The daily cognitive abuse of disinformation campaigns, the oppression of algorithmic bias, and the revelations of systematic manipulation shown in the 2020 documentary, The Social Dilemma, make combining “Ethics” and “Technology” sound like an oxymoron. In 2018 IFTF’s Digital Intelligence Lab produced The Ethical Operating System (Ethical OS) to foreground potential risks and worst-case scenarios from emerging technology products and platforms. The goal was to help technologists better anticipate long-term consequences, and design and implement ethical solutions BEFORE the worst happened. What might the world look like if technology companies could have foreseen the poisonous effects social media platforms have had on democratic elections? Similarly, how might governments have dealt with these platforms if they had anticipated the consequences and been empowered to act to deter them?
The Ethical OS has been used by many organizations and agencies across the civic sector, including the California state legislature, the United States Conference of Mayors, and other local governments, to bring more foresight and long-term thinking to policy decisions about new technologies. In response to high demand from government entities, and with support from the Tingari-Silverton Foundation, the IFTF Governance Futures Lab released our new Playbook for Ethical Tech Governance for government leaders in early 2021. Adapted from the original Ethical OS, this Playbook will equip civil servants with the skills and tools to proactively resolve ethical dilemmas emerging from new technologies and new social and political dynamics.
Addressing a rapidly changing technological landscape with governing institutions born in a slower, less complex world is a challenge. Effective, ethical technology governance requires balancing the needs of individuals, groups, and larger systems in such a way that winning in the short-term does not destroy opportunity in the long-term. Policymakers need a framework through which to assess and evaluate ethical dilemmas, anticipate and mitigate unintended consequences, and act to maximize positive outcomes that point towards preferred futures.
What’s in the Playbook
Through our research and interviews, we identified 5 “Risk Zones,” or urgent topics that governments and technologists alike will need to address swiftly and thoughtfully in the coming decade. These risk zones are: Artificial Intelligence, Climate, Equity, Law Enforcement, and Public Health.
Within each risk zone, we present two scenarios--one from the present, or near-term, and another with a 10-year horizon. These scenarios contain inherent dilemmas that challenge implicit ethical codes. For example, in one of our scenarios we look at how AI might be used to dictate when people can or cannot use their cars to commute to work. It reduces traffic for all, but requires occasional sacrifices for individuals. These scenarios are intended to spark conversation about how technology might impact - or be impacted by - the decisions about technology implementations by civic officials.
In order to facilitate a decision-making process that evaluates consequences of these decisions, we adapted the classic Futures Wheel to help guide users through possible best and worst case outcomes of their choices. We follow each section with a series of questions for consideration to help deepen the conversation about the issues raised in the scenario, as well as the broader applicability of the trade-offs contained with each decision.
The Decision Tree
Understanding intended and unintended consequences of events and action is a critical component of foresight. The Futures Wheel (also a tool in the IFTF Foresight Essentials training called Draw Out Consequences) is a staple of foresight workshops and facilitated processes. It asks users to brainstorm first, second, third, and tertiary consequences, and make connections between them. We originally included a Futures Wheel component into the playbook, but as we playtested it with experts and peers, it became clear that the written tool needed active facilitation or extensive explanation to be useful.
Since the playbook needs to be a stand-alone self-facilitated process, we decided to simplify the Futures Wheel into a bifurcating tree, representing positive and negative outcomes of an ethical policy choice. For the negative outcome branch of the tree, we asked what could be done to address the outcome, and what might have been done differently to avoid negative impacts. For the positive outcome, we looked at second order positive and negative outcomes, and asked what could be done to address negative outcomes at that stage. Finally, we asked users to evaluate their policy choices and overall outcomes against their core civil service values.
Through this process, we looked at multiple consequences, constrained possibilities to two outcomes at each stage, and forced users to think about outcomes as generally positive or negative. Of course, the limitations we enforce on the process hide the complexities and nuance of the impact of any policy choice. And, the interpretation of “positive” and “negative” can differ among people and constituencies. But we believe the trade-off of some of these complexities ultimately benefits the process of thinking through multiple consequences, and is a useful on-ramp for further exploration.
Your Turn: The Playbook in Action
Our hope is that this playbook is an easy-to-use guide to help navigate complex situations. As governing institutions and leaders try to keep pace with technological advances, long-term consequence-oriented thinking can help balance the need and speed of innovation with what is necessary to safeguard privacy, truth, democracy, mental health, civic discourse, equality of opportunity, economic stability, or public safety. It can help ensure trust, accountability, and fairness in the process.
The Playbook for Ethical Tech Governance was completed by IFTF in early 2021. You can sign up here to receive a copy or to book us for a workshop using this playbook.
*This essay has been updated since the original date of publishing.