ADMINISTRATOR POWER: Eileen, great to see you virtually. I really miss working with you day to day, but I'm excited that some pretty daunting real world challenges have brought us back together in the form of this collaboration, and others. Eileen is also kind enough to serve on an advisory board here at USAID that's really important to our strategic thinking about the direction that this development and humanitarian Agency goes in. So Eileen, great to see you virtually, can't wait to get out there – for lots of reasons – but not least, to see you.
Thank you also to Stanford’s Global Digital Policy Incubator – such a great idea. Back in December 2021, as some of you know, President Biden convened the first U.S. Summit for Democracy, which brought together countries united by their belief in free societies. The second Summit for Democracy will convene in March, so four months away – four plus months away – to further these goals. The Global Entrepreneurship Challenge stemmed from that first Democracy Summit, and it assembled global technology innovators to champion human rights. And I have to thank our partners, the U.S. Department of State – part of the U.S. government like us – Microsoft, and the IE Business School in Madrid, for helping launch today’s Venture Day as part of that effort.
I think we’re gathered today because we all understand the same truth: global democracy everywhere is threatened by technological advances. And I have to say, as somebody who came of age when people were predicting the end of history and the triumph of liberal democracy, the way that sentence was written was really, really different back then. Eileen knows – it was global democracy everywhere is advanced, or rendered inevitable, by technological advances. So many, many people did not see coming quite the extent of the challenge that we are now facing.
So, we see governments when they control advanced technologies, especially in autocratic regimes, but also in democracies with more repressive tendencies. We see those governments using technology to surveil, to censor, and to repress. In Russia, large media sites outside the control of the state are banned, as you well know. In North Korea, despite all the best efforts by many on the outside to provide technology tools, some of which have penetrated very discreetly, but citizens have almost no access to outside information because of state control. And in Iran right now, notwithstanding the creative use of some tools by the very diffuse and courageous protest movement, government forces are using technology to surveil protesters, to track their phones, and of course, to spread disinformation. To get a sense of the scale of the challenge: of the 3.8 billion people who have internet access, as we speak, 75 percent of those people live in countries where governments last year arrested and jailed people for expressing non-violent political or social views online.
Abuse, of course, also occurs when technology is wielded by private actors. Unchecked hate speech can rapidly spread through social networks, can incite extremists, fuel division of all kinds, fuel violence – political, ethnic, gender-based violence. Sophisticated spyware has been developed by private actors that can evade defenses and expose the personal information of journalists, activists, and all kinds of truth-tellers around the world. And in the past three years, really more than the past three years, but really heightened in the last few years, just the power of misinformation and disinformation – much of it created by private individuals – and that misinformation, disinformation, threatening election integrity and public health. And we saw in the COVID context, falsehoods about political figures, lies about voter fraud, and again, just general harm to civilization, you might say, and harm to individual and collective welfare.
Regardless of whether governments or businesses – private actors – are in control, the technology, of course, is not staying behind a country’s borders; it’s being exported to foreign actors that are using these tools to curtail rights.
So, the misuse of technology erodes trust, minimizes individual consent, and stifles human rights. Rights that include the right to safety or security. The right to privacy. The right to peaceably assemble.
But whether that technology is in private or public hands, it doesn't, of course, have to be a force for corrupting democracy. It can be a tool for strengthening it. And I’ve always appreciated Eileen, your leadership on this, as one who came from a very knowledgeable background in the Bay area to the Human Rights Council. Really, I’m sure you’d admit you were behind the times, but way ahead of the times, in terms of inter-governmental attention to this. But all of us, learning from those of you who have been raising these issues for some time.
I think with the ideas of the people that are gathered there today, you are demonstrating a future for tech that respects individual rights. When we implement technologies that respect rights, we can make a more democratic world. Through your actions also you are empowering and inspiring other technologists to develop rights-respecting solutions. This should all be further along, but we are where we are, and we have to accelerate progress toward a rights based mindset as we think about technology.
At USAID, we have launched our Advancing Digital Democracy (ADD) Initiative, which is meant to help promote digital democracy. We have 80 missions around the world, programs in about 100 countries, and this ADD Initiative includes helping governments advance legal and regulatory initiatives – which are hard enough to secure in this country, the United States, but you can imagine in some developing countries how challenging it is to get the regulatory environment right – increasing investments in rights-respecting innovation, and supporting organizations holding governments and companies accountable for a democratic digital ecosystem.
We are also currently working with technologists around the world to develop the Tech Code of Ethics, which creates a set of principles developed by technologists, for technologists. So, we are a bit player in that, but definitely an actor with the New American Foundation and others who wanted to spearhead, nudge along. But the technologists now coming together the same way doctors pledge to first do no harm, technologists need to agree on a shared set of ideals that they hold dear that are non-negotiable.
These are steps that we as a government are taking to help usher in the changes we hope to see. We have right there in the room with you all, Vera Zakem, a tech-native who leads our digital democracy efforts. Vera, if you haven’t already, please raise your hand, or stand up and be noticed. If you would like to hear more about the work that we’re doing, and how we might be able to partner with you, please talk to Vera.
And as you develop new technologies, as we know you are doing and will do, please keep democratic principles in mind. Think about features that can be designed to protect privacy rather than to subvert it. As you think about hyper-scaling your next product, consider how others might use that product to sow disinformation. And as you focus on the user, as all of you do as you think about these tools, please focus on users in countries in the Global South that may not have the same rights or protections as we do in the West, and think about what that could mean.
The only way to develop and deploy technologies that enhance democracy is to do so with intentionality, from the start – rather than trying to reverse or back-track from unintended consequences of products, consequences that one hasn’t even taken the time or built the space into one’s creative process to even try to foresee. So you have great power, and with that power comes great responsibility, and we are here to learn from you, to listen to you, and to partner with you, as makes the most sense.
Back to you, Eileen, and I’m eager to take questions or comments from the audience.