- Pritzker Fellows
- Former Fellows
- Erin Simpson
Erin Simpson
Director of Technology Policy at the Center for American Progress
Spring 2022 Pritzker Fellow
Seminar Series: "Disinformation USA: A Sociotechnical Reckoning on Where We Go from Here"
Erin Simpson is a technology policy expert committed to advancing public interest values online. Simpson is currently the Director of Technology Policy at the Center for American Progress, where she develops policy responses to a range of economic, social, and democratic challenges.
Simpson has advocated for platform accountability and improved technology regulation in the European Union, United Kingdom, and United States.She served as the civil society lead for the Computational Propaganda Research Project at the University of Oxford, where she supported international civil and human rights leaders in democratic resilience and strategic responses to disinformation.
Simpson’s policymaking is informed by her earlier work in civic technology. She was the founding director of programs at Civic Hall Labs, a civic tech research and development nonprofit in New York City, and a Microsoft civic tech fellow.
Simpson is a Marshall scholar and a Truman scholar. She received two master’s degrees from the Oxford Internet Institute at the University of Oxford and earned a bachelor’s degree in public policy from the University of Chicago. During her time at the College, she was a student leader and organizer at the IOP, co-founding the IOP TechTeam.
She is an avid outdoorswoman and proud Wisconsinite.
Seminars
"Disinformation USA: A Sociotechnical Reckoning on Where We Go from Here"
Disinformation, it seems, is everywhere. From the doctor’s office to the ballot box, TikTok to FM radio, disinformation is held up as the culprit behind any number of US political anxieties. But while problems with propaganda are nothing new, the national fixation with disinformation on social media illustrates a meaningful challenge for our time: coming to grips with how the internet radically changed our information ecosystem, our politics, and how we understand each other. From foreign interference to domestic radicalization, democratic institutions and technology companies have struggled to stymie the harms from our broken information ecosystem. Over eight weeks, we’ll critically examine the disinformation phenomenon and use it as an entry point to look more expansively at American politics and an evolving media ecosystem. We’ll explore the technical, cultural, financial, and political power struggles fueling information disorder, engaging leading practitioners and scholars in the field. Throughout, we’ll interrogate the policy implications of our discussions, turning over the technical, legal, and cultural strategies that might constitute better governance in a disinformation-rich age.
Propaganda is central to political life. It's evolved with the introduction of each new communication technology, hand-in-hand with the cultural and historical movements that have shaped our politics. To kick off our disinformation journey, we'll trace the lineage of online disinformation and talk about the technical and political affordances that have enabled recent disinformation crisis points within domestic elections, global elections, and the pandemic. Grounded in historical context, we'll discuss the racialized and gendered drivers of U.S. political disinformation, and introduce the key questions we'll grapple with around architecting voice and power online. This seminar will set the conceptual stage for the coming weeks and tee up our recurring questions around shaping technical, legal, cultural, and policy responses.
The rise of online social networking introduced a multitude of new possibilities for social and political communication. The scope, scale, and targeting specificity offered people, movements, campaigns, and governments of all persuasions a new toolset. While some of these tools were abused, many of the products that contributed to the prevalence of online disinformation were features - not bugs. Today, we’ll speak with Adam Conner and Kip Wainscott about the early days of politics and social media, chronicling the recent history of the fraught relationship between social media companies, lawmakers, and online content problems such as disinformation.
Special Guests: Adam Conner, Vice President for Technology Policy at the Center for American Progress & Founder of Facebook’s DC Office in 2007; and Kip Wainscott, Head of Content & Product Policy at Snap
Digital advertising makes the consumer internet-go-round: free, user-friendly services mask the massive surveillance architecture and ad sales machines that track and sell our every click. Today, we’ll discuss the digital advertising business model at the heart of our problems with disinformation, while also questioning its premise: what is our attention online even worth? We’ll talk with Tim Hwang, whose recent book Subprime Attention Crisis critically investigated big tech’s attempts to financialize attention. We’ll discuss the limitations of persuasion online for disinformation and beyond, exploring what’s at risk if these business models continue - and what’s at risk if they collapse.
Special Guest (via Zoom): Tim Hwang, General Counsel at Substack & Author of Subprime Attention Crisis
Drawing on the deep well of U.S. cultural and political bigotry, extremist online sub-cultures have leveraged search and social media tools to recruit, radicalize, and successfully coordinate mass media disinformation campaigns. Online media platforms have become the latest arenas where hate and conspiracy groups have sought to amplify extremist ideologies and warp broader media narratives in their favor. Today, we’ll examine the evolving disinformative tactics of far-right influencers and the role that algorithmic recommendation systems can play in driving radicalization. Speaking with researcher Becca Lewis, we’ll discuss popular and media vulnerabilities toward these tactics and explore potential technical, cultural, and policy paths toward greater resiliency.
Special Guest (via Zoom): Becca Lewis, Researcher at Stanford University, Affiliate at the University of North Carolina’s Center for Information, Technology, and Public Life & Author of Alternative Influence: Broadcasting the Reactionary Right on YouTube
The question of how to best protect freedom of expression online is at the heart of conversations on disinformation. The First Amendment protects internet platforms and constrains policymakers in grappling with informational threats in the privately-owned “public sphere.” Indeed, contemporary First Amendment doctrine repels judicial and legislative checks on platforms – painting a detailed picture of what cannot be done while saying little about what can. Lawmakers have encountered these constraints in the robust public debates around compelling transparency, restricting amplification, and amending Section 230 of the Communications Decency Act (the key liability shield for people and service providers online), among other avenues. Today, we’ll be joined by Professor Genevieve Lakier of the University of Chicago Law School – a leading scholar on these issues – to dive into the fascinating and expansive questions around how the United States might best steward freedom of expression online.
Special Guest: Genevieve Lakier, Professor of Law at the University of Chicago Law School and Senior Visiting Research Scholar at the Knight Institute at Columbia University
Recommended Reading:
- “After the ‘Great Deplatforming’: Reconsidering the Shape of the First Amendment,” Genevieve Lakier and Nelson Tebbe (March, 2021: LPE)
The interplay amongst global, national, and regional informational threats poses complex challenges for stewarding a healthy information ecosystem. Across varying cultural contexts and technologies, people are grappling with unique but also surprisingly similar questions around media manipulation, platform governance, and social cohesion. Academic research has played an essential role in illuminating information operations and social media manipulation across sociopolitical contexts, especially around crisis events and democratic elections. Today we’ll be joined by Dr. Philip Howard, Director of the Oxford Internet Institute, who has been a pioneer in this field. As the principal investigator of the DemTech research group, he has led investigations into computational propaganda in dozens of elections around the world, including leading analysis for the Senate Intelligence report on Russian interference in the 2016 U.S. general election. We’ll discuss the origin of this research, the rapid development of the field, and what role research can play in stewarding global governance of a highly political challenge.
Special Guest (via Zoom): Philip N. Howard, Professor of Internet Studies at Balliol College at the University of Oxford, Director of the Oxford Internet Institute, and Author of Numerous Books Including Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives
Recommended Reading:
- “What's stunning about the misinformation trend - and how to fix it,” Sheldon Himelfarb and Philip Howard (October, 2021: CNN)
*This seminar will be held via Zoom
Armies of invisible workers around the world are doing the low-paid, high-stakes labor of commercial content moderation. The unfathomable amount of content churned out on big social media companies has birthed a brutal industry of people charged with reviewing and removing reported content–violent, harassing, disinformative, pornographic, and otherwise. Content moderation is the last line of defense in grappling with mass disinformation online, a place where the contextual, subjective, cultural questions around political disinformation can hardly be grappled with in the seconds moderators have to review. For workers and social media users alike, the status quo is untenable. Today we’ll speak with Dr. Sarah T. Roberts about her pioneering research into the labor of content moderation and discuss the technical, policy, and cultural approaches to improving content moderation of disinformation and beyond.
Special Guest (via Zoom): Dr. Sarah. T. Roberts, Co-Founder of the UCLA Center for Critical Internet Inquiry, Consultant for Twitter’s META group (Machine Learning, Ethics, Transparency and Accountability) & Author of Behind the Screen: Content Moderation in the Shadows of Social Media
Don't miss the concluding, interactive seminar wrapping up this series.