Nina Jankowicz ’11 Publishes Book on the Information War

In her new book, Nina Jankowicz ’11 takes on Russia, fake news, and the future of conflict. In the following Q&A, she talks about how disinformation works, why we're vulnerable, and what to do about it.


What They're Doing

My book specifically describes Russian influence operations in Central and Eastern Europe, putting the revelations about Russian interference here in the U.S. into context for American readers. Often what our media coverage misses about Russian influence operations is that they go far, far beyond botnets and troll farms—the types of “fakes” we became familiar with in the 2016 election. In the examples the book describes in Estonia and the Republic of Georgia, for instance, Russia not only uses disinformation (the use of false or misleading information with malign intent) to influence political outcomes; it also launches cyberattacks, foments protest, and infiltrates cultural and media organizations.

Though the book specifically focuses on Russia, another important thread between all of the case studies in it is that disinformation is harmful to democracy, whether it comes from within or outside the house. The countries that fight foreign disinformation while tacitly endorsing its tactics for their own domestic political use don’t get very far in dealing with either, undermining the free flow of information and ultimately the democratic process itself.

Why We’re Vulnerable

This touches on another misconception we in the United States have because of the widespread use of the term “fake news.” Disinformation isn’t about creating fake journalism or manipulating photos; the most successful disinformation runs on emotion and exploits societal fissures like endemic racism, economic inequality, and political gridlock and polarization. Unfortunately, the U.S. has plenty of fertile ground for disinformers to exploit.

We’re also vulnerable because of the changes to our information environment and how slow society has been to adapt to them. Social media platforms serve as a primary information source for millions of people, and these platforms’ business models prioritize and incentivize engagement with emotional content, which performs better and keeps people on the platform, looking at ads, making tech companies money. Without changing these business models or introducing some regulatory checks, it’s up to individuals themselves to suss out what information can be trusted, and many Americans don’t have the media literacy skills or the tools to do that. They are used to mainstream media serving as a gatekeeper for their information and treat the content they encounter on social media with the same trust.

This is essentially how all disinformation operations work—Russian or otherwise. In the U.S., in 2016, Russia used racial tension and targeted Black Lives Matter supporters, even creating a Facebook page that had more followers than the official BLM account. They latched onto similar narratives when protests about the murder of George Floyd erupted this year. Ultimately, the goal is to pit Americans against one another and encourage dysfunction in our democratic process, but the model applies in other countries as well: Russia used ethnic tensions between Estonians and ethnic Russians to provoke unrest surrounding the removal of a Soviet statue in Tallinn in 2007.

How to Fix It

There are obvious solutions, like holding the perpetrators of foreign interference campaigns to account; under the Trump administration, the White House itself has undercut these efforts. While the US has imposed sanctions on Russia for its malign activity, the President’s praise of Putin and assertion that Russian interference is a "hoax" has created an incongruence in our Russia policy. We are not sending a consistent message, making Moscow even less likely to heed it. We need recognition from the very top of government for these efforts to hold water. But we also need to stop securitizing this problem; this isn’t only a national security issue, one that the Departments of Defense and State should be dealing with. We also need to think about how to build America’s societal resilience, investing in media literacy and civics courses for children and voting age adults alike, making sure Americans have access to trustworthy information by beefing up the budgets for our public broadcasters, and looking inward to try to lessen the vulnerabilities that bad actors like Russia exploit.

On an individual level, people should really be cautious when sharing information online, asking themselves if it is coming from a reputable source (does it have a masthead? contact information? has the author written anything else?) and why that source or user might be targeting them. If they feel themselves getting highly emotional, these questions are especially important, and it might be time to step away from the keyboard and practice what I call “informational distancing”—putting physical distance between yourself and emotional content that you might be tempted to share, until you can do your due diligence. Disinformation often gets its legs not through paid advertising, but because it is shared by normal users.

November 3

I’m really disturbed that we are seeing disinformation shared by American politicians and officials. This sort of behavior should be anathema to everyone. Disinformation knows no party—its ultimate victim is democracy. No matter the outcome of the election, we will be dealing with the consequences of this behavior for years to come, because disinformation undermines trust in institutions, a basic level of which we need for our government and democratic process to function. In particular, I’m worried about trust in the results on and after election night. We should all do our best do get our information from official sources (state and local election boards) rather than politicians and pundits.

The Human Face of Fake News

Out of graduate school, I worked for the National Democratic Institute, an organization that provides training and support to democratic activists around the world. I worked on programs in Russia and Eurasia. We were often the victims of Russian propaganda, which sought to paint us “CIA-sponsored instigators of color revolution” (we weren’t). I’ve always been interested in the effects of social media on society, so when Ukraine’s Euromaidan revolution happened, I felt a strong pull to go there and work on issues related to disinformation. As a Fulbright Public Policy Fellow, I advised the Ukrainian Foreign Ministry on strategic communications issues. I watched from Kyiv as the U.S. election unfolded and America woke up to the threat of information warfare. That’s where the idea for the book was born.

While some people who study disinformation focus on network analysis and the technical side of things, I try to bring a human face to these highly technical stories and make them more accessible to curious, non-expert audiences. My research style is more sociological in that way; it is largely interview-based, though I work with primary source documents as well.

From Bryn Mawr with Love

Bryn Mawr was undoubtedly one of the biggest influences on my career! As a freshman, I didn’t intend to double major in Russian and political science, but the Russian department was too interesting and inspiring to not! I loved my classes with Sharon Bain, Tim Harte, and Dan Davidson (his language policy class gave me my first exposure to ethnic issues in Estonia!), and my time in the Russki Dom was responsible for some of my favorite Bryn Mawr memories. Also, without my Russian language skills and knowledge of Russian culture I would not be able to do half the work I do today.

The political science department was also great; I still think back to course readings from Marissa Martino Golden and my thesis advisor, Carol Hager’s classes. Professor Hager’s class on social movements, in particular, influenced my graduate research and how I think about the several uprisings that have happened in the Eurasia region since I left Bryn Mawr. I often say that going to Bryn Mawr was the most important decision of my adult life, and that’s not disinformation.


A Disinformation Fellow at the Wilson Center, Nina Jankowicz ’11 studies the intersection of democracy and technology in Central and Eastern Europe. She is the author of How To Lose the Information War: Russia, Fake News, and the Future of Conflict (Bloomsbury/IBTauris).

""
In How to Lose the Information War, Nina Jankowicz ’11 takes the reader on a journey through five Western governments' responses to Russian information warfare tactics—all of which have failed. She journeys into the campaigns the Russian operatives run and shows how we can better understand the motivations behind these attacks, how to beat them, and what is at stake: the future of civil discourse and democracy, and the value of truth itself.