Flagged: June 5, 2020

Welcome to a preview of Flagged, a newsletter from the Wilson Center’s Science and Technology Innovation Program. Each week, we flag the most important news about disinformation and how to counter it, landing an accessible analytical brief in your inbox every Wednesday starting June 10. 

Flagged, Disinformation Distilled

Information flows are incessant; news fatigue is real. In addressing these issues, nothing matters more than the laws and policies that govern our information space. Amid the COVID-19 “infodemic,” the choices our legislatures and social media platforms make could have repercussions for years to come. And we’ll be tracking and making sense of them for you. Stay in the know; if you’d like to start receiving Flagged, please opt-in here.

This Week in Disinformation: the world continued to grapple with the coronavirus “infodemic” and online discussion of the George Floyd protests, while the “big three” social media platforms continued to be scrutinized for their business practices.

  • Twitter issued a fact check label on a tweet from President Trump about mail in-ballots, the first time it had made such a decision on content from a world leader; it doubled down on its policy when the president sent a tweet that the company said “glorified violence.” Two days later, President Trump issued an executive order (EO) targeting the safe harbor provided for internet platforms under Section 230 of the 1997 Communications Decency Act. The EO recommends a new interpretation of Section 230, arguing that intermediaries are not moderating content in good faith if they censor political views or promote certain viewpoints over others. Should they do this, intermediaries should be treated as “publishers” of the content and be liable. This EO echoes concerns held by conservatives about disproportionate policing of their posts online by social media platforms, though some data points to the opposite effect. The EO issues a recommendation on Section 230’s interpretation and initiates action from the FCC to issue regulation policy proposals in 60 days. Reconsideration of Section 230 is now an agenda item for both the Trump and Biden campaigns.

     
  • The United Nations launched “Verified,” an initiative that seeks to challenge the spread and effects of misinformation. Verified will dispatch “digital first responders” to distribute accurate COVID-10-related content and information. It’s a new model, similar to the work of Lithuania’s famous “elves” who fight Russian disinformation. There is evidence that respected members of families and communities can act as conduits of trustworthy information, but can they overcome widespread institutional distrust in organizations like the UN?

     
  • As social media platforms are cracking down on COVID-19-related misinformation (YouTube released a coronavirus policy last week), users are finding alternative platforms on which to share banned content.  Users move the content in question onto platforms with less stringent moderation practices and more limited monitoring capacity than most social media websites, like Google Drive and the Internet Archive, and use more popular platforms to distribute links. This tactical shift creates a clash between user privacy and societal harm. As Alex Stamos of Stanford’s Internet Observatory asked on Twitter, should a researcher’s copy of Plandemic on Google Drive or DropBox be removed the same way a viral link to the video might be? Thus far, file-hosting services have only grappled with these questions in relation to illegal content, like child pornography; now the spectre of disinformation is one they’ll have to confront, too.

     
  • Before 2020 is over, Alphabet Inc.’s Google will be the subject of likely two federal and state antitrust lawsuits. The US Department of Justice (DOJ) is expected to file a case against Google as early as this summer, and a filing from the Texas Attorney General Ken Paxton-led coalition of state attorneys general is anticipated for this fall. The DOJ probe, launched in July 2019, investigates Google’s online ad tools and whether the company leverages its hegemony as a search business to suppress competition in the online advertising market—a market of which it now owns a 70.5 percent share. In September, 50 state attorneys general from 48 states formed a bipartisan coalition to launch investigations into whether Google holds a monopoly over the online advertising and search markets.

     
  • The Wall Street Journal reported this week that following the 2016 US election, Facebook launched an internal effort to study how their platform shapes user behavior and to brainstorm strategies to mitigate potential harmful effects. The research concluded that Facebook’s algorithms polarize users by driving increasingly divisive content to them “in an effort to gain user attention and increase time on the platform.” The WSJ investigation alleges that solutions proposed to mitigate these effects were shelved, as such solutions could threaten user engagement levels and the company moves towards non-intervention. Facebook pushed back on the WSJ investigation’s conclusions, citing investments it has made to fight polarization, but evidence its recommendation algorithms recommend harmful content persists.

Recent Wilson Center Work

  • Russia and China are certainly involved in the coronavirus information war, but what is the COVID-19 “infodemic” like in the MENA region? The Wilson Center’s MENA and STIP programs hosted a ground truth briefing with Bayan al-Tal of the Jordan Media Institute and Faisal al-Mutar, the founder and President of Ideas Beyond Borders, to discuss what narratives are gaining steam in the region and how to combat them.

     
  • What impact will the COVID-19 infodemic have on regulation of social media platforms? Wilson Center Disinformation Fellow Nina Jankowicz joined UK Member of Parliament Damian Collins, and alumnus of the Center’s Defeating Disinformation Program, to discuss if better regulation might have mitigated the current crisis on his Infotagion Podcast.

     
  • The Wilson Center’s Rui Zhong dives into Chinese disinformation surrounding COVID-19 and how it affects audiences inside and outside of China’s borders in an interview with the International Observatory of Human Rights. She discusses the challenges for Chinese journalists covering COVID: “China's always had reporters willing and qualified to do reporting work on natural disasters, pandemics and other issues. The problem is whether or not they have enough institutional support and wiggle room to have their writing be published and stay published.” Zhong also arguepleads for nuance, noting that battling coronavirus disinformation is “not something that can solely be solved by taking one hawkish policy on China; we need to devise a domestic response as well, and there are no shortcuts around that.”

     
  • Congress has also set its sights on combating COVID-19 disinformation. Representatives Lauren Underwood and Elissa Slotkin spoke of the Committee on Homeland Security spoke with Stanford’s Renee DiResta and Wilson Disinformation Fellow Nina Jankowicz about Flattening the Misinformation Curve through public education and awareness campaigns, as well as social media platforms’ responsibilities during.  The U.S. Helsinki Commission, which promotes human rights, military security, and economic cooperation in 57 countries in Europe, Eurasia, and North America examined Disinformation, COVID-19, and the Electoral Process. Jankowicz told the Commission: “Disinformation is not a partisan issue. If we are to make any progress in protecting our democracies we need not only to clearly recognize the threat it poses, but reject its tactics wholecloth.”

Authors

Science and Technology Innovation Program

The Science and Technology Innovation Program (STIP) serves as the bridge between technologists, policymakers, industry, and global stakeholders.   Read more

Science and Technology Innovation Program

Digital Futures Project

Less and less of life, war and business takes place offline. More and more, policy is transacted in a space poorly understood by traditional legal and political authorities. The Digital Futures Project is a map to constraints and opportunities generated by the innovations around the corner - a resource for policymakers navigating a world they didn’t build.   Read more

Digital Futures Project