Understanding “Who” Is Behind Disinformation Is Key to Fighting It

Ever since the riots on the U.S. Capitol on January 6, policymakers and individuals alike have renewed their focus on technology and its role in promulgating disinformation. But much of the battle centers on “who is accountable” for the information published online. Is it the content’s creator? The person doing the posting? The platform? A community? A state actor?

That was just one of the topics tackled by 92Y during their “State of Democracy Summit” on March 25. The five-hour event consisted of nine panels that assessed the aftermath of the 2020 election and discussed the future of protecting democracy in America. The panels ranged from “Democracy on Film,” a discussion of the “Boys State” documentary, to “Local Democracy,” a conversation with Ithaca Mayor Svante Myrick.

But one panel, “Technology, Democracy, and Misinformation,” sought to answer how exactly America can “wrestle with the spread of dangerous misinformation.” The panel, which was moderated by Democratic strategist Emily Tisch Sussman, included Debora Plunkett, senior fellow at the Belfer Center and former director of the NSA’s Information Assurance Directorate; Jeff Kao, a data science journalist at ProPublica; and Jevin West, director of the Center for an Informed Public at the University of Washington.

Advertisement

Plunkett distinguished between the platform and content of technology. Platforms, she said, are “the tools and capabilities to be able to communicate largely on the internet,” such as Facebook, Twitter and Instagram. The past 10-15 years have witnessed an “exponential” rise of platforms. Content, by comparison, is the “information that rides on those platforms that serves all kinds of purposes.”

Plunkett said that younger people were the future of fighting disinformation, and that they need to build up a “healthy skepticism” early on.

The complication in fighting disinformation, Plunkett said, was the question of “who is accountable” for content, be it the poster, community or platform. Regulators are specifically challenged by the pace of technological changes, the borderless nature of the internet, and the notion that “internet is owned by no one.”

Although there are certainly positive uses for these platforms, Plunkett noted, the concern lies when these platforms are used for malicious purposes. As an example, she cited a recently released report from the Director of National Intelligence that detailed how Iran and China attempted to influence the 2020 election through online disinformation.

Kao elaborated on the foreign campaigns to spread disinformation. ProPublica tracked “10,000 suspected fake Twitter accounts” involved in a disinformation campaign traced to China. The Chinese government and their contractors used these fake accounts to praise China for its handling of the pandemic and tried to aid GOP goals.

Although Twitter regularly finds these fake accounts and posts the data for research, that “doesn’t really deter” these campaigns, Kao argued. Platforms try to deter disinformation by raising the cost of spreading it, but those incentives don’t work as well on actors with “deep pockets.” “As social media economy has become more sophisticated… anyone with money and time can start up one of these campaigns,” he explained.

West explained that in addition to fake accounts, “blue check verified accounts” repeatedly violated site policies and amplified disinformation in 2020, especially when it came to information about COVID-19. West cautioned that often a piece of information is created in “good faith” but “nefarious” actors twist and frame that information into something more malicious. Addressing “home grown domestic problems” is as necessary as combatting foreign sources of disinformation, he emphasized.

Although many individuals are aware that disinformation exists, West noted, they are unaware of new disinformation tactics, such as deepfake videos. He expressed fear that “people will start not believing anything,” which is the “objective” of disinformation campaigns. Kao explained that this broad disbelief is called the “Liar’s Dividend,” in which true things are “dismissed as fake news,” too.

Kao explained that this broad disbelief is called the “Liar’s Dividend,” in which true things are “dismissed as fake news,” too.

All the panelists concluded that Americans can each take steps to fight disinformation. West suggested “digital civics” education to cultivate “habits of mind.” For example, when looking at information that seems “too good to be true,” ask who is saying it and what they have to gain by telling you. Kao raised a flag about extremely emotional content; if you get worked up while reading something, he explained, “step back” and investigate it.

Plunkett argued that younger people were the future of fighting disinformation, and that they need to build up a “healthy skepticism” early on, adding to the panel’s existing calls of researching and independently verifying information. “We are now expected to be our own editors,” West concluded.

The summit was produced in collaboration with the Knight Foundation, ProPublica and Craig Newmark Philanthropies. You can watch the full summit at their website (https://www.92y.org/state-of-democracy).


Ari Berman is Op-Ed Editor for the Journal.

Advertisment

Source

You can skip to the end and leave a response. Pinging is currently not allowed.

Leave a Reply

The maximum upload file size: 28 MB. You can upload: image, audio, video, document, spreadsheet, interactive, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded.

*

Powered by WordPress | Designed by: Premium WordPress Themes | Thanks to Themes Gallery, Bromoney and Wordpress Themes