Ever since the attack on the U.S. Capitol by right-wing extremists on Jan. 6, social media and online platforms have been scrambling to take action.
Twitter permanently suspended and Facebook blocked President Donald Trump’s accounts, while Amazon removed Parler from its cloud hosting service. Stripe, Apple, Venmo, Paypal, YouTube, Telegram, and more deleted and suspended and delisted and closed and took down accounts, apps, and content, as compiled by First Draft.
While these actions make some wary – why are Mark Zuckerberg (Facebook) and Jack Dorsey (Twitter) deciding the fate of world leaders’ ability to communicate? – what’s enormously significant is that platforms are beginning to acknowledge they bear responsibility for information that flows on their platforms.
However, these actions are all crude approaches that don’t affect the way online platforms function algorithmically, or the systemic reasons harmful and violent content flourishes.
Fox News reported that foreign dictators and repressive regimes still have free reign on Twitter. Human-rights groups in countries such as Ethiopia, Sri Lanka, and Myanmar have argued for curbing of hate speech in those countries to little effect, according to The New York Times.
They have a point, and I’ll raise them: Here in Colorado, we have our own local spreaders of online false, misleading, and harmful information. Examples:
- False claims of election fraud: The day after the election, a former campaign staffer for Rep. Lauren Boebert who calls herself “America’s mom” spread false information on Facebook, claiming that President-elect Joe Biden’s win was a “coup” and listing refuted claims about ballot fraud. When Sandra Fish — a data journalist with the Colorado News Collaborative (COLab) and fellow with First Draft — reported this, Facebook marked the post as containing “partly false information,” but it still got more than 400 shares. On Jan. 8, “America’s Mom” posted that her Facebook account had been suspended for seven days.
- False allegations that COVID-19 is a hoax. Local posters frequently claim the pandemic is a hoax. In one typical recent post, which got nearly 40 retweets, the writer states: “After Trump conceded schools in Colorado will reopen Feb. 7 Denver went from Red to Orange for Covid … I thought as of yesterday we were all dying from Covid … The dems can work “miracles” this has all been a hoax.”
- Personal attacks: Anti-Semitic attacks and violent rhetoric against specific local journalists and claims that they coordinate with antifa flourished on Twitter, YouTube, Instagram and other conventional platforms after the October shooting and death of a protester by a security contractor hired by Denver TV station KUSA-9News.
The most effective misinformation typically contains a kernel of truth, or plays on legitimate concerns and doubts people have, and these posts are no exception.
- Is it true that a demonstrator was shot and killed by a security contractor brought by 9News following an October protest? Yes. Was the news team coordinating with antifa? No.
- Is it true that the November 2020 election was hotly contested? Yes. Is it true that Democrats got “corporate media” to lock down state results announcements and stopped counting ballots? No.
Trump himself, of course, has been a master of manipulation online, mixing real or perceived grievances from his base with half-truths and outright lies, the results of which we saw when his supporters attempted to stop the certification of the presidential election on Jan. 6. His online following is (or was) vast, far exceeding any of the Colorado examples above.
So what to do? While many worry about social media executives constraining free speech, it’s important to remember the companies they run are private entities that enjoy immense profits and are largely free from concerns about liability for what’s posted on their sites. And technically, private companies are not subject to the First Amendment, which protects free speech from government intrusion.
Social media platforms are 21st-century creations, which don’t look exactly like anything that’s come before. They may have the feel of a public street corner where anyone can hold up a sign or give a speech, but they are more like a gated community without a board of homeowners providing oversight, where the developer sets algorithmic rules on what you see, when and why.
Tackling these problems requires creative thinking and deliberation. We need to approach these problems as matters of public policy, not private corporate decision making.
We need a substantive conversation in this country – and this state – about accountability, transparency, and regulation of social media platforms. This conversation must be informed by research, investigative journalism and legal proceedings on effective means to hold social media platforms accountable and take responsibility.
We need hearings, we need studies, we need debates about various approaches: antitrust actions, requirements to make algorithms transparent, the creation of independent public oversight boards, developing public social media alternatives analogous to public media, and more.
These issues are too important to the public and our democracy to rely on the good, bad, or indifferent intentions of a few CEOs.
Nancy Watzman is director of Lynx LLC, where her clients include Democracy Fund and the NYU Ad Observatory. She is former director and remains an advisor to the Colorado Media Project, and in 2020 she managed a local news fellowship combating online misinformation for First Draft.
The Colorado Sun is a nonpartisan news organization, and the opinions of columnists and editorial writers do not reflect the opinions of the newsroom. Read our ethics policy for more on The Sun’s opinion policy and submit columns, suggested writers and more to email@example.com. (Learn more about how to submit a column.)