The alarm many of us have sounded — that what happens online is not just a game — was sadly validated when radicals spilled out of our screens and stormed the Capitol last week. The insurrection itself was a crude selfie event, an LOL magnet for Instagram likes — except it was a real, violent attack on democracy that cost human lives.
Design flaws embedded in the systems of the biggest tech platforms — especially Facebook, YouTube and Twitter — push disinformation campaigns to go viral rather than prioritizing authoritative content. The platforms lack incentives to address these structural vulnerabilities on their own. But the events of Jan. 6 must move us all to recognize that the reactive game of whack-a-mole no longer passes muster.
If the incoming Biden administration is to make significant progress on any of the other crises facing the nation — covid-19, climate change, racial justice — it will have to treat America’s debilitating information disorder. Fortunately, there are norms and a regulatory tool kit developed over many decades in consumer protection, civil rights, media, election and national security law that can be renewed for the online world.
Online platforms are designed to maximize revenue and externalize costs. It turns out that manipulation and conspiracies sell, and tempering platform design to prevent harm is a cost that platforms have been allowed to avoid. As a result, tech companies too often allow deceptive design that makes it easy to spread conspiracy theories and endanger society.
Today, we should take a three-pronged approach to changing their incentives — turning deceptive design into more democratic design.
First, clarify that what happens online is subject to the same legal standards as what happens in real life, update regulations and increase enforcement.
Updated regulations would include the bipartisan Honest Ads Act — applying broadcast election ad transparency rules to the Internet — supplemented with know-your-customer rules so dark money groups can’t hide their funding. Updated consumer protection rules and enforcement would allow agencies such as the Federal Trade Commission to crack down on deepfakes, fake accounts, hidden amplification and other clearly deceptive designs — and require greater data sharing with independent researchers.
Much of this can be done under existing law or with narrow reform (not eradication) of Section 230 to impose legal hazard on dominant platforms for online harassment, incitement to violence and civil rights violations.
Second, insist that the industry make a high-level commitment to democratic design — a so-called digital code of conduct. Each platform should also make its own individual implementation commitments to which it will be held accountable. The code would be designed by the companies but overseen and enforced by the FTC, and any violation of the code would be enforced as a consumer protection violation.
After the Second World War, the Commission on Freedom of the Press concluded that the mass media had a responsibility to society in the shadow of totalitarian advances. The commission made the industry’s social responsibility explicit, warning: “If these giant agencies of communication are irresponsible, not even the First Amendment will protect their freedom from government control.”
The results were editorial standards, norms about editorial independence and design features such as separating news from opinion. These voluntary codes were complemented by rules the Federal Communications Commission imposed on electronic media.
Today, a new digital code of conduct committing online platforms to democratic design would include circuit breakers, such as those used by high-frequency traders on Wall Street, to slow the viral spread of dangerous lies. Other types of friction, such as stopping recommendations or limiting shares, would help, as would the accountability that comes from letting third parties audit platform data flows.
A code would provide clear, useful information about the trustworthiness of news outlets using rankings from organizations such as NewsGuard. And platforms would commit to greater transparency about how they enforce their terms of service, making it difficult to bend rules to accommodate repeat offenders and providing greater confidence in choices.
Third, we should create a new “PBS of the Internet” to strengthen our civic infrastructure and ensure a strong online supply of trustworthy, nonpartisan scientific and election information.
Start with infrastructure supporting access to high-quality information, plus the participation of anchor institutions such as libraries and schools to promote civic engagement and even incentives to reinforce democratic norms and habits of discourse. Next, there should be protocols to surface and prioritize authoritative information on digital platforms. Finally, we need financial support for independent public service media.
As Republican Sen. John McCain warned, “If you want to preserve democracy as we know it, you have to have a free and, many times, adversarial press.”
Updating digital norms would make organizing the next Capitol assault more difficult.
After all, this is no longer an online game — it’s real life. We must act.