US lawmakers move to limit protections enjoyed by internet platforms
Well that was quick. No sooner does Google censor the comments sections of two independent news sites than the US legislative machine moves to remove its own protections.
June 17, 2020
Well that was quick. No sooner does Google censor the comments sections of two independent news sites than the US legislative machine moves to remove its own protections.
Axios has been all over this, reporting that the US Justice Department is urging Congress to limit the legal protections enjoyed by online platforms. Its urgings didn’t seem to have been published at time of writing, but the move was widely leaked to many media so it seems legit. The limits are somewhat, well, limited in so much as they target quite niche issues. This could, however, be an attempt to set the thin end of the wedge, thus allowing more profound changes to the law.
Also reported by Axios is proposed legislation by Senator Hawley, a long-time opponent of the protections enjoyed by internet platforms. We’re told he wants to introduce legislation that would give consumers grounds to sue internet platform companies over accusations of selective censorship of political speech. It includes vague criteria around the concept of ‘good faith’, so it looks like the legalese on this needs some tightening, but again this represents an attempt to chip away at the protected status of companies like Google.
All this proposed legislation feels very preliminary, but at the very least it’s a statement of intent from US lawmakers. Every time an internet platform acts to unilaterally censor on the basis of its own arbitrary criteria, it increases the demand that it be regulated. The purpose of regulation is to protect the free market and preserve competition. When near monopolists abuse their position, it’s right they should be held accountable.
Update 09:00 18 June 2020 – Please find below the Justice Department announcement reproduced in full.
Justice Department Issues Recommendations for Section 230 Reform
Reforms Strike Balance of Protecting Citizens While Preserving Online Innovation and Free Speech
The Department of Justice released today a set of reform proposals to update the outdated immunity for online platforms under Section 230 of the Communications Decency Act of 1996. Responding to bipartisan concerns about the scope of 230 immunity, the department identified a set of concrete reform proposals to provide stronger incentives for online platforms to address illicit material on their services while continuing to foster innovation and free speech. The department’s findings are available here.
“When it comes to issues of public safety, the government is the one who must act on behalf of society at large. Law enforcement cannot delegate our obligations to protect the safety of the American people purely to the judgment of profit-seeking private firms. We must shape the incentives for companies to create a safer environment, which is what Section 230 was originally intended to do,” said Attorney General William P. Barr. “Taken together, these reforms will ensure that Section 230 immunity incentivizes online platforms to be responsible actors. These reforms are targeted at platforms to make certain they are appropriately addressing illegal and exploitive content while continuing to preserve a vibrant, open, and competitive internet. These twin objectives of giving online platforms the freedom to grow and innovate while encouraging them to moderate content responsibly were the core objectives of Section 230 at the outset. The Department’s proposal aims to realize these objectives more fully and clearly in order for Section 230 to better serve the interests of the American people.”
The department’s review of Section 230 over the last ten months arose in the context of its broader review of market-leading online platforms and their practices, which were announced in July 2019. The department held a large public workshop and expert roundtable in February 2020, as well as dozens of listening sessions with industry, thought leaders, and policy makers, to gain a better understanding of the uses and problems surrounding Section 230.
Section 230 was originally enacted to protect developing technology by providing that online platforms were not liable for the third-party content on their services or for their removal of such content in certain circumstances. This immunity was meant to nurture emerging internet businesses and to overrule a judicial precedent that rendered online platforms liable for all third-party content on their services if they restricted some harmful content.
However, the combination of 25 years of drastic technological changes and an expansive statutory interpretation left online platforms unaccountable for a variety of harms flowing from content on their platforms and with virtually unfettered discretion to censor third-party content with little transparency or accountability. Following the completion of its review, the Department of Justice determined that Section 230 is ripe for reform and identified and developed four categories of wide-ranging recommendations.
Incentivizing Online Platforms to Address Illicit Content
The first category of recommendations is aimed at incentivizing platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation claims. These reforms include a carve-out for bad actors who purposefully facilitate or solicit content that violates federal criminal law or are willfully blind to criminal content on their own services. Additionally, the department recommends a case-specific carve out where a platform has actual knowledge that content violated federal criminal law and does not act on it within a reasonable time, or where a platform was provided with a court judgment that the content is unlawful, and does not take appropriate action.
Promoting Open Discourse and Greater Transparency
A second category of proposed reforms is intended to clarify the text and revive the original purpose of the statute in order to promote free and open discourse online and encourage greater transparency between platforms and users. One of these recommended reforms is to provide a statutory definition of “good faith” to clarify its original purpose. The new statutory definition would limit immunity for content moderation decisions to those done in accordance with plain and particular terms of service and consistent with public representations. These measures would encourage platforms to be more transparent and accountable to their users.
Clarifying Federal Government Enforcement Capabilities
The third category of recommendations would increase the ability of the government to protect citizens from unlawful conduct, by making it clear that Section 230 does not apply to civil enforcement actions brought by the federal government.
Promoting Competition
A fourth category of reform is to make clear that federal antitrust claims are not, and were never intended to be, covered by Section 230 immunity. Over time, the avenues for engaging in both online commerce and speech have concentrated in the hands of a few key players. It makes little sense to enable large online platforms (particularly dominant ones) to invoke Section 230 immunity in antitrust cases, where liability is based on harm to competition, not on third-party speech.
For more information about the department’s recommendations, please visit https://www.justice.gov/ag/department-justice-s-review-section-230-communications-decency-act-1996.
About the Author
You May Also Like