Skip to main content

NTU Weighs In On House Energy & Commerce Technology Discussion Draft Bills

View as PDF

To:                  Republican Members of the House Committee on Energy and Commerce

From:            Josh Withrow, Director of Technology Policy, National Taxpayers Union Foundation

                        Will Yepez, Policy and Government Affairs Associate, National Taxpayers Union

Re:                  NTU’s Views on discussion draft bills released on July 28, 2021

On behalf of the National Taxpayers Union (NTU), we write to express our views on the draft discussion bills put forward by Energy and Commerce Republicans to addresses improving transparency and content moderation accountability, reforming Section 230 of the Communications Decency Act, promoting competition, and preventing illegal and harmful activity on large online platforms. With over 30 discussion draft bills being considered, this memo represents topline taxpayers’ concerns but is by no means an exhaustive analysis of every discussion draft bill.

This memo includes feedback on certain legislation as well as more general concerns with provisions that appear in several discussion drafts. We welcome the opportunity to provide input on behalf of taxpayers and appreciate Members and their staffs’ willingness to engage in a thoughtful discussion on these important topics.

_________________________________________________________________________________________

Section 230 Reform:

  • Definition of “covered company”: The “covered company” definition used by most of the reforms addressing Section 230, of $3 billion in annual revenue and 300 million monthly active users, seems deliberately targeted to include Twitter, which often escapes other attempts to lump them in with “Big Tech” because of its relatively paltry market cap. However, setting the bar that low would thus likely affect not only Facebook, Youtube, Amazon, and Twitter, but also Netflix, LinkedIn, eBay, TikTok, and Spotify, among other “interactive computer services.” Likewise, many companies that may not exceed these thresholds today will likely do so at some point in the near future, like Snapchat, Reddit, or other up-and-coming platforms.
    • The sort of third party content that each of these platforms hosts, and therefore moderates, is quite different, which is one reason that targeting the regulation of digital platforms via a size metric is nearly impossible to do without negative consequences for businesses that are not the target of Congress’s ire today. It makes sense to have some size threshold for the transparency and reporting requirements of this and other bills, however, because these impose definite time and cost burdens that would be difficult for a startup to meet. Applicability of liability protections for content moderation, though, ought to be based on well-defined terms such that firm size should be irrelevant (i.e. how Section 230 already does not protect hosting federally illegal content).
  • Preserving Constitutionally Protected Speech, led by Reps. Cathy McMorris Rodgers (R-WA) and Jim Jordan (R-OH):
    • Civil liability: “Objectively reasonable” immediately raises the question “according to whom?” Shifting Section 230’s protections for taking down content from what the “provider or user considers” to be objectionable would seemingly open the door to a flood of expensive litigation that will allow complainants and judges to decide for the covered sites what “objectively reasonable” is. Outside of explicitly illegal content that sites are obligated to take down already, most moderation decisions sit in a subjective grey zone - made even more difficult because people on opposite sides of America’s political divide have diametrically opposite views of what moderation should look like.
    • Notice and appeals process: While the largest social media platforms have already dedicated enormous staff and technological resources to content moderation, imposing similar notice-and-appeals standards, which necessarily involves hiring many more human moderators, could prove a difficult barrier to entry for platforms looking to compete with incumbents.
    • In addition, asking companies of any size to distinguish which moderation decisions were related to “conservative” vs. “liberal” accounts would be functionally impossible to comply with and unlikely to produce useful or accurate results.
  • Specific content liability carve-outs: In general, new carve-outs in Section 230’s liability protections are likely to result in over-moderation that infringes upon aspects of legitimate speech. We saw this with the Stop Enabling Sex Traffickers Act/Fight Online Sex Trafficking Act (SESTA/FOSTA), which was intended to address the scourge of human trafficking online but which was written so broadly that a large number of dating sites, and even Craigslist’s personals section, closed down entirely rather than taking the risk that they could get sued for the illegal activities of come of their users.
    • Rep. Bob Latta’s “Bad Samaritan” exclusion is one carve-out proposal that seems reasonable. Knowingly hosting content that is illegal under federal law is already not protected by Section 230, but explicitly removing liability protections for companies which are found to have “purposefully” enabled such content seems a reasonable clarification that ought to address bad actors like the infamous Backpage.com. Such an exclusion would seemingly render unnecessary a number of the specific proposals relating to already-illegal content such as terrorist recruitment, child exploitation, and the sale of illegal drugs.
    • In the case of certain clearly harmful categories of content that are not presently explicitly illegal federally - “revenge porn” being the most obvious example - it may be a better approach to simply expand criminal law to prohibit such content rather than taking the Section 230 carve-out approach.
  • Nondiscrimination Carve Out led by Rep. Dan Crenshaw (R-TX): Sec. 101, which prohibits a covered company from blocking or degrading access to content based on a user’s racial, sexual, political affiliation, or ethnic grounds, would open the door for frivolous lawsuits. With even a single lawsuit potentially costing in the six-figure range, this is good news for trial lawyers and bad for American businesses and consumers. Sec. 201 (b)(1) provides an exception if a covered company publishes a description of the content it would block or impair access to. This would likely lead to companies putting together a laundry list of general types of content they would act on.
  • Cyberbullying Carve Out led by Rep. Tim Walberg (R-MI): This bill would limit Section 230 liability protections for all platforms, not just “Big Tech,” for cyberbullying. This could potentially crush startups under waves of meritless lawsuits. The definition of cyberbullying is also so broad it could encompass any mean-spirited speech. Many companies that benefit from Section 230 protections do not collect a user’s age for some content they host, such as with online comment sections, which would make enforcing this vague change incredibly difficult if not impossible.

Content Moderation Practices to Address Certain Content:

  • Whereas the carve-out bills try to increase moderation of undesirable content via removing liability protections, most of these proposals make such moderation practices subject to FTC review and approval, as well as empowering state attorneys general to challenge non-enforcement of these requirements. This puts the covered companies in a “mother may I” relationship with the increasingly partisan Federal Trade Commission (FTC).
  • Having “reasonable content moderation practices” defined by the FTC might be a more deliberative method than throwing open the doors to private litigation, but the reporting and compliance costs may make these requirements just as onerous and likely to result in over-moderation of broad categories of content. The definition of “covered company” reduces to $1 billion in revenue and 100 million global active users for these bills, meaning even smaller platforms, like Reddit, would have to begin devoting significant personnel and technology resources towards compliance.
  • In particular, requiring approval, either by the Commission or by a third party, of any changes to the manner by which covered companies enforce takedowns of these categories of content could actually slow down their adaptation to the ever-changing methods of bad actors on their platforms.

Protecting Children from Mental Health Harms and Cyberbullying:

  • While it is certainly understandable that parents have concerns about how social media use (and potential abuses) affect the well-being of their children, bills forcing companies to study these impacts, or funding government educational campaigns, seems like a use of taxpayer dollars that would be best left to parents and their communities.
  • Obligating companies to conduct such studies may also entail them having to conduct more intrusive surveillance and sensitive data collection on minors than they currently do.

Improving Transparency:

  • All four of these bills would require public disclosure either quarterly, biannually, or annually on the business practices of large online companies that would be made publicly available by the FTC. NTU has long supported transparency in government, but requiring private companies to comply with this level of disclosure is extremely heavy-handed. For example, investors are not required to submit their investment strategies and how they make investment decisions to lawmakers so that information can be made publicly available. Requiring public disclosures could expose proprietary information, create a roadmap for bad actors to evade the rules, and create an unnecessary administrative burden for businesses.

Additional Accountability Bills:

  • Funding Affordable Internet with Reliable (FAIR) Contributions Act led by Rep. Markwayne Mullin (R-OK): This legislation would require a study on the feasibility of having edge providers contribute to the Universal Service Fund (USF). The USF is in urgent need of reforms, but taxing edge providers makes little sense. As NTU has written, technology companies should not be forced to pay for the upkeep on broadband networks owned by telecommunications companies. NTU recommends Congress evaluate other funding reforms like a voucher program or appropriations as better alternatives to the current system.
  • ID Verification led by Rep. John Curtis (R-UT): Sec. 2 gives the Federal Trade Commission (FTC) blank check authority to decide how they want to enforce the identification mandate. In the “Big Tech Accountability Platform,” one of the concerns listed is companies’ access to user data, so it is striking that this bill does mention what information would be required by the FTC and social media companies to verify a user. As a holistic issue, conservatives may live or work in areas where their views are not well received. The majority of Democrats view calling out others on social media for content they view as offensive as “holding people accountable,” while the majority of Republicans see this as punishing people who don’t deserve it.  Republicans should be wary of the implications of forcing conservatives to submit verifications to technology companies in order to utilize their platforms.