March 6, 2026
The Honorable Brett Guthrie
Chair
House Energy and Commerce Committee
U.S. House of Representatives
Washington, DC 20515
The Honorable Frank Pallone, Jr.
Ranking Member
House Energy and Commerce Committee
U.S. House of Representatives
Washington, DC 20515
Dear Chairman Guthrie and Ranking Member Pallone:
On behalf of National Taxpayers Union (NTU), the nation’s oldest taxpayer advocacy organization, I write to provide our views on H.R. 7757, the Kids Internet and Digital Safety Act (KIDS Act), which is scheduled for consideration before the Committee. NTU appreciates the Committee’s continued efforts to improve online safety for minors and recognizes the importance of developing effective and durable policy solutions in this area. At the same time, several aspects of the bill raise questions about how the proposed framework would operate in practice, particularly with respect to privacy implications, compliance expectations, and implementation challenges across different types of online services.
I. Technology Verification Measures and Privacy Implications (Title I — Section 103)
Section 103 of the KIDS Act directs covered platforms to adopt and utilize “technology verification measures” designed to identify minors and prevent them from accessing sexual material harmful to minors. The provision appropriately avoids prescribing a specific verification technology, which helps ensure that the framework remains technology-neutral and adaptable as verification methods evolve.
Because compliance turns on whether a user is a minor, covered platforms would need to determine age before granting access to restricted material. In practice, this means age-assurance measures would be applied to all users seeking access—including adults—because a platform cannot restrict access for minors without first distinguishing minors from non-minors, potentially raising significant privacy and data-protection concerns.
At the same time, the provision does not prescribe a specific verification technology, leaving platforms discretion to implement different forms of age-assurance systems, which supports a technology-neutral approach and allows flexibility as methods evolve. However, verification approaches can vary widely, with differing data privacy and security implications. Examples include mechanisms that confirm age without retaining identifying information, third-party age-verification services that confirm eligibility through an external provider, and identity-document verification systems that require users to submit personal credentials. The nature and scope of the information processed—and the associated data-protection and security risks—can vary significantly depending on the method used.
In recognition of these risks, Section 103 includes several safeguards intended to limit potential risks. For example, the statute requires covered platforms to maintain reasonable data-security practices and includes a rule of construction stating that government-issued identification cannot be required.
However, the proposed law does not address the possibility that some age-verification approaches may involve significantly greater privacy risks than others. Although the statute allows platforms flexibility in selecting verification technologies, it does not include safeguards specifically aimed at limiting the use of more privacy-intrusive methods.
Additional safeguards aimed at limiting unnecessary data collection and promoting privacy-protective forms of age assurance could help ensure that the bill’s objectives are achieved without encouraging the use of unnecessarily intrusive verification approaches.
II. Scope of “Covered Platforms” (Title II — Section 201)
Section 201 of the KIDS Act defines a “covered platform” as a website, software, application, or online service that is publicly available to consumers and that meets several specified criteria. These criteria include enabling users to create a username or other identifier that can be searched for and followed by other users; having as a primary purpose the facilitation of sharing and gaining access to user-generated content; incorporating design features intended to facilitate user engagement; and using personal information for advertising, marketing, or recommending content. Because the obligations established in Title II apply only to services that fall within this definition, the scope of “covered platforms” plays a central role in determining the reach of the bill’s regulatory framework.
While the definition appears intended to capture platforms whose primary purpose is to facilitate the sharing and access of user-generated content, applying this criterion in practice may still present interpretive questions. Many modern online services incorporate user accounts, content-sharing tools, and recommendation or engagement features alongside other core functions. As a result, services with substantially different structures and purposes may face uncertainty in determining whether they qualify as a covered platform under the statutory definition and therefore whether they are subject to the compliance obligations set forth in Title II.
Section 213(b) directs that the policies, practices, and procedures required under subsection (a) be appropriate to the size and complexity of the covered platform and the technical feasibility of addressing the identified harms. This provision recognizes that platforms vary significantly in scale, resources, and technical capacity and allows the required safeguards to be implemented in ways that reflect those differences. However, these considerations apply only after a service has been determined to be a covered platform and therefore do not affect the threshold question of which services fall within the statutory definition set forth in Section 201. As a result, services with substantially different functions and operational structures may still fall within the scope of Title II, even though the safeguards required under Section 213 are intended to be implemented in ways that reflect differences in platform scale and the technical feasibility of addressing the specified harms.
Providing additional clarity regarding the intended scope of “covered platforms” could help address these concerns. Further clarification of how the “primary purpose” criterion should be applied to services that incorporate user accounts, content-sharing tools, or engagement features alongside other core functions could improve predictability for platforms seeking to determine whether the statute applies to them. Congress may also wish to consider whether additional factors or thresholds reflecting differences in platform scale, functionality, or the centrality of user-generated content to a service’s operation would help ensure that the statute applies primarily to services whose primary purpose is to facilitate the sharing and access of user-generated content, as reflected in the statutory definition.
III. Duty to Address Harms to Minors (Title II — Section 213)
Section 213 of the KIDS Act requires providers of covered platforms to establish, implement, maintain, and enforce reasonable policies, practices, and procedures intended to address certain harms to minors. The statute identifies several categories of harm, including serious threats of physical violence affecting a minor’s major life activities, sexual abuse and exploitation, the distribution, sale, and consumption of substances such as narcotics, tobacco, cannabis, alcohol, or gambling products in ways that involve minors, and financial harm resulting from deceptive practices.
The statute appropriately recognizes that platforms vary substantially in their capabilities and operational structures. Section 213(b) provides that the policies and safeguards required under the provision should be calibrated to characteristics such as the platform’s size and complexity and the practical feasibility of addressing the harms identified in the statute. In addition, Section 213(c) clarifies that the provision should not be construed to impose a general duty of care on platform providers.
At the same time, the requirement that platforms maintain “reasonable policies, practices, and procedures” leaves some uncertainty regarding how compliance will ultimately be assessed in practice. Because the provision relies on a flexible standard rather than prescriptive requirements, platforms may face questions about which safeguards would be sufficient to comply with the statute’s requirements. In practice, the meaning of the standard may therefore depend on how it is interpreted through enforcement actions, audits, or subsequent regulatory guidance.
Providing additional clarity regarding the application of the “reasonable policies, practices, and procedures” standard could help address these concerns. For example, Congress may wish to consider whether identifying relevant factors that regulators could take into account when evaluating compliance—such as the nature of a platform’s services, the types of risks presented to minors, and the platform’s technical capacity to mitigate those risks—would improve predictability for covered platforms. At the same time, preserving flexibility for regulators to issue interpretive guidance over time could help ensure that the framework remains adaptable as online services and platform technologies evolve.
IV. Audit and Reporting Requirements (Title II — Section 219)
Section 219 of the KIDS Act requires covered platforms to undergo periodic independent audits conducted by third-party auditors and to report the results of those audits to the Federal Trade Commission and the public. These audits are intended to evaluate whether platforms are implementing the safeguards and mitigation measures established elsewhere in the legislation to address harms affecting minors, including safeguards and parental tools designed to mitigate those risks and the ways in which platforms respond to reports of harms involving minors.
In carrying out the audit, the statute directs auditors to consider widely accepted or research-based practices and relevant evaluative frameworks when assessing safeguards for minors and parental tools, as well as approaches used to identify and address risks of harm affecting minors. The provision also requires auditors to consult with relevant stakeholders—including parents with appropriate experience, organizations focused on public or mental health and youth development, and experts in freedom of expression—when evaluating how platforms address these issues.
At the same time, the statute relies on general concepts such as best practices and evidence-based approaches without specifying how those standards should be applied in determining whether a platform’s policies and procedures sufficiently address the harms identified elsewhere in the legislation. Because these concepts are not further defined, assessments of compliance may depend substantially on the methodologies and professional judgment employed by individual auditors. Differences in audit approaches could therefore lead to variation in how comparable platform practices are evaluated across services.
Providing additional clarity regarding audit expectations—whether through statutory clarification or subsequent regulatory guidance—could help promote greater consistency in how audits are conducted and how compliance is assessed, while preserving flexibility for auditors to account for differences in platform size, design, and technical architecture.
V. Messaging Restrictions and Parental Control Requirements (Title II — Sections 233–235)
Sections 233 through 235 establish requirements governing direct messaging functionality involving minors on covered platforms. These provisions include restrictions on messaging features available to younger users and requirements that platforms provide parents with tools to manage certain messaging interactions involving teen users. The provisions are intended to reduce the risk that minors may encounter harmful interactions or exploitation through private communication channels.
Under these sections, covered platforms that provide messaging functionality must offer parents tools that allow them to manage certain aspects of a minor’s direct messaging interactions. These tools may include the ability to receive notifications when an unapproved contact seeks to initiate communication with a minor, approve or deny such requests, manage lists of approved contacts, disable messaging features, or prevent other users from initiating direct contact with a minor. The statute also prohibits covered platforms from providing direct messaging functionality to users under the age of thirteen.
While the objective of reducing harmful interactions through messaging channels is both important and widely shared, implementing parental approval requirements may present practical challenges for platforms that operate complex messaging systems. Even among services that fall within the statutory definition of a covered platform, messaging functionality often exists within broader interaction environments—including group communications, dynamic contact networks, or messaging embedded within community or gaming features. Designing parental approval mechanisms that function consistently across these varied messaging architectures may therefore present technical and operational challenges for some services.
While parental oversight can play an important role in protecting minors online, messaging risks are often addressed through a combination of complementary safeguards tailored to the platform’s structure, functionality, and the risks posed to minors. Depending on the service, safeguards may include parental controls, default settings that limit unsolicited contact with minors, restrictions on interactions between accounts with large age differences, and risk-based controls tailored to the platform’s messaging system and interaction model. Allowing covered platforms flexibility to deploy combinations of these safeguards may help address the risks the statute seeks to mitigate while accommodating differences in structure and functionality across covered platforms.
Conclusion
Protecting minors online is an important objective, and National Taxpayers Union appreciates Congress’s efforts to advance solutions addressing emerging risks in digital environments. However, several aspects of the KIDS Act raise questions about how the proposed framework would operate in practice, particularly given the heterogeneity of platform structures, potential privacy implications, and operational challenges. Key areas include the threshold for determining which services qualify as covered platforms, the interpretation and application of “reasonable policies, practices, and procedures” for addressing harms to minors, and the design and implementation of parental controls for messaging functionality.
Further guidance on these aspects would help reduce uncertainty for covered platforms while supporting the legislation’s objective of improving online safety for minors. National Taxpayers Union appreciates the Committee’s attention to these issues and stands ready to assist as lawmakers continue refining the KIDS Act.
Sincerely,
Ryan Nabil
Director of Technology Policy and Senior Fellow
National Taxpayers Union
122 C St NW, Suite 700
Washington, DC 20001