A significant number of federal legislative proposals that focus on online child safety have been introduced. If enacted, they would modify online providers’ obligations to remove and report child sexual exploitation (CSE) content (including child sexual abuse material (CSAM)), as well as require providers to implement notice and takedown mechanisms for certain CSE content. Failure to comply would carry steep penalties and increased risk of liability.

The bills would also increase oversight and regulatory involvement in connection with child safety obligations. This Update explores some of the key themes and trends that appear in the STOP CSAM Act, EARN It Act, REPORT Act, and END CSAM Act.*

Why This Matters

Under current law, online providers have no affirmative obligation to scan for CSAM, but if they choose to or otherwise learn of CSAM on their service, they must report it in accordance with federal law.

Given these new legislative proposals and the increased scrutiny they signal, online providers with existing user reporting and child safety programs will need to evaluate and prepare for the changes they may need to make to their child safety practices and assess how they can leverage existing processes.

Platforms without robust user reporting mechanisms—or those that do not scan for CSAM and do not have procedures to comply with the downstream obligations—will need to evaluate whether they need to do so now. For example, reports received through required user reporting mechanisms for CSE content, including CSAM, may make providers aware of this content and trigger resulting obligations.

In terms of regulatory involvement, the STOP CSAM Act would involve the Federal Trade Commission (FTC) in an area well outside its expertise through the creation of a Child Online Protection Board (Board). The FTC is a civil law enforcement and regulatory agency for antitrust and consumer protection issues—typically headed by commissioners who are antitrust lawyers—that has considerable experience in children’s online consumer privacy, advertising, and similar matters, but no meaningful experience with CSAM issues. Yet the bill would vest the FTC with significant authority over online providers and complainants with limited judicial review. What is more, while the Board would ostensibly be independent, it would be subject to the supervision of the FTC chair, a political appointee and designee of the president. In addition, the FTC chair or the full commission may assign the child protection officers duties beyond those laid out in the bill, potentially injecting broader components of the FTC into the Board’s work.

Updates to 18 USC § 2258A: Increased Mandatory Reporting Obligations and Voluntary Reporting Thresholds


  • Mandates reporting of apparent, planned, or imminent violations of 18 USC §§ 2251, 2252, 2252A, 2252B, or 2260, in addition to apparent CSAM. Reporting planned or imminent violations is currently optional under existing law.
  • Requires online providers to submit CyberTipline reports to NCMEC as soon as reasonably possible, but no later than 60 days after obtaining actual knowledge of the information that triggers a reporting obligation.
  • Creates a new voluntary standard for reporting by permitting providers to submit a CyberTipline report where they have a “reasonable belief” that facts or circumstances indicating an apparent, planned, or imminent violation of §§ 2251, 2251A, 2252, 2252A, 2252B, or 2260 exist.


  • Adds 18 USC § 1591 (minor sex trafficking) and 18 USC § 2422(b) (persuading a minor to engage in prostitution or “any sexual activity for which any person can be charged with a criminal offense”) to the list of enumerated statutes triggering a reporting obligation.
  • Expressly permits online providers to submit CyberTipline reports when they obtain knowledge of any facts or circumstances “sufficient to identify and locate each minor and each involved individual” in connection with an imminent violation of the enumerated CSAM offenses in § 2258A(a)(2)(A).


  • Similar to the STOP CSAM Act, the REPORT Act would require online providers to report “planned or imminent” violations of enumerated statutes under 18 USC § 2258A(a)(2)(A).
  • Similar to EARN It, this bill also requires reporting of Sections 1591 and 2422(b) offenses.

Updates to 2258A: Contents and Submission of CyberTipline Reports

  • The STOP CSAM Act would require that online providers include certain information in CyberTipline Reports (to the extent applicable and reasonably available), such as identifying information regarding any individual who is the subject of the report and the terms of service in effect at the time of the violation. Further, for each item of apparent CSAM that is the subject of the report, the provider must indicate whether the apparent CSAM has previously been the subject of a report or of multiple contemporaneous reports due to rapid and widespread distribution.
  • The EARN IT Act permits online providers to include information about the identity and location of any involved minor, including their email, IP address, or URL (including any self-reported identifying information), and requires providers to use “best efforts” to conform to the CyberTipline structure.

Updates to 18 USC 2258A: Preservation Obligations

  • The REPORT Act and EARN IT Act would require online providers to preserve the contents provided in CyberTipline reports for one year instead of 90 days (under existing law). EARN IT would permit online providers to voluntarily preserve the contents of reports for longer “for the purpose of reducing the proliferation” of online CSE or preventing online CSE.
  • Relatedly, the STOP CSAM Act would add language clarifying that submitting a CyberTipline report does not satisfy the preservation obligation in § 2258A(h).

Updates to 18 USC 2258A: Transparency Reporting Obligations

  • The STOP CSAM Act would require online providers that meet certain threshold criteria to submit a report to the attorney general and FTC chair, disaggregated by subsidiary, that includes information on CyberTipline reports, data on the notice and action mechanism described below, the measures a provider has for reports concerning child sexual exploitation and abuse (CSEA), including a summary of actions taken, information on providers’ CSEA policies, a culture of safety on the product/service, safety by design, and trends and patterns regarding online CSEA.

Notice and Action Mechanisms


  • Reporting and removal of “proscribed visual depictions relating to children.” If an online provider receives a “complete” notification from a qualified complainant that it is hosting a “proscribed visual depiction relating to a child” (referred to in this Update as a “visual depiction” or “proscribed visual depiction”) it must, as soon as possible but no more than 48 hours (or two business days for “small providers” as defined in the bill) after receiving the notice:
    • Remove the visual depiction and notify the complainant it has done so, or
    • Notify the complainant that it has determined the depiction referenced is not a proscribed visual depiction, is unable to remove the visual depiction using reasonable means, or has determined the notification is duplicative.


  • Reporting and removal of CSAM. “Social media companies” (as defined) are required to provide a notice process that allows individuals to report CSAM and are required to remove that material within 10 days of receiving that notice.

Increased Risk of Liability


  • The bill adds a carve out to online providers’ immunity under Section 230 to permit claims against them for:
    • Civil claims under 18 USC § 2255 where the provider’s underlying conduct violates §§ 2252 or 2252A.
    • Civil or criminal claims regarding the advertisement, promotion, presentation, distribution, or solicitation of CSAM.
  • It also provides limited protections for encryption by shielding providers from specified independent bases of liability for their use of encryption.


  • STOP CSAM would permit victims to sue interactive computer services and software distribution services under 18 USC § 2255 for any injuries as a result of their (1) intentional, knowing, or reckless promotion or facilitation of a violation of 18 USC §§ 1591, 1594(c), 2251, 2251A, 2252, 2252A; and (2) intentional, knowing, reckless hosting, or storing of CSAM or making CSAM available to any person. Similar to EARN IT, this bill provides limited protections for the use of encryption but does not propose any modifications to Section 230.
  • The bill would also create a new statutory section, 18 USC § 2260B, that provides that interactive computer services would be liable when they knowingly (1) host or store child pornography or make it available to any person; or (2) otherwise knowingly promote or facilitate a violation of §§ 2251, 2251A, 2252, 2252A, or 2422(b), with limited statutory defenses.

Regulatory Oversight


  • This bill would create a “Child Online Protection Board” that would be housed within the FTC. The Board would adjudicate notice and takedown disputes between online providers and complainants. Responding parties have the right to opt out of Board proceedings, which would include discovery and hearings on issues of fact or law and may also include witness testimony. Board determinations are subject to reconsideration, FTC review, and judicial review, subject to the parameters provided in the bill.
  • Complainants may file a petition on several grounds, including that the online provider did not remove the visual depiction within the required timeframe or incorrectly asserted its available defenses. Petitions may also allege that the visual depiction involves recidivist hosting, but petitions cannot be filed for this reason alone.
  • Online providers may file petitions on several grounds, including that visual depiction is not a proscribed visual depiction relating to a child, that the notification is frivolous or was submitted with an intent to harass, or that the notification was duplicative.


  • The bill would establish the National Commission on Online Child Sexual Exploitation Prevention, which would develop best practices that providers may choose to implement to prevent, reduce, and respond to the online sexual exploitation of children, including the enticement, sex trafficking, and sexual abuse of children, and the proliferation of online child sexual abuse material.
    • The commission would be composed of 19 members, including the attorney general or their representative, the secretary of Homeland Security or their representative, and the chairperson of the FTC or their representative.

*This Update does not reflect a comprehensive summary of all obligations. For full details, please contact counsel.

© 2023 Perkins Coie LLP


Sign up for the latest legal news and insights  >