Skip to main content

Standards Against Child Sexual Abuse and Exploitation (CSAE)

I. Prohibited Content and Conduct:

1.  The creation, possession, distribution, promotion, or facilitation of Child Sexual Abuse Material (CSAM) is strictly prohibited. CSAM includes any visual depiction, including photographs, videos, simulations, or other media, that depicts or implies sexual abuse or exploitation of a minor.
2.  Grooming, online solicitation of minors for sexual purposes, and any form of communication intended to exploit, abuse, or endanger children is strictly prohibited. This includes but is not limited to:
    *   Initiating inappropriate contact with minors.
    *   Requesting or sharing personal information that could identify a minor.
    *   Engaging in sexually suggestive conversations with minors.
    *   Offering gifts, money, or other incentives in exchange for sexual acts or images.
3.  Content that sexualizes minors, including depictions of minors in a sexual context or in a manner that is sexually suggestive, inappropriate, or endangers their well-being, is prohibited.

II. Age Verification and Safety Measures:

1.  Implement reasonable and appropriate age verification mechanisms to prevent minors from accessing age-restricted content or features.
2.  Provide parental controls and other safety features to empower parents and guardians to manage their children's online activity.
3.  Maintain clear and accessible privacy policies that outline how user data is collected, used, and protected, particularly with regard to minors.

III. Reporting and Response:

1.  Provide clear and accessible in-app reporting mechanisms for users to report suspected cases of CSAE or CSAM.
2.  Establish a dedicated point of contact for receiving and responding to reports of CSAE.
3.  Maintain a documented process for investigating and addressing reported incidents of CSAE.
4.  Cooperate fully with law enforcement and other relevant authorities in investigations related to CSAE and CSAM. This includes promptly reporting suspected CSAM to the National Center for Missing & Exploited Children (NCMEC) or other appropriate authorities.

IV. Content Moderation and Filtering:

1.  Implement robust content moderation and filtering systems to detect and remove CSAM and other harmful content. This may include automated tools, human review, and user flagging mechanisms.
2.  Regularly review and update content moderation policies and procedures to ensure their effectiveness.

V. Training and Awareness:

1.  Provide regular training to staff and moderators on CSAE prevention, detection, and response.
2.  Promote awareness of CSAE issues among users through educational resources and community guidelines.

VI. Transparency and Accountability:

1.  Maintain clear and publicly accessible policies regarding CSAE.
2.  Regularly review and update these standards to reflect evolving best practices and legal requirements.
3.  Maintain records of reported incidents and actions taken.

VII. Legal Compliance:

Comply with all applicable laws and regulations related to child protection, including but not limited to the Children's Online Privacy Protection Act (COPPA), the Communications Decency Act (CDA) Section 230, and relevant international laws.