Dokumente

CHILD SEXUAL ABUSE MATERIAL (CSAM) POLICY

Wir sind hier, um zu helfen! Kontaktieren Sie uns bei Fragen oder Problemen unter [email protected].

CHILD SEXUAL ABUSE MATERIAL (CSAM) POLICY

Letzte Aktualisierung: Nov 24, 2024

The company adheres to a strict zero-tolerance policy toward any content related to the sexualization, sexual exploitation, or threat to children (Child Sexual Abuse Material or CSAM). This includes, in particular, media, text, video, illustrations, or computer-generated images that are illegal, obscene, offensive, harmful, or inappropriate. We are committed to ensuring that our platform is not used to share, view, or consume any such content and will promptly remove and report any violations we discover or are made aware of. Under this policy, a child is considered any person under the age of eighteen (18) or the age of majority in the relevant jurisdiction, whichever is lower.

CSAM and Prohibited Content

Exploitation of Children

It is strictly prohibited to upload, distribute, or engage in any content involving child pornography, sexual depictions of individuals who appear to be minors (regardless of actual age), or any content that depicts someone under 18 years old through the use of scripts, makeup, behavior, costumes, settings, or props. This includes animated or cartoon images.

Distribution, Acquisition, and Possession of Child Pornography

The distribution, dissemination, transmission, uploading, or providing access to any child pornographic content on the platform is strictly prohibited. Child pornography is defined as any pornographic content involving:

  • Sexual acts performed by a person under the age of 18.
  • Depictions of a child under the age of 18 in a state of undress in a sexually provocative pose.
  • Sexually provocative depictions of a child’s genitals or buttocks.
  • Any attempts to acquire or possess child pornography involving real or realistic actions with the intent to provide it to others are also prohibited.

Reporting Mechanisms

If any user encounters CSAM on our platform, they must immediately report it using our dedicated content removal request form. Reports can be made anonymously and will be kept confidential.

Our moderation team will promptly review all reports and take appropriate action, including removing the content and reporting it to the relevant authorities.

Enforcement Measures

We have strict policies, operational mechanisms, and technologies to combat and take swift action against CSAM. When we identify or learn of an actual or potential instance of CSAM on the platform, we remove and investigate the content and report any material identified as CSAM. We also cooperate with law enforcement investigations and respond promptly to valid legal requests received to assist us in combating the spread of CSAM on our platform.

In conjunction with our team of human moderators and regular audits of our platform, we also rely on innovative, industry-standard technical tools to help identify, report, and remove CSAM and other types of illegal content from our platform. We use automated detection technologies as additional layers of protection to keep CSAM off our platform.

These technologies include:

  • CSAI Match on YouTube is a tool that helps identify known videos containing child sexual abuse.
  • PhotoDNA from Microsoft is a tool that helps detect and remove known child sexual abuse images.
  • Safer is Thorn's comprehensive CSAM content detection tool that protects platforms from offensive material.
  • NCMEC Hash Sharing - NCMEC's ​​database of known CSAM hashes, including hashes submitted by individuals who have fingerprinted their child-targeted content through the NCMEC service.

Together, these tools are fundamental to our overall fight against the spread of CSAM on our platform, and to our mission to assist in the industry's collective effort to eradicate the horrific global crime of online child sexual exploitation and abuse.

Consequences of Violations

Any agency, model, or subscriber found to be uploading, distributing, or sharing CSAM will be immediately removed, and their account will be permanently banned from our platform.

We will report all instances of apparent CSAM and fully cooperate with law enforcement in investigations, providing any information that may aid in the prosecution of offenders.

Legal Compliance and Limitations

The company operates in compliance with applicable laws and regulations. Despite our best efforts, no system is perfect, and we cannot guarantee that all CSAM will be detected and removed. We rely on the vigilance and shared responsibility of our users to report any CSAM they encounter.

We reserve the right to update this policy as necessary to adapt to changing threats and best practices in combating CSAM.

Support Resources

The company is committed to playing an active role in combating child sexual exploitation and abuse. We will continue to invest in and improve our policies, technologies, and partnerships to help create a safer online environment for all users, especially children. If you have any information related to CSAM on our platform, please report it immediately using our content removal request form.

If you have any questions, concerns, or feedback regarding this policy, please contact us at [email protected].