Our Commitment to Child Safety
HER has a zero-tolerance policy towards Child Sexual Abuse Material (CSAM) and any form of child exploitation. Ensuring a safe, secure, and ethical platform is our highest priority. We implement strict prevention, detection, and reporting measures to swiftly identify and remove offending content while ensuring compliance with global regulations.
Our Three-Pronged Approach to CSAM Prevention
We deploy both automated and manual enforcement mechanisms to detect, remove, and report CSAM effectively:
We leverage Cloudflare’s CSAM Detection Tool, which continuously scans media uploaded to our platform. The process follows these strict enforcement steps:
- Automated Detection & Isolation:
- If CSAM is detected, the user responsible for the content is immediately suspended from HER.
- The offending content is isolated, preventing any further internal or external access except for designated compliance personnel.
- Compliance & Reporting to NCMEC:
- All relevant user data—including IP address, email, phone number (if available), location, and declared app name—is collected.
- The detected content and all available offender details are submitted to the National Center for Missing & Exploited Children (NCMEC) via its reporting API.
- The offending content is encrypted and stored in a secure, restricted-access S3 bucket to ensure compliance and facilitate necessary follow-ups with authorities.
2. Processing Cloudflare Manual Takedown Notices
In addition to automated detection, HER has implemented a monitoring and enforcing backend service that automatically processes Cloudflare’s manual takedown notices.
- When a Cloudflare takedown notice is received, our backend system automatically verifies and processes the request.
- Offending content is immediately removed, and the corresponding user is suspended following the same isolation and compliance procedures as with automated detection.
- All takedown requests and enforcement actions are logged for auditability and compliance.
3. User Reporting & Manual Escalation
In addition to automated detection, HER provides a user-driven reporting system for any content that may have bypassed automated scanning.
- In-App Reporting: Users can report any suspicious content or accounts through our built-in Trust & Safety reporting feature.
- Trust & Safety Review:
- Reports flagged as potential CSAM are escalated to HER’s Head of Trust & Safety for further assessment.
- The Security Team initiates a manual submission to NCMEC via the automated API if verified.
This dual approach ensures proactive detection and responsive escalation to eliminate CSAM from our platform.
For any CSAM-related reports or compliance inquiries, please contact: 📧 technical-reports@weareher.com
HER is committed to full legal compliance and cooperation with law enforcement in all CSAM-related investigations.