Child Safety Standards

Last updated: January 27, 2026

Our Commitment

Peek, operated by peek soft, is committed to protecting children and preventing child sexual abuse and exploitation (CSAE) on our platform. We take our responsibility to create a safe environment seriously and have zero tolerance for any content, behavior, or activity that sexually exploits, abuses, or endangers children.

This page outlines our standards, policies, and procedures for addressing CSAE and child sexual abuse material (CSAM) in accordance with Google Play's Child Safety Standards policy and applicable laws.

What is CSAE?

Child Sexual Abuse and Exploitation (CSAE) refers to any content or behavior that sexually exploits, abuses, or endangers children. This includes, but is not limited to:

  • Grooming a child for sexual exploitation
  • Sextorting a child
  • Trafficking of a child for sex
  • Any other form of sexual exploitation of a child

What is CSAM?

Child Sexual Abuse Material (CSAM) is illegal and strictly prohibited on Peek. CSAM consists of any visual depiction, including but not limited to photos, videos, and computer-generated imagery, involving the use of a minor engaging in sexually explicit conduct.

Peek has zero tolerance for CSAM. Any such material discovered on our platform will be immediately removed, and we will report it to the appropriate authorities, including the National Center for Missing & Exploited Children (NCMEC) and relevant law enforcement agencies.

Age Restrictions

Peek is strictly for users who are 18 years of age or older. We do not knowingly allow individuals under the age of 18 to use our Service. During account creation, users must verify that they are at least 18 years old. If we become aware that a user is under 18, we will immediately suspend or terminate their account and delete all associated data.

Important: The presence or absence of child users in our app is irrelevant to our commitment to child safety. Even though Peek is designed exclusively for adults, we maintain rigorous standards to prevent CSAE and CSAM in accordance with Google Play's Child Safety Standards policy.

Our Policies and Procedures

Content Moderation

Peek employs automated and manual content moderation systems to detect and prevent CSAE and CSAM. All user-generated content is subject to review, and we use industry-standard tools and technologies to identify potentially harmful material.

Reporting Mechanisms

Users can report concerns about CSAE or CSAM through multiple in-app mechanisms:

  • In-app reporting feature accessible from any screen
  • Direct email to our child safety team (see contact information below)
  • Support channels within the app

All reports are reviewed promptly and confidentially. We take every report seriously and investigate all claims thoroughly.

Response and Action

When we obtain actual knowledge of CSAM or CSAE on our platform, we take immediate action, including:

  • Immediate removal of the content
  • Permanent suspension or termination of the account(s) involved
  • Reporting to NCMEC and relevant law enforcement agencies
  • Preservation of evidence as required by law
  • Cooperation with law enforcement investigations

Compliance with Laws

Peek complies with all applicable child safety laws and regulations, including but not limited to:

  • Laws prohibiting the production, distribution, and possession of CSAM
  • Laws protecting children from sexual exploitation and abuse
  • Mandatory reporting requirements
  • International child protection standards

We work closely with law enforcement agencies and child protection organizations to ensure compliance and to support investigations into CSAE and CSAM.

Education and Prevention

Peek is committed to preventing CSAE through education and awareness. We provide resources and information to help users understand the importance of child safety and recognize potential signs of exploitation. Our team regularly reviews and updates our policies and procedures based on industry best practices and guidance from organizations such as the Tech Coalition.

Child Safety Point of Contact

If you have concerns about child safety, CSAE, or CSAM on Peek, please contact our designated Child Safety team immediately:

Peek Child Safety Team
Email: childsafety@peek.app

For urgent matters involving immediate danger to a child, please contact your local law enforcement agency immediately.

Our Child Safety team is trained to handle reports of CSAE and CSAM and will respond promptly to all inquiries. We maintain confidentiality while ensuring appropriate action is taken in accordance with our policies and applicable laws.

Reporting to Authorities

Peek reports all instances of CSAM to the National Center for Missing & Exploited Children (NCMEC) through their CyberTipline, as required by law. We also cooperate fully with law enforcement investigations and provide necessary information and evidence when legally required.

If you discover CSAM or suspect child exploitation, you can also report directly to:

Updates to This Policy

We may update this Child Safety Standards page from time to time to reflect changes in our policies, procedures, or applicable laws. We will notify users of significant changes by updating the "Last updated" date at the top of this page. We encourage you to review this page periodically to stay informed about our commitment to child safety.

Additional Resources

For more information about child safety online, please visit: