Child Safety Standards Header

Child Safety Standards for Amigo AI

Developer: Amigos AI, Inc.

Effective Date: October 20, 2025

Amigos AI Inc. ("Amigo," "we," "us," or "our") is committed to child safety. These Child Safety Standards explain our commitment to protecting children, preventing abuse and exploitation, and our detection and reporting mechanisms in connection with our face fusion video chatting mobile application ("App") and related services ("Services").

1. Our Commitment

At Amigo AI (by Amigos AI, Inc.) we are committed to providing a safe, trusted, and inclusive platform for all users. We have a zero-tolerance policy toward Child Sexual Abuse and Exploitation (CSAE). Every child deserves protection online, and we actively work to prevent, detect, and respond to such abuse.

2. Scope & Definitions

Child Sexual Abuse and Exploitation (CSAE) includes any behavior or content that exploits or endangers a minor (anyone under 18) for sexual purposes. Examples include grooming, sextortion, trafficking, or solicitation of sexual acts from a minor.

Child Sexual Abuse Material (CSAM) refers to any visual depiction (images, videos, live streams, illustrations, computer-generated imagery) that involves a minor in sexually explicit conduct.

This policy applies to:

  • All users of Amigo AI
  • All content shared, created, or stored through our platform
  • All interactions, messaging, uploads, and other in-app features

3. Prohibited Conduct

We strictly prohibit:

  • The production, distribution, sharing, or possession of CSAE or CSAM
  • Grooming, solicitation, or sexualizing of minors for any purpose
  • Any content or behavior that places children at risk of sexual exploitation or abuse
  • Use of Amigo AI's features to facilitate or enable child sexual exploitation

Violations will result in immediate removal of content, suspension or termination of user accounts, and, where required, reporting to relevant law-enforcement authorities.

4. Prevention, Detection & Reporting Mechanisms

In-App Reporting and Feedback: Users can report concerns directly within the Amigo AI app using the “Report User” feature or by reaching out through in-app support, or they may contact us via email.

Automated and Manual Review: We use detection systems and human moderation to identify and act on suspicious behavior or content related to CSAE or CSAM.

Prompt Response: When CSAE or CSAM is detected, we immediately remove the content, suspend the user, and, where appropriate, report it to authorities.

User Education: We provide resources and safety tips to help users recognize and respond to potential exploitation risks.

5. Legal Compliance & Cooperation

Amigo AI complies with all applicable child-safety, data-protection, and exploitation-prevention laws, including mandatory reporting of child sexual abuse material (CSAM) to relevant authorities. We cooperate fully with law enforcement, child-protection agencies, and industry partners to help maintain a safe digital environment.

6. Child Safety Point of Contact

For any child-safety concerns, suspected CSAE or CSAM, or reports of abusive behavior, please contact:

Email: childsafety@amigoai.io

All reports are reviewed promptly and handled with confidentiality and seriousness.

7. Continuous Improvement & Transparency

We recognize that protecting children online is an ongoing responsibility. We commit to:

  • Reviewing this policy regularly and updating it as new risks emerge
  • Improving moderation systems and reporting workflows
  • Maintaining internal accountability for all enforcement actions

8. Related Policies and Resources

For additional information, please see:

By using Amigo AI, you agree to follow these Child Safety Standards in addition to our Terms of Service and Privacy Policy. We thank you for helping us maintain a safe environment for everyone, especially minors.