Juno Child Safety Policy
Date of Last Revision: December 2025
Juno is committed to creating a safe, respectful, and inclusive online environment for all users, especially minors. We maintain a zero-tolerance policy towards any form of child sexual abuse, exploitation, endangerment, or sexualization. Any content, activity, or individual found to violate this standard will be subject to permanent removal and potential reporting to the relevant authorities.
1. Age Restriction, Eligibility, and Age Verification
Minimum Age Requirement & Applicability: Juno is strictly intended for users aged 18 and above. You must be at least 18 years old to create an account and use Juno. This platform is not intended for individuals under the age of 18 ("minors").
Prohibition of Minors & Impersonation: Minors are strictly prohibited from appearing on the Juno platform in any capacity. It is strictly forbidden to submit false age information, impersonate an adult, or create an account for a minor to access the platform. Such actions will result in immediate account suspension or permanent termination.
Age Verification & Monitoring: Juno employs a combination of automated systems and manual reviews to identify underage usage, potential Child Sexual Abuse and Exploitation (CSAE) risks, and other policy violations. Any attempt to circumvent these systems and verification measures constitutes a severe violation of our policy.
2. Prohibited Conduct and Content
All users are prohibited from engaging in or facilitating any of the following activities involving minors (anyone under 18):
2.1 Child Sexual Abuse and Exploitation (CSAE)
CSAE refers to any act or material that sexually exploits, abuses, or endangers a child. The following is strictly prohibited on Juno:
- Sharing, soliciting, producing, or distributing material of a sexual or abusive nature involving minors.
- Engaging in sexualized conversations, discussions, or "role-play" involving minors, even in the absence of explicit imagery or direct contact.
- Using gestures, symbols, emojis, virtual filters, attire, or any other feature to depict, reference, or target minors in a sexual or suggestive manner.
- Attempting to approach, groom, solicit, or entice minors through the application.
2.2 Child Sexual Abuse Material (CSAM)
CSAM includes any visual, textual, or AI-generated depiction of a minor in a sexual context, whether based on a real person or synthetic. The following is strictly prohibited:
- Uploading, storing, distributing, or linking to any form of CSAM.
- Sharing AI-generated, deepfake, or otherwise synthetically created content that sexualizes minors.
- Sharing, producing, or disseminating any form of child nudity or sexualized content, including artistic or digital depictions such as cartoons, drawings, or animations.
2.3 Other Harmful or Dangerous Acts
Including but not limited to:
- Threatening, psychologically manipulating, bullying, or harassing minors.
- Inciting or depicting physical violence, neglect, abandonment, or trafficking against minors.
- Soliciting, coaxing, or sharing personally identifiable information of minors.
Such content or behavior will result in the immediate removal of content and permanent account suspension, with severe cases being directly referred to law enforcement.
3. Safety Identification, Monitoring, and Review Mechanisms
Juno employs a multi-layered, practical approach to proactively identify and prevent harm:
3.1 Automated Screening and Detection
- Technical scanning of user-uploaded images, video streams, and profiles to identify potentially inappropriate content involving minors.
- Screening of keywords and patterns in chat messages, profiles, and posts to identify grooming, inappropriate, or harmful language.
- Triggering safety alerts based on user behavior analysis (e.g., abnormal login patterns, frequent changes to identity information, associations with known risk accounts).
3.2 Human Review and Auditing
- Our safety team conducts manual reviews of all content flagged by systems or reported by users.
- Random sampling audits of public platform content are conducted regularly to ensure enforcement of safety standards.
- Enhanced monitoring and review of accounts with a confirmed history of child safety violations.
- Upon verification of violations, immediate actions such as removal and suspension are taken, with decisions on escalation to authorities made according to internal guidelines and legal requirements.
4. Platform Safety and Reporting Features
To ensure community safety and enhance user vigilance, Juno provides accessible and prominent reporting channels:
4.1 Dedicated Reporting Channels
Priority Reporting Category: Within the in-app reporting menu, “Underage” is set as the primary and prominently displayed reporting category for easy identification and selection. We strongly encourage every user to act responsibly and report any harmful or suspicious behavior.
4.2 In-App Reporting Procedure
We strongly encourage every user to act responsibly and report any harmful or suspicious behavior. Core Reporting Flow:
- Click the "Report" button within a chat window, user profile, or app settings.
- Select “Underage” as the primary category.
- Provide a detailed description and, where possible, attach evidence such as screenshots, chat logs, or timestamps to support our investigation.
Direct Contact: Users may also contact our safety team directly via email at service@juno.center.
4.3 External Reporting Resources
If you encounter content related to child sexual abuse outside the Juno platform (e.g., on other websites, links), we recommend reporting it directly to the following specialized organizations, which can coordinate global law enforcement action:
- The National Center for Missing & Exploited Children (NCMEC): They can report the content to the appropriate authorities around the world.
- Or contact your local hotline via the INHOPE network.
5. User Awareness, Transparency, and Education
5.1 Raising Safety Awareness
- Policy Accessibility: Banners on the app launch screen or prominent locations provide direct links to this policy, ensuring all users are informed of our safety standards.
- Proactive Reminders: Warnings may pop up during user content creation activities (e.g., initiating a video chat, updating a profile) to remind users of child protection rules.
5.2 Preventative Education
Beyond post-incident action, Juno is committed to preventing harm through education:
- Conducting safety-themed campaigns and displaying educational banners within the app to promote child safety principles and prohibited behaviors.
- Regularly updating and promoting community safety guidelines to align with evolving global standards and best practices for child online protection.
6. Violation Penalties and Accountability
6.1 Prohibited Persons
Any individual known to have been convicted of crimes against children (including but not limited to sexual abuse, exploitation, violence, or trafficking) is strictly prohibited from using Juno. Discovery will result in permanent removal.
6.2 Law Enforcement and Cooperation
- Platform Penalties: Juno takes swift and severe action against any violation of this policy, including but not limited to content removal and permanent account suspension.
- Legal Reporting: All verified serious violations, particularly those involving CSAM, will be reported in accordance with the law to the National Center for Missing & Exploited Children (NCMEC) and/or other relevant law enforcement authorities.
- Cooperation with Investigations: Juno will fully cooperate, as required by law, with investigations conducted by law enforcement agencies worldwide.
7. Our Ongoing Commitment to Safety
Protecting children online is a continuous and dynamic responsibility. Juno is committed to continuously investing resources to evaluate and improve our safety policies, technological protections, review processes, and educational initiatives. We aim not only to deter violations through enforcement but also to foster a culture of safety awareness, collective responsibility, and shared vigilance throughout our community, maintaining a safer digital space for everyone.