CSAE Policy

Last Updated: January 5, 2025

Our CSAE policy ensures a safe digital environment by prioritizing child safety and leveraging technology.

policy Rationale

At Jamme, safeguarding children is our highest priority. We uphold a zero-tolerance policy towards Child Sexual Abuse and Exploitation (CSAE) and are deeply committed to providing a safe, inclusive, and secure digital environment. Our policies and practices are designed to detect, prevent, and address any form of CSAE content or behavior.

commitment To Child Safety

We strictly prohibit any content or actions that exploit or endanger children. Any indications of child exploitation are promptly reported to the appropriate authorities, including the National Center for Missing and Exploited Children (NCMEC) at https://report.cybertip.org/reporting, in accordance with applicable laws. We also remove content involving children, even when shared with good intentions, to mitigate potential misuse and safeguard against harm.

prohibited Content And Activities

We do not tolerate the following:

child Sexual Exploitation

  • Content depicting, promoting, supporting, or facilitating the sexual exploitation of children.
  • Sexualized depictions of children, whether real, fictional, or AI-generated.
  • Mocking or identifying victims of child exploitation.

solicitation Of Sexual Activity

  • Soliciting sexual content or activity involving children.
  • Distributing child sexual abuse material (CSAM) or sexualized depictions of children.

inappropriate Interactions With Children

  • Attempts to arrange or facilitate sexual encounters with children.
  • Engaging children in sexual conversations or obtaining explicit content from them.

exploitative Intimate Imagery And Sextortion

  • Coercing or threatening children to obtain intimate imagery or favors.
  • Sharing or threatening to share private sexual content involving children.

sexualization Of Children

  • Depictions of children in sexualized contexts, including nudity, explicit poses, or costumes designed to sexualize.

child Nudity

  • Content portraying child nudity, such as visible genitalia, anus, or uncovered female nipples (except in breastfeeding contexts). Exceptions are made for recognized works of art, health-related materials, or cultural representations with proper contextual consideration.

age Verification Options

We’re developing innovative ways to help ensure the right experience for every age group on Jamme, starting with users in India. If someone tries to update their date of birth from under 18 to 18 or older, we’ll guide them through a new age verification process. Users can now choose options: upload an ID, capture a video selfie, or ask mutual friends to vouch for their age.

enforcement And Safety Measures

To protect children, Jamme employs a comprehensive set of tools and measures:

content Moderation

  • AI-Based Moderation: Advanced AI models detect inappropriate audio, text, and image content in real-time.
  • Manual Review: A dedicated moderation team ensures accurate enforcement of guidelines.
  • Proactive Monitoring: AI systems monitor user interactions to detect violations proactively.

reporting Mechanisms

  • In-App Reporting: Users can report offensive content directly via the “Report” button.
  • 24/7 Moderation Support: A dedicated helpline addresses CSAE concerns promptly (https://jamme.app/).
  • Immediate Actions: Flagged content is removed swiftly, offending accounts are suspended or banned, and relevant authorities are notified when necessary.

cooperation With Authorities

Jamme collaborates with law enforcement and child protection organizations to:

preventive Measures

  • Age Verification: AI-driven verification ensures users meet age requirements. Minors under 18 have restricted access to high-risk features.
  • Content Filters: Keywords, phrases, and audio cues associated with inappropriate content are automatically flagged and blocked.
  • Safety Made Simple: Our platform ensures compliance with age guidelines through an advanced age-verification feature. By detecting a user’s age instantly from their DOB, we restrict access for those under 18, creating a safe and secure digital space.
  • Age Verification Image

educational Resources And Awareness

Jamme promotes education as a key tool against CSAE by providing:

contextual Considerations

Certain content, such as imagery of child nudity in humanitarian reporting, is evaluated carefully to balance awareness and protection. Non-sexual depictions may be allowed under strict guidelines, with sensitive labeling applied as needed.

vision For Safe Platform

Jamme is dedicated to fostering authentic voice-based connections while prioritizing child safety. By leveraging cutting-edge technology, partnering with experts, and empowering our community to report abuse, we aim to create a secure environment where minors are protected from harm, exploitation, and abuse.

Table of contents
policy Rationale
commitment To Child Safety
prohibited Content And Activities
child Sexual Exploitation
solicitation Of Sexual Activity
inappropriate Interactions With Children
exploitative Intimate Imagery And Sextortion
sexualization Of Children
child Nudity
age Verification Options
enforcement And Safety Measures
content Moderation
reporting Mechanisms
cooperation With Authorities
preventive Measures
educational Resources And Awareness
contextual Considerations
vision For Safe Platform