The TLDR
The greatest online risk to children is not a stranger in a chat room — it’s a groomer who presents as a peer first. The National Center for Missing & Exploited Children (NCMEC) received over 36 million reports of suspected child sexual exploitation in 2023. One in five children receives unwanted sexual solicitation online. The threat isn’t just on sketchy websites — it’s on Discord, TikTok, Roblox, Fortnite, Instagram, and Snapchat. The platforms your kids use every day are where predators operate, because that’s where children are.
The Reality
The Crimes Against Children Research Center data tells a different story than most parents expect:
- Most online exploitation starts with willing initial contact — the child isn’t “tricked” into talking to a stranger. They believe they’re talking to a peer or a friend.
- Grooming is a process, not an event — it takes weeks or months of relationship-building before any exploitation occurs.
- The predator’s profile has changed — it’s not just “creepy old men in basements.” Some offenders are teenagers themselves, and many present as age-appropriate peers online.
- Sextortion targeting boys has exploded — the FBI reported a dramatic increase in financially-motivated sextortion targeting teenage boys (ages 14–17), with multiple documented suicides.
Grooming Mechanics
The Grooming Cycle
Online grooming follows a documented pattern:
Stage 1 — Targeting: The predator identifies a vulnerable child. Indicators they look for: posting about loneliness, family conflict, low self-esteem, or looking for attention. Public social media profiles provide this information freely.
Stage 2 — Trust Building: The predator establishes rapport. They’re interested in the child’s problems. They’re supportive. They share “similar” experiences. On gaming platforms, this often starts with helping the child in-game — giving items, carrying them through difficult content, being the “cool older friend.”
Stage 3 — Filling a Need: The predator becomes an emotional resource the child depends on. For a lonely teenager, having someone who “gets them” and is always available is powerful. The relationship feels real and important to the child.
Stage 4 — Isolation: The predator moves communication to private channels — from a public Discord server to DMs, from Roblox to WhatsApp, from a game to Snapchat. They encourage the child to keep the relationship secret: “Your parents wouldn’t understand.”
Stage 5 — Desensitization: Gradual introduction of sexual content. It starts with “accidental” exposure — sharing adult memes, discussing sexual topics “as friends,” normalizing sexual conversations. The boundary erosion is gradual enough that the child doesn’t recognize the shift.
Stage 6 — Exploitation: Solicitation of images, video calls, or real-world meetings. By this point, the child is emotionally invested and may not recognize what’s happening as exploitation.
How Online Grooming Differs
In-person grooming requires physical proximity and carries higher risk for the predator. Online grooming can happen from anywhere, target multiple children simultaneously, and leaves the predator less exposed to detection. A single predator can groom dozens of children concurrently across different platforms.
Platform-Specific Risks
Discord
Discord is the default social platform for gaming communities and increasingly for general teenage social life. Its structure creates specific risks:
- Server hopping: Large public servers (thousands of members) are easy to join and provide access to children. A predator joins a Minecraft server, participates normally, identifies targets, and moves to DMs.
- DM access: By default, Discord allows DMs from anyone who shares a server with you. A child doesn’t have to accept a friend request to be contacted.
- Lack of age verification: Discord’s Terms of Service require accounts to be 13+. Verification is minimal. Many children under 13 use Discord.
- NSFW servers: While age-gated, the verification is self-reported. Exposure to explicit content is common.
Gaming Platforms
- Voice chat on PlayStation, Xbox, and PC games provides real-time verbal communication with strangers. A predator can sound like a teenager.
- In-game gifting is used as a grooming tool — buying a child virtual items creates a reciprocity dynamic.
- Roblox specifically has been flagged by NCMEC and Thorn for high rates of predator-to-child contact due to its enormous under-13 user base and chat functionality.
TikTok
- Algorithmic exposure: TikTok’s algorithm can serve content to audiences the creator didn’t intend — a child’s video can reach adults specifically interested in children’s content.
- Direct messaging: TikTok has restricted DMs for accounts under 16 in some regions, but enforcement depends on the reported age at registration.
- Duets and comments: Even without DMs, predators can make contact through comments and duet features.
Snapchat and Instagram
- Disappearing messages on Snapchat give predators a built-in evidence destruction mechanism.
- Location sharing (Snap Map) can reveal a child’s physical location.
- Instagram DMs from strangers are possible unless the account is set to private — and even private accounts can be followed by creating a convincing fake profile.
CSAM — What Parents Need to Know
Child Sexual Abuse Material (CSAM) is a growing crisis. The NCMEC CyberTipline received 36.2 million reports in 2023.
Solicitation and Production
The most common scenario is not a predator secretly photographing a child. It’s a child being manipulated into producing images themselves — through grooming, through sextortion (threats of exposure), or through social pressure from peers.
AI-Generated CSAM
AI image generation tools have been used to create synthetic CSAM using non-sexual images of real children as reference material. This is a rapidly growing problem that NCMEC and the FBI have flagged as an emerging threat.
What to Do If You Find It
- Do not forward, copy, or share the material — possessing CSAM is a federal crime regardless of intent
- Report immediately to the NCMEC CyberTipline (online or call 1-800-843-5678)
- Report to the platform where it was found
- Contact local law enforcement if a child is in immediate danger
- Preserve evidence — don’t delete messages or conversations (screenshot without the imagery if possible), as these may be needed by law enforcement
Hidden Apps as Evidence
The Hidden Apps Protect guide covers detection. From a threat perspective:
Children may use hidden or disguised apps (calculator vaults, secret chat apps) to hide conversations with groomers. If you discover these on your child’s device:
- Do not immediately confront or destroy the evidence — this can compromise a law enforcement investigation
- Document what you find — take notes on app names and approximate dates
- Contact NCMEC or local law enforcement before taking action
- Talk to your child — but prioritize their safety and potential evidence preservation
Identity Theft of Children
A child’s Social Security Number is particularly valuable to identity thieves because:
- Children have clean credit histories (no negative marks)
- The theft may go undetected for years (until the child applies for credit as an adult)
- Synthetic identity fraud using a child’s real SSN is difficult to detect
The FTC recommends checking whether your child has a credit report (they shouldn’t). If a credit report exists, it likely means their SSN has been misused.
What You Can Do
Communication First
Technical controls matter, but the most effective defense is an ongoing, open conversation:
- Make it normal to talk about online interactions — ask about who they play with online as casually as you’d ask about school friends
- Establish the “It’s Not Your Fault” message early — children who believe they’ll be in trouble for what happened are less likely to report
- Teach the pattern — “If someone online asks you to keep a secret from your parents, that’s the signal to tell your parents”
- Discuss sextortion specifically — teens need to know this exists and that they can come to you without judgment if it happens
Platform-Specific Controls
- Discord: Settings → Privacy & Safety → “Allow direct messages from server members” → OFF. Enable the explicit content filter.
- Gaming: Set accounts to friends-only communication. Review friends lists regularly. Consider restricting voice chat.
- TikTok: Enable Restricted Mode. Set account to private. Turn off DMs (or restrict to friends only). Enable Family Pairing.
- Instagram: Set to private. Restrict messaging from non-followers. Enable supervision features.
- Snapchat: Ghost Mode (hide from Snap Map). Restrict who can contact them to “Friends Only.”
Age-Appropriate Access
Not every platform is appropriate for every age:
- Under 13: No social media (COPPA restricts it, though enforcement is weak). Supervised gaming only.
- 13–15: Limited social media with parental oversight. Private accounts only. Friends-only communication settings.
- 16+: Gradually increasing autonomy with ongoing conversation about risks.
If Your Child Is Being Exploited
- Believe them. Your reaction determines whether they’ll keep talking.
- Don’t blame them. The adult is always responsible. Always.
- Preserve evidence. Don’t delete messages or accounts.
- Report to NCMEC CyberTipline: missingkids.org/gethelpnow/cybertipline or 1-800-843-5678
- Contact local law enforcement.
- Get professional help. NCMEC can connect you with victim services.
Sources & Further Reading
- NCMEC CyberTipline — report suspected exploitation
- Crimes Against Children Research Center — academic research on online exploitation
- Thorn: Research on Online Child Sexual Exploitation — technology-focused research and tools
- Internet Watch Foundation — CSAM detection and removal
- FBI IC3: Sextortion Advisory — FBI warnings and resources for sextortion targeting minors
- StopBullying.gov — federal cyberbullying resources
- FTC: Child Identity Theft — protecting children’s financial identity