By: Jake Sen

While scrolling through Instagram one night, I came across a post that I found deeply troubling. After hesitating, I decided to report it – a few clicks, and it was done. Days later, I received a reply: the post didn’t violate community guidelines and wouldn’t be removed. I was encouraged to block the user instead.
That was it.
If that experience left me feeling dismissed and discouraged, imagine how it feels for a teenager.
Toxic content is everywhere on social media. As parents and educators, we might choose to ignore it or report it, assuming the matter is closed. But for teenagers, it’s rarely that simple – and the reporting systems designed to protect them are a big part of why.

What counts as toxic content?

Broadly speaking, toxic content includes anything that could cause harm – from bullying and abuse to hate speech and misinformation. It applies not just to posts, accounts, or videos, but also to comments, private messages, and images. Notably, at the time of writing, none of the major platforms – Instagram, TikTok, or Snapchat – offer a single clear definition of “toxic content” in their community guidelines. Instead, they outline what is and isn’t allowed, while vowing to protect users and provide reporting tools. But reporting, as many have discovered, doesn’t guarantee anything will actually be taken down.

So, how easy is it to report something?

On most platforms, the mechanics of reporting are straightforward enough. A few taps, a selection from a menu, and it’s submitted. You might be prompted to block the account or click “not interested” to stop similar content from appearing in your feed. Simple.

What happens after that, though, is far less clear. Most of us assume that if something obviously breaks a platform’s rules, it’ll be removed. But content containing hate speech or encouraging self-harm often stays up even after being reported. A 2025 study found that major platforms offer very little transparency about what actually happens once a report is submitted and that silence goes a long way toward explaining why so many people don’t bother reporting in the first place.

Why don’t teens often report even when they should?

For teenagers, the barriers go deeper than frustration with the process. Distrust is a big part of it. Research published in 2025 by Bäumler et al found that many adolescents simply don’t believe reporting will make any difference – they see it as inefficient and superficial. And for those who’ve previously turned to adults for help and felt ignored, that skepticism makes complete sense.
There’s also the social dimension, which is hard to overstate. Teens are navigating intense pressure to fit in, and social media amplifies that pressure considerably. Many worry that reporting a post or comment will make things worse – drawing attention to themselves, inviting retaliation, or damaging their reputation among peers. Staying quiet can feel like the safer option, even when it isn’t. The fear of being doxxed, publicly called out, or flooded with hostile messages is real, and for many teenagers, it’s enough to keep them from speaking up at all.

How can parents and educators help?

Social media can feel like uncharted territory for many parents – the platforms, trends, and language shift constantly, and it’s genuinely difficult to keep up. But one thing that doesn’t change is the value of an open, ongoing conversation. When teenagers feel they have a safe space to talk about what they’re seeing online – good and bad – they’re far more likely to reach out when something goes wrong.
It’s also worth knowing that the platforms themselves have introduced some protections, however imperfect. Profiles for younger users are automatically set to private. On TikTok, users under 16 have more restricted accounts that limit their exposure. Instagram applies similar restrictions to users aged 13 to 15, which can only be lifted with parental permission – and parents can directly supervise their child’s account, monitoring settings, messages, and any reports made. Snapchat has a dedicated Family Centre designed to support parents in doing the same.
These tools are a starting point, not a solution. The gap between what platforms promise and what teenagers actually experience when they report toxic content remains wide. Until that changes, the most powerful thing we can offer young people isn’t a reporting button – it’s the confidence that someone they trust will take them seriously when they use it.

Frequently Asked Questions

What is the definition of toxic content?

There isn’t one universal definition, which is part of the problem. Major platforms like Instagram, TikTok, and Snapchat don’t use the term in their community guidelines at all. Academics have approached it from different angles.  One useful definition frames toxic content as anything likely to make people want to disengage from, or leave entirely, an online space. That framing is helpful because it shifts the focus from intent to impact: it’s not just about what was meant by a post or comment, but about what it does to the person on the receiving end.

Why don’t social media platforms remove content after it’s been reported?

Moderation is more complicated than most users realize. Platforms receive an enormous volume of reports, and decisions are made by a combination of automated AI systems and human reviewers – neither of which is perfect. What seems like an obvious violation to the person reporting it may not meet the specific threshold set by a platform’s guidelines. There’s also very little transparency about how these decisions are made, which leaves users in the dark and, understandably, less likely to report in the future.

What should I do if my teenager encounters toxic content online?

Start with the conversation, not the reporting button. When teenagers feel heard and supported at home, they’re more likely to come forward when something online upsets or worries them. From there, you can look at reporting the content together, adjusting their privacy settings, or using the parental supervision tools that platforms such as Snapchat’s Family Center now offer. The goal isn’t to remove social media from their lives, but to make sure they don’t have to navigate its difficult parts alone.

Image Credit: Yarenci Hdz