Meta has announced a major safety update for Instagram, introducing PG13-style content filters that will now apply automatically to all teenage accounts. The move marks one of Meta’s most significant steps in protecting young users from exposure to explicit or harmful material online.

The company confirmed that the update will make Instagram feel like “watching a PG13 movie,” where mild suggestive or humorous content might appear, but mature, violent, or explicit posts will remain out of reach. Teens under 18 will be automatically placed under the 13+ category and will require parental approval to change these settings.

Why Meta Chose the PG13 Standard

Meta explained that the decision to follow a PG13 framework comes from its familiarity among parents worldwide. The platform compared its content categories with film ratings and refined its filters to exclude anything that may contain strong language, adult themes, or risky behaviour.

Under these new rules, posts referencing alcohol, drugs, or suggestive acts will no longer appear in teens’ feeds or explore sections. Meta shared results from an Ipsos survey, showing that 95 percent of parents in the US believe the new controls will create safer experiences for their children online, while 90 percent said it made Instagram easier to monitor.

Introducing the ‘Limited Content’ Option

For families wanting stricter supervision, Meta is rolling out an additional setting called Limited Content. This feature takes control a step further by blocking more content categories, turning off comments, and filtering AI-driven interactions to avoid any inappropriate subjects.

According to Meta, nearly 96 percent of surveyed parents supported having an extra layer of restriction, even if they didn’t plan to activate it immediately.

Strengthening AI and Moderation Systems

To enforce these filters, Meta has upgraded its artificial intelligence systems to automatically detect and hide content that violates the new standards. Teenage users will no longer be able to interact with or follow accounts sharing mature or violent material.

Instagram will also prevent searches for banned terms, including misspelled versions of words such as “alcohol” or “gore.” If restricted links are shared in private messages, the platform will block access to them.

Parental Involvement and Feedback

Meta said the changes were designed with feedback from parents and educators globally. The company analysed over three million content ratings to define what qualifies as “age-appropriate.” Parents will also get new tools to report any posts they believe should be hidden from teenagers, ensuring ongoing participation in improving the system.

Internal data from Meta’s pilot tests showed that fewer than two percent of posts visible to teenagers were rated inappropriate by most parents.

Global Rollout and Future Plans

The PG13-based filters are rolling out first across the US, UK, Canada, and Australia, with full implementation expected by the end of the year. Global expansion is scheduled for 2026, including safeguards for users who may have falsely registered as adults.

“These updates reflect our commitment to helping teens have safer, more positive online experiences,” Meta stated, adding that similar age-based protections will soon be introduced for Facebook users as well.


Follow Tech Moves on Instagram and Facebook for the latest tech updates, safety innovations, and digital policy stories from around the world.