Instagram introduces teens accounts with enhanced parental controls

Faith Alofe
4 Min Read

One of Meta’s social media apps, Instagram has announced a significant overhaul aimed at making its platform safer for teenagers, with new “Teen Accounts” launching in the UK, US, Canada, and Australia.

According to the BBC, the changes, which take effect from Tuesday, promise built-in protections for young users, alongside more control and reassurance for parents.

Social media platforms like Instagram have faced increasing pressure globally to shield young people from harmful content.

The move comes amidst growing concerns about online safety, particularly for minors.

The UK’s National Society for the Prevention of Cruelty to Children welcomed Instagram’s new measures but stressed the need for more proactive steps to prevent harmful content from spreading in the first place.

The NSPCC’s online child safety policy manager,Rani Govender, praised the update but noted, “Meta and other social media companies must do more to prevent harmful content and sexual abuse from spreading on their platforms.”

Meta, Instagram’s parent company, described the changes as a “new experience for teens, guided by parents.”

The tech giant said these measures would give parents peace of mind, ensuring that teenagers have the right protections in place.

However, some experts have raised concerns over the effectiveness of these measures.

Ofcom, the UK’s media regulator, highlighted in a report that many parents are hesitant to actively intervene in their children’s online safety.

Senior Meta executive Sir Nick Clegg acknowledged the issue, stating that while controls are available, parents often do not use them.

Ian Russell, whose 14-year-old daughter Molly took her own life after viewing harmful content on Instagram, expressed cautious optimism about the new policies.

“Whether it works or not, we’ll only find out when the measures come into place,” he said, urging Meta to be transparent about the effectiveness of its safety measures.

The new Teen Accounts will primarily affect users aged 13 to 15, with key safety features automatically activated.

These include strict content controls to prevent exposure to harmful material, muted notifications during overnight hours, and private account settings by default.

Teenagers will need to actively approve new followers, and their content will not be visible to non-followers unless they change the settings — a process that now requires parental involvement.

Instagram’s age-verification system will continue to rely heavily on users being honest about their age.

However, Meta has plans to roll out artificial intelligence (AI) tools to detect and revert teens using adult accounts back to teen settings.

The UK’s recently passed Online Safety Act requires social media platforms to take concrete steps to protect children, with the risk of hefty fines or potential bans for non-compliance. Social media analyst Matt Navarra warned that while the changes are a step in the right direction, enforcement is key.

“Teens, as we’ve seen throughout history, will find ways to bypass restrictions if they can,” he said, stressing the need for robust safeguards.

As Instagram implements these changes, the platform will be closely watched to see how well it enforces the new measures and whether it can effectively curb harmful content for its youngest users.

TAGGED:
Share This Article