Major Instagram update to give parents more control with new teen privacy settings
Designated accounts will be automatically switched over to the new teen settings and will have strict controls placed on them that only a parent can change.
Teenage Instagram users will get new privacy settings, its parent company Meta has announced in a major new update.
It is an attempt by Meta, which also owns WhatsApp and Facebook, to reduce the amount of harmful content seen online by young people.
Instagram allows 13-year-olds and above to sign up but after the privacy changes, all designated accounts will be turned into teen accounts automatically, which will be private by default.
Those accounts can only be messaged and tagged by accounts they follow or are already connected to, and sensitive content settings will be the most restrictive available.
Offensive words and phrases will be filtered out of comments and direct message requests, and the teenagers will get notifications telling them to leave the app after 60 minutes each day.
Sleep mode will also be turned on between 10pm and 7am, which will mute notifications overnight and send auto-replies to DMs.
Users under 16 years of age will only be able to change the default settings with a parent’s permission.
But 16 and 17-year-olds will be able to turn off the settings without parental permission.
Parents will also get a suite of settings to monitor who their children are engaging with and limit their use of the app.
Meta said it will place the identified users into teen accounts within 60 days in the US, UK, Canada and Australia, and in the European Union later this year.
The rest of the world will see the accounts rolled out from January.
Ofcom, the UK’s communications regulator, called the changes “a step in the right direction” but said platforms will have to do “far more to protect their users, especially children” when the Online Safety Act starts coming into force early next year.
“We won’t hesitate to take action, using the full extent of our enforcement powers, against any companies that come up short,” said Richard Wronka, online safety supervision director at Ofcom.
More from Sky News:
Meta bans Russian state media from Facebook and Instagram
Three mpox scenarios the UK is preparing for
Final messages from Titan submersible crew
Be the first to get Breaking News
Install the Sky News app for free
Meta has faced multiple lawsuits over how young people are treated by its apps, with some claiming the technology is intentionally addictive and harmful.
Others have called on Meta to address how its algorithm works, including Ian Russell, the father of teenager Molly Russell, who died after viewing posts related to suicide, depression and anxiety online.
In wake of today’s news, Mr Russell, who is chair of trustees at the Molly Rose Foundation, questioned why these steps weren’t taken sooner.
He added: “The countries of the world are uniting and saying that the platforms haven’t done well enough, and have to do better. That is why these announcements are being made by Meta, because they have to comply with the regulators of the world.
“I think the other side of the coin is, if they can do these measures now – which don’t seem that complicated in many ways – you have to ask why they didn’t take these steps sooner. If they had done, they would have protected far more young people and maybe saved some innocent young lives as well.”
Keep up with all the latest news from the UK and around the world by following Sky News
Meta said the new restrictions on accounts are “designed to better support parents, and give them peace of mind that their teens are safe with the right protections in place”.
It also acknowledged that teenagers may try to lie about their age to circumvent restrictions, and said that it is “building technology to proactively find accounts belonging to teens, even if the account lists an adult birthday”.
That technology will begin testing in the US early next year.