Instagram rolls out teen accounts, other privacy changes designed to protect those under 18
Instagram is introducing separate teen accounts for those under 18 as it tries to make the platform safer for children amid a growing backlash against how social media affects young people’s lives.
Beginning Tuesday in the U.S., U.K., Canada and Australia, anyone under under 18 who signs up for Instagram will be placed into a teen account and those with existing accounts will be migrated over the next 60 days. Teens in the European Union will see their accounts adjusted later this year.
Meta Platforms Inc. acknowledges that teenagers may lie about their age and says it will require them to verify their ages in more instances, like if they try to create a new account with an adult birthday. The Menlo Park, Calif., company also said it is building technology that proactively finds teen accounts that pretend to be grownups and automatically places them into the restricted teen accounts.
The teen accounts will be private by default. Private messages are restricted so teens can only receive them from people they follow or are already connected to. Content deemed “sensitive,” such as videos of people fighting or those promoting cosmetic procedures, will be limited, Meta said.
Teens will also get notifications if they are on Instagram for more than 60 minutes and a “sleep mode” will be enabled that turns off notifications and sends auto-replies to direct messages from 10 p.m. until 7 a.m.
Addressing 3 concerns based on feedback
While these settings will be turned on for all teens, 16- and 17-year-olds will be able to turn them off. Kids under 16 will need their parents’ permission to do so.
“The three concerns we’re hearing from parents are that their teens are seeing content that they don’t want to see or that they’re getting contacted by people they don’t want to be contacted by or that they’re spending too much on the app,” said Naomi Gleit, head of product at Meta. “So teen accounts is really focused on addressing those three concerns.”
In the past, Meta’s efforts at addressing teen safety and mental health on its platforms have been met with criticism that the changes don’t go far enough. For instance, while kids will get a notification when they’ve spent 60 minutes on the app, they will be able to bypass it and continue scrolling.
LISTEN l Teens, digital rights activist Zamaan Qureshi on social media challenges:
The Current24:12Should social media come with a warning label?
That’s unless the child’s parents turn on “parental supervision” mode, where parents can limit teens’ time on Instagram to a specific amount of time, such as 15 minutes.
With the latest changes, Meta is giving parents more options to oversee their kids’ accounts. Those under 16 will need a parent or guardian’s permission to change their settings to less restrictive ones. They can do this by setting up “parental supervision” on their accounts and connecting them to a parent or guardian.
Nick Clegg, Meta’s president of global affairs, said last week that parents don’t use the parental controls the company has introduced in recent years.
Facing lawsuits across North America
Up to 95 per cent of youth from the ages of 13 to 17 in the U.S. report using a social media platform, with more than a third saying they use social media “almost constantly,” according to the Pew Research Center.
U.S. Surgeon General Vivek Murthy said last year that tech companies put too much on parents when it comes to keeping children safe on social media.
“We’re asking parents to manage a technology that’s rapidly evolving that fundamentally changes how their kids think about themselves, how they build friendships, how they experience the world — and technology, by the way, that prior generations never had to manage,” Murthy said in May 2023.
Instagram CEO Adam Mosseri told ABC’s Good Morning America on Tuesday the new changes don’t involve a burdensome requirement for parents.
“We’ve really decided that parents should be our north star,” he said. “They’ve been clear in what they’re most concerned about and we’re trying to proactively address those concerns, without requiring their involvement. But if a parent wants to get involved, we’ve also built some robust tools to allow them to shape the experience into what’s most appropriate for their teen.”
Meta faces lawsuits from dozens of U.S. states that accuse it of harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.
WATCH l Social media companies deny school board suit allegations:
Several Ontario school boards also announced lawsuits this year seeking billions from Meta, Snap Inc. and ByteDance Ltd., which operate the platforms Facebook and Instagram, Snapchat and TikTok, respectively. The boards allege the social media companies “have knowingly and/or negligently disrupted and fundamentally changed the school, learning and teaching climate by creating and sustaining prolific and/or compulsive use of their products by students.”
The claims have not been tested in court.
Top social media platforms, including Facebook, Instagram and TikTok, allow users who are 13 years of age and above to sign up.