Is Facebook’s new Instagram For Kids a bad idea? Most people think so

There’s no denying the social media platform Instagram has grown hugely popular since its launch in 2010. From celebrities to everyday citizens, the smartphone app allows people to keep track of each other’s lives by posting, liking and commenting on photos and short videos. 

It’s no wonder the platform has also attracted a large base of users among teens and children, whose lives largely revolve around what their peers are doing – and what their opinions are. But Facebook, which owns Instagram, has been under fire for the surge in younger people joining the platform. It bars people younger than 13 from using it, but it’s easy for children to lie about their age when creating an account. That’s because of the challenges in verifying age. Facebook argues most children don’t have identification documents until their mid-teens (although it’s looking into potential ways to verify age including using artificial intelligence).

That concern sparked the idea for a version of Instagram designed specifically for children younger than 13. 

In explaining its reasoning for its plans to create Instagram For Kids, Facebook claimed that it is better to have dedicated spaces for children to use Instagram than have them lying in order to access the normal version, which lacks parental oversight. In a statement, Facebook said: “We want to improve this situation by delivering experiences that give parents visibility and control over what their kids are doing”. The details of how the parental oversight will work are yet to be formalised, but it’s likely that parents will have full access to the account activity and history, content will be vetted and advertising will be banned. Facebook also said Instagram for Kids is being created with the input of child development and mental health experts as well as privacy advocates to ensure it is as safe and healthy as possible.

But the plans have drawn quite a lot of controversy, with some concerned it will be ineffective at keeping kids sheltered from the unhealthy aspects of social media. In fact, it’s difficult to find any well-known experts and organisations that have publically supported the idea… except for Facebook itself, of course. 

Let’s break down the main arguments against the new platform.

Lawmakers in the US recently wrote to Facebook to share their disapproval and concerns about the planned app. The letter signed by attorney generals from across the country said:

“The attorneys general urge Facebook to abandon these plans. Use of social media can be detrimental to the health and well-being of children, who are not equipped to navigate the challenges of having a social media account. Further, Facebook has historically failed to protect the welfare of children on its platforms. The attorneys general have an interest in protecting our youngest citizens, and Facebook’s plans to create a platform where kids under the age of 13 are encouraged to share content online is contrary to that interest.”

The attorney generals echoed concerns of psychologists and other health experts about the harm social media can cause to children’s health. They cited research that found worsening mental distress and treatment for mental health issues among youth has paralleled a steep growth in the use of smartphones and social media by young people, as well as a study of 5.4million children by an online-monitoring company which found “Instagram was frequently flagged for suicidal ideation, depression and body image concerns”.

Another major worry is that Instagram exploits the angst young people have around missing out and their cravings for peer approval. It’s feared that the more children who use Instagram, the more of them who will feel the desire to constantly check their devices and share photos with their followers.

Last month, dozens of child safety organisations and experts in the US wrote a separate letter to Facebook in which they expanded on these concerns.

In the letter, they agree that the normal version of Instagram is not safe for children under the age of 13 and that “something must be done to protect the millions of children who have lied about their age to create Instagram accounts”. But they add that launching a version for kids will not solve the problem and will instead “put young users at great risk”. 

“The platform’s relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and wellbeing,” their letter read.

“Younger children are even less developmentally equipped to deal with these challenges, as they are learning to navigate social interactions, friendships, and their inner sense of strengths and challenges during this crucial window of development. Moreover, young children are highly persuadable by algorithmic prediction of what they might click on next, and we are very concerned about how automated decision making would determine what children see and experience on a kids’ Instagram platform.”

Another argument made was that children aged between 10 and 12 are unlikely to want to use a “babyish” version that far younger kids use and will be tempted to continue to fake their ages on the original version – especially if they’ve previously been exposed to it.

“The true audience for a kids’ version of Instagram will be much younger children who do not currently have accounts on the platform,” the letter read.

“While collecting valuable family data and cultivating a new generation of Instagram users may be good for Facebook’s bottom line, it will likely increase the use of Instagram by young children who are particularly vulnerable to the platform’s manipulative and exploitative features.”

Then there is the more general concern around the dangers of increased screen time amongst younger people. The experts argue that children do not need yet another digital distraction when excessive screen use is already linked to obesity, lower psychological wellbeing, decreased happiness, lower sleep quality, higher risk of depression and more suicide-related outcomes.

Both the attorney general letters and expert letters pointed to instances where Facebook has failed to protect children’s privacy online. Facebook has faced criticism for inadequately responding to reports of child exploitation (in 2020, Facebook and Instagram reported 20 million child sex abuse images). Leaked documents revealed that Facebook bragged to advertisers about its ability to target teens when their self-esteem was the lowest, particularly relating to physical appearance. In 2019, a glitch in Facebook’s Messenger Kids app let thousands of young children bypass parental oversight and chat with strangers.

Given this history, it’s understandable that many people would be skeptical about whether Facebook has children’s’ interest at heart or if it’s just looking to improve its bottom line.