Facebook app for kids sparks privacy concerns
A new Facebook chat app designed for kids is raising concern among lawmakers and children’s groups over data privacy and safety.
The app, Messenger Kids, is targeted toward children aged 6-12, who are still too young to use Facebook.
Unveiled Monday, it differs from the existing Facebook Messenger app in key ways.
An account can only be set up by a parent, who must also add any contacts for their child. Facebook also won’t advertise to children within the app or sell any data it collects to third-party advertisers. Children don’t need to set up a Facebook account and after the age of 12 they won’t be pushed on to the adult app.
{mosads}
Facebook says those safeguards will protect children and give parents more control, but lawmakers are seeking more assurances.
On Thursday, Democratic Sens. Ed Markey (Mass.) and Richard Blumenthal (Conn.) wrote to Facebook CEO Mark Zuckerberg, calling on the company to provide more information about the app.
Markey and Blumenthal expressed concern over Facebook collecting data from kids despite the firm’s promises that it wouldn’t send sell any data to third-party advertisers.
“We remain concerned about where sensitive information collected through this app could end up and for what purpose it could be used,” they wrote. “Facebook needs to provide assurances that this ‘walled garden’ service they describe is fully protective of children.”
The lawmakers, both on the Senate Commerce Committee, worry that Facebook can still collect and share the app to other companies it owns, such as WhatsApp and Instagram.
They aren’t alone in their fears.
“It’s not overtly stated, but everything a kid is doing is stored and perhaps used in the development of future projects at Facebook,” says Christine Elgersma, an editor at Common Sense Media, a children’s media ratings and advocacy group.
The privacy policy for Messenger Kids does acknowledge that Facebook may share information it collects from kids within its family of companies.
Elgersma also highlights another issue. She worries that an app targeted for children who are too young to have a Facebook account — the company’s terms of service only allow those 13 years or older to sign up — is really geared at pulling them into the company’s other products.
“It seems clear that Facebook is trying to capture a younger audience who go to different apps that aren’t Facebook when they turn 13,” she explained. “The hope is that is because they’re already ensconced in Facebook, they’ll just continue and open an account.”
“This is an attempt to create a feature that will help Facebook win over young people and keep their parents tied to the site,” Jeffrey Chester, the executive director of the Center for Digital Democracy, a privacy and children’s advocacy group, told The New York Times.
“With YouTube monetizing the youngest children, it’s too lucrative a market for Facebook to overlook — plus the company is losing youth market share to Snapchat.”
Facebook does have a problem attracting younger users, who have instead flocked to other social media such as Instagram, which Facebook owns, and Snapchat.
Children’s advocates say that Facebook’s effort to reverse this trend isn’t necessarily a problem, but worry about exposing children to technology at a young age.
Jean Twenge, who has written and researched the impacts of technology on children and adolescents told the Financial Times this week that while Messenger Kids seems better for children than many apps available, higher amounts of screen time can still be damaging.
“If it significantly increases the amount of time kids are spending with digital media, it could keep them from more beneficial activities (like in-person social interaction, exercise, or sleeping). Spending more time on screens and less on non-screen activities is linked to unhappiness and depression,” she said.
“Parents should also be on the lookout for cyberbullying — friendships can turn quickly at this age, so even if kids are only communicating with approved peers issues can still crop up.”
Twenge co-authored a study published in November that found teenagers who spent more time on social media were more likely to report being depressed than those who engaged in other activities.
The study argued that this “may account for the increases in depression and suicide,” among younger generations.
Facebook says that it consulted with leading child development experts, educators and parents to create a safe app.
In a post explaining Messenger Kids, Antigone Davis, public policy director and global head of safety at Facebook said the company had considered the effects of screen time as it developed the app and had listened to parents closely on the issuse.
“In all of our research, there was one theme that was consistent: parents want to know they’re in control. They want a level of control over their kids’ digital world that is similar to the level they have in the real world,” she wrote.
“And just as they want to say ‘lights out’ at night, they also want to be able to say ‘phones off.’ ”
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..