Facebook and Instagram to restrict advertisers’ access to teenagers’ data
Facebook and Instagram are to tighten restrictions around the data available to firms to target ads at teenage users, the platforms’ parent company, Meta, has said.
From February, advertisers will no longer be able see a user’s gender or the type of posts they have engaged with as a way of targeting adverts to them. Under the enhanced restrictions, only a user’s age and location will be used to show them advertising, Meta said.
The social media firm also confirmed that new controls would be introduced in March enabling teenagers to go into the settings in both apps and choose to “see less” of certain types of adverts.
Many online safety campaigners say social media platforms need to do more to control the types of advertising shown to younger users, saying inappropriate ads can cause as much harm as offensive or abusive content posted by others.
Meta has previously added restrictions that stop advertisers from targeting teenagers with adverts based on their interests and activities, and the company said the latest updates came in response to research on the issue, direct feedback from experts and global regulation.
“As part of our continued work to keep our apps age-appropriate for teens, we’re making further changes to their ad experiences,” Meta said in a blogpost.
“We recognise that teens aren’t necessarily as equipped as adults to make decisions about how their online data is used for advertising, particularly when it comes to showing them products available to purchase.
“For that reason, we’re further restricting the options advertisers have to reach teens, as well as the information we use to show ads to teens.”
This isn’t the first time Meta has been forced to look at its impact on its teenage users. Irish regulators launched a two-year investigation into whether Instagram exposed the contact information of its underage users by allowing them to publicly post their phone numbers and email addresses when they switched to a business account in 2020. In September 2022, Meta was fined €405m ($492m) for violating the General Data Protection Regulation.
Facebook whistleblower Frances Haugen also first revealed to the Wall Street Journal in September 2021 that the company knew and had conducted research that showed that its photo-sharing app, Instagram, had a harmful impact on the mental health of teen girls.
In a blog post responding to the article, Instagram head of public policy Karina Newton said that the company takes the findings seriously but contended social media wasn’t “inherently good or bad for people”.
“Many find it helpful one day, and problematic the next. What seems to matter most is how people use social media, and their state of mind when
Read more:
Facebook and Instagram to restrict advertisers’ access to teenagers’ data