In response to criticisms of children’s safety and new privacy regulations (United Kingdom, United States, Europe), Facebook and Instagram continue to roll-out a number of platform changes.
As of 23 August 2021, Facebook has set new restrictions on advertising to users under the age of 18. Advertisers will now not be able to target users under 18 based on detailed targeting options, such as interests, behaviours and demographics. This will include custom audiences too, where youth audiences will not be included in website remarketing, page engagement, customer lists or lookalikes audiences.
If businesses do need to target users under 18, they can only do so by using broader demographic targets, such as age, gender and location. This change applies to ads run on Facebook, Instagram and Messenger.
Why is it changing?
The decision came after the social media giant received criticism for protection of children using the platforms. The change represents what the platform describes as a ‘precautionary approach’ to advertising aimed at young people.
Over the past twelve months, most social media platforms have been tightening their privacy restrictions in effort to meet demands for greater online privacy measures for children.
A Brief (Recent) History on Social Media vs. Children Everywhere
Earlier in the year, Facebook came under heavy scrutiny in the United States after leaks revealed plans for a kids version of Instagram (similar to YouTube Kids and Messenger Kids) that is specifically designed for children under the age of thirteen. This was quickly met with a proposed update to the U.S’s children’s privacy law and a call to end the project by more than 40 U.S State Attorneys General.
U.S Congress members wrote in a April 2021 letter to Facebook CEO Mark Zuckerberg, arguing that children are “a uniquely vulnerable population online” and questioned the potential for advertisers to use the platform for influencer or other native ad messaging that children would have difficulty recognising as marketing.
Further, in a letter written by the National Association of Attorneys General – also to Mark Zuckerberg – stating that “Instagram exploits young people’s fear of missing out and desire for peer approval”. The letter continues:
“Children do not have a developed understanding of privacy…they may not fully appreciate what content is appropriate for them to share with others, the permanency of content they post…and who has access to what they share online.”
Following these protests, it appears that Facebook has not taken the criticism lightly, with the new kids-oriented platform seemingly coming to a halt and new privacy measures becoming a priority.
TikTok and YouTube have already introduced privacy protections for children on their platforms after both being issued multimillion-dollar fines for data exploitation. YouTube now limits features available for children’s videos that are tied to personal data (such as comments, live-chat and saving to playlists) and requires content creators to declare whether their videos are made for children in effort to limit data collection. While TikTok has already limited advertisers from specifically targeting users under 18 in Australia, in August, they also revealed a set of changes for teenagers to further enhance proactive protections, by altering direct messaging privacy settings, sharing prompts and default privacy for users under the age of 15.
Instagram has now announced similar changes that aim to provide children a safer, more private experience, including defaulting to private accounts for users under 16 years of age, and disabling adult users from direct messaging users under 18 years of age who don’t follow them.
However, in order to get these initiatives working, these platforms need users to declare their age, in an ironic twist on protecting children’s personal information.
This is where the UK’s Age Appropriate Design legislation comes in. The code, which was issued in 2020, is now being enforced as of September 2021, requires companies to identify child users and take special efforts to ensure their privacy online. One of main implications of the code affecting social media, requires companies to ensure their default settings on apps are ‘high privacy’, forbids them from encouraging children to provide personal data and prohibits disclosure of children’s data unless it’s absolutely necessary.
As a result, Instagram is now requiring users to verify their age in-app to comply with the code. While new users have been required to enter date of birth since 2019, existing users will be prompted to verify their age upon opening the app. Users who repeatedly dismiss the prompt may also be locked out of their account until they comply.
But can’t users just lie about their age anyway?
Across almost all social media platforms, such as Facebook, Instagram, YouTube and TikTok, users are required to be at least 13 years of age but it’s no secret that users can and have entered false birth dates in order to gain access to the app. While there are techniques in place across Instagram and other platforms to flag potential underage accounts for review, the Age Appropriate Design code calls for stronger measures for legitimate age verification on social media.
Amidst all the fallout of increased privacy measures, this latest update from Facebook is one of the major changes specifically affecting advertisers, rather than users, and businesses should expect further restrictions incoming as privacy matters continue to be brought to public attention.
The bottom line
It’s important to note that although the targeting capabilities have been restricted on the platform, Facebook is not changing the way they collect user data. So, while the advertiser can’t target a specific segment of youth audiences based on their interests or behaviours, Facebook’s algorithms will still be working hard to decide who to reach based on data that determines who is most likely to respond to ads, which may mean unchanged – or even in some cases, improved – campaign performance.
The new restriction actually aligns with the downfall of cookie-based targeting, as interest-level audience targeting is slowly losing effectiveness as more users begin to opt-out. This change is also to Facebook’s benefit, as it encourages advertisers to rely more heavily on the platform’s algorithm to target users, essentially leading advertisers to throw caution to the wind and hope Facebook delivers results.
In terms of impact, businesses who are looking to reach youth audiences will largely only be losing audience insights, as most campaigns will need to separate their broad under 18s audience from their other audience targets, but they’re certainly still targetable on platform.
Given the generally lower audience availability for users aged under eighteen, the change may further push advertisers to siphon more budget into other social media platforms, such as SnapChat or TikTok, to achieve a more omnipresent approach to reaching the youth audience.
Need help with ad targeting and management? NOUS can help. Get in contact with us at [email protected] or call us on 07 3003 0722.