Skip to main contentSkip to navigationSkip to navigation
Phone app screen showing Facebook app
Photograph: Chesnot/Getty Images
Photograph: Chesnot/Getty Images

Children 'interested in' gambling and alcohol, according to Facebook

This article is more than 4 years old

Exclusive: algorithm may expose thousands of under-18s to harmful targeted adverts

Facebook has marked hundreds of thousands of children as “interested in” adverts about gambling and alcohol, a joint investigation by the Guardian and the Danish Broadcasting Corporation has found.

The social network’s advertising tools reveal 740,000 children under the age of 18 are flagged as being interested in gambling, including 130,000 in the UK. Some 940,000 minors – 150,000 of whom are British – are flagged as being interested in alcoholic beverages.

These “interests” are automatically generated by Facebook, based on what it has learned about a user by monitoring their activity on the social network. Advertisers can then use them to specifically target messages to subgroups who have been flagged as interested in the topic.

In a statement, Facebook said: “We don’t allow ads that promote the sale of alcohol or gambling to minors on Facebook and we enforce against this activity when we find it. We also work closely with regulators to provide guidance for marketers to help them reach their audiences effectively and responsibly.”

The company does allow advertisers to specifically target messages to children based on their interest in alcohol or gambling. A Facebook insider gave the example of an anti-gambling service that may want to reach out to children who potentially have a gambling problem and offer them help and support.

But advertisers can target the interests for other purposes as well. The developers of an exploitative video game with profitable “loot box” mechanics, for instance, could target their adverts to children with an interest in gambling without breaching any of Facebook’s regulations.

The presence of automated interests also means that alcohol and gambling advertisers who do try to avoid Facebook’s rules about advertising to children have an audience already selected for them by the social network. Facebook relies primarily on automated review for flagging adverts that break its policies. But the automated review is not guaranteed to find breaches before the adverts start to run. Facebook recently settled a lawsuit with the financial expert Martin Lewis over the company’s long-term failure to keep his image out of scam adverts.

Facebook’s automatic categorisation of users has been criticised before. In May 2018, the social network was found to be targeting users it thought were interested in subjects such as homosexuality, Islam or liberalism, despite religion, sexuality and political beliefs being explicitly marked out as “sensitive information” by the EU’s GDPR data protection laws.

A month later, the Guardian and DBC discovered that the social network had algorithmically labelled 65,000 Russians as “interested in treason”, potentially putting them at risk from the repressive state. Facebook removed that label following inquiries from the Guardian.

Facebook’s desire to offer as much information as possible to advertisers for their targeting of consumers has repeatedly led to issues regarding advertisers’ ability to misuse that targeting data in ways that are often borderline, or outright illegal. In March, the US Department of Housing and Urban Development charged Facebook with violating the Fair Housing Act, arguing that the site’s targeting features allowed advertisers to restrict housing adverts in a way that “unlawfully discriminates based on race, colour, national origin, religion, familial status, sex, and disability”. Facebook said at the time that it would work with the department and civil rights groups to improve its ad-targeting systems.

Most viewed

Most viewed