Facebook denies it manipulates vulnerable teens for advertising
A leaked document obtained by an Australian newspaper claims that Facebook was telling advertisers how to target emotionally vulnerable teens.
The power Facebook has over its users has been brought into question once again as last week, The Australian newspaper claimed to have documents showing that the social network was exploiting teens for the benefit of advertisers.
According to these internal documents, Facebook was helping advertisers to target emotionally vulnerable teens by using its algorithms to influence their mood.
This included Facebook’s algorithms monitoring teenagers’ activity and comments for words or phrases that might indicate they are stressed or experiencing anxiety.
For teenagers as young as 14, Facebook notes when the user says words such as ‘defeated’, ‘anxious’, ‘failure’ or ‘stupid’ and will then flag these users with advertisers.
So far, these documents only tie such activities to users in Australia and New Zealand, with two of its executives based in the country – David Fernandez and Andy Sinn – writing the report.
Immediately following the revelation, Facebook issued an apology for its activities, but indicated it was willing to challenge the newspaper legally over its decision to publish the documents.
However, in the few days since, Facebook has changed its stance to say that despite the claim it was working with advertisers on such a programme, the article in question was misleading.
“Facebook does not offer tools to target people based on their emotional state,” the company said in a statement.
“The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.
“Facebook has an established process to review the research we perform. This research did not follow that process, and we are reviewing the details to correct the oversight.”
For some, it proved to be familiar reading as, three years ago, Facebook was found to have subjected 700,000 unassuming users to a social experiment that attempted to influence their moods.