"Facebook Ads and Sexuality / Drugs" – Part 2

The results of my investigation on Facebook ads, which target the interests of Facebook users in certain sexual practices or in drug.
Eleven days ago, I published an article about the strange discovery I had just made: Facebook allows advertisers to target users based on their interests – expressed or implied – in certain sexual practices or in drugs.

I reported my discovery to the CNIL (Commission Nationale de l’Informatique et des Libertés – an independent French administrative authority whose mission is to ensure that data privacy law is applied to the collection, storage, and use of personal data), which confirmed my concerns.  Sophie NerbonneDeputy DirectorLegal, IT and International AffairsCNIL, said during a telephone interview: «this is a use of sensitive data for advertising purposes. Facebook should have collected the approval of its users before using their sensitive data. And the social network cannot hide behind the terms and conditions that must be approved by any new user who signs up for its services: this document is not sufficient to obtain consent prior to the use of sensitive data.»

The Article 29 Data Protection Working Group (G29), which is made up of a representative from the data protection authority of each EU Member State, the European Data Protection Supervisor and the European Commission, recommends even banning all advertising using sensitive data.

I also contacted the Office of the Data Protection Commissioner (ODPC), which, in Ireland, is responsible for the enforcement and monitoring of compliance with data protection legislation. Among European Data protection authorities, the ODPC is specifically responsible for the Facebook case, as the European headquarters of the social network is located in Dublin. Catriona Holohan, Press Officer, Office of the Data Protection Commissioner, answered my questions by e-mail: «the issue of how Facebook-Ireland uses information provided by users to target advertisements was raised during our audit of the company.  […] The relevant extract from the audit report reads: « […] Facebook-Ireland undertakes to clarify its policy in this respect, which is to allow targeting on the basis of keywords entered by the advertiser but not allow targeting based upon the described categories of sensitive data ». * As indicated in the audit report (page 4), we will be formally reviewing implementation of this and other recommendations in July 2012.»

And I sent my text to the Center for Democracy & Technology (CDT) a Washington, D.C. based 501 non-profit organization, which campaigns to enhance free expression and privacy in communications technologies. For Justin Brookman, Director, Consumer Privacy, CDT, this issue is not clear cut. «In the US we really don’t have many substantive privacy laws, so unless Facebook affirmatively promises *not* to target based on these sensitive categories, it’s probably legal, at least from a privacy perspective. »
However, Facebook Advertising Guidelines say** (the emphases are mine):

«B.    Attribution

Ad text may not assert or imply, directly or indirectly, within the ad content or by targeting, a user’s personal characteristics within the following categories:

i.    race or ethnic origin;

ii.    religion or philosophical belief;

iii.    age;

iv.    sexual orientation or sexual life;

v.    gender identity;

vi.    disability or medical condition (including physical or mental health);

vii.    financial status or information;

viii.    membership in a trade union; and

ix.    criminal record.»

For Justin Brookman, Director, Consumer Privacy, CDT, Facebook Advertising Guidelines are ambiguous: «I’m not sure of the details of how Facebook advertising platform works, but it may be the case that it’s just a platform that Facebook doesn’t pre-screen all the potential categories of advertising — they just organically populate, and they leave it to the developers to *not* target based on potentially offensive categories.  I’m sympathetic to the idea that Facebook shouldn’t be legally obligated to pre-screen all potential keywords that might derive — as long as they don’t go out of their way to tell their users that they won’t be advertised based on those categories.  However, you could make the argument that by having developer rules in place, FB has some responsibility for enforcing those rules, even absent consumer-facing disclosures.»

____

Despite my repeated requests since May 16, 2012, Facebook has not answered my questions about this targeting based on sensitive categories.

Jacques Henno

____

Sources:

* Page 50 of audit report available on the ODPC website at
http://dataprotection.ie/viewdoc.asp?DocID=1182&m=f

**www.facebook.com/ad_guidelines.php

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *

Ce site utilise Akismet pour réduire les indésirables. En savoir plus sur comment les données de vos commentaires sont utilisées.