Political advertising via profiling of persons or users using data gathered by internet companies engaged in marketing has gained prevalence in recent years.[1] This unruly issue has become a source of worry to most national and international authorities with privacy regulatory bodies reeling out action plans and directives seeking to regulate the use of profiling for targeted purposes.[2] The targeting of individuals by political groups, potentially for an ingenious mode of influencing users using psychological profiles[3] built by obtaining personal data became infamous due to Cambridge Analytica.[4]

The increased monitoring of data harvesting by Internet companies was as a result of the Cambridge Analytica[5] scandal which came to light in 2018.[6] The collection of data was carried out through a Facebook-based quiz app ‘thisisyourdigitallife‘, created by Aleksandr Kogan, a University of Cambridge psychologist who was granted access to the data of 270,000 Facebook members after they had agreed to use the app to undergo a personality test.[7]  Since user interaction on Online Social Networks (OSNs) can correspond to their real-life relationships, OSN datasets quickly became a highly sought-after resource for social network research.[8]

The access granted to Cambridge Analytica allowed the data firm to harvest the data of more than 80 Million data subjects. This data obtained was allegedly used to psychologically profile US[9] voters and tailor adverts seeking to create a mindset in the mind of the user.[10]

The General Data Protection Regulation 2016/679 (GDPR) is Europe’s response to better protection of people’s data. The GDPR defines profiling as:

any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”[11].

While automated decision, on the other hand, applies to all automated individual decision-making[12] and profiling. By the provisions of the GDPR,[13] automated decision-making creating a legal or similarly significant effect is not allowed, unless it is done with an individual’s explicit consent separate from the consent for marketing, necessary for the entry into or performance of a contract and authorised by EU or Member State law applicable to the data controller. A micro-targeted environment may create an ‘echo chamber’[14] where people are exposed to regurgitated information and encounter fewer opinions[15] and could propagate false stories and conspiracies[16].

The GDPR further adds two new categories of sensitive personal data to cover DNA and biometric data. Biometric data, therefore, includes physical, physiological or behavioural characteristics of users. With the revamping of personal data by the GDPR, the GDPR has expanded its data principle’s coverage to include profiles of individuals built by data processors and controllers. The GSR app created by Kogan was able to build profiles for persons whose data had been obtained. This data taken while not covered under the DPA[17] is provided for under the GDPR.

The GDPR presents mandatory principles[18] to be met by data controllers and data processors in dealing with user data. In attempting to reach out to its users and bring the right content to users, the top internet companies such as Facebook and Google have maintained a system of intrusive profiling[19] of users to get more income from advertising[20] which are the abovementioned companies’ primary sources of income. The GDPR has several data principles[21] which shall not be breached by entities subject[22] to it.

The principles are

  1. Lawfulness[23], fairness and transparency[24]
  2. Purpose limitation[25][26].
  3. Data minimisation.[27],
  4. Accuracy[28],
  5. Storage limitation[29]
  6. Integrity and confidentiality[30].

While Facebook tries to comply with the data protection principles; however due to business interests, the social media giant has also tried to kill off privacy regulations in several countries.[31] Despite these measures, Facebook/SCL all through the period of the data handling misdemeanour fell short of complying with five fundamental principles of the GDPR which are stated below;

1.         Lawfulness[32], Fairness And Transparency[33].

Facebook was not adequately transparent in explaining to users why a political party or campaign might target them. The ICO also noted this; however, we cannot discountenance the usefulness of political microtargeting has it can reach citizens who ignore traditional media[34], and interest individuals about politics through tailored messages[35]. Simply put, microtargeting might increase information, interest in politics, and electoral turnout[36].

Facebook in an unfair and illegal move which was also against its code of conduct handed out data belonging to data subjects and friends of data subject on its social network as the friends didn’t consent to use of their data for profiling purposes or consented to the use.  Facebook in breaching the GDPR in this regard would also breach the provision of the GDPR[37] which stipulate that data subjects have a right to be informed.

2. The principle of Integrity and confidentiality.[38]

While microtargeting as its advantages[39], users were subject to the dishonesty of Facebook and its political advertisers as advertisers simply sieved the chaff using Facebook’s microtargeting tools. Individuals must be aware of a company’s personal data usage plans, consent or have the right to opt out if consent is the basis on which their data is held as Facebook and SCL.[40]

According to WHICH?[41] profiling at an individual level and ‘micro-targeting’ creates the potential for various types of consumer harm, including financial & non-financial harms such as discriminatory access to information or services, or lower uptake of digital services due to consumer concerns.[42] Individual data therefore entrusted with Facebook was consequently given to SCL/Cambridge Analytica for research purposes. However, this was not the case as Cambridge Analytica used this data to show it can influence people.

3 Purpose Limitation.[43]

Also, Facebook during the Cambridge Analytica scandal would also have been liable to a breach of the purpose limitation doctrine under the GDPR as profiling may involve the use of personal data that was obtained for a purpose.  Whether the new purpose is compatible with the original purposes for which the data were obtained will depend on certain factors, including what information the controller initially provided to the data subject.

The GDPR further states[44] the conditions to determine if the purpose limitation doctrine had been abused or not.[45] The data were obtained for an academic study; rather it was used for micro-targeting certain data subjects with political ads. With the provisions of the GDPR, Facebook and the defunct SCL would have been liable for a breach of the purpose limitation doctrine as the data used and utilised would go against the principle of purpose limitation.

4. Storage Limitation.[46]

Under the GDPR the principle of storage limitation provides that companies must delete personal data when it is no longer necessary. SCL had obtained data from Facebook through Kogan who had obtained so for academic purposes, but the technology company still held onto the data it had collected and associated data which is inconsistent with the storage limitation principle as their action was not proportionate with the reason for continued storage of data.

Despite this inconsistency, it may be advantageous for companies to retain data for profiling as this would allow data processors and controllers to learn more.[47] The data obtained was used for research purposes; however, SCL was able to market the data to its clients and feed the data subject information its client wanted.

5.Data Minimisation.[48]

This brings us to the data Minimisation principle which was breached by the SCL, while Facebook granted Cambridge Analytica access to personal data of persons who had been paid for the research conducted, SCL went ahead to build profiles of persons connected with the original data subjects.

The action by SCL in accessing data belonging to persons outside the original research group conflicts with the data minimisation principle under the GDPR. To buttress this WP29 opines that Controllers should be able to justify why they should obtain and maintain personal data or consider using anonymous data for profiling[49].

The ICO[50] noted that Facebook’s policies permitted third-parties to access personal data about users who installed the app which is inconsistent with the data minimisation principle. A good example is the GSR app which Facebook’s default settings allowed user’s friends’ data to be collected by the app unless the friends had restricted their privacy settings.

Facebook therefore unlawfully and in breach of the GDPR[51] released user data without the data subject’s consent. This action by Facebook was deemed unlawful by the ICO and to further bolster this point the GDPR[52] places an additional burden on Facebook as it is the data controller in respect of the micro-targeted person.

Facebook under the provisions of Article 6 of the GDPR is therefore accountable for the actions of third-party contractors or developers for the misuse and abuse of data they have been granted access to.

Conclusion

The Cambridge Analytica scandal was the effect of Facebook’s effort to offer fully transparent and portable data to specific firms. With this scandal, OSNs must limit access to only official third-party providers who undergo frequent due diligence checks to prevent misuse. For now, micro-targeting based on web and purchase behaviour isn’t going anywhere as its advantages are key in ensuring that relevant information is served to data subjects.

To prevent a breach of the GDPR by companies like Facebook and Google[53] actions such as obtaining separate consents to all types of processing, greater transparency around privacy settings, and considering the design and prominence of privacy notices should be investigated as this would prevent a breach of the GDPR. Also, third party App developers should be made to execute binding contracts while dealing with data.

In addressing problems that may arise from the use of profiling and automated decision making, its best for initiatives such as the enforcement of data protection rules, Creation of a super-regulator as advocated by the UK’s House of Lords[54], allowing users exercise their rights to be implemented and creating a special set of rules for political campaigns[55].

References

Legislation

General Data Protection Regulation 2016/679

Data Protection Act 1998

Cases           

CNIL v GOOGLE LLC [2019] CNIL’s restricted committee (CNIL’s restricted committee)

Journals

Allcott HM Gentzkow, ‘Social Media And Fake News In The 2016 Election’ (2017) 31 Journal of Economic Perspectives

Bond R and others, ‘A 61-Million-Person Experiment In Social Influence And Political Mobilization’ (2012) 489 Nature

Cadwalladr C, ‘Revealed: Facebook’S Global Lobbying Against Data Privacy Laws’ (the Guardian, 2019) <https://www.theguardian.com/technology/2019/mar/02/facebook-global-lobbying-campaign-against-data-privacy-laws-investment> accessed 8 March 2019

Datoo A, ‘Data In The Post-GDPR World’ (2018) 2018 Computer Fraud & Security

Humski L, D PintarM Vranic, ‘Analysis Of Facebook Interaction As Basis For Synthetic Expanded Social Graph Generation’ (2019) 7 IEEE Access

Pennycook GD Rand, ‘Assessing The Effect Of ‘Disputed’ Warnings And Source Salience On Perceptions Of Fake News Accuracy’ [2017] SSRN Electronic Journal

Zuiderveen Borgesius F and others, ‘Online Political Microtargeting: Promises And Threats For Democracy’ (2018) 14 Utrecht Law Review

Books

‘Definition Of ECHO CHAMBER’ (Merriam-webster.com) <https://www.merriam-webster.com/dictionary/echo%20chamber> accessed 13 March 2019

Reports

Information Commissioners Office, ‘Democracy Disrupted? Personal Information And Political Influence’ (Information Commissioners Office 2018) <https://ico.org.uk/media/2259369/democracy-disrupted-110718.pdf> accessed 11 March 2019

ARTICLE 29 DATA PROTECTION WORKING PARTY, ‘Guidelines On Automated Individual Decision-Making And Profiling For The Purposes Of Regulation 2016/679’ (ARTICLE 29 DATA PROTECTION WORKING PARTY 2019)

House of Commons Digital, Culture, Media and Sport Committee, ‘Disinformation And ‘Fake News’: Final Report’ (House of Common 2019) <https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/1791/1791.pdf> accessed 12 March 2019

Information commisioners Office, ‘Investigation Into The Use Of Data Analytics In Political Campaigns’ (Information commisioners Office 2019) <https://ico.org.uk/media/action-weve-taken/2260271/investigation-into-the-use-of-data-analytics-in-political-campaigns-final-20181105.pdf> accessed 13 March 2019

EU Parliament, ‘The Use Of Facebook Users’ Data By Cambridge Analytica And The Impact On Data Protection’ (EU 2018)

Websites

 ‘A – European Data Protection Supervisor – European Data Protection Supervisor’ (European Data Protection Supervisor – European Data Protection Supervisor, 2019) <https://edps.europa.eu/data-protection/data-protection/glossary/a_en> accessed 13 March 2019

Graham-Harrison EC Cadwalladr, ‘Revealed: 50 Million Facebook Profiles Harvested For Cambridge Analytica In Major Data Breach’ (the Guardian, 2019) <https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election> accessed 13 March 2019

Halpern S, ‘Cambridge Analytica And The Perils Of Psychographics’ (The New Yorker, 2018) <https://www.newyorker.com/news/news-desk/cambridge-analytica-and-the-perils-of-psychographics> accessed 11 March 2019

Latham A, ‘Cambridge Analytica Scandal: Legitimate Researchers Using Facebook Data Could Be Collateral Damage’ (The Conversation, 2018) <https://theconversation.com/cambridge-analytica-scandal-legitimate-researchers-using-facebook-data-could-be-collateral-damage-93600> accessed 13 March 2019

Manokha I, ‘‘Cambridge Analytica’: Surveillance Is The DNA Of The Platform Economy’ (openDemocracy, 2018) <https://www.opendemocracy.net/en/digitaliberties/cambridge-analytica-surveillance-is-dna-of-platform-economy/> accessed 13 March 2019

Wade M, ‘Psychographics: The Behavioural Analysis That Helped Cambridge Analytica Know Voters’ Minds’ (The Conversation, 2018) <https://theconversation.com/psychographics-the-behavioural-analysis-that-helped-cambridge-analytica-know-voters-minds-93675> accessed 13 March 2019

Blaschke Y, ‘Google And IAB: Knowingly Enabling Intrusive Profiling – Edri’ (EDRi, 2019) <https://edri.org/google-and-iab-knowingly-enabling-intrusive-profiling/> accessed 13 March 2019


[1] Sue Halpern, ‘Cambridge Analytica And The Perils Of Psychographics’ (The New Yorker, 2018) <https://www.newyorker.com/news/news-desk/cambridge-analytica-and-the-perils-of-psychographics> accessed 11 March 2019.

[2] Information Commissioners Office, ‘Democracy Disrupted? Personal Information And Political Influence’ (Information Commissioners Office 2018) <https://ico.org.uk/media/2259369/democracy-disrupted-110718.pdf> accessed 11 March 2019.

[3] Michael Wade, ‘Psychographics: The Behavioural Analysis That Helped Cambridge Analytica Know Voters’ Minds’ (The Conversation, 2018) <https://theconversation.com/psychographics-the-behavioural-analysis-that-helped-cambridge-analytica-know-voters-minds-93675> accessed 13 March 2019.

[4] Providing the data to Cambridge Analytica was, it seems, against Facebook’s internal code of conduct, but it was not until March 2018 has Aleksandr Kogan the researcher who obtained the data has been banned by Facebook from the platform.

[5] The parent company was known as SCL

[6] Emma Graham-Harrison and Carole Cadwalladr, ‘Revealed: 50 Million Facebook Profiles Harvested For Cambridge Analytica In Major Data Breach’ (the Guardian, 2019) <https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election> accessed 13 March 2019.

[7] Ivan Manokha, ‘‘Cambridge Analytica’: Surveillance Is The DNA Of The Platform Economy’ (openDemocracy, 2018) <https://www.opendemocracy.net/en/digitaliberties/cambridge-analytica-surveillance-is-dna-of-platform-economy/> accessed 13 March 2019.

[8] Luka Humski, Damir Pintar and Mihaela Vranic, ‘Analysis Of Facebook Interaction As Basis For Synthetic Expanded Social Graph Generation’ (2019) 7 IEEE Access.

[9] United States of America Voters.

[10] Michael Wade, ‘Psychographics: The Behavioural Analysis That Helped Cambridge Analytica Know Voters’ Minds’ (The Conversation, 2018) <https://theconversation.com/psychographics-the-behavioural-analysis-that-helped-cambridge-analytica-know-voters-minds-93675> accessed 13 March 2019.

[11] General Data Protection Regulation 2016/679 Art 4(4).

[12] an “automated individual decision” is a decision which significantly affects a person, and which is based solely on automated processing of personal data in order to evaluate this person. Such an evaluation may relate to different personal aspects, such as performance at work, creditworthiness, reliability, conduct, etc A – European Data Protection Supervisor – Glossary’ (European Data Protection Supervisor – European Data Protection Supervisor, 2019) <https://edps.europa.eu/data-protection/data-protection/glossary/a_en> accessed 13 March 2019.

[13] General Data Protection Regulation 2016/679 Article 22(1)

[14] ‘Definition of Echo Chamber’ (Merriam-webster.com) <https://www.merriam-webster.com/dictionary/echo%20chamber> accessed 13 March 2019.

[15] See for example The Economist, How the World Was Trolled (November 4-10, 2017), Vol. 425, No 9065, pp. 21-24.

[16] Hunt Allcott and Matthew Gentzkow, ‘Social Media And Fake News In The 2016 Election’ (2017) 31 Journal of Economic Perspectives.

[17] Data Protection Act 1998

[18] General Data Protection Regulation 2016/679 Article 4

[19] Yannic Blaschke, ‘Google And IAB: Knowingly Enabling Intrusive Profiling – Edri’ (EDRi, 2019) <https://edri.org/google-and-iab-knowingly-enabling-intrusive-profiling/> accessed 13 March 2019.

[20] Gordon Pennycook and David G. Rand, ‘Assessing The Effect Of ‘Disputed’ Warnings And Source Salience On Perceptions Of Fake News Accuracy’ [2017] SSRN Electronic Journal.

[21] General Data Protection Regulation 2016/679 Article 5

[22] Data Subjects, Data Controllers and Data processors

[23] General Data Protection Regulation 2016/679 Article 5(1)a

[24] you must process personal data lawfully, fairly and in a transparent manner in relation to the data subject Article 5(1)a

[25] you must only collect personal data for a specific, explicit and legitimate purpose. You must clearly state what this purpose is, and only collect data for as long as necessary to complete that purpose Article 5(1)b

[26] General Data Protection Regulation 2016/679 Article 5(1)c

[27] you must ensure that personal data you process is adequate, relevant and limited to what is necessary in relation to your processing purpose General Data Protection Regulation 2016/679 Article 5(1)c

[28] you must take every reasonable step to update or remove data that is inaccurate or incomplete. Individuals have the right to request that you erase or rectify erroneous data that relates to them, and you must do so within a month General Data Protection Regulation 2016/679 Article 5(1)d

[29] You must delete personal data when you no longer need it. The timescales in most cases aren’t set. They will depend on your business’ circumstances and the reasons why you collect this data General Data Protection Regulation 2016/679 Article 5(1)e

[30] General Data Protection Regulation 2016/679 Article 5(1)f

[31] Carole Cadwalladr, ‘Revealed: Facebook’S Global Lobbying Against Data Privacy Laws’ (the Guardian, 2019) <https://www.theguardian.com/technology/2019/mar/02/facebook-global-lobbying-campaign-against-data-privacy-laws-investment> accessed 8 March 2019.

[32] General Data Protection Regulation 2016/679 Article 5(1).

[33] General Data Protection Regulation 2016/679 Article 5(1)a

[34] There is ample evidence that political information reaches segments of the population of lower political interest through incidental exposure, while people are using social media for entertainment purposes. Y. Kim et al., ‘Stumbling upon news on the Internet: Effects of incidental news exposure and relative entertainment use on political engagement’, (2013) 29 Computers in human behavior, no. 6, pp. 2607-2614.

[35] Frederik J. Zuiderveen Borgesius and others, ‘Online Political Microtargeting: Promises And Threats For Democracy’ (2018) 14 Utrecht Law Review.

[36] Robert M. Bond and others, ‘A 61-Million-Person Experiment In Social Influence And Political Mobilization’ (2012) 489 Nature.

[37] Articles 13(2) (f) and 14(2) (g) and 22(1)

[38] General Data Protection Regulation 2016/679 Article 5(1)f

[39] Annabel Latham, ‘Cambridge Analytica Scandal: Legitimate Researchers Using Facebook Data Could Be Collateral Damage’ (The Conversation, 2018) <https://theconversation.com/cambridge-analytica-scandal-legitimate-researchers-using-facebook-data-could-be-collateral-damage-93600> accessed 13 March 2019.

[40] Datoo A, ‘Data In The Post-GDPR World’ (2018) 2018 Computer Fraud & Security

[41] Consumers’ Association

[42] Written evidence from Which? (IRN0116) <http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/communications-committee/the-internet-to-regulate-or-not-to-regulate/written/91639.html> accessed 13 March 2019.

[43] General Data Protection Regulation 2016/679 Article 5(1)b

[44] General Data Protection Regulation 2016/679 Article 6(4)

[45] The factors stated by the GDPR include

1.relationship between the purposes for which the data have been collected and the purposes of further processing;

2.the context in which the data were collected and the reasonable expectations of the data subjects as to their further use;

3.The nature of the data;

4.The impact of the further processing on the data subjects; and

5.The safeguards applied by the controller to ensure fair processing and to prevent any undue impact on the data subjects.

[46] General Data Protection Regulation 2016/679 Article 5(1)e

[47] Annabel Latham, ‘Cambridge Analytica Scandal: Legitimate Researchers Using Facebook Data Could Be Collateral Damage’ (The Conversation, 2018) <https://theconversation.com/cambridge-analytica-scandal-legitimate-researchers-using-facebook-data-could-be-collateral-damage-93600> accessed 13 March 2019.

[48] General Data Protection Regulation 2016/679 Article 5(1)c

[49] Article 29 Data Protection Working Party, ‘Guidelines On Automated Individual Decision-Making And Profiling For The Purposes Of Regulation 2016/679’ (2018).

[50] Information commissioner’s Office, ‘Investigation Into The Use Of Data Analytics In Political Campaigns’ (Information commissioner’s Office 2019) <https://ico.org.uk/media/action-weve-taken/2260271/investigation-into-the-use-of-data-analytics-in-political-campaigns-final-20181105.pdf> accessed 13 March 2019.

[51] General Data Protection Regulation 2016/679 Article 6(1)a

[52] General Data Protection Regulation 2016/679 Article 6

[53] CNIL v GOOGLE LLC [2019] CNIL’s restricted committee (CNIL’s restricted committee).

[54] House of Commons Digital, Culture, Media and Sport Committee, ‘Disinformation And ‘Fake News’: Final Report’ (House of Common 2019) <https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/1791/1791.pdf> accessed 12 March 2019.

[55] EU Parliament, ‘The Use Of Facebook Users’ Data By Cambridge Analytica And The Impact On Data Protection’ (EU 2018).

Share This

Share this post with your friends!

Share This

Share this post with your friends!