Back To Top

How Can Tech Giants Using Biometric Data Be GDPR and CCPA Compliant

How Can Tech Giants Using Biometric Data Be GDPR & CCPA Compliant?

Humairaa Patel

Shilpi Agarwal, Susanna Raj

Ayomikun Bolaji



The guide provides advice for tech companies on how to become GDPR and CCPA compliant as well as recommendations on how to further enhance the ethical use of data in line with the 12-pillar framework outlined by DataEthics4All.

The focus is on the use of biometric data by the five largest technology firms: Facebook, Amazon, Apple, Netflix, and Google. The guide aims to inform tech firms of the steps they should take to be legally compliant with regulations (GDPR) and state law (CCPA) as well as raise awareness for customers about the rights they are entitled to exercise as data subjects. DataEthics4All recommends tighter rules, particularly in the event of a breach where existing GDPR regulations allow firms 72 hours to inform subjects of the breach without any rules on subsequent action.

DataEthics4All believes it is important that GDPR rules require firms to issue clear action and notices on what they will do to resolve the breach and/or compensate affected consumers, particularly in the case of biometric data.


Executive Summary.


GDPR-Logo-DataEthics4All How Can Tech Giants Using Biometric Data Be GDPR & CCPA Compliant?

GDPR is a transnational measure.

GDPR has a Global effect.

Both GDPR and CCPA aim to empower consumers over their own data.

Data encryption is not mandatory for both GDPR and CCPA.

GDPR covers data privacy at the consumer level including 3rd party data providers who sell data for marketing such as marketing research, mailing lists, etc.

Data breaches and GDPR violations may result in financial penalties equivalent to four percent of the company’s annual revenue or approximately $20 million.

In Europe, they can begin with the determination of companies engaging in risky practices.

CCPA-Logo-DataEthics4All-How Can Tech Giants Using Biometric Data Be GDPR

CCPA is a state law.

CCPA applies to firms operating in California or use the data of Californian residents providing they meet the three requirements.

Both GDPR and CCPA aim to empower consumers over their own data.

In CCPA violations, companies can expect a maximum of $7,500 per incident, but there is no limit on the number of violations that can be assessed.

CCPA focuses more on the first party owned data and not necessarily 3rd party data providers.

Data encryption is not mandatory for both GDPR and CCPA.

In California, incidents are considered to start at the time data is breached.

DataEthics4All Membership Diversity check points

Readers of this guide will be able to see how the GDPR and CCPA work within the DataEthics4All framework.

DataEthics4All Membership Diversity check points

Readers will be able to see GDPR and CCPA rules for biometric firms.

DataEthics4All Membership Diversity check points

The guide includes DataEthics4All’s recommendations on where the law needs to extend further.

DataEthics4All Membership Diversity check points

Issues with the GDPR and CCPA rules, where it doesn’t go far enough.

DataEthics4All Membership Diversity check points

Recent trends in Big Tech firm fines.

Guide Outcomes.

Data Privacy/Security.

How can companies preserve Data Privacy?

The GDPR article 25 places responsibility of Data Privacy with the data controller. When processing data the GDPR requires implementation of appropriate technical and organisational measures such as pseudonymisation and to implement data-protection principles, such as data minimisation, as well as integrating the necessary safeguards into data processing.


Ensuring only personal data which is necessary for each specific purpose of the processing is obtained.

Measures to ensure that by default personal data is not made accessible without the individual’s intervention to an indefinite number of natural persons.

Maintaining a record of processing activities under its responsibility.

GDPR-Graphics-Adobe DataEthics4All How Can Tech Giants Using Biometric Data Be GDPR and CCPA compliant?



The pseudonymization and encryption of personal data.


Ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services.


Restore the availability and access to personal data in a timely manner in the event of a physical or technical incident.


A process for regularly testing, assessing, and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.

Security of processing data must be ensured, particularly in biometric data. This is because once leaked, the associated risk of data misuse is extensive as hackers can gain valuable data unique to individuals. Unlike passwords, biometric data cannot be changed and therefore leads to the possibility of people’s personal identity being in constant jeopardy.

CCPA Protection

Under the CCPA consumers have the right to sue over a loss of privacy resulting from a data breach.

Notification of a Personal Data Breach To The Supervisory Authority.

The GDPR allows firms 72 hours after having become aware of a breach to notify the supervisory authority.
In the event of a breach the controller/processor must:


Describe the nature of the personal data breach including where possible, the categories and approximate number of data subjects concerned and the categories and approximate number of personal data records concerned.


Communicate the name and contact details of the data protection officer or other contact point where more information can be obtained.


Describe the likely consequences of the personal data breach.


Describe the measures taken or proposed to be taken by the controller to address the personal data breach, including, where appropriate, measures to mitigate its possible adverse effects.

Communication Of a Personal Data Breach to The Data Subject.

If the breach results in a high risk to the rights and freedoms of natural persons, the controller must communicate the personal data breach to the data subject without undue delay. The company must describe in clear and plain language the nature of the personal data breach.


They have implemented data protection measures which render the personal data unintelligible to any person who is not authorised to access it, such as encryption.

The breach is no longer likely to materialise.

It would involve disproportionate effort so instead public communication would be relied on.

DataEthics4All believe this is an inadequate rule which places a significantly low obligation on firms when data breaches occur. The consequences of data breaches can impact consumer of tech for long periods and therefore there ought to be stricter rulings beyond mere notification.

First, the 72-hour notice period is unnecessarily lengthy. Technology has rapidly developed and hence firms would be easily able to provide notifications within a shorter time frame.

DataEthics4All recommends a maximum 24-hour notice period. The legal obligation on tech firms using biometric data should extend to informing specific customers how they were directly impacted by the breach, with details of what data was affected and how. As well as that, the company should also outline what action they are taking to rectify or address the issue in line with compensation.

Transfers of personal data to third countries or international organisations have some regulatory restrictions.

A transfer of personal data to a third country/international organisation may take place if the Commission has decided that the third party (third country, a territory or one or more specified sectors within that third country, or the international organisation) in question ensures an adequate level of protection, then such a transfer shall not require any specific authorisation.

DataEthics4All thinks that this transfer should require authorisation by a supervisory body. This prevents the ease of spreading data to third parties.

Data controllers/processors can still transfer the data to a 3rd country without a Data Protection Implementing Act (DPIA) from the Commission if they have provided appropriate safeguards and on condition that enforceable data subject rights and effective legal remedies for data subjects are available.

The Facebook Cambridge Analytica Scandal involved allowing the access of data by third parties via their platform. The tech giant’s failure to maintain the privacy of data and prevent its misuse by third parties highlights the need for stricter legislation when data breaches.

Data Processing.

How Can Companies Ensure Fair Data Processing?

GDPR Article 5 and 6 outlines principles firms must adhere to when processing the personal data of customers.

Personal data must be processed:




In a way that ensures appropriate security of the data as well as protection against unlawful/unauthorized processing and unlawful loss, destruction, damage by using appropriate technical or organisational measures.

DataEthics4All recommends that this is taken to the next level, the protection needs to come with compensational measures for customers in the event of any unlawful processing or loss of data.

     Be collected for a:

a) specified

b) explicit and

c) legitimate purpose not in a way that’s incompatible with those purposes

However, the GDPR introduced a limitation on the tech industry. Biometric data under the GDPR comes under a special category of data and is defined as “personal data resulting from specific technical processing relating to the physical, physiological, or behavioural characteristics of a natural person”. Processing of such data is strictly prohibited.

DataEthics4All defines Biometric data as the unique, permanent, and collectable measurements of the physical or behavioural characteristics of an individual.

Although there are few limited and restrictive exceptions to this rule.

Biometrics-Graphics-Adobe DataEthics4All How Can Tech Giants Using Biometric Data Be GDPR and CCPA compliant?


If consent has been given explicitly.

If biometric information is necessary for carrying out obligations of the controller or the data subject in the field of employment, social security, and social protection law.

If it’s essential to protect the individual’s vital interests.

If it’s critical for any legal claims.

If it’s necessary for reasons of public interest in the area of public health.

DataEthics4All recommends that the least intrusive type of data should be used and then consideration ought to be given to the securitisation of that data. Firms should choose the least intrusive and most secure.

Before implementing a system that processes biometric data, organizations should assess whether they can rely on an exemption to the GDPR’s prohibition to process biometric data. Big tech firms fall under the exception of being able to use the data as it is within the data subjects’ interest to protect their personal devices such as mobile phones.


Purposes of the processing for which the personal data are intended as well as the legal basis for the processing.

Identity and the contact details of the controller.

Contact details of the data protection office.

Legitimate interests pursued by the controller or by a third party.

Recipients of the personal data.

The fact that the controller intends to transfer personal data to a third country.

Once the data has been collected, data controllers need to ensure fair and transparent processing and provide the data subject with details on how long the data will be stored for. The regulation empowers customers with the right to request access from the controller for rectification or erasure of their data or restriction of processing concerning the data subject or to object to processing.

An area which requires greater scrutiny is a change in purpose for data processing.
DataEthics4All recommends that a fixed period of at least 1 month is given to consumers to consider any new changes to data.

CCPA and Data Privacy.

Overview of Required Notices:

Businesses must provide notice to customers when collecting data in accordance with the CCPA.

Businesses selling personal information must provide a notice with a right to opt out.

Any businesses providing financial incentives for data sharing must issue a notice in acc. with CCPA.

Notice At Collection of Personal Information:

Firms must provide customers with a timely notice either before or at the point of data collection. The notice should be plain, straightforward avoiding legal jargon in a format that makes the notice readable and accessible for those with disabilities.

In the event a firm is collecting personal information from consumers mobile devices for a purpose not reasonably expected, the firm must provide a just in time notice.

DataEthics4All recommends the legislation should require firms to provide advance notice so consumers have time to think about whether they wish to provide consent.

Businesses must provide notices of all data and categories they will collect. They must not collect data for any purpose without issuing a notice.

DataEthics4All – The CCPA appears to be very notice heavy but has little on Data Minimisation, it appears to allow firms to collect any data if they issue a notice to the consumer.

When issuing a notice, firms must provide a link to their privacy policy.

Data Ownership.

How can companies demonstrate clear Data Ownership?


The privacy policy must be posted online through a conspicuous link using the word “privacy” on the business’s website homepage or on the download or landing page of a mobile application.


Right to know about personal information collected, disclosed, or sold

An explanation of the customers rights.

Customers hold the right to request deletion of personal information, the right to opt-out of the sale of personal information and the right to not be discriminated against.

Businesses must respond to your request within 45 calendar days. They can extend that deadline by another 45 days (90 days total) if they notify you.


calendar days

Big-Data-Graphics-Adobe DataEthics4All How Can Tech Giants Using Biometric Data Be GDPR and CCPA compliant?

Netflix has been considering an enhanced use of customer data from its streaming services. It is suspected that this will involve analysis of users’ emotional behavior to understand their response to films. For instance, if any scenes from films were replayed or skipped and how frequently. Owning such data would reveal substantially unique information about specific individuals which raises concerns about how such vast volumes of data would be secured by the streaming platform.

As per Article 7 of the GDPR the data controller should be able to demonstrate that the data subject consented to the processing of their data. Consent is a highly contended area as often consent is obtained easily by tech giants within extensive and lengthy Terms and Conditions forms.

The GDPR aims to address this issue. It requires that in the event the data subject’s consent is given in the context of a written declaration concerning other matters, the request for consent shall be presented in a manner which is:

Clearly distinguishable from the other matters.


Using clear and plain language.

Easily accessible.

While the above appears to be a reasonable requirement, if a notice breaches such terms, the only consequence for firms is that the declaration is not binding. However, DataEthics4All suggests that this ought to be taken further. Many consumers of tech are unaware if a declaration is binding or not. Instead, if a firm uses a declaration in breach of the regulation the consequences should be more severe, the data subject should be made aware of the breach and consent ought to be re-obtained within a certain timeframe with firms highlighting the change explicitly. This will enhance accountability of firms as well as ensuring data subjects are fully aware of how their data is being used.

Data subjects also have the right to withdraw consent at any time and firms are required to make this easy for data subjects.

A more thorough assessment needs to be made when considering if a data subject freely consented to use of their biometric data. This is imperative as biometric data, unlike other data types, is completely unique to the individual and unchangeable, therefore, a breach could result in disastrous consequences for customers. Hence, stricter guidelines are needed when analysing if large tech firms have simple procedures in place for customers to opt out, with fines in place if they fail to do so.

The CCPA issues specific guidance on how firms must clearly inform consumers how to opt out of data collection initiatives. This includes clear links labelled as ‘Do not sell my personal information’ and instructions on how to submit preferences. The notice must also include a description of the customers right to opt out of the sale of their information.

DataEthics4All believes customers should have the option to opt out of other data collection methods other than selling info to third parties. This is because consumer consent is central to any data usage by large tech firms. Biometric data may only be used in limited circumstances under the GDPR, and users are entitled to have full autonomy on how their data is used.

Data Control.

How can companies be prevented from taking excessive control of Data?

The GDPR’s guidance on data minimisation and storage is one of the ways firms can prevent excessive use and control of biometric data.

Data-Storage-Graphics-Adobe DataEthics4All How Can Tech Giants Using Biometric Data Be GDPR and CCPA compliant?


Adequate / Relevant / Limited to what is necessary for the purpose

Accurate where necessary

Kept up to date

Kept in a form that allows identification of the customer, but not longer than is necessary for purpose of which data is processed

Regulation requires data controller to take every reasonable step to ensure inaccurate data is removed

CCPA Rights of the Subjects.

The business purposes for collection

Whether that information is sold, and for what business purpose

The third-party recipients of the data

To access their personal information

To equal service and price, even if they exercise their privacy rights

To know what information has been collected and the sources from which that data was collected

To know what information is being collected about them

To know if their personal information is sold or disclosed, and to whom

To say ‘no’ to the sale of personal information

Amazon’s introduction of the new biometric palm print scanners, Amazon One allows customers to pay for goods in stores by waving their palm prints over a scanner. The scanner can capture intricate details of the palm from lines and ridges and vein patterns. The indefinite storing of a palm signature as such provides Amazon with great control over the data. The data is only deleted if the feature is not used for two years.

DataEthics4All believe this time frame gives Amazon control over data for a longer period than necessary and instead should be reduced to a maximum of 1 year.


How can companies remain transparent and maintain the trust of data subjects?

Article 12 of the GDPR outlines the requirements surrounding transparent information and communication.


Communicate with data subjects in a way that is concise, transparent, intelligible, easily accessible using clear and plain language.

Provided requested information in writing.

Facilitate the exercise of data subject rights and not refuse to act on the request of the data subject for exercising their rights.

Respond to any request without undue delay within 1 month of receipt of the request. The period can be extended by 2 further months considering complexity of the request. The controller shall inform the data subject of any such extension with reasons.

Requests should be free unless unfounded or excessive, then can: a) charge a reasonable fee considering the administrative costs of providing the information or b) refuse the request.

Further ways firms can maintain the trust of data subjects is through article 15 of the GDPR.


Purposes of the processing and categories of biometric data.

To whom the personal data has been or will be disclosed.

The data subject has the right to lodge a complaint with a supervisory authority.

The envisaged period for which the personal data will be stored.

Where personal data is transferred to a third country or to an international organisation, the data subject has the right to be informed of the appropriate safeguards pursuant to Article 46 relating to the transfer.

Financial Incentive Notices.

Finance-Graphics-Adobe DataEthics4All How Can Tech Giants Using Biometric Data Be GDPR and CCPA compliant?



A succinct summary of the financial incentive or price or service difference offered


A statement of the consumer’s right to withdraw from the financial incentive at any time and how the consumer may exercise that right


A good-faith estimate of the value of the consumer’s data that forms the basis for offering the financial incentive


Document a reasonable and good faith method for calculating the value of the consumer’s data

Transparency in the biometric sphere is imperative. Data breaches by large tech firms, for instance by Facebook in 2010, inspired mistrust within its community of users. The social media giant began automatically tagging people in photos using a tag suggestion tool, by scanning a person’s face via facial recognition and offering suggestions as to who that person is. The breach went even further as ghost profiles were created for individuals who did not have Facebook accounts. While the option to turn off the setting was available, users had not consented to activating the feature.

The controversy forced Facebook to introduce the following changes to increase user autonomy.


If you turn your face recognition setting on, we’ll keep your template while your account is active but will delete it if you turn your face recognition setting off

We don’t share your template with anyone else but you

We don’t have any face recognition features that tell strangers who you are

If you’re untagged from a photo or video, we won’t use that photo or video as part of the face recognition template to recognise you

Face recognition is only available to people who are over 18. People under 18 won’t have the face recognition setting


Issues with the GDPR and CCPA.

The GDPR and CCPA rules do not include substantial reference to pillar 3,8,9,10,11 and 12 of the DataEthics4All framework.

Strengthens the largest players – larger firms are protected from competition as smaller firms are unable to collect biometric data and ensure the protection of it through expensive systems. This places greater power with big techs and strengthens their ability to dominate the market and ultimately choose which rules to follow.

The GDPR and the CCPA create risks for identity theft and online fraud – The GDPR and the CCPA purportedly give users the ability to control their data by facilitating user requests. However, they also give hackers and identity thieves the ability to steal data because there is no provision for user authentication.

The GDPR has not inspired more trust online – users of services provided by FAANG still encounter intrusive pop-ups and disclosures on every digital property they visit. The legislative requirement for obtaining consent to use customer data allows tech giants to constantly obtain more and more data on consumers of technology whilst remaining compliant with legislation. There needs to be more legislative restraints on what data can be processed and the purposes of the data use, particularly for biometric data. As a result, this will prevent big techs from obtaining excessive amounts of data to profile individuals and their lifestyles and habits.

The GDPR and the CCPA fail to include the role of privacy enhancing innovation and consumer education in data protection. Increasing the number of regulators and regulations to govern data fails to make users safer. The regulations ought to require big techs to explain clearly and explicitly not only their use of biometric data, but also how users can protect themselves from being victims of data breaches.

Finally, DataEthics4All believes that data encryption should be mandatory for both GDPR and CCPA. This is because the lack of encryption means when data is lost hackers can use it, causing long term damage. Encryption increases the safety of consumer data and prevents it from being used against data subjects.

Recent Fines incurred by FAANG.

Facebook was fined $5bn by the Federal Trade Commission (FTC) for giving app developers Cambridge Analytica access to 87 million users data without clear consent. The Information Commissioner’s Office (ICO) considered this to be unfair processing of data as those who had not downloaded the app, but had friends that did, also had their data shared. The tech giant failed to secure customers personal data as it did not carry out suitable checks on third parties who were using its platform.

The Irish Data Protection Commission has launched an investigation into a data leak in which data of 533 million people from 106 countries was published online on a hacking forum earlier in April 2021. The DPC indicated that the evidence suggests more than one of the GDPR regulations have been infringed.

Amazon has been fined $886.6 million for breaking GDPR rules. At present it is unknown what breach has occurred, but a spokesperson from Amazon claims no breach has occurred and that customer data has not been leaked to third parties.

Google was fined £44 million by the regulator for ‘lack of transparency, inadequate information and lack of valid consent’ for advert personalisation. It was held that people were insufficiently informed about how Google collected data to personalise advertising. The Group argued that the firm had no valid legal basis to process user data for advert personalisation.

The fines incurred by the major tech firms appear to have an underlying theme of processing data for a purpose customers have not consented to or failing to secure the data adequately which results in a data breach and unlawful third party access to data where it is then misused.









Keep in touch with the latest in Data Ethics, Privacy, Compliance, Governance and Social Corporate Responsibility.