The UK’s Information Commissioner’s Office (ICO) has fined Clearview AI £6.4 million ($9.6 million) for illegally collecting and using millions of people’s biometric data.

The UK’s Information Commissioner’s Office (ICO) recently fined Clearview AI £6.4 million ($9.6 million) for violating privacy laws by collecting and using millions of people’s biometric data. This fine is the biggest ever issued by the ICO, and it serves to highlight the importance of personal privacy protections.

This article will go over what happened and the implications of the ICO’s decision.

UK fines Clearview just under $10M for privacy breaches

Clearview AI is a US technology company founded in 2017 and is best known for its facial recognition software. The company uses an algorithm to match human faces with images collected from social media and other online sources.

The US-based company has been controversial because it allegedly scrapes billions of images from across the internet without users’ consent, creating a database of biometric information that can be used to monitor individual’s movements and activities. Although Clearview AI claims its software is used only by law enforcement, it has reportedly been sold to hundreds of companies including the military, banks, retailers and universities.

The UK’s Information Commissioner’s Office (ICO) opened an investigation into Clearview AI for collecting millions of people’s biometric data without their knowledge or consent. On 28 April 2021, the ICO announced that it had fined Clearview AI £6.4 million ($9.6 million) for breaching UK data protection laws through:

obtaining personal data without individuals’ knowledge or consent;

failing to take appropriate technical measures to prevent the misuse of personal data;

not having effective procedures in place to ensure compliance with the filing obligations on controllers; and

notifying customers less than 24 hours before announcing its intentions on social media platform Twitter.

Background

The UK’s Information Commissioner’s Office (ICO) has fined Clearview AI a hefty sum of £6.4 million ($9.6 million) after they were found to have illegally collected and used millions of people’s biometric data.

The case originated when press reports in January 2020 sparked an investigation by the ICO into the facial recognition company. The probe found that Clearview AI had violated British data protection laws and caused distress to the individuals whose data was collected without their knowledge.

What is biometric data?

Biometric data is any data collected from a person to identify, authenticate, or track an individual. It refers to information about an individual’s physical characteristics, such as fingerprints, facial features and other biometric traits. Biometric data can also verify and authenticate an individual’s identity in identity management systems.

data clearview 7.5m uklomastechcrunch

Biometric technology has become increasingly popular because of its potential to enhance security and access control through fast and efficient identification of individuals. As a result, it is becoming increasingly used for systems that require a high level of security or convenience, such as access control systems, financial transaction systems and travel passport/visa applications.

Many types of biometric data may be collected including: fingerprints; facial recognition; iris scans; signature verification; hand geometry; eye movement tracking; voice recognition; gait analysis; skeletal structure analysis; ear prints and palm prints. All of these are physical characteristics unique to each person. When captured accurately, it can then be used for verification with a high amount of accuracy when properly stored and securely transmitted within a secure system designed with the highest levels of encryption protocols.

With the implementation of biometrics into various industries, there have been many concerns around how this type of personal data is collected and handled by organisations who collect it for various purposes where users do not necessarily consent on how organisations utilise their information. It’s important for organisations collecting biometric data to ensure the collection process complies with any applicable privacy laws in their jurisdiction because failure to do so may result in legal action against them such as fines or other penalties.

What is the UK’s Information Commissioner’s Office (ICO)?

The UK’s Information Commissioner’s Office (ICO) is the UK’s independent body responsible for upholding the General Data Protection Regulation and other data protection laws. It promotes openness, transparency, and privacy by setting data controllers and processors standards. It also investigates reports of breaches of the law and carries out enforcement action where appropriate.

The ICO is also responsible for raising awareness of individual rights and responsibilities under data protection legislation, advising individuals on their rights and keeping a public register of organisations that process personal data. In addition, the ICO can issue fines and enforce compliance with GDPR including investigations into potential breaches of personal data by businesses or organisations in the UK.

clearview 7.5m uklomastechcrunch

The ICO can issue fines of up to four percent of an organisation’s global annual turnover or 20 million euros, whichever is greater.

In this case they have fined Clearview AI £6.4 million ($9.6 million) for illegally collecting and using millions of people’s biometric data – a decision that reflects their commitment to protecting individual rights under UK law regarding personal privacy issues.

Clearview AI’s Breach

The UK’s Information Commissioner’s Office (ICO) has recently fined Clearview AI—a facial recognition software company—just under $10 million for illegally collecting and using millions of people’s biometric data without their knowledge or consent.

This is one of the largest fines ever issued for privacy breaches and is a good reminder of the importance of data privacy and security in the digital age.

How did Clearview AI breach UK privacy laws?

Clearview AI is a US-based facial recognition company that collects and uses biometric data without the knowledge or consent of the individuals involved. The UK’s Information Commissioner’s Office (ICO) fined Clearview AI £6.4 million ($9.6 million) for illegally harvesting and using the biometric data of millions of people in the UK.

The ICO found that Clearview AI illegally collected, shared, and used billions of images scraped from public websites, including social media sites such as Twitter, YouTube, and Venmo, as well as news portals or state government sites. These images generated biometric templates that could be matched with a photograph taken later to identify an individual. This breached GDPR principles because people were not given adequate information about how their data was being used or asked permission for it to be collected and processed in this way.

Furthermore, the number of individuals affected could not be known precisely as they did not have access to, track or delete all personal data since it was transmitted across multiple jurisdictions. To prevent such events in the future, the ICO made nine recommendations on organisational measures for any company operating similar facial recognition technology, including transparency about its practices upfront – aiming to explain how the technology works and which private datasets are used – conducting due diligence checks on acquisitions involving large quantities of personal data from third countries, etc.

What was the fine imposed on Clearview AI?

In February 2021, the UK Information Commissioner’s Office (ICO) imposed a fine of £6.4 million ($9.6 million) on Clearview AI for illegally collecting and using millions of people’s biometric data. This is believed to be one of the largest fines made by the ICO in its history and is in response to Clearview AI’s violation of the UK Data Protection Act 2018.

Clearview AI is an American facial recognition start-up that collects and stores billions of images scraped from social media sites such as Facebook, YouTube, Venmo, and other websites. The court documents allege that under both GDPR and U.K. data privacy law, Clearview AI collected personal information without legal authority, processed it in a way that did not meet the standards set out in these laws, and failed to provide adequate security measures to protect this data from misuse or unauthorised access.

The ICO found that Clearview AI failed to comply with its obligations under GDPR by processing ‘special category’ personal data without appropriate legal basis; failing to meet its obligations regarding transparency; failing to establish an effective accountability process; and failing to protect personal data through appropriate security measures including encryption, pseudonymisation or secure deletion of unused data.

The fine signals the importance of protecting personal data and taking proper steps when handling sensitive information such as facial recognition data. This case highlights how easily consumer privacy can be violated if companies do not take proper steps to ensure compliance with relevant laws when dealing with sensitive user information.

Implications

The UK’s Information Commissioner’s Office’s (ICO) record fine of £6.4 million ($9.6 million) issued to Clearview AI for illegally collecting and using millions of people’s biometric data has far reaching implications.

uk clearview ai 7.5m uklomastechcrunch

It not only shows the increasing power of the ICO to regulate tech companies, it also serves as a warning to tech companies to adhere to data privacy laws.

This article will dive into the implications of this record-breaking fine.

What implications does this have for data privacy in the UK?

The UK Information Commissioner’s Office (ICO) has recently fined Clearview AI, the facial recognition technology firm, £6.4 million ($9.6 million) for illegally collecting and using millions of people’s biometric data without their knowledge or consent. This landmark legal action is a stark reminder of the consequences of failing to comply with data protection and privacy laws.

The troubling case surrounding Clearview AI highlights the need for organisations to gather, retain and securely handle personal data in compliance with the EU General Data Protection Regulation (GDPR) and other relevant legislation in the United Kingdom. In this particular instance, it was found that Clearview had collected 3 billion images from social media platforms such as Facebook, YouTube and Venmo without appropriate legal authority or user consent – an action that is out of step with regulatory compliance established under GDPR legislation.

This case further emphasises the importance of ensuring that organisations have satisfactory processes concerning their use of people’s data – from implementation to destruction. The fines imposed by ICO should act as a warning for all organisations operating within data protection regulations; it is now more pertinent than ever for companies to ensure that their practices are up-to-date with current digital privacy requirements and standards. Data breaches can have serious implications for companies; monetary and reputational damage caused by becoming embroiled in such protracted legal proceedings concerning GDPR non-compliance can be difficult to recover from.

The severity of this breach sends a strong message to businesses worldwide: take your responsibility towards protecting customers’ data seriously or face severe penalties if caught breaking these stringent regulations implemented by authorities such as ICO in the United Kingdom.

What other implications does this have?

The UK’s Information Commissioner’s Office (ICO) has fined Clearview AI £6.4 million ($9.6 million) for illegally collecting and using millions of people’s biometric data and exposing them to facial recognition technology, sending shockwaves across the world in light of looming privacy legislation coming into force later this year. This sets a strong precedent that firms must abide by the rules about data collection and usage, if they are to avoid hefty fines.

Not only does this action set a clear warning for companies worldwide to be careful when collecting personal data, but it also raises questions about the increasing use of facial recognition software – particularly when used without consent or knowledge.

Additionally, this case could lead to greater public scrutiny over organisations that deal with large quantities of personal information within their businesses, as well as how strict regulations should be to protect citizens’ rights. Furthermore, it serves as an important reminder for industry stakeholders such as corporations and governments alike that robust security measures should be in place to not violate individuals’ private information or misuse biometric authentication technologies.

Ultimately, this ruling may encourage countries worldwide to take extra precaution when collecting biometric data from citizens, providing other nations with an opportunity to study best practices from the UK’s case against Clearview AI and possibly create their legislations around privacy

tags = £7.5 million today for a string of breaches of local privacy laws, data protection watchdog, string of breaches of local privacy laws, the clearview 7.5m uklomastechcrunch, the uk clearview 7.5m uklomastechcrunch, uk clearview 7.5m uklomastechcrunch, data clearview ai 7.5m uklomastechcrunch, the clearview ai 7.5m uklomastechcrunch, the uk clearview ai 7.5m uklomastechcrunch

Scroll to Top