International Technical Support: (EU): +44 (20) 80891215 & (US): +1 312 248 7781 | support@trustcloud.tech

These are the dangers of selling your biometric data

Share This:

TrustCloud | These are the dangers of selling your biometric data

Our biometric information, such as iris or fingerprint, are unique traits that can serve as a security barrier in various contexts. However, irresponsible or unconscious use of this data can put us in serious danger, opening the door to frauds and scams. 

I

n February 2024, it was revealed that WorldCoin, a startup based in San Francisco, was undertaking an ambitious initiative to create a universal identification for all Earth’s inhabitants. Their proposal: offering cryptocurrency payments in exchange for scanning people’s irises, thus creating a global map of human identity. 

While the idea of a universal digital identity may sound appealing in an increasingly interconnected world, the reality is that WorldCoin’s proposal presents serious risks and challenges that should not be overlooked. Analyzing WorldCoin’s experience in purchasing biometric information will give us clues about the risks associated with these practices. 

WorldCoin’s procedure: a controversial exchange 

WorldCoin developed a mobile device called the “Orb” that used facial recognition technology to scan people’s irises. In exchange for this scan, the company offered users an initial amount of their cryptocurrency, called “WLD”. 

The proposal managed to attract millions of people worldwide, especially in developing countries where the promise of an economic “reward” could be more enticing. However, this exchange soon raised serious concerns about privacy, security, and the use of biometric data. 

But why is it dangerous to sell biometric data? Biometric information, such as iris scans, is considered highly sensitive information. Unlike other personal data, like name or address, biometric data is unique and immutable. Once compromised, it can be used to impersonate a person, access confidential information, or even cause physical harm. It can also be sold to large companies that use it for targeted advertising and influencing consumer habits. We could say that handing over our fingerprint or a scan of our iris is like sending copies of our identity document without any filter. 

In the case of WorldCoin, the magnitude of the biometric information collected and the lack of transparency about its use clearly created a scenario of elevated risk. The company defended itself by stating that the data was securely stored and used solely for identity verification purposes. However, they did not provide specific details about what security measures were in place or how they planned to share or use this data in the future. They also did not clarify how long they intended to retain it. 

The surrender of biometric information facilitates identity theft 

Biometric data is unique and immutable, making it an ideal target for cybercriminals. If this data falls into the wrong hands, it could be used to impersonate a person and commit various crimes, such as: 

  • Financial identity theft. With this data, criminals could access bank accounts, credit cards, or even apply for loans in the victim’s name. 
  • Electoral fraud. Biometric data could be used to fraudulently vote in elections or engage in other activities that require identity verification. 
  • Physical crimes. Ultimately, the theft of fingerprints or facial biometric maps would pave the way for physical crimes: it could facilitate access to restricted areas or even help criminals impersonate the victim to commit criminal acts. 

Additional concerns and challenges 

Beyond the inherent risks of selling biometric data, WorldCoin’s initiative presents other challenges, as subtle as they are complex.  

  • Digital exclusion. Relying on a mobile device and internet connection to access the “universal ID” could exclude people in areas with limited access to technology. 
  • Lack of control. Users relinquish control over their biometric data to a private company, with no clear guarantees on how it will be used or protected. 
  • Ethical implications. Creating a global database of biometric information raises questions about mass surveillance, social control, and potential discrimination. 

The Clearview case 

In 2020, another case related to biometrics and the collection and sale of private information obtained from the internet sparked an ethical and legal conflict. Clearview AI, a company based in New York, secretly collected billions of people’s faces from photos on social media and other internet sites, without their knowledge or consent. This database was offered to private companies, police, and federal agencies to track people using facial recognition technology. 

Facial recognition technology raises serious privacy and security concerns, especially for vulnerable groups such as survivors of domestic violence and sexual assault, undocumented immigrants, and people of color, due to the biases inherent in these technologies, against which we must fight with the utmost responsibility. 

Furthermore, Clearview sold access to an application that allowed users to upload a photo and receive instant matches, exacerbating the problem of this uncontrolled trade in biometric information. 

The consequences of indiscriminate collection of biometric data 

WorldCoin’s initiative encountered resistance in several countries, which did not hesitate to take action against what they considered reprehensible activity. In Spain, the Spanish Data Protection Agency (AEPD, Agencia Española de Protección de Datos) prohibited the collection of biometric data by the company, threatening to impose sanctions of up to 20 million euros. 

In other countries, such as Kenya and India, authorities expressed concerns about the potential misuse of biometric data and also urged the company to cease its operations. 

WorldCoin’s experience highlights the need for a broader public debate on the collection, use, and sale of biometric data. It is crucial for people to understand the risks and implications of sharing this type of sensitive information. Clear regulations and legal frameworks need to be established to protect the privacy of biometric data and ensure that its use is conducted in a responsible and ethical manner. 

Ultimately, the decision to share or not share biometric data with WorldCoin or any other company lies with each individual. However, it is essential that this decision is made in an informed manner, with full knowledge of the risks and challenges involved. 

At TrustCloud, we advocate for a responsible, selective, and informed use of biometric information. We apply these foundational principles in all our solutions, placing respect for individual information at the core of any of our strategies. 

Request information now 

Back To Top