According to the General Data Protection Regulation (GDPR) in Spain, biometric data are personal data relating to the physical, physiological or behavioral characteristics of an individual that enable or confirm the unique identification of that individual. Such data can include facial features, voice, fingerprints, and also the iris of the eye. The iris, due to its characteristics, is one of the most univocal and stable biometric data over time.
Once the iris image is captured using high-resolution and infrared cameras, it is processed to extract patterns, such as the position of the pupil, iris, eyelids and eyelashes. This information is then "encoded" and the iris code is created. This involves a hash function, a cryptographic sequence used to convert a set of data into a random line of characters. What is done is to get a mathematical representation of the iris, as if all those measurements and patterns were translated into an expression in numbers and letters.
In recent months, we have noticed a considerable interest from companies in exploring the use of this data: the iris is a unique data, no two irises are alike, and this is an advantage to use it as an identification system.
The new Apple Vision Pro glasses use an iris identification system (Optic ID). Optic ID can recognize the uniqueness of your iris, allowing it to quickly unlock the Vision Pro, authorize Apple Pay purchases, log into many apps, access personal data etc. "Just as Touch ID revolutionized authentication using a fingerprint and Face ID revolutionized authentication using facial recognition, Optic ID revolutionizes authentication using iris recognition," says Apple. But, the question arises spontaneously: what security guarantees are offered to the user? For its part, Apple assures that the biometric data collected are encrypted and there is no backup in iCloud or anywhere else.
Also the company Tools of Humanity Corporation, owned by the CEO of OpenAI, which has launched the Worldcoin project, has clearly bet on this novel biometric data, offering the user money (specifically, cryptocurrencies) in exchange for scanning their eye in just one minute, with the aim of creating a global digital identification system. In the last few weeks thousands of people have registered; there is talk of close to 400,000 irises captured throughout Spain.
On March 7, the Spanish Data Protection Agency notified the company responsible for the Worldcoin project of the obligation to suspend, as a precautionary measure, its activity of collecting and processing personal data (iris, among others) that it is carrying out in Spain, and to block the data already collected. The Agency understands that the adoption of urgent measures of temporary prohibition of the activity of Tools of Humanity Corporation is justified to avoid potentially irreparable damage and that not taking them would deprive individuals of the protection to which they are entitled under the GDPR.
As a result of this measure, as of this March 7 and for the next three months, the Worldcoin company will be prohibited from continuing to scan people's irises and collect their personal data. The Agency's precautionary measure has been a real revolution: it is the first time we see a processing ban decision that, even if temporary, blocks the operation of a company that, in addition, will have to keep blocked all personal information already collected, making it impossible to use it at all levels. The decision taken by the Agency against Worldcoin is undoubtedly the toughest, but it is not surprising that other authorities will take similar decisions in the coming months. France, Germany and the UK are investigating the data processing carried out by Worldcoin. The Spanish Data Protection Agency (AEPD) points out that a major problem in Worldcoin's activity is that it processes data without properly informing data subjects, without allowing them to revoke the consent given and without explaining in detail how they use the information.
In line with the Agency's actions, it should be noted that this type of biometric data processing, considered in the General Data Protection Regulation (GDPR) as of special protection, raises many doubts and risks in relation to the protection of personal data, among others, risks to the identity itself, as it can be supplanted; to privacy, as the iris reflects health data and complementary information can be extracted; and social risk. In relation to biometric data, we must not forget the specific obligations of the General Data Protection Regulation (GDPR), among others, to find an applicable exception that lifts the prohibition on the processing of these data, established in art. 9 GDPR, and subsequently find a basis for legitimacy. Of course, the rest of the obligations of the Regulation must be complied with, such as the duty of information in relation to the processing of users' personal data, the duty of security, the duty of confidentiality, the obligation of an impact assessment, or the possibility of withdrawing consent, among others. Therefore, before any processing of the iris, as biometric data, it is necessary to pay particular attention to the characteristics of the processing and its legality.
Finally, one of the main security risks that we see with this identification system and that we believe will give much to talk about throughout Europe is the following: if someone fraudulently gets our digital banking or email password, we can change it quickly to avoid any kind of unwanted access or information theft. If someone gets hold of our iris... no.