Fingerprint verification, finger image analysis, iris recognition, retina analysis, face recognition, outline of hand patterns, ear shape recognition, body odour detection (body odour detection?), voice recognition, DNA pattern analysis and sweat pore analysis (ugh!). These are just some of the many systems that, along with behavioural-based techniques like hand-written signature verification, keystroke analysis and gait analysis, are now at the disposal of providers of automatic identification, authentication and verification services – biometric systems in other words.
The trouble is that the use of such systems invokes issues that go right to the heart of privacy legislation against which biometrics manufacturers, providers and their customers must assess their legality before putting such systems to practical effect. Because of the complexity of the technology and the scope of legislation protecting personal data, without guidance this is an onerous task as some have found to their cost. It has now been made marginally less difficult following the publication on
Biometric systems
As the techniques listed above suggest, biometric systems are applications of biometric technologies which allow the automatic identification, and/or authentication/verification, of a person. The potential breadth and use of such techniques is limited only by practicality, their areas of application are manifold. The deployment permutations are also extensive. Many biometric systems work by combining different biometric modalities of the user with other identification or authentication technologies, not just combined biometrics, such as face recognition and voice registration, but biometrics combined with passwords, PIN numbers, and smart cards. For instance, with a computer the user could insert a smart card, type a password and present a fingerprint.
Biometric data, ie the collection of biometric samples such as an image of a fingerprint, the picture of an iris or retina or the recording of a voice, is by default almost always deeply personal. Sample collection is carried out during a phase called “enrolment” by using a sensor specific to each type of biometrics. The biometrics system extracts from the data user-specific features to build a biometric template. It is the template, presented in digitalised form, which is stored.
Data protection issues
Even before deployment a number of data protection issues arise. At the enrolment phase, raw data, extraction and protection algorithms, such as cryptography, and templates are all present. Because of its nature, the information gathered may well be sensitive personal data, such as data revealing racial or ethnic origin or concerning health. Systems based on face recognition, for example, would inevitably be considered sensitive. The special safeguards that apply under Article 8 of the Data Protection Directive (95/46/EC) will therefore apply in addition to the general protection principles of the Directive.
The storage of users’ templates will also raise data protection issues depending on the type of application for which the biometric device will be used and the size of the templates themselves. For example, templates may be stored in the memory of a biometric device, on a central database, or in plastic cards, optical cards or smart cards. As the Working Party’s comments on the application of the data protection principles stress, firms must use a method of storage appropriate to the purpose of the system. Whilst identification can only be achieved by storing the reference data in a centralised database, in principle it is not necessary for the purposes of authentication/verification to store reference data centrally.
Another data protection issue arising in relation to the construction of a biometric system, is that some are based on information, like fingerprints or DNA samples, that may be collected without the data subject’s awareness. It may therefore be possible to identify the data subject by comparison of the resulting template with biometric data from another database. Not only can personal data therefore be obtained and processed in this way without the data subject’s knowledge but, in the Working Party’s view, such biometric technologies “lend themselves to blanket utilisation on account of their low level intrusiveness” such that specific safeguards are necessary in respect of them.
Purpose
According to Article 6 of the Directive, personal data must be collected for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes. Additionally, personal data must be adequate, relevant and not excessive in relation to the purposes for which it is collected and further processed. This implies a clear determination of the purpose for which the biometric data is collected and processed. The Working Party says that, for access control purposes (ie authentication/verification), rather than being stored on a database, biometrics should only be held in an object exclusively available to the user, such as a microchip card, a mobile phone or a bank card. In other words, authentication/verification applications which can be carried out without a central storage of biometric data should not implement excessive identification techniques.
The Working Party thinks that the use of other types of application (ie based on digital fingerprint templates in the terminal or a central database) should be “carefully assessed” before implementation. Those intending to implement this type of system, for instance in cases such as high security installations, “may need” to submit to prior checking by the national data protection authority (the Information Commissioner in the
The Working Party is also concerned about the re-use of biometric data for purposes outside those for which it was collected. The Directive prohibits further processing that will be incompatible with the original purpose and, subject to specific exemptions under the Directive, the Working Party insists that “all measures must be taken” to prevent such incompatible re-use.
In a similar vein, the centralised storage of biometric data also increases the risk of the use of biometric data as a key to interconnecting different databases that could lead to detailed profiles of an individual’s habits both in the public and private sectors. Where biometric data is intended to be used to link databases containing personal data, the Working Party regards it as “desirable” that templates and their digital representations be processed with mathematical manipulations (encryption, algorithms or hash-functions), using different parameters for every biometric product in use to avoid the combination of personal data from several databases through the comparison of templates or digital representations.
There is also the issue of proportionality. Biometric data may only be used if adequate, relevant and not excessive in the light of the original purpose for which the data is processed. For the Working Party this implies a strict assessment of the necessity and proportionality of the processed data. Moreover, as biometric data, particularly raw data, will often contain more information than strictly necessary for identification or authentication/verification, templates should technically be constructed so as to preclude the processing of data that are not necessary and any unnecessary data collected should be destroyed as soon as possible. The Directive requires that personal data is kept for no longer than necessary for the purposes for which it is processed.
Other data protection principles
The processing of biometric data and in particular its collection should happen in a fair way. The data subject must also be kept informed, subject to specific exceptions for example in relation to public security. The data subject should be informed of the exact purpose and the identity of the controller of the file. Usually this will be the person running the biometric system or applying the biometrical technique. Systems based on the collection of biometric data without the knowledge of data subjects are therefore ruled out. In this respect, systems based on distance facial recognition, collection of fingerprints and taping of the voice present the most risk.
The processing of biometric data must be based on one of the grounds of legitimacy under Article 7 of the Directive. Where this is consent, it must be a freely given, specific and informed indication by the data subject that he agrees to the processing.
Security measures
Article 17 of the Directive requires controllers to take all appropriate technical and organisational security measures to protect personal data against accidental or unlawful destruction or accidental loss, alteration, unauthorised disclosure or access. In particular, firms will need to ensure that security measures are in place where the processing of biometric data involves its transmission over a network. However, security measures must be taken at all stages of processing, including storage, extraction of characteristics and comparison. Security is therefore a major issue when biometric data is transmitted over the Internet.
The sort of security measures envisaged include encryption of templates and the protection of encryption keys in addition to access control. It should be virtually impossible to reconstruct original data from templates. The measures should be in place from the outset and especially during the enrolment phase where the biometric data are transformed into templates or images. The Working Party stresses that any loss of the integrity, confidentiality and availability features in respect of the databases “would be clearly prejudicial to all future applications based on the information contained in those databases”. The concern here is the mixing up of identities, for example where the fingerprints of two separate individuals are similar.
Conclusion
Essentially, all biometric systems produced in the EU should be implemented according to the recommendations in the Working Paper. The emphasis throughout is on proportionality which in the data protection context is not as woolly a concept as it sounds. The Working Party expresses “a clear preference” towards biometric applications, particularly for authentication/verification, that do not process data obtained without the data subject’s knowledge or are not kept in a centralised system.
Additionally the system must suit the purpose. For example, the French CNIL refused the use of fingerprints in the case of access by children to a school restaurant, but accepted for the same purpose the use of the outline of the hand. Typically though there will be inconsistency at national level. The UK Information Commissioner has accepted the use of fingerprints in similar circumstances where appropriate safeguards have been put in place, whereas the Portuguese data protection authority recently held disproportionate and excessive the use of a biometric fingerprint system by a university to control the punctuality of non-teaching staff. Entering into constructive dialogue with your data protection authority is therefore certainly to be encouraged. For instance, the German data protection authority has approved the introduction of biometric characteristics on identity papers in order to prevent fraud, provided that the data is stored in the microchip of the card rather than in a database for comparison with the owner’s fingerprints.
On this basis there should be no need for manufacturers and users to allow their investment in sophisticated security systems to be undermined by non-compliance with data protection legislation. Used in the right way, biometrics should enhance privacy and not undermine it. The Venerable Bede Church of England Aided School in
This is just one of an increasing number of uses to which biometrics are being put in both public and private sectors. Earlier this year the European Commission launched the EURODAC project aimed at processing asylum requests in EU Member States and other states belonging to the Dublin Convention. As part of this project, Steria, the European IT services global operator, developed a central system for fingerprint identification operating out of
Phillip Rees is Head of IT and Communications Law Group at