Introduction Biometrics is a comprehensive technique to establish a person’s identity by measuring one of their physical characteristics. There may be several types of physical characteristics, some more reliable than others, but all must be tamper-proof and unique to be representative of one and only one individual. On the other hand, as we will see, the physical characteristics are far from being so perfect and so precise, and one quickly reaches limits for these techniques. Use Biometrics-based techniques are currently enjoying a general popularity driven by a fashion phenomenon, mainly driven by films in the cinema and on television. Thus, it is not uncommon to see retinal scanners with superb red lasers, fingerprint readers with very pretty flashing lights, etc … all this representing the pinnacle of the technology of the control of access. However, biometrics techniques are well and truly spreading in our daily lives, while maintaining a somewhat misleading image. Because the problem is to know what techniques really exist, and what are their limits. This article is not intended to be exhaustive on a subject as vast as biometrics, but it is nevertheless intended to make readers as aware as possible and to give them some essential bases. Physical characteristics There are several physical characteristics that prove to be unique to an individual, and there are also several ways for each individual to measure them:Fingerprint (finger-scan): The basic data in the case of fingerprints is the design represented by the ridges and furrows of the epidermis. This drawing is unique and different for each individual. In practice, it is almost impossible to use all the information provided by this drawing (because too many for each individual), so we will prefer to extract the main features such as bifurcations ridges, “islands”, lines that disappear, etc. … A complete footprint contains on average one hundred of these characteristic points (the “minutiae”). If we consider the area actually scanned, we can extract about 40 of these points. Yet, again, the products offered on the market are based on only fifteen of these points (at least 12 vis-à-vis the law), or even less for many of them (up to 8 minimum). For history, the number 12 comes from the 12-point rule that it is statistically impossible to find 2 individuals with the same 12 characteristic points, even considering a population of tens of millions of people. Figure 1 The techniques used for the measurement are various: optical sensors (CCD / CMOS cameras), ultrasonic sensors, electric field sensors, capacitors, temperature sensors, etc. These sensors are often coupled with a measure to establish the validity of the submitted sample (in other words, it is indeed a finger): measurement of the relative dielectric constant of the sample, its conductivity, heartbeat, blood pressure, even a measurement of the impression under the epidermis … the geometry of the hand/finger (hand-scan): this type of biometric measurement is one of the most widespread, in the United States. This consists in measuring several characteristics of the hand (up to 90) such as the shape of the hand, length, and width of the fingers, forms of the joints, inter-joint lengths, etc … The technology associated with this is mainly of infrared imaging; in general, the system has fairly high FARs (False Acceptation Rate, see below), especially among members of the same family or twins. iris (iris-scan): for the 2 following techniques, it is first necessary to distinguish between the iris and the retina: Figure 2 Source: American Academy of Ophthalmology In other words, the study of the iris will focus on the part of the eye visible below: Regarding the iris, the individual is placed in front of the sensor (CCD / CMOS camera) that scans his iris. This one represents something very interesting for the biometrics because it is at the same time always different (even between twins, between the left eye and the right, etc …), independent of the genetic code of the individual, and very difficult to falsify. Indeed, the iris has a quasi-infinity of characteristic points (which some compare in number to those of DNA), which do not vary during the life of a person, unlike the color of the iris which, she can change. But this has no influence because the images of iris obtained by the sensors are in black and white. The only problem with this technique is related to the measurement itself, which can be a source of errors or problems. Thus, it can almost be said that the number of problems encountered during this measurement increases proportionally with the distance between the eye and the camera. Other problems arise because of reflections (need to have a restricted and controlled lighting), and when detecting false eyes (photos) and other frauds. For the latter, we can call upon certain dynamic features of the eye that will prove its authenticity: reactivity of the pupil (dilation/retraction) with respect to the quantity of light, the study of the iris in the infrared and the ultraviolet, etc … retina (retina-scan): this biometric measurement is older than that using iris, but it has been less well accepted by the public and users, probably because of its too restrictive nature: the measurement must be made at very short distance from the sensor (a few centimeters), which then performs a scan of the retina. It is physically impossible to perform a retinal measurement at a distance of 30cm or more on a moving subject as can be seen in some movies. This method requires cooperative and trained subjects. Yet this technique seems to be just as reliable as that of the iris; it is based on the fact that the pattern and the pattern formed by the blood vessels of the retina (the inner and opposite wall of the eye) is unique for each individual, different between twins and quite stable during the life of the person. The measurement can thus provide up to 400 characteristic points of the subject, which can be compared to the 30 to 40 points provided by a fingerprint! In conclusion, the retinal measurement is the most difficult to use but also the hardest to counterfeit. facial-scan: this is a question of making a photograph more or less evolved to extract a set of factors that are meant to be specific to each individual. These factors are chosen for their strong invariability and concern areas of the face such as the top of the cheeks, the corners of the mouth, etc … we will also avoid the types of hairstyles, the areas occupied by the hair in general or any area subject to change during the life of the person. There are several variants of face recognition technology. The first is developed and supported by MIT and is called “Eigenface”. It consists in breaking down the face into several images made of shades of gray, each highlighting a particular characteristic: Figure 3: Source: MIT Face Recognition Demo Page Another technique called “feature analysis” is based on the previous one by adding information on inter-element distances, their positions, etc … It is more flexible as to any modifications that may occur: angle of view, a tilt of the head, etc … Then come techniques less used at present, based on neural networks, on more technical methods and less flexible. vein pattern-scan: this technique is usually combined with another, such as the study of hand geometry. This is to analyze the pattern formed by the network of veins on a part of the body of an individual (the hand) to keep some characteristic points. Behavioral characteristics In addition to the physical characteristics, an individual also has several elements related to his or her own behavior: keystroke-scan: keystrokes are influenced by many things; first of all, depending on the text you are typing and, more generally, depending on its nature, you will tend to change the way you type on the keyboard. This is also one of the means used by some attacks (timing attacks) to try to infer the content or the nature of the text typed to go up to a password for example. These techniques are quite satisfactory but remain nevertheless statistical. Then the behavioral factor comes into play, and this factor is going to be different for each individual. The factors are almost identical to those mentioned above: the durations between strikes, the frequency of the errors, duration of the strike itself … The difference is more at the level of the analysis, which can be either static and based on neural networks, or dynamic and statistical (continuous comparison between the sample and the reference). Voice-recognition: The data used by voice recognition comes from both physiological and behavioral factors. They are not usually imitable. signature dynamics (signature-scan): this type of biometry is currently little used but its defenders hope to impose it quickly enough for specific applications (electronic documents, reports, contracts …). The process is usually combined with a graphic palette (or equivalent) provided with a pen. This device will measure several characteristics during the signature, such as speed, the order of strikes, pressure, and accelerations, total time, etc … In short, everything that can identify a person in the way as safe as possible when using data as changing as the signature Summary and new techniques Here is an example of the result of a study carried out by an American company, the International Biometric Group (“New York Based Integration and Consulting Firm”), presenting the different citers for each type of biometric technique: Figure4 Legend: Effort: effort required for the user during the measurement. Intrusiveness describes how the user perceives the test as intrusive. Cost: the cost of technology (readers, sensors, etc …) Accuracy: effectiveness of the method (ability to identify someone) there are several techniques under development right now; these include biometrics based on ear geometry, odors, skin pores and DNA testing. On this last point, it is interesting to underline that the process can be threatening both in terms of the privacy of people, their freedom as possible computer drifts (and other Big Brothers). Indeed, even if it depends on the technique used, the DNA test is something that can be 100% accurate and 100% safe, allowing zero FRRs and FARs (see below). It is also universally recognized and would very easily allow cross-referencing between databases. In other words, it would be the ideal way to “catalog” people and thus destroy the privacy we have respected so far. The CNIL website is a must for those interested. Disadvantage of biometrics: equality vs. similarity Biometrics, unfortunately, has a major disadvantage; indeed, none of the measures used is proving to be totally accurate because it is indeed one of the major characteristics of any living organism: it adapts to the environment, we get older, we suffer more or less important traumas, in short, we evolve and the measures change. Take the simplest case, that of fingerprints (but note that the same applies to any physical data). Depending on the case, we present more or less perspiration; the temperature of the fingers is anything but regular (on average 8 to 10 degrees Celsius above room temperature). It is enough to cut one to present an anomaly in the drawing of its prints. In short, in the majority of cases, the measure will return a result different from the initial reference measure. However, we must be successful in being recognized, and in reality, it will work in most cases because the system allows a margin of error between the measurement and the reference. The purpose of this device is simple: manufacturers are not looking for absolute security, they want something that works in practice. They are therefore seeking to reduce the False Rejection Rate (FRR) while maintaining a relatively low rate of False Acceptance Rate (FAR). Explanations: an RF is the rejection of an authorized person in normal times because his biometric measurement is too far from the reference measure for that same person. A functional system will have the lowest RIF possible. On the other hand, an FA is accepting an unauthorized person. This can happen if the person falsified the biometric data or if the measure confuses it with another person. A safe system will have the lowest FAR possible. In everyday life, manufacturers are mainly looking for a compromise between these two rates, FRR and FAR, which are linked to each other following a relationship illustrated here: Figure 5: This graph is purely demonstrative; delta represents the margin of error allowed by the system, ranging from 0 to infinity. Very succinctly, we see that the greater the margin of error allowed, the higher the rate of false acceptances increases, that is to say, that we will accept more and more people who are not authorized (and therefore the security of the system decreases). On the other hand, it can be seen that the rejection rate of authorized persons is also decreasing, which makes the system more functional and better meets users’ expectations. At the other end, if we reduce the margin of error accepted by the biometric measurement process, the trends of the two rates are reversed: we will less and less accept individuals trying to defraud but we will also, by on the same occasion, having a rejection rate on authorized persons that will be too important to be tolerated in most cases. The usual compromise is to take the junction of the curves, ie the point x where the couple (FAR, FRR) is minimal. In conclusion, all biometrics can be summed up for the most pessimistic this compromise alone that distorts any confidence that could be brought to this technology. Vulnerability example: the case of fingerprints Fingerprints undoubtedly represent the most commonly used biometric data. In fact, there are a large number of products available on the market but also a lot of work on the subject and counterfeits in this area. We will see some of the techniques used as well as how they are bypassed. Be careful, this section is not meant to be exhaustive; on the one hand, it is based on current technologies that are inherently variable and evolving; and on the other hand, its purpose is to sensitize the reader generally rather than to train him in any technique. It is first necessary to obtain the fundamental data of the measurement, ie the characteristic points of the fingerprint that one wants to counterfeit, by manufacturing a false finger (or a thin layer of silicone reproducing the geometry finger). We will not give here the operating mode, but know that it is quite possible and simple to create a false finger from a simple impression (on a glass for example, or on a keyboard, a handle, etc.). Then, consider the cases for each type of sensor: temperature sensor: the thin layer of silicone only varies the temperature of 1 to 3 ° Celsius on average, which is not detectable by the sensors on pain of having an FRR too high (especially outdoors). heartbeat sensor: the thin layer of silicone allows the sensor to function normally. Moreover, any discrimination based on this measure is physically impossible and infeasible. Infeasible because in the case of sportsmen, for example, their heart rate can go down to 40 beats/minute, which implies a measurement lasting more than 4 seconds to be able to evaluate the heart rate. Impossible finally because what’s more changeable than a heartbeat? the slightest effort modifies it, which makes it unusable in our case. conductivity sensor: depending on the type of sensor, the normal value for the skin is estimated at 200 kOhms. Nevertheless, this value will be several MOhms during the winter (dry) to go down to a few kOhms during a humid summer. In these conditions, it is obvious that a fake finger can pass the test without much trouble. the relative dielectric constant: very briefly, this constant identifies to what extent a material concentrates the electrostatic flow lines. Here, the silicone will be rejected since having a value too different from that of the skin. However, it turns out that the value of this constant for the skin is between that of water (80) and that of alcohol (24). In other words, simply smear the fake finger with a water-alcohol mixture (80% / 20% or 90% / 10%), put your finger on the sensor and wait for the alcohol to evaporate. Indeed, when the alcohol evaporates, the value of the constant will go back to that of the water (from 24 to 80) and reach in passing that of the skin. QED. In general, the weaknesses of these systems are not at the level of the physical peculiarity on which they rest, but of the way in which they measure it, and the margin of error that they allow. Again, we must not be impressed by an illusory image of high technology – a miracle product. Limitations of this technology biometric data should not be used alone for strong authentication because they are not modifiable since by nature specific to each individual. So we can’t just rely on it, especially since we have seen that they are not 100% reliable (FAR / FRR). As a rule of thumb, biometrics will be preferred as part of an identification scheme rather than for authentication. the biometric data are comparable to any other access control system such as passwords, etc., because, from the point of view of the computer system, they are nothing more than series of bits like any other data. In other words, the difficulty lies in the counterfeiting of the physical and biological characteristic that is measured, but in no case in its digitized value (digital). Take the example of our old friend, login/password. This system is often described as unsafe because one of the main attacks is to spy on transactions during a login process to recover the user data and replay. We can see that even in the case of biometric-based techniques, this is still possible! What is the point of complicating the task by trying to reproduce a fingerprint when one can recover the digitized data directly? Or if we can attack the databases containing all the biometric reference data? ConclusionWe will remember several important facts about biometrics: it is not enough to replace a login/password with a biometrics measure; we must also rethink the entire system and secure the complete architecture.you should not use a biometric measure alone to authenticate; it was preferable to couple it with a smart card, a secure token (small storage element with a high resistance to attacks, even physical ones), a password or an OTP (More details on One Time Password.). preference biometrics will be used for identification rather than authentication operations. Finally, we lose once and for all this image of ultra-secure technology falsely propagated by the media. Biometrics is by no means a “miracle and universal solution”!