Ever since the EU mandated the introduction of biometric ePassports containing fingerprints there has been a flurry of technology development and innovation to make biometric ePassports a reality. Much of this played out behind the scenes, but now electronic passports are slowly working their way towards the forefront of the public consciousness since they are sufficiently widespread for researchers and journalists to play with. There have already been a number of security scare stories where the gap between the public's perception of passport security and the reality of the protection is abruptly and rudely closed. It came as a surprise to many that the first generation electronic chips generally do not resist identical duplication, and there have been concerns over eavesdropping and ePassport anonymity. These issues have put ePassports on the media radar.
However, now there is an even stronger driver for public interest in ePassports, beyond mere curiosity, because the regular electronic inspection of ePassports at border control is getting underway in earnest. This change has maybe been too long in the making, and phased deployment has drawn the process out even further (fingers have already been burnt over deployment of part inspection, where the chip is read, but the certificate which proves the data is intact is not verified).
From now the performance of ePassports will have a more direct and tangible effect on travellers, because governments realise that they cannot claim return on investment for ePassport technology if the chips are not actually being regularly used. Without return on investment it is hard to justify passing on the additional cost of the ePassport to citizens in application fees. So will ePassport speed travellers through border control bypassing the queues? Will everyone be able to use e-Gates without enrolment in the future? Will people be led away for questioning when their ePassport won't read? Or will the queues get longer?
Unfortunately where we currently stand, the answers to these questions are not positive. The necessary drive to full inspection of ePassports needs to be backed up with the same innovation and new technology that accompanied the issuance process. At the heart of the matter is the rather lengthy duration of a full biometric inspection. The machine readable zone (MRZ) on the passport must be scanned and pushed through OCR, the chip must be read, data verified, certificates checked. To extract fingerprint biometrics each inspection system must submit a certificate chain to prove entitlement; the actual ePassport data must be read out, the traveller's fingers must be scanned, and finally everything must be matched up. It is a formidable process, and border control agencies are starting to realise that it is just not feasible to perform such inspections unless the time required can be reduced - there is a very real "need for speed".
Biometric ePassport inspection is such an involved process that one technology alone is unlikely to be enough to drive down inspection times to acceptable levels. Proposed solutions include even greater numbers of e-Gates at border control, newer and faster chips, to reduce cryptography time and communications overheads, and new protocol improvements. Cryptomathic has developed a unique technology in this area designed to work in conjunction with other speed-up measures in order to rein in the inspection time.
High Speed Inspection
Cryptomathic's High Speed Inspection technology exploits the middle ground between a fully distributed and a fully centralised scheme: to implement local caching of data that has been read from identity documents, on local, national or even international level. Such a system could retrieve biometric data from an encrypted cache forming part of the inspection infrastructure at a port of entry. Reading from such a cache would allow a document to be read almost instantaneously; however the challenge comes in implementing such a cache in a secure manner such that the confidentiality, integrity and anonymity of personal data is preserved. There is a solution, however: it is possible to create an encrypted cache of biometric data where each entry can only be accessed in the presence of the original identity document from which it was sourced.
Consider the example of an ICAO-compliant EU electronic passport, which contains sixteen different data groups of biometric, biographical and additional information, such as iris and signature data. Any and all of these data groups could be cached with this mechanism. In particular, look at the two large biometric data groups to be stored on EU Extended Access Control (EAC) compliant ePassports:
Data Group 2 (DG2) - Facial Information
Data Group 3 (DG3) - Fingerprint Information
Before these large data groups are read from an ePassport, the "Document Security Object" (SOD) is first read, a sort of "summary file" which contains a digital signature and protects the integrity of the information stored on the ePassport. As the summary file contains high entropy unpredictable data, a key derivation function can be applied to this data to generate a secure key for encrypting data from the passport. Such a key could only be recreated in possession of the summary file. Sources of entropy within the SOD include, the hashes of biometric data groups, hashes of cryptographic data groups (AA or EAC public keys), the digital signature itself and timestamps.
There are many possible key derivation functions drawn from this high entropy data, which can easily provide enough entropy to derive a strong cryptographic key, but let us use a simple concrete example: to use the hash value for each data group to create a cryptographic key to encrypt the data group bulk data, and to create a pseudonym for locating the cache data in order to prevent the cache from containing personally identifiable information.
Figure 1: ePassport Encrypted Cache
Consider the derivation mechanism shown in Figure 1. To securely store DG2, its hash is divided in half. The first half is used as a key to a (noncryptographic) hash table in order to store the data group. The data group is then encrypted using standard best-practice cryptographic techniques: salting and encryption under a cryptographic key derived from the second half of the hash. Example rows in a cache table would contain the following lookup key and data:
left(hash(DG2)) encrypt( right(hash(DG2)) , salt||DG2 )
In this way, only in possession of the real ePassport (whose Document Security Object contains the hashes of the data groups) can one calculate the key and decrypt the data group. It is infeasible to predict the value of this hash of a biometric data group, even knowing the identity of the citizen from which the data groups have been made (this is because JPG and WSQ images are highly redundant encodings from a semantic perspective, so contain a lot of unpredictable data).
When is Stored Data Not Stored?
To deploy an encrypted cache scheme successfully it is vital to be able to convince cryptography experts, lawyers, journalists and citizens alike that the information is safely held. This is a challenge hard to meet, and one that has to be tackled head on by the implementers of the EAC protocols being deployed by the EU for distributed fingerprint storage. But fortunately the security of a distributed system is easy for the layperson to grasp - if I have the document in my hand, I have the data, I can inspect it. If I do not have the document, I can't. In the case of the encrypted cache, the core compliance goal is to show that it is not creating a central store of data, and that really the security lies with the distributed system just as it has always done. Even with the cache in place, if I don't have the document in my hand, I can do absolutely nothing.
The "encrypt and destroy" technique used to enter data into the cache can be as strong as anybody would want: it relies only on the security of the cryptographic algorithm, and the act of destruction of the key in the inspection system.
Trustworthy algorithms such as 3DES and AES have the seal of approval of national security and intelligence experts as well as standing up to years of open and academic scrutiny. Destruction of the key is in fact an easier task than destruction of the biometric data temporarily held on an inspection system during the matching process (where the reference image is compared against the candidate image from the traveller), as the smaller the amount of data, the easier and the quicker it is to destroy. This is why the very most sensitive data in the world - cryptographic keys used by banking authorisation systems, customer PINs, the root keys of certificate authorities, and even the arming codes for nuclear weapons all lie under protection of Hardware Security Modules (HSMs) which are subject to evaluation by NIST and Common Criteria to demonstrate their security. The world's most secure FIPS 140-2 Level 4 evaluated module, the IBM4758 (and its 4764 successor) uses the "encrypt and destroy" technique to protect its data. All stored data in the devices non-volatile memory and RAM is held encrypted with an internal master key stored in battery-backed RAM. If any tampering event is sensed (evidence of an attacker trying to physically penetrate the device and extract data), it need only destroy this one key to render the entire remainder of the device useless. The only technique that could be considered more trustworthy is to use a shaped explosive charge to vaporise the storage chip's content upon sensing of a tamper event. Even though these explosions are very small, such technology is unwieldy, and is the preserve of military equipment at particular risk of battlefield capture, such as crypto ignition keys in battlefield radios. Of course an explosive destruction technology can only be used once (whereas biometric data has to be deleted after every inspection).
The High Speed Inspection technology developed by Cryptomathic is patent pending and incorporated into the ID Inspector product range of inspection systems and software development kits. It is also available as a technology for third party licensing.
Previously published in Cryptomathic NewsOnInk, 2008