With data protection standards, such as GDPR, and the sheer mass of data that companies collect and accumulate, the protection and control of information has become increasingly important. The deployment of encryption is the backbone of any given organisation’s systems security scheme towards the goal of data protection.
This post was prepared by the Apograf team. Apograf is an innovative publishing platform geared for cryptographers and developers — learn more at apograf.io.
The industry standard AES algorithm is currently the most secure and widely used algorithm for encrypting data. Encryption (and cryptography in general) tries to provide us with three types of protection — source authentication, integrity and confidentiality. However, there is no such thing as absolute security; as the saying goes, “a tool is only as good as the hand that wields it”.
AES is designed to protect classified government information; it relies on block ciphers to encrypt bulk data, such as large documents. However, symmetric encryption algorithms like AES have weak points that must be considered before they are used. In this article, we focus on the ‘key leak’ problem which occurs when a single key is overused and allows an attacker to potentially reconstruct the encryption key. We’ll consider two cases, when the cipher is overused and the risk of side-channel attacks.
Key “Exhaustion” and Leakage
“Key exhaustion” can be understood here as using a key more times than it should be used, which means encrypting too much data or using it beyond its authorised lifecycle. This is by no means a theoretical concern — take a huge data centre like Amazon who encrypts terabytes of data a day, for example, the issue that the key may become ‘exhausted’ or overused is legitimate. On the issue of key overuse, Jonathan Burns, Director of Cryptography and Mathematics at Ionic, notes that this can occur in several contexts — “ the first context is cryptanalytic – using a key more times than it should be can reveal information about the underlying plaintext, or the secret values in the key itself”. He adds that “another restriction on key usage concerns the modes of operation for block ciphers”. That is to say, if a data centre uses AES with Galois Counter Mode (GCM), then a single key should never be used to encrypt more than 232 cipher blocks. This is to prevent partial or full compromise of the plaintext messages or its integrity .
Additionally, side-channel attacks can recover secret keys from cryptographic algorithms (including the pervasive AES) using measurements, such as power use. All electronic products work by pushing electrons around, and this induces a magnetic field. There is often nothing preventing anyone from measuring this EM radiation from close by, or even covertly from a distance . The TEMPEST attack demonstrated how keys could be recovered simply by listening to these electrical emanations leaked by a laptop or computer running an algorithm. The researchers here were able to obtain the secret key after observing 66 decryption processes, each lasting around 0.05 seconds . Ultimately, the more a key is used, the greater the probability that it will be taken advantage of by an adversary, perhaps in an undetectable manner, resulting in the compromise of the key.
To borrow from the IETF Network Working Group, the term "key management" refers to the establishment of cryptographic keying material for use with a cryptographic algorithm to provide protocol security services, especially integrity, authentication, and confidentiality . The effective implementation of a key management system reduces the scope of data protection to just keys and certain metadata. Therefore, the entire security of any given scheme is dependent on the security of the keys — specifically, the control and maintenance of key lifecycles, usage and sharing.
As noted by Vladimir Soukharev, Chief Post-Quantum Researcher & Cryptographer at InfoSec Global, “the amount of information leaked about a key or the data which it protects depends on the cryptographic scheme, software, and hardware environments in which the key is used”. If we assume that it is correctly implemented and hasn't fallen foul to human error, key rotation and the use of key hierarchies as part of a key management system directly minimise the risks associated with key exhaustion or overuse.
Encryption keys are not interminable, and the probability of a breach increases the longer that a key is in use. When an organisation considers the threat of an attack, they don’t know when the attack will take place, where it will happen or how long the attacker will have access to its keys. Key rotation allows them to minimise their exposure to such an attacker. Key rotation is when you retire an encryption key and replace that old key by generating a new cryptographic key. Rotating keys on a regular basis help meet industry standards and cryptographic best practices. Key rotation reduces the amount of content encrypted with a single key — this becomes especially relevant when we consider side-channel attacks against AES and TEMPEST attack mentioned previously. If a key is rotated every day, only that day of information can be decrypted by the attacker. So, while key rotation in and of itself cannot guard against compromise, it does reduce the costs associated with this.
An integral part of key rotation is the establishment of an appropriate and well-considered key lifecycle — i.e. the period during which a key is active or authorized for use. The lifecycle of a key should be the strength needed to withstand attacks, and a key management system should ensure its secure progression through each stage of the lifecycle to deactivation. The National Institute of Standards and Technology (NIST) provides strict guidelines for most aspects of the lifecycle of cryptographic keys and has also defined some standards on how a cryptoperiod is determined for each key .
The logic behind key rotation is clear, however, it raises the question — is there a deterministic process that we can follow to always compute fresh and secure keys?
This brings us to key hierarchies. Key hierarchies involve organizing encryption keys so that the root (or master) key is used to derive and encrypt keys that are themselves used to encrypt the actual data you want to protect. It allows for the segmentation of data which means that different keys can be deployed for different data sets, which can then be managed separately. As long as you have the root key, all the other keys can be recomputed. Key hierarchies avoid the issues that come with attempting to encrypt too much data using a single key or using one key for multiple purposes. Similar to key rotation, they act to reduce the impact of data leakage (or even losing a key). Key hierarchies ensure the integrity of the data while focusing all data protection efforts on the root key. However, as Soukharev notes, “the key management part would apply not only to protecting the master key but also to ensuring that the key hierarchies are well implemented and executed for the data keys”.
As noted by the National Institute of Standards and Technology, “the proper management of cryptographic keys is essential to the effective use of cryptography for security. Keys are analogous to the combination of a safe. If a safe combination is known to an adversary, the strongest safe provides no security against penetration. Similarly, poor key management may easily compromise strong algorithms” . IT systems work to protect operations by keeping external technologies out, and encryption plays an important part in this, however, as mentioned above, there is no such thing as definitive security.
An organisation looking to deploy encryption and a key management scheme needs to work out its threat model — i.e. the power and capability of an attacker. From here they can now decide what security properties they need to achieve. With respect to symmetric keys and the unwanted leakage of information, Burns makes the point that “formal proofs of leakage resiliency depend upon security models with certain assumptions. For instance, a cipher may be secure against non-adaptive chosen-ciphertext attacks, but insecure against an adaptive chosen-ciphertext attack. Similarly, it may be possible for an AES implementation to be formally proven leakage resistant in the Continuous Bounded-Range Leakage model , but not resilient for the dth-order side-channel security model” . If an organisation is unable to iterate the power of an attacker, then they must assume the worst and deploy the strongest security policy possible.
The growing appetite for digital connectedness, interoperability and automation in business carries with it a corresponding increase in the sophistication of security threats. Ultimately, the dynamic nature of technological change and development is matched only by the dynamic and adaptive nature of malicious actors. Burns poses that “the practical resources of implementers and adversaries may not adhere to the theoretical constraints of a particular security model — a security proof may fail in one model while no current technology exists to crack a key in practice, and a proof that is too narrow may not account for recent advancements in technology”. Using the adage “all models are wrong, but some are useful”, he adds that security models for leakage resiliency will also continue to evolve as technology progresses.
In an ideal setting, key overuse should not lead to compromise but only make it slightly easier to perform attacks. However, Soukharev makes the point that, in practice, an ideal setting does not exist. He adds that “for the real-world applications, keys are usually not protected as well as they should be, cryptographic algorithms are not updated as fast as they should be and the general key management is heavily impacted by the human factor.” While there may not be an abiding panacea for data security, the execution of a competent key management strategy remains the bare minimum requirement of any organisation or data centre.
 “NIST SP 800-38D – Recommendation for Block Cipher Modes of Operation: Galois/Counter Mode (GCM) and GMAC” (2007), by the National Institute for Standards and Technology
-  “TEMPEST attacks against AES: Covertly stealing keys for €200” (undated, retrieved April 2019), by Fox-IT-com
-  "Schneier on Security 'Practical TEMPEST Attack' ” (2016), by Schneier
-  "Guidelines for Cryptographic Key Management" (2005), by S. Bellovin, R. Housely
-  “Exploring the Lifecycle of a Cryptographic Key” (2018), by Chris Allen
-  NIST “Recommendation for Key Management – Part 1: General (Revision 3)” (2012), by Elaine Barker, William Barker, William Burr,
William Polk, and Miles Smid at the National Institutes for Standards and Technology
-  “Theory and Practice of a Leakage Resilient Masking Scheme” (2012) by Balasch J., Faust S., Gierlichs B., Verbauwhede I. in: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2012
-  “On the Practical Security of a Leakage Resilient Masking Scheme” (2013), Prouff Emmanuel, Rivain Matthieu,