This is no exaggeration, but there’s a ticking time bomb at the heart of modern computing platforms that enable our digital world. Quantum computers, often being heralded as the answer to so many of humanity’s unfulfilled potentials, with their mind-boggling processing power they’re also harbingers of unprecedented doom. For experts fear that an advanced quantum computer could break most modern encryption protocols within a decade. Which is why the finalisation of PQC (post-quantum cryptography) algorithms deserves greater attention, and how it will help us bolster global encryption standards.
Earlier in August 2024, NIST (National Institute of Standards and Technology) announced the final set of “encryption tools designed to withstand the attack of a quantum computer.” As part of its PQC standardisation project which began in 2016, the process of narrowing down from over 80 encryption methods submitted by the world’s foremost cryptographic experts took almost a decade – until the final three PQC algorithms that received NIST’s approval. This is a crucial step, make no mistake. As part of the US Department of Commerce, NIST is tasked with creating best standards and practices for the US government and its agencies to adopt. And when it releases a set of new cryptographic standards designed to withstand cyberattacks from a quantum computer, its importance cannot be overstated enough.
That above clip should give you a small idea of just how different and mind-bogglingly quick quantum computers can be at a fundamental level compared to all the traditional computers we know and have heard of till date – our desktop PCs, laptops, smartphones, servers, and supercomputers. Quantum computers are insanely quick at certain tasks like factoring large integers and solving discrete logarithms, which are the mathematical foundations of many modern encryption protocols, such as RSA (Rivest–Shamir–Adleman) and ECC (Elliptic Curve Cryptography).
Also read: Shaping the future of quantum computing: Intel’s Anne Matsuura
For example, using Shor’s algorithm, a quantum computer can factorise a large number exponentially faster than classical computers, effectively breaking RSA encryption. This capability poses a significant risk to current cryptographic systems, as it could allow attackers to decrypt secure communications, financial transactions, and sensitive data with unprecedented speed. Imagine everything from your private emails, personal messages across various encrypted apps, online banking activity, e-commerce purchases, and more; everything you ever learned about trusting the safety of the padlock icon across your web browsers to messaging apps, all gone in an instant – where nothing is encrypted and private anymore, everything laid bare for cyberattackers to exploit willy nilly. That’s what’s at stake, the very fabric of cryptographic systems that underpin our modern digital infrastructure, and their legitimacy in the quantum computing era.
NIST has finalised the following three PQC standards in order to bolster modern public-key cryptography infrastructure for the quantum era:
ML-KEM (derived from CRYSTALS-Kyber), ML-DSA (derived from CRYSTALS-Dilithium), and SLH-DSA (derived from SPHINCS+) – three brand new post-quantum cryptography algorithms. ML-KEM is based on a key encapsulation mechanism selected for general encryption, such as for accessing secured websites and apps. ML-DSA is a lattice-based algorithm picked for its general-purpose digital signature protocols. SLH-DSA is a stateless hash-based digital signature scheme. According to IBM, both ML-KEM and ML-DSA were developed by IBM Research cryptography researchers in Zurich with external collaborators, and SLH-DSA was co-developed by a scientist who has since joined IBM Research.
“One of the main reasons for delayed implementation is uncertainty about what exactly needs to be implemented,”commented Whitfield Diffie, now a cryptography researcher at IBM, who alongside collaborator Martin Hellman, introduced over five decades ago the Diffie-Hellman key exchange protocol, a bedrock of public-key cryptography and digital signatures enabling many secure communication protocols, including SSL/TLS used in HTTPS. “Now that NIST has announced the exact standards, organisations are motivated to move forward with confidence,” he further added.
An early mover in the quantum computing race, IBM Research’s significant contributions towards developing these PQC algorithm standards need to be acknowledged. “IBM has worked for years to bring our expertise forward and prepare for this new era of computing,” emphasised Dr Dario Gil, Senior Vice President and Director, IBM Research.
Also read: Intel and IBM: High-performance AI chips to quantum computing breakthroughs
There is, in fact, a fourth draft standard based on FALCON which is planned for finalisation by late 2024. The upcoming FALCON encryption is derived from NTRU-Lattice and will be a digital signature algorithm, according to the NIST.
It isn’t just IBM leading the way in the development and adoption of the new PQC encryption algorithms, as they announced in an update to their Quantum Safe Platform, several other tech companies are moving quickly to safeguard their platforms and users from “harvest now, decrypt later” cyberattacks as well. Apple’s iMessage platform’s PQ3 has already begun implementing the ML-KEM encryption standard, so has Google Chrome with support for a hybrid KEM implementation. Known for its water-tight encrypted messaging, Signal chats are already secured by PQC. Zoom Workplace calls are also protected by PQC algorithms. The enabler of one-fourth of all connection requests across the internet, even Cloudflare has started turning on a version of ML-KEM encryption.
Apart from contributing to their co-development, the likes of Amazon, Google and Microsoft are also in the process of adopting at scale the latest PQC standards finalised by NIST. This shows how big tech enterprises are recognising the urgent need to safeguard and protect their hardware, software and networks with quantum-safe technology solutions.
Also read: IIT Madras opens India’s quantum research doors, joins IBM’s quantum computing network
In conclusion, I’m reminded of something Dario Gil said when I interviewed him back in 2022. “What I worry about sometimes is that as impressive as the rate of progress we have seen around technology in the recent past, if we do not succeed in improving our speed of innovation then some of these big challenges may be too overwhelming for humans to solve. Innovations like quantum computing we have to channel with care and consideration, because we’re going to need every help we can get from these emerging technologies to solve everything from climate change to sustainability and beyond,” he emphasised.
Make no mistakes, the finalisation of PQC algorithms represents a critical milestone in addressing the ticking time bomb at the heart of our online and interconnected digital infrastructure. The adoption of these new standards marks an essential step in safeguarding our data, ensuring that as quantum computing scales, it doesn’t crash and burn everything about the digital world as we know it – that our encryption methods evolve to meet the challenge.