Post-Quantum Cryptography
Deploying Hybrid Stacks: Combining Classical and Quantum-Safe Key Exchange
Discover how to maintain backward compatibility and current security certifications by wrapping PQC algorithms with classical schemes like X25519.
In this article
The Evolution Toward Post-Quantum Resiliency
The current security of our digital economy depends almost entirely on the mathematical hardness of factoring large integers and computing discrete logarithms. These problems underpin RSA and Elliptic Curve Cryptography which protect everything from banking transactions to private messaging. However, the advent of quantum computing introduces a fundamental shift in this landscape by making these specific mathematical challenges solvable in polynomial time.
A sufficiently powerful quantum computer utilizing Shors algorithm can break the cryptographic primitives we use today. While such a machine does not yet exist at a scale to threaten current production systems, the danger is not in the future but in the present. This leads us to the concept of Harvest Now Decrypt Later where adversaries capture encrypted traffic today to store it for decryption in a future quantum era.
Post-Quantum Cryptography refers to new cryptographic algorithms designed to be secure against both quantum and classical computers. NIST has standardized several of these algorithms, most notably ML-KEM for key encapsulation and ML-DSA for digital signatures. These algorithms rely on different mathematical foundations like structured lattices which are currently believed to be resistant to quantum attacks.
The primary challenge for software engineers is not just the selection of these new algorithms but the safe transition to them. Because PQC algorithms are relatively new and have not undergone decades of production scrutiny, jumping to them exclusively is considered a high-risk move. This creates the necessity for hybrid schemes that leverage both classical and post-quantum methods simultaneously.
Understanding the Harvest Now Decrypt Later Threat
The Harvest Now Decrypt Later threat is a strategic risk that affects any data with a long shelf life. If your application handles health records, government secrets, or financial archives that must remain confidential for twenty years, you are already vulnerable. An attacker can record your TLS handshakes today and wait for quantum hardware to mature.
This threat model changes the urgency of cryptographic migration for many organizations. It is no longer enough to wait for the first quantum computer to be built before upgrading your infrastructure. By the time the hardware arrives, the historical data already captured by adversaries will be compromised.
To mitigate this, developers must implement forward-looking security measures that provide quantum resistance today. Hybrid cryptography allows you to add quantum protection to your existing security layers without discarding the certifications and proven reliability of classical algorithms like X25519.
The Strategy of Hybrid Cryptography
Hybrid cryptography is an architectural pattern where a classical cryptographic algorithm is combined with a post-quantum algorithm in a single operation. The goal is to ensure that the resulting security is at least as strong as the strongest individual component. If the post-quantum algorithm is later found to have a flaw, the classical algorithm still protects the session.
This approach provides a safety net for developers while fulfilling regulatory requirements like FIPS 140-3. Many compliance frameworks require the use of approved classical algorithms, which makes a pure PQC implementation non-compliant for many enterprise use cases. Hybrid schemes resolve this by maintaining the classical primitive as the baseline.
Hybridization is the only responsible way to deploy post-quantum cryptography in production environments today. It allows us to gain quantum resistance while insuring ourselves against the potential cryptographic weaknesses of nascent PQC mathematical models.
Designing the Hybrid Key Encapsulation Mechanism
Key Encapsulation Mechanisms are used during a handshake to establish a shared secret between two parties. In a hybrid KEM, we perform two independent key exchanges: one using a classical method like X25519 and one using a post-quantum method like ML-KEM-768. The two resulting shared secrets are then combined into a single master secret.
The combining process usually involves a Key Derivation Function that takes both secrets as inputs. This ensures that an attacker would need to break both the classical discrete logarithm problem and the post-quantum lattice problem to recover the session key. This dual-layer protection is the cornerstone of the transition strategy recommended by NIST and the NSA.
Developers must be careful about how they concatenate these secrets and how they handle errors. If the PQC portion of the handshake fails, the system should be able to decide whether to fall back to classical-only mode or terminate the connection based on the security policy of the application.
Implementing X25519-MLKEM-768 Concatenation
When implementing a hybrid KEM, the most common approach is to concatenate the public keys and the ciphertexts of both algorithms. For instance, a client sends a combined public key containing both the X25519 public key and the ML-KEM-768 public key. The server responds with the respective ciphertexts for both.
After both parties have performed their side of the exchange, they end up with two raw shared secrets. These secrets are concatenated and passed into a HKDF-SHA256 function to produce the final symmetric key. This ensures the output entropy is derived from both sources.
1func DeriveHybridSecret(classicalSecret, pqcSecret []byte) []byte {
2 // Concatenate the two secrets to form the input key material
3 combinedInput := append(classicalSecret, pqcSecret...)
4
5 // Use HKDF to derive a single, uniform master secret
6 // The salt and info parameters should be consistent with your protocol spec
7 hkdfReader := hkdf.New(sha256.New, combinedInput, nil, []byte("hybrid-kem-v1"))
8
9 masterSecret := make([]byte, 32)
10 if _, err := io.ReadFull(hkdfReader, masterSecret); err != nil {
11 log.Fatalf("Failed to derive shared secret: %v", err)
12 }
13
14 return masterSecret
15}Key Derivation and Domain Separation
Domain separation is critical when combining multiple cryptographic inputs to prevent cross-protocol attacks. When feeding the concatenated secrets into a KDF, you should include a unique string in the info parameter that identifies the specific hybrid combination being used. This prevents the same secret from being misused in different contexts.
It is also important to maintain the order of concatenation as defined by the protocol standard. In most implementations, the classical secret is placed first followed by the PQC secret. Deviating from the standard order will result in interoperability failures with other libraries and services.
Robust implementations should also zeroize the intermediate raw secrets immediately after the final master secret is derived. This minimizes the time that sensitive key material remains in memory, reducing the risk of side-channel attacks or memory leaks.
Performance Implications and Network Constraints
Transitioning to PQC is not a free upgrade; it comes with significant changes in performance and network utilization. Classical algorithms like X25519 are highly optimized and result in very small keys and signatures. In contrast, post-quantum algorithms like ML-KEM and ML-DSA use much larger data structures.
For example, an X25519 public key is only 32 bytes, whereas an ML-KEM-768 public key is 1184 bytes. This shift represents an order of magnitude increase in the amount of data sent during a TLS handshake. Engineers must account for this when designing high-throughput systems or applications running on bandwidth-constrained networks.
CPU overhead also increases, although the impact varies. While ML-KEM is quite efficient and can sometimes outperform RSA, the overall combined cost of running two algorithms in a hybrid mode will always be higher than running one. This impacts the number of handshakes a single server can handle per second.
Analyzing Latency and Payload Sizes
The increased size of PQC public keys and ciphertexts can lead to TCP fragmentation. If a handshake packet exceeds the Maximum Transmission Unit of a network link, it will be split into multiple frames, increasing the risk of packet loss and adding round-trip latency. This is particularly noticeable in protocols like QUIC which rely heavily on single-packet handshakes.
In microservice environments, this overhead can accumulate. If every service-to-service call requires a new hybrid handshake, the total latency of a request chain could increase significantly. Using session resumption or persistent connections becomes even more critical when deploying PQC.
- X25519 Public Key: 32 bytes vs ML-KEM-768 Public Key: 1184 bytes
- Classical TLS Handshake: ~300-500 bytes vs Hybrid PQC Handshake: ~3000+ bytes
- CPU cycles for ML-KEM key generation are generally 2-4x higher than X25519
- UDP-based protocols like QUIC may face issues with amplification limits due to larger server responses
Operational Monitoring and Scaling
As you deploy hybrid schemes, your monitoring strategy must evolve to track the performance impact on end users. You should measure handshake duration specifically for hybrid versus classical connections to identify potential network bottlenecks. This data is vital for deciding which PQC parameter sets to use in different regions.
Load balancers and TLS terminators may need hardware upgrades or configuration changes to handle the increased computational load. If your infrastructure uses hardware security modules, you must verify that they support the specific PQC algorithms and can handle the larger key sizes without performance degradation.
Testing with real-world network conditions is essential. Developers should simulate high-latency and high-loss environments to see how the larger PQC payloads behave. This helps in tuning timeout values and buffer sizes in the networking stack.
Practical Deployment in Modern Infrastructure
Integrating PQC into an existing enterprise environment requires a phased approach. You cannot simply update all clients and servers simultaneously. Instead, you should implement a negotiation mechanism where the client offers both classical and hybrid cipher suites, and the server selects the most secure option that both parties support.
This negotiation is natively handled in TLS 1.3 through the use of supported groups and key share extensions. By adding new group identifiers for hybrid schemes like X25519MLKEM768, you can maintain backward compatibility with legacy clients while providing quantum security to updated ones.
Modern cryptographic libraries like OpenSSL 3.x and AWS-LC have already started adding support for these hybrid groups. Leveraging these established libraries is much safer than attempting to implement the lattice mathematics from scratch, which is prone to subtle implementation errors.
Integrating with TLS 1.3 Workflows
In a TLS 1.3 handshake, the client includes a Key Share extension containing its public keys. To support hybrid PQC, the client will send an additional key share for the hybrid group. The server then checks its own configuration to see if it recognizes and prefers the hybrid group over the classical-only options.
If the server supports the hybrid scheme, it responds with its own hybrid key share. If it does not, it simply falls back to the standard X25519 key share provided by the client. This ensures that the connection is never broken, even if one of the parties is not yet PQC-ready.
1const tls = require('tls');
2
3// Configuring the client to prefer the hybrid X25519 + ML-KEM group
4// Note: group names may vary based on the underlying OpenSSL version
5const options = {
6 host: 'secure-api.internal',
7 port: 443,
8 ecdhCurve: 'X25519MLKEM768:X25519:secp256r1',
9 servername: 'secure-api.internal'
10};
11
12const socket = tls.connect(options, () => {
13 console.log('Connected with:', socket.getCipher().name);
14 console.log('Key Exchange Group:', socket.getEphemeralKeyInfo().name);
15});