# Tag: Asymmetric Cryptography

• ## We Need Non-Interactive Post-Quantum KEMs

Design a system that allows you to encrypt data online, but only decrypt it offline (i.e. in an airgapped environment).

If you’re in the world of symmetric cryptography, this is impossible. Fortunately, we can use asymmetric cryptography to accomplish this goal.

In the 2000’s, your design might have looked like this:

1. Generate a 128-bit random key `k`.
2. Encrypt `k` with an RSA public key `p` to obtain `w`.
3. Encrypt the message `m` with `k`, using AES-CBC, to obtain `c`.
4. Store `w`, `c` for offline decryption, wipe `k` from memory.

Since 2015, your approach might have shifted a bit:

1. Generate an ephemeral ECDH keypair $(sk_e, pk_e)$.
2. Perform a scalar multiplication of the ephemeral secret key $sk_e$ with the recipient’s public key $pk$.
3. Hash the output of step 2 as a 256-bit random key, $k$.
4. Encrypt the message $m$ with $k$, using an AEAD mode, to yield $(c, t)$.
5. Store $(pk_e, c, t)$ for offline decryption, wipe $k$ from memory.

Different approaches, different algorithms, but the same workflow works in both cases. We’re using asymmetric cryptography to somehow manage the symmetric key used for actual message encryption. As long as our asymmetric algorithms are secure, and our keys are kept away from attackers, this approach is secure.

This was made possible because RSA encryption and ECDH key agreement are both non-interactive protocols that operate with static keypairs.

## The CRQC Has Entered the Facility

Unfortunately, a Cryptography-Relevant Quantum Computer (CRQC) defeats both RSA and ECDH and renders the above algorithms insecure.

### NIST and Post-Quantum Cryptography

In response to the looming threat of a CRQC, NIST has been working with the cryptography community to standardize post-quantum asymmetric cryptography (KEMs and signature algorithms).

At the end of Round 3, some algorithms are being standardized and a few more are being studied.

#### NIST Post-Quantum Round 3 Finalists

• KEMs:
• CRYSTALS-Kyber
• Signatures:
• CRYSTALS-Dilithium
• FALCON
• SPHINCS+

#### NIST Post-Quantum Round 4 Candidates

• KEMs
• BIKE
• Classic McEliece
• HQC
• SIKE

### Which KEMs Are Non-Interactive?

Let’s start with the Round 3 KEM finalist: Kyber. From this document:

Using IND-CCA2 security by default makes it safe to use Kyber with static keys and as a consequence also to re-use ephemeral keys for some time.

If you can use Kyber with static keys, it logically follows that you can use Kyber in a non-interactive setting without facing insecurity.

So, y’know, good job, NIST!

However, this isn’t true of many Round 4 candidates.

#### BIKE

Key reuse or adapting BIKE to asynchronous protocols (e.g. email) require to secure long term static keys. Those usage models are possible but no longer provides forward secrecy and require IND-CCA security. Note that they are not compliant with BIKE’s current specification.

BIKE Specification

Static keys are a no-go with BIKE.

#### Classic McEliece

The cryptanalysis literature is unclear on the security of Classic McEliece with static keys, but the authors claim IND-CCA2.

However, the enormous public key sizes make it less attractive than alternatives.

#### HQC

Static keys are discussed briefly by the specification, but there are attacks against static keys against HQC and BIKE.

## Considerations for Round 4 and Beyond

Although it’s currently believed that CRYSTALS-Kyber is sufficient for non-interactive use cases, we’re putting a lot of eggs in one basket.

If Kyber is ever broken by cryptanalytic advancement, then we will need to ensure the alternatives we consider aren’t limited to the TLS use-case.

As it stands today, Classic McEliece is the only Round 4 candidate that might be safe for these use cases if Kyber is broken.

### Do We Really Need Offline Decryption?

Yes, and for reasons beyond keeping email encryption on life support.

A lot of systems implement this today with RSA. You’re leaving a lot of commercial use-cases in the dark if you don’t support non-interactive key exchanges in your scope.

Post-Quantum security is important for TLS, and I don’t want to diminish the work that’s been done already, but it’s not important for only TLS.

• ## Toward Hybrid Post-Quantum Signatures

The state-of-the-art digital signature algorithm, which is widely used and widely recommended by cryptography and security experts, is Ed25519.

Ed25519 is a specific type of an algorithm called EdDSA, instantiated over the edwards25519 elliptic curve. The other standard of EdDSA is Ed448.

EdDSA was created in response to security issues that plagued the incumbent elliptic curve signature algorithm (ECDSA). Overall, it greatly improved the security of elliptic curve signatures across the Internet. Consequently, many things use Ed25519.

Of course, Ed25519 isn’t perfect. Here’s just a few ways to make Ed25519 insecure.

## Double Public Key Vulnerability

If you develop a cryptography library that feeds in an Ed25519 secret key and an Ed25519 public key as distinct inputs, and an attacker can ever influence the choice of the public key input, you can leak the secret key.

### Mitigation

Treat Ed25519 secret keys as the tuple of (secret seed, public key) rather than independent inputs. This is what NaCl and libsodium do.

For high-level protocols and libraries, consider validating that the public key portion is the correct public key for a given seed. You don’t necessarily have to do this by default (since it costs an extra scalar multiplication), but exposing the API can mitigate against this sort of key-leaking misuse.

## Fault Attacks

In embedded systems, you can also use hardware faults to exploit the deterministic nature of Ed25519 to leak the secret key.

Kudelski Security’s article (linked above) does a phenomenal job explaining this attack vector. I won’t belabor the point here.

### Mitigations

The simplest way to mitigate fault attacks is to make it non-deterministic, but that puts back into ECDSA insecurity territory.

An improved strategy is to implement so-called “hedged signatures“: Keep the determinism, but inject additional randomness to the process. In the event of a random number generator failure, this additional randomness is at least as secure as deterministic algorithms.

However, this path is fraught with patent peril.

## Cryptography-Relevant Quantum Computers

If you have a cryptography-relevant quantum computer, Ed25519 (and, indeed, all other ECC algorithms in use today that aren’t based on supersingular isogeny math) are rendered completely insecure.

### Mitigation

Use post-quantum cryptography.

## Hybrid Post-Quantum Cryptography

### Post-Quantum Cryptography

NIST recently selected its Round 3 finalists for post-quantum cryptography, while several other candidates moved onto the 4th round for further cryptanalysis.

However, if you’re an attacker, the first round of “real” cryptanalysis begins once the algorithm has become a standard. (Why waste time on a losing candidate?)

Additionally, we don’t know if or when a cryptography-relevant quantum computer will be developed. It could be 10 years away, it could be 100.

When confronted with extremely abstract risks, it’s often difficult to establish the direction of the inequality operator.

What’s more likely to happen first?

• The arrival of a cryptography-relevant quantum computer, OR
• An advancement in cryptanalysis that breaks the standardized algorithm

### Why Not Both?

Hybrid post-quantum cryptography, while controversial, is a straightforward solution to the uncertainty and ambiguity of the risks involved.

#### Hybrid Post-Quantum KEMs

The idea here is simple: Perform multiple independent asymmetric key encapsulations, then feed both values into a key derivation function.

For example:

```function hybridKeyExchange(
sk1: X25519SecretKey,
sk2: KyberSecretKey,
pk1: X25519PublicKey,
pk2: KyberPublicKey
): CryptoKey {
const ikm1: Buffer = crypto_kem_x25519(sk1, pk1)
const ikm2: Buffer = crypto_kem_kyber76890s(sk2, pk2)
return new CryptoKey(crypto_kdf_hkdfhmacsha384(
ikm = Buffer.concat([ikm1, ikm2]),
salt = NULL,
info = PROTOCOL_DOMAIN_SEPARATION_CONSTANT,
length = 32
))
}
```

#### Hybrid Post-Quantum Signatures

Same idea, but instead of feeding both values into another operation, you simply append the two together.

```function hybridSign(
message: Buffer,
sk1: Ed25519SecretKey,
sk2: DilithiumSecretKey,
pk1: Ed25519PublicKey,
pk2: DilithiumPublicKey
): Buffer {
const sign1: Buffer = crypto_sign_ed25519(sk1, message)
const sign2: Buffer = crypto_sign_dilithium3(sk2, message)
return Buffer.concat([sign1, sign2])
}

function hybridVerify(
message: Buffer,
signature: Buffer,
pk1: Ed25519PublicKey,
pk2: DilithiumPublicKey
): boolean {
const sign1: Buffer = signature.slice(0, 64)
const sign2: Buffer = signature.slice(64)
const valid1: boolean = crypto_sign_verify_ed25519(pk1, message)
const valid2: boolean = crypto_sign_verify_dilithium3(pk2, message)
return (valid1 && valid2)
}
```

This works, but I think we can do a little better.

If the cryptography community ever decides that hybrid signatures are the way to go (as opposed to a direct migration towards post-quantum signatures without any classical counterparts), why not also mitigate EdDSA’s fault attacks and misuse-prone APIs while we’re at it?

To avoid the semi-deterministic digital signature patent (which directly affects the construction of the “nonce”), we can simply use deterministic signatures as-is, but simply prepend 32 random bytes to the actual “message” that Ed25519 sees. These same random bytes will simply be appended to the Ed25519 signature (which is now 96 bytes, from the API perspective).

This is the best of many worlds:

• If your RNG fails, you still have the security properties of a deterministic signature algorithm.
• Fault attack mitigations are baked-in.

To satisfy the misuse-prone API requirement, simply add a MUST to the specifications for the mitigation discussed above.

However, it does come at the cost of extra bandwidth. Compared to post-quantum signatures, this extra bandwidth isn’t much, but it’s nonzero.

## In Summary

If we ever explore the standardization of hybrid approaches, we should also consider choosing hedged classical signature algorithms (deterministic with additional randomness), instead of non-hedged classical signature algorithms.

In most applications, the extra bandwidth is a small price to pay for better security across a myriad use-cases.