The state-of-the-art digital signature algorithm, which is widely used and widely recommended by cryptography and security experts, is **Ed25519**.

Ed25519 is a specific type of an algorithm called EdDSA, instantiated over the edwards25519 elliptic curve. The other standard of EdDSA is Ed448.

EdDSA was created in response to security issues that plagued the incumbent elliptic curve signature algorithm (ECDSA). Overall, it greatly improved the security of elliptic curve signatures across the Internet. Consequently, many things use Ed25519.

Of course, Ed25519 isn’t perfect. Here’s just a few ways to make Ed25519 insecure.

## Double Public Key Vulnerability

If you develop a cryptography library that feeds in an Ed25519 secret key and an Ed25519 public key as distinct inputs, and an attacker can *ever* influence the choice of the public key input, you can leak the secret key.

### Mitigation

Treat Ed25519 secret keys as the tuple of (secret seed, public key) rather than independent inputs. This is what NaCl and libsodium do.

For high-level protocols and libraries, consider validating that the public key portion is the correct public key for a given seed. You don’t necessarily have to do this by default (since it costs an extra scalar multiplication), but exposing the API can mitigate against this sort of key-leaking misuse.

## Fault Attacks

In embedded systems, you can also use hardware faults to exploit the deterministic nature of Ed25519 to leak the secret key.

Kudelski Security’s article (linked above) does a phenomenal job explaining this attack vector. I won’t belabor the point here.

### Mitigations

The simplest way to mitigate fault attacks is to make it non-deterministic, but that puts back into ECDSA insecurity territory.

An improved strategy is to implement so-called “hedged signatures“: Keep the determinism, but inject additional randomness to the process. In the event of a random number generator failure, this additional randomness is* at least as secure* as deterministic algorithms.

However, this path is fraught with patent peril.

## Cryptography-Relevant Quantum Computers

If you have a cryptography-relevant quantum computer, Ed25519 (and, indeed, all other ECC algorithms in use today that aren’t based on supersingular isogeny math) are rendered completely insecure.

### Mitigation

Use post-quantum cryptography.

## Hybrid Post-Quantum Cryptography

### Post-Quantum Cryptography

NIST recently selected its Round 3 finalists for post-quantum cryptography, while several other candidates moved onto the 4th round for further cryptanalysis.

However, if you’re an *attacker*, the first round of “real” cryptanalysis begins once the algorithm has become a standard. (Why waste time on a losing candidate?)

Additionally, we don’t know if or when a cryptography-relevant quantum computer will be developed. It could be 10 years away, it could be 100.

When confronted with extremely abstract risks, it’s often difficult to establish the direction of the inequality operator.

**What’s more likely to happen first?**

- The arrival of a cryptography-relevant quantum computer, OR
- An advancement in cryptanalysis that breaks the standardized algorithm

### Why Not Both?

Hybrid post-quantum cryptography, while controversial, is a straightforward solution to the uncertainty and ambiguity of the risks involved.

#### Hybrid Post-Quantum KEMs

The idea here is simple: Perform multiple independent asymmetric key encapsulations, then feed both values into a key derivation function.

For example:

```
function hybridKeyExchange(
sk1: X25519SecretKey,
sk2: KyberSecretKey,
pk1: X25519PublicKey,
pk2: KyberPublicKey
): CryptoKey {
const ikm1: Buffer = crypto_kem_x25519(sk1, pk1)
const ikm2: Buffer = crypto_kem_kyber76890s(sk2, pk2)
return new CryptoKey(crypto_kdf_hkdfhmacsha384(
ikm = Buffer.concat([ikm1, ikm2]),
salt = NULL,
info = PROTOCOL_DOMAIN_SEPARATION_CONSTANT,
length = 32
))
}
```

#### Hybrid Post-Quantum Signatures

Same idea, but instead of feeding both values into another operation, you simply append the two together.

```
function hybridSign(
message: Buffer,
sk1: Ed25519SecretKey,
sk2: DilithiumSecretKey,
pk1: Ed25519PublicKey,
pk2: DilithiumPublicKey
): Buffer {
const sign1: Buffer = crypto_sign_ed25519(sk1, message)
const sign2: Buffer = crypto_sign_dilithium3(sk2, message)
return Buffer.concat([sign1, sign2])
}
function hybridVerify(
message: Buffer,
signature: Buffer,
pk1: Ed25519PublicKey,
pk2: DilithiumPublicKey
): boolean {
const sign1: Buffer = signature.slice(0, 64)
const sign2: Buffer = signature.slice(64)
const valid1: boolean = crypto_sign_verify_ed25519(pk1, message)
const valid2: boolean = crypto_sign_verify_dilithium3(pk2, message)
return (valid1 && valid2)
}
```

This works, but I think we can do a little better.

## Hedging Your Hybrid Signatures

If the cryptography community ever decides that hybrid signatures are the way to go (as opposed to a direct migration towards post-quantum signatures without any classical counterparts), why not also mitigate EdDSA’s fault attacks and misuse-prone APIs while we’re at it?

To avoid the semi-deterministic digital signature patent (which directly affects the construction of the “nonce”), we can simply use deterministic signatures as-is, but simply prepend 32 random bytes to the actual “message” that Ed25519 sees. These same random bytes will simply be appended to the Ed25519 signature (which is now 96 bytes, from the API perspective).

This is the best of many worlds:

- If your RNG fails, you still have the security properties of a deterministic signature algorithm.
- Fault attack mitigations are baked-in.

To satisfy the misuse-prone API requirement, simply add a MUST to the specifications for the mitigation discussed above.

However, it **does** come at the cost of extra bandwidth. Compared to post-quantum signatures, this extra bandwidth isn’t much, but it’s nonzero.

## In Summary

If we ever explore the standardization of hybrid approaches, we should also consider choosing hedged classical signature algorithms (deterministic with additional randomness), instead of non-hedged classical signature algorithms.

In most applications, the extra bandwidth is a small price to pay for better security across a myriad use-cases.