The main focus of .NET Cryptography for .NET 10 was adding support for Post-Quantum Cryptography (PQC). Cryptography work tends to be in the “it’s important, but not really worth talking about” camp, but since there’s been a fair amount of buzz this year regarding PQC, this seems like a good time to talk about PQC in .NET.
First, a note about nomenclature. The “Post” in “Post-Quantum” doesn’t mean “quantum computers are here,” it mostly means “algorithms that won’t be compromised by the existence of a sufficiently powerful quantum computer.” Even that is overreaching, because a “cryptographically-relevant quantum computer” (CRQC) won’t have as significant an impact on AES, the SHA-2 family of hash algorithms, or the SHA-3 family of hash algorithms as it will for ECC (EC-DSA, EC-Diffie-Hellman, etc) or RSA. So, mainly, “PQC” just means “some new algorithms we’re adding because quantum computers are a threat to RSA and ECC.”
Strategies like “Harvest now, decrypt later” mean that the transition from “traditional” asymmetric cryptography to PQC should be done before CRQCs exist. Depending on the futurist you ask (and how important your data is), the time to switch is anywhere from “years ago” to “maybe never”. We don’t have a time machine, so we can’t solve it for “years ago”, but we achieved “in our first release after the first specifications were standardized”, so “now” is about as good as it gets!
In .NET 10 we’re focusing on 4 PQC algorithms:
| Algorithm | Kind | Specification | .NET Class |
|---|---|---|---|
| ML-KEM | Key Encapsulation | NIST FIPS 203 | MLKem |
| ML-DSA | Signature | NIST FIPS 204 | MLDsa |
| SLH-DSA | Signature | NIST FIPS 205 | SlhDsa |
| Composite ML-DSA | Signature | IETF Draft “Composite ML-DSA for use in X.509 Public Key Infrastructure“ | CompositeMLDsa |
ML-DSA, SLH-DSA, and Composite ML-DSA are all replacements for signatures by RSA and EC-DSA. ML-KEM logically replaces both RSA “Key Transport” and EC-Diffie-Hellman “Key Agreement”, though the actual usage isn’t close to being a drop-in replacement for either of them. There is no direct replacement for RSA “Data Encryption”, but that’s because that’s not a recommended use of RSA in the first place.
The Way We’ve Always Done It
Generally speaking, when you’re doing “a thing” that’s “like some other thing”, you should do it in the same way. In .NET Cryptography, we have an established pattern for keys of asymmetric algorithms:
- Algorithm types derive from
AsymmetricAlgorithm - Implementation types derive from the algorithm types
- Algorithm types have a static
Create()method that works no matter what OS you’re on (unless the algorithm just isn’t supported on your OS). - Keys can then be imported, explicitly generated, or implicitly generated.
namespace System.Security.Cryptography;
public partial class AsymmetricAlgorithm : IDisposable { }
public partial class RSA : AsymmetricAlgorithm
{
public static RSA Create();
}
public partial class DSA : AsymmetricAlgorithm
{
public static DSA Create();
}
public partial class RSACng : RSA { }
public partial class RSAOpenSsl : RSA { }
// etc
It Starts To Go Wrong
When we started the PQC project, the obvious answer was that we should continue to extend AsymmetricAlgorithm,
but there was sort of a hint that was a bad answer… and that’s the KeySize property on AsymmetricAlgorithm:
public partial class AsymmetricAlgorithm
{
public virtual int KeySize { get; set; }
}
This property was introduced when .NET only supported RSA and DSA, and it mostly made sense:
when creating an RSA key or a DSA key, pretty much the only parameter is the RSA modulus (n) size or the DSA prime modulus (p) size.
RSA’s KeySize values and DSA’s KeySize values shouldn’t be compared across the algorithms, but they are a property of any given key.
Then we introduced EC-DSA and EC-DiffieHellman.
ECC keys have a simple integer value as a private key, a number in the range [1, p), where p is the prime modulus for the “curve”.
So, OK, everyone agrees that the “key size” of an ECC key is “the number of bits required to represent p“.
.NET at the time only supported 3 curves: NIST P-256, NIST P-384, and NIST P-521.
The number after “P-” is “how many bits are required to represent p“,
so now we have a sensible answer for this property:
the getter reports the number of bits required to represent p,
and the setter chooses from NIST P-256, NIST P-384, and NIST P-521.
Then Windows added support for more elliptic curves, so .NET added support for more elliptic curves.
Brainpool’s brainpool384r1 and NIST’s P-384 both report 384 from the getter, but what should the setter do?
The best answer we came up with was “the setter still picks from the 3 options it had before, and we need a new way to inspect or specify the curve.”
That was basically like hearing a creak of wood on a calm day while standing next to a dam.
So now we’re adding more new algorithms.
Given an ML-DSA-65 key, what should we report as the KeySize value?
“65” is an obvious answer, but it’s sort of meaningless (that name just means that this “parameter set” makes use of a 6×5 matrix).
The “raw” public key for ML-DSA-65 is 1952 bytes, so maybe 1952?
Well, this is cryptography, so it should be in bits: 15616?
This was the start of a journey where we decided to “break up” with AsymmetricAlgorithm.
… Anything Else?
Many moons ago I saw a poster that said something like “Change is terrible… unless it’s great!”.
Based on where it was, I think the target audience was UX designers and the poster was saying
“users hate it when you move buttons around, so if you’re going to move it, you better have a great reason”.
Regardless of the intended audience, the message has resonated with me all these years,
and so I knew that we needed something other than “AsymmetricAlgorithm, but without the KeySize property.”
So, we took a look at what things we like about AsymmetricAlgorithm, and what parts we don’t.
The bad parts:
- Heavy use of
public virtualmeans that we have to repeat state and argument validation in every derived type.- And sometimes we didn’t repeat it correctly.
- You have to create an instance to ask about its capabilities (e.g.
public virtual LegalKeySizes[] { get; }) Create()doesn’t generate a key, in case you do import. As a result, key generation happens when the key is first needed, making for some perf surprises.Dispose()doesn’t always mean “the object is unusable”, often it meant “I’ve abandoned this key, but I can generate another one!”- You can’t really use it as-is. If you accept one you need to cast it to an algorithm type.
KeySizedoesn’t seem to make sense for these new algorithms.KeyExchangeAlgorithm,SignatureAlgorithm,ToXmlString(bool),FromXmlString(string)are intrusions fromSignedXmlandEncryptedXml, they’re at the wrong layer.ExportParameters(bool)makes it hard to write a consistent flow analyzer for when you have private key data or public key data.
The good parts:
- There’s a consistent way to import/export keys.
Clearly, once we started making the “breakup” list, it was pretty obvious.
Sorry, AsymmetricAlgorithm, it’s not you, it’s me (P.S.: it’s totally you).
The New Design’s Goals
- Instances represent a key/keypair.
- Once disposed, always disposed.
- Don’t have a “common base class” when two things don’t really have anything in common.
- Minimize code for derived types, so that we can minimize the room for mistakes.
- Use existing terminology when it means the same thing.
- Use new terminology when the existing terminology means something else.
- Design for Span
The New Design
Here’s a view of the class for ML-DSA, with most of the overloads removed for brevity:
namespace System.Security.Cryptography;
public abstract partial class MLDsa : System.IDisposable
{
public static bool IsSupported { get; }
protected MLDsa(MLDsaAlgorithm algorithm);
public MLDsaAlgorithm Algorithm { get; }
public void Dispose();
protected virtual void Dispose(bool disposing);
// Generate a new key
public static MLDsa GenerateKey(MLDsaAlgorithm algorithm);
// Algorithm-specific key format imports
public static MLDsa ImportMLDsaPublicKey(MLDsaAlgorithm algorithm, ReadOnlySpan<byte> source);
public static MLDsa ImportMLDsaPrivateKey(MLDsaAlgorithm algorithm, ReadOnlySpan<byte> source);
public static MLDsa ImportMLDsaPrivateSeed(MLDsaAlgorithm algorithm, ReadOnlySpan<byte> source);
// Standard key container format imports
public static MLDsa ImportSubjectPublicKeyInfo(ReadOnlySpan<byte> source);
public static MLDsa ImportPkcs8PrivateKey(ReadOnlySpan<byte> source);
public static MLDsa ImportEncryptedPkcs8PrivateKey(ReadOnlySpan<char> password, ReadOnlySpan<byte> source);
public static MLDsa ImportFromPem(ReadOnlySpan<char> source);
public static MLDsa ImportFromEncryptedPem(ReadOnlySpan<char> source, ReadOnlySpan<char> password);
// Algorithm-specific key format exports
public void ExportMLDsaPublicKey(Span<byte> destination);
public void ExportMLDsaPrivateKey(Span<byte> destination);
public void ExportMLDsaPrivateSeed(Span<byte> destination);
// Standard key container format exports
public byte[] ExportSubjectPublicKeyInfo();
public byte[] ExportPkcs8PrivateKey();
public byte[] ExportEncryptedPkcs8PrivateKey(ReadOnlySpan<byte> passwordBytes, PbeParameters pbeParameters);
public string ExportSubjectPublicKeyInfoPem();
public string ExportEncryptedPkcs8PrivateKeyPem(ReadOnlySpan<char> password, PbeParameters pbeParameters);
// Operations the algorithm can perform
public void SignData(ReadOnlySpan<byte> data, Span<byte> destination, ReadOnlySpan<byte> context = default);
public void SignMu(ReadOnlySpan<byte> externalMu, Span<byte> destination);
public void SignPreHash(ReadOnlySpan<byte> hash, Span<byte> destination, string hashAlgorithmOid, ReadOnlySpan<byte> context = default);
public bool VerifyData(ReadOnlySpan<byte> data, ReadOnlySpan<byte> signature, ReadOnlySpan<byte> context = default);
public bool VerifyMu(ReadOnlySpan<byte> externalMu, ReadOnlySpan<byte> signature);
public bool VerifyPreHash(ReadOnlySpan<byte> hash, ReadOnlySpan<byte> signature, string hashAlgorithmOid, ReadOnlySpan<byte> context = default);
// Key exports, implementation-specific.
protected abstract void ExportMLDsaPrivateSeedCore(Span<byte> destination);
protected abstract void ExportMLDsaPublicKeyCore(Span<byte> destination);
protected abstract void ExportMLDsaPrivateKeyCore(Span<byte> destination);
protected abstract bool TryExportPkcs8PrivateKeyCore(Span<byte> destination, out int bytesWritten);
// Algorithm operations, implementation-specific.
protected abstract void SignDataCore(ReadOnlySpan<byte> data, ReadOnlySpan<byte> context, Span<byte> destination);
protected abstract void SignMuCore(ReadOnlySpan<byte> externalMu, Span<byte> destination);
protected abstract void SignPreHashCore(ReadOnlySpan<byte> hash, ReadOnlySpan<byte> context, string hashAlgorithmOid, Span<byte> destination);
protected abstract bool VerifyDataCore(ReadOnlySpan<byte> data, ReadOnlySpan<byte> context, ReadOnlySpan<byte> signature);
protected abstract bool VerifyMuCore(ReadOnlySpan<byte> externalMu, ReadOnlySpan<byte> signature);
protected abstract bool VerifyPreHashCore(ReadOnlySpan<byte> hash, ReadOnlySpan<byte> context, string hashAlgorithmOid, ReadOnlySpan<byte> signature);
}
Let’s first see how this stacks up against our goals:
- ✅ All of the instance methods are about “the key/keypair”, none are about “the algorithm”.
- Generating and importing keys are
staticmethods.
- Generating and importing keys are
- ✅ You can’t tell from the class shape, but the base class tracks disposal and won’t call any virtual members once the key is disposed.
- ✅ It turns out ML-DSA and ML-KEM have very little in common. And while ML-DSA and Composite ML-DSA sound similar, they differ in important ways.
- So all of the new algorithms directly extend
object.
- So all of the new algorithms directly extend
- ✅ The class extensively uses the Template Method Pattern. All argument and state validation is done in the base class’s
publicmethods, theprotected abstractmethods only have to do the last step. - ✅
RSAandECDsaboth have a method namedSignDatathat takes the full data to sign and produces a signature.MLDsamatches that.MLDsa‘s version gains an extracontextparameter from the specification, but that doesn’t fundamentally change the terminology.- Additionally, all of the Export methods from
AsymmetricAlgorithmare here, with the same parameters, in the same order.
- ✅
RSAandECDsaboth have a method namedSignHash, it produces a signature that is compatible withSignData. ML-DSA’sHashML-DSAvariant produces an intentionally incompatible signature, so instead ofSignHashit’s calledSignPreHash.SignMuis closer toSignHash, but it’s still different. And it’s very different fromSignPreHash, so it needed an even more unique name.
- ✅ There are no
abstractorvirtualmethods that operate on arrays. TheMLDsabase class does have a lot of overloads that accept (or return) arrays, but those are just for caller convenience.
One thing that may stand out is the prevalence of void methods writing to spans.
For ML-DSA, ML-KEM, and SLH-DSA, all of the algorithm operations have fixed-size responses;
that means there was a strong case made for “if you pass a buffer that’s not exactly the correct size, you’re holding it wrong.”
Okay, so how do you know how to hold it right?
Clearly, just open FIPS 204 (Module-Lattice-Based Digital Signature Standard),
jump down to section 4 (Parameter Sets), and read Table 2 (Sizes (in bytes) of keys and signatures of ML-DSA)
| Private Key | Public Key | Signature Size | |
|---|---|---|---|
| ML-DSA-44 | 2560 | 1312 | 2420 |
| ML-DSA-65 | 4032 | 1952 | 3309 |
| ML-DSA-87 | 4896 | 2592 | 4627 |
Just kidding.
The data from this table (as well as other data) is in the Algorithm property:
namespace System.Security.Cryptography;
public sealed partial class MLDsaAlgorithm : IEquatable<MLDsaAlgorithm>
{
public static MLDsaAlgorithm MLDsa44 { get; }
public static MLDsaAlgorithm MLDsa65 { get; }
public static MLDsaAlgorithm MLDsa87 { get; }
public string Name { get; }
public int MuSizeInBytes { get; }
public int PrivateKeySizeInBytes { get; }
public int PrivateSeedSizeInBytes { get; }
public int PublicKeySizeInBytes { get; }
public int SignatureSizeInBytes { get; }
}
Every now and then, someone wants/needs to do interop with the underlying provider. So, we still have the Cng and OpenSsl derived types, but they’re much, much smaller.
namespace System.Security.Cryptography;
public sealed partial class MLDsaCng : MLDsa
{
public MLDsaCng(CngKey key) : base (GetMLDsaAlgorithm(key)) { }
public CngKey GetKey();
protected override void Dispose(bool disposing);
protected override void ExportMLDsaPrivateKeyCore(Span<byte> destination);
protected override void ExportMLDsaPrivateSeedCore(Span<byte> destination);
protected override void ExportMLDsaPublicKeyCore(Span<byte> destination);
protected override void SignDataCore(ReadOnlySpan<byte> data, ReadOnlySpan<byte> context, Span<byte> destination);
protected override void SignMuCore(ReadOnlySpan<byte> externalMu, Span<byte> destination);
protected override void SignPreHashCore(ReadOnlySpan<byte> hash, ReadOnlySpan<byte> context, string hashAlgorithmOid, Span<byte> destination);
protected override bool TryExportPkcs8PrivateKeyCore(Span<byte> destination, out int bytesWritten);
protected override bool VerifyDataCore(ReadOnlySpan<byte> data, ReadOnlySpan<byte> context, ReadOnlySpan<byte> signature);
protected override bool VerifyMuCore(ReadOnlySpan<byte> externalMu, ReadOnlySpan<byte> signature);
protected override bool VerifyPreHashCore(ReadOnlySpan<byte> hash, ReadOnlySpan<byte> context, string hashAlgorithmOid, ReadOnlySpan<byte> signature);
}
public sealed partial class MLDsaOpenSsl : MLDsa
{
public MLDsaOpenSsl(SafeEvpPKeyHandle pkeyHandle) : base (GetMLDsaAlgorithm(pkeyHandle)) { }
public SafeEvpPKeyHandle DuplicateKeyHandle();
protected override void Dispose(bool disposing);
protected override void ExportMLDsaPrivateKeyCore(Span<byte> destination);
protected override void ExportMLDsaPrivateSeedCore(Span<byte> destination);
protected override void ExportMLDsaPublicKeyCore(Span<byte> destination);
protected override void SignDataCore(ReadOnlySpan<byte> data, ReadOnlySpan<byte> context, Span<byte> destination);
protected override void SignMuCore(ReadOnlySpan<byte> externalMu, Span<byte> destination);
protected override void SignPreHashCore(ReadOnlySpan<byte> hash, ReadOnlySpan<byte> context, string hashAlgorithmOid, Span<byte> destination);
protected override bool TryExportPkcs8PrivateKeyCore(Span<byte> destination, out int bytesWritten);
protected override bool VerifyDataCore(ReadOnlySpan<byte> data, ReadOnlySpan<byte> context, ReadOnlySpan<byte> signature);
protected override bool VerifyMuCore(ReadOnlySpan<byte> externalMu, ReadOnlySpan<byte> signature);
protected override bool VerifyPreHashCore(ReadOnlySpan<byte> hash, ReadOnlySpan<byte> context, string hashAlgorithmOid, ReadOnlySpan<byte> signature);
}
And that is our last change from the existing types: there’s no “import a key into MLDsaCng”, or “generate a key with MLDsaCng”.
Why? The primary reason is that you shouldn’t care.
MLDsaCng doesn’t work on Linux, MLDsaOpenSsl doesn’t work on Windows;
so if you’re writing a run-anywhere app or library, you want to stick to just using the base class.
If you’re trying to work with the underlying provider, that work should be done using the provider classes, like CngKey.Create(...).
Does This Help Me As An Implementer?
OK, it’s pretty unusual that someone other than us extends cryptographic key types, but it happens. The answer is, emphatically, “yes!”
For RSAOpenSsl (and the hidden class used by RSA.Create() on Linux), signing looks like this:
public override bool TrySignHash(
ReadOnlySpan<byte> hash,
Span<byte> destination,
HashAlgorithmName hashAlgorithm,
RSASignaturePadding padding,
out int bytesWritten)
{
ArgumentException.ThrowIfNullOrEmpty(hashAlgorithm.Name, nameof(hashAlgorithm));
ArgumentNullException.ThrowIfNull(padding);
ThrowIfDisposed();
SafeEvpPKeyHandle key = GetKey();
int bytesRequired = Interop.Crypto.GetEvpPKeySizeBytes(key);
if (destination.Length < bytesRequired)
{
bytesWritten = 0;
return false;
}
bytesWritten = Interop.Crypto.RsaSignHash(key, padding.Mode, hashAlgorithm, hash, destination);
Debug.Assert(bytesWritten == bytesRequired);
return true;
}
Argument validation, a disposed state check, a precondition on the destination size to prevent an out-of-bounds write when calling the provider implementation, and then finally the call to the provider.
Here’s the same for MLDsaOpenSsl:
protected override void SignDataCore(ReadOnlySpan<byte> data, ReadOnlySpan<byte> context, Span<byte> destination) =>
Interop.Crypto.MLDsaSignPure(_key, data, context, destination);
“But you can hide all manner of sins behind a one-line call”. OK, here’s MLDsaSignPure:
internal static void MLDsaSignPure(
SafeEvpPKeyHandle pkey,
ReadOnlySpan<byte> msg,
ReadOnlySpan<byte> context,
Span<byte> destination)
{
int ret = CryptoNative_MLDsaSignPure(
pkey, GetExtraHandle(pkey),
msg, msg.Length,
context, context.Length,
destination, destination.Length);
if (ret != 1)
{
throw Interop.Crypto.CreateOpenSslCryptographicException();
}
}
The only thing MLDsaOpenSsl.SignDataCore needs to do is call OpenSSL,
everything else was done in the base class.
For RSA, every single derived type gets independently tested for
- Argument validation
- Disposed state
- Argument validation vs Disposed state ordering
- Buffer too small
just to make sure they’re consistent.
For MLDsa, it’s impossible for them to be inconsistent, so we only need to test the base class.
RSA-derived types also get tested with a correct buffer, and an overly large buffer, when doing “algorithm correctness” tests.
For MLDsa-derived types, that’s almost the entirety of the tests we run.
So, overall it’s less code to write, is therefore less error-prone, and by eliminating categories of tests it makes the overall testing phase faster (with no loss of coverage). Sounds like a win.
There is one drawback in testing, and that’s that we want a separation between “testing the MLDsa base class behaviors” and “testing MLDsa implementations” (so we can actually run fewer tests overall),
but the most obvious name for each of those halves is MLDsaTests.
While all of the algorithms use the same strategy of a static class to test things like “no abstract or virtual methods are called once the instance is disposed”,
and an abstract class to make sure that the implementation types are performing the algorithms correctly,
we ended up with four different naming patterns for four algorithms (none of which are clearly superior to the others):
| Algorithm | static test class | instance test class |
|---|---|---|
| ML-DSA | MLDsaTests | MLDsaTestsBase |
| ML-KEM | MLKemTests | MLKemBaseTests |
| SLH-DSA | SlhDsaContractTests | SlhDsaTests |
| Composite ML-DSA | CompositeMLDsaContractTests | CompositeMLDsaTestsBase |
What’s Up With [Experimental]?
In .NET Cryptography, we observe a modified “rule-of-two”: we don’t (usually) add an algorithm unless two (or more) of our supported OSes offer it. This helps us to reduce situations where we’ve designed something that can’t be fulfilled by a new OS, or an OS that added the feature later. When I was in college, math students used the phrase “engineering induction” for the notion of “if it works three times, it’ll work forever” (versus the much more rigid mathematical induction we had to use in formal proofs). Our “rule-of-two” is like that… except with two instead of three: if an algorithm feature is exposed by any two of Windows, OpenSSL, or macOS, it’ll probably work on the third.
Since Windows hasn’t yet (as of this writing) added support for SLH-DSA,
and neither Windows nor OpenSSL have added Composite ML-DSA as a first-class algorithm,
we’ve decided to release the SlhDsa and CompositeMLDsa classes with [Experimental] on the classes themselves.
It’s possible (though not expected) that we’ll have to make breaking structural changes to these classes when the OS support arrives.
For MLKem and MLDsa we’ve removed [Experimental] from the classes,
but it remains on a few methods:
using System.Security.Cryptography;
public abstract partial class MLKem : System.IDisposable
{
// These are [Experimental]
[Experimental("SYSLIB5006", UrlFormat="https://aka.ms/dotnet-warnings/{0}")]
public byte[] ExportEncryptedPkcs8PrivateKey(ReadOnlySpan<byte> passwordBytes, PbeParameters pbeParameters);
[Experimental("SYSLIB5006", UrlFormat="https://aka.ms/dotnet-warnings/{0}")]
public byte[] ExportPkcs8PrivateKey();
[Experimental("SYSLIB5006", UrlFormat="https://aka.ms/dotnet-warnings/{0}")]
public byte[] ExportSubjectPublicKeyInfo();
[Experimental("SYSLIB5006", UrlFormat="https://aka.ms/dotnet-warnings/{0}")]
public static MLKem ImportFromPem(ReadOnlySpan<char> source);
...
// These are not
public byte[] ExportPrivateSeed();
public static MLKem GenerateKey(MLKemAlgorithm algorithm);
public static MLKem ImportDecapsulationKey(MLKemAlgorithm algorithm, byte[] source);
public void Encapsulate(Span<byte> ciphertext, Span<byte> sharedSecret);
public void Decapsulate(ReadOnlySpan<byte> ciphertext, Span<byte> sharedSecret);
...
}
“What’s the difference?” you ask? Mainly spec ownership.
All of the parts of the class that come from FIPS 203 are in a published spec,
have been written for both Windows and OpenSSL, and we’ve integrated with them.
Therefore, there are no surprises left there, and so no need for [Experimental].
The PKCS#8 PrivateKeyInfo and X.509 SubjectPublicKeyInfo formats of the key, however, come from a different spec (in this case, “draft-ietf-lamps-kyber-certificates”). When our last possible day for changes came up, the specification had not yet been published. While draft-11 looks like it will be published as the finished RFC, we still needed to make callers aware that these formats were still susceptible to both breaking changes and interoperability concerns. If draft-11 is published as the RFC (or was by the time you read this), you can just suppress the diagnostic.
It is very similar for MLDsa, except that, additionally, SignPreHash and VerifyPreHash are [Experimental] even though they come from NIST FIPS 204.
That’s mainly because we feel that representing the hash algorithm by name isn’t quite right, and representing it by OID is not very user friendly
(“SHAKE-128” is a XOF, which means it doesn’t have a fixed output length; “2.16.840.1.101.3.4.2.11” means “a 256-bit extraction from SHAKE-128”).
We’re also not certain if we should be validating that the pre-hash length matches the hash algorithm output,
or if “garbage in, garbage out” is the correct design.
Ultimately, these both tie back to the “rule-of-two”, because while OpenSSL 3.5 supports HashML-DSA,
it’s not as polished as pure ML-DSA,
and we’re waiting on them (and the rest of the ecosystem) for some future guidance.
Where Does .NET Use These Algorithms?
These new algorithms can be used a few places within the System.Security.Cryptography namespaces:
CertificateRequest:MLDsa,SlhDsa,CompositeMLDsaSignedCms‘CmsSigner:MLDsa,SlhDsa- COSE’s
CoseSigner:MLDsaCoseSignerdoesn’t accept anMLDsadirectly. The COSE library added a newCoseKeytype to reduce the number of overloads required for key-specific verification, andCoseKeycan accept anMLDsainstance.- COSE with SLH-DSA is an expired draft specification, so we left it out. Based on current trends, it’ll probably turn up for .NET 11, but that’s not a promise.
CertificateRequest accepting PQC keys may have already been a hint,
but just like .NET’s X509Certificate2 instances can know about RSA, DSA, ECDsa, and ECDiffieHellman private keys,
they can also know about MLDsa, SlhDsa, and CompositeMLDsa private keys.
X509Certificate2 can also track MLKem private keys,
but since MLKem can’t self-sign it isn’t tightly integrated with CertificateRequest.
Most .NET APIs that accept an X509Certificate2 (or even X509Certificate) are interested in signing,
so ML-KEM subject keys inside a certificate can’t be used with things like TLS.
Certificates with ML-DSA public keys, though, should generally work.
The two most prominent places are SslStreamCertificateContext (and SslStream directly) and SignedCms‘ CmsSigner (which works either with attached keys, or detached keys, which is why it was also mentioned above).
Sometimes layering requires explicit support at each layer for a new algorithm.
For example, Kestrel’s CertificateConfigLoader required a change to support ML-DSA and SLH-DSA.
We fixed everywhere that we noticed, but if we missed somewhere (for an algorithm that makes sense) we’ll (probably) fix it in a servicing update.
For SslStream to work with ML-DSA or SLH-DSA certificates you need to be using TLS 1.3 (or a newer future version), and the OS needs to support it, and so does the other half of the connection.
Great, How Do I Get Started?
- Go grab a version of .NET 10
- Ensure you’re on a computer where the OS supports the algorithms.
- We’ll tell you if yours is via
System.Security.Cryptography.MLDsa.IsSupported(or similar forMLKem,SlhDsa, et al). - For Linux, you need OpenSSL 3.5 or newer
- Windows support arrived this month, so if you’re running Windows 11 and have rebooted for Patch Tuesday, you should be good to go.
- We’ll tell you if yours is via
- If you’re targeting .NET Standard 2.0, you will need to reference a 10.0 version of Microsoft.Bcl.Cryptography
using System.Security.Cryptography;
if (!MLKem.IsSupported)
{
Console.WriteLine("ML-KEM isn't supported :(");
return;
}
MLKemAlgorithm alg = MLKemAlgorithm.MLKem768;
using (MLKem privateKey = MLKem.GenerateKey(alg))
using (MLKem publicKey = MLKem.ImportEncapsulationKey(alg, privateKey.ExportEncapsulationKey()))
{
publicKey.Encapsulate(out byte[] ciphertext, out byte[] sharedSecret1);
byte[] sharedSecret2 = privateKey.Decapsulate(ciphertext);
if (sharedSecret1.AsSpan().SequenceEqual(sharedSecret2))
{
Console.WriteLine($"Same answer, yay math! {Convert.ToHexString(sharedSecret1)}");
}
else
{
Console.WriteLine("You just got the one in 2^165 failure. There's probably a prize for that.");
Console.WriteLine($"sharedSecret1: {Convert.ToHexString(sharedSecret1)}");
Console.WriteLine($"sharedSecret2: {Convert.ToHexString(sharedSecret2)}");
Console.WriteLine($"MLKEM768 seed: {Convert.ToHexString(privateKey.ExportPrivateSeed())}");
}
}
If you run into any surprises, let us know!
Special Thanks
As the saying goes, it takes a village (to raise a child). We wouldn’t have made it as far as we did, at the quality we did, without help.
- GitHub Security Services: Participating in the class design journey, doing all of the work for ML-KEM, and setting up a private CI leg to get the project off to a good start.
- OpenSSL, Debian (13), CentOS (10): The timing of the OpenSSL 3.5 release, and how rapidly Debian and CentOS adopted it, meant we had stable CI coverage way earlier than we expected.
- Windows Cryptography: For putting PQC into the Windows Insider builds, and being responsive to our feedback.
- IETF LAMPS-WG: For quickly responding to our questions and feedback for the composite signatures project.
0 comments
Be the first to start the discussion.