Announcing .NET 6 Preview 5

Richard Lander

We are thrilled to release .NET 6 Preview 5. We’re now in the second-half of the .NET 6 release, and starting to see significant features coming together. A great example is .NET SDK Workloads, which is the foundation of our .NET unification vision and enables supporting more application types. Like other features, it is coming together to provide a compelling end-to-end user experience.

You can download .NET 6 Preview 5 for Linux, macOS, and Windows.

See the ASP.NET CoreEF Core, and .NET MAUI posts for more detail on what’s new for web, data access, and cross-platform UI scenarios.

Visual Studio 2022 Preview 1 is also releasing today and .NET 6 Preview 5 is included in that release. .NET 6 has also been tested with Visual Studio 16.11 and Visual Studio for Mac 8.9. We recommend you use those builds if you want to try .NET 6 with Visual Studio.

Check out the new conversations posts for in-depth engineer-to-engineer discussions of the latest .NET features.

.NET SDK: Optional Workload improvements

SDK workloads is a new .NET SDK feature that enables us to add support for new application types — like mobile and WebAssembly — without increasing the size of the SDK.

The workloads feature has been updated to include list and update verbs. These new capabilities provide a sense of the expected final experience. You’ll be able to quickly establish your preferred environment with a few simple commands and to keep it up-to-date over time.

  • dotnet workload list will tell you which workloads you have installed.
  • dotnet workload update will update all installed workloads to the newest available version.

The update verb queries for updated workload manifests, updates local manifests, downloads new versions of the installed workloads, and then removes all old versions of a workload. This is analogous to apt update and apt upgrade -y (used on Debian-based Linux distros).

The dotnet workload commands operate in the context of the given SDK. Imagine you have both .NET 6 and .NET 7 installed. If you use both, the workloads commands will provide different results since the workloads will be different (at least different versions of the same workloads).

As you can see, the workloads feature is essentially a package manager for the .NET SDK. Workloads were first introduced in the .NET 6 preview 4 release.

.NET SDK: NuGet Package Validation

Package Validation tooling will enable NuGet library developers to validate that their packages are consistent and well-formed.

This includes:

  • Validate that there are no breaking changes across versions.
  • Validate that the package has the same set of publics APIs for all runtime-specific implementations.
  • Determine any target-framework- or runtime- applicability gaps.

This tool is available via the Microsoft.DotNet.PackageValidation.

A post on this tool will soon be available.

.NET SDK: more Roslyn Analyzers

In .NET 5, we shipped approximately 250 analyzers with the .NET SDK. Many of them already existed but were shipped out-of-band as NuGet packages. We’re adding more analyzers for .NET 6.

By default most of the new analyzers are enabled at Info level. You can enable these analyzers at Warning level by configuring the analysis mode like this: <AnalysisMode>AllEnabledByDefault</AnalysisMode>.

We published the set of analyzers we wanted for .NET 6 (plus some extras) and then made most of them up-for-grabs.

Credit to Newell Clark and Meik Tranel for the following implementations, included in Preview 5. Note that the community has contributed other implementations in previous previews.

Contributor Issue Title
Newell Clark dotnet/runtime #33777 Use span-based string.Concat
Newell Clark dotnet/runtime #33784 Prefer string.AsSpan() over string.Substring() when parsing
Newell Clark dotnet/runtime #33789 Override Stream.ReadAsync/WriteAsync
Newell Clark dotnet/runtime #35343 Replace Dictionary<,>.Keys.Contains with ContainsKey
Newell Clark dotnet/runtime #45552 Use String.Equals instead of String.Compare
Meik Tranel dotnet/runtime #47180 Use String.Contains(char) instead of String.Contains(String)


.NET SDK: Enable custom guards for Platform Compatibility Analyzer

The CA1416 Platform Compatibility analyzer already recognizes platform guards using the methods in OperatingSystem/RuntimeInformation, such as OperatingSystem.IsWindows and OperatingSystem.IsWindowsVersionAtLeast. However, the analyzer does not recognize any other guard possibilities like the platform check result cached in a field or property, or complex platform check logic is defined in a helper method.

For allowing custom guard possibilities we added new attributes SupportedOSPlatformGuard and UnsupportedOSPlatformGuard for annotating the custom guard members with the corresponding platform name and/or version. This annotation is recognized and respected by the Platform Compatibility analyzer’s flow analysis logic.


    [UnsupportedOSPlatformGuard("browser")] // The platform guard attribute
    internal bool IsSupported => false;
    internal bool IsSupported => true;

    void ApiNotSupportedOnBrowser() { }

    void M1()
        ApiNotSupportedOnBrowser();  // Warns: This call site is reachable on all platforms.'ApiNotSupportedOnBrowser()' is unsupported on: 'browser'

        if (IsSupported)
            ApiNotSupportedOnBrowser();  // Not warn

    void ApiOnlyWorkOnWindowsLinux() { }

    private readonly bool _isWindowOrLinux = OperatingSystem.IsLinux() || OperatingSystem.IsWindows();

    void M2()
        ApiOnlyWorkOnWindowsLinux();  // This call site is reachable on all platforms.'ApiOnlyWorkOnWindowsLinux()' is only supported on: 'Linux', 'Windows'.

        if (_isWindowOrLinux)
            ApiOnlyWorkOnWindowsLinux();  // Not warn

Windows Forms: default font

You can now set a default font for an application with Application.SetDefaultFont. The pattern you use is similar to setting high dpi or visual styles.

class Program
    static void Main()

+       Application.SetDefaultFont(new Font(new FontFamily("Microsoft Sans Serif"), 8f));

        Application.Run(new Form1());

Here are two examples after setting the default font (with different fonts).

Microsoft Sans Serif, 8pt:


Chiller, 12pt:


The default font was updated in .NET Core 3.0. However that change introduced a significant hurdle for some users migrating .NET Framework apps to .NET Core. This new change makes it straightforward to choose the desired font for an app and removes that migration hurdle.

Libraries: Dropping support for older frameworks

Dropping a framework from a package is a source breaking change. At the same time, continuing to build for all frameworks we ever shipped increases the complexity and size of a package. In the past, we’ve solved this issue by harvesting, which basically meant:

  1. We build only for current frameworks
  2. During build, we download the earlier version of the package and harvest the binaries for earlier frameworks we no longer build for

While this means that you can always update without worrying that we drop a framework it also means that you’ll never get any bug fixes or new features if you consume a harvested binary. In other words, harvested assets can’t be serviced which is now hidden because from your point of view you’re able to keep updating the package to a later version even thought you’re consuming the same old binary that we’re no longer updating.

Starting with .NET 6 Preview 5, we plan to no longer perform any form of harvesting to ensure that all assets we ship can be serviced. This means we’re dropping support for any framework that is older than these:

  • .NET Framework 4.6.1
  • .NET Core 3.1
  • .NET Standard 2.0

If you’re currently referencing an impacted package from an earlier framework, you’ll no longer be able to update the referenced package to a later version. Your choice is to either retarget your project to a later framework version or not updating the referenced package (which is generally not a huge take back because you’re already consuming a frozen binary anyways).

For more details, including the full list of impacted packages, see dotnet/announcement: Dropping older framework versions.

Libraries: Microsoft.Extensions

We’ve been improving Microsoft.Extensions APIs this release. In Preview 5, we’ve focused on hosting and dependency injection. In Preview 4, we added a compile-time source generator for logging.

Credit to Martin Björkström]( for dotnet/runtime #51840 (AsyncServiceScope).

Hosting – ConfigureHostOptions API

We added a new ConfigureHostOptions API on IHostBuilder to make application setup simpler (e.g. configuring the shutdown timeout):

using HostBuilder host = new()
    .ConfigureHostOptions(o =>
        o.ShutdownTimeout = TimeSpan.FromMinutes(10);


Prior to Preview 5, configuring the host options was a bit more complicated:

using HostBuilder host = new()
    .ConfigureServices(services =>
        services.Configure<HostOptions>(o =>
            o.ShutdownTimeout = TimeSpan.FromMinutes(10);


Dependency Injection – CreateAsyncScope APIs

You might have noticed that disposal of a service provider will throw an InvalidOperationException when it happens to register an IAsyncDisposable service.

The new CreateAsyncScope API provides a straightforward solution, as you can see in the following example:

await using (var scope = provider.CreateAsyncScope())
    var foo = scope.ServiceProvider.GetRequiredService<Foo>();

The following example demonstrate the existing problem case and then the previous suggested workaround.

using System;
using System.Threading.Tasks;
using Microsoft.Extensions.DependencyInjection;

await using var provider = new ServiceCollection()

// This using can throw InvalidOperationException
using (var scope = provider.CreateScope())
    var foo = scope.ServiceProvider.GetRequiredService<Foo>();

class Foo : IAsyncDisposable
    public ValueTask DisposeAsync() => default;

You can workaround the exception by casting the returned scope to IAsyncDisposable.

var scope = provider.CreateScope();
var foo = scope.ServiceProvider.GetRequiredService<Foo>();
await ((IAsyncDisposable)scope).DisposeAsync();

CreateAsyncScope solves this problem, enabling you to safely use the using statement.

Libraries: JsonSerializer Source generation

The backbone of nearly all .NET serializers is reflection. Reflection is a great capability for certain scenarios, but not as the basis of high-performance cloud-native applications (which typically (de)serialize and process a lot of JSON documents). Reflection is a problem for startup, memory usage, and assembly trimming.

The alternative to runtime reflection is compile-time source generation. Source generators generate C# source files that can be compiled as part of the library or application build. Generating source code at compile time can provide many benefits to .NET applications, including improved performance.

In .NET 6, we are including a new source generator as part of System.Text.Json. The JSON source generator works in conjunction with JsonSerializer, and can be configured in multiple ways. It’s your decision whether you use the new source generator. It can provide the following benefits:

  • Reduce start-up time
  • Improve serialization throughput
  • Reduce private memory usage
  • Remove runtime use of System.Reflection and System.Reflection.Emit
  • Allows for trim-compatible JSON serialization

For example, instead of dynamically generating methods at runtime to get and set class properties during (de)serialization using Reflection.Emit (which uses private memory and has start-up costs), a source generator can generate code that more simply and efficiently assigns or retrieves a value directly to/from properties, which is lightning fast.

You can try out the source generator by using the latest preview version of the System.Text.Json NuGet package. We are working on a proposal for including source generators within the SDK.

Generating optimized serialization logic

By default, the JSON source generator emits serialization logic for the given serializable types. This delivers higher performance than using the existing JsonSerializer methods by generating source code that uses Utf8JsonWriter directly. In short, source generators offer a way of giving you a different implementation at compile-time in order to make the runtime experience better.

Zooming out, JsonSerializer is a powerful tool which has many features (and even more coming!) that can improve the (de)serialization of .NET types from/into the JSON format. It is fast, but can have some performance overhead when only a subset of features are needed for a serialization routine. Going forward, we will update JsonSerializer and the new source generator together.

Given a simple type:

namespace Test
    internal class JsonMessage
        public string Message { get; set; }

The source generator can be configured to generate serialization logic for instances of the example JsonMessage type. Note that the class name JsonContext is arbitrary. You can use whichever class name you want for the generated source.

using System.Text.Json.Serialization;

namespace Test
    internal partial class JsonContext : JsonSerializerContext

We have defined a set of JsonSerializer features that are supported with the source generation mode that provides the best serialization throughput, via JsonSerializerOptionsAttribute. These features can be specified to the source generator ahead of time, to avoid extra checks at runtime. If the attribute is not used, then default JsonSerializationOptions are assumed at runtime.

As part of the build, the source generator augments the JsonContext partial class with the following shape:

internal partial class JsonContext : JsonSerializerContext
    public static JsonContext Default { get; }

    public JsonTypeInfo<JsonMessage> JsonMessage { get; }

    public JsonContext(JsonSerializerOptions options) { }

    public override JsonTypeInfo GetTypeInfo(Type type) => ...;

The serializer invocation with this mode could look like the following example. This example provides the best possible performance.

using MemoryStream ms = new();
using Utf8JsonWriter writer = new(ms);

JsonContext.Default.JsonMessage.Serialize(writer, new JsonMessage { "Hello, world!" });

// Writer contains:
// {"Message":"Hello, world!"}

Alternatively, you can continue to use JsonSerializer, and instead pass an instance of the generated code to it, with JsonContext.Default.JsonMessage.

JsonSerializer.Serialize(jsonMessage, JsonContext.Default.JsonMessage);

Here’s a similar use, with a different overload.

JsonSerializer.Serialize(jsonMessage, typeof(JsonMessage), JsonContext.Default);

The difference between these two overloads is that the first is using the typed metadata implementation — JsonTypeInfo<T> — and the second one is using a more general untyped implementation that does type tests to determine if a typed implementation exists within the context instance. It is a little slower (due to the type tests), as a result. If there is not a source-generated implementation for a given type, then the serializer throws a NotSupportedException. It does not fallback to a reflection-based implementation (as an explicit design choice).

The fastest and most optimized source generation mode — based on Utf8JsonWriter — is currently only available for serialization. Similar support for deserialization — based on Utf8JsonReader — may be provided in the future depending on your feedback.

However, the source generator also emits type-metadata initialization logic that can benefit deserialization as well. To deserialize an instance of JsonMessage using pre-generated type metadata, you can do the following:

JsonSerializer.Deserialize(json, JsonContext.Default.JsonMessage);

Similar to serialization above, you might also write:

JsonSerializer.Deserialize(json, typeof(JsonMessage), JsonContext.Default);

Additional notes

  • Multiple types can be included for source generation via [JsonSerializable] on a derived, partial JsonSerializerContext instance, not just one.
  • The source generator also supports nested object and collection members on objects, not just primitive types.

Libraries: WebSocket Compression

Compression is important for any data transmitted over a network. WebSockets now enable compression. We used an implementation of permessage-deflate extension for WebSockets, RFC 7692. It allows compressing WebSockets message payloads using DEFLATE algorithm.

This feature was one of the top user requests for Networking on GitHub. You can follow our journey to providing that API via API review 1 and API review 2.

Credit to Ivan Zlatanov. Thanks Ivan!

We realized that using compression together with encryption may lead to attacks, like CRIME and BREACH. It means that a secret cannot be sent together with user-generated data in a single compression context, otherwise that secret could be extracted. To bring user’s attention to these implications and help them weigh the risks, we renamed our API to DangerousDeflateOptions. We also added the ability to turn off compression for specific messages, so if the user would want to send a secret, they could do that securely without compression.

There was also a follow-up by Ivan that reduced the memory footprint of the WebSocket when compression is disabled by about 27%.

Enabling the compression from the client side is easy, see the example below. However, please bear in mind that the server can negotiate the settings, e.g. request smaller window, or deny the compression completely.

var cws = new ClientWebSocket();
cws.Options.DangerousDeflateOptions = new WebSocketDeflateOptions()
    ClientMaxWindowBits = 10,
    ServerMaxWindowBits = 10

WebSocket compression support for ASP.NET Core was also recently added. It will be included in an upcoming preview.

Libraries: Socks proxy support

SOCKS is a proxy server implementation that can process any TCP or UDP traffic, making it a very versatile system. It is a long-standing community request that has been added to .NET 6.

This change adds support for Socks4, Socks4a, and Socks5. For example, it enables testing external connections via SSH or connecting to the Tor network.

The WebProxy class now accepts socks schemes, as you can see in the following example.

var handler = new HttpClientHandler
    Proxy = new WebProxy("socks5://", 9050)
var httpClient = new HttpClient(handler);

Credit to Huo Yaoyuan. Thanks Huo!

Libraries: Support for OpenTelemetry Metrics

We’ve been adding support for OpenTelemetry for the last couple .NET versions, as part of our focus on observability. In .NET 6, we’re adding support for the OpenTelemetry Metrics API. By adding support for OpenTelemetry, your apps can seamlessly interoperate with other OpenTelemetry systems.

System.Diagnostics.Metrics is the .NET implementation of the OpenTelemetry Metrics API specification. The Metrics APIs are designed explicitly for processing raw measurements, generally with the intent to produce continuous summaries of those measurements, efficiently and simultaneously.

The APIs include the Meter class which can be used to create instrument objects (e.g. Counter). The APIs expose four instrument classes: Counter, Histogram, ObservableCounter, and ObservableGauge to support different metrics scenarios. Also, the APIs expose the MeterListener class to allow listening to the instrument’s recorded measurement for aggregation and grouping purposes.

The OpenTelemetry .NET implementation will be extended to use these new APIs, which add support for Metrics observability scenarios.

Library Measurement Recording Example

    Meter meter = new Meter("io.opentelemetry.contrib.mongodb", "v1.0");
    Counter<int> counter = meter.CreateCounter<int>("Requests");
    counter.Add(1, KeyValuePair.Create<string, object>("request", "read"));

Listening Example

    MeterListener listener = new MeterListener();
    listener.InstrumentPublished = (instrument, meterListener) =>
        if (instrument.Name == "Requests" && instrument.Meter.Name == "io.opentelemetry.contrib.mongodb")
            meterListener.EnableMeasurementEvents(instrument, null);
    listener.SetMeasurementEventCallback<int>((instrument, measurement, tags, state) =>
        Console.WriteLine($"Instrument: {instrument.Name} has recorded the measurement {measurement}");

Libraries: BigInteger Performance

Parsing of BigIntegers from both decimal and hexadecimal strings has been improved. We see improvements of up to 89%, as demonstrated in the following chart.


Credit to Joseph Da Silva. Thanks Joseph!

Libraries: Vector<T> now supports nint and nuint

Vector<T> now supports the nint and nuint primitive types, added in C# 9. For example, this change should make it simpler to use SIMD instructions with pointers or platform-dependent lengths.

Libraries: Support for OpenSSL 3

.NET cryptography APIs support using OpenSSL 3 as the preferred native cryptography provider on Linux. .NET 6 will use OpenSSL 3 if it is available. Otherwise, it will use OpenSSL 1.x.

Libraries: Add support ChaCha20/Poly1305 cryptography algorithm

The ChaCha20Poly1305 class has been added to System.Security.Cryptography. In order to use the ChaCha20/Poly1305 algorithm, it must be supported by the underlying operating system. The static IsSupported property can be used to determine if the algorithm is supported in a given context.

  • Linux: requires OpenSSL 1.1 or hßigher.
  • Windows: build 20142 or higher (currently requires the Dev “insider” channel)

Credit to Kevin Jones for the Linux support. Thanks Kevin!

Interop: Objective-C interoperability support

The team has been adding Objective-C support, with the goal of having a single Objective-C interop implementation for .NET. Up until now, the Objective-C interop system was built around the Mono embedding API but we decided it wasn’t the right approach to share across runtimes. As a result we’ve create a new .NET API that will enable a single Objective-C interop experience that will eventually work on both runtimes.

This new API for Objective-C interop has brought immediate support in both runtimes for NSAutoreleasePool, which enables support for Cocoa’s reference-counted memory management system. You can now configure whether you want each managed thread to have an implicit NSAutoreleasePool. This enables the release of Cocoa objects on a per-thread basis.

Diagnostics (EventPipe/DiagnosticsServer) – MonoVM

A lot of diagnostics features have been added into MonoVM since beginning of .NET 6. This has enabled features like managed EventSource/EventListener, EventPipe and DiagnosticsServer. It has enabled using diagnostics tools like dotnet-trace, dotnet-counters, dotnet-stacks for apps running on mobile devices (iOS/Android) as well as desktop.

These new features opens up ability to analyse nettrace files generated by MonoVM in tools like PrefView/SpeedScope/Chromium, dotnet-trace, or writing custom parsers using libraries like TraceEvent.

We will continue to include more features going forward, primarily focusing on SDK integration and adapting more native runtime events (Microsoft-Windows-DotNETRuntime) into MonoVM enabling more events in nettrace files.

The following features are now in place:

  • Share native EventPipe/DiagnosticsServer library between MonoVM and CoreCLR.
  • Add TCP/IP support into DiagnosticsServer and build MonoVM iOS/Android runtime packs leveraging that configuration. Needed in order to support mobile platforms.
  • BCL EventSources runs on MonoVM emitting events into EventPipe.
  • BCL Runtime counters emitted by System.Diagnostics.Tracing.RuntimeEventSource wired up on MonoVM, consumable from tools like dotnet-counters.
  • Custom EventSources runs on MonoVM, emitting custom events into EventPipe, consumable from tools like dotnet-trace.
  • Custom event counters runs on MonoVM, emitting custom counter events into EventPipe, consumable from tools like dotnet-counters.
  • Sample profiler is implemented on MonoVM emitting events into EventPipe. Opens up abilities to do CPU profiling on MonoVM using dotnet-trace.
  • Implementation of dotnet-dsrouter diagnostics tool, enables use of existing diagnostic tooling like, dotnet-trace, dotnet-counters, dotnet-stack together with MonoVM running on mobile targets, without any need to change existing tooling. dotnet-dsrouter runs a local IPC server routing all traffic from diagnostic tooling over to DiagnosticsServer running in MonoVM on simulator/device.
  • Implementation of EventPipe/DiagnosticsServer in MonoVM using component-based architecture.
  • Implementation/extension of diagnostics environment based file session.

iOS CPU sampling (SpeedScope)

The following image demonstrates part of an iOS start up CPU sampling session viewed in SpeedScope.


Android CPU sampling (PerfView)

The following image demonstrates Android CPU sampling viewed in PerfView (main thread in infinite sleep).


Runtime: CodeGen

The following changes have been made in RyuJIT.

Community contributions

  • Delete the unused dummyBB variable
  • Delete unused functions reading integers in big-endian format
  • Pass TYP_FLOAT to gtNewDconNode instead of creating a new scope

Thanks to @SingleAccretion for these contributions.

Dynamic PGO

  • Revise inlinee scale computations
  • Update optReachable with excluded block check
  • Generalize the branch around empty flow optimization
  • Add MCS jitflags support for the new GetLikelyClass PGO record type
  • Generalize checking for valid IR after a tail call to support crossgen2 determinism
  • More general value class devirtualization
  • Chained guarded devirtualization

JIT Loop Optimizations

  • Improved loop inversion shows good performance improvement in BenchE<img src=”” width=”600″ height=”120″>
  • Scale cloned loop block weights
  • Don’t recompute preds lists during loop cloning to preserve existing profile data on the edges
  • Improve DOT flow graph dumping
  • Improve loop unrolling documentation


  • Include register selection heuristics in “Allocating Registers” table diff of old vs. new table:image

Keep Structs in Register

  • Prepare JIT backend for structs in registers
  • Liveness fix for struct enreg
  • Improve struct inits to keep ASG struct(LCL_VAR, 0) as STORE_LCL_VAR struct(0)

Optimizations & Debugging experience

  • Recognize and handle Vector64/128/256 for nint/nuint
  • Add clrjit.natvis file for better debugging experience sample visualizer for jitstd::list as well as RefPosition and the decomposition of registerAssignment inside it to show all the registers:



Inlining of certain methods involving SIMD or HWIntrinsics should now have improved codegen and performance. We saw improvements of up to 95%.



.NET 6 Preview 5 is perhaps the biggest preview yet in terms of breadth and quantity of features. You can see how much Roslyn features are affecting low-level libraries features, with source generators and analyzers. The future has truly arrived. We now have a very capable compiler toolchain that enables us to produce highly-optimized and correct code, and enables the exact same experience for your own projects.

Now is a great time to start testing .NET 6. It’s still early enough for us to act on your feedback. It’s hard to imagine given that while we’re not shipping until November 2021 that the feedback window will soon narrow to high-severity issues only. The team works about one and a half previews ahead, and will soon switch to focusing primarily on quality issues. Please give .NET 6 a try if you can.

Thanks for being a .NET developer.