Announcing .NET 6 Preview 4

Richard Lander

We are delighted to release .NET 6 Preview 4. We’re now about half-way through the .NET 6 release. It’s a good moment to look again at the full scope of .NET 6, much like the first preview post. Many features are in close-to-final form and others will come soon now that the foundational building blocks are in place for the release. Preview 4 establishes a solid base for delivering a final .NET 6 build in November, with finished features and experiences. It’s also ready for real world testing if you haven’t yet tried .NET 6 in your environment.

Speaking of the final release, we now have a date! Book off November 9-11 for .NET Conf 2021. We’ll launch .NET 6 on the 9th with many in-depth talks and demos that tell you everything you want to know about .NET 6.

You can download .NET 6 Preview 4, for Linux, macOS, and Windows.

See the ASP.NET Core and EF Core posts for more detail on what’s new for web and data access scenarios. There’s also new .NET MAUI post that describe new client app experiences and a Hot reload post that describes a new approach to developer productivity.

.NET 6 has been tested with Visual Studio 16.11 and Visual Studio for Mac 8.9. We recommend you use those builds if you want to try .NET 6 with Visual Studio.

Build 2021

The Microsoft Build conference is this week. It’s free and streaming on the web. It’s also not too late to register.

You’ll want to checkout these talks for sure, which will include lots of discussion of .NET 6 and demos that show you what’s new and now possible.

.NET 6 Themes

We started planning .NET 6 in late 2020 on GitHub. We identified eight themes across a wide set of topics, including industry scenarios, support, and education. The themes represent half to three quarters of our effort for the release. There are many projects that don’t rise to the level of a theme or that are significant but not thematic (like supporting Apple Silicon devices).

The following are the .NET 6 themes, each described with a one sentence summary. They are listed in the same order they are displayed in themesof.net.

Some of these themes are discussed in more detail in the following posts:

.NET Platform unification

We’ve talked a lot in past posts and at conferences about .NET unification yet it is missing from the themes. Platform unification is baked into everything we do and has no need for its own theme. One can think of it as being the one mega-theme above and beyond the ones that are listed. It is interleaved through multiple of the themes and is a basic assumption of the team going forward.

The inner-loop performance project is a great example. It assumes that .NET 6 apps all share the same foundation, for example using the same build system and libraries. Where there is a technical difference, like using a different runtime (CoreCLR or Mono) or code generation technique (AOT or JIT), we take those things into account and deliver pragmatic and appropriate experiences, with a bias to no observable experience difference. The EventPipe project is another similar example.

Production confidence

We’ll soon start releasing “go live” builds that are supported in production. We’re currently targeting August for that. Our development model is oriented around enabling production workloads, even while we’re finishing up work on all the themes that were just mentioned.

Production confidence begins with the dotnet.microsoft.com site. It’s been running half its site load on .NET 6 starting with Preview 1. While not massive scale, it is a mission critical site for our team and we take it very seriously. .NET 6 has been working for us like a champ.

We also work with Microsoft teams who deploy their production apps on .NET previews. They do that so they can take advantage of new .NET features early. These teams are always looking for opportunities to reduce their cloud hosting costs, and deploying new .NET versions has proven to be one of the most effective and lowest effort approaches for that. These teams give us early feedback that helps us ensure new features are ready for global production use. They also significantly influence final feature shape because they are our first production users.

All of this early battle-testing with real-world apps builds our confidence that .NET 6 will be ready for running your app.

The remainder of the post is dedicated to features that are new in Preview 4.

Tools: Hot Reload with the Visual Studio debugger and dotnet CLI

Hot Reload is a new experience that enables you to make edits to your app’s source code while it is running without needing to manually pause the app or hit a breakpoint. Hot Reload improves developer productivity by reducing the number of times you need to restart your running app.

With this release, Hot Reload works for many types of apps such as WPF, Windows Forms, WinUI, ASP.NET, Console Apps and other frameworks that are running on top of the CoreCLR runtime. We’re also working to bring this technology to WebAssembly, iOS and Android apps that run on top of Mono, but this is still coming (in a later Preview).

To start testing this feature install Visual Studio 2019 version 16.11 Preview 1 and start your app with the Visual Studio debugger (F5). Once your app is running, you’ll now have the new option to make code changes and apply them using the new “apply code changes” button as illustrated below.

.NET Hot Reload Apply Code Changes in Visual Studio 2019

Note: Click the image to start the animation.

Hot Reload is also available through the dotnet watch tool. Preview 4 includes multiple fixes that improve that experience.

If you want to learn more about Hot Reload you can read Introducing .NET Hot Reload.

System.Text.Json support for IAsyncEnumerable

IAsyncEnumerable<T> is an important feature that was added with .NET Core 3.0 and C# 8. The new enhancements enable System.Text.Json (de)serialization with IAsyncEnumerable<T> objects.

The following examples use streams as a representation of any async source of data. The source could be files on a local machine, or results from a database query or web service API call.

Streaming serialization

System.Text.Json now supports serializing IAsyncEnumerable<T> values as JSON arrays, as you can see in the following example.

using System;
using System.Collections.Generic;
using System.IO;
using System.Text.Json;

static async IAsyncEnumerable<int> PrintNumbers(int n)
{
    for (int i = 0; i < n; i++) yield return i;
}

using Stream stream = Console.OpenStandardOutput();
var data = new { Data = PrintNumbers(3) };
await JsonSerializer.SerializeAsync(stream, data); // prints {"Data":[0,1,2]}

IAsyncEnumerable values are only supported using the asynchronous serialization methods. Attempting to serialize using the synchronous methods will result in a NotSupportedException being thrown.

Streaming deserialization

Streaming deserialization required a new API that returns IAsyncEnumerable<T>. We added the JsonSerializer.DeserializeAsyncEnumerable method for this purpose, as you can see in the following example.

using System;
using System.IO;
using System.Text;
using System.Text.Json;

var stream = new MemoryStream(Encoding.UTF8.GetBytes("[0,1,2,3,4]"));
await foreach (int item in JsonSerializer.DeserializeAsyncEnumerable<int>(stream))
{
    Console.WriteLine(item);
}

This example will deserialize elements on-demand and can be useful when consuming particularly large data streams. It only supports reading from root-level JSON arrays, although that could be relaxed in the future based on feedback.

The existing DeserializeAsync method nominally supports IAsyncEnumerable<T>, but within the confines of its non-streaming method signature. It must return the final result as a single value, as you can see in the following example.

using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Text.Json;

var stream = new MemoryStream(Encoding.UTF8.GetBytes(@"{""Data"":[0,1,2,3,4]}"));
var result = await JsonSerializer.DeserializeAsync<MyPoco>(stream);
await foreach (int item in result.Data)
{
    Console.WriteLine(item);
}

public class MyPoco
{
    public IAsyncEnumerable<int> Data { get; set; }
}

In this example, the deserializer will have buffered all IAsyncEnumerable contents in memory before returning the deserialized object. This is because the deserializer needs to have consumed the entire JSON value before returning a result.

System.Text.Json: Writable DOM Feature

The writeable JSON DOM feature adds a new straightforward and high-performance programming model for System.Text.Json. This new API is attractive since it avoids the complexity and ceremony of serialization and the traditional cost of a DOM.

This new API has the following benefits:

  • A lightweight alternative to serialization for cases when use of POCO types is not possible or desired, or when a JSON schema is not fixed and must be inspected.
  • Enables efficient modification of a subset of a large tree. For example, it is possible to efficiently navigate to a subsection of a large JSON tree and read an array or deserialize a POCO from that subsection. LINQ can also be used with that.
  • Enables using the C# dynamic keyword, which allows for a loosely-typed, more script-like model.

We’re looking for feedback on support for dynamic. Please give us your feedback if dynamic support is important to you.

More details are available at dotnet/runtime #6098.

Writeable DOM APIs

The writeable DOM exposes the following types.

namespace System.Text.Json.Node
{
    public abstract class JsonNode {...};
    public sealed class JsonObject : JsonNode, IDictionary<string, JsonNode?> {...}
    public sealed class JsonArray : JsonNode, IList<JsonNode?> {...};
    public abstract class JsonValue : JsonNode {...};
}

Example code

The following example demonstrates the new programming model.

    // Parse a JSON object
    JsonNode jNode = JsonNode.Parse("{"MyProperty":42}");
    int value = (int)jNode["MyProperty"];
    Debug.Assert(value == 42);
    // or
    value = jNode["MyProperty"].GetValue<int>();
    Debug.Assert(value == 42);

    // Parse a JSON array
    jNode = JsonNode.Parse("[10,11,12]");
    value = (int)jNode[1];
    Debug.Assert(value == 11);
    // or
    value = jNode[1].GetValue<int>();
    Debug.Assert(value == 11);

    // Create a new JsonObject using object initializers and array params
    var jObject = new JsonObject
    {
        ["MyChildObject"] = new JsonObject
        {
            ["MyProperty"] = "Hello",
            ["MyArray"] = new JsonArray(10, 11, 12)
        }
    };

    // Obtain the JSON from the new JsonObject
    string json = jObject.ToJsonString();
    Console.WriteLine(json); // {"MyChildObject":{"MyProperty":"Hello","MyArray":[10,11,12]}}

    // Indexers for property names and array elements are supported and can be chained
    Debug.Assert(jObject["MyChildObject"]["MyArray"][1].GetValue<int>() == 11);

Microsoft.Extensions.Logging compile-time source generator

.NET 6 introduces the LoggerMessageAttribute type. This attribute is part of the Microsoft.Extensions.Logging namespace, and when used, it source-generates performant logging APIs. The source-generation logging support is designed to deliver a highly usable and highly performant logging solution for modern .NET applications. The auto-generated source code relies on the ILogger interface in conjunction with LoggerMessage.Define functionality.

The source generator is triggered when LoggerMessageAttribute is used on partial logging methods. When triggered, it is either able to autogenerate the implementation of the partial methods it’s decorating, or produce compile-time diagnostics with hints about proper usage. The compile-time logging solution is typically considerably faster at run time than existing logging approaches. It achieves this by eliminating boxing, temporary allocations, and copies to the maximum extent possible.

There are benefits over manually using LoggerMessage.Define APIs directly:

  • Shorter and simpler syntax: Declarative attribute usage rather than coding boilerplate.
  • Guided developer experience: The generator gives warnings to help developers do the right thing.
  • Support for an arbitrary number of logging parameters. LoggerMessage.Define supports a maximum of six.
  • Support for dynamic log level. This is not possible with LoggerMessage.Define alone.

If you would like to keep track of improvements and known issues, see dotnet/runtime#52549.

Basic usage

To use the LoggerMessageAttribute, the consuming class and method need to be partial. The code generator is triggered at compile time, and generates an implementation of the partial method.

public static partial class Log
{
    [LoggerMessage(EventId = 0, Level = LogLevel.Critical, Message = "Could not open socket to `{hostName}`")]
    public static partial void CouldNotOpenSocket(ILogger logger, string hostName);
}

In the preceding example, the logging method is static and the log level is specified in the attribute definition. When using the attribute in a static context, the ILogger instance is required as a parameter. You may choose to use the attribute in a non-static context as well. For more examples and usage scenarios visit the docs for the compile-time logging source generator.

System.Linq enhancements

New System.LINQ APIs have been added that have been requested and contributed by the community.

Enumerable support for Index and Range parameters

The Enumerable.ElementAt method now accepts indices from the end of the enumerable, as you can see in the following example.

Enumerable.Range(1, 10).ElementAt(^2); // returns 9

An Enumerable.Take overload has been added that accepts Range parameters. It simplifies taking slices of enumerable sequences:

  • source.Take(..3) instead of source.Take(3)
  • source.Take(3..) instead of source.Skip(3)
  • source.Take(2..7) instead of source.Take(7).Skip(2)
  • source.Take(^3..) instead of source.TakeLast(3)
  • source.Take(..^3) instead of source.SkipLast(3)
  • source.Take(^7..^3) instead of source.TakeLast(7).SkipLast(3).

Credit to @dixin for contributing the implementation.

TryGetNonEnumeratedCount

The TryGetNonEnumeratedCount method attempts to obtain the count of the source enumerable without forcing an enumeration. This approach can be useful in scenarios where it is useful to preallocate buffers ahead of enumeration, as you can see in the following example.

List<T> buffer = source.TryGetNonEnumeratedCount(out int count) ? new List<T>(capacity: count) : new List<T>();
foreach (T item in source)
{
    buffer.Add(item);
}

TryGetNonEnumeratedCount checks for sources implementing ICollection/ICollection<T> or takes advantage of some of the internal optimizations employed by Linq.

DistinctBy/UnionBy/IntersectBy/ExceptBy

New variants have been added to the set operations that allow specifying equality using key selector functions, as you can see in the following example.

Enumerable.Range(1, 20).DistinctBy(x => x % 3); // {1, 2, 3}

var first = new (string Name, int Age)[] { ("Francis", 20), ("Lindsey", 30), ("Ashley", 40) };
var second = new (string Name, int Age)[] { ("Claire", 30), ("Pat", 30), ("Drew", 33) };
first.UnionBy(second, person => person.Age); // { ("Francis", 20), ("Lindsey", 30), ("Ashley", 40), ("Drew", 33) }

MaxBy/MinBy

MaxBy and MinBy methods allow finding maximal or minimal elements using a key selector, as you can see in the following example.

var people = new (string Name, int Age)[] { ("Francis", 20), ("Lindsey", 30), ("Ashley", 40) };
people.MaxBy(person => person.Age); // ("Ashley", 40)

Chunk

Chunk can be used to chunk a source enumerable into slices of a fixed size, as you can see in the following example.

IEnumerable<int[]> chunks = Enumerable.Range(0, 10).Chunk(size: 3); // { {0,1,2}, {3,4,5}, {6,7,8}, {9} }

Credit to Robert Andersson for contributing the implementation.

FirstOrDefault/LastOrDefault/SingleOrDefault overloads taking default parameters

The existing FirstOrDefault/LastOrDefault/SingleOrDefault methods return default(T) if the source enumerable is empty. New overloads have been added that accept a default parameter to be returned in that case, as you can see in the following example.

Enumerable.Empty<int>().SingleOrDefault(-1); // returns -1

Credit to @Foxtrek64 for contributing the implementation.

Zip overload accepting three enumerables

The Zip method now supports combining three enumerables, as you can see in the following example.

var xs = Enumerable.Range(1, 10);
var ys = xs.Select(x => x.ToString());
var zs = xs.Select(x => x % 2 == 0);

foreach ((int x, string y, bool z) in Enumerable.Zip(xs,ys,zs))
{
}

Credit to Huo Yaoyuan for contributing the implementation.

Significantly improved FileStream performance on Windows

FileStream has been re-written in .NET 6, to have much higher performance and reliability on Windows.

The re-write project has been phased over five PRs:

The final result is that FileStream never blocks when created for async IO, on Windows. That’s a major improvement. You can observe that in the benchmarks, which we’ll look at shortly.

Configuration

The first PR enables FileStream to choose an implementation at runtime. The most obvious benefit of this pattern is enabling switching back to the old .NET 5 implementation, which you can do with the following setting, in runtimeconfig.json.

{
    "configProperties": {
        "System.IO.UseNet5CompatFileStream": true
    }
}

We plan to add an io_uring strategy next, which takes advantage of a Linux feature by the same name in recent kernels.

Performance benchmark

Let’s measure the improvements using BenchmarkDotNet.

public class FileStreamPerf
{
    private const int FileSize = 1_000_000; // 1 MB
    private Memory<byte> _buffer = new byte[8_000]; // 8 kB

    [GlobalSetup(Target = nameof(ReadAsync))]
    public void SetupRead() => File.WriteAllBytes("file.txt", new byte[FileSize]);

    [Benchmark]
    public async ValueTask ReadAsync()
    {
        using FileStream fileStream = new FileStream("file.txt", FileMode.Open, FileAccess.Read, FileShare.Read, bufferSize: 4096, useAsync: true);
        while (await fileStream.ReadAsync(_buffer) > 0)
        {
        }
    }

    [Benchmark]
    public async ValueTask WriteAsync()
    {
        using FileStream fileStream = new FileStream("file.txt", FileMode.Create, FileAccess.Write, FileShare.Read, bufferSize: 4096, useAsync: true);
        for (int i = 0; i < FileSize / _buffer.Length; i++)
        {
            await fileStream.WriteAsync(_buffer);
        }
    }

    [GlobalCleanup]
    public void Cleanup() => File.Delete("file.txt");
}

```ini
BenchmarkDotNet=v0.13.0, OS=Windows 10.0.18363.1500 (1909/November2019Update/19H2)
Intel Xeon CPU E5-1650 v4 3.60GHz, 1 CPU, 12 logical and 6 physical cores
.NET SDK=6.0.100-preview.5.21267.9
  [Host]     : .NET 5.0.6 (5.0.621.22011), X64 RyuJIT
  Job-OIMCTV : .NET 5.0.6 (5.0.621.22011), X64 RyuJIT
  Job-CHFNUY : .NET 6.0.0 (6.0.21.26311), X64 RyuJIT
Method Runtime Mean Ratio Allocated
ReadAsync .NET 5.0 3.785 ms 1.00 39 KB
ReadAsync .NET 6.0 1.762 ms 0.47 1 KB
WriteAsync .NET 5.0 12.573 ms 1.00 39 KB
WriteAsync .NET 6.0 3.200 ms 0.25 1 KB

Environment: Windows 10 with SSD drive with BitLocker enabled

Results:

  • Reading 1 MB file is now 2 times faster, while writing is 4 times faster.
  • Memory allocations dropped from 39 kilobytes to 1 kilobyte! This is a 97.5% improvement!

These changes should provide a dramatic improvement for FileStream users on Windows. More details are available at dotnet/core #6098.

Enhanced Date, Time and Time Zone support

The following improvements have been made to date and time related types.

New DateOnly and TimeOnly structs

Date- and time-only structs have been added, with the following characteristics:

  • Each represent one half of a DateTime, either only the date part, or only the time part.
  • DateOnly is ideal for birthdays, anniversary days, and business days. It aligns with SQL Server’s date type.
  • TimeOnly is ideal for recurring meetings, alarm clocks, and weekly business hours. It aligns with SQL Server’s time type.
  • Complements existing date/time types (DateTime, DateTimeOffset, TimeSpan, TimeZoneInfo).
  • In System namespace, shipped in CoreLib, just like existing related types.

Perf improvements to DateTime.UtcNow

This improvement has the following benefits:

  • Fixes 2.5x perf regression for getting the system time on Windows.
  • Utilizes a 5-minute sliding cache of Windows leap second data instead of fetching with every call.

Support for both Windows and IANA time zones on all platforms

This improvements has the following benefits:

  • Implicit conversion when using TimeZoneInfo.FindSystemTimeZoneById (https://github.com/dotnet/runtime/pull/49412)
  • Explicit conversion through new APIs on TimeZoneInfo: TryConvertIanaIdToWindowsId, TryConvertWindowsIdToIanaId, and HasIanaId (https://github.com/dotnet/runtime/issues/49407)
  • Improves cross-plat support and interop between systems that use different time zone types.
  • Removes need to use TimeZoneConverter OSS library. The functionality is now built-in.

Improved time zone display names

This improvement has the following benefits:

  • Removes ambiguity from the display names in the list returned by TimeZoneInfo.GetSystemTimeZones.
  • Leverages ICU / CLDR globalization data.
  • Unix only for now. Windows still uses the registry data. This may be changed later.

Other

  • The UTC time zone’s display name and standard name were hardcoded to English and now uses the same language as the rest of the time zone data (CurrentUICulture on Unix, OS default language on Windows).
  • Time zone display names in WASM use the non-localized IANA ID instead, due to size limitations.
  • TimeZoneInfo.AdjustmentRule nested class gets its BaseUtcOffsetDelta internal property made public, and gets a new constructor that takes baseUtcOffsetDelta as a parameter. (https://github.com/dotnet/runtime/issues/50256)
  • TimeZoneInfo.AdjustmentRule also gets misc fixes for loading time zones on Unix (https://github.com/dotnet/runtime/pull/49733), (https://github.com/dotnet/runtime/pull/50131)

CodeGen

The following improvements have been made to the RyuJIT compiler.

Community contributions

@SingleAccretion has been busy making the following improvements over the last few months. That is in addition to a contribution in .NET 6 Preview 3. Thanks!

Dynamic PGO

The following improvements have been made to support dynamic PGO.

JIT Loop Optimizations

The following improvements have been made for loop optimizations.

LSRA

The following improvements have been made to Linear Scan Register Allocation (LRSA).

Optimizations

.NET Diagnostics: EventPipe for Mono and Improved EventPipe Performance

EventPipe is .NET’s cross-platform mechanism for egressing events, performance data, and counters. Starting with .NET 6, we’ve moved the implementation from C++ to C. With this change, Mono will be able to use EventPipe as well! This means that both CoreCLR and Mono will use the same eventing infrastructure, including the .NET Diagnostics CLI Tools! This change also came with small reduction in size for CoreCLR:

lib after size – before size diff
libcoreclr.so 7037856 – 7049408 -11552

We’ve also made some changes that improve EventPipe throughput while under load. Over the first few previews, we’ve made a series of changes that result in throughput improvements as high as 2.06x what .NET 5 was capable of: image

Data collected using the EventPipeStress framework in dotnet/diagnostics. The writer app writes events as fast as it can for 60 seconds. The number of successful and dropped events is recorded.

For more information, see dotnet/runtime #45518.

IL trimming

Warnings enabled by default

Trim warnings tell you about places where trimming may remove code that’s used at runtime. These warnings were previously disabled by default because the warnings were very noisy, largely due to the .NET platform not participating in trimming as a first class scenario.

We’ve annotated large portions of the .NET libraries (the runtime libraries, not ASP.NET Core or Windows Desktop frameworks) so that they produce accurate trim warnings. As a result, we felt it was time to enable trimming warnings by default.

You can disable warnings by setting <SuppressTrimAnalysisWarnings> to true. With earlier releases, you can set the same property to false to see the trim warnings.

Trim warnings bring predictability to the trimming process and put power in developers’ hands. We will continue annotating more of the .NET libraries, including ASP.NET Core in subsequent releases. We hope the community will also improve the trimming ecosystem by annotating more code to be trim safe.

More information:

The new default Trim Mode in .NET 6 is link. The link TrimMode can provide significant savings by trimming not just unused assemblies, but also unused members.

In .NET 5, trimming tried to find and remove unreferenced assemblies by default. This is safer, but provides limited benefit. Now that trim warnings are on by default developers can be confident in the results of trimming.

Let’s take a look at this trimming improvement by trimming one of the .NET SDK tools, as an example. I’m going to use crossgen, the Ready To Run compiler. It can be trimmed with only a few trim warnings, which the crossgen team was able to resolve.

First, let’s look at publishing crossgen as a self-contained app without trimming. It is 80 MB (which includes the .NET runtime and all the libraries).

image

We can then try out the (now legacy) .NET 5 default trim mode, copyused. The result drops to 55 MB.

image

The new .NET 6 default trim mode,link, drops the self-contained file size much further, to 36MB.

image

We hope that the new link trim mode aligns much better with the expectations for trimming: significant savings and predictable results.

Shared model with Native AOT

We’ve implemented the same trimming warnings for the Native AOT experiment as well, which should improve the Native AOT compilation experience in much the same way.

Single-file publishing

The following improvements have been made for single-file application publishing.

Static Analysis

Analyzers for single-file publishing were added in .NET 5 to warn about Assembly.Location and a few other APIs which behave differently in single-file bundles.

For .NET 6 Preview 4 we’ve improved the analysis to allow for custom warnings. If you have an API which doesn’t work in single-file publishing you can now mark it with the [RequiresAssemblyFiles] attribute, and a warning will appear if the analyzer is enabled. Adding that attribute will also silence all warnings related to single-file in the method, so you can use the warning to propagate warnings upward to your public API.

The analyzer is automatically enabled for exe projects when PublishSingleFile is set to true, but you can also enabled it for any project by setting EnableSingleFileAnalysis to true. This is could be helpful if you want to embed a library in a single file bundle.

Compression

Single-file bundles now support compression, which can be enabled by setting the property EnableCompressionInSingleFile to true. At runtime, files are decompressed to memory as necessary. Compression can provide huge space savings for some scenarios.

Let’s look at single file publishing, with and without compression, used with NuGet Package Explorer.

Without compression: 172 MB

image

With compression: 71.6 MB

image

Compression can significantly increase the startup time of the application, especially on Unix platforms (because they have a no-copy fast start path that can’t be used with compression). You should test your app after enabling compression to see if the additional startup cost is acceptable.

PublishReadyToRun now uses crossgen2 by default

Crossgen2 is now enabled by default when publishing ReadyToRun images. It also optionally supports generating composite images.

The following setting are exposed to enable you to configure publishing with ready to run code. The settings are set to their default values.

    <PublishReadyToRun>false</PublishReadyToRun>
    <!-- set to true to enable publishing with ready to run native code -->
    <PublishReadyToRunUseCrossgen2>true</PublishReadyToRunUseCrossgen2> 
    <!-- set to false to use crossgen like in 5.0 -->
    <PublishReadyToRunComposite>false</PublishReadyToRunComposite>
    <!-- set to true to generate a composite R2R image -->

CLI install of .NET 6 SDK Optional Workloads

.NET 6 will introduce the concept of SDK workloads that can be install after the fact on top of the .NET SDK to enable various scenarios. The new workloads available in preview 4 are .NET MAUI and Blazor WebAssembly AOT workloads.

For the .NET MAUI workloads, we still recommend using the maui-check tool for preview 4 as it includes additional components not yet available in Visual Studio or as an .NET SDK workload. To try out the .NET SDK experience anyway (using iOS as the example), run dotnet workload install microsoft-ios-sdk-full. Once installed, you can run dotnet new ios and then dotnet build to create and build your project.

For Blazor WebAssembly AOT, follow the installation instructions provided via the ASP.NET blog.

Preview 4 includes .NET MAUI workloads for iOS, Android, tvOS, MacOS, and MacCatalyst.

Note that dotnet workload install copies the workloads from NuGet.org into your SDK install so will need to be run elevated/sudo if the SDK install location is protected (meaning at an admin/root location).

Built-in SDK version checking

To make it easier to track when new versions of the SDK and Runtimes are available, we’ve added a new command to the .NET 6 SDK: dotnet sdk check

This will tell you within each feature band what is the latest available version of the .NET SDK and .NET Runtime.

sdkcheck

CLI Templates (dotnet new)

Preview 4 introduces a new search capability for templates. dotnet new --search will search NuGet.org for matching templates. During upcoming previews the data used for this search will be updated more frequently.

Templates installed in the CLI are available for both the CLI and Visual Studio. An earlier problem with user installed templates being lost when a new version of the SDK was installed has been resolved, however templates installed prior to .NET 6 Preview 4 will need to be reinstalled.

Other improvements to template installation include support for the --interactive switch to support authorization credentials for private NuGet feeds.

Once CLI templates are installed, you can check if updates are available via --update-check and --update-apply. This will now reflect template updates much more quickly, support the NuGet feeds you have defined, and support --interactive for authorization credentials.

In Preview 4 and upcoming Previews, the output of dotnet new commands will be cleaned up to focus on the information you need most. For example, the dotnet new --install <package> lists only the templates just installed, rather than all templates.

To support these and upcoming changes to dotnet new, we are making significant changes to the Template Engine API that may affect anyone hosting the template engine. These changes will appear in Preview 4 and Preview 5. If you are hosting the template engine, please connect with us at https://github.com/dotnet/templating so we can work with you to avoid or minimize disruption.

Support

.NET 6 will be will be released in November 2021, and will be supported for three years, as a Long Term Support (LTS) release. The platform matrix has been significantly expanded.

The additions are:

  • Android.
  • iOS.
  • Mac and Mac Catalyst, for x64 and Apple Silicon (AKA “M1”).
  • Windows Arm64 (specifically Windows Desktop).

.NET 6 Debian container images are based on Debian 11 (“bullseye”), which is currently in testing.

Closing

We’re well into the .NET 6 release at this point. While the final release in November still seems like a long way off, we’re getting close to being done feature development. Now is a great time for feedback since the shape of the new features are now established and we’re still in the active development phase so can readily act on that feedback.

Speaking of November, please book off some time during November 9-11 to watch .NET Conf 2021. It is certain to be exciting and fun. We’ll be releasing the final .NET 6 build on November 9th, and a blog post even longer than this one.

Still looking for more to read? You might check out our new conversations series. There is a lot of detailed insights about new .NET 6 features.

We hope you enjoy trying out preview 4.

40 comments

Discussion is closed. Login to edit/delete existing comments.

  • Jeremy Morton 0

    Is this preview supposed to be buildable with Visual Studio 16.11 Preview 1? I get a lot of Code Analysis errors like this:

    2>CSC : error CS8032: An instance of analyzer Microsoft.CodeAnalysis.MakeFieldReadonly.MakeFieldReadonlyDiagnosticAnalyzer cannot be created from C:\Program Files\dotnet\sdk\6.0.100-preview.4.21255.9\Sdks\Microsoft.NET.Sdk\codestyle\cs\Microsoft.CodeAnalysis.CodeStyle.dll : Could not load file or assembly 'Microsoft.CodeAnalysis, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified..
    2>CSC : error CS8032: An instance of analyzer Microsoft.CodeAnalysis.UseExplicitTupleName.UseExplicitTupleNameDiagnosticAnalyzer cannot be created from C:\Program Files\dotnet\sdk\6.0.100-preview.4.21255.9\Sdks\Microsoft.NET.Sdk\codestyle\cs\Microsoft.CodeAnalysis.CodeStyle.dll : Could not load file or assembly 'Microsoft.CodeAnalysis, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified..
    2>CSC : error CS8032: An instance of analyzer Microsoft.CodeAnalysis.UseSystemHashCode.UseSystemHashCodeDiagnosticAnalyzer cannot be created from C:\Program Files\dotnet\sdk\6.0.100-preview.4.21255.9\Sdks\Microsoft.NET.Sdk\codestyle\cs\Microsoft.CodeAnalysis.CodeStyle.dll : Could not load file or assembly 'Microsoft.CodeAnalysis, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified..
    • Jonathon MarolfMicrosoft employee 0

      I assume you have <EnforceCodeStyleInBuild>true</EnforceCodeStyleInBuild> set in your project?

      • Jeremy Morton 0

        Yes:

            <EnableNETAnalyzers>true</EnableNETAnalyzers>
            <AnalysisMode>AllEnabledByDefault</AnalysisMode>
            <AnalysisLevel>preview</AnalysisLevel>
            <EnforceCodeStyleInBuild>true</EnforceCodeStyleInBuild>
        
  • Jefferson Motta 0

    Nice toy, sadly we will never have a mature technology. Now we have LTS for 3 years. My self alone made software since 1994 that last more than 20 years, why Microsoft like so much to kill his own tech not but for money I guess. Framework 4.x is dead, soon Asp.NET is dead. I really like that some day tech evolves so big that Microsoft and his “toys” make no sense anymore. Yes, I very unpleasant with Bill Gates.

  • Andriy Savin 0

    Hi, does the optimized FileStream make possible “true async” not only for reading/writing, but also for opening/closing? E.g. if I open a file stream on a sleeping drive or a slow network share, I will have my app hanging in the FileStream constructor until there is a response from the driver. Does this change with the new implementation? (E.g. you could make file opening a lazy async operation happening on first attempt to execute any other async operation like read/write).

  • Hakan 0

    Thanks for the nice article Those are really great

    Just a small question, can anyone explain to me why

    Enumerable.Range(1, 20).DistinctBy(x => x % 3); // {1, 2, 3}

    and not

    Enumerable.Range(1, 20).DistinctBy(x => x % 3); // {0, 1, 2}
  • Aaron La Greca 0

    Hi Richard,

    Great to see the work on dot net being done. I am commenting to make a suggestion that could be of interest in future developments. I would like to see more security features built into either the compiler or ide build process. I am thinking a code dictionary that allows the compiler to identify previous code vulnerabilities identified through the code semantics and operations.

    For example, if I found a 0-day vulnerability and it came down to a string parse vulnerability where an injection could happen during a file write operation. It would be great to be able to mark the vulnerability like one does when they debug a code segment. Then, the marked code segment could be added to a code dictionary that can create a semantic and operation pattern recognition database that could be downloaded locally and then when developers and SecDevOps do code sweeps they can easily identify vulnerable and risky code and suggest strong options that mitigate the threat in the code development phase of a project, even run it on current production code and open source libraries.

    This would be a great addition to any development methodology, saving costs, mitigating risk, and protecting users, developers and their companies from growing dependency on technology in the modern day.

    Looking forward to DotNet6 and VS2022.

    With kind regards

Feedback usabilla icon