The future of .NET Standard

Immo Landwerth

Since .NET 5 was announced, many of you have asked what this means for .NET Standard and whether it will still be relevant. In this post, I’m going to explain how .NET 5 improves code sharing and replaces .NET Standard. I’ll also cover the cases where you still need .NET Standard.

For the impatient: TL;DR

.NET 5 will be a single product with a uniform set of capabilities and APIs that can be used for Windows desktop apps, cross-platform mobile apps, console apps, cloud services, and websites:

To better reflect this, we’ve updated the target framework names (TFMs):

  • net5.0. This is for code that runs everywhere. It combines and replaces the netcoreapp and netstandard names. This TFM will generally only include technologies that work cross-platform (except for pragmatic concessions, like we already did in .NET Standard).

  • net5.0-windows (and later net6.0-android and net6.0-ios). These TFMs represent OS-specific flavors of .NET 5 that include net5.0 plus OS-specific functionality.

We won’t be releasing a new version of .NET Standard, but .NET 5 and all future versions will continue to support .NET Standard 2.1 and earlier. You should think of net5.0 (and future versions) as the foundation for sharing code moving forward.

And since net5.0 is the shared base for all these new TFMs, that means that the runtime, library, and new language features are coordinated around this version number. For example, in order to use C# 9, you need to use net5.0 or net5.0-windows.

What you should target

.NET 5 and all future versions will always support .NET Standard 2.1 and earlier. The only reason to retarget from .NET Standard to .NET 5 is to gain access to more runtime features, language features, or APIs. So, you can think of .NET 5 as .NET Standard vNext.

What about new code? Should you still start with .NET Standard 2.0 or should you go straight to .NET 5? It depends.

  • App components. If you’re using libraries to break down your application into several components, my recommendation is to use netX.Y where X.Y is the lowest number of .NET that your application (or applications) are targeting. For simplicity, you probably want all projects that make up your application to be on the same version of .NET because it means you can assume the same BCL features everywhere.

  • Reusable libraries. If you’re building reusable libraries that you plan on shipping on NuGet, you’ll want to consider the trade-off between reach and available feature set. .NET Standard 2.0 is the highest version of .NET Standard that is supported by .NET Framework, so it will give you the most reach, while also giving you a fairly large feature set to work with. We’d generally recommend against targeting .NET Standard 1.x as it’s not worth the hassle anymore. If you don’t need to support .NET Framework, then you can either go with .NET Standard 2.1 or .NET 5. Most code can probably skip .NET Standard 2.1 and go straight to .NET 5.

So, what should you do? My expectation is that widely used libraries will end up multi-targeting for both .NET Standard 2.0 and .NET 5: supporting .NET Standard 2.0 gives you the most reach while supporting .NET 5 ensures you can leverage the latest platform features for customers that are already on .NET 5.

In a couple of years, the choice for reusable libraries will only involve the version number of netX.Y, which is basically how building libraries for .NET has always worked — you generally want to support some older version in order to ensure you get the most reach.

To summarize:

  • Use netstandard2.0 to share code between .NET Framework and all other platforms.
  • Use netstandard2.1 to share code between Mono, Xamarin, and .NET Core 3.x.
  • Use net5.0 for code sharing moving forward.

Problems with .NET Standard

.NET Standard has made it much easier to create libraries that work on all .NET platforms. But there are still three problems with .NET Standard:

  1. It versions slowly, which means you can’t easily use the latest features.
  2. It needs a decoder ring to map versions to .NET implementations.
  3. It exposes platform-specific features, which means you can’t statically validate whether your code is truly portable.

Let’s see how .NET 5 will address all three issues.

Problem 1: .NET Standard versions slowly

.NET Standard was designed at a time where the .NET platforms weren’t converged at the implementation level. This made writing code that needs to work in different environments hard, because different workloads used different .NET implementations.

The goal of .NET Standard was to unify the feature set of the base class library (BCL), so that you can write a single library that can run everywhere. And this has served us well: .NET Standard is supported by over 77% of the top 1000 packages. And if we look at all packages on NuGet.org that have been updated in the last 6 months, the adoption is at 58%.

#Packages supporting .NET Standard

But standardizing the API set alone creates a tax. It requires coordination whenever we’re adding new APIs — which happens all the time. The .NET open-source community (which includes the .NET team) keeps innovating in the BCL by providing new language features, usability improvements, new cross-cutting features such as Span<T>, or supporting new data formats or networking protocols.

And while we can provide new types as NuGet packages, we can’t provide new APIs on existing types this way. So, in the general sense, innovation in the BCL requires shipping a new version of .NET Standard.

Up until .NET Standard 2.0, this wasn’t really an issue because we only standardized existing APIs. But in .NET Standard 2.1, we standardized brand new APIs and that’s where we saw quite a bit of friction.

Where does this friction come from?

.NET Standard is an API set that all .NET implementations have to support, so there is an editorial aspect to it in that all APIs must be reviewed by the .NET Standard review board. The board is comprised of .NET platform implementers as well as representatives of the .NET community. The goal is to only standardize APIs that we can truly implement in all current and future .NET platforms. These reviews are necessary because there are different implementations of the .NET stack, with different constraints.

We predicted this type of friction, which is why we said early on that .NET Standard will only standardize APIs that were already shipped in at least one .NET implementation. This seems reasonable at first, but then you realize that .NET Standard can’t ship very frequently. So, if a feature misses a particular release, you might have to wait for a couple of years before it’s even available and potentially even longer until this version of .NET Standard is widely supported.

We felt for some features that opportunity loss was too high, so we did unnatural acts to standardize APIs that weren’t shipped yet (such as IAsyncEnumerable<T>). Doing this for all features was simply too expensive, which is why quite a few of them still missed the .NET Standard 2.1 train (such as the new hardware intrinsics).

But what if there was a single code base? And what if that code base would have to support all the aspects that make .NET implementations differ today, such as supporting both just-in-time (JIT) compilation and ahead-of-time (AOT) compilation?

Instead of doing these reviews as an afterthought, we’d make all these aspects part of the feature design, right from the start. In such a world, the standardized API set is, by construction, the common API set. When a feature is implemented, it would already be available for everyone because the code base is shared.

Problem 2: .NET Standard needs a decoder ring

Separating the API set from its implementation doesn’t just slow down the availability of APIs. It also means that we need to map .NET Standard versions to their implementations. As someone who had to explain this table to many people over time, I’ve come to appreciate just how complicated this seemingly simple idea is. We’ve tried our best to make it easier, but in the end, it’s just inherent complexity because the API set and the implementations are shipped independently.

We have unified the .NET platforms by adding yet another synthetic platform below them all that represents the common API set. In a very real sense, this XKCD-inspired comic is spot on:

How .NET platforms proliferate

We can’t solve this problem without truly merging some rectangles in our layer diagram, which is what .NET 5 does: it provides a unified implementation where all parties build on the same foundation and thus get the same API shape and version number.

Problem 3: .NET Standard exposes platform-specific APIs

When we designed .NET Standard, we had to make pragmatic concessions in order to avoid breaking the library ecosystem too much. That is, we had to include some Windows-only APIs (such as file system ACLs, the registry, WMI, and so on). Moving forward, we will avoid adding platform-specific APIs to net5.0, net6.0 and future versions. However, it’s impossible for us to predict the future. For example, with Blazor WebAssembly we have recently added a new environment where .NET runs and some of the otherwise cross-platform APIs (such as threading or process control) can’t be supported in the browser’s sandbox.

Many of you have complained that these kind of APIs feel like "landmines" – the code compiles without errors and thus appears to being portable to any platform, but when running on a platform that doesn’t have an implementation for the given API, you get runtime errors.

Starting with .NET 5, we’re shipping analyzers and code fixers with the SDK that are on by default. This includes the platform compatibility analyzer that detects unintentional use of APIs that aren’t supported on the platforms you intend to run on. This feature replaces the Microsoft.DotNet.Analyzers.Compatibility NuGet package.

Let’s first look at Windows-specific APIs.

Dealing with Windows-specific APIs

When you create a project targeting net5.0, you can reference the Microsoft.Win32.Registry package. But when you start using it, you’ll get the following warnings:

private static string GetLoggingDirectory()
{
    using (RegistryKey key = Registry.CurrentUser.OpenSubKey(@"Software\Fabrikam"))
    {
        if (key?.GetValue("LoggingDirectoryPath") is string configuredPath)
            return configuredPath;
    }

    string exePath = Process.GetCurrentProcess().MainModule.FileName;
    string folder = Path.GetDirectoryName(exePath);
    return Path.Combine(folder, "Logging");
}
CA1416: 'RegistryKey.OpenSubKey(string)' is supported on 'windows'
CA1416: 'Registry.CurrentUser' is supported on 'windows'
CA1416: 'RegistryKey.GetValue(string?)' is supported on 'windows'

You have three options on how you can address these warnings:

  1. Guard the call. You can check whether you’re running on Windows before calling the API by using OperatingSystem.IsWindows().

  2. Mark the call as Windows-specific. In some cases, it might make sense to mark the calling member as platform-specific via [SupportedOSPlatform("windows")].

  3. Delete the code. Generally not what you want because it means you lose fidelity when your code is used by Windows users, but for cases where a cross-platform alternative exists, you’re likely better off using that over platform-specific APIs. For example, instead of using the registry, you could use an XML configuration file.

  4. Suppress the warning. You can of course cheat and simply suppress the warning, either via .editorconfig or #pragma warning disable. However, you should prefer options (1) and (2) when using platform-specific APIs.

To guard the call, use the new static methods on the System.OperatingSystem class, for example:

private static string GetLoggingDirectory()
{
    if (OperatingSystem.IsWindows())
    {
        using (RegistryKey key = Registry.CurrentUser.OpenSubKey(@"Software\Fabrikam"))
        {
            if (key?.GetValue("LoggingDirectoryPath") is string configuredPath)
                return configuredPath;
        }
    }

    string exePath = Process.GetCurrentProcess().MainModule.FileName;
    string folder = Path.GetDirectoryName(exePath);
    return Path.Combine(folder, "Logging");
}

To mark your code as Windows-specific, apply the new SupportedOSPlatform attribute:

[SupportedOSPlatform("windows")]
private static string GetLoggingDirectory()
{
    using (RegistryKey key = Registry.CurrentUser.OpenSubKey(@"Software\Fabrikam"))
    {
        if (key?.GetValue("LoggingDirectoryPath") is string configuredPath)
            return configuredPath;
    }

    string exePath = Process.GetCurrentProcess().MainModule.FileName;
    string folder = Path.GetDirectoryName(exePath);
    return Path.Combine(folder, "Logging");
}

In both cases, the warnings for using the registry will disappear.

The key difference is that in the second example the analyzer will now issue warnings for the call sites of GetLoggingDirectory() because it is now considered to be a Windows-specific API. In other words, you forward the requirement of doing the platform check to your callers.

The [SupportedOSPlatform] attribute can be applied to the member, type, or assembly level. This attribute is also used by the BCL itself. For example, the assembly Microsoft.Win32.Registry has this attribute applied, which is how the analyzer knows that the registry is a Windows-specific API in the first place.

Note that if you target net5.0-windows, this attribute is automatically applied to your assembly. That means using Windows-specific APIs from net5.0-windows will never generate any warnings because your entire assembly is considered to be Windows-specific.

Dealing with APIs that are unsupported in Blazor WebAssembly

Blazor WebAssembly projects run inside the browser sandbox, which constrains which APIs you can use. For example, while thread and process creation are both cross-platform APIs, we can’t make these APIs work in Blazor WebAssembly, which means they throw PlatformNotSupportedException. We have marked these APIs with [UnsupportedOSPlatform("browser")].

Let’s say you copy & paste the GetLoggingDirectory() method into a Blazor WebAssembly application.

private static string GetLoggingDirectory()
{
    //...

    string exePath = Process.GetCurrentProcess().MainModule.FileName;
    string folder = Path.GetDirectoryName(exePath);
    return Path.Combine(folder, "Logging");
}

You’ll get the following warnings:

CA1416 'Process.GetCurrentProcess()' is unsupported on 'browser'
CA1416 'Process.MainModule' is unsupported on 'browser'

To deal with these warnings, you have basically the same options as with Windows-specific APIs.

You can guard the call:

private static string GetLoggingDirectory()
{
    //...

    if (!OperatingSystem.IsBrowser())
    {
        string exePath = Process.GetCurrentProcess().MainModule.FileName;
        string folder = Path.GetDirectoryName(exePath);
        return Path.Combine(folder, "Logging");
    }
    else
    {
        return string.Empty;
    }
}

Or you can mark the member as being unsupported by Blazor WebAssembly:

[UnsupportedOSPlatform("browser")]
private static string GetLoggingDirectory()
{
    //...

    string exePath = Process.GetCurrentProcess().MainModule.FileName;
    string folder = Path.GetDirectoryName(exePath);
    return Path.Combine(folder, "Logging");
}

Since the browser sandbox is fairly restrictive, not all class libraries and NuGet packages should be expected to work in Blazor WebAssembly. Furthermore, the vast majority of libraries aren’t expected to support running in Blazor WebAssembly either.

That’s why regular class libraries targeting net5.0 won’t see warnings for APIs that are unsupported by Blazor WebAssembly. You have to explicitly indicate that you intend to support your project in Blazor Web Assembly by adding the <SupportedPlatform> item to your project file:

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <TargetFramework>net5.0</TargetFramework>
  </PropertyGroup>
  
  <ItemGroup>
    <SupportedPlatform Include="browser" />
  </ItemGroup>
  
</Project>

If you’re building a Blazor WebAssembly application, you don’t have to do this because the Microsoft.NET.Sdk.BlazorWebAssembly SDK does this automatically.

.NET 5 as the combination of .NET Standard & .NET Core

.NET 5 and subsequent versions will be a single code base that supports desktop apps, mobile apps, cloud services, websites, and whatever environment .NET will run on tomorrow.

You might think "hold on, this sounds great, but what if someone wants to create a completely new implementation". That’s fine too. But virtually nobody will start one from scratch. Most likely, it will be a fork of the current code base (dotnet/runtime). For example, Tizen (the Samsung platform for smart appliances) uses a .NET Core with minimal changes and a Samsung-specific app model on top.

Forking preserves a merge relationship, which allows maintainers to keep pulling in new changes from the dotnet/runtime repo, benefiting from BCL innovations in areas unaffected by their changes. That’s very similar to how Linux distros work.

Granted, there are cases where one might want to create a very different "kind" of .NET, such as a minimal runtime without the current BCL. But that would mean that it couldn’t leverage the existing .NET library ecosystem anyway, which means it wouldn’t have implemented .NET Standard either. We’re generally not interested in pursuing this direction, but the convergence of .NET Standard and .NET Core doesn’t prevent that nor does it make it any harder.

.NET versioning

As a library author, you’re probably wondering when .NET 5 will be widely supported. Moving forward, we’ll ship .NET every year in November, with every other year being a Long Term Support (LTS) release.

.NET 5 will ship in November 2020 and .NET 6 will ship in November 2021 as an LTS. We created this fixed schedule to make it easier for you to plan your updates (if you’re an app developer) and predict the demand for supported .NET versions (if you’re a library developer).

Thanks to the ability to install .NET Core side-by-side, new versions are adopted fairly fast with LTS versions being the most popular. In fact, .NET Core 3.1 was the fastest adopted .NET version ever.

.NET 5 Schedule

The expectation is that every time we ship, we ship all framework names in conjunction. For example, it might look something like this:

.NET 5 .NET 6 .NET 7
net5.0 net6.0 net7.0
net6.0-android net7.0-android
net6.0-ios net7.0-ios
net5.0-windows net6.0-windows net7.0-windows
net5.0-someoldos

This means that you can generally expect that whatever innovation we did in the BCL, you’re going to be able to use it from all app models, no matter which platform they run on. It also means that libraries shipped for the latest net framework can always be consumed from all app models, as long as you run the latest version of them.

This model removes the complexity around .NET Standard versioning because each time we ship, you can assume that all platforms are going to support the new version immediately and completely. And we cement this promise by using the prefix naming convention.

New versions of .NET might add support for other platforms. For example, we will add support for Android and iOS, with .NET 6. Conversely, we might stop supporting platforms that are no longer relevant. This is illustrated by the pretend net5.0-someoldos target framework that doesn’t exist in .NET 6. We have no plans for dropping a platform, but the model supports it. That would be a big deal, isn’t expected and would be announced long in advance. That’s the same model we had with .NET Standard, where, for example, there is no new version of Windows Phone that implements a later version of .NET Standard.

Why there is no TFM for WebAssembly

We originally considered adding TFM for WebAssembly, such as net5.0-wasm. We decided against that for the following reasons:

  • WebAssembly is more like an instruction set (such as x86 or x64) than like an operating system. And we generally don’t offer divergent APIs between different architectures.

  • WebAssembly’s execution model in the browser sandbox is a key differentiator, but we decided that it makes more sense to only model this as a runtime check. Similar to how you check for Windows and Linux, you can use the OperatingSystem type. Since this isn’t about instruction set, the method is called IsBrowser() rather than IsWebAssembly().

  • There are runtime identifiers (RID) for WebAssembly, called browser and browser-wasm. They allow package authors to deploy different binaries when targeting WebAssembly in a browser. This is especially useful for native code which needs to be compiled to web assembly beforehand.

As described above, we have marked APIs that are unsupported in the browser sandbox, such as System.Diagnostics.Process. If you use those APIs from inside a browser app, you’ll get a warning telling you that this APIs is unsupported.

Summary

net5.0 is for code that runs everywhere. It combines and replaces the netcoreapp and netstandard names. We also have platform-specific frameworks, such as net5.0-windows (and later also net6.0-android, and net6.0-ios).

Since there is no difference between the standard and its implementation, you’ll be able to take advantage of new features much quicker than with .NET Standard. And due to the naming convention, you’ll be able to easily tell who can consume a given library — without having to consult the .NET Standard version table.

While .NET Standard 2.1 will be the last version of .NET Standard, .NET 5 and all future versions will continue to support .NET Standard 2.1 and earlier. You should think of net5.0 (and future versions) as the foundation for sharing code moving forward.

Happy coding!

97 comments

Discussion is closed. Login to edit/delete existing comments.

  • Blubberich 0

    First, grammar criticism because that stood out to me and it’s such an annying one:
    “The board is comprised of .NET platform implementers […]”
    Just no. Either:
    “The board comprises .NET platform implementers […]”
    OR
    “The board is composed of .NET platform implementers […]”
    Pick one, not a mixture of both.

    On topic:
    I very much welcome the unification.
    I don’t know how often exactly I have looked up the version support table but it has been too damn often.
    If this unifying also brings with more predictable, regular shipping of new versions and features across platforms then even better.

    Still having OS-specific versions that are standard + OS-specific add-ons is better than now, so that in the future one can use, say, Windows-specific stuff from .Net without being stuck with an old version of C#.

    It’s a good decision by Microsoft to do this with .Net and similiarily with Powershell, the fragmentation was/is just hurting adoption.

    • Joseph Musser 0

      “Is comprised of” is a commonly accepted (and almost ubiquitous) variant of “comprises.” (References 2-4 on https://en.wikipedia.org/wiki/Comprised_of)

  • JesperTreetop 0

    Straightening up .NET as one thing is a welcome outcome, but it also has some effects for those of us who use it currently. I work on a product where we target software for servers (as in: a vendored product, intended to run as ancillary software on other people’s servers), and we used to target .NET Framework. We’re just barely going to be able to use .NET Core 2.1, which is an “LTS” release, but it’s not very “L” as these things go. New developments will move to new versions, and new versions will drop support for operating systems our customers use on an ongoing basis. We’re pretty much resigned to this, but the deprecation makes this treadmill frustrating.

    Anyone who’s in a situation where they have a lot of web apps running on one or a set of Windows servers that are more pet than cattle – ie a large majority of servers held by small business – is also feeling this, and may be counting down the days until they have to budget and provision entirely new servers just to be able to upgrade, because of OS deprecation meaning the next ASP.NET Core release won’t be supported.

    But here’s the deal: If newer operating systems were supported in new .NET releases, those of us who have to support for longer could upgrade, and more of the customer base would be running on the newer version! Each release would be validated for a few more platforms, but it would be a lot more realistic for customers to upgrade. With more hands behind a single .NET platform implementation, which is going to need to go to many places in any case, I think this sounds more doable today than it did a few years ago.

    • Immo LandwerthMicrosoft employee 0

      But here’s the deal: If newer operating systems were supported in new .NET releases, those of us who have to support for longer could upgrade, and more of the customer base would be running on the newer version!

      Can you expand on this? What operating system support are you looking for that would help you?

      • JesperTreetop 0

        Sorry, I mistyped – “if more operating systems were supported in new .NET releases”, as in “if OSes that previously would fly out of cover of Microsoft support would still be supported”. One example for us is that .NET Core 3.1, which is the LTS version of the first release to support WPF, means we have to give up support for Windows Server 2008 R2 and 2012. Another is that Windows 7 support will most likely go away soon as it drops out of Microsoft support, but will not be going away in practice. The same is true for many Linux distributions, although the typical Ubuntu LTS cycle is more generous.

        Basically: The idea that people ought to migrate off those products at their earliest opportunity for their convenience and security is well and good, but what if their earliest opportunity is basically “never” or “when the hardware dies” or “when we finally get the opportunity to do so in the middle of all the other things we have to do”? In the real world, products live on for longer, and it would make .NET a stronger, less fragmented platform if following along to newer versions was also a possibility for longer. Alternative/competing platforms have a much less stringent policy of pulling the table cloth from under one’s feet, and it’s at the point with .NET Core that despite all the features and advances, we very nearly stayed with .NET Framework.

        To pose an example of what could be helpful, what if the policy was: Windows Server 2008 R2 was supported in an LTS release. It will continue to be supported in the future, too – but we will only support it for the *latest* LTS release. But we’ll keep things building and make sure it is available on the newest LTS release such that you can upgrade to it. In other words, you’ll still have to do maintenance of some form, to get bug fixes and security fixes and new features that may be required for adhering to new laws, etc. But you won’t also simultaneously be shut out of said maintenance because something went out of support between two releases. I think this is workable since it limits the effort, and provides an incentive to move forward that’s more practically attainable beyond what the machine-wide .NET Framework installations offered, now that the runtime can be bundled along with the program. (Of course it would be a good thing if no OS ever ran out of support, but if the reason that’s not the current situation is that it takes work, this at least takes less work.)

        • Jamshed DamkewalaMicrosoft employee 0

          @JesperTreetop, thanks for the perspective. I acknowledge your point that only supporting a newer OS with the newer Core LTS release has the counterintuitive result of forcing you to use the older LTS for longer than you would like to.

          We are investigating what it means to have newer Core releases support older OS platforms for longer. We could consider doing this for LTS releases of Core only like you suggested, though if we were to have say 3.1 LTS support an older OS, drop support for 5.1 (which is not LTS) and add support back for 6.0 (the next LTS) then its difficult to ensure we don’t end up taking an inappropriate dependency (on something in the newer OS but not in the older one) during the in-between non-LTS release.

          From a more practical perspective, not supporting the older OS platform for a release cycle increases debt in terms of any regression that will now be deferred until we look at that older OS more closely for the next LTS. So there are a lot of variables here, and while there are no decisions as yet we’re looking at this and hope to solve this in a way that addresses many if not most customers’ needs.

  • Mark Adamson 0

    Having an LTS version only every 2 years is a bit of a pain for things targeting AWS lambda since they say they will only add LTS versions of .net. So we will often have to wait 18 months or more from when new language features are announced until we get to use them.

    That helps competitiveness of Azure though which presumably will support .net 5 for example.

    Hopefully C# 10 and 11 will work with .net 6 and won’t make us wait until .net 8!

    • Norm Johanson 0

      Although .NET 5 won’t be a managed runtime like .NET Core 3.1 and 2.1 since it is not an LTS you can you use .NET 5 deployed as a self contained publish bundle. With the improvements that .NET 5 has made for reducing its size, adding new trimming options to your build and enabling the ready to run features the performance is quite comparable to the built in managed runtimes.

      Here is a link to post I wrote last year that talks about how to use custom Lambda runtimes and .NET self contained published. It is written for .NET 3.0, the last non-LTS, but everything is the same for .NET 5 except for updating the target frameworks. https://aws.amazon.com/blogs/developer/net-core-3-0-on-lambda-with-aws-lambdas-custom-runtime/

      Also be sure to checkout this Microsoft post about the new trimming features for .NET 5. I would not recommend doing the member level trimming unless you also enable ready 2 run for the build otherwise the trimming will have a negative effect on your cold starts. https://devblogs.microsoft.com/dotnet/app-trimming-in-net-5/

      Final note their is an issue with the .NET 5 RC1 release blocking it from running in Lambda. I have confirmed the issue is fixed in the upcoming RC2 builds so if you want to try out .NET 5 on Lambda today you either need to grab the nightly builds of RC2 or the previous preview 8 build.

      • Mark Adamson 0

        That’s really helpful info, thanks Norm. I had discounted custom runtimes before but that’s a good point about the improvements in size and startup time, I will re-evaluate

  • Aldriansyah Putra 0

    Spend a month learning it since they said only one .net in the future for all platform. I believe Microsoft doing right when they doing it into native code. But they never consistent for what they said. I just disappointed. Sorry my bad english.

  • Zero 0

    Apologies if this has been answered already, but what are the future plans for Mono? I can’t find anything on Google, and their site is conspicuously absent of any mentions of .NET 5. My initial impressions after reading the .NET 5 announcement post from way back in the day were that it would be bundled with the .NET 5 SDK, and you would be able to select it as a runtime when building an app.

    Then there’s the issue that it’s an aging platform which existed out of necessity, and would in theory be supplanted by CoreCLR (and CoreRT for native compilation). Is there any sort of official roadmap on this, i.e. “bundle Mono with .NET 5, deprecate it in .NET 6, make Xamarin/Unity/Godot/MonoGame run on the ‘Core’ stack in .NET 7”? I’m assuming it isn’t as simple as getting rid of it everywhere (and work on components like CoreRT would have to be completed first), but I’d be interested in knowing what the general plan for the next 5-10 years looks like.

    • Immo LandwerthMicrosoft employee 0

      Mono means several things, so let me first tease this apart so we can make sure we’re talking about the same thing:

      1. Mono runtime. Mono has a JIT-based runtime as well as an ahead-of-time (AOT) runtime. The latter is used today when targeting iOS, the former is used when you run on top of the Mono desktop framework or Android.

      2. Mono framework. Handwaving, but this is basically a cross-platform re-implementation of .NET Framework.

      3. Xamarin is a subset of the Mono framework that supports iOS and Android.

      4. Mono, the source code. There is the source code repository that contains the entire code base of (1), (2), and (3).

      So to answer your question:

      In .NET 6, we’re bringing up the Xamarin workload on top of the .NET Core-based library stack but we continue to use the Mono runtimes. Specifically, for Android, we’re going to use the Mono JIT and for iOS we’re going to use the Mono AOT runtime. I’m not entirely sure, but I expect the Mono AOT runtime to become a part of the dotnet/runtime GitHub repo at that point.

      In the same way that we see .NET Framework done we also see the Mono framework as being done. However, .NET Framework and .NET Core are separate code bases whereas the Mono framework and Xamarin are built from the same code base. That means that features added at the bottom happen to work for both. I would expect that after we enable Xamarin on top of .NET 6, the Mono framework will no longer see any feature updates either, akin to .NET Framework.

      Today, Visual Studio for Mac runs on top of the Mono framework, in the same way that Visual Studio runs on top of the .NET Framework. However, since VS for Mac is much smaller than VS, there are investigations if we can move this on top of .NET 5. I don’t know whether we made a decision in this space yet.

      • Zero 0

        That makes sense. Thanks for taking the time to respond.

  • Gurpreet Singh 0

    any Plan for asp.net WebForm? or open sourced in case of not in Road-map. There is no word about this technology.

  • Bernd Baumanns 0

    Is it possible to get “reference sources” for dotnet 5.0+ like we had them with dotnet standard? I need those for roslyn….
    With reference sources I mean something like we had here: https://github.com/dotnet/standard/tree/master/src/netstandard/ref

    I know every “library package” has a “ref” folder with the public API surface – but they are not so easy to get for the whole framework.
    Or is it required now to use the compiled reference assemblies?

    • Immo LandwerthMicrosoft employee 0

      Do you need them as source code? We plan on making the reference assemblies for .NET Standard and .NET Core/.NET 5) easier to access when using Roslyn/CodeDOM.

      • Bernd Baumanns 0

        Yes, I prefer very much regular source code without generated / lowered stuff.

        Just copy all sources from the ref folders did not work as expected. I got a bit in trouble because of some “duplicates” in the “ref” folders (Index / Range) – but this was easy to fix (some sources are not “decorated” with ifdef’s – some use just csproj for conditional inclusion).
        Another issue were around 5 protected overriden functions of “protected internal” functions. I just made the overrides “protected internal”, too.

        Personally I very much dislike “protected internal” – this makes merging assemblies more difficult.

        In the reference assemblies you can find some strange compiler generated code like “EmbeddedAttribute”.

  • Kuba Szostak 0

    Will new versions of .NET be backward compatible? I wonder if library written using .NET 5 will be compatible with .NET 6 and .NET 7? Or maybe thanks to the ability to install .NET Core side-by-side there could be breaking changes between big versions, the same way ASP .NET Core does?

    • Tudor Turcu 0

      If they will follow SemVer, any major version can introduce breaking change, so at least in theory, .NET 6 might not be backwards compatible for apps build against .NET 5.

    • Immo LandwerthMicrosoft employee 0

      Absolutely. It’s the same story as with .NET Core today. The platform is generally backwards compatible. However, as you’re pointing out some parts of the stacks (such as ASP.NET Core) have made breaking changes.

      Within .NET Core/.NET 5 there already is strong separation of “layers”. You have probably seen or head of them, such as Microsoft.NetCore.App or Microsoft.AspNetCore.App. Different parts of the stack have different bars for breaking changes. For the BCL (which is basically everything in Microsoft.NetCore.App and all the System.* NuGet packages) the bar has always been very high. We have made breaking changes, but they are 99.999% in behavior, not API shape. The general expectation is that we’re fully backwards compatible because this part of the stack is vitally important for the library ecosystem. And this won’t change with .NET 5.

  • Tudor Turcu 0

    I saw the value of .NET Standard in providing a minimum common/standardized API spec to which all alternative implementations must adhere – until recently that was mostly .NET Framework and mono. Now that mono team was hired by Microsoft that won’t be the case.
    It’s sad that the possibility of having multiple, third-party implementations, is not left open – who knows if in the future there won’t be other ‘crazy’ people to build a new ‘mono’? 🙂
    Or, more realistically, somebody willing to port .NET on another, new OS or hardware platform?

    • Bernd Baumanns 0

      Effectively, Microsoft still has a “minimum common/standardized API” which will be the public reference assemblies of dotnet 5.0+ (shipped APIs).
      In addition there is still the opportunity for other implementations which could simply use most of the code which is defined in https://github.com/dotnet/runtime/tree/master/src/libraries/ (which is done by mono / coreCLR / coreRT).

      But it would be nice to make all “ref” files easily available.

      • Immo LandwerthMicrosoft employee 0

        Exactly, there is still an API definition of .NET 5

        There already are NuGet packages for them. I’ll see whether I can get this documented more.

  • Bruce J. Patin 0

    I don’t know what you guys are coding, but we code for business and use reports, specifically RDLC files, and can not migrate to .NET 5, until Report Viewer supports it. I have tried two third parties, one which requires us to rewrite hundreds of reports, and the other can only display Times New Roman and no images, besides costing us more money and vendor lock-in. Reporting Services came out just as Crystal Reports became Business (i.e.: Expensive) Objects and developed a bug they could and would not fix. I gladly spent months rewriting our Crystal reports to Reporting Services reports that supposedly would be reliable. Then ADFS decided not to fix a bug until much later that couldn’t handle Reporting Services’ use of a first positional URL parameter for the report name, and I rewrite all of our code to use RDLC files and eliminate Report Server. Now, .NET Core 3.1 doesn’t support Report Viewer, which needs rewritten to use .NET Core (or .NET 5 now), and I am waiting until Microsoft deems it important enough to support.

Feedback usabilla icon