The future of .NET Standard

Avatar

Immo

Since .NET 5 was announced, many of you have asked what this means for .NET Standard and whether it will still be relevant. In this post, I’m going to explain how .NET 5 improves code sharing and replaces .NET Standard. I’ll also cover the cases where you still need .NET Standard.

For the impatient: TL;DR

.NET 5 will be a single product with a uniform set of capabilities and APIs that can be used for Windows desktop apps, cross-platform mobile apps, console apps, cloud services, and websites:

To better reflect this, we’ve updated the target framework names (TFMs):

  • net5.0. This is for code that runs everywhere. It combines and replaces the netcoreapp and netstandard names. This TFM will generally only include technologies that work cross-platform (except for pragmatic concessions, like we already did in .NET Standard).

  • net5.0-windows (and later net6.0-android and net6.0-ios). These TFMs represent OS-specific flavors of .NET 5 that include net5.0 plus OS-specific functionality.

We won’t be releasing a new version of .NET Standard, but .NET 5 and all future versions will continue to support .NET Standard 2.1 and earlier. You should think of net5.0 (and future versions) as the foundation for sharing code moving forward.

And since net5.0 is the shared base for all these new TFMs, that means that the runtime, library, and new language features are coordinated around this version number. For example, in order to use C# 9, you need to use net5.0 or net5.0-windows.

What you should target

.NET 5 and all future versions will always support .NET Standard 2.1 and earlier. The only reason to retarget from .NET Standard to .NET 5 is to gain access to more runtime features, language features, or APIs. So, you can think of .NET 5 as .NET Standard vNext.

What about new code? Should you still start with .NET Standard 2.0 or should you go straight to .NET 5? It depends.

  • App components. If you’re using libraries to break down your application into several components, my recommendation is to use netX.Y where X.Y is the lowest number of .NET that your application (or applications) are targeting. For simplicity, you probably want all projects that make up your application to be on the same version of .NET because it means you can assume the same BCL features everywhere.

  • Reusable libraries. If you’re building reusable libraries that you plan on shipping on NuGet, you’ll want to consider the trade-off between reach and available feature set. .NET Standard 2.0 is the highest version of .NET Standard that is supported by .NET Framework, so it will give you the most reach, while also giving you a fairly large feature set to work with. We’d generally recommend against targeting .NET Standard 1.x as it’s not worth the hassle anymore. If you don’t need to support .NET Framework, then you can either go with .NET Standard 2.1 or .NET 5. Most code can probably skip .NET Standard 2.1 and go straight to .NET 5.

So, what should you do? My expectation is that widely used libraries will end up multi-targeting for both .NET Standard 2.0 and .NET 5: supporting .NET Standard 2.0 gives you the most reach while supporting .NET 5 ensures you can leverage the latest platform features for customers that are already on .NET 5.

In a couple of years, the choice for reusable libraries will only involve the version number of netX.Y, which is basically how building libraries for .NET has always worked — you generally want to support some older version in order to ensure you get the most reach.

To summarize:

  • Use netstandard2.0 to share code between .NET Framework and all other platforms.
  • Use netstandard2.1 to share code between Mono, Xamarin, and .NET Core 3.x.
  • Use net5.0 for code sharing moving forward.

Problems with .NET Standard

.NET Standard has made it much easier to create libraries that work on all .NET platforms. But there are still three problems with .NET Standard:

  1. It versions slowly, which means you can’t easily use the latest features.
  2. It needs a decoder ring to map versions to .NET implementations.
  3. It exposes platform-specific features, which means you can’t statically validate whether your code is truly portable.

Let’s see how .NET 5 will address all three issues.

Problem 1: .NET Standard versions slowly

.NET Standard was designed at a time where the .NET platforms weren’t converged at the implementation level. This made writing code that needs to work in different environments hard, because different workloads used different .NET implementations.

The goal of .NET Standard was to unify the feature set of the base class library (BCL), so that you can write a single library that can run everywhere. And this has served us well: .NET Standard is supported by over 77% of the top 1000 packages. And if we look at all packages on NuGet.org that have been updated in the last 6 months, the adoption is at 58%.

#Packages supporting .NET Standard

But standardizing the API set alone creates a tax. It requires coordination whenever we’re adding new APIs — which happens all the time. The .NET open-source community (which includes the .NET team) keeps innovating in the BCL by providing new language features, usability improvements, new cross-cutting features such as Span<T>, or supporting new data formats or networking protocols.

And while we can provide new types as NuGet packages, we can’t provide new APIs on existing types this way. So, in the general sense, innovation in the BCL requires shipping a new version of .NET Standard.

Up until .NET Standard 2.0, this wasn’t really an issue because we only standardized existing APIs. But in .NET Standard 2.1, we standardized brand new APIs and that’s where we saw quite a bit of friction.

Where does this friction come from?

.NET Standard is an API set that all .NET implementations have to support, so there is an editorial aspect to it in that all APIs must be reviewed by the .NET Standard review board. The board is comprised of .NET platform implementers as well as representatives of the .NET community. The goal is to only standardize APIs that we can truly implement in all current and future .NET platforms. These reviews are necessary because there are different implementations of the .NET stack, with different constraints.

We predicted this type of friction, which is why we said early on that .NET Standard will only standardize APIs that were already shipped in at least one .NET implementation. This seems reasonable at first, but then you realize that .NET Standard can’t ship very frequently. So, if a feature misses a particular release, you might have to wait for a couple of years before it’s even available and potentially even longer until this version of .NET Standard is widely supported.

We felt for some features that opportunity loss was too high, so we did unnatural acts to standardize APIs that weren’t shipped yet (such as IAsyncEnumerable<T>). Doing this for all features was simply too expensive, which is why quite a few of them still missed the .NET Standard 2.1 train (such as the new hardware intrinsics).

But what if there was a single code base? And what if that code base would have to support all the aspects that make .NET implementations differ today, such as supporting both just-in-time (JIT) compilation and ahead-of-time (AOT) compilation?

Instead of doing these reviews as an afterthought, we’d make all these aspects part of the feature design, right from the start. In such a world, the standardized API set is, by construction, the common API set. When a feature is implemented, it would already be available for everyone because the code base is shared.

Problem 2: .NET Standard needs a decoder ring

Separating the API set from its implementation doesn’t just slow down the availability of APIs. It also means that we need to map .NET Standard versions to their implementations. As someone who had to explain this table to many people over time, I’ve come to appreciate just how complicated this seemingly simple idea is. We’ve tried our best to make it easier, but in the end, it’s just inherent complexity because the API set and the implementations are shipped independently.

We have unified the .NET platforms by adding yet another synthetic platform below them all that represents the common API set. In a very real sense, this XKCD-inspired comic is spot on:

How .NET platforms proliferate

We can’t solve this problem without truly merging some rectangles in our layer diagram, which is what .NET 5 does: it provides a unified implementation where all parties build on the same foundation and thus get the same API shape and version number.

Problem 3: .NET Standard exposes platform-specific APIs

When we designed .NET Standard, we had to make pragmatic concessions in order to avoid breaking the library ecosystem too much. That is, we had to include some Windows-only APIs (such as file system ACLs, the registry, WMI, and so on). Moving forward, we will avoid adding platform-specific APIs to net5.0, net6.0 and future versions. However, it’s impossible for us to predict the future. For example, with Blazor WebAssembly we have recently added a new environment where .NET runs and some of the otherwise cross-platform APIs (such as threading or process control) can’t be supported in the browser’s sandbox.

Many of you have complained that these kind of APIs feel like "landmines" – the code compiles without errors and thus appears to being portable to any platform, but when running on a platform that doesn’t have an implementation for the given API, you get runtime errors.

Starting with .NET 5, we’re shipping analyzers and code fixers with the SDK that are on by default. This includes the platform compatibility analyzer that detects unintentional use of APIs that aren’t supported on the platforms you intend to run on. This feature replaces the Microsoft.DotNet.Analyzers.Compatibility NuGet package.

Let’s first look at Windows-specific APIs.

Dealing with Windows-specific APIs

When you create a project targeting net5.0, you can reference the Microsoft.Win32.Registry package. But when you start using it, you’ll get the following warnings:

private static string GetLoggingDirectory()
{
    using (RegistryKey key = Registry.CurrentUser.OpenSubKey(@"Software\Fabrikam"))
    {
        if (key?.GetValue("LoggingDirectoryPath") is string configuredPath)
            return configuredPath;
    }

    string exePath = Process.GetCurrentProcess().MainModule.FileName;
    string folder = Path.GetDirectoryName(exePath);
    return Path.Combine(folder, "Logging");
}
CA1416: 'RegistryKey.OpenSubKey(string)' is supported on 'windows'
CA1416: 'Registry.CurrentUser' is supported on 'windows'
CA1416: 'RegistryKey.GetValue(string?)' is supported on 'windows'

You have three options on how you can address these warnings:

  1. Guard the call. You can check whether you’re running on Windows before calling the API by using OperatingSystem.IsWindows().

  2. Mark the call as Windows-specific. In some cases, it might make sense to mark the calling member as platform-specific via [SupportedOSPlatform("windows")].

  3. Delete the code. Generally not what you want because it means you lose fidelity when your code is used by Windows users, but for cases where a cross-platform alternative exists, you’re likely better off using that over platform-specific APIs. For example, instead of using the registry, you could use an XML configuration file.

  4. Suppress the warning. You can of course cheat and simply suppress the warning, either via .editorconfig or #pragma warning disable. However, you should prefer options (1) and (2) when using platform-specific APIs.

To guard the call, use the new static methods on the System.OperatingSystem class, for example:

private static string GetLoggingDirectory()
{
    if (OperatingSystem.IsWindows())
    {
        using (RegistryKey key = Registry.CurrentUser.OpenSubKey(@"Software\Fabrikam"))
        {
            if (key?.GetValue("LoggingDirectoryPath") is string configuredPath)
                return configuredPath;
        }
    }

    string exePath = Process.GetCurrentProcess().MainModule.FileName;
    string folder = Path.GetDirectoryName(exePath);
    return Path.Combine(folder, "Logging");
}

To mark your code as Windows-specific, apply the new SupportedOSPlatform attribute:

[SupportedOSPlatform("windows")]
private static string GetLoggingDirectory()
{
    using (RegistryKey key = Registry.CurrentUser.OpenSubKey(@"Software\Fabrikam"))
    {
        if (key?.GetValue("LoggingDirectoryPath") is string configuredPath)
            return configuredPath;
    }

    string exePath = Process.GetCurrentProcess().MainModule.FileName;
    string folder = Path.GetDirectoryName(exePath);
    return Path.Combine(folder, "Logging");
}

In both cases, the warnings for using the registry will disappear.

The key difference is that in the second example the analyzer will now issue warnings for the call sites of GetLoggingDirectory() because it is now considered to be a Windows-specific API. In other words, you forward the requirement of doing the platform check to your callers.

The [SupportedOSPlatform] attribute can be applied to the member, type, or assembly level. This attribute is also used by the BCL itself. For example, the assembly Microsoft.Win32.Registry has this attribute applied, which is how the analyzer knows that the registry is a Windows-specific API in the first place.

Note that if you target net5.0-windows, this attribute is automatically applied to your assembly. That means using Windows-specific APIs from net5.0-windows will never generate any warnings because your entire assembly is considered to be Windows-specific.

Dealing with APIs that are unsupported in Blazor WebAssembly

Blazor WebAssembly projects run inside the browser sandbox, which constrains which APIs you can use. For example, while thread and process creation are both cross-platform APIs, we can’t make these APIs work in Blazor WebAssembly, which means they throw PlatformNotSupportedException. We have marked these APIs with [UnsupportedOSPlatform("browser")].

Let’s say you copy & paste the GetLoggingDirectory() method into a Blazor WebAssembly application.

private static string GetLoggingDirectory()
{
    //...

    string exePath = Process.GetCurrentProcess().MainModule.FileName;
    string folder = Path.GetDirectoryName(exePath);
    return Path.Combine(folder, "Logging");
}

You’ll get the following warnings:

CA1416 'Process.GetCurrentProcess()' is unsupported on 'browser'
CA1416 'Process.MainModule' is unsupported on 'browser'

To deal with these warnings, you have basically the same options as with Windows-specific APIs.

You can guard the call:

private static string GetLoggingDirectory()
{
    //...

    if (!OperatingSystem.IsBrowser())
    {
        string exePath = Process.GetCurrentProcess().MainModule.FileName;
        string folder = Path.GetDirectoryName(exePath);
        return Path.Combine(folder, "Logging");
    }
    else
    {
        return string.Empty;
    }
}

Or you can mark the member as being unsupported by Blazor WebAssembly:

[UnsupportedOSPlatform("browser")]
private static string GetLoggingDirectory()
{
    //...

    string exePath = Process.GetCurrentProcess().MainModule.FileName;
    string folder = Path.GetDirectoryName(exePath);
    return Path.Combine(folder, "Logging");
}

Since the browser sandbox is fairly restrictive, not all class libraries and NuGet packages should be expected to work in Blazor WebAssembly. Furthermore, the vast majority of libraries aren’t expected to support running in Blazor WebAssembly either.

That’s why regular class libraries targeting net5.0 won’t see warnings for APIs that are unsupported by Blazor WebAssembly. You have to explicitly indicate that you intend to support your project in Blazor Web Assembly by adding the <SupportedPlatform> item to your project file:

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <TargetFramework>net5.0</TargetFramework>
  </PropertyGroup>
  
  <ItemGroup>
    <SupportedPlatform Include="browser" />
  </ItemGroup>
  
</Project>

If you’re building a Blazor WebAssembly application, you don’t have to do this because the Microsoft.NET.Sdk.BlazorWebAssembly SDK does this automatically.

.NET 5 as the combination of .NET Standard & .NET Core

.NET 5 and subsequent versions will be a single code base that supports desktop apps, mobile apps, cloud services, websites, and whatever environment .NET will run on tomorrow.

You might think "hold on, this sounds great, but what if someone wants to create a completely new implementation". That’s fine too. But virtually nobody will start one from scratch. Most likely, it will be a fork of the current code base (dotnet/runtime). For example, Tizen (the Samsung platform for smart appliances) uses a .NET Core with minimal changes and a Samsung-specific app model on top.

Forking preserves a merge relationship, which allows maintainers to keep pulling in new changes from the dotnet/runtime repo, benefiting from BCL innovations in areas unaffected by their changes. That’s very similar to how Linux distros work.

Granted, there are cases where one might want to create a very different "kind" of .NET, such as a minimal runtime without the current BCL. But that would mean that it couldn’t leverage the existing .NET library ecosystem anyway, which means it wouldn’t have implemented .NET Standard either. We’re generally not interested in pursuing this direction, but the convergence of .NET Standard and .NET Core doesn’t prevent that nor does it make it any harder.

.NET versioning

As a library author, you’re probably wondering when .NET 5 will be widely supported. Moving forward, we’ll ship .NET every year in November, with every other year being a Long Term Support (LTS) release.

.NET 5 will ship in November 2020 and .NET 6 will ship in November 2021 as an LTS. We created this fixed schedule to make it easier for you to plan your updates (if you’re an app developer) and predict the demand for supported .NET versions (if you’re a library developer).

Thanks to the ability to install .NET Core side-by-side, new versions are adopted fairly fast with LTS versions being the most popular. In fact, .NET Core 3.1 was the fastest adopted .NET version ever.

.NET 5 Schedule

The expectation is that every time we ship, we ship all framework names in conjunction. For example, it might look something like this:

.NET 5.NET 6.NET 7
net5.0net6.0net7.0
net6.0-androidnet7.0-android
net6.0-iosnet7.0-ios
net5.0-windowsnet6.0-windowsnet7.0-windows
net5.0-someoldos

This means that you can generally expect that whatever innovation we did in the BCL, you’re going to be able to use it from all app models, no matter which platform they run on. It also means that libraries shipped for the latest net framework can always be consumed from all app models, as long as you run the latest version of them.

This model removes the complexity around .NET Standard versioning because each time we ship, you can assume that all platforms are going to support the new version immediately and completely. And we cement this promise by using the prefix naming convention.

New versions of .NET might add support for other platforms. For example, we will add support for Android and iOS, with .NET 6. Conversely, we might stop supporting platforms that are no longer relevant. This is illustrated by the pretend net5.0-someoldos target framework that doesn’t exist in .NET 6. We have no plans for dropping a platform, but the model supports it. That would be a big deal, isn’t expected and would be announced long in advance. That’s the same model we had with .NET Standard, where, for example, there is no new version of Windows Phone that implements a later version of .NET Standard.

Why there is no TFM for WebAssembly

We originally considered adding TFM for WebAssembly, such as net5.0-wasm. We decided against that for the following reasons:

  • WebAssembly is more like an instruction set (such as x86 or x64) than like an operating system. And we generally don’t offer divergent APIs between different architectures.

  • WebAssembly’s execution model in the browser sandbox is a key differentiator, but we decided that it makes more sense to only model this as a runtime check. Similar to how you check for Windows and Linux, you can use the OperatingSystem type. Since this isn’t about instruction set, the method is called IsBrowser() rather than IsWebAssembly().

  • There are runtime identifiers (RID) for WebAssembly, called browser and browser-wasm. They allow package authors to deploy different binaries when targeting WebAssembly in a browser. This is especially useful for native code which needs to be compiled to web assembly beforehand.

As described above, we have marked APIs that are unsupported in the browser sandbox, such as System.Diagnostics.Process. If you use those APIs from inside a browser app, you’ll get a warning telling you that this APIs is unsupported.

Summary

net5.0 is for code that runs everywhere. It combines and replaces the netcoreapp and netstandard names. We also have platform-specific frameworks, such as net5.0-windows (and later also net6.0-android, and net6.0-ios).

Since there is no difference between the standard and its implementation, you’ll be able to take advantage of new features much quicker than with .NET Standard. And due to the naming convention, you’ll be able to easily tell who can consume a given library — without having to consult the .NET Standard version table.

While .NET Standard 2.1 will be the last version of .NET Standard, .NET 5 and all future versions will continue to support .NET Standard 2.1 and earlier. You should think of net5.0 (and future versions) as the foundation for sharing code moving forward.

Happy coding!

81 comments

Leave a comment

  • Avatar
    ysmoradi

    “for WebAssembly, called browser and browser-wasm. They allow package authors to deploy different binaries when targeting WebAssembly in a browser.”

    Imaging I’ve a nuget package which produces different binaries with different nuget packages dependencies using different target frameworks.

    For example: https://www.nuget.org/packages/Bit.Universal.Http/

    How can I do the same I’ve done for android/iOS/UWP/ etc for web assembly too? For example, I’d like to reference a nuget package in my nuget package only for wasm

    If there was net5-wasm as like as net5-windows, I was able to do that. But have no idea how runtime identifier can help me in this situation )”:

  • Avatar
    Carl Scarlett

    In July 2019 the release date of .NET 5 shifted from September 2020 to November 2020.
    14 months later, in a year complicated by COVID-19, the release date has not shifted.

    In light of this and the degree of difficulty unifying all the .NET technologies, you need to be congratulated for making this milestone on time.
    What an incredible achievement!

    I can’t wait to work in the promised land of a unified .NET at last. This is going to be awesome.

  • Avatar
    Nicolas Pigaglio

    Netstandard what the promise we would be able to create our new applications in netcore while stile being able to keep and share code with our olders .NET 4.x apps. Really hoping Microsoft and the other nuget package maintainers won’t drop netstandard support anytime soon.

    • Avatar
      Immo LandwerthMicrosoft logo

      And this promise of .NET Standard remains unchanged. However, it is indeed up to the package authors to select what version of .NET they are targeting. We still recommend that libraries that go for reach should remain on .NET Standard 2.0. But I think it’s fair to say that many packages will eventually only support later versions, the reason being that new versions have more futures that they are typically also forced to support, so the cost goes up. Not everyone is able to do that. In other words, the cost argument cuts both ways.

  • Avatar
    Ralph

    Hi Immo,
    thanks for the great article!

    There is one thing, I don’t like about it:
    Please don’t publish misleading information regarding .NET Framework.

    In my opinion, the vision-video “One .NET” is simply wrong.
    .NET Framework won’t go anywhere, won’t get feature updates, won’t be merged into .NET 5.

    This would be more true:
    .NET Standard 2.0 = .NET Framework and .NET Core
    .NET Standard 2.1 = .NET Core
    .NET Standard 2.1+ = .NET 5, .NET 6, …

    Yes, you are covering that in the details, but please also change the TL;DR to match the true story 😉

    • Avatar
      Emmanuel Adebiyi

      I hope you realise that what you’re stating here has been stated explicitly prior to this moment. I see no point for unproductive repetition. No one was expecting .NET Framework support on .NET 5 anyway.

      • Avatar
        Ralph

        Oh sure, the only thing I disagree with, is to say .NET Framework will be merged into “One .NET” – that’s simply not true.
        I fully understand that we don’t have to repeat it every time but I disagree with this kind of information.

    • Avatar
      Immo LandwerthMicrosoft logo

      I hear what you’re saying. Our intention isn’t to mislead people; the picture is meant to cover the strategy of the platform, which also includes previous versions of .NET Core. And in .NET Core 2.0 and 3.0 we have done exactly what the picture suggested: we have ported substantial amounts of .NET Framework code to .NET Core, including WinForms and WPF. We have also added the .NET Framework compatibility mode, which allows .NET Core and .NET Standard projects to reference .NET Framework-only NuGet packages and binaries. After .NET Core 3.0 we have stated that we’re done with porting .NET Framework code to .NET Core. And with .NET 5, we’re making this explicit by changing the branding, dropping the core suffix and using 5 as the version number so that the product is positioned as the successor of .NET Framework.

      • Avatar
        Peter Row

        I think MS needs to be very repetitive about .NET 4.x vs. .NET 5 with regards to it being a major refactoring project for existing applications on .NET 4.x. For example, we’ll hopefully soon be upgrading to .NET 4.8 from 4.5.2, which isn’t as trivial as one might hope. Going to .NET 5 or more likely .NET 7+ by the time we get the opportunity it’s going be a massive shift since it’s the equivalent of going to .NET Core now.

        To a manager it doesn’t look like a big leap between versions 4.8 to 5.0 and thus it’s harder to convince them to spend the time/money on converting.

        It doesn’t help that Microsoft made .NET a component of Windows that lives and dies with the OS version; arguably that is what keeps older versions of .NET in play for so long. It’s hard to convince people to allow the upgrade whilst the existing version is still supported.

        • Avatar
          Immo LandwerthMicrosoft logo

          That is fair. It might be nuanced, but there is no such thing as .NET 4.8. The product is called “.NET Framework 4.8”. The new product is called “.NET 5”. Moving from .NET Framework 4.x to .NET 5 will be about as much work as moving to .NET Core 3.1. The primary goal of dropping the suffix is unifying the product line up and converging them both in name and version space to .NET 5. Feel free to quote me on this 🙂

      • Avatar
        Ralph

        Thanks for the detailed answer, I understand your intention.
        For me, it’s very clear that .NET 4.8 is not going anywhere, but I think for others, as Peter mentions Managers or even Developers who don’t follow the news every day this is a very misleading information.

        I understand, that it’s not funny to repeat the message of .NET 4.8 every time, but I think it’s really necessary to do that for at least some month after the .NET 5 release.

      • Avatar
        Paulo Pinto

        And because of it we are stuck with .NET Framework 4.7.2, eyeing upgrades only into .NET Framework 4.8, because no one is willing to pay for the major refactoring that Microsoft assumes everyone is willing to do for free.

        This coupled with the ongoing changes with Reunion, UWP, MAUI vs Blazor vs Forms vs WPF vs WinUI, .NET Native vs .NET 6, EF6 vs EF Core, C++/WinRT vs C++/CX (affects usage of UWP APIs not exposed to .NET as UWP components) means we are now very cautious with whatever Microsoft is promoting for future .NET versions.

        • Avatar
          Marco von Ballmoos

          major refactoring that Microsoft assumes everyone is willing to do for free.

          I think that’s a bit harsh. Microsoft has pledged to continue support for .NET Framework 4.8. I understand it’s non-trivial to upgrade from .NET Framework (I’ve done it myself with non-trivial code bases), but no-one’s forcing an upgrade. .NET Framework includes a lot of legacy design decisions that are incompatible with the cross-platform and performance goals of .NET 5. There was no realistic way to upgrade it without rewriting it.

          Even a relatively sophisticated and far-reaching library or framework should be able to continue to target .NET Standard 2.0. Package authors aren’t going to dump support just for the heck of it. If that’s the case, then it’s not really MS’s fault, is it?

          • Avatar
            Paulo Pinto

            Managed Direct X, XNA, Silverlight, WinRT 8, WinRT 8.1 UAP, Windows 10 UWP, .NET Native, C++/CX, WCF, EF 6 designers, Windows Forms/WPF VS component API .NET Core 1.0 JSON projects, .NET EF Core 2.1 – 3.0 changes, Reunion, ….

            Me being harsh?

    • Avatar
      Immo LandwerthMicrosoft logo

      You will be able to reference any NuGet package that can be installed into net5.0, which also includes .NET Standard 2.1 (and earlier), .NET Core 3.1 (and earlier).

      Whether the library will work is a different matter, because as the post suggest the code runs inside the browser’s sandbox. Not all APIs are supported. We expect that libraries will eventually update to include the attributes to communicate to their consumers whether or not Blazor WebAssembly is supported.

  • Avatar
    TSX

    Assuming that NET Framework is legacy and current VS depend on NET Framework, then you maybe should stop investing in convoluted tools like WPF/WinForm designers for Net Core / NET 5 apps, and with new VS (2021 ?) go straight with VS based on NET 5 with WPF/WinForm designers that works naturally with VS (as current designers works naturally with VS). Anyway it will be interested, how long NET Framework will drag VS’s next and prevent is from switching to newer NET 5/6/7, as there are multiple designers / tools created for NET Framework, and updating all ot it won’t be quick and easy.

    • Avatar
      MgSam

      VS is a Frankenstein of different technologies from different eras from what I understand, including C++. No doubt this is the real reason Microsoft doesn’t want to invest in making it a 64 bit application, it would be too costly and require too much re-engineering.

      I think there are many people at Microsoft hoping that one day VS Code will totally subsume VS, although that day remains a long way off.

      • Avatar
        Immo LandwerthMicrosoft logo

        You’re not entirely wrong, but you’re also not describing the full story.

        Visual Studio has a sophisticated out-of-proc model for many of the key features you use in VS. This includes, for example, the features that power IntelliSense, such as code completion, refactoring, analyzers etc. And many of these are running in a .NET Core process that the VS instance communicates with. Other IDEs have similar models, which includes VS Code and JetBrains Rider.

        The same applies to the WPF and WinForms designers. The .NET Core versions of these designers are running in a separate process that is .NET Core-based. This is necessary because both run user code and we can’t run .NET Core code on .NET Framework. That’s also the reason why the WinForms designer took more time, because the .NET Framework version runs in-proc of VS which meant we first had to make a designer that can be hosted out-of-proc.

        • Avatar
          JesperTreetop

          Visual Studio designer code is closed source along with the rest of Visual Studio, but has anything been revealed about the process through which the out-of-process designer works technically? Much like the “how .NET is built” post recently, I’d love to know more about this.

          • Avatar
            Florian Schneidereit

            From a quick peek into the Microsoft.VisualStudio.WinForms.RemoteClient.dll, I assume it’s using some kind of custom remoting based on memory mapped files. LINQPad is using a similiar approach for running its queries out-of-proc. Search for “Pushing C# to the Limit” by Joe Albahari, he gave a talk back in 2017(?) where he described his approach.

          • Avatar
            JesperTreetop

            That’s great! I’m interested mostly not in how the bits get across the wire but what happens on both sides before and after – how a .NET Framework Visual Studio manages to provide a well-integrated solution where the controls are in .NET Core, and where you can interact with them at least to some degree. (If they’d been the same framework, System.Addin sounds like it could have been the solution, but this works with controls you write that aren’t at all enlightened to this type of remoting.) Even if it’s just rendering bitmaps at phenomenal speed and shoveling them back and forth, knowing how to do that at a speed and latency that’s good enough for the designer could be very educational.

            (I’ve seen Joe Albahari’s IPC too and it’s incredibly inspired and low-overhead. This appears to be the code for it.)