Runtime: The future of JSON in .NET Core 3.0

Created on 29 Oct 2018  ·  193Comments  ·  Source: dotnet/runtime

JSON has become an essential part of virtually all modern .NET applications and in many cases even surpassed the usage of XML. However, .NET hasn't had a (great) built-in way to deal with JSON. Instead we've relied on [Json.NET] which continues to serve the .NET ecosystem well.

Moving forward, we plan on making some changes to our JSON support:

  • We need high-performance JSON APIs. We need a new set of JSON APIs that are highly tuned for performance by using Span<T> and allows for processing UTF-8 directly without having to transcode to UTF-16 string instances. Both aspects are critical for our web server Kestrel, where throughput is a key requirement.

  • Remove dependency from ASP.NET Core to Json.NET. Today, ASP.NET Core has a dependency on Json.NET. While this provides a tight integration between ASP.NET Core and Json.NET, it also means that application developers cannot freely choose which JSON library they are using. This is also problematic for customers of Json.NET as the version is dictated by the underlying platform. However, Json.NET is frequently updated and application developers often want to -- or even have to -- use a specific version. Thus, we want to remove the dependency from ASP.NET Core 3.0 to Json.NET so that customers can choose which version to use, without fearing they might accidentally break the underlying platform. In addition, this makes it also possible to plug-in an entirely different JSON library.

  • Provide an ASP.NET Core integration package for Json.NET. Json.NET has basically become the Swiss Army knife of JSON processing in .NET. It provides many options and facilities that allow customers to handle their JSON needs with ease. We don't want to compromise on the Json.NET support customers are getting today, for example, the ability to configure the JSON serialization via the AddJsonOptions extension method. Thus, we want to provide the Json.NET integration as a NuGet package that developers can optionally install so they get all the bells and whistles they get from Json.NET today. The other part of this work item is to ensure we have the right extension points so that other parties can provide similar integration packages for their JSON library of choice.

Below are more details around this plan.

The need for high-performance JSON APIs

The requirements for the .NET stack have changed a bit since the arrival of .NET Core. Historically, .NET has valued usability and convenience. With .NET Core, we've added a focus on performance, and we've made significant investments to serve high performance needs. And the improvements we made in the popular TechEmpower benchmark are a testament to that.

With .NET Core 2.1, we've added a brand new primitive called Span\ that allows us to represent native memory and arrays in a uniform way. With this type, we've also added a set of parsing and encoding APIs that are much more memory efficient without having to resort to unsafe code.

Part of the work of minimizing allocations is to avoid having to transcode UTF-8 payloads into UTF-16 strings, purely for parsing reasons. Currently, Json.NET is implemented by reading UTF-16. We need the ability to read (and write) JSON documents directly in UTF-8 because most network protocols (including HTTP) use UTF-8.

During .NET Core 2.1 we've learned that updating our existing APIs to leverage Span<T> has limits. While we did add a bunch of overloads that accept spans, we also had to produce brand new APIs that are designed around minimizing allocations and dealing with buffers, which we exposed in System.Buffers namespaces. And with System.IO.Pipelines we've also added a programming model that enables developers to share buffers without having to deal with lifetime issues.

Based on these experiences we believe in order to support JSON parsing, we'll need to expose a new set of JSON APIs that are specifically geared for high-performance scenarios.

You might wonder why we can't just update Json.NET to include support for parsing JSON using Span<T>? Well, James Newton-King -- the author of Json.NET -- has the following to say about that:

Json.NET was created over 10 years ago, and since then it has added a wide range of features aimed to help developers work with JSON in .NET. In that time Json.NET has also become far and away NuGet's most depended on and downloaded package, and is the go-to library for JSON support in .NET. Unfortunately, Json.NET's wealth of features and popularity works against making major changes to it. Supporting new technologies like Span<T> would require fundamental breaking changes to the library and would disrupt existing applications and libraries that depend on it.

Going forward Json.NET will continue to be worked on and invested in, both addressing known issues today and supporting new platforms in the future. Json.NET has always existed alongside other JSON libraries for .NET, and there will be nothing to prevent you using one or more together, depending on whether you need the performance of the new JSON APIs or the large feature set of Json.NET.

Move Json.NET integration into a separate NuGet package

Today, you cannot use ASP.NET Core without Json.NET because it is a dependency of ASP.NET Core itself. Over the years, we've received feedback that the dependency can conflict with other libraries that have their own dependency on a different version of Json.NET. In the past, we've considered addressing this issue by using a private copy of Json.NET in ASP.NET. However, this would create problems when developers want to configure Json.NET (for instance, in order to control how the serializer behaves when formatting JSON objects).

Moving forward we'd like to:

  1. Replace the internal usage of Json.NET in ASP.NET Core by the new platform-provided JSON APIs.

  2. Factor the public facing usage of Json.NET into an optional integration package that can be acquired from NuGet.

So the existing integration between ASP.NET Core and Json.NET will continue to be supported, but will be moving out of the platform and into a separate package. However, since the integration is then designed to sit on top of the platform, it will also allow customers to update Json.NET to later versions.

Furthermore, customers who need more performance can also choose to use the new JSON APIs, at the expense of the rich feature set that Json.NET offers.

area-Meta

Most helpful comment

Assuming that this new parser will be used for all the built-in JSON stuff like appSettings.json, could I put in an early request for comments to be supported?

Thanks.

All 193 comments

This is great. I am all for faster and less allocating json parsing.

Will there be a discussion about the features from json.net that the new json apis will support? If there is,I think the two major features that come to mind would be renaming/casing properties and ignoring null properties.

Will there be a discussion about the features from json.net that the new json apis will support?

Yes. We've done some early thinking that we will migrate to CoreFx. It will be a feature that is designed & built in the open as usual. In addition, I've reached to authors of many of the popular JSON libraries and invited them to review early drafts of this announcement. My hope is that we can work together to create a solid JSON component for the platform while also keeping the ecosystem on top of it (such as ASP.NET Core) pluggable to allow for others. In the end, different consumers will have different goals and being able to plug-in a different library means you can get maximum flexibility in choosing the component that has the best cost/benefit for your app.

Hey @terrajobst. Will the new JSON appear as a netstandard API surface, or just integrated into Core for now?

Hey @terrajobst. Will the new JSON appear as a netstandard API surface, or just integrated into Core for now?

Yes, the question is just which release train it can catch. 2.1 might be too early.

So the JSON parsing bits baked into the framework are planned to be available when v3.0 goes to RTM or will only the integration Apis in ASP.NET Core be complete (with just one implementation - JSON.NET) that will be swappable at a later date?

The plan for 3.0 is as follows:

  1. Built-in high-performance JSON APIs. Low level reader/writer, a Stream based reader/writer, and a serializer.
  2. ASP.NET Core is pluggable w.r.t. to the JSON component.

There is an open ended question what the templates for ASP.NET in 3.0 will use. Depending on fidelity we can provide by 3.0 we might have them pull in the Json.NET integration package. However, the goal is to deliver enough fidelity & parity to only depend on the built-in ones by default.

Thanks - that helps clear things up. 👍

And some additional questions!

If an integration package is used, will it be used throughout the entire ASP.NET Core pipeline or only in some places?
I'm assuming Kestrel will always use the internal readers/writers.

Would the Api ergonomics be:

  • Supply an integration only when you want to enhance the built-ins feature set.
  • Supply an integration package always but one would be built-in, integrating the low-level reader/writers into higher level functionality.

Assuming that this new parser will be used for all the built-in JSON stuff like appSettings.json, could I put in an early request for comments to be supported?

Thanks.

This is awesome news! Quick question: What packages will this library depend on?

Why to reinvent a wheel that is tested by production customers? If there is a problem with Json.Net, just send a PR as it's open source.

I suppose the problem with Json.NET is that it's not owned by Microsoft, so it has to be replaced. Oh but there is the one already in System.Runtime.Serialization, called DataContractJsonSerializer. Can you use that, or is it just so fun to code new APIs, DIY, that it cannot be avoided?

The reason I'm not very happy with this is that Json.Net supports already edge cases like e.g. F# Discriminated Unions. Not particularly well, but on a level that developers can live with that. Instead any new API's usually forget anything else than the use case of an ASP.NET-website.

@markrendle There is a opt-in setting on JsonReader (work in progress) to allow comments. The configuration system will likely enable that setting by default.

@Thorium Did you actually read the OP? It explains why not JSON.NET, and that JSON.NET will continue to be officially supported with an add-in package.

@JamesNK 😄

@Thorium Json.NET isn't going away. You aren't losing anything. This is another option for simple and high performance scenarios.

@Thorium Json.NET isn't going away. You aren't losing anything. This is another option for simple and high performance scenarios.

How will the Json generated to be backward compatible?

For example, I'm using SignalR which is using Json.NET in the background. Now, will my F# discriminated unions serialize to similar structures so that I will not be fighting issues with the new Azure Signalr Service (backplane) throwing runtime exceptions because of serializating the structures differently than my server's current SignalR library?

I hope others will pick up the new APIs quickly. Looking at you, @AzureCosmosDB ;-)

Are you planning on including a class like JObject and support for dynamic or this is out of scope for this feature?

I recommend a look in one of this libs:

this could be a really good way to get inspirations.

Will DataContractJsonSerializer have this new reader/writer used internally ?

I've reached to authors of many of the popular JSON libraries and invited them to review early drafts of this announcement. My hope is that we can work together to create a solid JSON component for the platform while also keeping the ecosystem on top of it (such as ASP.NET Core) pluggable to allow for others.

Is there a reason why the 2nd most popular JSON library after JSON.NET - ServiceStack.Text which has already been refactored to be built on Span APIs has been excluded? ServiceStack.Text serializers are what's used to power ServiceStack which is one of the most popular "Alternative .NET Core compatible Web Framework" that supports running on more platforms in .NET. I'm curious about which "ecosystem" you're referring to and hoping to "work together" with here? I'd obviously be interested in how compatible these "pluggable" APIs end up being or whether this ends up being another area where the adoption and integration of new MS libraries ends up killing the ecosystem that it's replacing.

It may be worth reviewing the MIT licensed high-performance https://github.com/neuecc/Utf8Json

This is definitely what we need... my suggestion for the main class name, just use "Json".

@terrajobst I was wondering when this would happen...

I was always wondering why JSON.Net was added as a direct dependency rather than an abstraction (even considering it is the de-facto JSON package for .Net ecosystem).

However, I think add an abstraction for JSON-only is somehow a shoot on your feet. I think a _serializer_ abstraction like we have in Orleans IExternalSerializer in a shape of Microsoft.Extensions.Serialization or something would be more effective...

Is there any particular reason why make is JSON-only? I see other cases where people can plug other types of serializers...

@galvesribeiro Something like IOutputFormatter/IInputFormatter?

@yaakov-h wasn't aware of those... Were are they?

Okey... makes sense now. So where this _new_ JSON-only abstractions comes to play?

The decision to start this undertaking is also a testament about the inefficiency of System.String (UTF-16 String).
I think that the new JSON hooks that will abstract all json handling between asp.net and a json library would look significantly better if you tackle the task to create a utf-8 string BaseType first.
--> Maybe create a System.Utf8String

Yeah... I remember @migueldeicaza saying a while ago that someday, he will make .Net use utf8 strings 😄

@jges42 @galvesribeiro The proposal to add Utf8String is https://github.com/dotnet/corefx/issues/30503. It seems it's also planned for .Net Core 3.0.

Will these new JSON APIs have dedicated code paths for both Utf8String and char/string, or does the optimization involve flipping the status quo, so that everything that's not UTF-8 will have to be transcoded to it instead? (This doesn't necessarily involve a huge cost since nearly nothing is string's native UCS-2UTF-16 and still has to be adapted/accounted for, I'm just trying to get an idea of the API surface. Doing this to let Kestrel be more efficient is reasonable; I just hope the design considers more clients than Kestrel.)

@galvesribeiro
Actually I think you raise a good point. I think creating an efficient Serialization Framework and an efficient Json Decoder/Encoder are two kind of problems. I know there are some ways to mark a struct as Serializable but I've never seen it used for any Json Serialization.

The Serde project from Rust has actually a good concept by splitting the problem into 2 problems:

  1. Serialize/Deserialize(Traits similar to Interfaces in C#) which means any type which inherits from this Interface can Serialized/Deserialized
  2. Serializer/Deserializer is the format specific implementation.

A type can either implement Serialize/Deserialize by hand or by a macro(which could be seen as some form of a compiler plugin) which generates the code needed to perform implement those Traits. If a type contains a child type which does not implement those Traits It will even warn at Compile Time. It's a nice concept overall because It means you can just write some data objects an (de)serialize It for any supported format. Writing a format is much easier this way.

I don't think Serde's ways will all work for C# because It doesn't really offer any type specific attributes which could be important for some data structures. So there has to be some work done for this. Also considering AoT Compilation will be very important for some projects(WASM) It should also function well with It.

Here is a link to the Serde docs to make It more clear(Click the bottom 4 Traits to see the concept):
https://docs.serde.rs

@mythz The license of ServiceStack.Text is AGPL with some FOSS exception - it would probably prevent people from using it in proprietary software. Also I think it requires clearance from legal for Microsoft employees to even touch it, and any employees who has looked at the source might be barred from working on any other JSON library or related technology.

@poizan42 ServiceStack.Text is dual licensed with both OSS/commercial licenses free to use in both OSS and closed source commercial projects. But the source code licenses are irrelevant as MS is developing their own implementation.

The assertion was that MS was collaborating with the "ecosystem" to develop "pluggable" JSON serializer APIs - if ServiceStack which has been in active development in nearly a decade which is one of the few independent .NET Software suites that has managed to sustain it's own independent healthy community outside of MS within its lifetime, that maintains the 2nd most popular JSON serializer after JSON.NET and what appears to be the 2nd most popular actively developed Web Framework (outside of MS), that runs on more platforms than any MS web framework isn't considered part of the "ecosystem" who is primarily affected by these changes, I'm curious what "ecosystem" are they referring to and why we're being excluded and how many others are being excluded because they're not considered part of the "ecosystem".

I don't understand all this resentment. Asp.net forced you to use a specific version of json.net. They're changing it so you can pick which ever JSON parser you want (or mix it), and there's a default one OOB. ServiceStack should be happy about this change and monitor and provide feedback on this development, rather than just whine about how it had been overlooked, which rarely is an effective way to foster a good community spirit. I personally know many of the .net team members and I'm confident they didn't intend any malice. They are all big proponents of OSS and community work.
Personally any GPL derived license would be a big automatic no for me. Apache or MIT for me and my customers or we will move on. No mysterious dual licenses.

Asp.net forced you to use a specific version of json.net

Nope. How so?

I don't understand all this resentment.

Seconded!

I am personally happy that we will finally be able to use the serializer of our choice without having to downloading JSON.Net only to not use it, but still need to ship it because ASP.Net has a hard reference to the library.

(Shameless plug: https://github.com/gregsdennis/Manatee.Json)

@dotMorten

I don't understand all this resentment.

Because you've either not read or understood my comments. Try to responding directly to my comments (i.e. use the quote feature) instead of making up your own narrative.

They're changing it so you can pick which ever JSON parser you want (or mix it)

So like a magic "mix", they'll automatically choose the most optimal API and plugability options that existing .NET serializers are going to be able to plug straight in/out and mix into their internal customizability options with wire-format being exactly the same and everything's just going to work across all serializers? In that case you're right no collaboration or integration testing is needed before APIs are solidified. Or perhaps Serializer implementations are more nuanced with various differences and opinions and everything's not just going to work, not all customization options are going to implemented exactly, the wire-format is not going to be the same and it's not going to be possible to achieve perfect interop between different implementations. The "plugability" that you're glossing over makes a big difference which can determine how much of a rewrite we'll have to do and whether or not it will be possible to support existing and this new implementation.

ServiceStack should be happy about this change and monitor and provide feedback on this development,

Which we've not been given any opportunity to do (or still know how to do), but thanks for letting me know how I should feel. I'd prefer to assess the functionality, interoperability and compatibility of the library before being able to assess the strength of each. Maybe it will be great and it will be easy to support both implementations but from experience interoperating with different serializer implementations is fraught with incompatibilities and corner cases and reliance on different serialization-specific implementations and features. My prediction is that interop between JSON.NET and the default impl will be great since that's their design goal and what's being tested against but other serializers are not going to be as fortunate.

rather than just whine about how it had been overlooked, which rarely is an effective way to foster a good community spirit.

I'm challenging their assertion that they've developed this in collaboration with the "ecosystem", .NET has had a history of killing the existing ecosystem each time they bundle a "default" library, which I'm also expecting will be happening here (I'm struggling to recall a time when bundling a default has ever helped the ecosystem). But regardless, we need to develop seamless integration with whatever they're releasing which I'd like to be able to have access and input to before the API's are frozen. But that's ok I don't expect you to care how it affects the existing frameworks/libraries that need to support existing and future implementations, you're likely only concerned if JSON.NET remains supported or not because that's all that impacts you, but try hold onto your assumptions and letting us know how we should feel about absorbing disruptive changes like this.

i'm struggling to recall a time when bundling a default has ever helped the ecosystem

Oh, come on!

(In the rest I mostly agree with your sentiments)

@mythz I'm surprised that this is causing any problems since we today bundle another 3rd party JSON library into the framework. There are only a handful of places where we bake JSON in and most of them have a provider model that users would reasonably replace (like MVC formatters).

My prediction is that interop between JSON.NET and the default impl will be great since that's their design goal and what's being tested against but other serializers are not going to be as fortunate.

I can already tell you that what we will ship will not support the gamut of features that JSON.NET supports. So that's already not true and in-fact, we expect it to be less capable in some cases (because of performance and scope reasons).

The pluggability mostly already exists today and we have default JSON.NET implementations everywhere. This is just changing that default to be the new JSON serializer instead...

@abatishchev

I really can't recall any, when has embedding or adopting a default implementation in their base framework (or projects) benefited the existing surrounding ecosystem? Everytime I've seen it bundled e.g. NuGet, MVC, ORMs, Unit Testing, Web API, etc it's only ever had a detrimental effect, effectively taking the oxygen and motivation for competing within that space.

There are times like where competing libraries like ASP.NET Ajax have failed to compete where they ended up abandoning it and adopted jQuery, but I don't recall a time where it's ever helped? Note: This is just my observation from closely following .NET after several years, maybe there are examples and I'd be curious to know of some? but from my POV the effects of MS defaults has a detrimental effect on the existing ecosystem of functionality it's replacing.

@mythz benefits for the users from having a default solution by Microsoft is not the same as benefits for the alternative solution's authors. EF is the best ORM in the .NET world and MSTest was better than NUnit back in the day. In my opinion.

But let's don't flood and stick to the subject. Cheers!

@davidfowl This is just changing that default to be the new JSON serializer instead...

I would like to propose that there be no default serializer and require that an implementation be downloaded. If there must be a default, will it be baked into the framework or some separate library (as is the case currently)?

I would like to propose that there be no default serializer and require that an implementation be downloaded. If there must be a default, will it be baked into the framework or some separate library (as is the case currently)?

That's unreasonable as the experience will be subpar. Every modern platform has JSON built in.

@davidfowl Not causing problems now because it's unreleased, but we still need to assess the disruption and scope of work it's going to cause. How much effort will it require to seamlessly support it, will we be able to apply customizations to the new impl to support our existing behavior, will we be able to support the new customization model and APIs, are we able to customize our serializer to support the default configuration/wire-format will the new APIs be able to support both .NET Core and .NET Framework - whilst it's clear ASP.NET Core 3 will abandon .NET Framework, it's not clear if the new APIs are going to use .NET Core only types which will prohibit us from continuing to be able to support both .NET Core and .NET Framework.

I can already tell you that what we will ship will not support the gamut of features that JSON.NET supports. So that's already not true and in-fact, we expect it to be less capable in some cases (because of performance and scope reasons).

I'm only ever expecting it to support a subset of JSON.NET features, e.g. will JSON.NET support the default wire-format? (I'm assuming yes). Will the new impl adopt JSON.NET serialization formats where possible (also assuming yes).

How much effort will it require to seamlessly support it, will we be able to apply customizations to the new impl to support our existing behavior, will we be able to support the new customization model and APIs, are we able to customize our serializer to support the default configuration/wire-format will the new APIs be able to support both .NET Core and .NET Framework.

@mythz I'm not following some of this. I'm trying to figure out how much is this discussing the APIs existing at all vs how they are going to be consumed. Maybe we could look at some concrete scenarios?

@mythz the only real concern somewhat that i see for servicestack would be if this new api is not supported on .net framework classic, that way servicestack won't be able to support both .net core and .net classic, as customer, packages depending on those libraries won't be available in .net framework full . Is that your concern ? i am asking because your concern as a concrete example is not clear.

Also this is a proposal in its very initial stage and the goal it want to achieve looks pretty promising. Constructive criticism is always good for any oss projects.

@mythz benefits for the users from having a default solution by Microsoft is not the same as benefits for the alternative solution's authors.

By ecosystem I'm referring to the surrounding .NET library ecosystem/communities (which presumably is also the "ecosystem" in the OP is referring to) that it replaces which I'd also argue .NET Users also benefit from a healthy ecosystem with a variety of options and more competition (as-is the trait of healthier ecosystems like Python, Java, Node, Ruby, PHP, etc).

EF is the best ORM in the .NET world

Soon after EF was released it quickly took the majority ORM marketshare whilst being over 6x slower than NHibernate whilst arguably supporting less features, excerpt from my 2012 InfoQ interview:

Their latest attempt at an ORM Data Access Layer in Entity Framework has negatively impacted the earlier prominent ORM NHibernate's once thriving community as well. Despite being several times slower than every other Open Source .NET ORM, EF has succeeded in attracting more downloads than all other ORMs combined.

Bear in mind this was pre .NET Core where performance is now a top priority, but it's a historical example of the detrimental effect MS defaults has on existing ecosystems/communities. IMO it's pretty much accepted what happen to existing communities when MS introduces defaults which is why there's been recent push back to revert from shipping defaults that compete with IdentityServer and AutoMapper.

and MSTest was better than NUnit back in the day.

IMO it never was (and R# support of NUnit has always been excellent AFAICR) and the fact we couldn't run it cross-platform on Mono meant libraries that supported cross-platform on Mono (before .NET Core) couldn't use it.

But let's don't flood and stick to the subject. Cheers!

I also don't want to hijack this thread on this either, but needed to state why I disagree with your points.

In relation to this, the primary reason now to use a different serializer than JSON.NET is performance and given the reason for this new default serializer is performance. Since most people just use defaults I'm expecting this to have the most noticable impact on JSON.NET share whilst the primary reason for using an alternative serializer should no longer exist with this faster impl. So basically I also see this having a detrimental effect on the existing (library) ecosystem. IMO a weaker ecosystem of JSON libs is a net negative for .NET (not something most consumers will see they'll just use the defaults and forget about the other options), but that's not my main concern.

@davidfowl @shahid-pk

Despite this I would've actually preferred that this existed 8 years ago as the primary reason for developing ServiceStack.Text was because .NET Framework JSON serializers were extremely slow. But after all this time SS.Text has been extended with a number of features across all our libraries, e.g. customizations for supporting different languages ServiceStack Supports, different JSON customization options in ServiceStack, JSON support in ServiceStack Templates, Complex Type JSON blob support in ServiceStack.Redis etc.

So now I'm focused on assessing on what the impact will be, what the new API and plugability options will look like, can we retrofit existing features onto it, will we be able to adopt as the JSON serializer in SS .NET Core Apps (what will it break), will ServiceStack.Text be able to support the new API, will we be able to support .NET v4.5, will it be able to customize it to support wire-formats of existing deployments, etc. I basically have no idea on the any impact on any of this or what the strategy will be going forward as I haven't had a chance to use or see anything yet. I'll know more answers when I get a chance to use it and I'd obviously like to have the opportunity to test integration and provide feedback and propose changes before the APIs have been frozen.

@mythz

Is there a reason why the 2nd most popular JSON library after JSON.NET - ServiceStack.Text which has already been refactored to be built on Span APIs has been excluded?

The omission wasn't intentional. We've actively searched and worked with JSON library authors as part of the CoreFxLab repo and one of our devs started to benchmark our works against theirs. I believe the list got initially populated by using basic search terms like "json" on NuGet. It looks like your library simply didn't show up. I understand that this can be frustrating or concerning for you but try to understand the situation from our end: our team cannot be expected to know every library under the sun. This announcement is part of our open development model to engage the entire community. The only reason we tend to reach out to smaller groups first is make sure our plans & messaging have a reasonable level of thoughtfulness & quality before we share it with the world. Nothing is final yet. We're actively looking for additional feedback.

I'm curious about which "ecosystem" you're referring to and hoping to "work together" with here?

The .NET ecosystem and in particular the parties interested in JSON processing.

I'd obviously be interested in how compatible these "pluggable" APIs end up being or whether this ends up being another area where the adoption and integration of new MS libraries ends up killing the ecosystem that it's replacing.

The purpose of the planned ASP.NET Core extension points is to enable customers to replace the built-in JSON component with whatever JSON library they want. Of course, ASP.NET has always shipped with "batteries included", i.e. a reasonable default experience. In the past this has been Json.NET and moving forward it's a platform provided component. Given that Json.NET was somewhat hard-wired into ASP.NET the new plan seems net-wise better for people like you; so I'm not entirely sure which part of our plan you think is a threat.

There is an open ended question what the templates for ASP.NET in 3.0 will use.

Isn't it time for the Templates to be modular? Like take vue.js for example.

image

Creating a new vue app allows you to pick the things you want. Why can't something similar be done for asp.net instead of creating 500 templates to cater for all scenarios.

Here is a concrete example for a feature in .ASP NET Core 2.2 where non-JSON.NET JSON Input/Output Formatters will face problems and how a decoupled solution might help:
The ProblemDetails feature, which allows for a RFC 7807 compatible error response:
https://github.com/aspnet/Mvc/blob/release/2.2/src/Microsoft.AspNetCore.Mvc.Core/ProblemDetails.cs

[JsonProperty(NullValueHandling = NullValueHandling.Ignore, PropertyName = "instance")]
public string Instance { get; set; }

[JsonExtensionData]
public IDictionary<string, object> Extensions { get; } = new Dictionary<string, object>(StringComparer.Ordinal);

The above code is annotated with JSON.NET specific attributes, including a specific fallback attribute [JsonExtensionData], all unknown JSON properties are deserialized into this dictionary and the content of this dictionary is serialized into the normal JSON structure.

Any alternative JSON Input/Output Formatter now needs to handle these JSON.NET specific attributes to be able to properly serialize/deserialize this type or find a different way, i.e. fallback to JSON.NET for these types.

Now, if we have a well-defined specification of features an Input/OutputFormatter for JSON needs to support for 3.0, the above problem does not exist as i.e. these attributes could live in an Microsoft.Extension... package and every Custom JSON Formatter could use them to implement this functionality to be compatible.

As to my knowledge, there are only a few instances of "official" source code in ASP.NET Core which are annotated by JSON.NET attributes, but I've also seen third party libraries using JSON.NET specific attributes (usually to specify the attribute name via [JsonProperty("name")]

FWIW, that's what https://github.com/Tornhoof/SpanJson/issues/63 is about.

@terrajobst I think you answered before you read my previous comment which IMO clarifies more my concerns.

We're actively looking for additional feedback.

Where? Is there an API proposal/document, has an API been created, which repo is it being developed under?

I think you answered before you read my previous comment which IMO clarifies more my concerns.

I've read it but it seems you're opposing having any default, which, as @davidfowl explained, isn't practical for us. My point was that our plan is an improvement of what we currently have, i.e. a de facto hard-wiring to Json.NET. Hence my question.

Where? Is there an API proposal/document, has an API been created, which repo is it being developed under?

We deliberately haven't done any coding/API design in .NET Core yet because we wanted to get this announcement out first. We didn't want people to read the tea leaves without providing the context that this announcement provided. In other words, stay tuned, we'll publish APIs and code soon.

@terrajobst Whole impression for post that

  1. Decision to make changes is done.

Moving forward, we plan on making some changes to our JSON support:

  1. Some preliminary design is done
    We need high-performance JSON APIs.
    Remove dependency from ASP.NET Core to Json.NET.
    Provide an ASP.NET Core integration package for Json.NET.

All this is means that direction is taken. All what is asked from "ecosystem" is to find obvious pain points which MS could not realistically account for.

Omission of ServiceStack and discussing it as some secondary-class .NET library is laughable. Even I solely on MS shipped libraries, that's does not mean that I do not know about alternatives.

I'm having no problem with MS taking decisions, but if it was stated directly, and not covered with "community feedback" on already taken decisions.

That's my impression

@terrajobst

I've read it but it seems you're opposing having any default

Never suggested that, JSON.NET was already the default before this. I've explained in more detail above but to reiterate this is positioned to take over the default and become the new defacto standard where effectively .NET Core will only have 2 JSON serializers in its future: this new defacto high-performance default and JSON.NET for custom features. Other JSON serializers are going to become niche, e.g. unique features added to support different scenarios.

We deliberately haven't done any coding/API design .NET Core because we wanted to get this announcement first.

Ok so it's not possible for any "outsider" to know how well the plugability, extensibility or interoperability
will be yet.

We didn't want people to read the tea leaves without providing the context that this announcement provided. In other words, stay tuned, we'll publish APIs and code soon.

So it was developed internally then? How long after you release it will outsiders have to test and propose changes to API design? My primary concern in what the "plugable" and "extensible" APIs will look like, are we going to be able to "take over" and have complete control for the wire format of Ref/Value Types? What about built-in Types, Enums, bools, other intrinsics, etc? e.g. As an example will it be able to configure bool to accept other popular JSON values like "yes", "on", "1".

Other questions:

  • Can this implementation be used stand-alone (in a separate NuGet package)?
  • Is the "plugable" part of the API tied to the Web Framework or can it be used elsewhere (e.g. Xamarin/UWP)
  • Will it support .NET Standard 2.0 or .NET v4.5?
  • If not, will the APIs be able to support .NET Standard 2.0 or .NET v4.5?

@mythz
It's not really developed internally (as to my knowledge), the Reader/Writer part is done in corefxlab (https://github.com/dotnet/corefxlab/tree/master/src/System.Text.JsonLab/System/Text/Json) and specifically there is no high-level API for it yet in corefxlab.

Personally, I'd except the extensible/pluggable API parts to be .NET Standard (i.e. attributes etc.). The library in corefxlab is .NET Standard 1.1 at the moment, but I imagine this will change depending on the performance goals of the library etc.

@mythz

You seem to be eager to take my statements and put them in a "the glass is half empty"-context. I get it, you're skeptical. I suggest we save ourselves a bunch of keystrokes and discuss concrete API proposals as opposed to discussing them in the abstract. I'm fairly convinced that our plan will provide the extension points you need.

@terrajobst Not trying to be skeptical, trying to know what are the high-level proposed capabilities are which I'm assuming has already been decided (or are they all still TBD?). This announcement is the first most of us has heard of it so I was after some clarification around on how plugable, extensible and reusable it is intended to be. Is System.Text.JsonLab the current impl? does this mean it will also support .NET Standard 2.0 and .NET v4.5?

This might be a nice feature for library creators, but also you have to consider the enterprise developers using 50 libraries with dependency trees and trying to find compabilities between those. Will there be a binding-redirect-style mapping-configurations to try to manage the mismatches?

This conversation seems on edge, whether because people have been offended in some way or because they are trying to defend an action that was taken or not taken. It's hard to read. Apologies around! Please just let the current state be and move on.

It seems there are two reasons for this change. First is the desire for improved performance utilizing new types within .Net Core.

Additionally, it was recognized that sometime ago an architectural error was made of including hard references to JSON.Net, a library which resides externally to .Net. To fix this, the introduction of a new, built-in JSON implementation must be made along with interfaces that allow third parties to use their own implementations.

Will this break things? Yes! That's why the announcement has a "breaking changes" label on it. (Maybe that label should be replicated here.) And since this is known to be a breaking change, a discussion was started to explore the impact. Additionally, to minimize the impact, an additional library that allows people to continue using JSON.Net will be supplied.

As a library author I'm really interested in this, and I'd prefer that the conversation moved forward.


@Tornhoof in response to your examples, if I would like to continue using JSON.Net, I would need to also reference the compatibility library I mentioned earlier. It should be mostly plug-and-play, but there may be changes. I definitely don't want the framework (.Net Core) to dictate that the serializer I choose MUST use these attributes for serialization, especially when my serializer uses a different mechanism for similar concepts.

The solution provided by .Net should be more generic than that. Specific model serialization handing should be performed by the chosen JSON implementation.

@mythz From everything I know and have seen from the people involved in making this proposal, you're going to get a long chance to review and comment on the proposed API and implementation before it is released as stable. One of the purposes of making this post at such an early stage was to find people like you for specifically this reason.

@gregsdennis
I'm not sure what you mean with more generic?
Assuming your serializer has the concept of overriding json property names, changing their null behaviour and/or fallback/catch-all implementations and assuming that all three features are part of the shared specification for JSON serializers for .net core 3.0, then the implementation package maps those attributes to your internal implementation details.
For example, if your library prefers to you use [DataMember] to specify the name of properties (like SpanJson does) your integration package should map that easily.
I'm not saying attributes are the right way, it's just happens to be a visible part of the code example.

Obviously, the ideal world would be that no ASP.NET Core framework library would use any specific annotations to control the serialization behaviour, but as for that feature above, that is quite complicated to impossible, because the RFC requires certain naming rules to be followed.

Either way, I think there will be many discussions on these shared features, how to use and describe them in the future.

Any plans to use SIMD instructions in the new JSON parser, like in RapidJSON?

Reference: http://rapidjson.org/

The proposed suggestion looks good , but plse just try and smooth out "breaking changes" I'm just a general user of 3rd party libs and recently had one of these experiences where reflection was suddenly excluded from UWP .net native release build process (compiler).

So none of my UWP apps could be build in release mode for months as I had to rewrite all the code that used reflection in 3rd party libs. I know many library authors had to split their implementations again to exclude reflection in those UWP parts. Most library authors did not come to the party and I was forced to jump ship. Although MS came to the fore and committed to implement an alternative in .net standard 2.1. we know that reality on the ground is .net standard 2.1 will take months to deliver from the initial breaking change.

Point is simply, this was a hugely disruptive process for me that had massive consequences for end users and was anything but "smooth" and frictionless.

This is definitely the right step to do.
I am wondering for a while seeing this Json.Net reference.

@Tornhoof I think there needs to be a defined separation: the interfaces which each provider would need to implement in order to be used with .Net Core 3.0, and the default implementation that's built in.

The interfaces should be as general-use as possible. Perhaps as simplistic as defining only Serialize() and Deserialize() methods.

Other details should be left to the implementation. If the default implementation used an attribute to define a custom property keying mechanism, I'm fine with that. I think that's an implementation-specific detail that shouldn't be part of the interface.

That said, having the ability to custom-key properties could be a requirement, though I'm not sure how that could be codified.

@gregsdennis Yes you're right, I was mainly looking at the IInput/OutputFormatter, which already exist currently in ASP.NET Core and specifically the problems regarding replacing them with non-JSON.NET versions.
Anyway as your and @mythz comments show, I guess the scope definition will get interesting and probably not that simple (I remember the problems with the DI interface specs). So it's better to get many different view points involved early in the process.

@Tornhoof agreed. The current formatter interfaces are distinctly JSON.Net-based, but not so much on the serializer itself, but rather on the options object. It seems that we would need a generic way to communicate options as well (a common options object).

Does this imply that the options object dictates a minimum feature set for an implementation? I don't think it does. I recently implemented a formatter for my serializer that completely ignores the options object, but it was for my private use. If I wanted to make one for public use, I'd look at trying to interpret as many of those options as possible in order to support them.

Not that’s now how we’re doing things. The options are serializer specific and there’s no common interface for them. The formatters in MVC are already properly factored out and not coupled to anything. JsonResult will have breaking changes because it takes JsonSerializationOptions directly (the JSON.NET type).

I was about to say the same thing. We don't plan on building an abstraction for a JSON reader/writer/serializer. It's not needed; frameworks either deal in IO primitives (such as Stream, TextReader) or plug into higher-level framework processing (such as the ASP.NET Core formatters).

Speaking of pain points: Personally (and I am likely in a very small minority) I am concerned about the lax nature of many JSON parsers. There is a standard (tm) but most parsers chose to be lenient and accept non-conforming documents. What's bad about that in the long run is that developers don't implement towards a standard they implement towards a library. If the library allows non-conforming document the developers are happy as long as all bits uses the same library. The pain arise when trying to communicate between domains that uses different libraries. Suddenly there is a problem because different libraries support different flavors of JSON.

Should we remove the pain points by making the JSON library as lenient as possible (but perhaps ends up with complexity and ambiguousness) or attacking the root cause; non-confirming JSON libraries.

As influential MS is perhaps there is to much to ask for MS to successfully champion conforming JSON parsers in order to improve interoperability in the long run but I wish it was different. Perhaps be lenient on read but strict on write?

(Things not in the standard; comments, trailing commas, single quote strings, no quote strings and so on)

IMHO, since JSON is stemming from the webbrowser world it seems for interoperability we should chose double as the underlying representation for numbers in JSON even though the JSON standard says nothing about the rep.

Speaking about the API, I implicitly assume the most commonly used API will be a DOM like API but I also would find it very useful if there was a lower level API that allowed me to consume the token stream or get signaled on a visitor interface for those large documents from which I just want to extract a small portion of data.

@mrange - As much as I like making things as strict as possible.... doing so relies on the ability to make changes in the non-conforming code.

If you're interacting with a legacy service under the control of some other company, the ability to change the offending code is near zero. Even strict-on-write, while more doable, isn't without its own problems here: what if the offending code expects to get a non-conforming object sent to it?

@terrajobst thank you! Func<Stream, CancellationToken, Task<T>> and Func<T, CancellationToken, Stream, Task> is all that is needed here. With maybe some overloads for TextReader/Writer, Span, and string.

The downside however is when you want to serialize / deserialize another library's type, and that type is decorated with attributes from a json serializer that you are not using.

@thefringeninja if you're already using the third party library for the objects, then you already have a reference to the other serializer. Nothing has changed there.

I'm not one for fear mongering, but I do think @mythz has some valid points.

@terrajobst Regarding the ecosystem, although it's impossible to account for every library out there, I don't think a quick search of "json" on NuGet was going to tell anyone much. Perhaps the name ServiceStack.Text isn't the most in-your-face way of saying "Hey! I'm a package that can (de)serialize JSON!", but there have been benchmarks over the years comparing it. Perhaps it's a case of dogfooding the MS defaults and either not knowing the breadth and popularity of the alternatives or using them frequently enough internally to be familiar with them.

I agree that there ought to be some default to provide an experience that _just works_ out-of-the-box. If other library authors in the ecosystem publish an integration package, it would be great if they could get a plug in the docs, release notes, etc. to emphasize that there are alternatives beyond the defaults. Making it hard to discover would be problematic for the ecosystem.

My hope is that if the goal of removing the dependency is sincere, the APIs should best represent the community's needs and not be modeled directly after Json.NET. The bottom line is it's going to require work from all of the library authors, not just ServiceStack, but the APIs shouldn't directly resemble Json.NET's API, otherwise you're back to what looks like a dependency but without the dll.

the APIs shouldn't directly resemble Json.NET's API

... or any other specific provider's implementation.

Part of the discussion that occurred prior to the announcement included the idea that the .Net team would be drawing from various libraries to get a feel for how various problems have been solved, and then to use what they considered would be the best fit of those ideas combined with their own. In many ways, it's not dissimilar to how any other new JSON library would be developed; it just happens that this one will be included within the framework.

We are all-in in having a high-performance JSON library that we dont have to build ourselves. :)

Before discussing anything, consider leveraging the results of Microsoft Research on that area (more specifically, the no branching no FSM parsing) https://www.microsoft.com/en-us/research/publication/mison-fast-json-parser-data-analytics/

We are going in that direction for high performance JSON parsing --- besides Span<T> of course ---
cc @terrajobst @karelz

:( all this discussion of JSON makes me feel my question on templates has gone amiss.

I wish I had more time for this discussion because it has been hell but I see why it has become what it is. 4.6.1 has to stay consistent with upgrades and upgrades and breaking changes will be for the rest.

I have seen a lot of recalled packages for core and 461 and I hope this type or change will fix it.

I am worried the cascading issue will haunt us, please prove us wrong.

//dot net devs everywhere

@c0shea

[…] it would be great if they could get a plug in the docs, release notes, etc. to emphasize that there are alternatives beyond the defaults.

I’m sure that will be the case. The docs already list alternative implementations for multiple topics, like DI containers, logging providers, Swagger integration, or EF Core database providers.

@phillip-haydon
The template system already support customizable templates, and the existing templates actually already have a number of options. Just check out e.g. dotnet new mvc --help for what’s currently possible. I’m sure you could easily extend that with e.g. alternative JSON serializer integrations, and feature requests or pull requests for that will likely be accepted at aspnet/Templates.

@mrange - As much as I like making things as strict as possible.... doing so relies on the ability to make changes in the non-conforming code.

If you're interacting with a legacy service under the control of some other company, the ability to change the offending code is near zero. Even strict-on-write, while more doable, isn't without its own problems here: what if the offending code expects to get a non-conforming object sent to it?

Maybe have a strict mode by default and able to explicitly switch it to a more lenient mode

@poke I guess you really need to go try vue cli and then retry dotnew new. The current dotnet new... is... it’s...

Can I please request that you remember that the various browsers' parsing of JSON don't support int64 values properly, and give us the ability to serialize/deserialize longs as strings? It's one of those things you don't notice until it bites you hard. A variation on this is being able to decide whether numbers are deserialized into ints or longs by default.

Can I please request that you remember that the various browsers' parsing of JSON don't support int64 values properly, and give us the ability to serialize/deserialize longs as strings? It's one of those things you don't notice until it bites you hard. A variation on this is being able to decide whether numbers are deserialized into ints or longs by default.

....Ah, EcmaScript, thanks for making everything doubles. I'm sure that hasn't caused problems in some other part of code...

As a purist, I dislike stuff like this.
As a realist, I think I'm going to have to agree with this.

Will any portion of the new JSON implementation be in .Net Standard 2.1?

I've been following this a bit, and I'm not sure if I've missed it. Is there a proposed API surface or interface that I can review? I'm interested in seeing the API surface area for this proposal.

Is there a proposed API surface or interface that I can review?

There's still a lot to do, but https://github.com/dotnet/corefx/pull/33216 is a beginning. Here's the API review notes.

UPDATE: Roadmap also available here.

so how feature complete will the aspnet 3.0 api's be compared to the json.net api? Also, will json.net for all intents and purposes be phased out by development of new apps with native api's?

just improve the performance or means replace all function of json.net?

This is amazing news.

I strongly recommend trying to collaborate with @neuecc his work with MessagePack, Utf8Json, ZeroFormatter etc has been phenomenal.

@linkanyway the outcome would be to replace a subset of json.net, while providing new no allocation apis.

I suspect the majority of aspnet users would have no strong dependency to the json.net implementation and would be able to nearly seamlessly migrate.

Is there a proposed API surface or interface that I can review?

There's still a lot to do, but dotnet/corefx#33216 is a beginning. Here's the API review notes.

UPDATE: Roadmap also available here.

@khellang Are we going to get any lang help that we can write json in c# for real?

One more good movement in OSS direction. Will motivate developers of other Commercial platforms to develop something better than MS already does for free, or become fully OSS if they really care about the community.

Are we going to get any lang help that we can write json in c# for real?

@dotnetchris I'm not sure what you mean by "lang help"? JSON literals? I doubt that'll happen. Have you looked at the upcoming "embedded language" feature? Specifically for JSON?

@khellang that's certainly a step in the right direction, but i mean full blow support. Using the same sample objects from your link, more akin to:

json v1 = { 
                    first: 0, 
                    second: ["s1", "s2" ] 
                }

var andCsharp = v1.second.Where(item => item.EndsWith("1"));

With enough voodoo, such as implicit tuple/valuetuple/record object generation to make that all work at the lang level behind the scenes.

Conversely the compiler could invoke the json services, create a class and then work with an instance of the class as if you wrote:

var v1 = "{ first: 0, second: ['s1', 's2' ] }".Deserialize<MyV1>();

Edit: LOL @ downvoters.

@dotnetchris If you're interested, please vote on https://github.com/dotnet/roslyn/pull/24110. It has been closed by IDE team because of lack of interest. But if there are enough votes, perhaps that could change.

Does „built-in“ and „platform provided“ JSON APIs mean that it won’t be a separate (netstandard) nuget package? If it won’t, why not?

I suppose the new ASP.NET Core shared framework cannot depend on nuget pakages, or can it?

Isn't there already a System.Json Namespace? Is this the namespace that is going to be used/expanded for .NET Core 3.0 and ultimately .NET Standard. That namespace is already used on (may be not used that much but available 😅) on Xamarin.Android 7.1+, Xamarin.iOS 10.8+ and Xamarin.Mac 3.0+.

Removing it would break backwards compatibility. Leaving it and starting a new API might cause some confusion. Expanding/Enhancing it without removing any APIs might be a little restrictive when adding new APIs.

It would be great to have interfaces for JsonReader/JsonWriter, because there are other sources and targets than just streams. For example I also use JSON.NET for MongoDB. It is faster and I do not have to maintain multiple serializers in my applications.

I know it could hurt the performance a little bit, but it is very useful.

@SebastianStehle: JsonReader/JsonWriter are high level abstractions, see terrajobst's comment

The plan for 3.0 is as follows:

  1. Built-in high-performance JSON APIs. Low level reader/writer, a Stream based reader/writer, and a >serializer.
  2. ASP.NET Core is pluggable w.r.t. to the JSON component.

There is an open ended question what the templates for ASP.NET in 3.0 will use. Depending on fidelity we can provide by 3.0 we might have them pull in the Json.NET integration package. However, the goal is to deliver enough fidelity & parity to only depend on the built-in ones by default.

The high level, easy to use API would be wrapped around streams to easily deserialize from and to streams (this are often most common, when serializing. If you want best performance in scenarios where you can't use streams, you should use the low level APIs that operate on Span<T> or Memory<T>, especially when you have the data already on hand/in memory you want to use these and not have the overhead of async.

@TsengSR

https://github.com/neuecc/Utf8Json does not provide the functionality to write custom readers/writer because of performance reasons (virtual calls and allocations I guess) and I thought that they would like to go the same path. But the so far I have not seen any serialization code yet.

I agree with @JonasZ95 and @gregsdennis, I hope the implementation will not be a simple abstraction of the same JSON.Net implementation details but will instead focus on what it _should_ look like.

I also think it should be approached as 2 separate functions...

  1. serialization / deserialization
  2. json version of said serialization and deserialization.

Hopefully the ASP.NET Core framework will use a generic serialization abstraction instead of a JSON specific abstraction.

As far as extensibility, I hope the framework will use the DI technique of coding to abstractions (interfaces not abstract classes) and simply provide a local default. From the perspective of JSON library authors this would provide the greatest extensibility because all you would have to do is provide an ASP.NET Core adapter class that uses the library implementation of the interfaces and then configure ASP.NET Core to use the library adapter.

Implementation for a 3rd party library could look something like this:
```C#
// reference 3rd party libraries
using Newtonsoft.Json;

// very naive example for brevity just to
// make the point
public class NewtonsoftAdapter : ISerializer
{
private JsonSerializerSettings _settings;
private Formatting _format;

public NewtonsoftAdapter(JsonSerializerSettings Configuration, Formatting FormatOption)
{
    _settings = Configuration;
    _format = FormatOption;
}

    // interface method
public string Serialize<T>(T Subject)
{
    return JsonConvert.SerializeObject(Subject, _format, _settings);
}

    // interface method
public T Deserialize<T>(string SerializedContent)
{
    return JsonConvert.DeserializeObject<T>(SerializedContent, _settings);
}

}

...

// setup adapter with 3rd party customization options
var settings = new JsonSerializerSettings
{
MissingMemberHandling = Newtonsoft.Json.MissingMemberHandling.Ignore
};
var adapter = new NewtonsoftAdapter(settings, Formatting.Indented);

// configure asp.net core
// where adapter implements ISerializer (or whatever name you come up with)
// library authors could even provide their own UseXYZ() extension method.
app.UseSerializer(adapter);
```

There's been a lot of advancement in SIMD-based text parsing, including for structured text, like JSON. Is there any chance that the work in .NET Core will approach this performance level? Is there any way to use these techniques?

Hey, even Microsoft Research has some new high-perf solutions!

@AnthonyMastrean Thanks for bringing this up. I am also interested in any benchmarks to get an idea of how this current Json impl compares to simdjson.

By the way, the authors of simdjson said that they are still working on fully publishing this project (I think, more doc). For now, you can read some interesting discussion of this project on HN.

Is there any way to use these techniques?

.NET Core 3.0 just happens to ship a bunch of platform dependent intrinsics, so it's definitely doable 😄

For what it's worth, I think in a web scope, the network is a primary bottleneck, and while it is cool that you can parse gigabytes within seconds, I don't think that is very useful for a web project.

Don't get me wrong, performance optimizations like these are super cool and would probably benefit even very small JSON documents. But for now, I think the focus with the new library is to avoid memory allocation and move everything very closely to the network layer. That alone should already improve the performance a lot over JSON.NET. But when the initial work is done, looking at additional optimizations might be another story.

Where I work we parse tera bytes of JSON everyday. I know of others as well that uses .NET and F# to process a lot of JSON docs as well. JSON has become more than just server => browser transport mechanism. It is used a lot in pure backend scenarios.

OFC; it would be better for backend to switch to a binary format like AVRO/Protobuf but often that is difficult and JSON do have some benefits (I grudgingly admit). Having a real fast JSON parser could literally save $10,000 dollar a month for companies similar to us.

@poke This project falls under .NET Core (not ASP.NET...), so it is relevant to all workloads, not just the web.

I can agree on the idea that it is too late to work on this specific optimization technique for .NET Core 3.0, but I would hope that some investigation is done now to ensure that the optimization will be possible in the future (ie without breaking changes).

May be better make something like unified mapping assembly ('System.Text.Json.Mapping') where define attributes and other stuff for mapping JSON to C# classes? After implementing this thing all existing JSON Parsers/Writers can be adopted to use unified mapping. It will give ability for all .NET Standard applications to migrate between different JSON libraries without any pain

@AlexeiScherbakov New abstractions don't actually help that much. You will just limit yourself again by choosing a common abstraction and there will always be new libraries that cannot use the abstraction and require more. It has always been that way, e.g. with logging.

I don't think creating a new abstraction based off this new implementation will give us any benefit, especially when the library is designed to be a lot less feature-rich anyway.

And there's actually already an existing netstandard abstraction in the form of DataContract/DataMember which I hope this library will end up respecting (even if that abstraction is somewhat limited).

Other than an ignore attribute we can not have a billion new attributes and scenarios to cater for. Would rather 1:1 mapping of JSON to classes, if you want to do anything out of the norm or support legacy, use json.net.

Personally I don't care so much about JSON <=> C# Classes. I think it is important to have the concept of of parsing/writing JSON separated from created C# object model from JSON model.
That way I (that don't care so much about the JSON <=> C# classes) can have a really efficient parser without round tripping over some kind object model.

@mrange that’s what the reader and writer are for

Does that mean I can expect a Reader/Writer API in the platform provided JSON API? Is the Reader/Writer pattern the most efficient one?

There are 3 types in System.Text.Json as of now:

  • Utf8JsonReader - a fast, non-cached, forward-only way to read UTF-8 encoded JSON text.
  • Utf8JsonWriter - ^ same as Utf8JsonReader, but for writing.
  • JsonDocument - a read-only random-access document model for JSON payloads.

All of the types above should be (more or less) allocation free 👍

We posted the simdjson paper : https://arxiv.org/abs/1902.08318

There is also ongoing work on a C# port of simdjson: https://github.com/EgorBo/SimdJsonSharp

cc @EgorBo

May be better make something like unified mapping assembly ('System.Text.Json.Mapping') where define attributes and other stuff for mapping JSON to C# classes? After implementing this thing all existing JSON Parsers/Writers can be adopted to use unified mapping. It will give ability for all .NET Standard applications to migrate between different JSON libraries without any pain

I really hope the new abstraction does not rely on attributes. I try to use clean POCO objects in the underlying libraries and use DI to avoid them having to know about the implementation. I definitely don't want to be decorating my underlying classes with attributes required for the implementation. This could end up causing extra classes to be created in the ui layers which essentially just map an existing domain object to json.

Probably 1-1 mapping of json to c# classes would be a better approach, at least in some cases you could avoid creating new classes even if viewmodel type classes will still be necessary in other cases.

It would be nice though if there were some sort way to ignore properties that are not needed and at the very least control some of the serialization aspects (such as camel vs pascal casing).

@mrange that’s what the reader and writer are for

@davidfowl Does that mean the new APIs are going this route? Has the design been finalized?

The serialization support is landing as we speak. The related issue says:

  • Due to time constraints, and to gather feedback, the feature set is intended to a minimum viable product for 3.0.
  • Simple POCO object scenarios are targeted. These are typically used for DTO scenarios.
  • The API designed to be extensible for new features in subsequent releases and by the community.
  • Design-time attributes for defining the various options, but still support modifications at run-time.
  • High performance using IL Emit with fallback to standard reflection for compat.

It also details how they're planning to support enum conversion, null handling, camel- vs PascalCasing etc.

If you have feedback on any of this, you should leave your comments in that issue or the pull request.

@lemire Wow, that's really cool. simdjson is indeed super fast.

Any chance that you can implement the JSON serialization benchmark of TechEmpower? (I know it will be a lot more work)

I found these implementations: In the TechEmpower repo and in the ASP.NET repo.

@KPixel This is serialization, right? Meanwhile simdjson is a parser... Unless I am confused about the terms, these things go into opposite directions?

My bad. I assumed there was a deserialisation part (which would use the parser).

Will the System.Text.Json be a .net standard nuget package ? or It's something that only available for .net core 3.0 ?

I think usability should be a focus of the new JSON package as well. One feature I think it should have is JSON schema validation support. Newtonsoft charges for that. This is something that is fundamental enough that it should be provided in the platform for free, as was the case for XML schema validation.

@jemiller0 My impression is that XML validation has been somewhat a mixed bag and JSON schema have had so-so adoption in the real word. You can always have a schema check through a library as an extra step... Sure, this may involve acquiring a software license, but is it a big deal?

@lemire Yes, it is big deal, if you are developing open source software and want to make your software available to everyone. XML parsing is not a mixed bag. It works. Same thing with JSON schema validation. Not having a built-in free way to do this makes the .NET platform uncompetitive.

I’ve never seen json schema used in the real world. Even so it should not be part of the implementation discussed here. And none of the billion other features and quirks in json.net should be implemented here either. This should be nothing more than Super light weight fast implementation. If you’re unhappy that you need a license for json.net to support json validation. Create your own open source implementation and make it freely avaliable.

@jemiller0

I am genuinely curious: do other programming languages offer JSON schema support in their standard library?

The goal of this library is to be a high performant, low-level library to work with JSON. Anything that isn’t that, which includes most of the more advanced features from JSON.NET and also JSON schemas, won’t be a part of this.

If you want JSON schema validation, you are free to implement a validator on top of this library, which should be low-level enough to allow you to do that.

I do not believe that having JSON schema validation in a standard library has anything to do about a platform being competitive or not.

The goal of this library is to be a high performant, low-level library to work with JSON. Anything that isn’t that, which includes most of the more advanced features from JSON.NET won’t be a part of this.

Except it'll also include higher-level features, designed to be a drop-in replacement of Newtonsoft.Json 😊

@poke You are entitled to have whatever opinion it is that you want to have, just like I am. XML is used all over the place. Hence, rightly, Microsoft included validation support with the .NET Framework. Now, JSON is all the rage and is used EVERYWHERE, in config files, web APIs, etc. etc. It makes sense to have the same level of support for JSON as was historically supported for XML. Personally, I find it a little ridiculous that Microsoft is making use of third party software for this to begin with. That should be a core feature of the platform.

@lemire I will soon be checking Python. At which point I will either find out that it is built-in, or included as an easily installable package. I would be highly surprised if it wasn't. I'm looking at using NJsonSchema for what I need right now. Right off the top, you can see the documentation sucks. Yeah, relying on third party libraries that may or may not be quality is a great idea for things like working with JSON which is totally ubiquitous and used everywhere. Let's all rely on either commercial software, or third party packages with no documentation. .NET shouldn't just match other platforms like Python. They should beat them by providing quality software with proper documentation out of the box. This has been historically how it has worked. Where I work, everyone hates .NET and wants to force me to use Python. I don't know if you have noticed, but, more and more people are abandoning ship and switching to Python. Telling my boss I need to buy a license to do something simple like validating JSON won't fly. What I will get in return is questioned as to why I would want to use .NET in the first place. Never mind the fact, that this is for an open source project where you can't get a license anyway. Hopefully, I will find that NJsonSchema just works. I am just pointing out a glaring omission in the framework that should be obvious to anyone who works with JSON.

I use JSON Schemas at work and I rather have a real good JSON parser than a half good JSON parser + JSON schema validator. Also AFAIK JSON Schema is a _draft_ 7 right now. So have JSON schema as an external library that can evolve quickly together with the schema makes sense to me. Having JSON Schema in the roadmap would be nice though.

@jemiller0

It makes sense to have the same level of support for JSON as was historically supported for XML.

.Net also includes support for XSLT and XPath. If you want "same level of support", doesn't that mean you would also need some version of those for JSON?

What I'm trying to say is: the JSON ecosystem is different from the XML ecosystem. Both have different usage patterns and the related technologies have different usage numbers and standardization levels. Also, XML was added to .Net before NuGet, git or GitHub existed. Nowadays, it is much easier and much more acceptable to rely on a 3rd party library.

So, no, I don't think it makes sense to blindly say "XML had this, so JSON has to have it too".

Also, validation is simply orthogonal to parsing. I would absolutely be fine with validation support being added at some point (possibly in another package altogether). But it is not at all necessary for proper parsing support.

We need a way for strict validation of data in API REST requests.
Because we save the json that comes through API without errors and later we cannot parse it in JS because of trailing commas and so on.

And why can’t you validate that request now?

@phillip-haydon @freerider7777 I think that any decent JSON parser should abide by the JSON specification and throw errors (and/or warnings) when the document is not well-formed (e.g., when it has trailing commas). That's pretty basic but also distinct from validation which is comparison of the JSON data with a schema (at least, that's how I use the terms).

https://tools.ietf.org/html/rfc7159

Microsoft, one of the biggest software development companies there is, doesn't have anyone to work on validation. A fast parser is more important. It will let you parse the invalid JSON at the speed of light. :-) It hasn't occurred to anyone that fast validation could be useful. This just like ASP.NET Core, a fast dumbed down upgrade to Web Forms.

And why can’t you validate that request now?
@phillip-haydon In controller code with such json:
ModelState.IsValid == true

🤦‍♂️

You can do validation based on your json schema already with NSwag + System.ComponentModel.DataAnnotations:

<Project Sdk="Microsoft.NET.Sdk" >
  <ItemGroup>
    <PackageReference Include="NSwag.MsBuild" Version="12.0.10" />
  </ItemGroup>
  <ItemGroup>
    <Compile Remove="**\*.g.cs" />
  </ItemGroup>
  <ItemGroup>
    <SchemaFiles Include="$(MSBuildProjectDirectory)\..\schema\*.json" InProject="false" />
    <EmbeddedResource Include="$(MSBuildProjectDirectory)\..\schema\*.json" LinkBase="Messages\Schema" />
  </ItemGroup>
  <Target Name="GenerateMessageContracts" BeforeTargets="GenerateAssemblyInfo">
    <Exec Command="$(NSwagExe_Core21) jsonschema2csclient /name:%(SchemaFiles.Filename) /namespace:MyNamespace.Messages /input:%(SchemaFiles.FullPath) /output:$(MSBuildProjectDirectory)/Messages/%(SchemaFiles.Filename).g.cs" />
    <ItemGroup>
      <Compile Include="**\*.g.cs" />
    </ItemGroup>
  </Target>
</Project>

I agree with @lemire, there is a difference in validating the structure of JSON and validating the JSON against a schema. I have no doubt Microsoft has spent time and effort to correctly implement the former... the latter is a corner case and I don't think it fits into the general design of this JSON library. I'm pretty sure they've made it clear that this JSON library is designed ONLY to provide fast, efficient, parsing as necessary for asp.net core to operate. It was not designed to include the _extras_ that came with newtonsoft's parser. (refer to @poke's comment earlier in the thread).

I don't think the fact that they intentionally did not design this to come with all the bells and whistles makes it an inferior product.

Did this happen to ship with preview 4?

Are there plans to make the System.Text.Json.Serialization.Policies.JsonValueConverter class public to allow replacing converter classes from Json.net?

Will System.Text.Json ship with full .Net support via nuget? It would sure be nice to ensure full interop as well as leverage the benefits from the full framework.

System.Text.Json was recently changed to produce netstandard2.0 binaries for shipping OOB; https://github.com/dotnet/corefx/pull/37129:

If possible, you should target .NET Core 3.0 and get the in-box System.Text.Json APIs. However, if you need to support netstandard2.0 (for example, if you are a library developer), you can use our NuGet package which is netstandard2.0 compatible.

benefits from the full framework

What are these again? 🤔

@khellang

My preference would be a nuget with multiple flavors, including full framework (4.5 or whatever acceptable minimum), standard, and core. Using in box assemblies is preferable.

The linked issue above refers to this package, but it is not supported:

https://www.nuget.org/packages/System.Text.Json

Is there a current supported package?

My preference would be a nuget with multiple flavors, including full framework (4.5 or whatever acceptable minimum), standard, and core.

Why would you prefer that? If there's no need to multi-target, i.e. all used APIs are part of the standard, it's much better to just have a single target 😊

Is there a current supported package?

I don't think it's shipped yet. The PR was merged days ago. That package used to be a community project that has now been transferred over to MS.

@khellang it depends on the specifics -- I was making a general statement.

If the net standard version had to omit anything from the net core version that would be possible with the net full version, I would prefer all three flavors to be available. Same general reasoning, I suspect, as the original statement from the linked issue above about preferring users to target the in box version.

When adding a reference to the nuget package, the appropriate most compatible flavor would automatically be selected. So, it isn't a big deal. If the consuming library is net standard, then the net standard flavor would be selected.

My general preference for in box flavors is that when I goto definition, I get decompiled sources. If I goto definition on net standard libraries, I typically only see stub sources that throw NotImplemented exceptions.

benefits from the full framework

What are these again? 🤔

Many applications use .NET Framework not because they absolutely want to stay off .NET Core but because .NET Core is not an option. I use .NET Core when it is an option; when I have to target Windows versions lower than Windows Server 2012 (the minimum .NET Core 3.0 version), I have to use .NET Framework. Much as I'm sure it was a very painful decision to drop support for Windows Server 2008 R2 and below, it's a very painful decision for every company with paying customers with servers that they don't want to upgrade/often basically recreate from scratch just so we can use a slightly newer tool.

No one would be happier than me if I could stop using .NET Framework tomorrow, but even with all the WPF and Windows Forms opportunities in .NET Core 3.0 and beyond, Microsoft is making that a practical impossibility with its support policies. I have tried to discuss this with anyone who would listen at Microsoft, but sadly, I have yet to receive as much as an email acknowledging that the message was delivered.

@JesperTreetop not to mention lack of LTS support for the versions of .NET Core worth using for an enterprise ;) I am hoping we'll get LTS support on 3.x -- as my org's .NET architect, I will be pushing for .NET Core adoption iff we get a 3.x version with LTS support.

@marksmeltzer The blog post Introducing .NET 5 from yesterday shows that .Net Core 3.1 will be LTS and is planned to be released in November 2019.

Will this new JSON serializer support F# types?

@rliman well currently it doesn't support Guid or Enum so it has a long way to go. I do agree that full support for F# option types being similar as C# nullable should be required IMHO

I personally think this is a rushed solution to a bad architectural design decision. This should have been done long ago.... Now it is going to cause a lot of pain everywhere... From library developers to enterprise developers.

There is no easy way to "smooth" out this new "feature".

This is MS trying to solve a problem they caused in the first place. And now everyone has to pay the price.

With NET Core seems from the beggining that the wagon is a little to "speedy"... This pure "agile" aproach might need to slow down a bit and let everyone catch their breath.

Seems like with ASP.NET Core these "features"( breaking changes ) have become the new normal.

In my opinion ASP.NET Core is in desperate need of a rework of their architectural design process. Because time and again keep making these "we will fix it later" features.

I have been developing with ASP.NET Core since it was on the early betas... And it is a great improvement on .NET.

But the MS team should stop for a moment and think how they can address the real problem here: Rushed and inconsistent architectural design decisions.

Just go back and read other threads... It seems it is a recurring theme.

So may be it is time to sit down and re-think what the best a approach is to make a more stable product.

Classic .NET might not be as mighty as Core... But it is very stable and consistent since 2.0.

Just my opinion.

Hi @suncodefactory,
i remember some time ago when ppl yelled at ms for not using opensource libraries, now they are blamed for doing so :D
From my point of view Aspnet/core MVC api have been very stable since mvc1/2 ! The reason why aspnet was stable since 2.0 was that it never changed/improved at all 😄.
To be honest if you are using advanced feature of a serialization library you have the chance to rethink about it and maybe approach the problem with a data structure suitable for the task, instead of pretending that all the serializers support all the language features, imo it's the wrong problem to solve, and the wrong way to use serialization.
Clearity, backward compatibility and future extensions are what drive my serializable dtos, very different trade offs used in common business logic objects (those are private, have a lot of functions, etc..)

We able to move microservices from net framework to Linux (net core) without almost any effort from pruduct teams. I don't know what you guys talking about. Microsoft doing great job speading up implementation of changes like this that have been long overdue.

Hi @suncodefactory,
i remember some time ago when ppl yelled at ms for not using opensource libraries, now they are blamed for doing so :D

For me the point is not about third party libraries... It is about architectural design which in this particular case is lacking or just plain wrong.

Also I never talked about classic asp.net... I was talking about .NET Framework 2.0. And the reason it was stable was not because there was no improvements as you falsely claim (since .net core is based on .NET 4.6.1). The reason was because it was well planned and architected.

As for how good aspnet core is vs classic asp.net mvc that has nothing to do with this particular thread.

This thread is about a breaking change that MS is about to ship once again without thinking it thoroughly.

We able to move microservices from net framework to Linux (net core) without almost any effort from pruduct teams. I don't know what you guys talking about. Microsoft doing great job speading up implementation of changes like this that have been long overdue.

Changes like this should not be happening at all..... So you are happy with breaking changes?

And saying that the asp.net core team has been doing a great job at shipping changes is simply not true.

I have been developing with asp.net core since beta 3 and I am pretty sure the architectural design process is lacking.

As for how good asp.net core is vs classic... I have no objections as I as well believe is better than classic.

But just because asp.net core is better than classic does not mean they are doing a great job architecture design. Those two are completly different topics.

Can we limit this discussion to the JSON functionality in .NET Core 3 please?

Changes like this should not be happening at all..... So you are happy with breaking changes?

So no improvements should be done? Why are you even a programmer if you don't want software to evolve and grow and get better?

@suncodefactory

Changes like this should not be happening at all..... So you are happy with breaking changes?

Ah, c'mon, you make it sound like "breaking change" means you have to scrap your project and start from scratch.

How many breaking changes can you count that were there in ASP.NET Core 2.x/3.0 which required more than

  • Referencing a different package
  • Using a different namespace
  • Changing more than 5 lines of code
  • Removing 1-2 lines of code (i.e. Properties from Options classes)

??

@suncodefactory

This thread is about a breaking change that MS is about to ship once again without thinking it thoroughly.

How is this actually a _breaking_ change? The new JSON APIs are a completely new set of APIs that are introduced in .NET Core which neither remove nor break any existing stuff. Yes, you will see things and libraries switch to it eventually as it offers different optimization opportunities, but you are not forced to apply that to your code.

Speaking about ASP.NET Core in particular although _“that has nothing to do with this particular thread”_, you have the choice there to keep using Newtonsoft.Json if you depend on some of its more advanced features. Yes, you will have to change some code for that to make it work, but I don’t consider that really breaking considering that you only have to do that if you actually want to upgrade to the new version. That’s the nice thing now: You have more choices.

If you dislike this for whatever reason, then feel free to stick to the .NET Framework which is known, stable and fixed feature set. That will stay there for quite a while, so you can totally depend on that. But please stop using this thread to spread your anti-new-stuff agenda when _“that has nothing to do with this particular thread”_.

Two questions from an EF Core user.

  1. Will System.Text.Json support circular references? Circular references can occur in EF Core data where there are navigational links going both ways between classes. Json.NET handles this with settings like
    c# var json = JsonConvert.SerializeObject(entities, new JsonSerializerSettings() { PreserveReferencesHandling = PreserveReferencesHandling.Objects, ReferenceLoopHandling = ReferenceLoopHandling.Ignore });
  2. With the rise of DDD-styled classes with private setters and private constructors can System.Text.Json deserialize these types of classes?

@JonPSmith IMO it shouldn't matter. You should never be serializing an entity directly. You should serialize a projection. This avoids circular references, and doesn't expose all data, especially when you add more properties to the entity that may end up being sensitive.

@JonPSmith: Imho both use cases are invalid from both best practice and DDD point of view.

  1. I have never seen a best practice which recommends deserializing entities directly (except in the most simple tutorial examples). Circular references always come at a cost. It requires tracking of already processed objects, this means: memory allocations and additional CPU cycles. But one of the mail goal of the new JSON library is to exactly avoid these memory allocations
  2. Invalid too, since you never serialize into an Domain model, especially not when you get the data via a web request such as an WebApi call. In DDD you should always work with events/commands, send the command to your web application, get (and dehydrate) the entity from the repository (via ORM mapping or EventSourcing) , apply the command, persist it.

On top of that, the new JSON API is for high performance scenarios. For everything else where you need rich feature set, you (and should) still use JSON.NET or whatever fulfills your needs.

@suncodefactory This is the opposite of a breaking change. Right now, in ASP.NET Core 2.2, JSON.NET is used by the framework as well as by user code. This has the potential to cause conflicts with your own use of Newtonsoft.Json; if ASP.NET Core 3.0 moved to JSON.NET 12.x and there was some kind of issue in there that broke your application, you'd have a problem.

For example, look at Microsoft.Extensions.Configuration.Json 2.2.0 - it has a dependency on Newtonsoft.Json 11.0.2. That's a configuration package; nothing to do with HTTP request handling or ASP.NET Core MVC. Or look at Microsoft.IdentityModel.Protocols.OpenIdConnect, which uses it for handling JSON Web Tokens; that's a hot path which needs as much performance as possible. JSON.NET is not a slow library by any standards, but it strikes a balance between performance, feature-richness and support for a massive range of user scenarios. Microsoft's new JSON library doesn't need to do that, because JSON.NET exists. So it can focus on handling the absolute basics with maximum performance.

.NET has always had its own JSON serialization solution in System.Runtime.Serialization.Json, but in the high-performance world of .NET Core it's not a very good one. I certainly wouldn't want it being invoked to check credentials on every incoming request. A new JSON library, with modern UTF-8 data handling and minimal allocations, is very welcome.

You will still be able to reference Newtonsoft.Json in your application, and continue to use it as the deserialization/serialization pipeline for request/response data as before. And from now on, you'll be able to do so without worrying about which version the Core framework depends on. That's a win for everybody.

Thanks @phillip-haydon and @TsengSR for your thoughts. I was asking if these features would be supported and you says that aren't, which understand and accept. I will continue to use Json.NET for the case where I need to serializing/deserializing EF Core classes.

BTW. I do have a valid reason for serializing/deserializing DDD-styles EF Core entity classes. I have a library that contains a feature I call Seed from Production which allows developers to take a snapshot of data from a production database, anonymise any private data, and then seed a new database with the snapshot.

I needed this feature for one on my clients and instead of writing just for them I built it into my open-source library EfCore.TestSupport so others can use it (and my client didn't have to pay me for it).

Is there a plan to support [DataContract], [DataMember] and friends?

Today this is a way to define how types should serialize/deserialize (e.g. the field name) in a way that doesn't bring a dependency to any serialization library to the project using it.

The current JsonNamingPolicy takes a string only so there's no way to inspect the member's attributes.

Hi.
We just tried to switch our micro services to DotNet core 3 preview 6 and we are unable to deserialize our immutable reference types : class with immutable properties (no setters) and only one constructor to set all properties. Json.net correctly handle these classes.
Is this an issue of the need System.Text.Json API or is this a plan to support it ?
Thanks for your responses

Thanks @khellang.
Support is planned indeed but not for the 3.0 release.
It seems to be possible to continue using Json.net with DotNet core 3 but I don't know how to do it (adding package reference is not enough). Is there a way to do that?

@agjini :

C# services.AddControllers() .AddNewtonsoftJson()

Thanks for your help guys.
It works !
I've missed the migration guide where all is explained :

https://docs.microsoft.com/fr-fr/aspnet/core/migration/22-to-30?view=aspnetcore-2.2&tabs=visual-studio

IMO json.net is half baked and making it the default (ie for signalr) which breaks existing code was premature.

On the other hand, migrating from .NET Core 2.2 to 3.0 is a major version upgrade and even if the .NET Core team isn't strictly following semantic versioning, I would expect things to break while upgrading from one version to another without explicit changes (like adding explicitly Newtonsoft's library in the pipeline)

Closing given this is an announcement and not an issue

Although the community has a lot of voices against improvement, as a new high-performance framework, the bad speed is unacceptable.

I know it has been said before, but I'd like to add my wish as well.

It would be really awesome if we could have immutable objects. I know it's possible by adding Json.NET to the MVC-Pipeline but in my case my tests are all failing since I am using ReadAsAsync<> which is now implemented somewhere in a peer-dependency of Microsoft.AspNet.WebApi.Client and that relies on System.Text.Json

We provide .NET Standard Class library to customers so that they can use our library to work on any platform that supports .NET Standard. We need to use System.Text.Json in our class library. What will be the plan to support System.Text.Json in .NET Standard?

@alsami

It would be really awesome if we could have immutable objects.

Do you only need the ability to prevent others from mutating it or do you also need the ability to create new instances with parts being smartly replaced (like immutable collections and Roslyn)? If you need the former, we've got you covered with the upcoming JsonDocument DOM APIs.

@mwoo-o

What will be the plan to support System.Text.Json in .NET Standard?

It's available as a NuGet package for .NET Standard 2.0: System.Text.Json.

@terrajobst

Thanks. When will this System.Text.Json be included in the .NET Standard SDK?
Will the .NET standard 3.0 (or some other later release versions) include System.Text.Json package? Will it happen in .NET Core 3.0 SDK production release?

@terrajobst

Are there any plans to make Deserialize method work with PipeReader? Or add Patch method that can be used in streaming scenarios where we don't have all the data when we start deserialization.

Here is a simplified version of the proposed API:

private async ValueTask<T> Deserialize<T>(PipeReader reader, CancellationToken cancellationToken) 
    where T: new()
{
    T model = new T();
    while (!cancellationToken.IsCancellationRequested)
    {
        ReadResult readResult = await reader.ReadAsync(cancellationToken);
        ReadOnlySequence<byte> buffer = readResult.Buffer;

        if (readResult.IsCanceled) break;
        if (buffer.IsEmpty && readResult.IsCompleted) break;

        SequencePosition consumed = JsonSerializer.Patch(model, buffer, readResult.IsCompleted);
        reader.AdvanceTo(consumed, buffer.End);               
    }

    return model;
}

public SequencePosition Patch<T>(T model, ReadOnlySequence<byte> jsonData, bool isFinalBlock, JsonSerializerOptions options = null)
{
      ...            
}

@terrajobst

ability to prevent others from mutating it

Only this currently. Is really just for 'data-transfer-objects'. Great news!

@mwoo-o

Thanks. When will this System.Text.Json be included in the .NET Standard SDK?
Will the .NET standard 3.0 (or some other later release versions) include System.Text.Json package? Will it happen in .NET Core 3.0 SDK production release?

There is no .NET Standard SDK. .NET Standard is an API surface, available on all supported Platforms. You can ship System.Text.Json in any application which runs targets the supported platforms supported by the .NET Standard, see .NET implementation support.

@TsengSR

There is no .NET Standard SDK. .NET Standard is an API surface, available on all supported Platforms.

Well, there is a project type that allows you to use the APIs. I think @mwoo-o is asking whether we have plans to add System.Text.Json to .NET Standard. The answer is no. Right, now we're planning on leaving this to be a NuGet package.

It's terrible.Too few functions to be applied in the project.

Was this page helpful?
0 / 5 - 0 ratings