Runtime: Question: Serialization support going forward from .Net Core 1.0

Created on 1 Mar 2016  ·  38Comments  ·  Source: dotnet/runtime

Hi all,
I have heard about support for serialization being discontinued in .Net Core 1.0 as it is not viable for cross-platform. (Paraphrasing from memory) What does this mean in practical terms? Are my code-bases that use BinaryFormatter's Serialize and Deserialize methods going to be completely deprecated, and I'll have to convert my codebase over to say, protobuf-net? Or have I misunderstood?
Thanks,
-Sam

area-Serialization question

Most helpful comment

Do you want peope to actually use .NET Core or is this just a waste of time like Silverlight? If you want people to actually use .NET Core, make it work. If people need serialisation, build it - no back-talk! These are the people actually using your product, and their opinion is worth far more than the collective wisdom of every Microsoft employee. You see these people are, unlike Microsoft, actually building mission critical stuff on .NET and if you want .NET Core to be anything, you have to stop breaking its usefulness. Nobody asked you to ditch .NET to do a full rewrite, you could have just ported the full .NET framework over time. You would have been done by now.

All 38 comments

Hi @joshfree , yes unfortunately that was the document that caused the confusion. Out of the suggestions there, JSON.NET does json serialization, protobuf-net does binary serialization, and datacontractserializer does xml serialization. The problem with that, is if I want to do binary serialization. While protobuf-net is a great library, it is limited. From Protobuf-nets repo, the supported types are:
custom classes that:
are marked as data-contract
have a parameterless constructor
for Silverlight: are public
many common primitives etc
single dimension arrays: T[]
List / IList
Dictionary / IDictionary
any type which implements IEnumerable and has an Add(T) method
In the past this has been fine as binaryformatter has always been there, but this is no longer going to be the case? What's the recommended way of binary serialization of non-supported types by protobuf-net? Build it ourselves?
I'm still pretty novice at all of these technologies so I may be completely missing the point on something.

Also, none of them can reliably serialize exceptions (and keep all the relevant data), as needed e.g. in distributed computing (Azure...).

@cdrnet point is a big sticking point for guys like the Orleans guys IIRC.

Hi @SamuelCox, as the porting guidance @joshfree pointed, we (serialization team) have no plan to bring binary serializer to .NET Core. Without .NET remoting and concept of AppDomain in .NET Core, binary serializer is much less useful. Please consider other serializers in .NET Core such as DataContractSerializer (DCS), DataContractJsonSerializer (DCJS) and XmlSerializer, which are and will continue to be supported in .NET Core. Of course, you can also consider 3rd party serializers built on .NET Core. Maybe I should ask the question - are you particularly looking for a serializer that has binary serialized payload (if yes, why) or do you just look for a serializer that can serialize/deserialize the types you care about? We have been trying hard to support the serialization of most .NET types with DCS, DCJS and XmlSerializer. Please feel free to let us know if you run into any particular issues that these serializers don't work for you.

@cdrnet and @RichiCoder1, thanks for your feedback. There was an open issue dotnet/coreclr#2715 for the discussion of serialization of exception. Please add your feedback there. I agree it is important to be able to serialize exceptions in a distributed system. Currently, without ISerializable on .NET Core, we just cannot serialize exceptions as we do in full .NET framework.

Hi @zhenlan , I perhaps should have mentioned originally, the main reason I'd want binaryformatter is to (de)serialize exceptions, and custom classes that contain exceptions.

The main use case is distributed systems, serializing some object to a byte array so that I can send it over tcp/ip. I realise it is probably more common to serialize the object to JSON via JSON.NET for example, convert that string to a byte array, send that over tcp/ip and reverse the whole process once on the other end. But that has a lot more overhead than just serializing to a byte array and deserializing from a byte array to the object you want.

Thanks @SamuelCox for the clarification. I think I understand your concern now. Binary serialization is a close system. I would recommend you to consider other more open standard based serialization such as Xml or JSON. You may not need it today, but it gives you more flexibility in case you will need to support serialization between different platforms or with different programming languages/scripts in the future. Not sure how large your data will be, but if the size of payload is really a concern, you can consider data compression, which is a built in feature for many servers nowadays.

I was more thinking of the performance overhead, than anything else. Seems like a lot of wasted operations compared to serializing and deserializing straight from objects to binary and back. I think I'm happy to close this now as I at least received clarification. I still think there's a gap between 4.6 and core 1.0 that's being opened up here, but I do understand why.

Bit late to the conversation, but here are my two cents:

I think it would be a mistake to believe that the utility of BinaryFormatter is limited to .NET remoting and AppDomains. What differentiates the (obsolete) BinaryFormatter from its newer counterparts is its sheer capability of serializing the more exotic .NET objects, including closures, subtypes and cyclic graphs. None of the other serializers listed in the current thread are capable of doing all this. It is no coincidence that many cutting edge distributed frameworks, including Microsoft projects such as Prajna and Mobius (aka SparkCLR) rely on BinaryFormatter in order to function. This is not exclusive to the .NET world: Spark uses the old and slow Java binary serializer for serializing closures.

There are other (non-binary format) serializers that replicate BinaryFormatter's capabilities, including our very own FsPickler library used by the mbrace framework. However, CoreCLR has deprecated many key APIs, to the extent that I believe porting the library to CoreCLR is an impractical endeavour.

From a business perspective, it would be great to see CoreCLR becoming a viable cross-platform competitor to JVM in the distributed computation/big data domain. This cannot happen without the platform offering reliable serialization support for POCOs and closures (binary or otherwise).

Do you want peope to actually use .NET Core or is this just a waste of time like Silverlight? If you want people to actually use .NET Core, make it work. If people need serialisation, build it - no back-talk! These are the people actually using your product, and their opinion is worth far more than the collective wisdom of every Microsoft employee. You see these people are, unlike Microsoft, actually building mission critical stuff on .NET and if you want .NET Core to be anything, you have to stop breaking its usefulness. Nobody asked you to ditch .NET to do a full rewrite, you could have just ported the full .NET framework over time. You would have been done by now.

fwiw, CSLA .NET relies on full-fidelity serialization because it rests on the concept of mobile objects.

When Silverlight came along and didn't have BinaryFormatter or NetDataContractSerializer, _and_ had all those nasty reflection limitations, we ended up implementing our own serializer that uses minimal reflection and doesn't rely on BF or NDCS.

In a post-Silverlight world the issue remains, because BF/NDCS aren't reliably available in UWP, WinRT, .NET Core, etc.

So I think there's an argument to be made that a full-fidelity serializer _should_ exist, but it is only really useful (at least imo) if it exists on all the various incarnations of .NET.

@rockfordlhotka @opinionmachine @eiriktsarpalis Glad to hear more people feel the same, although I feel it would be more productive if it was said a bit more politely @opinionmachine , but each to their own. As this issue is closed, I would imagine the corefx team are no longer monitoring it. I'd advise you to right your concerns on dotnet/coreclr#2715 as mentioned by @forki

I'd have to agree that dropping serialization (as well as appdomains) was a terrible idea. I really don't think that decision was well thought out.

However what really gets me going is MS didn't release the code necessary to support appdomains, etc and left it to the community to do with it what they will.

Given that coreclr is supposed to be the "server" version of .net, a lot of server features were left on the floor.

Case in point: it was decided to leave StackTrace/StackFrame out of corefx because it was "misused" by some developers, and because it was rarely used (according to MS). I believe they came to their senses on that one (after a ton of backlash) but I mean really? Who thinks of this stuff?

For what it's worth, I'm actually really happy with .net core and think it's a great step forward for .net, the performance improvements are brilliant, it still has pretty much all of the features I care about, and it's cross platform. This is really the only issue I have with it. There is a lot of value to dropping features where appropriate, just look at how successful go has been with it's minimalistic approach. Not that I'm saying .net should be as minimalistic as go, that would be ridiculous, I personally disagree with a minimalistic design, BUT what I'm trying to say is MS shouldn't be vilified for dropping certain features.

@SamuelCox, thanks! :-)

And to reiterate, APIs are missing from .NET Core for many reasons. Some of these gaps can be fixed easily (and we are fixing them); some are harder to fix. But we do want to hear from you about the issues you are running into (like this whole set of threads about serialization) and we want to find a way to unblock your scenarios.

Binary serialization of arbitrary object graphs, across versions of a single framework is hard; across different frameworks is even harder. And by 'hard" I don't mean it is hard for us to do the work. I mean it has quite far reaching implications for other goals of the platform (goals that I think we all share): security, performance, reliability.

Json.NET + TypeNameHandling.All + PreserveReferencesHandling.All + MemberSerialization.Fields gets you almost all the way there. There is no FormatterServices.GetUninitializedObject however so a constructor needs to be available.

There is no FormatterServices.GetUninitializedObject

This is the core problem I think - external reflection-based serialization frameworks can get you a long way, but this specific API needs to be there to enable these frameworks to be written. My recommendation to the CoreCLR team would be to add this API back in.

At some point the .NET community (commercial or otherwise) are going to have to stop relying on Microsoft holding their hand all the time. If there is such a massive requirement for this, and it's been known for quite some time that is won't be available, then why haven't there been any community activity on providing an alternative? Json.NET is a community alternative and we use that all the time. Heck the source code is even available on reference source

It seems to work out for other platforms and I'm sure it works out just fine on .NET.. the way I see it, this is a prime opportunity for creating a project (that there seems to be a demand for) and get in early on filling that void with community alternative(s)

@thecodejunkie we'd love to, but we'd still need to the runtime to expose appropriate APIs for this to happen.

FYI: this works on .NET Core: Type appDomainType = Type.GetType("System.AppDomain");. And yes, it let's you do a lot of things....

Just to say that we have improved our front end server responsiveness by 30%, reduced from 12 cores at max load to 2 cores at max load, decreased the redis cache size from 1.7Gb to 350Mb overall reduced our Azure hosting by 20% (bit more really)

You guessed it BinaryFormatter!

Were using netdatacontractserializer

I came here looking for answers to .Net 4.6.1 being so much slower with BinaryFormatter.

It is my understanding that these APIs will come back (serialization). While it is not ideal and has some issues about being brittle, that at least should allow existing code to continue running.

there is also value in designing another simple object serializer for the core framework, one that is also lightweight but is more resilient to cross-machine and potentially cross-version Issues

There are also security issues on top of brittleness.

@blowdart WRT security, do you mean things like this https://blog.scrt.ch/2016/05/12/net-serialiception/?

Yup that's the kind of thing. Any serialization format that includes the type within it is dangerous, as Java is finding out this year. I was incredibly happy when we dropped binary serialization because we removed an entire class of vulnerabilities.

@migueldeicaza @blowdart @SamuelCox
Serializers are needed not only for sending things over, but even in-proc.
The binary serializers, when done properly outperform the whole native object heap altogether when it comes to storing 10s of millions of objects in-process.
See this:
https://www.infoq.com/articles/Big-Memory-Part-2

The serialization APIs are absolutely needed for sane cluster programming.
It is extremely inconvenient to teleport object instances CLR -> text ->CLR, it is a huge overhead.
Maybe it is not a mistake to take BinaryFormatter out, as it is VERY slow and datagrams are huge, but
that was the only serializer on the market besides NFX.Slim that supported full CLR serialization semantic.
See the detailed speed and size charts:
http://aumcode.github.io/serbench/

ISerializable with [OnSer/Deser] family DOES make a lot of sense for in-platform teleportation.
It is not mandatory, just like in old NET. Why not keep it.
At least support it in complex collections (i.e. Dictionary), it is not hard at all.

Making everyone use JSON -is absolutely bad idea as it is times slower than binary serializer (not BinaryFormatter).
It looks like that everyone is building Twitter-like apps with no business logic?

@itadapter do have one nit to pick, which is that JSON.Net isn't the only serialization game in town. There's also MsgPack, Protobuf, and Microsoft's own Bond. (Edit: your second link calls out Protobuf). Json isn't the only option. There is the big issue still, though, of serializing .Net objects with interesting semantics (Exception).

@RichiCoder1
Yes, of course. As seen in the charts below, i.e. JIL is the fastest JSON serilizer, however none of text-based serializers can touch, binary ones. Protobuf is very fast at the expense of its inability to support true polymorphism and complex graphs.

All I am saying is:

  • Binary Serializers are always more performant for business domain objects, especially when you have lots of numerical data
  • Serializers, in general are necessary NOT ONLY for moving data between systems, but even inside the process as Big Memory approach shows. It is not a typical approach, but it makes tons of practical sense (i.e. caches and in-memory social graph traversals)
  • "Teleportation" is the technique akin to MS Remoting , which honestly was botched-up. Remoting in general when done right (without horrid complexity) is VERY VERY useful in cluster systems. We use this approach all the time and the programming is WAY easier with native objects - when you can take DOMAIN objects and just send them to another method as-is, be it on this or adjacent machine in the rack

Benchmarks of a "Typical Person", showing popular serializers:
http://aumcode.github.io/serbench/Specimens_Typical_Person/web/overview-charts.htm

Just in case anyone on this thread is not aware of, binary serializer including ISerializable etc. have been made available in corefx
https://github.com/dotnet/corefx/tree/master/src/System.Runtime.Serialization.Formatters

@zhenlan Part of the .NetStandard2.0 work I assume?

@RichiCoder1 Yes, correct.

@zhenlan 太棒了!

@zhenlan I can't seem to find the Serializable Attribute available in the preview of Standard. Am I just missing it?

@justinhelgerson it should be there.

Did you install 2.0 Preview1 SDK as instructed in the announcement blog post?

After you create a .NET Core project, can you please make sure the .csproj file has <TargetFramework>netstandard2.0</TargetFramework> if you created a Class Library or <TargetFramework>netcoreapp2.0</TargetFramework> if you created a Console App?

BTW, since many folks on this thread have interest in binary serializer, you may be interested in discussion in dotnet/corefx#19119, which is about scaling back [Serializable] for .NET Core 2.0. Please let us know there in case you have any feedback.

@zhenlan Thanks a lot for your quick response. I had installed the preview via NuGet but I didn't manually update my .csproj file. That did the trick! I'll read about the scaling back and provide input if it would impact our use-case.

@justinhelgerson , the scale-back changes aren't in Preview 1 (we're still working on them!), but you should be able to try them with daily builds or a future preview soon.

Was this page helpful?
0 / 5 - 0 ratings