-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Serializing a double with JsonSerializer results in JSON with lower precision #45341
Comments
@dgiddins this is a consequence of the precision of double math. Both the before and after numbers in your example have exactly the same bit representation when stored in a double:
So in your program The previous implementation happened to translate 40493F462C88BC96 into 50.494329039350461 (which happens to be what it "came from") instead of 50.49432903935046; the current implementation chooses the latter, because it is the shortest possible representation that is stored as that bit pattern (ie., that is round trippable to itself when parsed from a string). Usually this is what you want, as it's more compact. This work was done to conform to standard IEEE-754. Do you need numbers of this sort to deserialize to exactly the same representation? If so, you need to either serialize them as a larger type (like a 128 bit float, which .NET does not natively support) or as a string. Otherwise, perhaps you are satisfied with the explanation above. |
Thank you @danmosemsft My understanding is the behaviour I saw in .net core 2.2 was considered a bug so I am happy with the behaviour as it is in .net 5. The data represents a lat or long in our system so as long as we don't lose precision then the way the serialization works isn't citically important, we where simply serializing and object that contained a lat and long and generating a has from the Json representation to detect changes between different systems with the same data. |
Issue Title
Serializing a double with JsonSerializer results in JSON with lower precision
General
While upgrading from dotnet core 2.2 to dotnet 5.0 we experienced failing tests that depend on consistent serilaization of objects to generate a hash. I have tracked this down to the way the double type is serialized when passed to the JsonSerializer.Serialize method. During serialization we seem to be losing precision where a double is rounded to the next decimal point, but not every number is rounded.
For example
When we deserialize we get the origial number back but we need to act on serialized data to generate our hash. Why has this behaviour changed between frameworks, is it a bug or is it intended? Are there any settings we can change to restore the previous implementation (while remaining on .net 5 of course). The same behaviour can be seen with Newtonsoft Json (I have raised an issue with them as well)
The behavior seem consistent on Windows 10x64 and in a Lunix Docker image. This is running in Visual Studio 2019 (latest update)
The problem can also be seen with these number:
50.494328391915907
30.316339899700989
50.494128852095287
The text was updated successfully, but these errors were encountered: