-
Notifications
You must be signed in to change notification settings - Fork 9.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenAPI incompatible with I-JSON #1517
Comments
I opened json-schema-org/json-schema-spec#704 about two years ago, which describes the same issue and asks for some of the same solutions, as a start. I wasn't aware of I-JSON at the time and this is the first time I've seen the canonicalization draft, thank you for that. I've said my piece on the issue there, my stance hasn't really changed. |
@auspicacious Yes, we are in total agreement. I saw that I had accidentally stated 56 bits of precision. Corrected. |
For better or worse, OpenAPI borrows its type system from JSON schema, which aims to "annotate and validate JSON documents", not I-JSON documents. Is this a hypothetical problem, or are there really APIs out there which deal in 64-bit integers which haven't already solved the problem by using a (I ask because I have checked ~75,000 real-world OpenAPI definitions, and no-one has used the obvious The |
I have no idea but since Node.js is unable handling 64-bit integers (correctly) using the standard JSON object, it seem to me like a problem. Maybe everybody out there already understands this limit and have created their own proprietary workarounds? It would be nice to not have to rely on workarounds. It seems that other quite popular data types lack a definition as well like BigInteger, BigDecimal, and BigNumbers or am I just looking into the wrong documents? |
The table that is referenced here https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.0.1.md#dataTypes is used to help folks use the OAS If you want to describe I-JSON payloads, then don't add the |
Pardon my lack of insight in OAS but I looked on the left side of the table and it does indeed describe a In https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.0.1.md#components-object-example there is a definition: "id": {
"type": "integer",
"format": "int64"
} does that map into a JSON Number? |
@cyberphone I was about to respond to the JSON Schema issue, but I'll start here. JSON Schema doesn't currently define integer formats, although we have talked about it. If we did, and if we defined an "int64", it would get a note in the interoperability concerns section, with a reference to the appropriate part of the JSON RFC. We would presumably also consider an "int53" and allowing "int64" to apply to strings. There is currently a proposal under consideration for a fixed-decimal string format, which will go into the spec as soon as someone shows up who understands the use case well enough to write it up (I have no idea if there are non-obvious interoperability problems for it, so I've left it alone). In any event, based on the current JSON Schema spec there are no JSON Schema concerns, as the JSON RFC is a normative reference and already adequately covers the challenges of working with large or high-precision numbers in JSON. To answer your question about: {
"type": "integer",
"format": "int64"
} Yes, that would map to a JSON Number, because It would have obvious interoperability concerns, but that is no reason to avoid defining it. JSON Schema is not defined over JSON text, but over a data model which explicitly includes arbitrary-precision numbers. A system that has the capability to handle 64-bit integers should be able to describe and make use of them. JSON Schema definitely will not be changed to forbid this. |
Right, but that is orthogonal to "wire formats" like JSON. Even "poor" languages like JavaScript can deal with 64-bit integers but not through JSON Number. The advantage of using the Number type is minimal even if the used JSON tool can deal with it. Nobody ever proposed that for example RSA exponents (which are big integers) would use the JSON Number type. I believe the same is valid for "Money" which is syntactically compatible with JSON Number but requires another "math module". How is the data model supposed to deal with the following? {
"giantNumber": 1.4e+9999,
"payMeThis": 26000.33,
"int64Max": 9223372036854775807
} Since one goal with OpenAPI ought to be interoperability (?), this issue seems rather important. https://cyberphone.github.io/doc/openkeystore/javaapi/org/webpki/json/JSONObjectWriter.html#setInt53-java.lang.String-long- I don't have any more input on this. |
I might be misinterpreting this as my expertise is less in OpenAPI and more JSON Schema, but can't OpenAPI describe APIs using wire formats other than JSON? OpenAPI allows specifying media types. There's some sort of XML support (I try to pretend it doesn't exist but I recall an Authors and consumers of JSON APIs should be aware of JSON's (or I-JSON's) limitations and avoid formats that cannot be handled by JSON. But other sorts of APIs describable by OpenAPI might find int64 useful and interoperable. I think. Again, I'm neither an expert in OpenAPI nor in numeric representations, so if I seem to be missing something obvious here I probably am. |
One of the big weaknesses of JSON Schema and OpenAPI is that the permissiveness of the specifications has often led to the development of code generators that generate clients that are not compatible with the APIs they are meant to talk to. For instance, since maximum and minimum sizes are not required on number fields, if you were going to write a code generator for Java that you could be reasonably confident about, you should always use I think that there seem to be several competing visions here as to what JSON Schema and OpenAPI should be -- are they meant merely as machine-readable documentation, which you could generate human-readable documentation out of -- or is OpenAPI meant as something that you could actively and usefully generate code from, or validate live traffic against? Likewise, if you wanted to use the JSON Schema-ish parts of OpenAPI in a wire-format agnostic way -- to describe XML (or Avro, or another binary format), how can you translate between those formats without knowing the exact rules governing the values? As it stands, OpenAPI does not encourage the precision that is necessary for machines to transparently and reliably accomplish these tasks. If these are not goals of the project, that should probably be made explicit. |
[Disclaimer: I'm speaking for myself, from the context of JSON Schema, and am not in any way speaking for OAI] Deep breath... Your use case is not everyone else's use case. Your programming language or wire format restrictions are not everyone else's programming language or wire format restrictions. Your interoperability requirements are not everyone else's interoperability requirements. JSON Schema is not the only tool for solving these problems, nor should it be. It provides a layer of functionality, with a balance of flexibility and specificity that keeps it broadly useful. It is extensible for those who wish to extend it, and we are making the extensibility story much stronger in draft-08. Currently, JSON Schema targets validation and hyperlinking as use cases. It specifically does not target code generation. Once the multi-vocabulary support in draft-08 is finalized, we plan to add a code generation vocabulary. That will address many of the problems that arise when trying to hammer a validation constraint system into an object-oriented code generation system (or other types of code generation, but OO is especially poorly matched to validation keywords). However, it will not address all of them. In particular, it will not conform to the expectations of any one language or type system. Again, JSON Schema is extensible, so those wishing to further constrain the usage may do so. Doing things like requiring a Please keep in mind that standards need to be written broadly, so as not to impose the constraints of one system onto another. Approaches like offering an "int53" alongside an "int64" are good, because they provide a solution to a common problem, based on the boundaries of a commonly implemented standard, not just a single language. However, forbidding "int64" is not good, because there are systems that can support 64-bit integers, and they should not be prevented from doing so. Writing a code generation tool that either sets sensible numeric boundaries by default, or produces an error when there is not enough information in the schema to proceed in an interoperable manner, are good solutions. Requiring everyone to add boundaries whether they need them or not is not. Blaming JSON Schema for some implementation choosing the wrong type despite it clearly not matching the specification is also not useful. JSON Schema has never claimed that integers would fit in a Java Do we need a way to map into binary protocol formats? Great, let's add that (once schema vocabularies are sorted). But don't make everyone use it, because not everyone needs it. |
Perhaps there are different visions of what JSON Schema and OpenAPI are trying to do, but it seems to me that the vision of the OpenAPI TSC is aligned with that of JSON Schema working group. Let me try and use an analogy to see if we can agree on the different perspectives. Consider two different human languages, let's say French and English. It seems to me, that what is being asked for in this thread is for OAS to define precise rules for describing how French words can be translated to English words. As nice as that idea sounds, there are lots of edge cases where it just wouldn't work. The way I see OpenAPI is that it is attempting to be an Esperanto for HTTP APIs. When people in France consume OpenAPI, then need to understand what is the highest fidelity way of translating Esperanto to French. When English people consume OpenAPI they translate in a way that makes the most sense from Esperanto. It is not possible for the OpenAPI specification to accurately describe how to translate its typing concepts into every possible implementation language. It is the responsibility of the implementations to do the thing that makes the most sense in their native language. The hope is that the terms I'm don't want to even try to answer the question what does The funny thing about interoperability that we have learned over the past 20 years is the more constrained you try and make the interface definition in order to avoid interoperability problems, the more interoperability problems you end up having in the real world. WSDL and XSD went to enormous lengths to be precise about interfaces and wire formats and datatypes and was significantly less effective at interoperability than far more loose alternatives. |
The problem here is not one of understanding either of your intents. I understand what you're saying, it's not new to me, I just disagree. The predecessors to these specifications, and the problems in the ecosystems around them, have caused me a lot of pain over the past five years as I've been required to set norms and standards around them within a large company. Some of that came out in the comments to json-schema-org/json-schema-spec#704. It's difficult to separate the buggy implementations (and there are many!) from the specifications, yes, but I think there is responsibility in both the implementations and the level of abstraction in the specification itself. That abstraction itself creates implicit optimization for some use cases; I don't think those are the same use cases that actually need to be targeted. Nor do I think most implementers are capable of internalizing the abstractions without much more help. As far as bringing up the specter of XSD, couldn't I just as easily point to HTTP 2.0 as an example of how a lax standard had to be clarified and specified more precisely to reflect real-world use, after years of experience? Anyway, as it's clear this is a fundamental difference of opinion, I will try to make this my last comment on it. I'm glad to hear you're starting to think about code generation metadata, and maybe something interesting will come of it. |
@darrelmiller @handrews When you start looking into BigDecimal (which most monetary applications use), the problems associated with cramming any kind of number into JSON Number will become more evident. Based on my experience (I'm into payment systems/banking), nobody uses JSON Number for such data. It is obvious that URL parameters do not share this problem, their only format restriction is being URL compatible. |
FWIW, here is a related workaround for .NET using the [DataContract]
public class MyJSONObject
{
[DataMember]
public double myDoubleValue;
public long myLongValue
{
get { return long.Parse(myLongValueAsString); }
set { myLongValueAsString = value.ToString(); }
}
[DataMember(Name = "myLongValue")]
private string myLongValueAsString;
}
// Usage:
MyJSONObject o = new MyJSONObject();
o.myDoubleValue = 1.5e+33;
o.myLongValue = 9223372036854775807; Works as expected when serialized and deserialized using JSON. |
OK it sounds like we're arriving at closing statements, and then hopefully someone from OAI will close this. Here's my wrap-up: @cyberphone @auspicacious no one is stopping you from, for example, requiring that everyone developing APIs for your company/project/whatever supply a And that's really the key point here: OpenAPI and JSON Schema give you what you need to solve your concerns. But you want to force everyone else to be constrained by the solutions to your concerns. That's just not how a successful open-ended system works. If you have concerns about using big numbers in JSON... just don't. And write your schemas / API contracts accordingly. Alternatively: Tell me why I should be constrained by your requirements? Don't show how large numbers violate your requirements, we all understand that. Show me that there is absolutely no use case ever for me or anyone else to use large numbers. Assume we have all of the wire format and programming language support that we need. Why should we be prevented from using it just so that you can feel more confident about interoperability, when you are never going to use our API and we are never going to use yours? It doesn't matter to me how many problems come up with large numbers and the JSON number type if I know that my OpenAPI / JSON Schema-based system will never hit those problems. Yes, they are real problems. Yes, APIs going for certain interoperability targets should do things to avoid those problems. But you want to make it the responsibility of the standard to force everyone to avoid those "problems", when there are many applications in which they are not problems. The fundamental disagreement here is that you are not willing to acknowledge that other people have requirements that conflict with yours, which are equally valid. We can offer all of the tools you might want for you to produce systems that will work for you, but there is no reason I can think of to break everyone else's use cases to make you more comfortable. |
I don't appreciate being made into a straw man, and I feel that's what's happened in this thread. Please have a look back and ask why some of these proposals, with modification, couldn't help OpenAPI users while imposing nothing on people who don't like them, even if you don't like my most extreme one from json-schema-org/json-schema-spec#704. |
@auspicacious I was making an effort to reply to specific things you proposed, such as minimum/maximum boundaries. Otherwise, I am replying to the general trend of all of the objections here, not just specifically you. Since your response to my attempt to discuss your own proposal that you put on the table is to withdraw it, I guess we're done. |
We can close the thread now. You may have reasons revisiting it the day you begin supporting Decimal (monetary data) and BigNum which are distinct datatypes. Format-wise they can surely traverse through JSON Number but the value/gain seems limited and it will likely be incompatible with JavaScript which is about to be enriched with BigNum. I would consider doing a short research effort regarding the current use of Decimal/BigNum in conjunction with JSON. Microsoft uses a JSON Object in .NET for representing BigInteger so apparently interoperability isn't that great for us poor JSON users 😞 OpenAPI obviously has a mission to fill here 😁 |
https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.0.1.md#dataTypes
Apparently
long
maps (through JSON Schema) into JSON Number.This is in conflict with I-JSON: https://tools.ietf.org/html/rfc7493
IMO, I-JSON is right. Promising 64 bits of precision when only 53 bit are readily available leads to unnecessary problems.
Suggested solution:
long
but add "deprecated"int53
mapped into numberint64
mapped into stringRelated: https://cyberphone.github.io/doc/security/draft-rundgren-json-canonicalization-scheme.html#json.bignumbers
The text was updated successfully, but these errors were encountered: