Submitted for your approval, JSOX

classic Classic list List threaded Threaded
11 messages Options
Reply | Threaded
Open this post in threaded view
|

Submitted for your approval, JSOX

J Decker
(Thank you Rod Sterling)

But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.  

I'm, at this point, open to changing anything (or even omitting things), including the name.

JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.  

So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
so it can be used in human readable circumstances.

Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.    

So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.

I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.

So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.

A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.




(Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/

There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: Submitted for your approval, JSOX

Felipe Nascimento de Moura
I agree with that.
I too feel JSON could have an upgrade!
For instance, I almost always use JSON5 (https://json5.org/) transpilation in any project I have to deal with JSON files.

[ ]s

--

Felipe N. Moura
Web Developer, Google Developer ExpertFounder of BrazilJS and Nasc.

Website:  http://felipenmoura.com / http://nasc.io/ 
Twitter:    @felipenmoura
---------------------------------
Changing  the  world  is the least I expect from  myself!


On Tue, Sep 18, 2018 at 6:22 PM J Decker <[hidden email]> wrote:
(Thank you Rod Sterling)

But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.  

I'm, at this point, open to changing anything (or even omitting things), including the name.

JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.  

So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
so it can be used in human readable circumstances.

Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.    

So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.

I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.

So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.

A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.




(Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/

There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: Submitted for your approval, JSOX

Isiah Meadows-2
What precisely is there to do within the spec itself?

The only reason JSON was included was because it became so ubiquitous
across numerous platforms, including both server side, client side,
and scripting.

(Not a TC39 rep, but I strongly doubt they'd seriously consider
including support for *another* protocol unless it becomes similarly
ubiquitous.)

-----

Isiah Meadows
[hidden email]
www.isiahmeadows.com

On Tue, Sep 18, 2018 at 6:53 PM Felipe Nascimento de Moura
<[hidden email]> wrote:

>
> I agree with that.
> I too feel JSON could have an upgrade!
> For instance, I almost always use JSON5 (https://json5.org/) transpilation in any project I have to deal with JSON files.
>
> [ ]s
>
> --
>
> Felipe N. Moura
> Web Developer, Google Developer Expert, Founder of BrazilJS and Nasc.
>
> Website:  http://felipenmoura.com / http://nasc.io/
> Twitter:    @felipenmoura
> Facebook: http://fb.com/felipenmoura
> LinkedIn: http://goo.gl/qGmq
> ---------------------------------
> Changing  the  world  is the least I expect from  myself!
>
>
> On Tue, Sep 18, 2018 at 6:22 PM J Decker <[hidden email]> wrote:
>>
>> (Thank you Rod Sterling)
>>
>> But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.
>>
>> I'm, at this point, open to changing anything (or even omitting things), including the name.
>>
>> JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.
>>
>> So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
>> so it can be used in human readable circumstances.
>>
>> Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.
>>
>> So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.
>>
>> I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.
>>
>> So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.
>>
>> A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.
>>
>>
>>
>> https://github.com/d3x0r/jsox
>> https://npmjs.com/package/jsox
>>
>> (Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
>> https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/
>>
>> There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)
>> _______________________________________________
>> es-discuss mailing list
>> [hidden email]
>> https://mail.mozilla.org/listinfo/es-discuss
>
> _______________________________________________
> es-discuss mailing list
> [hidden email]
> https://mail.mozilla.org/listinfo/es-discuss
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: Submitted for your approval, JSOX

Mike Samuel
In reply to this post by J Decker
TC39 is not really a place to spec out new transport formats.

You proposed builtin support for JSON5 last year.
JSON5 was speced and had some adoption but that was controversial because JSON5 was not nearly as widely used as JSON.
This seems to offer more obvious benefits over JSON than JSON5, but it doesn't yet approach the adoption threshold.

-----

IIUC, type tags can only reference builtin types with well-understood semantics like Date and typed arrays or structs with default values defined in the same JSON input.
No type referenced by JSON can be or extend a type defined by application code.

If that's not the case, please keep in mind though that deserialization schemes that allow an external input to specify which types to construct makes it easy for an attacker to forge objects that code might treat as privileged because it assumes all instances are created internally.
"Malformed data or unexpected data could be used to abuse application logic, deny service, or execute arbitrary code, when deserialized."
See also "History of Java deserialization vulnerabilities" at https://www.slideshare.net/codewhitesec/java-deserialization-vulnerabilitesruhrseceditionv10

This already happens with plain JSON, so anything that allows external inputs to specify which internal types to construct would have to include a "Security Considerations" section that explains how this could be safely used by code that assumes that `if (x instanceof InternalType)` then x came from internal code that made a good-faith effort to only pass appropriate inputs to `new InternalType(...)`.

On Tue, Sep 18, 2018 at 5:22 PM J Decker <[hidden email]> wrote:
(Thank you Rod Sterling)

But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.  

I'm, at this point, open to changing anything (or even omitting things), including the name.

JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.  

So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
so it can be used in human readable circumstances.

Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.    

So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.

I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.

So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.

A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.




(Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/

There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: Submitted for your approval, JSOX

J Decker
Again I think it's a matter of introduction; I am just fishing for knowledge from the more knowledgable; maybe what other JS types are important.
I did include a discussion link https://gitter.im/sack-vfs/jsox .

On Wed, Sep 19, 2018 at 8:14 AM Mike Samuel <[hidden email]> wrote:
TC39 is not really a place to spec out new transport formats.

You proposed builtin support for JSON5 last year.
JSON5 was speced and had some adoption but that was controversial because JSON5 was not nearly as widely used as JSON.
This seems to offer more obvious benefits over JSON than JSON5, but it doesn't yet approach the adoption threshold.

Yes, but JSON5 also won't handle bigint (other than as a string). (pure speculation).
I'm not really proposing JSOX into the standard, but it is highly dependent on the standard (and yes, the subject doesn't say that at all)
 

-----

IIUC, type tags can only reference builtin types with well-understood semantics like Date and typed arrays or structs with default values defined in the same JSON input.
No type referenced by JSON can be or extend a type defined by application code.
 
(I am assuming you mean JSOX?) Yes types specified, (without the application registering information about that type) are not really 'types' they work more like macros.
And there are lots of ways to instantiate types; for which, just sharing the same prototype isn't enough; a use case for that would be a 3D model which has lots of 'vector' and some 'matrix'es (and have bone structures which may be cyclic). 
that interface probably needs work; 

If that's not the case, please keep in mind though that deserialization schemes that allow an external input to specify which types to construct makes it easy for an attacker to forge objects that code might treat as privileged because it assumes all instances are created internally.
"Malformed data or unexpected data could be used to abuse application logic, deny service, or execute arbitrary code, when deserialized."
See also "History of Java deserialization vulnerabilities" at https://www.slideshare.net/codewhitesec/java-deserialization-vulnerabilitesruhrseceditionv10

Prototype binding is entirely controlled by the application; if registers objects of a certain type should use a certain prototype(or other construction method?), they will be that type alone, and not some arbitrary type... if some new macro was used the object will just get a blank prototype; which would be shared by others of that 'type'.

And yes, the security note from JSON re eval() does still apply for a subset of valid input (since all JSON is valid),

I know of no exploits; all resulting strings should be shorter than the input (because of escapes \\ ).  The C version allocates a output buffer that is the same size as the input, and moves decoded strings into it.  Structure characters [ { } ] , " ' `  don't transfer either.

This does stick to JSON's spirit of only transporting data.  The parser is very similar to a JSON parser, except many places that would previously throw are accepted....
And references can only link to other objects/arrays within the current outermost object/array.


This already happens with plain JSON, so anything that allows external inputs to specify which internal types to construct would have to include a "Security Considerations" section that explains how this could be safely used by code that assumes that `if (x instanceof InternalType)` then x came from internal code that made a good-faith effort to only pass appropriate inputs to `new InternalType(...)`.

On Tue, Sep 18, 2018 at 5:22 PM J Decker <[hidden email]> wrote:
(Thank you Rod Sterling)

But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.  

I'm, at this point, open to changing anything (or even omitting things), including the name.

JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.  

So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
so it can be used in human readable circumstances.

Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.    

So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.

I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.

So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.

A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.




(Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/

There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: Submitted for your approval, JSOX

Mike Samuel


On Wed, Sep 19, 2018 at 12:01 PM J Decker <[hidden email]> wrote:
Again I think it's a matter of introduction; I am just fishing for knowledge from the more knowledgable; maybe what other JS types are important.
I did include a discussion link https://gitter.im/sack-vfs/jsox .

On Wed, Sep 19, 2018 at 8:14 AM Mike Samuel <[hidden email]> wrote:
TC39 is not really a place to spec out new transport formats.

You proposed builtin support for JSON5 last year.
JSON5 was speced and had some adoption but that was controversial because JSON5 was not nearly as widely used as JSON.
This seems to offer more obvious benefits over JSON than JSON5, but it doesn't yet approach the adoption threshold.

Yes, but JSON5 also won't handle bigint (other than as a string). (pure speculation).
I'm not really proposing JSOX into the standard, but it is highly dependent on the standard (and yes, the subject doesn't say that at all) 

Understood.
 

-----

IIUC, type tags can only reference builtin types with well-understood semantics like Date and typed arrays or structs with default values defined in the same JSON input.
No type referenced by JSON can be or extend a type defined by application code.
 
(I am assuming you mean JSOX?) Yes types specified, (without the application registering information about that type) are not really 'types' they work more like macros.
And there are lots of ways to instantiate types; for which, just sharing the same prototype isn't enough; a use case for that would be a 3D model which has lots of 'vector' and some 'matrix'es (and have bone structures which may be cyclic). 
that interface probably needs work; 

Sorry, JSOX.  Thanks for explaining.
 

If that's not the case, please keep in mind though that deserialization schemes that allow an external input to specify which types to construct makes it easy for an attacker to forge objects that code might treat as privileged because it assumes all instances are created internally.
"Malformed data or unexpected data could be used to abuse application logic, deny service, or execute arbitrary code, when deserialized."
See also "History of Java deserialization vulnerabilities" at https://www.slideshare.net/codewhitesec/java-deserialization-vulnerabilitesruhrseceditionv10

Prototype binding is entirely controlled by the application; if registers objects of a certain type should use a certain prototype(or other construction method?), they will be that type alone, and not some arbitrary type... if some new macro was used the object will just get a blank prototype; which would be shared by others of that 'type'.
 
And yes, the security note from JSON re eval() does still apply for a subset of valid input (since all JSON is valid),

I know of no exploits; all resulting strings should be shorter than the input (because of escapes \\ ).  The C version allocates a output buffer that is the same size as the input, and moves decoded strings into it.  Structure characters [ { } ] , " ' `  don't transfer either.

Not a vulnerability in your JSOX implementation per se, but have you looked into whether there's exploitable ambiguity between JSOX and runs of ES BlockStatements and ExpressionStatements?

JSON used to be vulnerable to cross-site snooping.

<script>// In attacker page
Array = function () { alert('Got ' + arguments[0]) };
</script>

This allowed piggybacking on HTTP credentials if an attacker could get a victim to visit their page.

The problem was that the meaning of [...] and {...} were specified in terms of global.Array and global.Object
which could be replaced

That's been fixed, but JSOX should probably be careful about any ambiguity with BlockStatement.
IIUC,
  { keyword: [] }
is valid as a statement so there is some ambiguity there.

Then I see examples like
//-- the following...
a { firstField, secondField }
[ a { 1, 2 }, a(5,6), a("val1","val2") ]
I haven't worked through your grammar, but I wonder whether a naive JSOX encoder might produce output like
    { looksLikeAStatementLabel: a("val1", "val2") }
or
    a
    { onlyField }
    [ a(5), a("val1") ]
allowing an attacker to do
    <script>
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    </script>
    <script src="http://other-origin/jsox-web-service"></script>

There's a lot of "ifs" in this scenario,
AND CORS solves a lot of these problems for origins that use it
AND browsers are less trusting of script srcs with Content-types:text/x-jsox than they were in 2008
BUT
    // attacker setup
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    // victim responds
    a
    { onlyField }
    [ a(5), a("val1") ]
does alert twice in Chrome and JSON hijacking was exploited in the wild, serializers have been known to
line wrap in attacker-controllable ways, and there may still be many JSON webservices that respect ambient
credentials on cross-origin requests.

 
This does stick to JSON's spirit of only transporting data.  The parser is very similar to a JSON parser, except many places that would previously throw are accepted....
And references can only link to other objects/arrays within the current outermost object/array.

 

This already happens with plain JSON, so anything that allows external inputs to specify which internal types to construct would have to include a "Security Considerations" section that explains how this could be safely used by code that assumes that `if (x instanceof InternalType)` then x came from internal code that made a good-faith effort to only pass appropriate inputs to `new InternalType(...)`.

On Tue, Sep 18, 2018 at 5:22 PM J Decker <[hidden email]> wrote:
(Thank you Rod Sterling)

But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.  

I'm, at this point, open to changing anything (or even omitting things), including the name.

JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.  

So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
so it can be used in human readable circumstances.

Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.    

So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.

I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.

So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.

A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.




(Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/

There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: Submitted for your approval, JSOX

J Decker
(trimmed)

On Wed, Sep 19, 2018 at 12:08 PM Mike Samuel <[hidden email]> wrote:


On Wed, Sep 19, 2018 at 12:01 PM J Decker <[hidden email]> wrote:

I know of no exploits; all resulting strings should be shorter than the input (because of escapes \\ ).  The C version allocates a output buffer that is the same size as the input, and moves decoded strings into it.  Structure characters [ { } ] , " ' `  don't transfer either.

Not a vulnerability in your JSOX implementation per se, but have you looked into whether there's exploitable ambiguity between JSOX and runs of ES BlockStatements and ExpressionStatements?

JSON used to be vulnerable to cross-site snooping.

<script>// In attacker page
Array = function () { alert('Got ' + arguments[0]) };
</script>

Interesting; that applies to JSOX for Number, BigInt, Date, ....

Parenthesis (in the C version) fault while collecting an identifier as being a non-identifier character as defined by Unicode Standards....  (as per rules of an identifier in ES6)
That lookup was omitted in the JS implementation.  (per character search through several thousand values.) 

Parenthesis is reserved for code, expressions, parameter specifications, and is (should be) forbidden except in strings.
 

This allowed piggybacking on HTTP credentials if an attacker could get a victim to visit their page.

The problem was that the meaning of [...] and {...} were specified in terms of global.Array and global.Object
which could be replaced

That's been fixed, but JSOX should probably be careful about any ambiguity with BlockStatement.
IIUC,
  { keyword: [] }
is valid as a statement so there is some ambiguity there.

Then I see examples like
//-- the following...
a { firstField, secondField }
[ a { 1, 2 }, a{5,6}, a{"val1","val2"} ]
Ya, it's tempting to type parenthesis after an identifer (fixed above)
[ {firstField:1, secondField:2 }, {firstField:5,secondField:6}, {firstField:"val1",secondField:"val2"} ]
But that doesn't really generate any more data; (similar strings get collapsed?)... and  on parsing, there's only one reference to 'firstField' and 'secondField'... I was trying to ponder scenarios where the data grows unbounded... but even in a case where there's a reference like

[ {a:1, b:2 }, [ref[0],ref[0]], [ref[1],ref[1]], [ref[2],ref[2]] ]
[ {a:1,b:2}, [ {a:1, b:2}, {a:1,b:2} ], [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ], [ [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ], [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ] ] ]
But it's not really replicated data, in the meta data between parsing and object assembly, it's an array of strings/numbers; and resolves to pointers to existing data.


I haven't worked through your grammar, but I wonder whether a naive JSOX encoder might produce output like
    { looksLikeAStatementLabel: a{"val1", "val2"} }

(yes, but not parentheses.  Because parens are not control characters, the end up being gatherable into identifiers)

or
    a
    { onlyField }

The current parsing will drop 'onlyField' and result with {}.
It only 'pushes' the value into the container if there is a value. 

It was previously is a parsing error, no value for field... 'expected ':' and a value' sort of thing; But I ran into '{}' which is a similar parsing state...

 
    [ a(5), a("val1") ]
allowing an attacker to do
    <script>
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    </script>
    <script src="http://other-origin/jsox-web-service"></script>

There's a lot of "ifs" in this scenario,
AND CORS solves a lot of these problems for origins that use it
AND browsers are less trusting of script srcs with Content-types:text/x-jsox than they were in 2008
BUT
    // attacker setup
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    // victim responds
    a
    { onlyField }
    [ a(5), a("val1") ]
does alert twice in Chrome and JSON hijacking was exploited in the wild, serializers have been known to
line wrap in attacker-controllable ways, and there may still be many JSON webservices that respect ambient
credentials on cross-origin requests.

In the first case a(5) turns out to be a valid identifier, which is also sort of a string, and the second one would fault finding a " in the middle of a identifier... string-string is never allowed... "a""b"; but I see... it does depend on how parsing is implemented; grabbing the values with a regexp could do that.
 

 
This does stick to JSON's spirit of only transporting data.  The parser is very similar to a JSON parser, except many places that would previously throw are accepted....
And references can only link to other objects/arrays within the current outermost object/array.

 

This already happens with plain JSON, so anything that allows external inputs to specify which internal types to construct would have to include a "Security Considerations" section that explains how this could be safely used by code that assumes that `if (x instanceof InternalType)` then x came from internal code that made a good-faith effort to only pass appropriate inputs to `new InternalType(...)`.

On Tue, Sep 18, 2018 at 5:22 PM J Decker <[hidden email]> wrote:
(Thank you Rod Sterling)

But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.  

I'm, at this point, open to changing anything (or even omitting things), including the name.

JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.  

So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
so it can be used in human readable circumstances.

Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.    

So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.

I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.

So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.

A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.




(Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/

There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: Submitted for your approval, JSOX

Mike Samuel


On Wed, Sep 19, 2018, 4:07 PM J Decker <[hidden email]> wrote:
(trimmed)

On Wed, Sep 19, 2018 at 12:08 PM Mike Samuel <[hidden email]> wrote:


On Wed, Sep 19, 2018 at 12:01 PM J Decker <[hidden email]> wrote:

I know of no exploits; all resulting strings should be shorter than the input (because of escapes \\ ).  The C version allocates a output buffer that is the same size as the input, and moves decoded strings into it.  Structure characters [ { } ] , " ' `  don't transfer either.

Not a vulnerability in your JSOX implementation per se, but have you looked into whether there's exploitable ambiguity between JSOX and runs of ES BlockStatements and ExpressionStatements?

JSON used to be vulnerable to cross-site snooping.

<script>// In attacker page
Array = function () { alert('Got ' + arguments[0]) };
</script>

Interesting; that applies to JSOX for Number, BigInt, Date, ....

Parenthesis (in the C version) fault while collecting an identifier as being a non-identifier character as defined by Unicode Standards....  (as per rules of an identifier in ES6)
That lookup was omitted in the JS implementation.  (per character search through several thousand values.) 

Parenthesis is reserved for code, expressions, parameter specifications, and is (should be) forbidden except in strings.

My apologies.  I thought there were parentheses in the docs on npmjs but seeing what I pasted from there on my phone it's obvious that it's all curly brackets.

As long as your syntax doesn't include parentheses, dots, or backticks you're probably fine.

 

This allowed piggybacking on HTTP credentials if an attacker could get a victim to visit their page.

The problem was that the meaning of [...] and {...} were specified in terms of global.Array and global.Object
which could be replaced

That's been fixed, but JSOX should probably be careful about any ambiguity with BlockStatement.
IIUC,
  { keyword: [] }
is valid as a statement so there is some ambiguity there.

Then I see examples like
//-- the following...
a { firstField, secondField }
[ a { 1, 2 }, a{5,6}, a{"val1","val2"} ]
Ya, it's tempting to type parenthesis after an identifer (fixed above)
[ {firstField:1, secondField:2 }, {firstField:5,secondField:6}, {firstField:"val1",secondField:"val2"} ]
But that doesn't really generate any more data; (similar strings get collapsed?)... and  on parsing, there's only one reference to 'firstField' and 'secondField'... I was trying to ponder scenarios where the data grows unbounded... but even in a case where there's a reference like

[ {a:1, b:2 }, [ref[0],ref[0]], [ref[1],ref[1]], [ref[2],ref[2]] ]
[ {a:1,b:2}, [ {a:1, b:2}, {a:1,b:2} ], [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ], [ [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ], [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ] ] ]
But it's not really replicated data, in the meta data between parsing and object assembly, it's an array of strings/numbers; and resolves to pointers to existing data.


I haven't worked through your grammar, but I wonder whether a naive JSOX encoder might produce output like
    { looksLikeAStatementLabel: a{"val1", "val2"} }

(yes, but not parentheses.  Because parens are not control characters, the end up being gatherable into identifiers)

or
    a
    { onlyField }

The current parsing will drop 'onlyField' and result with {}.
It only 'pushes' the value into the container if there is a value. 

It was previously is a parsing error, no value for field... 'expected ':' and a value' sort of thing; But I ran into '{}' which is a similar parsing state...

 
    [ a(5), a("val1") ]
allowing an attacker to do
    <script>
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    </script>
    <script src="http://other-origin/jsox-web-service"></script>

There's a lot of "ifs" in this scenario,
AND CORS solves a lot of these problems for origins that use it
AND browsers are less trusting of script srcs with Content-types:text/x-jsox than they were in 2008
BUT
    // attacker setup
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    // victim responds
    a
    { onlyField }
    [ a(5), a("val1") ]
does alert twice in Chrome and JSON hijacking was exploited in the wild, serializers have been known to
line wrap in attacker-controllable ways, and there may still be many JSON webservices that respect ambient
credentials on cross-origin requests.

In the first case a(5) turns out to be a valid identifier, which is also sort of a string, and the second one would fault finding a " in the middle of a identifier... string-string is never allowed... "a""b"; but I see... it does depend on how parsing is implemented; grabbing the values with a regexp could do that.
 

 
This does stick to JSON's spirit of only transporting data.  The parser is very similar to a JSON parser, except many places that would previously throw are accepted....
And references can only link to other objects/arrays within the current outermost object/array.

 

This already happens with plain JSON, so anything that allows external inputs to specify which internal types to construct would have to include a "Security Considerations" section that explains how this could be safely used by code that assumes that `if (x instanceof InternalType)` then x came from internal code that made a good-faith effort to only pass appropriate inputs to `new InternalType(...)`.

On Tue, Sep 18, 2018 at 5:22 PM J Decker <[hidden email]> wrote:
(Thank you Rod Sterling)

But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.  

I'm, at this point, open to changing anything (or even omitting things), including the name.

JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.  

So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
so it can be used in human readable circumstances.

Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.    

So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.

I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.

So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.

A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.




(Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/

There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: Submitted for your approval, JSOX

Mike Samuel


On Wed, Sep 19, 2018, 4:41 PM Mike Samuel <[hidden email]> wrote:


On Wed, Sep 19, 2018, 4:07 PM J Decker <[hidden email]> wrote:
(trimmed)

On Wed, Sep 19, 2018 at 12:08 PM Mike Samuel <[hidden email]> wrote:


On Wed, Sep 19, 2018 at 12:01 PM J Decker <[hidden email]> wrote:

I know of no exploits; all resulting strings should be shorter than the input (because of escapes \\ ).  The C version allocates a output buffer that is the same size as the input, and moves decoded strings into it.  Structure characters [ { } ] , " ' `  don't transfer either.

Not a vulnerability in your JSOX implementation per se, but have you looked into whether there's exploitable ambiguity between JSOX and runs of ES BlockStatements and ExpressionStatements?

JSON used to be vulnerable to cross-site snooping.

<script>// In attacker page
Array = function () { alert('Got ' + arguments[0]) };
</script>

Interesting; that applies to JSOX for Number, BigInt, Date, ....

Parenthesis (in the C version) fault while collecting an identifier as being a non-identifier character as defined by Unicode Standards....  (as per rules of an identifier in ES6)
That lookup was omitted in the JS implementation.  (per character search through several thousand values.) 

Parenthesis is reserved for code, expressions, parameter specifications, and is (should be) forbidden except in strings.

My apologies.  I thought there were parentheses in the docs on npmjs but seeing what I pasted from there on my phone it's obvious that it's all curly brackets.

As long as your syntax doesn't include parentheses, dots, or backticks you're probably fine.

Though I could probably make hay with an output that includes thee token pair  ] [ 


 

This allowed piggybacking on HTTP credentials if an attacker could get a victim to visit their page.

The problem was that the meaning of [...] and {...} were specified in terms of global.Array and global.Object
which could be replaced

That's been fixed, but JSOX should probably be careful about any ambiguity with BlockStatement.
IIUC,
  { keyword: [] }
is valid as a statement so there is some ambiguity there.

Then I see examples like
//-- the following...
a { firstField, secondField }
[ a { 1, 2 }, a{5,6}, a{"val1","val2"} ]
Ya, it's tempting to type parenthesis after an identifer (fixed above)
[ {firstField:1, secondField:2 }, {firstField:5,secondField:6}, {firstField:"val1",secondField:"val2"} ]
But that doesn't really generate any more data; (similar strings get collapsed?)... and  on parsing, there's only one reference to 'firstField' and 'secondField'... I was trying to ponder scenarios where the data grows unbounded... but even in a case where there's a reference like

[ {a:1, b:2 }, [ref[0],ref[0]], [ref[1],ref[1]], [ref[2],ref[2]] ]
[ {a:1,b:2}, [ {a:1, b:2}, {a:1,b:2} ], [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ], [ [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ], [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ] ] ]
But it's not really replicated data, in the meta data between parsing and object assembly, it's an array of strings/numbers; and resolves to pointers to existing data.


I haven't worked through your grammar, but I wonder whether a naive JSOX encoder might produce output like
    { looksLikeAStatementLabel: a{"val1", "val2"} }

(yes, but not parentheses.  Because parens are not control characters, the end up being gatherable into identifiers)

or
    a
    { onlyField }

The current parsing will drop 'onlyField' and result with {}.
It only 'pushes' the value into the container if there is a value. 

It was previously is a parsing error, no value for field... 'expected ':' and a value' sort of thing; But I ran into '{}' which is a similar parsing state...

 
    [ a(5), a("val1") ]
allowing an attacker to do
    <script>
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    </script>
    <script src="http://other-origin/jsox-web-service"></script>

There's a lot of "ifs" in this scenario,
AND CORS solves a lot of these problems for origins that use it
AND browsers are less trusting of script srcs with Content-types:text/x-jsox than they were in 2008
BUT
    // attacker setup
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    // victim responds
    a
    { onlyField }
    [ a(5), a("val1") ]
does alert twice in Chrome and JSON hijacking was exploited in the wild, serializers have been known to
line wrap in attacker-controllable ways, and there may still be many JSON webservices that respect ambient
credentials on cross-origin requests.

In the first case a(5) turns out to be a valid identifier, which is also sort of a string, and the second one would fault finding a " in the middle of a identifier... string-string is never allowed... "a""b"; but I see... it does depend on how parsing is implemented; grabbing the values with a regexp could do that.
 

 
This does stick to JSON's spirit of only transporting data.  The parser is very similar to a JSON parser, except many places that would previously throw are accepted....
And references can only link to other objects/arrays within the current outermost object/array.

 

This already happens with plain JSON, so anything that allows external inputs to specify which internal types to construct would have to include a "Security Considerations" section that explains how this could be safely used by code that assumes that `if (x instanceof InternalType)` then x came from internal code that made a good-faith effort to only pass appropriate inputs to `new InternalType(...)`.

On Tue, Sep 18, 2018 at 5:22 PM J Decker <[hidden email]> wrote:
(Thank you Rod Sterling)

But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.  

I'm, at this point, open to changing anything (or even omitting things), including the name.

JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.  

So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
so it can be used in human readable circumstances.

Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.    

So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.

I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.

So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.

A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.




(Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/

There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: Submitted for your approval, JSOX

J Decker


On Wed, Sep 19, 2018 at 1:46 PM Mike Samuel <[hidden email]> wrote:


On Wed, Sep 19, 2018, 4:41 PM Mike Samuel <[hidden email]> wrote:


On Wed, Sep 19, 2018, 4:07 PM J Decker <[hidden email]> wrote:
(trimmed)

On Wed, Sep 19, 2018 at 12:08 PM Mike Samuel <[hidden email]> wrote:


On Wed, Sep 19, 2018 at 12:01 PM J Decker <[hidden email]> wrote:

I know of no exploits; all resulting strings should be shorter than the input (because of escapes \\ ).  The C version allocates a output buffer that is the same size as the input, and moves decoded strings into it.  Structure characters [ { } ] , " ' `  don't transfer either.

Not a vulnerability in your JSOX implementation per se, but have you looked into whether there's exploitable ambiguity between JSOX and runs of ES BlockStatements and ExpressionStatements?

JSON used to be vulnerable to cross-site snooping.

<script>// In attacker page
Array = function () { alert('Got ' + arguments[0]) };
</script>

Interesting; that applies to JSOX for Number, BigInt, Date, ....

Parenthesis (in the C version) fault while collecting an identifier as being a non-identifier character as defined by Unicode Standards....  (as per rules of an identifier in ES6)
That lookup was omitted in the JS implementation.  (per character search through several thousand values.) 

Parenthesis is reserved for code, expressions, parameter specifications, and is (should be) forbidden except in strings.

My apologies.  I thought there were parentheses in the docs on npmjs but seeing what I pasted from there on my phone it's obvious that it's all curly brackets.

As long as your syntax doesn't include parentheses, dots, or backticks you're probably fine.

Though I could probably make hay with an output that includes thee token pair  ] [ 

That could occur in a stream.  (Although if it's a stream I would expect it to come in on a websocket rather than any sort of request).... But

   someText{a,b,c}[1,2,3][1,2,3]

    [1][2] 

Those are valid streams of objects... How would  '][' be used?

I converted the non-identifier character test to a bit lookup and applied it in the JS parser. ( fixed like =,+,-,!,~,(,),<,>,... in unquoted contexts) but speaking of quotes a variant allowed is back-tick quoting... ` ` ; without the template/code aspects that implies with ES6.  what about content that's like

{ asdf : "hello
world" }
(\n literal is allowed to be collected, and/or \r) but JS would fault on multiline non-escaped-continuation... 

But I've been reflecting on something you said 'custom types'.
I'm thinking of implementing basically typed-strings.  <identifier> " ... "  (or like "abc""reconstructiondata" ); and registering fromJSOX handlers on the parser.  Which would be like parser.registerFromJSOX( "someType",  function (string) { /* use string to create a thing */ } ).  
Types like 'color' might want to just emit as '#RRGGBBAA' with a toJSOX... but really be separate color channels internally.
 


 

This allowed piggybacking on HTTP credentials if an attacker could get a victim to visit their page.

The problem was that the meaning of [...] and {...} were specified in terms of global.Array and global.Object
which could be replaced

That's been fixed, but JSOX should probably be careful about any ambiguity with BlockStatement.
IIUC,
  { keyword: [] }
is valid as a statement so there is some ambiguity there.

Then I see examples like
//-- the following...
a { firstField, secondField }
[ a { 1, 2 }, a{5,6}, a{"val1","val2"} ]
Ya, it's tempting to type parenthesis after an identifer (fixed above)
[ {firstField:1, secondField:2 }, {firstField:5,secondField:6}, {firstField:"val1",secondField:"val2"} ]
But that doesn't really generate any more data; (similar strings get collapsed?)... and  on parsing, there's only one reference to 'firstField' and 'secondField'... I was trying to ponder scenarios where the data grows unbounded... but even in a case where there's a reference like

[ {a:1, b:2 }, [ref[0],ref[0]], [ref[1],ref[1]], [ref[2],ref[2]] ]
[ {a:1,b:2}, [ {a:1, b:2}, {a:1,b:2} ], [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ], [ [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ], [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ] ] ]
But it's not really replicated data, in the meta data between parsing and object assembly, it's an array of strings/numbers; and resolves to pointers to existing data.


I haven't worked through your grammar, but I wonder whether a naive JSOX encoder might produce output like
    { looksLikeAStatementLabel: a{"val1", "val2"} }

(yes, but not parentheses.  Because parens are not control characters, the end up being gatherable into identifiers)

or
    a
    { onlyField }

The current parsing will drop 'onlyField' and result with {}.
It only 'pushes' the value into the container if there is a value. 

It was previously is a parsing error, no value for field... 'expected ':' and a value' sort of thing; But I ran into '{}' which is a similar parsing state...

 
    [ a(5), a("val1") ]
allowing an attacker to do
    <script>
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    </script>
    <script src="http://other-origin/jsox-web-service"></script>

There's a lot of "ifs" in this scenario,
AND CORS solves a lot of these problems for origins that use it
AND browsers are less trusting of script srcs with Content-types:text/x-jsox than they were in 2008
BUT
    // attacker setup
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    // victim responds
    a
    { onlyField }
    [ a(5), a("val1") ]
does alert twice in Chrome and JSON hijacking was exploited in the wild, serializers have been known to
line wrap in attacker-controllable ways, and there may still be many JSON webservices that respect ambient
credentials on cross-origin requests.

In the first case a(5) turns out to be a valid identifier, which is also sort of a string, and the second one would fault finding a " in the middle of a identifier... string-string is never allowed... "a""b"; but I see... it does depend on how parsing is implemented; grabbing the values with a regexp could do that.
 

 
This does stick to JSON's spirit of only transporting data.  The parser is very similar to a JSON parser, except many places that would previously throw are accepted....
And references can only link to other objects/arrays within the current outermost object/array.

 

This already happens with plain JSON, so anything that allows external inputs to specify which internal types to construct would have to include a "Security Considerations" section that explains how this could be safely used by code that assumes that `if (x instanceof InternalType)` then x came from internal code that made a good-faith effort to only pass appropriate inputs to `new InternalType(...)`.

On Tue, Sep 18, 2018 at 5:22 PM J Decker <[hidden email]> wrote:
(Thank you Rod Sterling)

But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.  

I'm, at this point, open to changing anything (or even omitting things), including the name.

JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.  

So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
so it can be used in human readable circumstances.

Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.    

So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.

I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.

So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.

A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.




(Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/

There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: Submitted for your approval, JSOX

Mike Samuel


On Thu, Sep 20, 2018 at 12:48 PM J Decker <[hidden email]> wrote:


That could occur in a stream.  (Although if it's a stream I would expect it to come in on a websocket rather than any sort of request).... But

   someText{a,b,c}[1,2,3][1,2,3]

    [1][2] 

Those are valid streams of objects... How would  '][' be used?

<!-- alerts "Intercepted [1,2,3]" -->
<script>
// Attacker setup
let someText, a, b, c;
Object.defineProperty(
  Array.prototype, 3,
  {
    get() {
      alert(`Intercepted ${ JSON.stringify(this) }`)
    }
  });
</script>
<script>
// Loaded cross origin from victim
someText
{a,b,c}[1,2,3][1,2,3]
</script>

If I own an origin and load the victims JSON cross-origin, I can use getters on Object and Array.prototype to get any object that is square bracket dereferenced.

You might notice that my setup declares variables to avoid "Undefined reference to someText" errors.  This could be mitigated by adding an unpredictable field name to the first type definition for every response so that the attacker always gets an "Undefined reference error".

someText
{a,b,c,R4nD0m}[1,2,3][1,2,3]

is not vulnerable since, although it is well-formed JS, evaluation fails before the intercepted array is constructed.


I converted the non-identifier character test to a bit lookup and applied it in the JS parser. ( fixed like =,+,-,!,~,(,),<,>,... in unquoted contexts) but speaking of quotes a variant allowed is back-tick quoting... ` ` ; without the template/code aspects that implies with ES6.  what about content that's like

{ asdf : "hello
world" }
(\n literal is allowed to be collected, and/or \r) but JS would fault on multiline non-escaped-continuation... 

But I've been reflecting on something you said 'custom types'.
I'm thinking of implementing basically typed-strings.  <identifier> " ... "  (or like "abc""reconstructiondata" ); and registering fromJSOX handlers on the parser.  Which would be like parser.registerFromJSOX( "someType",  function (string) { /* use string to create a thing */ } ).  
Types like 'color' might want to just emit as '#RRGGBBAA' with a toJSOX... but really be separate color channels internally.

I don't see any immediate security consequences to custom literals.  People tend not to put side-effects in literal-ish types' constructors.
I would hope that developers would know to treat literal-ish types like github.com/WICG/trusted-types with suspicion if it travels across a security boundary or implement some signature checking scheme, and there's no novel risk around unwisely registering a type to deserialize via JSOX than via JSON revivers.

If you're already against global registries, please ignore the rest of this comment.

Since JavaScript is now used for large systems with many modules from different authors, it helps to be able to scope things to a module.

It's much harder to build secure systems when we can't reason about security properties of modules in isolation.
When I, as a security reviewer, encounter a module that uses JSOX, I might enumerate the types it deserializes and check that it vets those before making auth decisions based on their content.
But if an application loads another module alongside the first which registers a global JSOX handler, that reasoning may no longer be valid since an input pipe to the first module may now include objects of types its authors didn't forsee.
That means I have to treat any uses of registerFromJSOX that affect parser's globally as a system-level hazard, not just a module-level hazard.
TLDR: many interesting security properties depend on human judgement; humans can't do whole program analysis for most programs, so global registries are effectively open sets; open sets complicate conservative analyses which are often necessary for sound security reasoning.

https://github.com/mikesamuel/unduck/blob/HEAD/API.md (explainer) is addressing the flip side of some of JSOX and I managed to do without global registries.
There i used a composition pattern that brings a base object with an empty registry into scope, and a registration method that returns a copy with a larger registry.
{
    let ud = require('unduck')
    ud = ud.withTypes({ /* type description */ })
    // more of the same

    // Alternatively
    // let ud = require('unduck')
    //    .withTypes(...)
    //    .withTypes(...);

    // Apply
    ud(/* untrusted input */)
}




 
 


 

This allowed piggybacking on HTTP credentials if an attacker could get a victim to visit their page.

The problem was that the meaning of [...] and {...} were specified in terms of global.Array and global.Object
which could be replaced

That's been fixed, but JSOX should probably be careful about any ambiguity with BlockStatement.
IIUC,
  { keyword: [] }
is valid as a statement so there is some ambiguity there.

Then I see examples like
//-- the following...
a { firstField, secondField }
[ a { 1, 2 }, a{5,6}, a{"val1","val2"} ]
Ya, it's tempting to type parenthesis after an identifer (fixed above)
[ {firstField:1, secondField:2 }, {firstField:5,secondField:6}, {firstField:"val1",secondField:"val2"} ]
But that doesn't really generate any more data; (similar strings get collapsed?)... and  on parsing, there's only one reference to 'firstField' and 'secondField'... I was trying to ponder scenarios where the data grows unbounded... but even in a case where there's a reference like

[ {a:1, b:2 }, [ref[0],ref[0]], [ref[1],ref[1]], [ref[2],ref[2]] ]
[ {a:1,b:2}, [ {a:1, b:2}, {a:1,b:2} ], [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ], [ [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ], [ [{a:1, b:2}, {a:1,b:2}],[{a:1, b:2}, {a:1,b:2}] ] ] ]
But it's not really replicated data, in the meta data between parsing and object assembly, it's an array of strings/numbers; and resolves to pointers to existing data.


I haven't worked through your grammar, but I wonder whether a naive JSOX encoder might produce output like
    { looksLikeAStatementLabel: a{"val1", "val2"} }

(yes, but not parentheses.  Because parens are not control characters, the end up being gatherable into identifiers)

or
    a
    { onlyField }

The current parsing will drop 'onlyField' and result with {}.
It only 'pushes' the value into the container if there is a value. 

It was previously is a parsing error, no value for field... 'expected ':' and a value' sort of thing; But I ran into '{}' which is a similar parsing state...

 
    [ a(5), a("val1") ]
allowing an attacker to do
    <script>
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    </script>
    <script src="http://other-origin/jsox-web-service"></script>

There's a lot of "ifs" in this scenario,
AND CORS solves a lot of these problems for origins that use it
AND browsers are less trusting of script srcs with Content-types:text/x-jsox than they were in 2008
BUT
    // attacker setup
    let onlyField = null;
    function a(...data) {
      alert(`Got ${ data }`);
    }
    // victim responds
    a
    { onlyField }
    [ a(5), a("val1") ]
does alert twice in Chrome and JSON hijacking was exploited in the wild, serializers have been known to
line wrap in attacker-controllable ways, and there may still be many JSON webservices that respect ambient
credentials on cross-origin requests.

In the first case a(5) turns out to be a valid identifier, which is also sort of a string, and the second one would fault finding a " in the middle of a identifier... string-string is never allowed... "a""b"; but I see... it does depend on how parsing is implemented; grabbing the values with a regexp could do that.
 

 
This does stick to JSON's spirit of only transporting data.  The parser is very similar to a JSON parser, except many places that would previously throw are accepted....
And references can only link to other objects/arrays within the current outermost object/array.

 

This already happens with plain JSON, so anything that allows external inputs to specify which internal types to construct would have to include a "Security Considerations" section that explains how this could be safely used by code that assumes that `if (x instanceof InternalType)` then x came from internal code that made a good-faith effort to only pass appropriate inputs to `new InternalType(...)`.

On Tue, Sep 18, 2018 at 5:22 PM J Decker <[hidden email]> wrote:
(Thank you Rod Sterling)

But seriously, I'd like to submit, for serious consideration, JSOX - JavaScript Object eXchange format.  It inherits all JSON syntax such that it is able to process any existing JSON.  

I'm, at this point, open to changing anything (or even omitting things), including the name.

JSON is great.  JSON has some limits, and criticisms... JS/ES Grew , but JSON has to stay the same, similarly with whatever comes next I'd imagine.  

So a primary goal is to encode and decode ES6 objects for transport with a simple API such as JSOX.parse( object ), and JSOX.stringify( jsoxString ).  But also keep with the simplicity of JSON,
so it can be used in human readable circumstances.

Types that are now (or soon) native to ES such as TypedArrays (binary data), BigInt types, and even the existing Date type, do not transport with JSON very well.  They become a non-identifable string, that requires extra code involving knowledge of the structure of the data being transferred to be able to restore the values to Date(), BigInt(), et al.    

So a few weeks ago I started considering what else, beyond these simple modifications might also be useful, or address criticisms of JSON.  Handling the above types is really a trivial modification to most JSON parsers.  Each of the following modifications is really only a very slight change to behavior; although implementing typed-objects does initially involve changing error handling into identifer-fallback handling.

I initially argued, that defining a object prototype 'card(name,address,zipcode,created)' which removes the redundant data for every following reference, (and is good, just for data reduction, which was argued 'gzip').  A JSON representation might be `{"name":"bob","address":"123 street","zipcode":"55555","created":1537304820} where if have a large number of the same record the same 'name':,'address':, etc is repeated in every record.  Where a typed-object's value in JSOX could be `card{:"bob","123 street","55555",2018-09-18T21:07:00Z}`.  All objects that are revived as typed-objects share the same prototype, and before parsing, the prototypes to be used may be specified.  The amount of data to process is reduced, perhaps to a significant degree.

So <Identifer> '{' is about typed-objects.  This construct is not allowed in JSON.  But that then leads to <Identifier> '['  - typed arrays, arrays don't really have redundant data potential like objects, but there are TypedArrays in ES.  There is no way to define a type of an array, but hardcoded types like 'ab', 'u8', 'ref' are used to revive binary data.  The bytes of the backing ArrayBuffer are encoded to base64, and included within '[' and ']' without quotes; using the brackets as quotes.

A JSOX typed array is the 'ref' type.  A reference to another location in the current object can be specified, which allows encoding cyclic structures.




(Initial public reaction was not very helpful, but probably that's the fault of how it was introduced?)
https://www.reddit.com/r/javascript/comments/9f8wml/jsox_javascript_object_exchange_format_preview/

There was plenty of 'why not [YAML/BSON/protobufs/(I don't think anyone said XML)/...]'  and the answer is simply, because none of those read JSON, or have as simple of an API. (amongst other reasons that JSON is already a solution for compared to those mentioned)
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss