ECMAScript feature suggestion: Streaming Array items through filter/map/reduce functions

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

ECMAScript feature suggestion: Streaming Array items through filter/map/reduce functions

Roma Bronstein
Hi,

It's my first time suggesting a feature, hope I'm doing it correctly.

I really like using Array.prototype.map(), Array.prototype.reduce() and all related functions.
The code is more readable and it looks better.
However, when I want to write performance sensitive code, chaining these functions is not a good approach.
For example, writing this:
// a is an Array of length N
const b = a.filter().map()

will require 2 traversals over the whole array, up to 2*N iterations (if the filter passes all items).

This is why I often resort to writing this:
const b= []
a.forEach(() => {
  if (/*the filter condition*/)
    b.push(/*mapping logic*/)
})

Which requires only N iterations.

I suggest adding a capability to streamline items to these functions.
I get my inspiration from Redis's transaction syntax where you declare starting a transaction and finally call EXEC in order to execute it.
So now I'll be able to write something like this:
const b = a.stream()
  .filter()
  .map()
  .exec()

Just to clarify the example:
I've declared that I'd like to stream array items of a. Then I've chained the functions I'd like to items to pass through.
Finally I've activated it using the exec() function.

I'm not sure if this is the best syntactical approach, but this example is more intuitive to understand in my opinion.

Another approach could be thinking about a "pipeline" operator like in UNIX cli, providing a more generic capability to pipeline iterators.

Again, I hope I'm doing this correctly and in the right forum.
And if so, I'd be happy to hear some feedback.

Thanks,
Roma




_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: ECMAScript feature suggestion: Streaming Array items through filter/map/reduce functions

Oliver Dunk
This seems like a good place to share the idea, and it’s helpful that you provided use cases etc.

Is there a reason why you prefer the proposed syntax over the forEach loop you mentioned? Personally I like how the forEach is easy to understand, but maybe there are other examples of when the stream is useful and the polyfill is much more complex.

> On 21 Jun 2019, at 16:32, Roma Bronstein <[hidden email]> wrote:
>
> Hi,
>
> It's my first time suggesting a feature, hope I'm doing it correctly.
>
> I really like using Array.prototype.map(), Array.prototype.reduce() and all related functions.
> The code is more readable and it looks better.
> However, when I want to write performance sensitive code, chaining these functions is not a good approach.
> For example, writing this:
> // a is an Array of length N
> const b = a.filter().map()
>
> will require 2 traversals over the whole array, up to 2*N iterations (if the filter passes all items).
>
> This is why I often resort to writing this:
> const b= []
> a.forEach(() => {
>   if (/*the filter condition*/)
>     b.push(/*mapping logic*/)
> })
>
> Which requires only N iterations.
>
> I suggest adding a capability to streamline items to these functions.
> I get my inspiration from Redis's transaction syntax where you declare starting a transaction and finally call EXEC in order to execute it.
> So now I'll be able to write something like this:
> const b = a.stream()
>   .filter()
>   .map()
>   .exec()
>
> Just to clarify the example:
> I've declared that I'd like to stream array items of a. Then I've chained the functions I'd like to items to pass through.
> Finally I've activated it using the exec() function.
>
> I'm not sure if this is the best syntactical approach, but this example is more intuitive to understand in my opinion.
>
> Another approach could be thinking about a "pipeline" operator like in UNIX cli, providing a more generic capability to pipeline iterators.
>
> Again, I hope I'm doing this correctly and in the right forum.
> And if so, I'd be happy to hear some feedback.
>
> Thanks,
> Roma
>
>
>
> _______________________________________________
> es-discuss mailing list
> [hidden email]
> https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: ECMAScript feature suggestion: Streaming Array items through filter/map/reduce functions

Roma Bronstein
Thanks Oliver for the quick response.

The problem for me with forEach, that it's pretty much like a for loop.
Anything can go inside the iteration logic.
However with filter/map/reduce/some/every functions, the intention is explicit.
Also you don't need to implement basic array operations that the above mentioned functions already provide.

For instance in my opinion writing something like:
a.filter()
  .map()
  .reduce()

Is much clearer and safer than:

let reducedValue
a.forEach(item => {
  if(!<filter condition>) return
  const mappedItem = mappingLogic(item)
  reducedValue = reduceLogic(mappedItem)
})



On Fri, Jun 21, 2019 at 11:58 PM Oliver Dunk <[hidden email]> wrote:
This seems like a good place to share the idea, and it’s helpful that you provided use cases etc.

Is there a reason why you prefer the proposed syntax over the forEach loop you mentioned? Personally I like how the forEach is easy to understand, but maybe there are other examples of when the stream is useful and the polyfill is much more complex.

> On 21 Jun 2019, at 16:32, Roma Bronstein <[hidden email]> wrote:
>
> Hi,
>
> It's my first time suggesting a feature, hope I'm doing it correctly.
>
> I really like using Array.prototype.map(), Array.prototype.reduce() and all related functions.
> The code is more readable and it looks better.
> However, when I want to write performance sensitive code, chaining these functions is not a good approach.
> For example, writing this:
> // a is an Array of length N
> const b = a.filter().map()
>
> will require 2 traversals over the whole array, up to 2*N iterations (if the filter passes all items).
>
> This is why I often resort to writing this:
> const b= []
> a.forEach(() => {
>   if (/*the filter condition*/)
>     b.push(/*mapping logic*/)
> })
>
> Which requires only N iterations.
>
> I suggest adding a capability to streamline items to these functions.
> I get my inspiration from Redis's transaction syntax where you declare starting a transaction and finally call EXEC in order to execute it.
> So now I'll be able to write something like this:
> const b = a.stream()
>   .filter()
>   .map()
>   .exec()
>
> Just to clarify the example:
> I've declared that I'd like to stream array items of a. Then I've chained the functions I'd like to items to pass through.
> Finally I've activated it using the exec() function.
>
> I'm not sure if this is the best syntactical approach, but this example is more intuitive to understand in my opinion.
>
> Another approach could be thinking about a "pipeline" operator like in UNIX cli, providing a more generic capability to pipeline iterators.
>
> Again, I hope I'm doing this correctly and in the right forum.
> And if so, I'd be happy to hear some feedback.
>
> Thanks,
> Roma
>
>
>
> _______________________________________________
> es-discuss mailing list
> [hidden email]
> https://mail.mozilla.org/listinfo/es-discuss


_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: CMAScript feature suggestion: Streaming Array items through filter/map/reduce functions

Gus Caplan
I always forget to reply-all :)

---------- Forwarded message ---------
From: Gus Caplan <[hidden email]>
Date: Fri, Jun 21, 2019, 16:34
Subject: Re: ECMAScript feature suggestion: Streaming Array items through filter/map/reduce functions
To: Roma Bronstein <[hidden email]>


I'm working on a proposal that adds generalized iteration methods (most of which are lazy). https://github.com/tc39/proposal-iterator-helpers

I believe this would solve your problem.

-Gus

On Fri, Jun 21, 2019, 16:10 Roma Bronstein <[hidden email]> wrote:
Thanks Oliver for the quick response.

The problem for me with forEach, that it's pretty much like a for loop.
Anything can go inside the iteration logic.
However with filter/map/reduce/some/every functions, the intention is explicit.
Also you don't need to implement basic array operations that the above mentioned functions already provide.

For instance in my opinion writing something like:
a.filter()
  .map()
  .reduce()

Is much clearer and safer than:

let reducedValue
a.forEach(item => {
  if(!<filter condition>) return
  const mappedItem = mappingLogic(item)
  reducedValue = reduceLogic(mappedItem)
})



On Fri, Jun 21, 2019 at 11:58 PM Oliver Dunk <[hidden email]> wrote:
This seems like a good place to share the idea, and it’s helpful that you provided use cases etc.

Is there a reason why you prefer the proposed syntax over the forEach loop you mentioned? Personally I like how the forEach is easy to understand, but maybe there are other examples of when the stream is useful and the polyfill is much more complex.

> On 21 Jun 2019, at 16:32, Roma Bronstein <[hidden email]> wrote:
>
> Hi,
>
> It's my first time suggesting a feature, hope I'm doing it correctly.
>
> I really like using Array.prototype.map(), Array.prototype.reduce() and all related functions.
> The code is more readable and it looks better.
> However, when I want to write performance sensitive code, chaining these functions is not a good approach.
> For example, writing this:
> // a is an Array of length N
> const b = a.filter().map()
>
> will require 2 traversals over the whole array, up to 2*N iterations (if the filter passes all items).
>
> This is why I often resort to writing this:
> const b= []
> a.forEach(() => {
>   if (/*the filter condition*/)
>     b.push(/*mapping logic*/)
> })
>
> Which requires only N iterations.
>
> I suggest adding a capability to streamline items to these functions.
> I get my inspiration from Redis's transaction syntax where you declare starting a transaction and finally call EXEC in order to execute it.
> So now I'll be able to write something like this:
> const b = a.stream()
>   .filter()
>   .map()
>   .exec()
>
> Just to clarify the example:
> I've declared that I'd like to stream array items of a. Then I've chained the functions I'd like to items to pass through.
> Finally I've activated it using the exec() function.
>
> I'm not sure if this is the best syntactical approach, but this example is more intuitive to understand in my opinion.
>
> Another approach could be thinking about a "pipeline" operator like in UNIX cli, providing a more generic capability to pipeline iterators.
>
> Again, I hope I'm doing this correctly and in the right forum.
> And if so, I'd be happy to hear some feedback.
>
> Thanks,
> Roma
>
>
>
> _______________________________________________
> es-discuss mailing list
> [hidden email]
> https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: CMAScript feature suggestion: Streaming Array items through filter/map/reduce functions

Roma Bronstein
Thanks Gus.

I'm not sure I fully understand the proposal docs (first time I'm reading something like this, OMG...)
Can you please show a quick example of how would you streamline an array through filter, map and then reduce?

Thanks,
Roma


On Sat, Jun 22, 2019 at 12:36 AM Gus Caplan <[hidden email]> wrote:
I always forget to reply-all :)

---------- Forwarded message ---------
From: Gus Caplan <[hidden email]>
Date: Fri, Jun 21, 2019, 16:34
Subject: Re: ECMAScript feature suggestion: Streaming Array items through filter/map/reduce functions
To: Roma Bronstein <[hidden email]>


I'm working on a proposal that adds generalized iteration methods (most of which are lazy). https://github.com/tc39/proposal-iterator-helpers

I believe this would solve your problem.

-Gus

On Fri, Jun 21, 2019, 16:10 Roma Bronstein <[hidden email]> wrote:
Thanks Oliver for the quick response.

The problem for me with forEach, that it's pretty much like a for loop.
Anything can go inside the iteration logic.
However with filter/map/reduce/some/every functions, the intention is explicit.
Also you don't need to implement basic array operations that the above mentioned functions already provide.

For instance in my opinion writing something like:
a.filter()
  .map()
  .reduce()

Is much clearer and safer than:

let reducedValue
a.forEach(item => {
  if(!<filter condition>) return
  const mappedItem = mappingLogic(item)
  reducedValue = reduceLogic(mappedItem)
})



On Fri, Jun 21, 2019 at 11:58 PM Oliver Dunk <[hidden email]> wrote:
This seems like a good place to share the idea, and it’s helpful that you provided use cases etc.

Is there a reason why you prefer the proposed syntax over the forEach loop you mentioned? Personally I like how the forEach is easy to understand, but maybe there are other examples of when the stream is useful and the polyfill is much more complex.

> On 21 Jun 2019, at 16:32, Roma Bronstein <[hidden email]> wrote:
>
> Hi,
>
> It's my first time suggesting a feature, hope I'm doing it correctly.
>
> I really like using Array.prototype.map(), Array.prototype.reduce() and all related functions.
> The code is more readable and it looks better.
> However, when I want to write performance sensitive code, chaining these functions is not a good approach.
> For example, writing this:
> // a is an Array of length N
> const b = a.filter().map()
>
> will require 2 traversals over the whole array, up to 2*N iterations (if the filter passes all items).
>
> This is why I often resort to writing this:
> const b= []
> a.forEach(() => {
>   if (/*the filter condition*/)
>     b.push(/*mapping logic*/)
> })
>
> Which requires only N iterations.
>
> I suggest adding a capability to streamline items to these functions.
> I get my inspiration from Redis's transaction syntax where you declare starting a transaction and finally call EXEC in order to execute it.
> So now I'll be able to write something like this:
> const b = a.stream()
>   .filter()
>   .map()
>   .exec()
>
> Just to clarify the example:
> I've declared that I'd like to stream array items of a. Then I've chained the functions I'd like to items to pass through.
> Finally I've activated it using the exec() function.
>
> I'm not sure if this is the best syntactical approach, but this example is more intuitive to understand in my opinion.
>
> Another approach could be thinking about a "pipeline" operator like in UNIX cli, providing a more generic capability to pipeline iterators.
>
> Again, I hope I'm doing this correctly and in the right forum.
> And if so, I'd be happy to hear some feedback.
>
> Thanks,
> Roma
>
>
>
> _______________________________________________
> es-discuss mailing list
> [hidden email]
> https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: ECMAScript feature suggestion: Streaming Array items through filter/map/reduce functions

Bergi
In reply to this post by Roma Bronstein
Hi!

> However, when I want to write performance sensitive code, chaining these
> functions is not a good approach.
> const b = a.filter().map()
>
> will require 2 traversals over the whole array, up to 2*N iterations (if
> the filter passes all items).

Actually, the number of passes hardly matters. It's still linear
complexity. What makes this slow is the allocation of the unnecessary
temporary array.

> I suggest adding a capability to streamline items to these functions.

We don't need streams, JavaScript already has iterators. What we do need
are proper helper functions for those - see the existing proposal at
<https://github.com/tc39/proposal-iterator-helpers>. You then can write

     const b = Array.from(a.values().filter(…).map(…))

or

     for (const x of a.values().filter(…).map(…))
         console.log(x);

kind regards,
  Bergi
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss
Reply | Threaded
Open this post in threaded view
|

Re: ECMAScript feature suggestion: Streaming Array items through filter/map/reduce functions

Roma Bronstein
Thanks for the reply Bergi.

Correct me if I'm wrong but writing something like:
const b = Array.from(a.values().filter(…).map(…))

Still requires 2 iterations over the array.
You're theoretically right that the running time complexity is still linear.
However with the introduction of the ability to have async functions within each iteration,
in real life production code there's a significant difference between running 1M iterations vs. ~2M (an iteration can involve interaction with a database or 3rd party API).

I'd be happy to understand more from Gus, how would his proposal deal with the scenario I've described.

Thanks,
Roma

On Sat, Jun 22, 2019 at 12:56 AM Bergi <[hidden email]> wrote:
Hi!

> However, when I want to write performance sensitive code, chaining these
> functions is not a good approach.
> const b = a.filter().map()
>
> will require 2 traversals over the whole array, up to 2*N iterations (if
> the filter passes all items).

Actually, the number of passes hardly matters. It's still linear
complexity. What makes this slow is the allocation of the unnecessary
temporary array.

> I suggest adding a capability to streamline items to these functions.

We don't need streams, JavaScript already has iterators. What we do need
are proper helper functions for those - see the existing proposal at
<https://github.com/tc39/proposal-iterator-helpers>. You then can write

     const b = Array.from(a.values().filter(…).map(…))

or

     for (const x of a.values().filter(…).map(…))
         console.log(x);

kind regards,
  Bergi
_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[hidden email]
https://mail.mozilla.org/listinfo/es-discuss