Concurrency support?

classic Classic list List threaded Threaded
30 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Concurrency support?

Neil Mix
Hi all,

I've been reading the es4 pages and I'm really excited by the  
proposed language enhancements.  I'm looking forward to writing great  
software with this new language version.

I'm a little disappointed to see that there's no plans to address  
concurrent programming.  Is there any reason that concurrency is not  
specifically being addressed at this time?

FWIW, concurrency support (or more specifically, the ability to yield  
while waiting for an event) is easily the top item in my list of  
enhancement requests for the JavaScript language.  I've been building  
large-ish client-side "AJAX" applications for several years now, and  
I find it painfully difficult to debug asynchronous callbacks and  
read asynchronous code.  My wife, in particular, would like to see  
concurrency in JavaScript so that she might have normal conversations  
with me the end of my workday.

Any chance that concurrency support can be put (or already is) on the  
radar?

   -Neil


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Brendan Eich-2
On Jun 25, 2006, at 6:30 PM, Neil Mix wrote:

> FWIW, concurrency support (or more specifically, the ability to  
> yield while waiting for an event) is easily the top item in my list  
> of enhancement requests for the JavaScript language.  I've been  
> building large-ish client-side "AJAX" applications for several  
> years now, and I find it painfully difficult to debug asynchronous  
> callbacks and read asynchronous code.  My wife, in particular,  
> would like to see concurrency in JavaScript so that she might have  
> normal conversations with me the end of my workday.

See http://developer.mozilla.org/es4/proposals/ 
iterators_and_generators.html, apologies for the gaps there.  They  
will be filled in shortly.

The idea in emulating Python 2.5 generators is that you can coroutine  
your code -- you can use a standard Ajax library to map a single  
function that contains yield expressions across a series of  
asynchronous callbacks, without having to break that function up into  
a bunch of callback functions.

/be


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Chris Double
In reply to this post by Neil Mix
> Any chance that concurrency support can be put (or already is) on the
> radar?

With the generator support being discussed for Javascript it seems it
would be possible to build a simple lightweight thread library, as per
this Python approach:

http://www-128.ibm.com/developerworks/linux/library/l-pythrd.html

It's not quite as flexible as a Narrative Javascript's continuation
based approach since the 'yield' call can only be done within the
generator function, not in functions called by the generator function.

I tried to do an example similar to my lightweight threads example [1]
using JS 1.7's generators but struck that issue. I'm seeing if I can
work around it to get the example running using generators and yield.

[1] http://www.bluishcoder.co.nz/2006/06/more-concurrency-in-narrative.html

Chris.
--
http://www.bluishcoder.co.nz

Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Nicolas Cannasse
In reply to this post by Brendan Eich-2
>>FWIW, concurrency support (or more specifically, the ability to  
>>yield while waiting for an event) is easily the top item in my list  
>>of enhancement requests for the JavaScript language.  I've been  
>>building large-ish client-side "AJAX" applications for several  
>>years now, and I find it painfully difficult to debug asynchronous  
>>callbacks and read asynchronous code.  My wife, in particular,  
>>would like to see concurrency in JavaScript so that she might have  
>>normal conversations with me the end of my workday.
>
>
> See http://developer.mozilla.org/es4/proposals/ 
> iterators_and_generators.html, apologies for the gaps there.  They  
> will be filled in shortly.
>
> The idea in emulating Python 2.5 generators is that you can coroutine  
> your code -- you can use a standard Ajax library to map a single  
> function that contains yield expressions across a series of  
> asynchronous callbacks, without having to break that function up into  
> a bunch of callback functions.
>
> /be

What about full continuations support à la call/cc ? Generators are just
a specific application of continuations, which are much more powerful
when freely usable.

Nicolas


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Lars T Hansen
Nicolas Cannasse writes:
 > >>FWIW, concurrency support (or more specifically, the ability to  
 > >>yield while waiting for an event) is easily the top item in my list  
 > >>of enhancement requests for the JavaScript language.  I've been  
 > >>building large-ish client-side "AJAX" applications for several  
 > >>years now, and I find it painfully difficult to debug asynchronous  
 > >>callbacks and read asynchronous code.  My wife, in particular,  
 > >>would like to see concurrency in JavaScript so that she might have  
 > >>normal conversations with me the end of my workday.
 > >
 > >
 > > See http://developer.mozilla.org/es4/proposals/ 
 > > iterators_and_generators.html, apologies for the gaps there.  They  
 > > will be filled in shortly.
 > >
 > > The idea in emulating Python 2.5 generators is that you can coroutine  
 > > your code -- you can use a standard Ajax library to map a single  
 > > function that contains yield expressions across a series of  
 > > asynchronous callbacks, without having to break that function up into  
 > > a bunch of callback functions.
 > >
 > > /be
 >
 > What about full continuations support à la call/cc ? Generators are just
 > a specific application of continuations, which are much more powerful
 > when freely usable.

Full continuations interact in surprising ways with side effects.
Slipping into Scheme for a moment, the issue is exemplified by this
implementation of the map function:

    (define (map f xs)
      (let loop ((acc '()) (xs xs))
        (if (null? xs)
            (reverse! acc)
            (loop (cons (f (car xs)) acc) (cdr xs)))))

Suppose f captures it continuation and that code outside the call to
map invokes that continuation later.  Then the result returned from
the first call to map will be observably changed by this invocation:

  # (map (lambda (x) (+ x 1)) (list 1 2 3))
  (2 3 4)
  # (define (fn x)
      (if (= x 2)
          (set! c (call-with-current-continuation
                    (lambda (k) k))))
      (+ x 1))
  # (define v (map fn (list 1 2 3)))
  # v
  (2 3 4)
  # (c 4)
  # v
  (4 3 2 3 4)

This is surprising in the sense that this is not really the behavior
you expect from map, and indeed an implementation of map that does not
share structure between its intermediate data and returned data
behaves more as expected.  It just needs to be constructed more
carefully, and will cons twice as much (or use O(n) stack).

It's not obvious that this has huge implications for the standard
libraries in ECMAScript as they stand, but this type of problem will
tend to make libraries more brittle in general.

The point I want to make is that we probably do not want to provide
full continuations as an abstraction on which to build threads,
coroutines, generators, exceptions, and so on, but instead to provide
these other more controllable forms directly instead.

--lars


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Brendan Eich-2
On Jun 26, 2006, at 4:09 AM, Lars T Hansen wrote:

> The point I want to make is that we probably do not want to provide
> full continuations as an abstraction on which to build threads,
> coroutines, generators, exceptions, and so on, but instead to provide
> these other more controllable forms directly instead.

Thanks for the complete example.  I wanted to back up Lars here by  
affirming that we've discussed call/cc in TG1 meetings and come to  
this conclusion, already.  So this is a consensus position, as far as  
I know.

Rhino (http://www.mozilla.org/rhino/) has a call/cc implementation  
from the Apache Cocoon folks, exposed as a Continuation object.  
Chris Double knows this well, including a bug (http://www.google.com/ 
url?sa=t&ct=res&cd=4&url=http%3A%2F%2Fwww.bluishcoder.co.nz%2F2006%
2F03%2Fjavascript-partial-continuations.html&ei=AuSfRKjACoSo-
gHd9_jTCQ&sig2=6ADqe10tNY06KRjiGXW46g).

So far, the bugs or limitations combined with the promise of call/cc  
generality seem to me to indicate overkill, or overreach.  If anyone  
on this list has an example use-case of call/cc in JS that you think  
is important, and that can't be mapped to coroutine-generators  
(Python 2.5, PEP 342), please post it.

/be


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Nicolas Cannasse
In reply to this post by Lars T Hansen
> It's not obvious that this has huge implications for the standard
> libraries in ECMAScript as they stand, but this type of problem will
> tend to make libraries more brittle in general.
>
> The point I want to make is that we probably do not want to provide
> full continuations as an abstraction on which to build threads,
> coroutines, generators, exceptions, and so on, but instead to provide
> these other more controllable forms directly instead.

I agree that call/cc is not for the average user, and should be used
with the appropriate care. However, while it's important to provide the
different call/cc traditional usages (which you listed), it's also
important to provide the low-level call/cc access for users that want to
use it.

There's been a lot of creativity in the past years about the way of
using continuations. I think we can except new paradigms in the future
as well. As a compiler writer, I prefer to be able to directly use
call/cc in the generated code and libraries instead of rewrapping and
sometimes abusing the standard ES4 library.

Nicolas

Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Brendan Eich-2
On Jun 26, 2006, at 9:53 AM, Nicolas Cannasse wrote:

> There's been a lot of creativity in the past years about the way of
> using continuations. I think we can except new paradigms in the future
> as well. As a compiler writer, I prefer to be able to directly use
> call/cc in the generated code and libraries instead of rewrapping and
> sometimes abusing the standard ES4 library.

This gets to the heart of an issue that Nicolas has discussed with me  
in private email: is ES4/JS2 likely to be a high-level language used  
by programmers who write large parts of their programs by hand, or is  
it likely to be a mid-level or "variable level" safe target language  
whose programs compilers generate based on other inputs (haXe, Google  
GWT, etc.)?

It's a good question, and I don't have an uncracked crystal ball.

I have a feeling that even with the rise of compilers (which our  
introduction of a new version of ES/JS may cause to become even more  
common), "a lot of" future ES4 code will be written by hand, and by  
people who do not need full call/cc, and who probably should not have  
access to it.

/be


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Lars T Hansen
In reply to this post by Nicolas Cannasse
Nicolas Cannasse writes:
 > > It's not obvious that this has huge implications for the standard
 > > libraries in ECMAScript as they stand, but this type of problem will
 > > tend to make libraries more brittle in general.
 > >
 > > The point I want to make is that we probably do not want to provide
 > > full continuations as an abstraction on which to build threads,
 > > coroutines, generators, exceptions, and so on, but instead to provide
 > > these other more controllable forms directly instead.
 >
 > I agree that call/cc is not for the average user, and should be used
 > with the appropriate care. However, while it's important to provide the
 > different call/cc traditional usages (which you listed), it's also
 > important to provide the low-level call/cc access for users that want to
 > use it.
 >
 > There's been a lot of creativity in the past years about the way of
 > using continuations. I think we can except new paradigms in the future
 > as well. As a compiler writer, I prefer to be able to directly use
 > call/cc in the generated code and libraries instead of rewrapping and
 > sometimes abusing the standard ES4 library.

I don't think that being a compiler target is among the core use cases
for ECMAScript 4.  (Nor do I think it should be.)  I also don't think
that a pragmatic programming language like ECMAScript should follow
Scheme, say, in providing a minimalist and powerful core on which a
great variation of abstractions can be written by the initiated.
Instead I think ECMAScript needs to be a rich language with features
that are immediately useful to its core audiences, and which play well
with each other in the sense that they have unsurprising consequences
when composed.

--lars


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

John Cowan
In reply to this post by Brendan Eich-2
Brendan Eich scripsit:

> So far, the bugs or limitations combined with the promise of call/cc  
> generality seem to me to indicate overkill, or overreach.  If anyone  
> on this list has an example use-case of call/cc in JS that you think  
> is important, and that can't be mapped to coroutine-generators  
> (Python 2.5, PEP 342), please post it.

I believe that the facilities of PEP 342, while necessary, is
insufficient, as it does not allow subroutines invoked by a coroutine
to yield for it, where some of the subroutines on the dynamic chain are
coroutine-blind (or if it does, it's too subtle for me to see how).

Lua (http://www.lua.org ) provides a particular flavor of coroutines
as its sole nonlinear control abstraction.  Lua coroutines unite
generators, user-level multitasking, and backtracking into a single
fairly straightforward construct; I urge that they be considered.

Lua coroutines are asymmetric; that is, each coroutine returns only
to its caller, not to some arbitrary coroutine (though a coroutine
trampoline can overcome this restriction).  Furthermore, a yield
can happen after arbitrarily many nested subroutine calls within
a coroutine.  There's a short paper, which I strongly recommend, at
http://www.inf.puc-rio.br/~roberto/docs/corosblp.pdf that explains the
issues, the Lua coroutine syntax, a formal semantics, and points to some
interesting literature, including a demonstration that coroutines are
equivalent to one-shot delimited continuations; as is well known, call/cc
cannot provide arbitrary delimited continuations without recompiling
all uses.

Lua uses a slick implementation of its coroutines in pure C: Lua is
properly tail-recursive, and each Lua coroutine keeps its own stack
in the heap, but since coroutine invocation is entirely stack-like,
invocation is a recursive call into the Lua interpreter, and yield is a
return from the interpreter.  The only restriction is that a C routine
called from Lua cannot yield.

--
You escaped them by the will-death              John Cowan
and the Way of the Black Wheel.                 [hidden email]
I could not.  --Great-Souled Sam                http://www.ccil.org/~cowan

Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Neil Mix
On Jun 26, 2006, at 9:41 AM, John Cowan wrote:

> I believe that the facilities of PEP 342, while necessary, is
> insufficient, as it does not allow subroutines invoked by a coroutine
> to yield for it, where some of the subroutines on the dynamic chain  
> are
> coroutine-blind (or if it does, it's too subtle for me to see how).
>
> Lua (http://www.lua.org ) provides a particular flavor of coroutines
> as its sole nonlinear control abstraction.  Lua coroutines unite
> generators, user-level multitasking, and backtracking into a single
> fairly straightforward construct; I urge that they be considered.

+1

I'd also point out that more powerful coroutines would be useful  
whether or not JavaScript becomes a target language for compilers.  
For hand-coding it makes lots of operations easier to read and  
debug.  For compiler targeting, it would allow compilers to  
instrument code for interactive debugging against the original source.


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Graydon Hoare-3
In reply to this post by John Cowan
John Cowan wrote:

> I believe that the facilities of PEP 342, while necessary, is
> insufficient, as it does not allow subroutines invoked by a coroutine
> to yield for it, where some of the subroutines on the dynamic chain are
> coroutine-blind (or if it does, it's too subtle for me to see how).

I've stared at PEP 342 for an hour now and cannot exactly tell.

It clearly points out this problem in the second paragraph of its
"motivation" section:

     Also, generators cannot yield control while other functions are
     executing, unless those functions are themselves expressed as
     generators, and the outer generator is written to yield in response
     to values yielded by the inner generator.

I *think* the proposed solution is in the 3rd paragraph:

     a simple co-routine scheduler or "trampoline function" would
     let coroutines "call" each other without blocking

But I'm having a hard time picturing the meaning of that, and how it
addresses the problem. I think it means that the problem is not going to
be addressed directly, but indirectly. Let's work through an example,
say a network server:

def http_service_loop():
     while true:
       s = socket.accept()
       http_serve_connection(s)

def http_serve_connection(s):
     req = http_read_requests(s)
     f = filesystem.load_file(req.filename)
     s.write(f.data())

def http_read_requests(s):
     buf = s.readline()
     ...

Suppose we want this to yield any time it does something that might
block on i/o, so inside the OS-level accept, read, load, and write
methods. How does PEP 342 recommend we rewrite this?

I *think* it says that you must still structure all the functions
containing generators *as* generators, but that the yields you sprinkle
all over the intermediate calls can have yield-expression results fed
back into them by an outer "trampoline" function. So I think it says we
rewrite as such:

def http_service_loop():
     while true:
       s = yield socket.accept()
       yield http_serve_connection(s)

def http_serve_connection(s):
     req = yield http_read_requests(s)
     f = yield filesystem.load_file(req.filename)
     yield s.write(f.data())

def http_read_requests(s):
     buf = yield s.readline()
     ...

Or something; I surely am getting the notation they have in mind wrong.
But I think the idea is that there's to be an outer function that does
something like this:

def trampoline():
     x = http_service_loop()
     try
       y = x.send(None)
       while true:
         do_some_other_work_multiplexed_with_the_server_io()
         y = x.send(y)
     catch StopIteration:
       pass

stepping the coroutine through its work by acting as a sort of auxiliary
return slot. And this would let you -- with some more code -- similarly
multiplex N service loops together, keeping track of the next value to
feed back into each as it's re-scheduled (putting aside the issue of a
call to sleep-until-one-of-these-io-channels-has-an-event).

If this is what PEP 342 is proposing, then I must admit the lua strategy
seems much more appealing: make any "yield" expression return control to
the nearest dynamic "resume". That would let the low level i/o functions
know about yield points, and all the logic inbetween the scheduler and
the i/o functions ignore them.

So, follow-on question: what's *wrong* with the lua strategy? Moreover,
why did the python strategy turn out this way? Did the python group just
not understand the better strategy? Were they concerned about the
restriction of being unable to yield through C stack frames? That seems
unlikely since the same restriction probably applies to PEP 324 yields.

Maybe they were bound by semi-compatibility with the existing (and even
weaker) iterator/generator scheme in earlier python versions?

-graydon


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

John Cowan
Graydon Hoare scripsit:

> If this is what PEP 342 is proposing

I think your analysis is right, though I had been hoping there was something
better available.

> then I must admit the lua strategy
> seems much more appealing: make any "yield" expression return control to
> the nearest dynamic "resume". That would let the low level i/o functions
> know about yield points, and all the logic inbetween the scheduler and
> the i/o functions ignore them.

Indeed.

> So, follow-on question: what's *wrong* with the lua strategy? Moreover,
> why did the python strategy turn out this way? Did the python group just
> not understand the better strategy? Were they concerned about the
> restriction of being unable to yield through C stack frames? That seems
> unlikely since the same restriction probably applies to PEP 324 yields.
>
> Maybe they were bound by semi-compatibility with the existing (and even
> weaker) iterator/generator scheme in earlier python versions?

That seems plausible.  In Python without this PEP, the caller need not
know whether a procedure is being invoked as a coroutine or a subroutine,
whereas in Lua any procedure can be invoked either way: natively as
a subroutine, or using the coroutine creation functions as a coroutine.
The Python situation can be trivially emulated in Lua by creating a
facade that is invoked as a subroutine and invokes the real coroutine
as a coroutine.

--
John Cowan  [hidden email]  http://ccil.org/~cowan
And now here I was, in a country where a right to say how the country should
be governed was restricted to six persons in each thousand of its population.
For the nine hundred and ninety-four to express dissatisfaction with the
regnant system and propose to change it, would have made the whole six
shudder as one man, it would have been so disloyal, so dishonorable, such
putrid black treason.  --Mark Twain's Connecticut Yankee

Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Brendan Eich-2
In reply to this post by Graydon Hoare-3
On Jun 26, 2006, at 1:49 PM, Graydon Hoare wrote:

> [Accurate summary of Python trampoline scheduler]

> So, follow-on question: what's *wrong* with the lua strategy?  
> Moreover, why did the python strategy turn out this way? Did the  
> python group just not understand the better strategy? Were they  
> concerned about the restriction of being unable to yield through C  
> stack frames?

Yes, and that is a concern for us, for Rhino's Continuation object  
implementation (which cannot cross Java native frames), as for  
Python.  Several TG1 members have expressed concern about having to  
save and restore even N > 1 interpreted frames, IIRC.  Adobe and  
Opera folks, please comment.

> That seems unlikely since the same restriction probably applies to  
> PEP 324 yields.

No, it doesn't.  There is no yield across more than one level of call.

> Maybe they were bound by semi-compatibility with the existing (and  
> even weaker) iterator/generator scheme in earlier python versions?

I don't think so, but I'd have to ask around to be sure.

/be


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Chris Double
On 6/27/06, Brendan Eich <[hidden email]> wrote:
> Yes, and that is a concern for us, for Rhino's Continuation object
> implementation (which cannot cross Java native frames), as for
> Python.

There are ways around the inability to yield across C stack frames. A
Jit was recently announced for Lua that supports yielding from C
functions and returning to them (provided by the Coco patch):

http://luajit.luaforge.net/coco.html

I like the addition of generators to Javascript but not being able to
yield from functions called from the generator is a pain. But that
model has been in use for a while in the Python world - do they find
it a practical limitation?

I am a big fan of providing the tools for people to build these sorts
of things as libraries though. Native delimited continuations in
Javascript would enable coroutines, etc to be added on as libraries.

Chris.
--
http://www.bluishcoder.co.nz

Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Brendan Eich-2
On Jun 26, 2006, at 5:09 PM, Chris Double wrote:

> I am a big fan of providing the tools for people to build these sorts
> of things as libraries though. Native delimited continuations in
> Javascript would enable coroutines, etc to be added on as libraries.

If this were 1995, *and* we had a sound library mechanism with the  
right identity and security guarantees, I would agree.  But at this  
point I agree with Lars: "ECMAScript needs to be a rich language with  
features that are immediately useful to its core audiences, and which  
play well with each other in the sense that they have unsurprising  
consequences when composed."

/be

Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Bob Ippolito
In reply to this post by Chris Double
On Jun 26, 2006, at 2:09 PM, Chris Double wrote:

> On 6/27/06, Brendan Eich <[hidden email]> wrote:
>> Yes, and that is a concern for us, for Rhino's Continuation object
>> implementation (which cannot cross Java native frames), as for
>> Python.
>
> There are ways around the inability to yield across C stack frames. A
> Jit was recently announced for Lua that supports yielding from C
> functions and returning to them (provided by the Coco patch):
>
> http://luajit.luaforge.net/coco.html
>
> I like the addition of generators to Javascript but not being able to
> yield from functions called from the generator is a pain. But that
> model has been in use for a while in the Python world - do they find
> it a practical limitation?

In the Python world you don't "yield across" anything. Functions that  
use yield, when called, return a generator object with a next method.  
The next method executes the function until a yield or an exception
[1] and returns the value yielded. Generators help out considerably  
for a lot of use cases, but they're definitely not equivalent to  
coroutines in any practical sense. This isn't generally considered to  
be a pain, because generators aren't purporting to be coroutines and  
most users aren't going to be familiar with something like call/cc  
anyway.

PEP 342 turns yield into an expression that has a value or may raise  
an arbitrary exception depending on what the caller does (by adding  
the send and throw methods). It allows for coroutine-like behavior,  
but the user has to implement it with a trampoline and write code in  
a sort of communicating sequential processes style. It does not allow  
for "magical" behavior like implicit cooperative threading on I/O.  
 From the Python design perspective, this is generally considered to  
be a good thing according to EIBTI[2] ("explicit is better than  
implicit").

[1] bare return and falling off the end of the function are  
equivalent to "raise StopIteration".
[2] http://www.python.org/dev/peps/pep-0020/

-bob


Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Brendan Eich-2
On Jun 26, 2006, at 5:43 PM, Bob Ippolito wrote:

> In the Python world you don't "yield across" anything. Functions  
> that use yield, when called, return a generator object with a next  
> method.

[Bob knows all this, but I bet others on the list don't, so I'll  
elaborate here:]

This makes the returned object an iterator, so it is a subsequent  
g.next() call that starts at the top of the generator function, or  
that resumes "just after" the last yield expression (with value  
undefined) if the generator has already been started.  And for-in  
loops and related structures call .next implicitly (implicit hook  
calling is good when unambiguous and intentional).

g.send(v) resumes so that the send expressions result is v.  You have  
to g.next() or g.send(undefined) once to get things started.  You can  
g.throw(e) to resume as if the send expression threw e.

> It does not allow for "magical" behavior like implicit cooperative  
> threading on I/O. From the Python design perspective, this is  
> generally considered to be a good thing according to EIBTI[2]  
> ("explicit is better than implicit").

Apart from the EIBTI doctrine, and I think as important for the  
members of ECMA TG1, is the practical problem of over-constraining  
implementations to be able to save and restore whole native+scripted-
function call stacks, not simply resume a generator function via  
next, send, or throw.  An optimizing compiler may have to deoptimize  
based on the indirect call relation, which in the presence of  
packages and eval is undecideable, so speculative code generation  
with fallback would be required in full.  A tiny tree-walking  
interpreter may have to switch from natural host recursion to an  
explicit control stack, just to support the proposed feature.

/be

Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

John Cowan
Brendan Eich scripsit:

> Apart from the EIBTI doctrine, and I think as important for the  
> members of ECMA TG1, is the practical problem of over-constraining  
> implementations to be able to save and restore whole native+scripted-
> function call stacks, not simply resume a generator function via  
> next, send, or throw.  An optimizing compiler may have to deoptimize  
> based on the indirect call relation, which in the presence of  
> packages and eval is undecideable, so speculative code generation  
> with fallback would be required in full.  A tiny tree-walking  
> interpreter may have to switch from natural host recursion to an  
> explicit control stack, just to support the proposed feature.

I don't follow this.  A tiny interpreter will have to escape from
natural host recursion anyway just in order to do simple Python/Icon
generators, and I don't see that yielding within C functions is
so essential to the concept -- I'd be happy with just what Lua
(unJITted) provides.

--
Mark Twain on Cecil Rhodes:                    John Cowan
I admire him, I freely admit it,               http://www.ccil.org/~cowan
and when his time comes I shall                [hidden email]
buy a piece of the rope for a keepsake.

Reply | Threaded
Open this post in threaded view
|

Re: Concurrency support?

Bob Ippolito
On Jun 26, 2006, at 3:58 PM, John Cowan wrote:

> Brendan Eich scripsit:
>
>> Apart from the EIBTI doctrine, and I think as important for the
>> members of ECMA TG1, is the practical problem of over-constraining
>> implementations to be able to save and restore whole native+scripted-
>> function call stacks, not simply resume a generator function via
>> next, send, or throw.  An optimizing compiler may have to deoptimize
>> based on the indirect call relation, which in the presence of
>> packages and eval is undecideable, so speculative code generation
>> with fallback would be required in full.  A tiny tree-walking
>> interpreter may have to switch from natural host recursion to an
>> explicit control stack, just to support the proposed feature.
>
> I don't follow this.  A tiny interpreter will have to escape from
> natural host recursion anyway just in order to do simple Python/Icon
> generators, and I don't see that yielding within C functions is
> so essential to the concept -- I'd be happy with just what Lua
> (unJITted) provides.

I have no idea what you mean by "natural host recursion", but I have  
a feeling that you're mistaken. Python's implementation of generators  
simply keep the generator's frame around (the gi_frame attribute on  
the generator object). The frame stores all of the locals and the  
bytecode offset. Calling the next() method just resumes the  
interpreter at whatever bytecode it was at. It's not written in CSP  
style and it doesn't require any weird tricks.

The only real difference between a generator frame and a regular  
frame is that the regular frame is tossed after the function returns,  
where a generator's frame is kept around until the generator is  
garbage collected.

You can't really write generators in C, but you can easily write  
iterators -- objects that have a next() method that behave according  
to the iterator protocol. From a user's perspective, these two kinds  
of objects are intentionally indistinguishable since generators are  
just a convenient syntax for implementing an iterator.

-bob


12