The purpose of binary components

classic Classic list List threaded Threaded
84 messages Options
12345
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Kyle Huey-2
On Tue, Jan 17, 2012 at 4:30 PM, Benjamin Smedberg <[hidden email]>wrote:

> I see binary components as a useful tool in very specific circumstances.
> Primarily, they are a good way to prototype and experiment with new
> features which require mucking about with Mozilla internals. But I tend to
> think that we should discourage their use in any production environment,
> including perhaps disallowing them on AMO. I tend to think we should
> consider option "B".
>

I did not read the entire thread, so forgive me if this has been raised
already, but I think driving extensions off of AMO is absolutely the wrong
way to go.  I would much rather have as many addons as possible in the
brightly lit, police-patrolled plaza that is AMO than the dark shady
alleyways that are the rest of the internet.

- Kyle
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Ehsan Akhgari
In reply to this post by Asa Dotzler
Do we have data on what percentage of the add-ons with binary components do
not have an updated version available when we ship a new release?

--
Ehsan
<http://ehsanakhgari.org/>


On Tue, Jan 17, 2012 at 12:32 PM, Asa Dotzler <[hidden email]> wrote:

> On 1/17/2012 9:22 AM, Benjamin Smedberg wrote:
>
>> On 1/17/2012 11:28 AM, Asa Dotzler wrote:
>>
>>>
>>> There are a class of extension authors which, for what ever their
>>> reasons, have expressed an unwillingness to move to JS ctypes. Perhaps
>>> it's that they technically cannot or they're simply unwilling to (it
>>> would be useful to get to the bottom of that). It's been nearly a year
>>> now since we let them know we'd be requiring a re-compile every six
>>> weeks and began encouraging them to move to JS solutions and they've
>>> not done it.
>>>
>> I'm looking at
>>
>> https://addons.mozilla.org/en-**US/firefox/compatibility/11.0?**
>> appver=1-11.0&type=binary<https://addons.mozilla.org/en-US/firefox/compatibility/11.0?appver=1-11.0&type=binary>
>>
>>
>> What portion of these addons are not compatible at release time?
>>
>
> I believe those are only the ones hosted at AMO. That's a minority of
> add-ons.
>
>
>  I can make educated guesses about the *possibility* of these addons
>> using something other than XPCOM components to achieve their
>> functionality, or could if we added a few APIs.
>>
>
> How do we convince them to change, even if we have a destination that's
> better?
>
>  I think we've lost this battle and we need to step back and revisit
>>> the value we get from breaking that compatibility every 6 weeks and
>>> compare that against regressing millions of users every 6 weeks.
>>>
>> If this is true, I think we should revisit rapid release in general, and
>>
>> not just this aspect of it. As Boris has mentioned, this will
>> significantly affect our ability to make changes to the platform.
>>
>
> When we moved to rapid releases, I got the impression that we needed time
> to clean up but that we wouldn't be in that state forever. I think we
> should discuss ways to amend rapid releases to make it work better for
> platform developers and users.
>
>
> - A
> ______________________________**_________________
> dev-planning mailing list
> [hidden email]
> https://lists.mozilla.org/**listinfo/dev-planning<https://lists.mozilla.org/listinfo/dev-planning>
>
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Dave Mandelin-2
In reply to this post by Benjamin Smedberg
On 1/17/2012 7:30 AM, Benjamin Smedberg wrote:

> This post was prompted by the thread "Why do we keep revising intervace
> ids?" in mozilla.dev.platform, and Asa's reply which suggests that we
> reconsider our current policy of breaking (requiring a recompile) of
> binary components in each 6-week cycle. I thought that before we got
> into the details of our solution we should define the problem space.
>
> == Purpose ==
>
> Loadable XPCOM components allow developers of both applications and
> extensions access to the following basic functions:
>
> * Adding new functionality to the platform available via our XPCOM
> object framework.
> * Extending existing functionality of the platform, for example adding a
> new protocol handler.
> * Replacing core functionality of existing components/services.

I have started thinking of our platform as being like an OS and looking
for inspiration on APIs and architectures in OSs. Along those lines,
#1/#2 above could be done as either libraries or device drivers,
depending on how deeply they are integrated with the platform. Either
way, an OS would try to provide a pretty stable API (evolution over the
years, not the months). For #3, the main example that comes to mind is
complete UI replacements (explorer.exe or window managers), but it seems
replacing core features in a stable way is not done much and may not be
very practical--just producing a custom version of the platform seems
more likely.

But before thinking too hard about that, what is the long-term future of
binary extensions? roc recently predicted that NPAPI plugins will go
away, meaning we should worry only about user experience issues for
existing plugins:

   http://robert.ocallahan.org/2011/11/end-of-plugins.html

I'm not sure if the comment about Windows 8 applies only to NPAPI
plugins, or other binary extensions too. I think there are valid use
cases for binary extensions, but if we don't expect them to be allowed
by the underlying platform or to continue to be used, then it wouldn't
make much sense to work on them.

If we are going to continue with binary extensions, I'd like to see us
get well-designed, stable APIs. It would take a while to get there,
though, and there is still the question about whether add-on developers
would actually use the new APIs.

> == Possible Alternatives ==
>
> There are few possible alternatives for what we could do to make using
> binary components easier or harder:
>
> A. Change nothing, leave the status quo as-is. Extensions which use
> binary components and are not recompiled every 6 weeks will "break", but
> they at least have the opportunity to do cool stuff . Users will
> continue to be vulnerable to Oracle-type crashes.

We know roughly how this looks. :-)

> B. Make no technical changes, but disallow binary components in release
> extensions on AMO. We may allow binary components in beta or
> experimental addons, with the understanding that these would need to be
> updated for each release.

This option is hard for me to understand. It seems to be essentially a
ban on binary components, except in prototyping. Is it really that much
easier to prototype as a binary extension? And what does the prototype
lead to, if it can't be shipped? A patch to the platform? A non-AMO addon?

> C. (copied from Asa's post) Only change interfaces every 3rd release (or
> something like that). This would mean that extensions which use C++
> components would need to compile less frequently. Users would still be
> exposed to Oracle-type issues, but less frequently.

This seems really disruptive to development, especially if it applies to
all interfaces. Is there a subset of the interfaces that is useful to
add-on developers that we could keep stable?

> D. Stop loading binary components from extensions. Continue exporting
> the XPCOM functions so that application authors can continue to use
> them. Users might still be exposed to Oracle-type issues.
> E. Stop loading binary components completely: compile Firefox as a
> static binary and stop exporting the XPCOM functions completely.

These two make sense if we do decide that binary components are going away.

> == bsmedberg's Opinion ==
>
> I see binary components as a useful tool in very specific circumstances.
> Primarily, they are a good way to prototype and experiment with new
> features which require mucking about with Mozilla internals. But I tend
> to think that we should discourage their use in any production
> environment, including perhaps disallowing them on AMO. I tend to think
> we should consider option "B".

Personally, I know way too little about the ecosystem of add-ons with
binary components, what they do, and what other kinds of implementations
they could use to be able to make a decision. It seems like a really
deep problem.

But the core issue that your options revolve around does seem to be: do
we want to keep binary components at all? I see 3 main options:

1. Deprecate and eventually remove binary components, on the grounds
that they are hard to support and won't be allowed on all platforms.
(Also, we could later revive them in some new form if we wanted.)
Besides the obvious "life gets easier", I'm not sure what the effects
are--especially, how do vendors of add-ons with binary components react?
(And is this the path MS is taking?)

2. Do binary components right: provide stable APIs and QA binary
components. It's expensive and doesn't solve anything in the immediate
future, but it does create a good, stable environment for binary
components (if it works). It seems like this is what Google is going for
with PPAPI and NaCl.

3. Bounce along as we are: unstable binary components. This is OK for
now but I don't see it as a sound long-term plan.

Dave
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Matt Brubeck-3
In reply to this post by Benjamin Smedberg
On 01/17/2012 07:30 AM, Benjamin Smedberg wrote:
> B. Make no technical changes, but disallow binary components in release
> extensions on AMO. We may allow binary components in beta or
> experimental addons, with the understanding that these would need to be
> updated for each release.

For non-AMO add-ons, this has no effect.

For AMO-hosted add-ons, it gives authors a choice between changing their
code or leaving AMO.  If many authors chose to leave AMO, this would not
improve the experience for users, and it would reduce our ability to use
AMO procedures to improve things in the future.

If I remember correctly, the binary add-ons with the largest number of
users are not on AMO anyway, so this option would have little or no
effect on user experience for a majority of our users.
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Benjamin Smedberg
In reply to this post by Dave Mandelin-2
On 1/17/2012 3:07 PM, David Mandelin wrote:
>
>
> But before thinking too hard about that, what is the long-term future
> of binary extensions? roc recently predicted that NPAPI plugins will
> go away, meaning we should worry only about user experience issues for
> existing plugins:
As noted to Wes, this discussion isn't really about binaries in general.
NPAPI plugins and using binaries via ctypes are the 'good alternatives'
to XPCOM components.

> If we are going to continue with binary extensions, I'd like to see us
> get well-designed, stable APIs. It would take a while to get there,
> though, and there is still the question about whether add-on
> developers would actually use the new APIs.
I don't think we should spend time making stable well-designed binary
APIs, if that is going to get in the way of making stable well-designed
JS APIs (and I don't think we have the resources to do it all). So I do
think that we should be working hard to get stable well-designed JS APIs
that most extensions can use, and in the cases where extensions can't
use the existing APIs, work with them to get the APIs they need.


>
>> == Possible Alternatives ==
>>
>> There are few possible alternatives for what we could do to make using
>> binary components easier or harder:
>>
>> A. Change nothing, leave the status quo as-is. Extensions which use
>> binary components and are not recompiled every 6 weeks will "break", but
>> they at least have the opportunity to do cool stuff . Users will
>> continue to be vulnerable to Oracle-type crashes.
>
> We know roughly how this looks. :-)
I guess I don't know how bad this looks currently. Asa believes that
many users are annoyed because these addons-with-binaries aren't updated
at release-time (and I have no reason to doubt him). If this is the
case, we'd either need to help them get updates *before* we do each
6-week release, or change our own practices so that we don't break them
at each release.

As I wrote a while back
(http://benjamin.smedbergs.us/blog/2011-09-15/complementing-firefox-rapid-release/),
I think we'd be better off having those consumers use a LTS release.

>
>> B. Make no technical changes, but disallow binary components in release
>> extensions on AMO. We may allow binary components in beta or
>> experimental addons, with the understanding that these would need to be
>> updated for each release.
>
> This option is hard for me to understand. It seems to be essentially a
> ban on binary components, except in prototyping. Is it really that
> much easier to prototype as a binary extension? And what does the
> prototype lead to, if it can't be shipped? A patch to the platform? A
> non-AMO addon?
Often times these things have started out as something which hooks into
the guts of nsIContent or the docshell hierarchy in order to do
interesting stuff. The goal would be to do whatever rapid prototyping
you need, and then convert that into a patch for the platform.


>
>> D. Stop loading binary components from extensions. Continue exporting
>> the XPCOM functions so that application authors can continue to use
>> them. Users might still be exposed to Oracle-type issues.
>> E. Stop loading binary components completely: compile Firefox as a
>> static binary and stop exporting the XPCOM functions completely.
>
> These two make sense if we do decide that binary components are going
> away.
When? Immediately (breaking our most popular addons, see Jorge's link),
or eventually? If eventually, how do we get there?
>
>
> 2. Do binary components right: provide stable APIs and QA binary
> components. It's expensive and doesn't solve anything in the immediate
> future, but it does create a good, stable environment for binary
> components (if it works). It seems like this is what Google is going
> for with PPAPI and NaCl.
I don't think this would help anyone. For the most part, people don't
seem to be using binary components because of speed benefits. It's
either because they need to make special calls into OS libraries (USB
stuff for Garmin, embedding Trident for IETab, etc) or because they want
to poke at the unstable gecko guts (roboform and perhaps some others).
In either case you NaCL/pepper doesn't help. Some of the stuff we're
doing for B2G will help (WebUSB).
>
> 3. Bounce along as we are: unstable binary components. This is OK for
> now but I don't see it as a sound long-term plan.
I *think* everyone agrees that it is not a stable plan for extensions.

--BDS


_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Zack Weinberg-2
In reply to this post by Benjamin Smedberg
On 2012-01-17 7:30 AM, Benjamin Smedberg wrote:

> A. Change nothing, leave the status quo as-is. Extensions which use
> binary components and are not recompiled every 6 weeks will "break", but
> they at least have the opportunity to do cool stuff . Users will
> continue to be vulnerable to Oracle-type crashes.
> B. Make no technical changes, but disallow binary components in release
> extensions on AMO. We may allow binary components in beta or
> experimental addons, with the understanding that these would need to be
> updated for each release.
> C. (copied from Asa's post) Only change interfaces every 3rd release (or
> something like that). This would mean that extensions which use C++
> components would need to compile less frequently. Users would still be
> exposed to Oracle-type issues, but less frequently.
> D. Stop loading binary components from extensions. Continue exporting
> the XPCOM functions so that application authors can continue to use
> them. Users might still be exposed to Oracle-type issues.
> E. Stop loading binary components completely: compile Firefox as a
> static binary and stop exporting the XPCOM functions completely.

I think option B is an excellent one for the near term, but that in the
long run we want option D or E, and that concurrent with the change to
AMO, we should announce that binary components are deprecated and will
stop working at a stated time in the future (one or two years seems
reasonable to me).

To inform the choice between D and E, could authors of applications
outside m-c please chime in and tell us what they still need binary
components for?  It seems *highly* desirable to me to get to a point
where Thunderbird, Seamonkey, etc don't need any binary components
beyond what's already in libxul.

zw
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Dave Mandelin-2
In reply to this post by Dave Mandelin-2
On 1/17/2012 12:39 PM, Benjamin Smedberg wrote:
> On 1/17/2012 3:07 PM, David Mandelin wrote:
>>
>>
>> But before thinking too hard about that, what is the long-term future
>> of binary extensions? roc recently predicted that NPAPI plugins will
>> go away, meaning we should worry only about user experience issues for
>> existing plugins:
 >
> As noted to Wes, this discussion isn't really about binaries in general.
> NPAPI plugins and using binaries via ctypes are the 'good alternatives'
> to XPCOM components.

OK, but given that NPAPI may be on its way out (per roc's post) and
ctypes apparently is not being used much, it's not clear if the 'good
alternatives' are viable.

>> If we are going to continue with binary extensions, I'd like to see us
>> get well-designed, stable APIs. It would take a while to get there,
>> though, and there is still the question about whether add-on
>> developers would actually use the new APIs.
 >
> I don't think we should spend time making stable well-designed binary
> APIs, if that is going to get in the way of making stable well-designed
> JS APIs (and I don't think we have the resources to do it all). So I do
> think that we should be working hard to get stable well-designed JS APIs
> that most extensions can use, and in the cases where extensions can't
> use the existing APIs, work with them to get the APIs they need.

That makes sense. Do you mean Jetpack, or something more?

>>> == Possible Alternatives ==
>>>
>>> There are few possible alternatives for what we could do to make using
>>> binary components easier or harder:
>>>
>>> A. Change nothing, leave the status quo as-is. Extensions which use
>>> binary components and are not recompiled every 6 weeks will "break", but
>>> they at least have the opportunity to do cool stuff . Users will
>>> continue to be vulnerable to Oracle-type crashes.
>>
>> We know roughly how this looks. :-)
 >

> I guess I don't know how bad this looks currently. Asa believes that
> many users are annoyed because these addons-with-binaries aren't updated
> at release-time (and I have no reason to doubt him). If this is the
> case, we'd either need to help them get updates *before* we do each
> 6-week release, or change our own practices so that we don't break them
> at each release.
>
> As I wrote a while back
> (http://benjamin.smedbergs.us/blog/2011-09-15/complementing-firefox-rapid-release/),
> I think we'd be better off having those consumers use a LTS release.

That sounds right to me.

>>> B. Make no technical changes, but disallow binary components in release
>>> extensions on AMO. We may allow binary components in beta or
>>> experimental addons, with the understanding that these would need to be
>>> updated for each release.
>>
>> This option is hard for me to understand. It seems to be essentially a
>> ban on binary components, except in prototyping. Is it really that
>> much easier to prototype as a binary extension? And what does the
>> prototype lead to, if it can't be shipped? A patch to the platform? A
>> non-AMO addon?
 >
> Often times these things have started out as something which hooks into
> the guts of nsIContent or the docshell hierarchy in order to do
> interesting stuff. The goal would be to do whatever rapid prototyping
> you need, and then convert that into a patch for the platform.

OK, so for prototyping. I guess I don't know enough about what that kind
of development is like to know how important that ability is.

>>> D. Stop loading binary components from extensions. Continue exporting
>>> the XPCOM functions so that application authors can continue to use
>>> them. Users might still be exposed to Oracle-type issues.
>>> E. Stop loading binary components completely: compile Firefox as a
>>> static binary and stop exporting the XPCOM functions completely.
>>
>> These two make sense if we do decide that binary components are going
>> away.
 >
> When? Immediately (breaking our most popular addons, see Jorge's link),
> or eventually? If eventually, how do we get there?

Eventually. The how would be negotiated with add-on developers, users,
and product managers over time.

>> 2. Do binary components right: provide stable APIs and QA binary
>> components. It's expensive and doesn't solve anything in the immediate
>> future, but it does create a good, stable environment for binary
>> components (if it works). It seems like this is what Google is going
>> for with PPAPI and NaCl.
 >
> I don't think this would help anyone. For the most part, people don't
> seem to be using binary components because of speed benefits. It's
> either because they need to make special calls into OS libraries (USB
> stuff for Garmin, embedding Trident for IETab, etc) or because they want
> to poke at the unstable gecko guts (roboform and perhaps some others).
> In either case you NaCL/pepper doesn't help. Some of the stuff we're
> doing for B2G will help (WebUSB).

I am not proposing doing NaCl/Pepper, I was just observing that Google
seems to have chosen that path.

On the use cases you mention: Poking at unstable guts seems like
something that we can never make stable. Special calls into OS libraries
seems more important. Can ctypes cover it?

>> 3. Bounce along as we are: unstable binary components. This is OK for
>> now but I don't see it as a sound long-term plan.
 >
> I *think* everyone agrees that it is not a stable plan for extensions.

On further reflection, your proposal "B" to me seems like it largely
amounts to announcing that binary components are unsupported: the code
stays in the tree and you can still ship binaries but not on AMO, and
don't expect them to work as Firefox goes forward.

And that seems like "now, but more explicit about what that actually
means", which seems like a mild improvement over "now". I think it would
be better yet to combine that with a strong intention to create the JS
APIs (e.g., WebUSBs, improved ctypes if necessary) to get all the
add-ons that now have to use binaries onto a supportable technology,
which you also advocated above.

Dave
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Kyle Huey-2
In reply to this post by Zack Weinberg-2
On Tue, Jan 17, 2012 at 10:24 PM, Zack Weinberg <[hidden email]> wrote:

> On 2012-01-17 7:30 AM, Benjamin Smedberg wrote:
>
>  A. Change nothing, leave the status quo as-is. Extensions which use
>> binary components and are not recompiled every 6 weeks will "break", but
>> they at least have the opportunity to do cool stuff . Users will
>> continue to be vulnerable to Oracle-type crashes.
>> B. Make no technical changes, but disallow binary components in release
>> extensions on AMO. We may allow binary components in beta or
>> experimental addons, with the understanding that these would need to be
>> updated for each release.
>> C. (copied from Asa's post) Only change interfaces every 3rd release (or
>> something like that). This would mean that extensions which use C++
>> components would need to compile less frequently. Users would still be
>> exposed to Oracle-type issues, but less frequently.
>>
>> D. Stop loading binary components from extensions. Continue exporting
>> the XPCOM functions so that application authors can continue to use
>> them. Users might still be exposed to Oracle-type issues.
>> E. Stop loading binary components completely: compile Firefox as a
>> static binary and stop exporting the XPCOM functions completely.
>>
>
> I think option B is an excellent one for the near term, but that in the
> long run we want option D or E, and that concurrent with the change to AMO,
> we should announce that binary components are deprecated and will stop
> working at a stated time in the future (one or two years seems reasonable
> to me).
>
> To inform the choice between D and E, could authors of applications
> outside m-c please chime in and tell us what they still need binary
> components for?  It seems *highly* desirable to me to get to a point where
> Thunderbird, Seamonkey, etc don't need any binary components beyond what's
> already in libxul.


Do you mean the libxul that's in mozilla-central or the libxul mail is
shipping?  If you mean the libxul that's in mozilla-central, that seems
pretty unrealistic.

- Kyle
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

danderson
In reply to this post by Benjamin Smedberg
My experience with binary components has been overwhelmingly negative. I ran/run two decent-sized open source projects that allows both scripted addons and binary extensions. We tried to provide a stable API, but it's not enough. The API will be misused, abused, and often subtly broken by other changes. These binary components are difficult to ship because you need to compile them on N platforms (as well as the lowest common Linux distro denominator, if even possible). They are hard for their authors to debug unless they have access to these platforms. Sometimes the authors are very new to C++.

The API is often incomplete anyway. Then the worst thing: your API has some fundamental flaw you discover later, and now some ancient, unmaintained binary is totally relying on it.

In our experience, these binary components had three use cases:
 (1) Making an external implementation scriptable. For example, embedding MySQL support via libmysql.
 (2) Exposing to script something that was not provided by the API.
 (3) Having some insane, tightly-integrated extension that really needed to talk to the guts of the host app.

(1) is solved by having a really good interop layer, and we were able to eliminate many uses cases with that. (2) is solved by letting users create scripted libraries that can be shared throughout the whole system. (3) is the hard one. These are usually the most buggy, because that level of integration is difficult and brittle, and it usually shows in the user experience as well. When you're totally hijacking a system from the outside it's a miracle if it all works, and a maintenance nightmare forever after. All of them are in part solved by being proactive about what (reasonable) functionality developers need, so the hard stuff gets upstream.

If I were to create such a project again, I would definitely ditch binary components. They're so much of a headache.

-David

> On 1/17/2012 11:44 AM, Wes Garland wrote:
> > On 17 January 2012 11:39, Benjamin Smedberg<[hidden email]>  wrote:
> >
> >> In any non-trivial case, the recommended way to use ctypes is to compile a
> >> DLL which exposes your well-known API, and have the C compiler do all the
> >> dirty work for you. In this case none of the objections you have mentioned
> >> really apply.
> >>
> > This is absolutely true, however, by following this approach, we now have a
> > binary component.
> No, you have a *DLL*. I am in no way proposing that we get rid of
> extensions shipping DLLs. I in fact strongly encourage it for some use
> cases. IETab, for example, should almost certainly be implemented as an
> NPAPI plugin, not an XPCOM component.
>
> I am merely talking about binary *XPCOM* components, which are the root
> problem when we're talking about either freezing XPCOM interface
> definitions for some of the rapid release cycle or disallowing those
> components based on the user experience.
>
> --BDS

_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Zack Weinberg-2
In reply to this post by Zack Weinberg-2
On 2012-01-17 1:34 PM, Kyle Huey wrote:
> On Tue, Jan 17, 2012 at 10:24 PM, Zack Weinberg<[hidden email]>  wrote:
>> To inform the choice between D and E, could authors of applications
>> outside m-c please chime in and tell us what they still need binary
>> components for?  It seems *highly* desirable to me to get to a point where
>> Thunderbird, Seamonkey, etc don't need any binary components beyond what's
>> already in libxul.
>
> Do you mean the libxul that's in mozilla-central or the libxul mail is
> shipping?

I mean some future iteration of the libxul that's in m-c, which might
have things in it that aren't there right now.

Another way to describe the "highly desirable point" that I'm thinking
of would be: All the compiled code in the platform is in m-c.  The
compiled executable built by m-c is xulrunner. All Mozilla-platform
applications *including Firefox* consist of xulrunner plus some
JavaScript (and XUL/HTML/CSS/etc).

 > If you mean the libxul that's in mozilla-central, that seems
> pretty unrealistic.

I know that *presently* there's a lot of C++ code in the comm-central
version of libxul that isn't in the m-c version, but I don't know what
it's all for, and I could imagine a great deal of it being either
reimplementable in JS without much trouble, or reasonable to add to the
m-c codebase.

zw
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Joshua Cranmer-2
On 1/17/2012 3:50 PM, Zack Weinberg wrote:
> I know that *presently* there's a lot of C++ code in the comm-central
> version of libxul that isn't in the m-c version, but I don't know what
> it's all for, and I could imagine a great deal of it being either
> reimplementable in JS without much trouble, or reasonable to add to
> the m-c codebase.

As a prelude, I should point out that the mailnews code is often old and
crufty enough to make reimplementation a major task, fraught with
complications, even without the burden of translating to a more limiting
API boundary. I can explain the major problems of rewriting most/all of
comm-central in JS:

1. Charsets. This means a lot of strings will not pass through xpconnect
happily; additionally, a fair amount of the m-c charset/MIME code we
need to use is either not available via XPIDL or it is marked as [noscript].
2. Threads. Some portions of our codebase--particularly IMAP and
import--need to access things from different threads. It is absolutely
impossible to use a JS-implemented XPCOM component from another thread,
which means that it is almost inevitable that the core objects would
have to be implemented in C++. Additionally, IMAP relies on using socket
communications on a different thread, which would mean the same
limitation would still apply even if the core objects were implemented
without using XPCOM at all [which is highly unlikely].
3. RDF. It is impossible to implement RDF in JavaScript (and I have
tried before). While we do have an ongoing project to remove RDF from
mailnews, it has effectively stalled (the main pusher of the effort
appears to have retired from coding Thunderbird), and the last features
are where code is most likely to produce pernicious regressions.
4. System integration. The WABI address books, at the very least, uses a
MS COM implementation, which makes access via jsctypes all but
impossible. Now, this could probably be implemented via a jsctypes
thunk, but I'm just pointing out that it is almost certainly infeasible
to have all of comm-central be in JS.
5. I/O. Some of the I/O layers we use are again unusable from JS (e.g.,
nsIAuthModule). Admittedly, this is mostly fallout from 1 and 2, but I
still want to point out that it is a limit on what is possible.
6. All-or-nothing migration. Much of the current codebase is effectively
impossible to partially migrate to JS, due to several factors. There are
probably some individual files in base/src that could be independently
migrated, but the MIME, filter, composition, import, and database
libraries probably all have to be migrated as full libraries. Address
book might be partially migratible with some work, but the rest of
mailnews would pretty much have to swap all at the same time, or at
least in two units (the protocol communication and the actual base
object instances), due to liberal use of inheritance.

In short: it is, IMHO, highly unlikely that comm-central could be
rewritten in JS without several man-months of concentrated effort and
significant expansion of available APIs (I, for one, would love to see
some sort of "binary string" feature in xpconnect/JS/ishy-things, as
that pretty much eliminates my major concern for rewriting libmime in
JS). Unless, of course, you want to add all/most of comm-central back
into mozilla-central :-).
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Asa Dotzler
In reply to this post by Zack Weinberg-2
On 1/17/2012 1:24 PM, Zack Weinberg wrote:

> On 2012-01-17 7:30 AM, Benjamin Smedberg wrote:
>
>> A. Change nothing, leave the status quo as-is. Extensions which use
>> binary components and are not recompiled every 6 weeks will "break", but
>> they at least have the opportunity to do cool stuff . Users will
>> continue to be vulnerable to Oracle-type crashes.
>> B. Make no technical changes, but disallow binary components in release
>> extensions on AMO. We may allow binary components in beta or
>> experimental addons, with the understanding that these would need to be
>> updated for each release.
>> C. (copied from Asa's post) Only change interfaces every 3rd release (or
>> something like that). This would mean that extensions which use C++
>> components would need to compile less frequently. Users would still be
>> exposed to Oracle-type issues, but less frequently.
>> D. Stop loading binary components from extensions. Continue exporting
>> the XPCOM functions so that application authors can continue to use
>> them. Users might still be exposed to Oracle-type issues.
>> E. Stop loading binary components completely: compile Firefox as a
>> static binary and stop exporting the XPCOM functions completely.
>
> I think option B is an excellent one for the near term, but that in the
> long run we want option D or E, and that concurrent with the change to
> AMO, we should announce that binary components are deprecated and will
> stop working at a stated time in the future (one or two years seems
> reasonable to me).

If we do B, we do nothing about most add-ons with XPCOM components. Most
are hosted outside of AMO already. If we do B, the rest will probably
just move outside of AMO where we have even less visibility.

If we do D and E, we have to be prepared for the authors of all the
"security"-related add-ons that use XPCOM components to message to our
users that they will be unsafe and should not upgrade to newer versions
of Firefox that block them. We've already seen some of that from AV
vendors in the past.

- A
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Asa Dotzler
In reply to this post by Boris Zbarsky
On 1/17/2012 10:09 AM, Boris Zbarsky wrote:

> On 1/17/12 12:32 PM, Asa Dotzler wrote:
>> When we moved to rapid releases, I got the impression that we needed
>> time to clean up but that we wouldn't be in that state forever.
>
> I should not that we're not done with the cleanup yet...
>
> "cleanup" includes implementing the dom4 core spec, note.
>
>> I think we should discuss ways to amend rapid releases to make it work
>> better
>> for platform developers and users.
>
> OK, let's define "better"? It sounds like you would prefer to move to an
> 18-week release cycle for platform (including web) features, yes?

I don't yet know what I prefer. I'd like to find out what's possible. Is
it possible, for example, to stabilize and freeze (for, say, 18 weeks
ore more) a few key interfaces to get a large number add-ons compatible
while letting the more exotic ones break more frequently?

- A
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Boris Zbarsky
On 1/17/12 7:05 PM, Asa Dotzler wrote:
> I don't yet know what I prefer. I'd like to find out what's possible. Is
> it possible, for example, to stabilize and freeze (for, say, 18 weeks
> ore more) a few key interfaces to get a large number add-ons compatible
> while letting the more exotic ones break more frequently?

Possibly.  But note that one of the things that's come up the most are
things like the DOM node and element interfaces, which we can only
stabilize/freeze by deferring web features, because people are adding
new APIs to those all the time.

Maybe that's no longer an issue, though.  What would be interesting, if
we had a way of getting it, is some idea of the set of interfaces binary
XPCOM components are actually using...

-Boris

_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Asa Dotzler
On 1/17/2012 4:10 PM, Boris Zbarsky wrote:

> On 1/17/12 7:05 PM, Asa Dotzler wrote:
>> I don't yet know what I prefer. I'd like to find out what's possible. Is
>> it possible, for example, to stabilize and freeze (for, say, 18 weeks
>> ore more) a few key interfaces to get a large number add-ons compatible
>> while letting the more exotic ones break more frequently?
>
> Possibly. But note that one of the things that's come up the most are
> things like the DOM node and element interfaces, which we can only
> stabilize/freeze by deferring web features, because people are adding
> new APIs to those all the time.
>
> Maybe that's no longer an issue, though. What would be interesting, if
> we had a way of getting it, is some idea of the set of interfaces binary
> XPCOM components are actually using...
>
> -Boris
>

If there's some way to do that, even manually, I'd be willing to help.
Is there some way to have an instrumented Firefox that spits out a list
of any APIs an add-on is poking?

- A
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Boris Zbarsky
On 1/17/12 7:26 PM, Asa Dotzler wrote:
> If there's some way to do that, even manually, I'd be willing to help.
> Is there some way to have an instrumented Firefox that spits out a list
> of any APIs an add-on is poking?

Hmm.

That's an interesting question, actually.  Maybe, yes.  If we hack the
IDL and codegen for nsISupports such that the first 100 vtable entries
record some information about the concrete class the function is called
on (by examining the vtable pointer and then doing something with debug
symbols, if possible) and the index of the method, then forward on to
the "real" method (by adding 100 to the vtable index and calling the
result), we might be able to get something, esp. with some
postprocessing.  Does that sound at all feasible?

The key part is that to tell apart the addon's calls and our calls we
need to make sure that the addon is calling _different_ methods from our
internal code.  We get that somewhat for free if the two are compiled
against different headers.

-Boris


_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

skornblith
In reply to this post by Asa Dotzler
On Jan 17, 11:28 am, Asa Dotzler <[hidden email]> wrote:

> On 1/17/2012 7:30 AM, Benjamin Smedberg wrote:
>
> > I see binary components as a useful tool in very specific circumstances.
> > Primarily, they are a good way to prototype and experiment with new
> > features which require mucking about with Mozilla internals. But I tend
> > to think that we should discourage their use in any production
> > environment, including perhaps disallowing them on AMO. I tend to think
> > we should consider option "B".
>
> Benjamin, thank you for the best description of binary and js components
> I've seen to date. It's a big help to folks like me who don't live so
> deeply in the code.
>
> There are a class of extension authors which, for what ever their
> reasons, have expressed an unwillingness to move to JS ctypes. Perhaps
> it's that they technically cannot or they're simply unwilling to (it
> would be useful to get to the bottom of that). It's been nearly a year
> now since we let them know we'd be requiring a re-compile every six
> weeks and began encouraging them to move to JS solutions and they've not
> done it.

As an add-on developer who falls into this category, I can provide
three reasons that we haven't moved to js-ctypes yet, and the
experience we've encountered in moving in that direction.

1) We have a non-trivial number of users still running Firefox 3.6 who
we still want to support. Moving to js-ctypes means dropping these
users or maintaining two separate code bases. For now, the plan is to
wait for Firefox 3.6 end-of-life before dropping support. Maintaining
two separate code bases is far more of a pain in the ass than updating
binary components for each release.

2) js-ctypes is ugly. The core of our extension is written in
JavaScript, but we have three different XPCOM components to
communicate with different word processors, all of which expose a
common object-oriented interface. With binary XPCOM, we wrote an idl
file and auto-generated the headers from it. Now we have a C API that
passes around structs for the library, and a JavaScript XPCOM
component that uses js-ctypes to interface with our native library but
exposes it in an object-oriented way. We could eventually move our
core code to expect this C API, but it's not anything resembling
idiomatic JavaScript. On top of this, memory management with js-ctypes
is a major pain; we are manually refcounting interfaces within
JavaScript. After writing the original code as an XPCOM component, js-
ctypes seems like a downgrade in just about every way. The ability to
call C++ from js-ctypes (bug 505907) would address this to some
degree, but it looks like it's been WONTFIXed.

3) It will take a lot of rapid release cycles before moving code to js-
ctypes will pay off. Moving  code from XPCOM to js-ctypes requires a
non-trivial amount of effort and may introduce new bugs. Recompiling
against a new XULRunner SDK takes a couple of minutes and runs
comparatively little risk of regressions.

With that said, we are in the process of moving to js-ctypes in order
to decrease long-term maintenance requirements. However, it's not
something we are particularly enthusiastic about, and the only
benefits to us are indirect (through having a better Firefox). I don't
think it should be surprising that, when told to replace their current
paradigm with another paradigm that is worse in every way except for
forward compatibility, add-on developers are dragging their feet.

Simon
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Justin Dolske-2
In reply to this post by Benjamin Smedberg
On 1/17/12 7:30 AM, Benjamin Smedberg wrote:

> B. Make no technical changes, but disallow binary components in release
> extensions on AMO. We may allow binary components in beta or
> experimental addons, with the understanding that these would need to be
> updated for each release.

Non-AMO addons are unfortunately so common that I don't think this is
likely to be effective. It might be nice, someday, to more strongly
encourage an "app store" kind of model where we make it more difficult
to install addons without some kind of AMO review... But that's a thorny
issue and not likely to happen anytime soon.


> C. (copied from Asa's post) Only change interfaces every 3rd release (or
> something like that). This would mean that extensions which use C++
> components would need to compile less frequently. Users would still be
> exposed to Oracle-type issues, but less frequently.

I think this is likely to make the problem worse! Instead of code
breaking frequently (and hopefully caught during development), longer
windows of "it still happens to work" will elapse and end up impacting
more users.

Counter proposal (possibly crazy): essentially randomize
interfaes/vtables each release to reliably _ensure_ breakage. No binary
addon will work > 6 weeks (in a release channel), so less time for
broken code to spread to users.


> I see binary components as a useful tool in very specific circumstances.
> Primarily, they are a good way to prototype and experiment with new
> features which require mucking about with Mozilla internals.

Has anyone done some kind of survey to see what the common use-cases are
for binary addons? Would it be interesting to look at ways to expose
non-XPCOM binary hooks, with greater stability levels? [Essentially
Jetpack for native code, if you will. :)]

Another way to look at the problem is that there are still a lot of
developers out there who want to use the languages they know (C/C++) to
extend Firefox, and the only obvious interface points we provide are
these hazardous, unstable XPCOM APIs.

Justin
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

Philip Chee
In reply to this post by Henri Sivonen
On Tue, 17 Jan 2012 11:31:57 -0500, Benjamin Smedberg wrote:

> Justin Wood (Callek) wrote:
>
>> Re: Option B.
>>
>> It would relegate Lightning to a non-release addon.
>> It would relegate Enigmail to a non-release addon.
>>
>> These are two addons that users of Thunderbird for example, consider
>> necessary. Yes doing Lightning/Enigmail with simply binary-parts but
>> using js-ctypes to access those parts should be theoretically possible,
>> but I make no guess as to the amount of work.
> If Firefox chooses option B, that doesn't necessarily mean that
> Thunderbird must also choose it. I really don't know lightning at all,
> but I believe that all enigmail needs is ipccode, which we should just
> consider adding as part of core.

Well here's the problem. Enigmail tried multiple times to get their IPC
code into core but was rejected each time. Eventually they were
grudgingly allowed to put their code into a separate repository
somewhere in hg.mozilla.org where nobody can find it.

Phil

--
Philip Chee <[hidden email]>, <[hidden email]>
http://flashblock.mozdev.org/ http://xsidebar.mozdev.org
Guard us from the she-wolf and the wolf, and guard us from the thief,
oh Night, and so be good for us to pass.
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
Reply | Threaded
Open this post in threaded view
|

Re: The purpose of binary components

R Kent James
In reply to this post by Zack Weinberg-2
On 1/17/2012 1:24 PM, Zack Weinberg wrote:
> To inform the choice between D and E, could authors of applications
> outside m-c please chime in and tell us what they still need binary
> components for?  It seems *highly* desirable to me to get to a point
> where Thunderbird, Seamonkey, etc don't need any binary components
> beyond what's already in libxul.

I've been a proponent of adding capability to Thunderbird to allow
adding new account types beyond the current IMAP, POP3, News, and RSS
types. The addon TweeQuilla (written in JavaScript) is a demonstration
of adding a Twitter account type, and uses a binary addon "New Account
Types" as a glue layer to the base Thunderbird code.

The fundamental mailnews design model for accounts essentially has a
base set of C++ classes (example: nsMsgDBFolder.cpp) that are then
extended with C++ supertypes to implement account-specific functionality
on top of XPCOM interfaces. So, for example, nsIMsgDBFolder has specific
methods that need to be implemented for each account type(for IMAP in
nsImapMailFolder.cpp), as well as has account-specific extensions to the
XPCOM interface (for example nsIImapIncomingServer for the server).

So why do I need binary XPCOM extensions? Because the entire mailnews
architecture is based on a C++ inheritance model which does not map well
onto javascript. Although the "New Account Types" addon showed that, in
fact, it is possible to overcome those issues in javascript, everyone
seemed to hate the complexity and ugliness required to do so, so
attempts to propose adding some of that to core got quickly mired in
criticism. While it would be possible to rewrite all of mailnews I
suppose to be more javascript-friendly, realistically that is not likely
to happen. I also know from bitter experience that even fairly small bug
fixes in mailnews often lead to YEARS of chasing regressions through fix
after fix, so do not kid yourself that any of this would be easy.

As for the compatibility issue, for New Account Types I currently ship
DLLs supporting versions 8, 9, and 10 in the same addon. In the review
queue (for three weeks) is the next version that will support 9, 10, and
11. With this overall scheme, someone can switch seamlessly between the
last-release, current release, and beta versions seamlessly. I as the
developer run aurora. This really works just fine. I just wish that
Lightning would go to similar scheme!

So in case it is not clear, I'm quite opposed to any attempts to prevent
remove binary compatibility for extensions for Thunderbird.

rkent
_______________________________________________
dev-planning mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-planning
12345