My company has a C++ program that embeds the SpiderMonkey interpreter.
One operation this program supports ends up in compiling and evaluating
If I build the program using the JS38 SpiderMonkey, this long operation
will eventually cause the Interpreter to run out of memory. The 1.8.5
Spidermonkey did not have this problem.
I'll ask my question and then give more background below: I'm having
some difficulty figuring out where the memory is going. Is there some
sort of memory usage reporting I can use, either in the C++ API or in
This program compiles and executes JS Code as needed during its
operation. I can can control how it deals with memory.
A ScriptEngine object manages a pointer to a JSContext and Global
JSObject. There's a single global JSRuntime There are two behaviors
1. Every time a ScriptEngine is instantiated, a new JSContext and Global
JSObject is created. When the ScriptEngine object is destroyed, the
context and the global object are destroyed.
2. A single instance of the JSContext and JSObject is re-used by every
instance of the ScriptEngine. The JSContext is destroyed with
JS_DestroyContext, and the global object is managed with a
In case 1, I encounter the 'out of memory' exception from the
SpiderMonkey interpreter. In case 2, the out of memory exception never
All I can figure out is that the JSRuntime is hanging on to memory for
some reason after the destruction of the JSContext and Global object,
even though they're destroyed.