Re: Ratelimiting video to avoid buffer bloat

classic Classic list List threaded Threaded
11 messages Options
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Jason Duell-3
[Moved to dev.tech.network]

On Thu, 2012-03-01 at 12:51 +0100, Ashwin Rao wrote:
>  I have a suggestion regarding streaming HTML5 videos. Currently
>  Firefox does not restrict the rate of data transfer while streaming
>  HTML5 videos. Google Chrome and Internet Explorer however restrict the
>  rate of data transfer. I have detailed out these rate restrictions in
>  a preliminary work available at hal.inria.fr/inria-00638063/en/


On 03/01/2012 07:53 AM, Patrick McManus wrote:
> This is actually a design item we've already incorporated for an
> upcoming implementation of the DASH framework. I don't recall having
> talked about it in the scope of normal video/webm stuff but you're
> absolutely right that it applies there.

Yes, we should do it for all video.

> I've cc'd some folks who are hands on with the video projects -
> hopefully they can chime in here an incorporate this. Its an important
> topic. Can the right person file a bug?

One big question is: at what level should we be doing this?  The obvious
two choices are to 1) have the video layer control download speed by
calling suspend/resume on the network channel as needed.  That's a
pretty blunt hammer;  or 2) add some sort of API to necko that lets the
consumer specify what sort of bitrate they need to get out of the
channel, and have necko deal with controlling the rate.  Of course, TCP
doesn't exactly give us great tools for rate control either: besides
simply not reading from the socket (same as suspending channel). we
could fiddle with incoming OS socket buffer sizes, anything else?  
Perhaps we should start with suspend/resuming the channel, so we can
make progress quickly, and keep an eye on how necko internally could
facilitate things.

Thoughts?

Jason


> (jason, steve, josh, roc: here is a simple
> http://devfiles.myopera.com/articles/1891/sunflower-webm.html html5
> video page with 22 seconds of video, which I see all come down in 1
> chunk as fast as possible.. iirc some videos will use multiple chunks
> spread over time, but even that should be ratepaced where possible to
> share the network more effectively.. not sure how effectively that is
> currently happening)
>
> -Pat
_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Patrick McManus
On Thu, 2012-03-01 at 09:56 -0800, Jason Duell wrote:

>
> One big question is: at what level should we be doing this?  The obvious
> two choices are to 1) have the video layer control download speed by
> calling suspend/resume on the network channel as needed.  That's a
> pretty blunt hammer;  or 2) add some sort of API to necko that lets the
> consumer specify what sort of bitrate they need to get out of the
> channel, and have necko deal with controlling the rate.  

presumably #2.. maybe load group based instead of or in addition to
channel. non video use case would be things in the download window when
they were competing with interactive needs.




_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Christian Biesinger-2
On Thu, Mar 1, 2012 at 11:01 AM, Patrick McManus <[hidden email]> wrote:

> On Thu, 2012-03-01 at 09:56 -0800, Jason Duell wrote:
>> One big question is: at what level should we be doing this?  The obvious
>> two choices are to 1) have the video layer control download speed by
>> calling suspend/resume on the network channel as needed.  That's a
>> pretty blunt hammer;  or 2) add some sort of API to necko that lets the
>> consumer specify what sort of bitrate they need to get out of the
>> channel, and have necko deal with controlling the rate.
>
> presumably #2.. maybe load group based instead of or in addition to
> channel. non video use case would be things in the download window when
> they were competing with interactive needs.

FWIW, for the downloads case we could use the existing priorization
mechanism (nsISupportsPriority) to give downloads a lower priority...

-christian
_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Patrick McManus
In reply to this post by Jason Duell-3
On Thu, 2012-03-01 at 09:56 -0800, Jason Duell wrote:

> [Moved to dev.tech.network]
>
> On Thu, 2012-03-01 at 12:51 +0100, Ashwin Rao wrote:
> >  I have a suggestion regarding streaming HTML5 videos. Currently
> >  Firefox does not restrict the rate of data transfer while streaming
> >  HTML5 videos. Google Chrome and Internet Explorer however restrict the
> >  rate of data transfer. I have detailed out these rate restrictions in
> >  a preliminary work available at hal.inria.fr/inria-00638063/en/
>
>
> On 03/01/2012 07:53 AM, Patrick McManus wrote:
> > This is actually a design item we've already incorporated for an
> > upcoming implementation of the DASH framework. I don't recall having
> > talked about it in the scope of normal video/webm stuff but you're
> > absolutely right that it applies there.
>
> Yes, we should do it for all video.
>
> > I've cc'd some folks who are hands on with the video projects -
> > hopefully they can chime in here an incorporate this. Its an important
> > topic. Can the right person file a bug?

https://bugzilla.mozilla.org/show_bug.cgi?id=733010

_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Robert O'Callahan-3
In reply to this post by Patrick McManus
On Thu, Mar 1, 2012 at 7:01 PM, Patrick McManus <[hidden email]>wrote:

> On Thu, 2012-03-01 at 09:56 -0800, Jason Duell wrote:
> > One big question is: at what level should we be doing this?  The obvious
> > two choices are to 1) have the video layer control download speed by
> > calling suspend/resume on the network channel as needed.  That's a
> > pretty blunt hammer;  or 2) add some sort of API to necko that lets the
> > consumer specify what sort of bitrate they need to get out of the
> > channel, and have necko deal with controlling the rate.
>
> presumably #2.. maybe load group based instead of or in addition to
> channel. non video use case would be things in the download window when
> they were competing with interactive needs.
>

Using the loadgroup might be tricky since video downloads have to belong to
the page's loadgroup when we're delaying the page onload event for them,
and juggling the download channel between the page's main loadgroup and its
background loadgroup is already a pain.

Rob
--
“You have heard that it was said, ‘Love your neighbor and hate your enemy.’
But I tell you, love your enemies and pray for those who persecute you,
that you may be children of your Father in heaven. ... If you love those
who love you, what reward will you get? Are not even the tax collectors
doing that? And if you greet only your own people, what are you doing more
than others?" [Matthew 5:43-47]
_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Robert O'Callahan-3
In reply to this post by Jason Duell-3
On Thu, Mar 1, 2012 at 6:21 PM, Ashwin Rao <[hidden email]>wrote:

> > One big question is: at what level should we be doing this?  The obvious
> two
> > choices are to 1) have the video layer control download speed by calling
> > suspend/resume on the network channel as needed.  That's a pretty blunt
> > hammer;
>

Our media cache already calls suspend/resume "as needed" to throttle
downloading when the cache fills up. It is a blunt hammer :-).

The good news is that the media cache already knows exactly what is being
read by the decoder and when, and makes download decisions based on that.
Improving the download management policy is therefore not architecturally
difficult.

I'm not really sure what kind of policy is needed, though. We could easily
shrink the window of data we download for a particular video (e.g. 20s
ahead of and behind the current play point), but previous experience told
us that the more we buffer, the better. Should we do something like limit
the download rate to, say, 3x the playback data rate? I'm not sure the
suspend/resume approach to bandwidth management could do that.

Rob
--
“You have heard that it was said, ‘Love your neighbor and hate your enemy.’
But I tell you, love your enemies and pray for those who persecute you,
that you may be children of your Father in heaven. ... If you love those
who love you, what reward will you get? Are not even the tax collectors
doing that? And if you greet only your own people, what are you doing more
than others?" [Matthew 5:43-47]
_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Robert O'Callahan-3
On Wed, Mar 7, 2012 at 1:55 PM, Ashwin Rao <[hidden email]>wrote:

> > Our media cache already calls suspend/resume "as needed" to throttle
> > downloading when the cache fills up. It is a blunt hammer :-).
> >
> > The good news is that the media cache already knows exactly what is being
> > read by the decoder and when, and makes download decisions based on that.
> > Improving the download management policy is therefore not architecturally
> > difficult.
> >
>
> Is the rate throttling available in the dev tree? In Firefox 10.0.1 I
> am not seeing this throttling. For example entire HTML5 videos on
> YouTube, of even 20 minutes in duration, were downloaded at the end to
> end available bandwidth (of 100 Mbps in my lab).
>

The media cache size is 500MB. If your video is smaller than that, you
won't see any throttling.

Try setting media.cache_size to say 50MB, preload a 200MB video, you should
see downloading pause after approximately 50MB has been loaded. Then start
playing, and eventually you should see downloading resume; the download
will pause and resume to keep a window of data ahead of the play point.

(I wouldn't call this "rate throttling", since it's not explicitly based on
rate.)

A work by Don Towsley suggests that the rate of up to 2 times the
> median encoding rate is sufficient for smooth playback [
> http://dl.acm.org/citation.cfm?id=1027735 ]. About the buffering I did
> some measurements  on YouTube where I observed that YouTube begins a
> streaming session (for Flash videos) by buffering 40 seconds of
> playback data before limiting the download rate to 1.25 times the
> video encoding rate. The details of the results are available at [
> http://hal.inria.fr/inria-00638063/en/ ].
>

Either of those could be implemented in Gecko pretty easily I guess,
although as I said before, I'm not sure pausing and resuming the Necko
download is adequate to hit a smooth target rate.

Rob
--
“You have heard that it was said, ‘Love your neighbor and hate your enemy.’
But I tell you, love your enemies and pray for those who persecute you,
that you may be children of your Father in heaven. ... If you love those
who love you, what reward will you get? Are not even the tax collectors
doing that? And if you greet only your own people, what are you doing more
than others?" [Matthew 5:43-47]
_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Jason Duell-3
On 03/07/2012 04:10 PM, Robert O'Callahan wrote:

> On Wed, Mar 7, 2012 at 1:55 PM, Ashwin Rao
> <[hidden email] <mailto:[hidden email]>> wrote:
>
>     > Our media cache already calls suspend/resume "as needed" to throttle
>     > downloading when the cache fills up. It is a blunt hammer :-).
>     >
>
>     The media cache size is 500MB. If your video is smaller than that,
>     you won't see any throttling.
>
>     Try setting media.cache_size to say 50MB, preload a 200MB video,
>     you should see downloading pause after approximately 50MB has been
>     loaded. Then start playing, and eventually you should see
>     downloading resume; the download will pause and resume to keep a
>     window of data ahead of the play point.
>
>     (I wouldn't call this "rate throttling", since it's not explicitly
>     based on rate.)
>

Necko internally would be doing rate throttling using the same
suspend/resume mechanism--that's all TCP gives us.  So it's not so much
a less-blunt hammer than one that we can swing more quickly (i.e the
socket transport thread can keep track of the bandwidth coming in and
throttle it w/o the overhead/noise of thread events being sent back and
forth to the main thread from the video cache).  I'm not sure how much
difference that makes in practice, though--there's a good chance it'd be
smoother .  AFAICT this avoidance of event latency, plus a better sense
of "bottleneck bandwidth" are the only 2 advantages of doing this in
necko.    That might be reason enough, or it might make sense to do a
1st version of this using suspend/resume from the media cache with a
<500MB buffer.

 > Patrick wrote:
 >
 > The problem isn't the buffered video but the IP level buffering that
happens on big tcp downloads...  that's why I filed it against
networking first

I don't follow--the OS buffers for a TCP socket are much smaller than
the media cache's buffer.   Or are you talking about clogging router
buffers?  I still don't see how necko doing the suspend/resume vs the
media cache makes a difference here.

Jason


>
>     A work by Don Towsley suggests that the rate of up to 2 times the
>     median encoding rate is sufficient for smooth playback [
>     http://dl.acm.org/citation.cfm?id=1027735 ]. About the buffering I did
>     some measurements  on YouTube where I observed that YouTube begins a
>     streaming session (for Flash videos) by buffering 40 seconds of
>     playback data before limiting the download rate to 1.25 times the
>     video encoding rate. The details of the results are available at [
>     http://hal.inria.fr/inria-00638063/en/ ].
>
>
> Either of those could be implemented in Gecko pretty easily I guess,
> although as I said before, I'm not sure pausing and resuming the Necko
> download is adequate to hit a smooth target rate.


_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Patrick McManus

> > The problem isn't the buffered video but the IP level buffering that
> happens on big tcp downloads...  that's why I filed it against
> networking first
>
> I don't follow--the OS buffers for a TCP socket are much smaller than
> the media cache's buffer.   Or are you talking about clogging router
> buffers?  I still don't see how necko doing the suspend/resume vs the
> media cache makes a difference here.

yes, router buffer of the bottleneck link. That's where the problem is
and where the problem can be slow to clear - once data is at the host
its not an issue for anything but ram.

if necko can do it we can apply the code to things other than media
consumers (e.g. download window) where the notion of appropriate rate
might be derived from some shared state about all the necko connections
- so we don't want to push that decision out to the end consumer who
doesn't have visibility into all the other transfers.

additionally, there are things necko can do other than just stop reading
on suspend.. it can slam rwin (the recv buffer) down to ~0, which is
going to stop the transmit faster in the presence of large windows (and
windows larger than bw*delay are the heart of bufferbloat),
alternatively it can just reduce rwin to something smaller than bw*delay
to effectively slow the transmission rate and make sure the bottleneck
link gets its buffers cleared at least once every rtt.. etc.. etc..




_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Ashwin Rao
In reply to this post by Jason Duell-3
On Thu, Mar 8, 2012 at 3:48 AM, Jason Duell <[hidden email]> wrote:

> On 03/07/2012 04:10 PM, Robert O'Callahan wrote:
>
> On Wed, Mar 7, 2012 at 1:55 PM, Ashwin Rao <[hidden email]>
> wrote:
>>
>> > Our media cache already calls suspend/resume "as needed" to throttle
>> > downloading when the cache fills up. It is a blunt hammer :-).
>> >
>>
>> The media cache size is 500MB. If your video is smaller than that, you
>> won't see any throttling.
>>
>>
>> Try setting media.cache_size to say 50MB, preload a 200MB video, you
>> should see downloading pause after approximately 50MB has been loaded. Then
>> start playing, and eventually you should see downloading resume; the
>> download will pause and resume to keep a window of data ahead of the play
>> point.
>>
>> (I wouldn't call this "rate throttling", since it's not explicitly based
>> on rate.)
>
>
> Necko internally would be doing rate throttling using the same
> suspend/resume mechanism--that's all TCP gives us.  So it's not so much a
> less-blunt hammer than one that we can swing more quickly (i.e the socket
> transport thread can keep track of the bandwidth coming in and throttle it
> w/o the overhead/noise of thread events being sent back and forth to the
> main thread from the video cache).  I'm not sure how much difference that
> makes in practice, though--there's a good chance it'd be smoother .  AFAICT
> this avoidance of event latency, plus a better sense of "bottleneck
> bandwidth" are the only 2 advantages of doing this in necko.    That might
> be reason enough, or it might make sense to do a 1st version of this using
> suspend/resume from the media cache with a <500MB buffer.
>

I agree that the buffer size should be less than 500 MB. A buffer size
of 10 MB would be large enough for most videos; 10 MB = 80 Mbits = 80
seconds of playback data for a video encoded at 1 Mbps. For HD videos
that have an encoding rate of 5 Mbps it would account for 16 seconds
of playback time. A  double buffering scheme where download
resumes/begins when the buffered amount falls below 10 MB and download
pauses when the buffer size is 20 MB could be tried. I need to re-look
at the typical encoding rates used by YouTube, NetFlix, Vimeo, and
Dailymotion. I do not have an idea on the video encoding rates used by
other video streaming services.

>> Patrick wrote:
>>
>> The problem isn't the buffered video but the IP level buffering that
>> happens on big tcp downloads...  that's why I filed it against networking
>> first
>
> I don't follow--the OS buffers for a TCP socket are much smaller than the
> media cache's buffer.   Or are you talking about clogging router buffers?  I
> still don't see how necko doing the suspend/resume vs the media cache makes
> a difference here.
>

The problem is that TCP tries to saturate buffers at the routers till
a packet drop is encountered. On a packet drop it reduces the rate of
sending packets. The rate is slowly (in case of Reno) increased until
the next packet drop. In the case of home gateways the buffer size are
large. One reason for large buffers is to support a burst of packets,
however the side-effect of these large buffers is that the queuing
delay (in the steady state) at the home gateways tends to exceed the
propagation delay. This can reduce the performance and responsiveness
of TCP flows.

TCP flows transferring streaming video content do not need to send at
the end to end available bandwidth -- they can send at a rate that is
close to the video encoding rate. This can ensure that the TCP flows
transferring video content have a smaller footprint on the router
buffers.

Another important advantage of downloading at the reduced rate is that
the amount of unused bytes -- the bytes downloaded by Firefox but not
used by the player due to user interruption -- is kept to a minimum.
This can ensure that the memory (media cache, memory, and disk)
footprint of Firefox is small. In my case, I see the amount of unused
bytes as the bytes wasted which in turn reflects the wastage of
networking resources that were used to transfer the bytes.

Regards,
Ashwin



> Jason
>
>
>
>>
>> A work by Don Towsley suggests that the rate of up to 2 times the
>> median encoding rate is sufficient for smooth playback [
>> http://dl.acm.org/citation.cfm?id=1027735 ]. About the buffering I did
>> some measurements  on YouTube where I observed that YouTube begins a
>> streaming session (for Flash videos) by buffering 40 seconds of
>> playback data before limiting the download rate to 1.25 times the
>> video encoding rate. The details of the results are available at [
>> http://hal.inria.fr/inria-00638063/en/ ].
>
>
> Either of those could be implemented in Gecko pretty easily I guess,
> although as I said before, I'm not sure pausing and resuming the Necko
> download is adequate to hit a smooth target rate.
>
>
>
_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network
Reply | Threaded
Open this post in threaded view
|

Re: Ratelimiting video to avoid buffer bloat

Robert O'Callahan-3
On Sat, Mar 24, 2012 at 3:30 AM, Ashwin Rao <[hidden email]>wrote:

> I agree that the buffer size should be less than 500 MB. A buffer size
> of 10 MB would be large enough for most videos; 10 MB = 80 Mbits = 80
> seconds of playback data for a video encoded at 1 Mbps. For HD videos
> that have an encoding rate of 5 Mbps it would account for 16 seconds
> of playback time. A  double buffering scheme where download
> resumes/begins when the buffered amount falls below 10 MB and download
> pauses when the buffer size is 20 MB could be tried. I need to re-look
> at the typical encoding rates used by YouTube, NetFlix, Vimeo, and
> Dailymotion. I do not have an idea on the video encoding rates used by
> other video streaming services.
>

Someone should file a bug to add a new preference controlling the maximum
desired buffered amount for any given media resource. It will mostly revert
bug 572235 so please link to that bug.

Rob
--
“You have heard that it was said, ‘Love your neighbor and hate your enemy.’
But I tell you, love your enemies and pray for those who persecute you,
that you may be children of your Father in heaven. ... If you love those
who love you, what reward will you get? Are not even the tax collectors
doing that? And if you greet only your own people, what are you doing more
than others?" [Matthew 5:43-47]
_______________________________________________
dev-tech-network mailing list
[hidden email]
https://lists.mozilla.org/listinfo/dev-tech-network