Changes between Version 62 and Version 63 of StreamingGuide


Ignore:
Timestamp:
Jan 3, 2014, 9:42:31 PM (5 years ago)
Author:
rogerdpack
Comment:

clarify some on latency

Legend:

Unmodified
Added
Removed
Modified
  • StreamingGuide

    v62 v63  
    7070== Latency ==
    7171
    72 You may be able to decrease latency by specifing that I-frames come "more frequently" (or basically always, in the case of [[x264EncodingGuide|x264]]'s zerolatency setting), though this can increase frame size and decrease quality, see [http://mewiki.project357.com/wiki/X264_Encoding_Suggestions here] for some more background.  Basically for typical x264 streams, it inserts an I-frame every 250 frames.  This means that new clients that connect to the stream may have to wait up to 250 frames before they can start receiving the stream (or start with old data).  So increasing I-frame frequency (makes the stream larger, but might decrease latency).  For real time captures you can also decrease latency of audio in windows dshow by using the dshow audio_buffer_size [http://ffmpeg.org/ffmpeg.html#Options setting].  You can also decrease latency by tuning any broadcast server you are using to minimize latency, and finally by tuning the client that receives the stream to not "cache" any incoming data, which, if it does, increases latency.
     72You may be able to decrease initial "startup" latency by specifing that I-frames come "more frequently" (or basically always, in the case of [[x264EncodingGuide|x264]]'s zerolatency setting), though this can increase frame size and decrease quality, see [http://mewiki.project357.com/wiki/X264_Encoding_Suggestions here] for some more background.  Basically for typical x264 streams, it inserts an I-frame every 250 frames.  This means that new clients that connect to the stream may have to wait up to 250 frames before they can start receiving the stream (or start with old data).  So increasing I-frame frequency (makes the stream larger, but might decrease latency).  For real time captures you can also decrease latency of audio in windows dshow by using the dshow audio_buffer_size [http://ffmpeg.org/ffmpeg.html#Options setting].  You can also decrease latency by tuning any broadcast server you are using to minimize latency, and finally by tuning the client that receives the stream to not "cache" any incoming data, which, if it does, increases latency.
    7373
    7474Sometimes audio codecs also introduce some latency of their own.  You may be able to get less latency by using speex, for example, or opus, in place of libmp3lame.
     
    7676You will also want to try and decrease latency at the server side, for instance [http://www.wowza.com/forums/content.php?81-How-to-achieve-the-lowest-latency-from-capture-to-playback wowza] hints.
    7777
    78 Also setting -probesize and -analyzeduration to low values may help your stream start up more quickly (it uses these to scan for "streams" in certain muxers, like ts, where some can appears "later", and also to estimate the duration, which, for live streams, the latter you don't need anyway).
    79 
    80 Reducing cacheing at the client side can help, too, for instance mplayer has a "-nocache" option, other players may similarly has some type of pre-playback buffering that is occurring.
     78Also setting -probesize and -analyzeduration to low values may help your stream start up more quickly (it uses these to scan for "streams" in certain muxers, like ts, where some can appears "later", and also to estimate the duration, which, for live streams, the latter you don't need anyway).  This should be unneeded by dshow input.
     79
     80Reducing cacheing at the client side can help, too, for instance mplayer has a "-nocache" option, other players may similarly has some type of pre-playback buffering that is occurring.  (The reality is mplayers -benchmark option has much more effect).
    8181
    8282Using an encoder that encodes more quickly (or possibly even raw format?) might reduce latency.
     
    8686NB that if you are sending to UDP or what not, that a client may have to wait until the next i-frame to be able to start receiving the stream, so the GOP setting (-g) i-frame interval will have an effect on how quickly they can connect.  Setting it to a lower number means it will use more bandwidth, but clients will be able to connectmore quickly (the default for x264 is 250--so for 30 fps that means an i-frame only once every 10 seconds or so).  So it's a tradeoff if you adjust it.  This does not affect actual latency (just connection time) since the client can still display frames very quickly after and once it has received its first i-frame.  Also if you're using a lossy transport, like UDP, then an i-frame represents "the next change it will have to repair the stream" if there are problems from packet loss.
    8787
    88 You can also (if capturing from a live source) increase frame rate to decrease latency (which affects throughput and also i-frame frequency, of course).
     88You can also (if capturing from a live source) increase frame rate to decrease latency (which affects throughput and also i-frame frequency, of course).  This obvious sends packets more frequently, so (with 5 fps, you introduce at least a 0.2s latency, with 10 fps 0.1s latency) but it also helps clients to fill their internal buffers, etc. more quickly.
    8989
    9090Note also that using dshow's "rtbufsize" has the unfortunate side effect of allowing frames to "buffer" while it is waiting on encoding of previous frames, or waiting for them to be sent on the wire.  This means that if you use a higher value at all, it can cause/introduce added latency if it ever gets used (but if used, can be helpful for other aspects, like transmitting more frames overall).
    9191
    92 There is also apparently an option -fflags nobuffer which might possibly help [http://ffmpeg.org/ffmpeg.html#Format-AVOptions reduce latency].
     92There is also apparently an option -fflags nobuffer which might possibly help, usually for receiving streams [http://ffmpeg.org/ffmpeg.html#Format-AVOptions reduce latency].
    9393
    9494=== Testing latency ===
    9595
    96 By default, ffplay (as a receiver for testing latency) introduces a small latency of its own, so if you use it for testing (see troubleshooting section) it may need some of these parameters, as well.  NB that ffplay has somewhat poor video output, though, so don't base quality levels on that.  Also some settings mentioned above like "probesize" might help it start more quickly.
     96By default, ffplay (as a receiver for testing latency) introduces a small latency of its own, so if you use it for testing (see troubleshooting section) it may not reflect latency accurately.  NB that ffplay also has somewhat poor video output, though, so don't base quality levels on that.  Also some settings mentioned above like "probesize" might help it start more quickly.
    9797
    9898Also useful is mplayer with its -benchmark for testing latency (-noaudio and/or -nocache *might* be useful, though I haven't found -nocache to provide any latency benefit).
    9999
    100 Using the SDL out option while using FFmpeg to receive the stream might also help to view frames with less client side latency: "ffmpeg ... -f sdl -"   
    101 
    102 === see also ===
    103 
    104 [http://stackoverflow.com/a/12085571/32453 Here] is a list of some other ideas to try (make sure you're using VBR, etc.)
     100Using the SDL out option while using FFmpeg to receive the stream might also help to view frames with less client side latency: "ffmpeg ... -f sdl -"  (this works especially well with -fflags nobuffer, though in my tests is still barely slower than using mplayer).
     101
     102=== See also ===
     103
     104[http://stackoverflow.com/a/12085571/32453 Here] is a list of some other ideas to try (using VBR may help, etc.)
    105105
    106106== Cpu usage/File size ==