Changes between Version 37 and Version 38 of StreamingGuide


Ignore:
Timestamp:
Oct 17, 2012, 6:34:39 PM (4 years ago)
Author:
rogerdpack
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • StreamingGuide

    v37 v38  
    11= Streaming = 
    22[[PageOutline(2, Contents)]] 
    3 FFmpeg can basically stream through one of two ways:  It either streams to a some "other server", which restreams for it, or it can stream via UDP directly to some destination host, or alternatively directly to a multicast destination. 
     3 
     4FFmpeg can basically stream through one of two ways:  It either streams to a some "other server", which restreams for it, or it can stream via UDP/TCP directly to some destination receiver, or alternatively directly to a multicast destination. 
     5 
    46Servers which can receive from FFmpeg (to restream) include [http://ffmpeg.org/ffserver.html ffserver] (linux only, though with cygwin it might work), or [http://en.wikipedia.org/wiki/Wowza_Media_Server Wowza Media Server], or [http://en.wikipedia.org/wiki/Adobe_Flash_Media_Server Flash Media Server]. Even [http://en.wikipedia.org/wiki/VLC_media_player VLC] can pick up the stream from ffmpeg, then redistribute it, acting as a server.  Since FFmpeg is at times more efficient than VLC at doing the raw encoding, this can be a useful option compared to doing both transcoding and streaming in VLC. Nginx also has an rtmp redistribution plugin, as does [http://h264.code-shop.com/trac/wiki apache etc.] and there is probably more out there for apache, etc..  You can also live stream to online redistribution servers like own3d.tv or justin.tv (for instance streaming your desktop).  Also any [http://www.flashrealtime.com/list-of-available-rtmp-servers/ rtmp server] will most likely work to receive streams from FFmpeg (these typically require you to setup a running instance on a server). 
    57 
    68How to stream with several different simultaneous bitrates is described [http://sonnati.wordpress.com/2011/08/30/ffmpeg-%E2%80%93-the-swiss-army-knife-of-internet-streaming-%E2%80%93-part-iv/ here]. 
    79 
    8 NB that when you are testing your streams, you may want to test them with both VLC and [http://ffmpeg.org/ffplay.html FFplay], as FFplay sometimes introduces its own artifacts when it is scaled (it has poor quality scaling).  Don't use ffplay as your baseline for determining quality. 
     10NB that when you are testing your streams, you may want to test them with both VLC and [http://ffmpeg.org/ffplay.html FFplay], as FFplay sometimes introduces its own artifacts when it is scaled (it uses poor quality scaling, which can be inaccurate).  Don't use ffplay as your baseline for determining quality. 
    911 
    1012Also note that encoding it to the x264 "baseline" is basically for older iOS devices or the like, see [http://sonnati.wordpress.com/2011/08/30/ffmpeg-%E2%80%93-the-swiss-army-knife-of-internet-streaming-%E2%80%93-part-iv/ here].  Some people argue that just using mpeg4video codec is better than x264 baseline (where possible) since it is a simpler codec. 
    11  
    12  
    1313 
    1414== The -re flag == 
     
    103103The most popular streaming codec is probably [http://www.videolan.org/developers/x264.html libx264], though if you're streaming to a device which requires a "crippled" baseline h264 implementation, some have argued that the mp4 video codec is [http://forums.macrumors.com/showthread.php?t=398016 better].  You can also use mpeg2video, or really any other video codec you want, typically, as long as your receiver can decode it, and it suits your needs. 
    104104 
    105 == Outputting files == 
     105== HTTP Live Streaming == 
    106106 
    107 FFmpeg supports splitting files (using "-f segment" for the output, see [http://ffmpeg.org/ffmpeg.html#segment_002c-stream_005fsegment_002c-ssegment segment muxer]) into time based chunks, useful for [http://en.wikipedia.org/wiki/HTTP_Live_Streaming HTTP live streaming] style file output.  See also [http://stackoverflow.com/questions/12041077/ffmpeg-output-to-multiple-files-simultaneously this SO post]. 
     107FFmpeg supports splitting files (using "-f segment" for the output, see [http://ffmpeg.org/ffmpeg.html#segment_002c-stream_005fsegment_002c-ssegment segment muxer]) into time based chunks, useful for [http://en.wikipedia.org/wiki/HTTP_Live_Streaming HTTP live streaming] style file output.  See also http://sonnati.wordpress.com/2012/07/02/ffmpeg-the-swiss-army-knife-of-internet-streaming-part-v 
     108 
     109== Saving a file and Streaming at the same time == 
     110 
     111See [Creating%20multiple%20outputs].  Basically, you may only be able to accept from a webcam from one process, so you'll need to "split" your output if you want to save it and stream it as well.  See also [http://stackoverflow.com/questions/12041077/ffmpeg-output-to-multiple-files-simultaneously this SO post]. 
    108112 
    109113== Transcoding/repeating == 
     
    144148 'rtmp://<wowza server IP>/live/cam0' is where the transcoded video stream gets pushed to 
    145149 
    146 == Variable bitrate based on line conditions == 
     150== Adjusting bitrate based on line conditions == 
    147151 
    148152FFmpeg doesn't (today) support varying the encoding bitrate based on fluctuating network conditions.  It does support outputting in several "different" fixed bitrates, at the same time, however [http://stackoverflow.com/questions/12041077/ffmpeg-output-to-multiple-files-simultaneously see here], which is somewhat related.  Also if you are during direct capture from directshow, the input device starts dropping frames when there is congestion, which somewhat simulates a varying bitrate. 
     
    152156If you get a "black/blank" screen from your server, try sending it yuv422p or yuv420p type input.  Some servers get confused if you send them yuv444 input (which is the default for libx264). 
    153157 
    154  
    155158== Point to point streaming == 
    156159 
    157 If you want to stream "from one computer to another", you could start up a server on one, and then stream from FFmpeg to that server, then have the client connect to that server.  Or you could do a point to point type stream, like: 
     160If you want to stream "from one computer to another", you could start up a server on one, and then stream from FFmpeg to that server, then have the client connect to that server (server could either be on client or server side computers).  Or you could do a point to point type stream, like: 
    158161 
    159162 ffmpeg -i INPUT -acodec libmp3lame -ar 11025 --f rtp rtp://host:port