Opened 9 years ago

Closed 9 years ago

#5185 closed defect (needs_more_info)

Excessive HTTP traffic for DASH mp4 videos

Reported by: Misaki Owned by:
Priority: normal Component: undetermined
Version: 2.7.3 Keywords:
Cc: Blocked By:
Blocking: Reproduced by developer: no
Analyzed by developer: no

Description

This might have been reported before, if so I apologize.

Using YouTube as the example, because most video sources on the Internet don't use DASH and YouTube is a very popular website, ffmpeg attempts to not only index an entire mp4 file, but repeatedly downloads it from an offset to the end, instead of just to the start of the next segment.

This might be related to the 'DASH manifest', but (recent versions of) ffmpeg actually handles webm DASH videos from YouTube just fine, without excessive downloading.

ffmpeg's statistics are unreliable on this point. With -v trace, there's a report from AVIOContext on the number of bytes read, but it's much smaller than actual network traffic.

The reason for this could be that the network element is reading more packets from the server that it needs, and discarding the extra, but still keeping the connection open until it finishes. However, my current testing shows that while downloading is high, it isn't as high as I concluded it was before with a previous version of ffmpeg (which was "download to the end starting at each 5-second segment").

Using a script to look at changes in ifconfig, it takes ~10 MB of downloading, and 12~25 seconds, to download the first 1 second of a 426x240 resolution video file. AVIOContext reports that just 750kB was read, with 33 kB in the input stream being read, and the same amount (size of read packets) demuxed, with similar size for output.

The short script I'm using to diagnose, for people like me who are lazy:

for i in {1..10}; do sleep 10; end=$(ifconfig wlan3 |grep -o 'RX bytes[ ]*'); end=${end#*:}; echo RX change: $((end-start)); date; done

Script for downloading:

i="0"; v=133; title="$(youtube-dl --get-filename -f $v -o "%(title)s [%(id)s].%(ext)s" $i)"; vurl="$(youtube-dl -g -f $v -- $i)";

time ffmpeg -ss 0 -i "$vurl" -hide_banner -c copy -map_metadata 0 -movflags faststart -avoid_negative_ts 2 -f ${title##*.} -t 1 /dev/null

(can combine)

Using 'cache:' doesn't significantly change it. I think the problem was worse in the past, and was related specifically to seeking (or at least was made worse by seeking), while currently seeking seems not to affect it.

In this case, using <(wget -qO - "$vurl") to pipe the input in reduces the change in download activity from ~8.3MB, to 73kB. AVIOContext also reports a drop from ~750kB to 50kB.

(Trying to use cache with piped input breaks stuff)

So, ideally, ffmpeg wouldn't take longer, or download more, when it handles downloading itself than when using wget. I'm not sure if it's possible (DASH manifest?) for ffmpeg to avoid downloading from the start of the file to the seek point for mp4 files, even though it can now do so for webm files (and I think used to be able to do so for mp4 files, before YouTube changed? I might just not have noticed long download times though), but it shouldn't download lots of data after the end point.

Change History (3)

comment:1 by Misaki, 9 years ago

For anyone concerned with feasibility, totem, ffplay, and vlc could all play a DASH webm file from YouTube that had the middle missing. (Created by starting a download, using 'truncate', and then starting the download again.)

None of them could play a DASH mp4 file from YouTube with the middle missing. They can play non-DASH mp4 files that are this way.

comment:2 by Carl Eugen Hoyos, 9 years ago

Please test current FFmpeg git head and please provide the command line that allows to reproduce the issue together with the complete, uncut console output.

comment:3 by Carl Eugen Hoyos, 9 years ago

Resolution: needs_more_info
Status: newclosed

Please reopen this ticket if you can provide the missing information.

Note: See TracTickets for help on using tickets.