Opened 3 years ago
Last modified 3 years ago
#8413 new enhancement
Demuxing embedded H.264 frames from mjpeg stream from Logitech webcam
|Reported by:||Max Dymond||Owned by:|
|Blocking:||Reproduced by developer:||no|
|Analyzed by developer:||no|
The Logitech C920 webcam has the capability of generating H.264 video. Unfortunately in newer revisions, it achieves this by embedding those frames in an MJPEG stream as APP0 data. This can be seen by doing the following, as per here:
# create a small mp4, copying mjpeg stream off the cam for one second $ ffmpeg -f v4l2 -input_format mjpeg -i /dev/video0 -c:v copy -t 1 test.mp4 # extract the unaltered jpeg files inside the stream $ ffmpeg -i test.mp4 -vcodec copy %03d.jpg # view any of the jpeg files for APP attachments $ exiv2 -pS 001.jpg STRUCTURE OF JPEG FILE: 001.jpg address | marker | length | data 2 | 0xd8 SOI | 0 4 | 0xdb DQT | 197 203 | 0xc0 SOF0 | 17 222 | 0xc4 DHT | 31 255 | 0xc4 DHT | 181 438 | 0xc4 DHT | 31 471 | 0xc4 DHT | 181 654 | 0xfe COM | 10 666 | 0xe0 APP0 | 16 | ............................... 684 | 0xdd DRI | 4 690 | 0xda SOS | 12
My plan is to write a demuxer to demux this mjpeg stream into both mjpeg data and H.264 data, so I can simply copy H.264 data into the output. This would be similar to the gstreamer plugin uvch264mjpgdemux, except in ffmpeg.
I posted about this on the ffmpeg-devel IRC and someone suggested I raise a ticket for this, so here's the ticket!
Change History (9)
comment:1 by , 3 years ago
|Component:||undetermined → avformat|
|Keywords:||mjpeg h264 added|
|Version:||unspecified → git-master|
by , 3 years ago
Stream #0:0(und): Video: mjpeg (mp4v / 0x7634706D), yuvj422p(pc, bt470bg/unknown/unknown), 1920x1080, 64007 kb/s, 30.61 fps, 29.92 tbr, 1000k tbn, 1000k tbc (default)
comment:2 by , 3 years ago
pi@octopi:~ $ ffmpeg -f v4l2 -input_format mjpeg -i /dev/video0 -c:v copy -fs 2400k -f rawvideo h264mjpeg ffmpeg version 3.2.10-1~deb9u1+rpt1 Copyright (c) 2000-2018 the FFmpeg developers built with gcc 6.3.0 (Raspbian 6.3.0-18+rpi1) 20170516 configuration: --prefix=/usr --extra-version='1~deb9u1+rpt1' --toolchain=hardened --libdir=/usr/lib/arm-linux-gnueabihf --incdir=/usr/include/arm-linux-gnueabihf --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx-rpi --enable-mmal --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared libavutil 55. 34.101 / 55. 34.101 libavcodec 57. 64.101 / 57. 64.101 libavformat 57. 56.101 / 57. 56.101 libavdevice 57. 1.100 / 57. 1.100 libavfilter 6. 65.100 / 6. 65.100 libavresample 3. 1. 0 / 3. 1. 0 libswscale 4. 2.100 / 4. 2.100 libswresample 2. 3.100 / 2. 3.100 libpostproc 54. 1.100 / 54. 1.100 Input #0, video4linux2,v4l2, from '/dev/video0': Duration: N/A, start: 45054.742469, bitrate: N/A Stream #0:0: Video: mjpeg, yuvj422p(pc, bt470bg/unknown/unknown), 1920x1080, -5 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc Output #0, rawvideo, to 'h264mjpeg': Metadata: encoder : Lavf57.56.101 Stream #0:0: Video: mjpeg, yuvj422p(pc, bt470bg/unknown/unknown), 1920x1080, q=2-31, -5 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc Stream mapping: Stream #0:0 -> #0:0 (copy) Press [q] to stop, [?] for help frame= 9 fps=0.0 q=-1.0 Lsize= 2393kB time=00:00:00.39 bitrate=49509.1kbits/s speed= 1x video:2393kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.000000%
by , 3 years ago
Output from ffmpeg -f v4l2 -input_format mjpeg -i /dev/video0 -c:v copy -fs 2400k -f rawvideo h264mjpeg
comment:3 by , 3 years ago
Please use a file hoster if 9 frames are not enough to show the issue and provide a link.
comment:4 by , 3 years ago
@cehoyos: I believe the 9 frames should suffice as test input data; following the procedure in the main post, you can see the APP0 data in the JPG frames.
comment:5 by , 3 years ago
Link to the GStreamer demuxer:
comment:6 by , 3 years ago
After examining the data a little more, the mjpeg data coming from the camera (and captured above) doesn't appear to have H.264 present in the APP fields of the JFIFs. I do have a spare camera of the same type (purchased at a different time) which I can also try this process on; I can also try capturing the data on Windows as well to see if that makes any difference due to different drivers.
Slightly confusing, but about par for the course for this camera so far...
comment:7 by , 3 years ago
- I plugged in the other Logitech C920 camera that I bought, and that exposes H.264 - so my immediate need here has suddenly been met. However, I wanted to pursue this a bit more for future persons who are hitting issues like this.
- Comparing the working camera and the non-working camera:
- the working camera has USB ID 046d:082d
- the non-working camera has USB ID 046d:0892
- the non-working camera does not expose the H.264 codec control Extension GUID (A29E7641-DE04-47e3-8B2B-F4341AFF003B, as defined in this document which details exactly how the MJPEG smuggling works, but the working camera does.
- I plugged the -0892 camera into my Windows PC to see if it exposed H.264 or exposed H.264 over MJPEG - it did not.
- I have raised a ticket against Logitech to ask them whether this camera even has H.264 natively or not; I don't know if I'll get back a response.
At this point, I'm unsure as to whether I can actually generate a video file with the desired behaviour; so unsure as to whether you want to keep this enhancement request open.
- The code at https://gl.swmansion.com/bartosz.blaszkow/ffmpeg-uvcvideo/tree/release/4.0 has support for doing UVCX_VIDEO_CONFIG_PROBE (which I think is part of turning on H.264 support here)
- My guess for how you ask the camera to generate MJPEG in a container is to twiddle some of the bits in
bStreamMuxOptionwith UVCX_VIDEO_CONFIG_COMMIT (Bit 6: MJPEG payload used as a container), but without a camera to test on, I'm just guessing.
To make this a valid ticket run the following command and provide the complete, uncut console output and attach the output file: