Opened 4 years ago

Last modified 3 years ago

#8459 open enhancement

V360 + BLEND

Reported by: villmer Owned by: villmer
Priority: wish Component: avfilter
Version: git-master Keywords: v360
Cc: Blocked By:
Blocking: Reproduced by developer: no
Analyzed by developer: no

Description

First I want to congratulate the ffmpeg team on implementing the excellent v360 filter. Well done.

There is one particular issue, however, I would like to see resolved in order to make this filter even better.

When converting dual-fisheye to equirectangular format, there is often (always) a need to add some blending in order to reduce the sharp contract between the two regions.

Attachments (6)

3ZGrG.jpg (85.4 KB ) - added by villmer 4 years ago.
Original Dual Fisheye Video
fli0z.png (668.2 KB ) - added by villmer 4 years ago.
Equirectangular video created using: ffmpeg -y -i in.mp4 -vf v360=dfisheye:e:in_pad=0.058 -c:v libx264 -b:v 3000k -bufsize 3000k -preset ultrafast -c:a copy out.mp4
Screen Shot 2020-02-22 at 9.40.09 PM.png (2.1 MB ) - added by villmer 4 years ago.
Stitched Equirectangular
Screen Shot 2020-02-22 at 9.48.20 PM.png (1.8 MB ) - added by villmer 4 years ago.
Stitched Equirectangular (trimmed)
out.png (656.3 KB ) - added by Michael Koch 4 years ago.
results.jpg (327.6 KB ) - added by villmer 4 years ago.
Results of blend method by mkoch

Change History (44)

by villmer, 4 years ago

Attachment: 3ZGrG.jpg added

Original Dual Fisheye Video

by villmer, 4 years ago

Attachment: fli0z.png added

Equirectangular video created using: ffmpeg -y -i in.mp4 -vf v360=dfisheye:e:in_pad=0.058 -c:v libx264 -b:v 3000k -bufsize 3000k -preset ultrafast -c:a copy out.mp4

comment:1 by villmer, 4 years ago

Owner: set to villmer
Status: newopen

in reply to:  description comment:2 by villmer, 4 years ago

After the FFMPEG conversion of the dual fisheye (dfisheye) to equirectangular format, the vertical lines separating the regions are sharp.
See attachment fli0z.png

Adding a "blend" parameter to soften these areas will provide a clean, seamless 360° video.
Without such blending, the results lack the quality often desired in professional VR content.

Last edited 4 years ago by villmer (previous) (diff)

comment:3 by Carl Eugen Hoyos, 4 years ago

Component: ffmpegavfilter
Keywords: dfisheye equirectangular removed
Priority: importantwish
Reproduced by developer: unset

comment:4 by villmer, 4 years ago

Noticed this was not Analyzed by developer nor Reproduced by developer. Would be nice to see a developer at least look into this.

comment:5 by Michael Koch, 4 years ago

This is a workaround for removing the left vertical border (between x=249 and x=250) by applying a suitable luminance gradient to all pixels in the x=229 to x=249 range.

ffmpeg -i fli0z.png -vf geq=cb_expr='cb(X,Y)':cr_expr='cr(X,Y)':lum_expr='if(between(X,229,249),lum(X,Y)+lerp((X-229)/20,0,lum(249,Y)-lum(250,Y)),lum(X,Y))',format=rgb24 -y out.png

comment:6 by villmer, 4 years ago

Conversion failed:

ffmpeg -i equi_frame.png -vf geq=cb_expr='cb(X,Y)':cr_expr='cr(X,Y)':lum_expr='if(between(X,229,249),lum(X,Y)+lerp((X-229)/20,0,lum(249,Y)-lum(250,Y)),lum(X,Y))',format=rgb24 -y out.png

ffmpeg version git-2020-01-04-3c8da7b Copyright (c) 2000-2020 the FFmpeg developers

built with Apple clang version 11.0.0 (clang-1100.0.33.16)
configuration: --prefix=/usr/local/Cellar/ffmpeg/HEAD-3c8da7b --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags='-I/System/Library/Frameworks/JavaVM.framework/Versions/Current/Headers/ -fno-stack-check' --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libmp3lame --enable-libopus --enable-librubberband --enable-libsnappy --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-libsoxr --enable-videotoolbox --disable-libjack --disable-indev=jack
libavutil 56. 38.100 / 56. 38.100
libavcodec 58. 65.102 / 58. 65.102
libavformat 58. 35.101 / 58. 35.101
libavdevice 58. 9.103 / 58. 9.103
libavfilter 7. 70.101 / 7. 70.101
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 6.100 / 5. 6.100
libswresample 3. 6.100 / 3. 6.100
libpostproc 55. 6.100 / 55. 6.100

Input #0, image2, from 'equi_frame.png':

Duration: 00:00:00.04, start: 0.000000, bitrate: 17496 kb/s

Stream #0:0: Video: mjpeg (Baseline), yuvj420p(pc, bt470bg/unknown/unknown), 1000x499 [SAR 1:1 DAR 1000:499], 25 tbr, 25 tbn, 25 tbc

Stream mapping:

Stream #0:0 -> #0:0 (mjpeg (native) -> png (native))

Press [q] to stop, ? for help
[Parsed_geq_0 @ 0x7fc72d700940] A luminance or RGB expression is mandatory
[AVFilterGraph @ 0x7fc72d5029c0] Error initializing filter 'geq' with args 'cb_expr=cb(X'
Error reinitializing filters!
Failed to inject frame into filter network: Invalid argument
Error while processing the decoded data for stream #0:0
Conversion failed!

comment:7 by Michael Koch, 4 years ago

try removing the line feeds from the command line, it must all be written in one line

comment:8 by Michael Koch, 4 years ago

Here it works:

C:\Users\mKoch\Desktop>c:\ffmpeg\ffmpeg -i fli0z.png -vf geq=cb_expr='cb(X,Y)':c
r_expr='cr(X,Y)':lum_expr='if(between(X,249-20,249),lum(X,Y)+lerp((X-249+20)/20,
0,lum(249,Y)-lum(249+1,Y)),lum(X,Y))',format=rgb24 -y out.png
ffmpeg version git-2020-02-20-56df829 Copyright (c) 2000-2020 the FFmpeg develop
ers

built with gcc 9.2.1 (GCC) 20200122
configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfi

g --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libb
luray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enab
le-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --e
nable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable
-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 -
-enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enab
le-libvidstab --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --en
able-libspeex --enable-libxvid --enable-libaom --enable-libmfx --enable-ffnvcode
c --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 -
-enable-avisynth --enable-libopenmpt --enable-amf

libavutil 56. 41.100 / 56. 41.100
libavcodec 58. 71.100 / 58. 71.100
libavformat 58. 38.101 / 58. 38.101
libavdevice 58. 9.103 / 58. 9.103
libavfilter 7. 76.100 / 7. 76.100
libswscale 5. 6.100 / 5. 6.100
libswresample 3. 6.100 / 3. 6.100
libpostproc 55. 6.100 / 55. 6.100

Input #0, png_pipe, from 'fli0z.png':

Duration: N/A, bitrate: N/A

Stream #0:0: Video: png, rgba(pc), 1000x498 [SAR 5669:5669 DAR 500:249], 25

tbr, 25 tbn, 25 tbc
Stream mapping:

Stream #0:0 -> #0:0 (png (native) -> png (native))

Press [q] to stop, ? for help
Output #0, image2, to 'out.png':

Metadata:

encoder : Lavf58.38.101
Stream #0:0: Video: png, rgb24, 1000x498 [SAR 1:1 DAR 500:249], q=2-31, 200

kb/s, 25 fps, 25 tbn, 25 tbc

Metadata:

encoder : Lavc58.71.100 png

frame= 1 fps=0.0 q=-0.0 Lsize=N/A time=00:00:00.04 bitrate=N/A speed=0.183x

video:655kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing o
verhead: unknown

comment:9 by Michael Koch, 4 years ago

Try adding double quotes around the filter chain:

ffmpeg -i fli0z.png -vf "geq=cb_expr='cb(X,Y)':
cr_expr='cr(X,Y)':lum_expr='if(between(X,249-20,249),lum(X,Y)+lerp((X-249+20)/20
,0,lum(249,Y)-lum(249+1,Y)),lum(X,Y))',format=rgb24" -y out.png

by villmer, 4 years ago

Stitched Equirectangular

comment:10 by villmer, 4 years ago

Adding the quotes stopped errors but I don't understand how you're using this. I assume this is a post-stitching process to soften the 2 vertical lines. I also assumed we had to stitch the dual-fisheye image first to create an equirectangular and then apply this filter. If this is true I don't see any results. I've attached Screen Shots 2020-02-22 above of a stitched equi. Can you show me what you're getting after you run this filter?

by villmer, 4 years ago

Stitched Equirectangular (trimmed)

comment:11 by Michael Koch, 4 years ago

Yes, this is a post-stitching process to soften the left of the two vertical lines. Did you use the input image fli0z.png that you uploaded here? It works only if you know exactly at which x position the vertical line is. In this case between x=249 and x=250.
Your output image seems to be bigger than the original input image.

Here is a better Windows batch file that also softens the other line between x=749 and x=750.

set "FF=c:\ffmpeg\ffmpeg" :: Path to FFmpeg
set "IN=fli0z.png" :: Input image
set "B1=249" :: Left side of first vertical border, right side is at B1+1
set "B2=749" :: Left side of second vertical border, right side is at B2+1
set "W=25" :: Width of interpolation area

%FF% -i %IN% -vf "geq=cb_expr='cb(X,Y)':cr_expr='cr(X,Y)':lum_expr='lum(X,Y)+between(X,%B1%-%W%,%B1%)*lerp((X-%B1%+%W%)/%W%,0,lum(%B1%,Y)-lum(%B1%+1,Y))+between(X,%B2%-%W%,%B2%)*lerp((X-%B2%+%W%)/%W%,0,lum(%B2%,Y)-lum(%B2%+1,Y))',format=rgb24" -y out.png

pause

by Michael Koch, 4 years ago

Attachment: out.png added

comment:12 by villmer, 4 years ago

Yes, the workaround actually looks quite good. Great technique. Thank you for sharing this. I see two issues. First, is having to know the exact x positions of both vertical lines beforehand (although I suppose some kind of algorithm could determine this). Second, this process can only be done once the entire stitching process has been complete.

Ideally, the v360 command needs to automatically determine how to implement the blending/luminance gradient techniques during a single stiching process to avoid having to do 2 separate processes. The stitching and blending should be done all at the same time in my opinion. Knowing the radius of both fisheye circles should be enough to determine where the blending (luminance manipulation) should take place.

I hope the FFMPEG team will consider this. It's highly needed in the VR world right now.
Thanks again for sharing your technique mkoch. Really excellent.

comment:13 by Michael Koch, 4 years ago

Most probably the right sides of the borders are always at WIDTH/4 and WIDTH*3/4, and the left side is at one pixel less. It would be a little bit more complicated if WIDTH isn't a multiple of 4.
There is no need for two separate processes. In the filter chain you can first use the v360 filter and then the geq filter, separated by a comma:

-vf "v360=input=dfisheye:output=e,geq=..."

You find more examples for remap and v360 in my book:
http://www.astro-electronic.de/FFmpeg_Book.pdf

I agree with you that smoothing would be a nice enhancement for the v360 filter.

comment:14 by villmer, 4 years ago

WOW, your book is really amazing!
Thanks again for your great input on this.

comment:15 by villmer, 4 years ago

Could you give me an idea of what a single string would look like to both convert the fisheyes and apply the gradient to both lines? Is there a way to also add a formula to evaluate where the vertical lines are? ( I'm using a Mac and Linux machines) My familiarity with FFMPEG is very, very low.

comment:16 by Michael Koch, 4 years ago

What command line did you use to make the equirectangular image? Just add the geq filter at the end of the filter chain. I don't think there is a way to automatically evaluate where the vertical lines are. I just found a bug in my batch file and I'll update the example in my book soon. May I use your image as an example in my book?

Last edited 4 years ago by Michael Koch (previous) (diff)

comment:17 by villmer, 4 years ago

The dual fisheye image is not mine.
It is from here: https://www.youtube.com/watch?v=70Wd7Ex54jE

comment:18 by Michael Koch, 4 years ago

I did download the video in size 1280x640. The first command line makes only the conversion from dual_fisheye to equirectangular, without smoothing. The best in_pad parameter must be found by try and error. That should be always the same value, as long as you use the same camera.
The second command line does the same thing, but including smoothing. Unfortunately all these workarounds with geq filter are quite slow. That's why I limited the duration to 8 seconds.

set "FF=c:\ffmpeg\ffmpeg" :: Path to FFmpeg
set "IN=dualfisheye.mp4" :: Input video
set "B1=320" :: Left side of first vertical border, right side is at B1+1
set "B2=959" :: Left side of second vertical border, right side is at B2+1
set "W=25" :: Width of interpolation area
set "T=8" :: Duration in seconds

%FF% -i %IN% -vf "v360=input=dfisheye:output=e:in_pad=0.058" -t %T% -y out1.mp4

%FF% -i %IN% -vf "v360=input=dfisheye:output=e:in_pad=0.058,geq=cb_expr='cb(X,Y)':cr_expr='cr(X,Y)':lum_expr='clip(lum(X,Y)+between(X,%B1%-%W%,%B1%)*lerp(0,lum(%B1%+1,Y)-lum(%B1%,Y),(X-%B1%+%W%)/%W%)+between(X,%B2%-%W%,%B2%)*lerp(0,lum(%B2%+1,Y)-lum(%B2%,Y),(X-%B2%+%W%)/%W%),0,255)'" -t %T% -y out2.mp4

pause

comment:19 by villmer, 4 years ago

How would I make this script for Mac and Linux?
What's the file extension to save as?

comment:20 by villmer, 4 years ago

ah, bash script with .sh extension...

comment:21 by Michael Koch, 4 years ago

Sorry, I know almost nothing about Mac or Linux

comment:22 by Michael Koch, 4 years ago

I have added a better example to my book. It uses the "maskedmerge" filter for merging two overlapping fisheye videos and is much faster than the workaround with "geq" filter.
http://www.astro-electronic.de/FFmpeg_Book.pdf

comment:23 by villmer, 4 years ago

Great. I look forward to trying the new methods.

comment:24 by villmer, 4 years ago

I'm not highly familiar with FFMPEG so please forgive my lack of expertise on this.
First, I'm running MacOS and I've been unable to run the script as it is. Not sure why. You should consider adding information in your book on the differences in script development (and execution) between window and other OSs, such as linux and MacOS.
In the meantime, I've been removing the variables from the script and creating single command lines like so:

ffmpeg -f lavfi -i nullsrc=size=640x640 -vf "format=gray8,geq='clip(128-128/11.5*(180-191.5/(640/2)*hypot(X-640/2,Y-640/2)),0,255)',v360=input=fisheye:output=e:ih_fov=191.5:iv_fov=191.5,format=rgb24" -frames 1 -y mergemap.png

ffmpeg -i in.mp4 -i mergemap.png -lavfi "[0]format=rgb24,split[a][b];[a]crop=ih:iw/2:0:0,v360=input=fisheye:output=e:ih_fov=191.5:iv_fov=191.5[c];[b]crop=ih:iw/2:iw/2:0,v360=input=fisheye:output=e:yaw=180:ih_fov=191.5:iv_fov=191.5[d];[c][d][1]maskedmerge,format=rgb24" -t 10 -y out.mp4

However, running the first command above gives me this error:

ffmpeg version git-2020-01-04-3c8da7b Copyright (c) 2000-2020 the FFmpeg developers

built with Apple clang version 11.0.0 (clang-1100.0.33.16)
configuration: --prefix=/usr/local/Cellar/ffmpeg/HEAD-3c8da7b --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags='-I/System/Library/Frameworks/JavaVM.framework/Versions/Current/Headers/ -fno-stack-check' --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libmp3lame --enable-libopus --enable-librubberband --enable-libsnappy --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-libsoxr --enable-videotoolbox --disable-libjack --disable-indev=jack
libavutil 56. 38.100 / 56. 38.100
libavcodec 58. 65.102 / 58. 65.102
libavformat 58. 35.101 / 58. 35.101
libavdevice 58. 9.103 / 58. 9.103
libavfilter 7. 70.101 / 7. 70.101
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 6.100 / 5. 6.100
libswresample 3. 6.100 / 3. 6.100
libpostproc 55. 6.100 / 55. 6.100

Input #0, lavfi, from 'nullsrc=size=640x640':

Duration: N/A, start: 0.000000, bitrate: N/A

Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 640x640 [SAR 1:1 DAR 1:1], 25 tbr, 25 tbn, 25 tbc

Stream mapping:

Stream #0:0 -> #0:0 (rawvideo (native) -> png (native))

Press [q] to stop, ? for help
[v360 @ 0x7fd41dc0c000] [Eval @ 0x7ffeeed800a0] Undefined constant or missing '(' in 'fisheye'
[v360 @ 0x7fd41dc0c000] Unable to parse option value "fisheye"
[Parsed_v360_2 @ 0x7fd41dc0bf40] Option 'ih_fov' not found
[AVFilterGraph @ 0x7fd41de02840] Error initializing filter 'v360' with args 'input=fisheye:output=e:ih_fov=191.5:iv_fov=191.5'
Error reinitializing filters!
Failed to inject frame into filter network: Option not found
Error while processing the decoded data for stream #0:0
Conversion failed!

Last edited 4 years ago by villmer (previous) (diff)

comment:25 by Elon Musk, 4 years ago

You use too old ffmpeg version, simply as that. Your ffmpeg version have older v360 filter with bunch of features missing.

comment:26 by villmer, 4 years ago

No, I have it.

ffmpeg -filters

.S. v360 V->V Convert 360 projection of video.

comment:27 by Elon Musk, 4 years ago

Read what I wrote, you have old v360 filter, Your ffmpeg is from January.
While ideally it should be from March, few days old max.

comment:28 by Michael Koch, 4 years ago

You need a newer ffmpeg version because many options for v360 filter have been added recently (for example the "fisheye" option).

Regarding scripts for MAC or Linux I really have no knowledge. If anyone wants to write a short introduction about the differences of Windows/MAC/Linux scripts, please do it and add it to the FFmpeg wiki. Also I'd like to add this chapter to my book, of course with proper credit to the author.

comment:29 by villmer, 4 years ago

Updated ffmpeg and everything worked great.
Exceptional in fact. Wow.

I'm attaching a frame of the video showing the results.
Great work mkoch!

by villmer, 4 years ago

Attachment: results.jpg added

Results of blend method by mkoch

comment:30 by villmer, 4 years ago

Dual-fisheye videos stitched using mkoch's method:

https://pop.movie/?movie=1583331030836
https://pop.movie/?movie=1583353871422

Original videos were 1920x960.

Last edited 4 years ago by villmer (previous) (diff)

comment:31 by villmer, 4 years ago


Last edited 4 years ago by villmer (previous) (diff)

comment:32 by villmer, 4 years ago


Last edited 4 years ago by villmer (previous) (diff)

comment:33 by villmer, 4 years ago

Last edited 4 years ago by villmer (previous) (diff)

comment:34 by Michael Koch, 4 years ago

Recording from two cameras simultaneously should be possible, but I have never tried it myself. Copying the exposure data from one camera to the other? As far as I know this isn't possible with FFmpeg.

comment:35 by villmer, 4 years ago

Under Linux Debian 10, how do I install the absolute latest (nightly) version of FFMPEG that includes all of the newest v360 features?

Is there a command line action to install the newest version?
sudo apt-get install ffmpeg does not install the latest version.

On Mac OS we can install the newest version using brew install ffmpeg --HEAD. Is there an equivalent command for Linux 10?

Last edited 4 years ago by villmer (previous) (diff)

comment:36 by villmer, 3 years ago

[ REMOVED ]

Last edited 3 years ago by villmer (previous) (diff)

comment:37 by villmer, 3 years ago

[ REMOVED ]

Last edited 3 years ago by villmer (previous) (diff)

comment:38 by villmer, 3 years ago

Been a while since I’ve looked into this... but does the v360 filter offer any kind of blending (yet) when converting from dual fisheye to equirectangular?

Note: See TracTickets for help on using tickets.