dcadec internal default downmix is not normalised and reduces stereo separation by cross mixing L* and R*.
|Reported by:||AndyF||Owned by:|
|Blocking:||Reproduced by developer:||no|
|Analyzed by developer:||no|
Summary of the bug: When a dts 5.1 channel is downmixed to stereo with -request_channels the results are faulty: not normalised and the L* and R* channels are cross mixed albeit with reduced dB so some stereo effect is still perceivable.
How to reproduce:
% ffmpeg -request_channels 2 -i 6ch.dts 2ch.wav ffmpeg version - git built on ... 10/08/13
Though this issue seems to exist for all dts samples I have - I don't have many and totally failed to find a "normal" VOB channel check.
The channel check I used was core extracted from 7.1MA.
Looking at the code it seems that the default matrix in libavcodec/dcadata.h looks
guilty assuming that dca_default_coeffs refers to dca_downmix_coeffs it matches with the cross channel mixing that I can hear.
This also raises the question of why this matrix is used when I expected studio dts material to have downmix meta data.