Version 6 (modified by jbrower888, 5 years ago) (diff)

added c66x hardware accelerator

FFmpeg provides a subsystem for hardware acceleration.

Hardware acceleration allows to use specific devices (usually graphical card or other specific devices) to perform multimedia processing. This allows to use dedicated hardware to perform demanding computation while freeing the CPU from such computations.

There are several hardware acceleration standards API, some of which are supported to some extent by FFmpeg.

VDPAU (Video Decode and Presentation API for Unix)

Developed by NVidia for UNIX/Linux systems. To enable this you typically need the libvdpau development package in your distribution, and a compatible graphic card.

Official website:

Note that VDPAU cannot be used to decode frames in memory, the compressed frames are sent by libavcodec to the GPU device supported by VDPAU and then the decoded image can be accessed using the VDPAU API. This is not done automatically by FFmpeg, but must be done at the application level (check for example the vo_vdpau.c module from MPlayer). Also note that with this API it is not possible to move the decoded frame back to RAM, for example in case you need to encode again the decoded frame (e.g. when doing transcoding on a server).

Several decoders are currently supported through VDPAU in libavcodec, in particular MPEG Video, VC-1, H.264, MPEG4.


XVideo Motion Compensation. This is an extension of the X video extension (Xv) for the X Window System (and thus again only available only on UNIX/Linux).

Official specification is available here:


Video Acceleration API (VA API) is a non-proprietary and royalty-free open source software library ("libVA") and API specification, initially developed by Intel but can be used in combination with other devices. Linux only:


Direct-X Video Acceleration API, developed by Microsoft (supports Windows and XBox360).

Link to MSDN documentation:

Several decoders are currently supported, in particular H.264, MPEG2, VC1 and WMV3.


Video Decoding API, only supported on MAC. H.264 decoding is available in FFmpeg/libavcodec.

Developers documentation:


Official website:

Currently only used in filtering (deshake and unsharp filters). In order to use OpenCL code you need to enable the build with --enable-opencl. An API to use OpenCL API from FFmpeg is provided in libavutil/opencl.h. No decoding/encoding is currently supported (yet).



Hardware / software solution with c66x accelerator card containing multicore CPUs, DDR3 mem, and NIC. Enabled by adding -hwaccel c66x in ffmpeg command line. Supports (i) file output, (ii) streaming output using NIC ports on card, and (iii) VM operation. Enable build with --enable-c66x

External resources