Version 1 (modified by saste, 4 years ago) (diff)

add first version of the page

FFmpeg provides a subsystem for hardware acceleration.

Hardware acceleration allows to use specific devices (usually graphical card or other specific devices) to perform multimedia processing. This allows to use dedicated hardware to perform demanding computation while freeing the CPU from such computations.

There are several hardware acceleration standards API, some of which are supported to some extent by FFmpeg.

VDPAU (Video Decode and Presentation API for Unix)

Developed by NVidia for UNIX/Linux systems. To enable this you typically need the libvdpau development package in your distribution, and a compatible graphic card.

Official website:

Note that VDPAU cannot be used to decode frames in memory, the compressed frames are sent by libavcodec to the GPU device supported by VDPAU and then the decoded image can be accessed using the VDPAU API. This is not done automatically by FFmpeg, but must be done at the application level (check for example the vo_vdpau.c module from MPlayer). Also note that with this API it is not possible to move the decoded frame back to RAM, for example in case you need to encode again the decoded frame (e.g. when doing transcoding on a server).

Several decoders are currently supported through VDPAU in libavcodec, in particular H.263, H.264, MPEG4.


XVideo Motion Compensation. This is an extension of the X video extension (Xv) for the X Window System (and thus again only available only on UNIX/Linux).

Official specification is available here:


Video Acceleration API (VA API) is a non-proprietary and royalty-free open source software library ("libVA") and API specification, initially developed by Intel but can be used in combination with other devices.


Direct-X Video Acceleration API, developed by Microsoft (supports Windows and XBox360).

Link to MSDN documentation:

Several decoders are currently supported, in particular H.264, MPEG2, VC1 and WMV3.


Video Decoding API, only supported on MAC. Several decoders are available in FFmpeg/libavcodec.

Developers documentation:


Official website:

Currently only used in filering (deshake and unsharp filters). In order to use OpenCL code you need to enable the build with --enable-opencl. An API to use OpenCL API from FFmpeg is provided in libavutil/opencl.h. No decoding/encoding is currently supported (yet).

External resources