No, ffmpeg does not use a configuration file post-install.

lots of ways to build ffmpeg, but generally you have to configure everything before install.

for example, with brew (macOS package manager) you have to edit the following file to supply your --enable or --disable args BEFORE INSTALLING.

/usr/local/Homebrew/Library/Taps/homebrew/homebrew-core/Formula/ffmpeg.rb

if you didn't get your args in on your original install, then you need to uninstall and re-install again with your desired options and args.

also: instead of supplying your options (--with-webp for example) from the CLI during install, you can write them in ffmpeg.rb.

Answer from Rowe Morehouse on Stack Exchange
🌐
GitHub
github.com › FFmpeg › FFmpeg › blob › master › configure
FFmpeg/configure at master · FFmpeg/FFmpeg
echo "config:$arch:$subarch:$cpu:$target_os:$(esc $cc_ident):$(esc $FFMPEG_CONFIGURATION)" > ffbuild/config.fate
Author   FFmpeg
Discussions

FFMPEG configuration for home video - Stack Overflow
I have a lot home video from my smartphone and from camera. But they take up much space. I want to compress/convert these in x264 files by ffmpeg. I find following config: ffmpeg -y -i input.mov ... More on stackoverflow.com
🌐 stackoverflow.com
FFMPEG configuration file?
actually you could tell youtube-dl to do it. youtube-dl is a collection of several python modules and one of them calls ffmpeg and passes the necessary parameters. which parameter is it that you want to set? in /usr/lib/python2.7/dist-packages/youtube_dl/postprocessor/ffmpeg.py (or wherever) in the function def run_ffmpeg_multiple_files(self, input_paths, out_path, opts): (there's also run_ffmpeg() but that just calls run_ffmpeg_multiple_files with input_paths being a one-item list) you have the two lines: cmd = ([encodeFilename(self.executable, True), encodeArgument('-y')] + files_cmd +[encodeArgument(o) for o in opts] + [encodeFilename(self._ffmpeg_filename_argument(out_path), True)]) p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE) cmd is a list that defines the external command that is being executed. it looks like this ['ffmpeg', '-y', '-i', 'inputfile', ...] so if you want to run a different command you can manipulate the list cmd. now it depends which parameters you want to pass to ffmpeg, because the order matters of course. the first part: [encodeFilename(self.executable, True), encodeArgument('-y')] is just ffmpeg -y files_cmd is a list of -i file1 -i file2 -i file3 [encodeArgument(o) for o in opts] has encoding options (like the codec, bitrate and so on) finally [encodeFilename(self._ffmpeg_filename_argument(out_path), True)] is the output filename. + concatenates all of these chunks into one list as an example, if i want to add global option cpuflags (just random example from the ffmpeg documentation) ffmpeg -cpuflags mmx ... i would change the cmd line like this: cmd = ([encodeFilename(self.executable, True), '-cpuflags', 'mmx', encodeArgument('-y')] + files_cmd +[encodeArgument(o) for o in opts] + [encodeFilename(self._ffmpeg_filename_argument(out_path), True)]) you need to repeat those changes whenever you update youtube-dl More on reddit.com
🌐 r/ffmpeg
4
3
February 19, 2016
how does ffmpeg generate configure script file? - Stack Overflow
There is no AutoTools script file such as makefile.am or configure.ac. how can I regenerate a ‘configure’ script file? and what is the .mak file? I see even the option of configure script is also More on stackoverflow.com
🌐 stackoverflow.com
FFMPEG Configuration File Example
Hi, Could somebody please help point me the right direction please. I have a NVR with 6 x external hardwired cameras and I would like to stream live video to a dashboard. I have Motioneye installed and that works well but it also hits the limited resources on my HA hardware , hence looking ... More on community.home-assistant.io
🌐 community.home-assistant.io
0
0
May 20, 2024
🌐
GitHub
gist.github.com › omegdadi › 6904512c0a948225c81114b1c5acb875
A list of all the configuration options available when compiling FFMpeg v4.1.5 · GitHub
A list of all the configuration options available when compiling FFMpeg v4.1.5 - ffmpeg-4.1.5_configure_options.txt
🌐
FFmpeg
ffmpeg.org › ffmpeg.html
ffmpeg Documentation
1 week ago - Matches streams with usable configuration, the codec must be defined and the essential information such as video dimension or audio sample rate must be present. Note that in ffmpeg, matching by metadata will only work properly for input files.
🌐
Stack Overflow
stackoverflow.com › questions › 34332816 › ffmpeg-configuration-for-home-video
FFMPEG configuration for home video - Stack Overflow
ffmpeg -y -i input.mov -c:v libx264 -preset medium -b:v 4500k -pix_fmt yuvj420p -pass 1 -an -f mp4 nul ffmpeg -y -i input.mov -c:v libx264 -preset medium -b:v 4500k -pix_fmt yuvj420p -pass 2 -c:a aac -b:a 256k -f mp4 out.mp4 · Could you, please, help me improve these config to convert files with acceptable quality and small size.
🌐
Reddit
reddit.com › r/ffmpeg › ffmpeg configuration file?
r/ffmpeg on Reddit: FFMPEG configuration file?
February 19, 2016 -

I'm trying to use a setting that isn't being passed by another program (youtube-dl) to ffmpeg, is there a way of setting that option in an ffmpeg config file to always use that setting? This is for downloading a stream.

Top answer
1 of 2
2
actually you could tell youtube-dl to do it. youtube-dl is a collection of several python modules and one of them calls ffmpeg and passes the necessary parameters. which parameter is it that you want to set? in /usr/lib/python2.7/dist-packages/youtube_dl/postprocessor/ffmpeg.py (or wherever) in the function def run_ffmpeg_multiple_files(self, input_paths, out_path, opts): (there's also run_ffmpeg() but that just calls run_ffmpeg_multiple_files with input_paths being a one-item list) you have the two lines: cmd = ([encodeFilename(self.executable, True), encodeArgument('-y')] + files_cmd +[encodeArgument(o) for o in opts] + [encodeFilename(self._ffmpeg_filename_argument(out_path), True)]) p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE) cmd is a list that defines the external command that is being executed. it looks like this ['ffmpeg', '-y', '-i', 'inputfile', ...] so if you want to run a different command you can manipulate the list cmd. now it depends which parameters you want to pass to ffmpeg, because the order matters of course. the first part: [encodeFilename(self.executable, True), encodeArgument('-y')] is just ffmpeg -y files_cmd is a list of -i file1 -i file2 -i file3 [encodeArgument(o) for o in opts] has encoding options (like the codec, bitrate and so on) finally [encodeFilename(self._ffmpeg_filename_argument(out_path), True)] is the output filename. + concatenates all of these chunks into one list as an example, if i want to add global option cpuflags (just random example from the ffmpeg documentation) ffmpeg -cpuflags mmx ... i would change the cmd line like this: cmd = ([encodeFilename(self.executable, True), '-cpuflags', 'mmx', encodeArgument('-y')] + files_cmd +[encodeArgument(o) for o in opts] + [encodeFilename(self._ffmpeg_filename_argument(out_path), True)]) you need to repeat those changes whenever you update youtube-dl
2 of 2
1
i don't think so but what you can try is set --ffmpeg-location and create a wrapper script although you might have to figure out how exactly youtub-dl calls ffmpeg so you get the arguments and order right.
Find elsewhere
🌐
FFmpeg
ffmpeg.org › pipermail › ffmpeg-user › 2013-December › 018858.html
[FFmpeg-user] how to change ffmpeg configuration
December 12, 2013 - On Wed, 11 Dec 2013 22:43:43 +0000 e-letter <inpost at gmail.com> wrote: > Readers, > > An attempt to convert flv to ogg failed: > > ffmpeg -i file.flv file.ogg > > FFmpeg version 0.6.5, Copyright (c) 2000-2010 the FFmpeg developers This is old. Please compile from git master or use a recent build: http://trac.ffmpeg.org/wiki/CompilationGuide http://ffmpeg.org/download.html > built on May 15 2012 07:00:00 with gcc 4.4.3 > configuration: --prefix=/usr --enable-shared --libdir=/usr/lib > --shlibdir=/usr/lib --incdir=/usr/include --disable-stripping > --enable-postproc --enable-gpl --enable-pthreads --enable-libtheora > --enable-libvorbis --disable-encoder=vorbis --enable-x11grab > --enable-runtime-cpudetect --enable-libdc1394 --enable-libschroedinger > --disable-decoder=aac --disable-encoder=aac Why --disable-decoder=aac --disable-encoder=aac?
🌐
GitHub
github.com › rvs › ffmpeg › blob › master › configure
ffmpeg/configure at master · rvs/ffmpeg
config_files="$TMPH config.mak" · cat > config.mak <<EOF · # Automatically generated by configure - do not modify! ifndef FFMPEG_CONFIG_MAK · FFMPEG_CONFIG_MAK=1 · FFMPEG_CONFIGURATION=$FFMPEG_CONFIGURATION · prefix=$prefix · LIBDIR=\$(DESTDIR)$libdir ·
Author   rvs
🌐
Shotstack
shotstack.io › / › learn › how to use ffmpeg: installation, commands & examples
How to use FFmpeg: Installation, commands & examples — Shotstack
To resize a video using FFmpeg, you can use the scale filter set using the -vf (video filter) option. ... -vf “scale=w:h”: Replace w and h with the desired width and height of the output video. You can also set a single dimension, such as -vf “scale=-1:720” to maintain the original aspect ratio. resized.mp4: Replace this with the desired output video file name and extension.
🌐
Home Assistant
community.home-assistant.io › configuration
FFMPEG Configuration File Example - Configuration - Home Assistant Community
May 20, 2024 - Hi, Could somebody please help point me the right direction please. I have a NVR with 6 x external hardwired cameras and I would like to stream live video to a dashboard. I have Motioneye installed and that works well but it also hits the limited resources on my HA hardware , hence looking ...
🌐
Reddit
reddit.com › r/ffmpeg › how to configure ffmpeg build for optimal performance and hardware acceleration
r/ffmpeg on Reddit: How to Configure FFmpeg Build for Optimal Performance and Hardware Acceleration
July 26, 2024 -

Hello, I am relatively new to FFmpeg.

I have been working on a customized version of FFmpeg in order to resolve the issue where the HEVC_CUVID decoder fails to retrieve HDR10+ side data after decoding.

To achieve this, I had to modify certain C files and compile them accordingly.

However, I encountered performance concerns with my current build; while Gyan's library achieves approximately 34 FPS, my custom build falls short at less than 9 FPS.

Moreover, it seems that Nvidia hardware acceleration features such as CUVID, FFNVCODEC, and NVDEC are not available or enabled in my build.

My configuration options include:

./configure --toolchain=msvc --arch=x86_64 --enable-yasm --disable-x86asm --enable-asm --enable-shared --enable-w32threads --disable-programs --disable-doc --disable-static --prefix=/c/ffmpeg3.3/DLLS

I would greatly appreciate any guidance on resolving these two issues simultaneously.

Thank you for your time and assistance.

install prefix            /c/ffmpeg3.3/DLLS
source path               .
C compiler                cl.exe
C library                 msvcrt
ARCH                      x86 (generic)
big-endian                no
runtime cpu detection     yes
standalone assembly       no
x86 assembler             nasm
MMX enabled               yes
MMXEXT enabled            yes
3DNow! enabled            yes
3DNow! extended enabled   yes
SSE enabled               yes
SSSE3 enabled             yes
AESNI enabled             yes
AVX enabled               yes
AVX2 enabled              yes
AVX-512 enabled           yes
AVX-512ICL enabled        yes
XOP enabled               yes
FMA3 enabled              yes
FMA4 enabled              yes
i686 features enabled     yes
CMOV is fast              yes
EBX available             no
EBP available             no
debug symbols             yes
strip symbols             no
optimize for size         no
optimizations             yes
static                    no
shared                    yes
postprocessing support    no
network support           yes
threading support         w32threads
safe bitstream reader     yes
texi2html enabled         no
perl enabled              yes
pod2man enabled           yes
makeinfo enabled          no
makeinfo supports HTML    no
xmllint enabled           no

External libraries:
mediafoundation         schannel

External libraries providing hardware acceleration:
d3d11va                 d3d12va                 dxva2
🌐
CODEX FFMPEG
gyan.dev › ffmpeg › builds
Builds - CODEX FFMPEG @ gyan.dev
(complete archive @ mirror) ffmpeg-2026-04-09-git-d3d0b7a5ee-essentials_build.7z .sha256 ffmpeg-2026-04-09-git-d3d0b7a5ee-full_build.7z .sha256 ffmpeg-2026-03-15-git-6ba0b59d8b-essentials_build.7z .sha256 ffmpeg-2026-03-15-git-6ba0b59d8b-full_build.7z .sha256 ffmpeg-2026-02-15-git-33b215d155-essentials_build.7z .sha256 ffmpeg-2026-02-15-git-33b215d155-full_build.7z .sha256
Top answer
1 of 2
61

FFmpeg is indeed a powerful video encoder/decoder tool¹. It operates in the command line, as opposed to using a GUI. Command line is that black window you find by typing [windows+r], then cmd in the popup field and hitting enter. This is also called "command prompt". Once setup, you enter FFmpeg commands in one of these windows to use it.

Here are the basic steps to "install" and use it:

Installation

  1. Download the latest FFmpeg build, courtesy of gyan.dev.
  2. Create a folder on your computer to unpack the zip file. This folder will be your "installation" folder. I chose C:\Program Files\ffmpeg\. This is a good idea because you will treat this like a regular program. Unpack the zip file into this folder.
  3. The folder should now contain a number of other folders, including one titled bin where ffmpeg.exe is saved. We're not done yet. Double clicking that file does nothing. Remember, this is a command line program. It runs in cmd.
  4. Before you can use ffmpeg.exe in cmd you have to tell your computer where it can find it. You need to add a new system path. First, right click This PC (Windows 10) or Computer (Windows 7) then click Properties > Advanced System Settings > Advanced tab > Environment Variables.
  5. In the Environment Variables window, click the "Path" row under the "Variable" column, then click Edit
  6. The "Edit environment variable" window looks different for Windows 10 and 7. In Windows 10 click New then paste the path to the folder that you created earlier where ffmpeg.exe is saved. For this example, that is C:\Program Files\ffmpeg\bin\ In Windows 7 all the variables are listed in a single string, separated by a semicolon. Simply go the the end of the string, type a semicolon (;), then paste in the path.
  7. Click Ok on all the windows we just opened up. Just to be sure, reboot your computer before trying any commands.

FFmpeg is now "installed". The Command Prompt will now recognize FFmpeg commands and will attempt to run them. (If you are still having issues with Command Prompt not recognizing FFmpeg try running CMD as an admin. Alternatively, you can use windows powershell instead of cmd. If it still does not work double check to make sure each step was followed to completion.)

Alternative installation methods

I've not tried these myself, but they probably work, and they're easy to do. However, you can accidentally mess up important things if you're not careful.

First, if you open cmd with administrator privileges, you can run setx /m PATH "C:\ffmpeg\bin;%PATH%", and change C:\ffmpeg\bin to your path to FFmpeg. This uses cmd to do all the gui steps listed above. Easy peasy.

Second, user K7AAY reports that you can simply drop the FFmpeg executables in C:\Windows\System32 and run them from there without having to define the path variable because that path is already defined.

Updating FFmpeg

To update FFmpeg, just revisit the download page in step 1 above and download the zip file. Unpack the files and copy them over the old files in the folder you created in step 2.

Using FFmpeg

Using FFmpeg requires that you open a command prompt window, then type FFmpeg specific commands. Here is a typical FFmpeg command:

 ffmpeg -i video.mp4 -vn -ar 44100 -ac 1 -b:a 32k -f mp3 audio.mp3

This command has four parts:

  1. ffmpeg - This command tells cmd that we want to run FFmpeg commands. cmd will first look for ffmpeg.exe in one of the folders from step 6 in the Installation section. If it is found, it will attempt to run the command.
  2. -i video.mp4 - This is an input file. We are going to be doing work on this file.
  3. -vn -ar 44100 -ac 1 -b:a 32k -f mp3 - These are the "arguments". These characters are like mini commands that specify exactly what we want to do. In this case, it is saying create an mp3 file from the input source.
  • -vn - Leave out the video stream
  • -ar 44100 - Specifies audio resolution in hertz.
  • -ac 1 - Audio channels, only 1. This is effectively "make mono".
  • -b:a 32k - Audio bitrate, set to 32 kbps.
  • -f mp3 - Force to MP3 conversion. Without this command, FFmpeg attempts to interpret what you want based on the extension you use in the output file name.
  1. audio.mp3- This is the output file.

As you can probably guess, this short command makes an MP3 audio file from an MP4 file.

To run this command, assuming you have an MP4 file to try this on, follow these steps:

  1. Hit the Windows key + r.
  2. Type cmd then enter.
  3. Change the path to where the file is that you want to work on. Type cd [path]. It should look something like cd C:\Users\name\Desktop\.
  4. Now type the FFmpeg command with the name of your input file. The command will run with some feedback. When it's done, cmd will be available for more commands.

This is the basic way to use FFmpeg. The commands can get far more complicated, but that's only because the program has so much power. Using the FFmpeg documentation, you can learn all the commands and create some very powerful scripts. After that, you can save these scripts into a .bat file so that you just have to double click a file instead of type out the whole command each time. For example, this answer contains a script that will create MP3's from all the MP4's in a folder. Then we would be combining the power of FFmpeg with the power of cmd, and that's a nice place to be when you have to do professional quality video/audio encoding on mountains of files.


  1. As a point if technical accuracy, FFmpeg is itself not an encoder or decoder. FFmpeg is a multimedia framework which can process almost all common and many uncommon media formats. It has thousands of to capture, decode, encode, modify, combine, and stream media, and it can make use of dozens of external libraries to do even more. Gyan.dev provides a succinct description.
2 of 2
9

The other answer gives a very good answer that covers the default way of installing it, I'd like to propose two ohter methods that are good for noobs and pros alike:

Option 1

Chocolatey is a package manager, it's a bit like the Microsoft Store, except that it's actually useful, it's all free, and it runs on the commandline. With chocolatey, installing ffmpeg—and setting up the correct $PATH etc.—is as simple as

choco install ffmpeg

It's way quicker and far safer than searching for the right website, finding the download, unzipping it, reading the installation documentation, googling how to set it up, downloading some dependancy etc. etc.

To install Chocolatey you run a command on the commandline, obvs. The website shows you how, but it is a simple cut-n-paste affair. https://chocolatey.org/

You can then check out over 6000 free packages available with choco list <search term here>. There are even non-CLI programs so it's not just for the hardcore. It makes setting up a new install of windows super easy: I have a list of software that I always install and just get chocolatey to do it for me: choco install firefox ffmpeg conemu edgedeflector ditto rainmeter imagemagick… and so on.

As an added bonus upgrading your software is as easy as choco upgrade all

Option 2

Winget is another package manager, that is built-in to recent versions of Windows. The recipe for installing ffmpeg with winget is similar: open a terminal (can be Powershell, wsl, or even cmd if you like banging rocks together) and type

winget install ffmpeg

This will download and install the build of ffmpeg that is hosted by gyan (at the time of writing). It will also work if you don't have admin privileges, which chocolatey prefers.

Top answer
1 of 2
3

First of all, to see a complete list of configure options refer to ./configure --help.

According to the FFmpeg documentation...

This isn't actually the documentation but is a wiki that is editable by anyone, so like any wiki you may want to independently verify any claims.

This made me wonder what other such options FFmpeg can be compiled with in order to give me the best build for creating the most efficient (i.e. the highest quality at the lowest bitrate) audio.

This is currently Opus audio. Enable it with --enable-libopus, and use the latest version of libopus if you want to take advantage of recent development activity.

There are claims that the Sox audio resampler is better than the built-in resampler in FFmpeg. I haven't tried it much myself. Enable it with --enable-libsoxr.

What advantages does compiling with --enable-nonfree offer?

This option alone gives no advantages. It is required for some external libraries that are considered to be non-free. You can view which libraries require this in the source code of the configure file: refer to EXTERNAL_LIBRARY_NONFREE_LIST (and HWACCEL_LIBRARY_NONFREE_LIST). As of this answer these include: decklink, libndi_newtek, libfdk_aac, openssl, libtls (and cuda_nvcc, cuda_sdk, libnpp).

A disadvantage of using --enable-nonfree is that the resulting build will be non-free and therefore non-redistributable.

What advantages does not compiling with --enable-gpl offer?

Slightly faster to compile. Somewhat smaller resulting executable file size. LGPL 2.1 license instead of GPL 2. However, these may not be of any concern to you.

See LICENSE.md included in the source code for a complete list of what requires --enable-gpl.

2 of 2
2

Thanks to llogan for pointing out that the full list of configuration options can be found by doing ./configure --help in the directory containing the FFmpeg sources. Unfortunately this information isn't documented anywhere else, so to make it available without requiring that FFmpeg's sources are downloaded, I've reproduced them below.

Note that I've only reproduced the configuration options that enable/disable support for a particular external library or hardware acceleration feature - general program configuration or debugging options have been omitted. Each option is followed by a description in square brackets of whether the option is enabled by default in FFmpeg or whether support for it is autodetected:

Licensing options:

  --enable-gpl             allow use of GPL code, the resulting libs and binaries will be under GPL [no]
  --enable-version3        upgrade (L)GPL to version 3 [no]
  --enable-nonfree         allow use of nonfree code, the resulting libs and binaries will be unredistributable [no]

External library support:

  Using any of the following switches will allow FFmpeg to link to the
  corresponding external library. All the components depending on that library
  will become enabled, if all their other dependencies are met and they are not
  explicitly disabled. E.g. --enable-libwavpack will enable linking to
  libwavpack and allow the libwavpack encoder to be built, unless it is
  specifically disabled with --disable-encoder=libwavpack.

  Note that only the system libraries are auto-detected. All the other external
  libraries must be explicitly enabled.

  Also note that the following help text describes the purpose of the libraries
  themselves, not all their features will necessarily be usable by FFmpeg.

  --disable-alsa           disable ALSA support [autodetect]
  --disable-appkit         disable Apple AppKit framework [autodetect]
  --disable-avfoundation   disable Apple AVFoundation framework [autodetect]
  --enable-avisynth        enable reading of AviSynth script files [no]
  --disable-bzlib          disable bzlib [autodetect]
  --disable-coreimage      disable Apple CoreImage framework [autodetect]
  --enable-chromaprint     enable audio fingerprinting with chromaprint [no]
  --enable-frei0r          enable frei0r video filtering [no]
  --enable-gcrypt          enable gcrypt, needed for rtmp(t)e support if openssl, librtmp or gmp is not used [no]
  --enable-gmp             enable gmp, needed for rtmp(t)e support if openssl or librtmp is not used [no]
  --enable-gnutls          enable gnutls, needed for https support if openssl, libtls or mbedtls is not used [no]
  --disable-iconv          disable iconv [autodetect]    
  --enable-jni             enable JNI support [no]
  --enable-ladspa          enable LADSPA audio filtering [no]
  --enable-libaom          enable AV1 video encoding/decoding via libaom [no]
  --enable-libaribb24      enable ARIB text and caption decoding via libaribb24 [no]
  --enable-libass          enable libass subtitles rendering, needed for subtitles and ass filter [no]
  --enable-libbluray       enable BluRay reading using libbluray [no]
  --enable-libbs2b         enable bs2b DSP library [no]
  --enable-libcaca         enable textual display using libcaca [no]
  --enable-libcelt         enable CELT decoding via libcelt [no]
  --enable-libcdio         enable audio CD grabbing with libcdio [no]
  --enable-libcodec2       enable codec2 en/decoding using libcodec2 [no]
  --enable-libdav1d        enable AV1 decoding via libdav1d [no]
  --enable-libdavs2        enable AVS2 decoding via libdavs2 [no]
  --enable-libdc1394       enable IIDC-1394 grabbing using libdc1394 and libraw1394 [no]
  --enable-libfdk-aac      enable AAC de/encoding via libfdk-aac [no]
  --enable-libflite        enable flite (voice synthesis) support via libflite [no]
  --enable-libfontconfig   enable libfontconfig, useful for drawtext filter [no]
  --enable-libfreetype     enable libfreetype, needed for drawtext filter [no]
  --enable-libfribidi      enable libfribidi, improves drawtext filter [no]
  --enable-libgme          enable Game Music Emu via libgme [no]
  --enable-libgsm          enable GSM de/encoding via libgsm [no]
  --enable-libiec61883     enable iec61883 via libiec61883 [no]
  --enable-libilbc         enable iLBC de/encoding via libilbc [no]
  --enable-libjack         enable JACK audio sound server [no]
  --enable-libklvanc       enable Kernel Labs VANC processing [no]
  --enable-libkvazaar      enable HEVC encoding via libkvazaar [no]
  --enable-liblensfun      enable lensfun lens correction [no]
  --enable-libmodplug      enable ModPlug via libmodplug [no]
  --enable-libmp3lame      enable MP3 encoding via libmp3lame [no]
  --enable-libopencore-amrnb enable AMR-NB de/encoding via libopencore-amrnb [no]
  --enable-libopencore-amrwb enable AMR-WB decoding via libopencore-amrwb [no]
  --enable-libopencv       enable video filtering via libopencv [no]
  --enable-libopenh264     enable H.264 encoding via OpenH264 [no]
  --enable-libopenjpeg     enable JPEG 2000 de/encoding via OpenJPEG [no]
  --enable-libopenmpt      enable decoding tracked files via libopenmpt [no]
  --enable-libopus         enable Opus de/encoding via libopus [no]
  --enable-libpulse        enable Pulseaudio input via libpulse [no]
  --enable-librsvg         enable SVG rasterization via librsvg [no]
  --enable-librubberband   enable rubberband needed for rubberband filter [no]
  --enable-librtmp         enable RTMP[E] support via librtmp [no]
  --enable-libshine        enable fixed-point MP3 encoding via libshine [no]
  --enable-libsmbclient    enable Samba protocol via libsmbclient [no]
  --enable-libsnappy       enable Snappy compression, needed for hap encoding [no]
  --enable-libsoxr         enable Include libsoxr resampling [no]
  --enable-libspeex        enable Speex de/encoding via libspeex [no]
  --enable-libsrt          enable Haivision SRT protocol via libsrt [no]
  --enable-libssh          enable SFTP protocol via libssh [no]
  --enable-libtensorflow   enable TensorFlow as a DNN module backend for DNN based filters like sr [no]
  --enable-libtesseract    enable Tesseract, needed for ocr filter [no]
  --enable-libtheora       enable Theora encoding via libtheora [no]
  --enable-libtls          enable LibreSSL (via libtls), needed for https support if openssl, gnutls or mbedtls is not used [no]
  --enable-libtwolame      enable MP2 encoding via libtwolame [no]
  --enable-libv4l2         enable libv4l2/v4l-utils [no]
  --enable-libvidstab      enable video stabilization using vid.stab [no]
  --enable-libvmaf         enable vmaf filter via libvmaf [no]
  --enable-libvo-amrwbenc  enable AMR-WB encoding via libvo-amrwbenc [no]
  --enable-libvorbis       enable Vorbis en/decoding via libvorbis, native implementation exists [no]
  --enable-libvpx          enable VP8 and VP9 de/encoding via libvpx [no]
  --enable-libwavpack      enable wavpack encoding via libwavpack [no]
  --enable-libwebp         enable WebP encoding via libwebp [no]
  --enable-libx264         enable H.264 encoding via x264 [no]
  --enable-libx265         enable HEVC encoding via x265 [no]
  --enable-libxavs         enable AVS encoding via xavs [no]
  --enable-libxavs2        enable AVS2 encoding via xavs2 [no]
  --enable-libxcb          enable X11 grabbing using XCB [autodetect]
  --enable-libxcb-shm      enable X11 grabbing shm communication [autodetect]
  --enable-libxcb-xfixes   enable X11 grabbing mouse rendering [autodetect]
  --enable-libxcb-shape    enable X11 grabbing shape rendering [autodetect]
  --enable-libxvid         enable Xvid encoding via xvidcore, native MPEG-4/Xvid encoder exists [no]
  --enable-libxml2         enable XML parsing using the C library libxml2, needed for dash demuxing support [no]
  --enable-libzimg         enable z.lib, needed for zscale filter [no]
  --enable-libzmq          enable message passing via libzmq [no]
  --enable-libzvbi         enable teletext support via libzvbi [no]
  --enable-lv2             enable LV2 audio filtering [no]
  --disable-lzma           disable lzma [autodetect]
  --enable-decklink        enable Blackmagic DeckLink I/O support [no]
  --enable-mbedtls         enable mbedTLS, needed for https support if openssl, gnutls or libtls is not used [no]
  --enable-mediacodec      enable Android MediaCodec support [no]
  --enable-libmysofa       enable libmysofa, needed for sofalizer filter [no]
  --enable-openal          enable OpenAL 1.1 capture support [no]
  --enable-opencl          enable OpenCL processing [no]
  --enable-opengl          enable OpenGL rendering [no]
  --enable-openssl         enable openssl, needed for https support if gnutls, libtls or mbedtls is not used [no]
  --disable-sndio          disable sndio support [autodetect]
  --disable-schannel       disable SChannel SSP, needed for TLS support on Windows if openssl and gnutls are not used [autodetect]
  --disable-sdl2           disable sdl2 [autodetect]
  --disable-securetransport disable Secure Transport, needed for TLS support on OSX if openssl and gnutls are not used [autodetect]
  --enable-vapoursynth     enable VapourSynth demuxer [no]
  --disable-xlib           disable xlib [autodetect]
  --disable-zlib           disable zlib [autodetect]

  The following libraries provide various hardware acceleration features:

  --disable-amf            disable AMF video encoding code [autodetect]
  --disable-audiotoolbox   disable Apple AudioToolbox code [autodetect]
  --enable-cuda-nvcc       enable Nvidia CUDA compiler [no]
  --disable-cuvid          disable Nvidia CUVID support [autodetect]
  --disable-d3d11va        disable Microsoft Direct3D 11 video acceleration code [autodetect]
  --disable-dxva2          disable Microsoft DirectX 9 video acceleration code [autodetect]
  --disable-ffnvcodec      disable dynamically linked Nvidia code [autodetect]
  --enable-libdrm          enable DRM code (Linux) [no]
  --enable-libmfx          enable Intel MediaSDK (AKA Quick Sync Video) code via libmfx [no]
  --enable-libnpp          enable Nvidia Performance Primitives-based code [no]
  --enable-mmal            enable Broadcom Multi-Media Abstraction Layer (Raspberry Pi) via MMAL [no]
  --disable-nvdec          disable Nvidia video decoding acceleration (via hwaccel) [autodetect]
  --disable-nvenc          disable Nvidia video encoding code [autodetect]
  --enable-omx             enable OpenMAX IL code [no]
  --enable-omx-rpi         enable OpenMAX IL code for Raspberry Pi [no]
  --enable-rkmpp           enable Rockchip Media Process Platform code [no]
  --disable-v4l2-m2m       disable V4L2 mem2mem code [autodetect]
  --disable-vaapi          disable Video Acceleration API (mainly Unix/Intel) code [autodetect]
  --disable-vdpau          disable Nvidia Video Decode and Presentation API for Unix code [autodetect]
  --disable-videotoolbox   disable VideoToolbox code [autodetect]

Also note that some codecs above, such as Xvid and Vorbis, already have encoders native to FFmpeg and don't require that any additional encoders be enabled.

Additional information on licensing options can be found from this screenshot of the Media Auto Build Suite, which I found to contain the best explanation I've come across so far of what the licensing options mean and what their implications are:

🌐
Ampere
amperecomputing.com › en › tuning-guides › FFmpeg-Tuning-Guide
FFmpeg Build and Tuning Guide
The build instructions use the following directories for FFmpeg source files, build directories and binary output:
Top answer
1 of 1
16

There is no automatic way to do that. You have to look at the parameters of the original file and apply them to the output file.

In most cases, these will be the following:

  • Container format (MP4, MKV, …)
  • Video and audio codec (H.264, H.265, …)
  • Audio-specific:
    • Number of audio channels
    • Audio sampling rate
    • Audio bitrate
  • Video-specific:
    • Profile and Level (to ensure compatibility, see here)
    • Maximum bitrate limitations (e.g. for H.264)
    • Maximum video resolution, change via -filter:v scale or -s:v
    • Framerate, change via -filter:v fps -r
    • Chroma subsampling, change via -pix_fmt (e.g., -pix_fmt yuv420p should give you the best compatibility)
    • GOP size (distance between IDR-frames), set via -g
    • Other specific encoding settings

But even if you get that all right, some devices may require specific, proprietary information embedded in the bitstream.


As for the specific task of using x264, this is not going to be trivial. I'm not aware of a single script that'd take care of these tasks, which are usually done manually. For the most info about the encoding settings, on Unix/Linux or OS X, you can use mediainfo with some Bash tricks.

For example, for an x264-encoded video in an MP4 file:

mediainfo input.mp4 | grep "Encoding settings" | cut -d':' -f2- | tr '/' '\n' | sed 's/ //'

This will output a list of x264 options:

cabac=1
ref=3
deblock=1:-1:-1
analyse=0x3:0x113
me=hex
subme=7
psy=1
…

You could then manually pass these options to the x264 binary.

If you go through FFmpeg, that's a little more complicated though, as not all of x264's options can or should be mapped like this. Note that often a simple preset, tune and profile specification will do as well (as seen in x264 --fullhelp and the x264 encoding guide), and specifying the CRF level is enough.

And this is not even considering audio, where luckily, there aren't that many options.

🌐
Frigate
docs.frigate.video › ffmpeg presets
FFmpeg presets | Frigate
go2rtc: streams: reolink_cam: http://192.168.0.139/flv?port=1935&app=bcs&stream=channel0_main.bcs&user=admin&password=password cameras: reolink_cam: ffmpeg: inputs: - path: http://192.168.0.139/flv?port=1935&app=bcs&stream=channel0_ext.bcs&user=admin&password=password input_args: preset-http-reolink roles: - detect - path: rtsp://127.0.0.1:8554/reolink_cam input_args: preset-rtsp-generic roles: - record