muxers.texi 23 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668
  1. @chapter Muxers
  2. @c man begin MUXERS
  3. Muxers are configured elements in FFmpeg which allow writing
  4. multimedia streams to a particular type of file.
  5. When you configure your FFmpeg build, all the supported muxers
  6. are enabled by default. You can list all available muxers using the
  7. configure option @code{--list-muxers}.
  8. You can disable all the muxers with the configure option
  9. @code{--disable-muxers} and selectively enable / disable single muxers
  10. with the options @code{--enable-muxer=@var{MUXER}} /
  11. @code{--disable-muxer=@var{MUXER}}.
  12. The option @code{-formats} of the ff* tools will display the list of
  13. enabled muxers.
  14. A description of some of the currently available muxers follows.
  15. @anchor{aiff}
  16. @section aiff
  17. Audio Interchange File Format muxer.
  18. It accepts the following options:
  19. @table @option
  20. @item write_id3v2
  21. Enable ID3v2 tags writing when set to 1. Default is 0 (disabled).
  22. @item id3v2_version
  23. Select ID3v2 version to write. Currently only version 3 and 4 (aka.
  24. ID3v2.3 and ID3v2.4) are supported. The default is version 4.
  25. @end table
  26. @anchor{crc}
  27. @section crc
  28. CRC (Cyclic Redundancy Check) testing format.
  29. This muxer computes and prints the Adler-32 CRC of all the input audio
  30. and video frames. By default audio frames are converted to signed
  31. 16-bit raw audio and video frames to raw video before computing the
  32. CRC.
  33. The output of the muxer consists of a single line of the form:
  34. CRC=0x@var{CRC}, where @var{CRC} is a hexadecimal number 0-padded to
  35. 8 digits containing the CRC for all the decoded input frames.
  36. For example to compute the CRC of the input, and store it in the file
  37. @file{out.crc}:
  38. @example
  39. ffmpeg -i INPUT -f crc out.crc
  40. @end example
  41. You can print the CRC to stdout with the command:
  42. @example
  43. ffmpeg -i INPUT -f crc -
  44. @end example
  45. You can select the output format of each frame with @command{ffmpeg} by
  46. specifying the audio and video codec and format. For example to
  47. compute the CRC of the input audio converted to PCM unsigned 8-bit
  48. and the input video converted to MPEG-2 video, use the command:
  49. @example
  50. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f crc -
  51. @end example
  52. See also the @ref{framecrc} muxer.
  53. @anchor{framecrc}
  54. @section framecrc
  55. Per-packet CRC (Cyclic Redundancy Check) testing format.
  56. This muxer computes and prints the Adler-32 CRC for each audio
  57. and video packet. By default audio frames are converted to signed
  58. 16-bit raw audio and video frames to raw video before computing the
  59. CRC.
  60. The output of the muxer consists of a line for each audio and video
  61. packet of the form:
  62. @example
  63. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, 0x@var{CRC}
  64. @end example
  65. @var{CRC} is a hexadecimal number 0-padded to 8 digits containing the
  66. CRC of the packet.
  67. For example to compute the CRC of the audio and video frames in
  68. @file{INPUT}, converted to raw audio and video packets, and store it
  69. in the file @file{out.crc}:
  70. @example
  71. ffmpeg -i INPUT -f framecrc out.crc
  72. @end example
  73. To print the information to stdout, use the command:
  74. @example
  75. ffmpeg -i INPUT -f framecrc -
  76. @end example
  77. With @command{ffmpeg}, you can select the output format to which the
  78. audio and video frames are encoded before computing the CRC for each
  79. packet by specifying the audio and video codec. For example, to
  80. compute the CRC of each decoded input audio frame converted to PCM
  81. unsigned 8-bit and of each decoded input video frame converted to
  82. MPEG-2 video, use the command:
  83. @example
  84. ffmpeg -i INPUT -c:a pcm_u8 -c:v mpeg2video -f framecrc -
  85. @end example
  86. See also the @ref{crc} muxer.
  87. @anchor{framemd5}
  88. @section framemd5
  89. Per-packet MD5 testing format.
  90. This muxer computes and prints the MD5 hash for each audio
  91. and video packet. By default audio frames are converted to signed
  92. 16-bit raw audio and video frames to raw video before computing the
  93. hash.
  94. The output of the muxer consists of a line for each audio and video
  95. packet of the form:
  96. @example
  97. @var{stream_index}, @var{packet_dts}, @var{packet_pts}, @var{packet_duration}, @var{packet_size}, @var{MD5}
  98. @end example
  99. @var{MD5} is a hexadecimal number representing the computed MD5 hash
  100. for the packet.
  101. For example to compute the MD5 of the audio and video frames in
  102. @file{INPUT}, converted to raw audio and video packets, and store it
  103. in the file @file{out.md5}:
  104. @example
  105. ffmpeg -i INPUT -f framemd5 out.md5
  106. @end example
  107. To print the information to stdout, use the command:
  108. @example
  109. ffmpeg -i INPUT -f framemd5 -
  110. @end example
  111. See also the @ref{md5} muxer.
  112. @anchor{ico}
  113. @section ico
  114. ICO file muxer.
  115. Microsoft's icon file format (ICO) has some strict limitations that should be noted:
  116. @itemize
  117. @item
  118. Size cannot exceed 256 pixels in any dimension
  119. @item
  120. Only BMP and PNG images can be stored
  121. @item
  122. If a BMP image is used, it must be one of the following pixel formats:
  123. @example
  124. BMP Bit Depth FFmpeg Pixel Format
  125. 1bit pal8
  126. 4bit pal8
  127. 8bit pal8
  128. 16bit rgb555le
  129. 24bit bgr24
  130. 32bit bgra
  131. @end example
  132. @item
  133. If a BMP image is used, it must use the BITMAPINFOHEADER DIB header
  134. @item
  135. If a PNG image is used, it must use the rgba pixel format
  136. @end itemize
  137. @anchor{image2}
  138. @section image2
  139. Image file muxer.
  140. The image file muxer writes video frames to image files.
  141. The output filenames are specified by a pattern, which can be used to
  142. produce sequentially numbered series of files.
  143. The pattern may contain the string "%d" or "%0@var{N}d", this string
  144. specifies the position of the characters representing a numbering in
  145. the filenames. If the form "%0@var{N}d" is used, the string
  146. representing the number in each filename is 0-padded to @var{N}
  147. digits. The literal character '%' can be specified in the pattern with
  148. the string "%%".
  149. If the pattern contains "%d" or "%0@var{N}d", the first filename of
  150. the file list specified will contain the number 1, all the following
  151. numbers will be sequential.
  152. The pattern may contain a suffix which is used to automatically
  153. determine the format of the image files to write.
  154. For example the pattern "img-%03d.bmp" will specify a sequence of
  155. filenames of the form @file{img-001.bmp}, @file{img-002.bmp}, ...,
  156. @file{img-010.bmp}, etc.
  157. The pattern "img%%-%d.jpg" will specify a sequence of filenames of the
  158. form @file{img%-1.jpg}, @file{img%-2.jpg}, ..., @file{img%-10.jpg},
  159. etc.
  160. The following example shows how to use @command{ffmpeg} for creating a
  161. sequence of files @file{img-001.jpeg}, @file{img-002.jpeg}, ...,
  162. taking one image every second from the input video:
  163. @example
  164. ffmpeg -i in.avi -vsync 1 -r 1 -f image2 'img-%03d.jpeg'
  165. @end example
  166. Note that with @command{ffmpeg}, if the format is not specified with the
  167. @code{-f} option and the output filename specifies an image file
  168. format, the image2 muxer is automatically selected, so the previous
  169. command can be written as:
  170. @example
  171. ffmpeg -i in.avi -vsync 1 -r 1 'img-%03d.jpeg'
  172. @end example
  173. Note also that the pattern must not necessarily contain "%d" or
  174. "%0@var{N}d", for example to create a single image file
  175. @file{img.jpeg} from the input video you can employ the command:
  176. @example
  177. ffmpeg -i in.avi -f image2 -frames:v 1 img.jpeg
  178. @end example
  179. The image muxer supports the .Y.U.V image file format. This format is
  180. special in that that each image frame consists of three files, for
  181. each of the YUV420P components. To read or write this image file format,
  182. specify the name of the '.Y' file. The muxer will automatically open the
  183. '.U' and '.V' files as required.
  184. @anchor{md5}
  185. @section md5
  186. MD5 testing format.
  187. This muxer computes and prints the MD5 hash of all the input audio
  188. and video frames. By default audio frames are converted to signed
  189. 16-bit raw audio and video frames to raw video before computing the
  190. hash.
  191. The output of the muxer consists of a single line of the form:
  192. MD5=@var{MD5}, where @var{MD5} is a hexadecimal number representing
  193. the computed MD5 hash.
  194. For example to compute the MD5 hash of the input converted to raw
  195. audio and video, and store it in the file @file{out.md5}:
  196. @example
  197. ffmpeg -i INPUT -f md5 out.md5
  198. @end example
  199. You can print the MD5 to stdout with the command:
  200. @example
  201. ffmpeg -i INPUT -f md5 -
  202. @end example
  203. See also the @ref{framemd5} muxer.
  204. @section MOV/MP4/ISMV
  205. The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4
  206. file has all the metadata about all packets stored in one location
  207. (written at the end of the file, it can be moved to the start for
  208. better playback using the @command{qt-faststart} tool). A fragmented
  209. file consists of a number of fragments, where packets and metadata
  210. about these packets are stored together. Writing a fragmented
  211. file has the advantage that the file is decodable even if the
  212. writing is interrupted (while a normal MOV/MP4 is undecodable if
  213. it is not properly finished), and it requires less memory when writing
  214. very long files (since writing normal MOV/MP4 files stores info about
  215. every single packet in memory until the file is closed). The downside
  216. is that it is less compatible with other applications.
  217. Fragmentation is enabled by setting one of the AVOptions that define
  218. how to cut the file into fragments:
  219. @table @option
  220. @item -moov_size @var{bytes}
  221. Reserves space for the moov atom at the beginning of the file instead of placing the
  222. moov atom at the end. If the space reserved is insufficient, muxing will fail.
  223. @item -movflags frag_keyframe
  224. Start a new fragment at each video keyframe.
  225. @item -frag_duration @var{duration}
  226. Create fragments that are @var{duration} microseconds long.
  227. @item -frag_size @var{size}
  228. Create fragments that contain up to @var{size} bytes of payload data.
  229. @item -movflags frag_custom
  230. Allow the caller to manually choose when to cut fragments, by
  231. calling @code{av_write_frame(ctx, NULL)} to write a fragment with
  232. the packets written so far. (This is only useful with other
  233. applications integrating libavformat, not from @command{ffmpeg}.)
  234. @item -min_frag_duration @var{duration}
  235. Don't create fragments that are shorter than @var{duration} microseconds long.
  236. @end table
  237. If more than one condition is specified, fragments are cut when
  238. one of the specified conditions is fulfilled. The exception to this is
  239. @code{-min_frag_duration}, which has to be fulfilled for any of the other
  240. conditions to apply.
  241. Additionally, the way the output file is written can be adjusted
  242. through a few other options:
  243. @table @option
  244. @item -movflags empty_moov
  245. Write an initial moov atom directly at the start of the file, without
  246. describing any samples in it. Generally, an mdat/moov pair is written
  247. at the start of the file, as a normal MOV/MP4 file, containing only
  248. a short portion of the file. With this option set, there is no initial
  249. mdat atom, and the moov atom only describes the tracks but has
  250. a zero duration.
  251. Files written with this option set do not work in QuickTime.
  252. This option is implicitly set when writing ismv (Smooth Streaming) files.
  253. @item -movflags separate_moof
  254. Write a separate moof (movie fragment) atom for each track. Normally,
  255. packets for all tracks are written in a moof atom (which is slightly
  256. more efficient), but with this option set, the muxer writes one moof/mdat
  257. pair for each track, making it easier to separate tracks.
  258. This option is implicitly set when writing ismv (Smooth Streaming) files.
  259. @end table
  260. Smooth Streaming content can be pushed in real time to a publishing
  261. point on IIS with this muxer. Example:
  262. @example
  263. ffmpeg -re @var{<normal input/transcoding options>} -movflags isml+frag_keyframe -f ismv http://server/publishingpoint.isml/Streams(Encoder1)
  264. @end example
  265. @section mpegts
  266. MPEG transport stream muxer.
  267. This muxer implements ISO 13818-1 and part of ETSI EN 300 468.
  268. The muxer options are:
  269. @table @option
  270. @item -mpegts_original_network_id @var{number}
  271. Set the original_network_id (default 0x0001). This is unique identifier
  272. of a network in DVB. Its main use is in the unique identification of a
  273. service through the path Original_Network_ID, Transport_Stream_ID.
  274. @item -mpegts_transport_stream_id @var{number}
  275. Set the transport_stream_id (default 0x0001). This identifies a
  276. transponder in DVB.
  277. @item -mpegts_service_id @var{number}
  278. Set the service_id (default 0x0001) also known as program in DVB.
  279. @item -mpegts_pmt_start_pid @var{number}
  280. Set the first PID for PMT (default 0x1000, max 0x1f00).
  281. @item -mpegts_start_pid @var{number}
  282. Set the first PID for data packets (default 0x0100, max 0x0f00).
  283. @end table
  284. The recognized metadata settings in mpegts muxer are @code{service_provider}
  285. and @code{service_name}. If they are not set the default for
  286. @code{service_provider} is "FFmpeg" and the default for
  287. @code{service_name} is "Service01".
  288. @example
  289. ffmpeg -i file.mpg -c copy \
  290. -mpegts_original_network_id 0x1122 \
  291. -mpegts_transport_stream_id 0x3344 \
  292. -mpegts_service_id 0x5566 \
  293. -mpegts_pmt_start_pid 0x1500 \
  294. -mpegts_start_pid 0x150 \
  295. -metadata service_provider="Some provider" \
  296. -metadata service_name="Some Channel" \
  297. -y out.ts
  298. @end example
  299. @section null
  300. Null muxer.
  301. This muxer does not generate any output file, it is mainly useful for
  302. testing or benchmarking purposes.
  303. For example to benchmark decoding with @command{ffmpeg} you can use the
  304. command:
  305. @example
  306. ffmpeg -benchmark -i INPUT -f null out.null
  307. @end example
  308. Note that the above command does not read or write the @file{out.null}
  309. file, but specifying the output file is required by the @command{ffmpeg}
  310. syntax.
  311. Alternatively you can write the command as:
  312. @example
  313. ffmpeg -benchmark -i INPUT -f null -
  314. @end example
  315. @section matroska
  316. Matroska container muxer.
  317. This muxer implements the matroska and webm container specs.
  318. The recognized metadata settings in this muxer are:
  319. @table @option
  320. @item title=@var{title name}
  321. Name provided to a single track
  322. @end table
  323. @table @option
  324. @item language=@var{language name}
  325. Specifies the language of the track in the Matroska languages form
  326. @end table
  327. @table @option
  328. @item stereo_mode=@var{mode}
  329. Stereo 3D video layout of two views in a single video track
  330. @table @option
  331. @item mono
  332. video is not stereo
  333. @item left_right
  334. Both views are arranged side by side, Left-eye view is on the left
  335. @item bottom_top
  336. Both views are arranged in top-bottom orientation, Left-eye view is at bottom
  337. @item top_bottom
  338. Both views are arranged in top-bottom orientation, Left-eye view is on top
  339. @item checkerboard_rl
  340. Each view is arranged in a checkerboard interleaved pattern, Left-eye view being first
  341. @item checkerboard_lr
  342. Each view is arranged in a checkerboard interleaved pattern, Right-eye view being first
  343. @item row_interleaved_rl
  344. Each view is constituted by a row based interleaving, Right-eye view is first row
  345. @item row_interleaved_lr
  346. Each view is constituted by a row based interleaving, Left-eye view is first row
  347. @item col_interleaved_rl
  348. Both views are arranged in a column based interleaving manner, Right-eye view is first column
  349. @item col_interleaved_lr
  350. Both views are arranged in a column based interleaving manner, Left-eye view is first column
  351. @item anaglyph_cyan_red
  352. All frames are in anaglyph format viewable through red-cyan filters
  353. @item right_left
  354. Both views are arranged side by side, Right-eye view is on the left
  355. @item anaglyph_green_magenta
  356. All frames are in anaglyph format viewable through green-magenta filters
  357. @item block_lr
  358. Both eyes laced in one Block, Left-eye view is first
  359. @item block_rl
  360. Both eyes laced in one Block, Right-eye view is first
  361. @end table
  362. @end table
  363. For example a 3D WebM clip can be created using the following command line:
  364. @example
  365. ffmpeg -i sample_left_right_clip.mpg -an -c:v libvpx -metadata stereo_mode=left_right -y stereo_clip.webm
  366. @end example
  367. @section segment, stream_segment, ssegment
  368. Basic stream segmenter.
  369. The segmenter muxer outputs streams to a number of separate files of nearly
  370. fixed duration. Output filename pattern can be set in a fashion similar to
  371. @ref{image2}.
  372. @code{stream_segment} is a variant of the muxer used to write to
  373. streaming output formats, i.e. which do not require global headers,
  374. and is recommended for outputting e.g. to MPEG transport stream segments.
  375. @code{ssegment} is a shorter alias for @code{stream_segment}.
  376. Every segment starts with a video keyframe, if a video stream is present.
  377. Note that if you want accurate splitting for a video file, you need to
  378. make the input key frames correspond to the exact splitting times
  379. expected by the segmenter, or the segment muxer will start the new
  380. segment with the key frame found next after the specified start
  381. time.
  382. The segment muxer works best with a single constant frame rate video.
  383. Optionally it can generate a list of the created segments, by setting
  384. the option @var{segment_list}. The list type is specified by the
  385. @var{segment_list_type} option.
  386. The segment muxer supports the following options:
  387. @table @option
  388. @item segment_format @var{format}
  389. Override the inner container format, by default it is guessed by the filename
  390. extension.
  391. @item segment_list @var{name}
  392. Generate also a listfile named @var{name}. If not specified no
  393. listfile is generated.
  394. @item segment_list_flags @var{flags}
  395. Set flags affecting the segment list generation.
  396. It currently supports the following flags:
  397. @table @var
  398. @item cache
  399. Allow caching (only affects M3U8 list files).
  400. @item live
  401. Allow live-friendly file generation.
  402. This currently only affects M3U8 lists. In particular, write a fake
  403. EXT-X-TARGETDURATION duration field at the top of the file, based on
  404. the specified @var{segment_time}.
  405. @end table
  406. Default value is @code{cache}.
  407. @item segment_list_size @var{size}
  408. Overwrite the listfile once it reaches @var{size} entries. If 0
  409. the listfile is never overwritten. Default value is 0.
  410. @item segment_list type @var{type}
  411. Specify the format for the segment list file.
  412. The following values are recognized:
  413. @table @option
  414. @item flat
  415. Generate a flat list for the created segments, one segment per line.
  416. @item csv, ext
  417. Generate a list for the created segments, one segment per line,
  418. each line matching the format (comma-separated values):
  419. @example
  420. @var{segment_filename},@var{segment_start_time},@var{segment_end_time}
  421. @end example
  422. @var{segment_filename} is the name of the output file generated by the
  423. muxer according to the provided pattern. CSV escaping (according to
  424. RFC4180) is applied if required.
  425. @var{segment_start_time} and @var{segment_end_time} specify
  426. the segment start and end time expressed in seconds.
  427. A list file with the suffix @code{".csv"} or @code{".ext"} will
  428. auto-select this format.
  429. @code{ext} is deprecated in favor or @code{csv}.
  430. @item m3u8
  431. Generate an extended M3U8 file, version 4, compliant with
  432. @url{http://tools.ietf.org/id/draft-pantos-http-live-streaming-08.txt}.
  433. A list file with the suffix @code{".m3u8"} will auto-select this format.
  434. @end table
  435. If not specified the type is guessed from the list file name suffix.
  436. @item segment_time @var{time}
  437. Set segment duration to @var{time}. Default value is "2".
  438. @item segment_time_delta @var{delta}
  439. Specify the accuracy time when selecting the start time for a
  440. segment. Default value is "0".
  441. When delta is specified a key-frame will start a new segment if its
  442. PTS satisfies the relation:
  443. @example
  444. PTS >= start_time - time_delta
  445. @end example
  446. This option is useful when splitting video content, which is always
  447. split at GOP boundaries, in case a key frame is found just before the
  448. specified split time.
  449. In particular may be used in combination with the @file{ffmpeg} option
  450. @var{force_key_frames}. The key frame times specified by
  451. @var{force_key_frames} may not be set accurately because of rounding
  452. issues, with the consequence that a key frame time may result set just
  453. before the specified time. For constant frame rate videos a value of
  454. 1/2*@var{frame_rate} should address the worst case mismatch between
  455. the specified time and the time set by @var{force_key_frames}.
  456. @item segment_times @var{times}
  457. Specify a list of split points. @var{times} contains a list of comma
  458. separated duration specifications, in increasing order.
  459. @item segment_wrap @var{limit}
  460. Wrap around segment index once it reaches @var{limit}.
  461. @end table
  462. Some examples follow.
  463. @itemize
  464. @item
  465. To remux the content of file @file{in.mkv} to a list of segments
  466. @file{out-000.nut}, @file{out-001.nut}, etc., and write the list of
  467. generated segments to @file{out.list}:
  468. @example
  469. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.list out%03d.nut
  470. @end example
  471. @item
  472. As the example above, but segment the input file according to the split
  473. points specified by the @var{segment_times} option:
  474. @example
  475. ffmpeg -i in.mkv -codec copy -map 0 -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 out%03d.nut
  476. @end example
  477. @item
  478. As the example above, but use the @code{ffmpeg} @var{force_key_frames}
  479. option to force key frames in the input at the specified location, together
  480. with the segment option @var{segment_time_delta} to account for
  481. possible roundings operated when setting key frame times.
  482. @example
  483. ffmpeg -i in.mkv -force_key_frames 1,2,3,5,8,13,21 -vcodec mpeg4 -acodec pcm_s16le -map 0 \
  484. -f segment -segment_list out.csv -segment_times 1,2,3,5,8,13,21 -segment_time_delta 0.05 out%03d.nut
  485. @end example
  486. In order to force key frames on the input file, transcoding is
  487. required.
  488. @item
  489. To convert the @file{in.mkv} to TS segments using the @code{libx264}
  490. and @code{libfaac} encoders:
  491. @example
  492. ffmpeg -i in.mkv -map 0 -codec:v libx264 -codec:a libfaac -f ssegment -segment_list out.list out%03d.ts
  493. @end example
  494. @item
  495. Segment the input file, and create an M3U8 live playlist (can be used
  496. as live HLS source):
  497. @example
  498. ffmpeg -re -i in.mkv -codec copy -map 0 -f segment -segment_list playlist.m3u8 \
  499. -segment_list_flags +live -segment_time 10 out%03d.mkv
  500. @end example
  501. @end itemize
  502. @section mp3
  503. The MP3 muxer writes a raw MP3 stream with an ID3v2 header at the beginning and
  504. optionally an ID3v1 tag at the end. ID3v2.3 and ID3v2.4 are supported, the
  505. @code{id3v2_version} option controls which one is used. The legacy ID3v1 tag is
  506. not written by default, but may be enabled with the @code{write_id3v1} option.
  507. For seekable output the muxer also writes a Xing frame at the beginning, which
  508. contains the number of frames in the file. It is useful for computing duration
  509. of VBR files.
  510. The muxer supports writing ID3v2 attached pictures (APIC frames). The pictures
  511. are supplied to the muxer in form of a video stream with a single packet. There
  512. can be any number of those streams, each will correspond to a single APIC frame.
  513. The stream metadata tags @var{title} and @var{comment} map to APIC
  514. @var{description} and @var{picture type} respectively. See
  515. @url{http://id3.org/id3v2.4.0-frames} for allowed picture types.
  516. Note that the APIC frames must be written at the beginning, so the muxer will
  517. buffer the audio frames until it gets all the pictures. It is therefore advised
  518. to provide the pictures as soon as possible to avoid excessive buffering.
  519. Examples:
  520. Write an mp3 with an ID3v2.3 header and an ID3v1 footer:
  521. @example
  522. ffmpeg -i INPUT -id3v2_version 3 -write_id3v1 1 out.mp3
  523. @end example
  524. To attach a picture to an mp3 file select both the audio and the picture stream
  525. with @code{map}:
  526. @example
  527. ffmpeg -i input.mp3 -i cover.png -c copy -map 0 -map 1
  528. -metadata:s:v title="Album cover" -metadata:s:v comment="Cover (Front)" out.mp3
  529. @end example
  530. @c man end MUXERS