Update ffmpeg piping with udp and rtp

Also updated the tutorial for live and added ffmpeg_piping to toc.

Change-Id: I5596723cbd709e742ceab0f0789801f58606b932
This commit is contained in:
KongQun Yang 2017-09-15 14:58:56 -07:00
parent b5800fad77
commit df6878ee6d
4 changed files with 74 additions and 36 deletions

View File

@ -2,8 +2,8 @@ Shaka Packager Library
======================
Documentation for the top level Shaka packager library. See
`Internal API <http://google.github.io/shaka-packager/docs/annotated.html>`_ for
documentation on internal APIs.
`Internal API <https://google.github.io/shaka-packager/docs/annotated.html>`_
for documentation on internal APIs.
.. doxygenclass:: shaka::Packager

View File

@ -1,34 +1,57 @@
ffmpeg piping
FFmpeg piping
=============
We can use *ffmpeg* to redirect / pipe input not supported by *packager*
to *packager*, for example, input from webcam devices. The example below uses
webcam input device. The concept depicted here can be applied to
other *ffmpeg* supported capture device.
We can use *FFmpeg* to redirect / pipe input not supported by *packager*
to *packager*, for example, input from webcam devices, or rtp input. The concept
depicted here can be applied to other *FFmpeg* supported device or protocols.
ffmpeg camera capture
---------------------
Piping data to packager
-----------------------
Refer to `ffmpeg Capture/Webcam <https://trac.ffmpeg.org/wiki/Capture/Webcam>`_
on how to use *ffmpeg* to capture webmcam inputs.
There are two options to pipe data to packager.
The examples below assumes Mac OS X 10.7 (Lion) or later. It is similar on
other platforms. Refer to the above link for details.
- UDP socket
Create pipe
-----------
*FFmpeg* supports writing to a UDP socket and *packager* supports reading
from UDP sockets::
We use pipe to connect *ffmpeg* and *packager*::
$ packager 'input=udp://127.0.0.1:40000,...' ...
$ ffmpeg ... -f mpegts udp://127.0.0.1:40000
VP9 cannot be carried in mpegts. Another container, e.g. webm needs to be
used when outputs VP9. In this case, transcoding has to be started after
starting packager as the initialization segment is only transmitted in the
beginning of WebM output.
- pipe
Similarily, pipe is also supported in both *FFmpeg* and *packager*::
$ mkfifo pipe1
$ packager 'input=pipe1,...' ... --io_block_size 65536
$ ffmpeg ... -f mpegts pipe: > pipe1
.. note::
Option -io_block_size 65536 tells packager to use an io_block_size of 65K
for threaded io file. This is necessary when using pipe as reading from pipe
blocks until the specified number of bytes, which is specified in
io_block_size for threaded io file, thus the value of io_block_size cannot
be too large.
Encoding / capture command
--------------------------
The below command captures from the default audio / video devices on the
machine::
Camera capture
^^^^^^^^^^^^^^
$ ffmpeg -f avfoundation -i "default" -f mpegts pipe: > pipe1
Refer to `FFmpeg Capture/Webcam <https://trac.ffmpeg.org/wiki/Capture/Webcam>`_
on how to use *FFmpeg* to capture webmcam inputs.
The example assumes Mac OS X 10.7 (Lion) or later. It captures from the default
audio / video devices on the machine::
$ ffmpeg -f avfoundation -i "default" -f mpegts udp://127.0.0.1:40000
The command starts only after packager starts.
@ -37,20 +60,35 @@ The command starts only after packager starts.
After encoding starts, monitor encoding speed carefully. It should always be
1x and above. If not, adjust the encoding parameters to recude it.
Packaging command (DASH)
------------------------
RTP input
^^^^^^^^^
Assume there is an RTP input described by `saved_sdp_file`::
$ ffmpeg -protocol_whitelist "file,rtp,udp" -i saved_sdp_file -vcodec h264 \
-tune zerolatency -f mpegts udp://127.0.0.1:40000
.. note::
For testing, you can generate an RTP input from a static media file::
$ ffmpeg -re -stream_loop 100 -i <static.mp4> -vcodec copy -an \
-f rtp rtp://239.255.0.1:1234 -sdp_file saved_sdp_file
The command starts only after packager starts.
.. note::
After encoding starts, monitor encoding speed carefully. It should always be
1x and above. If not, adjust the encoding parameters to recude it.
Example packaging command in DASH
---------------------------------
::
$ packager \
'in=pipe1,stream=audio,init_segment=live_cam_audio.mp4,segment_template=live_cam_audio_$Number$.m4s' \
'in=pipe1,stream=video,init_segment=live_cam_video.mp4,segment_template=live_cam_video_$Number$.m4s' \
--mpd_output live_cam.mpd \
--io_block_size 65536
'in=udp://127.0.0.1:40000,stream=audio,init_segment=live_cam_audio.mp4,segment_template=live_cam_audio_$Number$.m4s' \
'in=udp://127.0.0.1:40000,stream=video,init_segment=live_cam_video.mp4,segment_template=live_cam_video_$Number$.m4s' \
--mpd_output live_cam.mpd
.. note::
Option -io_block_size 65536 tells packager to use an io_block_size of 65K
for threaded io file. This is necessary as reading from pipe blocks until
the specified number of bytes, which is specified in io_block_size for
threaded io file, thus the value of io_block_size cannot be too large.

View File

@ -4,11 +4,6 @@ Live
A typical live source is UDP multicast, which is the only live protocol
packager supports directly right now.
.. include:: /options/udp_file_options.rst
Pipe through FFmpeg
-------------------
For other unsupported protocols, you can use FFmpeg to pipe the input.
See :doc:`ffmpeg_piping` for details.
@ -44,3 +39,7 @@ Here are some examples.
Packager does not support removing old segments internally. The user is
resposible for setting up a cron job to do so.
.. include:: /options/udp_file_options.rst
.. include:: /options/segment_template_formatting.rst

View File

@ -9,3 +9,4 @@ Tutorials
hls.md
live.md
drm.rst
ffmpeg_piping.rst