URI: 
       tÃNew post: webstreaming with ffmpeg - monochromatic - monochromatic blog: http://blog.z3bra.org
  HTML git clone git://z3bra.org/monochromatic
   DIR Log
   DIR Files
   DIR Refs
       ---
   DIR commit ea5d78639c0ec73e577163083b8e0502c85e198e
   DIR parent 42c232ef2987b0e735909c75ee4c6b7139231d94
  HTML Author: z3bra <willyatmailoodotorg>
       Date:   Tue, 30 Aug 2016 14:13:13 +0200
       
       ÃNew post: webstreaming with ffmpeg
       
       Diffstat:
         A 2016/08/desktop-streaming.txt       |     179 +++++++++++++++++++++++++++++++
         M config.mk                           |       3 ++-
         M index.txt                           |       1 +
       
       3 files changed, 182 insertions(+), 1 deletion(-)
       ---
   DIR diff --git a/2016/08/desktop-streaming.txt b/2016/08/desktop-streaming.txt
       t@@ -0,0 +1,179 @@
       +# [Desktop streaming](#)
       +## &mdash; 30 August, 2016
       +
       +For teaching purposes (and cool internet points!) I recently needed
       +to share my screen and microphone online. Being the unix enthusiast
       +that I am, I looked into how I could do it "simple" command-line
       +tools.
       +
       +And here comes [`ffmpeg`](http://ffmpeg.org). `ffmpeg` is the swiss
       +army knife for everything related to audio and video decoding/encoding.
       +I've been using it for multiple tasks already, from converting .ogg
       +to .mp3, to recording GIFs of my desktop.
       +
       +### Server part
       +
       +I started looking into how I could "stream" my desktop online, and
       +quickly found about the `ffserver` utility (which is part of the
       +`ffmpeg` package).
       +
       +`ffserver` provides a service to do the following:
       +
       +* Receive a "Feed" sent by one user
       +* Send a "Stream" to multiple users
       +
       +A "Feed", from `ffserver` point, is a URL that a user will pass to
       +ffmpeg as the output media, to start "uploading" or "streaming" a
       +video to.  `ffserver` will then start bufferizing this input locally,
       +and expose this raw buffer via a "Stream". A stream will read from
       +this buffer, and encode it in the  specified format, with a bunch
       +of options.
       +
       +One can specify multiple output streams for a single feed, eg, to
       +use different encoding formats.
       +
       +Enough shittalks, here is what my `/etc/ffserver.conf` looked like:
       +
       +        # Port 80 was taken by the webserver
       +        HTTPPort 8090
       +        HTTPBindAddress 0.0.0.0
       +        MaxHTTPConnections 64
       +        MaxClients 28
       +        MaxBandwidth 10000
       +        
       +        CustomLog /var/log/ffserver.log
       +        
       +        # Where to send data.
       +        # URL will be: http://10.0.0.2:8090/0.ffm
       +        <Feed 0.ffm>
       +                # buffer file and max size
       +                File /tmp/ffserver/0.ffm
       +                FileMaxSize 200K
       +        
       +                # Only allow this IP to send streaming data ACL
       +                allow 10.0.0.3
       +        </Feed>
       +        
       +        # How to expose the stream
       +        # URL will be: http://10.0.0.2:8090/0.flv
       +        <Stream 0.flv>
       +                # The feed to encode data from
       +                Feed 0.ffm
       +                
       +                # Video encoding options
       +                Format flv
       +                VideoCodec libx264
       +                VideoFrameRate 5
       +                VideoSize 1440x900
       +                VideoBitRate 512
       +                AVOptionVideo tune zerolatency
       +                AVOptionVideo flags +global_header
       +                
       +                # Audio encoding options
       +                AudioCodec aac
       +                AVOptionAudio flags +global_header
       +        </Stream>
       +
       +I limited my research for the perfect stream to either
       +[x264](https://wikipedia.org/wiki/X264) or
       +[vp8](https://wikipedia.org/wiki/Vp8) video encoding. At first, vp8
       +seemed appealing, being a royalty-free format. The WEBM container
       +also seems to be pretty good for online videos. But x264 turned out
       +to be faster, and of higher quality (especially thanks to the
       +"zerolatency" setting).  I had to switch to x264 also because I
       +couldn't get the libvorbis codec for audio to synchronize well with
       +the vp8 video stream.
       +
       +The above configuration is the best quality/rapidity ratio I could
       +get.
       +
       +When the config is ready, you just need to fire up the server with
       +
       +        /usr/bin/ffserver -f /etc/ffserver.conf
       +
       +### Watcher part
       +
       +In order to watch the stream, one has to use the URL defined by the
       +<Stream> tag. I personally use `mplayer` to watch it, but one can
       +use the `ffplay` command provided by `ffmpeg`:
       +
       +        ffplay http://10.0.0.2:8090/0.flv
       +
       +And that's *ALL*. You can hardly do simpler to watch a stream,
       +right?
       +
       +### Feeder part
       +
       +In order to feed yor stream to the `ffserver`, you can use `ffmpeg`
       +directly.  The format of the command is pretty simple. We have 2
       +inputs: the video and the audio. We also have 1 output: the FFM
       +feed. The simplest command we can use is thus (recording our desktop,
       +and microphone):
       +
       +        ffmpeg -f x11grab -i :0.0 -f alsa -i default http://10.0.0.2:8090/0.ffm
       +
       +Let's break it down a bit:
       +
       +* `-f x11grab -i :0.0`: Record the desktop, using `$DISPLAY` :0.0
       +* `-f alsa -i default`: Record the microphone, using ALSA's default input device
       +* `http://10.0.0.2:8090/0.ffm`: Feed location
       +
       +This should start recording, and sending data to the stream. If you
       +look at it, the output will not look really great, and we need to
       +pass a few more flags to get a nice looking output that will record
       +the full screen in a decent way:
       +
       +        ffmpeg -f x11grab -r 5 -s 1440x900 -thread_queue_size 1024 -i :0.0 \
       +               -f alsa -ac 1 -thread_queue_size 1024 -i default \
       +               -af 'highpass=f=200, lowpass=f=2000' \
       +               -fflags nobuffer \
       +               http://10.0.0.2:8090/0.ffm
       +
       +Ok, that was odd. no worries, I'm no wizard and didn't came up with
       +all these flags out of nowhere! Let's review them:
       +
       +        -f x11grab -r 5 -s 1440x900 -i :0.0
       +
       +Record our X11 desktop with a framerate (`-r`) of 5 FPS, and record
       +the screen at size (`-s`) 1440x900 (my screen size).
       +
       +        -f alsa -ac 1 -i default
       +
       +Record from the default ALSA capture device, using MONO input
       +(`-ac`).
       +
       +        -thread_queue_size 1024
       +
       +For both input, this is to increase the max number of queued packets
       +`ffmpeg` can handle. If the thread queue is full, `ffmpeg` will
       +start dropping packets, which can lower the stream quality.
       +
       +        -af 'highpass=f=200, lowpass=f=2000'
       +
       +Add an audio filter. My microphone is utter shit, and records a lot
       +of white noise. This filters out frequencies below 200Hz and above
       +2000Hz (that's the typical voice range).
       +
       +        -fflags nobuffer
       +
       +Avoid buffering frames when possible, so the stream is available
       +as it is recorded.
       +
       +And that's pretty much it! Note that ffmpeg can have multiple
       +outputs, so you can record to the feed AND to a local file at the
       +same time.
       +
       +For instance, this is my `ffstream` script:
       +
       +        #!/bin/sh
       +
       +        STREAM=${1:-http://10.0.0.2:8090/0.ffm}
       +        ffmpeg -f x11grab -r 5 -s 1440x900 -thread_queue_size 1024 -i :0.0 \
       +               -f alsa -ac 1 -thread_queue_size 1024 -i default \
       +               -af 'highpass=f=200, lowpass=f=2000' \
       +               -fflags nobuffer ${STREAM} \
       +               -af 'highpass=f=200, lowpass=f=2000' \
       +               -c:v libvpx -b:v 5M -c:a libvorbis webcast-$(date +%Y%m%d%H%M%S).webm
       +
       +That's all folks!
       +
   DIR diff --git a/config.mk b/config.mk
       t@@ -31,7 +31,8 @@ PAGES   =   index.html \
                    2015/08/cross-compiling-with-pcc-and-musl.html \
                    2015/08/install-alpine-at-onlinenet.html \
                    2016/01/make-your-own-distro.html \
       -            2016/03/hand-crafted-containers.html
       +            2016/03/hand-crafted-containers.html \
       +            2016/08/desktop-streaming.html
        
        FEEDS = rss/feed.xml
        EXTRA = css img vid data errors favicon.ico
   DIR diff --git a/index.txt b/index.txt
       t@@ -1,3 +1,4 @@
       +* 0x001c - [Desktop streaming](/2016/08/desktop-streaming.html)
        * 0x001b - [Hand-crafted containers](/2016/03/hand-crafted-containers.html)
        * 0x001a - [Make your own distro](/2016/01/make-your-own-distro.html)
        * 0x0019 - [Install Alpine at online.net](/2015/08/install-alpine-at-onlinenet.html)