24/7 Live Stream to YouTube with FFMPEG

I was recently tasked with setting up a 24/7 livestream of the surf on a local beach from a PoE network camera to a public accessible webpage. Previously this was done with a 640×480 webcam sending an MJPEG stream to an Ubuntu server running FFMPEG which would then update the image on a webpage on ever 2 seconds. In addition to getting a better image at a double digit frame rate I wanted to include data from the local NOAA buoy in the stream.

First I went looking on a suitable replacement camera and settled on the Reolink 811a. Its PoE, affordable, high def, and given that the subject of the video stream is 200+ feet away the 5x optical zoom allows for less unwanted scenery in the shot. Next was to replace the static Apache webpage showing the 2hz image with a CDN to reduce our bandwidth and server requirements. YouTube was the obvious answer though after having completed the project I think I’ll investigate alternatives for future projects. Last step was to get FFMPEG to cooperate and this took by far the longest.

I had never used FFMPEG directly before so this was a learning experience. FFMPEG has lots of flags and while the documentation is exhaustive its difficult to parse and decipher. Looking around at forums for answers shows that there is a lot of varying opinions on what some FFMPEG flags should be used and what they do which doesnt help matters. I’m still honing my script but here’s what I have so far that seems to work to pull the h.264 stream from the reolink camera and send it to YouTube at 720p (I’m limited by the uplink from the camera being 10/100 ethernet):

ffmpeg -f lavfi -i anullsrc -rtsp_transport udp -i rtsp://surfcam:Rfs387Ax@128.111.28.194:554//h264Preview_01_main -vf "drawtext=fontfile=/usr/share/fonts/truetype/dejavu/DejaVuSansMono.ttf:textfile=/surf/data.txt:reload=1:fontcolor=white:fontsize=44:box=1:boxcolor=black@0.5:boxborderw=5:x=(w-text_w)/2:y=1300" -preset ultrafast -tune fastdecode -c:v libx264 -pix_fmt yuv420p -b:v 9500k -maxrate 9500k -bufsize 9500k -f flv -g 4 rtmp://a.rtmp.youtube.com/live2/9qwd-1s7c-2dpk-dc73-a5ys

I came up with this after a lot of trial and error and while i’m not 100% on exactly what all this does here is what I do know:

  • -f lavfi -i anullsrc: sends a null audio stream to YouTube as they require audio
  • -rtsp_transport udp: use RTSP as the transport method via UDP, I tried TCP for a while and that was not reliable
  • rtsp://<username>:<password>@<IP>:554//h264Preview_01_main: how you pull the HD RTSP video stream from the reolink 811a
  • -vf “drawtext=fontfile=/usr/share/fonts/truetype/dejavu/DejaVuSansMono.ttf: draw text using a fixed width font that ships with Ubuntu 22.04
  • :textfile=/surf/data.txt:reload=1: use the contents of a text file as the subtitles and update the subtitles if the contents of the file changes
  • :fontcolor=white:fontsize=44:box=1:boxcolor=black@0.5:boxborderw=5:x=(w-text_w)/2:y=1300″: white font, 44pt, within a translucent black bounding box the width of the text + 1/2 located 1300 pixels down the page
  • -preset ultrafast -tune fastdecode: no idea but it works
  • -pix_fmt yuv420p: set color space to what youtube accepts
  • rtmp://a.rtmp.youtube.com/live2/<streamkey>: YouTube endpoint and the streamkey
  • -b:v 9500k -maxrate 9500k -bufsize 9500k: set stream rate at 9500k which is what youtube wants for a 2k stream and set a buffer

Definitely a lot to unpack and it took a lot of iterating to get to this point. With this setup I still have to restart the FFMPEG script about ever 3 hours to prevent the YouTube stream from dropping about once every 24 hours. I set this up as a systemd service that auto starts on boot and setup a cronjob to restart it every 3 hours. I tried starting with 12 hours and worked my way down til I found something that was stable. Next was getting the buoy data from NOAA which was simple as they publish a text file once every 10 minutes with the latest data. I wrote a quick and dirty Python script to scrape it and a cron job to replace the data.txt file with the latest data.

Now I had a working stream to YouTube that checked all the project requirements last step was to add some redundancy. YouTube provides a backup stream that it will automatically fail over to if the primary stream goes down and during testing this happened a lot. I setup a second lightweight VM to stream a static technical difficulties image with this FFMPEG script:

ffmpeg -loop 1 -f image2 -i '/surf/psb.jpeg' -f lavfi -i anullsrc -vf realtime,scale=1280:720,format=yuv420p -r 30 -g 60 -c:v libx264 -x264-params keyint=60 -bufsize 500k -c:a aac -ar 44100 -b:a 128k fps=20 -f flv rtmp://b.rtmp.youtube.com/live2?backup=1/<streamkey>

Breaking down this FFMPEG command we have:

  • -loop 1 -f image2: loop an input image indefinitely
  • -i ‘/surf/psb.jpeg’: input image
  • -f lavfi -i anullsrc: more null audio
  • -vf realtime,scale=1280:720,format=yuv420p: no idea on the first part, image is 720p to match the primary stream and again no idea on the last part
  • -r 30 -g 60: ???
  • -c:v libx264 -x264-params keyint=60 -bufsize 500k: ouput to h.264 with a key frame every 60 frames, and set the buffer size I think
  • -c:a aac -ar 44100 -b:a 128k: i think this sets the audio format and bitrate
  • fps=20 -f flv rtmp://b.rtmp.youtube.com/live2?backup=1/<streamkey>: stream at 20fps to match the primary or YouTube complains, point the stream at the backup endpoint

This will reliably send out a static image and YouTube will happily stream it until the primary stream is back. Annoyingly YouTube will not automatically switch back to the primary stream, you have to stop the backup for 30 or so seconds before YouTube will try the primary stream again. There is good reason to have a reliable backup stream, YouTube doesn’t give you a permanent stream address immediately so every time your stream ends a new URL is generated. So if you want to embed the video in a webpage you’ll have to manually update the embed code every time the stream ends.

I plan on continuing to refine things to improve performance and reliability, the full setup is available via github.

6 month update:

After 6 months of running the stream here is where we are:

We had lots of crashes irregular intervals, from days to weeks so the text overlay has been removed. This seems to have improved things but I haven’t tested to verify if this is what fixed it.

I can now reliably run the stream at 1440P instead of 720P

We still had minor networking issues that were out of our hands that would kill the stream, so I wrote a python script to handle restarting the stream. This was surprisingly difficult to put together so to save others some time, since I couldn’t find a good example elsewhere, here it is: (note this still wont auto start the stream but it can at least be used for alerts if the stream dies)

#!/usr/bin/python3
import subprocess
import httplib2
import os
import sys
import time
from datetime import datetime, timedelta

from googleapiclient.discovery import build
from oauth2client.client import flow_from_clientsecrets
from oauth2client.file import Storage
from oauth2client.tools import argparser, run_flow

now = datetime.utcnow()
start_time = now + timedelta(minutes=2)
start_time_iso = start_time.strftime("%Y-%m-%dT%H:%M:%S.%fZ")
command_stop = "sudo systemctl stop surfcam.service"
command_start = "sudo systemctl start surfcam.service"
CLIENT_SECRETS_FILE = "/surf/client_secrets.json"

MISSING_CLIENT_SECRETS_MESSAGE = """
WARNING: Please configure OAuth 2.0
To make this sample run you will need to populate the client_secrets.json file
found at:
   %s
with information from the Developers Console
https://console.developers.google.com/
For more information about the client_secrets.json file format, please visit:
https://developers.google.com/api-client-library/python/guide/aaa_client_secrets
""" % os.path.abspath(os.path.join(os.path.dirname(__file__),
                                   CLIENT_SECRETS_FILE))

YOUTUBE_SCOPE = "https://www.googleapis.com/auth/youtube"
YOUTUBE_API_SERVICE_NAME = "youtube"
YOUTUBE_API_VERSION = "v3"

flow = flow_from_clientsecrets(CLIENT_SECRETS_FILE,
  message=MISSING_CLIENT_SECRETS_MESSAGE,
  scope=YOUTUBE_SCOPE)

storage = Storage("/surf/main.py-oauth2.json")
credentials = storage.get()

if credentials is None or credentials.invalid:
  flags = argparser.parse_args()
  credentials = run_flow(flow, storage, flags)
youtube = build(YOUTUBE_API_SERVICE_NAME, YOUTUBE_API_VERSION,
  http=credentials.authorize(httplib2.Http()))

request = youtube.liveBroadcasts().list(
        part="status",
        broadcastStatus="active",
        broadcastType="all"
    )
response = request.execute()
print(response)
if response['items']:
    if response['items'][0]['status']['lifeCycleStatus'] == "live":
        print("running")
        pass
else:
    print("dead")
    stop = subprocess.run(command_stop, shell=True, text=True, capture_output=True)
    print(stop.stdout)
    time.sleep(15)
    print("starting stream")

    stream = youtube.liveStreams().insert(
        part="snippet,cdn",
        body={
            "snippet": {
                "title": "Surf Cam",
                "description": "Surf Cam"
            },
            "cdn": {
                "resolution": "1440p",
                "frameRate": "30fps",
                "ingestionType": "rtmp"
            }
        }
    ).execute()

    time.sleep(30)
    start = subprocess.run(command_start, shell=True, text=True, capture_output=True)
    print(start.stdout)
    
    # Check if the stream is active
    stream_status = youtube.liveStreams().list(
        part="status",
        id=stream["id"]
    ).execute()['items'][0]['status']['streamStatus']

    if stream_status == "active":
        # Bind the broadcast to the stream
        bind_broadcast = youtube.liveBroadcasts().bind(
            part="id,contentDetails",
            id=broadcast["id"],
            streamId=stream["id"]
        ).execute()

        # Transition the broadcast to live
        transition_broadcast = youtube.liveBroadcasts().transition(
            broadcastStatus="live",
            id=broadcast["id"],
            part="id,contentDetails"
        ).execute()

print("done")