So we’re starting with our fist parameters of the FFmpeg command: -listen 1 -i rtmp:///stream01 RTMP is a common used protocol for this and it’s supported by most of the streaming clients. h264, aac, mp3) and to know, how to use a command line. To be a able to follow my this article with using FFmpeg you should have basic knowledge about audio and video, the difference between container format (e.g. So be sure, that you use a fresh release. During the next steps I will use features that are only available since version 4.0. Requirementsįirst of all a new version of FFmpeg is required. Some things might have changed and might not be up to date. This article contains all my personal experience with HLS in FFmpeg. The data is delivered over a HTTP/HTTPS connection, so it can be also perfectly cached on server side or used in combination with an CDN. You can split audio and video into separate files, multiple audio and video streams for different formats and qualities are possible, easy to read and built in support on Apple devices (iOS, MacOS and tvOS). HLS itself is very awesome protocol for live streams. My goal was to setup my own streaming server that provides a video livestream using the HLS protocol ( Apple HTTP Live Streaming). Follow the link here to get an overview over all articles.