Location>code7788 >text

FFmpeg development notes (forty-two) using ZLMediaKit to open SRT live video service

Popularity:211 ℃/2024-07-28 14:46:51
The book "FFmpeg Development in Action: From Zero Basics to Short Video Online" introduces a lightweight streaming server MediaMTX in Chapter 10, through which you can test push and pull streams of streaming protocols such as RTSP/RTMP. However, MediaMTX is too simple to be applied to the production environment of real live broadcasts, and the real streaming media server that can be used in the production environment should also look at SRS or ZLMediaKit.

ZLMediaKit is a domestic open source streaming media server, support for RTSP, RTMP, SRT and other mainstream live protocols, its installation instructions see the previous article "Linux environment to install ZLMediaKit to achieve video push streaming". Combined with ZLMediaKit and ffmpeg to achieve RTSP/RTMP protocols to push the streaming function, has been in the "Linux environment to install ZLMediaKit to achieve the video to push the streaming" article is described in detail, and here alone to explain how to achieve SRT protocols to push the streaming function of ZLMediaKit and ffmpeg.
ZLMediaKit already supports SRT by default during compilation and startup. Check the configuration file of ZLMediaKit and find the configuration information of srt section as follows, which shows that ZLMediaKit assigns port 9000 to SRT protocol by default.

[srt]
latencyMul=4
pktBufSize=8192
port=9000
timeoutSec=5

Other than that, ZLMediaKit doesn't need to adjust any other configuration, just run the following ffmpeg command after startup to push video files to SRT address. Note, make sure FFmpeg on Linux server has integrated libsrt library, otherwise ffmpeg can't push stream to SRT address, for detailed integration steps, please refer to the previous article "Integrate libsrt and librist with FFmpeg in Linux environment".

ffmpeg -re -stream_loop -1 -i "/usr/local/src/test/" -c copy -f mpegts 'srt://127.0.0.1:9000?streamid=#!::r=live/test,m=publish'

Note that the second half of the srt address in the above command is "r=live/test,m=publish", where "r=live/test" means that the service name of SRT is called "live/test" and "m=publish" means that the address belongs to the publishing function, which is for the push streamer to use. live/test", and "m=publish" means that the address belongs to the publish function, which is for the push streamer to use.
ZLMediaKit also has requirements on the encapsulation format of the video source file, not only the source file is required to be ts format, but also the push stream format is also required to be ts format, so ffmpeg added "-f mpegts" to the command to convert to mpeg ts stream format. If the source file is not in ts format, or not converted to mpegts format, subsequent playback of srt links via ffplay will report the following error.

non-existing PPS 0 referenced

In addition, the audio and video encoding standards supported by ZLMediaKit are listed in src/Extension/, and the detailed audio and video support standards are shown below.

#define CODEC_MAP(XX) \
    XX(CodecH264,  TrackVideo, 0, "H264", PSI_STREAM_H264, MOV_OBJECT_H264)          \
    XX(CodecH265,  TrackVideo, 1, "H265", PSI_STREAM_H265, MOV_OBJECT_HEVC)          \
    XX(CodecAAC,   TrackAudio, 2, "mpeg4-generic", PSI_STREAM_AAC, MOV_OBJECT_AAC)   \
    XX(CodecG711A, TrackAudio, 3, "PCMA", PSI_STREAM_AUDIO_G711A, MOV_OBJECT_G711a)  \
    XX(CodecG711U, TrackAudio, 4, "PCMU", PSI_STREAM_AUDIO_G711U, MOV_OBJECT_G711u)  \
    XX(CodecOpus,  TrackAudio, 5, "opus", PSI_STREAM_AUDIO_OPUS, MOV_OBJECT_OPUS)    \
    XX(CodecL16,   TrackAudio, 6, "L16", PSI_STREAM_RESERVED, MOV_OBJECT_NONE)       \
    XX(CodecVP8,   TrackVideo, 7, "VP8", PSI_STREAM_VP8, MOV_OBJECT_VP8)             \
    XX(CodecVP9,   TrackVideo, 8, "VP9", PSI_STREAM_VP9, MOV_OBJECT_VP9)             \
    XX(CodecAV1,   TrackVideo, 9, "AV1", PSI_STREAM_AV1, MOV_OBJECT_AV1)             \
    XX(CodecJPEG,  TrackVideo, 10, "JPEG", PSI_STREAM_JPEG_2000, MOV_OBJECT_JPEG)

It can be seen that if the video file to be pushed for streaming does not belong to the above audio/video encoding standard, it will not be able to be pushed for streaming normally through the SRT service address.
After running ffmpeg's SRT Push Streaming command, ZLMediaKit outputs the following log message, which shows that its SRT Push Streaming function is running normally.

[MediaServer] [576478-event poller 0] :103 onRecv | 1-11(127.0.0.1:33630)
[MediaServer] [576478-event poller 0] :166 operator() | test(127.0.0.1:33630) permissible srt push the current (e.g. in mechanics)
[MediaServer] [576478-event poller 0] :143 onTrack | Got track: H264
[MediaServer] [576478-event poller 0] :143 onTrack | Got track: mpeg4-generic
[MediaServer] [576478-event poller 0] :97 onStream | Add track finished
[MediaServer] [576478-event poller 0] :161 emitAllTrackReady | All track ready use 172ms
[MediaServer] [576478-event poller 0] :517 emitEvent | Media Registration:fmp4://__defaultVhost__/live/test
[MediaServer] [576478-event poller 0] :551 onAllTrackReady | stream: schema://__defaultVhost__/app/stream , codec info: mpeg4-generic[48000/2/16] H264[1280/720/25]
[MediaServer] [576478-event poller 0] :517 emitEvent | Media Registration:rtmp://__defaultVhost__/live/test
[MediaServer] [576478-event poller 0] :517 emitEvent | Media Registration:rtsp://__defaultVhost__/live/test
[MediaServer] [576478-event poller 0] :517 emitEvent | Media Registration:ts://__defaultVhost__/live/test
[MediaServer] [576478-event poller 0] :517 emitEvent | Media Registration:hls://__defaultVhost__/live/test

Then, according to the introduction of "FFmpeg Development in Action: From Zero Basics to Online Short Video" book "1.3 Windows System Installation of FFmpeg", install FFmpeg on your PC and open the command line of MSYS, run the following ffplay command, expecting to pull the streams from the SRT address. play. Note, make sure the FFmpeg on your computer has integrated libsrt library, otherwise ffplay will not be able to play srt links, see the previous article "Windows environment to FFmpeg integration libsrt" for detailed integration steps.

ffplay -i 'srt://:9000?streamid=#!::r=live/test,m=request'

The above SRT streaming address is similar to the previous streaming address, except that the IP address is changed to the IP address of the external network, that is, the "m=publish" at the end of the link is changed to "m=request", in which request indicates that the request is also used to pull the stream party.
ffplay runs and pops up the player window and plays the video screen and sound normally. Also observe the service log of ZLMediaKit as shown below:

[MediaServer] [576478-event poller 0] :103 onRecv | 2-16(112.5.138.145:57022)
[MediaServer] [576478-event poller 0] :731 onShutdown | peer close connection
[MediaServer] [576478-event poller 0] :118 onError | 2-16(112.5.138.145:57022) 6(peer close connection)
[MediaServer] [576478-event poller 0] :14 ~SrtTransportImp | test(112.5.138.145:57022) srt player(__defaultVhost__/live/test)turn off (electric switch),take a period of (x amount of time)(s):16

As seen from the above logs, ZLMediaKit has successfully implemented the SRT push-pull streaming feature for live video streaming via the SRT protocol.

For more details on FFmpeg development seeFFmpeg Development in Action: From Zero Basics to Short Video OnlineA book.