Centricular

Expertise, Straight from the Source



« Back

Devlog

Posts tagged with #rust

Some time ago, Edward and I wrote a new element that allows clocking a GStreamer pipeline from an MPEG-TS stream, for example received via SRT.

This new element, mpegtslivesrc, wraps around any existing live source element, e.g. udpsrc or srtsrc, and provides a GStreamer clock that approximates the sender's clock. By making use of this clock as pipeline clock, it is possible to run the whole pipeline at the same speed as the sender is producing the stream and without having to implement any kind of clock drift mechanism like skewing or resampling. Without this it is necessary currently to adjust the timestamps of media coming out of GStreamer's tsdemux element, which is problematic if accurate timestamps are necessary or the stream should be stored to a file, e.g. a 25fps stream wouldn't have exactly 40ms inter-frame timestamp differences anymore.

The clock is approximated by making use of the in-stream MPEG-TS PCR, which basically gives the sender's clock time at specific points inside the stream, and correlating that together with the local receive times via a linear regression to calculate the relative rate between the sender's clock and the local system clock.

Usage of the element is as simple as

$ gst-launch-1.0 mpegtslivesrc source='srtsrc location=srt://1.2.3.4:5678?latency=150&mode=caller' ! tsdemux skew-corrections=false ! ...
$ gst-launch-1.0 mpegtslivesrc source='udpsrc address=1.2.3.4 port=5678' ! tsdemux skew-corrections=false ! ...

Addition 2025-06-28: If you're using an older (< 1.28) version of GStreamer, you'll have to use the ignore-pcr=true property on tsdemux instead. skew-corrections=false was only added recently and allows for more reliable handling of MPEG-TS timestamp discontinuities.

A similar approach for clocking is implemented in the AJA source element and the NDI source element when the clocked timestamp mode is configured.



When using hlssink3 and hlscmafsink elements, it's now possible to track new fragments being added by listening for the hls-segment-added message:

Got message #67 from element "hlscmafsink0" (element): hls-segment-added, location=(string)segment00000.m4s, running-time=(guint64)0, duration=(guint64)3000000000;
Got message #71 from element "hlscmafsink0" (element): hls-segment-added, location=(string)segment00001.m4s, running-time=(guint64)3000000000, duration=(guint64)3000000000;
Got message #74 from element "hlscmafsink0" (element): hls-segment-added, location=(string)segment00002.m4s, running-time=(guint64)6000000000, duration=(guint64)3000000000;

This is similar to how you would listen for splitmuxsink-fragment-closed when using the older hlssink2.



Last month as part of the GTK 4.14 release, GTK gained support for directly importing DMABufs on Wayland. Among other things, this allows to pass decoded video frames from hardware decoders to GTK, and then under certain circumstances allows GTK to directly forward the DMABuf to the Wayland compositor. And under even more special circumstances, this can then be directly passed to the GPU driver. Matthias wrote some blog posts about the details.

In short, this reduces CPU usage and power consumption considerably when using a suitable hardware decoder and running GTK on Wayland. A suitable hardware decoder in this case is one provided by e.g. Intel or (newer) AMD GPUs via VA but unfortunately not NVIDIA because they simply don't support DMABufs.

I've added support for this to the GStreamer GTK4 video sink, gtk4paintablesink that exists as part of the GStreamer Rust plugins. Previously it was only possible to pass RGB system memory (i.e. after downloading from the GPU in case of hardware decoders) or GL textures (with all kinds of complications) from GStreamer to GTK4.

In general the GTK4 sink now offers the most complete GStreamer / UI toolkit integration, even more than the QML5/6 sinks, and it is used widely by various GNOME applications.