Settings

Theme

An Introduction to WebRTC Simulcast

blog.livekit.io

64 points by shanewwarren 4 years ago · 14 comments

Reader

Benjamin_Dobell 4 years ago

This approach of the broadcasting "peer" sending multiple streams (with varying bitrates) to a relaying server is a neat solution. This is done to avoid transcoding on the relay servers, so it makes the servers cheap. It is superior to:

> Lower[ring] the bitrate of everyone’s streams so it doesn’t overwhelm the slow user (i.e. the lowest common denominator)

However, it's still a lowest common denominator solution with respect to what codec is being used. The broadcasting "peer" has to send using a codec that every other peer supports. Essentially that means you're stuck with the lowest common denominator codec.

If the relay server was to instead transcode, then the broadcasting peer could submit a single stream (lower bandwidth) in the best codec it supports. Then the relay server would generate additional streams in various bitrates and codecs (something this solution is trying to avoid).

So unfortunately there's still a trade-off.

The article does mention "Scalable video codecs" as a future improvement. However, using the approach described in this article, we're a long way off taking advantage of them, because whilst there's no transcoding every participant would need to support VP9/AV1.

  • jtsiskin 4 years ago

    How often is that really a problem though?

    • allo37 4 years ago

      Codec support is a very annoying issue on mobile. Hardware support for VP9 (let alone AV1) codecs is relatively recent on Android devices, and Apple doesn't provide HW acceleration for Google's codecs at all (or at least not when I was trying to do it).

      • Sean-Der 4 years ago

        Android devices in some cases don’t have H264 HW. Libwebrtc also doesn’t support SW H264.

        That has added lots of complexity/confusion for people building stuff also. Quite a few footguns in this area :)

        • allo37 4 years ago

          If we want to get really pedantic...WebRTC will support SW H264 if you enable it explicitly in the compilation, since you're technically on the hook for royalties if you compile it yourself. Footguns galore :)

  • davidz 4 years ago

    > However, it's still a lowest common denominator solution with respect to what codec is being used. The broadcasting "peer" has to send using a codec that every other peer supports. Essentially that means you're stuck with the lowest common denominator codec.

    This is correct, though I wouldn't call them "inferior codecs": all WebRTC capable browsers support both H.264 and VP8 as required by the specs.

  • MayeulC 4 years ago

    I would really like to see some projects that leverage SVC, as I haven't seen a single example (and ffmpeg doesn't support them).

Sean-Der 4 years ago

LiveKit is an amazing project. If you haven’t had a chance check it out. It’s the first time I have seen WebRTC approached by people who know Cloud/Scaling and it’s all a open source!

pthatcherg 4 years ago

Does anyone know where LiveKit's implementation of congestion control is? I can't find it looking through their code and to me that's a major sign of the maturity or completeness of an SFU. For example, Jitsi, Signal's SFU, and (I think) MediaSoup have it. But I've not yet seen a pion-based implementation of congestion control and I'm hoping that LiveKit has fixed that. But I can't find it.

pthatcherg 4 years ago

That's a well written article covering the basics of simulcast.

If you're interested in seeing an implementation of an SFU doing simulcast forwarding written in Rust, we (at Signal) recently open sourced our SFU:

https://github.com/signalapp/Signal-Calling-Service/blob/mai...

jvilalta 4 years ago

When I click on get started it sends me to medium, which is now asking me to create an account. Not the best experience.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection