For today’s livestream and video streaming needs, low latency is key. This is where edge computing and Content Delivery Networks (CDNs) shine – by bringing data storage and computation closer to end users.
Connecting with people via livestream was once thought of as something largely used by the gaming and eSports community. However, the tech industry has caught on to livestreaming in recent years, with many technology media outlets and software companies beginning to livestream from booths, trade show floors, and at conferences worldwide. Today and for the foreseeable future as the world adjusts to the impact of COVID-19, software companies have to pivot their event focus and rely solely on livestream, in lieu of in-person conferences and meetups.
The ability to livestream effectively first requires knowing how to digitize once in-person content. Then, it is important to effectively stream that content efficiently and in a way that is impactful for your end users. This means streaming with as low a latency as possible. A stream with high latency can drop frames, lag, and become out of sync for end users; they will quickly stop watching your broadcast.
Fortunately, today’s live streaming is bolstered by edge computing. Establishing low latency for Content Delivery Networks (CDNs) and multi-streaming platforms such as Restream.io is where edge computing shines.
What is a CDN and Why Do You Need It?
Livestreaming at scale for today’s software organizations means that often events will need to be streamed worldwide. Depending on your livestream platform of choice, there are a variety of CDNs available to you such as Cloudflare, Akami, and Verizon Edgecast. Multistreaming one’s content can be done using services such as Restream.io, Mixer, Vimeo, Streamcloud, and more.
Content Delivery Networks are a network of globally distributed servers and data centers that are used to transport and transmit media files. Streaming via a single server is generally a recipe for disaster, as bandwidth allocation, traffic bottlenecks, dropped frames, and latency can occur. Many multistreaming platforms such as Restream.io or OBS have streamers select the geographical location that is closest to them in order to provide their stream with optimal performance levels. When an end user views a livestream, a CDN will also select whatever location that user is nearest to to deliver the information to them, rather than selecting a server in a location further away from the viewer in question.
Making use of a CDN ensures less latency, better network stability, and reduces overall lag and buffer time. This means that your livestream content will take less time to reach your audience, with better quality overall. In the early days of livestreaming, all traffic was usually handled through one individual server. With the advent of CDNs, traffic loads can now be spread out equally across multiple distributed servers running at the edge.
Kubernetes and Livestreaming at the Edge
Edge computing has made huge advancements possible in livestreaming by bringing data storage and computation closer to end users, thus resulting in overall better livestream performance.
There is a relatively new open source project created in May 2020 by Sysdig Chief Open Source Advocate Kris Nova underway that aims to bring the power of Kubernetes to the open source streaming software platform Open Broadcaster Software (OBS). This project is called knobs (Kubernetes Native Open Broadcasting Software). While just getting started, the project consists of plugins, servers, and clients that will run live video production content and recordings on Kubernetes (or Kubernetes adjacent) systems. Contributions from the Kubernetes community that are interested in knobs are encouraged, and pull requests are accepted.
Edge computing also brings with it a variety of new use cases for livestreaming. With these opportunities also come challenges, such as ensuring scalability under heavy traffic loads. Thankfully, by serving content from a server located at the edge to split and replicate content, bottlenecks and quality degradation are drastically reduced.
One must also consider the streaming technologies that are particularly well suited for edge computing. Web Real-Time Communications (Web RTC), serverless edge computing platforms such as EDJX, Red5Pro, and more. Many feel that WebRTC is well suited to edge computing and livestreaming in particular, as it makes use of User Datagram Protocol (UDP) instead of Transmission Control Protocol (TCP) which is what is used with Real Time Messaging Protocol (RTMP). Many also argue that WebRTC is inherently more secure than RTMP, as it has DTMP and SRTP for encrypting streams.
That being said, WebRTC isn’t currently supported by many of today’s browsers, nor is it supported by Open Broadcasting Software. Still, if you require secure edge-computing powered livetreaming capabilities for your use case, it may be worth looking into WebRTC-backed CDN options such as Wowza.
For software companies looking to transform their event presence, virtualize their product marketing and demonstrations, or undertake customer outreach through live streaming, edge computing is a powerful tool. One that will enable scalability, cost reduction, and reliability for those seeking to livestream and distribute live video content over a CDN not only today, but far into the future.
- Livestreaming content with low latency is key to creating an impactful, meaningful broadcast that resonates with end users
- Making use of a CDN ensures lower latency, less bottlenecks, and an improved livestream performance
- While there are not many Kubernetes-powered livestream projects, a few open source options exist, and are open to contributions
- One must consider if their use case for livestream requires the use of edge computing technologies such as WebRTC, or if they prefer the traditional RTMP livestreaming route.