Low Latency Technology: A mile ahead

  • admin
  • December 18th, 2020
  • What is Video Latency? Its Significance in Live Streaming!

    Intro With the boom of OTT platforms, live video delivery is becoming increasingly more popular and competitive. Most of the consumers are willing to watch live events through their video streaming services. Be it watching sports tournaments or any product launch events or virtual conferences, webinars, etc. Live video streaming has been gaining a lot of traction over these years. You often hear promises about low latency streaming options from various OTT platforms. Sometimes, we even could hear promises like ultra-low, zero, or no latency solutions. So what does this mean; What is the significance of low-latency in video streaming. In this blog; you will get a brief description of video latency and its significance in live streaming. And you will also see how low-latency can provide a better viewing experience.

    What is Video Latency?

    Latency means a delay or lag in video streaming. It is a delay in time taken to encode a live video and deliver it to an OTT device or player where it is decoded before playing over the screen; this is live streaming video latency. In simple terms, latency is the time taken for action to appear on screen from a point to another point. For instance, you are watching a cricket match via your OTT streaming service, meanwhile, your neighbors cheer before you cheer during a live game. Due to latency, the viewers will never see the actions in real-time during a live streaming video. There will always be a delay.

    What causes the latency?

    Based on the number of steps involved in video processing and delivery there are various factors associated with video latency. Components like encoding, decoding, delivery, playback, and streaming pipeline add to a layer over latency. Based on the varied components and architecture of the pipeline, latency is ought to be affected significantly. A specific delay may be minimal. However, all the components could collectively add up to it. Here are some key contributors to latency:Network Type & Speed

    Regardless of fact; whether it is a public internet, satellite, or MPLS network, the network you prefer to use to transmit video impact both quality and latency. While network speed defines the transfer rate of megabits or gigabits of data per second and the distance traveled, the speed of connection is measured using throughput and bandwidth.Components in the Streaming Workflow

    Latency is very sensitive to the kind of video compression or encoding techniques. Right from retrieving the video from the camera, streaming via the encoders and decoders till the final display, each component in the workflow creates processing delays. And eventually, this contributes to video latency in varying degrees.

    Streaming protocols

    The video latency effects due to the video protocols used for contribution and distribution formats to make the video view on a device. Even the error correction type used in the selected protocol to counter jitter and packet loss can add to firewall traversal and latency.Video Player Buffer

    To experience a smooth playback of the video, the online video players need to buffer the video. Though the buffer sizes differ based on the media specifications, you will identify that the latency during playback depends on the buffer size. That means if the buffer size is more, then the delay will be more during playback.

    Why or How does it Matter?

    Does video latency matter your question, then the answer might simply depend on the application you are using. In some cases, streaming the previously recorded events like movies, web series, or a show – high latency is acceptable if the picture quality is better. However, in linear broadcast workflows, the delay lies around 10 seconds between the feed and the actual live feed. Furthermore, to ease closed captioning, live subtitling, or prevent obscenities from airing, a short delay or latency could probably be intentional between the broadcast production and playout. On OTT platforms, the video latency varies from 10 seconds to 60 seconds, and in the case of live video streaming, the latency might be nearer to 200 milliseconds or even real-time.

    How can latency be Reduced?

    You can reduce video latency with techniques ensuring that the picture quality will not affect. To keep the video latency at a standard internet connection, you need to choose an encoder and decoder combo engineered. However, video encoders and decoders of the latest generation can maintain a low-latency of approximately under 50ms. And you can also maintain high picture quality by using HEVC and compressing the video to low bitrates. Also, by selecting a video transport protocol, you can achieve low latency and watch a high-quality video over the public internet network. Moreover, some form of error-correction is needed to prevent packet loss. And as part of the streaming protocol that will help you stream video successfully over the internet and without compromising the video quality.

    Low Latency and Real-time Streaming

    To deliver low-latency and real-time video streaming it is necessary to use specific streaming protocols. Some of the streaming protocols used to provide low-latency are RTMP (Real-time Messaging Protocol), WebRTC (Web Real-Time Communication), or HLS and MPEG-DASH as an alternative to WebRTC. These streaming protocols ensure that the encoder and the streaming service are in the same area in the case of live streaming.