Compression is coming to ST2110

16.11.18 05:05 PM By Nils Finger

SMPTE 2110 Professional Media over IP Infrastructure 
​with added -22 for compressed video essence

The long awaited SMPTE ST 2110 standards for professional media over IP infrastructures is out for about a year now and serves as a major contributor to the industry’s movement toward IP-based infrastructures. The suite of standards specifies the carriage, synchronization, and description of separated video, audio and ancillary data streams over IP for live production, playout, and other professional media applications. By adding timestamps, all elements can be routed separately and brought together at any endpoint. This synchronized separation of streams, as opposed to SMPTE ST 2022, promises to simplify the process of adding metadata such as captions, subtitles, Teletext, time codes, and simplified video editing, as well as tasks such as the processing of multiple audio languages and types. video, audio and ancillary data streams over IP for live production, playout, and other professional media applications. By adding timestamps, all elements can be routed separately and brought together at any endpoint. This synchronized separation of streams, as opposed to SMPTE ST 2022, promises to simplify the process of adding metadata such as captions, subtitles, Teletext, time codes, and simplified video editing, as well as tasks such as the processing of multiple audio languages and types.


Today, the standard suite is already being embraced by the industry and many are offering equipment and solutions based on the SMPTE ST 2110. To look at vendors offering ST2110 products already, check out the members of the AIMS Alliance.


To shed a light on all parts of the suite, we have enlisted and explained them in the following:

Part 10 - System timing and definitions

The ST 2059 (PTP) is used to distribute time and timebase to each device within the system by giving timestamps to the separate streams. It specifies the various system clocks and how the RTP timestamps are calculated for Video, Audio and ANC signals. This enables each component flow — audio, video, metadata —to be synchronized to each other, while remaining independent streams.


Part 20 – Uncompressed active video

This standard specifies the real-time, RTP-based transport of uncompressed active video essence over IP networks. An SDP-based signalling method is defined for image technical metadata necessary to receive and interpret the stream.

It supports resolutions up to 32k x 32k, thus well covering the currently trending UHD formats, Y’Cb’Cr’, RGB, XYZ and I’Ct’Cp’ color spaces, HDR and HFR content, 4:2:2/10, 4:2:2/12, 4:4:4/16, and more.


Part 21 – Traffic shaping and delivery timing for video

Part 21 defines a timing model for SMPTE 2110-10 video RTP streams as measured leaving the RTP sender, and defines the sender SDP parameters used to signal the timing properties of such streams.


Part 22 - Compressed video

Currently the ST 2110 officially only specifies how to transport uncompressed video over IP (even though it is practically already possible to transport compressed video files following the SMPTE RDD35 for TICO over RTP). With the introduction of the upcoming Part 22, the SMPTE 2110 will specifically and officially define a standardized way for transporting compressed video over IP workflows such as TICO (SMPTE RDD35) or TICO-XS (JPEG-XS).

The introduction of compressed video to ST 2110 intensifies the already existing advantages of moving to IP based workflows – flexibility, scalability, unlimited accessibility – by allowing users to transport generally high-bandwidth videos like 4K and 8K over cost-effective COTS 1GbE/10GbE networks. Using TICO or TICO-XS innovative ultra-low latency & lossless quality codecs positions compression a solid sustainable solution for creating cost-effective, bandwidth-efficient and high quality live production workflows. In no means, it will be inferior to uncompressed video concerning neither quality nor latency. It will just be better in bandwidth and to use COTS equipment such as 1GbE, 10GbE to manage multiple streams in HD, 4K and 8K .


Part 30 – PCM digital audio

ST 2110-30 deals only with the real-time, RTP-based transport of PCM digital audio streams over IP networks. An SDP-based signalling method is defined for metadata necessary to received and interpret the stream. Non-PCM digital audio signals, which includes compressed audio, are beyond the scope of this standard.

Part 31 – AES3 transparent transport

Part 31 can handle non-PCM audio. In this part, the real-time, RTP-based transport of AES3 signals over IP networks, referenced to a network reference clock, is specified. Like AES3, the audio signal is always stereo.

Part 40 – SMPTE ST 291-1 ancillary data

2110-40 basically says how to use the IETF RFC 8331 with 2110, for generically wrapping ancillary data items in IP. It specifies the transport of SMPTE ST 291-1 Ancillary (ANC) data packets related to digital video streams over IP networks. In this way, it enables break-away routing of Audio and VANC. 


About SMPTE:

For more than a century, the people of the Society of Motion Picture and Television Engineers® (SMPTE®, pronounced “simp-tee”) have sorted out the details of many significant advances in media and entertainment technology, from the introduction of “talkies” and color television to HD and UHD (4K, 8K) TV. Since its founding in 1916, the Society has received an Oscar® and multiple Emmy® Awards for its work in advancing moving-imagery engineering across the industry. SMPTE has developed thousands of standards, recommended practices, and engineering guidelines, more than 800 of which are in force today.


For more information visit: https://www.smpte.org/st-2110