Compare Trucks

For I/P to be ultimately widely successful in a live environment outside of a studio, there must be interoperability standards between equipment from different manufacturers.

What's The Trouble With I/P? Author: George Hoover / October 23, 2015

There is a lot of talk in the trade press about productions moving to I/P. In a way, the talk is generating confusion in the industry and stagnating the purchase of new equipment. I think part of the confusion over the topic of I/P is the broad range of practices this generic term seems to cover:

1. Transport of video signals between distant facilities such OBs, Transmitter Plants Satellite Farms, etc. via MPEG, JPEG, SMPTE or other standards over satellite or land lines

2. Storage of video as files on servers and edit systems through the use of various codecs

3. A scheme for transporting streams of video and associated audio, metadata and time code through a production facility for the purpose of connecting various production facing devices such as cameras, production switchers, record and playback devices (EVS, XD Cam, KiPro, etc, etc) and graphics system with the goal to record the stream and or transport it to another facility or device at some distance.

Number one the industry has been doing for years. That’s how we move signals back and forth between site and our clients’ facilities and how we store and edit content.

Number three is what is getting the attention today as we ponder the potential benefit of using the physical I/P layer for live or near live multi-camera and multi-source production. Interestingly enough, outside of the OB marketplace and real big network Broadcast Centers, file based workflow has greatly reduced the need to move baseband video between devices until the last step of integrating live productions or feeding transmitters. My friends at Utah Scientific tell me the average TV station router today is in the area of 128 inputs and outputs, mostly to support the Control Room used for news, and live shots, network feed and STL links. This is a far cry from the nearly 1,000 x 3,000 and larger routers in some of our biggest trucks.

Part of the debate is what the I/P signal actually is; compressed or uncompressed, and whose scheme or protocol is being used with choices from SMPTE 2020, IP Live Tico, Aspen and others. You may have seen that NEP endorsed Aspen. We did this not so much for Aspen specifically, but to endorse the development of a standard for live multi-source that is open and license fee free. We will endorse other proposals as they become available and accessible for development.

In our live mastering environment, we are concerned at this point in time about a protocol that compresses
the video (reduces the bandwidth by eliminating duplicate and repetitive data). All compression schemes have artifacts, and transcoding between schemes can cause even more visual impairment. Here is why we are concerned, in a router centric facility design, where the router feeds the production switcher, all record devices, transmit paths and monitoring, a signal can go through the router, be it baseband or I/P maybe a dozen times.

Here is a worst case for a typical flow to air for a production element such as an open or highlights that is then played out through the switcher for transport to the Broadcast Center and ultimately the viewer:

• Camera CCU to Router
• Router to Production Switcher
• Production Switcher to Router
• Router to record device (EVS, other ingest server, optical disc recorder, vtr, etc) for initial recording of composited elements.
• Record device to Router
• Router to Production Switcher
• Production Switcher to Router
• Router to Record Device
• Record Device to Router
• Router to Production Switcher for final live switched feed to air
• Production Switcher to Router
• Router to Transmission Encoder
• Transmission Encoder output to Transport Carrier; satellite, land line, etc.

Today with all the production facing devices (switcher, cameras, servers, graphics) having only baseband inputs and outputs, each one of those passes requires conversion to a I/P and back to baseband. Each record device deploys some form of compression that varies by the codec chosen. So the reason folks doing live production insist on uncompressed transport during the mastering process is simply that we don’t know what all those passes will do to the picture if each involves compressing and decompressing the signal often with further transcoding. Remember, after it leaves tour studio or OB there is still the aggregators facility to pass through and the distributors plant as well.

Lastly, a concern in today’s world of virtual graphics, such as the yellow line in American football, where the graphic elements are tied to a specific moving point in the video and stay in place as the camera moves - do we run the risk or corrupting that ability when we have “squished” the image too many times?

For I/P to be ultimately widely successful in a live environment outside of a studio, there must be interoperability standards between equipment from different manufacturers. Also required will be easy selection of compressed and uncompressed signals as well as transcoding between 720p, 1080i, 1080p and 4K. In the long term, a true self-discovering network architecture is very advantageous. Imagine how great it would be if we could bring two mobile units from different companies with different hardware to a venue and plug them together and be able to easily grant permission to share audio and video signals between the two OBs and the Venue- regardless of the capture format. As an example, a 720p truck from GCV with a router from manufacturer A connected to 1080p truck from NEP with manufacture B's router and connected to the venue with a 4K scoreboard system. Plug them together, discover who's there, grant permission to share your sources on an individual basis with either the other truck and or the venue and you’re done!

In a future post we will explore where manufacturers as well as NEP Infostrada are at in terms of achieving this goal.