Content at the Speed of Your Imagination

 
Adapters, Big Data, Ethernet, High Performance Networks, Interconnect, Switches, , , , , , , ,

media-clip-loop-200pxIn the past, one port of 10GbE was enough to support the bandwidth need of 4K DPX, three ports could drive 8K formats and four ports could drive 4K-Full EXR.  However, the recent evolution in the media and entertainment industry that has been presented this week at the NAB Show showcases the need for higher resolution.  This trend continues to drive the need for networking technologies that can stream more bits per second in real-time. However, these number of ports can drive only one stream of data. New films or video productions today include special effects that necessitate the need to support multiple streams simultaneously in real-time. This creates a major “data size” challenge for the studios and post-production shops, as 10GbE interconnects have been maxed-out and can no longer provide an efficient solution that can handle the ever-growing workload demands.

This is why IT managers should consider using the new emerging Ethernet speeds of 25, 50, and 100GbE. These speeds have been established as the new industry standard, driven by a consortium of companies that includes Google, Microsoft, Mellanox, Arista, and Broadcom, and recently adopted by the IEEE as well.  A good example of the efficiency that higher speed enables is Mellanox ConnectX-4 100GbE NIC that has been deployed in Netflix’s new data center. This solution now provides the highest-quality viewing experience for as many as 100K concurrent streams out of a single server. (Mellanox also published a CDN reference architecture based our end-to-end 25/50/100GbE solutions including: the Mellanox Spectrum™ switch, the ConnectX®-4 and ConnectX-4 LX NICs, and LinkX™ copper and optical cables.)

 

 

table-4k
Bandwidth required for uncompressed 4K/8K video streams

Another important parameter that IT managers must take into account when building media and entertainment data centers is the latency that it takes to stream the data. Running multiple streams over the heavy and CPU-hungry TCP/IP protocol will result in lower CPU utilization (as a significant percentage of the CPU cycles will be used to run the data communication protocol and not the workload itself), which will reduce the effective bandwidth that the real workload can use.

This is why IT managers should consider deploying RoCE (RDMA over Converged Ethernet). Remote Direct Memory Access (RDMA) makes data transfers more efficient and enables fast data move­ment between servers and storage without involving the server’s CPU. Throughput is increased, latency reduced, and CPU power freed up for video editing, compositing, and rendering work. RDMA technology is already widely used for efficient data transfer in render farms and in large cloud deployments such as Microsoft Azure, and can accelerate video editing, encoding/transcoding, and playback.

 

chart1

RoCE utilizes advances in Ethernet to enable more efficient implementations of RDMA over Ethernet. It enables widespread deployment of RDMA technologies in mainstream data center applications. RoCE-based network management is the same as that for any Ethernet network management, eliminating the need for IT managers to learn new technologies. Using RoCE can result is 2X higher efficiency since it doubles the number of streams compared to running over Ethernet (source: ATTO technology).

 

table-4k2
The impact of RoCE for 40Gb/s vs. TCP in the number of supported video steams

 

Designing data centers that can serve the needs of the media and entertainment industry has traditionally been a complicated task that has often led to slow streams and bottlenecks in the pure storage performance, and in many cases has required the use of very expensive systems that resulted in lower-than-expected efficiency gains. Using high performance networking that supports higher bandwidth and low latency guarantees a hassle-free operation and enables extreme scalability and higher ROI for any industry-standard resolution and any content imaginable.

About Motti Beck

Motti Beck is Sr. Director Enterprise Market Development at Mellanox Technologies Inc. Before joining Mellanox Motti was a founder of BindKey Technologies an EDC startup that provided deep submicron semiconductors verification solutions and was acquired by DuPont Photomask and Butterfly Communications a pioneering startup provider of Bluetooth solutions that was acquired by Texas Instrument. Prior to that, he was a Business Unit Director at National Semiconductors. Motti hold B.Sc in computer engineering from the Technion – Israel Institute of Technology. Follow Motti on Twitter: @MottiBeck

Comments are closed.