FTP used to be the main method for transferring media files, but in the modern age it can no longer be counted upon for speed and reliablity.
The flow of data across the Internet is steadily and constantly increasing, with annual global IP traffic predicted to reach 3.3 ZB by 2021. Coined by Cisco in a recent report on IP network trends, “The Zettabyte Era” will be defined by a continued surge of traffic over the next few years. “Overall,” says the report, “IP traffic will grow at a Compound Annual Growth Rate (CAGR) of 24 percent from 2016 to 2021,” with IP video traffic making up 82 percent.
“It would take more than 5 million years to watch the amount of video that will cross global IP networks each month in 2021.”
That’s a lot of video. Granted, much of it will likely still be pet bloopers and teenage antics, but professional content created and distributed by the media & entertainment industry will take up its share as well.
At the same time that traffic is increasing exponentially, so are the media file sizes we’re attempting to move over IP networks. Already 4K files are huge and trends in higher resolution are pushing that mark ever higher. How much bigger can they get? With each jump between 2K, 4K and 8K, pixel rates (and file sizes) square incrementally. Where 4K = 8 megapixels, 8K = 32 megapixels.
The problem is, compared to general traffic, large files are much more taxing for TCP, the standard Internet Protocol that moves data from one point to another and the foundation protocol for FTP (File Transfer Protocol). A 40+ year-old technology, FTP what was once the easiest and most cost-effective way to transfer large files, but there are some growing challenges with FTP in today’s media landscape.
Any file between 500MB and 1GB approaches a threshold where FTP starts to break down in speed and reliability. Why does FTP struggle with large files? TCP/FTP use a relatively unsophisticated mechanism to move files, sending only a fraction of the file’s data and waiting for acknowledgement that the data has been received on the other end before sending a bit more, and so on. With anything 1GB and beyond, FTP requires an increasing amount of back and forth to send the entire file, slowing the transfer and risking data loss and failure.
Moving files in the multigig and above range is a familiar need for many broadcasters and feature film producers, but even small media companies operating further down the supply chain are regularly working above the 1GB threshold.
Distance only compounds the speed and reliability problems with FTP. Today’s broadcasters work with globally distributed teams and freelancers, and often have post-production partners around the world.
Data can’t travel faster than the speed of light. Even in a theoretical perfectly unobstructed environment, it would take more time to send a few bytes halfway around the world than next door, but the time lag with such a small amount of data is negligible. However, compound the distance with FTP’s back and forth process of moving large files, and the delay adds up quickly.
FTP is often thought of as a complete file transfer solution. But FTP is merely the protocol (a set of rules for communication between computers) around which pieced-together solutions are built. FTP alone lacks many basic business needs — like security, notifications, checkpoint restart, storage allocation and management dashboards — requiring ad hoc software written by developers or purchased software add-ons. The result is usually a clumsy solution that is difficult to scale and update, and is beyond the help of even the best user experience (UX) designer. This impacts more than just the people doing file transfers. End users, media operations teams and IT are all impacted by the poor UX of FTP.
Depending on the setup, FTP users and managers run into all types of frustrations. Those most familiar with using FTP point to the “babysitting” (or “nannying” if you’re in the U.K.) needed to ensure files are delivered and received. FTP stalls and fails without notifying you, it will not automatically restart where it left off if interrupted, and it doesn’t notify recipients when new files are ready to download.
Media ops and IT managers, on the other hand, know the tediousness of trying to manage storage and bandwidth allocation, keeping an eye on security, onboarding new users and keeping track of all the unsanctioned (and unsecure) “free” file transfer alternatives that employees take to using out of frustration with FTP. All of this adds up to stalled and stuck users, managers and businesses.
From big broadcast companies to small post houses, the need to quickly and easily send large files is only getting greater. Companies currently using FTP have two options — replace existing FTP systems with a modern file transfer solution or augment their FTP with a single solution that provides increased speed, reliability, ease of use and security. Signiant’s Media Shuttle can do both.
With over 200,000 users in over 25,000 businesses moving petabytes of data every month, Signiant’s Media Shuttle has become the trusted solution across the global Media & Entertainment industry. A true cloud-native SaaS solution with the scalability and rapid innovation that is making SaaS a foundation for modern businesses, Media Shuttle fits the needs of companies large and small. Learn more about Media Shuttle and Signiant’s Emmy award-winning technology here.
About Signiant
Signiant’s intelligent file movement software helps the world’s top content creators and distributors ensure fast, secure delivery of large files over public and private networks. Built on patented acceleration technology, the company’s on-premises software and cloud-native SaaS solutions allow businesses of every size to optimize mission-critical file transfers between users, data centers, and the cloud. For more information, please visit http://www.signiant.com/.