The vagaries of FTP: What to look for in a secure large file transfer alternative

01.04.2016
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.

FTP turns 45 this year. And, while this original protocol for transferring files over the Internet is still widely used, many companies are looking for a more modern alternative. Initially, concerns about FTP centered on security. But, as IP technology became ubiquitous for global data exchange, FTP’s more fundamental performance limitations also became apparent.

Because FTP was originally designed without security features like data integrity and confidentiality, the first security concerns arose around privacy of control channel data like user IDs and passwords, and then spread to the actual data being transferred. “Secure” FTP (FTPS) was developed in response. FTPS is FTP with Transport Layer Security (TLS), which protects file content and user names and passwords while in transit over the Internet from eavesdropping and modification.

However, FTPS doesn’t protect data at rest on servers, which are, by necessity, accessible from the Internet to allow FTP access for off-site business partners. To solve these issues, companies often built external security controls around their FTP infrastructure.

In addition to the need for security enhancements, FTP lacks many basic features and functions necessary for organizations to operate efficiently, from email notifications and file transfer tracking to automation and user and storage management. All of these either require scripts to be written for use with FTP servers and clients or the purchase of additional software.

For most companies, the result is a system that requires heavy IT involvement and is very difficult to scale, update and manage. Basic tasks like adding users to support new business initiatives or adding partners required IT involvement, and those new users often have difficulty using archaic FTP interfaces.

However, the biggest vagary for FTP users is its painful slowness and tendency to fail when sending large files over Wide Area Networks (WAN). And this is not something that can be solved by scripts or changes to the protocol.

FTP almost always runs on top of the Transport Control Protocol (TCP), which is also the underlying protocol for Hypertext Transfer Protocol (HTTP). While TCP was originally built to ensure accuracy and reliability, performance on today’s high-bandwidth long-distance networks wasn’t given much consideration.

In order to ensure reliability and prevent congestion, TCP utilizes an acknowledgment mechanism, only sending a defined amount of data before requiring a response from the other end before more data is sent. All of this handshaking is impacted by latency, which delays both delivery and acknowledgments of data, and the longer the distance the higher the latency. Also, in the event of a connection failure due to a network disruption, TCP typically requires FTP to retransfer the entire file from the beginning.

Even though it might seem like a high bandwidth connection would improve throughput, it typically doesn’t when even relatively small metropolitan area distances are involved. In a high bandwidth high latency environment, FTP can only utilize a fraction of available bandwidth because of its underlying reliance on TCP. Therefore, even increasing bandwidth will not improve the slow performance of FTP.

Fortunately, technology optimized for large file transfers over long distance IP networks was developed many years ago. It was first adopted at scale by the Media and Entertainment industry as it transitioned from tape to file-based workflows. Large media enterprises like Disney, Discovery and the BBC pioneered the use of this technology within their on-premises infrastructures to move huge media files.

However, it wasn’t until four years ago that this advanced acceleration technology was incorporated into cloud-native software as a service (SaaS) large file transfer solutions, leading to increased adoption in other industries.

Today, there are plenty of cloud-based file sharing solutions to choose from. However, when it comes to moving big files quickly, there are only a few options. Especially if organizations are involved in video production or other big data projects, it’s important to know what to look for.  

—    Speed and efficient bandwidth utilization: Modern SaaS accelerated file transfer solutions are up to 200 times faster than FTP, and gain in efficacy with higher bandwidths and longer distances. In addition to fully utilizing available bandwidth, they should also provide means to manage bandwidth and have no file size limits.

—    Enterprise-grade security: It is essential that the solution is built using secure design principles, including implementing multiple layers of protection. Confidentiality and integrity of data should be considered when data is in transit and at rest. Be sure to verify vendors’ security practices, which should also include independent third-party evaluation.

—    Scalability and elasticity: It’s important that a data transfer solution be able to automatically scale along with business growth, that it elastically responds to the ups and downs of required throughput, and that it includes flexible billing based on actual usage and benefit (unlike a fixed licensing model). This will be true for any cloud-native solution.

—    No software to manage: With cloud-native SaaS, you do not have to worry about managing software updates or cloud servers. Make sure the vendor does not practice “cloud washing” or your business could end up with additional software maintenance and infrastructure costs.

—    Usability and global access: Ease-of-use is essential for today’s end-users. Look for a solution with a simple web interfaces that can ideally be branded to give users additional confidence. It should be very easy to onboard users and require little to no user training. (Keep in mind that, without high usability, employees are more likely to use online file sharing solutions not sanctioned by IT.) Additionally, from an operations or IT perspective, the backend of the solution should also be easy to use and provide features that match the needs of each team including authorization and tracking.

—    Storage independence: Many companies are utilizing or considering cloud object storage as a main storage solution or as an adjunct/backup to their on-premises storage servers. Look for a solution that allows you to control the storage behind your file transfers and to choose between on-premises or cloud storage. Ideally, it should also support multiple cloud storage vendors like AWS S3, Microsoft Azure and/or Google cloud, and allow you to move between them without a switching cost.

These are some of the main features to look for in a modern file transfer solution. If you are considering moving away from FTP, be sure to get to know all the options available including accelerated file transfer and cloud-native SaaS solutions.

Signiant’s intelligent file movement software helps the world’s top data-intensive businesses ensure fast, secure delivery of large files over public and private IP networks. Built on Signiant’s patented technology, the company’s on-premises software and SaaS solutions move petabytes of high-value data every day between users, applications and systems with proven ease.  www.signiant.com

 

(www.networkworld.com)

By Ian Hamilton, CTO, Signiant

Zur Startseite