In an effort to help enterprises move data from their on-premise systems to the cloud, Google Cloud has announced Transfer Service.
This new managed service is designed to handle large-scale transfers of billions of files and petabytes of data in the easiest way possible.
Google has launched similar services in the past including Transfer Appliance which allows companies to ship data to its data centers via FedEx and Google’s BigQuery service that automates data transfers from SaaS applications.
Google Cloud’s new Transfer Service will handle the heavy lifting and it can even validate the integrity of an organization’s data as it moves to the cloud. The agent also automatically handles failures and it will use as much available bandwidth as possible to help reduce transfer times.
Transfer Service for on-premise data
To get started using Google Cloud’s Transfer Service, you just have to install an agent on your on-premise servers, select which directories you want to copy and the service will take care of the rest. You can also use the Google Cloud console to monitor and manage your transfer jobs.
While archiving and disaster recovery are obvious use cases for this new service, Google is also targeting businesses that want to move their workloads and their attached data to the cloud.
Senior analyst at ESG, Scott Sinclair explained why implementing Transfer Service is much easier than developing a custom solution in a blog post announcing the new service, saying:
“I see enterprises default to making their own custom solutions, which is a slippery slope as they can’t anticipate the costs and long-term resourcing. With Transfer Service for on-premises data (beta), enterprises can optimize for TCO and reduce the friction that often comes with data transfers. This solution is a great fit for enterprises moving data for business-critical use cases like archive and disaster recovery, lift and shift, and analytics and machine learning.”