Applications that run in the cloud are protected, but only so much. For full protection of data generated by cloud-based apps, you need cloud-to-cloud backup. Spending on public cloud services will reach US$236bn by 2020, according to Forrester. It’s a trend driven by increasing numbers of applications being delivered from the cloud.
Cloud computing is sometimes so easy that users and IT teams assume it “just works”, and they are happy to leave data protection and backup to the provider. So why are we seeing the emergence of cloud-to-cloud backup?
Data protection risks in the cloud
Moving applications, workloads or IT infrastructure to the cloud poses risks. It means handing over control of a large slice of an organization’s capability to store and protect data to a third party. A cloud service will – or should – have multiple data centers and multiple redundant data stores to ensure business continuity and the ability to recover data. It should also provide enterprise-grade security for data.
But there have been outages among cloud services. These are relatively rare, but CIOs that fail to consider how their cloud data is backed up will put their organizations at risk. For businesses that use the cloud, the question is not whether cloud services will fail, but how the business will cope when they do. Although cloud services may offer a high degree of resilience, this will not be enough for all organizations’ backup needs.
Cloud versus on-premise service levels
Cloud services do what they can to keep services running. But CIOs should check the details of service level agreements. Public cloud services are unlikely to guarantee availability or recovery times, and only offer a “best endeavor” commitment.
When it comes to the data itself, businesses are even more at risk. Software-as-a-service (SaaS) suppliers typically take responsibility for infrastructure availability, but data loss is the sole responsibility of the client.This could leave customers with a complex, costly and time-consuming data recovery exercise after an outage.
Nor will a cloud service provider take responsibility for accidental data deletion. Human error – from accidentally overwriting one field in a customer record to the wiping of an entire dataset – is the customer’s problem. Cloud services may also delete data for any user whose subscription ends. Microsoft, for example, wipes all data for users 30 days after their subscription stops. Unless a business has a robust plan to capture users’ files when they leave an organization, vital data can be lost for good.
Cloud backup options
In small-scale scenarios, users can copy files from, for example, Office 365 and G Suite to a local volume, or if security rules permit an external drive. But this is a manual process that might not be reliable and will struggle to scale. For larger files and larger applications, this is rarely practical. Enterprises using infrastructure-as-a-service (IaaS) or SaaS applications can use application programming interfaces (APIs) or third-party software to back up to local servers, network-attached storage (NAS) equipment or their own data center.
But backing up cloud services to local storage is a step backward. Instead of taking advantage of the cloud, it forces companies to retain the on-site infrastructure, increases costs and limits flexibility. Enterprises that backup software-as-a-service applications will have the reassurance that they have copies of their data, but they will not be able to replicate or run the SaaS environment in-house. This limits the usefulness of local backups. At best, a business will face a lengthy recovery or migration to a new platform.
Backing up cloud services to the cloud should be a better option. Currently, just one in 10 businesses back up their data to an IaaS supplier, according to Gartner. But the firm expects this to double by 2020, as businesses realize the importance of backups and more suppliers offer cloud-specific services.Back
Call us onIN: 91-9848733309