“The great enemy of the truth is very often not the contrived lie but the persuasively persistent myth.” ~ John F. Kennedy
The above quote from President Kennedy endures today because the truth often requires significant change, and change can be difficult for many people.
This is particularly true within the IT field, where the rapid pace of change continues to challenge the status quo. This makes it easy to hold fast to persistent myths whenever possible, just for the sake of continuity. In the arena of cloud-based backup, this spirit of myth-making continues to be prevalent. It’s time to challenge five persistent myths about online backup:
Myth 1: There’s No Need for Virtual Backup
You may have heard this one: “Since we already have local, physical servers, we have no need for virtual backup at our company.”
Whether this myth is propagated by cost-conscious bean counters or late-adopting Luddites, having on-site manual backup is usually a sub-optimal option. This is especially true if the original files and their corresponding backups are both at the same location. Even if external hard drive backups are stored in a safety deposit box, that doesn’t necessarily protect against natural disasters or fire damage. Additionally, manual backups can be time-consuming and tough to automate, and they are only as reliable as the personnel tasked with the job.
Cloud-based backup can be fully automated at any frequency you deem fit. In the event of a natural disaster that prohibits access to your organization’s offices, the applications and content can be accessed from anywhere, as long as there is access to the Internet. For mobile, small-business owners who run their businesses using T-Mobile 4G coverage, this is a helpful option. And, off-site backup allows for shortened recovery time objectives (RTO), measured in minutes rather than days.
Myth 2: The Cost of Remote Backup is Too High
While there are costs associated off-site servers and online backup functionality, these are minimal, especially if you have several geographic locations within your network.
Consider a school district, which might have 40 different high-traffic servers at 40 different locations across a given county. Problems could arise at any moment with any individual site’s hard drive, CPU, router, power supply, cooling unit, surge protector, etc.—then multiply that by 40. The cost of repairing component parts, labor, windshield time and lost productivity are costly intangibles that clearly favor a remote solution.
In addition, having a smaller number of on-site servers helps reduce facility cooling costs and dedicated computing space. This results in overall energy efficiency, and in some instances, those energy savings can subsidize the cost of remote servers.
Myth 3: It’s Too Much Work to Manage a Virtual Backup Network
Using the school district example again, it’s much more challenging for an IT director to manage travel and time demands servicing 40 different server sites within a county as opposed to a single data center with online virtualization.
Many of the current cloud-based solutions provide users with a proprietary software interface that offers real-time updates of the client’s network. This enables easy capacity balancing, diagnostics and troubleshooting.
Virtual vendors also offer enhanced functionality that an understaffed IT department might not be able to manage on their own regarding areas of network security, technical support, data restoration or high GB of storage.
Myth 4: Data in the Cloud is Unsafe and Unprotected
This might be the most important myth to address, given the increasing significance of online privacy rights, information safety and data security. It’s completely reasonable for any organization to want to keep its data on independent servers that don’t share computer power with other companies or organizations.
Fortunately, vendors offer a variety of services, including banks of dedicated servers (also called Virtual Private Networks or Virtual Private Clouds) that have exclusive access and usage rights. As far as the service provider being able to “see” your data, it’s important to find a solution that offers a minimum level of security, such as AES 128-level encryption. This level of encryption needs to be in place while data is being both transmitted to and received at the remote location.
It’s equally important to ensure that only the right people have the encryption keys, and there is a clear understanding of the provider’s physical and electronic security policies.
Further, there are legislative regulations at the state and federal level that mandate extended archival functionality for a variety of organizations including government, healthcare, pharma/biotech and banking, to name a few. If that type of government-mandated data is lost or stolen, a cloud service provider can lose its license or remit to operate within a given geography.
Cloud service providers also serve as a lower-cost emergency backup if local servers are compromised via a computer attack or natural disaster, as was the case with Hurricane Sandy. Virtualization helps ensure the safety and viability of government-mandated backups.
Myth 5: You Can’t Do Full Restoration With Cloud-Based Backup
While this may be the case with low-end, remote data storage, the vast majority of business-class service providers recognize the need for disaster recovery/business continuity (DR/BC) testing.
Many cloud providers offer a management interface that enables testing of the complete gamut of recovery needs, ranging from isolated application objects all the way to a bare-metal restore. The key is to know what your current needs are and what they might be in the future, and finding the platform that has the flexibility and scalability to meet those organizational requirements. Fortunately, there are great resources available to help every organization assess and plan for those requirements.