top of page

Beyond the 3-2-1 Backup Strategy – Data Virtualization

Updated: Jan 18



Ensuring your corporate data is always available to be restored due to ransomware attacks, natural disasters, data corruption, employee sabotage, or other mishaps is one of your corporate IT department's most important jobs. In the past, I have worked for both personal tape backup companies as well as corporate tape backup companies, ranging from HP’s Colorado Memory Systems personal PC tape backup systems to the room-size automated backup systems from StorageTek.


Over the last three decades, backup technology has transformed from floppy disk backup to individual tape drives to automated tape libraries, and now, it is mostly spinning disk-based backups. I was at HP when the first IBM 8086-based PCs (with 5 MB hard disks) were introduced. These first PCs were unreliable, and many regularly suffered data or OS loss. In the mid-eighties, a backup application called FastBack Plus 1.0 was introduced. The original FastBack product was the industry's first DOS and Macintosh backup system. It was unique because it could simultaneously read from a computer's hard drive and write (the backup) to the floppy drive. Even these relatively simple processes would only be able to restore your backups in a hit-or-miss fashion.


However, a basic piece of the backup strategy has not changed over the decades – the need to keep multiple backup copies due to their tendency to become corrupt or unrestorable. Many have experienced the horror that the computer could no longer read the backups they had painstakingly created.


The IT department then changed its backup strategies to include more than one copy of a backup to increase your chances of having a full restore when needed.


Over the preceding years, companies developed customized backup strategies to fit their specific risk and needs. However, as new threats have emerged, such as new variations of ransomware and extortionware, organizations have had to adapt quickly to the emerging threat vectors.

3-2-1 backups

Over the years, many organizations have settled on the 3-2-1 backup strategy, a data protection strategy that advises keeping three copies of your data on two different media types, with one copy offsite. This backup method helps protect corporate data from various threats, including accidental deletion, hardware failure, and natural disasters.

This data protection should include the following:

  • The primary copy: The primary copy is the original copy of record

  • The backup copy: The backup copy is a duplicate of the primary copy stored on a separate device – tape, disk, or cloud

  • An offsite copy: The offsite copy is a copy of the backup copy stored in a different location from the primary and backup copies. Many years ago, this meant creating a separate tape backup copy that would be sent to an offsite location such as an Iron Mountain warehouse

The two different required media formats will help protect your data from different failure types. For example, you could store one copy of your data on a hard drive and a second copy tape, or now, in the cloud.


The offsite copy should be stored in a different location from the primary and backup copies – such as a cloud or separate warehouse, to help protect your data from natural disasters and other site-wide events.


Figure 1: The 3-2-1 backup strategy is the one most organizations have relied on for many years.


The 3-2-1 backup method is a straightforward and effective way to protect your important data. By following this backup strategy, you can raise the chances that you will always have access to your important corporate data. However, with the growth of corporate information stores due to ongoing business needs and the growing numbers of company data storage silos in both the cloud and on-prem, many organizations have found that they are, in reality, saving up to 8 or 9 copies of data for data protection and disaster recovery coverage. In today’s data overload environment, knowing where all your data is stored and how often it's accessed is essential. Inactive data does not need to be backed up multiple times.


Figure 2: Based on regulatory compliance requirements, the on-prem data set is backed up to an on-prem backup repository and a remote backup site in the above graphic.


Additionally, a second backup is stored in the cloud. Many companies will also create and store additional copies of on-prem data in specific cloud locations for extra data protection – causing additional cloud repositories to be backed up as well.


Other backup strategies

There are several other backup strategies that could be a better fit for your organization's specific needs, especially for businesses with large amounts of data, geographic requirements, complex data protection, or regulatory compliance needs. Some of the better-known strategies are:

  • The 4-3-2 backup strategy adds an additional backup copy and offsite copy to the 3-2-1 backup strategy. This provides an extra layer of protection and redundancy, which can be important for businesses with critical data.

  • The cloud-first backup strategy stores all backup copies in the cloud. This can be a good option for businesses that need to access their data from anywhere or that have limited on-premises storage resources.

  • Immutable (WORM) backup strategy uses immutable storage – either on-prem or in the cloud, which cannot be modified or deleted once it is written. This can help to protect data from ransomware/extortionware attacks and other malicious activity.

The best backup strategy for you will depend on your specific needs and requirements. It is important to consult with a data protection expert to develop a backup plan for your organization.

However, a backup strategy that has yet to be discussed much is a hybrid of the abovementioned strategies. But before we describe a specific hybrid backup strategy, it would be good to discuss the types of data that should be backed up and how many copies.


Active versus inactive data

Most corporate data is considered inactive, meaning it has been over a year since anyone has accessed it. Most industry experts believe that, on average, 80% of a company’s data is inactive and should be archived to more cost-efficient storage. Also, inactive data does not need three or more backups. In fact, you should ask yourself: do I need three copies of the inactive data stored on two different media types, with one copy stored offsite?


The obvious answer is no… This is not to say that you shouldn’t have a backup (copy) of this data somewhere, but incurring the expense of making three, four, or more copies of this underused data is a waste of time and money.


With that in mind, the first step in updating your data protection strategy is understanding the value of the data you currently include in your data protection planning. In reality, only your company’s currently active data – or approximately 20% of your total data, is needed daily/weekly for the ongoing operation of the business and should be included in your standard 3-2-1 data protection process.


The other data type that must be protected is your compliance data, which you must capture, store, and manage due to government or association records retention requirements.


Based on the standard best practices for data protection, most companies are wasting storage backing up inactive data more than once.


A hybrid backup strategy

Earlier in this blog, I mentioned a hybrid data protection (backup) strategy. It comprises the various backup strategies mentioned above: the 3-2-1 backup strategy, the cloud-first backup strategy, and the immutable backup strategy. Additionally, adding storage virtualization capability into your hybrid backup plan can dramatically reduce your overall backup size and save you money.


To get started, first, organizations must map where all their data is by creating (and updating) an enterprise data map. Most organizations already have this, but few take the time to update it regularly when new resources are added.


Second, you must conduct a data discovery in your enterprise storage repositories to determine how much total data you have and how much of the total stored data is inactive, meaning it has not been accessed in the last 12 months. In most organizations, inactive data makes up approximately 80% of stored data. This inactive data will become a candidate for special data protection handling, i.e., storage virtualization.


Once all inactive data has been identified, it can be virtualized and migrated into a cloud-trusted system where two copies of all files are encrypted and stored on the immutable storage tier. The inactive files on the on-prem file system are migrated and replaced with virtual pointers (from your on-prem or cloud file servers) pointing to the inactive (immutable) data in the cloud. When a user clicks on the file system individual pointer, the file is retrieved from the cloud and presented to the user just as it would be if the file still resided on the on-prem file system. This capability is called storage virtualization or copy data virtualization. This technology removes approximately 80% of your storage resource needs and backup requirements.


Your complete data protection processing should only target the 20% of active data left on your on-prem file systems and primary cloud storage systems. The much larger inactive data set (80%) would be migrated to a cloud repository where two encrypted immutable copies are created. Because of the creation of encrypted and immutable copies, further backups are unnecessary.


An additional best practice would be periodically checking the virtualized files against known hash values to ensure the data has not been corrupted and remains gold copies.

By adopting the hybrid backup strategy described above, companies would free up storage resources and servers, reduce overall costs, and add additional data security capabilities against ransomware/extortionware attacks.


restorVault storage virtualization

restorVault’s Storage virtualization solution replaces files (based on policies) in an on-prem active repository, such as a file share, with a pointer or virtual data file that points to the original file in the restorVault cloud. Whenever a user clicks on a virtual data file in their file explorer, the actual file is instantly retrieved from the restorVault cloud platform (see below) for viewing and continued work. This storage virtualization into the immutable restorVault trusted cloud repository also eliminates the wasteful need for numerous backups of inactive data.

It also frees up large amounts of costly enterprise storage for priority use by active data. With your inactive data stored and managed in a trusted and inexpensive cloud repository, your enterprise backups will be approximately 20% of their current size. This will enable you to restore data faster and free up costly enterprise storage. For every TB of restorVault virtual cloud storage, you should recoup 3 TB from primary, backup, and other cloud platforms - a 300% increase in usable storage capacity.



Figure 3: restorVault storage virtualization allows organizations to move inactive data to a less expensive cloud repository, freeing up to 80% of storage space on your cloud and on-prem servers. This also reduces overall backup and disaster recovery requirements by as much as 80%.


The restorVault patented cloud solutions provide two ways to store your inactive unstructured data as well as other high-value unstructured data safely and inexpensively in a trusted cloud vault:

The Compliant Cloud Archive (CCA) provides long-term information management and on-demand access to virtualized unstructured data, with an option to store your data in an immutable cloud storage tier for ransomware/extortionware protection.


The Tamperproof Cloud Storage solution (TCS) provides a hot standby-like protected storage repository that allows for complete disaster or ransomware recovery in minutes, not days.

In addition, once your inactive files are immutably stored in the cloud-based CCA or TCS data vaults, virtualized access to that data enables seamless access to the data for employees and lowers your on-prem, backup, and cloud capacity requirements by as much as 80%.


restorVault storage virtualization is provided by two technologies called Offload Data Virtualization (ODV) and Copy Data Virtualization (CDV).


ODV frees up valuable capacity on primary servers by offloading inactive data to a protected CCA or TCS vault based on policies.


CDV allows fast data replication in the cloud through virtualized access to protected CCA or TCS data vaults for functions such as Dev, Training, DR, etc.


Contact us today to learn how restorVault can help your company save money by virtualizing and managing inactive data while increasing data security and storage capacity!



bottom of page