Backup Header

The Pros and Cons of All Backup Storage Targets Explained

Written by Hornetsecurity / 27.11.2023 /
Home » Blog » The Pros and Cons of All Backup Storage Targets Explained

The days of tape-only solutions have come to an end. Other media have caught up to it in cost, capacity, convenience, and reliability. You now have a variety of storage options. Backup applications that can only operate with tape have little value in modern business continuity plans.

Unless you buy everything from a vendor or service provider that designs your solution, make certain to match your software with your hardware.

Use the software’s trial installation or carefully read through the manufacturer’s documentation to determine which media types it works with and how it uses them. Backup software targets the following media types:

  • Magnetic tape;
  • Optical disc;
  • Direct-attached hard drives and mass media devices;
  • Media-agnostic network targets;
  • Cloud storage accounts.

Magnetic Tape in Backup Solutions

IT departments have relied on tape for backup since the dawn of the concept of backup. This technology has matured well and mostly kept up with the pace of innovation in IT systems. However, the physical characteristics of magnetic tape place a severe speed limit on backup and restore operations.

Pros of magnetic tape

  • Most backup software can target it;
  • Tape media has a relatively low cost per gigabyte;
  • Reliable for long-term storage;
  • Lightweight media, easy to transport offsite;
  • Readily reusable.

Cons of magnetic tape

  • Extremely slow;
  • Tape drives have a relatively high cost;
  • Media susceptible to magnetic fields, heat, and sunlight.

For most organizations, the slow speed of tape presents its greatest drawback. You can find backup applications that support on-demand features such as operating directly from backup media. That will not happen from a tape.

Having said that, tape has a good track record of reliability. Tapes stored on their edges in cool locations away from magnetic fields can easily survive ten years or more. Sometimes, the biggest problem with restoring data from old tape is finding a suitable, functioning tape drive. I have seen many techniques for tape management through the years.

One of the worst involved a front desk worker who diligently took the previous night’s tape offsite each night – leaving it on their car dashboard. It would bake in the sunlight for a few hours each evening. So, even though the company and its staff meant well, and dutifully followed the recommendation to keep backups offsite, they wound up with warped tapes that had multiple dead spots.

At the opposite end, one customer used a padded, magnetically shielded carrying case to transport tapes to an alternative site. There, they placed the tapes into a fireproof safe in a concrete room.

I was called upon once to try to restore data from a tape that was ten years old. It took almost a week to find a functioning tape drive that could accommodate it. That was the only complication. The tape was still readable.

Optical Media in Backup Solutions

For a brief time, optical technology advances made it attractive. Optical equipment carries a low cost and interfaces well with operating systems. It even supports drag-and-drop interactivity with Windows Explorer. They were most popular in the home market. Some optical systems found their way into datacenters. However, magnetic media quickly regained the advantage as capacities outgrew optical media exponentially.

Pros of optical media

  • Very durable media;
  • Shelf life of up to ten years;
  • Inexpensive, readily interchangeable equipment;
  • Drag-and-drop target in most operating systems;
  • Lightweight media, easy to transport offsite.

Cons of optical media

  • Very limited storage capacity;
  • Extremely slow;
  • Few enterprise backup applications will target optical drives;
  • Poor reusability;
  • Wide variance in data integrity after a few years.

When recordable optical media first appeared on the markets, people found its reliability attractive. CDs and DVDs do not care about magnetic fields at all and have a higher tolerance for heat and sunlight. Also, because the media itself has no mechanism, they survive rough handling better than tape.

However, they have few other advantages over other media types. Even though the ability to hold 700 megabytes on a plastic disc was impressive when recordable CDs first appeared, optical media capacities did not keep pace with magnetic storage.

By the time recordable DVDs showed up with nearly five gigabytes of capacity, hard drives and tapes were already moving well beyond that limit.

Furthermore, people discovered – often the hard way – that even though optical discs have little observable structural material, their data-retaining material has a much shorter life. Even though a disc may look fine, its contents may have become unreadable long ago.

Recordable optical media has a wide range of data life, from a few years to several decades. Predicting media life span has proven difficult.

Because of its speed, low capacity, and need for frequent testing, you should avoid optical media in your disaster recovery solution.

Direct-Attached Storage and Mass Media Devices in Backup Solutions

You do not need to limit your backup solutions to systems that distinguish between devices and media. You can also use external hard drives and multi-bay drive chassis. Some attach temporarily, usually via USB. Others, especially the larger units, use more permanent connections such as Fiber Channel.

These types of systems have become more popular as the cost of magnetic disks has declined. They have a somewhat limited scope of applications in a disaster recovery solution, but some organizations can put them to great use.

Pros of directly attached external devices

  • Fast;
  • Reliable for long-term storage;
  • Inexpensive when using mechanical drives;
  • Easily expandable;
  • High compatibility;
  • Use as a standard file system target.

Cons of directly attached external devices

  • Difficult to transport;
  • Additional concerns when disconnecting;
  • Mechanical drives have many failure points;
  • Expensive when using solid-state drives;
  • Not a valid target in every backup application.

Portability represents the greatest concern when using directly attached external devices for backup. Unlike tapes and discs, the media does not simply eject once the backup concludes.

With USB devices, you should notify the operating system of pending removal so that it has a chance to wrap up any writes, which could include metadata operations and automatic maintenance.

Directly connected Fiber Channel devices usually do not have any sort of quick-detach mechanism. In an emergency, people should concern themselves more with evacuation than spending time going through a lengthy detach process. In normal situations, people tend to find excuses to avoid tedious processes. Expect these systems to remain stationary and onsite.

Once upon a time, such restrictions would have precluded these solutions from a proper business continuity solution. However, as you will see in upcoming sections, other advances have made them quite viable. With that said, you should not use a directly attached device alone. Any such equipment must be part of a larger solution.

You may run into some trouble using external devices with some backup applications. Fortunately, it would help if you never ran into any modern programs that absolutely cannot backup to a disk target. However, some may only allow you to use disk for short-term storage.

Others may not operate correctly with removable disks. If you purchase your devices before your software, make certain to test interoperability thoroughly.

Even though mechanical hard drives have advanced significantly in terms of reliability, they still have a lot of moving parts. Furthermore, the designers of the typical 3.5-inch drive did not build them for portability. They can travel, but not as well as tapes or discs. Even if you don’t transport them, they still have more potential failure points than tapes. Do not overestimate this risk, but do not ignore it, either.

Networked Storage in Backup Solutions

Network-based solutions share several characteristics with directly attached storage. Where you find differences between the two, you also find trade-offs. You could use the same pro/con list for networked solutions as you saw above for direct attached systems. We emphasize different points, though.

In the “pros” column, networked storage gets even higher marks for expandability. Almost every storage unit built for the network provides multiple bays. You can start with a few drives and add more as needed. Some even allow you to connect multiple chassis, physically or logically. In short, you can extend your backup storage indefinitely with such solutions.

The network components result in a higher cost per gigabyte for network-attached storage. However, the infrastructure necessary to enable a storage device to participate on a network tends to have a side effect: more features.

Almost all these systems provide some level of security filtering. Less expensive devices, typically marketed simply as “Network-Attached Storage” (NAS), may not provide much more than that.

Higher-end equipment, commonly called “Storage Area Network” (SAN), boasts many more features. You can often make SAN storage show up in connected computers much like directly attached disks. All in all, the more you pay, the more you get. Unfortunately, though, cost increases more rapidly than features.

What you gain in capacity and features, you lose in portability. Many NAS and SAN systems are rack mounted, so you cannot transport them offsite without significant effort.

But, because these devices have a network presence, you can place them in remote locations. Using remote storage requires some sort of site-to-site network connection, which introduces higher costs, complexity, security concerns, possible reduction in speed, and more points of failure.

Even though placing networked storage offsite involves additional risks, it also presents opportunity. Most NAS and SAN devices include replication technology. You can back up to a local device and configure it to replicate to one or more remote sites automatically.

If your device cannot perform replication, or if you have different devices and they cannot replicate to each other, your backup software may have its own replication methods.

In the worst case, you can use readily available free tools such as XCOPY and RSYNC with your operating system’s built-in scheduler.

Using commodity computing equipment as backup storage

Up to this point, we have talked about network-attached devices only in terms of dedicated appliances. SANs have earned a reputation for carrying price tags that exceed their feature sets. In the best case, that reduces your budget’s purchasing power. More commonly, an organization cannot afford to put a SAN to its fullest potential – if they can afford one at all.

As a result, you now have choices in software-based solutions that run on standard server-class computing systems. Some backup applications can target anything that presents a standard network file protocol, such as NFS or SMB.

Software vendors and open-source developers provide applications that provide network storage features on top of general-purpose operating systems. These solutions fill the price and feature space between NAS and SAN devices. They do require more administrative effort to deploy and maintain than dedicated appliances, however.

When I built my first backup solution with the intent of targeting a dedicated appliance, I quickly learned that hardware vendors emphasize the performance features of their systems. Since I only needed large capacity, I priced a low-end rack-mount server with many drive bays filled with large SATA drives. I saved quite a bit over the appliance options.

The role of hyper-converged infrastructure in backup

A comparatively new type of system, commonly known as “hyper-converged infrastructure” (HCI), has taken on a growing role in datacenter infrastructure. In the traditional scale-out model, server-class computers handle the compute work, SAN or NAS devices hold the data, and physical switches and routers connect them all together.

In HCI, the server-class computers take over all the roles, even much of the networking.

Few organizations will design an HCI just for backup. Instead, they will deploy HCI as their foundational datacenter solution.

Originally, datacenters used purpose-built hosts for specific roles, such as domain controllers and SQL servers. As technologies matured, vendors and administrators enhanced their resilience by clustering hosts.

These clusters stayed on the purpose-built path of their constituent hosts. In the second generation, server virtualization started breaking down the pattern of single-use physical hosts. However, for the sake of organization and permission scoping, most administrators continued to deploy hosts and storage around themes.

HCI supersedes that paradigm by enabling true “cloud” concepts. With HCI, we can still define logical boundaries for compute, storage, and networking groups, but the barriers only exist logically. We may not know which physical resource hosts a particular server or database file.

Even if we find out, it could move in response to an environmental event.

With files, the storage tier can scatter the bits across the datacenter – possibly even between well-connected datacenters. In short, HCI administrators only need to concern themselves with the organization’s overall capacity.

If some resource runs low, they purchase more equipment and extend their HCI footprint. When done well, hardware purchases and allocations occur in different cycles and levels than server provisioning and storage allocation.

All this gives you two considerations for backup with HCI:

  • You could place your infrastructure for on-premises backup hosting and public cloud relays in HCI just like any other server role
  • You may have concerns about mixing the things that you backup with the backup itself

The first viewpoint has the strongest supportable argument. You should have multiple independent copies of backup anyway, so pushing data to offsite locations reduces the impact of dependence on HCI.

Also, many administrators (and the non-technical people above them in the reporting chain) cannot understand that coexistence does not automatically mean line-of-sight.

You can architect your HCI such that the production components have no effective visibility into backup. It works the same basic way that we have always set up datacenter backup, but the dividers exist in software instead of hardware. However, it does not matter how much anyone can justify their fears.

If you encounter significant resistance to bundling backup in with the rest of your HCI deployment, then architect traditionally. It sacrifices some efficiency, but not to a crippling degree.

Portability represents the greatest concern when using directly attached external devices for backup. Unlike tapes and discs, the media does not simply eject once the backup concludes.

With USB devices, you should notify the operating system of pending removal so that it has a chance to wrap up any writes, which could include metadata operations and automatic maintenance.

Directly connected Fiber Channel devices usually do not have any sort of quick-detach mechanism. In an emergency, people should concern themselves more with evacuation than spending time going through a lengthy detach process. In normal situations, people tend to find excuses to avoid tedious processes. Expect these systems to remain stationary and onsite.

Once upon a time, such restrictions would have precluded these solutions from a proper business continuity solution. However, as you will see in upcoming sections, other advances have made them quite viable. With that said, you should not use a directly attached device alone. Any such equipment must be part of a larger solution.

You may run into some trouble using external devices with some backup applications. Fortunately, it would help if you never ran into any modern programs that absolutely cannot backup to a disk target. However, some may only allow you to use disk for short-term storage.

Others may not operate correctly with removable disks. If you purchase your devices before your software, make certain to test interoperability thoroughly.

Even though mechanical hard drives have advanced significantly in terms of reliability, they still have a lot of moving parts. Furthermore, the designers of the typical 3.5-inch drive did not build them for portability. They can travel, but not as well as tapes or discs. Even if you don’t transport them, they still have more potential failure points than tapes. Do not overestimate this risk, but do not ignore it, either.

Cloud Storage in Backup Solutions

Several technological advances in the past few years have made Internet-based storage viable. Most organizations now have access to reliable, high-speed Internet connections at low cost. You can leverage that to solve one of the most challenging problems in backup: keeping backup data in a location safe from local disasters. Of course, these rewards do not come without risk and expense.

Pros of cloud backup

  • Future-proof;
  • Offsite from the beginning;
  • Wide geographical diversity;
  • Highly reliable;
  • Effectively infinite expandability;
  • Access from anywhere;
  • Security.

Cons of cloud backup

  • Dependencies outside your control;
  • Expensive to switch to another vendor;
  • Possibility of unrecoverable interruptions;
  • Speed.

To keep their promises to customers, cloud vendors replicate their storage across geographical regions as part of the service (cheaper plans may not offer this protection).

So, even though do you need to worry about failures in the chain of network connections between you and your provider and about outages within the cloud provider, you know that you will eventually regain access to your data. That gives cloud backup an essentially unrivaled level of reliability.

The major cloud providers all go to great lengths to assure their customers of security. They boast of their compliance with accepted, standardized security practices. Each has large teams of security experts with no other role than keeping customer data safe.

That means that you do not need to concern yourself much with breaches at the cloud provider’s level.

Yet, you will need to maintain the security of your account and access points. As with any other Internet-based resource, the provider must make your data available to you somehow.

Malicious attackers might target your entryway instead of the provider itself. So, you still accept some responsibility for the safety of your cloud-based data.

When using cloud storage for backup, two things have the highest probability of causing failure. Your Internet provider presents the first.

If you cannot maintain a reliable connection to your provider, then your backup operations may fail too often. Even if you have a solid connection, you might need more bandwidth to support your backup needs.

For the latter problem, you can choose a backup solution such as Hornetsecurity’s VM Backup that provides compression and deduplication features specifically to reduce the network load.

Your second major concern is interim providers. While you can trust your cloud provider to exercise continuous security diligence, many third-party providers follow less stringent practices.

If your backup system transmits encrypted data directly to a cloud account that you control, then you have little to worry about. Verify that your software uses encryption and keep up on updates, and you will have little to worry about beyond the walls of your institution.

However, some providers ship your data to an account under their control that they resell to customers. If they fall short on security measures, then they place your data at great risk. Vet such providers very carefully.

“Cost” did not appear on either the pro or con list. Cost will always be a concern, but how it compares to onsite storage will differ between organizations. Using cloud storage allows you to eliminate so-called “capital expenditures”: payments, usually substantial, made up-front for tangible goods.

If you have an Internet connection, you will not need to purchase any further equipment. You also wipe out some “operational expenses”: recurring costs to maintain goods and services.

You will need to pay your software licensing fees, and your cloud provider will regularly bill you for storage and possibly network usage.

However, you will not need to purchase storage hardware, nor will your employees need to devote their time to maintaining it. You transfer all the hassle and expense of hardware ownership to your provider in exchange for a lower overall fee.

Unfortunately, you should not transfer your entire backup load to a cloud provider. Due to the risks and speed limits of relying on an internet connection, it still makes the most sense to keep at least some of your solution on site. So, you should still expect some capital expense and local maintenance activities.

Putting It in Action

The previous section helped you to work through your software options. If you have made a final selection, then that has at least some control over your hardware purchase. If not, then you can explore your hardware options and work backward to picking software.

The exact deployment style that you use, especially for the on-premises portion of your solution, only matters to the degree that it enables your backups to function flawlessly.

Prioritize satisfying your needs above aligning with any paradigm. You need space to store your backups, software to capture them, and networking and transport infrastructure to move them from live systems.

Four steps to performing hardware selection

Truthfully, your budget plays the largest restrictor in hardware options. So, start there. Work through the features that you want to arrive at your project scope. Your general process looks like this:

1. Determine budget

2. Establish other controlling parameters:

  • Non-cloud replication only works effectively if you have multiple, geographically distant sites;
  • Inter-site and cloud replication need sufficient bandwidth to carry backup data without impeding business operations;
  • Rack space.

3. Decide on preferred media type(s). The above explanations covered the pros and cons of the types. Now you need to decide what matters to your organization:

  • Cost per terabyte;
  • Device/media speed;
  • Media durability;
  • Media transportability.

4. Prioritize desired features:

  • Deduplication;
  • Internal redundancy (RAID, etc.);
  • External redundancy (hardware-based replication);
  • Security (hardware-based encryption, access control, etc.).

If you find that the cost of a specific hardware-based feature exceeds your budget, then your software might offer it. That can help you to achieve the coverage that you need at a palatable expense.

Once you have concluded your hardware selection, you could proceed to acquiring your software and equipment. However, it makes sense to work through the next article on security before making any final decision. You might decide on a particular course for securing data that influences your purchase.


To properly protect your virtualization environment and all the data, use Hornetsecurity VM Backup to securely back up and replicate your virtual machine.

We ensure the security of your Microsoft 365 environment through our comprehensive 365 Total Protection Enterprise Backup and 365 Total Backup solutions.

For complete guidance, get our comprehensive Backup Bible, which serves as your indispensable resource containing invaluable information on backup and disaster recovery.

To keep up to date with the latest articles and practices, pay a visit to our Hornetsecurity blog now.


Conclusion

In summary, exploring various backup storage options reveals both advantages and disadvantages. Each target, whether it’s cloud-based, on-premises, or a hybrid approach, offers unique benefits and challenges. The choice ultimately depends on specific business needs, budgets, and data security concerns.

By carefully considering the factors mentioned in this article, organizations can make informed decisions to ensure data protection and accessibility align with their objectives.

FAQ

What is the main purpose of storage?

Storage serves as a system that empowers a computer to store data, whether temporarily or permanently. Devices like flash drives and hard disks constitute a foundational element in most digital devices, enabling users to safeguard a wide array of data, including videos, documents, images, and raw information.

What are the benefits of backup storage?

Backups provide the means to recover deleted files or retrieve data that may have been unintentionally overwritten. Moreover, backups often represent the most reliable choice for recuperating from incidents like ransomware attacks or significant data loss events, such as a data center fire.

What are the benefits of more storage?

Here are several notable benefits associated with additional space that will persuade you to maximize your storage capacity:

– Enhance organization and minimize clutter;
– Boost efficiency and productivity;
– Ensure safety;
– Enhance accessibility for everyone;
– Elevate comfort and ergonomics.

These advantages underscore the importance of making the most of available storage space.

You might also be interested in: