Part of:

How Data Storage Infrastructure Is Being Redefined Today

Why Trust Techopedia
KEY TAKEAWAYS

Here we examine some of the new approaches to data storage that many vendors are offering today in their storage solutions such as all-flash arrays, software-defined storage, scalable unstructured and file-based storage as well as the active management of data storage infrastructure.

The nature of storage is changing. Data is what drives decisions today. For companies, the ability to access their data quickly, efficiently and predictably can provide a competitive edge in a crowded and disruptive marketplace. According to the IDC, the world will create ten times as much data as existed in 2016, an estimated 163 zettabytes total. Moreover, while consumers have traditionally created the bulk of data up to now, enterprises will create 60 percent of the world’s data in 2025. According to the 2018 State of Infrastructure Report, the growth of data and storage is by far the biggest factor driving IT infrastructure change, with 55 percent of respondents citing it as one of the top three factors. In fact, data and storage far exceeded the need to integrate with cloud services.

As we examine data storage today within the enterprise, we find several trends:

  • Data needs to be accessed as quickly as possible.
  • Data storage must be highly scalable in ability to accommodate accelerating growth.
  • Data storage needs to be smart, matching varying types of data with the appropriate storage.
  • Companies require active management, monitoring and support to ensure that their storage infrastructures run reliably and predictably.
  • Companies want to rid themselves of costly forklift upgrades in their data infrastructure every few years.
  • The growth of unstructured data

The Need for Speed

Companies today need to acquire the data they want, when they need it. This equates to speed, and whether you are talking about cars or data, speed costs money. Companies are turning to all-flash arrays (AFAs), which is evident by the fact that the AFA market grew by 37.6 percent year over year in 2017, making it a $1.4 billion industry. While it is true that solid-state technology is more expensive than traditional drives, you may not need as much of that perceived capacity. By integrating intelligence-based tools into AFA storage infrastructure, companies can achieve data reduction ratios of 2:1, 4:1 and even 10:1. Some of these reduction tools include the following:

  • Data Compression – The combination of both inline compression and algorithm-based deep reduction helps deliver a 2–4x data reduction target. This powerful compression combination is the primary form of data compression for databases.
  • Copy Reduction – Provides instant pre-deduplicated copies of data for snapshots, clones and replication.
  • Thin Provisioning – Eliminates waste by reserving data capacity in dynamic fashion in order to stay of ahead of written data.

Now consider the fact that these deduplication and pattern removal technologies help to maximize the lifespan of your storage solution. Referred to as “write avoidance techniques,” these software features help to reduce the number of times that data must be written to the array. Less usage helps preserve the durability and performance of your system. AFA also helps reduce overall data center costs as well. When it comes to traditional storage, there many moving parts, all creating a lot of heat and consuming a lot of power. There is no movement within all-flash drives. Motionless drives equate to reduced electrical and cooling costs. (For more on saving energy, check out How Lawmakers Are Pushing Data Centers in a Green Direction.)

Software-Defined Storage

We have witnessed the software defining of many aspects of the data center in recent years such as server virtualization and software-defined networking. Software-defined storage (SDS) is changing the very nature of many data centers today. In the same manner that enterprises have liberated themselves of costly and inflexible server hardware, they are purging their data centers of expensive proprietary storage solutions in favor of software-defined storage that utilizes x86 technology. This has several benefits:

  • There is no need for a dedicated storage controller running proprietary software.
  • It utilizes x86 technology that most IT professionals are already familiar with.
  • Enterprises can reduce the size of their storage footprint, which reduces hosting and cooling costs.
  • Companies can leverage existing storage areas.

SDS allows vendors to integrate compute, storage and networking assets into a single integrated system, allowing a single admin to manage all of these facets through a single pane of glass. Gartner predicts that companies will be able to reduce their server and storage expenditures by 2020.

Advertisements

Scaling Out for Unstructured Data

One of the key initiators of the astounding growth rate of data today is the explosion of unstructured data. According to a research survey sponsored by Western Digital of more than 200 technology decision sponsors, 63 percent reported to be managing storage capacities of 50 petabytes (PB) or more, with more than half of that falling under the unstructured category. Says one leading storage vendor today, “Unstructured, file-based data is the crown jewel of the modern day enterprise and petabyte scale data storage is the new normal.”

An example of unstructured data is IoT-generated data. IDC believes data from IoT will make up 10 percent of the data universe by 2020. As a result, companies need a new generation of scale-out storage engineered to store and manage unstructured and file-based data at web scale. Although valuable, unstructured data often does not justify the high cost of block-based storage. Unstructured data is creating the need for scale-out storage such as highly scalable NAS solutions and software-defined storage. (Consolidating your data center can also help you manage your data. Learn more in 5 Reasons Your Company Should Consolidate Its Data Center.)

Active Managed Data Infrastructure

Consider the typical support call concerning a failed drive in a traditional SAN appliance. Your call is answered by a service representative whose job it is to take your basic information about the problem at hand and forward it on to the appropriate technical support technician or engineer. The representative will ask for the usual – product ID numbers, your name, contact information – and remind you of the expiration date of your current service contract. Once your customer profile is established, the barrage of questions begins:

  • What software or firmware version are you running?
  • Have you made any changes recently to the unit?
  • Can you access the administrative console?
  • Are there any lights flashing on the drives?
  • Is your data currently available?

Finally, you are forwarded to a technician who requests you to pull a log from the unit and email or FTP it, after which time will be required for the log to be reviewed. During all of this, your time is put on pause, costing your organization valuable productivity. But what if your vendor knew about the failed drive before you did?

Not only can companies not tolerate any downtime when it comes to their storage infrastructure, they cannot afford timely inefficiencies when it comes to their support staff. For this reason, some storage vendors are offering solutions that are actively monitored and managed through the cloud. Leveraging the data sent from storage systems across the world, storage vendors are able to use predictive analytics to predict most problems before they occur. Oftentimes, the drive is set for delivery before the customer is aware of the problem.

The nature of storage infrastructure is indeed changing, and with it are new methods of storing, accessing and managing your company’s data. Needless to say, even more change in the data industry lies ahead.

Advertisements

Related Reading

Related Terms

Advertisements
Brad Rudisail
Contributor
Brad Rudisail
Contributor

Brad Rudisail is a network engineer, IT manager, IT instructor, technical writer and professional musician. His twenty year writing portfolio includes a long assortment of white papers, newspaper columns, articles, learning curriculum and blogs. He is also the author of two inspirational books.