Moving seldom-used “cold” data off networks and clouds can slash enterprise costs.
Storing all of your data on flash storage is like buying a race car to drive 25 miles per hour through your neighborhood – it’s too fast for all your needs, and you probably don’t have the budget for it.
Yet thousands of organizations essentially make the same mistake every day, by storing, replicating, and backing up their data exactly where users put it – on expensive enterprise network-attached storage (NAS) arrays – failing to realize the difference between regularly-used “hot” data and the “cold” data which can lie dormant for years and often represents 60% of a typical organization’s total.
Idling Data
Yes, that data still needs to be available for analytics and other monetization projects, but that doesn’t mean it needs to sit in a race car while waiting at the stop sign.
See also: Big Data Battle Shifts Fronts
This is exactly what’s happening today with lots of unstructured data, which is growing at an explosive rate. According to Gartner’s recent Magic Quadrant for Distributed File Systems and Object Storage, unstructured data is growing at 50% or more; however, IT budgets are only growing at 3.2%, according to Gartner How can IT close that gap and keep up with the pace of growth in data as they plan their data infrastructure?
The time has come for enterprises to rethink how to handle this deluge of unstructured data, based on lessons learned over the past few years. It’s time to be smarter about where to store all of this data. An article in the Harvard Business Review asks: “What’s Your Data Strategy?” IT and business leaders need to think about how much data they have, which data is hot or cold, and where is the best place to store it. Chief Data Officers (CDOs) across industries must take a step back and re-evaluate their data management practices so they can optimize their storage environments, free critical resources and save money.
Do You Know Where Your Data Lives?
With this unprecedented growth in data and data types, a company’s primary NAS storage can quickly reach a point where performance bogs down due to too many hierarchical structures. Storing multitudes of cold data on NAS simply doesn’t make sense. NAS is expensive for the rising volumes of unstructured data storage that is mainly cold and not frequently accessed. Throwing this data on existing NAS ends up clogging performance and elevating costs. There must be a more cost-effective way to store that colder data that you might need to access at some point, but not every day.
See also: Cloud Storage Costs Emerge as Inconvenient AI Truth
Reaching to the Public Cloud
Over the past few years, companies have leveraged the public cloud to quickly add layers of storage. After the initial great wave of adoption, many companies are rethinking their public cloud strategy after getting unexpected large bills, often for extracting data from the public cloud when it comes time to access and analyze it. The reason that many companies use the public cloud is to be able to spin up a massive number of compute cores to run a job and then decommission it after the job to avoid the capital cost if done on-premises.
A Cost-Effective Alternative: On-Premises Object Storage
An alternative to using the public cloud is to consider bringing data back on-premises with an object storage solution. With object storage, IT can keep control of data, get better performance that is more predictable, and easily scale at lower costs. The case for on-premises analytics is the ability to control the environment, including security features, get predictable performance, as well as move and store data without incurring charges. Object storage on-premises is a cost-effective way to store that colder data that you might need to access at some point, but not every day.
Know When to Say When
How do you know when it’s time to rethink your data storage strategy? A few events that should trigger a re-evaluation are:
- An unexpected request or demand to get data out of the cloud blows your entire budget.
- Your NAS performance is so bad that you need more NAS!
- The gap between the explosive growth of data and the cost of storage becomes unmanageable.
At this point it’s time to consider bringing data back on-premises with an object storage solution.
Moving Data from NAS to Object Storage
New tools from companies such as Komprise help manage data by identifying cold data and migrating it to an object storage solution. As mentioned above, typically 60% of an organization’s data is cold and has not been accessed in over a year. Why not move that cold data to more cost-effective object storage and reclaim NAS for initiatives that drive more immediate value to your business?
See also: Why Big Data Storage Needs to Be a Bigger Discussion
Here at Western Digital our CDO saved $600K in public cloud charges by deciding which data to bring on-premises versus in the public cloud. Doing so we were able to regain control of our data while still taking advantage of the public cloud where it made sense.
An Evolving Data Strategy
It’s time for IT to rethink data management practices, and how to handle the unprecedented growth of unstructured data. Data strategy needs to evolve over time and take into consideration lessons learned from the past. Understanding where your data is and where it can be best served is paramount to a cost-effective data management solution. There is great TCO potential from re-evaluating your cloud strategy, knowing where your data lives and how to optimize resources both on NAS, in the cloud and on-premises.