Data Center Tech Blog

There are new fundamentals for data, and they're based on value. When data lives forever, our dynamic world requires dynamic storage.

Stefaan Vervaet

Sr. Dir. Strategic Alliances & Market Development

The New Fundamentals of Data

What do Artificial Intelligence, Virtual Reality and blockchain technologies have in common? First, they all aspire to improve experiences in our day-to-day life, whether through the automation of processes, or by providing an augmented layer that has never been experienced before.

The second commonality is that their path to success stems from massive amounts of datapoints. That’s also their key challenge. In a world where the future value of data is not yet understood,

Certain discoveries are only unlocked when you have enough mass. A single dataset of sequenced cancer cells may not look out of the ordinary, but tomorrow, when more data is collected and faster processors allow for more sophisticated algorithms, that combined dataset could be an instrumental part in curing cancer. Who would want to pass up such opportunities? It’s no surprise that some companies may decide to never delete data again.

The Business Challenge of Growing with Data

CIOs and storage admins continue to be challenged with the never-ending storage growth that’s outweighing their available budgets. But there’s an opportunity when accepting the challenge. Some companies reinvent their business around the data they control with a data-first-strategy. This includes a strategy in which they handle data as intellectual property and take a holistic approach on how to capture, manage and monetize it. Start with the data strategy, then follow with the storage architecture.

Traditional file and block based system arrays are failing at cloud-scale. Some are limited by the traditional RAID protection schemes and the fixed storage allocation techniques they are built on, others are bound by a centralized database required to support central locking, inherent to posix based systems.

Traditional systems lack the agility to take advantage of new, denser storage technologies and are subject to tedious forklift upgrades and/or migration services. They are comprised of dedicated storage silos that are not only expensive to maintain, but also don’t contribute to the incremental value that businesses are trying to achieve. Buying more of the same is no longer acceptable to support a long-term data strategy.

Next Generation Technologies, and Promises

Two key storage technologies have come to alleviate some of the traditional storage pain in the abundance of data – software-defined storage and object storage. Object storage solutions are designed from the ground up to deliver scale and flexibility that overcome the challenges inherent to the traditional arrays. Software-defined storage (SDS) models have brought the promise of separation of data from its underlying hardware, bringing both flexibility as well as new cost-efficiencies through commodity hardware.

However, the marriage of these two solutions has been less successful. Running object storage in a software-only model is still expensive at scale. Commodity hardware is not all created equal and different hardware is introduced as new standards are adopted. The cost to qualify, tune and alter an existing environment can wash all those benefits away. Many open-source tools provide a great foundation, but early adopters quickly discovered that reaching petabyte scale environments requires dedicated engineering teams.

On top of that, existing object storage architectures are built on dispersal techniques that are static, e.g. RING or other pre-defined algorithms. These techniques suffer when newer hardware is introduced to an existing cluster, triggering a rebalance of data or migration in certain cases. Another side effect of a static allocation is the immediate impact on ingest performance when one of the storage or network components fails.

A New Type of Object Storage Architecture

Based on customer feedback, we understood that there is a need for a new type of object storage architecture that delivers a more dynamic and real-time data placement for a robust storage system. An architecture that combines the agility and flexibility of a software-defined model and dynamically takes advantages of the most efficient building blocks over time.

Today we announced the newest features to our ActiveScale® system. With its vertical integration and innovative stack, the ActiveScale architecture delivers exactly that, and more:

  • Dynamic Data Placement – for seamless movement of data onto new capacity and new ActiveScale hardware models without the need to rebalance or migrate data over time.
  • Support for highly efficient and reliable drive media – HGST’s enterprise HelioSeal® drives that provide leading power and cooling efficiencies.
  • Seamless ability to scale capacity out to multiple racks. Support for single or multi-geographical deployed systems – to make scaling beyond a single rack simple and make data easily accessible across locations under a single namespace.
  • Variable storage capacity and extreme reliability – to ensure extreme data durability across different drive sizes and to simplify changes to drive capacities over time.
  • A vertically integrated object store system – Deploy an on-premises object store with a native Amazon S3™-compliant interface in matters of hours instead of weeks, fully delivered and supported by HGST technology from the device to the operating system.

See here how are dynamic placement works:

Our CIO’s View

A perfect example of this transition from traditional infrastructures to flexible and agile object storage that enables data and insights to thrive, is how our own IT department replaced our traditional tape, backup appliances and traditional file-based storage servers with a single consolidated object storage repository.

This content repository, built on ActiveScale, hosts a multitude of backup data sets, long-term-archives, manufacturing logs and VDI snapshots in a single architecture. From there it is accessed for advanced analytics, fast restores, post metadata augmentation, audit reviews and much more.

In this new infrastructure, data is created and touched by different applications and incremental value is created over time. One such use case is our big data platform supporting the drive manufacturing business in discovering better techniques to improve our own drive yields.

A Dynamic World Requires Dynamic Storage

As you invest in your company’s approach to the digital transformation taking place in the economy, start with your data strategy and think about an architecture that is built to dynamically scale and grow with your needs. An architecture that takes advantage of the latest storage capacities in-line and delivers the continuous economics over time to turn your storage strategy positon from a cost focus to focusing on generating value through data. We look forward to helping you on that journey.

To find out more visit us at www.hgst.com/products/systems

Watch the Data Lives Forever virtual event: www.hgst.co/datalivesforever

 

NVMe™ and Data Warehouse Workloads – is it Ready to Dominate?

Object Storage and Ransomware – How to Better Protect Your Business

bring your data to life

Today’s digital economy with mobile, IoT and cloud is based on the value of data. How do you unlock it? Download the infographic to learn more: