Data Center Tech Blog

the Process of Transforming Data Centers

Data centers are undergoing a period of technology transformation designing their infrastructure to innovate with add ground-breaking technologies, even as the data center maintains its core production systems with enterprise service levels.

This kind of disruptive innovation is being called “bi-modal” IT by Gartner Inc. analysts, including Gartner managing analyst Ray Paquet, who described it, and spoke about its impact, at the Gartner Data Center conferences in London and Las Vegas.

According to Paquet’s keynote slides (Bi-Modal IT: Managing the Dichotomy in the Data Center”), “By 2017, 75% of IT organizations will have a bi-modal capability.” His slides defined Bi-Modal IT in this way: “Bi-Modal IT means—having two modes of IT, each designed to address different information and technology goals.” His slides describe the two modes this way: “Mode 1 is traditional, and Mode 2 is exploratory.” He recommended that customers start with an “island” project for proof of concept, before adding it formally to IT operations. 1

Bi-Modal IT

As I listened to the conference, I realized that this form of IT allows the new and the innovative systems the time to be developed and tested—even as the traditional IT systems continue in production, delivering data services to end-users. SanDisk®’s Rich Petersen commented on this trend here, in the SanDisk ITBlog.

This bi-modal approach is driven by a host of new applications and is enabled by new designs in hardware and new software tools.. The new infrastructure leverages software virtualization across the data center, improved network “fabrics” within the data centers – and a higher level of abstraction that allows workloads to be shifted within the data center, and between linked data centers, as needed.

Bi-modal IT is all about maintaining quality of service (QoS) to end-users from production systems, while simultaneously developing and deploying new applications supported by new infrastructure approaches in “islands” of innovation that improve – and prove—the new capabilities before being deployed more widely.

What It Means for Business Planning

Here’s one way to look at it, in my view: It’s like building new modules for the International Space Station on Earth – and then installing them on the main ship when they’ve been tested and proven on the ground. While the new modules are under development, production systems must continue to deliver data services to end-users. In the data center, we see new innovation happening within DevOps and new software delivery systems for cross-platform management that ultimately will be woven into the fabric.

A similar transformation occurred in the 1990’s when open systems based on UNIX surrounded mainframe computers, off-loading non-production applications, or were used to host new applications that mainframes were not nimble enough to deploy quickly. That’s exactly what happened in American Airlines’ Sabre reservations systems, as hotel applications moved off the mainframes (which remained the core of Sabre) and were re-hosted on Unix servers. Now, the changes are taking place in multiple layers of the infrastructure—in servers, storage and networking – and in orchestrating the flow of workloads throughout the data center.

Why This is Happening

Customers know they must move forward to build out an extensible x86 server infrastructure, even as they preserve selected islands of automation that contain mainframes and Unix servers and x86 infrastructure running production workloads, many of which are mission-critical.

Just as importantly, with enterprises sending many applications to the cloud for hosting, the data center needs to be able to link more easily to the cloud – and to manage both on-prem and off-prem systems in a more holistic way.

Introducing a high level of abstraction into applications and databases into workloads, for better management, is key to moving to a software-defined data center. In his keynote presentation, “Non-Stop IT: Delivering the Integrated Data Center,” David Cappuccio, Managing VP and Chief of Research for Gartner’s Infrastructure services, outlined the importance of Software-Defined X (SDx), or software-defined everything, in the data center, noting, in his presentation’s bullet points, as quoted, below, that it is 2

  • A new way to operate
  • Moves control plane (and potentially the value) from individual devices to a central controller
  • Allows provisioning, monitoring and management from a single point
  • Potential integration point of on- and off-premises data and applications
  • Key element of future data centers

business planning Skyscraper Windows in London

Integrated Systems and Disaggregated Building Blocks

As for new technologies – there are many types of deployments coming into the data center, with platforms becoming more optimized around specific workloads, in many cases, to improve performance.

In my view, this is seen in the emergence of more integrated systems, combining servers, storage and networking features – and in the industry move to support organizations like the Open Compute Project, which offers specifications for open-source hardware, resulting in a disaggregated approach to building out data centers. Standard building blocks are assembled, then installed in standard racks, to allow the data center to scale out, extensibly, as needed.

Avoiding Chaos In a Time of Change

How can data centers avoid chaos? By simplifying IT – while also managing the IT that is already in place, as long as it is useful, and running production systems. In many cases, this may involve consolidating systems that have been deployed over many years, running on less efficient infrastructure.

As I see it, given this approach to software-defined datacenters, here are some top considerations for the data center environment, as it evolves to cope with the impact of Cloud Computing, Big Data, Mobility and Social Media:

  • A high degree of virtualization, with 60-80% of servers virtualized
  • On-prem/off-prem workloads, meaning apps that have links to the cloud, or to offsite hosters. More workloads will now be seen as end-to-end, linking mobile phones and smart devices to enterprise systems and cloud systems running in the data center.
  • High-speed network links to move data quickly around the network.
  • High-speed storage to hold the Big Data in scalable repositories, for analysis
  • An evolving fabric throughout the data-center, allowing servers and storage to be viewed as virtual “pools” of resources.
  • Software defined infrastructure, including software-defined networks (SDN), software-defined storage (SDS), and software-defined data centers (SDDC).
  • Sophisticated software management tools to orchestrate and manage workloads across the software-defined environment.

A Phased Approach

We know from experience that IT is highly conservative, often putting off technology refresh for a variety of reasons, including cost and availability concerns. IT managers would rather preserve a production system than to take it offline, only to be replaced by something that is unproven or unstable. That being said, change is coming to the data center, due to the need to update aging systems – and the need to access enterprise data center services as easily as accessing services from outside cloud service providers.

Given the megatrends forcing change in the datacenter, I believe that IT will recognize this process of introducing change as thoughtful and necessary, because business must proceed while infrastructure is evolving into a new level of application abstraction and policy-managed administration, enabled by virtualization of servers, storage and the network itself.

Summary

Updating data centers is no easy matter. Data services cannot be disrupted while thousands of end-users are accessing them. To protect those workloads, new ones must grow up alongside those that are already in place – and be tested—before substituting them for the old ones.

Development of applications that support Big Data/Analytics, Cloud Computing, Mobility and Social Media, which is extensively chronicled in the industry, is moving very quickly – much faster than pace of development for many long-established data center workloads. But IT managers need to manage both types of deployments – new and old – while balancing the pace of change for both.

1 Gartner Presentation, Bi-Modal IT: Managing the Dichotomy in the Data Center, Ray Paquet, November 2014

2 Gartner Presentation, Non-Stop IT: Delivering the Integrated Data Center, David Cappuccio, November 2014

 

 

1200 Virtual Desktops in 3-node All-Flash Virtual SAN (VSAN)

Why Flash Is The Gift On Your Data Center Wish List

bring your data to life

Today’s digital economy with mobile, IoT and cloud is based on the value of data. How do you unlock it? Download the infographic to learn more: