Data Center Tech Blog

From chatbots, to edge computing and serverless computing, here are four key areas contributing to the expansion of cognitive computing and propelling value in a data-driven economy

Shailesh Manjrekar

Director of Product and Solutions Management

Cognitive systems are systems that have the ability to analyze, interpret, reason and learn, thus augmenting human intelligence, and they are able to do so without constant human involvement. With the ability to understand the context behind the content, cognitive computing is taking Big Data analytics from data-driven to value-driven. As we unleash more possibilities and opportunities for data through technological advances, we are able to not only turn data into information, but also to transform it into knowledge and even wisdom.

One of the biggest changes enabling this data revolution is the unprecedented scale, performance and breakthrough economics of high-density storage systems, software-defined storage technology and cloud architectures. I recently wrote about the case for object storage-based data lake platform and how it can enable new use cases for process discovery, process mining, and process optimization. These new storage building blocks enable us to store, access and transform data at scale like never before, enabling IT resources to launch new applications and services and thrust forward the evolution of cognitive computing.

 

See how 2.8 billion rows of data is flowing through the data lake, opening up opportunities for creative intelligence

 

Cognitive Computing – 4 Areas of Upcoming Change

Relying on the foundation of storage and data at scale, here are four key areas where big changes are happening that will propel cognitive computing forward:

Less Human Interaction

As cognitive computing becomes more mainstream and more mature through deep learning, neural nets, pattern recognition and natural language, processing cobots (Collaborative Robots) are now performing more tasks than ever. You can see this in manufacturing plants and virtual assistants in human-less grocery stores, automated eateries, and, of course, driverless car services (semi-autonomous driving).

Data-Driven Programming

Modern-day chatbots deliver simulated conversation by leveraging machine learning and picking up on conversational tendencies to mimic human interaction when delivering a service. Chatbots are also growing as a robust tool to gather business intelligence and advance artificial intelligence through responsive technology. Many tech platforms are already helping developers build chatbots on their platform and businesses are now increasingly using chatbots through popular messengers as front ends for cognitive tools. As this evolves we may see chatbots becoming the new websites and messaging apps the new browser.

Edge Computing

With some analysts estimating IoT devices may reach 50 billion devices by 2020, the bandwidth available to support connected devices will not be sufficient to keep up. Many IoT devices require high-speed data processing and analytics with short response times (think of the use cases of vehicle-to-vehicle communication or medicine where milliseconds matter) that are difficult to meet by sending data to a centralized cloud. Edge and fog computing solve this pain point by placing processes and resources at the edge of the cloud, while data is actually stored within the cloud. This solves the bandwidth problem and results in faster local processing, delivering near real-time analytics.

Serverless Computing

Function-as-a-Service (FaaS) and serverless architectures are shifting how companies think about infrastructure that is lightweight and can be triggered on an action. Developers can now focus on an application without needing to build it for a specific configuration, scaling it up or down, or spinning up a VM or containers – it all happens through automatic sequences. IoT edge devices and gateways will gain great value from these architectures, here, too, enabled by less human intervention.

More Data and More Value Ahead

Artificial Intelligence (AI), machine learning, and deep learning are not new, but now we are able to harness the data which is required to make these algorithms more intelligent through large data lakes and data oceans, enabled by high-density storage. Algorithms are becoming a commodity and the data that trains them is a strategic asset. Computing power enabled by GPUs is able to handle deep algorithms and help with unsupervised learning, the open sourcing of deep learning frameworks, mobile computing, attention marketing and, most importantly, the rise of Augmented Intelligence.

The power of data is undeniable, and companies are finding ways to generate more value from it, enabled by the combination of people and machines. Underlying storage, network, and compute infrastructure is what powers the possibilities of data; choosing the right building blocks is what will open up new opportunities for businesses, services, and innovation. With the onslaught of data, density, simplicity and affordability are key. Employing cost-effective, scalable and powerful solutions such as high-density Big Data flash platforms like InfiniFlash™ for processing and analytics, or highly scalable modular storage systems like ActiveScale™ systems (with over 19PB raw) for cloud storage, enables businesses to address the growing demands of data and enables new possibilities of delivering value. You can read more about why Big Data flash-based data fabric is an ideal storage layer building block for converged platforms here.

Moving forward, it is important to turn data into information, knowledge and eventually wisdom. We will be able to do this by leveraging technologies like augmented intelligence, deep learning, pattern recognition and natural language processing.

Celebrating Diversity at Western Digital

Introducing Skyhawk™ NVMe-compliant SSDs from SanDisk® brand

Subscribe Today!

Get our latest posts via email and enjoy data center insights in a flash