Data Center Tech Blog
Steve Fingerhut

Vice President of Marketing, Enterprise Storage Solutions

NASA US lights from space

Image Courtesy of NASA Goddard Space Flight Center

SanDisk® at Gigaom Structure Data

Last month I had the opportunity to participate in a Gigaom Research Webinar discussion on “Innovations in flash storage and their impact on analytics,” moderated by Barb Goldworm, president and chief analyst, FOCUS, and joined by panelists Mike Karp, VP and principal analyst, Ptak, Noel & Associates and Ashar Baig, Founder, Analyst Connection.

The message from the panelists was unanimous: data is the most precious commodity for businesses, and flash-based storage helps to speed the process to unlock the value of that data using analytics.

With companies collecting more and more streams of data – extracting value from compliance archives, quickly analyzing log stats or consumer transactions – businesses are looking to make faster decisions to help impact the top-line.

Big Data And Big Challenges

The challenges of big data and analytics derive from the sheer scale of data produced (a single passenger airliner engine produces 20TB of data every hour of an international flight) and the need to process data as quickly as possible – for security, safety or strategic business targets (in the oil and gas industry an hour of analytics downtime can cost a company $1 million).

With the need for businesses to remain competitive and the demand for high-performance, reliability and faster results, traditional data center infrastructure and budgets are having a difficult time keeping up. A poll taken by the webinar participants agreed that top anxieties for supporting Big Data requirements are:

  1. Response time
  2. Overall throughput
  3. Reliability
  4. Capital costs / Equipment costs of hardware

These answers manifest why flash-based storage is having a profound impact on analytics environments by delivering superior IOPS and throughput (in fact, some SSD solutions even reach near DRAM performance), and better reliability (due to no mechanical failures), while retaining lower power consumption and higher densities.

For IT managers, Big Data is a new territory with rapidly evolving tools. It’s not only about finding the right analytical tools, but also about ensuring that hardware infrastructures can deliver the needed results now, while retaining the ability to scale with steadily increasing volumes of data.

I’ll be in New York this week at the Gigaom Structure Data conference discussing how flash-based storage solutions are helping organizations meet the challenges of big data and analytics, while improving Total Cost of Ownership (TCO). My colleague, Hemant Gaidhani, will join me to examine the various solutions our customers are adopting for Big Data projects, and identify which environments can benefit most from SSDs for better scaling, performance and TCO. If you can’t attend in person, join us on Twitter this Wednesday to read live Tweets and photos from the show!

 

Flash Solid-State Disks (SSDs): High-Performing, Cost-Effective Solutions for Big Data Analytics

Wednesday, March 19th, 2014 at 10:40 AM
Gigaom Structure Data – Pier Sixty Chelsea Piers – Aquitania West Suite
Steve Fingerhut – VP of Marketing, SanDisk Enterprise Storage Solutions
Hemant Gaidhani – Director of Technical Marketing, SanDisk Enterprise Storage Solutions

FlashSoft Applied: Virtualizing Mission Critical Applications

SanDisk® at Interop Las Vegas: Expanding the Possibilities of Storage

bring your data to life

Today’s digital economy with mobile, IoT and cloud is based on the value of data. How do you unlock it? Download the infographic to learn more: