Sitting in the stadium watching baseball as the San Francisco Giants clinched the National League Championship Series (NLCS) vs. the St. Louis Cardinals on Thursday night, Oct. 16th, it seemed that nearly every one of my bleacher neighbors had a cell phone, recording his or her own “slice” of the action.
As the final run slid into home plate, and the entire team piled into a “group” jump in centerfield, cellphones and iPads with cameras captured the moment in pictures (flash storage!). Professional TV crews captured the moment in 3D video (flash storage, again!). Video-streaming brought the images to the home viewers (Flash! Flash! Flash!).
Everywhere you looked, the moment was being recorded, and the video transmitted, starting at AT&T Park, where fireworks were taking off behind the scoreboard – and radiating outward, across the city and around the world.
The Chain of Events
I know this: This chain of events could not have happened without flash storage in smart-phones, allowing multi-gigabytes of video to be captured in the thin confines of cellphones – and storing multi-terabytes of video in 4K high-def cameras.
Once captured, the video content had to be transmitted through a series of “content depots” that passed the content along – at high speeds—to be broadcast on the Internet, and over TV networks and cable TV distribution systems. Turns out that “edge” servers, which relay content to broadcast centers, often have flash on-board to expedite transmission of video’s large datasets.
In that moment of the Giants’ pennant win, just after 8 PM Pacific Time, the story of flash became instantly clear. Flash enabled the capture of the video, flash enabled the transmission of the video on ESPN, Fox and other networks, flash enabled the data-center processing by servers and storage – and flash enabled the capture for replay or archiving.
All of this digital-content transmission underscored the video life-cycle, which continues daily, and worldwide, although often not on the excitement level of this last playoff game, leading to the World Series between the San Francisco Giants of the National League and the Kansas City Royals of the American League this week.
Using flash, the fans in the stadium captured a lot of sights and sounds of the fireworks, the team high-fives on the field, and the screams and yelps of the excited fans in the stands. In that sense, the fans have become the broadcasters of their “slices” of the event. That moment of the final home-run alone generated lots of video – not to mention the video captured by all of the digital content providers and all of the media in the stadium’s press box —and on the field.
After the initial capture, images have to be sent somewhere, often pausing at “edge servers” on their way to central-site data centers at the cell-phone companies and service providers. At IDC’s Directions conference in March, 2014, IDC Research Vice President Rick Villars said that “edge servers” are growing rapidly, positioned at the “edge” of a network to bring content closer to where it is, ultimately, consumed. Supporting them are other servers in the data-center, which relay the video content – and distribute it to the edge servers.
Data Center Servers and Storage
Flash storage is widely recognized for its compact form-factors, its high capacity levels – and its ability to accelerate performance. Its ability to “cache” large amounts of data is well-known – and leveraged for work with transactional workloads and database updates. It is also less expensive than DRAM, a factor that could mean flash will continually be associated with technology refresh cycles aimed at improving system performance and data capacity.
SanDisk®’s repeated testing of data center workloads shows that flash storage accelerates many workloads in the data center, from financial applications online transaction processing (OLTP), to high performance computing (HPC). In a wide range of workload areas, flash SSDs can dramatically accelerate job completion. To drive home that point, SanDisk has posted many technical white papers that show the results of these benchmarking tests, posting them here: http://www.sandisk.com/enterprise/resource-library/resources/
Bringing it All Together
In the video lifecycle, some processes are CPU-centric (rendering), while others are I/O-centric (playback, editing and data transmission). Capturing the video when it’s created is important – but it is only the first step in the process. For playback and media editing, mixed-use modes for storage are prevalent—but read-intensive modes are used for media-streaming to workstations, servers and high-capacity data repositories. Delivering the finished video, via-media-streaming, and archiving the video, completes the cycle.
What began with consumer devices—with flash a key component of the lightweight, portable products—has now come to the data center. Flash-enablement is already firmly established in the racks upon racks of servers in the Internet-centric data services delivered by enterprise and hyperscale data centers.
The impact of flash-enabled data processing is this: fewer server footprints will be able to handle increasing amounts of digital content. Sporting events like the baseball’s World Series and American football’s SuperBowl present variable workloads—filled with ups and downs, as more viewers access on-demand video services during sports events. These data centers and edge servers will have more video-sending capacity than ever before. That’s a fundamental change in the data center world that will allow it to cope with the data deluge – and with the event-driven videos that can be accessed anywhere, anytime, by a growing variety of consumer devices.
With flash, whether I’m actually at the event in person like the San Francisco Giants’ baseball game last week, or I am viewing the event on video from my home, I’m sure to get a very rich fan experience that’s crisp, instantaneous and engaging. This multi-screen digital content creates a wraparound viewing environment that brings the game from the field to the living room, to the coffee shop, or wherever I happen to be when I tune in.