Better Together: Thoughts on Flash, Big Data and the Cloud as Fusion ioMemory Joins the SanDisk® Family

Better Together: Thoughts on Flash, Big Data and the Cloud as Fusion ioMemory Joins the SanDisk® Family

Coming into SanDisk® from the Fusion-io team has been an incredible experience. During my career, I’ve gone through several acquisition processes and am incredibly impressed with the speed and scope of the integration – from the time of the announcement to its swift closure, and now navigating an integrated roadmap, ensuring all individual pieces are aligned to create the best matching teams, pulling together towards a joint, larger goal. The pace has been amazing.

Fusion ioMemory and SanDisk – Powering Millions of Users Around the World

SanDisk consumer products are trusted around the world by millions of people.  One often thinks this scale and reach is something only retail products are capable of. However, in the enterprise sector, our hyperscale customers are reaching at least as many end users, and so are we, by supplying the infrastructure that accelerates the cloud services behind the smartphones apps.

Fusion ioMemory technology was developed out of the need to increase performance for these  scale-out architectures.  As we look at companies who are enabling some of the largest social networks and platforms across the globe, the acceleration of data has taken on entirely new dimensions due to the sheer numbers of users engaging on these hubs. With a scale of millions, in some cases even billions, of users, simple tasks such as updating a personal security preference on Facebook or delivering tailored, suggested music choices on Spotify, means these operations are hitting a new scale of workloads that simply didn’t exist before. The need for high-speed logging, high-speed indexing and efficient compression are some of the key requirements that emerged in these scale-out architectures and these same demands are now expanding to enterprises as well.

Within this context, Fusion ioMemory technology fits perfectly with SanDisk’s vision of the flash-transformed data center.

SanDisk is committed to delivering innovative solutions for new IT demands, enabled by the benefits of NAND technology while maintaining lower costs, superior efficiencies, and delivering the performance and endurance required for critical applications.

The Software-Defined Shift

As we examine what’s happening today in the entire stack that connects between applications and infrastructure, we can see the emergence of a software-defined shift. Much of the challenge in delivering better end user experience has been the fact that the concept of application acceleration has remained too narrow. Developers have focused on the application code iteself and trying to make the code run faster. Our view of application acceleration is much broader and looks at the entire stack. For example, we ask ourselves what does an application on your smartphone care about in the cloud? And the truth is that it’s not just the code. It’s hardware latency that’s holding it back. To overcome latency and deliver optimal performance, application developers need to drive new levels of efficiency into their infrastructure. That means making data tier software aware of how the storage works natively, and providing ways for applications to control how infrastructure can better handle the workload.

For example, with software-defined networking (SDN) we understood how switches had unnecessary mechanisms and related costs that hyperscale customers did not care for and wanted to bypass.  A similar idea has now pushed into storage. Building upon the strong capabilities of solid state memory, our integrated MemoryWare™ exposes interfaces that raise the level of abstraction exposed by storage devices and systems so applications can express to the storage system the intended semantics of input/output operations, and thereby exploit the performance and flexibility of flash memory. A case in point is Fusion ioMemory’s Atomic Writes API. This helps push the performance of MySQL databases higher by offering capabilities that allow developers to exploit the native properties of ioMemory normally hidden when flash memory is used as a basic, no-frills block device.

As we move forward to deliver even more powerful and efficient solutions, such expressed intent of software will play an even greater part in how our hardware solutions operate. SanDisk will continue to be a leader and innovator in this space, specifically due to our vertical NAND foundry integration that allows us to best understand and expose the application-relevant characteristics of our devices and systems.

The Emergence of Big Data

As I look forward to where flash technology is heading, I see a huge need for flash in Big Data and analytics. This is a different space than that of transactional databases or virtualization, where application of flash technology is currently yielding new solutions. Big Data analytics isn’t concerned with the number of operations required by a large amount of simultaneous users, but rather dealing with petabytes, and eventually even exabytes, of data that need to be analyzed as a single image. In fact, IDC has stated that our digital universe will double every two years. As we contend with this mass amount of data, and companies seek to leverage it for more insight into business and consumer decision processes, I think flash will have a huge impact on the underlying infrastructure of these systems.

If we look at servers that are running either batch analytics (such as Hadoop) or real time analytics (such as Cassandra) the challenge we face is in how to improve the structure of these systems. The culture especially in batch analytics has been to keep things as cheap as possible due to the huge scale and capacity needs. But the real-time technologies work by holding and processing more and more of their data in memory, and this is a costly thing.

Therefore, in servers used for analytics a large portion of the capital is spent on DRAM technology.  Using flash to displace or extend DRAM could be a much more cost-effective approach than purely relying on DRAM as an in-memory data store.  In several situations, flash memory technology can effectively displace costly DRAM through a smart combination of hardware and software and relieve a great budget stress. SanDisk’s ZetaScale is a great example in the Big Data space of how software optimization can deliver far better performance at lower costs than DRAM.

The Future

With a broad portfolio of enterprise flash solutions, I see the combined knowledge and solutions of the Fusion-io and SanDisk teams as our greatest strength.

Together we look forward to pushing flash adoption further into the enterprise, and innovating to further expand the possibilities of flash-based storage.

We’ve recently launched the SanDisk Technology Council to bring our collective wisdom to bear on strategic decisions about hardware, software and systems technologies across all of SanDisk’s technology and architecture pillars – from memory technology to software, including workloads & architectures, embedded and discrete controllers, devices and systems.

As we move forward, I’ll be sharing more of our discussions and vision for flash technology and future data storage on this blog.

Related Stories

What is the 3-2-1 Backup Strategy?