NetApp stitching it all together

Recently I wrote about how NetApp were showing a commitment to helping businesses adopt new technologies by lowering the barrier for transformation. This was not the only NetApp news in June, On June 18th they made a wide-ranging set of product and strategy announcements, this came alongside analysts IDC re-categorizing their definition of the hyper-converged infrastructure (HCI) market to include dis-aggregated systems such as NetApp’s HCI platform.

In this article, I want to offer a view on how I see these announcements and the part it can play in your wider data strategy to help more effectively meet some of the demands of today’s enterprise.

At the heart of it

When NetApp introduced their “data fabric” vision it was no more than a philosophy, driven by the understanding they couldn’t thrive if they remained solely a provider of storage arrays. The data market was changing significantly with ever-growing volumes of increasingly crucial data, needing to be stored in multiple locations, while maintaining control, security and insight, all while trying to embrace public cloud, it was important for them to adapt or become extinct.

This has been NetApp’s direction ever since and this set of June announcements underlined, not only their commitment to their strategy but also to provide the technology to deliver it.

Our Challenge

The use of public cloud inside of the enterprise has changed our perception of how we want to experience IT, business leaders now expect cloud flexibility and efficiency across their infrastructure and will no longer tolerate lengthy, expensive projects, to deliver a new service, they want to meet business demands quickly and in true cloud style.

What do we need to meet this challenge?

  • Consistent services across public clouds and on-premises
  • Ability to have a cloud-like infrastructure-as-a-service delivery on-premises
  • Deliver new services and applications faster with automation integration

Achieving this is not a trivial task, deploying infrastructure in diverse locations and platforms presents management issues, for the platform, the data and applications we place on it.

This is aim of NetApp’s data fabric, to bring consistency across multiple locations and deliver portability of data and applications and a true cloud-like experience across the entire infrastructure.

The How

Prior to these announcements NetApp already had strong capabilities in this area, ONTAP and Cloud Volumes provided consistency of service on-premises, near and in the public cloud. Features like Fabricpool allowed cloud integration directly into an on-premises production array and their HCI platform offered a cloud-integrated endpoint via the NetApp Kubernetes Service.

These recent announcements build comprehensively upon this foundation, two areas in particular highlighted for me how they can help us stitch together an even more effective fabric.

Cloud Volumes, which allows us to deploy ONTAP driven infrastructure in the public cloud either as an overlay to native storage providing rich data management or as a service that provides a file based environment running on NetApp platforms within the hyperscale provider’s datacentres, has had two major extensions. Firstly extending into Google Cloud to give consistent data services across the three major providers and a potentially more interesting announcement in Cloud Volumes on-premises.

I mentioned earlier IDC and their recategorization of HCI platforms, which although provide NetApp with validation of its place in the HCI market, I don’t believe the “traditional” HCI market is actually their focus. Cloud Volumes on-premises is however, a much clearer indication of how they see this platform, as a seamlessly integrated endpoint in your cloud infrastructure to which you can easily move native cloud workloads.

It is not that capability alone that is most interesting, AWS and Azure already have platforms that can sit in your datacentre integrating with their clouds, but let’s think broader, what if I wanted my on-premises environment to be the common platform between multiple providers and orchestrate the relationship between them and your existing on-premises infrastructure? That is exactly what NetApp are doing and this is extremely interesting.

The missing piece?

As much as this has the potential to change the way we can consume public cloud and integrate with our own environment, what often makes the adoption of such multi-location solutions difficult is a lack of tools to manage the integration, service consumption and data portability the underlying infrastructure provides.

This is where NetApp perhaps delivered their most important announcement, Fabric Orchestrator, a solution designed to meet this very need. It isn’t a product to be sold it’s a toolset that provides a single management platform for your environment, it can discover and manage your entire data fabric and orchestrate management workflows all within a single interface.

The demands of many modern businesses are challenging IT leaders to provide a cloud-like experience across the business, be outcome focussed, deliver automation, mobility and scale, as we’ve discussed this is not a simple task.

If Fabric Orchestrator delivers, it can be a significant innovation that overcomes many of the complexities of operating a multi-location environment, making it easier to adopt and exploit.

What NetApp is delivering with these innovations is not only a vision, but they are backing it with the technology that can actually allow you to deliver your very own data fabric and produce outcomes modern business demands.

Where NetApp head next is going to be intriguing, as is seeing how the rest of the industry follows, and they will need to, before NetApp start to extend their capabilities beyond their own technology and allow us all to build a true data focussed platform where the underlying technology becomes less relevant and data truly becomes king.

For more detail on these wide ranging NetApp announcements here’s a selection of other posts you may find interesting;

Bringing data fabric to life with support for hybrid multi-cloud and devops

The season for the data fabric

It’s hci jim but not as we know it

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.