It’s a fair question and when I think about many of the meetings I have, when you cut to the heart of what the person that I’m meeting with is actually thinking then on many occassions this is probably it.
So who are we and why should you care? NetApp is a Hybrid Cloud Data Services company, we enable companies to have their data and applications in the right place at the right time with the right characteristics to accelerate innovation and find new insights. We do this by helping to build their data fabrics

We are focused on three areas, the first is developing what we believe to be the best, most advanced and cloud integrated physical platforms to cover the more traditional workloads that companies want to keep on premises, be that our industry leading All Flash FAS (AFF) platform, our FlexPod Converged Infrastructure (CI) solution with Cisco or advanced extreme performance technologies such as our MAX Data.
Then there are the technologies that are enabling companies to build their Private Cloud platforms, using our HCI or StorageGRID solutions.
And finally we have deployed many of our solutions directly into the major Hyperscalers, AWS, Azure and GCP. This provides the enterprise grade performance and protection that companies have to have in place in order to run more critical applications in the Cloud. For example, we have a fully certified solution for running SAP in Azure with our storage services (Azure NetApp Files) as the underpinning data platform.

All of these are effectively end points where data and applications can live, it’s about choosing the right ones with the right characteristics for what you want to achieve.
Now that you are able to choose the end points that are right for you, what if you could very simply discover them all, regardless of whether they are on premises or in any one of the Cloud providers? With them all discovered you can now associate meta data to them so that you can apply policy based capabilities, for example you could assign ‘Tier 1’ as meta data to all of the end points that need the highest level of protection, now you simply create a protection policy and apply it to all volumes that are associated with ‘Tier 1’, regardless of where these end points reside.
With the end points discovered you can also start to integrate them, maybe you have an on-premises volume residing on our AFF platform, what if you could simply create a connection from this to a volume residing in Azure, giving you DR, or Test and Development or even just a simple way to migrate the data.
Now we can optimise them, we can monitor how quickly they are growing, how much data could be archived, we alert you as to how you could continuously optimise the characteristics of your data fabric as it evolves, even giving you the necessary information to work out whether the current location for the data is the most cost effective.
With this foundation in place you can now start to layer services over it. From a security perspective we can show you who is accessing data across the fabric, alerting you to behaviour that may be unusual, and we’re continuing to build out our capability to be able to provide data compliance reports.

Think about this, you’ve discovered all of the end points within your fabric, then someone copies data from an on-premises area to Azure, firstly we can alert you to the fact that this activity just happened, then we can allow you to run a report to identify whether the file that was moved could expose you to a compliance risk or breach.
Our NetApp Kubernetes Service (NKS) enables you to build Kubernetes clusters in just a few clicks that can span any or all of the major Hyperscalers, AWS, Azure or GCP and should you choose, your on-premises HCI platform, attaching persistent storage if required using the Trident storage orchestrator.
Partners can also layer their services onto your Data Fabric as the entire environment is built using open API’s, if there are capabilities you require for the data and applications across the end points then it’s simple to layer these on.
I have this kind of discussion with nearly all the companies that I meet with and it always seems to resonate well. What NetApp do now is a far cry from the storage company I joined 15 years ago.
Storage devices for now are still important, but the move to the Cloud is accelerating, I’m glad we made such a huge investment more than 5 years ago now into building out our Cloud Data Services capabilities.
If we were still just purely a storage company and only now starting to think about the Cloud then I’d be concerned.
Thanks Matt, been looking for something like this to share, it’s easy to digest, articulate and open to further discussions
Thank you!
Thanks for the comment David, I wrote this off the back of a few recent meetings. It did lead to really good further discussions
Matt
Many thanks Matt, very helpful and easy to consume summary of NetApp. Always hard to find a comprehensive and up to date view on the company due to the never stopping innovations.
Thanks for the comment Thomas, I’m always trying to find the balance between telling a cohesive story whilst not going in to too many details about any individual piece. Like you say, this is quite a challenge when new products and innovations are coming so fast right now. It’ll be interesting to look back on this in six months and then twelve to see how quickly it becomes out of date.
This is a really good explanation of how NetApp is taking the underlying fabric and building this into a true data platform, one that provides intelligence, insight and automation all are core tenants of a modern data strategy.
I just noticed your comment hit my spam folder Paul, not sure why as it’s always good to hear from you. It feels like we hit a tipping point last year where we went from a number of pieces that people could use for parts of their data fabric to something much more clear and understandable. Now we just have to make sure people are aware of what we can do