The use of Public Cloud continues to grow with organisations of all types moving data and services into SaaS and infrastructure platforms from the likes of Microsoft, Amazon and IBM. However, what doesn’t seem to be happening is the wholesale migration of entire infrastructures, be that because of complexity, cost or lack of suitability. So, if it’s not wholesale migration, what is driving the continued growth?

I believe much of this growth is driven by a more tactical approach to Cloud, this growing maturity in the way we view Cloud adoption shouldn’t be a surprise and for those familiar with the Gartner Hype-Cycle, it’s an example of technology moving from inflated expectation to the plateau of productivity. Cloud is reaching that end of the cycle now, we have moved on from the “Cloud first” mandate to asking a more reasonable question “can we do this in the cloud and should we?” (this is well argued in a blog post from Matt Watts here). We have learned how the unplanned and poorly thought out move to Cloud can be counter-productive and expensive and instead we are replacing it with a more measured and tactical view of the use of Cloud.
Why Tactical Use?
This greater maturity in understanding how the benefits of Public Cloud, scale, agility and commercial flexibility can be an integral part of modern data architecture, is allowing us to appreciate that when Cloud is deployed appropriately it can deliver significant benefits.
The Technology Vendor Role
It is not just our growing maturity in how we view Cloud, but also that of the technology vendors, who are equally seeing where Cloud can help them more effectively deliver solutions to meet the demands of their customers and how the seamless integration of its scale and flexibility can deliver significant benefit.
The seamless nature of that integration is key too allowing organisations to adopt Cloud, if we can integrate directly into our “traditional” environments without the need for a major upheaval, this can make Cloud extremely attractive.
The Case for Cloud Storage
When considering tactical adoption of Cloud services, storage makes a strong candidate, it provides scale, flexibility and is commercially attractive but there can be significant adoption challenges with native Cloud storage as it tends not to operate the way our enterprise SAN and NAS solutions do, lacking storage efficiencies or integration with our on-prem enterprise data management tools, this can often lead to additional cost and risk.
However, if we can overcome those challenges it can provide an excellent tactical use of Cloud. There is no disagreement that our data is crucial but the amount we keep and how long we keep it is a challenge, whether that’s our production storage or backups and archive. While having our data available quickly when it’s needed is a necessity, there is significant potential benefit in using the almost limitless scale and low cost of public Cloud storage to hold some certain types of data.
Let me illustrate this effective use of Cloud storage by way of two examples I’ve recently used;
NetApp FabricPool
The cost of storing data still remains a significant part of our already stretched IT budget, while housing key production data in our high-performance flash based array is attractive, less so is paying too store data that is not accessed or is a snapshot copy held in that same production flash based data tier.

NetApp has recognised how Cloud can help reduce that cost with the introduction of their FabricPool technology. While we may not want to go all in with Cloud-based storage, the idea that little used or backup data can be seamlessly moved into a scalable, low-cost Cloud backend makes great sense. This is what FabricPool delivers, providing integration of Cloud storage right into your production aggregate with seamless management of the movement of data based on some simple user-defined policies, which makes for an extremely attractive proposition.
Veeam Cloud Tier
The next example comes from availability vendor Veeam. While you may want to keep your most recent backup datasets relatively near your production environment, is the cost of housing dozens of TB’s of storage within your datacentre the best way to store longer retained backups, those you may keep for a month, a year or maybe forever?
Even relatively “cheap and deep” on-prem storage still has a cost, purchasing, support and maintenance all adds up over the life of your backup repository.

Cloud storage is perhaps the perfect home for long term copies of backup data but integrating it can be difficult. Veeam Backup and Replication Update 4 introduces Cloud Tier providing a seamless integration of Cloud object storage directly into your B&R infrastructure. Backup data is then moved seamlessly into the public Cloud via the definition of a few simple policies and then intelligently retrieved only when needed. No major adjustments needed, the backup administrator sees the Cloud storage in an interface they are used to, delivering non-disruptive integration with the ultimate cheap and deep storage location for our backups, saving capacity and reducing cost in our primary backup repository.
Summary
What’s important about both of these approaches is the simplicity and non-disruptive nature of their deployment, nothing forcing a shift of production data to a cloud tier, learning of a new interface or refactoring any parts of our infrastructure but seamlessly integrating the flexibility of Cloud directly to where it can be effective without trying to enforce its use where you may not want it.
This kind of thinking from leading vendors alongside a more mature and measured approach to Cloud adoption is driving this more tactical use, it will be interesting to see how this tactical adoption of Cloud continues to develop and what other good examples we will see in the future.
You can find out more about NetApp FabricPool here and Veeam Cloud Tier here.