Exciting times are here for those of us with an absolute passion for infrastructure - continued advancements in the virtualization of x86 workloads, software-defined storage and networking, coupled with management, automation, security and compliance.
But let's not lose sight of the purpose of the infrastructure within this wave of innovation – it's there to run applications!
Paul Strong, our CTO of Global Field for VMware, kicked off TAM Day 2013 in San Francisco with an Executive Keynote and addressed the concepts behind the software-defined data center and the relationships with the application workloads.
Running applications can be challenging. Each application has to be purchased or written, architected, integrated and tested. If it’s critical to the business or the mission, the application will be throughly and repeatedly tested. Then it needs to be deployed, managed, secured and monitored. Backup, recovery and business continuity plans are also needed. And if you are really dedicated, you will practice actual failover between sites rather than the academic paper-based exercises.
Oh no, you're not done just yet. Applications need people with those skills to operationally maintain the application lifecycle and processes, including patching and upgrading.
And that's just a single application. How many applications are in a typical enterprise? Some have hundreds, some thousands! It is this large number of applications and their diversity that drives complexity in IT, and thus cost. And it is diversity that has made automated management so hard. Diversity in terms of applications and in terms of infrastructure – and all change frequently.
Minimizing the number of patterns is what allows cloud providers to achieve economies of scale, or what we could think of as economies of simplicity. Minimize the number of patterns deployed and in so doing you are able to massively automate the management of applications. Automation improves reliability, reduces cost and enables agility.
While enterprises are unable to reduce the number of applications managed down to single digits they can certainly source some of their applications as SaaS. But what about the rest? Well, it turns out that if we make all of the applications look the same from a management perspective, then we can get most, if not all, of the benefits of simplicity, and automation.
And it is this principle that lies at the heart of a software-defined data center.
The software-defined data center is about two verbs – virtualize and automate. Virtualization separates applications from the physical infrastructure, placing them in simple containers (virtual machines or virtual data centers). With the applications in containers, you can isolate them from each other. You can move them from low capacity to high capacity, and back. You can move them from a breaking machine to a working machine. You can move them from a machine that needs maintenance to one that does not. You can move or replicate them across data centers for business continuity. You can move out to the cloud to burst or test. Separating applications from the infrastructure allows you to do all of this, and virtualization is the means of achieving this. Placing your applications in these containers is the key to simplification.