If you’re like me, you’ve heard the term “cloud” more than you’ve heard your kids names over the last year. And, if you’re like me, you already have expensive legacy data centers. Yes, cloud is cool. No, it’s not free. If you have a barrage of app owners looking to move apps to the cloud, how do you handle all the costs you already incur?
Some companies are cloud-forward. That means they have a strategy to get applications on the cloud and the flexibility in their cost base to make the push. Some companies are completely regressive, where any cloud dollar is incremental and therefore not ok. Everyone else is in the middle. Go to the cloud opportunistically, but only when it’s for a cost avoid of new servers.
In all three cases, you have to deal with the legacy costs. In my view you have a few options.
The most ideal option for you (but probably the worst for the company) is to take any legacy costs out if the allocation pool and send them to the top of the house. This is great for you and your internal clients as it allows a quick pivot to cloud with a short memory on legacy decisions. It’s bad for the company for the same reasons. No one is accountable and it doesn’t drive behavior.
The worst option is to allow applications to opt out if the legacy costs as they become cloud-ready. This drives behavior to the cloud but punishes the legacy applications that aren’t suitable for the cloud. Taken to the extreme, you end up with one dinosaur database application that carries the entire legacy cost basis. Not ideal.
Like most things, the middle route is the most sensible. In this scenario, you establish a “tax” for all applications. The point if this tax is to recover legacy overheads. Applications are given a break on the variable costs if they move to the cloud, while still covering their cloud bill (and any associated overheads). This allows app owners to take control of their cost base (ideal!) while still living in the real world.