How Low Cost Is That Low-Cost Cloud?

CFOs need to learn how to figure costs per-application to make good decisions about deploying Cloud systems.

Today’s Cloud feeding frenzy has been fuelled by heady promises of low costs, almost instant functionality and, ultimately, IT Nirvana.

It’s those putative lower costs, of course, that make most CFOs sit up and pay attention. But if the primary driver behind your Cloud initiatives is to reduce IT costs, then you need to take a second and third look at your financial assumptions.

The Cloud vs. traditional on-premise computing cost argument can be clouded (so to speak) by the way organizations structure and report their IT spend. Those organizations that report IT expenses in the form of the standard chart of accounts, typically broken down into staff costs, depreciation, utility, maintenance, and so on, may not be able to state accurately the actual total cost of a specific application. So if you’re looking to replace one of your on-premise applications with a Cloud equivalent because you think it will be cheaper, then you better be sure that that’s indeed the case.

Comparing Apples to Oranges
The arrival of Cloud as a technologically viable alternative to on-premise or traditionally-hosted enterprise applications can make for some interesting discussions, but if you are unable to compare costs between applications — typically a per-user per-month per-application calculation — how can you assess whether a particular Cloud offering is low cost compared to its equivalent on-premise system?

Read the full article here