There is a school of thought, being echoed by no less a technology behemoth than Microsoft, that maintaining your own IT infrastructure will one day be as antiquated as maintaining your own power generation capability. In the past, they remind us, companies had to generate their own power - until reliable utility power generated centrally for broad use came to be. Suddenly it was no longer cost effective to make your own power. The problem comes from the leap of irrationality they make in drawing an analogy between that step in our industrial evolution, to the current practice of a company maintaining their own IT infrastructure leading inevitably towards a cloud model.
On the surface, and only on the surface, this might counterbalance the fear of the "new" some folks might have. People didn't trust utility power at first, but they eventually learned that it was great and the cost savings were worth the risk. As it applies to utility power, certainly this is a sound argument.
Two things to keep in mind though.
1) Electricity is fungible; data isn't.
2) Companies still have backup power systems for when (not if) the utility fails.
The cost of maintaining redundant power is pretty reasonable. And in some cases, the cost of maintaining redundant data systems is reasonable. There are certainly use cases for cloud computing, but technology leaders are still exactly right to be critical of those who argue everything can, should, and will eventually be "in the cloud". If they end up being right, it will be by accident and will probably not look anything like they imagine it today. For now, outside those few "low business impact" use cases representing the lowest hanging fruit, the opinion of this technologist is that SaaS / Cloud / Web Hosted solutions struggle to do significantly more than exchange one set of problems for another. Not that there's anything wrong with that.