Thoughts on ‘the cloud’

By July 17, 2012Business, Cloud

So what is the cloud? Well, that’s a discussion for another time. Short version: it’s lots of things. But for the purposes of this discussion, let’s assume it means shoving as much of your IT infrastructure, especially servers, onto some 3rd party who manages it for you and lots of other people in a standardised way

Over here on this LinkedIn group, I commented the below in response to SmartCompany’s recent Death of an IT guy article:

Yeah, I saw this too. Whilst I agree in general, I think this is yet another example of the cyclical nature of IT architectural solutions. Think thick-thin-thick-thin client transitions over the last 40+ years. “The cloud” can be the same. Where it gets tricky is that “the cloud” represents multiple different layers and slices across layers. i.e. virtualised infrastructure is quite a different beast from a software-as-a-service app (e.g. stuff). Personally, I think

Business-wise, Australia is at too much of a bandwidth disadvantage to really make much use cloud ‘infrastructure’, at least for SME’s. i.e. Most of my clients are still on prosumer ADSL2+ connections, not corporate bandwidth, even though they may have scores of users. On-premises solutions have natural advantages. I think the main thing is actually the degree of incompetence in IT. It’s very difficult to be an incompetent cloud provider. It’s extremely simple to be an incompetent local “IT guy” or MSP. We compete against them all the time (you know the type – the guys who’ve never heard of S.M.A.R.T, think RAID is backup, have no idea how to do risk analysis let alone a technical-only DRP, rely on vendors to swap PC components, can’t do deep troubleshooting etc).

With ‘the cloud’, on average, you do buy a higher level of quality. However, in the analyses I’ve done, your peak quality & fitness for purpose are better with local infrastructure for most use cases.

Continuing this further for a more targeted SME audience, the primary technical problem is latency. Stuff just ain’t fast enough to do things like file access over an internet connection. And that’s assuming you have sufficient bandwidth (i.e. corporate-grade internet at a minimum of 10Mbits – i.e. you’re spending > $1,000/month on your link/s). The primary business problem, however, is a lack of flexibility and a level of marketing sophistication that makes it very difficult for you to compare like with like. So you virtualise your infrastructure? How do you access that virtual infrastructure? You still need PCs (even thin clients cost the same, hardware-wise). The consequences might be less, but you still need to stop them getting virused. If you’re using netboot, you need on-premises infrastructure to do this. You need switches and routers. You need serious internet bandwidth and reliability. You still need an IT guy. You still need to do a migration. And you’re paying an external company for their up to 90% gross profit margins. Are you actually better off? What happens if/when they go bust? What if you want to switch to a competitor – how easy is it to transition out of? As we’ve seen lately, the cloud isn’t without its risks either. Sure, it may be cheaper in some instances, or it may be more expensive but you don’t care because the hassles go away. But do they really?