Cloud Computing
In the mid 90's, I made a decision that my staff would get Windows PC's on their desktop, It was then that I started to develop a theory. My theory has to do with the time it takes humans to adapt to change. Some are faster than others, but none can adapt instantly. Microsoft Office took sometime for my staff to learn, but they did. They used Outlook for email, learned to send files attached to email. schedule meetings. They learned PowerPoint to better communicate their plans. They learned how to use Excel to better plan their work. They became more productive. Then a year later, the company IT support department decided that everyone should migrate to the new release of Office. When it was installed in my staff's desktop, it took them months to adapt to the differences between the old release and the new release. It took them months to reach the same level of productivity that they had before the upgrade.
It was then that I developed a theory that, in order to benefit from new technology, one should not change it frequently. Younger people would accuse me of getting old fashion. So let's fast forward to 2011. Tablets like the iPad are taking over where PC's use to be. Smart phones like Google based Android and Apple's iPhone are extending the reach of being connected and redefining applications. Cloud computing like Google's applications, Dropbox, Saas, and other new applications are replacing software that you install and run on your computing device.
One of the big difference between Cloud computing and the old-fashion PC installed software is who decides when it changes. In PC based installed software, normally the user decides. Between 2000 and today (2011) that is often challenged with PC installed software often updating itself without the user's knowledge. However, if one is technically astute, they can control when updates are done.
Not so with Cloud computing. One day it works this way, and tomorrow it works differently. The user never sees a change coming. It just appears to work differently. Currently, I'm a heavy user of Chrome and the Google applications. They are 'free' and I use Google docs often for spreadsheet and word processing needs. Often, when I tried to do something as simple as add a column of numbers using the SUM function in spreadsheet talk, it works differently than it did before. The results, I frequently have to retrain myself how to get it to work. It normally is just an minor aggravation, and only a small amount of disruption. Sometimes, however, the disruption can be more than minor.
I'm just forced to adapt, faster, more often, and unexpectedly.
If this trend becomes accepted behavior in the software and technology development world, and I claim it is, then what will happen when technology gets really sophisticated?
Let's fast forward to 2030. Singularity is upon us. Computing is pervasive. You are always being monitored and connected to the Cloud. Your transportation is automatic, your calendar is always updated, your family knows what is going on with you all the time, as does the legal world. Your TV, phone, tablet, desktop is now everywhere and you can not tell one from another. You have a 'personal assistant' that 'lives' with you. Some call it an AI robot, but is is really more than that. It takes care of you, plans your day. You depend on it for your health and well being. You become attached to it. You like it. It likes you.
One day it works like this, then the next day it works differently.
What's with that?
Who, or what, is controlling whom?
No comments:
Post a Comment