Every few months the IT press is swept by talk of the “next big thing” and how it will totally transform the IT market, change how IT users work, shake organisations, batter IT departments and threaten CIOs. In reaction many commentators will post a wave of comments explaining that the “next big thing” is not new at all and has been a core part of IT for years. There are always a few that point out that IBM Mainframes were doing the same thing decades ago. The current wave of press hype seems to be focused on the combination of Social Networking, Cloud computing and Mobile devices and trends built upon these components such as “Consumerisation of IT” and “Bring Your Own Device”. In this case I think there is some substance to the arguments made by both camps but both seem to be neglecting what could be a slower acting and more profound shift - a shift in the application of technology rather than the technology itself.
Data, Information and Action
The easiest way to see this shift is to start with an earlier shift that we all now take for granted - the shift from data processing to information technology.
- The origins of computing were focused on data. The earliest computing machines dealt with transmitting, computing and* recording* data. Early examples included the machines used to tabulate the 1890 US census or the engines which produced ballistic tables in the 1940’s. In order to use the data humans were needed to link the data to real world concepts and then make decisions and take actions based upon this.
- In the 1980’s the economics of computing technology allowed a new model to become mainstream. Computing shifted to be focused on** information*. New multimedia personal computers allowed us to *present, share and* analyse* information. The data was still there underneath but people could interact with non-numerical information in much more natural ways, use it to make decisions and then take appropriate action.
- So are we ready to consider what comes next? Actually, no - the world has already moved on. The focus of computing has already begun a shift onto actions. This is about executing, responding to and orchestrating actions.We use information technology to find out about a product but action technology to order it online.
Another way to think about these different shifts is to draw an analogy with language.
- Data is about symbols, codes and structures - in the study of language this equates to syntax.
- Information is about meaning which has parallels with semantics.
- In language pragmatics is concerned with the effects.
You can study the syntax and semantics of a marriage ceremony but you are really missing the point if you don’t understand the pragmatics.
SoCloMo - so yesterday
Now think about SoCloMo through this lens. Cloud and mobile computing (and Big Data for that matter) are actually new data technologies. Social networks are largely information technologies. Will they have a big impact and create new opportunities? Yes. Do they represent a fundamental shift? No. And, yes, IBM mainframes were doing this sort of thing decades ago. If you are looking for a better example of the latest era of computing technology you could consider (relatively) old fashioned internet shopping.
Even without considering SoCloMo there is still plenty of scope to extend the application of existing technologies in individual products (for example medicine dispensers which keep dosage records) or whole markets (such as collaborative consumption and sharing). There are also areas where we should be desperately trying to replace data era systems with action era systems (replacing dumb email with workflow springs to mind).
Chief Action Officer
Finally, if you look hard enough, you can see examples of future eras of computing which focus on decisions (perhaps some kinds of algorithmic trading in financial services meet these criteria). So do IT departments and CIOs need to panic about SoCloMo? No. But perhaps they should watch out for the Chief Action Officer or Chief Digital Product Officer.