In my judgment, our thinking around the nature and structure of the IT function in most organizations is woefully out-of-date and we need to radically revise this if the function is going to survive the next ten years. What leads me to this conclusion? Information technologies and related tools are now ubiquitous and are increasingly imbedded in goods that, only a few years ago, would have been unimaginable (the ‘Internet of things’). Information is now a commodity in the hands of consumers (smart phones). Social networking media have redefined the landscape for inter-personal and even business interactions. Developments in mobile information tools and technologies are changing the fundamental nature of many business operations. The ability to ‘print’ three-dimensional objects could revolutionize some industries. Data and data sources are exploding, overwhelming the ability of many organizations to cope. And changes in the information environment continue to come at us at Warp speed. In fact, the pace of change is accelerating, if that’s at all possible, and I believe that is because (as the first George Bush stated some years ago now) there are now thousands more ‘points of light’ that are driving this change. But, before we explore this further, it might be useful to briefly examine where we have been with IT.
A Historical Perspective
When computers were first introduced into leading organizations back in the late ’50s, the environment was quite different from today. Computers filled large, specially air conditioned rooms, yet offered a fraction of the speed, processing power, and data management capability that is available today in a smart phone. They were seen primarily as tools to automate manual functions and boost efficiency to help cope with the burgeoning growth that emerged after World War II. Given this limited role, as was then foreseen, computers generally wound up in the domain of the Finance/Accounting function (after all, they worked with numbers, right?) or within the Policy Administration group (for example) in insurance companies. The sizeable budget associated with managing this new innovation also fell within the purview of Finance/Accounting. Because of the aura and mystique that surrounded these early computers, and the novel skill sets that were required to interact with them, executive management adopted a ‘hands-off’ approach to these new machines, and didn’t really want to be drawn into their orbit. These factors would have unfortunate downstream implications that some organizations are still wrestling with.
There were a handful of legitimate computer vendors in the early days, a few of whom had their roots in producing unit record (or punched card) equipment, fore-runner gear for processing large volumes of transaction data more efficiently than trying to do it manually. But now that the computer industry was beginning to emerge, whole new opportunities presented themselves to the vendors. A very few are still with us from those days and have evolved as conditions changed. IBM is, of course, one of those. From their early days of punched card gear and time-clocks, IBM embraced computers as the way of the future, despite its Chairman Thomas J. Watson having earlier conservatively estimated that six computers should be sufficient to satisfy the world’s needs. IBM became a full-service vendor, from offering computing equipment, including repairs and maintenance, through providing education and training on the new computers and concepts, to providing system analysis and design, programming, and management support. For the most part, the people who went into the computer field were fascinated with the technology, and many of them had little real appreciation of business or the operational challenges their employers were facing. Computer systems were usually designed and programmed by in-house staff within an information processing group under the purview of Finance/Accounting. But a funny thing happened on the way to the forum; the information processing departments that had been established under the Finance/Accounting function to improve efficiency became bottlenecks or roadblocks to getting things done. Since Finance/Accounting controlled the purse strings, their projects took priority, often leaving other functions within the organization bereft of computer support.
However, there were many newer, more innovative computer vendors beginning to emerge offering smaller, less intrusive computer gear for a fraction of the cost of what had become known as mainframes. These computers were much easier to work with than the large mainframes and utilized new programming languages that were easier to interact with than those currently supported by IBM, et al. This got the attention of operating executives in other functions within the organization and, assuming they had sufficient budgets to support the move, these new mini-computers began to emerge throughout large organizations to support activities relative to specific operational functions, thereby bypassing the Finance/Accounting information processing group. These devices also began to gain a foothold in organizations that couldn’t afford to employ mainframe services. Moreover, the vendors of these mini-computers began providing packaged application software that made turnkey installations much more viable, minimizing the need for a sizeable, highly-skilled computer staff to support the computing environment. While they were generally very effective at what they did, however, there were some problems with the mini-computer approach: typically, the software they utilized was incompatible across vendors, making it impossible to share information between computers; and similar data tended to be multi-sourced, meaning that the data in the Finance/Accounting computer did not always reflect the same values as similar data from Sales/Marketing, or Distribution. In large organizations, this became a colossal organizational headache, and the executive group was forced to designate an overseer for information technology initiatives, usually the Finance/Accounting IT function that was responsible for creating the problem in the first place.
In an attempt to recognize the importance that information technology was beginning to play in many larger organizations, a new position was created to head up the evolving IT function. This position was designated the Chief Information Officer, or ‘CIO’, to recognize its importance; unfortunately, I believe this to be largely a misnomer. Most CIOs, especially in the early days, had very little to do with information or information management, and were still firmly focused on the technology of data management. That’s because, in many organizations, the previously designated Data Processing Manager simply got a title change and salary increase, with little real change in skill levels, operating philosophy, or management approach. Some organizations tried to avoid this pitfall by bringing in executives from outside of the current IT organization to head up the function. This was often problematic, too; without a background in IT, these executives were largely at the mercy of the current IT staff, who often used these new, more articulate and accepted executives, to further their own objectives and promulgate their points of view.
While all of this was going on, IT was continuing to under-perform in many organizations, and failing to deliver on its potential to add real value to the enterprise. Enter the micro-computer or personal computer. These devices produced an amazing breakthrough, economically putting computing power into the hands of the people who needed it. While people still grouse about them, the business transformation that was wrought by organizations such as Microsoft and Apple was truly astounding. Once again the IT Department was being by-passed in many organizations. But if you thought the mini-computer revolution was chaotic, it had nothing on the chaos that resulted in many organizations from the introduction of these personal computers. The IT Department often had no idea what was really transpiring; they were asked to deal with issues of data compatibility and data security without having been involved in any of the decision processes. When something went wrong, IT often got the blame. IT’s response was often to try to re-establish control. This was often fear-based; if people could by-pass IT, what role was the IT function to play going forward? In many cases this led IT to try to shoe-horn these new devices and concepts into an old mold which facilitated control. Clearly, this obviated the goal of organizational independence and responsiveness that personal computers enabled. Moreover, this often led to the same kind of bottlenecks and lack of responsiveness that the enterprise was trying to transcend. Because many enterprises are still struggling with this issue today, some new approach is clearly warranted.
A New Approach
It is time for IT management and organizational architects to let go of the past. With the explosion of information and technology all around us, the ‘old ways’ will no longer suffice. Unfortunately, in many IT organizations, there is still a tendency to try to improve on what has been done previously. In some circles this is known as backing into the future while looking longingly at the past. Based on the number of seminars I get invited to, we are still trying to learn to manage the past more effectively. This is consuming a lot of time and energy and not really providing the true benefits that the emerging information environment demands. We need to look to the future, not the past.
The first thing is to recognize the changed technological environment. Hardware, while still obviously important as an enabler, is secondary to applications. There are now thousands if not millions of vendors out there. It is impossible to keep on top of everything. The impact of mobile technology is now being felt across many sectors. And information has become a commodity available in the hands of consumers (e.g. price comparison shopping). In fact, consumers are both major users and providers of information through their social networking tools and facilities. ‘Big data’ is being used by many enterprises now to help tune their marketing and sales initiatives, as well as to more effectively tune their operations. And ‘the cloud’ is playing an increasingly important role in enabling and supporting many new initiatives. All of this means that clients and consumers will play key roles in shaping the future. And where will the IT function be in all of this?
If it doesn’t recognize and adapt to the emerging information environment, the IT function as we know it will fade into oblivion. In any event, I believe it is going to have to make significant changes if it is to remain viable. I see a continuing, yet emerging, role for a central IT function reporting to top management. Clearly, the function only exists to help enterprises operate more effectively, more efficiently, and to provide some competitive advantage. But the days of just doing the transaction processing grunt work are long over. The emerging role of the central IT function (as I see it) is to act as an innovator, an enabler, a facilitator, a coordinator and – most importantly – as a visionary. Information policy development will also be a key component of this role. While some operations-oriented and infrastructure-related tasks will undoubtedly remain, this new approach transcends the role that many IT functions are currently providing.
As a result, additional capabilities will be required in the central or corporate IT group. While much of the current IT role will be distributed among the operating divisions or groups, a central entity is still going to be required to pull everything together and ensure chaos doesn’t reign, as has happened before. As a result, a Policy Development and Administration Group within IT will be essential to ensuring that things don’t get out of control. Managing in this complex business environment will be challenging. Since building and managing relationships will be key to the overall success of this approach, a Relationship Management Group will need to exist within IT. This group will be charged with building the necessary relationships across and within the enterprise that will enable and support progress. Coordination across a series of entities that have their own priorities and visions can be difficult at the best of times, and this group will be key to keeping technology initiatives within an overall enterprise framework. Similarly, to keep up with emerging developments in business and technology, a Research and Development Group needs to be imbedded within IT. This will be critical to keeping the enterprise at the forefront of emerging technological developments and practices. It is essential that this group envision or be alert to new applications that might emerge downstream that will transform how enterprises will operate and deliver goods and services in the future. In addition, this group needs to ask the question, “Given what we know about emerging developments in technology, how can these be deployed to support and enable our enterprise, and what will be the impact of these developments?” For example, if the 3-D ‘gun’ printer can be fully developed, what impact will that have on manufacturing, warehousing and distribution operations? And what information systems and processes will be essential to making all of this work? Most importantly, how will they be provided?
Summary
Today’s executive CIO has opportunities coming at her from many varied sources. Which opportunities will likely emerge, and how will these benefit and impact on the enterprise? Executive management and the Board clearly wants to know and, in many cases, the survival of the enterprise may be at stake. This puts the CIO front and center with respect to envisioning, enabling and supporting the right initiatives.
No one really knows where all of this is going over the next five–ten years and I won’t presume to know either. But there is certainly no end in sight in the evolution of technology. What I do know is that the IT function as we have historically known it has to change. A major re-think of this function is definitely in order if it is to survive.
Robert Liley, Principal, The Signal Group www.theciohandbook.com