To me it is obvious and inevitable that "utility computing" is the future direction of IT. The provision of computing facilities will become a service in parallel with the development last century of power and telephone services. Instead of housing banks of servers, organisations of all sizes will plug into the network and access the appropriate computers and applications on an "as needed" basis. There is however a long way to go yet! The whole structure of the IT industry will be changed.
Historically factories bought their own power equipment, so that makers of steam engines, etc. were to the fore. But with the development of electrical power and distribution networks it was the electricity supply utility that became the focal point. Which user today knows or cares who makes the turbines, generators and transformers? The same will happen to computers; the equipment suppliers will fade into the background and it is the service suppliers who will dominate. Who then will care if it is Linux or Windows? This is why all the major computer system suppliers are steadily drifting away from an equipment bias towards a services bias.
The basic idea of providing computing resources as a "utility" service is not new, but in the past it has only been of fringe interest, whereas in the future it could and should become the norm. Providing the physical computers and managing them is only the first step, and probably the easiest one, a much bigger problem is the limitations inherent in the communication networks. Current networks are just about adequate but improvements in reliability, availability, speed and cost are needed. This becomes more important as e-commerce applications escalate since they are essentially network based. Indeed this general trend is a key catalyst in creating a demand for utility computing. "Personal computing" is a solution to a communication problem, not a long term concept. The unfortunate astronomic growth of mobile telephony has diverted resources from low-cost, high speed data networks, but it will happen, albeit later than we would have liked in the IT industry. It just shows that we are never masters of our own destiny!
The basic system software such as operating system, database management software and management tools can be as easily shared as the hardware but the actual data and the applications are a much bigger problem area. The data is always unique to the company that owns it and even if so-called standard application packages are used they are tailored to suit individual companies needs. The company providing the utility service can be responsible for operational services but the organisation using the service must be responsible for the data and applications. The experience gained over the last few years with outsourcing will prove invaluable in the long run.
The earliest attempts at utility computing were called "time-sharing", which was of limited success because of the unacceptably high cost of remote communications. These early systems were based on simple ASCII terminals and still they suffered from communication limitations so that they could never have coped with today's demand for graphics. The Internet and the low cost of PCs have changed all that and the time-sharing concept has resurfaced under the guise of "Application Service Providers". They use technology such as Windows Terminal Server or Citrix to provide access to managed applications which are compatible with local area based PC applications. In any case the Internet itself is a refined form of time-sharing, albeit at present serving a limited range of applications, dominated by Web servers.
Martin Healey, pioneer development Intel-based computers en c/s-architecture. Director of a number of IT specialist companies and an Emeritus Professor of the University of Wales.
Stuur dit artikel door×