When to consider utility computing

16

Author: Paul J. Dravis

Many organizations are expanding their use of online services on a global scale, using a range of diverse computing and communications devices on an increasingly real-time basis. Their CIOs want to monitor what IT resources are being used and what they cost, and want to provide business solutions rather than manage systems integration. These dynamics are the fundamental drivers for utility computing, a technology platform where computing, storage, applications, and network resources are available as needed, and where customers are billed on a pay-per-use basis. The vision of utility computing is to access information services in a fashion similar to those provided by telephone, cable TV, or electric utilities.

Utility computing is a service provided by several startups and major systems providers. Hewlett-Packard’s approach is called Adaptive Enterprise, IBM has its On Demand strategy, and Sun Microsystems provides services via N1. Firms such as NetSuite, Salesforce.com, and Salesnet were founded to provide ERP and CRM applications on-line on a per user basis, and enterprise software firms Oracle, Peoplesoft, Siebel, and SAP provide access to their product offerings in a hosted fashion as well.

Large corporations interested in the potential to leverage Web services and grid computing environments can clearly benefit from utility computing, but its potential market is actually much larger. Even small and medium-sized businesses can enhance their financial flexibility by lowering their spending on hardware, applications, and systems and network management services. Organizations with highly variable computing resource requirements should find the utility computing model particularly appealing because of its adaptive nature.

Organizations evaluating utility computing must review their internal technology support costs, assess which IT functions are organizational core competencies, and identify related operational risk factors, just as they would do if they were contemplating outsourcing IT functions. Understanding where internal IT staff bring value to their organizations and which needs can be serviced by a commodity platform is a critical step in assessing utility computing. Questions to address include how well does the provider understand my needs, has the service provider achieved sustained success with other organizations, are the service offerings competitive, and how will the provider adapt to my organizations needs in the future.

The evaluation process should be multidimensional, driven by the breadth and complexity of the organization’s needs and the competencies of the service providers under consideration. Prospective users must examine how much customized processing is truly required by their organization as they seek to achieve cost reductions — in other works, how will much service is “nice to have” versus “good enough” to support the needs of the business.

Once an organization understands its own IT situation, it needs to begin evaluating potential utility computing service providers. Such an assessment will follow a classic “buy versus build” decision-making process. In addition to examining their technical expertise, companies must test the service providers’ understanding of the customer’s line-of-business.

Security remains an issue in utility computing, as it does in all forms of information processing. The utility computing model shifts some control of an organization’s information to the service provider. As such, when evaluating the capabilities of utility computing providers, companies must understand how their data is protected during normal processing and when services are upgraded, transitioned, or terminated. Be sure to review vendors’ data security capabilities for managing a security breach.

A transparent, risk-free processing environment remains the IT holy grail. While the vision of utility computing is broad, most organizations that take advantage of it do so in small, incremental steps. For instance, some start by providing backup storage services, or support for an individual business application. As an organization becomes more comfortable with its utility computing provider, it can begin to include additional applications or technology.

As with other IT offerings, managing user and customer expectations about what is achievable it critically important. The evolution of utility computing will require the convergence of core technologies spanning Web services, grid computing, broadband, storage virtualization, automatic provisioning, change management, and security. Much work remains in building out utility computing platforms, so managing the hype-to-reality ratio is important when considering its use.

The Bottom Line:

To determine whether utility computing is worth implementing in your organization requires evaluating how well today’s IT needs are performed internally, along with understanding future business and IT requirements. You must identify which processes can cost-effectively leverage commodity platforms. The service provider’s business model is based on provided their capabilities in a one-to-many fashion. Customers need to manage the process toward a one-to-one relationship in which solutions are obtained in a reliable, secure, and economically viable manner.

Paul J. Dravis is president of The Dravis Group, which provides
research, consultancy and solutions to help organizations understand,
plan for, and adapt to the changing dynamics of the technology sector.

Category:

  • Migration