Stay on Top of Recent M&A's with Breaking News Stories

Mergers & Acquisitions on Ulitzer

Subscribe to Mergers & Acquisitions on Ulitzer: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Mergers & Acquisitions on Ulitzer: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Mergers & Acquisitions Authors: Pat Romanski, Bob Gourley, Jim Kaskade, David Skok, Liz McMillan

Related Topics: Cloud Computing, Intel XML, CIO, XML Magazine, Infrastructure 2.0 Journal, SOA Best Practices Digest, SOA & WOA Magazine, SOA in the Cloud Expo, IT Strategy, Mergers & Acquisitions on Ulitzer, IT as a Service, CIO/CTO Update

Blog Feed Post

My Prediction of Cloud & SOA in 2005

Moving to a utility computing model helps organizations consolidate computing resources

Okay, maybe it's petty and I'm just tooting my own horn, but I found this old article I wrote for Upstream CIO's October issue (written in July '05).  In rereading this article today, I surprised myself how aware I was of the forthcoming Cloud & SOA convergence.

The popularity of Service-Oriented Architecture (SOA) is gaining ground on the heels of the success of eXtensible Mark-up Language (XML) as a means for representing data in a technology-neutral manner and its ability to move data between applications. However, SOA plays a more important role than merely a moniker to capture the use of applications that communicate using XML. SOA is the software world's entry into the utility computing model.

For CIOs, the utility computing model has come to be synonymous with the ability to provision computing resources on demand, thus maximizing these resources based upon needs of the organization and its customers. In some cases, the utility computing model has begun to incorporate metering, which allows IT organizations to charge back usage to individual departments or projects, thus enabling the CIO to demonstrate the need for the current resources and to plan properly for capital expenditures of new resources.

Moving to a utility computing model helps organizations consolidate computing resources, which results in lower costs of ownership and management of those resources.  However, due to today's inefficient software designs, most computing resources are dedicated to a single function, such as Web applications or enterprise resource planning. Such dedicated resource allocation means you cannot re-provision them to other tasks when the demand hits. Instead, IT departments are forced to acquire enough computing resources to satisfy each function running at 100%. This results in organizations overspending on computing resources to compensate for this limitation imposed by the software. Thus, today's application models are undermining consolidation efforts and limiting the cost savings to the organization.

Enter SOA, which offers a software architecture that maps more closely to the utility computing models that have already been adopted, allowing better allocation of resources and greater provisioning controls. SOA enables organizations to develop and, more importantly, deploy their software in a loosely coupled fashion. This means the software is designed as a set of black boxes that have well-defined inputs and outputs, such that they can be linked together in a process flow with ease.

SOA operates on the concept of service providers and consumers who have agreed upon service contracts. These contracts are based on well-defined messages that are passed between the provider and the consumer, but at an abstract level, they are no different than the agreements that you have with any other utility, such as the phone or electric company.Indeed, supporting these contracts requires the same service level agreements that would be expected of mission-critical utilities.

While there is no concrete definition of SOA today, the generally agreed-upon attributes of SOA include being:

  1. Based on technology-neutral standards, such as XML and Web Services;
  2. Self-describing through metadata files created using the Web Services Definition
    Language (WSDL);
  3. Discoverable through a URL mechanism, which means they use common Web
    mechanics for implementation;
  4. Stateless, which means they can easily be reused since they are not reliant upon
    specific resources being dedicated to them while they are in use.

These attributes lead to the creation of software that can be provisioned and allocated to a resource on demand, keeping in line with the tenets of utility computing. Moreover, the concept of providing a service means that computing power can be made available like other utility-based services, such as telecommunications and electricity, which have the ability to direct more or less service based on need.

Hence, this change in architecture leads to a fundamental change in the way software is designed, but also in the way that software is deployed and managed. For example, once a service is developed, it needs to be deployed into an infrastructure that is operationally managed. This means the service may require access controls, redundancy, quality-of-service, service level agreements and root cause analysis when it is not available. These are the same services one would expect of any utility-based service.

In keeping with this model, then, SOA maps perfectly into existing utility models, such that IT can charge back infrastructure usage based on its existing utility metering, adding on charges for the number of times the service is invoked. Additionally, IT has more control over the resources used by these services, such that a single service will not require a significant allocation of a particular resource, such as today's application servers or SAP servers require.

Thus, the organization can deploy services where there are available resources that fit the demand for that service. Moreover, this change will not impact existing applications if they use the model of "find and bind" that has become a cornerstone of SOA. That is, users can look up the location of a service in a registry and dynamically bind to and use that service without any prior knowledge of its existence.

This movement toward SOA requires a fundamental shift in the way that IT operates within most organizations today regarding software. Today, most IT organizations follow a system integrator model of delivering software, which means they take existing off-the-shelf applications and/or programming tools and deliver an automated solution to a business problem, similar to the service provided by Accenture or BearingPoint.

The move to utility computing using SOA requires the IT organization to operate more like the electric or phone company for their organization. The IT department may still do development work, but it will mostly be responsible for creating new services.  Eventually, business tools will create these services. Thus, IT will become the organization responsible for the infrastructure for provisioning resources for these services based on demand, with the goal of no interruption in service.

This is a major change for many IT organizations that are managing silos of applications today. If one application that supports fifty users is unavailable, IT can expect five to ten calls to the help desk. However, if a service that supports fifty applications is unresponsive, the help desk can expect to receive ten times that many calls. Needless to say, the impact of this paradigm change to the organization and the IT resources is significant from both a cost and resource perspective.

The utility computing model can save organizations millions of dollars each year
through hardware and human resource consolidation.  By adopting SOA, organizations can incorporate a software model that easily maps onto the utility computing model, such that resources do not need to be dedicated to particular applications, but instead can be provisioned based on demand within the organization. This approach requires the organization to plan for changes in how the IT department operates and the types of skills required to build and support an effective utility computing model.

Read the original blog entry...

More Stories By JP Morgenthal

JP Morgenthal is a veteran IT solutions executive and Distinguished Engineer with CSC. He has been delivering IT services to business leaders for the past 30 years and is a recognized thought-leader in applying emerging technology for business growth and innovation. JP's strengths center around transformation and modernization leveraging next generation platforms and technologies. He has held technical executive roles in multiple businesses including: CTO, Chief Architect and Founder/CEO. Areas of expertise for JP include strategy, architecture, application development, infrastructure and operations, cloud computing, DevOps, and integration. JP is a published author with four trade publications with his most recent being “Cloud Computing: Assessing the Risks”. JP holds both a Masters and Bachelors of Science in Computer Science from Hofstra University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.