Introduction to IT Process Automation

For many organizations, IT Process Automation (ITPA) is little more than a dream. IT managers, easily overwhelmed in the complexity and day-to-day rigor of maintaining IT services and supporting production environments, are spread far too thin to spend the necessary time to examine all the variables that need to be understood to completely automate systems. The business that they support is often complex and, to some, a mystery wrought with continuous change management concerns, never-ending problem management sessions, and production incidents that keep teams around the globe busy at all hours of the day and night. Processes are being initiated to manage incidents, changes, server provisioning, backup and recovery exercises, and security, just to name a few, and these processes depend upon many different systems and subject matter experts (SMEs). Further, the systems used to maintain the IT infrastructure are themselves in a constant state of change as vendors release new software versions and security fixes are applied to existing systems to protect the infrastructure. Considering all this complexity, the thought of automating IT processes can be quite overwhelming. Where to begin?

This guide examines IT processes and the steps organizations can take to begin to turn the dream of ITPA into a much-anticipated reality. The benefits of ITPA will enable IT organizations to take full command of their infrastructure, realize greater efficiency, become more effective, and deliver an amazing level of business agility that will enable any organization to rapidly respond to changing business requirements with reduced overhead and greater focus on the business.

Definition

Fundamentally, a process is defined as the act of proceeding. It is the act of moving forward, continually, in a controlled manner. Taken by itself, this definition fits the many vertical needs of organizations. Processes are created, documented, and aligned to meet the many needs of business and IT operations from associate on-boarding human resource processes to manufacturing, shipping, IT procurement, project planning, and security. Unlike human resources and the other more traditional process areas, however, IT processes are highly systemcentric. That is to say that IT operations processes depend upon many different, often conflicting or competing, hardware, software, and systems solutions to meet their goals.

One universal challenge to all IT organizations is the matter of server provisioning. Once a project requests a server, and those requirements are understood, many teams are often called upon to facilitate the request including those engineers or vendors who physically install the servers, those who install the core build, and those who ultimately install the applications. To deliver server resources in the most expeditious manner, the teams supporting your infrastructure have defined many processes and implemented many different types of software solutions to fulfill those technical requests. These teams are generally well prepared to meet the challenges they face; what they often lack, however, is a means of meeting the challenges in the most efficient and effective way.

As any IT executive who has gone through a corporate merger can attest, the biggest challenges aren't always in getting a Help desk system to integrate with a paging and escalation system or provisioning servers; the challenge is in getting one system to integrate with another. Vendor solutions often come equipped out of the box with a means to improve quality control across in your enterprise as well as increase the throughput of your IT processes. The idea of integrating with competitors is foreign and abstract to many vendors. Why would "the best" product integrate with a "lesser" product, and if they did, what would be the customer's motivation to consolidate to "the best" solution? This often leads to the establishment of "tribal" knowledge and very low process maturity, and although the psychology of vendor management is outside of the scope of this text, the underlying problem is most clearly illustrated through such examples. Vendors lack the motivation, time, and resources necessary to drive integration to the level needed for end-to-end ITPA to be realized. ITPA tools bridge this gap and break through the silos that have been established by the "tribal" mentality, leading to solutions that deliver crossfunctional and cross-silo efficiencies.

Responding to the need to manage IT in a controlled and predicable manner, many of your peers have begun to leverage industry-level best practices as a guide. Best practices for financial IT systems, for example, can be overheard at any major financial conference just as the best practices for IT in healthcare are being put forth during healthcare conferences. One thing these practices do (or should) have in common is a link to a larger framework. The best practice in service desk management isn't limited to a single industry segment, and selecting the best ticketing system for your Help desk, project management and change management system for your delivery teams, and provisioning software for your infrastructure teams is critical to the value proposition of IT management. IT processes simply must be aligned as efficiently and effectively as possible and ITPA is a way to not only align your best practices but also deliver them more efficiently than ever before thought possible. Software capable of orchestrating the IT infrastructure to meet the needs of the business can be leveraged, along with tighter integration and configuration management databases (CMDBs), to deliver truly automated IT processes to the data center.

Facing the Data Center Automation Challenge

Today's data center involves myriad software, hardware, people, and processes with one purpose—serve the information technology (IT) needs of the business in the most cost-effective and efficient manner possible. Unfortunately, no organization is currently able to fully realize this purpose. Hardware and software solutions have been designed, implemented, and purposed to fulfill very specific needs. Decision criteria for the software, hardware, and systems procurement should be set by the needs of the organization requesting the system and who ultimately will be utilizing the system. Allowing these decisions to be driven out of a nontechnology group, however, can cause some problems.

For example, when handling the movement of data, the likely candidates from an IT perspective would be an IT Storage Solutions team designated to manage the systems that house, manage, and administrate data, and the IT Server Support team designed to support the production system. The reality is that how data is managed is often defined more by the application system being purposed and delivered than by your internal technology teams. As a result, systems are purchased and implemented into the data center that do a relatively good job of delivering on their specific goals for the business but are difficult to manage, difficult to maintain, and difficult to optimize.

Keeping in mind that there is a lot of great software available today for IT management that wasn't available, even in concept, as little as 5 years ago and although these management tools do a really good job delivering on their out-of-box promises, they often do very little in the way of integrating into a holistic IT operations management view. Why? The answer, simply put— it's not their job.

Consider a vendor who develops monitoring software and who could invest the time and resources into integrating their monitoring software into another vendor's ticketing system. Such a feature may be given significant weight in procurement and would be a great feature, but just as there are many monitoring software vendors, there are also many ticketing systems. So the question then becomes which one(s) do they attempt to integrate with? The added overhead of attempting to integrate with all the potential interface points would be overwhelming for even the largest of contenders. Further complicating today's IT infrastructure is the matter of sprawl. In the IT management context, "sprawl" is much the same as it is in urban development and refers quite directly to irresponsible development, design, adoption, or implementation of IT solutions. The existence of such sprawl usually leaves the IT executive with a disparate and difficult-tosupport environment consisting of incompatible "standards" and architecture. Instead of a single means of storage management, an organization may find that it has over time adopted several kinds, each with their own purposes. A small application in a business unit, for example, may use network attached storage (NAS) for storage and retrieval of temporary files at the local site. At the time, the decision seemed to be the best choice based upon cost and network bandwidth constraints. Today, however, as organization migrate smaller storage units over to storage area networks (SAN), NAS equipment may be counter to an organization's SAN direction. If the line of business has no motivator to change their system, IT managers may be left supporting technology artifacts that have simply accumulated over time. Fast forward to today's environment; while many have since learned their lesson on storage sprawl, new technologies, such as virtualization, offer up similar challenges that haven't yet been fully realized and are an order of magnitude more complex. Virtualization involves more than just a device; it is hardware and software working in concert to deliver the production platform on which your operational stability will be based and sprawl could lead to crippling levels of process rigor if left unchecked.

Organizations prone to transition or who have gone through many steps of acquisition and periods of growth find this problem in abundance. Reversing this equation is going to require three very important and integrated parts, which we will refer to as the pillars of ITPA.

Architecting the Next-Generation Data Center

Now that we have identified some of the fundamental problems that slow or are currently preventing centralized IT management, the next step is to identify and take action to orchestrate change towards a more cohesive IT process model. What will set the next-generation data center apart from current technology will be its ability to orchestrate processes across systems to drive IT efficiency, support business agility, and enable legal and regulatory compliance. This will be accomplished through ITPA tools, configuration management, and tighter integration within the data center.

Pillar 1: ITPA

A key success factor for the next-generation data center will be its ability to gather, analyze, and make decisions based upon predefined IT processes. Solving this problem is at the very core of the definition of ITPA. It is important to first point out, however, that ITPA is not job scheduling nor does it necessarily entail any scripting or coding. What ITPA is, rather, is an emerging orchestration methodology that enables the cohesive automation of IT processes from an IT management or business process view.

When deciding upon the right ITPA tool for your organization, it is important to make certain that the options being considered are actually within the definition of ITPA. ITPA tools tend to be process oriented, rather than programming oriented, and require less technical acumen to implement and leverage than non-ITPA tools. A good ITPA tool will seamlessly integrate, out of the box, with your IT systems with no additional programming or scripting required and should have a very low technical learning curve so that those who are focused on business processes, rather than programmers, can make full use of the tool.

ITPA tools will enable your front-line IT managers to deliver cross-platform process integration. To contrast, traditionally the bigger, better, or most appealing software to an IT manager is the one that works with what they already have within the infrastructure. At times, this may mean sacrificing features for the software's ability to integrate with pre-existing software. By providing orchestration between software, ITPA tools render this decision weight irrelevant and enable IT managers to make decisions based upon the qualities of the software rather than its ability to be integrated.

Pillar 2: CMDB (Inventory)

As ITPA addresses the orchestration of software and systems to fulfill IT processes, CMDBs, as the second pillar of the next-generation datacenter, will need to address how data about those information systems is kept up to date. A CMDB is a repository of information related to all the components of all the information systems within an IT infrastructure. CMDBs store granular information about information systems in records known as Configuration Items (CIs). Within the CIs' data is data relating to the technology, the ownership of the technology, and the relationship of the technology to other CIs. The result is a central database that contains an inventory of relationships between systems which can be used by ITPA tools to deliver added value.

CMDBs help organizations meet one of the biggest challenges within IT management—asset to inventory management. Acquisition, sprawl, and a multitude of other factors have contributed to your organization's infrastructure size today and inventorying that system can be a challenge. An ITPA solution can help alleviate pain associated with setting up and maintaining a CMDB by automating processes that search other inventory systems (such as those used by your procurement process), and integrate those assets into the CMDB. Ongoing CMDB maintenance from that point forward could consist of ITPA processes created to add further value by bolstering a CMDB's ability to automatically discover information about CIs and track and report on any changes as they occur in near real-time.

Pillar 3: Integration

The third pillar in the architecture of the next-generation data center is one that has been deeply pursued and widely understood for some time, although its full realization has yet to come to fruition—integration. Systems simply need to be able to work and function together without human intervention. The challenges preventing integration are equally as fundamental.

As an analogy, imagine for a moment that the data center of today is an airplane and that its function is to move us from one point to another. Like an aircraft, much depends upon the data center and peoples' lives can be significantly impacted if either were to malfunction. Many countries have worked together to standardize many aviation safety requirements. The result is that a pilot trained and skilled to fly a small aircraft in the U.S. will possess the skills necessary to fly an aircraft in the UK. The function is the same (move through the air) and the controls have been well thought out and established so that yoke, yaw, rudder, and aileron controls all function in much the same way. If everyone were granted the capability of designing their own airplanes using parts from many different vendors without any centralized control, the world would be a much more dangerous place. Data centers are that dangerous place. It took decades for the aviation industry to become what it is today, and in its infancy, the airplane presented a much less complicated "infrastructure" of components. So how do we replicate the tight integration achieved in manufacturing processes in the next-generation data center?

  • Make it a priority—With the many vertical silos that exist in supporting line of business, operations driving toward a goal, such as integration, will need to be addressed systematically. To do so, you will need to have the backing of senior management. In aviation, the driver is government regulation. In an organization, it's typically happening at your level. It can even be said that the only person within an organization who is able to perform strategy work is the Chief because only he or she can provide the incentive to see that strategic decisions are fully realized.
  • Standardize—To meet the goal of integration, organizations must adopt standards. It's important to frame the standardization discussions in the appropriate context and keep the conversation moving forward. Ask your teams questions about the software, hardware, and systems in production. If you have more than one type of software or equipment performing a specific task, take a closer look to identify why. Many times, IT managers will find that the reason for differing products or services is due to differing business requirements in or among several business units. The real component that needs to be understood, however, is how the technical requirements differ. The challenge then is to work with business partners to close any of these costly gaps.
  • Provide incentives—To develop and then maintain an internal direction to standardize, your organization will need incentive. Senior management support is critical to getting it off the ground, but maintaining momentum, and focus, can best be achieved through incentives. Wherever possible, link the risk/reward associated with standardization to financial impacts. Associating an actual financial impact to a technology decision provides a driver that can be leveraged in terms the business can quantify and relate to.

Linking the Present to the Future

Today, as organizations strive to reach their operational goals, to drive out greater efficiencies, and to become more agile, and as they meet the challenges and opportunities presented by compliance with IT standards, the wise CIO will be looking for ways to link the organization's IT goals to business goals. Each of these three pillars solves problems today that can be employed to build a business case for adoption. ITPA tools can be leveraged today both to drive integration and to help with on-boarding past, present, and future CIs into the CMDB. ITPA tools may also be leveraged to open a trouble ticket and even initiate further troubleshooting steps automatically by integrating with all the software used to manage your IT infrastructure. This saves both time and process rigor.

Strategic Drivers for Process Automation Now

Achieving cost-effectiveness and efficiency in the data center is a primary business goal of most organizations. Today's data centers, however, often have internal and external drivers, many times in conflict, that impact their ability to deliver on their purpose.

Generally, these drivers fall into three categories:

  • IT effectiveness and efficiency
  • Business agility
  • Standards and compliance

An organization's ability to realize the goals behind the drivers in each of these categories can be significantly improved through ITPA. As we examine each of these drivers in detail, we will begin to reveal much of the power and flexibility ITPA and ITPA tools can provide.

IT Efficiency

Efficiency is most commonly defined as the ratio of output to input for a system. In IT, efficiency is measured in many ways. A data center may measure efficiency by the load of work on the central processing units (CPUs) of its servers against the number of "transactions" processed by the line of business using known benchmarks. It may measure efficiency by comparing and contrasting power consumption against work performed. Further still, efficiency can be measured by the amount of man hours that are spent intervening into IT processes that could have been automated. It is this last area that is most directly impacted by ITPA.

Today, many IT processes are manually executed. Help desk support technicians perform tasks of varying complexity within support software to assist business partners with common technical problems or requests. These are often repetitive tasks that regularly occur, such as a request for a new server, and may require an IT process to be executed in part by manual processes, such as ordering a server, which may require a human to interact with more than one system to validate the request, confirm identity of the ordering party, and so on.

An effective ITPA tool will allow users to build and automate IT processes visually without the need to script or program anything. This reduces the technical acumen required to deliver and automate IT processes. Further down-level from the server-ordering process is the provisioning process. An ITPA tool can quickly and easily take care of consolidating several manual IT processes into a single executable process that requires no IT management intervention. Figure 1.1 illustrates the steps necessary to automate the process of provisioning a virtual server.

Figure 1.1: Provisioning a server.

The gap in IT efficiency for many organizations is quite broad and ideas fueling new ways to fix efficiency gaps abound. At the crux of the issue is an organization's ability to simplify IT management. ITPA tools simplify IT management by reducing the number of people required to automate IT processes, reducing the technical acumen involved in creating solutions, and ultimately improving the productivity of the environment.

Business Agility

To be successful, organizations must be agile both in thought and action. Reducing the time needed to react to changing market conditions by implementing new business initiatives is vital to a business' success. IT organizations can contribute greatly to the agility of the businesses they serve by automating IT processes.

Consider an organization that has grown large enough to acquire a competitor and expand their market but whose only major limiting factor is that the organization they want to acquire is known to have deployed a different IT infrastructure model than their own organization. Integrating these two organizations is going to present a significant challenge to IT resources and create a rather large need for IT capital for project teams and replacement systems during transition. An often-unreported loss in the transition and acquisition process is the loss associated with lost software licenses. The company being acquired may have another year or more left on their software licensing agreements that will now no longer be leveraged. Wouldn't it be great if both organizations could integrate their IT processes without a wholesale replacement of software during acquisition? Through ITPA, they can. Rather than needing an immediate change in IT processes, which results in heavy impacts to IT associates, their core knowledge base and competencies, these two organizations could leverage an ITPA tool to integrate the disparate systems during acquisition. As Figure 1.2 illustrates, the newly acquired company's software is simply added as an interface point for the process within the ITPA tool. The end result is an organization that can rapidly assimilate new products and services with virtually no impact to the underlying IT processes.

Figure 1.2: Add a new monitor to quickly assimilate a new application.

We have thus far illustrated how ITPA tools can be leveraged to provide for automated monitoring and reaction to incidents, but they can also go a step further and provide reporting and escalation based upon SLA goals. With the proper ITPA tool in place, senior management can be kept abreast of issues that matter most to their production environments and receive escalation warnings well in advance of service level objectives.

Standards and Compliance

As pointed out earlier, the aviation industry would likely have had a much more difficult and turbulent history had they not embraced, or through government controls been forced to embrace, some form of standardization. Some regulatory compliance drivers exist today that impact IT operations, especially within organizations that provide financial or medical services or who contract with the U.S. government, but no centralized regulatory organization exists to completely and authoritatively address IT infrastructure standardization. Instead, organizations need to work within and comply with many differing outside "standards" in order to be effective.

An IT operations team may find that it needs to put into place specific technical and administrative security controls to comply with the Health Insurance Portability and Accountability Act (HIPAA), the Sarbanes Oxley Act (SOX), or any number of government regulations. In structuring, internal governance teams may also decide to adhere to internationally accepted best practices and frameworks such as ITIL and the Control Objectives for Information and related Technology (COBIT). Further still are those directives that apply throughout an organization such as using Six Sigma as the standard process development methodology within an organization. Each of these "standards" has the potential to weigh heavily in IT decisions that have an effect on day-to-day operations. As government regulations mandate controls to how organizations do business, IT systems will need to be made compliant with those controls, and as ITIL is adopted as a standard framework for IT Service Management and Six Sigma used as the methodology for driving out process efficiencies, pressures mount. It may not be sufficient to merely deliver ITIL's overall framework; IT managers may be challenged to do so in a way that generates metrics suitable for use in Six Sigma or other process improvement efforts.

ITIL provides many advantages to IT organizations, particularly in the area of IT service support. With the ITIL framework, service support is comprised of six main functions:

  • Service Desk—The goal of the service desk it to provide a single point of contact between business partners (the users of IT systems) and the IT service managers (that maintain IT systems).
  • Incident Management—This function provides a specific focus on restoring normal service operation as rapidly as possible and to minimize the impact of service interruption on the line of business. Incident management goals are often set forth with SLAs between the IT organization and the line of business unit dependent upon the technology.
  • Problem Management—Problem management backs up incident management with root cause analysis specifically focused on problems caused by errors within the IT infrastructure.
  • Configuration Management—The objective of configuration management is to track all the individual CIs within the IT infrastructure, a task which is typically accomplished through the use of a CMDB.
  • Change Management—This function serves to ensure that changes to the IT infrastructure occur in a standard way and to minimize any opportunity for adverse impacts to line of business operations resulting from change initiatives.
  • Release Management—Like change management, the goal of release management is to minimize impact related to changes but with a specific focus on the protection of existing services. Regular updates to server platform operating systems (OSs), such as the application of security updates and hotfixes is an example of release management.

Within each of these main functions, specific IT processes will need to be developed. Incident management may benefit from an IT process designed to monitor for IT events and dispatch tickets to the appropriate teams automatically, and, depending upon the tools available with your infrastructure, automatically provide technicians with problem details through basic automated troubleshooting and senior manager's escalation reports.

Problem management can benefit from automated trend analysis. When walking the line between the business needs of the line of business partners and the business needs of the IT organization, perspectives and priorities can become blurred. Take the common business goal of reducing operating costs. To a line of business partner, this may mean reducing the amount of money that can be spent on new initiatives. To the technology organization, it may mean seeking out funding to replace costly antiquated systems to trim support costs. Without a means of aligning perspectives, relationships can become strained and data gathered will lack the perspective necessary to make good business decisions. Using ITPA tools, IT teams can begin to bridge this gap. Data within incident tracking systems can be analyzed by running predefined queries against the trouble ticket system across multiple databases.

IT process surrounding change and release management can also benefit from automation. ITPA tools can be leveraged here to automate distribution processes, saving time spent in change and reducing manual overhead through repeatable and consistent ITPA processes.

Conclusion: Increased IT Effectiveness and Efficiency Through ITPA Delivers Business Agility

At its core, ITPA is about orchestrating the IT infrastructure to meet the needs of the business. It is about accomplishing business objectives through intelligent ITPA tools that can integrate multiple disparate technology systems. ITPA tools increase IT effectiveness and efficiency by being vendor agnostic and flexible enough to meet changing business needs rapidly and with reduced overhead. In doing so, ITPA tools create an agile business environment where decisions about IT product and service acquisition can be based upon the merit of the product or service rather than its ability to integrate, on its own, into an IT infrastructure.

Combining ITPA tools with the other two pillars of the "Lights Out Data Center," CMDB and integration, will deliver a future-state data center with a tremendous amount of power and flexibility that will afford IT managers the opportunity to center their attention less on service support and more on service delivery.