Where am I? | Home Continual Service Improvement Process Evaluation Report
Process Evaluation Report

Process Evaluation Report Template

The ITIL Process Evaluation Report documents the assessment results for a process and its activities, including process performance metrics and the identification of potential process improvement items.





If you like our site please spread the word by telling your friends about us. We like to be Liked.

Process Evaluation Report
User Rating 3.67 ( 6 Votes )

Process Evaluation Report Template (MS Word)                 Process Evaluation Template (Adobe PDF)
MS Word           Adobe PDF

1. Executive Summary

[A Process Evaluation is performed to evaluate the behavior and effectiveness of a specific process. Process evaluations should be done regularly as part of the Continual Service Improvement in the organization. A Process Evaluation Report is a main output of a Process Evaluation. In the Introduction you will be declaring the purpose, scope, and overview of the document itself.]

1.1 Purpose

[Specify what the intention of the whole process evaluation document is. Mention the name of the process that is been evaluated. This process name should also appear as part of the document’s title.]

In the last years, “The Organization” has implemented a program to increase the excellence in the provision of the IT services. A fundamental part of that has been the implementation of initiatives for the Continual Service Improvement of the IT services. That includes process evaluations in all of the main IT process on a regular basis. The purpose of this document is to document the findings of the Process Evaluation performed in October 2012 for the Incident Management Process in “The Organization”.

1.2 Scope

[Define here to which extent the selected Process Evaluation has been applied.]

The Process Evaluation summarized in this report encompasses all of the components in the Incident Management Process along all of the facilities of “The Organization”, including the Headquarters, branches and temporary facilities.

1.3 Overview

[Describe here the structure of the rest of the process evaluation document along with the history and purpose for each section.]

To evaluate the process, data is collected in a number of ways. Section 2, Process Activity Data, consolidate data collected with each of the methods employed to evaluate the process. Each method is aimed to evaluate the process from a specific point of view.

In section 3, Gap Analysis, gaps identified from the previous methods are summarized.

Section 4, Recommended Initiatives, a set of initiatives for improvement is listed to close each of the gaps previously identified.

2. Process Activity Data

[This section summarizes data acquired for each of the methods used to evaluate the process. The number of sections will vary according to the methods that are applicable in your own case.]

2.1 Feedback

[Customer and user feedback is important to assess the perception that they have about the quality of services and the processes that support them.]

Processes in “The Organization” are continually monitored for customer and user perception. The survey framework set as part of the Incident Management Process monitoring is taken as a source for this evaluation. Data gathered in the previous month is the basis for this report.

The following table consolidates and summarizes the values obtained:

Survey Target Value
User surveys
Quality of service support 7.0 7.12
Technical knowledge 7.0 7.89
Quality of solutions 7.0 7.28
Readiness 7.0 7.01
Customer surveys
Alignment to business goals 7.0 7.64
Quality of service support 7.0 7.45
Statistics combined
Average user/customer survey score 7.0 7.56

Table 1. Surveys.

Results show an acceptable user and customer satisfaction with the service and processes.

2.2 Measurements

[Measuring is an essential step in getting an understanding of how things are going. A measurement framework should have been developed and implemented to get effective reports.

Consolidate and explain here the results obtained.]

As part of the Service Management, processes are continually monitored. The same measurement framework set for the process monitoring and control is also used as the source for this evaluation. Data collected in the previous month is the basis for this report.

The following table consolidates and summarizes the values obtained:

CSF KPI Target Value
Align incident management activities and priorities with those of the business
Percentage of incidents handled within agreed response time 98% 98.89%
Resolve incidents as quickly as possible
Mean elapsed time to solve incidents of priority = 1 1 hrs. 1.25 hrs.
Mean elapsed time to solve incidents of priority = 1 3 hrs. 2.70 hrs.
Percentage of incidents solved at the first tier 67% 84.82%
User satisfaction
Average user/customer survey score 7.0 7.56

Table 2. Process Measurements.

Data show that the indicator whose value fails expectation was the mean elapsed time to solve incidents of priority = 1. The indicator was affected for unexpected, repeated failures in a critical database system. An initiative is recommended to increase redundancy and auto-healing in this critical system.

2.3 Documentation

[Check that process documentation exists and is updated.]

All the predefined documentation for the process was certified to be:

  • Available.
  • Aligned with the Best Practices.
  • Comprehensive.
  • Updated.
  • Properly handled.

2.4 Maturity Assessment

[Assess the process in relation to best practices. The goal is to identify gaps and assign priorities for actions.]

The Incident Management Process has been evaluated and assigned maturity levels in a scale from 1 to 5. The levels attained are shown in Table 1. Process Maturity Levels.

Indicator Maturity Level
Capacity 1-Ad hoc 2-Conscious 3-Capable 4-Mature 5-Optimum
Importance 1-Unimportant 2-Low priority 3-Important 4-Very important 5-Critical
Automation 1-Limited automation 2- Limited integration 3-Effective inside domain 4-Effective across domains 5-Enterprise-wide integration
Governance 1-Unclear 2-Clear in some domains 3-Clear in all domains 4-Clear across domains 5-Business aligned

Table 3. Process Maturity Levels.

Results are as follow:

  • Capacity: the process is deemed capable in many management domains.
  • Importance: the process is considered very important for the effectiveness of the IT services.
  • Automation: there exist many tools for monitoring and automation of process, with some automation inside the management domains. Automation across management domains is poor.
  • Governance: governance is clearly understood inside all the domains and across the critical domains. Information needed for decision making is available.

As a conclusion, an improvement initiative is advised as an immediate step to increase the integration of the existing automation tools based on System Center.

2.5 Process Audits

[Process audits, usually by an external party, are conducted to certify compliance with standards and regulations. Mention here the findings of the most recent audit(s), if any, whose conclusions are relevant to the process and can be included as part of this process evaluation.]

To ensure an independent point of view of the company’s IT infrastructure control and the adherence to industry-related regulations, an agreement is in place with the “Audit Commission” for the periodic auditing of “The Organization” IT processes. For a complete report, refer to the report ("The Organization". IT Processes Audit Report. Last quarter 2012.)

Here is a summary of the findings relevant to the Incident Management Process:

1. All of the recommendations from the previous audit were implemented. Most of the problems have been solved.
2. Concerns remains about security and integrity. A unified approach is suggested.

2.6 Benchmarking

[Fill in this section of the process evaluation if you can gauge and compare the process with other processes, with the same process in other parts of the organizations or with the same process in other organizations.]

As part of the agreement with “The Audit Commission”, maturity levels are rated against the average of the same process in other organizations. The results are shown in the following table:

Indicator Companies’ average benchmark “The Organization” benchmark
Capacity 3 3
Importance 3 4
Automation 2 2
Governance 3 3

Table 4. Benchmark.

As shown in the table above, “The Organization” currently ranks in or above the median for the companies surveyed by “The Audit Commission”. Nevertheless, as many competitors are implementing programs to enhance their IT services, it is important to implement improvement initiatives to maintain services to business in competitive levels.

3. Gap Analysis

[One of the most important outcomes of the process evaluation report is to identify gaps for which implement improvement initiatives. These gaps are measured against desirable levels like best practices recommendations, other companies’ results or in-house identified targets. Summarize here the gaps identified in the preceding sections.]

As a result of the Incident Management Process evaluation, the following gaps have been identified:

a. Critical incidents are not solved as fast as agreed in the SLA. One KPI (mean elapsed time to solve incidents of priority = 1) is not guaranteed to succeed under the current process structure.
b. The maturity level in the category Automation is still too low (2 out of 5). The other indicators may be enhanced as well.
c. Security and integrity is still a concern when compared to industry accepted practices.

4. Recommended Initiatives

[For each finding identified in the previous section, enumerate initiatives to close the gap.]

For each gap identified in the previous section, the following initiatives are recommended:

Gaps Initiatives
Critical incidents are not solved as fast as agreed in the SLA
  • Increase availability through redundancy and backup technologies.
  • Implement auto-healing procedures using the available Service Management tools.
The maturity level in the category Automation is still too low
  • Increase automation across management domains.
  • Expand licenses for new options in the Service Management tools.
Security and integrity is still a concern
  • Redesign interaction with Security Management and Access Management.
  • Expand integration along the Service Management tools.

Table 5. Gaps vs. Initiatives.

5. Annex

[Insert here anything you may like to attach to support the process evaluation document.]

5.1 Glossary

[This section of the process evaluation report provides the definitions of terms, acronyms, and abbreviations required to understand this document.]

Term Definition
Access Management The process responsible for allowing users to make use of IT services, data or other assets.
Assessment Inspection and analysis to check whether a standard or set of guidelines is being followed, that records are accurate, or that efficiency and effectiveness targets are being met.
Audit Formal inspection and verification to check whether a standard or set of guidelines is being followed, that records are accurate, or that efficiency and effectiveness targets are being met.
Availability Ability of an IT service or other configuration item to perform its agreed function when required.
Benchmark A baseline that is used to compare related data sets as part of a benchmarking exercise.
Capacity The maximum throughput that a configuration item or IT service can deliver.
Compliance Ensuring that a standard or set of guidelines is followed, or that proper, consistent accounting or other practices are being employed.
Critical Success Factor (CSF) Something that must happen if an IT service, process, plan, project or other activity is to succeed.
Customer Someone who buys goods or services. The customer of an IT service provider is the person or group who defines and agrees the service level targets.
Gap Analysis An activity that compares two sets of data and identifies the differences. Gap analysis is commonly used to compare a set of requirements with actual delivery.
Governance Ensures that policies and strategy are actually implemented, and that required processes are correctly followed.
Incident An unplanned interruption to an IT service or reduction in the quality of an IT service.
Integrity A security principle that ensures data and configuration items are modified only by authorized personnel and activities.
Key Performance Indicator (KPI) A metric that is used to help manage an IT service, process, plan, project or other activity.
Maturity A measure of the reliability, efficiency and effectiveness of a process, function, organization etc.
Process A structured set of activities designed to accomplish a specific objective.
Redundancy Use of one or more additional configuration items to provide fault tolerance.
Service Level Agreement (SLA) An agreement between an IT service provider and a customer.
User A person who uses the IT service on a day-to-day basis.

Table 6. Glossary.

5.2 List of Tables

[This section includes a list of all of the tables in the process evaluation report.]

Table 1. Surveys..................6
Table 2. Process Measurements.....7
Table 3. Process Maturity Levels..8
Table 4. Benchmark................9
Table 5. Gaps vs. Initiatives....11
Table 6. Glossary................13

5.3 Bibliography

Audit Commission. (2012). "The Organization". IT Processes Audit Report. Last quarter 2012.

 

Newsletter Sign Up


Sign up to receive occasional newsletters from Fast ITIL Templates. Registration will ensure you're notified of new templates and articles as they become available. We do not spam or share your email address with others.

*


Copyright © 2012 The Corporation, Inc.
Mark Piscopo