Email this Article Email   

CHIPS Articles: Seeing the Forest Through the Trees

Seeing the Forest Through the Trees
A Multi-Part IT Service Management Novella - Part 4: “Assess for Success”
By Mukesh Barot, Navy IT Service Management Office and Phil Withers, Navy ITSMO Contractor Support Staff - April-June 2015
Editor’s Note: This is the fourth installment in a series designed to highlight the products and services of the Navy IT Service Management Office relating their capabilities in a business case story format spread over succeeding chapters of an IT Service Management novella:

Part 1: http://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=5299
Part 2: http://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=5640
Part 3: http://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=5924

He was thinking that he was in a really good place. Recapping the sequence of events, Bob thought about his initial foray into IT Service Management after being named a process owner, and how overwhelming it all seemed at the time. Had it not been for the resources and assistance he found at the Navy IT Service Management Office, doubtless he’d be referring to ITSM as a dirty four-letter word like many of his fellow process owners did.

Ah, but with those resources….Bob absent-mindedly motioned with his finger as if checking off an imaginary to-do list and remembered he was able to get a firm grip on the governance aspects of his process, ensuring that the process managers were trained in the scope of their responsibilities.

Bob remembered he was able to create and promulgate a cohesive strategic communication plan to foster messaging unity for his process in concert with the enterprise strategy. He thought about the fact that by using the Navy Process Reference Model (NPRM) he had based his process on international standards and industry best practice and he now had a service quality management plan based on a Navy ITSMO guide and template that captured the metrics for his process and stepped logically through implementing and sustaining end-to-end service quality. These were all good things, to be sure.

But Bob furrowed his brow as he recalled his last conversation with Sally. He had been discussing all of these accomplishments with several other process owners, who — by the way — had jotted down the links to the various Navy ITSMO documents and information contained on the wiki site. I should get a commission for recommending the Navy ITSMO documents, he thought. During that discussion, Bob had nonchalantly stated that the enterprise request fulfillment process was running like a top (or words to that effect).

“How do you know?” Sally said.

“Pardon me?” Bob had been holding court and was on a roll when the question stopped him cold.

Sally continued, “How do you know your process is, as you say, running like a top? What quantifiable measurements have you put in place that shows an improvement trend with identified gaps that enable you to focus your improvement efforts going forward?”

Of course, Bob knew about the Process Capability Assessment Model and Tool (PCAT), and had even listened to the overview brief on the Navy ITSMO wiki portal. He was about to name-drop the PCAT when Sally added: “Furthermore, have you identified your process SWOT criteria? A SWOT analysis is a key deliverable to leadership demonstrating that you have indeed done the hard work of establishing a process capability baseline — where you are right now — and that you have started on a well-defined path to process improvement – where you want to be in six months, a year, or longer.”

“I was just getting to that!” Bob had said with a wide grin.

He wasn’t grinning now. After that impromptu meeting, Bob had made a beeline for his office and sitting at his desk, he scoured through the local hard drive data dump file he had made when he harvested the majority of the Navy ITSMO products that most concerned him.

Bob paused for a second and looking at the ceiling, tried several phonetic variations: “SWAT? SWOT? Surface Warfare…something...Officer…Training?”

It wouldn’t come. Snapping back to the task at hand, Bob found what he was looking for in the Assessment folder. It was the Process Capability Assessment Model/Tool Assessors Guide from the Navy ITSMO’s Assessment and Audit Library. There was other stuff in the folder as well, like the PCAT Plan Template, the Report Template, a SWOT Template (yeah!) and a training brief as well as other supporting documentation. For now, he opened the PCAT Assessor’s Guide to get a feel for the scope of the content.

The guide was well laid out. In particular, Chapters 2 and 3 were what interested Bob the most — Assessment Team Roles, Practices for Interviews, Constraints and Communication, and Assessment Process Activities. He read through the roles and their associated responsibilities: Sponsor, Coordinator, Observer, Lead Assessor and Assessor. He found it interesting that here too, as with so many of the Navy ITSMO products and artifacts, the PCAT was based on an international standard: ISO/IEC-15504 Information Technology — Process Assessment.

Bob was also intrigued by the distinction between the assessment model and the tool itself. As it turns out, the tool is simply an automated Microsoft Excel spreadsheet that accepts the assessor input and creates a bar graph depiction of the numerical values. The red meat is the model itself. Almost immediately he noticed the alignment with all the material he had studied in Service Quality Management — the Deming Cycle was on full display as the engine for the capability baseline and incremental process improvement activity. (See Figure 1.)

Bob took some time for an in-depth review of Chapter 3 to become familiar with the activities involved in a process assessment. The sequential activities were displayed in a flow chart (shown as Figure 2):

  • Not achieved (0 to 15%) – There is little or no evidence of achievement of the defined attribute.
  • Partially achieved (> 15% to 50%) – There is some evidence of an approach to, and some achievement of, the defined attribute.
  • Largely achieved (> 50% to 85%) – There is evidence of a systematic approach to, and significant achievement of, the defined attribute.
  • Fully achieved (> 85% to 100%) – There is evidence of a complete and systematic approach to, and full achievement of, the defined attribute.

The assessors use the PCAT rating sheet to record their evidence, which could include documents, written artifacts, observations and interviews with key process practitioners. Each assessor (a minimum of two assessors is required — more is better) records evidence independently and assigns scores. Later in the process, the Lead Assessor helps to deconflict any wildly disparate ratings among the assessors for the same evidence, and they begin the process of “synthesizing” a single rating. Then when all of the focus areas for an assessment level have been assessed per the assessment plan, the assessors really earn their keep by performing a SWOT analysis.

“There’s that non-word again,” Bob mumbled about performing another analysis, but as he read through the guide, he discovered that the real, tangible value in the assessment comes from the analysis of the assessment results to discover the actionable Strengths, Weaknesses, Opportunities and Threats (SWOT) that the organization can take onboard to make quantifiable and measureable improvement in its processes.

Aha! SWOT! Bob jotted the acronym down on a sticky note, pulled it off the pad and stuck it on the frame of his computer monitor.

After the hard work of capturing the SWOT analysis comes reporting. Reports and their formats are agreed to in the assessment plan during the Plan & Organize activity of the assessment. This is followed by the actual presentation to the sponsor. This is all really good stuff, and straightforward too, Bob thought, which was quickly followed by a conclusion: I need to get assessor training.

Bob reviewed the ITSMO Process Capability Assessment Tool Brief and made another sticky note as a reminder to put in a service request for PCAT assessor training on the Navy ITSMO Service Request System.

Leaning back in his chair, Bob mused aloud, “If I can get my process assessed at Level 1 capability for Purpose and Outcomes, and come out with at least a “Largely Achieved” rating, then I will have established an objective and quantifiable baseline of process capability against which future measurements will launch. Scheduling iterative assessments will foster a continual improvement culture that can be used as a pattern for other processes to follow.”

Bob smiled to think how far he had come in his thinking. It was no longer just about his process, but rather it was about multiple processes supporting end-to-end quality service delivery to the customer, a lesson he learned from the Navy ITSMO’s Service Quality Management practice.

With his process design based on an approved enterprise architecture model, and his assessment methodology based on international standards linked to that model — and supported by a tool, Bob was certain he had the winning formula to not just improve his service delivery, but to show that improvement to leadership.

About the Navy ITSMO
Chartered in April 2012, the Navy ITSMO provides IT Service Management thought leadership and assistance by creating usable products and services for the Navy ITSM community. The Navy ITSMO strives for alignment of enterprise IT architecture through discreet but interlocking practice areas to help define and support organizational IT governance and management requirements. The Navy ITSMO résumé boasts industry-certified expertise in ITIL, COBIT, Program and Project Management, DoDAF, IT Risk Management and Control, IT Skills Framework, Service Quality, CMMI, ISO/IEC-20000, ISO/IEC-15504, Information Security, Enterprise IT Governance, and Assessment and Audit.

The Navy ITSMO Wiki is located at: https://www.milsuite.mil/wiki/Navy_IT_Service_Mangement_Office/. Access to milSuite is CAC controlled. First time users will need to register their CAC with milSuite by clicking the ‘Register’ button, confirming their information and clicking ‘Submit’

Figure 1. The Deming Cycle: Plan, Do, Check, Act.
Figure 1. The Deming Cycle: Plan, Do, Check, Act.

Figure 2. The sequential activities involved in a process assessment.
Figure 2. The sequential activities involved in a process assessment.
Related CHIPS Articles
Related DON CIO News
Related DON CIO Policy
CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988