The Essentials of Measuring Service Improvement – Sensible Service Management Part 10

In our last blog post from Rob England @theitskeptic, he talked about whether you actually want to improve service at all – it’s not necessarily a given that you do. And we reviewed the essential question: how service improvements fit into your business strategy.

Now, that we’ve made the business case that service improvement is needed and worthwhile, here’s the next question to consider: what constitutes “better”?

Better

There are three kinds of “better” service:

  • A specific measurable goal: e.g., cut costs
  • A broad goal: e.g., make customers happier
  • The goal to just keep improving service

Depending on where you fit on that spectrum of goals, your answer to the next question will change: how will you know that you have improved?

There are two possible answers:

1)    By measuring the difference in something

2)    Subjectively

This may seem blindingly obvious and subjective assessment may seem like a bad idea, but think for a moment. Because to measure the difference in something, you need:

  • To decide what is a good measure or measures for what you want to improve
  • A mechanism for measuring, recording and reporting
  • To do the work to measure the current state
  • To do the work to measure again afterwards and work out the difference

This could be an expensive exercise. What is the purpose of improving? Often (but not always) it is to make customers, users and/or managers feel better about the services. (Or it could be to cut costs, speed delivery or some more objective goal). If the goal is to make people happier then why not just ask them if they are happier afterwards? If they aren’t feeling happier, then all your improvement metrics could be a waste of time.

On the other hand, good metrics may help change their minds and convince them to feel better. After all, users most strongly remember the recent past or painful past. Showing objective progress can help put a recent minor slip in standards into context.

Part10_SensibleSvsMgmt1.png

So it is management’s call: spend on measurement or to subjectively assess. Both approaches have their merits.

Assuming you decide to measure your improvement, the next question is: what metrics will you measure?

External

The overall measure of service should usually be from a customer perspective: in terms of the business results delivered, not in terms of some internal performance. You need to measure enough to know if you are delivering service well enough – usefulness and reliability – and to know if efforts at improvement had any effect.

The classic service metric is customer satisfaction, measured by surveys. This can be challenging to get an accurate measure – it is subjective. It’s a useful measure but not usually the best primary measure. Note that customer satisfaction is not always the same thing as user satisfaction; in some cases those paying for the service don’t actually care how satisfied the users are with it.

A popular tool for measuring customer satisfaction is NetPromoter. This is a good approach, though you might like to read my sceptical blog  on the subject before deciding.

The most obvious objective measure of service is its availability: was it available when the customer expected it to be? This seems simple enough until we define “available.” If the service is too slow then it is not really available even if the “door is open.”

This leads us to service quality, the standard of service, the quality of the user experience. The metrics for this will differ depending on the actual service. Some minimum standard must be set, below which the service is deemed unavailable.

Once you get your operational metrics right, and understandable by customers, in some situations you can move up to reporting the customer’s processes (and your contribution), and even move up higher again to report the value you provided to the customer. Put another way, report not on what you do but rather how you help them do what they do.

Internal

Of course, your objective in improving service may not be customer oriented. You may seek to optimize service for internal reasons, e.g. lower cost, risk, cycle times, effort, higher quality, agility, accountability, profit. Sometimes the intent is to cut costs by having user satisfaction as low as possible without actually alienating anyone (e.g. after-sales support from some vendors).

Focusing too much on any one metric – whether you are reporting to customers, staff, managers or governors – will lead to distortions of behavior: people get driven to improve the metric even at a cost to something else. Metrics all cause this effect: the perfect metric has never been invented.

One way to reduce this problem is to employ a balanced scorecard (or a performance pyramid, results and determinant matrix or performance prism – there are many alternatives). The classic balanced scorecard has scores for four groups of metrics, typically with half-a-dozen well-chosen metrics in each group, based on the main objectives and strategies. The four groups/dimensions are:

customer financial
internal business processes learning and growth / innovation

There are a number of variations, including:

  • Dimensions specific to the business or department
  • Nested scorecards at strategic and operational levels
  • An overall (often weighted) score for each group and trying to improve that score

The original concept of balanced scorecard from Kaplan and Norton had the dimensions above, but other combinations are used. One service-oriented variant you might like to try:

customer value efficiency
effectiveness improvement

The main point is to look at a balance of many metrics across different dimensions instead of making decisions based on one or two numbers.

Another good practice is to always include a commentary with each KPI, an “intelligence report.” Numbers on their own do not tell the whole story, and behind every number is a story: why it is what it is, why it has been changing, what it is not showing…

External and internal metrics working together

So service levels should ideally be measured with customer-centric, outside-in metrics. We can use service level metrics like mean time to resolve incidents or cost of servers to help us optimize the internal “machinery.” They are good for helping improve internally but they don’t measure the service: they don’t measure how happy customers are with the result, what the costs were, or how much value it delivered.

Outside-in, black-box measurements of service delivery like satisfaction, cost, and value will work well as service level targets in a Service Level Agreement (SLA). We combine feedback from the users and how we are tracking against the SLA targets as performance information. We combine that performance information with the internal operational metrics to determine what needs improving.

 Part10_SensibleSvsMgmt

Measurement is good: it gives us objectivity and makes progress visible. It is also expensive to build and operate, so think carefully about what you really need and why. All measurement should have a purpose.

The primary purpose of measurement is to improve. Next time we’ll talk about how to improve service.

Portions of this article are derived from Rob England’s books Basic Service Management and Standard+Case.

Have you tried GoToAssist Service Desk yet? Support teams can quickly and easily log and track incidents, deliver end-user self-service and manage configurations. The GoToAssist Service Desk tool provides a simple, intuitive way to more effectively manage IT operations and gain visibility into IT services. Try it free for 30-days, start using GoToAssist Service Desk today!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s