By
Tim Hines
|
Date Published: November 15, 2010 - Last Updated August 22, 2018
|
Comments (1)
Management 101 tells us that success in business starts with setting goals and measuring progress towards them. Service and support leaders are metrics masters: from electronic status boards to thick monthly reports, filled with charts and tables, leaders manage by the numbers.
Traditionally, measuring support operations has been straightforward. Shorter hold times are better; higher abandonment rates are worse.
Knowledge management is harder to measure. For example, how many articles should be written? As anyone who has waded through an overgrown, under-maintained knowledgebase can tell you, more content is not better, unless each article is a high-quality, unique, useable and findable solution that answers a question users really ask. It’s difficult to know what the right amount of content is in advance.
There’s no simple trick to optimize a knowledge management initiative using metrics. But, in conjunction with our customers and our work with the Consortium for Service Innovation and their Knowledge-Centered Support (KCS) best practices methodology, we’ve developed five measurement principles for success.
Measure Cases and Knowledge Together
Incident management systems can tell you how many cases are closed and what the backlog is. Conventional knowledgebase systems can tell you how many articles have been authored, or how often a particular article has been opened. But none of these factoids alone, explains what to do: what accomplishments to recognize, which team members need coaching, or which knowledgebase articles are having the greatest business impact.
Metrics that combine case tracking data with knowledge management data answer the business questions that really need to be answered: who is creating value in the knowledgebase? What gaps keep us from more effective self-service? What’s the highest value product improvement we can take?
Knowledge management doesn’t happen in isolation. Knowledge management reporting can’t, either.
Use Role-Focused Dashboards
Line managers have to assess employee performance, so they need reports to identify how staff is reusing, creating and improving knowledge. Product managers need reports that highlight the most valuable product enhancements. KM program managers need to know if there are any bottlenecks in the publication process. And executives want to know the financial impact of knowledge.
Customized dashboards for these and other roles are an essential part of any KM program. Too often, a thick sheaf of generic reports is sent to everyone, regardless of role. People who would benefit from a targeted subset of metrics are deluged with irrelevant information. When dashboards are specific to each user’s role we are assured that the people who need the data, get the data they need.
Take the Long View
Knowledge management initiatives are long-term programs—they grow and evolve (or get into trouble) relatively slowly. With the right measures, it’s possible to trend activities and outcomes to see what’s happening. But looking at data a week or a month at a time simply doesn’t help—the signal of the long-term trend is buried in the noise of typical weekly variations. Knowledge program managers need to be able to compare results over a period of at least 12 to 24 months.
While processing a year or two of measurement data is a simple requirement in principle, applying it in the real world can be maddeningly difficult. This is because transaction logs, especially from busy self-service sites, can be enormous and complex – in some cases, processing even a day’s worth of data can take longer than a day! Obviously, this makes it effectively impossible to crunch through the kind of data required to understand what’s happening in a knowledge program.
Fortunately, technology is emerging that makes long-term analytics a practical reality, even for the highest-volume support sites. Although the new reporting technology uses complex features like in-memory analytical engines, highly efficient data models, and advanced filtering algorithms, the result is simplicity itself: reports that give program managers the ability to see the long-term trends and take action to make sure they’re going the right way.
Make the Metrics Your Own
Many of the most valuable knowledge management metrics require some degree of judgment. What percentage of self-service interactions was plausibly successful? Should knowledgebase authors get credit when they reuse their own content, or only when others do? What about editors—do they get credit, too?
When tools provide metrics like this, they’re generally packaged as a black box, calculated using a proprietary formula that customers can’t change. In effect, vendors are saying, “Trust me.”
Executives who request the ROI from call deflection won’t settle for “trust me.” And neither should anyone else. Ensure that reports are based on assumptions you can examine and formulas you can change to meet the specifics of your business.
Ask the Next Question
At their best, metrics don’t just answer questions: they prompt new ones. Average handle time is going up? OK, but why? Maybe effective self-service is deflecting the easier cases, or a new product release is creating previously unknown problems, or support staff aren’t searching the knowledgebase as often as they should. Interesting data in one report often creates the need to generate three new reports.
True ad-hoc reporting can’t be done with most knowledgebase tools. That’s because the OLAP (online analytical processing) solutions they include use an off-line process to massage the data for reporting. The predefined reports work fine, but new reports are difficult or impossible—if that off-line process wasn’t programmed for the report in advance, too bad.
No matter what reports come “out of the box,” eventually there’s a need to answer a different question. That’s when the reporting infrastructure really needs the kind of hyper-efficient in-memory technology that’s built specifically for generating new reports on the fly.
KM success requires that all stakeholders know where they’re going and how they’re doing. With measures that pull together knowledge and case management, dashboards that meet role-specific needs, timeframes that give perspective, assumptions that are understood and adjustable, and the ability to ask new questions, KM metrics become effective guideposts on the road to success.
Based on the whitepaper: Service and Support: Made to Measure
Tim Hines is Vice President of Product Management at Consona CRM. www.consona.com