Tuesday, September 28, 2010

Metrics That Matter

In my Web Manager University course, “Delivering Great Customer Service – Essentials for Government Web Managers,” I do a section on “metrics that matter.” I often start by asking folks if they collect performance data.  Heads nod.  Then I ask them what they do with it - how they use it.  Most use stats to track page use, unique visitors.  But if I ask them how they measure customer service...well, I usually get blank looks.

Gosh, I remember so well struggling to figure out how to measure the web’s impact on mission achievement and trying to decipher what all that data was telling me, when I managed HUD’s website. We got tons of statistics, spent a year using the American Customer Satisfaction Index (ACSI), had a contract with Nielson Netratings for awhile, and did a little usability testing. All right things. But how do you rack up all that data to find out if your website really is providing great customer service?

The truth is that many web teams haven’t identified specific customer service and mission completion objectives to measure. What to do? Step back. Pick just a few really important things to measure, pick just a few really good measures, and follow through. Use those metrics that matter.

The process is common sense.
  1. Start by identifying a handful – and I mean 10 or fewer – of key objectives for your website. At a minimum, 6 of those objectives should be the 6 governmentwide Customer Service objectives identified by the Federal Web Managers Council.
  2. For each objective, figure out 3-4 key performance indicators (KPIs) that will tell you whether or not you’re achieving the objective. Don’t get down in the weeds. Keep it simple.
  3. For each KPI, decide what data you need to collect. And I encourage folks to collect data in different ways, from different sources – statistics, usability testing, customer service surveys, writing quality reviews, etc. But don’t over-collect. Enough is as good as a feast.
  4. Collect and analyze the data to establish a performance baseline.
  5. Pinpoint places where you can improve customer service; and establish performance goals, like reducing time or improving the percentage of successes or reducing errors.
  6. Make incremental improvements, and collect and analyze data again. And again. And again.
  7. Report your findings to your web team, your bosses, your agency, and – in the spirit of transparency – the public. Explain the problems in terms of their impact on customer service. And let everyone know what you’re going to do to fix the problems. Let everyone know that you care about making your website as useful and usable as possible.
Sounds easy, doesn’t it? It’s not. It takes hard work to figure out those KPIs and make sure you get the right data to measure them. And it takes time and dedication to analyze the data, decide what you can do to improve, and make improvements. But gosh – this is so important. This is how you make your customer service the best that it can be.

Too often, web teams/agencies collect too much data. They get overwhelmed. They have trouble focusing on the biggest problems.

Or they don’t follow through. How many times do you think to yourself, “I know we’ve got a problem here – the data says it – but we don’t have the time or money to fix it right now.” So you keep collecting data that tells you the same thing. It’s a waste of time and money, not to mention a disservice to your customers.

Or they collect data first and then try to figure out what it measures. “Gee, we’ve got this great stats package giving us all this information. Hmm…what can we learn from it?” Doesn’t it make more sense to decide what you want to measure first and then find the data that will help you?

Or sometimes we think the data tells us something it doesn’t. For example, customer satisfaction surveys are important and are one great indicator about the effectiveness of your website. But they don’t give you facts about efficiency – they only report people’s feelings and perceptions about your site. I’ve heard people say, “oh, this site is so pretty and professional-looking. I love it.” But then you watch them use the site, and they have a hard time.

OK – so here’s an example of the way it should work. Let’s take FWMC Customer Service Objective #3: (Customers) should be able to complete common tasks efficiently. How do you measure that?

Well, I’d start with these three KPIs:
  1. Length of time it takes the average person to complete the task
  2. % of people who complete the task
  3. % of people who get the right answer
I'd measure those through:
  • Usability testing: Did they finish the task? How long does it take? How many wrong turns did they take? How many clicks did it take? What words didn’t they understand? Did they come up with the right answer?
  • Statistics: how many people come to the page to start the task? How many people visit each of the subsequent pages required to complete the task? What’s the drop-out rate?
  • Plain language peer reviews: did each of the pages required to complete the task score well?
Then I’d corroborate with a customer satisfaction survey for each of those tasks. Did people think the task was easy to complete?

Are these perfect metrics? Probably not. Are they adequate? Yes. They’d give you a good start on figuring out where the problems are (too many clicks? Wrong words or terms? Bad layout or design?) and how to fix them (reduce steps, change words, use more white space or bullets or sub-heads).

And you can do this pretty efficiently. You can identify most of your usability problems by testing 3-5 users. Almost any users. It takes only about 10 minutes to do a plain language review of a web page – that includes both individual reviews and group consensus. And you know exactly which site traffic stats you need, so you don’t need to plough through that entire WebTrends report.

Don't waste your time your time on data for the sake of data.  Think about what's important - what management purpose you're trying to achieve.  Focus on measuring customer service (starting with those 6 governmentwide objectives) and impact on mission.  Don’t over-think this. Don’t worry that it's not absolutely perfect. Just start at the beginning – what are the most important objectives, how will we know if we’ve achieved them, and what data do we need to measure those indicators? That’s metrics that matter.

Monday, September 20, 2010

A Sad Day for Customer Service

Today I heard that HUD has abolished the Regional Web Manager positions we (the department) created, at the recommendation of an agency-wide task force, 10 years ago. If this is true (and I think it is), how sad. I’m sad, personally, because I helped find and train these 10 fabulous people. I’m telling you – they were (are) the best. But more important, I’m sad for an agency I called home for 24 years – an agency whose mission is so important to every single citizen…finding decent, safe and sanitary housing…an agency who really got it right about customer service through the web, for many years.

Ten years ago – under a Democrat administration – we determined that the American people value local information. Yes – they want to know how to buy a home. But what they really want to know is how to buy a home in Arizona. Or Illinois. Or New York. They want that local connection. A departmentwide task force recommended that 10 Regional Web Managers be created to help us complete that link from Washington DC to the people we hoped to serve. And a Republican administration made it happen. Everyone seemed to agree that local link was the right thing to do.

Now, that link has been broken. I’m sure there’s some good reason. At least I hope there is. But I wonder if anyone has considered the consequences. Is bridging the gap between Washington DC and “the people” who use the web to access government services a one-shot deal? You think you’ve done it and now you claim victory and are done with it?

OK - yes – I have a vested interest in this issue. But I know what these Regional Web Managers brought to the table, and I know how the public responded. In the very first month we put up “state” pages (with that local connection), they became the second most-requested content on HUD’s website…only behind HUD’s home page. Surely, there’s a message there. People value that local connection. They want Washington to appreciate and understand and keep abreast of their local differences and needs.

I’m a strong advocate for “the field” – for those employees who sit in agency offices located in every state, in every major community. I used to be among them – I know how savvy they are. These people know what our customers want because they actually live with them and interact with them, every day. In my book, we should be strengthening our links to citizens, through our field offices – not breaking them.

More Americans access the federal government through the web than through any other communications vehicle. Shouldn’t we be doing everything we can – especially establishing those grassroots, on-the-scene people who know and understand what our customers want and make sure Washington provides it, via our websites? I surely think so.

Oh -I so hope I'll hear tomorrow that this news is false...that instead of abolishing the local link, HUD has decided to embrace this pioneering effort to improve customer service. But just in case it is true, let me say to Diane Fournier, Eric Ramoth, John Carpenter, Diane Littleton, Mykl Asanti, Steve Meiss, Barbara Bates, Lynn Kring, Jim Graver, David Lockwood, and Rachel Flagg, HUD’s first-class Regional Web Managers - I salute you. You did the right thing for your customers. You made a difference.


Related Posts

Holding Out for a Hero
Creating an Agency Web Strategy – How You Do It Is As Important As What You Do

Thursday, September 09, 2010

As We Do What’s Exciting, Let’s Not Forget What’s Important

Improving government's customer service means constantly looking for new ways to do things, seizing new technologies, and experimenting. All good. But as we do what’s exciting and new, let’s not forget that we also need to do what’s important. Like implementing all the laws, regulations, and requirements already in place, for government websites.

A little more than 6 years ago, a group of government web managers came together under the umbrella of OMB’s Interagency Committee on Government Information (ICGI) and hammered out policy recommendations for federal public websites. Sheila Campbell and I co-chaired that group, and Bev Godwin was our liaison with OMB. The working group included many people whose names you’d recognize – Gwynne Kostin, Annetta Cheek, Brian Dunbar, Jeffrey Levy, Sam Gallagher, Janet Stevens, and others. In that effort, we documented all the existing laws, regulations, and requirements that applied to government websites and best practices already commonly in use across government. In December 2004, OMB issued a memo that embraced our recommendations and referred agencies to the newly-created Webcontent.gov, for guidance on implementation and best practices; and agencies were required to confirm they’d implemented the new policies. The web manager working group – which became the Federal Web Managers Council – was quite proud that this grassroots effort had really worked!

Fast forward to today. So – what happened? Are all of those requirements spelled out in the OMB policies firmly in place? I did just a tiny bit of checking this week and discovered, well, let’s just say there are some holes. I’m not going to call anyone out because I don’t think there’s malice here. I suspect what’s happened is that the web management workforce has changed extensively in 6 years, and there’s been a loss of knowledge. Even though Web Manager University offers a refresher course on laws, regulations and requirements almost every semester, I suspect many web managers believe they already have everything covered and don’t need a reminder.

You know what? I think every web manager should go through refresher training every year. Heck, I knew that stuff forward and backward, 6 years ago; and I couldn’t remember some of the specifics. Yes - some of it is boring and mundane, in light of the excitement of open government and social medial. But we have to remember: it’s important. These are basic protections and management principles that are the foundation of web-based customer service.

So here’s my challenge, government web managers (and any of you who care about how government communicates with citizens): print out one of the handy-dandy checklists on webcontent.gov and see if your agency is complying with OMB’s policies, which include all the pertinent laws and regulations for government websites. Before you start, read the report of our ICGI working group – the one that spelled out why each of these requirements and best practices is so critical to customer service. Take a half hour and go through the chart that shows you how OMB Circular A-130 (which OMB cites in the policies) applies to web management. Do it for yourself, but – more important – do it for your customers.

As we march ahead doing what’s exciting, let’s not forget to do what’s important.