More

Tips for Refining Metrics

As we do most years, last year, in the midst of Covid, we at Catalysis had the opportunity to review our True North metrics. Were we moving the needle? Were these the right metrics? For our quality and customer service metrics, we couldn’t unequivocally and enthusiastically answer “Yes!” – so we took the time to reflect. Below are the steps we took and what we learned. These can apply to metrics at any level, not just organizational True North.

1. Identify the need

We discovered while we were collecting and reporting on quality and customer service that we weren’t deliberately changing our behaviors and actions to intentionally try to move the needle.  We had a 4.5 (or greater) out of 5 for quality – great! We were meeting and beating the target! But what did that mean? We realized we couldn’t easily answer this question, so we wanted to better define what it means for a customer to be “satisfied.” What we had were stale metrics!

While you may not be in a position to review and revise your organization metrics, the same process can apply to a new project that you’ve started, or some existing metrics that just aren’t helping you improve. These could be department metrics, or even a project that you’ve started. Ask questions about what each metric means to help show where you might be stale.

2. Link to True North

Because the metrics we were reviewing were our true north metrics, they naturally were going to align. If you are not looking specifically at organization (or department/section/etc.) True North metrics directly, you should think about how to link to them. How is the work directly contributing to the mission of the organization? How is what you’re working on connected to department goals? It’s much easier to get buy-in for support and resources if you can connect work to higher goals or show the benefit at a strategic level.

3. Ask the users of the work - what matters to you?

When we first started reviewing how we might improve our metrics, we started brainstorming different ways to measure quality and customer service. Then we paused and said, “What if we ask our customers what’s important to them?” This was a game changer for us. We reached out to ask a variety of customers to share with us how they would describe a quality customer experience with Catalysis. We received some really interesting responses, including:

  • “… after every Catalysis event I learned something new or met an interesting/stimulating person.”
  • “connecting, learning and contributing with peers in an environment that allows me to stop, reflect, absorb and build my own action / reaction plan.”
  • “… I have felt heard and respected, and learned something or gained new insight that leaves me energized and committed to think and act differently - in a way that helps me deliver more value - to my organization…”

In your work, you could reach out to direct customers or patients, internal users of your process, or just ask for input from outside eyes. We thought we knew what our customers valued and had some ideas around metrics, but when we went out and actually asked them, we were surprised! It can be easy to prioritize differently than customers, so outside perspectives can be extremely beneficial. 

4. Stratify it 

At first glance taking qualitative input and coming up with a quantitative way to measure seemed a bit daunting. We looked at each piece of feedback and grouped them by theme as best as we could to find trends. We then brainstormed how to phrase the meaning of quality into a way that meant something to both the customer and to us.  We decided to use a 10-point scale for our metrics, and ended up with three meaningful metrics, asking our customers to tell us to what extent they:

  1. Would recommend this [service] to a colleague
  2. Were inspired to take action after the session
  3. Found the content current and relevant

Of course the numbers only give a general direction – the meat is in the comments that customers share to support their ratings.

5. Make a plan

As with all PDSA cycles, the important work happens by analyzing the feedback and using it constructively to move the intended needs. With our services, we do PDSA after each session. It might be a small “study and adjust” to tweak something for the next time, or it might be a review of feedback that affects multiple offerings. Equally important is a regular review of the metrics themselves. Are you getting the feedback and input you expected? Are you able to adjust actions and behaviors? Is the needle moving? Why, or why not?

How are your metrics working for you? What steps could you take to review and improve them? Let us know how you’ve incorporated meaningful metrics into your work!

Leave a Reply

Your email address will not be published. Required fields are marked *

*

*