What's inside
Our clients often ask: "Hey, is there any other way to measure if our AI channel is working, beyond the thumbs up or down? Are users happy?"
Most projects are still stuck there. In theory it sounds fine: thumbs up = good, thumbs down = bad. In practice, very few users click and leave feedback, and the few who do don’t represent the opinions of the majority. The result: you end up designing for a loud minority and guessing for everyone else.
What you really need is a metric that measures whether the AI actually solved the user’s intent. No wishes, no opinions: real resolution. That’s SASR, Search Answer Success Rate.
What is SASR?
SASR measures the percentage of searches that end with a useful answer. It draws from explicit signals (when available) and implicit signals that users leave without even realizing it.
Some examples:
- The user’s interaction sentiment is positive or neutral.
- Satisfaction words appear, like “thanks,” “perfect,” “ok,” or “got it.”
- The user keeps asking the same thing with refinements because they’re actually interested in the information provided.
- The system delivers a complete answer because it has enough context in its knowledge base.
The formula is simple:
SASR = (Successful answers ÷ Total searches) × 100
You can view this metric in two ways: global or by topic. And trust me, the topic view is where the magic happens because it shows you exactly where your system is solving — or failing.
Why SASR rocks
Because it’s a very actionable metric. A low number doesn’t leave anything open to debate; it points straight to the problem. And when you zoom in, it almost always comes down to one of two reasons:
- Content gaps: what you have is incomplete or outdated.
- Missing business rules: sometimes the “answer” forces the user to jump around or follow an endless flow that doesn’t really solve their need. Adding direct-action buttons, showing tutorial images, or even a map can make sense.
SASR tells you which of the two it is, and where to focus. And it also lets you see the impact of your actions clearly, week by week.
A quick example
One client had a 70% global SASR. Looking at topics, “Passwords & Access” was at 52%. Users were asking “I forgot my password” or “I can’t log into the app,” but the answers were vague and forced too many steps.
What we did: we added a Reset Password button directly in the AI’s response. We also added image-based tutorials to the experience. So when users asked about their password, the answers truly solved their pain.
Result: SASR for that topic jumped from 52% to 86%. Since it was a high-volume topic, the global SASR rose to nearly 80%. Reformulations and incidents dropped.
The takeaway
SASR isn’t a vanity metric. It’s the control panel of the user experience. It tells you if your AI channel is truly solving, and shows you exactly where to act.
If you already have an AI channel, start measuring this KPI and you’ll see the difference. If you don’t yet, at AIFindr we can help you launch your AI channel, measure SASR from day one, and activate the continuous improvement loop.