Dec 3, 2024 · Behavior | Metrics

Measuring Engagement That Matters

Engagement is too often a proxy for “time spent.” For research and civic uses, we need metrics that privilege informed decision-making over stickiness.

The pyramid of intent

  • Exposure: Did the right people see it? (reach, qualified impressions)
  • Comprehension: Did they understand it? (read depth, clarity checks)
  • Action: Did it prompt a useful step? (form completion, plan creation)
  • Reflection: Did beliefs or strategies update? (self-report, follow-ups)

Signals worth capturing

  • Quality time: focus-weighted dwell (discounts idle tabs), scroll stability, and return visits within 24h for the same topic.
  • Clarity loops: number of clarifying questions asked, and whether confidence in understanding rises after system explanations.
  • Constructive disagreement: respectful counter-arguments posted, edits after reading opposing evidence, and cross-faction replies that stay civil.
  • Outcome alignment: goal creation -> action -> confirmation (e.g., booked appointment, filed request, shared source with a peer).

Instrumentation tips

  1. Tag flows by intent at design time (inform, decide, request help). Avoid one-size-fits-all events.
  2. Pair quantitative signals with lightweight prompts: “Did this answer help you decide?” with Yes/No + short reason.
  3. Track the “two-hop” effect: did a user share a source or explanation with someone else?
  4. Respect privacy: aggregate and anonymize by default; make sensitive event capture opt-in.

Example scorecard

  • Topic comprehension: +18% after adding inline summaries.
  • Clarifying questions per session: down 12% (better initial framing).
  • Decision completion for benefit applications: up 9% with checklists.
  • Constructive replies between differing groups: up 6% after moderation cues.

Engagement that matters is intentional: it is defined by the user’s goal and measured by how reliably they achieve it without unnecessary friction.