The COMPEL Glossary Graph visualizes relationships between framework terminology, showing how concepts interconnect across domains, stages, and pillars. Term nodes cluster by pillar affiliation while cross-references reveal semantic dependencies — for example, how risk appetite connects to control effectiveness, model governance, and assurance requirements. This network representation helps practitioners navigate the framework vocabulary and understand that COMPEL terminology forms a coherent conceptual system rather than isolated definitions.
COMPEL Glossary / GL-65
Active-User Rate
The percentage of provisioned users who meaningfully engage with an AI system within a defined measurement window (typically weekly or monthly), where "meaningful engagement" is defined per use case with an explicit action threshold.
What this means in practice
Active-user rate is tracked by cohort and by business unit so that adoption gaps surface at the team level, not just in aggregate, and it is the primary leading indicator for value realization.
Context in the COMPEL framework
A core metric of the Adoption dimension. Instrumented during Produce and reported in Evaluate; a key input to Value Realization reviews in Learn.
Where you see this
Active-User Rate is most commonly referenced when teams work across the Produce , Evaluate and Learn stages — especially within the Value Realization layer . It appears in governance artifacts, assessment instruments, and delivery playbooks wherever COMPEL is operationalized.
Related COMPEL stages
Related domains
Synonyms
WAU rate , MAU rate , active usage rate , engaged-user rate
See also
- Trust & Performance Dimensions — The eight continuous-measurement axes against which every AI transformation is evaluated in COMPEL: Value, Reliability, Safety, Responsibility, Compliance, Security, Sustainability, and Adoption.
- Time-to-Value — The elapsed time from a user being provisioned on an AI system to their first recorded value-generating interaction with it, measured at the cohort level.
- Value Realization — The end-to-end process of defining, tracking, and verifying the business value delivered by AI initiatives — from initial value thesis through baseline measurement, deployment, post-deployment review, and ongoing benefit tracking.