Engaged means emotionally attached to the organization and its mission, usually beginning with one’s supervisor and immediate circle of co-workers. Committed means fully invested in one’s role, goals, and accountabilities. Both concern emotion and motivation. Given that context, what do we mean when speaking of measurement and evaluation of progress?
Measuring and Evaluating Change
Measurement boils down to counting and comparing phenomena by means of a numeric scale. We do this to determine size, magnitude, or quantity. Is the current state of observed phenomena quantitatively different from an earlier state because of growth or change? Does the direction and rate of change imply improvement, stagnation, or decline?
Evaluating change and growth, by contrast, may or may not involve a scaled measurement of that kind. But it always involves judgment of the value of any change that has occurred. Its explicit meaning is judging value, not counting. And evaluation is often done without resort to quantitative scales because of the essential incommensurability of quantity and quality.
Reducing quality to quantity always occurs at a cost of simplification and loss of the original complexity and richness. Do we know for sure what we lose in this process? Is it "merely" the inherent plentitude of qualitative meaning as originally experienced in our concrete experience? And is the "merely" a warranted discount? Important questions.
When we talk of measuring engagement, therefore, we must recognize that the instruments used to do so won’t replace a need for a conversational discovery of meaning. The surveys and questionnaires are a starting point. They yield group-level data that can be a helpful stimulus for discussing what these questionnaire responses mean.
Getting to Insightful Evaluation
Most survey firms won't disagree with these observations. They recommend focus-groups and offer to help management interpret survey data. That inevitably takes them into concrete and "situated" discussions of issues, attitudes, and behaviors. All of this takes time, but it usually sparks practical interest and examination of causes and effects, which then generate hypotheses and recommendations for change and next-step actions.
What we obtain from such qualitative analysis is sometimes called a gap analysis. And if we continue our qualitative focus on specific situations in our next steps and include others in the process (stakeholders), we further specify and validate the gap analysis. In fact, what begins to happen is engagement and commitment: “It's our situation, let’s figure this out !”
And that is only the beginning of insightful evaluation. Unlike the clean edges of a survey, ongoing qualitative evaluation is messy because it is concrete and situated. We gain insight as we intervene with action. We experiment with practical curiosity. We try to be unrushed, positioning ourselves to notice the effects of our action.
We continue this until we've arrived at a new more adaptive normal. Our attitudes, beliefs, and motivations are shaped by these collaborative actions. We create our own qualities of engagement and commitment. And we learn to notice the signs of troubled emotions and wavering motivation that signal negative change in engagement and commitment.
In this way, we become our own ongoing data collection and adaptive intervention system. Future surveys may continue to measure the basics (e.g., Q12-type data). But they may also look at capacities for doing the ongoing qualitative work, the conversational and reflective practices that produce our feelings of engagement and commitment.