Measuring competency gains: an elusive objective

DECI-4 Blog

Measuring competency gains: an elusive objective

By Ricardo Ramirez and Dal Brodhead, DECI-4

DECI’s raison d’être is to provide evaluation and communication capacity development. How do we know to what extent we have achieved our training objectives?  We work directly with individuals who are charged with evaluation design and communication planning in their respective organizations. As they build on their experience and weave in the ideas that we provide, they interact with their colleagues thereby sharing some of the new insights.  We have recently experimented with measuring competency gains at the individual level.   We are well aware that the individual and the organizational levels are interconnected.  Labin (2014) proposes an integrated evaluation capacity-building model with outcomes at the individual and organizational levels.  At the individual level, there is mention of attitudes towards evaluation, as well as knowledge skills, and behavior.  Most important in our context is instilling evaluation and communication thinking – it is a different way to see the world.

In order to measure individual learning outcomes among the partners’ staff that worked with DECI, we used the five domains identified by the Canadian Evaluation Society (2018) to group competencies:

  • Reflective Practice;
  • Technical Practice;
  • Situational Practice;
  • Management Practice; and
  • Interpersonal Practice.

The Research Communication (ResCom) self-assessment form was developed to document research communication competency gains. It follows the five competency domains from the Canadian Evaluation Society used for evaluation.  The competencies were collected from numerous communication sources. Both forms use a format adapted from Knowles (1975) that allows learners to identify their self-assessed level of achievement, with a view to map out areas for further learning.   Below we copy the summary of 13 responses for the domain of reflective practice in evaluation.  We have highlighted the competencies with the highest scores in gray color.

The competency self-assessment forms provide a partial picture of individual staff members’ reflections on their acquired skills. Unfortunately, we did not conduct this exercise earlier to set a baseline for comparisons. Further and possibly more compelling evidence of their achievement exists in the form of presentations prepared by staff assigned to evaluation and communication roles for internal knowledge sharing.

We conducted a joint analysis of the findings within the DECI team and among the conclusions, it was clear that the tool needs to be complemented by interviews or focus groups, be it of the person filling the form in, or their supervisor; there is also the possibility of having pairs of staff fill in the form for each other.  The tool is very detailed, but does it capture the broader picture?  The four-level of responses was limiting; a more detailed 1-10 rating may have been more appropriate to capture a continuum of change. We have since revised the tools and hope to apply them early on to record a baseline.  We need to make it explicit that the effort is not about documenting attribution to DECI’s mentoring, but an overall change in competencies where DECI may have been a contributor to the change.  Finally, we need to ways to make the exercise a self-learning tool and non-threatening.

The revised self-assessment tools are available from our website (CC-by license), along with a summary of the findings and analysis. We also feel that there is a need to better understand the mechanisms by which individual competencies are translated into organizational-level capacity building.

References 

Knowles, M. 1975. Self-directed learning: A guide for learners and teachers.  Chicago: Follett Publishing Co.

Labin, S. N. (2014). Developing common measures in evaluation capacity building: An iterative science and practice process. American Journal of Evaluation, 35(1), 107–115.