DECI as a capacity building approach in collaborative evaluation and communication*

DECI-4 Blog

DECI as a capacity building approach in collaborative evaluation and communication*

By  Joaquin Navas, Ricardo Ramírez & Mariana López-Fernández, DECI-4.

DECI (Designing Evaluation and Communication for Impact) is an action-research project on capacity building in evaluation and communication. By combining evaluation and communication, the DECI model provides a decision making framework that helps social development and research projects to be more strategic and adaptable to unexpected situations in order to maximize their impact. In this blog, we describe the key features of the DECI model, the main learned lessons and some of the challenges.

Distinctive features of the DECI model

 The evaluator as process facilitator: In the Utilization Focus Evaluation (Patton, 2008) approach adopted by DECI, the role of the evaluator is not that of an “expert” who intervenes at the end of a project to judge its achievements. Instead, he or she is a facilitator who guides a team of intended users through a set of decisions that must be taken throughout the evaluation process.

 Capacity building through mentors: In DECI, training is delivered through a team of mentors who offer timely support (just-in-time mentoring) to the person who the partner organization has chosen to facilitate the evaluation process. The mentors offer support at a pace that matches each project’s calendar and stages of development. This allows for mentor support that can be adjusted to every context as knowledge is shared in a timely fashion. This also allows the person being trained to cover the different steps in ways that are specific to their project needs. This tailored approach seems to be more effective than offering pre-designed, standard steps, like recipes shared through a workshop. In our experience, we feel it and it is also more cost-effective for the partner project.

Evaluation as a framework for decision-making: Traditionally, project evaluations are carried out as a response to donor requirements that emphasize accountability to verify that objectives are being met and resources are being used in a satisfactory manner. However, in the DECI process, evaluation is seen as a tool for learning and decision-making throughout the life cycle of the projects. The UFE approach proposes that the evaluation should be designed and implemented from the beginning of the project rather than at the end. Primary evaluation users are the ones who determine the purpose and intended uses of the evaluation. They also formulate the questions that the evaluation should respond to so that the findings are useful. Thus, evaluation becomes a tool for collective reflection and strategic management which enables the generation of organizational learning. It is worth adding that the partner projects already provide the funder with regular technical and financial reporting.

 Integrating evaluation and communication into projects: Communication planning and evaluation design share some common steps. As a research project, DECI has tested some methodological innovations by integrating them. The decision-making required to design evaluation and communication creates a place for internal reflection within the project teams about how best to achieve outcomes and improve effectiveness.

 Practical wisdom is only acquired through experience: Contrary to the notion of ‘best practices’ which suggests pre-determined “recipes”, practical wisdom refers to the ability to respond to every situation in a unique way according to the specificities presented by each context (Schwartz & Sharpe 2010). The capacity-building approach adopted by the DECI project creates spaces where practical wisdom guides the decision on evaluation and communication instead of so-called ‘best practices’. This can only be achieved through experience that comes from applying specific strategies in each evaluation design and communication planning process. As mentors we are engaged in on-going reflection and we take the time to document each experience through case studies summarizing the process and outcomes, which can be found in the knowledge base of our website.

 Lessons learned

Project readiness: a critical factor: Project “readiness” is not an evaluation tool but rather a management tool for the assessment of the enabling context (Ramírez, 2017) – and we have learned that it is a critical factor for implementing DECI’s capacity-building approach. We have also learned that when a project’s context does not provide enough readiness conditions it is better to suggest to the project to go for a different evaluation approach. Our knowledge base includes several articles about readiness).

 Trust between the mentor and evaluator makes a difference: It is important for the mentor to generate enough trust with the evaluation or communication mentee. The mechanisms to achieve such levels of trust depend greatly on the mentor’s personal style and his or her ability to read project’s context. It is worthwhile mentioning that in DECI’s context the mentor does not represent the donor, which is an advantage as we play a third party role. Furthermore, as an IDRC- funded project, DECI is subject to the same reporting conditions to IDRC as the partner projects we support, so in that sense we are peers.

 Value is recognized at the end of the process: The UFE approach is difficult to understand when explained in writing for those unfamiliar with the process. However, experience shows that most organizations that collaborate with DECI end up convinced of the added value that the process provides. In addition to generating useful findings, the organizations appreciate the spaces for reflection and internal dialogue that the mentoring provides. This often results in rethinking strategies, revealing non-explicit assumptions, and confirming the value of previous decisions. This is particularly important for projects dealing with new or complex topics.

Challenges

 Staff rotation in the partner projects: Staff changes within a partner project makes continuity difficult and in many cases it requires that the mentor starts the process from scratch with a new person. This requires double the effort and delays in the work. Solution: the mentor requires patience.

 Lack of time among staff working in the partner projects: The people working in the partner projects are usually working at full capacity, leaving little time for them to dedicate to the task of evaluation or planning a communication strategy. Solution: offer support with an open calendar to make the mentoring schedule as flexible as possible and take advantages of instances when the process gains momentum in unexpected ways.

 Lack of interest in the person to be trained: In some cases, the organization accepts DECI’s support and seems to fulfill the minimum readiness conditions, but it is later found that the person to be trained is not interested in learning. This tends to happen in cases where the person has a fair amount of experience in evaluation or communication and sees no value in additional training. Solution: perseverance by the mentor to help the person realize that DECI’s proposal can provide him/her with new or complementary knowledge that is worth gaining and can generate useful knowledge for the organization.

 *This blog is the summary of an article written for EvalParticipativa. You can read the full article here.


References and suggested readings

Brodhead, D & Ramírez, R. (2014). Readiness & mentoring: Two touchstones for capacity development in evaluation. Paper presented at the CDI Conference: Improving the use of M&E processes and findings. 2021 March. Wageningen, The Netherlands.

Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice-Hall.

Mayne, J. (2009). Building an evaluation culture: The key to effective evaluation and results management. Canadian Journal of Program Evaluation 24(2): pp.1-30.

Patton, M. Q. (2008). Utilization-focused evaluation, 4th ed. Los Angeles, London, New Delhi, Singapore: Sage Publications.

Quarry, W. & Ramírez, R. (2014). Comunicación para otro desarrollo: Escuchar antes de hablar. Madrid: Editorial Popular.

Ramírez, R. (2011). Why “utilization focused communication” is not an oxymoron. Communication, media and development policy; BB World Service Trust.

Ramírez, R. (2017). Un marco para la toma de decisiones en evaluación y comunicación: Resumen de investigación y comunicación. Revista de Comunicación y Ciudadanía Digital – COMMONS, Vol. 6 N. 1 pp. 23-44.

Ramírez, R. & Brodhead, D. (2013). Las evaluaciones orientadas al uso: Guía para evaluadores. Penang: Southbound.

Ramírez, R., Quarry, W. & Guerin, F. 2015. Community Note. Can participatory communication be taught? Finding your inner phronēsis. Knowledge Management for Development Journal 11(2): 101-111.

Schwartz, B., & Sharpe, K. (2010). Practical wisdom: The right way to do the right thing. New York: Riverhead Books.