The well-organized and visionary Francophone evaluation network
(RFE) conference in Marrakesh was an inspiration for me as Co-Chair of this year’s Canadian Evaluation Society Conference. I had the good fortune to connect with the President of host organization Moroccan Evaluation Association
, Jamal Ramdane, Jean-Marie Loncle from the Francophone evaluation network, and Program Coordinator Mouna el Ghormli from the Moroccan Evaluation Association
, who gracefully shared their insights on organizing a successful conference with me.
The conference brought together attendees from over 20 countries, representing an array of sectors that was both diverse and distinct from what I’ve seen at CES. Their focus on institutions in government, parliament and government audit made for an interesting mix. High-level government presenters talked about how they are institutionalizing evaluation. This was not evaluators as outsiders looking in, but rather institutions conceptualizing evaluation as a core function. The discussion led me to reflect on questions of evaluation use and our conference theme. Ultimately, conversations about innovation, action and reflection in evaluation are motivated not only by our intrinsic desire to use evaluation to innovate and improve, but we are also called to serve societal needs, which was a theme underlying the discourse at the RFE conference.
There was a place at the RFE conference to strengthen the role of Voluntary Organizations for Professional Evaluation (VOPEs); this inspired us to issue an invitation to volunteer organizations in Canada, CES Chapter Councils and CES Board Members, to come together during CES 2017 to become familiar with the work that each of us is doing, build relationships, and find opportunities to collaborate.
There was also a focus at the RFE conference on supporting emerging evaluators. The network of francophone emerging evaluators (RF-Ee) was officially launched at the CES conference in Montreal, and continues to play a leadership role in supporting emerging evaluators. Our own CES BC and Yukon (CESBCY) Chapter’s Michelle Naimi has played a leadership role for us locally in supporting student and emerging evaluators, and is now joined by Carolyn Camman, as a conference volunteer and new CESBCY Council Member. Under Michelle’s leadership, CESBCY initiated a number of student activities, including student events, student presentations, student bursaries, and mentoring programs. We are bringing that spirit to the CES 2017 conference, in alignment with the international work being done by CES and other international groups, to recognize the important role of students and emerging evaluators in the future of our profession.
The RFE conference also supported dialogue about innovations in evaluation methodology, and key areas of international significance, including human rights and Sustainable Development Goals
(SDGs). We know that presenters are bringing their own innovations, action and reflection as the basis for discussion in Vancouver, and we are building in novel networking events to support lively conversations.
Finally, I’m proud to report a great deal of interest internationally in the CE, as CES has played an important leadership role in the professionalization of evaluation. I am grateful for the inspiring discussions and ideas at the RFE conference. I look forward to building on that inspiration and contributing to the momentum in our field at CES 2017
Read the proceedings
(in French) here: http://www.portail-rfe.org/ressources-du-fife2
Building meaningful frameworks for measuring diverse projects contributing to common outcomes requires a clarity and simplicity that is truly challenging to achieve. This has been my focus this year for a few programs at the provincial scale.
Communities in rural and urban areas have distinct challenges, and there are different cultures in different regions. This influences the nature of issues being addressed, as well as the kinds of approaches needed to successfully build community and effect change. “Shared measurement” rolls off the tongue, but finding ways to structure it among diverse organizations in a wide range of communities takes a thoughtful approach.
I’ve found a few things that help:
1. Aligning common measures at the right level. Common measures can’t be so specific that there can be no comparison, nor so broad that there is no longer any meaning.
2. Focusing on a small number of common quantitative metrics that are feasible and meaningful for all parties to collect. These may look more like outputs than outcomes, but they can be rolled up better than the most perfect metrics that only 20% can reasonably collect.
3. Building in a process for organizations to gather data about their specificity. What are they experiencing in their community? What are the particular approaches they take, and in what context? This starts to give meaning to the metrics- allowing us to see how the overall outcomes are working, and why.
This is all easier said than done, but I am heartened but the responses from groups once they see their results presented as shared outcomes- it starts to feel powerful. Qualitative data and stories that illustrate local experiences in a particular context add depth of meaning. Taken as a whole, this can be a valuable way of demonstrating that our differences are what makes us effective in a variety of settings, and contributing to common outcomes makes us effective at scale.
I’m delighted to be Co-chairing the 2017 CES conference with Chris Lovato. My first introduction to CES was at the 2010 conference in Victoria. I’d been working in evaluation for many years, including conducting training in monitoring and evaluation. At that time, though, I wasn’t familiar with evaluation as a discipline.
What struck me most about the conference was how collegial attendees were, and how curious they were to learn. I enjoyed the sessions I attended, and more than that, I savoured the open and exploratory conversations that took place throughout the event. Almost every session made me feel like I’d finally come home. People were friendly, used a language I understood, and talked about topics I cared about. I felt from that time that I was joining a community of people working in the same realm, though the boundaries were still being worked out.
We’ve advanced a great deal as a profession since that conference, and it only been six years. Since 2010, the Credentialed Evaluator program has been implemented in Canada, and there is a movement to promote evaluation internationally, through VOPEs and global initiatives and organizations such as Better Evaluation, EvalYear 2015 and the International Organization for Cooperation in Evaluation. I’m honoured and constantly invigorated to be part of the evaluation community.
The enthusiasm and energy that members of our Planning Committee are bringing to CES 2017
is inspiring. I’m looking forward to “Facing Forward” together with evaluation colleagues joining us in Vancouver.
I’ve been hearing a lot about failure lately: “I don’t want to admit that I have failed”, “Let’s re-frame these results so it doesn’t seem like we failed”. Most of these comments have come from what I consider to be successful people, running successful programs and businesses. Why is this conversation coming up about failure?
People are trying new things. We live in a complex world that requires innovation to adapt and improve our approaches. It’s courageous to try something new, especially when we’ve received funding and feel like we’re accountable for results. Innovation requires a certain kind of risk, because we are doing something we aren’t actually sure will work. If we already new what would work, and if our context never changed, we could confidently continue what we are already going.
Why is innovation so important?
We live in a changing world with new challenges, complexity, and ever-shifting influences. Innovation allows us to imagine new solutions. This requires a lot of leadership and willingness to learn.
Why is innovation so hard?
We may understand the need to change, simply because we know our approach needs to improve or because we can envision a better way. Knowing change is needed, however, isn’t the same as knowing what to do about it. Until we try something, we don’t know if it will work: we are operating in a constellation of needs, stakeholders, funding, relationships and other pressures that will impact the best-laid plans. We can’t know how something will work until we try it. Trying something new means exposing ourselves to a world of unknowns. Our challenge is to act with our best information, intentions and approach. Then we need to reflect, because there will be nuances to our experience that can teach us a great deal about how we might move forward effectively. Listening carefully is the key to our ability to learn and improve, bringing a clear understanding of what didn’t work forward, just as much as what did work.
I am much more worried about failure of imagination, failure to act, and failure to reflect, than I am about hearing “this completely failed, let’s learn from it.” The very reason that we tried something new was to see if it worked. If it didn’t, let’s not repeat it, and let’s understand why.
How do we as evaluators create a safe space to talk about failure? It’s a conversation that helps us evolve and grow as a profession. It’s key to supporting our clients to benefit from their experience; saying something failed shouldn’t be about admitting weakness, it should be about celebrating a new approach and building collective wisdom around how it worked, what didn’t work, and what lessons can be learned and shared.
What can we do to support talking about failure?
· Create a safe space for the conversation
· Make it clear from the beginning that learning is the goal
· Focus on the experiment, not the success or failure of the organization carrying it out
If we knew exactly how to do something, it wouldn’t be innovation. We can create the opportunity to build on our failures through innovation, action, and reflection.
I’ve just completed an intensive series of reports to close multi-year evaluations with several clients. The experience has led me to reflect on my team members and how grateful I am to work with so many amazing people.
As an external evaluator I am at times seeing things from a different perspective than the Executive Directors, Project Managers and administrative staff that I work with. This is my advantage and my challenge; I have a fresh way of seeing things, but I am also one step removed from the day to day actions and changes that impact program outcomes. Sometimes the data I see is out of date and I don’t realize it, and sometimes the data I happen to have doesn’t tell the full story.
Working in collaboration with my clients, I am able to bridge these gaps and get a truer picture of what happened, and why. I am grateful to have an excellent Research Assistant who also helps me to prepare the data we do have and identify big-picture questions that we need to take back to our clients.
As we close this period of reflection I am grateful to the wonderful people I work with, and wish everyone continued success in improving healthcare, building collaboration and contributing to system-wide changes that make all of our lives better.
I recently presented a workshop on confidence and motivation for a mentorship program I support, and was asked to provide more concrete examples of how to nurture and maintain confidence and motivation.
As an entrepreneur, I need to keep up my motivation in the face of a multitude of opportunities and challenges; as a leader, I need to maintain my confidence- trusting that I’m doing my best, while knowing how to ask for help, support my team, and create space for everyone to bring their best self so that we’re all contributing in the most meaningful way possible. It’s a balancing act that takes patience and humility, plus the dedication to continuously build new skills.
Here are some of my suggestions: Prioritize
1. Read Stephen Covey’s The 7 Habits of Highly Effective People
2. If you’re in a rush, skip to the part about what’s urgent, and what’s important. Do the exercise to help yourself prioritize Be Yourself
1. Figure out your social style and be yourself
2. Read about the concept of kintsukuroi
- learn about accepting change and embracing the beauty of what comes next
3. Try new things Stay Motivated
1. Create a routine
2. Take time to do the things you love
3. Type “confidence” or “motivation” into youtube or pinterest or another sharing site, and get inspired, every morning or evening
4. Subscribe to Notes from the Universe
for free daily motivation
“No offense, but I’m not actually going to read this” said a client last week about my Final Evaluation Report. I’ve been gathering data for two years and have spent countless hours putting it together. Actually, I don’t take offense. The final report format requires of me a certain amount of comprehensiveness
; by which you can also envision a swath of dust-catching pages full of detailed data, long explanations, and figures.
In fact, I consider the final report an essential document, because it is the full version that details the methodology, data sources, analysis and other important information. I know, however, that this is not the final product that my client wants to see. It’s just the repository for all of the relevant information, including appendices with all of the survey instruments, interview protocols, and detailed results.
What my client wants to see is a richer representation of the data. They want to see it in colour, in context. They want to know what it means. This is one of the most exciting and meaningful parts of my work. I have created a number of reports in association with the final report, which help to visualize the data available, and help explain the relationships between different aspects of the work. This “report” is no longer one thing; it is a variety of versions and formats which may have multiple goals: understanding the process of a particular strategy, articulating outcomes within a combination of strategies, illustrating the results of a particular method, and communicating with different kinds of audiences ranging from internal decision-makers to community partners. This is another step beyond data analysis, drawing on skills in communication and design, and it’s challenging but rewarding.
You can find out more about better evaluation reporting from the exceptional Kylie Hutchinson
, who is a great guide in making sense of data in every situation. There are also other helpful resources out there, such as Stephanie Evergreen
and Ann K. Emery
This is the new evaluation reporting. Someone is actually going to read the evaluation report. Our ability to create a meaningful, accessible report means that it will have a better chance of supporting important decisions to come and improving the work being done. Personally, I find that very exciting!
At a recent evaluation workshop put on by the Tamarack Institute featuring Liz Weaver and Mark Cabaj, I participated in a workshop by Vanessa Timmer. Vanessa demonstrated systems mapping using an ecosystem model, asking us to find the connections between different parts of the system. I’ve used similar tools before, but have struggled with how to make sense of the work beyond the process. Vanessa put my mind at ease, noting that systems mapping is about being curious about connections and patterns. Essentially, the process is a big part of the outcome with this tool. It allows a group to communicate about its assumptions, perceptions and expectations. Seen in that light, I really appreciated what it could do.
We also watched an incredibly illuminating video by Eric Berlow. He showed us how to take something complex and break it down into something simple. He asks two basic questions: 1. What is the sphere of influence you care about most? 2. If you eliminate everything not actionable and not under your control, what is left? The answer is pretty simple.
Take three minutes to watch this brilliant video:
At a recent evaluation workshop by the Tamarack Institute, Liz Weaver and Mark Cabaj challenged us to consider evaluation as a kind of inquiry. Mark talked about how the conceptual use of evaluation helps us to reflect on multi-dimensional issues, noting that all frameworks reveal and distort- our challenge is to be effective in adapting. Liz urged us to think about our impact at a whole-community level. They suggested that learning requires implementation- we must try things in order to understand them. Mark talked about how there’s nothing worse than looking at a complex problem from behind a desk- we need learning-rich experience to help us understand complexity.
My colleague and I shared our work using Most Significant Change to evaluate a provincial healthcare initiative. Our challenge has been to evaluate across multiple organizations, regions and strategy types. We’ve been using Most Significant Change as a generative approach, where diverse stakeholders bring forward stories that are significant to them within key domains identified at the provincial level. This participatory method is helping us tell the story of innovation in a complex and ever-changing environment. It’s allowing us to gather rich data directly from the field, from those impacted-by and influencing the changes. The power of story is also that in the process, we are building relationships and increasing our collective capacity to articulate the work, the outcomes, and their significance at a local and provincial level.
I was reviewing my achievements and learnings for 2015: working with great new clients, building even stronger relationships with existing clients, taking on new leadership roles, letting go of volunteer commitments I could no longer maintain, communicating using media that is new to me, presenting and learning from others at conferences and events, and building the facilitation and training side of my work.
I was preparing to make some resolutions for the new year, when an email from the Chopra Lifestyle Centre helped me reorient my thoughts. The article suggested that people are more likely to achieve goals than to stick to resolutions. In my own work I would have suggested the same thing. I had forgotten this simple yet transformative thing when considering my year ahead!
Goals help to orient us. They keep us “on track” exactly because if we know our goals, we can always figure out our direction no matter where we are on the path, or how we’ve gotten lost along the way. Goals give us a tangible “what” to work towards, and leave it up to our creativity to bring our goals into reality.
Resolutions tend to be more about “how” we do things, which is harder to adapt, less motivational, and not as easy to visualize. With that in mind, I have some inspiring goals I’ll be working on in 2016.
Thanks to everyone who helped me learn and grow in 2015. I wish you all the best in 2016.