“No offense, but I’m not actually going to read this” said a client last week about my Final Evaluation Report. I’ve been gathering data for two years and have spent countless hours putting it together. Actually, I don’t take offense. The final report format requires of me a certain amount of comprehensiveness
; by which you can also envision a swath of dust-catching pages full of detailed data, long explanations, and figures.
In fact, I consider the final report an essential document, because it is the full version that details the methodology, data sources, analysis and other important information. I know, however, that this is not the final product that my client wants to see. It’s just the repository for all of the relevant information, including appendices with all of the survey instruments, interview protocols, and detailed results.
What my client wants to see is a richer representation of the data. They want to see it in colour, in context. They want to know what it means. This is one of the most exciting and meaningful parts of my work. I have created a number of reports in association with the final report, which help to visualize the data available, and help explain the relationships between different aspects of the work. This “report” is no longer one thing; it is a variety of versions and formats which may have multiple goals: understanding the process of a particular strategy, articulating outcomes within a combination of strategies, illustrating the results of a particular method, and communicating with different kinds of audiences ranging from internal decision-makers to community partners. This is another step beyond data analysis, drawing on skills in communication and design, and it’s challenging but rewarding.
You can find out more about better evaluation reporting from the exceptional Kylie Hutchinson
, who is a great guide in making sense of data in every situation. There are also other helpful resources out there, such as Stephanie Evergreen
and Ann K. Emery
This is the new evaluation reporting. Someone is actually going to read the evaluation report. Our ability to create a meaningful, accessible report means that it will have a better chance of supporting important decisions to come and improving the work being done. Personally, I find that very exciting!
At a recent evaluation workshop put on by the Tamarack Institute featuring Liz Weaver and Mark Cabaj, I participated in a workshop by Vanessa Timmer. Vanessa demonstrated systems mapping using an ecosystem model, asking us to find the connections between different parts of the system. I’ve used similar tools before, but have struggled with how to make sense of the work beyond the process. Vanessa put my mind at ease, noting that systems mapping is about being curious about connections and patterns. Essentially, the process is a big part of the outcome with this tool. It allows a group to communicate about its assumptions, perceptions and expectations. Seen in that light, I really appreciated what it could do.
We also watched an incredibly illuminating video by Eric Berlow. He showed us how to take something complex and break it down into something simple. He asks two basic questions: 1. What is the sphere of influence you care about most? 2. If you eliminate everything not actionable and not under your control, what is left? The answer is pretty simple.
Take three minutes to watch this brilliant video:
At a recent evaluation workshop by the Tamarack Institute, Liz Weaver and Mark Cabaj challenged us to consider evaluation as a kind of inquiry. Mark talked about how the conceptual use of evaluation helps us to reflect on multi-dimensional issues, noting that all frameworks reveal and distort- our challenge is to be effective in adapting. Liz urged us to think about our impact at a whole-community level. They suggested that learning requires implementation- we must try things in order to understand them. Mark talked about how there’s nothing worse than looking at a complex problem from behind a desk- we need learning-rich experience to help us understand complexity.
My colleague and I shared our work using Most Significant Change to evaluate a provincial healthcare initiative. Our challenge has been to evaluate across multiple organizations, regions and strategy types. We’ve been using Most Significant Change as a generative approach, where diverse stakeholders bring forward stories that are significant to them within key domains identified at the provincial level. This participatory method is helping us tell the story of innovation in a complex and ever-changing environment. It’s allowing us to gather rich data directly from the field, from those impacted-by and influencing the changes. The power of story is also that in the process, we are building relationships and increasing our collective capacity to articulate the work, the outcomes, and their significance at a local and provincial level.
I was reviewing my achievements and learnings for 2015: working with great new clients, building even stronger relationships with existing clients, taking on new leadership roles, letting go of volunteer commitments I could no longer maintain, communicating using media that is new to me, presenting and learning from others at conferences and events, and building the facilitation and training side of my work.
I was preparing to make some resolutions for the new year, when an email from the Chopra Lifestyle Centre helped me reorient my thoughts. The article suggested that people are more likely to achieve goals than to stick to resolutions. In my own work I would have suggested the same thing. I had forgotten this simple yet transformative thing when considering my year ahead!
Goals help to orient us. They keep us “on track” exactly because if we know our goals, we can always figure out our direction no matter where we are on the path, or how we’ve gotten lost along the way. Goals give us a tangible “what” to work towards, and leave it up to our creativity to bring our goals into reality.
Resolutions tend to be more about “how” we do things, which is harder to adapt, less motivational, and not as easy to visualize. With that in mind, I have some inspiring goals I’ll be working on in 2016.
Thanks to everyone who helped me learn and grow in 2015. I wish you all the best in 2016.
How Evaluation Saved my Family Christmas
A few weeks ago, my dad sat me down and started asking about whether we’d be having stockings for Christmas, and who would do it, and suggesting that maybe we shouldn’t have stockings anymore. My sister called me the same night to talk about whether we should have turnips in the mashed potatoes or not. I could feel the start of it…Christmas drama.
Instead of engaging in the usual backroom negotiations and lobbying for my preferences, I used appreciative inquiry to think about how I could make this the best Christmas ever. What do we all really want? I know that deep down we all want to have a joyful holiday that brings us together to eat good food and enjoy one another’s company. The worry about what kind of gifts to get or who should cook the gravy using what technique was distracting us from what is truly important.
I spent some time thinking about the tools I use in my daily work, and I decided to try a Christmas survey. What better way to find out what everyone wants? To be successful, this would need to incorporate some important principles:
1. Full engagement- Everyone would have to participate to be heard, and the experience would have to be fun to inspire their participation
2. Transparency- The raw results would have to be shared to build trust
3. Simplicity- The survey would have be quick and simple
4. Actionable recommendations- I’d have to translate the raw data into actionable items that everyone could understand and participate in
I did my best to make the invitation to the survey fun, and to make the survey itself an enjoyable experience. I incorporated the following elements:
· Casual, fun language in the survey invite
· Explicit statement of goal in the invite: “We want to make this the best Christmas ever”
· Everyone is asked to click on “I will help make this the best, most fun Christmas ever” at the end of the survey
· The survey is highly visual with fun and symbolic Christmas images like trees and ornaments
· The survey questions are short and simple
The full results are presented so that everyone can see the raw data. On top of the raw results, I help with interpretation by clearly showing which choices were most and least popular. This was important in an environment where one person in a family might speak for another. For example, in one of the comments, someone said: “Thanks for organizing! Absolutely no purchased items for me or anyone in our family :-) Can't wait for cookie day - I vote sugar cookies”. Needless to say, the other family members actually voted that they wanted stockings and gifts. The survey allowed each individual to express their wishes without going through a family representative who might alter their response. Transparency was key to establishing trust in the process and further supporting buy-in.
The survey has only two questions:
· How do you feel about the following activities?
· What do you want to eat?
The scales were simple and fun:
· Activities: Love it! * Um…maybe. * Kill me now.
· Food: Yum J * If I must… * Gross.
I incorporated images into the survey and the “report” to keep it entertaining.
The results were important to share, but in order for people to take the data and act on it, the recommendations needed to be clear. In the case of Christmas, this took the form of a list of activities, menus for dinner and dessert, and a budget for gifts and stockings.
Several people made new suggestions in the activities section, so we had a one-question ranking survey sent out to follow up, and we are now starting a new family tradition- we’ll be watching James Bond together.
This is the Best Christmas ever!
I’m happy to say that I’m more convinced than ever that the tools of evaluation can help us to encourage engagement, build relationships, and create a shared sense of purpose and agreed process. We’ll be having English trifle for dessert, which is my favourite, and everyone has been completely on-task and excited about Christmas.
Last week the Canadian Evaluation Society BC & Yukon Chapter held a conference on the theme of Collaboration, Contribution and Collective Impact.
This sold-out event reminded me how much we want to collaborate in our own field. We work together with clients, partners and stakeholders all the time, incorporating new practices and innovating to build relationships and improve our outcomes.
Paul Kishchuk talked about wisdom outcomes being integral to making a difference as evaluators. We learned new skills like graphic facilitation from @jackiecamsden, designed to help us collaborate. I couldn’t attend the Evaluation Therapy session, but I heard from many people how much they enjoyed the chance to intervene in common situations and act out their own recommendations, quite literally. Incorporating a new element like theatre adds to our toolbox in unexpected ways.
Funders talked frankly about how hard it is to standardize metrics, because to respect context naturally leads to adaptation and differentiation, but to tell a bigger story, we need to define common measures.
What did I take away from the experience?
1. We learn and grow better together
2. Collaboration is complex and we benefit by being creative, observant and adaptable to make sense of it in evaluation
3. Collaboration is a confluence of individuals, mechanisms and systems, and we need to approach it with this understanding
Thanks to the many excellent presenters and to the attendees who provoked meaningful conversations. I look forward to next year!
I recently reviewed Rethinking Canadian Aid
, edited by Stephen Brown, Molly den Heyer, and David R. Black, for The Philanthropist journal. It's the first book about Canadian foreign aid since CIDA became part of DFATD, and a fascinating read. Read the review
I've been collecting stories of Most Significant Change as part of the evaluation of a systems-change initiative in the healthcare sector. It's innovative, complex and it's the first time anything of this scale is being attempted by my client.
The Most Significant Change method calls for collecting stories within a certain domain of change, so we've been focusing on the area where there has been the greatest impact since the inception of the initiative in the fall of last year. Stories have come from physicians, patients, partners and allied health professionals. Up until now I was recording the stories, but there still wasn't clear buy-in about the process of using stories as part of the evaluation.
The magic happened last week when we went through the story selection process. For the first time, the Evaluation Working Group had the opportunity to hear the stories of Most Significant Change in people's lives, and they realized what a difference they were making. Not only did the stories show the incredible challenges of patients seeking care for complex conditions and care providers struggling to support patients who had social barriers clearly impacting their health but outside of the realm of what they could address, but they got to see how the changes came about and why they were significant to the storytellers. The group discussed at length the significance of each story and it didn't take long before they identified themes in the stories that reflected their original motivation for getting involved in the initiative. As the conversation touched on the raw experiences of the initiative, there was an opportunity for deep reflection.
This was one of the most satisfying meetings I've had all year. There was a "click" where the hard data started to take on faces and experiences, guiding us through the journey of change that has been happening. The stories illustrated the change in a way that made our survey statistics and care data come to life.
Here are some things to consider when using Most Significant Change:
1. Stories help illustrate the context. It's complementary, though, and is most valuable when presented with hard data that shows the bigger picture.
2. Gather stories from a diversity of respondents. The target interview groups should ideally be identified as part of the evaluation plan.
3. Be ready to facilitate! The selection process is rewarding but needs guidance to maintain a safe, open space and help nudge the group towards a decision using a process they feel comfortable with.
Enjoy the process!
How do you deal with the complexity of collaborating organizations that are on different timelines, with power differentials, and varying levels of data quality? Krishna Belbase of the Evaluation Office of UNICEF introduced the Resource Pack on Joint Evaluations developed by the UN Evaluation Group, at the CES 2015 conference in Montreal. He suggested that it is structured for UN agencies, but could be adapted to suit other organizations. The Resource Pack is a rich resource not only because of the simple yet comprehensive guide it provides for evaluation, but also because of the way it details the governance structures needed to support evaluation in organizations working together on evaluation.
In today’s world many evaluations are done with some element of collaboration, and the Guidance Document and Toolkit that make up the Resource Pack can be used to help define the key functions, structures, and questions to ask when determining how to govern evaluation.
The Guidance Document helps tease out the various functions like communication, management, technical input, and logistics. The Toolkit then walks you through the steps from deciding to work together on an evaluation, preparing for the evaluation, implementing the evaluation, to utilizing the outcomes. It addresses sticky issues like readiness and buy-in, and provides advice at every stage from developing terms of reference to disseminating findings.
Do you need a steering committee, management group, reference group, stakeholder group, or advisory group? The Toolkit lays out the considerations for making important decisions about the most appropriate governance structure for your situation. Overall, the Resource Pack on Joint Evaluations is a great resource for any organization looking to support decision-makers and leaders in structuring their governance, and provides tools such as checklists, examples and good practices to evaluation practitioners.
Check out this amazing resource: Resource Pack on Joint Evaluations
As planners and evaluators we have an important and influential role in supporting decision-making, but we are ultimately advisors. How can we exercise leadership in our role while respecting the role of final decision-makers? I've found a few approaches that help:
1. Help identify decision-makers
2. Facilitate the development of terms of reference, scope of decisions or decision-making criteria
3. Ensure that organizational values are reflected in the plans and evaluation framework
4. Articulate evaluation findings and recommendations in a language and format that meets the needs of decision-makers
5. Time work to accommodate upcoming decisions and information needs
Good governance is important to everyone and we are responsible for contributing to a sound decision-making process. We can take leadership in strengthening the decision-making process in an advisory role.