I recently presented a workshop on confidence and motivation for a mentorship program I support, and was asked to provide more concrete examples of how to nurture and maintain confidence and motivation.
As an entrepreneur, I need to keep up my motivation in the face of a multitude of opportunities and challenges; as a leader, I need to maintain my confidence- trusting that I’m doing my best, while knowing how to ask for help, support my team, and create space for everyone to bring their best self so that we’re all contributing in the most meaningful way possible. It’s a balancing act that takes patience and humility, plus the dedication to continuously build new skills.
Here are some of my suggestions: Prioritize
1. Read Stephen Covey’s The 7 Habits of Highly Effective People
2. If you’re in a rush, skip to the part about what’s urgent, and what’s important. Do the exercise to help yourself prioritize Be Yourself
1. Figure out your social style and be yourself
2. Read about the concept of kintsukuroi
- learn about accepting change and embracing the beauty of what comes next
3. Try new things Stay Motivated
1. Create a routine
2. Take time to do the things you love
3. Type “confidence” or “motivation” into youtube or pinterest or another sharing site, and get inspired, every morning or evening
4. Subscribe to Notes from the Universe
for free daily motivation
“No offense, but I’m not actually going to read this” said a client last week about my Final Evaluation Report. I’ve been gathering data for two years and have spent countless hours putting it together. Actually, I don’t take offense. The final report format requires of me a certain amount of comprehensiveness
; by which you can also envision a swath of dust-catching pages full of detailed data, long explanations, and figures.
In fact, I consider the final report an essential document, because it is the full version that details the methodology, data sources, analysis and other important information. I know, however, that this is not the final product that my client wants to see. It’s just the repository for all of the relevant information, including appendices with all of the survey instruments, interview protocols, and detailed results.
What my client wants to see is a richer representation of the data. They want to see it in colour, in context. They want to know what it means. This is one of the most exciting and meaningful parts of my work. I have created a number of reports in association with the final report, which help to visualize the data available, and help explain the relationships between different aspects of the work. This “report” is no longer one thing; it is a variety of versions and formats which may have multiple goals: understanding the process of a particular strategy, articulating outcomes within a combination of strategies, illustrating the results of a particular method, and communicating with different kinds of audiences ranging from internal decision-makers to community partners. This is another step beyond data analysis, drawing on skills in communication and design, and it’s challenging but rewarding.
You can find out more about better evaluation reporting from the exceptional Kylie Hutchinson
, who is a great guide in making sense of data in every situation. There are also other helpful resources out there, such as Stephanie Evergreen
and Ann K. Emery
This is the new evaluation reporting. Someone is actually going to read the evaluation report. Our ability to create a meaningful, accessible report means that it will have a better chance of supporting important decisions to come and improving the work being done. Personally, I find that very exciting!
How do you deal with the complexity of collaborating organizations that are on different timelines, with power differentials, and varying levels of data quality? Krishna Belbase of the Evaluation Office of UNICEF introduced the Resource Pack on Joint Evaluations developed by the UN Evaluation Group, at the CES 2015 conference in Montreal. He suggested that it is structured for UN agencies, but could be adapted to suit other organizations. The Resource Pack is a rich resource not only because of the simple yet comprehensive guide it provides for evaluation, but also because of the way it details the governance structures needed to support evaluation in organizations working together on evaluation.
In today’s world many evaluations are done with some element of collaboration, and the Guidance Document and Toolkit that make up the Resource Pack can be used to help define the key functions, structures, and questions to ask when determining how to govern evaluation.
The Guidance Document helps tease out the various functions like communication, management, technical input, and logistics. The Toolkit then walks you through the steps from deciding to work together on an evaluation, preparing for the evaluation, implementing the evaluation, to utilizing the outcomes. It addresses sticky issues like readiness and buy-in, and provides advice at every stage from developing terms of reference to disseminating findings.
Do you need a steering committee, management group, reference group, stakeholder group, or advisory group? The Toolkit lays out the considerations for making important decisions about the most appropriate governance structure for your situation. Overall, the Resource Pack on Joint Evaluations is a great resource for any organization looking to support decision-makers and leaders in structuring their governance, and provides tools such as checklists, examples and good practices to evaluation practitioners.
Check out this amazing resource: Resource Pack on Joint Evaluations
I just came back from a wonderful experience at the Canadian Evaluation Society's annual conference, Evaluation for the World We Want, CES/SCÉ 2015. I was happy that my presentation on Measuring Collaboration was well-received. It turns out that I'm not the only one struggling to figure out what collaboration really means and how to identify the appropriate tools for measuring it. I've been collecting a list of tools for measuring collaboration
as I come across them. Please share your tools for measuring collaboration and I'll add them. The best thing about #EvalC2015 was how much everyone shared- because we're all in this together!
My favourite lessons from CES 2015:
1. Include legislation in the logic model to ensure it gets tracked (Nancy Carter and Robert Chatwin)
2. Government is responsible for evaluating the value of programs- did they provide a public good, as is the goal, not the dollar value (Mirianaud Oswald Agbadome)
3. Paradigm shifts happen. Evaluation can lead to systems change. 'coeur et rigeur' (Laure Waridel)