This organization adopted evaluation (And the world didn’t fall apart)

Kris Cummings knows the feeling, how the very word seems to knot the stomach.

But evaluation is like the first cold swim of cottage season: it gets better after you take a deep breath and plunge in.

“When evaluation is interpreted to mean, ‘Prove you’re doing something perfectly,’ it gets twisted into an exercise in accountability,” says Cummings, who carries two titles for the United Way of Cambridge and North Dumfries: chief operating officer and director of community and voluntary sector investment.

“Evaluation at its best is about an ongoing cycle of learning. You see an issue you want to address, you respond to that issue, you implement it, you pay attention to whether it’s working the way you thought it would, and the difference you’re making . . . You keep track of the story while it’s happening and you reflect: what worked, what didn’t and what would we change the next time.”

Kris Cummings

More home-grown expertise

The Cambridge United Way embraced evaluation about 10 years ago, as it shifted from funding organizations to investing in community programs, in order to focus planning and evaluation efforts on specific issues and specific programs. Back then, the United Way looked south, to United Way America experts, to augment what local Canadian consultants had to say on the topic.

Things have changed. Evaluation — the “storytelling” about the impact projects have on communities, and what groups learn from those projects — plays a key role in the funding strategy at the Ontario Trillium Foundation. United Way Centraide Canada collects and compiles the evaluation results of more than 100 United Ways and Centraides across Canada each year.  Evaluation has gone mainstream.

Capacity Canada, a Waterloo, Ont.-based organization that provides educational services to the charitable non-profit sector, offers evaluation training and coaching through its new EvalU program.

EvalU will hold its first boot camp in Cambridge Nov. 24-25, followed by one in Kingston, Jan. 26-27. The boot camps — there will be more to come over the next few years — are part of Capacity’s commitment to the Ontario Ministry of Citizenship, Immigration and International Trade, which wants to raise the level of evaluation knowledge in the province.

The United Way of Cambridge and North Dumfries works on a three-year funding cycle. For funded programs, the evaluation-driven application and reporting process identifies the need, lays out a plan to deal with it and then provides regular updates.

Stories count, too

Evaluation does generate numbers, Cummings says, and those measurements across the United Way’s 20 partner agencies running 40 programs help identify trends in the well-being of the community.

But just as important, he adds, are anecdotes and observations in the reports the organizations file.

“They put the human face on the metrics,” he says. “You can say 80 per cent of youth in 10 programs showed a significant increase in leadership skills. And that’s great, but you still want to be able to tell the story of one of those youths to put a face on what that means: What does it do in terms of summer job aspirations? Where they’re going to go to school?  All of that kind of stuff.”

With those stories and numbers, evaluation provides a much clearer picture of the impact programs are having, Cummings said. Stories add context.

For organizations thinking about taking on evaluation, Cummings suggests:

  • Viewing evaluation as innovation, rather than research and administration;
  • Starting small, with a manageable project, and building on that until evaluation becomes part of the working culture;
  • Customizing. No single approach to evaluation will work for all organizations;
  • Letting go. Evaluation tends to reveal some hard truths about much-loved programs. Organizations need to be open to change.

“That’s one of the core reasons you go into evaluation — to find out if you’re really making a difference,” Cummings says. “And that can be scary, depending on what you uncover.

“Maybe a service you thought was working great isn’t as on-target as you had expected. But it’s all in the response: If you stick your head in the sand, that’s not a great response. If you take it as a learning opportunity, that’s what it’s meant to do.”