This is the third and final post in a series about design thinking in evaluation. I’ve found that designers and evaluators grapple with similar issues, so the goal of this series is to share insights from the world of design that may help you think differently about data collection and visualization and, hopefully, start a broader conversation about what the world of social sciences can learn from the world of design.
(Re-post from AEA365; see the original here.)
At CRC Im the word nerd, implementing our qualitative projects. Like many evaluators, Ive had to translate academically-honed skills to the often faster-paced world of evaluation. A recent project for a county health departments substance abuse initiative provides an example of how I tailor qualitative methods to meet clients needs.
Allot ample time for clarifying goals. As with all good research, methods choices flow from the question at hand.
by Sheila Matano [su_youtube url=“https://www.youtube.com/watch?v=Hx7WLlJzrlw”]
Recently, CRC has been working with several clients who are evaluating initiatives to combat substance misuse and abuse. In particular, these agencies have been concerned with how their local communities have been impacted by the problem of drug overdoses and opioid-involved deaths, which have markedly increased in the United States over the past decade (CDC.gov).
More than six out of ten drug overdose deaths involve an opioid, and The Centers for Disease Control and Prevention estimates that 91 Americans die every day from an opioid overdose (Rudd, Seth, & Scholl, 2016).
by Mandi Singleton
Are you passionate about making a difference in the lives of youth? Many of my clients are, and the time, effort, and money they put into creating killer programs is proof enough that they are invested in forming positive and meaningful experiences for the young people that they work with.
BUT, how do program directors really know they are creating quality experiences for youth? How exactly is this measured?
Collaborating is hard; measuring collaboration doesn’t always have to be.
We’re currently working with a client to do just that, among other things. A system-wide change initiative, located in California, this client’s work is aimed at helping a large number of agencies and organizations work together to reduce domestic violence. A primary goal of the initiative is to improve the ways in which the various, diverse partners work together. Measuring this type of change can be a challenge for evaluators.
This is the second post in a series about design thinking in evaluation. The goal of this series is to share insights from the world of design that may help you think differently about how you work and, hopefully, start a conversation about what the world of social sciences can learn from the world of design. If you missed Part 1 about radical collaboration, check it out here.
This time around were focusing on another key idea in the design thinking world: human values.
Over the past couple of blog entries, Taj has shared lessons learned about design thinking that you can apply to your work. Taj will be continuing that series soon, but in the meantime, we wanted to share a related example of creating a simple, but effective, visualization for a client.
A local agency wanted to track client capacity on a monthly basis. This agency oversees services to pregnant women across multiple program locations, so tracking such information is necessary not only for their oversight of services, but also for sound management of dollars received from their funder
Lets face it, fundraising can be one of the most dreaded aspects of running a nonprofit. A lot of people feel unprepared and apprehensive about it; asking for money is hard. But there are ways to make it a little easier, and more effective.
Thats where data come in.
Perhaps you think of data and fundraising as natural complements to each other. Perhaps you never considered using data in your fundraising efforts.
_(Pardon our silence over these past several months! After our unintentional hiatus, were be getting back into our blogging routine, sharing evaluation related news, tips, and tricks on a somewhat monthly basis. Starting with todays post, the first in a series of posts about design and evaluation…) _
Over the past few years, as CRC has explored and embraced visual thinking, information visualization, and the use of technology in evaluation, Ive gotten a real world education in design, technology, and design thinking.
by Mandi Singleton
(Note: this post is the second part of a two-part series.)
As I mentioned in the my last blog post, one of my favorite things about my job at CRC is conducting focus groups. Focus groups with elementary school students can be the most challenging and the most fun for me as a focus group facilitator. Here in part two of my discussion of tips & tricks for doing focus groups with kids, I get into strategies that make for effective and enjoyable groups.