Inciter | 2013 September
0
archive,date,ajax_fade,page_not_loaded,,select-child-theme-ver-1.0.0,select-theme-ver-1.7.1, vertical_menu_background_opacity,smooth_scroll,wpb-js-composer js-comp-ver-4.11.2.1,vc_responsive

Evaluation communication and audience considerations

by Dana AnsariDuring the brief time I’ve worked as an evaluator, for various projects I’ve had to report on different types of information in different formats and structures. Given that I’ve had limited-to-no access to the actual databases for these projects, I typically have had to request needed information from the project staff members who have access to it. As a result, the data I receive is mostly in raw format, lengthy and complicated to understand at a glance. However, with time and follow up questioning, I’ve been able to extract what I need from within the numbers. Of course, the work is still far from over.I then have had to filter, sort, organize, and restructure the data so that my audience can understand it without being confused or bored...

Read More

Theory-driven & process evaluation: The art of getting inside and beyond the “black box”

by Jill ScheiblerBefore working in program evaluation, I received education and training as a clinician, specifically as an art therapist. Through my work as an art therapist, which was based in a personal belief in and, more importantly, empirical observations supporting the mental health-promoting effects of making art, I became curious about how to demonstrate arts impacts to the general public (including dubious funders and policymakers) and found a lack of relevant research to back up what I’d seen in practice. At the same time, I talked to numerous art therapists and community artist-practitioners who were doing good work with vulnerable populations in my city of Baltimore, all around the U.S., and throughout the world. The individuals I talked to all voiced their need to “prove” the value of their...

Read More