AEA 2014 Recap
by Mandolin Singleton
Last month I attended the 2014 American Evaluation Association conference in Denver, CO. 2014’s conference theme was “Visionary Evaluation for a Sustainable, Equitable Future.” The event brought together research and evaluation professionals from all over the globe and from a variety of disciplines (e.g., community psychology, health and human services, PreK-12 educational evaluation). Attendees were encouraged to explore ways in which evaluation could be used to support sustainability and equality across disciplines and sectors.
This year’s conference was especially exciting (as well as nerve-wrecking) for me because I was attending as a first time conference presenter. I went to numerous sessions, learned a lot, and had a great time connecting with other evaluators. (I even found a little bit of time to explore Denver’s spectacular shopping scene). Below are some of my highlights from the conference.
- Robert Kahle: Dominators, Cynics, and Wallflowers: Practical Strategies for Moderating Meaningful Focus Groups
Robert Kahle is a sociologist and expert in qualitative research. He is versed in leading skill building sessions for both new and experienced focus group moderators. In this workshop, he talked about how to effectively manage focus group dynamics, identified problem behaviors typically observed in groups, and reviewed strategies to recognize, prevent, and address them. I found this workshop especially informative and will be applying some of these techniques in my upcoming focus groups.
If you’re interested in learning more about what was reviewed in this session, much of the content can be found in Robert’s book, “Dominators, Cynics, and Wallflowers: Practical Strategies for Moderating Meaningful Focus Groups”.
- Veena Pankaj: Data Placemats: A DataViz Technique to Improve Stakeholder Understanding of Evaluation Results
Veena Pankaj has experience directing evaluation design and implementation with a focus on participatory approaches (you can check out one of her recent publications on participatory analysis here). Pankaj described a data visualization technique (Data Placemats) she uses to engage, improve understanding, and solicit stakeholder interpretation of evaluation results. She talked about the logistics of the technique (e.g., the what, when, and how) and reviewed the learning journey involved in their creation.
If you’re interested in Veena’s work, you can find slides and resources from the session on SlideShare.
Deciding to one-up myself by giving not only one, but two presentations my first go around didn’t really help my nerves, but what can I say, I was enthralled by this year’s conference theme (you could probably also say there was a little part of me trying to impress the boss as well).
I gave a poster presentation on an infographic that we (CRC) created for Elev8 Baltimore in effort to visually display evaluation findings. The poster reviewed the process, results, and implications of translating data findings into a reader-friendly infographic.
My poster implied that infographics can be successfully translated into attractive, functional, and informative infographics. It suggested the use of infographics to report evaluation findings, as effective data visualization can attract readers, aid in interpretation of data, and support comprehension. Perhaps most importantly, it implied infographics can be used to promote the use of evaluation findings to inform decision making!
I also gave a paper presentation on our (CRC’s) establishment of an early warning system at the Elev8 Baltimore sites, and reviewed the application of early warning indicators to the middle grades. Within the presentation I gave a brief description of the system and reviewed the steps we took to create it – including everything from gaining access to the data to producing the final reports.
In this presentation I described how early warning indicators can enhance the accessibility and use of evaluation data; through providing reports to sites on a quarterly basis, the early warning indicators can be used on a real-time basis to inform programmatic decisions. Implications were also made for expanding the use of early warning indicators from high school to middle school populations.
Between the multitude of great sessions I attended, learning loads of information, giving two presentations, and still finding time to scope out the shopping scene in Denver, you could say I had a whirlwind of a time at AEA 2014. For more pictures from AEA 2014, visit AEA’s Facebook page.