This is the third and final post in a series about design thinking in evaluation. I’ve found that designers and evaluators grapple with similar issues, so the goal of this series is to share insights from the world of design that may help you think differently about data collection and visualization and, hopefully, start a broader conversation about what the world of social sciences can learn from the world of design. For more on this, see my earlier posts on radical collaboration and focusing on human values.
This time around were focusing on another key tenet in the design thinking world: crafting clarity. To craft clarity means to frame things for understanding, reduce jargon, and make your work accessible to as broad an audience as possible.
One of the things that we do at CRC is aim to create order out of chaos. Life is messy. People are messy. Filling out forms using pen and paper with a clipboard is messy. Spreadsheets are messy. We dont pretend that isnt true. But we try to frame things in a way that makes them easier for our clients to comprehend and understand.
Sometimes that means sorting through what feels like reams of indicators, helping a client figure out which ones are the most important, which ones are asking the same things twice, and/or are left over from a previous funder’s request and dont need to be collected anymore. Adding structure, for example by way of a carefully considered indicator grid can also add clarity. In addition to cleaning up and getting rid of data that isn’t being used, and often turning 20 spreadsheets into 2, structure improves understanding of which measures are needed for what purposes and which funders.
We routinely use visual tools to help organize and clarify thoughts and program models. We love logic models for this very reason! A common tool in good evaluation design, they can organize all parts of even complex programs into a simple, one page, visual representation that even program outsiders can understand. Logic models group things according to whether they are activities or outcomes, whether they are process or outcomes measures, and also point to who will be responsible for collecting the data or when it will be available. We like indicator grids and logic models not because they make the work itself simple, but because they frame the work and the evaluation around it in ways that increase understanding.
Another major way that we craft clarity is through our approach to report writing. When we write reports we keep them as simple and short as they can be, but (with a nod to Einstein) no simpler. We try to use two words when we know they will do the work of ten. We eschew obfuscation. Using conversational language and visuals, we aim to create deliverables that make meaningful sense out of things instead of making more confusion.
Data visualization has become an integral part of our work for the same reasons. Visualizing information makes it easier to understand, in order to enhance or even stand in for text. The brain can instantly make sense of the impact of a program in a moment, versus having to do the work of slogging through a lengthy report full of academic jargon.
Life is complicated. We dont deny that. But evaluators (along with our clients) often make it more complicated than it has to be. The whole point of collecting and analyzing data it to increase understanding, not the opposite. By crafting clarity, we make sure we’re doing everything we can to make things clear and useful.