Seven Tips to Guide Your Program Evaluation

By Rob Horowitz

Organizations are under increasing pressure to provide data demonstrating program accomplishments. But they typically don’t have the capacity to meet these expectations. If you are trying to do this yourself, here are some “tips” to help you think about evaluation issues.

  1. Understand the “audience” for your evaluation and their expectations. Why are you doing the evaluation? Who will look at and act upon the data? Who are the key constituencies—such as Boards, funders, staff, or school principals—that will be most interested in evaluation results? Solicit the views of this “audience” to ensure that the evaluation meets their needs.
  1. Decide on your evaluation questions. What do you want to learn? Be the driver of your evaluation. Don’t do an evaluation simply to satisfy the needs of others, such as funding agencies or a development office. Ask questions about your program and use the results to improve it. Develop a culture of inquiry in your organization.
  1. Be consistent in your evaluation design. Your evaluation plan should strive for consistency among these phases: focus, data collection, analysis, and reporting. That is, your evaluation questions and audience should drive your selection of assessment tools, research design, and analysis. Your analysis should attempt to answer your questions. Keep it focused and consistent. Evaluate what your program does, not what others hope it does.
  1. But…be flexible and open to unexpected findings. Yes, conditions in schools or other organizations may change and you may not be able to adhere to your plan. Work with them and change plans, as needed. Don’t have the “tail wag the dog” in evaluation, whereby you try to compel a program to keep to a plan only for the evaluation, but to the detriment of students or teachers.
  1. Build trust with your data sources. Bring teachers and others into the process and let them know that you value their objective views and will use them to improve the program.
  1. “Not everything that can be counted counts, and not everything that counts can be counted.” I didn’t coin that handy aphorism, but it is a tip to guide us all. Make sure to include qualitative approaches to data collection and learn how to analyze and present information that is not in the form of a number. This is especially important when studying special education populations. You don’t want to miss out on documenting individual children’s accomplishments – with all of their wonderful diversity – because you’re only presenting aggregated statistical data.
  1. Get help when you need it. Consider whether you need an independent evaluator. Do you have the capacity and expertise to do it on your own? Do you have a budget? Keep within your means. Reach out to others when you need them.

Rob Horowitz photoRob Horowitz is a consultant to arts organizations, school districts, and foundations. He has conducted over 100 program evaluations for organizations such as the Kennedy Center, National Endowment for the Arts, and Jazz at Lincoln Center, and conducted basic research on the impact of arts learning on cognitive and social development. He was a contributor to Champions of Change: The Impact of the Arts on Learning, Critical Links, and NEA Jazz in the Schools.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s