The Five Ws + an H of Program Evaluation

By Erin J. Hoppe

Whether you work at an organization with dozens of employees or just one, evaluation is essential to accountability, transparency, and continuous improvement. I come from the later of these dichotomies, but will always prioritize evaluation as a way to measure our success and work smarter. With limited funds but unlimited demands, it is essential to take a critical look at our work. Here are a few tips on how to make evaluation work, no matter the organizational circumstances.

  1. Why – Start here. Aside from the few reasons previous noted, you need to identify the specific reasons why evaluation is important for your program. Are you trying to understand the impact, increase efficiency and effectiveness, or demonstrate value to stakeholders? Having this answer will help you make a plan and address the rest of the evaluation process.
  2. What – This matters a lot because it shapes your research question and strategy. Clarify what you want to learn from this process and what you will do with the information. This is more specific than the “why.” What you want to know will determine what data you collect, from who, and how—do you need a pre-post-test or interviews?
  3. Who – No one is an island and no evaluation has ever been conducted by a single person. Someone is providing the data you are collecting. Someone is analyzing the information. Someone is expecting a report on the results. Build a team to help you get through the process and always over-thank participants for the extra work you are asking of them.
  4. Where – Is this evaluation taking place in your building or schools across the state? The lines of communication between administrators and participants should be wide open and responsive. Think about providing the evaluation in multiple formats and make sure there is a clear path from data collection to analysis to reporting.
  5. When – Evaluations can be a short survey after an event or span several years. Either way, I make the same recommendation for evaluation as I do for accessibility: it should be a line item during planning meetings and in the budget. This doesn’t have to mean spending more than you can afford, but it does demonstrate value.
  6. How – Large institutions might have a team with “evaluation” in their job description and funds to make it happen. Others need to find funders and outside experts. Either way, with a clear “why” and “what” the work will happen.

The best advice I can offer in program evaluation is to be thoughtful, flexible, and tenacious. Whatever the scope of your project, the results should inform your practices (even if they aren’t what you expected), and just might move the field forward so we all learn something new. I look forward to reading your findings.


Erin Hoppe's headshot

VSA Ohio Executive Director Erin Hoppe

Erin J. Hoppe is approaching her ten-year anniversary as executive director of VSA Ohio ( Her background in evaluation includes work at VSAO, The Ohio State University, American Institutes for Research, and the Smithsonian Institution. She is a board member for Columbus Arts Marketing Association, Ohio Citizens for the Arts, and ADA Ohio. If you can’t find her in the office, she is probably working on a home improvement project or bird watching.


Five Tips for Using The Arts and Special Education: A Map for Research

By Jenna Gabriel, Ed.M. and Don Glass, Ph.D.

The Kennedy Center’s recent publication, The Arts and Special Education: A Map for Research, is meant to be a living document that sparks conversation and incites action to support a shared, ambitious agenda around growing the field of the arts and special education. Thus, this resource should not be considered a “be-all-end-all” answer to large questions about the impact of arts learning on students with disabilities. Rather, it proposes priority areas for the field to focus new research, offers suggestions around the field’s responsibility to rigorous research design and methodologies, and charts a course of milestones by which we can measure our shared progress toward these goals.

Whether you are an individual researcher, a program evaluator, a funder, or a practitioner; whether you are working alone, as a program consultant, or with a large institution, there is a place for your work in this map. To get the most out of this resource, we recommend you consider the following:

1. This is not a prescriptive plan. This research map is far from comprehensive, and is not meant to be an exhaustive list of research questions to pursue or literature to reference. Rather, this is meant to be a jumping off point. The three priority areas proposed might help to frame your work or provide direction in a next step. The research questions could be investigated, and might also inspire other rich questions. The milestones are benchmarks by which the field might measure progress.

2. Contextualize your work to identify where you can contribute to this agenda. Just as this map is not exhaustive, it is impossible to think any one researcher or organizations could take on all of these action steps. Don’t think about how you can pursue everything in the map. Rather, think about where your work fits: Do you work with an organization that offers innovative programs for students? Perhaps you could consider how your work helps to contribute to a body of literature in Priority Area 2: Instructional Design and Innovation. Do you work at a large school district or state-level department of education? Perhaps you can look at large data sets that help us understand how students with disabilities participate in arts education in your area (Priority Area 1: Access and Equity). This work will grow and succeed when our efforts are aligned, not when all of us try to do a little bit of everything in isolation.

3. Look to other, more established fields as exemplars. The arts and special education is a young field, and the representative body of literature is still growing. However, this field draws on larger, established fields like arts education, special education, human development, curriculum and instruction, improvement sciences, developmental psychology, disability studies, and more. These fields have rich bodies of research literature that can offer theoretical foundations for our work as well as useful models of rigorous research methodologies.

4. Connect your work to practice and to policy. Remember—research can’t happen in a vacuum. Across the field, we work with real students in real classrooms; research should inform practice, and experiences in our classrooms should drive the next research questions we pursue. Findings should further inform policy decisions, so consider how your work (whether as a researcher or in the classroom/community) can influence systems-level ideas.

5. Be comfortable with discomfort. Close, rigorous examination of instructional practices might not show us what we want to see. Sometimes, teaching strategies we feel intuitively should work might not prove to be statistically effective. While that can be disappointing, it’s important to remember that that information helps us to better understand what does work and why—thereby improving instruction for the students we support.


Jenna Gabriel, Ed.M., is Manager of Special Education at the Kennedy Center. Don Glass, Ph.D., is Research Manager at the Kennedy Center. They are both editors of The Arts and Special Education: A Map for Research.

Growing a Field of Study in the Arts and Special Education

Image of a boy leaning over a paper and holding a pen; text: The Arts and Special education; a Map for ResearchWhen the Kennedy Center’s Office of VSA and Accessibility convened thought leaders from the arts education and special education communities at a forum in 2012, attendees identified a need to name and grow the relatively new field of arts and special education. This conversation has continued at the annual VSA Intersections: Arts and Special Education conferences, with participants stating the need to both quantify the work of the field and get a handle on the literature and data that already exists.

In 2016, with these needs in mind, the Kennedy Center again brought together field leaders and asked them to envision how an action plan for research in the arts and special education might look. Seeking to create a series of guideposts for scholars, researchers, and practitioners, the group created The Arts and Special Education: A Map for Research, a new publication from the Kennedy Center.

According to Jenna Gabriel, Manager of Special Education at the Kennedy Center and co-editor of A Map for Research, three areas of focus emerged from those conversations in 2016: access and equity; instructional design and innovation; and effectiveness, efficacy, and scale-up. These topics became the three priority areas for the research map.

Image: a man helps a boy who is holding a pencil; text: "There is a need to develop and test new research methodologies which are more compatible with inquiry in arts education and special education."Gabriel says that while the group initially set out to create a five-year strategic action plan for research in arts and special education, they realized during their conversations “…that what we needed was more of a call to action, which the map provides.” Dr. Don Glass, Research Manager at the Kennedy Center and co-editor of A Map for Research, adds, “Some of the types of research we are advocating will take time to get going, so what we are saying here is the direction we want to move and the time frame is more flexible than just five years.”

Some of the goals articulated in the map are long-term and ambitious, especially those in Priority Area 3: Effectiveness, Efficacy, and Scale-Up. This section challenges researchers to study best practices on a larger scale, across sites and contexts. But Gabriel and Glass say members of the VSA network can incorporate every priority area into their evaluation designs through thoughtful planning and consideration of the research questions.

Glass notes that for organizations just beginning their research and evaluation efforts, the map can offer some starting points. “Research can be very overwhelming, and the map is providing some focal points and guide beams. For instance, in Priority Area 1: Access and Equity, we want to make sure we are counting students with disabilities in some way and not forgetting to do so. Collect data to see who you are serving; that’s a great place to start,” offers Glass.

For those already conducting research, Gabriel says the map can support what is being done. “When an organization is assessing their program, there is a way to situate that evaluation in support of the map’s priorities. If you are implementing instructional design or innovation and have studied its success as we discuss in Priority Area 2, start thinking about how it can be scaled up and ask the bigger questions addressed in Priority Area 3.”

Image: a girl with braids in her hair looks at an art lesson; Text: "There is a need to explore the development of more targeted research questions that focus on the arts and learning for all students, including those with disabilities."Both Glass and Gabriel emphasize the importance of not just studying program outcomes, but also how a program works for a group of students as discussed in Priority Area 2. “We want to see how diverse learners are supported by different kinds of instructional strategies,” says Glass, adding that many current program evaluations share information about the impact of a program, but not why or how it works for different groups or what adaptations were needed.

Gabriel notes that to conduct the rigorous and meaningful research prescribed in the map, the field has to be willing to look at results that are not pleasing to us. “We need to hold ourselves accountable to the standards of fields with which we want to be associated,” she explains. Glass adds, “When we see evaluation reports, what often gets featured are best case examples. They are informative, but it is equally helpful to learn how an approach that is successful for one child may present a barrier to another.”

Looking to the future, Gabriel and Glass urge researchers who are doing work in arts and special education to get their work out there so others can learn from it. They note that the VSA professional papers series is a great way to share information about practice and research, and hope to see more articles on arts education for students with disabilities in special education journals in the future.

Glass and Gabriel say the Kennedy Center is eager to contribute to the map, but emphasize that it is not a checklist they can accomplish alone. “The map is not a Kennedy Center plan, but a call to action for a broad field. We hope others want to join our effort,” explains Gabriel. Glass adds, “It is reflective of our internal thinking, but also an invitation to collaborate and help us advance the arts and special education.”

The Musical Theater Project Demonstrates the Value of Building Evaluation into Programs from Day One

One girl and two boys growl like tigers while wearing smock-style costumes.

Students participate in a Kids Love Musicals! residency. Photo credit: Heather Meeker

When leaders at the Musical Theater Project in Northeast Ohio decided they wanted to expand their Kids Love Musicals! residency program to serve students with disabilities, they were deliberate in their planning. They sought out resources and expertise from peer arts organizations already working with students with disabilities, and they attended professional development sessions on arts and special education topics. As they laid out their expansion plan, they identified program assessment as a priority and sought to include comprehensive evaluation strategies as a part of the new residencies.

With this in mind, Heather Meeker, Executive Director of the Musical Theater Project (TMTP), connected with leaders at the Schubert Center for Child Studies at Case Western Reserve University (CWRU), located nearby in Cleveland, Ohio. “CWRU is interested in being deeply involved in their community, so developing a mutually beneficial research project was of great interest to them,” says Meeker.

The Schubert Center introduced Meeker to psychology professor Sandra Russ and doctoral student Olena Zyga, who agreed to work with TMTP to assess the new residencies. TMTP agreed to support the academics’ work by raising money to pay for student researchers and faculty time, and Meeker says funders have been especially interested in supporting this collaborative assessment.

The Kids Love Musicals! residencies for children with disabilities aim to teach social skills and emotional understanding through the stories and characters from classic American musicals such as The Wizard of Oz, The Jungle Book, and You’re a Good Man, Charlie Brown. The multi-year evaluation project with the Schubert Center seeks to better understand if engaging in the residency program impacted participants’ socioemotional skills, including the ability to make eye contact, engage with others, take turns appropriately, and demonstrate emotional understanding. A secondary goal is to understand whether gains seen during the residency program extend to other environments.

Russ and Zyga created a custom measurement scale for the program, using their expertise in the fields of psychology and play. TMTP initiated their new residencies for students with disabilities, collecting multiple forms of data throughout. Residency sessions were videotaped across multiple school sites and to include a range of student ages and ability levels; the videos were then coded and scored according to the measurement scale. Teachers were also asked to report on the same variables that were being coded in each session for every student, both before the residency program began and after it had finished.

Analysis of the first round of data, which specifically focused on The Wizard of Oz residency, suggests that students who participated in the Kids Love Musicals! program did make gains in eye contact, turn taking, engagement, and symbolic flexibility. These results were recently published in the Journal of Intellectual Disabilities. Meeker is thrilled that their collaboration with the Schubert Center led to the research being shared broadly, both through journal publication and in various conference presentations by her and Zyga.

Four children stand in front of two adults, all wearing curly gold ribbon on their heads and making roaring faces.

Teaching artists work with students in the Kids Love Musicals! residencies. Photo credit: Heather Meeker

The research collaboration between TMTP and the Schubert Center continues post-report publication, including a new round of data collection focused on identifying if similar gains are seen across curriculums presented to students. Specifically, they are asking if children made the same gains while learning The Jungle Book as made while learning The Wizard of Oz. Analysis of this data is currently underway, with initial results suggesting that curriculum differences do not significantly impact the student outcomes. A final phase of data collection, completed at the end of the 2016-2017 academic year, focused on comparing the active residency period with a pre-residency control period.

Given the success of their collaboration with the Schubert Center, Meeker encourages organizations interested in conducting robust program evaluations to consider partnering with a college or university in their own community. “If a project can be designed with the idea that both the organization and university students can benefit from it, a collaboration can really be a win-win situation,” she says.

Of course, Meeker also warns of the hard work and complications that come with conducting a large-scale assessment. She explains, “We had to make peace with the fact that we would not get 100% compliance from teachers in our data collection efforts, and that not all of the data we worked so hard to collect would ultimately be used in the study. We also did not anticipate the delays that sometimes come with working with a university, like waiting for internal review board approvals for everything from project proposals to parent permission forms.”

But the reward for that hard work is great, Meeker says, as their research has clarified so much for TMTP about the program internally. She concludes, “If you are constantly looking to improve your work, then thorough evaluation is crucial. This project has empowered us to do even more with our programming.”

Intersections Preview: An Interview with Presenter Erin Hoppe

A woman stands in front of a group of children.

A VSA Ohio teaching artist works with students in an AIA residency.

As we prepare for the 2016 VSA Intersections: Arts and Special Education Conference, we are highlighting presenters and sharing information about their sessions. This month, we feature Erin Hoppe, who is co-presenting a session entitled, “Making the Intangible Tangible: Quantifying the Impact of an Arts Residency Program” with Sheri Chaney Jones on Monday, August 1, 2016.


Erin Hoppe, executive director of VSA Ohio, has long believed her organization’s Adaptation, Integration, and the Arts program (AIA) makes a positive impact on student success. The teaching artist residency program, now in its 14th year, has collected qualitative data from participating teachers and artists that suggests as much.


Erin Hoppe's headshot

VSA Ohio Executive Director Erin Hoppe

After continuing to receive the same feedback year after year from program participants, Hoppe, a self-described “research nerd,” decided they needed a quantitative way to assess student impact. She contracted Measurement Resources Company’s Sheri Chaney Jones to conduct an independent evaluation of how the AIA program is impacting students and teachers, and how it can improve. The Ohio Department of Education and the Kennedy Center invested dollars into the research, adding legitimacy and priority to the efforts.


The evaluation project is currently concluding its second of three years. Hoppe says she and Jones agreed that they would need multiple years of student data to properly assess program impact. In an attempt to make the data collection process as easy as possible, the researchers are looking at anonymous student test scores that already exist rather than creating a new test for teachers and students to deal with.


According to Hoppe, the data collection process has proven challenging. “Schools are so busy already,” she says, continuing, “…and some sites are more conducive to our research parameters.” Still, Hoppe says the data collected during the evaluation’s first year showed some very positive results, and their experiences in year one helped them figure out ways to expand their data pool for year two.


One way they have increased their data collection success is by adding control schools to the study. These schools have been identified by AIA schools as peers, but do not currently participate in the residency program. The peer school data will help researchers see if there is a discernable difference between student achievement at the sites with arts residencies versus those without residencies.


Ultimately, Hoppe hopes the evaluation results, whatever they may be, help improve the AIA program. She is interested in using VSA Ohio’s resources where student impact is greatest, which may mean limiting AIA classrooms to certain grade levels or subject areas based on the quantitative findings, or adjusting the residencies in some other way. “I’m interested in knowing if we are actually doing what we say we are doing, like teaching 21st century career skills or improving teacher preparedness for working with students with disabilities,” says Hoppe, continuing, “…and quantitative analysis is an essential part of answering that question.”


Erin Hoppe and Sheri Chaney Jones will present “Making the Intangible Tangible: Quantifying the Impact of an Arts Residency Program” at the 2016 VSA Intersections: Arts and Special Education conference on Monday, August 1, 2016.

Striking a Balance with Internal and External Program Evaluation

ArtsConnection teaching artist and P94M teachers

ArtsConnection teaching artist and P94M teachers collaborating in STAARS, a sequential musical theater program for children with special needs.

At ArtsConnection in New York City, evaluation is more than just a buzzword used in funding applications; it is an essential part of the arts education organization’s programs. Deputy Director for Education Carol Morgan says that ArtsConnection evaluates its programs in two ways: through working with external, independent evaluators and through practitioner research facilitated by ArtsConnection staff. This two-pronged approach has provided them with a wealth of valuable information over the years, and their work continues with an ongoing study of the Spectrum Musical Theater school residency program for students on the autism spectrum.

Morgan says program development at ArtsConnection begins with a needs assessment process, when her staff meets with teachers to discuss their artistic and educational goals for their students. “We start to build a shared language and shared concern,” explains Morgan, “…and that in itself is an inquiry process. Planning and reflection are basic level inquiry.”

ArtsConnection’s Spectrum Musical Theater program grew out such an inquiry process with Public School P94M, also called the S.P.E.C.T.R.U.M. School. This school in New York City’s District 75 has 8 sites in lower and midtown Manhattan with 300 children in kindergarten–high school. Students at P94M have autism or other disabilities, including emotional and behavioral disturbances and intellectual disabilities. ArtsConnection and the P94M principal discussed how they could best collaborate to help students achieve their goals.

The Spectrum Musical Theater teaching artist residency program is a three-year pilot project at P94M. Program Manager Emily Lukens says the first year was about experimenting to create a program that focused on social and emotional literacy through musical theater. The second year was about fine-tuning, and now in the third year, they are focused on thoroughly evaluating the program and collecting relevant data.

Lukens says the program evaluation has two main components: an external evaluator conducting research specifically on how movement lessons impact students, and internal assessments facilitated by ArtsConnection staff and completed by the teachers and teaching artists. The internal assessment includes frequent “reflection sessions” between teachers, teaching artists, and program staff. According to Lukens, “These reflection sessions inform the next session and allow for discussion of both individual students and the group as a whole.”

As another part of the internal Spectrum Musical Theater evaluation, ArtsConnection staff created a custom student assessment form by merging one of their existing arts education forms with an assessment tool P94M uses to measure student growth. The new form assesses observable behaviors in an arts classroom as well as any growth in the student’s social and emotional skills.

The new Spectrum Musical Theater assessment tool is being completed for five students from each residency class, four times each throughout the residency. Lukens says they selected students on a variety of levels for the evaluation, including “…some higher functioning students, some lower functioning, children who are verbal and non-verbal, and students who have expressed interest in the arts and some who haven’t at all. Having this large range helps train our teaching artists and school teachers on how to really look at what students are doing, what their reactions are to certain music, movement, or directions, and how this work may be impacting their choices and lives.”

In addition to the student evaluations, teachers and teaching artists also provide assessments of the program itself. The educators enter all their observations into a spreadsheet to provide a big picture analysis at the program’s conclusion.

Lukens says that ArtsConnection hopes to replicate Spectrum Musical Theater in another multi-site school, which is why incorporating data and assessment into the program is so critical. But Morgan adds that program evaluation data is valuable for their organization in so many ways. For instance, their Board of Directors has found previous evaluation reports helped them better articulate and contextualize ArtsConnection’s work.

Morgan urges other arts education leaders to not just conduct evaluation studies, but to also be consumers of each other’s research. “It is so important for the field to think not only about what we do but how we do it and why we do it, and put it in a broader context,” she continues, “What’s underneath those promising practices? … [R]eading and discussing other work, research, and thought leaders can greatly improve our own work.”

Seven Tips to Guide Your Program Evaluation

By Rob Horowitz

Organizations are under increasing pressure to provide data demonstrating program accomplishments. But they typically don’t have the capacity to meet these expectations. If you are trying to do this yourself, here are some “tips” to help you think about evaluation issues.

  1. Understand the “audience” for your evaluation and their expectations. Why are you doing the evaluation? Who will look at and act upon the data? Who are the key constituencies—such as Boards, funders, staff, or school principals—that will be most interested in evaluation results? Solicit the views of this “audience” to ensure that the evaluation meets their needs.
  1. Decide on your evaluation questions. What do you want to learn? Be the driver of your evaluation. Don’t do an evaluation simply to satisfy the needs of others, such as funding agencies or a development office. Ask questions about your program and use the results to improve it. Develop a culture of inquiry in your organization.
  1. Be consistent in your evaluation design. Your evaluation plan should strive for consistency among these phases: focus, data collection, analysis, and reporting. That is, your evaluation questions and audience should drive your selection of assessment tools, research design, and analysis. Your analysis should attempt to answer your questions. Keep it focused and consistent. Evaluate what your program does, not what others hope it does.
  1. But…be flexible and open to unexpected findings. Yes, conditions in schools or other organizations may change and you may not be able to adhere to your plan. Work with them and change plans, as needed. Don’t have the “tail wag the dog” in evaluation, whereby you try to compel a program to keep to a plan only for the evaluation, but to the detriment of students or teachers.
  1. Build trust with your data sources. Bring teachers and others into the process and let them know that you value their objective views and will use them to improve the program.
  1. “Not everything that can be counted counts, and not everything that counts can be counted.” I didn’t coin that handy aphorism, but it is a tip to guide us all. Make sure to include qualitative approaches to data collection and learn how to analyze and present information that is not in the form of a number. This is especially important when studying special education populations. You don’t want to miss out on documenting individual children’s accomplishments – with all of their wonderful diversity – because you’re only presenting aggregated statistical data.
  1. Get help when you need it. Consider whether you need an independent evaluator. Do you have the capacity and expertise to do it on your own? Do you have a budget? Keep within your means. Reach out to others when you need them.

Rob Horowitz photoRob Horowitz is a consultant to arts organizations, school districts, and foundations. He has conducted over 100 program evaluations for organizations such as the Kennedy Center, National Endowment for the Arts, and Jazz at Lincoln Center, and conducted basic research on the impact of arts learning on cognitive and social development. He was a contributor to Champions of Change: The Impact of the Arts on Learning, Critical Links, and NEA Jazz in the Schools.