Getting Started with Program Evaluation

This outline has been developed to assist staff at American Spaces with planning and conducting evaluations of their programs.

Start

Hold at least one planning session with staff and volunteers who will be responsible for conducting the program evaluation. Below are suggested questions to consider and discuss during the planning session.

Why

What is the purpose/goal of your evaluation and what do you hope to learn from the evaluation?

Sample goals:

  • To see whether the American Space should run this program again
  • To improve the program
  • To see what the program accomplished for a group or for individual participants
  • To provide feedback to the Embassy on a sponsored program

Who

What is a good number of participants (sample size) for what you hope to learn?

When

When and where are the best times/places to conduct the evaluation?

  • Just after a program?
  • The day after the program?
  • Before and after the program?
  • At a separate location?
  • In the program space?

How

How will feedback be collected?

  • How should participants submit their feedback?
  • Will it be anonymous?
  • Will you ask for additional information (gender, age, education)?
  • How will you encourage participation and feedback? For example: have refreshments, use marketing or promotion, make a game of it, etc.

What

The Programs:

  • How many programs will you survey?
  • Will you use the same survey method for each program?
  • What types of programs do you want to evaluate and why?
    • Individual or one-time program?
    • Series or ongoing programs?
    • Overall evaluation of the American Space’s programming?

The Tools: What tools will you use?

  • Which tools or methods match the information needed?
    • Survey Monkey: https://www.surveymonkey.com/
    • Typeform Pro: https://www.typeform.com/pricing/
  • Instant feedback options:
    • Mentimeter: https://www.mentimeter.com/why
    • Feedbackly: https://www.feedbackly.com/

The Format: What format best fits the information need?

Examples

  • Group discussion with a facilitator(s), also called “focus group” o Participants are encouraged to express their opinions and share ideas with one another to provide the facilitator with honest feedback.
    • The ideal time is 1 — 1.5 hours.
    • The suggested number of participants is 10-14; if larger, split into two groups.
  • Group discussion — closed-ended questions
  • Group discussion — open-ended questions
  • Individual feedback — closed-ended questions
  • Individual feedback — open-ended questions
  • Individually submitted feedback — online survey or online tool

Question-writing tips

  • Use simple, direct words and phrases
  • Ask one point per question and be specific
  • Avoid confusing or leading questions
  • Break complex ideas down into multiple questions
  • Limit the number of questions in a survey (10 -15, depending on topic)

Tips for group discussions

  • Make it clear at the start how long the session will be
  • Explain the project goal and the importance of each person’s opinion for this goal
  • If possible, sit around a table or put chairs in a circle
  • Staff not participating in the discussion should stay in the background (not up front)
  • Start with introductions

Types of questions

  • Closed-ended (yes/no responses): Would you recommend this program to friends?
  • Quantitative (numbers/data): How many times have you seen/attended a program here?
  • Multiple choice (3-5 options): Agree, slightly agree, neutral, disagree, strongly disagree
  • Open-ended: Questions about program quality, such as
    • What was the primary reason you participated/attended the program?
    • What did you learn from this program that is/will be useful to you?
    • How could this program be improved?
    • Did you change your mind or opinion about anything after attending this program?