Measuring Outcomes 101
Wed., Feb. 10th, 2016, 1:15-2:30
Measuring Outcomes 101 is designed to help you get up and running this summer. The eight steps (below) are a great guide, but please bear in mind that they are not gospel — there are many ways this process can work.
Define: measuring outcomes
At camp, there are three general categories related to outcomes we can measure: participants, programs and practices. Most camp professionals measure something related to camper and/or parent satisfaction — this is a program outcome. Practices refers to the organizational features we evaluate in order to improve efficiency: personnel performance, marketing effectiveness, enrollment, etc. Perhaps the most difficult to measure, but the most powerful, are participant (camper) outcomes. Measuring camper outcomes = gathering information about what campers gained from their time at camp. It is generally assumed that outcomes are:
- beneficial for participants / campers — a positive change
- observable in some way
- aligned with the camp’s mission or program aims
Terms: test = instrument = tool.
Eight Steps to Measuring Outcomes
Please note timeframes in parentheses.
1. Why do I want to measure outcomes? (Timeframe: now)
First, it’s probably best to determine your reasons for wanting to do this (“the why”). Reasons can be general or specific — both types are worthy.
General examples: I just want to get better; I want outcomes for my marketing; I want to advance the camping industry.
Specific examples: I need outcomes for a grant application; I need to know whether X program is working; I am afraid we aren’t properly preparing our staff.
2. What obstacles do I face? (Feb-Mar)
Think about organization-related obstacles: My board is nervous about it; My parents don’t like testing; I don’t have any money.
To overcome your obstacles, talk with your constituents and identify their concerns. Listen. If necessary, highlight the benefits and mitigate the risks.
3. Which outcomes should I measure? (Mar-Apr)
There are various ways to answer this question:
- Refer back to your “why” in #1 above
- Refer to your mission statement, philosophy and values
- Look to the future, e.g., 21st century skills
- If you’re using Youth Outcomes Battery (below) choose from the list
- Ask your staff about program strengths and weaknesses, or about their interests and passions within your program
4. What instrument should I use? (Apr-May)
The ACA’s Youth Outcomes Battery (YOB) is a great place to start, even if you plan to move on in a few years.
Beware of creating your own instrument, especially without a professional involved. Even with a professional involved, it can take years to achieve validity and reliability. If you are just starting out, we strongly suggest you use an existing instrument.
Amanda (Sherwood Forest) uses YOB, plus a proprietary instrument for her reading program. Matt (Longacre Camp) uses Algorhythm.
5. Prep for delivery (May-Jun)
Ask yourself a ton of questions.
Paper or online? Do I have all my materials? What time will it happen? Where will it happen? What will I tell the campers? What assistance can staff give? What can they not give? Who will troubleshoot? What about the kids who finish early? What about the kids who finish late? What do I do with the tests afterwards?
Beware of underestimating the details. Visualize every step of the way.
6. Deliver the instrument (summer)
Be clear when you instruct your campers. Their mindset matters. Also, make sure assistance from your staff is consistent, or it could affect your results.
Write down all relevant notes, including what complicated your process (distractions, weather, etc.) and what to change for next year (location, number of staff, etc.). Store nothing in your head. Make sure all tests are labeled and organized when you put them away. Next time you see the tests, you may have forgotten all the details of the day.
7. Analyze my results (fall)
How you analyze your results — that process — should factor into your instrument decision in #4 above. For some instruments, an analysis is included in the service, so weeks or months after you submit your data (online or by mail), you receive an analysis (online or by mail). If analysis is not included in your service, use a professional if possible. Beware amateur analysis.
8. Employ my results (winter and spring)
When employing your results, bear in mind that people are moved by stories, not data. Here are some ways we think about this:
- Lead with a story, then back it up with data
- Turn your data into a story, then tell the story in a compelling way
- Use a case study as an example
- Give the data context and make it tangible
- Use charts and make it visual
Raise the Bar
If you are overwhelmed by the thought of measuring outcomes, consider joining Raise the Bar. Raise the Bar (RTB) is a community of practice for camps measuring outcomes. It’s free.
RTB’s Getting Off the Ground Miniseries takes camps through the steps to “getting off the ground”, like an expansion of this session. It’s a series of Google Hangouts (a multi-person video conference). A small number of novice camps come together and are guided through the process by researchers and experienced camps.
You get access to a network of researchers and like-minded camp professionals. You can attend structured presentations about camps’ experiences and challenges. Some camps share data and analysis.
ACA launched Raise the Bar two years ago. Now, RTB operates outside the ACA structure (i.e., no funding). It’s all-volunteer.
RTB began with 18 camps in 2014. It grew to 34 camps in 2015. Members still work closely with ACA staff and CARE (the Committee for the Advancement of Research and Evaluation).
Matt is a co-leader. Amanda is a member. Laurie is CARE’s liaison to RTB.
The next application deadline is Mar. 31st. More information at acaraisethebar.com/join.
Start small; foolishly small. You can even focus on one outcome, one program and / or one cabin.
It often takes 2-3 years to get it right. That’s normal. Don’t be afraid to make mistakes. You will not be 100% comfortable before you begin — and that’s ok, because there aren’t big risks. Measuring outcomes ≠ rock climbing.
If you’re interested in other outcomes-related sessions, please pick up a postcard in the back of the room.
Thank you! Enjoy the rest of your conference.
-Amanda, Laurie and Matt