Home

About

Studio

Services

Resources

Blog

Contact

Studio Hours & Contact info

Monday – Thursday
hello@driveevaluation.com | (856) 869-3382



What Does Applied Research and Evaluation Mean?

Posted By

Elena Ragusa

Whenever I describe what I do to people outside of applied research and evaluation, I typically receive some variation of the same response, “Oh that’s really cool. I hadn’t really thought about it, but that is such a need.” It might be people being polite (would you tell someone if you thought their job sounded painfully boring? Mine is not, by the way!), but the researcher in me has picked up on the repeated “I hadn’t really thought about it…” element of their reaction.

Research supports the notion that many leaders in the social sector do not go into the planning of a new initiative or program with evaluation in mind.

My own experiences and those of many of my colleagues would suggest that even those who do consider evaluation think of it as a big, resource-intensive, often external thing that happens at the end of a program (and one that brings with it a fair amount of trepidation, but that is a blog post for another day). That sort of summative evaluation brings tremendous value; it helps make big decisions about changes to philosophy and approach, it can demonstrate impact, and it can certainly help with casemaking – like presenting to your board, expanding your geographic reach to a new site, and securing additional grant funding to continue operating.

How about the ongoing, potentially lower resource, learn-something-new-and-use-it-to-inform-decisions-while-the-program-is-being-implemented type of evaluation, though?

There are many (many!) ways to leverage programmatic information you are either already collecting or could start collecting with relative ease that could help you learn, iterate, and improve your program design and implementation and get you closer to achieving your intended outcomes.

This type of formative evaluation can be done by an external evaluation partner and/or internally by staff. Internal staff with “learning and evaluation” in their title are a good place to start, but often there are others who have some evaluative thinking skills, are interested in learning more, and are willing to roll up their sleeves. With the help of an evaluation coach or capacity builder, these folks can be powerhouses for your learning and evaluation efforts.

For example, let’s say you run a job skills program for English-language learners living in your community. The program includes a structured English course, a structured computer skills course, and a peer-to-peer mentoring component. Adults register for the program on their first day, and they sign in each time they arrive for a class session or meeting.  Here are two questions you can consider using data you already have.

1. Who is coming?

As part of your registration form, you are likely collecting some basic identifying information about your participants. Do a quick comparison between who can register and who does register. Does the subset you are serving represent the subset eligible for your services? Is a particular group overrepresented in your population? If so, do you want to diversity who is coming in the door, or are you OK with the distribution as is? If you decide you want to diversify, what are some strategies for engaging the underrepresented group(s)? If you decide you are comfortable with the distribution, are there reasonable changes you want to make to further meet the needs of that group?

2. Who is staying?

Those who come in the door once might not be the same people who come in the door again and again. In our example, the program is designed as one where participants are expected to return, and we have their session-by-session attendance data. Look at what that is telling you. Do the people who stop coming have anything in common? Are they coming from the same neighborhood? are they meeting with a particular instructor? Is there a common point in your curriculum where people get bored? Are they finding a job and therefore no longer need your services? (By the way, this last one is a good thing – we need to capture it differently than someone not being retained for other reasons.)

These questions and others like them sometimes result in tough answers. But engaging in these exercises in real-time allows for program improvement, which ultimately helps your participants and your progress toward your vision.

 

Do:

  • Think about evaluation throughout your entire program cycle – the earlier the better.
  • Consider what data you already have and could leverage to answer important questions that will help you meet your goals.
  • Look for the folks on staff who might want to dig into these puzzles, and support them with the resources they could use to flourish.

Don’t:

  • Be afraid of what the data will tell you. Even if you don’t see what you wanted or expected to see, it is better to know where we are as we figure out where we are going.

Leave a Comment +

Back to blog index

DR. ELENA RAGUSA

Founder and CEO of Drive Evaluation Studio.

She is committed to strengthening the places where we live, work, play, and learn through learning, evaluation, and applied research. Today you can find her supporting, growing, and evolving Drive Evaluation Studio.

Follow @driveevalstudio

L

Data-Driven Learning & Evaluation Consulting Boutique

BRINGING DATA-INFORMED DECISIONS to

Nonprofits, Foundations, and Higher Education

Our strategic consulting and evaluations will help you make better data-informed decisions and do your work better—as individual and unique to your outcomes.

Studio Categories

Previous Post

BACK TO INDEX

Free monthly newsletter

The Studio Post

Insider access to exclusive content, what's happening at the Studio, data spotlights, tips, and more, only available to our email list when you join for free. We'll email approximately once a month.

Thanks! Keep an eye on your inbox for updates.

We respect your privacy and will never share your name or contact info with anyone else.

+ Add a Comment

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *