Please ensure Javascript is enabled for purposes of website accessibility Optimizing Program Evaluation: Insights and Recommendations from SisterWeb
top of page
Search

Optimizing Program Evaluation: Insights and Recommendations from SisterWeb

by Regina Endoso, Evaluations Intern & MPH Candidate at UC Berkeley 


Coming into my internship position at SisterWeb (SW), I was so excited to finally get my hands on the data and generate meaningful analyses for program evaluation, eventually leading to program improvement - but there was actually more work to do within this process, extremely important work, that I think all organizations should incorporate if they’re not already doing so. 


As a community-rooted organization, evaluation is an essential key to your organization’s success. Here are some recommendations for evaluation based on my close-experience with SisterWeb’s evaluation tools and processes. 


1.Have a guiding framework.


SisterWeb has adopted Results Based Accountability, which has been a great framework for accomplishing their goals! I would recommend having a framework to help guide your organization in their evaluation goals. 


2. Evaluate everything and anything (within reason!)


I am in awe of all of the measures that SW is evaluating - they have 101 performance measures and counting! SW measures everything from client satisfaction to number of overall visits to staff emotional well-being, which goes to show how much they care about not only their clients but their staff as well.  Every single measure has a specific purpose and helps with the overall improvement of the organization. 


Now you’re probably wondering, “How do they keep track of all of that?,” and the answer is: using a Google Sheet! But don’t get me wrong, it may seem simple, but there is a high level of organization to the Performance Measure Spreadsheet. It’s a central hub for all things data - including things like the data’s purpose, tools for collection, what program/department is being measured, how frequent data is collected, and who is responsible for the data. 


3. Revisit and modify your evaluation tools.


While you want to make sure that the quantity of the measures you’re conducting evaluation on is sufficient, I think it’s more important that you have quality performance measures. SW makes sure to regularly review the measures to ensure that they are still in line with current goals. During my internship, I spent some time reviewing each measure and considering what it was measuring, why we were measuring it, and what changes needed to be made. I would really recommend making sure that as you are changing your survey questions, you change your performance measures as well! 


4. Clean up your data! 


It’s important to audit your data to make sure you and your staff are recording the data properly and collecting the correct measures. Data cleaning should be a continuous process that happens not only before analysis, but also throughout data collection. This is essential in ensuring that the data you’re reporting back to any of your interested parties is accurate! This was one of my main projects during my SW internship. I reviewed our Doula Checklist of Care for each client to ensure that the visits SW doulas marked as complete matched the visits that were actually completed, the referral and eligibility notes, and the SW Fiscal Year Client Groups. After review, I created a presentation to inform the doulas of how they were doing in terms of data collection, how they might improve, and why reporting accurate data is important.

If you want to see more on how I presented this information to our doulas, please click here






 

5. Use qualitative methods to tell the story behind your quantitative data.


While quantitative data is important, sometimes it doesn’t tell us the whole story. SW uses a system built into our data collection system called ScoreCard, where they are able to present the quantitative data in easy to read graphs, while also providing a narrative that tells the story behind the curve, any partners, what was successful, and what strategies they might use to improve or maintain the measure. This can be especially helpful to communicate with interested parties about why numbers might be low or what they might be able to do to keep the numbers up.


I hope these tips were helpful in your understanding of effective program evaluation. Program evaluation is not just a box to check - it’s a dynamic process that underpins the sustainability of community-driven initiatives like SisterWeb. The journey of evaluation is just as rewarding as the destination of program improvement. 


All in all, I’ve had such a wonderful experience at SisterWeb, for which I’ll always be grateful. I was so inspired by the organization’s dedication to excellence in evaluation practices and their profound commitment to both their clients and staff. Moving forward, I hope to carry these invaluable practices with me in my future endeavors in Public Health, and I hope you are able to apply these few tips to your organization as well! 

30 views0 comments
bottom of page