Helen, the questions you have posed are very important. Education partners and prospective funders are increasingly seeking greater accountability. Our stations are well positioned to meet this new challenge thanks to the system’s enduring commitment to early childhood content, outreach, and evaluation that has been so generously funded by the US Department of Education through Ready To Learn, especially during the past 8 years.
Yes, stations do conduct school readiness activities, many of which are based on resources developed through the national CPB-PBS Ready To Learn Initiative. A good number of these outreach models have been evaluated by the RTL’s national research partners including EDC, SRI, WestEd and others (e.g., American Institutes for Research and Rockman et al as well as university faculty including Deb Linebarger, University of Iowa, Susan Neuman, University of Michigan, and Rebecca Silverman, University of Maryland). Our public media system is very fortunate to have major research to back-up our school readiness efforts.
You mention the quantitative record-keeping that most stations engage in. At the most basic level, we agree it is important to quantify outputs.
However, as you mention, there is increasing demand for measuring the impact of interventions at the local level. Having said this, we realize that most stations do not have the funds or capacity to conduct “gold standard” or other rigorous quantitative evaluations. Given this reality, here are some thoughts.
o First and foremost, we recommend that local stations cite relevant national RTL study outcomes when working with partners and submitting proposals to funders.
o Next, we encourage stations to partner with local colleges and universities on school readiness grants, as this will make your proposal more competitive, and may also provide you with the evaluation expertise you need to address funders’ evaluation requirements. Should you go in this direction, your local research partners might benefit from taking a closer look at some of our RTL study designs as a guide or model.
o Finally, given the expense and challenges involved measuring the impact of outreach activities (not least of which are implementing the intervention with fidelity across participants and getting sufficient response rates on surveys to be credible), your station and research partner may wish to explore using a more qualitative, case study approach to showing impact. This approach would involve an independent evaluator observing project activities and conducting focus groups and interviews with relevant stakeholders and participants about the project and its impact on the target audience. While funders ideally want statistics from controlled studies, qualitative nuggets about transformation and success often carry the day.
There are some good examples of recent CPB-funded RTL demonstration stations that collaborated with local evaluators. KBTC/Tacoma partnered with its Housing Authority’s evaluation group. Montana PBS hired a local evaluator to observe different implementation strategies and develop context-sensitive recommendations to enhance learning outcomes.
Check out the link to our “findings” report about resources developed and research conducted in the last round of RTL 2005-2010 (
www.cpb.org/rtl/). Also stay tuned to the PBS KIDS.org/Lab website for postings of all RTL research reports from this current round (2010-2015) as they become available.
Pam Johnson, Executive Director of Ready To Learn, CPB
Barbara Lovitts, Director of Research and Evaluation/Ready To Learn, CPB