Skip to main content

APTS FORUMS

It is currently Sun Jun 16, 2013 7:18 am

All times are UTC - 5 hours [ DST ]




Post new topic Reply to topic  [ 4 posts ] 
Author Message
PostPosted: Thu Jan 17, 2013 4:42 pm 
Offline

Joined: Sat Jan 12, 2013 4:52 pm
Posts: 2
I assume that most local PBS stations conduct an educational outreach program designed to increase school readiness of children. This may involve 1) workshops for Head Start teachers, daycare providers and parents to teach the learning triangle, emphasize the value of reading to preschool children, and show how PBS television programs and the PBS Kids website can increase school readiness, 2) summer library events to teach the learning triangle and emphasize the importance of reading to children, and 3) book distributions to Head Start classrooms. My question is if or how are you measuring the impact of these programs beyond just a quantitative record of how many workshops were given, how many attended, how many children impacted, and how many books were distributed. Measuring the impact of these outreach programs seems very difficult to us because our programs are just a small part of what it takes to achieve "school readiness". We have tried surveying parents of children impacted, but response rates were very low. With funders placing more emphasis on monitoring and evaluation, we are concerned that insufficient evaluation could make our applications less competitive. However, we are a small station struggling to maintain existing programs.

_________________
Helen Hands
Grant Writer
Smoky Hills Public Television
Bunker Hill, KS


Top
 Profile  
 
PostPosted: Thu Jan 31, 2013 4:42 pm 
Offline

Joined: Fri Jan 25, 2013 11:28 am
Posts: 2
Helen, the questions you have posed are very important. Education partners and prospective funders are increasingly seeking greater accountability. Our stations are well positioned to meet this new challenge thanks to the system’s enduring commitment to early childhood content, outreach, and evaluation that has been so generously funded by the US Department of Education through Ready To Learn, especially during the past 8 years.

Yes, stations do conduct school readiness activities, many of which are based on resources developed through the national CPB-PBS Ready To Learn Initiative. A good number of these outreach models have been evaluated by the RTL’s national research partners including EDC, SRI, WestEd and others (e.g., American Institutes for Research and Rockman et al as well as university faculty including Deb Linebarger, University of Iowa, Susan Neuman, University of Michigan, and Rebecca Silverman, University of Maryland). Our public media system is very fortunate to have major research to back-up our school readiness efforts.

You mention the quantitative record-keeping that most stations engage in. At the most basic level, we agree it is important to quantify outputs.

However, as you mention, there is increasing demand for measuring the impact of interventions at the local level. Having said this, we realize that most stations do not have the funds or capacity to conduct “gold standard” or other rigorous quantitative evaluations. Given this reality, here are some thoughts.

o First and foremost, we recommend that local stations cite relevant national RTL study outcomes when working with partners and submitting proposals to funders.

o Next, we encourage stations to partner with local colleges and universities on school readiness grants, as this will make your proposal more competitive, and may also provide you with the evaluation expertise you need to address funders’ evaluation requirements. Should you go in this direction, your local research partners might benefit from taking a closer look at some of our RTL study designs as a guide or model.

o Finally, given the expense and challenges involved measuring the impact of outreach activities (not least of which are implementing the intervention with fidelity across participants and getting sufficient response rates on surveys to be credible), your station and research partner may wish to explore using a more qualitative, case study approach to showing impact. This approach would involve an independent evaluator observing project activities and conducting focus groups and interviews with relevant stakeholders and participants about the project and its impact on the target audience. While funders ideally want statistics from controlled studies, qualitative nuggets about transformation and success often carry the day.

There are some good examples of recent CPB-funded RTL demonstration stations that collaborated with local evaluators. KBTC/Tacoma partnered with its Housing Authority’s evaluation group. Montana PBS hired a local evaluator to observe different implementation strategies and develop context-sensitive recommendations to enhance learning outcomes.

Check out the link to our “findings” report about resources developed and research conducted in the last round of RTL 2005-2010 (www.cpb.org/rtl/). Also stay tuned to the PBS KIDS.org/Lab website for postings of all RTL research reports from this current round (2010-2015) as they become available.

Pam Johnson, Executive Director of Ready To Learn, CPB
Barbara Lovitts, Director of Research and Evaluation/Ready To Learn, CPB


Top
 Profile  
 
PostPosted: Fri Feb 22, 2013 12:07 pm 
Offline

Joined: Sat Jan 12, 2013 4:52 pm
Posts: 2
Pam and Barbara-
Thank you very much for your response to my question about evaluating outreach programs for children. I re-read the Ready to Learn findings and found the evaluations you mentioned have added them to a couple recent grant applications.

I doubt our station will ever evaluate the kids affected by our outreach with in a rigorous scientific way using literacy tests. However, I think we would be capable of evaluating the effectiveness of how we do our outreach. For example, I think it would be valuable for our station to know if:
1) the information we present in workshops is new to those attending;
2) it is being used in the classroom; and
3) the children, in the opinion of the teachers, are more receptive to it than standard literacy instruction.

As for our programs where we give books to kids in Head Start and our summer reading events, I think it would be valuable to learn if:
1) the children involved are more interested in reading after we give them books or they attend the event; and
2) their parents read to them more.
Answering these questions could satisfy foundations' desire to know if a program is effective and worth funding, and lead us to improvements in our program.

I'm sorry I was so slow to thank you.

Helen

_________________
Helen Hands
Grant Writer
Smoky Hills Public Television
Bunker Hill, KS


Top
 Profile  
 
PostPosted: Tue Feb 26, 2013 11:55 am 
Offline

Joined: Fri Jan 25, 2013 11:28 am
Posts: 2
Hi Helen,

We’re happy to hear that our response about evaluating outreach programs for children was helpful.

While CPB is required by its federal grant to evaluate the impact of Ready To Learn on children, we don’t expect stations to conduct the same type of rigorous controlled evaluations. These types of evaluations are very expensive and require a high degree of implementation fidelity, something not easily done in local outreach programs.

The measures of effectiveness you have identified are very appropriate, not to mention valuable for a station, and are the types of things that would satisfy most private funders. Indeed, many of them are things that we look at in our evaluations through parent/caregiver/teachers surveys and focus groups, over and above whether children learned the targeted content.

We wish you the best with your evaluation efforts.

Pam and Barbara


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 4 posts ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
cron