Lessons Learned at Spring Brook Farm An Evaluation of the Farms For City Kids Program



Download 411.01 Kb.
Page9/10
Date conversion14.06.2018
Size411.01 Kb.
1   2   3   4   5   6   7   8   9   10

Section Six:



Appendices



Appendix A – Logic Model





Appendix B – Evaluation Overview



Farms for City Kids

Evaluation Overview

Prepared: July 2005

Updated: November 23, 2005
Program evaluation will:

  • Evaluate effectiveness of the Farms for City Kids model in terms of process (implementation) and outcomes (results)

  • Provide useful information for program staff, board members, & funders to assist with program development, justification & refinement

  • Provide Farms for City Kids with evaluation tools that may be used for ongoing data collection and program analysis
Evaluators’ Philosophy

  • PEER Associates is committed to using a multiple-methods, utilization-focused, participatory evaluation process. It is our intention to help organizations better understand their programs & to help them to improve their programs based on evidence of program functioning & outcomes. We also intend to help organizations build their own capacity to reflect on & internally evaluate programs & to help to improve the evaluablility of programs.
Evaluators’ Roles

  • Meet with project staff to develop logic model and evaluation plan, & modify as needed

  • Data collection including site visits, surveys, interviews, photo documentation
  • Data analysis and report writing


  • Provide planning and/or recommendations for subsequent evaluation (as appropriate)

  • PEER Contacts: Amy and Andrew
Farms for City Kids Staff and/or Board Roles in Evaluation Process

  • Develop logic model and evaluation plan with evaluators

  • Provide input throughout the evaluation cycle via meetings, phone and/or email on evaluation direction, appropriateness of instruments, & format of final report

  • Serve as liaison between evaluators & participants (e.g. setting up interview schedule)

  • Collect & share observation notes, project documentation, photos with evaluators as designated

  • Assist in administration of surveys, if used

  • Provide incentives for teacher, student, parent, and/or staff participation in evaluation process
Deliverable Products

  • Farms for City Kids Program Logic Model: Due December 2005

  • Informal report, as presentation to board, on existing survey and interview data: Due December 2005

  • Formal report on 05-06 evaluation findings: Due April 2006

  • Presentation and facilitation of discussion on final report with board and/or staff: April 2006

Farms for City Kids Evaluation Overview, 2005-2006

Category


Evaluation Strategy/Activity

When

Evaluation Questions

1. Existing Data

  1. Enter, code, and analyze most relevant existing 2003-2005 teacher and student survey data.

Fall 05


  • Based on data collected so far, what are the most notable and consistent impacts of the Farms for City Kids program on students?

  1. Review video data and document primary themes.

2. Data Collection

  1. Initial one-day site visit and program observation.

July 05

As a result of their participation in the Farms for City Kids program:


  • What areas of students’ personal and social skills are being affected, and in what ways? (self-esteem, self-confidence, leadership, cooperative teamwork, conflict resolution)




  • How have students’ knowledge, attitudes, and behaviors about food and health choices changed?




  • How have students’ knowledge of and attitudes about agriculture, farm animals, and the environment changed?


  1. Develop surveys and interview guides for students and teachers.


Aug 05

  1. Conduct pre- and post- interviews and surveys with one group of students and their teachers at Spring Brook Farm. Conduct follow up interviews and surveys at the school in NYC.

Oct 05

Jan 06


  1. Conduct interviews with program staff.

Oct 05

  1. Conduct interviews and focus groups with alumni students, teachers, parents, administrators, etc. during on-location visit to NYC.

Jan 06

3. Report

  1. Plan, travel, transcribe, analyze, report on interviews & site visits. Provide electronic and print versions of final report on survey, observation, and interview data (#2), and existing data.

Apr 06

4. Present-ations

  1. Present findings and facilitate discussion of informal report on existing data and status of evaluation at Staff or Board Meeting.

Dec 05

  1. Present findings and facilitate discussion of final report at Staff or Board Meeting.

Apr 06


5. Program Staff Support

  1. Facilitate development of a program logic model, articulating the theory of change, and identifying evaluation needs. Provide final Logic Model as part of Informal and/or Final Report.

July 05

  • What resources and activities are used to create desired outcomes? Where are key evaluation needs within the program logic?

  1. Meet with stakeholders to define evaluation questions and methods, revise evaluation overview as needed.

July 05, Ongoing

  • What are realistic expectations and plans for generating useful evaluation results within existing resource constraints?




  1. Help build capacity of program to continue evaluation efforts on their own, such as refining instruments for ongoing use.

Ongoing

  • In what ways can the evaluation tools and processes used in this evaluation cycle serve longer term, internal evaluation purposes?



1   2   3   4   5   6   7   8   9   10


The database is protected by copyright ©hestories.info 2017
send message

    Main page