Sustaining Evaluative Capacity
Evaluation Support Project, Anchoring Evaluative Capacity
In three separate projects basic evaluation training was provided to multiple staff members with direct service delivery responsibility (e.g., program managers) in multiple organizations that had already participated in comprehensive (18 months+) evaluation training programs (see Evaluation Support Project and Anchoring Evaluation Capacity project reports for examples). Participants included the Executive Directors of each organization and staff who had been involved in the long-term training, as well as those who had never participated in evaluation training. The Evaluation Support 2.0 project involved two organizations (Center for Anti-Violence Education in NYC, and Jewish Family Service in Rochester) that had never received longer-term training before, but had been recipients of Bruner Foundation grant support.
The Evaluation Support (2010 and the 2.0 version 2015) and Anchoring Projects (2011-2012) helped participants refresh or learn new skills and enhance organization-wide evaluative thinking as well. In all the projects, the training included instruction and activities addressing basic evaluation planning (evaluation questions, stakeholders, designs), evaluation logic (outcomes, indicators, targets), and attention to data analysis. The ESP 2.0 project focused specificlly on data analysis using tools such as Survey Monkey and Microsoft Excel.
- For the Evaluation Support Project 2.0 (2015), training included a session refresher for the entire agency, an optional session on evaluation design, and two additional sessions focused on data analysis using tools and data from the agency's selected project. The final session for all groups but JFS, which did the modified version as a first agency-wide training effort, was focused on summarizing findings. (Contact the agencies for examples of the results reports.)
- For the Anchoring Project (2011-2012), organizations identified an analysis project of interest to them and selected specific tools to accomplish their work. These included construction and use of an electronic survey (including instrument development, administration and analysis plan development and target setting, launch and bivariate analysis of results); development and analysis of a multi-year client intake database using SPSS (including record review to collect data, conversion of files from Excel to SPSS and bivariate analysis of intake and placement trends); development and analysis of a master, unit-record database, in Excel (including single file construction from multiple survey, observation and attendance databases).
- For the Evaluation Support Project (2010), participants from LifeSpan engaged in a group survey analysis project devised by the trainer where they developed code books and analysis plans and practiced analyzing a set of surveys according to their plans. They also developed an evaluation design for a new project.