Training for Managers and Supervisors to Enhance Their Capability to
Understand and Implement ASFA
Evaluation Report for the Project Period October 1, 2001 to September 30, 2002
This report presents the evaluation findings for the second full year of this project’s operations. It is intended to be a companion document to the Second Semi-Annual Performance Report submitted by the Project Director to the Children’s Bureau, detailing activities and accomplishments for that six-month period of year two. However, this evaluation report looks retrospectively at the entire twelve months recently completed, and offers a preview of evaluation methods and events planned for the third project year.
This evaluation has three components: (1) an evaluation of process, or the extent to which the work of the project is proceeding or has proceeded as expressed in the work plan; (2) an evaluation of outcomes, or the extent to which anticipated milestone events and results and products have emerged as the consequence of the work of project staff and their leaders and various collaborators; and (3) an evaluation of the individual pilot sites’ activities and experiences with evaluating the curriculum, including the lessons learned.
It should be noted here that data sources for much of the first two components of this report identified above have been created and administered by project staff for their own internal uses in tracking the progress of their work and assuring that critical tasks are carried out according to schedule. These tools routinely inform and guide decisions and actions in every respect. The diligence and managerial expertise with which this project has been conducted during this 12 month period is clearly evident and facilitates an accurate and complete evaluation of the project team’s work. The source of data for the third component is the analysis of the evaluations forms completed by pilot training participants, supplemented by qualitative assessments by key managers. The quantitative and qualitative analysis for this third component was completed by this evaluator.
The major year two activities are presented in detail within the attached matrix labeled Project Task Completion Timetable (Objectives 1-7). This project task matrix was created at the outset, has guided discussions at the meetings of the project team and is updated periodically by the project director to reflect current status and planned activities for the following period. No significant variations from the work plan developed at the start have been observed. The project has completed all its critical tasks on time, within budget and in good order. The progress attained in each substantive area of work has been described in some detail within each of the Project Director’s second year semi-annual progress reports.
Year two was primarily dedicated to field-testing, evaluating, and revising the curriculum, with the ultimate goal of producing a final curriculum for wide dissemination. Producing this curriculum is the essence of this three-year project. Year one was dedicated to developing competencies and a draft curriculum that could be piloted. Year three activities will focus on creating useful versions of and widely disseminating the final curriculum. But it is in year two that the substance of the work on this curriculum was planned to be, and indeed was, accomplished.
Kentucky was the first to pilot the curriculum (which they did in December 2001). During the second half of year two, the curriculum was piloted in three additional sites: New Mexico, Wisconsin, and Cuyahoga County, Ohio. As designed, the module-based approach embodied in the curriculum encouraged adaptations. Both New Mexico and Wisconsin made significant adaptations to the curriculum; Cuyahoga County made no changes to the draft curriculum; and Kentucky made some changes for their pilot and later incorporated the curriculum into a larger training for supervisors. Pilot states were required to complete pilot training activities by the end of May 2002.
In order to produce a robust and fully vetted curriculum, the project team carefully tracked the various ways states modified the curriculum to meet their own needs and requirements. The team did so in four ways: individual consultation with each site as they developed their training approach and customized the curriculum; monthly teleconferences with the training teams at all four sites; surveys of each site manager regarding lessons learned; and a debriefing meeting in September 2002 for all sites to share their experiences and provide input into the final curriculum. As a result, the project team has a deep and comprehensive understanding of exactly how the curriculum was adapted by each site and, more importantly, what worked well and why it worked. Their understanding enabled them to produce a final curriculum that is stronger than the piloted draft, remains just as flexible, and provides important guidance to potential users.
All outcomes and products scheduled for year two were completed in good order.
One of the challenges that the Project Director wanted to tackle was that of evaluating the impact of the training on practice (not merely the satisfaction of participants with the training). To do so, pilot training evaluation activities included pre- and post test participant surveys, follow up surveys, individual interviews with managers, and/or participant action plans. In keeping with the approach to the curriculum itself, each pilot site was asked to customize the evaluation approach to reflect the changes they made to the curriculum. Thus, as with the curriculum, the evaluation approach was field-tested by a range of organizational structures (e.g., county based/ state based, providing juvenile justice services and not), training approaches (such as a partnership and agency trainers) and rural and urban delivery areas.
Kentucky Department of Community Based Services
Participants in Kentucky completed the pre-training evaluation at the beginning of the first day of training, the post-training evaluation at the end of the second day of training, and the post-training evaluation again six months after the training. Questions focused on ASFA, strategic planning, using data for decision-making and collaboration. Skill assessments were higher in all areas following the training. Most participants said that they would use all or most of the information and skills from the training on the job.
Cuyahoga County,Ohio, Children and Family Services
Participants in Cuyahoga County completed the pre-training evaluation at the beginning of the first day of training, the post-training evaluation at the end of the second day of training and the post-training evaluation again two months after the training. The forms used at the training itself (both pre and post) were lengthier than those used by any other site. Questions focused on ASFA, systemic factors, action planning, using data for decision-making, and core competencies. Most participants said that they would use all or most of the information and skills from the training on the job. The senior manager who led this project for Cuyahoga also designed a follow-up assignment for participants to continue to practice using data and management reports. That initiative will allow her to continue to assess the impact of this training.
New Mexico Department of Children, Youth and Families
The New Mexico training team was the only one to make significant revisions to the evaluation instruments as initially designed. In fact, much of the streamlining that they used (and that participants responded to favorably) has been incorporated into the evaluation instruments provided in the final curriculum. Participants there also complete the pre-training evaluation at the beginning of day one and the post-training evaluation at the end of day two. The training manager enlisted the assistance of the IT unit to complete a data analysis of the pre and post forms which was shared with participants at the very end of the second day. Questions focused on ASFA, understanding data, using management reports and collaboration. Significant improvements were seen in all areas, except collaboration. NM managers reported that collaboration was already widely understood and practiced prior to the training, so would not have expected to see much change in that area.
Wisconsin Division of Children and Family Services and the Training Partnerships
Participants in Wisconsin completed pre-training evaluation at the beginning of the first day of training, the post-training evaluation at the end of the second day of training and the post-training evaluation again two months after the training. As they continue to provide this training to additional participants, the training partnership is requiring staff to complete and submit the pre-training evaluation as part of training registration. Questions focused on ASFA, outcomes required by ASFA, the application of ASFA in key practice areas and managing by outcomes. Most participants said that they would use all or most of the information and skills from the training on the job.
In September 2002, the Muskie project team hosted an ASFA project debriefing for representatives from the four pilot sites in order to share experiences learned from participating in the project. This included a presentation by this evaluator and discussion with the group on the lessons learned and themes from each pilot site’s evaluation results.
The project team set out to design an evaluation approach that would evaluate the impact of the training on the practice of the participants. While the pilot trainings included a relatively small number of participants (approximately 20 to 25 at each site) making the impact of the pilot training difficult to measure, the project team did want to test an approach that would focus on impact rather than satisfaction. As training professionals know, impact evaluation is challenging at best – and this training reinforced that.
The curriculum is designed to support the transfer of learning from the classroom to office practice. The briefing to senior executives is designed to create a commitment to the bringing the substance of the curriculum into action. The personal learning plans are designed to support participant’s reflection on their own learning and continued post-training skill development. The pre-training evaluation form asks participants to reflect on what they need/want to learn and the post-training evaluation form asks participants to anticipant putting their learning into practice back at the office.
That said, the primary focus of the pilot sites was on customizing the training-room curriculum to meet their specific local needs. None of the sites used / tested personal learning plans. While all sites had the blessing of senior executives to participate in this initiative, Cuyahoga County was the only site that used the executive briefing as designed. They briefed their senior management team, including the director, on the curriculum and solicited input as to substance and focus. For example, the director asked that the action planning exercise focus on a problem of particular concern to him (namely, “foster care drift”). The results from that exercise were presented to him following the training, as were the results from the systemic factors assessment.
One qualitative measure of impact is the commitment and enthusiasm of each pilot site to delivering this curriculum to a broader audience beyond the pilot itself. Kentucky has integrated it as part of a comprehensive supervisory curriculum delivered in conjunction with the University of Louisville; Cuyahoga is adapting it as three separate, companion training sessions; and New Mexico and Wisconsin are rolling out statewide the curriculum they piloted with some modifications. Clearly, pilot sites believe the curriculum offers important benefits to their managers and supervisors. This is echoed by participants, a large majority of whom reported that they would use “all” or “almost all” of what they learned.
As noted above, the curriculum is designed to be customized to meet local needs. It has been delivered in states that are relatively rich in training resources as well as those that have new or relatively few training resources. It has been delivered prior to a state’s Child & Family Service Review (CFSR) as well as following a state’s CFSR. Each site successfully adapted the curriculum to meet its unique needs.
Participants rated their knowledge and understanding of ASFA very high both before and after the training. However, at the September debriefing, the trainers in two sites reported that, prior to the training, participants did not understand how ASFA actually applies to case practice. It seems most likely that on the pre-training evaluation participants at these sites were rating their understanding of ASFA as a federal requirement. During the training, they learned about the application of ASFA to case practice so that the post-training evaluation rating reflects a deeper understanding of the impact of ASFA on the agency and practice.
Participants rated their skills in using data to support decision-making as being low relative to other skill areas. Their ratings improved in the post-training evaluation, but were still relatively low. Their comments on the evaluation forms identified a number of obstacles to developing and practicing these skills. The most commonly identified obstacles were access to user-friendly reports, concerns about data integrity/quality and timing of the reports. The pilot trainers concurred that these factors are challenges with which their entire agencies are grappling. This finding highlights the importance of securing executive support for the curriculum’s content, particularly for areas that might require organizational changes to support training transfer.
As with the first and second
years, year three evaluation of process and outcomes will employ methods
of competency, consistency, utility and implementation analysis. The focus
of the third year study will be on approaches to national dissemination
as described in the attached matrix, Activities Planned for Year Three.
1. Project Task Completion
Timetable (Objectives 1-7)
Major Activities and Accomplishments During Year Two
Activities Planned for Year Three