Skip survey header

Changes to the ACLS Site Visit and Monitoring Process 2017

Introduction
An ACLS team (hereinafter The Team) has developed a new uniform and transparent approach to site visits and revised the process for monitoring visits to align with the new FY19-FY22 Multi Year Open and Competitive Adult Basic Education Request for Proposals (RFP).  This approach strives to articulate the essential characteristics of a high quality Community Adult Learning Center and reflects ACLS' policy shifts toward: (1) program flexibility, (2) an ABE system that is focused on outcomes, and (3) alignment with the Workforce Innovation and Opportunity Act (WIOA) priorities, specifically the 13 WIOA considerations. The Team will employ a systematic and coordinated method of gathering data to ensure processes are fair and objective.
 
The Team is currently piloting the new site visit and monitoring protocols with four Community Adult Learning Centers (CALCS) and one ABE Correctional Institution. These pilot visits will inform the calibration of the protocols and sources of evidence included in this new approach. Before its release to the field, the site visit and monitoring protocols will be finalized based on the findings from the spring 2017 pilots and feedback from the open comment period.
 
The description below highlights the changes proposed to the site visit and monitoring process. ACLS is seeking your feedback on these new processes. Please review these changes carefully as, together with the revised Indicators of Program Quality (IPQ), they will inform both site visits and monitorings.
 
The survey questions that follow will be available until May 15, 2017.
 
Overview of Site Visits
Site visits may be grouped broadly into three types: Year One, Check-in, and Targeted Intervention (TI). Additional site visits may be scheduled as needed. The length and scope of each visit varies depending on the focus of the visit and the set of IPQs reviewed during the visit. In general, the overall purposes are:
  1. To provide support and technical assistance to the program.  Technical assistance may encompass a wide range of areas, including but not limited to:, consistent underperformance, drop in performance, staff retention, lack of reported data, decline in student attendance, poor student retention, curriculum not aligned to CCRSAE, or other concerns as determined by ACLS. The program is always encouraged to identify areas where technical assistance is needed.
  2. To review the program’s progress using a limited selection of IPQs as a framework for the discussions to take place between the program specialist, the ABE Director and a selected sampling of program staff.
  3. To gather best practices for state-wide dissemination.
Each site visit will be conducted by two program specialists to ensure transparency. At the start of the fiscal year, ACLS program specialists will negotiate site visit dates with the ABE program directors. Note the IPQs are a useful tool for program self-evaluation.
 
See table 1 below for more details on each type of site visit.
 
Table 1: Summary of Site Visits
 
Type of Visit Description Purpose Approximate Length IPQ (s) typically addressed
Year One
(FY19)
All programs (introductory)
May include class observations.
Verify key funded activities align with service delivery (e.g. classroom space, class plan, access to technology).
Provide technical assistance and identify best practices.
.5 day 1: Program Design
2: Access and Equity
Check-in
 
 
All programs
May or may not include class observations.
 
Provide feedback on program’s performance in terms of IPQ standards.
Gather evidence regarding the implementation of program activities and identify best practices.
.5 day
 
3: Career Pathways Collaboration
4: Curriculum and Instruction
5: Student Progress
6: Advising and Student Support
8: Educational Leadership
Targeted Intervention (TI) A wide range of issues as identified on page 1. Provide technical assistance including resources and referral to PD content areas as appropriate.
Assess program’s progress on selected IPQ standards.
.5-1 day 1-10 as needed
 
Fiscal Reviews
A sampling of programs within a funding cycle and those with fiscal issues Evaluate fiscal compliance. 1-2 days 10: Fiscal and Data Accountability
Data Reviews A sampling of programs within a funding cycle
Programs with data reporting compliance issues
Evaluate data reporting compliance. 1-2 days 10: Fiscal and Data Accountability
 
 

Overview of Monitoring Visits
Monitoring visits will be conducted by the ACLS Program Quality Team in collaboration with workforce partners. This one day monitoring visit is multi-purpose:
  1. To assess the program’s progress using a selection of the IPQs as the framework for data collection activities. 
  2. To gather best practices for state-wide dissemination and other evidence that will be become part of the larger body of evidence used to make program finding decisions.
In the spirit of continuous improvement, the monitoring team will document the outcomes of the monitoring process and assign a rating to each indicator.
 
ACLS will signal all programs selected for a monitoring visit at the start of the fiscal year. The ABE Director and the program specialist will identify a mutually agreeable date. To prepare the program for the monitoring visit, the program specialist will ensure the ABE Director has the link to the ACLS homepage where all the monitoring materials will be posted including a sample schedule, student and staff interview questions, and the required documentation. Once the program director confirms the date for the monitoring visit, he/she will be responsible to prepare staff for the visit.  
 
 The Monitoring Structure-Major Components of the Day
  • Interviews and Focus Groups
    • Several interviews will be conducted throughout the day: the program’s leadership team (director, assistant director, coordinator, lead teachers), a sampling of advisors, a sampling of teachers, and a sampling of students. Not all program staff are required to participate in the interviews. Prior to the visit, program leadership in collaboration with the monitoring team leader will determine the configuration of the focus groups. Uniform interview questions will be guided by the IPQs.
  • Survey Questions
    • In order to maximize the monitoring team’s time with the focus groups and to allow the team to probe more deeply with fewer questions, some questions on a subset of the Indicators will be asked in the form of a survey.ACLS will send the questions prior to the visit and program leaders are expected to provide the monitoring team with the answers two weeks before the visit.
  • Document Review
    • Two weeks prior to the visit, programs are required to provide ACLS with a set of evidentiary documents. Examples of documents could include: a sample curriculum scope and sequence, a sample curriculum unit, sample lesson plans, and a list of most recent staff PD activities.
  • Classroom Observations
    • Each classroom observation will not exceed 30 minutes and will not disrupt classroom activities. The monitoring team will visit several classrooms and will use an ACLS-developed classroom observation tool to gather evidence on activities and practices related to four essential elements of high-quality instruction: scaffolding, differentiated instruction, checking for understanding, and student engagement. Evidence collected during classroom observations will become part of the larger body of evidence and will inform the rating of Indicator 4: Curriculum and Instruction.
  • Team Debrief
    • This is time for the monitoring team to discuss major highlights of the day, ask clarifying questions, and gather additional evidence if needed. In addition, the team will confer to ensure a shared understanding of the assessment of the data collected to determine preliminary findings.
  • Report Out
    • The team shares preliminary feedback with the ABE program leadership and key staff. 
  • The Monitoring Report
    • Each monitoring report will contain, if applicable, best practices, recommendations, and required actions related to the IPQs. The program’s performance against the Indicators will be summarized using a rating scale. In the monitoring report, each Indicator will be accompanied by a rating: Limited Evidence, Developing, Proficient, and Exemplary.
 
Table 2: Rating Descriptions
Rating Description
Limited Evidence The program demonstrates little to no evidence related to the Indicator and/or significant concerns are noted.
Developing The program demonstrates inconsistent evidence related to the Indicator and/or moderate concerns are noted.
Proficient The program demonstrates consistent evidence related to most standards of the Indicator and/or minor concerns are noted.
Exemplary The program demonstrates consistent evidence related to all standards of the Indicator and is a potential exemplar in this area.
 
Please note that the Student Progress Indicator will not be rated with this scale since this particular Indicator is already assessed through the ACLS Performance Standards Framework with the Measurable Skill Gain (MSG) Standard and the post-exit measures.
 
Timeline
The implementation of the new site visit and monitoring models is due to start in FY19 (July 1, 2018). The following is a projection of the site visits and monitoring visits to be conducted in the new funding cycle:
  • Year 1 (FY19): all funded programs will receive a year one site visit. Year one site visits will focus on providing technical assistance.
  • Year 2 (FY20):  50% of the CALCS and ABE CI programs will be monitored by the ACLS PQ team. Programs not being monitored will receive a check-in visit or a targeted intervention visit.
  • Year 3 (FY 21): the second half of the CALCS and ABE CI programs will be monitored by the ACLS PQ team. Programs not being monitored will receive a check-in or targeted intervention visit.
  • Year 4 (FY22): No site visits or monitoring visits will be conducted.
Copy of
Introduction
An ACLS team (hereinafter The Team) has developed a new uniform and transparent approach to site visits and revised the process for monitoring visits to align with the new FY19-FY22 Multi Year Open and Competitive Adult Basic Education Request for Proposals (RFP).  This approach strives to articulate the essential characteristics of a high quality Community Adult Learning Center and reflects ACLS' policy shifts toward: (1) program flexibility, (2) an ABE system that is focused on outcomes, and (3) alignment with the Workforce Innovation and Opportunity Act (WIOA) priorities, specifically the 13 WIOA considerations. The Team will employ a systematic and coordinated method of gathering data to ensure processes are fair and objective.
 
The Team is currently piloting the new site visit and monitoring protocols with four Community Adult Learning Centers (CALCS) and one ABE Correctional Institution. These pilot visits will inform the calibration of the protocols and sources of evidence included in this new approach. Before its release to the field, the site visit and monitoring protocols will be finalized based on the findings from the spring 2017 pilots and feedback from the open comment period.
 
The description below highlights the changes proposed to the site visit and monitoring process. ACLS is seeking your feedback on these new processes. Please review these changes carefully as, together with the revised Indicators of Program Quality (IPQ), they will inform both site visits and monitorings.
 
The survey questions that follow will be available until May 15, 2017.
 
Overview of Site Visits
Site visits may be grouped broadly into three types: Year One, Check-in, and Targeted Intervention (TI). Additional site visits may be scheduled as needed. The length and scope of each visit varies depending on the focus of the visit and the set of IPQs reviewed during the visit. In general, the overall purposes are:
  1. To provide support and technical assistance to the program.  Technical assistance may encompass a wide range of areas, including but not limited to:, consistent underperformance, drop in performance, staff retention, lack of reported data, decline in student attendance, poor student retention, curriculum not aligned to CCRSAE, or other concerns as determined by ACLS. The program is always encouraged to identify areas where technical assistance is needed.
  2. To review the program’s progress using a limited selection of IPQs as a framework for the discussions to take place between the program specialist, the ABE Director and a selected sampling of program staff.
  3. To gather best practices for state-wide dissemination.
Each site visit will be conducted by two program specialists to ensure transparency. At the start of the fiscal year, ACLS program specialists will negotiate site visit dates with the ABE program directors. Note the IPQs are a useful tool for program self-evaluation.
 
See table 1 below for more details on each type of site visit.
 
Table 1: Summary of Site Visits
 
Type of Visit Description Purpose Approximate Length IPQ (s) typically addressed
Year One
(FY19)
All programs (introductory)
May include class observations.
Verify key funded activities align with service delivery (e.g. classroom space, class plan, access to technology).
Provide technical assistance and identify best practices.
.5 day 1: Program Design
2: Access and Equity
Check-in
 
 
All programs
May or may not include class observations.
 
Provide feedback on program’s performance in terms of IPQ standards.
Gather evidence regarding the implementation of program activities and identify best practices.
.5 day
 
3: Career Pathways Collaboration
4: Curriculum and Instruction
5: Student Progress
6: Advising and Student Support
8: Educational Leadership
Targeted Intervention (TI) A wide range of issues as identified on page 1. Provide technical assistance including resources and referral to PD content areas as appropriate.
Assess program’s progress on selected IPQ standards.
.5-1 day 1-10 as needed
 
Fiscal Reviews
A sampling of programs within a funding cycle and those with fiscal issues Evaluate fiscal compliance. 1-2 days 10: Fiscal and Data Accountability
Data Reviews A sampling of programs within a funding cycle
Programs with data reporting compliance issues
Evaluate data reporting compliance. 1-2 days 10: Fiscal and Data Accountability
 
 

Overview of Monitoring Visits
Monitoring visits will be conducted by the ACLS Program Quality Team in collaboration with workforce partners. This one day monitoring visit is multi-purpose:
  1. To assess the program’s progress using a selection of the IPQs as the framework for data collection activities. 
  2. To gather best practices for state-wide dissemination and other evidence that will be become part of the larger body of evidence used to make program finding decisions.
In the spirit of continuous improvement, the monitoring team will document the outcomes of the monitoring process and assign a rating to each indicator.
 
ACLS will signal all programs selected for a monitoring visit at the start of the fiscal year. The ABE Director and the program specialist will identify a mutually agreeable date. To prepare the program for the monitoring visit, the program specialist will ensure the ABE Director has the link to the ACLS homepage where all the monitoring materials will be posted including a sample schedule, student and staff interview questions, and the required documentation. Once the program director confirms the date for the monitoring visit, he/she will be responsible to prepare staff for the visit.  
 
 The Monitoring Structure-Major Components of the Day
  • Interviews and Focus Groups
    • Several interviews will be conducted throughout the day: the program’s leadership team (director, assistant director, coordinator, lead teachers), a sampling of advisors, a sampling of teachers, and a sampling of students. Not all program staff are required to participate in the interviews. Prior to the visit, program leadership in collaboration with the monitoring team leader will determine the configuration of the focus groups. Uniform interview questions will be guided by the IPQs.
  • Survey Questions
    • In order to maximize the monitoring team’s time with the focus groups and to allow the team to probe more deeply with fewer questions, some questions on a subset of the Indicators will be asked in the form of a survey.ACLS will send the questions prior to the visit and program leaders are expected to provide the monitoring team with the answers two weeks before the visit.
  • Document Review
    • Two weeks prior to the visit, programs are required to provide ACLS with a set of evidentiary documents. Examples of documents could include: a sample curriculum scope and sequence, a sample curriculum unit, sample lesson plans, and a list of most recent staff PD activities.
  • Classroom Observations
    • Each classroom observation will not exceed 30 minutes and will not disrupt classroom activities. The monitoring team will visit several classrooms and will use an ACLS-developed classroom observation tool to gather evidence on activities and practices related to four essential elements of high-quality instruction: scaffolding, differentiated instruction, checking for understanding, and student engagement. Evidence collected during classroom observations will become part of the larger body of evidence and will inform the rating of Indicator 4: Curriculum and Instruction.
  • Team Debrief
    • This is time for the monitoring team to discuss major highlights of the day, ask clarifying questions, and gather additional evidence if needed. In addition, the team will confer to ensure a shared understanding of the assessment of the data collected to determine preliminary findings.
  • Report Out
    • The team shares preliminary feedback with the ABE program leadership and key staff. 
  • The Monitoring Report
    • Each monitoring report will contain, if applicable, best practices, recommendations, and required actions related to the IPQs. The program’s performance against the Indicators will be summarized using a rating scale. In the monitoring report, each Indicator will be accompanied by a rating: Limited Evidence, Developing, Proficient, and Exemplary.
 
Table 2: Rating Descriptions
Rating Description
Limited Evidence The program demonstrates little to no evidence related to the Indicator and/or significant concerns are noted.
Developing The program demonstrates inconsistent evidence related to the Indicator and/or moderate concerns are noted.
Proficient The program demonstrates consistent evidence related to most standards of the Indicator and/or minor concerns are noted.
Exemplary The program demonstrates consistent evidence related to all standards of the Indicator and is a potential exemplar in this area.
 
Please note that the Student Progress Indicator will not be rated with this scale since this particular Indicator is already assessed through the ACLS Performance Standards Framework with the Measurable Skill Gain (MSG) Standard and the post-exit measures.
 
Timeline
The implementation of the new site visit and monitoring models is due to start in FY19 (July 1, 2018). The following is a projection of the site visits and monitoring visits to be conducted in the new funding cycle:
  • Year 1 (FY19): all funded programs will receive a year one site visit. Year one site visits will focus on providing technical assistance.
  • Year 2 (FY20):  50% of the CALCS and ABE CI programs will be monitored by the ACLS PQ team. Programs not being monitored will receive a check-in visit or a targeted intervention visit.
  • Year 3 (FY 21): the second half of the CALCS and ABE CI programs will be monitored by the ACLS PQ team. Programs not being monitored will receive a check-in or targeted intervention visit.
  • Year 4 (FY22): No site visits or monitoring visits will be conducted.