
Share this Guide
Step 4: Review Programs
A systematic review of the coursework and clinical experiences within programs can help identify strengths, weaknesses, and opportunities for improved alignment with evidence-based practices and frameworks. A data-driven program review is the foundation for developing a strong action plan for program reform.
Outcomes:
- The workgroup has chosen appropriate program review tools (e.g., CEEDAR Center ICs) and has established a process to conduct the review, including division of tasks and responsibilities among workgroup members.
- The workgroup has analyzed data from the program review and has shared initial findings with the rest of the faculty and other applicable stakeholders.
Steps
Step 4.1 Choose the Program Review Tools
Guiding Questions
- How will the workgroup use the CEEDAR Center ICs or other tools to facilitate the program review process?
- Does the workgroup understand the purpose of the CEEDAR Center ICs? Do they understand that the ICs are self-assessment tools with no rating or accountability function?
- How will the workgroup receive training to use the CEEDAR Center ICs? Will the workgroup use the online ICs on the NIC to complete the program matrix?
Step 4.2 Establish the Program Review Process
Guiding Questions
- How will faculty on the workgroup divide responsibility for reviewing syllabi for courses and clinical experiences for the selected programs?
- Will faculty individually or collectively review courses within programs? Will there be checks for interrater reliability?
- What level of input and control will faculty be granted in the program revision process?
- What role will the dean or program chair have in leading and facilitating the program review process?
- What supports and resources (e.g., training, time, faculty buy-outs, stipends) are needed for faculty to adequately engage in program review?
- How will the workgroup analyze data generated by the CEEDAR Center ICs or other program review tools?
Step 4.3 Analyze Program Review Data
Guiding Questions
- What did the program review reveal? What are areas of strength within the program? What are areas for improvement within the program?
- Did specific courses or content strands emerge in the review that should be prioritized for revision and course enhancement? Did specific courses emerge that do not directly align with the teacher standards?
- Did the review reveal issues in scope and sequence across content? For example, are teachers provided training on phonemic awareness across several courses but no attention is given to comprehension?
- What do the data suggest about gaps in curriculum across the program?
- What do the data suggest about duplications in curriculum across the program?
- How will faculty beyond the workgroup have an opportunity to analyze, discuss, and reflect on findings that the data generated during the program review process?
- How will the workgroup communicate and share initial findings from the program review beyond the workgroup?
Resources
CEEDAR Center:
- CEEDAR Center Innovation Configurations (ICs): Determine the extent to which EBPs are taught, observed, and applied within teacher preparation or professional development programs. Watch this video to obtain guidance on how to use ICs.
Concerns Based Adoption Model:
- ICs: Defines the components of an innovation or intervention that an EPP team is implementing.
Active Implementation:
- Usable Innovations module: Helps EPP teams come to some agreement about what the program looks like to faculty, such as a set of EBPs or frameworks used to examine coursework and clinical experiences.
- Getting Started with Usable Innovations: Encourages EPP teams think through how their program review is defined, including a clear description, a clear and essential function, an operational definition, and a performance assessment.
Examples
- Connecticut’s state leadership team focused on teacher preparation reform in evidence-based practices (EBPs) in literacy and culturally responsive pedagogy. Louise Spear-Swerling at Southern Connecticut State University discussed their experiences with preparation reform and highlighted using the Center innovation configurations to strengthen coursework provided by Dr. Nicoll-Senft at Central Connecticut State University. Faculty workgroups engaged in analyses of curricula to identify gaps, redundancies, and priorities in EBPs in literacy and writing.
- Rhode Island EPPs have used CEEDAR innovation configurations, course enhancement modules, and practice-based opportunities to review and improve their programs. The focus of program review has been aligned to state initiatives and priority areas related to EBPs, specifically in the areas of mathematics and intensive intervention.