
Share this Guide
Step 3: Determine Program Review Focus
Before beginning the review process, the team must clearly define the evidence-based practices or frameworks that will be used to review individual programs. Teams should carefully consider all sources of data to achieve consensus about the focus and scope of the program review.
Outcomes:
- The steering committee has clearly defined the instructional focus of the program review (e.g., the evidence-based practices or frameworks that will be used to guide the review) and the individual programs, sets of courses, and/or clinical experiences that will be reviewed.
- The steering committee has appointed a faculty-led workgroup to carry out the program review tasks.
Steps
Step 3.1 Decide the Instructional Focus of the Program Review
Guiding Questions
- Based on the data, should program reform and continuous improvement efforts target a specific content area (e.g., reading, math, behavior)?
- Based on the data, should program reform and continuous improvement efforts focus on cross-cutting instructional practices that can be applied across content areas (e.g., Multi-Tiered Systems of Support, UDL, HLPs, strengthening practice-based opportunities)?
- What are the pros and cons of focusing on a content area as opposed to focusing on instructional practices that can be applied across content areas and grade levels?
- How will the focus on identified instructional practices affect reforms needed in clinical experiences?
- Which of the CEEDAR Center IC topic areas best addresses identified areas of program reform?
- Based on these conversations, what is the steering committee’s final decision about the instructional focus of the program review (i.e., what evidence-based practices and/or frameworks will be used to review programs)?
Step 3.2 Select Individual Programs or Courses to be Reviewed
Guiding Questions
- Based on the data, are there individual programs that should be prioritized for review (e.g., elementary, secondary, dual certification)?
- Is there a particular set of courses that should be prioritized for review (e.g., a set of courses leading to endorsement that is shared across multiple individual programs)?
- How will scope and sequence be considered within and across programs?
- Based on these conversations, what is the steering committee’s final decision about which individual programs will be reviewed?
Step 3.3 Create a Workgroup to Conduct the Program Review
Guiding Questions
- Given the chosen program review focus, which stakeholders should be represented on the workgroup that will carry out the program review tasks? Does this group overlap with the steering committee? What additional members should be added?
- Does the workgroup encompass stakeholders outside of the EPP? For example, are district representatives or program graduates currently teaching in partner district schools involved?
- Does the workgroup include adjunct and/or clinical faculty?
- Do all workgroup members have a clear understanding of the commitment involved in reviewing and enhancing programs (e.g., revising syllabi, developing new course content)?
- Do all workgroup members understand the expectations for program improvements and revisions per feedback from the review process, and therefore are committed to moving forward with the process?
Resources
Active Implementation Frameworks:
- Hexagon Tool: Systematically evaluate existing and potential innovations (program reforms) using six broad components: need, fit, resources, evidence, readiness, and capacity. This tool is helpful in determining if an innovation is the right fit for an EPP.
Leading by Convening (Coalescing Around Issues):
- Four Simple Questions: Simplifies the process to reach consensus on the purpose of the reform and a common understanding of the reform process.
Examples
- Georgia State University used an organic approach to program review and continuous improvement. The university used CEEDAR innovation configurations to conduct syllabi analysis to identify program gaps and strengths and then created a matrix to illustrate content covered across coursework and field experiences.
- Grand Valley State University in Michigan used CEEDAR resources to facilitate cross-departmental conversation about early literacy. Faculty at Siena Heights University engaged in curricular mapping and analysis of reading/literacy components across all programs, where they gathered and analyzed data to identify gaps and duplications.
- The Montana CEEDAR team along with other EPP professional colleagues undertook the alignment of statewide and locally used instructional frameworks throughout Montana as a precursor to conducting program reviews. They cross-walked those frameworks with the Teaching Works High-Leverage Practices (HLPs) and the Council for Exceptional Children HLPs. As a result, the Comprehensive Practices Matrix was developed to illustrate this alignment.