Program review

Purpose

Program review is an essential process to engage faculty in a systematic evaluation process regarding Syracuse University’s academic offerings. Program review contributes to the improvement of the University’s academic programs and informs our planning of degree offerings based on principles of shared governance. Program review informs departmental, school/college, and University discussions, decisions, and recommendations. By giving increased attention to our existing academic offerings, faculty can teach the topics they love in the context of academic programs that are best structured to meet the learning goals of our students.

External groups have weighed in on the importance of program review. In a 2015 report, the Educational Advisory Board commented, “… the proliferation of courses, specializations, and programs spreads resources more thinly across a broader array of activities, reducing quality. . . while at the same time producing a level of complexity that creates barriers to student success.” The University’s regional accreditor, the Middle States Commission on Higher Education, comments that, “program review [is] used to change and improve educational programs, consistent with institutional values, purpose, and goals.” Middle States expects the University to review every academic program on a four-year cycle, to use our feedback from program assessment to improve existing programs, and to merge or sunset programs that have reached the natural juncture for such changes.

The purpose of program review is to craft and maintain a set of high-quality academic programs that support our educational objectives for students while making effective use of our institutional resources.  If we do this well, we will maintain a list of high-quality programs that are consistent with our mission, sought by students, and sustainable.

Consistent with Middle States and University expectations, academic programs are reviewed for their quality, demand, cost-effectiveness, and centrality to mission. Defined below, these four characteristics comprise the basis of Syracuse University’s program review:

Quality The quality of the program is demonstrable by the extent of student learning, student persistence, employment outcomes, or other markers appropriate to the discipline.
Demand There is sufficient student demand, in the form of student enrollments and/or student majors, and sustenance or growth potential to warrant maintaining the program.
Cost Effectiveness The value of the program to students and to the University warrants the resources required to maintain the program.
Centrality to Mission The program is deeply connected to successful execution of our mission as a pre-eminent and inclusive student-focused research university as well as the specific mission of the school/college.

Process

Each school and college at Syracuse University reviews each academic program on a four-year cycle. Through the review process, faculty are asked to review common data sets along with other information to create an overview of a specific academic program. Syracuse University’s program review process incorporates all of the following features:

  • A fair and equitable, faculty-driven procedure for evaluating each program for which the school/college is responsible, using assessment outcomes, institutional data, and disciplinary norms to make recommendations;
  • Collecting evidence for evaluation that address the four characteristics of quality, demand, cost-effectiveness, and centrality to mission;
  • A schedule for program review that allows for each program in the school or college to be evaluated at least once every four years; for schools/colleges or programs with specialized accreditors, the specialized accreditation timelines should be factored into the program review calendar to minimize redundant work;
  • A process of cross-college consultation on joint programs and other programs where modifications, mergers, or closures would affect the work of another school/college;
  • An annual school-wide or college-wide review of the full portfolio of programs to act on recent recommendations and ensure fit with the mission of the school/college and the University. This review should consider program-specific assessment plans and annual assessment progress reports as one element of the evaluation process.
  • Evaluation conclusions and recommended improvements identified in a review process should be included as success criteria in the next review cycle.

Program Review – Process Overview

program review roles

Steps to Undertake Systematic Program Review

  1. Appoint a Program Review Chair: Each dean can appoint a faculty member or staff member to take responsibility for structuring the review process. This can be the same person who leads in the area of assessment or curriculum. Program review is an annual activity, so this responsibility would ideally be for a multi-year period. For school/colleges that have specialized accreditation, these processes may mesh with the existing assessment and compliance duties of a staff or faculty member who runs the specialized accreditation process. The Program Review chair will set the annual schedule for evaluation and provide the reporting deadlines for program representatives and others involved in the program review process.
  2. Appoint Program Representatives: For each unit (e.g., an academic department) that “owns” a program or set of related programs, the dean appoints one or more faculty representatives who can gather data (see next item) about the programs under review in a given year. Depending upon local culture, this could be a faculty member, program director, a department chair, or an associate chair.
  3. Collect Data: Data about a program should always contain the common data elements (below) plus any additional indicators the program representatives consider relevant in each of the four areas. The Office of Institutional Research and Assessment (OIRA) will provide the common data elements.
  4. Submit Program Reports to School/College Curriculum Committee: Each school and college has at least one committee dedicated to curriculum management. This committee can obtain program reports from program representatives and evaluate them using a uniform set of judgment criteria that apply to all programs in the school or college. For each reviewed program, the committee should make an evaluative judgment and a recommendation: update the program with suggested improvements, maintain the program as is, merge the program with another related program, or close the program. The school/college committee then submits their report, along with recommendations, to the respective Dean.
  5. Provide Mechanisms for Faculty Appeal of Recommendations: Program representatives should have an opportunity to consult with program faculty and, if necessary, appeal program recommendations by presenting additional program data to the school/college curriculum committee. Acting through the program review chair, the dean of the school/college can apply a set of deadlines and adjudication procedures to ensure a fair and equitable final decision about the program(s) in question.
  6. Conduct Consultations and Program Actions: Substantive modifications to a program’s academic content, a decision to merge, or a decision to close should be undertaken in consultation with other schools/colleges that may be affected by program changes. Most program and course changes require the approval of the Senate Committee on Curricula, and some program changes also involve communications with the New York State Department of Education and our regional accreditor, the Middle States Commission on Higher Education. The Office of the Provost and Vice Chancellor can provide guidance and support on all of these administrative steps.

The Office of Institutional Effectiveness and Assessment can provide assistance with the program review process.

Evidence for Program Evaluation

Evidence evaluated for program review is comprised of a core data set common across all program reviews, plus any additional data the school/college may deem appropriate for individual programs. A concise narrative should accompany the presentation of the data. Data elements should be referenced in the report’s appendices. Reports should be in the range of five pages. The table below lists the core data set, followed by examples of additional data schools and colleges may find valuable when examining individual programs.  Additional data should be chosen with the aim of providing multiple types of evidence to provide a holistic view of the program.

Within broad disciplinary areas (such as STEM), it is valuable to use consistent criteria for programs in each area. Academic program leaders should not pick and choose what evidence to present, rather, the school/college curriculum committee should provide guidance and set expectations as to what evidence should be included in each program’s report.

Program Review Overview Memo 2018