6: Quality Assurance and Improvement: Program Review; Assessment; Use of Data and Evidence

(CFRs 2.4, 2.6, 2.7, 2.10, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 4.7)

Successful quality improvement efforts are broadly participatory, iterative, and evidence-based. This component of the institutional report includes a discussion of three basic tools of quality improvement—program review, assessment of student learning, and data collection and analysis—and presents the ways these tools inform the institution’s decision making. In addition, institutions are welcome to discuss other quality improvement approaches that have made a difference, if they wish.

Program review remains a priority for WSCUC. It is a natural nexus and point of integration for the collection of data and findings about the meaning of the degree, the quality of learning, core competencies, standards of student performance, retention, graduation, and overall student success. Because of the commitment of students to their degree programs and the loyalty of faculty to their disciplines, program review has great power to influence the quality of the educational experience. Program review can also provide insight into desirable future directions for the program and the institution.

In addition to implementing systematic program review, institutions are expected to periodically assess the effectiveness of their program review process. They can do so, for example, by reviewing the quality and consistency of follow-up after program reviews; determining the effectiveness with which the program review addresses achievement of program learning outcomes; and tracing how recommendations are integrated into institutional planning and budgeting.

Assessment, along with program review, is an essential tool that supports the goals and values of the accreditation process. “Assessing the assessment” should not crowd out the work of understanding student learning and using evidence to improve it. However, good practice suggests that it is wise to step back periodically, ask evaluative questions about each stage of the assessment cycle, and seek ways to make assessment more effective, efficient, and economical.

Data provide the foundation for effective program review, assessment of student learning, and other quality improvement strategies. However, to have an impact, data need to be turned into evidence and communicated in useful formats. The discussion of data collection, analysis, and use can include, for example, information about resources provided by the institutional research office (if one exists), software used to generate reports, access to data, processes for making meaning out of data (see the WSCUC Evidence Guide for more information), and mechanisms for communicating data and findings.

Prompts: The following prompts may be helpful in getting started, but the institution is not required to follow these prompts or respond to them directly.

  • How have the results of program review been used to inform decision making and improve instruction and student learning outcomes? (CFRs 2.7, 4.1, 4.3, 4.4)
  • What was identified in the process of examining the institution’s program review process that may require deeper reflection, changes, restructuring? What will be done as a result? What resources will be required? (CFRs 2.7, 4.1, 4.4, 4.6)
  • What has the program or institution learned as it carried out assessments of students’ learning? How have assessment protocols, faculty development, choices of instruments, or other aspects of assessment changed as a result? (CFR 4.1)
  • How adequate is the institutional research function? How effectively does it support and inform institutional decision-making, planning, and improvement? How well does it support assessment of student learning? (CFRs 4.2-4.7)