Mohawk College - Program Quality Assurance Process Audit

Criterion 6

To what extent does your college’s quality assurance process ensure regular program quality assessment that involves a variety of stakeholders, including faculty, students, industry representatives, and others as appropriate for the purpose of continual improvement is in place and happens?

Requirements:

6.1 The college has implemented a program quality management system that identifies and rectifies weaknesses, and facilitates the evolution of the program to maintain its relevance. This includes:

In the period since the last Audit, the College has further refined the formal structured processes for program review as a key strategy for maintaining program excellence and relevance. As described earlier in the report, the College has enhanced the annual review process, offering the opportunity for more regular and timely continuous improvement. Although not as comprehensive as the full review, annual review does include review of the curriculum and program structure.

Every program undergoes comprehensive review at least every 5 years. The previous approach to comprehensive review was too centralized, resulting in backlogs in the completion of reviews. The new approach maintains a comprehensive process but places more responsibility for completion of the review on the academic departments. As the attached graphic indicates, the process, which takes 12 months to complete, integrates five steps (6.1.01).  The Review Team includes the Associate Dean responsible for the program, all full-time faculty teaching in the program under review, and some part-time faculty (as budgets allow). Review Teams participate in extensive orientation and training as initial preparation for the Review. Note that the data collection phase (step 2) requires assembly of comprehensive data in four key categories: 1) Curriculum 2) Program Information 3) Pathways and Partnerships and 4) Alignment with Strategic and Academic Plans.  Several key data sources are represented in each of these categories.  Several meetings follow as step 3 to analyze data, and requirements for the draft final report.  Step 4 of the process involves submission and review of the Final Report first by the Associate Dean responsible for the program, and then by the Vice-President, Academic.  The final review step addresses monitoring of review recommendations in subsequent annual reviews.

Reviewers have access to the following ten toolkits (6.1.02a 6.1.02b 6.1.03 6.1.04 6.1.05 6.1.06 6.1.07 6.1.08 6.1.09 6.1.10 6.1.11) which guide them systematically through data collection and analysis:

Careful attention to each of these toolkits ensures that program outcomes are met, and that the views of learners, employers, and other key stakeholders are taken into account. The process also accommodates changes necessary to maintain currency with provincial standards and the requirements of professional bodies.

Regular restructuring, reorganization and enhancement of the academic programming complement is an essential College response to demographic shifts, changing employment trends, and the needs of employers and students in a competitive economic environment.  As discussed elsewhere in this report, the College has introduced processes to address program modification. Program modifications are driven either by changes in provincial program standards or internal decisions. A structured approvals process has been developed for program modification. The modification approvals process (6.1.12) begins with a statement of rationale that is approved by the Enrolment Planning Committee, and then requires the submission of a full Management Report. Depending on the level of change proposed, this Report requires either internal approval by the Strategic Enrolment Management Committee and other senior College management, or external approval from the Credentials Validation Service and the Ministry of Training, Colleges and Universities.

The process for determining continuation or suspension of programs, which has been refined since the last Audit, is strategic, transparent, consultative and evidence-based. Program Prioritization is governed by the Program Quality Policy. The Process Slide (6.1.13) outlines the steps involved in program prioritization. Note that the process integrates extensive analysis and review, using a set of consistent criteria, many of which are aligned with the College’s Program Review process. The Qualitative Analysis Template (6.1.14) ensures a thorough consideration of several key issues essential for ensuring that program rationalization decisions are well founded within a performance framework that is consistently applied.

When a recommendation is made to suspend a program, a statement of rationale is submitted to the Enrolment Management Committee as a first step. Approval by the SEM Committee initiates the development of a full management report. The program suspension decision requires the approval first of the SEM Committee, then the Mohawk Executive Group, and finally by the College Board of Governors.

Back to Top

6.2 Documentation and other evidence arising from program quality management processes is maintained and used in on-going quality management.

The multi-faceted program review process, described above, requires collection and dissemination of a wide variety of documentation. Program review reports are maintained on a central SharePoint site that is accessible to the Vice-President Academic and the Deans. The Vice-President Academic reads all program review reports and follows up as necessary with the Deans regarding key issues and concerns. A quarterly report outlining the status of program reviews is presented to the Board of Governors.

Data collected during the review process and the recommendations that result in the final program review report are referenced regularly during the annual reviews and form the basis of the action plan that drives quality assurance strategies in the period between comprehensive reviews. Program Co-ordinators and faculty are allotted workload hours in the spring/summer semester to implement recommendations from program review.

A similarly rigorous and structured process associated with new program development ensures the collection of various data that are used by the academic departments to ensure high quality new programs that are responsive to student needs and aligned with the College mission.

As part of its quality assurance processes, the College uses a varied survey methodology to solicit student input and gather evidence that drives quality assurance initiatives. The Student Feedback on Teaching Survey (6.2.01a 6.2.01b) and the provincial Key Performance Indicator surveys are key tools in identifying course/program improvements. The Centre for Teaching and Learning has developed guidelines to assist faculty in using this data constructively. The College also surveys first year students, and first generation students to determine levels of satisfaction and identify areas for improvement.

Collection, distribution and analysis of data related to enrolment and student retention are also important quality assurance strategies. A centralized, structured process ensures that this data is easily accessible, and presented in a format that supports annual planning processes. Student retention data is used to inform the development and enhancement of student success initiatives (6.2.02a 6.2.02b - in progress).

Back to Top

6.3 Graduates, employers, students, and other stakeholders indicate satisfaction with the program.

Mohawk College uses a variety of strategies to determine the degree of satisfaction with the program among graduates, students, employers and other stakeholders. The most visible evidence of satisfaction among current students, graduates, and employers is the data that emerges from the Key Performance Indicator (KPI) Survey administered once yearly. Since the 2009 PQAPA, Mohawk College has continued to show steady increases in KPI results, and for the third successive year, holds the status of # 1 College for Student Satisfaction in the Greater Hamilton/Toronto area. The College also purchases more in-depth KPI analyses for every program area. Stored on a share drive, these are accessible to Deans, Associate Deans, Faculty, Program Coordinators and Support Staff (6.3.01b 6.3.01c).

The Student Feedback on Teaching Survey is a more direct indication of the satisfaction of current students with specific faculty and particular courses. The process undergoes regular review and enhancement to ensure its viability and relevance. The Online Survey is administered each semester; surveys can be completed using ‘smart phones’, tablets, laptops, and desktop computers. Survey results are shared with faculty, their Associate Deans and Deans after final marks have been submitted for the term. Results are used to assist with professional development plans for faculty.

Another significant mechanism for securing feedback on specific programs is the Program Advisory Committee. The Policy (6.3.02) mandates representation from at least one current student and one recent graduate, as well as appropriate community and industry representatives. Since the last Audit, the College has strengthened the role of Program Advisory Committees and made changes to ensure a dedicated Committee for each program or cluster of related programs. Program Advisory Committees meet at least twice yearly to ensure ongoing program quality and relevance. The minutes of each Advisory Committee meeting are distributed to the Vice-President Academic, and program faculty as well as PAC members (6.3.03a 6.3.03b 6.3.03c 6.3.03d).

Providing feedback during program review is an important responsibility of Program Advisory Committee members. The Toolkit developed to assist academic staff in completing the Final Program Review Report (6.3.04) includes a structured assessment document to demonstrate the level of engagement of Advisory Committee members in maintaining and enhancing program quality.

Additional evidence of employer satisfaction with student and graduate performance is acquired through experiential learning opportunities integrated into College programming. Employers offering clinical placements, field placements, or co-operative education work opportunities are asked to complete evaluation forms that measure their satisfaction with student participants and programming directions. These are used to determine process enhancements that continue to contribute to the quality of these workplace experiences (6.3.05).

Asessment of Criterion #6

Does the evidence provided for each of the 3 requirements indicate the criterion is Met, Partially Met, or Not Met using the definitions provided on Page 20 of the PQAPA Orientation and Training Manual?

In the event the Criterion is rated as Partially Met or Not Met, what plans are being identified to improve on this?

Our review of the evidence that supports the three requirements of Criterion # 6 indicates that this Criterion has been met. Since the 2009 Audit, Mohawk College has reinforced both the policy framework and the quality assurance processes that support programming decisions made by the College. The development of structured tools to support the comprehensive review has enhanced the consistency of this essential quality process, provided faculty with enhanced understanding of the underpinning of program quality and excellence, and ensured the collection and integration of feedback from key stakeholders.

Back to Top

The chart below identifies initiatives currently in progress or planned to enhance quality assurance at Mohawk.

Initiative

Responsibility

Timeline

Current Status

Implement Annual Program Review Process

Program Quality

Pilot in Spring 2014;
Full Implementation 2014/15

Draft process developed

Continue Program Prioritization process

VPA Group

2014/15

Recommendations approved

Establish curriculum committees where they don’t exist.

VPA Group

2014/15

Implementation phase.

 

 

Back to Top