Program Evaluation Unit

The Program Evaluation Unit (PEU) oversees all aspects of the evaluation of the MD Program and curriculum. This includes administering more than 100,000 evaluations of courses, clerkships, and instructors each year. Program evaluation also informs continuous quality improvement and supports the accreditation process.

The PEU is an integrated administrative unit responsible for all aspects of the evaluation of the MD Program. It includes the assistant dean of program evaluation, a program evaluation specialist and a program evaluation analyst.

Contact 

Dr. Joanne Rodger, Assistant Dean, Program Evaluation
Phone: 780-492-9522
Email: joanne.rodger@ualberta.ca

Kelly Gibson, Program Evaluation Specialist
Phone: 780-492-3884

Email: ume.progeval@ualberta.ca or kngibson@ualberta.ca

Lauren Deines, Program Evaluation Analyst

Phone: 780-492-4128
Email: ume.PEanalyst@ualberta.ca or lmmansel@ualberta.ca

Program Evaluation Framework

The work of the PEU is outlined in the Program Evaluation Framework document. The document describes a systematic approach to the evaluation of the MD Program and its curriculum. This framework outlines the overall approach to program evaluation and quality improvement, the sources of data, timelines and implementation plans.

Program Evaluation Framework

How Medical Students’ Feedback Impacts the MD Program

Student Feedback

Student feedback falls into three categories: program, place and person.

Program

This category includes feedback gathered through course evaluations and other student feedback forms like course/clerkship evaluations, preceptor or resident evaluations and session evaluations.

Place

This refers exclusively to hot spot surveys, which are used by the MD Program to address mistreatment in the learning environment.

Person

These feedback sources include the professionalism and racism reporting processes designed to act on the most serious concerns in a timely fashion.

MedSIS Feedback Process

Student evaluations are submitted through MedSIS and provide information to individual instructors and the program. The flowchart below demonstrates this process.

Student Feedback through MedSIS

Hot Spot Surveys

Hot spot surveys are one approach the MD Program uses to address mistreatment in the workplace. Hot spots surveys are jointly sponsored by the MD Program and chief wellness officer and have the unanimous support of the chairs of our clinical departments in Family Medicine, Surgery, Obstetrics and Gynecology, Pediatrics, Medicine, Psychiatry and Emergency Medicine.

Hot spot surveys are organized by rotation and site and sent out to you every four weeks to capture your most recent experience during clerkship. The overall goal is to focus on learning environments that are struggling in any of four key areas: inclusivity, bullying, harassment and discrimination.

Procedure + Reporting

Hot Spot Surveys
  • A monthly electronic survey with four items goes out to students on all mandatory rotations in Years 3 and 4
  • Data is collected and displayed in a dynamic dashboard while maintaining respondent anonymity. Personal information about respondents is not collected
Monthly Review
  • Associate dean, MD Program, assistant dean program evaluation and chief wellness officer meet quarterly to review the data from the hot spot surveys and identify potential educational environments of concern
  • Associate dean, MD Program reviews areas of concern with the vice dean, FoMD
  • Identified areas of concern are actioned as needed by the associate dean, MD Program with the chief wellness officer, associate dean, professionalism, the appropriate department chair and other stakeholders (e.g. clerkship coordinator)
  • Progress on areas of concern is monitored/evaluated by ongoing scrutiny of hot spot data
Quarterly Reports
  • Quarterly reports are generated by the PEU, reported to MDCPC and sent to clerkship coordinators and department chairs
  • Report is broken down by rotation and by site when the threshold is met (minimum 10 responses in order to display a site)

Hot Spot Surveys — Frequently Asked Questions

How is the hot spot survey initiated?

This process will use the Assess.Med system, and is similar to how we collect data for entrustable professional activities.

During core clerkship rotations, all medical students will receive a monthly prompt to rate their experience on their last completed rotation and site. The survey comprises four items and should take less than a minute to submit. Participation is voluntary and separate from the evaluations of the rotation and preceptors.

Why do we need another survey?
Our existing feedback process using MedSIS is mandatory and doesn’t address the learning environment. Hot spot surveys focus on the place in our "person, place and program" approach to student feedback.
How does this approach add value to our existing program evaluation process?
  • There is a dedicated focus on the learning environment itself with feedback in real-time
  • It guarantees that neither the students nor the teaching faculty can be identified
  • The process is separate from the existing MedSIS-based evaluations and similar to our existing low-stakes feedback process
  • Students can be confident that data stewardship for hot spot surveys is the direct responsibility of someone who is known to them and that their feedback will not be viewed by teaching faculty
  • The hot spot survey results will be presented in an anonymized format using a 12-month rolling average
How is data stewardship handled and privacy maintained?
Information collected in this hot spot survey is not connected to any person but rather to the training site and learning environment. Data linking to the site and clinic department will be accessible to the chief wellness officer, associate dean, MD Program, and assistant dean, program evaluation. No identifiable information about the learner will be available. Information will be tracked by site, clerkship and across time.
How and to whom will the data be presented?
Data is presented in regular reports using a rolling average that would include no less than the last 10 student responses. Clinical department chairs and clerkship coordinators are provided with a custom report for their rotations quarterly based on the principles above.

About the PEU

Who We Are

The PEU administers and oversees all aspects of MD Program evaluation. Our services:

  • Provide evaluation services to the MD Program
  • Deliver strategic reporting to support leadership in the MD Program
  • Gather and report data to support the accreditation of the medical education program
  • Provide data in response to requests from MD Program leadership (including course and clerkship coordinators). For example, questions regarding the evaluation of specific sessions or activities, including timely feedback for instructors/preceptors
  • Operationalize the Program Evaluation Framework
  • Support evidence-informed continuous quality improvement in the MD Program
  • Deliver strategic reporting to measure progress toward the MD Program’s strategic plan and the Program Evaluation Framework
  • Offer evaluation advice and consultation
  • Support and facilitate health education scholarship

The PEU includes the assistant dean, program evaluation and a program evaluation specialist and is supported by the program evaluation analyst.

Reporting Structure

The Program Evaluation Unit reports to the associate dean, MD Program and the MD Curriculum & Program Committee (MDCPC) and collaborates with:

  • MD Program Leadership (including associate and assistant deans)
  • Curriculum Management Unit
  • Sub-Committees of MDCPC (e.g. Pre-Clerkship Coordinators Committee, Clerkship Coordinators Committee, Assessment Committee)
  • Other committees (e.g. Admissions Committee, etc.) and offices within the FoMD (e.g. Office of Rural & Regional Health, Indigenous Health Program, etc.)
  • Students

Approach to Program Evaluation

Program evaluation:

  • Is integrated
  • Is data-driven and evidence-based
  • Ensures data and analysis are valid and reliable
  • Values confidentiality and anonymity
  • Ensures data is available in a timely manner to the appropriate stakeholders
  • Emphasizes transparency
  • Supports continuous quality improvement processes
  • Is collaborative
  • Values student involvement and input
  • Is engaged with the community

Key Responsibilities of the PEU

The PEU is responsible for all aspects of evaluation of the MD Program as outlined in the Program Evaluation Framework, including student outcomes related to the program-level objectives, strategic initiatives as defined by MDCPC (including the 2022-2027 strategic plan), the effectiveness of instructors/preceptors and the student learning environment. The following table summarizes the selected specific responsibilities of the PEU.

This table has headers. The first column is the broad purpose, the 2nd column is the specific tasks, and the last column is the interactions.
Broad Purpose Specific Tasks Interactions

Evaluation of student outcomes related to the program-level objectives

  • Develop, monitor and review methods and outcomes to ensure the consistency, reliability and validity of program evaluation
  • Develop new and innovative program evaluation methods and establish appropriate mechanisms to introduce and evaluate them
  • Monitor evaluation policies and practices to ensure consistency with evidence from the literature
  • Review and present collected evaluation data to inform key partners about program-related outcomes
  • Gather, analyze and present data and evidence to ensure overall program effectiveness
  • MDCPC
  • Pre-Clerkship & Clerkship Coordinators Committees
  • Assessment Committee
  • Admissions Committee
  • Assistant Deans Council
  • Directors Council
  • Black Health Lead
  • Social Accountability Lead
  • Students (represented by
Medical Students Association, Indigenous Medical & Dental Students Association and Black Medical Students Association)
Evaluation of strategic initiatives as defined by MDCPC, including the strategic plan and CACMS accreditation
  • Gather, analyze and present data related to CACMS accreditation standards
  • Gather, analyze and present data related to the MD Program strategic plan
  • Gather and analyze data to respond to specific, ad hoc questions from MDCPC or other partners
  • Associate dean, MD Program
  • MDCPC & sub-committees
  • Assistant Deans Council
  • Directors Council
  • Course/clerkship coordinators
  • Instructors/preceptors
Evaluation of the effectiveness of instructors/preceptors, the curriculum and the learning environment
  • Gather and analyze evaluation data from students (about learning sessions, courses, clerkships and individual instructors, preceptors and coordinators)
  • Present and report aggregate student feedback about courses/clerkships, course/clerkship coordinators, individual session instructors and preceptors to inform curriculum review, management and renewal
  • Associate dean, MD Program
  • Assistant dean, curriculum
  • Curriculum Management Unit
  • Students
  • Department chairs
  • Individual instructors/preceptors
  • Learning science team, including director, learning science

PEU Service Model Overview

image1.png

*Full Program Evaluation Framework document is available here.

Teaching Evaluation Score (TES) Reports

Instructors who teach in the MD Program will now be sent an email through MedSIS every time a new TES report is available to them. The email that is sent includes a direct link to their individual report (for security purposes, the link expires in 72 hours). These TES reports provide instructors with feedback from students (as long as the instructor has three or more evaluations).

Faculty members who have questions or concerns about their TES report are encouraged to contact the coordinator for the course or clerkship they taught in: Dr. Lana Bistritz, Assistant Dean, Curriculum (lmb4@ualberta.ca), or Dr. Darryl Rolfson, Associate Dean, MD Program (drolfson@ualberta.ca).

Instructors who require access to their TES reports separate from the direct link can log in to MedSIS and access their reports using these instructions. For any questions about accessing the TES reports through MedSIS, contact Lauren Deines, Program Evaluation Analyst (ume50@ualberta.ca).