Program Evaluation Unit

The Program Evaluation Unit oversees all aspects of the evaluation of the MD Program and its curriculum. This includes administering more than 100,000 evaluations of courses and clerkships each year. Program evaluation also informs continuous quality improvement and supports the accreditation process.

The Program Evaluation Unit is an integrated administrative unit that is responsible for all aspects of evaluation of the MD program. It includes the Assistant Dean, Program Evaluation, a Program Evaluation Specialist, and a Program Evaluation Analyst.

Contact:

Dr. Joanne Rodger, Assistant Dean, Program Evaluation
780 492 9522 | joanne.rodger@ualberta.ca

Kelly Gibson, Program Evaluation Specialist
780 492 3884 | ume.progeval@ualberta.ca | kngibson@ualberta.ca

Lauren Deines, Program Evaluation Analyst
780 492 4128 | ume.PEanalyst@ualberta.ca | lmmansel@ualberta.ca

 

Program Evaluation Framework (2022)

The work of the Program Evalution Unit is outlined in the 2022 Program Evaluation Framework document which describes a systematic approach to the evaluation of the MD Program and its curriculum. Program evaluation also helps to inform continuous quality improvement and supports accreditation. This framework outlines the overall approach to program evaluation and quality improvement, the role of the Program Evaluation Unit in the implementation of the strategy, the sources of data, timelines, and implementation plans.

Program Evaluation Framework

 


 Curious about what happens to the feedback medical students provide to the program?

Student Feedback

What happens to my feedback?

Student feedback can be categorized into three categories of person, place, and program.

People, Place, & Program

Feedback gathered through course evaluations and other student feedback forms relate to program. Hot spot surveys focus on place. Additional feedback sources focus on the person, specifically the professionalism and racism reporting processes which are designed to act on the most serious concerns in a timely fashion.

What happens to my feedback?

Are you a student who is curious about what happens to the feedback you provide to the MD Program during and after every course or clerkship?

This chart breaks down the different forms students complete and what happens to the information that is shared with the program.

MedSIS Feedback Process

Student evaluations are submitted through MedSIS and provide information to individual instructors and the program. The flowchart below demonstrates this process.

Student Feedback through MedSIS

Hot Spot Surveys

Hot spot surveys are one approach the MD Program uses to addressing mistreatment from an environmental perspective.  Hot spots surveys are jointly sponsored by the MD Program and Chief Wellness Officer, and have the unanimous support of the chairs of our clinical departments in Family Medicine, Surgery, Obstetrics and Gynecology, Pediatrics, Medicine, Psychiatry, and Emergency Medicine. 

Hot spot surveys are organized by rotation and site and sent out to you every four weeks, so as to capture your most recent experience. The overall goal is to focus on learning environments that are struggling in any of four key areas: inclusivity, bullying, harassment and discrimination. 


Procedure & Reporting

 

Hot Spot Surveys

  • Electronic survey with 4 items goes out monthly to students on all mandatory rotations in years 3 & 4
  • Data collected and displayed in a dynamic dashboard while maintaining respondent anonymity
  • Personal information about respondents is not collected

Monthly review

  • Associate Dean, MD Program, Assistant Dean, Program Evaluation, and Chief Wellness Officer meet monthly to review the data from the hot spot surveys and identify potential educational environments of concern
  • Associate Dean, MD Program reviews areas of concern with the Dean, FoMD
  • Areas of concern that are identified are actioned as needed by the Associate Dean, MD Program with the Chief Wellness Officer, Associate Dean, Professionalism, the appropriate Department Chair, and other stakeholders (e.g. clerkship coordinator)
  • Progress on tackling concerns are monitored/evaluated by ongoing scrutiny of hotspots data

Quarterly reports

  • Quarterly (January, April, July, October) reports are generated by the Program Evaluation Unit, reported to MDCPC, and sent to clerkship coordinators and department chairs
  • Report is broken down by rotation, and then by site when the threshold met (min 10 responses in order to display a site)

Hot Spot Surveys - Frequently Asked Questions

 

How would the hotspot survey be initiated?

This process will use the assess.med system, and is similar to how we collect data for entrustable professional activities (EPAs).  During core clerkship rotations, all medical students will receive a monthly prompt to rate their experience on their last completed rotation and site.  The survey comprises four items and should take less than a minute to submit.  Participation is voluntary and separate from the evaluations of the rotation and faculty teachers.  

Why do we need another survey?

Our existing feedback process using MedSIS is mandatory and doesn’t address the learning environment. Hot spot surveys focus on the place in our "person, place, and program" approach to student feedback.

How does this approach add value to our existing program evaluation process?

  • There is a dedicated focus on the learning environment itself with feedback in real time.
  • It guarantees that neither the students nor the teaching faculty can be identified.  
  • The process is separate from the existing MedSIS-based evaluations, and similar to our existing low stakes feedback process.
  • Students can be confident that data stewardship for hot spot surveys is the direct responsibility of someone who is known to them, and that their feedback will not be viewed by teaching faculty. 
  • The hotspot survey results will be presented in an anonymized format using a rolling average.

Can you provide more details about data stewardship and privacy?

Information collected in this hotspot survey is not designed to link to any person, but rather to the training site and learning environment. Data linking to the site and clinic department will be accessible to the Chief Wellness Officer, Associate Dean, MD Program, and Assistant Dean, Program Evaluation. No identifiable information about the learner will be available. Information will be tracked by site, clerkship, and across time.  

How and to whom will the data be presented?

Data will be presented in regular reports using a rolling average that would include no less than the last ten student responses. Clinical department chairs and clerkship coordinators are provided with a custom report for their rotations quarterly based on the principles noted above.


The Program Evaluation Unit is an integrated administrative unit that is responsible for all aspects of evaluation of the MD Program.

Who we are

 Our services are:

  • Provides evaluation services to the MD program
  • Delivers strategic reporting to support leadership in the MD program
  • Gathers and reports data to support the accreditation of the medical education program
  • Provides data in response to requests from MD program leadership (including course & clerkship coordinators), for example questions regarding the evaluation of specific sessions or activities, including timely feedback for instructors/preceptors
  • Operationalizes the Program Evaluation Framework
  • Supports evidence-informed continuous quality improvement in the MD program
  • Delivers strategic reporting to measure progress towards the MD program’s strategic plan (2022-2027) and the Program Evaluation Framework
  • Offers evaluation advice and consultation
  • Supports and facilitates health education scholarship

The Program Evaluation Unit includes the Assistant Dean, Program Evaluation and a Program Evaluation Specialist and is supported by the Administrative Assistant, Program Quality & Accreditation.

Reporting Structure

The Program Evaluation Unit reports to the Associate Dean, MD Program and the  MD Curriculum & Program Committee (MDCPC) and collaborates with:

  • MD Program Leadership (including Associate & Assistant Deans)
  • Curriculum Management Unit
  • Sub-Committees of MDCPC (e.g. Pre-Clerkship Coordinators Committee, Clerkship Coordinators Committee, Assessment Committee)
  • Other committees (e.g. Admissions Committee, etc.) and offices within the Faculty of Medicine & Dentistry (e.g. Office of Rural & Regional Health, Indigenous Health Program, etc.)
  • Students
Approach to Program Evaluation

Program Evaluation:

  • is integrated
  • is data-driven and evidence-based
  • ensures data and analysis is valid and reliable
  • values confidentiality and anonymity
  • ensures data is available in a timely manner to the appropriate stakeholders
  • emphasizes transparency
  • supports continuous quality improvement processes
  • is collaborative
  • values student involvement and input
  • is engaged with the community
Key Responsibilities of the Program Evaluation Unit

The Program Evaluation Unit is responsible for all aspects of evaluation of the MD program, as outlined in the Program Evaluation Framework, including student outcomes related to the program level objectives, strategic initiatives as defined by MDCPC (including the 2022-2027 strategic plan), the effectiveness of instructors/preceptors and the student learning environment. The following table summarizes selected specific responsibilities of the Program Evaluation Unit.

Broad Purpose

Specific Tasks

Interactions

Evaluation of student outcomes related to the program level objectives

  • Develop, monitor, and review methods and outcomes to ensure the consistency, reliability, and validity of program evaluation; 
  • Develop new and innovative program evaluation methods and establish appropriate mechanisms to introduce and evaluate them; and to monitor evaluation policies and practices to ensure consistency with evidence from the literature.
  • Review and present collected evaluation data to inform relevant stakeholders about program-related outcomes 
  • Gather, analyze, and present data and evidence to ensure the overall program effectiveness
  • MDCPC
  • Pre-Clerkship & Clerkship Coordinators Committees 
  • Assessment Committee
  • Admissions Committee
  • Assistant Deans Council
  • Directors Council
  • Black Health Lead
  • Social Accountability Lead
  • Students (represented by 
  • Medical Students Association; Indigenous Medical & Dental Students Association; Black Medical Students Association)

Evaluation of strategic initiatives as defined by MDCPC, including the strategic plan and CACMS accreditation

  • Gather, analyze, and present data related to CACMS accreditation standards.
  • Gather, analyze, and present data related to the MD program strategic plan
  • Gather and analyze data to respond to specific, ad hoc questions from MDCPC or other stakeholders
  • Associate Dean, MD Program
  • MDCPC & Sub-Committees
  • Assistant Deans Council
  • Directors Council
  • Course/Clerkship Coordinators
  • Instructors/Preceptors


Evaluation of the effectiveness of instructors/preceptors, the curriculum, and the learning environment

  • Gather and analyze evaluation data from students (about learning sessions, courses, clerkships, and individual instructors, preceptors, and coordinators).
  • Present and report aggregate student feedback about courses/clerkships, course/clerkship coordinators, individual session instructors, and preceptors to inform curriculum review, management, and renewal.
  • Associate Dean, MD Program
  • Assistant Dean, Curriculum
  • Curriculum Management Unit
  • Students
  • Department Chairs
  • Individual Instructors/Preceptors
  • Learning Science team, including director, learning science

 

Program Evaluation Unit Service Model

The following diagram provides an overview of the Program Evaluation Unit’s service model.

image1.png

*Full Program Evaluation Framework document is available here

 


Teaching Evaluation Score (TES) Reports

Instructors who teach in the MD Program will now be sent an email through MedSIS every time they have a new Teaching Evaluation Score (TES) report is published in MedSIS. The email that is sent includes a direct link to their individual report (for security purposes, the link will expire in 72 hours). These TES reports provide instructors with feedback from students (as long as the instructor has three or more evaluations).

Faculty members who have questions or concerns about their TES reports are encouraged to contact the coordinator for the course or clerkship they taught in, Dr. Lana Bistritz, Assistant Dean, Curriculum (lmb4@ualberta.ca), or Dr. Darryl Rolfson, Associate Dean, MD Program (drolfson@ualberta.ca).

Instructors who require access to their TES reports separate from the direct link, can still login to MedSIS and access their reports using these instructionsFor any questions about accessing the TES reports through MedSIS, please contact Lauren Deines, Program Evaluation Analyst (ume50@ualberta.ca).