Universal Student Ratings of Instruction (USRI)

Collect feedback that counts

Universal Student Ratings of Instruction (USRIs) gather feedback from classes to help instructors, departments and faculties improve curriculum and instruction. The results also serve as one important factor in decisions affecting the career of your instructor.

The collection of USRIs is regulated by the General Faculties Council (GFC) policy. The current version of this policy may be viewed in Section 111.3 of the GFC Policy Manual.

When do USRIs occur?

Generally, USRIs are available for students to complete once the withdrawal deadline for classes has passed, and will be available until the last day of classes.

What is the USRI process?

Step 1: Prior to the start of the rating period, instructors will receive an email announcement with further information and important dates for the rating period.

Step 2: Once the announcement email is received, instructors should notify students that they will be receiving an email with instructions and encourage participation.

Step 3: When student ratings become available, students will receive an email with instructions and appropriate links to complete their ratings.

Step 4: During the rating period, instructors will receive a helpful reminder from TSQS to encourage student participation.

Step 5: If students have not yet completed their ratings, they will receive an initial reminder via email.

Step 6: A secondary email reminder will be sent to any students who have not yet completed the rating.

Step 7: Once the rating period is complete, instructors will be able to view results online through the USRI Instructor Reports.

When will I receive results from USRIs?

Instructors will receive rating results within 20 working days after the course is completed, and once the Dean, Director or Chair has signed the grade sheet.

How can instructors check response rates during the USRI rating period?

If you would like to check the response rate during the USRI course rating period, log into the USRI system and your homepage will provide a status overview and the current response rates for your courses.

Log into USRI

Where can instructors find USRI Instructor results?

To access numerical data:
If your class had at least ten students, a report of your student rating will be available through: https://tsqs.srv.ualberta.ca/cgi-bin/indusri/ind_usri.pl.

To access comments:
A summary of comments will be available through:

To access previous comments:

  • After login, click on "Reports" (second icon, top-left)
  • Select "Individual Reports"
  • Set top drop-down boxes (Year, Period, Status, Survey) to "Show All"
  • The system will display your courses for which you will be able to access the comments

If you have comments or concerns about this process, please contact us at test.scoring@ualberta.ca.

Are USRI results made public?

On October 12, 1993, the General Faculties Council (GFC) of the University of Alberta modified its policy concerning Teaching Evaluation and Student Evaluation of Instruction to include the requirement for the collection of students' ratings of instruction on a University-wide basis using a basic set of mandated questions. The policy also made provision for releasing the associated results to the Students' Union and the Graduate Students' Association. Currently, results are not made "public" unless there have been at least 10 completed questionnaires for a class.

In 2011, online access to results was restricted to registered students. This was followed by allowing instructors to see results for their own classes and, later, providing access to Deans, Directors, and Chairs to view results for their employees. Beginning in July 2012, Deans, Directors and Chairs may extend this access to individuals whom they designate by sending an e-mail to test.scoring@ualberta.ca which provides:

  • Their name and employee number
  • The name and employee number of the person to whom they wish to grant designated access

The current version of this policy may be viewed by linking to Section 111 of the GFC Policy Manual. Included in this policy is the following cautionary note concerning the results obtained from students' ratings:

Student questionnaires form an important part of evaluating teaching effectiveness but cannot be taken alone as a complete assessment of an instructor or course. Factors other than an instructor's teaching ability may influence ratings. These factors include class size, class level, Faculty, time of class, required versus optional course, grade expectations, student GPA, gender, race, ethnicity, age of both students and instructors. Small differences in evaluation should not be considered as meaningful.”

Login to Search for USRI Results

What does the USRI report consist of?

A one-page report is generated for each class from which students' ratings have been collected. The Instructor Report contains the text of each of the rating questions appearing on the questionnaire. The questions are reported in the sequence that they were printed on the questionnaire. Following the text of each question, the number of students responding to the rating scale Strongly Disagree (SD), Disagree (D), Neutral (N), Agree (A), and Strongly Agree (SA) are reported. These frequencies are followed by the median of the responses and reference data.

Additional USRI reports available.

What does the USRI instrument look like?

USRI Reference Data

Reference Groups for Comparative Ratings

The columns of reference data display statistics from Tukey's box-and-whisker plot analysis (John W. Tukey, Exploratory Data Analysis, Addison-Wesley Publishing Company, Inc. 1977). The values displayed are derived from all classes in the indicated reference group. These statistics are chosen to achieve two main objectives:

  1. To summarize skewed distributions of data, and
  2. To identify outliers from the general population, if they exist.

The median value (middle of a ranked set of numbers) is generally preferred over the mean to identify the centre of a skewed distribution of scores. This is the value below which 50 percent of the medians from other classes lie. Please note that data for the items in the current set of mandated questions are accumulated from Academic Year 2005/06 and beyond. If an item (question) has not been used at least 15 times by the indicated reference group since then, the reference data cells will be filled with the text: too few uses. It is theoretically possible for all median scores in a single year to be above, or below, the Reference Group median.

The 25th and 75th percentiles provide information about the spread of scores around the median. By definition, 25 percent of the scores are above the 75th percentile and 25 percent are below the 25th percentile. Since this occurs by definition, these values should not be used to determine whether a particular score is good or bad.

The lower Tukey Fence, which is the 25th percentile minus 1.5 times the distance from the 25th to the 75th percentile, defines a reasonable limit beyond which a score can be considered an outlier. Outliers are scores that appear to be outside the usual distribution of scores for the population being tabulated (i.e. for the indicated reference group.) Given the nature of the USRI data, the upper Fence will usually be above 5.0 and, therefore, need not be reported.

Please note that some items can be expected to elicit higher ratings because they are closer to apple pie types of items (i.e. we would expect the item to be rated quite positively.) This is illustrated by the campus-wide results accumulated in the years 2000-2004 for the two items shown below.

Item Tukey
Reference Data
25% 50% 75%
The instructor treated students with respect. 3.4 4.3 4.6 4.8
Overall, the quality of the course content was excellent. 2.9 3.8 4.1 4.3

This suggests that the median obtained for the first item in a particular class can be expected to be 0.5 of a rating above that for the second item simply because that has been found to be the case in results from thousands of classes surveyed at the University of Alberta. Note that the 25th percentile for the first item corresponds to the 75th percentile for the second item.

Also, the reference group used for a particular class consists of all classes in the indicated department or faculty. One of the most consistent findings of researchers studying students' ratings of instruction is that the ratings obtained for items such as those addressing general satisfaction with a course or instructor, depend on the discipline in which the course is taught. Franklin and Theall (1995) reported that "professors in fine arts, humanities, and health-related professions are more highly rated than their science, engineering and math-related colleagues." There appears to be a combination of reasons for these differences including diversity in the characteristics of the students, in the nature of the subject matter, and in the course objectives that are emphasized in different disciplines. The sizes of the differences and the conclusion that they are not necessarily related to characteristics of the instructors in the different disciplines, leads to the advice that "we must continue to be very cautious about —if not prohibited from —using the results of student evaluations to make comparisons across disciplines." (Marincovich, 1995).

For example, the item "Overall, this instructor was excellent" illustrates that results at the University of Alberta are consistent with the research studies. The reference data from some of the departments in which a large number of classes have been surveyed appear in the following table.

Department Tukey
Median 75th
Physics 2.4 3.7 4.1 4.5
Computing Science 2.5 3.7 4.1 4.5
Electrical & Computer Engineering 2.7 3.9 4.2 4.6
Mathematical & Statistical Sciences 2.8 3.9 4.2 4.6
Earth & Atmospheric Sciences 3.0 4.0 4.3 4.6
Biological Sciences 3.1 4.0 4.3 4.6
English 2.8 4.0 4.4 4.7
Modern Languages & Cultural Studies 2.9 4.0 4.4 4.8
History & Classics 3.4 4.2 4.5 4.7
Elementary Education 2.7 4.0 4.5 4.8
Drama 2.9 4.1 4.7 4.9

What provisions are made for student anonymity in the USRI system and process?

GFC Policy 111.3 D states the importance of student anonymity in completing course and instructor survey questions. Free expression of views in the ratings is essential, so long as the safety of the members of the University community is upheld. The following indicates measures which ensure anonymity in the process:

  1. The survey administrator cannot identify the student through the survey tools unless the student self-identifies.
  2. The survey tools are truly anonymous.
  3. Your ID/username does not get tagged on the survey results.
  4. You must log in for verification that you have taken, partially taken, or not taken some or all of your surveys. Again, your answers to survey questions are completely separate from this verification.
  5. Circumstances that warrant overriding anonymity are spelled out in GFC policy 111.3 D (see above). Threats to the safety or well-being of members of the University community will not be tolerated, and would result in actions in order to identify the author of the statements according to GFC policy.
  6. These surveys or ratings are conducted so that "the results help instructors and departments or faculties to initiate constructive change in curriculum and instruction. In addition, the results are one important factor in decisions affecting the career of your instructor." (GFC policy 111.3 C).

How can I customize my USRI questionnaire?

USRIs are generated using the Instructor Designed Questionnaire (IDQ). This system allows for the inclusion of items relevant to the individual instructor, university, faculty and department; and allows for normative as well as individualized feedback on the quality of instruction. Please visit our IDQ system page to obtain more information about the IDQ system.

Do USRI’s support ratings if there are multiple instructors for a course, as in the case of team-teaching?

Yes. In a case such as this, student ratings can be configured to provide numerical and open-ended questions that repeat for each instructor. The questionnaire is arranged such that the questions that apply to the overall course appear first. These are followed by questions that apply to the instructor. Instructor-related questions are repeated for each instructor involved in the course.

When results are compiled and ready for viewing, each instructor will receive the questions that apply to the overall course and their individual results. This method eliminates the need of having separate questionnaires generated for each instructor in a team-taught class in order to achieve confidentiality among instructors.

Example of a team teaching questionnaire

Who should I contact if information on the USRI email announcement is incorrect?

If any information appears incorrect on the email announcement such as instructor name, course information, etc., or if your course has been canceled, or is missing student ratings, please contact TSQS as soon as possible by phone 780-492-2741 or by email test.scoring@ualberta.ca.

What should I do if I have problems logging into the USRI system?

If you have difficulty logging into the ratings system please try the following:

  1. Clear your browser cache. For assistance on clearing your browser cache refer to the following help article: How to Clear Browser Cache and Cookies.
  2. Check the status of your CCID to ensure it is valid and functioning properly by accessing https://myccid.ualberta.ca/check.

If you are still having problems logging in, please contact IST at ist@ualberta.ca.


Employees, Students

Technology Support
VP Services & Finance