Our festival after the festival

More students’ perspectives on AI's limitations, academic integrity, and inclusive education in a changing landscape.

fotl-2023-gurleen-kaur-banner.png

This collaborative essay follows the 2023 Student Keynote Conversation at this year’s Festival of Teaching and Learning (FoTL) and extends it to add more authentic student voices and perspectives on the theme of “Student Engagement in/with Digital Technologies” to the ongoing conversation. Staying true to both that theme and FoTL 2023’s overarching theme of “Traditions Re-Imagined: Learning, Unlearning, and Relearning Teaching,” this essay re-imagines method and genre to take on writing together in a Reddit-style discussion modeling some of the Internet writing cultures and tropes our students engage in/with, so to speak.


r/StudentKeynoteFoTL2023

From having been in the audience for the FoTL 2023 Student Keynote, what is something that wasn't said there that you wish had been said/touched upon?

[Discussion]

EK: I think for me, I would have liked to have talked more about the limitations of AI. It’s amazing, but it’s not perfect and I think talking about the obstacles of this emerging technology would be a good topic. Right now, one of the biggest obstacles is for these large language models to provide quality responses given any prompt. There is room to train these models and have them learn and hopefully after years of development they’ll have improved even more. 

JAN: In the vein of limitations, it would have been great to hear voices from the U of A, yes, but also other tertiary institutions and their experiences with AI. While I see us all as different pages in 1 book, I think it could be a much more mind-opening conversation to bring in perspectives from different books. I would have liked to see how the inclusion of limitations could have yielded a discussion where we learn from each other and say, “My campus/or area of study does this …” While not every limitation has a solution … for now, I would have liked to see discussion about why the limitations are present in and above its mere existence. 

For example, the Augustana Campus does not have the access to certain technologies like the Agricultural technology or opportunities at AMII, and these types of discussions can ignite broader conversations which can lead us moving forward. 

SS: Absolutely agree with you on this Jewel, the topic of limitations is extremely important. While I believe that opening up a topic about why limitations exist, I think it’s extremely important to discuss what could be done to mitigate or extinguish them.

A potential solution for the Augustana Campus could be creating an online moodle platform (such as eClass or Campus Bridge) where regular opportunities from AMII or other corporations are presented in an organized manner. Taking this platform and marketing it towards the student population could have a substantial impact. This may also be slightly more organized than a Discord server, for example: where comments by students are flooded in and can lead to a need for moderators or push the important information out of the way.

| What is something that wasn’t said in the Student Keynote Conversation that you wish had been said/touched upon?

TL;DR - More on AI limitations. More on cross-campus collaborations with AI.

How can schools and universities ensure that their AI-driven digital engagement efforts are respectful of academic integrity?

[Discussion]

EK: These technologies are here to stay and they will be used no matter what. It’s important to learn and grow with them and not be fearful of them or restrict them. For example, I had a professor who encouraged the use of ChatGPT; the only thing that was asked was, if it was utilized, write a small blurb of why you used it and find a backing source to verify the claims. Restricting the use of these technologies can be chalked up to the same way educators would say, “You won’t always have a calculator in your pocket”. Turns out most of us would …

JAN: The culture of learning is one of self-empowerment and self-improvement. The current culture of academia within the teaching and learning sphere is shifting away from attaining knowledge for growth but rather towards attaining knowledge to regurgitate — why for the ‘A’ of course. And who’s to blame us? There needs to be a frequent meeting of the minds between AI (emerging technology) professionals and university instructors. The main fear stems from the unknown. It was mentioned, during the Festival, that the campus bring in guidelines or regulations to streamline working with AI. These guidelines can be drawn up, and alleviate that fear, in consultation with workplace professionals so that we are preparing graduates for the workforce. In this manner, we are teaching students to fish and are going back to the true meaning of what it means to learn. What the current setting demands is a change in culture so that we do not fall into stagnancy nor do we forget how to celebrate the learning process.

For example, and I am sure that it was mentioned already during the Festival, in my Stats class, we use JASP. My professor never required us to memorize methods, just use JASP. However, we learnt the difficult behind-the-scenes methods and its importance was stressed. We were also invited to look at the fundamentals of real-life applications; being its understanding and its subsequent practical implementation.

| How can schools and universities ensure that their AI-driven digital engagement efforts are respectful of academic integrity?

TL;DR - Create guidelines and use the tech appropriately. Don't be fearful of the unknown.

How can schools and universities ensure that their teachings regarding AI and various evolving technologies are equitable and inclusive?

[Discussion]

JAN: When I saw this question, the first thing that came to mind was the idea about how to make AI technologies more equitable and inclusive if encouraged in the classroom. What about students with accommodations? Are we sure that they are comfortable using different required platforms? What about affordability, if some students could not afford premium applications at some point in the future? While all I do is pose questions, I truly have no answers. A possible level for any of the answers that may arise, would be weighting the learning process higher as opposed to the final result. This new rubric (with a higher weighting for the process) should reflect ethical usage of AI/technology with a stress on procedural rather than demonstrative knowledge — keeping in mind that my lens is primarily Science and that of an Augustana student with smaller classes and more of a focus on the Liberal Arts.

SS: The act of creating an inclusive and equitable environment for artificial intelligence is by no means a small task. In many cases, we may need to wait for machine learning technology to catch up to the current accommodations needed by many people. Some of these could include voice activation and descriptive audio for the visually-impaired in visual recognition tasks (DALL-E), subtitles for the hearing-impaired in video creation AI applications (Synthesis), etc. In regards to the university in particular, I believe one of the most daunting tasks currently is creating an environment that is equal for students undergoing examinations and assignments. The introduction of AI (with ChatGPT) allowed for dishonesty on many of these tasks and I think that addressing these is the most important topic for any academic institution. Although I am also posing a question that does not yet have an answer, this is something that certainly must be addressed sooner rather than later.

| How can schools and universities ensure that their teachings regarding AI and various evolving technologies are equitable and inclusive?

TL;DR - Forefront access and accessibility. Assessments need to be equitable and inclusive, too.

With the technological advances we have seen in just the last year, how do you see education (mainly post-secondary) changing in the next 10 years?

[Discussion]

SS: There is so much changing today with the addition of AI in our workplaces and educational institutions. From the use of AI in assignments, all the way to helping us code in the workplace. In the future, I can see the change of the conventional way of teaching and learning in post-secondary education. Rather than a large approach in theory, we could potentially see a more interactive/hands-on approach to teaching in various subjects such as Engineering, Computer Science, Nursing, etc.

JAN: I agree with Sankalp, the term “flipped classroom” is going to take on a new meaning. There is a basic understanding that Technology cannot stand alone, it needs to be applied to different disciplines. We are definitely going to see technology embedding itself into various disciplines. The “I am not good with technology” is going to be no more, we are going to be forced out of that malaise and learn to live and work symbiotically with technology.

| With the technological advances we have seen in just the last year, how do you see education (mainly post-secondary) changing in the next 10 years?

TL;DR - A lot.


An AI-Generated Conclusion

To conclude our re-imagined essay here on student engagement in/with digital technologies, we threw our Reddit-style conversation and writing into ChatGPT and asked it to “Summarize this in less than 200 words” - here is what it returned to us and what we formally leave you with … for now:

In the Student Keynote Conversation, the speakers expressed their thoughts on what was missing from the discussion. EK mentioned the need to address the limitations of AI, acknowledging that while it is impressive, it is not flawless. They highlighted the current obstacle of large language models providing quality responses, emphasizing the importance of training and development to improve them over time. JAN added that perspectives from different campuses and areas of study could have enriched the conversation on limitations, sparking broader discussions and learning from each other's experiences. SS agreed, emphasizing the importance of discussing potential solutions to mitigate limitations, using an online platform as an example.

Regarding academic integrity in AI-driven digital engagement, EK suggested embracing these technologies and learning to grow with them instead of restricting their use. They shared an example of a professor who encouraged the use of AI tools but required verification of claims and sources. JAN emphasized the need for collaboration between AI professionals and university instructors to address the fear of the unknown and create guidelines that prepare students for the workforce. They advocated for a change in the culture of academia to prioritize the learning process and celebrate it. To ensure equitable and inclusive teachings on AI and evolving technologies, JAN raised concerns about students with accommodations, affordability, and ethical usage. They suggested emphasizing procedural knowledge over demonstrative knowledge and creating rubrics that reflect ethical AI usage. SS acknowledged the challenges of inclusivity and suggested waiting for machine learning technology to catch up with accommodation needs. They highlighted the importance of addressing the dishonesty facilitated by AI in examinations and assignments.

Looking ahead to the next 10 years, SS envisioned a shift in post-secondary education toward more interactive and hands-on approaches, particularly in disciplines like Engineering, Computer Science, and Nursing. JAN agreed, mentioning the integration of technology into various disciplines and the need to overcome the resistance to technology in order to work symbiotically with it.