Header Image

Generative AI in higher education

Higher Education Partnership Network North, 27-28 February 2024

AI is reshaping education at every level – but how can we make sure higher education is ready for what’s next?

At last year’s HEPN North event, we brought together a panel of experts to explore the evolving role of generative AI in higher education, covering everything from ethical use to curriculum design.

Here are some of the key takeaways…

AI: challenges and opportunities

AI isn’t going anywhere, so we need to make sure we’re equipping managers and educators with the necessary skills and knowledge to work in AI-enabled environments.

Kicking off the panel discussion, Professor Margherita Pagani, Director, SKEMA Center for AI at SKEMA Business School, emphasised the need to adapt pedagogy to include AI skills. She pointed to three key stakeholders in this transformation: research, education, and industry – all of which need to work together to ensure AI users are properly prepared for what lies ahead.

She also explored the different ways AI can be used to enhance creativity:

  • Using AI as a tool
  • Augmenting the creative process
  • Inspiring out-of-the-box thinking

She explained, ‘I can use generative AI to better write my document, my research or in the pedagogy to help my interaction with the students… I can also use AI to augment the creative process… And finally, there is inspiring out-of-the-box thinking which means to stimulate and maybe rethink my way to create.’

Becoming AI literate

In the second part of the session, Dr. Eleanora Pantano, Associate Professor in Retail and Marketing Technology at the University of Bristol Business School (part of the Russell Group) talked about generative AI usage from a student, staff and stakeholder perspective.

She started by highlighting the importance of AI literacy for both staff and students in higher education, stating that ‘Staff should be equipped to support students to the usage of AI, generative AI tools effectively and appropriately in their learning experience. So before training students, we need to train staff, and we need to do it because we need to perform in this world that is increasingly AI enabled.’

As well as training them to use AI, staff and students also need to be incorporating AI responsibly. She explained, ‘This means that we also need to adapt teaching and assessment to incorporate the ethical usage of a generative AI to support also equal access. Which means that we should not be afraid that AI can replace us or that students can use AI to develop their assignment.’

Although AI can pose many challenges, staff and students need to recognise that AI is a tool used to support human abilities, not replace them. As Eleanora explained, ‘It’s about how AI can support students in their assessments and in their assignments and not how AI can replace student in developing the assignment. Which means that we need to put some training in place, some literacy activities to make students and staff to understand how to make ethical and responsible use of generative AI.’

Eleanora then highlighted five ethical issues staff and students need to be aware of:

  • Risk of privacy and intellectual property loss
  • Potential bias
  • Inaccuracy and misinterpretation of information
  • Ethics code
  • Plagiarism

While AI presents certain risks, it’s important to remember that humans remain the final decision-makers. That’s why universities need to work closely with staff and students to ensure it’s being used responsibly. This includes providing the right training, and establishing clear, practical guidelines for ethical use.

Embedding generative AI into the curriculum: a case study from Royal Holloway

In the third part of the panel discussion, Director of Digital Organisation and Society Research Centre, Nisreen Ameen, some case study examples of how the Royal Holloway University is embedding AI into their curriculum, particularly within their postgraduate programme in digital marketing. This approach focused on two key strategies: integrating real-world industry case studies and adopting research-informed teaching.

The first example was a real-life use-case from Nutella, who were able to use generative AI to produce 7 million unique labels for their jars. By using AI in this way, Nutella were able to make meaningful connections with their customers, which ‘resulted in a very successful campaign.’ By looking at real-world examples, students were able to very quickly understand the benefits that generative AI can bring for the industry.

Another way the Royal Holloway incorporated AI was through AI-enhanced teaching materials. In this instance, journal articles were paired with teaching and learning cases, helping boost engagement and encourage interaction with the materials. She explained, ‘For every article, there is an option for authors to create their own teaching and learning case and it will be published on the platform. I think that’s a great way to use AI, and our students found it really helpful as well.’

She continued, ‘We introduced a number of case studies for students, and we designed a class activity for different groups of students within our case for teaching and learning associated with this paper. we found that students are way more engaged with the teaching and learning case rather than reading a whole paper on generative AI.’

Ultimately, the integration of generative has sparked higher engagement, encouraged critical thinking and helped students understand its place in their future careers.

AI: the higher education experience

At the end of the session, our panellists split the audience into technology offers and decision-makers and took a quick survey asking them to describe their experience with generative AI in higher education in three words. The results revealed both common ground and contrasting perspectives.

Technology officers most frequently said:

  • Limited
  • Exciting
  • Scary
  • Hesitation

Decision-makers most commonly used:

  • Exciting
  • Potential
  • Opportunity
  • Cheating

While both groups found AI ‘exciting,’ their answers did differ in focus. As Professor Margherita Pagani explained, ‘One of the key issues is that the stakeholders around the table are different.’ By collaborating and sharing these perspectives across departments, institutions can properly prepare for the challenges and opportunities AI brings and create a shared vision for the future.

Join our future discussions

If you’d like to take part in future discussions register your interest to join us at HEPN North on 20-21 May!