Sometimes, often, always, there’s more content in our lectures than students can ever get their minds around. Your lecture slides are amazing, you are amazing at presenting them, but students, even the ones furiously taking notes, nodding, laughing, asking questions, can’t get it all. This stuff is important, they really need to know it. So what’s the obstacle?    

Well, we can’t always be sure. Sometimes there’s a need to offer more practice of the materials outside your valuable lecture time. Sometimes you might even want to remove things from the lecture and completely offer it all in another format.

Supporting students

This is how we approached our recent AiR collaboration with Dr Emma Camp on 91126 Coral Reef Ecosystems. In this subject, students are required to become reasonably proficient in benthic and other reef life identification prior to going on a 7-day research field trip to the Great Barrier Reef. 

Emma provides amazing lecture slide materials with a range of coral genera on display but she also wanted to offer something more interactive to help students study the corals in their own time.

This screenshot is one slide from the Coral Identification lecture slide deck. It displays photos of the Acropora coral its some key features such as axial coralites.
One of Emma’s slides from the Coral Identification lecture

Personalisation Principle

The little grey cells of the LX.lab staff started working. We realised Emma’s slides would lend themselves very well to a range of H5P activities. But how to make them really memorable? Fortunately, our colleague Learning Design and Technology Specialist Lucian Sutevski had closely studied the work of Richard Mayer, in particular, his Cognitive Theory of Multimedia Learning.

With all the advances in Large Language Models, Lucian suggested we leverage these to employ Mayer’s ‘personalisation principle’ as a means to personify each of the corals. 

“people learn more deeply when the words in a multimedia presentation are in conversational style rather than formal style”

(Mayer, 2014) ​

So that’s what we attempted.

Personified Corals

If you can’t wait to find out, the following basic formula is what it boiled down to: GPT + TTS + H5P = personified learning activities.

How to ask GPT to give coral a personality? After playing around with a few different prompts we settled on one that consistently worked. Though I’m sure you can try your own version to create more nuance.

GPT Prompt:

Write from the perspective of a [genus – eg. Acropora] coral. Include these details [insert key IDs – eg. Branching growth form​​, >150 species​​, Most significant reef-building genus (vast thickets)​​, Axial corallites​].​

Write about your mood and experiences with other creatures or changes in the sea during the day. ​

Make it engaging and suitable for an Instagram caption less than 70 word

Note: we were using Emma’s teaching materials so asked her permission to run them through GPT 3.5 and also turned off the chat history so they never became part of GPT training data. Also, as we were dealing with dozens of different corals, we made slight variations to the prompts to avoid repetitive responses.

What did we get?

Here’s the ChatGPT output:

Did you know? I belong to the most significant reef-building genus! ​​? With more than 150 species, each of us adds a unique touch to the underwater mosaic. ​You’ll always know me by my axial corallites. ​Our branching form? That’s nature’s way of letting us reach out and touch the sky…or at least the ocean’s surface!

ChatGPT output (edited).

Making them speak

The next idea was to go beyond just text. We used AI text-to-speech tools to give them their own voices. Our app of choice was TTSmaker though there are numerous other versions of this out there. The trick is sifting through the voices to select those that gave the most natural intonation.

Since this was our first time typing something like this

Acropora speaks

Putting it all together in H5P

We chose the Dialog Cards H5P item for this activity because it allowed us to combine images, audio and text together into one package, and give students the option to guess an answer before they flip the card over. 

  1. The GPT personified 1st person description was place in the ‘Text’ box.​
  2. Then the coral genus was placed in the ‘Answer’ box.​
  3. We attached an image taken from Emma’s slide deck.​
  4. And then attached the audio file that was downloaded from TTSmaker.​
Screenshot of an H5P Dialog card in edit mode highlighting the previously mentioned 4 stages.
Screenshot of H5P Dialog card edit mode

Each H5P item would have at least 3 dialog cards in it.​

The final product

The following is an example of the final product in H5P:

Coral ID – Hard corals – branching – axial corallites

Want to try it out for yourself?

Do you have a set of existing resources you’d like to make interactive? Or a topic area in your subject that could be personified to support student memory retention? Contact us in the LX.lab for guidance and support via our Service Connect portal and select LX.lab Support or email us at LX.lab@uts.edu.au.

Discover more about Emma’s experience as an AiR and find out more about our past AiR projects.

Banner image from William Saville Kent’s Great Barrier Reef Corals 1893, Public Domain.

Join the discussion