Remote learning: anonymized AI powers engagement dashboard
Live online training events are a trend that’s here to stay, and while delivery platforms are more feature rich than they’ve ever been before, there are still key elements of in-person sessions that are hard to re-create. For training providers, one of the big missing pieces is how to determine whether the audience is engaged, happy, and following along with the content.
In a real-world classroom or training hall, it’s straightforward to ‘gauge the temperature of the room’, even for a large audience. All that’s required is for the trainer to glance at the class, read the expressions of the participants, and listen out for any questions or confusion. But taking a quick look around a virtual room is another story. For online events with more than a handful of participants, all dialing in from different locations, the task scales poorly as images of attendees soon become hard to read if all placed on-screen at once.
Reading the room, online
But the technology that makes it possible to host events remotely in the first place could also lend a helping hand when it comes to reading the room (together with some clever use of artificial intelligence (AI) enabled in a privacy-protecting workflow). Dubbed ‘Sensing the classroom’, researchers from the Educational Technologies team at DIPF (The Leibniz Institute for Research and Information in Education in Frankfurt) in Germany, together with scientists from Open university of The Netherlands, have prototyped a system capable of generating an ‘emotional dashboard’. The anonymized data are designed to bridge the feedback gap between online and in-person teaching.
Presenting the work at the 17th European Conference on Technology Enhanced Learning – which took place in Toulouse, France, in September 2022 – the group recapped the challenges of online learning and explained how their ‘Sense the Classroom’ setup works. As the researchers point out, video conference tools and learning management systems have worked wonders in making education more accessible and flexible. But there are downsides to remote learning in comparison with lessons held in a traditional classroom setting. “Communication is unnatural, students get distracted under isolation, and teachers have a lack of understanding of how students are doing,” said the team.
To tackle this, the group has investigated how data generated by detecting ‘learning centered affected states’ such as anger, anxiety, boredom, concentration, confusion, engagement, and frustration – to give just a few examples – can be turned into an informative dashboard for trainers. Non-verbal cues include gestures, body postures, and micro-expressions, which can be combined with the observation of activities such as turning away and active listening can be aggregated to build up an anonymized emotional picture of participants. “Sense The Classroom is a pragmatic approach to detect students’ affect in a privacy-preserving and highly scalable way,” the group explained.
In terms of system architecture, the project features student-, server- and teacher-side tools that pipe together machine learning output, database files, and session data based on images that are taken periodically (with consent) from participants’ webcams. And if trainees don’t want to participate, or wish to pause their contribution, they can simply turn off their cameras.
Affective computing, which refers to the study and development of systems and devices that can recognize, interpret, process, and simulate human affects is attracting great interest across a range of industries. Uses include not just edtech examples such as ‘Sense the Classroom’, but also audience testing of upcoming Hollywood movies and Netflix productions, as well as many other novel applications.
Returning to the team’s solution, the system first locates facial landmarks such as the trainee’s eyes, bridge of the nose, and mouth, and then applies a neural network (a series of non-linear algorithms used to generalize the properties of large data sets) to determine willing participant’s likely emotional state. The data is packaged up into a machine-readable Json file and combined with the expressions gathered from all of the other trainees who are taking part.
Together, the results give trainers valuable snapshots of the emotional distribution of their classes and overall trends as lessons progress. At a high level, the dashboard shows if the class is happy or displaying more of a neutral response. And even lower-level hardware was shown to be feasible for providing a status update over one-second intervals. Addressing ethical concerns, the researchers are clear that consent should always be asked for before using such a system. And the tool is designed to enhance the training experience – for example, by highlighting when topics need to be revisited and the pace at which content should be delivered.
Furthermore, the system should support the work of trainers as a valuable add-on and shouldn’t conflict with the independence of the presenter or be used to monitor teaching performance. And the next steps for the researchers are to further road test their design and gather views from all stakeholders on how anonymized, privacy-preserving, AI-based systems can contribute positively to making online learning a more engaging and beneficial experience.
In related news, Daniele Di Mitri – one of the contributors to the ‘Sense the Classroom’ project – announced that DIPF’s EduTec department has received EURO 750,000 to investigate the use of AI systems as ‘Multimodal Tutors’ in a joint project with the Cologne Game Lab. The work involves testing the ability of AI systems to ‘provide automatic, personalised, and real-time feedback to learners in distance learning settings when a human expert is unavailable’.
22 February 2024
22 February 2024
21 February 2024