³Ô¹ÏÍøÕ¾

VR Can Help Teachers Better Distribute Their Gaze

Teachers need to know their material, but they must also keep their students engaged and interested.

Part of that involves making eye contact with their students – all of them.

A multidisciplinary team of researchers tested several methods of data visualization in an immersive virtual reality (VR) classroom, to give teachers a way to gauge how their gaze was distributed. The implications of this study could inform future VR classroom simulations, and offer a tool for teachers wishing to fine-tune their nonverbal behaviors.

, associate professor of communication in the College of Agriculture and Life Sciences and director of the , is senior author of “,” which was presented at the Association of Computing Machinery Conference on Human Factors in Computing Systems, held May 11-16 in Honolulu.

Co-lead authors are Yejoon Yoo, MPS ’23 (information science), a researcher and interaction designer in the Virtual Embodiment Lab; and Jonathan Segal, doctoral student in the field of information science. The other co-author is Aleshia Taylor Hayes, assistant professor in the Department of Learning Technologies at the University of North Texas.

“Making eye contact is really significant in keeping students engaged,” said Yoo, who along with Segal presented the paper at the conference. “And there really aren’t enough tools that allow you to reflect on your own teaching behavior, because you don’t record yourself teaching students or ask people to give you feedback when you’re teaching.”

Some of the earliest uses of VR simulators include virtual classroom simulations that allow instructors to hone key skills, including nonverbal behaviors.

The researchers conducted two pilot studies that measured the effectiveness of different types of gaze-distribution data visualization. Participants (college students whose primary teaching experiences had been as teaching assistants) assumed the role of a teacher in a virtual classroom setting lecturing to 30 virtual “student” agent-avatars (a virtual human representation created and controlled by a computer program). Each participant, wearing a VR headset, taught the class for five minutes, in four different experiment conditions:

  • control (no gaze data visualization);
  • gaze distribution measured with a bar graph over each avatar, which increased when the student was being looked at and decreased when they weren’t; and
  • two conditions in which gaze distribution was measured by the opacity of the avatar (either fading in or fading out when it was being looked at).

For the first study, the authors used participants’ head positions as a proxy for gaze distribution measurement. The second study used actual eye tracking to measure gaze. In both studies, participants reported that the bar graph condition was the most helpful in improving their gaze behaviors during the experiments, while they found the control condition to be the least helpful.

Cognitive load, as measured by the NASA Task Load Index, was increased in the data visualization conditions, but Yoo said the ability to improve nonverbal behaviors could be worth the added mental tax. “We found that the visualization actually helped them perform better,” she said, “so I think there’s a tradeoff between the increased cognitive load and improved performance.”

Won said she plans to expand on the research this summer, recording teachers’ behavior in a physical classroom.

“Allowing teachers to get real-time feedback on their gaze distribution has been an exciting application of virtual reality,” Won said, “but how​ to represent gaze has been less studied. Yejoon was an ideal person to address this question given her interest in UX (user experience) design.”

This work was supported by the ³Ô¹ÏÍøÕ¾ Science Foundation.

/Public Release. View in full .