top of page

VR Teaching Simulation

The research paper is accepted to the ACM (Association of Computing Machinery) CHI conference on Human Factors in Computing Systems.

I led the project as the first co-author. 

We explore how visualizing participants’ gaze, using four different data visualizations, affected participants’ behaviors and self-reflection in an immersive virtual reality classroom simulation. This research was conducted by Yejoon Yoo, Jonathan Segal, Aleshia Hayes, and Andrea Steveson Won. 

Link to paper: https://dl.acm.org/doi/10.1145/3613905.3651105

My Role

Interaction Design,

Research Assistant,

3D Prototyping

Timeline

8 months, 

Virtual Embodiment Lab

Tools Used

Unity,

Figma,

Cinema 4D

Problem Statement

Teachers often struggle to maintain consistent and inclusive gaze behavior in the classroom, which can negatively impact student engagement and achievement. Current professional development resources and feedback mechanisms are limited, making it challenging for teachers to improve their gaze behavior and create a more inclusive learning environment. As a result, there is a need for a user-friendly and accessible app that can provide real-time feedback, data analytics, and personalized recommendations to help teachers track and improve their gaze behavior in the classroom

Virtual reality (VR) technology has been integrated into education to enhance learning by creating immersive and interactive experiences. VR simulations offer an opportunity for teachers to examine and improve their teaching behaviors. Gaze data visualization is one such technique that can provide teachers with feedback on their gaze distribution during VR simulations. However, the effectiveness of different types of gaze data visualization in enhancing teaching practice has not been widely researched.

Research Questions

01 

RQ1. Can transforming how participants’ gaze is represented during a virtual teaching task affect how participants distribute their gaze around a virtual classroom?

RQ2a: Is a metaphor of neglect (Fade In) more effective than a metaphor of over-attention (Fade-Out)? and RQ3a: Is a more explicit measure more effective than a more implicit measure in changing participants’ gaze patterns?

RQ2b: Is a metaphor of neglect perceived as more cognitively difficult than a metaphor of overattention? and RQ3c: Is a more explicit measure perceived as more cognitively difficult than a more implicit measure?

RQ3a: Is a more explicit measure more effective than a more implicit measure in changing participants’ gaze patterns?

RQ3b: Is a more explicit measure perceived as more effective than a more implicit measure?

RQ3c: Is a more explicit measure perceived as more cognitively difficult than a more implicit measure?

RQ4: Does gaze behavior become more evenly distributed as participants complete the four conditions?

teacher_illustration.png

User Research

Target Users

  • Instructors who are interested in improving their teaching behaviors, especially making more eye contact with students

  • Individuals who are interested in trying teaching simulations to practice their teaching skills. 

Research Methods

  • In-person interview

  • Online research 

  • Quantative research (NASA-TLX protocol, Gaze distribution over time)

  • Qualitative research (Post-test questionnaire)

Pain Points / Key Insights

Based on the interviews with 10 participants, I discovered pain points and key insights for designing the interactions for the real-time analysis of gaze data.

  • Difficulty maintaining eye contact with every student in the classroom, especially during large or fast-paced lessons.

  • Lack of feedback or guidance on how to improve gaze behavior and create a more inclusive classroom environment.

  • Inadequate resources or professional development opportunities to support teachers in improving their teaching skills and classroom management techniques.

By offering real-time gaze data analysis, VR teaching simulation could help teachers track and improve their gaze behavior in the classroom, which could lead to engaging teacher-student interactions.

VR_simulation.png

VR Teaching Simulation

I designed four different VR teaching conditions that will help participants to fix the bias in their eye gaze when they're teaching. One condition is a normal classroom setting in which participants can improve their teaching practice in an environment that is similar to real-life classroom. Three conditions have indicators using opacity and bar charts to visualize the gaze data of the participants. 

01  Condition A: Normal Classroom

I included a normal classroom setting so participants can practice teaching in a virtual classroom that is similar to a real-life setting. There is no data visualization in this condition, and I focused on comparing this condition to the ones with data visualization to test the mental load. 

02  Condition B: Bar Charts

I designed this setting and UI for instructions. Participants perform a teaching task in this condition in which virtual students present bar charts of the participants' gaze data. The bar charts get higher when the participants make more eye gaze. I focused on making the bar charts gradually react to the participant's teaching behavior. 

03  Condition C: Fade-in

I used opacity to visualize participants' gaze data. Participants perform a teaching task in a condition in which the students fade in when participants make more eye gaze. 

04  Condition D: Fade-out

I used another approach to visualize gaze data through opacity. Participants perform a teaching task in this condition in which the students fade out when participants make more eye gaze. This is the inverted version of condition C and I wanted to test how participants are more likely to respond to one of the opacity control environments. 

Process

Affinity Diagram & Sketches

I created an affinity diagram to envision the ways to visualize the gaze data of users and simulation features that can help users easily view their teaching outcomes. After considering various options for visualizing eye gaze data, I chose to use bar charts and opacity because they are widely recognized and familiar to most people, which can enhance the accessibility and clarity of the data presented.

Sketch_01.png
Sketch_02.png

Procedure of Study & UI

I created the flow of the study and UIs to help participants understand the instructions and procedure. The study starts with a brief introduction to the study and a consent form. The first teaching task starts when the participant gets comfortable with the virtual environment and controls. There are four teaching tasks, each in different conditions (Condition A: Normal Classroom, Condition B: Bar Charts, Condition C: Fade-in, Condition D: Fade-out). The participant has to complete the TLX protocol after each session and a post-test questionnaire after the whole session. The head movement of the participant is recorded over the experiment. 

Gaze Data Visualization

I used Cinema 4D to create a classroom setting and visualize how the gaze data is communicated through opacity and bar charts. This interaction was implemented into the Unity prototype that was used for research.  

Condition A: Normal Classroom

There is no data visualization in this condition. 

Normal.png

Condition B: Bar Charts

Gaze data is visualized through bar charts, which is displayed on top of each student. 

BarChart.png

Condition C: Fade-in

Gaze data is visualized through opacity. Students fade in when the participant makes more eye contact. It takes 10 seconds for the student to become fully opaque.

FadeIn.png

Condition D: Fade-out

Gaze data is visualized through opacity. Students fade out when the participant makes more eye contact. It takes 10 seconds for the student to become fully transparent. 

FadeOut.png

Teaching Task

Participants were instructed to complete a teaching task in each condition. Different teaching materials were provided in each condition and the order of the sessions was randomly selected. 

Instruction:

Now, we will start the teaching task. You have to teach the students for 5 minutes based on the slides provided in front of you. You can also see the slides at the back of the classroom and the board behind you. You view the timer next to the laptop and the right side of the classroom. You can go back and forth between the slides by pressing A and B of your controller. Please spread your eye gaze equally between all students 

User Testing 

Quantitative Research: NASA-TLX Protocol

NASA-TLX (NASA Task Load Index) is a widely used subjective rating scale for assessing the perceived workload or mental effort of a task. The NASA-TLX protocol consists of six subscales, each rated on a 0-100 scale: Mental Demand, Physical Demand, Temporal Demand, Performance, Effort, and Frustration.

 

Participants are asked to rate their perceived workload on each subscale based on their subjective experience during the task. The final NASA-TLX score is calculated by weighting each subscale according to its perceived importance and summing the scores.

  • Mental demand: How much mental and perceptual activity was required to perform the task (e.g., thinking, looking, searching, deciding, teaching, etc.)? 

  • Physical demand: How much physical activity was required to perform the task (e.g., turning, controlling, etc.)?

  • Temporal demand: How much time pressure did you feel due to the rate or pace at which the tasks occurred?

  • Performance: How successful do you think you were in accomplishing the goals of the task set by the experimenter? How satisfied were you with your performance in accomplishing these goals?

  • Effort: How hard did you have to work (mentally and physically) to accomplish your level of performance?

  • Frustration level: How insecure, discouraged, irritated, stressed and annoyed versus secure, gratified, content, relaxed and complacent did you feel during the task?

Screen Shot 2024-05-11 at 1.26.12 PM.png

The NASA-TLX results present higher cognitive load in the visualization conditions compared to control condition. We did not see statistically significant differences

between conditions on any TLX measures. Participants’ prefered the Bar Graph condition overall. 

Quantitative Research: Gaze Distribution Over Time

Study1+2_Heatmaps.png

We calculated the time that gaze behavior was directed to each of the 30 student agents.  Due to expanding the lecture materials, we were able to use the initial four minutes of teaching in each condition. We excluded values from Col 3 and Col 4 of Row 1 in all heatmaps, as participants reported that the student agents in these positions were reacting to their gaze even when the participants were looking at the teaching materials on the podium. Based on the heatmaps, the conditions with visualizations were generally presented more evenly distributed eye gaze compared to the control condition. Participants’ gaze was most evenly distributed in the Bar Graph condition.

New_Head_Rotation_Rate.png

The results of participants' head movements in yaw showed that in Condition B, which featured bar charts, the participants had more evenly distributed eye gaze compared to the other conditions. These findings suggest that the use of bar charts may have helped this participant to achieve a more even gaze distribution. Overall, this study highlights the importance of carefully selecting appropriate data visualization techniques to improve user experience and facilitate effective information processing.

Qualitative Research: Post-test Questionnaire

Post-test questionnaire is based on usability testing designed to evaluate the usability of applications. It assesses usability in several dimensions, including efficiency, effectiveness, satisfaction, learnability, and flexibility. The results of the evaluation helped me to identify areas for improvement and make changes to enhance the overall usability of the simulation. 

User feedback:

What did you like the most about the experience?

  • "The VR technology itself in providing me lively data-based feedback without recruiting real students giving subjective feedback."

  • "I really like the bar chart! It clearly shows the engagement of students. When the bar chart lowers, I'm more likely to look to that direction to make it higher."

What did you like the least about the experience?

  • "I wish the bar chart could provide an accumulative amount of eye gaze that I gave to students rather than at each instantaneous moment (i.e. grow bar when I'm looking at them, stop growing and maintain bar height when I look away)"

  • "Fading students are sometimes distracting, especially the fade-out one."

  • "Bar chart a bit too distracting; I feel like I need to be validated by the data all the time. the obsession with the data metrics is a bit too much."

  • "I don't like the agents staring at me without any interactions. I like interactions such as nodding when making eye contact, but the agents can't even blink, which is actually kinda frustrating."

Impacts

Improving teaching behavior: Using VR teaching simulations to improve gaze behavior can also enhance the overall teaching effectiveness. By providing teachers with real-time feedback on their gaze behavior, they can adjust their teaching methods accordingly and optimize their teaching effectiveness

 

Promoting an engaging learning experience: Eye contact can facilitate communication between the teacher and the students. It can help convey nonverbal cues such as enthusiasm, interest, and empathy, which can make the learning experience more interactive and engaging for the students.

Reflection: Takeaways

This project was focused on user research and interaction design. The head movements data, NASA-TLS protocol, and post-test questionnaire helped me to understand the interactions that are efficient in displaying data and improving teaching behavior. 

If I had more time and resources, I could have recruited more participants. I also could have created an MR version of the prototype to visualize the real-time analysis of the gaze data in a real-life classroom. 

The feedback from the post-test questionnaire provided me with a deeper understanding of how users' behavior changes according to different interactions and data visualizations. It also helped me discover areas of improvement for enhancing the usability of the simulation. 

bottom of page