top of page

VR Teaching Simulation

The VR teaching simulation aims to assist instructors in improving their nonverbal behaviors, specifically their gaze attention, by providing guidelines for training.

To investigate the most effective interface for data visualization in virtual reality, we conducted a study with four conditions: 1) normal classroom, 2) gaze data presented via bar charts, 3) gaze data presented through student opacity, and 4) gaze data presented through inverted student opacity. Our goal was to determine which interface would minimize cognitive load and enhance teaching behavior. This research was conducted by Yejoon Yoo, Jonathan Segal, Aleshia Hayes, and Andrea Steveson Won. 

My Role

Interaction Design,

Research Assistant,

3D Prototyping

Timeline

8 months, 

Virtual Embodiment Lab

Tools Used

Unity,

Figma,

Cinema 4D

Problem Statement

Teachers often struggle to maintain consistent and inclusive gaze behavior in the classroom, which can negatively impact student engagement and achievement. Current professional development resources and feedback mechanisms are limited, making it challenging for teachers to improve their gaze behavior and create a more inclusive learning environment. As a result, there is a need for a user-friendly and accessible app that can provide real-time feedback, data analytics, and personalized recommendations to help teachers track and improve their gaze behavior in the classroom

Virtual reality (VR) technology has been integrated into education to enhance learning by creating immersive and interactive experiences. VR simulations offer an opportunity for teachers to examine and improve their teaching behaviors. Gaze data visualization is one such technique that can provide teachers with feedback on their gaze distribution during VR simulations. However, the effectiveness of different types of gaze data visualization in enhancing teaching practice has not been widely researched.

Research Questions

01 

02 

1. What types of interactions are effective in the immersive setting for viewing data visualization in the context of teaching?

2. What technology can efficiently analyze teachers’ gaze with virtual students and improve their teaching behavior

3. How can we evaluate the teaching outcomes of users based on their performances in each simulation? 

teacher_illustration.png

03 

User Research

Target Users

  • Instructors who are interested in improving their teaching behaviors, especially making more eye contact with students

  • Individuals who are interested in trying teaching simulations to practice their teaching skills. 

Research Methods

  • In-person interview

  • Online research 

  • Quantative research (NASA-TLX protocol, Gaze distribution over time)

  • Qualitative research (Post-test questionnaire)

Pain Points / Key Insights

Based on the interviews with 10 participants, I discovered pain points and key insights for designing the interactions for the real-time analysis of gaze data.

  • Difficulty maintaining eye contact with every student in the classroom, especially during large or fast-paced lessons.

  • Lack of feedback or guidance on how to improve gaze behavior and create a more inclusive classroom environment.

  • Inadequate resources or professional development opportunities to support teachers in improving their teaching skills and classroom management techniques.

By offering real-time gaze data analysis, VR teaching simulation could help teachers track and improve their gaze behavior in the classroom, which could lead to engaging teacher-student interactions.

VR_simulation.png

VR Teaching Simulation

I designed four different VR teaching conditions that will help participants to fix the bias in their eye gaze when they're teaching. One condition is a normal classroom setting in which participants can improve their teaching practice in an environment that is similar to real-life classroom. Three conditions have indicators using opacity and bar charts to visualize the gaze data of the participants. 

01  Condition A: Normal Classroom

I included a normal classroom setting so participants can practice teaching in a virtual classroom that is similar to a real-life setting. There is no data visualization in this condition, and I focused on comparing this condition to the ones with data visualization to test the mental load. 

02  Condition B: Bar Charts

I designed this setting and UI for instructions. Participants perform a teaching task in this condition in which virtual students present bar charts of the participants' gaze data. The bar charts get higher when the participants make more eye gaze. I focused on making the bar charts gradually react to the participant's teaching behavior. 

03  Condition C: Fade-in

I used opacity to visualize participants' gaze data. Participants perform a teaching task in a condition in which the students fade in when participants make more eye gaze. 

04  Condition D: Fade-out

I used another approach to visualize gaze data through opacity. Participants perform a teaching task in this condition in which the students fade out when participants make more eye gaze. This is the inverted version of condition C and I wanted to test how participants are more likely to respond to one of the opacity control environments. 

Teaching Session (Condition A)

This is a video recording of a teaching session in condition A: Normal Classroom. The head movements of the participant was recorded throughout the whole session. 

Process

Affinity Diagram & Sketches

I created an affinity diagram to envision the ways to visualize the gaze data of users and simulation features that can help users easily view their teaching outcomes. After considering various options for visualizing eye gaze data, I chose to use bar charts and opacity because they are widely recognized and familiar to most people, which can enhance the accessibility and clarity of the data presented.

Sketch_01.png
Sketch_02.png

Procedure of Study & UI

I created the flow of the study and UIs to help participants understand the instructions and procedure. The study starts with a brief introduction to the study and a consent form. The first teaching task starts when the participant gets comfortable with the virtual environment and controls. There are four teaching tasks, each in different conditions (Condition A: Normal Classroom, Condition B: Bar Charts, Condition C: Fade-in, Condition D: Fade-out). The participant has to complete the TLX protocol after each session and a post-test questionnaire after the whole session. The head movement of the participant is recorded over the experiment. 

Gaze Data Visualization

I used Cinema 4D to create a classroom setting and visualize how the gaze data is communicated through opacity and bar charts. This interaction was implemented into the Unity prototype that was used for research.  

Condition A: Normal Classroom

There is no data visualization in this condition. 

Normal.png

Condition B: Bar Charts

Gaze data is visualized through bar charts, which is displayed on top of each student. 

BarChart.png

Condition C: Fade-in

Gaze data is visualized through opacity. Students fade in when the participant makes more eye contact. It takes 10 seconds for the student to become fully opaque.

FadeIn.png

Condition D: Fade-out

Gaze data is visualized through opacity. Students fade out when the participant makes more eye contact. It takes 10 seconds for the student to become fully transparent. 

FadeOut.png

Teaching Task

Participants were instructed to complete a teaching task in each condition. Different teaching materials were provided in each condition and the order of the sessions was randomly selected. 

Instruction:

Now, we will start the teaching task. You have to teach the students for 5 minutes based on the slides provided in front of you. You can also see the slides at the back of the classroom and the board behind you. You view the timer next to the laptop and the right side of the classroom. You can go back and forth between the slides by pressing A and B of your controller. Your movements can be controlled with the joystick. Please move your head and position around often to spread your eye gaze equally between all students. 

User Testing 

Quantitative Research: NASA-TLX Protocol

NASA-TLX (NASA Task Load Index) is a widely used subjective rating scale for assessing the perceived workload or mental effort of a task. The NASA-TLX protocol consists of six subscales, each rated on a 0-100 scale: Mental Demand, Physical Demand, Temporal Demand, Performance, Effort, and Frustration.

 

Participants are asked to rate their perceived workload on each subscale based on their subjective experience during the task. The final NASA-TLX score is calculated by weighting each subscale according to its perceived importance and summing the scores.

  • Mental demand: How much mental and perceptual activity was required to perform the task (e.g., thinking, looking, searching, deciding, teaching, etc.)? 

  • Physical demand: How much physical activity was required to perform the task (e.g., turning, controlling, etc.)?

  • Temporal demand: How much time pressure did you feel due to the rate or pace at which the tasks occurred?

  • Performance: How successful do you think you were in accomplishing the goals of the task set by the experimenter? How satisfied were you with your performance in accomplishing these goals?

  • Effort: How hard did you have to work (mentally and physically) to accomplish your level of performance?

  • Frustration level: How insecure, discouraged, irritated, stressed and annoyed versus secure, gratified, content, relaxed and complacent did you feel during the task?

NASA-TLX Protocol Results.png

Quantitative Research: Gaze Distribution Over Time

Head_Radian.png

In this study, the head movements of participants were analyzed to determine the distribution of eye gaze while viewing different types of data visualizations. The participants' head movements were plotted in radians and analyzed to determine the evenness of their eye gaze distribution. The following graphs present the results from one of the participants. 

*Y = Radians, X = Time in seconds

Condition A: Normal Classroom 

27091_Controll.png

Condition B: Bar Charts

27091_Bar_Chart.png

Condition C: Fade-in

27091_Fade_In.png

Condition D: Fade-out

27091_Fade_Out.png

The results of this participant's head movements showed that in Condition B, which featured bar charts, the participants had more evenly distributed eye gaze compared to the other conditions. These findings suggest that the use of bar charts may have helped this participant to achieve a more even gaze distribution. Overall, this study highlights the importance of carefully selecting appropriate data visualization techniques to improve user experience and facilitate effective information processing.

Qualitative Research: Post-test Questionnaire

Post-test questionnaire is based on meCUE, which is a usability testing tool designed to evaluate the usability of applications. It assesses usability in several dimensions, including efficiency, effectiveness, satisfaction, learnability, and flexibility. The results of the evaluation helped me to identify areas for improvement and make changes to enhance the overall usability of the simulation. 

IMG_1354.jpg

User feedback:

What did you like the most about the experience?

  • "The VR technology itself in providing me lively data-based feedback without recruiting real students giving subjective feedback."

  • "I really like the bar chart! It clearly shows the engagement of students. When the bar chart lowers, I'm more likely to look to that direction to make it higher."

What did you like the least about the experience?

  • "I wish the bar chart could provide an accumulative amount of eye gaze that I gave to students rather than at each instantaneous moment (i.e. grow bar when I'm looking at them, stop growing and maintain bar height when I look away)"

  • "Fading students are sometimes distracting, especially the fade-out one."

  • "Bar chart a bit too distracting; I feel like I need to be validated by the data all the time. the obsession with the data metrics is a bit too much."

  • "I don't like the agents staring at me without any interactions. I like interactions such as nodding when making eye contact, but the agents can't even blink, which is actually kinda frustrating."

Impacts

Improving teaching behavior: Using VR teaching simulations to improve gaze behavior can also enhance the overall teaching effectiveness. By providing teachers with real-time feedback on their gaze behavior, they can adjust their teaching methods accordingly and optimize their teaching effectiveness

 

Promoting an engaging learning experience: Eye contact can facilitate communication between the teacher and the students. It can help convey nonverbal cues such as enthusiasm, interest, and empathy, which can make the learning experience more interactive and engaging for the students.

Reflection: Takeaways

This project was focused on user research and interaction design. The head movements data, NASA-TLS protocol, and post-test questionnaire helped me to understand the interactions that are efficient in displaying data and improving teaching behavior. 

If I had more time and resources, I could have recruited more participants. I also could have created an MR version of the prototype to visualize the real-time analysis of the gaze data in a real-life classroom. 

The feedback from the post-test questionnaire provided me with a deeper understanding of how users' behavior changes according to different interactions and data visualizations. It also helped me discover areas of improvement for enhancing the usability of the simulation. 

bottom of page