24-3001 Head Spin: Multisensory Integration of 3D Orientation in VR

From UBC Wiki
Emerging Media Lab
UBC Emerging Media Lab Signature.png
About EML
A collaborative space for UBC faculty, students and staff for exploration of emerging technologies and development of innovative tools and solutions. This wiki is for public-facing documentation for our projects and procedures.
Subpages


Introduction

Animals, including humans, possess the ability to perceive their orientation relative to the surrounding environment. Orienting in three-dimensional space requires complex multisensory integration and cognition, as cues from various senses and axes of rotation must be combined in a nonlinear, non-commutative manner. The brain's representation of orientation can constrain neural computations, impacting functions such as how disturbances are combined across multiple axes of rotation. Impairments in orientation can result in postural imbalance and difficulties in navigation and object manipulation.

The 'Head Spin' project explores human processing of 3D spatial rotations, the interaction of disturbances in different modalities and axes, and the impact of passive and active movements on orientation perception. By examining these key questions, this research will provide valuable insights into human cognition and behavior in spatial localization.  

Background

The Head Spin project aims to create the VR tool utilized for carrying out a study on how humans process 3D spatial rotations, and how disturbances in different modalities and axes affect how humans perceive and experience orientation perception in a 3D environment. Disturbances were not in the scope of this project but will eventually be added, and they include a rotational chair, galvanic vestibular stimulation (GVS), and spatial audio cues.  As such, this project aims to create a Virtual Reality (VR) interface where participants will complete the study by performing a rotation task across multiple trials.  

The challenge comes with the fact that 3D orientation requires complex multisensory integration and cognition to be able to combine cues from different senses and axes of rotation in a non-linear manner which does not always yield the same result. The brain's representation of orientation, if impaired across multiple axes of rotation, can lead to difficulty in navigation and manipulating objects, and physical manifestations like postural imbalance. As such, the project aims to examine how these disturbances, as well as how passive and active movements, impact how humans perceive and experience orientation. By examining these key questions, we will gain valuable insights into human cognition and behaviour in spatial localization.  

Objective

As the objective of the project is to create a VR tool for a research study, the main challenge of the project was working with virtual reality and translating the needs of the study to the VR environment. As the idea was brought to EML to be developed, the expectation was that phase 1 would be developed over the term, with some groundwork done for phase 2 of the project. As such, incorporating the tools and technologies like galvanic vestibular stimulation (GVS), spatial audio cues, and the rotational chair, for phase 2 will come after the project.  

Format/Versioning

The VR interface is developed using Unreal Engine 5.3.2 and an in-house built EML server to test connections from the interface. Wireframes were designed using Figma, and assets created in Blender. The project was designed to be utilized on the Meta Quest.

Primary Features & Functionalities

The virtual interface developed includes the following features:

tutorial, task

FUNCTIONALITIES

The first table details the success criteria as outlined in the Head Spin Project Agreement at the start of term, and the second table includes a more comprehensive list of functionalities of the project, iterated through the term.  


Table 1: Success Criteria from Head Spin Project Agreement  

No. Task Priority Status
F1 Develop a virtual environment with controls for precision manipulation of rotating objects.

 

Must have Complete
F2 Establish a connection between the rotating platform* and the virtual environment.

*Modified to be the server instead of the rotating platform due to equipment inaccessibility  

Must have Complete
F3 Implement functionality for Phase 1, section A and B.   Must have Complete
F4 Implementations will be done with considerations for project’s future plans.
  1. Allow for future implementation of omni-directional treadmill.
  1. Allow for future implementation of galvanized electrical stimulation.
Must have   Complete


Table 2: Core Functionalities

No. Task Priority Status
F1 VR rotational task design aligns with experimental design paradigm (e.g. phase 1, phase 2, transitions between each within a single trial and within trials are fully functional)   Must have Complete
F2 Phase 1: In a single trial, user can view the rotations of an object. Must have Complete
F3 Phase 1 Active: In a single trial, user can view the rotations of an object and rotate the object using a controller.   Must have Complete
F4 Phase 1 Passive and Phase 2 transition: In a single trial, user will be taken to phase 2 after watching the rotation of the object in phase 1. Must have Complete
F5 Phase 2: In a single trial, user can rotate the object using a controller back to its initial orientation. Nice to have Complete


F6 Researcher can access downloaded (binary to CSV) participant and object data when the task is finished.   Must have Complete


F7 Researcher can setup the experiment by inputting networking settings and participant information into the UI   Should have   Complete


F8 Virtual environment and rotational object created Must have   Complete


F9 User can view and interact with a tutorial to understand the task in VR (e.g. task overview, goal of each phase, how to rotate using controllers) Must have   Complete


F10 User can view in-task indicators, such as new trial alerts and controller tooltips   Should have   Incomplete
F11 User can access in-task menu (e.g. pause the task and settings) Nice to have   Incomplete
F12 Error indicators popup for wrong button pressed during tutorial/task   Should have Partially Complete

* this feature was instead replaced by disabling other buttons during the task  

F13 Error indicators popup for wrongly inputted data during setup Nice to have Incomplete
F14 Style and component guide for cohesive implementation of UI and related elements   Nice to have Complete


Tech Stack and Development Overview

Technical Components

Here is a table with all the tools used in the project's design and development.

Term Description
Unreal Engine Game Engine used for development of this project’s interface for Virtual Reality.
Figma Design software used for wireframing and prototyping UX/UI design of the project.
EML Server Used as placeholder for the rotational chair (e.g. sending messages to the server to check if the connection point works)  
Miro   Mind map visualization software used for architecture diagrams and UX related work (e.g. user flows)  
Trello Project management software used for team and project organization  
Qualtrics Survey software used for gathering quantitative user research findings
Blender Used for 3D modeling of rotational object and related assets  

Design Overview

First Time Setup Guide

If you do not have Unreal Engine installed, head to the section "Setup" and beyond.

Tools, equipment and settings

  • Run on desktop machine
  • Meta Quest 3 Oculus Headset
  • Link cable connected  

From the desktop, setup the headset:

  1. Open MetaQuest Link app on the PC and you should see something that looks like this. Minimize the window.
  2. Connect the Oculus Headset (MetaQuest 2) to PC via link cable.
  3. Put on the headset.
  4. Turn on the headset by holding down the side button on the side of the headset
    1. A blue Meta logo will appear and you will hear an audio cue.
  5. Using the trigger button on the controller, click the Wi-Fi signal/clock on the bottom left of the bar to open the Quick Settings.
  6. From the Quick Settings, choose “Quest Link”.  
  7. Click it using the trigger button and at the prompt click “Launch”. Make sure that the toggle for Air Link is NOT selected.  
  8. When you see the screen change and the menu with apps appear, you are ready to launch the project.
  9. Remove the headset and return to the desktop.  

Setup

Setting up Unreal Engine

Below are instructions on how to install Unreal Engine.

  1. Visit Download Unreal Engine - Unreal Engine and install the Epic Games Launcher
  2. After following the steps to install the launcher, select Unreal Engine.
  3. Go to the Library Tab and press the “+” Icon. Choose Unreal Engine 5.3.2
  4. In Folder, choose where to install unreal engine. It is recommended not to change this if you have the space on your primary drive.  
  5. In path, click on options, and unselect the Android, IOS, and Linux platforms. This will save you about 22 GB in installation.  
  6. Let the install happen, and then run. The first time you run Unreal Engine, it will take a while so be patient!

Working with Server (FOR Developers)

Challenges

Trial Overview

Each trial in the study consists of the rotation completion task, which is split into two parts, phase 1 and phase 2.  

Phase 1 is further split into either passive or active mode. Participants see an object rotate across two axes and either copy the rotation with their controller if active or skip directly to phase 2 if passive. This object has no rotational symmetries, and for the purpose of the project, we decided on a book.  

In phase 2, participants will have to use their controller to rotate the object back to its initial orientation, and this phase will incorporate the disturbances down the road, so currently phase 2 has no disturbances. The disturbances include:

  1. Galvanic vestibular stimulation (GVS), which uses electrodes placed on the mastoid processes to introduce weak currents that create calibrated disturbances in the vestibular system. This approximates a roll sensation.  
  1. Omnidirectional treadmill/rotational chair. The rotational stage underneath the omnidirectional treadmill will rotate, causing disturbances in the yaw direction.  
  1. Moving spatial audio cues, which will be delivered through a set of high-fidelity speakers surrounding the participant, causing disturbances in the roll and yaw directions.  

Once participants finish a trial, they are then taken to the next trial. For a more detailed look at the overall process, refer to the user flow diagram below.

<UPLOAD IMAGE>

Future Considerations

Immediate quality-of-life improvements can be made to deliver the full user experience such as implementing the designed:

  • Controller tooltips
  • Pause and settings menus
  • Sounds for button presses
  • Detailed tutorial

Other than quality-of-life improvements, we anticipate that implementing different mechanisms for controls for greater ease of use might be a potential direction for improvement.  

The controls of the project underwent an extensive, iterative process throughout the project. We initially designed the interface to work with joystick controls and grip “steering wheel” controls. These controls allowed for the user to use the joystick on the controller to move the object, as well as the grip button on the controller to rotate the object left and right like a steering wheel.  

However, due to feedback from the project design review session mid-way through term and user testing, we removed the joystick functionality and transitioned the controls function into one that mimics grabbing the object. To rotate the object, the user would press the grip button and physically rotate the controller in the direction that they intend its rotation to go within the interface.

Controls for rotating the object are such an iterative process due to the unique ergonomics that they require of each user, and each user’s different preferences and capabilities. As such, we anticipate this process to be a future improvement and future options for the controls could include adding a grab point to the object and then grabbing to rotate, or re-implementing joystick controls.

Team

Principal Investigator

Dr. Manu Madhav

Assistant Professor

School of Biomedical Engineering, Faculty of Applied Science

The University of British Columbia

Current Team

Victoria Lim, Project Lead, UI/UX Designer (May 2024 - Sep 2024)

Work Learn at the Emerging Media Lab at UBC

Undergraduate in Bachelor of Science in Cognitive Systems

University of British Columbia

Graydon Strachan, Software Developer (May 2024 - Sep 2024)

Work Learn at the Emerging Media Lab at UBC

Undergraduate in Bachelor of Applied Science in Computer Engineering

University of British Columbia

Jerry Wang, Software Developer (May 2024 - Sep 2024)

Work Learn at the Emerging Media Lab at UBC

Undergraduate in Bachelor of Science in Cognitive Systems

University of British Columbia

Poster

License

MIT License

Copyright (c) 2024 University of British Columbia

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.