Documentation:23-1002 Behavioural Biomarker

From UBC Wiki
Emerging Media Lab
UBC Emerging Media Lab Signature.png
About EML
A collaborative space for UBC faculty, students and staff for exploration of emerging technologies and development of innovative tools and solutions. This wiki is for public-facing documentation for our projects and procedures.
Subpages


Emerging Media Lab Project Documentation

Introduction

Background

Memory-based tests are extremely common in assessing neurodegenerative decline due to Alzheimer's or other dementias. Navigational abilities and spacial memory are also primarily affected in such cases, yet they are often overlooked in memory testing.

The Emerging Media Lab at UBC is collaborating with the Neural Circuits for Computation, Cognition and Control (NC4) Laboratory to develop a novel approach to assess cognitive decline on the individual's "cognitive map" using Virtual Reality (VR).

This project is a continuation of the Minimum Viable Project (MVP) of the VR task, developed by the principal investigator Dr. Manu Madhav's team.

"Alzheimer’s Disease (AD) causes progressive neurodegeneration, i.e. the death of neurons and their connections in the brain. If AD is diagnosed early, drugs and lifestyle changes can be used to slow down its progress. In its early stages, AD affects brain regions that are responsible for navigation and planning. The difference in our ability to remember the location of landmarks around us with respect to us and with respect to each other may be affected differently in early AD compared to normal aging. Our virtual reality task is designed to be able to quantify these differences using relatively few trials." (Dr. Manu Madhav, Principal Investigator)

Objective

The development of this VR tool is being continued at EML, with a specific focus on improving the usability and accessibility of the task. This project aims to supplement the original experience by streamlining the back-end operations and data collection methods, and improving the accessibility and intuitiveness of the program within user interface. This includes key goals such as creating a tutorial level, and implementing a dynamic path generation algorithm that will provide researchers with greater flexibility and expansion potential.

Through these goals for improvement, it is hoped that the use of VR in experimental research as an assessment tool will allow researchers to discover a different angle of Alzheimer's disease and provide opportunities for greater understanding.

Format and Versioning

Behavioural Biomarker (MVP)

Behavioural Biomarker is a VR experience developed on Unity.

Behavioural Biomarker (EML Prototype)

This project is built on Unity 2021.3.4f1.

Primary Features

The experience will consists of the following complete features:

F1. Dynamic Maze Generation

In the previous version of Biomarker, the paths that were a part of the trial were pre-determined and manually written in binary files. Each binary code represented different walls, windows, and other components of a room and a path, which had to be interpreted by the application and then utilized. This limited the ability of customization and expandability of paths.

In the latest version, the team opted to create a more dynamic and engine-reliant generation process, where the only manual input needed from the developer would be the length and number of turns for each path, and the number of times each difficulty level should be repeated. The input is also given to the application through Unity Components in editor mode, replacing the use of binary code. See section D.2. for details.

Major refactoring of the codebase was undertaken to optimise transitions between game states and ensure that the program operated according to the intended user flow.

F2. Increased Usability and UI Improvements

The solution was projected to include a fully functional tutorial level. The goal of the tutorial is to solve the issue of users getting better over the trials due to task familiarity.

The experience underwent the following improvements:

  1. Reduced motion sickness by replacing the static console with a hand-menu console.
  2. Clearer text instructions.
  3. Clear diagrams of the controls.
  4. Intuitive movement/input systems.
  5. Comprehensive instructional tutorial panels (to be implemented).

Design Methods

  • User personas
  • User flow
  • User Interface Wireframes (Figma)
  • Implementation of UI into Unity
  • Prefab creation for rooms and corridors

Development

Github Branches

Our repository can be found here: https://github.com/ubceml/23-1002-Biomarker.git

Please refer to the table below for a description of all the branches in the repository.

Active Branches

Branch Name Description
main The original project, after deletion of extra assets
Develop The most updated branch with the latest updates
devMariane The branch with the UI's implement in it

Stale Branches

Branch Purpose Last Modified Date Modified By Build/Demo Ready?

Issues/Bugs

  • Start Location acknowledgement appears to be skipped sometimes
  • When the user is doing the task of pointing to where they started (in the LocateStart state), if the user chooses the direction correctly, it seems that the step of Acknowledging the starting point is skipped
  • This is due to the starting point pin having an XR interactable component responsible for detecting the right controller’s ray cast when acknowledging the starting point.
  • The acknowledge step is not skipped, however, it immediately preformed and moved on from
  • The start pin appears immediately while the user is still in the process of locating the starting point and the trigger is held down from the previous step. This makes it so that the pin appears and detects the controller, resulting in quickly triggering the acknowledgement and moving on to the next task.
  • Suggestions: There are many potential ways this can be resolved, one which is making it so that the XR-interactable on the start pin is activated with a delay, allowing time for the user to let go of the trigger button from the last step.
  • “Pathroute.LastStepDir” variable
  • The PathRoute’s KeyValPair of “lastStepDir”’s value which is the direction to signify where the door should be placed to the next path.
  • We attempt to set the correct Value to this variable in the “GridManager.GetValidNextStart(path)” function in line approx. 345. However, keyValuePair references are not modifiable in C# it seems and as a result the correct direction is saved onto the dictionary in PathRoute but not in the “lastStepDir” variable.
  • Solution: Split the Key and Value pairs for both “firstStepDir” and “lastStepDir” to integers and Vector2Int, this way it will be modifiable.
  • Falling at the beginning of the game [FIXED]
  • User Falls before pressing the new Instance Button
  • Solution used: Gravity was set to false on awake and movement locked.
  • Door is walk through [FIXED]
  • The User can walk back through the door and exit the rooms/path and fall.
  • Solution used: "is_trigger” variable was set to false after the user walks through the door
  • Head tracked in Scoring [FIXED]
  • When scores are being calculated, the head rotation was being measured as the pointing axis instead of right-hand controller.
  • Solution used: the rotation of the right controller was replaced.  
  • Note: the data being saved needs to be reviewed as to whether it is the correct input/state that the PI would like to save or not.

First Time Setup + General Usage

How to Connect Quest 3 to PC
  1. Install the Meta Oculus application
  2. Turn on the Quest 3 and login to your account both on the headset and the Oculus pc app
  3. Connect the headset over AirLink (same network) or via Cable to the computer Note : make sure the USB-C to USB-C cable is 2 way data capable
  4. Open the oculus Rift app through the quick setting menu on the headset
  5. When game is deployed, the headset should automatically open up the VR game
How to deploy
  1. Install the Unity 2021.3.4f1 editor
  2. Clone the lastest version of the "develop" branch
  3. Open the project in-editor and open the "develop" scene
  4. Connect the VR headset and click play.

Challenges

As the title suggests

Future Plans

  • EML encourages testing the usability of the product with its specific end user (Alzheimer’s patients) to gather tangible evidence for accessible and dementia-specific needs.
  • Going forward, an important design challenge that needs to be addressed is ensuring that elderly adults and adults with memory deficits are comfortable and willing to undergo the experience.
  • Consider upgrading the graphics to a more realistic or perhaps a different style of graphical objects.
  • Complete researcher interactions to allow control of trials and/or the tutorial in real time and grant exclusive researcher access to specific menus (i.e. Pause menu or Step Select menu). This potentially involves creating a separate UI and implementing a form of remote control.  


Development  

  • The scoring:
    • The Current scoring system (described in section D3), users the custom EventData() struct from the previous version of the application.
    • Now with a fully immersive VR experience, it was suggested to have full continuous movement tracking. Primarily the head and the right controller movement.
    • A re-purposing and adjustment of the eventData() struct is suggested to tackle this by evaluating exactly what is needed to be saved into the Json files.
  • Input mapping:
    • Creating a custom action mapping for user inputs as opposed to the native XRI Toolkit which is more complex than needed for BIOM
  • The dynamic environment to disappear mid-way through the path as the user is walking.
    • Suggestions:
      • This can be set by placing a collider trigger in the second or third cell of the path that disables the "DynamicEnviroment” object in the hierarchy
      • It can also be set by a timer instead of walking progression
  • Filtering through the Assets and removing unused packages and assets.
  • Create an Android build to be loaded directly onto the headset instead of through the PC


UI/UX

  • Accessibility settings/trial notes from the starting menu are a potential future expansion of the current UI to cater to different user needs (i.e. large text, volume, etc).
  • Usability testing should be conducted to assess the efficacy of instruction/game controls delivery in the main and tutorial task, as well as design choices such as font size, panel size, etc.
  • Many panels of the UI developed both in Figma and in Unity will have to be implemented in the main/develop scene
  • This includes Making the buttons XR interactable and creating custom script for the specific action the buttons perform.  
  • The models for the controller to be replaced by the device specific ones, i.e. the quest 3 controller models
  • Head Rotation
  • Common feedback received during the demos highlighted concerns regarding the inability to turn using the controller. The current application allows turning with head/headset rotation.  
  • Note these users were standing and not seated on a rotating chair.


Tutorial

  • The solution was projected to include a fully functional tutorial level.  
  • Consider integrating Premade maze routes with tutorial-specific UI implementation
  • Consider integrating “Looping” capabilities for the Step Select functions and practice rounds
  • Researcher control over the maze difficulty is limited to within the game engine and before the beginning of the trial. Implementing a separate graphical user interface is a beneficial UX change for the researcher’s ease of use.

Poster

As the title suggests

Team Members

Principal Investigator

Dr. Manu Madhav

Assistant Professor

School of Biomedical Engineering

University of British Columbia

Team

Mariane Olivan, Project Lead, UI/UX Designer (September 2023-April 2024)

Work Learn at the Emerging Media Lab at UBC

Undergraduate in Bachelor of Science in Cognitive Systems

University of British Columbia

Eric Leung, Developer (September 2023-Present)

Work Learn at the Emerging Media Lab at UBC

University of British Columbia

Mohsen Movahedi, Developer

Contractor at the Emerging Media Lab at UBC

University of British Columbia

FAQ

As the title suggests