Documentation:24-1001 Enhanced Auditory Simulation Improved Map
Introduction
Despite advancements in hearing technology, individuals with hearing loss struggle with sound localization in day-to-day life. Research suggests, however, that the brain can adapt to localizing sound using only one ear, emphasizing the need for accessible training programs in both educational and clinical settings.
Enhanced Auditory Simulation Improved Mapping, or EARSIM, is a VR prototype of one of these training programs, using Procedural Content Generation and Head-Related Transfer Function Sound Design to enhance users' sound localization abilities. The game increases in difficulty by introducing environmental distractions and altering sound clarity. The current prototype is developed in Unreal Engine 5.4.4 and is compatible with Meta Quest 2 and 3, with optional headphone support for improved audio accuracy.
Primary Features
- Realistic environments and animal avatars to enhance user experience and immersion
- Adaptive gameplay mechanics
- Performance tracking and immediate feedback to show user progression and gamify the experience
Process
Users interact by pointing and selecting the location of animal sound sources, receiving real-time feedback and scores based on accuracy. The experience features adjustable difficulty settings, a scoring system, and a progress panel that includes a timer countdown, target reminders, and performance tracking.
Design
The environment is built on a Procedural Content Generation (PCG) framework, enabling comprehensive customization to replicate real-world settings. Key environmental parameters, including vegetation density, weather conditions, and time of day, can be adjusted dynamically. The current prototype features three distinct environments, each offering a unique experience.
Development
EARSIM is built using Unreal Engine 5.4 with Meta Quest integration and is exported as a Windows PCVR package. The application uses procedural content generation (PCG) for dynamic environment creation, combined with a modular weather system, and features sound design utilizing the Resonance Audio spatialization plugin. We also used Unreal Engine's material graph system to create various game textures, as well as behavior trees and animation blueprints for animal behavior and movement. The logic for game progression is written with a blend of Blueprint and C++, while the rest of the functionalities are implemented using Blueprint.
User Research
Usability testing was conducted to evaluate navigation, task completion, and interface clarity with participants of varying VR experiences and hearing abilities. Overall, usability was rated positively, with 7 out of 12 participants agreeing that navigation was straightforward and intuitive.
Sound localization was identified as a key area for improvement, with some users struggling with unclear audio cues. Participants recommended enhancing sound cue clarity and refining the tutorial to provide clearer, step-by-step instructions, especially for users with no prior VR experience.
Next Steps
Areas of exploration and improvement include:
- Implementation of profile system and user-based data saving
- Introduction of more animal species for diverse sound cues
- Improvement of accessibility options (color contrast settings, UI readability, etc
Code
- What version of Unreal is required to build this?
- Where on GitHub can we find the code? (If it's not on GitHub, why not?)
- Add a section explaining:
- What third-party assets need to be added to the project?
- What the third-party asset does
- Why the third-party asset is needed.
Art Assets
Provide information on any artwork developed for the project. This could includes scenes, controllers or assets.
Libraries
What third party assets and plugins were used in the development of this project? Please thoroughly list all of them.
A section regarding Unity Engine components and art related elements pertaining to that Prefab or Gameobject
- What components need to be present in the Prefab or Gameobject for the feature to work properly (eg. Rigidbody and Colliders)
- Why the components are needed
- What tags or layers the Prefab or Gameobject needs to be set as and why
- Any additional special notes or quirks about the GameObject or Prefab
Setup Guide
To try out the project locally, please refer to the attached PDF document: EARSIM Set-Up Guide.pdf
This project requires a Meta Quest headset and a PC with a good graphics card to run. Don't worry if you don't have one — you can book a demo to try out the project at EML's demo space: https://eml.ubc.ca/visit-eml/
Poster
Development Team
Principal Investigators
Dr. Douglas Sladen
Associate Professor
UBC Faculty of Medicine
University of British Columbia
Dr. Valter Ciocca
Professor
UBC Faculty of Medicine
University of British Columbia
Dr. Joanne Whitehead
Bioinformatician
BC Children's Hospital
Subject Matter Expert
Eli Hason
Sound Designer
Wabi Sabi Sound
Student Team
Sinnie Choi, Project Lead, UI/UX Designer (September 2024 - April 2025)
Work Learn at the Emerging Media Lab at UBC
Undergraduate in School of Architecture and Landscape Architecture
University of British Columbia
Eric Tang, Software Developer (September 2024 - April 2025)
Work Learn at the Emerging Media Lab at UBC
Undergraduate in Bachelor of Applied Sciences in Computer Engineering
University of British Columbia
James Edralin, Software Developer (September 2024 - April 2025)
Coop Software Developer at the Emerging Media Lab at UBC
Undergraduate in Bachelor of Science in Computer Science
University of British Columbia
Mark Hamon, Software Developer (January - April 2025)
Work Learn at the Emerging Media Lab at UBC
Undergraduate in Bachelor of Science in Physics
University of British Columbia
Julien Roy, Project Lead and Software Developer (September - December 2024)
Work Learn at the Emerging Media Lab at UBC
Undergraduate in Bachelor of Science in Computer Science
University of British Columbia
License
MIT License
Copyright (c) 2023 University of British Columbia
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Last edit: May 18, 2023 by Daniel Lindenberger
|