Documentation:23-1004 VR Makerspace

From UBC Wiki
Emerging Media Lab
UBC Emerging Media Lab Signature.png
About EML
A collaborative space for UBC faculty, students and staff for exploration of emerging technologies and development of innovative tools and solutions. This wiki is for public-facing documentation for our projects and procedures.
Subpages


Introduction

Overview

In collaboration with the UBC Master of Education in Literacy Education (LITR) and Master of Educational Technology (MET) Program, the Virtual Reality (VR) Makerspace project aims to provide a collaborative, open space for creativity. It is built with inclusive practices and accessibility in mind, and will be used initially by students in the LLED 559 (Digital Media Literacies) course in the Online Master of Education in Literacy Education program, with the potential to be expanded to K-12 classrooms. The core functionality of the project is centered around a "cardboard challenge", where students are tasked to build with pieces of cardboard. This will be brought to a VR environment and students will have access to a variety of materials as well as open collaboration spaces.

"Making has traditionally not focused on inclusivity and as such our problem is to re-examine maker challenges utilizing the Liberatory Design Thinking model that helps to create culturally responsive making content. This grassroots project needed to be supported by a group that would be responsive to cutting edge solutions with an inclusivity focus." - Dr. Melanie Wong and Dr. Keri Ewart, Principal Investigators

Application

Phase 1

As part of the weekly provocations, students in the LLED 559 (Digital Media Literacies) course in the Online Master of Education in Literacy Education program will have an opportunity to virtually tour and engage in a Virtual Reality (VR) Makerspace environment. This VR makerspace will be used during asynchronous and synchronous (live zoom sessions) discussions to provoke ongoing inquiry and support course assignments. This VR Makerspace will be shared with students in the UBC Master of Educational Technology Program (MET) as well.

Cardboard Challenge

Through the practice of making and tinkering, traditional makerspaces seek to involve learners in meaningful real-world activities that cultivate innovation, critical thinking, and collaboration skills. The emergence of Virtual Reality (VR) Makerspaces extends this traditional model into immersive technology, facilitating virtual collaboration for users who are in different locations around the world. In addition to prioritizing equity, diversity, inclusion, decolonization, and anti-racism, VR Makerspaces also employ Liberatory Design Thinking Theory to encourage the participation of all learners, striving to establish a comprehensive educational framework in an ever-evolving world. Students will have the chance to use the VR Makerspace as part of their weekly provocations to foster creativity and collaboration, and to support course assignments. In its first phase, students will work on the Cardboard Challenge, where they will build items (e.g., a chair) by manipulating a variety of geometric shapes.  

The cardboard challenge exposes makers to the idea of possibilities. Cardboard is an everyday material that is often discarded; but what about being able to re-image the use of cardboard to create arcades and other objects to solve real-world problems. This inclusive maker challenge is inspired by Caine’s Arcade (Link).

Primary Features

The key features of the VR Makerspace include:

  • Multiplayer ability
  • Object interaction
  • Object slicing
  • Open environment with inclusive design elements
  • UI Wireframes

Development Overview

The following technologies were used in the creation of the project:

  • Unreal Engine 5.3.1
  • Perforce
  • Online Subsystem EOS
  • EOS Voice Chat Plugin

Unreal 5.3.1 was used to develop the VR Makerspace. Both C++ scripts and UE5 Blueprints were used for in-game functionalities and the multiplayer component of the project. Currently, the slicing interaction is mapped to button A (IA_ButtonA) on any Oculus Touch device. The spawning action is mapped to button B (IA_ButtonB). Teleport movement has been disabled and replaced with fluid movement, as this prevents players from jumping around the space. Player movement is mapped to the right thumbstick. Snap turn has also been disabled to prevent interfering interactions using the thumbstick.  

All relevant virtual reality interaction blueprints can be found in BP_VRPawn, which includes blueprints for slicing procedural mesh and spawning actor objects. Relevant material interaction blueprints can be found in BP_CutObject, which copies the geometric properties of the static mesh component onto a procedural mesh to allow for the dynamic modification of materials when users slice through one. The UI for vertical and horizontal slicing has only been implemented in first person, which can be found in the BP_FirstPerson file.  

Multiplayer

Multiplayer is implemented using peer-to-peer, based on a series of YouTube tutorials (J.1.1). Using peer-to-peer, users can log in and connect to another user's session and interact with each other. Epic Online Services Voice Chat functionality is partially written, with only limited UI built for input and output audio device selection.  

The current method of logging in and creating/joining a session is provided by the WBP_MainMenu blueprint, which provides UI to Login, Create a Session, and Join a Session. A user must first log into their Epic Games account on the Epic login portal by pressing the "LOGIN WITH EOS" button. Once a user successfully logs in, they can create a new session with the "CREATE SESSION" button or join an existing session with the "JOIN SESSION" button. The "JOIN SESSION" button will also create a new session if one does not currently exist.  

Current implementations make use of the EOS lobbies interface to provide multiplayer functionality; specifically, when creating the FOnlineSessionSettings object, the bUseLobbiesIfAvailable and bUseLobbiesVoiceChatIfAvailable fields are set to true. The game instance blueprint, BP_GameInstance, inherits from the EOS_GameInstance class defined in the C++ scripts. This lets the user access the parent FString field, OpenLevelText. When a session is successfully created, the success callback will ServerTravel to the map specified by OpenLevelText (currently pointing to the VRTemplateMap at /Game/VRTemplate/Maps/VRTemplateMap).

Design Overview

The following technologies were used in the creation of the project:

  • Blender 4.0
  • Twinmotion 2023.1.2
  • Revit 2023
  • Unreal Engine 5.3.1
  • Figma
  • Miro

Environment

VR Makerspace Environment

Rather than adhering to the traditional classroom setup of solid walls and rigid structures, we designed a fresh environment tailored to enhance students' learning and inspire creativity. The VR Makerspace consists of a central room complemented by several smaller pods serving as "breakout rooms". Users will first convene in the main space for instructions and large-group brainstorming before dispersing into the smaller pods for small-group collaboration on their assignment, such as the cardboard challenge. With semi-open facades and wood slats, we've designed the space to facilitate visual interaction, enabling individuals to observe and draw inspiration from their peers while working on their own project.

Avatar/Assets

With the intention of creating an inclusive space, we implemented an non-human avatar design in which users would be able to change certain visual features as their personal representation in the VR space.

First Time Setup Guide

  • Provide enough information for a first time user to get up and running
  • Known Issues affecting functionality or stability of the program

Poster

VR Makerspace Project Poster

Team

Principal Investigators

Dr. Melanie Wong

Assistant Professor of Teaching

Program Advisor for the Online MEd in Literacy Education

University of British Columbia

Dr. Keri Ewart

Coordinator, EDI & Community Outreach

MET Practicum Instructor

MET Lecturer

University of British Columbia

Current Team

Sophia Yang, Project Lead and Software Developer (Sept 2023 - Present)

Work Learn at the Emerging Media Lab

University of British Columbia

Sinnie Choi, UI/UX Designer (Sept 2023 - Present)

Work Learn at the Emerging Media Lab

University of British Columbia

Walker Rout, Software Developer (Jan 2023 - Present)

Work Learn at the Emerging Media Lab

University of British Columbia

Previous Contributors

Karan Anand, Software Developer (Sept 2023 - Dec 2023)

License

MIT License

Copyright (c) 2023 University of British Columbia

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Last edit: Jan 25, 2024 by Sophia Yang

Some rights reserved
Permission is granted to copy, distribute and/or modify this document according to the terms in Creative Commons License, Attribution-ShareAlike 4.0. The full text of this license may be found here: CC by-sa 4.0
Attribution-Share-a-like