Documentation:Jupyter3D

From UBC Wiki
Emerging Media Lab
UBC Emerging Media Lab Signature.png
About EML
A collaborative space for UBC faculty, students and staff for exploration of emerging technologies and development of innovative tools and solutions. This wiki is for public-facing documentation for our projects and procedures.
Subpages


Emerging Media Lab Project: Jupyter 3D Documentation

Note

Jupyter is an ongoing project at Emerging Media Lab. This page will be updated as the project progresses, if you have any questions, please contact us at emergingmedia.lab@ubc.ca or visit EML's website here.

Introduction

Immersive Virtual Reality (VR) can transform the way educational content is delivered; it can engage users in a completely virtual world and aid in teaching students 3-dimensional concepts that are currently taught in 2D.  Inspired by many VR applications in the classroom, Jupyter3D aims to help students’ understanding and intuition for physics without the expense of building and operating a physics lab. The project creates an educational Virtual Reality experience in which students can visualize and walk around a 3 dimensional static wave surface which is a “snapshot” or series of snapshots of a physically realistic, dynamic wave. The students demonstrate their knowledge by placing a virtual “flagpole” at a spot on the wave surface, showing they have found an area of interest in the simulation. This project is built on the platform Unity3D software and uses Oculus Rift headset and controller to navigate the VR environment.

FAQ

  1. Why is the project initiated? This addresses a pedagogical issue in teaching physics and math, to have student develop an intuition for mathematical models and computer simulations of real physical phenomena. It is too expensive, or complicated, or dangerous to bring students to see a real physical wave like electromagnetic waves from a radio, and 2D animations on a computer screen are not sufficiently informative or useful to a student. We want a 3D VR experience that replicates the real physics in a simulated environment. We chose “waves” as a demonstration topic, although other physics simulations could be considered down the road
  2. What was the educational concern? The concern is that a room full of students, or a student at home on their computer, do not have access to a real physics lab and cannot easily build an intuition about the math and physics behind complex, real phenomena. A VR simulation that is accessible to a class, or a single user on a computer, will give an experience similar to a real physics lab. And it can go beyond, to investigate real phenomena that can’t been seen in real life (e.g. electromagnetic waves from a cell phone or radar are invisible to human senses.)
  3. Why do we want to use VR to solve this problem/concern? VR can create an accurate simulation of real phenomena, that help build the students’ understanding and intuition for physics without the expense of building and operating a physics lab.

How to Use Jupyter3D [Version 1.0] (August 8, 2019)

  1. Where to try:
    • Come to EML drop-ins!
    • Email emergingmedia.lab@ubc.ca
  2. Tools/Technologies Used:
    • Unity 2017.4.26f
    • Oculus Rift S
    • HTC Vive
    • C#
  3. How to use
    • Click the project’s .exe file and wait for it to load
    • Controller buttons and functions
      • Right Controller:
        • Grip: Hold to plant flag
        • Joystick: Hold to reset and remove flags (will remove all flags)
        • Trigger: Hold to move the flag around (and rotate)
        • Trigger + Joystick: Up & down to to move wave forward and back.
      • Left Controller:
        • Grip: Hold to divide the wavelength by a factor of two.
        • Joystick: Press to change wave type
        • Trigger: Press to play and pause the scene.
        • Joystick: Move up and down to adjust amplitude

Using it in the classroom

  • Large class:
    • Professor can borrow an Oculus Rift S/HTC Vive from UBC’s EML. EML’s personnel can help with set up in the classroom
    • Professor wears a headset and projects what they see in VR on a screen for students to see
  • Small class:
    • Tutorial style: email [[1]] to book EML space/headsets. Can have 3-4 students using Jupyter3D at one time.

Primary Features

The user can:

  1. Plant a flag on areas on interest on the wave (e.g. amplitude)
  2. Change wave amplitude and/or wavelength
  3. See wave as static or dynamic (pause/play function)
  4. See up to 6 different types of waves including a ripple wave and Gaussian pulse.
  5. Move position of wave

Lessons learned

The Jupyter 3D project evolved throughout the past few terms.

  1. We began with a wave simulation made by generating balls inside a square, with each ball located at a different height. We added a shader in order to enhance the visualization, since it was be hard to differentiate the balls especially with the different depths and heights. However, we quickly realized the disadvantages with this manner of generating a wave. For instance, a peak is not precise and the area of interest is limited by the balls’ boundaries. We were limited by the types of functions that could be displayed due to these mesh restrictions. Our first improvement was to use properly scaled mesh grid in order to create the area for the wave simulation. This helped with the precision issue previously encountered.
  2. We then decided to create our own customized mesh generator in order to have more flexibility and control, which really helped in the display of different wave types. A mesh is composed of three vertices. Two meshes are needed to create a quadrilateral (quad). The foregoing analysis enabled us to create meshes with as many xz quads as needed to represent the wave. The y component of each vertex represents the wave height.

Plugins/assets/prebuilds developed

  • Development Platform: Unity
  • Virtual Reality Headset: HTC Vive and Oculus Rift S
  • All Assets were taken free from Unity.

Future Directions

Future student teams should work with the co-Principal Investigators on:

  1. Improving the user interface of Jupyter 3D
  2. Including more types of waves
  3. Potentially creating a pipeline from Jupyter Notebook application to Unity so that users that understand Jupyter Notebook and Python can generate VR 3D waves without needing to understand Unity. This is a complex and time-intensive task.

White Paper

N/A

Poster

To see, please visit

UBC Emerging Media Lab, Irving K. Barber Room 183, 1961 East Mall

Vancouver, BC Canada V6T 1Z1

Team

Co-Principle Investigators:

  • Dr. Matthew Yedlin, Electrical and Computer Engineering Department, Faculty of Applied Science, University of British Columbia
  • Dr. Michael Lamoreaux, Mathematics and Statistics, University of Calgary

Current Student Team:

  • Kyle Mas, Project Technical lead (September 2019 - present)

Previous Student Team:

  • Sabrina Ge, Project Coordinator (January 2019 - August 2019)
  • Rayhan Fakim, Project Technical Lead (May 2019 - August 2019)
  • Abel Waller, Project Technical Lead (January 2019 - April 2019)
  • Daanyaal Sobani, Project Lead (September 2018 - December 2018)
  • Patrick Kong, Developer (January 2019 - August 2019)
  • Harvey Huang, Developer (May 2019 - August 2019)
  • Librason Chen, Developer (January 2019 - August 2019)
  • Amelia He, Developer and Research (May 2019 - August 2019)
  • Julia Zhu, Developer (January 2019 - April 2019)
  • Kyle Mas, Developer (May 2019 - August 2019)
  • Musa Mohannad, Developer (January 2019 - April 2019)

License

Some rights reserved
Permission is granted to copy, distribute and/or modify this document according to the terms in Creative Commons License, Attribution-ShareAlike 4.0. The full text of this license may be found here: CC by-sa 4.0
Attribution-Share-a-like