Jump to content

25-3003 LoveLoop: Immersive Fertility Training Tool

From UBC Wiki
Emerging Media Lab
About EML
A collaborative space for UBC faculty, students and staff for exploration of emerging technologies and development of innovative tools and solutions. This wiki is for public-facing documentation for our projects and procedures.
Subpages


Introduction

Background

With global estimates suggesting that 45-65% of pregnancies are unplanned (Mathers 2022), there is an urgent need for effective education in alternatives to mainstream forms of contraception, such as natural family planning methods. Unfortunately, access to Symptothermal Methods courses is limited, particularly for those who are most vulnerable to unplanned pregnancies. We aim to develop and test an innovative, immersive Fertility Awareness Training program centered on the Symptothermal Method (STM).

By utilizing advanced technologies this project seeks to provide an essential service, supporting sexually active people with additional knowledge and tools to make informed reproductive choices. The project’s aim will be to support the education of individuals and couples. The end result should offer an engaging and user-friendly platform for transfer of knowledge around STM. Further, the approach should address gaps or lack of access to fertility awareness available to LGBTQ2+ folks, it should be religion-neutral and judgement-free, and highly applicable for populations at higher risk of unintended pregnancies.

Challenge Overview

LoveLoop provides a unique immersive learning experience about the Symptothermal Method with an interactive day-in-the-life routine of a couple practicing STM. Each of the three STM techniques – basal body temperature (BBT), cervical mucus, and cervical position, are featured as core interaction opportunities, allowing users to see the actual changes to the body over the menstrual cycle. In the hopes of empowering a wide range of audiences with reproductive knowledge and power, this program aims to increase accessibility by being culturally inclusive, narrative-based, visually appealing, and user-friendly.

The current prototype is a desktop-based application. Future avenues of program delivery could include VR or a mobile interface. A tutorial system has been implemented to familiarize users with desktop navigation and controls before entering the game.

Primary Features

No. Task Priority Status
F1 Realistic bedroom, washroom, and clinic scene for user gameplay navigation Must have Complete
F2 Uterus 3D model paired with radial slider control for visualizing cervical position changes over ovulation cycle. Must have Complete
F3 Thermometer model with radial slider control for visualizing basal body temperature changes over ovulation cycle.   Must have Complete
F4 Cervical Mucus model with radial slider control for visualizing mucus stretchiness behavior over ovulation cycle.   Must have Complete
F5 In game notebook that allows user to take notes from in-game interaction. Must have Complete
F6 Object inventory system with adjustable item information settings for in game object storage Nice to have Complete
F7 Tutorial system that familiarizes users with game controls, interactions, inspection system, navigations, etc. Must have Complete
F8 Object Inspection system for in game object examination (rotation, view control, etc) Nice to have Complete
F9 2D user interfaces for game and inspection panels.   Must have Complete
F10 First person animations for picking up objects, walking, jumping. Nice to have Complete
F11 STM metrics generator for randomizing BBT, cervical position, and mucus viscosity based on the day. Should have Incomplete
F12 STM day selector for user to experience the changes across multiple day settings (Day Bar UMG completed) Should have Partially Complete
F13 Clinic Scene animation showing conversation between first person and NPCs. Should have Partially Complete
F14 Interacting with the uterus, adding anatomical labels Nice to have Incomplete

Design

The team uses user personas and Miro board to create the game flow and user experience of the game. With the given design inspiration provided, the team designed high fidelity scenes, metahuman characters, and implemented game play systems using unreal engine 5.5.4, realistic assets and textures using Blender and UI panels using Figma.

Here are more details regarding each component of design:

Assets

Figma Design

A variety of 3D assets were sourced from Fab to create the living room, bedroom, bathroom, and clinic scenes. A full asset list complete with licensing and links can be found in the appendix.

Female Reproductive Organs-X Section 3D Asset

This asset was sourced from user CVallance on SketchFab. It is under a Creative Commons Attribution license and therefore free to adapt and share. This model was imported into Blender and served as the base for three positional variations in the cervix check.

Materials and Textures for Multiple 3D Assets

Multiple assets of the 3D environment were modelled inside Unreal Engine 5.5. Multiple images of botanical art, South Asian patterns and motifs from google search were imported to unreal engine to create materials and textures to apply to various models.

UX/UI

UI Panels

Considering the story-based structure of this program, we saw a need to guide the user through the game by providing instructional prompts. These 2D UI panels were all created in Figma and adhered to style guidelines.

  • Start: Game entrance, background information brief, and an option to enter the tutorial
  • Tutorial: Contains visual instructions for desktop game navigation - walk, turn, jump, object interaction, inventory actions, etc., as well as instructions on completing the tutorial
  • Clinic: Dialogue boxes for the conversation that takes place in the doctor’s clinic. We opted to use instructional panels for the actual educational STM content so the user can read at their own pace.
  • BBT/Mucus/Cervix: Instructions to read the doctor’s note, acquire relevant items, and record data to the STM tracker.
    • Dynamic panels for each of the stages within the menstrual cycle containing relevant information and statistics. The radial slider bar, day number, and core asset were implemented in UE.
  • End: End page that appears when all in-game objectives have been completed. Users have the option to exit the game or keep exploring.

Project Branding

Project branding consists of the project name and branding ideas and project logo we designed after our research on the project and several conversations with the PI.

  • Project name: Considering the project background and overview and our initial conversations with the PI, we came up with several project name options and collectively voted for the top three choices with LoveLoop, Ovulink and FlowQuest. Loveloop was chosen as it highlighted intimacy, rhythm and continuous learning, while also suggesting gamification elements of the project. Also, it appeals emotionally while keeping things light and memorable.
  • Branding Ideas: We experimented with various ideas for colours, typography and logos to create a cohesive and aesthetically pleasing project.
  • Logo: We designed the logo with the main characteristics of the project in mind – openness, intimacy, and inviting, along with a nod to the integration of emerging technology.

MetaHumans

Custom MetaHumans were created using Unreal Engine 5.5.4, keeping in mind diversity and inclusion for the project. We created three MetaHuman characters/avatars - Dev, the male partner, Priya, the female partner, and the physician. After creating these avatars, we imported them in our project and used various animation workflows to create animations. We used some preset body animation rigs available from platforms like Mixamo. Other custom manual rigging was done inside Unreal Engine using MetaHuman and mannequin control rigs with keyframes inside level sequencers. To create face animations with lip-sync and head movement, we used Unreal Engine’s Livelink app. We also used tools like Eleven Labs to create audio dialogue using Indian male voice samples for Dev’s/male avatar for the clinic scene animation.

Customized Metahumans

To create a customized MetaHuman in Unreal Engine 5.5, we used the MetaHuman Creator (a cloud-based tool) to design our characters. They were then imported into Unreal with Quixel Bridge.

Metahuman Body Animations

To create an animation for picking up the thermometer, we used an animation workflow of manual rigging using IK and FK method for global control. It was then further manually keyframed and rigged in the level sequencer. After finishing the body animation, we baked the animation and exported to be retargeted and used in the animation.

For some simple actions like seating on chair and talking, we used rigs from Mixamo. These rigs were imported and modified inside the level sequencer by baking the rig to the default mannequin.

Metahuman Face Animations

Facial animations were created with Unreal Engine’s Livelink app, which can be installed on a mobile device to record videos. We then imported the video recordings using the MetaHuman animator option to Capture Source. MetaHuman identities were then created, which were used for face calibration and teeth alignment. After creating a new instance of a MetaHuman Performance, the previously recorded video is imported and processed to create a lip-sync animation. This animation is then exported and retargeted to the appropriate MetaHuman character within the level sequencer.

For audio, we added the audio track into the level sequencer and aligned it with the lip-sync animation.

Morph Targets - Cervix Mesh Deformation

The cervical position check involves inserting a finger up into the vaginal canal to evaluate the depth and feel of the cervix, which varies throughout the menstrual cycle. To illustrate these changes in an original way, we implemented a 3D model of the uterus; the skeletal mesh smoothly transitions between states depending on the input menstrual cycle day.

There are two variables this model displays: cervix position (low to high) and state (closed to open). Though there are four stages within the menstrual cycle, only three variations were needed.

  • Menstruation: low and slightly open
  • Infertile: low and closed
  • Fertile: high and open

The base model was imported into Blender and manipulated using a mix of modelling and sculpting to create the three variations. As it was densely packed with vertices, we took great care to not warp the topology to not affect the texture/material mapping.

Morph targets were used to smoothly interpolate between shapes. Meshes must have the same vertex count, vertex order, number of subdivisions, and be perfectly overlapped before linked. Using the menstruation mesh as the base and the infertile/fertile meshes as the targets, all three were joined as shapes under Data properties.

Once joined, mesh transition visualization is possible by toggling the value bar next to the target meshes.

Morph targets are most often used between two meshes, so there was a lack of documentation about implementation for multiple targets. Additionally, we needed our meshes to morph in a certain order. We experimented with absolute shape keys, which play in sequence like keyframes on a timeline and were ideal for creating animations within Blender, but they were not natively supported in Unreal Engine as morph target sequences. Thus, we kept the shape keys as relative and handled all the logic within Unreal (see section D.4.C - Morph Targets for more information). Before exporting, an armature skeleton was parented to the base mesh with automatic weights to ensure Unreal compatibility.

A thorough documentation file was created for this process and can be found in the Documentation folder in this project's Microsoft Teams channel. Note that the final .blend file has a keyframed animation showing transitions between states, as well as corresponding text about the current menstrual phase.

Figma File Structure

All design elements can be found on the LoveLoop Figma file.

Game UI Hosts all in-game and tutorial user interaction panels, organized by order of appearance. Includes both static instructional panels and wireframes for the dynamic temperature, mucus, and cervical position UIs. Other miscellaneous graphics such as the doctor’s note, STM tracker sheet, object interaction widget, and anatomical labels are stored in this page.  
Moodboard/Styleguide The style guide includes problem statement, project background and overview, typography, color palettes, imagery, and UI style inspiration. The design moodboard aimed to help with the creation of the 3D scenes, which was styled with South Asian motifs at the behest of the PI team.  
Project Branding Project name concepts and branding LoveLoop logo.  
User Personas Biographies and objectives of fictional end users to guide us in user experience and accessibility.  

Development

Folder Structure

Overview of directories throughout development:

Directory Description
/Content/Animations/ Contains all the top-level background animations for the game.
/Content/Asset/item_images/ Stores images for interactable objects in the scene; these images are dynamically added in item description under inventory system.
/Content/Asset/UI/ Stores all the UI panels inside the game.
/Content/Asset/user_prompts/ Stores all the control prompts used for user navigation.
/Content/Asset/uterus/ Stores the asset of the uterus model
/Content/Asset/White/ Stores an extensive library of more user prompts that can be used for more user controls
/Content/Blueprints/actors/ Stores the main actor blueprints in the game
/Content/Blueprints/GameInstance/ Stores all game instances blueprints in the game
/Content/Blueprints/Interfaces/ Stores all the interfaces used in the game
/Content/Blueprints/UserInterfaces/ Stores the major UI widgets in the game
/Content/Characters/ Stores the first-person blueprints, game Mode and controllers
/Content/Fab/ Stores all the assets directly pulled from unreal fab stores.
/Content/FabricMat/ Stores all the fabric-dedicated materials in the ‘bathroom’ scene.
/Content/fonts/ Stores all the unique fonts used in the game for future development.
/Content/Inventory/ Contains content-based structure related to the inventory system
/Content/Inventory/Inspection/ Contains all logical components related to inspection system (preview manager blueprints for generating render targets, render target, user widget, and render target material)
/Content/Level Sequences Contains all the level sequence animations used in the game.
/Content/Maps/ Contains all the maps used in game
/Content/Materials/ Contains miscellaneous materials for other types of objects in the scene.
/Content/Meshes/ Contains the main interactable objects meshes in the game
/Content/metahuman_animator/ Contains face animation sequences and dialogues used in doctor clinic animation.
/Content/MetaHumans/ Stores all the metahuman characters and respective blueprints applicable to the game.
/Content/Baked_Transition/ Stores all the baked animation sequences for transitions used in doctor clinic animation.
/Content/Splash/ Stores game icon
/Content/ThirdPerson/ Stores third-person blueprint implementation, input actions, and related maps
/Content/Tutorial/ Stores all the blueprint implementations, blueprint components, related user interfaces, and UI images.  

Top Level Overview

This section introduces the top-level development pipelines including game levels, player controls, game modes, instances, etc.

Project Overview

This project contains the following project-specific settings. All the following blueprints class can be found under Content/Characters:

  • Game Mode: BP_MyGameMode
  • Default Pawn Class: BP_first_person_character
  • Player Controller Class: BP_MyPlayerController

Level Overview

LoveLoop contains three major levels, which can be found under Content/Maps folder.

  1. Tutorial Map (Content/Maps/tutorial_map): provides a simple and immersive environment for the user to learn the controls required for the game.
  2. Main Map (Content/Maps/Bathroom): the starting and primary map where all the interactions and gameplay take place.
  3. Story Map (Content/Maps/Doctor_Clinic): the map where gameplay animations take place to convey background stories to the users for a more engaging experience.

The current project does not contain extensive top-level logic; more details around user interactions and features will be introduced in later sections.

Input Actions

LoveLoop contains many interactions that require users to perform keyboard inputs. Each input is associated with an action name, and these input actions are defined under Content/ThirdPerson/Input/Actions. Here is an overview of the input actions that were defined in the game along with a short description.  

Input Action Name Key Description
IA_Jump Space Bar Jumping in the scene (default game)
IA_Move WASD/Arrow Keys Moving in the scene (default game)
IA_Look Mouse User camera movement (default game)
IA_Run Left Shift Hold Run
IA_Crouch C Crouch
IA_Interacct F Interact/inspect
IA_PickUp E Storing an object into inventory
IA_Inventory Tab Open/Close the inventory
IA_LeftMouseClick Left Mouse Button Click Detecting mouse click (testing)

Player

LoveLoop is centered around the first-person player; its in-game presence is conveyed as a first-person character implemented in BP_first_person_character. The first-person character is implemented on top of Unreal Engine’s third person template with default movement and camera settings.  

Implementation

The first-person character contains a collision by default; this is the capsule that surrounds the first-person character in which the environment uses to detect overlap between the player and interactable actors in the scene. Around this logic, the first-person character blueprint implements all the input action triggers with further technical details introduced in later sections.

Gameplay:

This section provides a walkthrough of the overall game flow, from start to finish. It also introduces the key level maps and explains how they are interconnected through user interfaces.

Main Map

The Main Map serves as the primary gameplay environment where the core STM learning experience takes place. The player assumes the role of the male character in the narrative and engages with various objects throughout the scene to gain an understanding of STM. The map features MetaHuman characters, interactable and inspectable objects, and a high-fidelity environment designed to support immersive learning.

Tutorial Map

The Tutorial Map is intended to help first-time users of the desktop application familiarize themselves with the keyboard controls. Set within a maze-like environment, it guides users through a sequence of interactive tasks with the aid of instructional user interfaces. This map prepares players for effective navigation and interaction in the main LoveLoop experience.

Doctor’s Clinic

This map offers narrative context and an introductory overview of STM through an immersive, story-driven scene. Set in a doctor’s clinic, the environment includes Metahuman characters and animations built around their interactions. Users can freely explore the space and choose when to initiate gameplay in the Main Map.

Game Flow

The game begins in the Main Map, where users are presented with a panel offering two options: start the tutorial or begin the game.

Choosing the tutorial option transports the user to the Tutorial Map, where they can learn the keyboard controls by completing a simple task.

Selecting the Start Game option officially begins the game, starting in the Clinic Scene Map. In this scene, the user is immersed in the background story from the perspective of the male partner, who is accompanying his partner to a clinic for a consultation with a physician about the Symptothermal Method. As the narrative unfolds, the user gains the freedom to explore the gameplay environment and interact with elements as the story continues.

Inspection And Interaction System

This section will introduce the core interaction logic in LoveLoop,  project hierarchy, and implementation around the interaction and inspection system.

D.4.A. Inspection and Interaction System Overview

The current version of LoveLoop is an immersive exploration-based desktop game. The goal of the application is to construct a novel learning experience, as an alternative to a traditional educational approach. The functionality in the environment thus relies heavily on user’s proactive interaction with objects/characters in the game.  

The interaction system enables objects to be interactable in the scene when the player triggers an interaction through a keyboard command. The inspection system is an extension of the interaction system; it enables interactable objects to be inspectable in the game to improve user experience.

Interaction

An interaction is defined by a key trigger when the player gets close enough to an interactable object. Consider an interaction as initiating a conversation, a successful and meaningful  

in-person conversation can only be established when the target and the player get close enough with an initiator. For LoveLoop, the player will always be the initiator, a keyboard action the tool for initiation, and the interactable actors in the scene being the target.

Inspection

An inspection in the game is defined as an extension after an interaction. In other words, it can only be performed after an interaction is triggered beforehand. Using the same analogy as interaction, inspection can be parallelized as the thinking process after initiating a conversation. A converser may prefer taking time to ponder over the conversation, just like how the player may want to take a closer look at the interactable object in the scene. For LoveLoop, inspection is implemented by giving the player freedom to examine the objects in the scene by enabling rotation, zooming and special panel interactions.

Figure 4.1.0. An example of inspection widget with thermometer

D.4.B. Top Level System Hierarchy

Interfaces

The interaction/inspection system is built around an interface named Inspect, which can be found at Content/Blueprints/Interfaces/BPI_Interact.

The interact interface contains two major functions, Interact and Inspect respectively. These functions were implemented in interactable actors in the game depending on the role and importance.

  • Interact: This can be thought of as the fundamental interaction function; it is implemented in actors that have unique interaction behaviors.
  • Inspect: This function can be thought of as the inspection trigger function. Actors that implement this function will be both interactable and inspectable.

Note: An actor in the game can implement both interact and inspect functions at the same time. Top Level Classes GI_Love

GI_Love is a game instance class storing information that stays persistent throughout the game. This class is utilized most heavily with BP_Notebook, whose technical details will be introduced in its corresponding section.

BP_BaseActor

The BaseActor class is the parent class of all inspectable actors in the game; it implements the Interact interface above and contains interaction trigger conditions.  

On Spawn Variables

Each BP_BaseActor exposes public variables that can be dynamically modified in scene, these usually corresponds to important properties related to this specific actor in scene.

  • Interaction | Inspection | Inspect? Toggles whether this actor will be interactable or not. Always set this value to TRUE if you want to inspect and interact with this actor.
  • Figure 4.2.0. BP_BaseActor variable settings
    Settings | Debugging? Toggles the collision radius of the BP_BaseActor. Setting this value to true will make the collision radius to visible and invisible otherwise.
  • Inventory | Item: Contains item properties of this BP_BaseActor, details about each entry will be introduced in the inventory section.
  • Figure 4.2.0. BP_BaseActor variable settings

Collisions

A BaseActor contains two layers of collisions for interaction, the outer and inner collision. A collision can be considered as an invisible shield that encompasses the actor, used for detecting overlaps between the player and the interactable actor. Most of the trigger logic is detected by the inner collision. When an overlap is detected between the BaseActor and the Player inside the inner collision radius, an IA_Interact input action can be successfully triggered.

Figure 4.2.2. BP_BaseActor base structure

Tag Interactions

Tags are icons that are only visible when the user enters its corresponding collision radius; they are utilized as indicators for the user to perform a keyboard action interaction.

BaseActor’s inner and outer collisions are customizable based on user input; its technical details will be covered in the Inventory section. Outer and inner tags will be rendered visible one at a time once the player overlaps with the tag’s respective collision radius, this is implemented by overriding sphere-collisions onComponentBeginOverlap and onComponentEndOverlap properties in event graph.

User Widget

BaseActor’s inspection is demonstrated using a user widget, W_Inspect. This widget is the container for all inspection widgets. It constitutes three core components: the renderer, description, and sub-widget.  

Figure 4.2.3. W_Inspect widget structure
  • Render: This is a Render Target that captures the real time changes in the game environment from a specific camera input. Once the user triggers an inspection using action inputs in the scene with a BaseActor, a copy of the interacted BaseActor will be created in the world along with a PreviewManager actor at an arbitrary location that is distant from the current player location in the game. The PreviewManager actor contains an input camera, and the BaseActor copy is implemented to spawn right in front of the PreviewManager camera input. A Render Target is projected onto the user widget, simulating an inspection of the actor in the environment.  
  • Description: A description includes a title and a short description that hints to the player regarding its functionalities. Our current implementation binds this component to a Struct property (covered in inventory section).
  • Sub Widget: This component allows another widget to be attached onto the inspection widget container. This field will be filled when the BaseActor has special User interfaces for interaction. Implementation in this field is only a concern for the four core interactable actors which will be covered in later sections.

Inspection Controls

Besides defining the scope for successful trigger conditions, BaseActor covers all inspection logic from rotation, zooming, and sub widget addition. As mentioned in the User Widget, a copy of the interacted BaseActor is generated in the world, and that BaseActor can be controlled using mouse events, WASD/arrow, etc. This is implemented by overriding a series of default mouse and keyboard event post-widget addition. In our implementation, we bind Mouse’s X and Y movement to Base Actor's location, Mouse Scroll events to changing the scale of the Base Actor and right click for resetting scale and location.

Event Inspect Implementation

BaseActor implements the Inspect function from the Interact interface, it mainly enables the UI Mode and appends W_Inspect onto the viewport. Along with the widget addition, camera and movement controls, character visibility will be temporarily toggled during inspection for better inspection experience.

Event Inspect also silently calls the event Interact function under Interact interface, this call is intended to initiate potential actor-specific implementation as some subclasses of BP_BaseActor implements exclusive functionalities in the event interact function.

First-Person related controls

Recall that all the input action post-trigger implementations are implemented in BP_first_person_character. In our implementation, upon IA_Interact, all the overlapping actors will be obtained and sent through a filter. All the BaseActors will then be duplicated in the scene along with the generation of a PreviewManager.

Figure 4.2.7. Relationship diagram amongst BP_first_person_character, BP_BaseActor, W_Inspect, BP_PreviewManager, etc.

Direct Subclasses

The list of classes under direct subclasses all directly inherit BP_BaseActor, which means that all these classes are inspectable in the game. For LoveLoop, the majority of the interactable objects are strictly inspectable.

BP_ItemSample

BP_ItemSample is an actor that is used to define interactable actors in a scene that does not contain an associated interaction panel. It contains a static mesh that can be dynamically added in the environment, with the rest of the features identical as BP_BaseActor’s inherent functionalities. This makes BP_ItemSample an actor used to define inspectable objects with no special behaviors, simulating a plain object in the scene. Having interactable objects in the game makes the experience more realistic in which the user can pick up and look at items in the house.

Dynamic static mesh update is implemented through a setMesh property in construction script, with the static mesh property specifically defined in main level map.

BP_Thermometer

BP_Thermometer is one of the three core STM actors, it is mainly used to showcase the change in Basal Body Temperature across the ovulation cycle. Users can interact with BP_Thermometer similarly to BP_BaseActor subclasses, by initiating the interact input action. Functionality wise, BP_Thermometer comes with the inherited inspection panel, but with the addition of a dedicated sub-widget that contains a slider associated with the ovulation cycle to visualize temperature changes and temperature related symptoms over each critical period.

Sub-widget

The sub-widget is where the dedicated slider and actor specific information are located. This widget is appended directly onto the inspection widget introduced under BP_BaseActor.

The addition of a dedicated sub widget is implemented by activating the Interact function under the interact interface. The Interact function creates the sub-widget and appends it to the item slot entry from the inspection panel inherited from its superclass.  

The dedicated sub-widget is implemented in W_ThermometerWidget.  This widget contains a custom slider in which the slider values are linked to a line plot that illustrates the relationship between basal body temperature and days in the ovulation cycle. (The line plot is implemented in a Float Curve file under Content/Blueprints/FloatCurve_BBT_Cycle).  

In terms of implementation, the change in slider value is captured under Unreal’s slider’s default onValueChanged event. The value of the slider, which typically ranges from 0 to 1, was mapped to the respective ratios for calculating critical period ranges. Based on respective ranges, the information on the panel and the text renderer on BP_Thermometer’s actor mesh will update accordingly.

Actor Mesh

It can be easy to confuse the implementations of actors like BP_Thermometer with BP_ItemSample. It is important to realize that BP_Thermometer and other subsequent actors in this list possess their independent pre-defined meshes rather than BP_ItemSample’s dynamic mesh mechanism. BP_Thermometer’s mesh consists of a static mesh with a text renderer; this text renderer is bonded to be updated dynamically based on the slider values as introduced above using a simple setText function under BP_Thermometer class.

BP_Notebook

BP_Notebook is an assistant actor that allows the user to record data collected from interacting with BP_Thermometer, BP_cervix, and BP_Mucus. It reminds the user of the standard practice of recording the data after performing each Symptothermal method throughout the day.

Sub-widget

Like the implementation of BP_Thermometer, BP_Notebook also contains logic for appending a sub-widget(W_TrackerSheet) under event interact function from Interact interface.  

In the implementation of W_TrackerSheet, the widget has a copy of the tracker sheet on the view port, each entry of the tracker sheet is bonded to an editable textbox which its corresponding value is then stored inside a Structure (S_Journal) inside a Game Instance (GI_Love). Each time the user finishes typing an entry in the editable box, the text will be updated in the game instance through onTextCommited event, and the changes will be reflected in real time onto the tracker sheet mesh through event tick.

Actor Mesh

BP_Notebook has a static mesh that contains rendered text to represent user inputs, these user inputs are bonded to local variables which are connected to GI_Love’s S_Journal values. This ensures that even after inspection or destruction of the current BP_Notebook actor, the information on the trackersheet will be persistent throughout the game, and every time you inspect the BP_Notebook actor again, the information will be appended onto both the widget and the tracker sheet mesh.

BP_DoctorsNote

BP_DoctorsNote is an actor that provides insights over the daily tasks for the correct application of Symptothermal methods.  

Sub-widget

Like all the subclasses implemented so far, BP_DoctorsNote also contains logic for appending a sub-widget(W_DoctorsNote) under event interact function from Interact interface. Careful users may discover that its dedicated user widget only consists of a short description. This is because BP_DoctorsNote is envisioned to potentially include more functionalities. We have made it an independent subclass rather than directly using BP_ItemSample to ease future development.

Independent Classes

BP_cervix

BP_cervix is another core STM actor in the game, it’s intended to provide an accurate visualization of the changes in cervical position over the ovulation cycle. Functionality wise, BP_cervix comes with the inherited inspection panel, but with the addition of a dedicated sub-widget that contains a slider associated with the ovulation cycle to visualize cervical changes and related symptoms over each critical period.  

Careful readers may notice that BP_cervix is not a subclass of the BP_BaseActor. This is due to a realism consideration that a uterus model shouldn’t be inspectable in a simulated household environment. However, due to time constraints we have temporarily left a model in the game scene in the same manner as the other actors.

Sub-widget

Like the other two core STM actors, BP_cervix’s slider implementation follows the same fundamental logic. Its dedicated sub-widget is implemented in W_CervixInterface. The slider values are bonded to Unreal’s default onValueChanged event, and this event triggers dynamic update on the skeletal mesh transition and phase-specific information. Dynamic update on phase-specific information is implemented with a simple variable update function under onValueChanged event; however, to fully understand skeletal mesh transition property, we must understand the notion of morph targets.

Morph Targets: BP_cervix is an actor with an independent skeletal mesh. This mesh contains 2 morph targets; each corresponds to a mesh transition between 2 stages.

To fully understand the implementation of the mesh transition, we must understand that the menstrual cycle consists of 4 stages, and thus there are three major transitions that we must consider in our implementation. From the above figure, we have two morph targets, with the second one being a reusable morph target for both stage1-stage2 transition and stage3-stage4 transition. For more information about morph target creation in Blender, see section D.8.D.

We created a simple mesh transition function (transitionMesh) that updates the current state of the skeletal mesh with its morph target name and corresponding morph value. Under onValueChanged event, we parse the values from slider to ratios from 0 to 1 and pass this ratio into the transitionMesh function. Due to the irreversibility of morph targets, we included logic that manipulates the ratio to simulate reverse morph transition on the second morph target for smooth transition between the three stages of mesh transitions.

By binding these transitions to pre-defined ranges of critical periods, we can simulate seamless cervical position transitions with BP_cervix.

BP_Blob  

BP_Blob is another core STM actor in the game. It is intended to provide an accurate visualization of the changes in mucus over the ovulation cycle. Functionality wise, BP_Blob comes with the inherited inspection panel, but with the addition of a dedicated sub widget that contains a slider associated with the ovulation cycle to visualize mucus’s material and consistency change over each critical period.  

Careful readers may note that similarly to BP_Cervix, BP_Blob is not a subclass of the BP_BaseActor. This is due to realism consideration that mucus shouldn’t be inspectable in a simulated household environment, due to time constraints, we have temporarily left a model in the game scene same as the other actors.

Sub-widget

Like the other core STM actors, BP_Blob’s slider implementation follows the same fundamental logic. Its dedicated sub-widget is implemented in W_MucusSlider. The slider values are bonded to Unreal’s default onValueChanged event, which triggers dynamic update on the BP_Blob through the SlimeInAction function, and phase-specific information is updated through the OnSliderChange function. Dynamic update on phase-specific information is implemented with a simple variable update function under an onValueChanged event, and the mucus updates based on phases is achieved with custom material and stretch/thin variables for the active BP_Blob instance in the scene. Mucus Materials BP_Blob is an actor constructed with customMaterial to represent different mucus qualities across the menstrual cycle.

There are five different materials:  

  1. M_SlimeMenstrual: This material shows the mucus during menstrual phase where mucus is masked by blood.
  2. M_SlimeYellowWhite: This material shows the mucus during the early follicular phase where mucus has a yellow-white tint and mucus starts to become more abundant.  
  3. M_SlimeCloud: This material shows the mucus during the late follicular phase where mucus is clear and cloudy.
  4. M_SlimeOvulation: This material shows the mucus during the ovulation phase where mucus is clear and most abundant.
  5. M_SlimeLutealWhite: This material shows the mucus during luteal phase where mucus is white and starts to decrease in amount.

Strategies used to build materials

  1. Fresnel Effect: A fresnel node to add rim highlights, enhancing the perception of wetness and thickness from glancing angles.
  2. Scrolling through Normal: Multiple panning normal maps combined to create dynamic, organic surface movement. These normals distort the surface, mimicking the wet, uneven texture of mucus.
  3. Color and Emissive Tint:  Custom base color (red for menstrual mucus) and emissive color adds a subtle glow, making the mucus look more vivid.
  4. World Position Offset Jiggle: The wobble is driven by sine wave calculations based on world position and time, causing parts of the mesh to gently shift along the X and Y axes. To maintain visual realism and control, the distortion is blended using a texture mask, allowing localized movement and preserving details in other areas. This creates a lifelike, jiggly behavior resembling organic fluid motion.  

Movement / Consistency After considering several techniques to represent BP_Blob movement from custom spline paths, Niagara systems was chosen. Procedural Deformations with point forces was chosen to simulate mucus behavior. Mucus movement is performed through four force targets (startLoc, endLoc, leftLoc, rightLoc) around the blob, which stretches the BP_Blob in real time.  

Movement is triggered on EventTick, which moves startLoc/endLoc in the z-axis by (adjustStretch/1000). Similarly, leftLoc/rightLoc is moved in the y-axis by (adjustThin/1000) on each tick.

Based on the stretching and thinning movements, relative scaling for BP_Blob was kept using three key values:  

  1. Stretch Factor: Current stretched distance / BaseLength.
  2. Parabolic Compensation: Current thinned distance / BaseThickness.
  3. Break Point: Custom point for when the mucus stops stretching.

StretchBlob Since BP_Blob is based on a static mesh, real-time vertex-level deformation was not feasible. Therefore, implemented a dynamic non-uniform scaling to simulate stretch and deformation behavior. By manipulating the mesh’s scale parameters in response to control inputs (e.g., force points or distance thresholds), the mucus blob visually stretches until it reaches a breaking point

  • Z-axis: stretch normally using the stretchFactor.
  • X, Y-axis: use Volume Conservation to rescale the x-axis using ParabolicCompensation, and the y-axis using stretchFactor.

BP_Door

This is a side actor dedicated to handling all door open/close interactions and is triggered through the Interact input action.

The open/close feature is implemented based on a Timeline, a lerp node, and a set relative Rotation function in the blueprint. A timeline was included to simulate the time it takes for the door to open/close, and the lerp node specifies two ranges for relative rotation. Beyond the basic implementation, we also included the rotation computation property, which the door will always open in the opposite direction to where the player is situated. To achieve this, we calculated the dot product between the player and the door’s vector location and toggle the direction range of rotation on the door static mesh based on the polarity of the dot product.

Animations

A pick-up animation is integrated in the interaction trigger upon every inspection of a BaseActor. The pickup animation will be played at first, and after the completion of the animation, the animation sequence will send a notification to the actor, which then triggers the addition of the user panels and interfaces.  

Inventory System

Top Level Overview

Items can be classified into two categories: storable vs non-storable. The inventory system is responsible for handling actors in the game which are storable. By triggering the IA_Pickup input action, a storable actor in the scene will be destroyed in the scene and its item information will be stored in the local inventory. Combining with the IA_Inventory input action, inventory items can be visited and taken out depending on user needs.

The inventory system is built in parallel with BP_BaseActor class, all BP_BaseActor objects have attributes that can control their respective inventory behaviour, and only BP_BaseActor and its subclasses can be stored in the inventory system.  

Interfaces

The inventory system extends its functionality through the BPI_pickup interface, by default, BP_BaseActor and its subclasses all implements BPI_pickup. Classes that implement the Pickup method in BPI_pickup will have their unique functionalities.  

Item Structure

As introduced in D.4.B, BP_BaseActor objects have an Item Attribute that is of type Structure. This structure contains the fields that allow us to control the behaviour of the specific BP_BaseActor instance in the scene.

Figure 5.1.0. Item Structure for BP_BaseActor object
  • Item Inventory? (required): This controls whether this object can be stored or not. Always check this boolean for storable objects.
  • Item Name (required): This is the name of the item, will be displayed on the inspection panel as the object’s name upon inspection.
  • Item Description (required): This is the description of the item, will be displayed on the inspection panel upon inspection.
  • Item Stackable? (required): This controls whether the object is unique or not; if checked, then the object is not unique and will have counter associated with the object, otherwise the object is seen as unique and will be counted once.
  • Item Amount (required): This controls the amount associated with the object, should always be set to 1.
  • Item Image (required): This image is used as a placeholder in the inventory for the item stored.
  • Static Mesh (not required): This field is only required if of BP_ItemSample object or any BP_BaseActor’s subclass with no predefined static mesh.
  • Collision Inner Radius (required): This defines the inner collision radius of the actor, it should always be smaller than or equal to Collision Outer Radius for game play experience.
  • Collision Outer Radius (required): This defines the outer collision radius of the actor, it should always be bigger than or equal to Inner Collision Radius for best gameplay experience.
  • World Scale (not required): This is only a required field for BP_ItemSample object, it dynamically sets the world scale of the static mesh in the scene. For other BP_BaseActor subclasses, this field is automatically computed upon creation.
  • Inspection Scale (required): This field controls the item scale in inspection mode.
  • World Rotation (not required): This is only a required field for BP_ItemSample object, it dynamically sets the rotation of the static mesh in the scene. For other BP_BaseActor subclasses, this field is automatically computed upon creation.
  • Base Actor Class (not required): This field is the class type of the current actor; it is strictly not required and will be automatically computed in the background when game starts.

System Structure

The inventory system utilizes component-based architecture under the hood and mainly consists of three core components: an inventory component (BPC_InventoryComponent), a parent grid (W_InventoryGrid) and item slots(W_ItemSlot). These relationships can be abstracted in the following figure:

BPC_InventoryComponent

This class can be considered as the actual inventory, it contains all the inventory item information and simple functions regarding addition, removal, and dynamic update. It creates the W_InventoryGrid upon construction.

W_InventoryGrid

This is the parent container of all the items, it is a user widget that contains entries for all the W_ItemSlot widgets. It contains simple logic regarding updating all the item slot widgets upon a change to the inventory. It also captures onKeyDown and onDrop events for drag and drop behaviors.

W_ItemSlot

Each W_ItemSlot contains a button, a number, and an image to represent the item being added to the inventory. The number represents the count of the item in the inventory; it is updated automatically upon addition or deletion by the BPC_InventoryComponent. The button binds to an onClick event that displays item information to the inventory grid through a simple update function.  

W_ItemSlot overrides the onDragDetected event, this ensures that an item can be dragged out by holding onto the image.  

Drag and Drop

The Drag and Drop feature is implemented implicitly across W_ItemSlot, W_InventoryGrid, and BP_MyPlayerController blueprints. W_ItemSlot detects and captures an onDrag event by overriding its default onDragDetected event. In terms of implementation, it creates a DragDrop operation class (BP_DragDrop) to store item information between the transfer from inventory to the scene. Additionally, we created an intermediate widget (W_DraggedImage) on hold between the transfer to represent an item is being held by cursor.  

W_InventoryGrid’s canvas panel is enabled to be a variable, which enables the entire W_InventoryGrid to capture onDrop events. In terms of implementation, we overrode W_InventoryGrid’s onDrop event to generate the item from the inventory into the scene. To realize this dropping and actor spawning behavior, a ray trace on visible collision was implemented in the world in BP_MyPlayerControl, and the actor spawns based on the item information stored in the BP_DragDrop operation script.

Figure 5.2.1. DragDrop feature relationship diagram

Tutorial System

Top Level Overview

The tutorial system was created to familiarize the user with all the controls required to enjoy LoveLoop. It has its own dedicated map under Content/Maps/tutorial_map, its scene consists of a maze and 4 inspectable + storable BP_BaseActor objects.

Game Flow of Tutorial System

The tutorial system is implemented based on a sequential state machine, in which each state corresponds to a phase of the tutorial and will automatically switch after the previous phase is completed until the tutorial ends.  

Listed below are the phases in order from start to finish under the current implementation:

  • View: Teaches user how to navigate views through mouse movement.
  • Move: Teaches user how to move using WASD/arrow keys for control.
  • Jump: Teaches user how to jump using the space key.
  • Inspect/Interact: Teaches user how to interact and inspect objects in the scene.
  • Store: Teaches user how to store an object into the inventory.
  • Inventory: Teaches user how to open and close the inventory.
  • Drag Drop: Teaches user how to drag an item out from the inventory panel.
  • Goal: A phase dedicated for users to combine all the controls together to complete a task of collecting all four inspectable objects in the scene.
  • Complete: A phase that signals the end of the tutorial.
  • None: An unimplemented phase for debugging purposes.

System Structure:

The inventory system is implemented based on two major components, BP_TutorialSystem and BPC_InputCapturer. One may envision BP_TutorialSystem as the central manager of the tutorial system that handles all the logic related to phase changes and panel creations. BPC_InputCapturer is an actor component of BP_TutorialSystem, it is responsible for capturing user inputs for each phase of the tutorial and triggering a phase shift upon every successful phase completion.

An abstraction of the Tutorial System can be illustrated as the following:

Figure 6.1.0. Tutorial System abstraction

Logic Walkthrough

BP_TutorialSystem owns BPC_InputCapturer as an actor component. BP_TutorialSystem tracks a state of type E Tutorial State, an Enum that defines the current state of tutorial the user is in. The tutorial system starts through BP_TutorialSystem, which initializes the starting widget panel. Its current state is then transferred to BP_InputCapturer, which calls the respective capturing function depending on the state of the tutorial. This state check is implemented to perform once per frame using event tick until the capturing function detects a successful response from the user in the current state.  

Upon a successful response from the user in the current phase of the tutorial, it destroys the current tutorial widget and updates BP_TutorialSystem’s current state. A brand-new tutorial widget will be subsequently created, and BP_Capturer will perform a real time check using its state-dependent capturing function.

Input Captures

A capturing function is a method implemented in BPC_InputCapturer with its primary function as detecting state-dependent user inputs. Once the expected user behavior is detected, a state change will be executed, and user will proceed to the next stage of the tutorial. This section will walk through the implementation logic of the capturing functions introduced in BPC_InputCapturer.  

Capture View, Capture Move & Capture Jump

All three capturing functions share similar logic in terms of implementation, each involving overriding the main character default outputs of movement, camera output, and falling properties.  

To achieve this, we extract action values from BP_first_person_character’s default movement, jump, and camera input functions and bind them to local variables. Their setter functions are bound to these default action inputs for dynamic updates.

The three capturing functions only involve a check for the character’s respective output values; the result of the three functions are thus determined by the range of these output values.

Capture Interaction & Capture Inventory

Phases

Both capturing functions introduced the concept of phases, it is currently implemented as an array of booleans under BPC_InputCapturer. Its main role is to serve as a checkpoint for an interaction that contains multiple stages, and a phase is an indicator that tells the interaction the number of checkpoints the interaction has fulfilled.  

Both the capture interaction and capture inventory function contain two stages before completion, where each stage is mapped to a phase, and each phase update is triggered by an indicator associated with inventory and inspection. Capture Drag Drop, Capture Storage, & Capture Goal These three functions leverage a combination of phase checks and size-check functions under BPC_InventoryComponent to simulate respective behaviors.

  • Capture Drag Drop: Divided into three phases, the first phase stores the number of current items in the inventory, the second phase captures a decrease in total amount of current items in the inventory, and the final phase captures the closure of the inventory panel.
  • Capture Storage: Divided into two phases, the first phase stores the current number of items in the inventory, and the second phase detects an increase in the inventory system. We leveraged simple size-checking function from BPC_InventoryComponent to detect size changes and phase triggers.
  • Capture Goal: Leveraged BPC_InventoryComponent’s size-checking functions to determine if there are a satisfactory number of items in the inventory. In this case, we deliberately made the number of storable items in the tutorial map to be the same as the goal, so this trigger will be completed as soon as all items in the scene are added to inventory.

User Widgets

All tutorial user widgets are stored under the path Content/Tutorial/user_interfaces. Most of these interfaces only provide instructions and guidance through images, the only exception being W_TutorialCompleted, which incorporates button interactions to restart logic.

Miscellaneous Features

Ending Mechanism:

LoveLoop’s ending mechanism is hidden in its interaction and exploration. After triggering all three core interactions, a key will be generated in the game, which the user can thus store in the inventory. This feature is implemented using an informal state machine, where the after each interaction is triggered, a variable representing its state will be set to True. Once three states passed, the generation will be triggered.  

First Time Set Up Guide

Pre-Requisites and Setup Guide

In advance of presenting technical details, here we provide a content overview that future developers should be mindful of upon expanding this project. LoveLoop is a high-fidelity desktop application that requires the following hardware requirements for smooth interactions and gaming experience.

CPU:

  • Minimum: Intel Core i7 (12th Gen or newer), or AMD Ryzen 7 equivalent
  • Above Threshold: Intel Core i9-13900K, AMD Ryzen 9 7950X — ideal for multitasking, large projects, and compiling

GPU:

  • Minimum: NVIDIA GeForce RTX 4070 (16GB VRAM or more)
  • Above Threshold: RTX 4080, RTX 4090, or NVIDIA Quadro/A-series GPUs — preferred for heavy rendering, real-time graphics, or machine learning

RAM:

  • Minimum: 32GB DDR4 or DDR5
  • Above Threshold: 64GB or more — suitable for complex projects or running multiple tools simultaneously

Operating System:

  • Minimum: Windows 11 (or Windows 10 21H2+), or Ubuntu 22.04+
  • Above Threshold: Windows 11 Pro (Build 22631+), or Ubuntu 24.04 LTS

Display:

  • Minimum: 1080p (FHD) at 60Hz
  • Above Threshold: 1440p or 4K at 120Hz+ — beneficial for workspace and visual fidelity

Getting Started with Unreal Engine 5.5

We used Unreal Engine 5.5.4 as our main development platform. Below are instructions on how to install Unreal Engine 5.

  1. Visit Download Unreal Engine - Unreal Engine and install the Epic Games Launcher
  1. After following the steps to install the launcher, select Unreal Engine.
  1. Go to the Library Tab and press the “+” Icon. Choose Unreal Engine 5.4.4
  1. In Folder, choose where to install Unreal Engine. It is recommended not to change this if you have the space on your primary drive.
  1. In path, click on options, and unselect the Android, IOS, and Linux platforms. This will save you about 22 GB on installation.
  1. Let the installation happen and then run Unreal Engine. The first run may take a while, so please be patient!

Plugin Installation

LoveLoop uses MetaHuman plugins from Unreal Engine 5.5 to incorporate MetaHumans into the project, the following procedures ensures the proper installation of the metahuman plugin.

  1. Click and launch Epic Games with your epic games account.
  1. Click on Unreal Engine on the left side panel.
  1. Then click on Library on the top navigation bar.
  1. You should now see information related to ENGINE VERSIONS, scroll down until you see the title ‘Fab Library’.
  1. Look through the installed plugins and check if “MetaHuman Plugin” exists for your account. You may exit this tutorial if the plugin exists, otherwise continue to next step.
  1. From the top-level navigation bar, click on Fab. This should take you to the fab store on a separate browser tab.
  1. In the search bar, type “MetaHuman Plugin” and search for result. Then click download/add to library.
  1. Follow step 5 again to see if the plugin is ‘installed to engine’.

Project Launching

  1. Open the project through Unreal Engine 5
  1. Enter game mode through Alt + P or play button in Unreal Engine.

Build

Link to the build: https://github.com/ubcemergingmedialab/25-3003-loveloop

The project build can also be found in the P drive.

Poster

Development Team

Principle Investigator

Dr. Farah Shroff

Assistant Professor of Teaching

UBC Faculty of Medicine, Department of Family Practice

University of British Columbia

Current Team

Naomi Ng, Project Lead, UI/UX Designer (May 2025 - August 2025)

Work Learn at the Emerging Media Lab at UBC

Undergraduate in Bachelor of Science in Cognitive Systems

University of British Columbia

Shivangi Singh (May 2025 - August 2025)

Work Learn at the Emerging Media Lab at UBC

Masters in Fine Art of Design and Production

University of British Columbia

Billy Wang, Software Developer (May 2025 - August 2025)

Work Learn at the Emerging Media Lab at UBC

Undergraduate in Bachelor of Science in Computer Science and Statistics

University of British Columbia

Apram Ahuja, Software Developer (May 2025 - August 2025)

Work Learn at the Emerging Media Lab at UBC

Undergraduate in Bachelor of Science in Computer Science

University of British Columbia

Bibliography

Provide enough information for a first time user to get up and running.

License

MIT License

Copyright (c) 2023 University of British Columbia

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Last edit: May 18, 2023 by Daniel Lindenberger

Some rights reserved
Permission is granted to copy, distribute and/or modify this document according to the terms in Creative Commons License, Attribution-ShareAlike 4.0. The full text of this license may be found here: CC by-sa 4.0
Attribution-Share-a-like