MELODY GEIGER
Game Producer. Experience Designer.
The In Between
Medium
Live Immersive and Experiential Installation Event
Timeframe
6 Months
Class
Immersive Experience Production
Project Type
40+ Person Group Project
Deliverables
An immersive production that was ran for 9 shows during the weekend of April 20-22, 2023 for a cumulative audience of ~250 members.
Summary
The goal of The In Between was to produce a live event that would open to the general public from ideation to opening day.
I, the Development Lead, was one of 7 leads chosen on the 40+ person production team. Furthermore, I was the only undergraduate student chosen to be apart of the graduate student leadership team. In addition to collaborating with the other leads, I oversaw 4 other developers who were responsible for creating 8 technological interaction points for the experience alongside myself.
Management
Management was an audience member's guide to The In Between. A calming psychopomp without all the answers, Management could be asked questions throughout the show's runtime and would respond to audience members in real-time. Developmentally speaking, Management's visuals were AI generated images and audio reactive particles; while audially a live actor was behind the scenes serving as the voice of Management. The actor would shift between reading a script and improvising if audience members asked tough questions.
​
Initially, Management was planned to use live motion capture technology using a Rokoko SmartSuit that would LiveLink into Unreal Engine 5 and then utilize Niagara particle systems to appear before the audience as this amorphous silhouetted figure. However, extremely late into development the team ran into detrimental blockers that prevented the only developer on Management and myself from moving forward with the mocap technique to which we then pivoted to the AI audio-reactive TouchDesigner solution.
​
One of the challenges I faced during the development of Management was identifying the unknowns due to my own personal inexperience with motion capture technology. The sole developer that was on Management had very limited experience with motion capture technologies and UE5. Furthermore, connection issues that were on location specific occurred far too late in development to where myself and the other developer were left scrambling towards the end as to how to come up with viable and productive solutions. Throughout the 16 weeks we had for developing Management, we found out on week 14 that our live motion capture setup was not feasible. From there, the Management developer pivoted to an alternative motion capture solution while I pivoted to the TouchDesigner solution. My unfamiliarity with motion capture and limited working knowledge of Unreal Engine 5 as well as unforeseen on-location SmartSuite connectivity issues caused me to struggle knowing when to keep pushing forward and when to propose different solutions. Fortunately, we found a solution that worked in the end and was used with relative ease during the shows.
The Universe
The Universe had two interaction components that were available for the audience to explore. The first was an interactive projection mapping installation using TouchDesigner and an Xbox Kinect and the other was a narrative driven VR experience.
I oversaw the 11 week development cycle for the VR experience that was developed by a team of 10. Furthermore, I also oversaw the development of the TouchDesigner activation that was completed by two developers.
One challenge I encountered with the Universe activation was that the VR development was outsourced to another class/studio, which meant I was responsible for communicating the Creative Director's vision to the VR development team. Many creative decisions had already been made by the Creative Director, but on occasion, I had to approach the Creative Director and help them understand the scope, affordances, and limitations of VR in order to come to a solution that fell within their vision and also allowed development to progress.