Blue In The Distance VR Performance.

During a performance by Tracey Engleman and Zeitgeist of ‘The Blue in the Distance for live ensemble and virtual reality film, a selection of audience members were invited to view a series of 360° videos.  The videos were produced by Scott Miller, Ph.D. and Jack Wald, both from the School of Music.   The videos were synchronized to the music and controlled by a performer on the stage using a MIDI foot pedal.  A version of the videos for the rest of the audience was projected onto the screen behind the performers.

That’s where MN-NICE comes in.  All of the headsets (7 Oculus Go’s and 7 Oculus Quests) were all networked together through a prototype MN-NICE server, which was keeping all the headsets synched up.  At least that was the idea.  This was going to be a real stress test of the system, in more ways than one.

To try and make sure wireless latency wasn’t an issue, we set up a closed wireless network for the headsets.   The results were mixed at best.  Even though all the headsets could connect to the server, only about half of the headsets remained in synch.   The number was performed 3 times over the course of the evening, to try and allow for as many to see the 360 video as wanted.   Each time, only about half the headsets stayed with the server.

Another complication was the mixed headsets.  The Quest isn’t the best device for this type of audience experience.   They weren’t all updated to the same version, so not all of them could disable the tracking; the stadium seating in the recital hall caused the Guardian system to freak out.

We also identified several issues with the server architecture that would have made the experience more robust.   Currently the videos are controlled by an integer value that is shared among all the connected devices. This value serves as the index into an array of video names.  The server is the only device that has the authority to write to that integer.  When it’s changed on the server, all the clients update their value and start up the video that matches.

We should also have server do more to make sure the clients are all synchronized.   The message to synchronize the control value only gets sent when the value changes on the server.   We’ll have it send updates on a regular interval, whether the value has changed or not.   Not only should it send the control value for the video, but also the frame in the video from the server.

While it wasn’t the most successful first public display,  it was a valuable experience from a software evaluation point.


Leave a Reply

Your email address will not be published. Required fields are marked *