SCAD-Short-Example-HD.gif

VR Peformer

Summary

At the 2016 September Seattle VR hackathon, we combined a VR musical sandbox with an 8 speaker, cube shaped ambisonics sound system and a mocap suit.  Utilizing 2 htc vives, we created a "multiplayer" VR experience that put one user in the middle of a musical sandbox and surround sound system. In the final stage of the project the main player was able to trigger "sound toys" and move them 360° and with elevation in space.

Technical Description

Max Patch

Cycling 74's Max was used for the audio engine portion of this piece.  Emitters 1 and 2 are "preprogrammed" musical sequences aka backtracks.  Emitter sounds 3-8 are trigger-able.  All spatialization is interactive.  Below on the right is a "top-down" monitor of the emitter sounds.

Monitor screen for Max audio environment of spacial encoding/decoding and OSC triggers from Unity *Made by Andrew Luck*

Event Messaging

The OSC protocol was utilized to send coordinates, triggers, and interaction values between Unity and MaxMSP via UDP.  Wil Bown integrated the OSC into Unity with help of "OSCSharp".

Here’s an example message array: 

/bubble/pop/aed 1 2.293 0.235 3.532 1

/bubble/pop/ is the namespace of the OSC message, the type is spherical and indicated by the aed symbol.  aed is azimuth, elevation and distance.  The next value, the 1, indicates which sound emitter this message is for.  The three floats following are azimuth, elevation, and distance, respectively.  These are recieved from Unity into max to encode the spatial information. The following value is the group.  All of our sounds remained in group 1 for this project.

Spacialization

The ICST Ambisonics Tools package from the Zurich University of Arts was chosen for spacial encoding/decoding. Audio was mostly playback of recorded wav files or synths into the absorption, doppler, and ambisonics encoder.  Next, the ambisonics are decoded and discretely distributed over the speaker configuration.  

Playback

The loudspeaker system consisted of a variety of 5” studio monitors.  We attenuated these with a microphone test to set the volumes equally and distributed the system into the most perfect cube shape we could achieve.  

Music

Gus McManus and Andrew Luck wrote the theme in our delirium at the hackathon. 

Environment & 3d Assets

Writted by Lou Ward

Lou Ward and Alexander Moed worked on creating 3d assets, the atmosphere, and making sure the mocap was up and running correctly. Lou was primarily involved in creating aesthetic look and feel of the environment. Alexander created 3d assets and helped with any mocap needs. After deciding to build a Daili & Kandisky styled atmosphere, Alex and Lou made a low poly environment to allow the human form to be focal point. Alex and Lou experimented with physics to create more organic shapes and animations. Real time physics could have been problematic and gpu intensive. Alexander experimented with blendshapes in maya to create more organic looking instruments/objects. We created an array of objects even for future request, i.e. avatars for samsung gear vr spectator mode.

Mocap

The mocap system was the Perception Neuron Motion Capture Suit.  This suit has IMU sensors and is very sensitive to magnetic fields (3ft plus). “The IMU is a single unit in the electronics module which collects angular velocity and linear acceleration data which is sent to the main processor”. The system has 32 sensors and differentiates itself by having very accurate finger tracking.

When a dancer would get into the suit we would calibrate using the necessary poses, putting the character's body in steady, a, t, and s shapes. We then verify Axis software is broadcasting tcp using bvh, which one standard outputs for mocap. Evie Powell and Wil Bown did the networking, using duplicating mesh to show in the new scene. For the design of the character, we originally had a mesh with a bunch of squares and had problems with weighting. Evie Powell created a shader that used the mesh and put squares on top of the vertices. This allowed the user to go through them without tons of clipping errors and it made the human form more abstract.

About the Shader

The shader was a basic geometry shader that procedurally encases each original vertex with a cube.  Geometry shading happens between the vertex shader pass and the fragment shader pass, where additional triangles are added on the GPU.  This allows for much faster render passes using much simpler geometry and smaller models file sizes.  The shader opens up a lot of opportunity for incorporating musical feedback into our visual style.  We can use the cube shader to tie geometry size, shape, and color to variables like volume, BPM, and FFT in a musical experience.  


User Experience

Written by Evie Powell

The user interactions with visual effects were designed by Evie Powell.  The primary interaction for the “main player” is to pick up and move musical tracks or musical effects that take the form of several abstract objects in the scene.  The music responds to the positional placement of these objects in real time with 1:1 spatial accuracy.  Because the player is experiencing all of this using Ambisonics, spectators get to experience the real time mixing live as well.  

The dancer can also influence sound and visuals by touching things in the level. In this build, the dancer user cannot directly place music.  Together this makes for a novel collaborative musical / artistic experience that is unlike anything you’ve ever experienced before: neither as a VR user or a spectator.


Player Interactions

Newton VR is the basis so that all interactions had VR friendly physics based interactions.  Picking up objects, placing them, and throwing them feel very natural in pop rocks.  The player had abstract gender neutral hands that were responsive and easily communicated their function without pesky tutorial sessions or UI text.

When a player squeezes a trigger the hands respond by making a small “halfway gripping” gesture” which communicates that picking up an item is possible.  The hands will fully grip and latch on to an interactive track when a player presses trigger on an interactable object.

A player may also have a point interaction by naturally reaching out significantly from their body.  If the players hand is significantly far from their body center while pressing trigger the hand switches to a pointing state, which allows different types of interactions with the environment, like shooting sound particles or interacting with an object that is not within ones reach.  These controls were designed to feel natural and to inspire player experimentation.

Parts List

This would have been impossible without all of the team contributions!

This would have been impossible without all of the team contributions!

Future Work

I hope to find events to take this setup and make more time for development sprint cycles.  Now that all parts are communicating, and we are triggering dynamically with OSC and Unity and ambisonics, we can really begin to explore new interfaces for sound and music in a social atmosphere.  I hope to bring physical modeling, procedural audio, and some kind of spectral synthesis / editing  with parametric variables into the interactions.  I'm really excited about creating new ways to visualize loops and sequencing, orbital collisions, and new 3D virtual instruments.

1. Less "backtrack" and more improvisation. 
2. The experience itself could have a "story"
3.  The triggers and overall Max project can be cleaned up a bit
4.  More synthesis and sound sculpting, visual feedback
5. More speakers, bigger space
6. Lasers for holograms duh (stretch goal)
7. Integrate elevation for ambisonics monitor

Credits

Andrew Luck - Audio Engineer, Max Patcher / Team Organizer / Concept
Evie Powell - Lead Software Engineer and UX Designer / Unity Developer
Lou Ward - Chief Visual Effects Designer/Mocap Master
Wil Bown - Unity OSC Implementation
Gus McManus - Sound Designer
Vida Powell - Playtester, Production Assistant

Without CNMAT MAX/MSP externals and ICST Ambisonics package, this would not have been accomplish-able.

Thank you to Adam Houghton for letting us borrow your Yamaha studio monitors!  Also, thanks to everyone for losing sleep and bringing your gear to get this all connected at the Hackathon where we covered a lot of ground!