Typical Unity + Google VR Setup for Android: Player Camera & Spatial Audio

Prerequisites: Minimum Phone Requirements for Google VR

Cardboard on an Android phone requires version 4.1 or higher.

Before building the project, change the Player Settings for Android: the minimum API Level must be set to “Android 4.4 Kit Kat”.

Google VR has its own GVR SDK libraries that talks to the hardware(compatible phones) but because Unity doesn’t support these directly – it’s not built on top of them. UI system prefabs are provided to make development easier.

 

Typical Unity + Google VR Setup for Android: Auto-Walk Player Camera

  1. Switch platform to Android
  2. Player Settings > Android > XR Settings > Virtual Reality Supported
  3. Import Google VR SDK: https://github.com/googlevr/gvr-unity-sdk/releases
  4. Drag the GvrEditorEmulator prefab from the GoogleVR folder into the scene
  5. Create an empty game object and name it PlayerCamera
  6. Select the main camera and copy component
  7. Select the PlayerCamera and paste component value onto it
  8. Make the main camera a child of PlayerCamera. Change its clipping plane value to 0.1 to prevent wall see-through issues(shadow resolution in the distance will decrease if the value is too low)
  9. GoogleVR > Prefabs > Cardboard > GVRReticlePointer – drag  this onto the Main Camera to create the reticle UI
  10. Drag GVREventSystem into the scene to handle reticle based selection events
  11. Add a rigidbody component and a capsule collider to PlayerCamera, change the collider’s height to around 1.6 and its centre to -1.1 so the camera is at the top outside of the collider.
  12. Freeze the rigidbody’s rotation xyz  for PlayerCamera to prevent the collider from falling down.
  13. Move the playerCamera forward automatically with code. Example:  transform.position = transform.position + Camera.main.transform.forward * walkingSpeed * Time.deltaTime;
  14. Change the UI Canvas’ render mode to World Space, place it in front of the camera for it to be visible.

TIPS: 

  • Avoiding Motion Sickness by avoiding acceleration – acceleration creates dissonance between the visual perception and the feeling of sitting still in a chair. Use ForceMode.VelocityChange when using AddForce(an instantaneous force which is completely mass independent).
  • Use Debug.DrawRay(position, direction, colour) to debug vectors.

 

Google VR Spatial Audio Sources

Add spatial awareness to the VR game – distinguish different directions from both ears in the headphones. – simulate time & volume differences in sound due to aural distances.

Spatial audio can be used as a guide to help the viewer look at the right direction: position the audio source behind the menu.

 

Using Google Resonance Audio SDK with Unity

  1. Download Google’s Resonance SDK and import it into the project: https://developers.google.com/resonance-audio/develop/unity/getting-started
  2. Edit > Project Settings > Audio: Set the Spatializer Plugin and the Ambisonic Decoder Plugin to Resonance Audio
  3. Attach the ResonanceAudioSource script to the main camera – this will act as a listener to listen to the radiated 3D sound.
  4. Attach the ResonanceAudioSource script to the audio source game object – the desired object that will radiate or produce 3D sound.
  5. Assign the Resonance Master mixer as the AudioMixerGroup for the audio source.
  6. Change the Spatial Blend from 2D to 3D.
  7. Check the Spatialize option and Spatialize Post Effects option.

 

Legacy GVR Spatial Audio Workflow(without the Google Resonance Audio SDK):

  1. Ensure the spatializer plugin is set(Edit > Project Settings > Audio > Spatializer Plugin: GVR Audio Spatializer ). The sound intensity increases as the player gets closer to the goal.
  2. The GVRAudioSource prefab replaces Unity’s audio source. alternatively, attach the GVRAudioSource  script to the desired audio source(must attach GVRAudioListener to the main camera)
  3. Change the alpha(direction) and sharpness of the sound – the sound would dither depending on where the player is in the environment.

 

UI in VR Environments

  • Use world space UI canvas for VR environments – spatial elements around the environment which doesn’t move with the player
  • Create a introductory splash screen and chevrons that direct the player’s attention towards the important UI if they happen to be looking at the wrong direction.
  • Place the UI at an optimal distance for player comfort: 0.75 – 3m, then scale the UI appropriately
  • Create a cube of unit scale, put it 3 metres in front of the camera and use it as a reference for scaling UI
  • Create gaze-based buttons and use GvrReticlePointer to interact with them – cannot touch the button directly. The GvrReticlePointer should be set as a child of the Main Camera so it moves with the player’s head. It should change colour and become enlarged when it comes into contact with other game objects.

IMPORTANT: If the UI is too close the player is going to feel uncomfortable and end up cross-eyed! Don’t put the UI that’s needed frequently very close.

When adding the GVRviewermain prefab: can only view 3D GameObjects properly in VR mode but not the Canvas UI elements! The 3D GameObjects and 2D UI elements work together when the game is played.

Simple event system does not work properly with VR games…

Event system = a way of sending events / interacting with GameObjects and UI elements inside the game, by providing inputs from the keyboard, mouse, touch and custom events. To handle the input events inside the VR environment, the GvrEventSystem in GVR SDK must be used(with Gvr Pointer Manager and Gvr Pointer Input Module) for the reticle pointer to work properly.

 

Useful Links

https://developers.google.com/vr/develop/android/get-started

https://github.com/googlevr/gvr-unity-sdk

https://developers.google.com/resonance-audio/develop/overview

Google Resonance Audio

https://developers.google.com/resonance-audio/

 

Advertisements
%d bloggers like this: