VR Escape Room
Overview
The Escape Room sample demonstrates an approach on how to develop physical interactions between players and the environment.
The Core provides basic interactions and locomotion while optional modules offer additional features and interaction types as sample implementations.
All modules have their own folders and can be removed if not needed. Note that the main scene provided in this sample (/Scenes/EscapeRoom Full.scene) makes use of all the modules and components and might break when deleting modules.
The modules are described separately at the end of this document.
The Modules:
- Object Pull
- Instant Camera
- Whiteboard
- Observer
- Ui Module
- Slots Module
- Shooting Module
Technical Info
- This sample uses the Server/Hosted Mode topology,
- The project has been developed with Unity 2020.3.37f1,
Before you start
To run the sample :
Create a Fusion AppId in the PhotonEngine Dashboard and paste it into the
App Id Fusion
field in Real Time Settings (reachable from the Fusion menu).Create a Voice AppId in the PhotonEngine Dashboard and paste it into the
App Id Voice
field in Real Time SettingsThen load the
Start
scene and pressPlay
. Then, you can either launch the complete scene or test a specific module only.
Download
Version | Release Date | Download | ||
---|---|---|---|---|
1.1.3 | Oct 20, 2022 | Fusion VR Escape Room 1.1.3 Build 5 |
Handling Input
Meta Quest
- Teleport : press A or X buttons to display a pointer. You will teleport on any accepted target on release
- Touch : simply put your hand/finger over a button to toggle it
- Grab : first put your hand over the object and grab it using controller grab button
- Use : press the select button to use the grabbed objet
Please note that HTC Vive Controller and Valve Index Controller should be supported.
Desktop
Keyboard
- Walk : WASD
- Up/Down : space & left Shift
Mouse
- Rotate : keep the right mouse button pressed and move the mouse to rotate the point of view
- UI : use the left button to press an UI button
- Grab & use (3D pens) : put the mouse over the object and grab it using the left mouse button. Then you can use it using the keyboard space key
Core
The core of the sample provides physics based interactions with Rigidbodies
and Articulation bodies
.
Rigidbodies are used for any throwable objects in the world while Articulation Bodies
are used for mechanical systems like levers, dials, doors and drawers.
The camera is not based on physics but only user input. Teleport locomotion is included (TeleportHandler.cs
)
Interaction flow overview:
- Poll & Send Input :
- (
XRInput.cs
,LocalController.cs
,PCInput.cs
)
- (
- Update head and hand positions based on input using rigidbodies and forces :
- (
PlayerSystem.cs
,XRObject.cs
)
- (
- Hand reads input and passes it on to HandTool and any
IControllerInputReceiver
:- (
Hand.cs
,IControllerInputReceiver
)
- (
- HandTool and other components react to input :
- (
HandTool.cs
,TeleportHandler.cs
,InstantCameraInteractable.cs
)
- (
- HandTool checks if a hotspots is in range and grabs / releases based on input :
- (
HotspotCollector.cs
,Hotspot.cs
,HighlightBase.cs
)
- (
- Hotspot tells InteractableBase and all IInteractable when starting / stopping an interaction :
- (
InteractableBase.cs
,GrabbableBase.cs
)
- (
- GrabbableBase tracks hand position using forces :
- (
GrabbableBase.cs
,GrabbableRigidbody.cs
,GrabbableArticulation.cs
)
- (
Input:
Sending Input
LocalInputBase.cs
Exists independent of Fusion as a DontDestroyOnLoad
object and persists between sessions. Responsible for collecting input and sending it to the Runner
.
XRInput
is the main implementation of this.PCInput
is a debug input method for quicker iteration but limited features.
Receiving Input
PlayerSystem.cs
Handles basic positioning of player objects (head and hands) via XRObject.cs
.
XRObject updates rigidbody forces to track the input.
Hand.cs
Reads the controller input and passes it on to anything registered to receive input via IControllerInputReceiver
Explicitly passes input to HandTool.cs
for basic interactions.
- Systems that want to receive input can implement
IControllerInputReceiver
and register with the hand (egTeleportHandler.cs
) - Grabbed objects can temporarily register (eg
InstantCameraInteractable.cs) -> Hand.AddInputBehaviour() / Hand.RemoveInputBehaviour()
Interactions
HandTool:
- Reads Input for Grab / Drop
- Uses
HotspotCollector
to findHotspots
and highlight them - Grab / Drop
Hotspots
HotspotCollector:
- Finds
Hotspots
within radius based on Layers and Tags - Filter Order:
Layer, Tag, HighlightPriority, Distance
- Highlights hotspots being hovered (
HotspotCollector -> Hotspot -> HighlightBase
)
Hotspot:
Represent interaction points through which HandTools
can interact with objects.
- One Hand per Hotspot, if an object needs to be grabbable by more hands add more hotspots. These hotspots can be identical if need be.
- Passes Start / Stop Interaction calls to
IInteractable
on the same GameObject andInteractableBase
in parent.
InteractableBase:
Base class for interactable objects.
- Gets Start / Stop Interaction calls
Object Hierarchy for Grabbable Object:
Root
:InteractableBase, Rigidbody, NetworkRigidbody, Highlight, NetworkObject, BodyProperties
Visuals
Highlight Visuals
Collision
Hotspot
:Hotspot, GrabbableRigidbody / AttachmentRigidbody, Collider (Trigger)
BodyProperties
Assign a BodyPropertyCollection
(scriptable object) to control how the object behaves when grabbed.
BodyPropertyCollection
Contains an array of BodyPropertyData
which is applied to the grabbed object depending on how many hands grab the object at the time (0: not grabbed, 1: grabbed by one hand, n: grabbed by n hands).
This way objects can be made unwieldy or too heavy with one hand, but easy to handle with two or more.
- BodyPropertyData:
Mass
Drag
AngularDrag
JointFrictionArticulation
(only applicable forArticulation bodies
)UseGravity
OverrideInertia
(if set the inertia will be set toInertiaWhenGrabbedScale * Vector3.one
)InertiaWhenGrabbedScale
(control how much the object resists rotation changes)VelocityExtrapolation
(how much the current velocity is counteracted when applying the new force to the object, also used as multiplier to the force itself)TorqueExtrapolation
(how much the current angular velocity is counteracted when applying the new force to the object)
GrabbableBase (on Hotspot):
Handles Tracking the object to the hand position when grabbed.
Choose the appropriate implementation depending on the body type:
GrabbableRigidbody
GrabbableArticulation
AttachmentRigidbody (on Hotspot GameObject):
Attaches Rigidbody object to a HandTool
. This is usesd for light objects that need to be very snappy. (eg Whiteboard Markers, InstantCamera Pictures, Pistol)
For this to work the Object requires a certain structure:
Root
Visuals
Collision/Logic
Colliders
Hotspot
:AttachmentRigidbody, Trigger Collider
,
On grabbing the Collision
and Visuals GameObjects
are directly attached to the Hand
and act as if they are part of it.
Highlights
Visual confirmation that an interactable is in range of a hand and can be interacted with.
Value Provider / Reader
A value provider
is a generic component to transfer information from one system to another and make simple logic chains without specialized components.
Example:
- Read a value from an articulation body (
ArticulationBodyValueReaderFloat.cs
) - Compare it to a threshold (
ValueLogicIntCompare.cs
) - Use that value to drive something else (
ArticulationBodyDriverSetLimits.cs
)
StreamTextureManager
Required for modules. System to send texture data across the network using the OnReliableData
callback. Needs to be on the same GameObject as the Runner
.
IgnoreCollision
Sometimes collisions need to be ignored between certain objects using the Physics.IgnoreCollision
API.
IgnoreCollision.cs
: static ignore. Does not change during gameplay. The collider on the GameObject with the script ignores all colliders in the list.NetworkColliderCollection.cs
: Group of colliders that can be ignored.NetworkIgnoreCollision.cs
: UseAddIgnore()
andRemoveIgnore()
to ignore the local colliders against all assignedNetworkColliderCollections
.
Articulation Bodies
This sample makes heavy use of Articulation bodies
. They are a stable way to simulate connected bodies in Unity and are used here for things like Levers, Buttons, Dials and Drawers.
The implementation of the NetworkArticulationBody
is not feature complete in this sample.
See https://docs.unity3d.com/2020.3/Documentation/ScriptReference/ArticulationBody.html for reference
Core Components
NetworkArticulationBodyRoot
is placed on theRoot
NetworkArticulationBody
is placed on all childrenArticulation bodies
.NetworkedArticulationDrive
Networks the Drive properties of theArticulation body
.ArticulationBodyValueReaderSingleAxis
Can read the value of anArticulation body
to be used in the Value Provider system.ArticulationBodySingleAxisSoftSnap
Dynamically sets the drive property to snap to certain points. eg.: notches in a Dial, Soft-close on drawers, Fixed valid positions on Levers.
Local Interpolation
The NetworkArticulationBody
inherrits from NetworkTransform
and provides a stable local interpolation. For this to work as inteded the Interpolation targets
need to be set and have to mirror the hierarchy of the Articulation bodies
.
Structure
Root
:NetworkObject, ArticulationBody, NetworkArticulationBodyRoot, InteractableBase
Visuals
: Follow the same Hierarchy asArticulation system
.Child
:ArticulationBody, NetworkArticulationBody
, (optional:BodyProperties* ArticulationBodySoftSnap, ArticulationBodyValueReaderSingleAxis
)Child
: (optional, same components as above))Hotspot
:Hotspot, Trigger Collider, GrabbableArticulation
The BodyProperties
component is required at least once below the Hotspot
. Each Hotspot
will search for the first in its parents.
Connecting an Articulation Body to a rigidbody
If a Articulation
needs to be attached to a rigidbody
it needs to be done with a Joint
(eg a ConfigurableJoint
).
known issues
- Don't use
Compute Parent Anchor
onArticulation bodies
: They will be different on clients when they are not in their initial state. - A bug in
Articulation bodies
results in different indices for the articulation hierarchy. We need them the same across clients to properly sync them. A temporary fix is implemented inNetworkArticulationBodyRoot.Spawned()
. This reorderes them in the correct order.- This "hackfix" sadly disconnects any Configurable Joints between
rigidbodies
andarticulation bodies
(eg the Pistol in the shooting module). The reordering is therefore disabled for this object.
- This "hackfix" sadly disconnects any Configurable Joints between
Modules
Object Pull
Pull Objects towards you by holding the grab button, pointing at the Object and flicking the hand upwards.
Requirements:
ObjectPullCollector
prefab onHandTool
ObjectPullBody
component on aHotspot
, also requireds aGrabbableBase
component (GrabbableRigidbody
)
Instant Camera
Instant camera showcasing more complex interactable objects (additional input) and the OnReliableData
callbacks for sending larger data across the network.
InstantCameraInteractable
: Takes a picture by capturing the camera render-texture. Spawns a printout for that texture.InstantCameraPrintout
: Waits for texture to arrive until it is faded in using a shader. Additionally it is resend to clients that join after the picture was initialy taken.InstantCameraPicture.shader
: Simple Surface shader to fade-in Picture after it is received
Whiteboard
Whiteboard that can be drawn on.
WhiteboardSurface
: Surface to draw on. Saves its current state as aRendertexture
.WhiteboardMarker
: Interactable. Looks through surfaces (bounds) and tells it to paint a stroke when close.
Shaders
WhiteboardFill
: Initializes the whiteboard with a given textureWhiteboardStroke
: Draws a straight line on a rendertextureWhiteboardStrokeEraser
: Erases in a straight line on a rendertexture
Observer
Mouse / Keyboard controller for observing VR players and interacting with the world.
Requirements:
- Observers use a different Input rig and Player prefab (see
StartModulebutton.cs
) - Add
IObserverable
components to objects you want to manipulateObserverableGrabRigidbody
for dragging rigidbodiesObservableArticulationBodyButton
adds a constant force (can be toggled)ObservableArticulationBodyDrawer
adds a force for a duration and keeps a bool state (flips force when pressed again)ObservableArticulationBodyLever
adds a force while heldObservableValueProviderOverride
sets the float value directly (with override toggle). Must be injected into the logic chain to have an effect.
Floating UI windows are automatically created for all IObserverable
components using CreateObserverUi()
. Custom UI can be implemented here. Default components are defined in UI_ObserverItem
prefab.
Observer Input
Manipulation of observable objects is send directly as Input. (see ObserverInput / ObserverInputHandler.cs
)
Input consists of :
NetworkBehaviourId
to identify the component being modified.Value
(currently only float, bool and Vector3)
UI Module
Basic networked Cursors and UI Interactions. It uses Unity's 'Tracked Device Raycaster' and 'InputSystem UI Input Module' to register Canvas interactions locally and send it as Input.
For use with the PC Rig the 'Graphics Raycaster' also needs to be present.
LocalController.OnInputUi()
- Saves Canvas being pointed at to
InputDataController.CanvasBehaviour
(NetworkBehaviourId
) - Saves World position of pointer colliding with Canvas to
InputDataController.CursorPosition
- If the pointer hits a GameObject with a
NetworkedUiButton
component, it is saved toInputDataController.UiInteractionBehaviour
(NetworkBehaviourId
)
- Saves Canvas being pointed at to
UiPointerHandler
- reads the input and sends the pointer position and interaction on to the corresponding
NetworkedCanvas
andNetworkedUiButton
- reads the input and sends the pointer position and interaction on to the corresponding
Slots Module
The slottable system is used to guide objects smoothly into predetermined positions / orientations. Think: Plugs, Keys, Puzzle pieces, Magazines, etc.
To achieve this the GrabbableBase.cs
has various `elegates and callbacks to override the target position the object wants to reach while grabbed.
RotationDelegate GetRotationMethod
GrabAndTargetPositionDelegate GetGrabAndTargetPositionMethod
UnityAction PreTrackCallback
UnityAction PostTrackCallback
Slottable
The Slottable
component overrides these callbacks and changes the desired position and rotation depending on the current targets in relation to the Slot
.
PreTrack
: Check if close enough to a slot be be affected and calculate two factors for later use_slotFactorHand
: Distance from the hand to the slot. This is used to determine the target position and rotation the object should assume._slotFactorObject
: Distance from the object to the slot. This is used to determine if the object is slotted and should be released, if the object can still be rotated (>RotationLockThreshold
)
PostTrack
: Check if the object should be dropped.
The Object Factor is the actual position of the object while the hand factor is determined from the input position of the hand before any forces are applied.
The Slottable
gets the desired position / rotation form the Slot
depending on _slotFactorHand
and _slotFactorObject
.
Additional Components:
GrabbableRidgidbody
HotspotCollector
to find valid Slots.- (optional)
NetworkColliderCollection
: Assigncolliders
being ignored when near aSlot
(is ignored against colliders set in the slot and hands interacting with hotspots if set)
Slot
The Slot
component contains the data on how the Object should behave when in range.
Curves
determine how sudden the position / rotation changes from the object position to the desired slot position.
Needs at least one collider marked as trigger with the correct Layer to be picked up by the Slottables HotspotCollector
.
HotspotsToIgnoreHandCollisionWhenGrabbed
can be used to ignore hands grabbing the hotspots.
This is usefull when space is tight and you want to ignore the hand grabbing the Slot
.
Example: One hand is holding a pistol, the magazine and the slot for it are very close to the hand so the magazine would under normal circumstances collide with the hand.
Add the hotspot of the handle here to ignore the hand collision as well.
UseLineForRange
can be set to use a line segment as the slottable target instead of a point.
DropType
determines how an object is handles when it is successfully slotted.
Kinematic
: used for objects getting slotted into static level geometry. Objects get turned kinematic and can't be moved unless grabbed again.Destroy
: object gets despawned- (
TeleportOffsite
: not tested well, could be used for objects getting slotted on rigidbodies so they can be repositioned instantly when grabbed again. will need some magic on the receiving end to render the slottable in place.)
Slots
determine if the Slottable
can be slotted by checking various parameters:
SlotType
matches the Slot- no other
Slottable
is already present (or ifAllowMultipleObjects
is set) - more restrictions could be added in (
Slot.CanSlot()
)
Components:
NetworkIgnoreCollision
(optional): The colliders assigned can be ignored for the slotted object. This can be used to enable theSlottable
to go through the wall while the hand itself or other objects not fitting the slot can not.
Shooting Module
Sample implementation of how to do gun-like interactables. Two examples are provided.
Shotgun
- Multiple bullets at once
- Heavier physics body using
GrabbableRigidbody
- Reloading single shells
Pistol
- Manual cocking using articulation bodies
- Reloading full magazine
- Grabbing using
AttachmentRigidbody
Good to know
Scene Setup
The Connector.cs
and LocalRigSpawner.cs
are present in every scene. If no local rig is spawned or pre-defined in the scene it will spawn one (either xr or pc) and create a runner and connect.
For the Start.scene
the runner starts in SinglePlayer mode. This way we have full access to our game code without extra work.
Hand and Object visual interpolation :
Under different circumstances the regular interpolation of the NetworkRigidobdy
is overwritten in Render()
- If the hand is not grabbing or colliding with anything, the Local controller position is set. This avoids any delay that occurs when moving the hand object using forces (
XRObject.cs, UpdateRender()
). - If the hand is grabbing an object, the object comes the defacto authority for the visual hand position. The hand is placed relative to the grabbed object, taking the initial offset when the interaction started into account (
GrabbableBase.cs Render()
). - If the object is released the visual hand is lerped back to actual position (
HandTool.cs Render()
)
Common issues
My grabbable object is flying all over the place / rotating weirdly : Check the following settings to make sure forces don't build up or oscillate. They are tweakable in the editor while playing.
BodyProperties
: each grabbable object has aBodyProperties
component where physics properties are set and synced. The properties themselves are scriptable objects and can be reused across similar objects. Tweak the values to get the weight and feel you want the object to have. When objects get out of control it can either be the position or rotational component oscillating. Within theGrabbableBase
component you can toggle if you want to track Position and/or rotation to find out what causes the issue and tweak accordingly.- Position: Mass, Drag, UseGravityk, Velocity Extrapolation
- Rotation: Mass, Angular Drag, Interia, Torque Extrapolation
GrabbableRidgidbody
, "Set Center Of Mass To Grab Point" : In theGrabbable Rigidbody
component you toggle if the center of mass is set to the grab point when grabbing. This has a big influence on the rotational characteristics of an object and might require differentBodyProperties
. If similar objects with the sameBodyProperties
behave differently, this might be the culprit.