Avatar
The avatar addon provides a common base to support avatar specific functionality and a simple avatar system which shows how to create an avatar system allowing users to customize their own avatar.
Avatar
The AvatarRepresentation
class offers the following capabilities:
- subscription to
UserInfo
change of either avatar's name or url - selection of the proper avatar system for a given URL
- support for LOD (see "LOD Support" details below)
- definition of common description of an avatar (color, hair, ...) through the
IAvatar
interface - broadcasting of avatar loading finished event on
NetworkRig
game object's children implementingIAvatarRepresentationListener
(and also onHardwareRig
's children, if aRigInfo
system is in place to track the hardware rig reference).
LOD Support
A LODGroup can added as a sibling to AvatarRepresentation
, and in this case, AvatarRepresentation
offers those capabilities:
- addition and removal of avatar systems' dynamically created renderers to the LODGroup
- while an avatar is loading, or if an url is erroneous, show a higher level LOD (based on
loadingMode
, by default only for remote users) - control the activation or not of the LODGroup with the
IgnoreDistance
method
Avatar Representation listeners
The addon offers some components implementing IAvatarRepresentationListener
and reacting to avatar loading.
Local avatar culling
The local avatar culling system aims at hiding an avatar for the local user. To do so, the camera stored in the hardware rig has its culling mask changed, to hide a layer, that will be applied to the avatar gameobject when it is associated to the local user.
Local avatar culling is done by adding a LocalAvatarCulling
script on the HardwareRig
game object.
Note that a RigInfo
system is required for it to be functional, and so in addition to LocalAvatarCulling
on the HardwareRig
:
- a
RigInfo
on the networkRunner game object - a
RigInfoRegister
on theNetworkRig
game object and aRigInfoRegister
on theHardwareRig
game object
Hand representation managers
NetworkHandRepresentationManager
and HardwareHandRepresentationManager
are optional components that allow to manage online and offline representation of hands.
There purpose is mainly to be able to colorize the hand based on the avatar skin color received.
They also offer various options to determine the appearance logic of the offline and online hands, based on the connection status, the availability of hands in the loaded avatar system, and so on.
The hardware hand also has a decoration system, to have a part of the offline hands (typically, a watch) moved based on online data.
This is useful for special cases, like surfaces blocking the network hands: it ensures that the hardware hand decoration, if any, follows the network hand interpolation target (useful when the network hand interpolation target is manually changed - hand blocked, ...)
Low poly simple avatar
LowPolySimpleAvatar
is in charge of configuring a low poly avatar (for the LODGroup) according to the avatar selected by the player (it can be a simple avatar model or a Ready Player Me model).
To do so, it uses the OnRepresentationAvailable
callback of the AvatarRepresentation
.
Then, materials for body, hair and clothes colors of the low poly avatar are configured according to the selected avatar.
Also, if the avatar model is a simple avatar, the hair mesh is configured with the hair LOD mesh corresponding to the simple avatar model.
Simple Avatar
This addon includes a simple avatar system which shows how to create an avatar system allowing users to select the various parameters making up their avatar: skin color, clothes meshes, hair material, etc...
SimpleAvatar
contains methods to:
- change the avatar thanks to a new "simple avatar" URL
- generate a random "simple avatar" model
- configure a specific avatar parameters (hair, cloth, skin)
- animate avatar's eyes and mouth
Also, it informs the AvatarRepresentation
when a new simple avatar is loaded thanks to the RepresentationAvailable
method.
Demo scene can be found in the Assets\Photon\FusionAddons\Avatar\Demo\Scenes\
folder.
So, to test simple avatars, open the AvatarLODSimpleAvatar
scene.
When the user is spawned, select the SimpleAvatarNetworkRig(Clone)
game object and change the UserInfo
AvatarURL
parameter.
For example, you can use the following URL : simpleavatar://?hairMesh=1&skinMat=2&clothMat=0&hairMat=0&clothMesh=1
Eye movement Simulation
To avoid an avatar to display static eye gaze, it is possible to use the eye movement simulation system. This system can move objects (avatar eyes) to track other objects having a GazeTarget
on them.
To do so:
- the scene must contains a
GazeInfo
component (which runs at regular interval the background thread determining the new targets to aim) - avatar object must hold a
Gazer
component, and provide eyes game objects togazingTransforms
(it is also possible to provide some rotation offset throughgazingTransformOffsets
) - optionally, to improve performances, it is possible to provide a
RendererVisible
to theGazer
component in theeyeRendererVisibility
field: if theRendererVisble
detect that the avatar renderer is not visible, eye simulation won't be run for this avatar
Oculus lip synchronization
Some avatar system will need advanced lip synchronization. To do so, the avatar addon includes a version of the Oculus Lipsync library, released under the Oculus Audio SDK license (https://developer.oculus.com/licenses/audio-3.3/) / Meta Platform Technologies SDK Licence Agreement (https://developer.oculus.com/licenses/oculussdk/) , and available in the Oculus Integration.
Dependencies
Text Mesh Pro (only used to display the user name plate)
Download
This addon latest version is included into the addon project
Supported topologies
- shared mode
Changelog
- Version 1.0.2: use NetworkString for UserName & AvatarURL
- Version 1.0.1: fix namespace for LookAtCamera
- Version 1.0.0: First release