cancel
Showing results for 
Search instead for 
Did you mean: 

Avatar SDK Latest build - 10/22/2016

Ross_Beef
Heroic Explorer

The second drop of the native Avatar SDK is now available!

As always, please let us know if you run into any issues so that we can work quickly to resolve them.










[Please note that this is not yet ready for redistribution and is covered under our existing NDA’s]

New features in this release


Test ID’s


We encountered a delay getting an early build of the Avatar Editor out to developers, so to help you test your rendering implementations, we’ve temporarily enabled some Test ID’s that cycle through various avatar materials. If you request a specification for userID 0-89, you will receive an avatar specification with a different material applied.


The Mirror app allows you to cycle through these test ID’s using the left and right arrow keys.


(Please note, this debugging feature will be removed in a future version once the avatar editor is available to allow you to customize avatars on your own).


 


Controller support


Touch controllers are now supported natively. The SDK now takes a more complete input structure for the touch controllers, including button states and surface contact status, and uses these inputs to articulate the controllers as well as pose the hands dynamically based on the button presses, joystick movements, and finger proximity.


 


Custom grip poses


Custom grip poses are now supported. You can call ovrAvatar_SetLeftHandGesture/ ovrAvatar_SetRightHandGesture with one of the ovrAvatarHandGesture enum values to force one of the default grip poses. You can also call ovrAvatar_SetLeftHandCustomGesture/ ovrAvatar_SetRightHandCustomGesture and provide your own skeletal poses.


The Mirror sample provides an example usage. You can now press the F key to “freeze” the hand poses, which copies the current hand poses directly into the CustomGesture functions.


Calling Set*HandGesture with ovrAvatarHandGesture_Default will return the hands to their normal animation behavior.


Note - see the 'Reference' folder for .FBX files for the hands. These are pre-registration, so the bone placement will be offset from the actual hands. In a future build we'll provide the post-registration hands as imported into the SDK.


New rendering features


There is a new render part called SkinnedMeshRenderPBS that provides a skinned mesh and textures for albedo and surface properties. The surface texture follows Unity conventions:


  • R channel – Metallic
  • G channel – Occlusion
  • B channel – (unused)
  • A channel – Smoothness

This render part is used for the controllers, with textures matching the real-world surface properties of the shipping Touch controllers. A reference shader is provided with the native sample, but because the native sample doesn’t have a physically based lighting model, it just displays the albedo texture. The Unity package now includes a reference shader that shows how to hook these render parts up to Unity’s physically-based lighting model.


There is an additional new render part named ovrAvatarRenderPart_ProjectorRender. This render part provides a projection transform and material, and references another render part for the material to be projected onto. Though temporarily disabled, this is the technique we will be using for applying voice visualization effects on the avatar, and possibly to enable future stamp-based customization options.


Reference implementations of the projection technique are provided in both the Mirror sample and the Unity integration.


 


New material properties


The ovrAvatarMaterialState object has been expanded with new features. 


First, each texture field now has a companion Vector4 ScaleOffset value which is used to adjust the scaling and tiling of texture surfaces at runtime.


Second, there is now a base mask type on the root object that modules the overall alpha value of the avatar. This allows for material styles that have transparency effects based on the object geometry, and enables a new set of “holographic” material types.


Third, a new “Pulse” mask type has been added. This mask creates “waves” that pulse along an axis over time, adding secondary motion and life to several of the materials.


All of these new features are implemented in the native and Unity reference shaders.


 


New logical components


In addition to ovrAvatarBodyComponent and ovrAvatarHandComponent, you can now query explicitly for the left and right ovrAvatarControllerComponent and the ovrAvatarBaseComponent. These will contain the “semantic” state of each of these components.


The base component currently has no visual representation but will be connected to an optional visual marker for the avatar’s spawn position in an upcoming release.



What’s still to come?


The following features are either not implemented or partially implemented:


  • The avatar base cone is not yet implemented (including the custom base poses and highlight colors).
  • Semantic information about hands (grip volume sizes, point vectors, etc.) are not yet implemented.
  • Packet recording/playback is not yet implemented (Unity users can still take advantage of the Unity stub implementation for now).

To be finalized

  • The avatar material definition is close to final but may necessitate small refinements as we finish implementing the remaining visual features.
  • New render part types may still be added (currently everything is a skinned mesh render part).

As always, post in the forum with any comments or questions

Note - I've had to upload the SDK in 2 parts to fit our Forum file size, make sure you grab both halves.
13 REPLIES 13

cloudgineJordi
Explorer
Hi,

  Just a small note about the samples: While running the new mirror sample i get a crash inside the ATI graphics driver around line 1157 of mirror.cpp . I have seen this code hasn't changed much from the previous sample, so i have no clue on why it happens for now. This was running a debug build from VS2015. 

  I'm in the process of moving to the new SDK, i'll let you know if there are any new issues.

j.

Ross_Beef
Heroic Explorer
Hey Jordi - thanks for flagging.
Any chance you can get a mini-dump of the issue logs if you do get a repro?
It'd also be great to know if you repro this in release mode.

Thanks!

cloudgineJordi
Explorer
This is a link to the dump for the VS2015 Debug version. Since the crash it deep in the ATI driver, i am not sure it will be very helpful:

https://www.dropbox.com/s/s8oryl7hhhquyf1/Mirror.7z?dl=0

A similar crash happens with Release for 2015, and Debug and Relase using the VS2013 toolchain. Other oculus apps work well in this system.

It may be purely an issue with the ATI drivers (which are the latest) combined with this GPU (R9 390X).

cloudgineJordi
Explorer
In the mirror example, you are setting the material state directly from the oculus data structures to the shader. When using engines like Unreal, it would be good to know the update policy for this state to avoid overhead. Do we have to assume it may have changed every frame, including texture layers, etc?

Thanks!

Ross_Beef
Heroic Explorer
Good question Jordi,

We do modulate some of the alphas, colors, etc. on a per material basis at runtime, which enables us to pull off voice visualization, etc. I'd suggest comparing the raw structures with the previous state to determine any changes. We can look into ways to make this more standardized, however this shouldn't cause any significant overhead as it currently stands.

In a future drop we'll update the mirror sample code with a more efficient way of doing this, as it's currently refreshing continuously, rather than with deltas on specific parameters.

Also - thanks for the data dump on the ATI bug - looking into that one

mmcginley
Honored Guest
I've imported the assetbundle into Unity but it doesn't seem to be finding the textures it needs:
"Exception: Could not find texture for asset 4575657222473777152"

Any ideas?

I assume that the Unity package is self contained and that I don't need to import the DLL or the OVRAssets?

Ross_Beef
Heroic Explorer
Worth checking out the setup guide in our first post here: https://forums.oculus.com/developer/discussion/43799/avatars-sdk-native-dll-and-new-unity-integration#latest

The DLL & Assets are temporary and will need to be included, for now

mmcginley
Honored Guest
Got it working! Thought the DLL was referenced properly but it wasn't. Worked seamlessly once I fixed the pathing issue

cloudgineJordi
Explorer
Some more info on the crash reported before: in our integration to Unreal we "sometimes" receive degenerated hand pose transforms, and it only seems to happen in my system as well. I haven't been able to test latest yet, but it could be a hint.