Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums will be revoked at the discretion of Oculus staff.
New to the forums? Click here to read the How To guide. -- Developers click here.

[Integration Notice] Oculus Avatar SDK 1.29.0 Released (September 6th 2018)

imperativityimperativity Posts: 3,587 Valuable Player

Oculus Avatar SDK 1.29.0

The Oculus Avatar SDK assists developers with implementing social and hand presence for Gear VR, Oculus Go, and for Oculus Rift and Touch controllers. With the Avatar SDK, it is easy to bring the avatar appearance users create in Oculus Home into your own applications and make them viewable by other users. Avatars also include hand presence for Touch, letting you integrate Touch interaction into your app.

Includes Unity and native C/C++ support for both Rift and Mobile.

Note: If using Unity, consider using the Unity Integration. This enables the core Oculus APIs, the Platform and Avatar APIs, all of which are necessary to use the Avatar Social Starter sample scene.

For more information, see the Avatar SDK Developer Guide.

New Features

  • Oculus Avatars can now be used by apps on other platforms. This functionality is demonstrated in the new Unity CrossPlatform sample. For more information, see the Cross-Platform Avatar Support section of Unity Developer Guide - Rift.
  • New Unity shader support for linear lighting.

Bug Fixes

  • An exception caused by loading LOD 1 assets in Unity on Android has been fixed.

Known Issues

  • There are no known issues with this release.


  • motorsepmotorsep Posts: 1,298 Oculus Start Member

    Now that Oculus Quest was announced, any chance we can see Avatar SDK (with expressive avatars and lipsync) to be integrated with UE 4.21 ?
  • Ross_BeefRoss_Beef Posts: 139 Oculus Staff
    The plan is certainly to continue building against latest UE version, and the expressive update is targeting UE and Unity. Stay tuned for a UE mobile update (which’ll land before expressive does)
  • motorsepmotorsep Posts: 1,298 Oculus Start Member

    Is it possible (or will be possible in UE4 with the next release of integration) to use OVRLipsync (which I assume is coming to UE4 too) to have AI "talk" using WAV files instead of mic input ?  
  • Ross_BeefRoss_Beef Posts: 139 Oculus Staff
    It’s possible today, albeit the avatars don’t have lipsync yet, to drive an avatar for an NPC. You can use the avatar sdk packet recording / playback functions to record and save avatar motion and play it back (as you would a networked data stream).

    Given the avatar mouth today is just powered by a vertex offset against the amplitude of the audio wave, reading the waves from a WAV into the very shader should be pretty easy to hook up.
  • motorsepmotorsep Posts: 1,298 Oculus Start Member

    Sorry, I meant OVRLipSync for non-avatars (rather for AI using custom character skeletal models).

    From what I understand (other devs talk) OVRLipSync isn't even in UE4 integration yet. If it is, how do we work with it via Blueprints?
  • pushmatrixpushmatrix Posts: 6
    Can we get an updated UE4 Integration guide? The one currently online is for 4.15 and a super outdated version of OvrAvatar.
  • NeontopNeontop Posts: 147 Oculus Start Member
    edited October 2018
    Can we get an updated UE4 Integration guide? The one currently online is for 4.15 and a super outdated version of OvrAvatar.
    Yes a little update will be a good thing to do.

  • EldonVREldonVR Posts: 6 Oculus Start Member
    When I try the Social Starter example, should I see myself or my friend's personalized avatar? Right now I only see the default avatar. I Debug.Log the user ID's in PlatformManager and it seems to be getting the right ID. 
  • YbalridYbalrid Posts: 247

    there's an issue with the latest Avatar SDK package. The CMake script that copy over the .glsl shader files only copy 3 of them, and the "Mirror" sample will fail to startup because of the missing file.

    As you can seeu, the CombinedMesh fragment shader is never copied to the executalbe directoy. The progam will encounter an error in the call to _compileProgramFromfiles() at line 1061 in Mirror.cpp.

    The source file is also not added to the list in `add_executable()` in the cmake file, so it doesn't show up inside Visual Studio.

    This is a minor annoyance, but it should be fixed for next release. ;)
    CV1+Touch; Running on GTX980 - 4770K - 32GB DDR3

    I'm writing a C++ open-sourcre game engine for the Rift http://annwvyn.org/ - https://github.com/Ybalrid/Annwvyn
    For now it's only used for my student projects at my engineering school http://en.esiea.fr/
  • gbwhatsappapk008gbwhatsappapk008 Posts: 1
    When I attempt the Social Starter model, would it be advisable for me to see myself or my companion's customized symbol? At this moment I just observe the default symbol.
    I have use GBWhatsapp For Iphone here. This is the best app.
  • Ross_BeafRoss_Beaf Posts: 1
    @EldonVR and @gbwhatsappapk008 — have you ensured you’re using your own app ID? 
Sign In or Register to comment.