Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums may be revoked at the discretion of Oculus staff.

Using Lip Sync for Avatars SDK in Unity

Hello,

i'm using the Avatar SDK and Photon for a multiplayer VR experience. I already got the Avatar SDK to work. But now I want the avatars to move their lips while talking over VoIP. 
My current script looks like this:
void OnAudioFilterRead(float[] data, int channels)
{

avatar.UpdateVoiceVisualization(data);
}
The filter receives the data from the audio source, but the lips just won't move.
Sometimes I get a InvalidOperationException:
InvalidOperationException: Collection was modified; enumeration operation may not execute.
System.ThrowHelper.ThrowInvalidOperationException (System.ExceptionResource resource) (at <f2e6809acb14476a81f399aeb800f8f2>:0)
System.Collections.Generic.List`1+Enumerator[T].MoveNextRare () (at <f2e6809acb14476a81f399aeb800f8f2>:0)
System.Collections.Generic.List`1+Enumerator[T].MoveNext () (at <f2e6809acb14476a81f399aeb800f8f2>:0)
OvrAvatar.Update () (at Assets/Oculus/Avatar/Scripts/OvrAvatar.cs:608)

Anyone got any hints in how to get this working?

Comments

  • imperativityimperativity Posts: 3,587 Valuable Player
    Hi,

    Welcome to the Oculus Developer Forums.

    I have passed this along to our Lipsync team for review. If I need anything else from you to help them investigate this issue further, I will be in touch with you for more context.
  • CogSimGuyCogSimGuy Posts: 30
    Brain Burst
    ...make sure what type of voip data you're getting from Photon, if it's Opus encoded or whatever a lot of the time it's NOT in PCMFloat format...
  • lostsomfanlostsomfan Posts: 1
    NerveGear
    we are currently having the same problem with our project, hope they can fix it ;)
  • Ross_BeefRoss_Beef Posts: 143 Oculus Staff
    Hey folks, I wanted to just confirm that (as of today) the avatar sdk uses a float value to set the voice, as we’re not currently leveraging Lipsync.

    some more discussion in this thread
    https://forums.oculusvr.com/developer/discussion/67040/how-to-get-the-new-voice-visualization-working-is-it-sent-as-part-of-the-pose

    That said, we *are* in the process of finalizing the integration we demo’d at OC5, which updates the avatars with eye movement and facial movement powered in part by our Lipsync plugin. While this is coming soon, I just wanted to ensure you weren’t trying to implement Lipsync data for the avatars of today.

    Let me know if that clears things up!
Sign In or Register to comment.