Hello,
i'm using the Avatar SDK and Photon for a multiplayer VR experience. I already got the Avatar SDK to work. But now I want the avatars to move their lips while talking over VoIP.
My current script looks like this:
void OnAudioFilterRead(float[] data, int channels)
{
avatar.UpdateVoiceVisualization(data);
}
The filter receives the data from the audio source, but the lips just won't move.
Sometimes I get a InvalidOperationException:
InvalidOperationException: Collection was modified; enumeration operation may not execute.
System.ThrowHelper.ThrowInvalidOperationException (System.ExceptionResource resource) (at <f2e6809acb14476a81f399aeb800f8f2>:0)
System.Collections.Generic.List`1+Enumerator[T].MoveNextRare () (at <f2e6809acb14476a81f399aeb800f8f2>:0)
System.Collections.Generic.List`1+Enumerator[T].MoveNext () (at <f2e6809acb14476a81f399aeb800f8f2>:0)
OvrAvatar.Update () (at Assets/Oculus/Avatar/Scripts/OvrAvatar.cs:608)
Anyone got any hints in how to get this working?
Comments
Welcome to the Oculus Developer Forums.
I have passed this along to our Lipsync team for review. If I need anything else from you to help them investigate this issue further, I will be in touch with you for more context.
some more discussion in this thread
https://forums.oculusvr.com/developer/discussion/67040/how-to-get-the-new-voice-visualization-working-is-it-sent-as-part-of-the-pose
That said, we *are* in the process of finalizing the integration we demo’d at OC5, which updates the avatars with eye movement and facial movement powered in part by our Lipsync plugin. While this is coming soon, I just wanted to ensure you weren’t trying to implement Lipsync data for the avatars of today.
Let me know if that clears things up!