cancel
Showing results for 
Search instead for 
Did you mean: 

Question about the lip-sync for the new expressive avatar

ericwang0701
Explorer
Hi,
I have followed the developer guide for the new expressive avatar update and every thing works well except the lip-sync feature.
In this document, it only mentioned that the expressive avatar utilized the OVRLipsync to enable realistic lip-syncing. However, it didn't mention how we actually make it work for the expressive avatar (e.g. what prefab or scripts that we should looked at and attach on the avatar)

Any help or suggestion would be really appreciated.

Best Regards,
Eric Wang


1 ACCEPTED SOLUTION

Accepted Solutions

cloud_canvas
Expert Protege
As far as I can tell, all you need to do is set two bools to true in the Inspector for the Avatar Component itself: "Enable Expressive" and "Can Own Microphone". If you look at a local driver-driven third person instance with these settings, you should see it working.
Samsung Galaxy S8 (Snapdragon 835), Gear VR (2017), Oculus Go (64GB), Unity 2018.3.14f1

View solution in original post

21 REPLIES 21

cloud_canvas
Expert Protege
As far as I can tell, all you need to do is set two bools to true in the Inspector for the Avatar Component itself: "Enable Expressive" and "Can Own Microphone". If you look at a local driver-driven third person instance with these settings, you should see it working.
Samsung Galaxy S8 (Snapdragon 835), Gear VR (2017), Oculus Go (64GB), Unity 2018.3.14f1

Metal_Multiball
Heroic Explorer


As far as I can tell, all you need to do is set two bools to true in the Inspector for the Avatar Component itself: "Enable Expressive" and "Can Own Microphone". If you look at a local driver-driven third person instance with these settings, you should see it working.


Thanks @cloud_canvas, is there another microphone input step required besides toggling the "Can own microphone"?  I can't get the avatar mouth to move yet and cant tell if there is any voice audio input from the headset mic.  I have allowed permissions on Android.

cloud_canvas
Expert Protege
Are you including OVRLipSync from the Utilities too? The new version of the OvrAvatar Component makes use of it so you need to have it in your project.
Samsung Galaxy S8 (Snapdragon 835), Gear VR (2017), Oculus Go (64GB), Unity 2018.3.14f1

Metal_Multiball
Heroic Explorer


Are you including OVRLipSync from the Utilities too? The new version of the OvrAvatar Component makes use of it so you need to have it in your project.


Yes, I do have OVRLipSync in my project, is there a component I need to add to the localavatar game object?

ericwang0701
Explorer
@cloud_canvas and @stevehinan Thank you for your reply. I followed the @cloud_canvas suggestion and finally make the lip-sync works.

However, I did found a weird thing that if I run my scene in the editor mode, and if I select LocalAvatar or RemoteAvatar in the scene hierarchy to check some info, the lip-sync just stopped working. I guess I just need to prevent doing that at the moment.  

ericwang0701
Explorer
@stevehinan one easy way to test whether the lip-sync is working. You can use the RemoteLoopback sample scene in the Avatar SDK (i.e. Avatar/Sample folder). Hope you find a way to fix your issue.

Metal_Multiball
Heroic Explorer
@ericwang0701 @cloud_canvas
Solved.  Thank you both!
In a nutshell, the OVRAvatar script requires the OvrAvatarLocalDriver component to be attached for voice to work.
I had created a modified OvrAvatarLocalDriver script with a different name locking the position of the avatar.
Therefore I just added the required OvrAvatarLocalDriver script back onto the OVRAvatar object as a duplicate component and it works.

From OVRAvatar script:

if (GetComponent<OvrAvatarLocalDriver>() != null)
        {
            // Use mic.
            lipsyncContext.audioLoopback = false;
            if (CanOwnMicrophone && IsValidMic())
            {
                micInput = MouthAnchor.gameObject.AddComponent<OVRLipSyncMicInput>();
                micInput.enableMicSelectionGUI = false;
                micInput.MicFrequency = 44100;
                micInput.micControl = OVRLipSyncMicInput.micActivation.ConstantSpeak;
            }
        }


Steve Hinan - METAL MULTIBALL

legolasshegolas
Honored Guest
I've followed all the steps in this thread, have the entire Oculus Utilities folder imported in my project (includes LipSync and Avatar) but the lipsync still doesn't seem to be working. I'm using the OculusGo and have my Avatar in 3rd person mode. The app asks me for permission to access microphone at launch as well but there's no lip sync. Just the Avatar's hands are moving. The head isn't even turning based on movement of the headset. 

Please help?

Metal_Multiball
Heroic Explorer
@legolasshegolas
Do you still need help?  Have you got the head to turn as you would expect?