Are hands interactions in VR supported?

Original Discord Post by bioskop | 2024-03-15 18:33:02

Hey guys, I am doing a project that is really advanced now for what I wanted, but I have a question for a future project and that’s is if Hand interactions are supported on the VR version. (Using Oculus Quest 2 right now), Thanks in advance

Reply by k3kalinix | 2024-03-15 18:33:49

Hello <@470441815825448962>

Reply by k3kalinix | 2024-03-15 18:34:08

Is it possible to give details about Hand Interaction?

Reply by bioskop | 2024-03-15 19:08:19

I mean, the actual package works good with controls, but Oculus also allows to use some apps with hands.

Reply by bioskop | 2024-03-15 19:09:39

Embedded Content:
Advanced Hand Tracking & Lighting Fast Interaction Setup | Meta’s h…
Hi XR Developers! In this quick video we are going to look at the new improvements that version 62 of Meta’s XR SDK brought to our hand tracking and the interaction SDK. We are going to look at Multimodal, Wide Motion Mode, or WMM for short, and Cap-sense. Furthermore, we will take a quick look at Meta’s new comprehensive interaction sample and …
Link: https://www.youtube.com/watch?v=dmntgkltSoQ

Reply by k3kalinix | 2024-03-15 19:10:44

Yes you can use it. It is not blocker for Convai.

Reply by bioskop | 2024-03-15 19:22:12

Amazing thanks!!

Reply by tim_u. | 2024-03-18 18:11:27

<@1023671043287699568> Can you be a bit more specific, which specific action should be used to trigger „Talk“ with hand action? For the A button on the Quest controller it‘s pretty clear, but for hand?
And how could it be triggered for both cases, if the user has a controller OR hands? An example for Unity would be great!

Replying to tim_u.'s Message

Reply by tim_u. | 2024-03-18 18:11:27
<@1023671043287699568> Can you be a bit more specific, which specific action should be used to trigger „Talk“ with hand action? For the A button on the Quest controller it‘s pretty clear, but for hand?
And how could it be triggered for both cases, if the user has a controller OR hands? An example for Unity would be great!

Reply by k3kalinix | 2024-03-19 13:08:57

At the moment we don’t officially have a demo project or documentation for Meta or XR Hand Tracking.

However, when you get the “Hand Pinch Interaction” input as an example. From ConvaiNPCManager.cs, you can use the activeConvaiNPC variable to communicate with the NPC with the Start Listening and StopListening methods.

I think you can find some results if you search “Unity Meta or XR Hand Tracking Inputs” on the internet. Your main goal will be to call the Start Listening method after receiving the input and call the Stop Listening method after the input stops.

This conversation happened on the Convai Discord Server, so this post will be closed.