Oculus Hand Interaction SDK
Table of contents
The most impressive thing i felt when i started using oculus is the way the developers managed to make us feel alive in the Metaverse with the hand gesture and interactions. It’s like a fantasy.
It was the first thing thing in my bucket list for learning in oculus, but i don’t want to skip the basics, so made few simple apps like E-commerce store, Bedroom designs with ray indicator. Then came to the hand interaction part.
It was the time when oculus released their hand interaction SDK and some v1 updates which have a lot of improvements compared to the previous version.
About the SDK :
The Oculus Integration SDK brings advanced rendering, social, platform, audio, and Avatars development support for Oculus VR devices and some Open VR supported devices. The Oculus Integration contains the following:
Audio Manager : Contains scripts to manage all the audio and sound effects in your app.
Avatar : Contains the scripts and prefabs to add Oculus Avatars to your app. See Oculus Avatars to get started.
LipSync : Lip-sync contains a set of plugins and scripts that can be used to sync avatar lip movements to speech sounds. See Oculus Lipsync for Unity Development to get started.
Platform : Platform contains the components to add Oculus Platform Solutions functionality to your app.
Spatializer : Contains the scrips and plugins to add sound sources and make them sound as though they originate from a specific desired direction. See Oculus Native Spatializer for Unity for more information.
VoiceMod : Contains a plugin and set of scripts used to modify incoming audio signals.
VR : This folder contains the Oculus VR utilities, a set of scripts and prefabs to enable VR development. Please see Oculus Utilities for Unity for more information.
How to use it for Hand Interaction : For this part of the blog we are only going to use OVR Camera RIG with hand prefab from VR listed above.
Hand Interaction Makes your application more real and it now has limited access but has a great potential in future to make applications interactive yet real. Few application now started using the hand as the main controller instead of ray interaction.
What can we do with it : We will have the privilege to pick a 3d object, use UI action, interact with a 3d model like closing door, pick a mug, play tennis and more. we use the anchor points in 3d object so we can use it the same way we use in real life.
How to use the SDK In your project :
- Import the SDK from assets store to packages in your projects.
- Once its imported choose the camera rig (OVR Camera Rig) from the oculus package. set eye to floor level.
- Import OVR Hand prefab from oculusSDK onto left and right anchor inside camera rig.
- Set hand type respectively for differentiating Left and right hand.
- Now press play and watch your hand in VR.
* Remember to have XR Plugin management enabled.
New changes that you might miss from old tutorials : Transformables are depreciated so use grabbable instead.
I have made a simple oculus project on hand interaction. Refer if you have struck somewhere. This has all latest changed done my Oculus team.
Demo video : https://youtu.be/IOLlZhtRcP0
Best tutorials to learn Hand interaction :
- Youtube: https://www.youtube.com/watch?v=O5o-q9odUB4
- Developer site : https://developer.oculus.com/documentation/unity/unity-handtracking/
- Additional support : https://www.youtube.com/c/ValemTutorials
What can you expect in my next article ?
Its a game with hand interaction in a really cool environment where i’m using probuilder tool, I’ll write more about it in the blog.
Follow me on other social networks too
LinkedIn: https://www.linkedin.com/in/shreethaanu-raveendran-7a6275b2/
Github : https://github.com/shreethaanu
Instagram : https://www.instagram.com/shreethaanu_blogs/
Portfolio : https://strlabz.com
Subscribe to my newsletter
Read articles from shree thaanu directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
shree thaanu
shree thaanu
iOS Developer | XR Researcher & Design Technologist