Introduction

Table of contents

1. Introduction
With the arrival of Meta Quest 3, Mixed Reality (MR) application development has entered a new phase. The high-resolution passthrough functionality has created an environment where VR and AR experiences can be both enjoyed and developed on a single device.
This blog has previously published articles about Quest development using the Oculus XR plugin. However, recently, the Meta XR SDK has made OpenXR-based architecture standard, which has brought changes to the configuration of SDK-provided prefabs and setup procedures. As existing articles could no longer support the latest environment, we decided to recreate tutorials compatible with the new Meta XR SDK from scratch.
This series aims to be a tutorial where beginners can gradually learn how to use prefabs and scripts provided by the Meta XR SDK. Regarding development using this SDK, a tool called Building Blocks is also provided, which allows easy implementation of object display and interaction, making it possible to try basic features without SDK knowledge.
Building Blocks | Oculus Developers
However, when you feel unsatisfied with samples created with Building Blocks and want to adjust samples to achieve your desired behavior, you need to learn how to use the SDK. In such cases, since the Meta XR SDK provides abundant samples, you can learn detailed SDK usage by analyzing these. However, examining samples one by one can be overwhelming for beginners and presents a high barrier. In other words, implementing "small changes" becomes very challenging. I myself struggled with learning it.
Therefore, I thought that having materials to learn the basic procedures for implementing object display and interaction using Quest would help many people learn, which led me to start this series. I aim to create materials that can teach minimal configurations for implementing specific features at an intermediate level between no-code Building Blocks and official samples.
2. Series Index
- Meta XR SDK Installation and Project Setup
- VR Object Display
- AR Display Using Passthrough Feature
- Disabling Locomotion
- Object Grabbing (Preparation)
- Ray Intersection-based Interaction
- Unity UI Interaction (Using Buttons as Example)
- Natural Manipulation Using Custom Hand Poses
- Palm UI Using PalmMenu
I plan to add more content as ideas arise.
Please feel free to contact me if you have any corrections, suggestions for improvements, or topics you'd like to study.
X (formerly Twitter): https://twitter.com/Tks_Yoshinaga
3. GitHub
The samples created in this article are also published on GitHub.
GitHub - TakashiYoshinaga/MetaXR-SDK-Samples
Subscribe to my newsletter
Read articles from Takashi Yoshinaga directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
