This tutorial explains how to create your first interactions using Interaction SDK. You will add the Interaction SDK rig to your scene, set up a UI, and add interactions to enable users to interact with the UI.
From the Unity Editor, select Window > Package Manager to view your installed packages.
Navigate to My Assets in the Package Manager, select Meta XR Interaction SDK, and click Install.
Confirm that you see the Meta XR Interaction SDK folder in the Packages directory.
To import the Interaction SDK samples:
From the Package Manager, select Meta XR Interaction SDK and then select the Samples tab.
Click Import next to each sample you want to import.
Run the Unity Project Setup Tool
The Unity Project Setup Tool optimizes Android project settings for Meta Quest Unity apps, including texture and graphics settings. The tool applies the required settings for creating Meta Quest XR apps, including setting the minimum API version and using ARM64.
Navigate to Meta > Tools > Project Setup Tool.
In the checklist under the Android icon tab of the Project Setup Tool, select Fix All.
If you still see Recommended Items in the list, select Apply All.
Add the rig
Instead of manually adding these prefabs to the scene, we recommend using Interaction SDK Quick Actions, which finds or adds any missing dependencies such as the camera rig. You can read a more advanced guide about how to use this rig in Comprehensive Interaction Rig.
Delete the default Main Camera if it exists since since Interaction SDK uses its own camera rig.
Right click on the Hierarchy and select the Interaction SDK > Add OVR Interaction Rig Quick Action.
If you have an OVRCameraRig in the scene, it will appear referenced in the wizard, click Fix All if there is no camera rig so the wizard creates one.
If you don’t need smooth locomotion in your scene, disable the Smooth Locomotion option or your camera might fall infinitely when starting the scene if there is no ground collider present.
In Unity its not possible to save modifications to a rig prefab in a package folder directly, but Unity 2022+ can create a copy of the prefab for you to overwrite as needed. Select Generate as Editable Copy and set the Prefab Path to store a intermediary copy of the rig prefab so you can store as many overrides as needed.
If you want to further customize the rig, adjust the settings in the wizard. For details on the available options, please see the OVR Interaction Rig Quick Action documentation.
Click Create to add the OVR Interaction Rig to the scene.
In the Hierarchy, select the OVRCameraRig.
On the Inspector tab, go to OVR Manager > Quest Features, and then on the General tab, in the Hand Tracking Support list, select Controllers and Hands, Hands Only or Controllers only depending on your needs. The Hands Only option lets you use hands as the input modality without any controllers.
Set up your UI
In the Project panel, navigate to the Packages > com.meta.xr.sdk.interaction > Runtime > Sample > Objects > UISet > Prefabs > Backplate folder and add a backplate for the UI by dragging the EmptyUIBackplateWithCanvas prefab to the Hierachy panel.
The backplate prefab contains a Canvas, a background for the UI, some basic layout components, and ray and poke interactable components to enable direct touch and raycast interactions with the UI.
In the Hierarchy, select the CanvasRoot. In the Inspector, under Rect Transform, you can use the Width and Height properties to set the size of the Canvas. In this example, we have use the following settings to scale it to a reasonable size for a few components:
Rect Transform > Width: 500
Rect Transform > Height: 250
In the Hierarchy, select the UIBackplate. In the Inspector, under Rect Transform, set the set the Width and Height properties to match the canvas width and height set in the previous step. In this example, we have use the following settings to match the canvas size:
Rect Transform > Width: 500
Rect Transform > Height: 250
Add some UI elements to the panel to interact with by dragging and dropping prefabs from the Packages > com.meta.xr.sdk.interaction > Runtime > Sample > Objects > UISet > Prefabs folder. For example, in the Buttons > UnityUIButtonBased folder, drag the PrimaryButton_IconAndLabel_UnityUIButton prefab to the UIBackplate object in the Hierarchy. The button element appears on the UI. Add as many elements as you desire to create your UI.
Make the UI grabbable
Right-click on the Canvas object for your UI and select Interaction SDK > Add Grab Interaction. The Grab wizard appears.
In the Grab wizard, select Fix All to fix any errors. This will add missing components or fields if they’re required.
If you want to further customize the interaction, adjust the interaction’s settings in the wizard. For details on the available options, please see the Grab Quick Action documentation.
Select Create. The wizard automatically adds the required components for the interaction to the GameObject. It also adds components to the camera rig if those components weren’t already there.
Open the Link desktop application on your computer.
Put on your headset, and, when prompted, enable Link.
On your development machine, in Unity Editor, select the Play button.
In your headset, you can interact with the UI directly or at a distance using ray-casting. The UI can be moved around by grabbing it.
Test your Interaction with an APK
Build your project into an .apk to test your project.
Make sure your headset is connected to your development machine.
In Unity Editor, select File > Build Settings. Add your scene to the Scenes In Build list by dragging it from the Project panel or clicking Add Open Scenes.
Click Build and Run to generate an .apk and run it on your headset. In the File Explorer that opens, select a location to save the .apk to and give it a name. The build process may take a few minutes.
In your headset, you can interact with the UI directly or at a distance using ray-casting. The UI can be moved around by grabbing it.