Game Engine Issues Mar 20, 2026 10 min read

Unity Quest Hand Tracking Not Working on Meta Quest 3 - XR Input Fix

Fix Meta Quest 3 hand tracking in Unity when hands do not appear, pinch gestures fail, or XR input actions never trigger. Step-by-step OpenXR and Meta XR configuration guide.

By GamineAI Team

Unity Quest Hand Tracking Not Working on Meta Quest 3 - XR Input Fix

Problem: Hand tracking does not work on Meta Quest 3 in Unity. Hands may never render, pinch/select does nothing, or input actions are stuck in "not performed."

Quick Solution: Most failures come from one of four causes: missing OpenXR hand-tracking feature group, Quest runtime permissions not granted, wrong interaction profiles, or input actions not bound to hand controllers.

This guide gives you a reliable setup and troubleshooting flow so hand tracking works in Editor testing and device builds.

Why Quest 3 Hand Tracking Fails in Unity

Quest hand tracking depends on several layers:

  • XR provider and feature setup (OpenXR or Meta XR)
  • Android/Quest build target and permissions
  • Interaction profiles and input action bindings
  • Runtime mode on headset (controller-only vs hands and controllers)

If one layer is wrong, Unity usually falls back to controllers or returns no input.

Solution 1: Verify XR Plugin and Feature Setup

  1. Open Edit > Project Settings > XR Plug-in Management.
  2. Ensure OpenXR is enabled for Android.
  3. Open the OpenXR section and enable Meta/hand-tracking related feature groups.
  4. Disable conflicting providers for Android so only your intended XR path is active.

Verification: In Project Settings, OpenXR is active for Android and hand-related features are enabled.

Solution 2: Confirm Meta Quest Runtime Hand Tracking Mode

On the headset:

  1. Open device settings for hand tracking.
  2. Enable Hands and Controllers (or equivalent hybrid mode).
  3. Confirm apps are allowed to use hand tracking.
  4. Reboot headset after changing mode to clear stale runtime state.

Verification: System-level UI can detect your hands outside Unity apps.

Solution 3: Fix Input Action Bindings for Hands

If hands are visible but gestures do not trigger:

  1. Open your Input Actions asset.
  2. Add or verify bindings for hand interaction paths used by your XR toolkit.
  3. Ensure action maps are enabled at runtime.
  4. Avoid duplicate maps that consume the same actions first.

Common issue: Only controller bindings exist, so hand gestures never hit gameplay actions.

Verification: Debug logs show performed/canceled events when pinching or grabbing.

Solution 4: Check XR Origin and Hand Prefab Wiring

With XR Interaction Toolkit style setups:

  1. Ensure your XR Origin has left/right hand objects assigned.
  2. Confirm hand visual prefabs are referenced and active.
  3. Check tracking components are enabled on both hands.
  4. Remove duplicate rigs in the scene that may fight for tracking ownership.

Verification: Hands render and move with your real hand motion in Play/build.

Solution 5: Build Settings and Permissions for Quest 3

  1. Set Build Target to Android.
  2. Confirm minimum API and graphics settings match current Quest requirements.
  3. In Player settings, ensure XR and required permissions are included.
  4. Rebuild cleanly after major XR setting changes.

Verification: Fresh APK runs on Quest 3 and hand input initializes during startup.

Alternative Fixes for Stubborn Cases

  • Delete Library folder and reimport if XR packages desynced after upgrades.
  • Update Meta XR/OpenXR packages to compatible versions.
  • Test in a minimal blank XR scene to isolate project-level conflicts.
  • Disable custom input wrappers temporarily and test raw action callbacks.

Prevention Tips

  • Keep one canonical XR setup prefab and reuse it across scenes.
  • Lock package versions per project to avoid silent XR regressions.
  • Add a startup diagnostic panel that reports active XR runtime and interaction mode.
  • Test both controllers and hands before each release build.

Related Problems and Links

Bookmark this fix for your XR deployment checklist. Share it with your team if it saved a debugging session.

FAQ

Do I need controllers paired for hand tracking to work?
Not always, but hybrid mode and runtime configuration can affect fallback behavior. Test both controller-present and controller-absent scenarios.

Why does hand tracking work in one scene but not another?
Usually scene wiring differences: missing action map enable calls, wrong XR Origin prefab, or duplicate rigs.

Should I use OpenXR or a Meta-specific path?
Use one consistent path per project and keep package versions aligned. Mixed setups are a frequent cause of intermittent failures.