top of page

Meta Reality Labs Multimodal Interaction

AR/VR Product Design / HCI

At Meta Reality Labs, I work on defining what AI hardwares and XR interaction will look like in the next 3–5 years. As a lead product designer, I collaborate with scientists, engineers, and researchers to turn emerging technologies into clear interaction visions and meaningful user experiences.

Screenshot 2023-08-04 at 17.36.12.png

AI Voice Interface for Quest Horizon OS

Selected Published Works

I led the end-to-end design of AI voice interaction for Meta Quest, architecting the system to support a conversational AI interface. Key projects including:

​

  1. User barge-in and follow-up logic

  2. Camera-based intent inference

  3. Audio concurrency

  4. Visual and earcon feedbacks

Starting from Horizon OS V85, users can navigate Quest hands-free, perform all OS-level input primitives with voice and head cursor.

Input Paradigm for Neural Interface

Selected Published Works

I led the exploration of experiences and input paradigms to solve the XR interaction problem, establishing the interaction design foundation for the world’s first neural interface for AR glasses on Meta Ray-Ban Display and future products.

Adaptive Tangible User Interface

Selected Published Works

Explore Tangible UI in XR by repurposing everyday objects and making the physical world an adaptive interface.

Want to know more?

This is a high-level overview. Happy to share the full case study if you'd like more detail.

bottom of page