Meta Reality Labs "Whisperer"
Voice interactions represent a new frontier in VR immersion. Imagine a future in which you can talk to non-playable characters, navigate complex systems, and have virtual worlds respond to your commands in a way that feels natural and rich with possibility. To get there, developers will have to learn how to use Voice tools within their experiences. So Meta Reality Labs asked us: how can we teach devs to use the Quest platform’s Voice SDK, and get excited about these new features? Our answer: a cozy demo we call Whisperer.
A new frontier in VR immersion.
In Whisperer, you play a ghost trying to transcend the mortal plane. To do so, you must help the elderly heroine of the story rekindle her love of gardening. The only tools at your disposal are your voice, your wits, and scattered hints from a fussy macaw!
Try it out here!
Guided by Voice SDK
The beating heart of our demo is Wit.ai – a natural language interface that turns speech into structured data. This means that you don't have to use precise keywords or phrases to trigger interactions in Whisperer: the system will interpret what you say and assign it to a specific intent it then executes.
For example: Say you tell an object to "go to the right by a bit," (in the demo, you can move objects by talking to them), while another user says "jump far to the left." Both prompts are funneled to a general "Move" intent, and the correct response is triggered. The system adapts to the user, rather than the user having to adapt their own language to the system’s limited understanding.
Why this (ghost) story?
With two months to create our demo, we knew we needed to build in a few constraints up front. To keep users laser-focused on learning how to use voice in VR, we set up a few ground rules:
-> We’d remain in one location throughout, and
-> the user wouldn't be able to rely on movement or tactile interactions to drive the experience.
We needed to create a narrative context in which the user would accept not being able to touch anything and could only interact via the power of suggestion. A freshly minted ghost hit the mark!
Building the Greenhouse
BUCK wrote the entire story, built and animated the world, and designed the UI (a.k.a. stuff you already knew we were all about) – but there's also a tremendous amount of technical artistry and UX thinking under the hood.
As an example, in the demo, your hands act as a reticle to select objects you're then able to command with your voice. This makes for an intuitive metaphor, free of extraneous UI elements, that first-time users can pick up with ease. But deciding the range of movement to make this selection mechanic feel right was the product of extensive testing and development, which led to a unique ray-casting solution we haven’t yet seen in VR.
The ghost in the machine
These kinds of UX challenges factored into every part of the experience – but the net result is that you don't notice them! It all just works and feels, dare we say, a little supernatural.
Since this is ultimately meant to be a teaching tool/source of inspiration for developers, the UI that is explicit exists to help the user make sense of the possibilities, and find even deeper interactions on their second or third play-through.*
Find it and Play it!
In addition to being downloadable on the Quest App Lab, all code and assets are accessible and 100% open source on Github.
We're so excited "Whisperer" is out in the world, and can't wait to see what kinds of experiences the Quest developer community is able to build with it!
*You might be able to levitate objects, and cast fireballs you can control with your hands. But we never told you that!
Executive Creative Director
Group Creative Director
Associate Creative Director
Director of Creative Technology
Lead Creative Technologist