The Design & Implementation of Oculus Quest Hand-tracking in Myst

Using the Presence Platform’s upgraded Hand Tracking API, we introduced Hand Tracking with our latest update to Mystery on the Meta Quest Platform, titled ‘Hands & More’. We are super excited to finally let people play Mystery on Quest without physical controllers! In this post, we will discuss the evolution and iteration of implementing hand tracking in the Mystery— and especially adding more support for it in Unreal Engine 4.27.2.

Guest article by Hannah Gamiel

Hannah Gamiel is Director of Development at Cyan – the studio behind the original ‘mystery’ games – and helped develop the new ‘Mystery (2020)‘ which includes VR support. Originally coming from a purely technical background, she now helps lead production on all titles and manages business and technology efforts at Cyan. She has worked on titles such as ‘Myst’ (2020), ‘The Witness’, ‘Braid, Anniversary Edition’, ‘Obduction’, ‘Firmament’ (coming soon!) and much more.

Design phase and considerations

Design of navigation for hand tracking

Image that shows where you want to go. You were probably thinking of pointing, right? Therefore, we chose to use a ‘pointing’ method to move in Mystery.

When in teleport mode, you can point to where you want to go and the teleport ring will appear at your destination. When you ‘off-point’ (by extending the rest of your fingers, or simply retracting your index finger into your palm), the teleportation is performed.

When in smooth motion mode, pointing with your free motion dominant hand (which is configurable in our control settings, but defaults to the left hand) will smoothly begin to move you around in the direction you’re pointing.

When testing motion with pointing, we found that hand tracking can sometimes be unreliable with your index and middle fingers when it’s occluded by the rest of your hand. The system is not sure if these fingers are pointing completely or completely ‘enclosed’ in your hand. We added a bit of a ‘fudge’ factor to the code to allow for more stable motion initiation/execution on this front – which we’ll get into a bit later when we discuss changes made to out-of-the -box hand tracking support in Unreal Engine.

Turning

The ‘point’ method does not work for all navigation uses. When it comes to turning, we initially combined pointing with wrist rotation. A comparison of the player’s wrist and the camera’s forward vector would indicate the direction of the twist (and how big the twist should be). We tried this at first as it seemed intuitive to keep the “pointing” theme going for navigation between all modes.

However, complications arose in comfort tests. In playtesting, most players would point forward with their palm facing the ground, as you would probably do when trying to point at something outside of a game as well. Rotating your wrist left and right (around the upward axis of your wrist) while keeping your palm facing the ground is challenging and has a very limited range of motion, especially if you are trying to face away from your chest.

This problem is the same even if you asked a player to point to something in front of him with his palms facing inward. You can bend your wrist in against your body quite a bit, but you won’t get the same range of motion by bending your wrist away from your body.

So how did we solve this? We ended up assigning turn to a ‘thumbs-up’ gesture instead of a point-finger-point gesture.

Imagine giving a thumbs up. Now turn your wrist to the right and left. Note that even if you don’t have a large range of motion, it’s still pretty consistent to point either ‘left’ or ‘right’ with your thumb in this gesture.

This is what we settled for hitting in hand tracking mode. Although pointing with your thumb doesn’t seem like the most intuitive way to turn, it is did end up being the most comfortable and consistent way to do it.

With snap rotation, turning your wrist left or right from a thumbs-up position will initiate a single snap rotation. You must then return your hand to the ‘center’ (straight up) position to reset the snap, and additionally wait for a very short cooldown to initiate a snap flip again.

In smooth turning, turning your wrist while in a thumbs-up position will begin to rotate you left or right—leaving a ‘dead zone’ that prevents a turn from taking place until you pass the threshold.

Handling conflicts between gestures and object interaction

Of course, pointing a finger is too broad a gesture to be assumed to be used only for navigation. People will make the same pointing gesture to press buttons or interact with other things in the world just out of habit or their own expectation. It would be pretty jarring to go up to (but no right up to) a button, point your finger to press it, and then suddenly (and unwantedly) move closer to it in-game (or initiate a teleportation unintentionally)!

The way we prevent motion from occurring while the player might be interacting with something is by preventing any motion code from firing when the hand making the ‘move’ motion is within a certain area of ​​an interactable object. This series has been tweaked several times to get to a good ‘sweet spot’ based on playtesting.

We have found that this sweet spot is about 25 cm from the space location of the bone at the tip of the index finger. Mystery is full of interactive objects of various sizes (everything from small buttons to very large handles) set up in both large open spaces and narrow corridors, so it took us a lot of testing to settle on this number. We initially tried 60 cm (about two feet), but that prevented movement from occurring when players still had to get closer to an object. Likewise, anything under 25cm caused unwanted player movement to trigger when players tried to grab or touch an object.

One of our best test areas was the generator room on Myst Island, where you walk through a narrow entrance and then are immediately greeted by a panel full of buttons. When the interaction test area was too large, players could not move through the entrance and towards the panel because it detected buttons within the reach of the index finger.

That said, 25cm is what worked specifically for Mystery. Other games may need to adjust this number if they want to implement something similar with their own criteria in mind.

Designing object interactions for hand tracking

Right now all grabable interactions are in Mystery is built to work with hand tracking – turning valves, opening doors, pressing buttons, turning book pages and so on.

The interactions piggy-back off what we had already set up Mystery with Touch controllers. Pressing the grip button automatically blends the in-game mesh representation of your hand into a “gripped” pose, either by putting your hand into a fist (if empty) or by grabbing an object. With hand tracking, we’ve added code that will make an educated guess as to when you’ve curled your fingers enough to ‘grab’ something and start the same logic as mentioned before.

For example, when you use hand tracking and your hand hovers over something that can be grabbed, your hand color turns orange (this is exactly what happens when you don’t use hand tracking in Mystery also VR). When you grab an interactable object by starting to curl your fingers into a fist, an orange sphere replaces your hand mesh and represents where the hand is attached to the object.

The reason we went with this method instead of making custom poseable masks for your hands – or having your hands/fingers appear to physically interact with parts of these objects – is because we wanted the interactions to be in parity with what we offer on the Touch controller side so far.

However, pressing buttons works differently. There is no need for abstraction as buttons are not graspable objects and instead we allow you to simply press a button using generated capsule colliders between each of the finger joints of the poseable hand mesh. You can do all sorts of weird and funny things because of this – like using only your pinky or the knuckle of your ring finger to interact with every single button in the game if you really want to.

This implementation differs slightly from the way Touch controllers interact with in-game buttons, in that we normally expect players to use the grip button on their controller to set the hand to be a posed “finger pointing” mesh to get a exact button in the game. tap their end. With hand tracking, there is obviously significantly more flexibility in the pose you can create with your hand, and therefore significantly more ways to press buttons with the same accuracy.

Menu/UI interactions

To interact with menus, we ended up going with the same interaction paradigm that Meta uses for the Quest platform: a two-finger pinch between the thumb and index finger of both hands. This can be used both to open our in-game menu and interact with items in the menu. No point reinventing the wheel here when players are already taught to do this in the OS level menus when they first enable hand tracking on Quest!

Communicate all of this to the player

Because hand tracking isn’t as common an input on the Quest as Touch controllers, and because there may be some people playing Mystery for the very first time (or even playing their very first VR game!), we tried to be considerate in how we communicate all this hand tracking information to the player. We made sure to include another version of our “controller diagram” specifically tailored to describe hand tracking interactions (when enabled in Mystery), and we show the player specialized notifications that tell them exactly how to move their hands around.

Additionally, we thought it would be important to remind the player how to have a smooth hand tracking experience once it’s enabled. The player is notified in Mysterys menu that hand tracking stability is much better if they ensure they are in a well-lit room and keep their hands within their field of vision.

The Meta also informs players that these are key to a well-tracked hand tracking environment, but we recognize that some players may jump into a game without analyzing the Meta’s messages about this first, so we’ve chosen to remind people that if they have forgotten it.

Continue on page 2: Engine changes made in unreal »

William

Leave a Reply

Your email address will not be published.