Using Your Hands in Virtual Reality

By
John P. Martin
January 17, 2023

In virtual reality, there is never one option or way of doing something. Depending on the game or application you're building, there could be multiple ways of achieving the same result; whether configuring how the player moves or the enemy attacks, there is always a way to get it done. In this case, let us look at how we interact in VR. Most of the time in VR, you use hand controllers to navigate and interact with your environment. Users can use hand controllers for almost any hardware device, Oculus Quest, HTC Vive, or Pico Neo. But what if you wanted to drop the controllers and use your hands? The question we're exploring in this article is will our hands work as well or better than the controllers?

I had to deal with this type of interaction in a recent project. As a developer, we always have to keep an open mind. This mindset allows finding the best option to enhance these already impressive applications. Now without getting too technical, let us talk about the obstacle(s) I had with this particular project. The project is using Microsoft's MRTK or Mixed Reality Toolkit. Microsoft built this unique toolkit for the Microsoft Hololens device, tracking your hands in real-time using the cameras and sensors embedded on the front of the device, allowing for the handling and manipulating of 3D models. The Hololens, however, is for AR (Augmented Reality), not virtual reality. How do we make this work for VR? The answer is quite simple; we use Unity.

MRTK

Unity is a gaming engine, a prevalent one, I might add. This fantastic software lets users create virtual worlds with relative ease, thanks partly to an extensive documentation database. You can create virtual worlds for both AR and VR, which means you can create for a wide range of devices, which includes Microsoft Hololens. Because Unity allows you to develop for the Hololens, it only makes sense that Unity would also incorporate the MRTK SDK or System Design Kit. Now it gets interesting because you can use this SDK on other supported devices such as the Oculus Quest 2. Will the MRTK work for the Quest? It's a difficult question to answer as it depends on the type of application you are trying to build.

Hololens

In the case of the project I was working on, the MRTK was not quite hitting the mark. The way the hands interacted with objects and models did not feel natural, and the way things interacted with one another felt disjointed and clumsy. For example, rotating an object on a single axis proved difficult. Or simply picking an object up without it falling out of your hand seemed more like an arduous chore. Let's be honest; if your application is doing little things like this, no one will want to use it. So like any developer, I tried to troubleshoot as best I could. I looked at the documentation for the MRTK and Unity, used different components, and re-wrote scripts, but it didn't make a difference in the application. I needed something other than the Mixed Reality Toolkit to fix these issues! After a small conversation with a colleague, I found that using another SDK, in this case, Oculus Integration worked best. It only made sense as the same people who developed the Oculus Quest also developed this SDK. Remember that hand tracking is relatively new to the Oculus Quest, first introduced in early 2020. After installing the SDK and setting everything up in Unity, I started testing and found that the hand tracking was smoother and the interactions more fluid. I could easily pick models up and not have them fall out of my hand. I could rotate objects on a single axis with relative ease. Does this mean that everything will work thanks to this SDK? No. This SDK came with its own set of issues, and I found that the Oculus Integration lacked in specific areas compared to the MRTK and vice versa. Each SDK has pros and cons, depending on which is suitable for your application or game, and it is up to us, the developers, to make it work.

The Oculus Integration SDK helped alleviate the current issues, but I now faced a new problem, albeit a more significant one. The entire application now needed to be redone to remove the MRTK for the latest Oculus SDK integration as they could not work in tandem. Developing for VR and AR is never linear; it's nothing if not trial and error, and what works on one device may not work on another. The takeaway is to try different methods and ideas or, in this case, SDK. Sometimes it can be best to take a step back and not try to reinvent the wheel to get a project working. And sometimes, all it takes is a conversation with a friend or colleague to spark an idea.