This was one of the best times of my career. I was able to explore the outer fringes of design, free from the conventions and copy paste patterns that dominate the product design world. Unique problems were more than abundant, and the highly technical nature of coding prototypes spoke to my nerdcore roots. One of the most exciting aspects of working in VR was that many problems exist for which there are no known solutions. Navigating large set of tools, like for programs such as Photoshop or 3d modeling applications was one of those problems. Diving into doing UX for VR required looking at problems like these and starting to create solutions that could become industry standards.
Everyone has a different work flow, and there's no telling if people actually work in the way I assume they do or if they hope for the same design qualities that I do. I collected data through surveys of professional creatives ( architects, 3d modelers, designers) then conducted in person user testing sessions to see how they felt about creation in VR. While almost all of the people tested felt like the "point and click" method of navigation was didn't impede their first time explorations in the application, they also felt like it would take significantly longer to build a real professional project in VR without all of the special tools and shortcuts they have available to them on mouse and keyboard.
It was clear that most professionals used keyboard shortcuts and special tools to improve their workload and reduce cognitive load, so began looking into what qualities highly memorable interactions had. Was there a way to design an interface in VR that both allowed users to navigate via muscle memory and provide the guidance that less experience users would need to find what they are looking for?
Researching the qualities of rapid tool selection I broke down the qualities that define successful rapid tool selection on mouse and keyboard and used those qualities as measuring sticks for the design process. Essentially good menus, keyboard shortcuts, and gesture based interactions all rely on the same qualities. "Positional" consistency, low chance of miss selection, lends itself to building muscle memory over time. Wanting to capitalize on the 3d nature of VR I initially began to focus on gestures, but quickly learned how unsuccessful they were given that they require the user to have already learned each unique gesture. In addition to that, complex gestures can make the user physically tired in VR, and who really wants to learn 100 unique gestures for 100 individual tools. Users needed more structure to be able to learn the ropes which led me down another track.
It became clear that what I needed was something with a defined, easily memorable structure. What I ultimately came up with was a 3D menu system that allowed less experienced users to explore for what they needed and also made it easy for an experienced user to select tools without having to look at the menu. I'll spare you the technical details (there's lots of talk about Euler angles and data objects), but essentially each tool corresponds to a specific angle that's measured by comparing one controller's position to the other.
The end result was a menu that was easy to explore, but that the user didn't have to look at to use if they knew the unique angle the specific tool was in the menu. Adding haptic guides to let the user know physically where they were in the menu also helped them navigate the menu without having to look at the menu itself.
Here's a variation of the menu on a no controller setup using hand tracking and custom gestures to initialize the menu. This was made partially out of curiosity and also to prove how a menu like this could work in an AR setup.