A revolutionary method of experiencing virtual reality — our hands.
At the Yale Ultraspace Symposium in 2024, I unveiled an experimental virtual reality (VR) approach that integrates physical hand movements with virtual environments. This method transcends traditional VR interactions, offering a more immersive and intuitive experience.
Ceci n’est pas une pipe: This is Not The Pipe
The Apple
Imagine holding an apple in your hand while blindfolded. Despite the visual deprivation, the tactile sensation and spatial awareness remain vivid. This concept became the foundation for our VR experiments, where the boundary between physical and virtual realities blurs. By recreating this tactile experience in a virtual space, we aimed to bridge the gap between the real and the digital, providing users with a more authentic and engaging interaction.
Experimenting the trackers
This approach draws heavily on the sensory experiences often overlooked in VR. Most traditional VR systems focus primarily on visual and auditory stimuli, often neglecting the importance of touch. By emphasizing the tactile aspect, we enhance the realism of the virtual environment and make the experience more intuitive and accessible. Users can naturally interact with virtual objects as if manipulating real-world items.
Eyes of the Machine
For VR to accurately overlay a virtual environment onto physical reality, it must first understand its surroundings. This is achieved using an Inertial Measurement Unit (IMU) within the headset. The IMU comprises accelerometers, gyroscopes, and magnetometers, each operating in three dimensions, to provide comprehensive data on the headset’s position and movement. The accelerometers measure linear acceleration, the gyroscopes track angular rotation, and the magnetometers detect orientation relative to the Earth’s magnetic field. Together, these sensors create a detailed map of the headset’s movements, ensuring that the virtual environment responds accurately to the user’s actions.
Mechanical eye
In addition to the IMU, VR systems use controllers with infrared LEDs. These LEDs are tracked by cameras on the headset, allowing for precise triangulation of each controller’s position. This setup involves multiple cameras placed around the play area, which detect the infrared light emitted by the LEDs. By calculating the angles at which the LEDs are seen by different cameras, the system can determine the exact position and orientation of the controllers.
Eyes of the Machine
Base stations and Infrared rays
Advanced setups include base stations emitting IR laser beacons, further enhancing spatial tracking accuracy. These base stations, typically positioned at opposite corners of the play area, sweep the environment with infrared laser beams. The controllers and headset have sensors that detect these beams, allowing the system to triangulate their positions with high precision. This ensures that even the smallest movements are captured accurately, providing a seamless and responsive VR experience.
The Base station setup
The Eyes of the Skin
Despite these advancements, traditional VR systems limit natural hand interactions, translating movements through mediated controls rather than direct physical engagement. This is where our approach diverges. By integrating everyday objects into the VR environment as interactive elements, we can bypass the need for traditional controllers. Users can interact with virtual objects directly, using their hands as they would in the real world. This enhances the sense of presence and makes the interaction more intuitive and satisfying.
Bike wheel experiment
For example, in one of our experiments, we created a virtual model of a bicycle tire and linked it to a digital controller attached to a real tire’s rim. This setup allowed users to manipulate the virtual tire to mimic the handling of the physical tire closely. The tactile feedback from the real tire provided a level of realism that is difficult to achieve with traditional controllers.
Enhancing VR with Hand-Based Interactions
By pushing the boundaries of VR technology, we aim to create a more immersive and interactive experience. We configured everyday objects as interactive VR controllers using the Unity game development engine. This involved hacking HTC controllers and integrating them with Unity’s Input Manager, mapping virtual axes to create virtual twins of physical objects.
https://medium.com/media/197418c4d6b71e7c9a96a24a48b8434e/href
Hacking the HTC Tracker allowed for manipulating the virtual tire within the VR space without directly handling the controller, providing a tactile interaction that authentically mimicked the tire’s physical movements. This experiment demonstrated the feasibility of incorporating multiple physical objects into the VR world, enabling them to interact naturally with our hands.
https://medium.com/media/3f53521aa5d3a541b8cba6dd59043d08/href
Practical Applications and Future Potential
Extending sensor technology to include direct interaction with physical objects can profoundly enhance VR’s immersive experience. This method transforms VR from a mediated experience into a natural extension of our physical actions.
The implications are vast. For example, in educational settings, students could interact with virtual lab equipment as if it were real, deepening their understanding through hands-on experience and designing Mental Health products.
Conclusion
Our exploration at the Yale Ultraspace Symposium showcased an experimental approach to VR that integrates physical hand movements with virtual environments. Inspired by artistic principles and grounded in advanced technology, this method redefines our interaction with virtual reality. By merging the physical and digital realms, we open new avenues for immersive, intuitive experiences beyond traditional VR capabilities. This hand-based interaction method is not just a technological innovation; it’s a paradigm shift in how we engage with virtual worlds.
Modern Still life