After announcing some brief statistics on the rise of GPU computing, Jen-Hsun presented a live video feed showing four Nvidia employees inside the virtual reality experience. Each employee was dawning an exoskeleton similar to a Stormtrooper’s outfit from Star Wars, though each individual was only visible from above the torso. The feed then showed a high-resolution model of a sports car, allowing all four employees to directly interact with the model by opening its doors and entering the cabin.
Real-time movement accuracy using PhysX rendering engine
The car was designed by notable automotive designer Christian Koenigsegg, who was communicating with the team in real-time to rotate the car and discuss its features. To demonstrate the car’s modularity in VR space, Jen-Hsun issued a verbal command that allowed the model to explode into hundreds of individual pieces and parts that could be examined one by one. The purpose of the demonstration was to show, in contrast to Microsoft HoloLens and other competitors, that Project HoloDeck will feature the most advanced physics rendering engine on the market.
Nvidia believes it can grab at least a niche market in the Social VR category that prefers image quality and photorealism over expandability and video programming choices. The platform could make a nice competitor in the enterprise space, where Hyperfair VR has become a primary platform for businesses to engage customers through virtual spaces. But in the consumer space, however, Nvidia is likely to face heated competition from existing players like AltspaceVR and Sansar VR (from the creators of Second Life), along with any social apps that crop up on the Oculus Rift, HTC Vive and PlayStation VR.
Object rigidity using PhysX will be a key feature of HoloDeck
Jen-Hsun says that his company’s approach is to make the visuals as photorealistic as possible when coupled with a suite of “believable physics” that its competitors may not offer. One example will be with object rigidity. When a user wants to grab a steering wheel in virtual space, the avatar’s hand will stop on the wheel rather than passing through it. This will be accomplished, of course, through its PhysX engine, which enables the corresponding VR game engine to give accurate visual and haptic responses.
“Everything obeys the laws of physics,” says Jen-Hsun.
During GTC 2016, the company announced a dramatic expansion to its VRWorks platform to include new technologies that enhance the presence of virtual sounds and touchable objects. One of these was VRWorks Audio, a path-traced audio platform that uses raytracing to simulate sound propagation based on the material properties of the virtual environment. On one hand, raytracing is very processor intensive, but on the other it can prove incredibly accurate in calculating the path from a source to a destination of individual light rays.
Previously, VRWorks Audio was a suite of APIs in Nvidia’s larger VRWorks suite, but as of this week it has been announced as a separate VRWorks Audio SDK and will be available to developers shortly.
Available for early access in September
During a Q&A session after the keynote, Nvidia VP of GPU engineering Jonah Alben said that Project Holodeck is a project aimed at showing the physical accuracy that can be applied in VR space, but the project is not a final product. He says that Project Holodeck will be available for early access in September 2017, and the company hopes to “get the enthusiasts of the world bringing in content.”