AMD Tackles Holy Grail of Providing Life Like Presence for Virtual Reality Game Developers
Tuesday, March 10, 2015
AMD has released the LiquidVR platform, with the goal of achieving life-like presence for players through technologies focused on enabling virtual reality content development for AMD hardware, improved comfort in VR applications by facilitating performance, and plug-and-play compatibility with VR headsets.
LiquidVR tackles a key obstacle to achieving presence by addressing motion-to-photon latency, the time between when a user moves their head and when his or her eye sees an updated image reflecting that new position. In virtual reality, the concept of ‘presence’ is described as the perception of being physically present in a simulated, nonphysical world in a way that fully immerses the user. Minimizing motion-to-photon latency is critical to achieving both presence and comfort, two key elements of great VR.
Reducing latency involves the entire processing pipeline, from the GPU, to the application, to the display technology in the headset. AMD GPU software and hardware subsystems are a part of improving that latency equation, and with LiquidVR, AMD is helping to solve the challenge by bringing smooth, liquid-like motion and responsiveness to game developers and content creators for life-like presence in VR environments powered by AMD hardware.
The alpha version of the LiquidVR SDK 1.0 is now available and offers these features:
- Async Shaders for smooth head-tracking enabling Hardware-Accelerated Time Warp, a technology that uses updated information on a user’s head position after a frame has been rendered and then warps the image to reflect the new viewpoint just before sending it to a VR headset, effectively minimizing latency between when a user turns their head and what appears on screen.
- Affinity Multi-GPU for scalable rendering, a technology that allows multiple GPUs to work together to improve frame rates in VR applications by allowing them to assign work to run on specific GPUs. Each GPU renders the viewpoint from one eye, and then composites the outputs into a single stereo 3D image. With this technology, multi-GPU configurations become ideal for high performance VR rendering, delivering high frame rates for a smoother experience.
- Latest data latch for smooth head-tracking, a programming mechanism that helps get head tracking data from the head-mounted display to the GPU as quickly as possible by binding data as close to real-time as possible, practically eliminating any API overhead and removing latency.
- Direct-to-display for intuitively attaching VR headsets, to deliver a seamless plug-and-play virtual reality experience from an AMD Radeon graphics card to a connected VR headset, while enabling features such as booting directly to the display or using extended display features within Windows.
Low code development platform gets an update from Filemaker Wednesday, May 16, 2018
New partnership emerges to simply IoT security Wednesday, May 16, 2018
Open source HarperDB database solution studio launched Wednesday, May 16, 2018
New demo shows 5G and 3D structured light technology Tuesday, May 15, 2018
Multi-cloud app network platform update launched by MuleSoft Tuesday, May 15, 2018
Stay UpdatedSign up for our newsletter for the headlines delivered to you