Researchers develop 3D holographic heads-up display for enhanced road safety

A groundbreaking augmented reality heads-up display has been developed by researchers from the Universities of Cambridge, Oxford, and University College London (UCL), aiming to elevate road safety by presenting high-resolution three-dimensional holograms of potential hazards directly in a driver’s field of vision in real time.

Unlike existing heads-up display systems that offer two-dimensional projections onto a vehicle’s windshield, this innovative system utilizes 3D laser scanner and LiDAR data to construct a comprehensive 3D representation of London streets. Going beyond conventional limitations, the technology can “see” through objects, projecting holographic representations of hidden road obstacles aligned with their real-world counterparts in both size and distance.

Imagine a scenario where a road sign obscured by a large truck becomes visible as a 3D hologram, providing the driver with precise information about its location and content. The 3D holographic projection technology not only diverts the driver’s attention from the windshield but also has the potential to significantly enhance road safety by presenting real-time holographic depictions of road obstacles and potential hazards from various angles.

Published in the journal Advanced Optical Materials, this development responds to the imperative to reduce the approximately 16,000 daily traffic accident-related fatalities caused by human error. By leveraging technology to furnish drivers with information about potential hazards, the augmented reality heads-up display represents a promising stride toward improving road safety. Current heads-up displays typically offer information such as current speed or driving directions, but this innovative approach introduces a new dimension to safety by integrating holographic projections aligned with the physical environment.

Credit: Jana Skirnewskaja/Phil Wilkes

Jana Skirnewskaja, the first author of the study from Cambridge’s Department of Engineering, emphasizes the crucial concept behind heads-up displays (HUDs): keeping the driver’s eyes focused on the road to prevent potential accidents. However, she notes a drawback with existing HUDs, stating, “Because these are two-dimensional images, projected onto a small area of the [windshield], the driver can be looking at the image, and not actually looking at the road ahead of them.”

In their pursuit of enhancing road safety through more accurate information delivery, Skirnewskaja and her colleagues have been exploring alternatives to traditional HUDs. Their goal is to project information throughout the driver’s field of view in a non-overwhelming and non-distracting manner, strictly related to the driving task at hand.

The team developed an augmented reality holographic point cloud video projection system, aligning displayed objects with real-world objects in terms of size and distance within the driver’s field of view. This innovative system combines 3D holographic data with LiDAR (light detection and ranging) data. LiDAR employs pulsed light to calculate the distance of an object based on reflected light pulses.

Credit: Jana Skirnewskaja

The researchers tested the system by scanning Malet Street on the UCL campus in London. LiDAR point cloud data was transformed into layered 3D holograms, incorporating up to 400,000 data points. The approach involved generating real-time holograms, enabling a dynamic assessment of road hazards as conditions change.

Skirnewskaja highlights the system’s adaptability to evolving conditions, explaining, “The data we collected can be shared and stored in the cloud, so that any drivers passing by would have access to it—it’s like a more sophisticated version of the navigation apps we use every day to provide real-time traffic information.”

While collecting more data enhances accuracy, the researchers focused on strategically selecting data points for a 360° view, enabling a comprehensive assessment of road hazards. Skirnewskaja emphasizes the need for balance, stating, “With as little as 100 data points, we can know what the object is and how big it is. We need to get just enough information so that the driver knows what’s around them.”

Following a virtual demonstration earlier this year, the researchers have fine-tuned the system for inclusivity and user-friendliness, considering factors like eye strain and visual impairments. Collaborating with Google, they aim to test the technology in real cars, planning road tests—on public or private roads—in 2024. The researchers envision a system that not only improves safety for drivers but also benefits pedestrians and cyclists, aligning with their commitment to creating a technology that is accessible and enhances overall road safety.

Source: University of Cambridge

Leave a Comment