Reducing Driver Distraction with a Cutting-Edge Head-Up Display System
Reducing Driver Distraction with a Cutting-Edge Head-Up Display System
Overview
Our team—RN2, consisting of Radhika Balaji, Nallely Martínez, Nabhi Shah, and myself; developed a modular Head-Up Display (HUD) designed to significantly minimize driver distraction. By consolidating key dashboard functionalities into the HUD, our solution eliminates the need for a traditional dashboard, offering manufacturers both cost and space efficiencies while aligning with the industry’s shift toward minimalistic vehicle interiors.
What difference did we make?
A Cost-Effective Solution That Reduces Driver Distractions
1.56s
Using the HUD resulted in a 1.56-second reduction in dwell time for responding to incoming calls.
80%
Users said that the HUD will improve their driving experience & significantly reduce distractions.
#Efficient
Cost & space is saved by adding a highly functional HUD instead of traditional dashboards.
The Problem to Solve
What Prompted Us to Re-Imagine HUDs?
According to a previous study[1], in-car navigation and infotainment systems cause major distractions. Even voice-activated features can divert attention for over 40 seconds. Just two seconds of distraction doubles crash risk, and inputting navigation at 25 mph covers the distance of four football fields. Though hands-free audio controls exist, high error rates and low trust often push drivers toward visual or manual use.
We also carried out a comprehensive heuristic evaluation of existing HUDs across various market segments. Most current solutions adopt a minimal approach, displaying only key functions like speed, navigation, and music playback, while still relying on the dashboard or secondary displays for additional information such as fuel level, playlists, incoming calls, and ADAS information.
Most HUD systems display only essential functions (e.g., speed, navigation, music), relying on dashboards or secondary screens for additional data like fuel level, incoming calls and ADAS alerts.
While both the BMW i5 M60 and Audi Activesphere concept introduced sophisticated HUD systems aimed at reducing the distraction often associated with secondary screens. Audi replaces the traditional display with AR glasses that project information throughout the car’s interior, but this still demands some visual attention away from the road.
In contrast, BMW clusters information more densely, increasing the risk of cognitive overload. Despite their forward-thinking designs, both HUDs lacked alignment with fundamental usability principles such as consistency, visual simplicity, and ease of interaction.
Our solution incorporates features like calls and ADAS data, thoughtfully designed to reduce distractions while aligning with heuristic design principles offering functionalities with visual simplicity, and ease of interaction.
User Research
Understanding Driver Needs
We interviewed 5 experienced US drivers (3–5+ years) to explore their expectations and behaviors related to Heads-Up Displays (HUDs). Using affinity mapping, we analyzed preferences, customization habits, and concerns, focusing on diverse driving contexts and derived the following insights
✔️ Clarity over clutter
People prefer having navigation and speed information upfront. But they do not want to compromise on the minimalism of the HUD interface and express concerns about cluttering.
✔️ Varying priorities
We also noticed that users have varying preferences regarding what they consider important. For instance, one individual prioritized RPM and wanted it displayed in the HUD, while others did not share this preference.
✔️ Fear of distraction
Most drivers found music control and phone alerts distracting, often leaving music handling to co-passengers and minimizing its importance while driving.
The Solution
Strategic Design
The identified user needs and existing gaps were addressed by striking a balance between functionality and Simplicity. Minimalism was achieved through thoughtful element placement, smooth transitions, and deliberate design choices. Strategic design decisions enabled the integration of additional features without inducing complexity.
➡️ Positioning
According to a study[2], key elements should be positioned within 5–15 degrees of the driver’s line of sight in the HUD to ensure quick access to important information. Content that could potentially causes the fear of distraction should be placed outside this range. However, including such elements beyond the recommended range is still beneficial, as they cause less distraction compared to when displayed on secondary screens.
➡️ Design for HUD
Our primary concerns were readability and visibility, while also ensuring a clean, transparent interface that wouldn’t obstruct the driver’s view of the road or surroundings. In our research into design principles used by AR technology designers, we found that while some techniques—like background blur, vibrancy text effects, and other background-aware enhancements—are effective in AR, they aren’t applicable for HUDs due to the limitations of projection-based technology.
➡️ Adapting to light
The HUD design was sensitive to changes in environmental lighting—not just the difference between day and night, but the overall brightness of the surroundings. To ensure optimal visibility, we created adaptive design variants that respond to lighting conditions. These versions adjust contrast, shadow hues, and modify color brightness and saturation to maintain clarity in any environment.
➡️ Audio Design
During our first user testing, the driver had to pretend to drive without audio but only the visual cues. Their feedback indicated that for a all-rounded testing scenario, our team would have to utilitse online audio libraries to play relevant instructions and sounds while the driver is driving in the pretend-scenarios.
This prompted us to collect 4 audio files that were relevant for the scenarios described below.
“Take the right onto NE TIER I” - OpenAI was used to record a calming voice announcing which exit to take
Indicator sound- This ticking sound was played after we instructed the driver to turn on right indicator.
Warning - Warning sounds were used to highlight the ADAS system.
Calling- A ringtone was played, few seconds prior to showing the call on HUD, preparing the user for the call.
Resources used:
ChatGPT, Voice of Sol; Chosen for it’s soothing and upbeat personality.
Car Chime Sounds
➡️ Transitions
Users were particularly concerned about distractions from incoming calls or music info on the HUD. To mitigate this, we implemented smooth transitions paired with audio cues that subtly signaled upcoming events. This approach helped users anticipate system behavior, minimized unexpected interruptions, and empowered them to maintain control over their attention.
➡️ Modularity
One key insight during our design process was that every car offers a unique design—and consequently, a unique user experience. Our visit to the NY Auto Show highlighted just how varied car interfaces can be. We observed that user experience is highly subjective: while some users find the shift from traditional dashboards to digital displays frustrating, some appreciate the change.
Given the limitations of car hardware, offering full customization isn’t always possible. However, with our HUD design, we recognized the possibility of this flexibility because of the digital nature.
To address this, we adopted a modular approach—similar to smartphone widgets—allowing users to prioritize the information they see, within defined rules for layout and sizing. This modularity ensures the HUD can adapt to different driving styles and user preferences.
Prototype & testing
ClearPath On a Test Drive
The most significant milestone in the project was the testing phase, where we successfully simulated a real driving experience to gather precise feedback on the HUD. Using an advanced prototyping setup and a carefully designed test, we conducted a Wizard-of-Oz-style user test. In this setup, drivers were presented with four different driving scenarios and tasks involving the use of the HUD, allowing us to observe how naturally they integrated it into their driving behavior.
➡️ Scenario 1: Multi-lanes and Exits
In this scenario, the chosen environment takes the driver through a carefully curated situation where they are asked to take an upcoming exit. The goal is to take the first exit by following the navigation cues and audio feedback. Our team evaluated the usefulness of the navigation information and other cues like arrows on the road and auditory inputs all holistically assisting the drivers to make the right decisions without taking focus away from the road.
☑️ Insights observed:
The map needs to provide more information well ahead of time. It should also display navigation milestones.
Audio cues would help; without audio, it is difficult to understand what to anticipate next.
➡️ Scenario 2 : ADAS information
In this scenario, the user must assess the safety of a lane change using the ADAS data displayed on the HUD, while also anticipating a nearby vehicle approaching closely, as indicated by the same system.
☑️ Insights observed:
3/5 users interpreted the information correctly. Hence, there is potential to enhance its representation.
One possible improvement could be the inclusion of a detailed lane map to improve clarity of communication.
➡️ Scenario 3 : Incoming phonecall
In this scenario, users are prompted to answer a phone call using the HUD interface. We monitor the duration of distraction caused by the incoming call and assess their response while driving. The dwell time is measured and compared with the dwell time required to answer calls via the secondary touch panel.
☑️ Insights observed:
Users were able to answer the phone call with ease, showing an average dwell time of under 2s.
The smooth transition helped users anticipate the call, providing better control over their attention.
Users showed a preference for using physical buttons to manage the call, as voice-based cues, such as buttons presented as quotes, were not easily discoverable. There was a need for audio instruction for better affordance.
➡️ Scenario 4 : Changing music
In this scenario, the chosen environment takes the driver through a situation where they are asked to change music and guess upcoming music while being interrupted by a pedestrian crossing the road. The goal is to take to intentionally stress test the readability and ease-of-use of our music panel on the left.
☑️ Insights observed:
When asked to change music, 3/5 users pressed the left button since the music panel appeared on the left. This reminded us that users require a clear mapping of controls to the element placement in real world.
At first, 5/5 users required some level of squinting to be able to read the upcoming track titles. This indicates that we need to improve the usability of the panel.
Users were satisfied with the duration the music widget remained visible, as it provided ample time for them to gather the necessary information about their playlist.
Why it Works
HUD's Effect on Distractions
“This is certainly an improvement over using the side touch screen.”
“Yes, this causes much fewer distractions than the way I usually handle calls.”
“Once I learn how to use it, I’d definitely use it daily—it seems helpful”
Apart from qualitative affirmation from the users we tested with, we also compared the dwell time involved in attending calls using the designed HUD versus using the secondary screen (the infotainment display to the right of the driver). This dwell time represents the total duration a driver is distracted from driving, including the time taken to shift gaze to the target, the fixation time on the target, and the time needed to refocus on driving.
Dwell Time = Transition time + Fixation Time
While we observed an average difference of 1.56 seconds, this figure should be considered indicative rather than precise, given measurement limitations and potential margins of error. Nonetheless, it suggests that the HUD significantly contributes to reduced driver distraction.
In addition to minimizing distractions, we also addressed drivers’ concerns about information overload on the HUD. The test results demonstrated that the HUD was not only effective in significantly reducing distractions but also achieved this in a user-friendly manner.
100% of the users found the information easy to read and clear.
None of the content obstructed or interfered with the driver’s line of sight for any user.
Automatically appearing information did not catch any user off guard, and the subtle transitions helped.
Conclusion & Future Scope
Reflection + Improvements
Designing for the automotive domain stands apart from other fields, it requires specialised adjustments in how we prototype and test products. Along the way, we refined our approach based on insights into the industry’s unique dynamics. The outcome is a compelling solution focused on minimizing driver distractions, a crucial consideration in this space. Thanks to its modular structure, it also adapts well to diverse driving experiences. Below are some ideas that could help advance the product even further.
✅ Customization space
Although we understand the need for customization, making it user-friendly presents an entirely different challenge. Exploring solutions such as preset modes and studying user behavior could be valuable directions for future development.
✅ Exploring additional interaction modalities
The primary interaction method relied on was the haptic button. However, further research could explore how users familiar with the system respond to and adopt alternative modalities such as voice or gesture controls.
✅ Exploring tailored themes
Every manufacturer has a distinct design language, and the HUD offers a convenient opportunity for customization to reflect that style. Unlike traditional dashboards, which demand significant effort to tailor, the HUD can be easily themed to align with a brand’s identity.