User-Centered Perspectives for Automotive AR

| a short summary of a paper on human aspects related to automotive AR application design

A research paper, titled like this blog post, by experts from the Honda Research Institute (USA), the Stanford University (USA) and the Max-Planck-Institut für Informatik (Germany) [1] discusses benefits, challenges, their design approach and open questions regarding Augmented Reality in automotive context with a focus on the users.

Augmented Reality can help drivers in pointig out important and potentially dangerous objects in the driver’s view and increase the driver’s situation awareness. Though if the information is presented incorrectly, the distraction and confusion of the driver can lead to dangerous situations.

The authors of the paper put up a design process with focusing on the appropriate form of solution to a driver’s problem (rather than just describing ideas technically).

To understand the drivers’ problems in the first place, they conducted in-car user interviews with different demographic groups to gather information about driving habits, concerns and the integration of driving into daily life.

After the interviews they ideated prototype solutions and tested concepts in a driving simulator with a HUD. One realization was that at a left turn, drivers needed more help in timing the turn according to oncoming traffic rather than an arrow or graphical aids for the turning path – which even distracted them from the oncoming traffic. The design solutions of the authors therefor focus on giving the driver additional cues to enhance awareness rather than giving only navigation commands. After researching different graphical styles of turning path indication, results showed less distraction with solid red path projection – that is visible in the peripheral vision while focusing on traffic – than simple chevron style lines.

Human visual perception

Regarding human perception, the authors of the paper analized influences of visual depth perception and the field of view. The human eye is built to focus on one distance at a time, so AR displays / Head Up Displays can cause a problem due to their see-through design. The driver’s focus has to remain on the road ahead and not change to the windshield’s distance, blurring out the farther imagery.

The eye’s foveal focus with the highest acuity is only at a ca. 2° center area of the vision field. This determins the so called “Useful Field Of View” (UFOV), the limited area from which information can be gained without head movement. These restrictions imply the use of augmented systems only in the driver’s main field of view, and not throughout the whole windshield. Objects in the peripheries should be therefor signalized either inside of the UFOV or through other methods.

Distraction

The National Highway Traffic Safety Administration (NHTSA) of the USA states three types of driver distraction:

  • visual distraction (eyes off road)
  • cognitive distraction (mind off driving)
  • manual distraction (hands off the wheel)

Each of these types can be aided but also caused by Augmented Reality applications in vehicles.

The authors discussed the human attention system and cognitive dissonance problems further.

  • Attention system
    Regarding the human attention system, the so called “selective visual attention” and the “inattentional blindness” can be problems in driving conditions. Important visual cues can be suppressed when the driver is focusing on secondary tasks or if they are outside of the focus of attention. Warning signs on a HUD can either help by attracting attention, but also distract from other objects that are outside of the augmented field of view. The study states the need of further research on the balance between increasing attantion and avoiding unvanted distraction.
  • Cognitive Dissonance
    Cognitive dissonance, the perception of contradictory information, could occur e.g. with bad overlapping of 2D graphics on the 3D vision of the surroundings, causing confusion or misinterpretation of the visual clues.

Human behaviour

As a third category, the study discusses the effects of AR technology on human behaviour.

Situation awareness – maintaining state and future state information from the surroundings – is detailed by a source in three steps:

  1. Perception of elements in the environment
  2. Comprehension of their meaning
  3. Projection of future system states

Augmented Reality can help drivers not only in perception but also in the further steps. State-of-the-art computers, AI technology and connected car data from surroundings can be especially of help in cases where additional computational power can predict traffic dynamics. [comment of MSK]

One aspect is the behavioural change of drivers after longer use of assitance systems. A study implies that the reduced mental workload could lead to the retention of the drivers’ native skills. Further, the phenomena called “risk compensation” can occur after getting used to the aids. This means a riskier behaviour of the driver than normal, due to higher confidence in the surroundings. These behavioural changes can have dangerous consequences, why the authors suggest the use of driver aids only when needed.

According to one source, the user’s trust in a technology is can be increased with more realistic visual displays, like AR rather than simple map displays. Further, AR can also help to build trust in autonomous cars, communicating the system’s perception, plans and reasons for decision making.

Some open questions were stated at the end of the paper, to be considered further on.
Such were for example how multiple aiding systems can interact at the same time? Or how will the use of AR over longer time effect the drivers’ behaviour and skills when they have to switch back and drive a non-AR vehicle? Will the drivers’ skills deteriorate over and will they become dependent on these aiding systems?

My conclusion

This paper was published in 2013, since when the technology was significantly developed further. Nevertheless the basic principles and human factors are still the same, which have to be considered when designing safety critical automotive applications.

Reliability and understanding the behaviour of autonomous vehicles will be an essential aspect in creating acceptance by the driver / passengers. Augmented Reality can be of much help not only for extra driving assistance systems, but also for the complete user experience at different automation levels.

The mentioned topics of human factors in this paper were only focusing on visual augmentation and assistance. These could be expanded to other modalities like sound and haptic augmentation, and analyse the perception of a combined driver assistance as well.

Source

[1] Ng-Thow-Hing, Victor & Bark, Karlin & Beckwith, Lee & Tran, Cuong & Bhandari, Rishabh & Sridhar, Srinath. (2013). User-centered perspectives for automotive augmented reality. 2013 IEEE International Symposium on Mixed and Augmented Reality – Arts, Media, and Humanities, ISMAR-AMH 2013. 13-22. 10.1109/ISMAR-AMH.2013.6671262.
Retrieved on 30.01.2022. https://www.researchgate.net/publication/261447349_User-centered_perspectives_for_automotive_augmented_reality

UI principles of in-car infotainment

| design challenges and principles from the car navigation system developer company TomTom

As it was stated in my earlier blog entry, one of the current cockpit design trends is the multiplicity of screens in cars. This increasing display real-estate is creating challenges for automotive UX designers in creating an effective driver experience instead of displaying as much beautiful information as possible and as a result distracting the driver.

The navigation system and mapmaker company TomTom is also discussing this topic with their principal UX interation Designer Drew Meehan in a blog post, with insightful content about the design principles to be considered.

Finding balance in information overload

The keyword of building an interface with informational balance is: “action plus overview”. When looking at several screens, the shown information should be clustered to provide hints for next actions, and further give an overview of the car’s journey. This should be achieved by sorting the information shown on separate screens to compensate each other.

An example would be a car equipped with head-up display (HUD), a cluster behind the steering wheel and a central display. On the HUD only the current status information would be shown, about the “here and now”. The cluster would show information about oncoming actions in the near future. The central stack would have the job to give the complete overview about the journey, arrival time and complementary info such as refueling/recharging possibilities.

This structure creates a flow of eye movement, which helps the driver will understand the information placing easily and know where to look for specific interests.

Information structure by TomTom for in-car interfaces (source: see below)

Challenges in automotive interface design

There are some aspects and strategies that need to be considered when designing in-car interfaces:

  • Responsive and scalable content according to screen size: complying with different screen sizes in different vehicle models of a brand
  • Adaptive content: displaying only the needed information for the current driving situation. This requires prioritization of the information according to drivers’ needs. —> if the fuel/battery charging is critical, the next stations should be displayed. If the tank/battery is full, the screens can focus on less data. —> if there is no immediate route change action necessary, e.g. straight highway for 50 km, other data from other driver assistance systems could be shown (e.g. line keeping). —> in the city with intense navigation needs, the best could be to show prompt actions on the HUD, closest to the drivers’ eyeline for easy help.
  • Creating one interface ecosystem: all screens should be connected and not segregated. The screens and the shown information should create continuity and complement each other.
  • Customization options: despite good information balance, some people could be overloaded and stressed by multiple screens. They should be allowed to change screen views and positions of content.

TomTom’s UX department has done user research with varied screen info content. They found that “users want easy, glanceable and actionable information”, which reduces cognitive load and stress.

In summary, the UI design has to support the drivers’ actions by showing essential, easily digestable information. It should be placed where the driver mostly expects the content to be and have just the right amount of detail, according to the current driving situation.

Source

Online article by Beedham, M.: Informing without overwhelming, the secret to designing great in-car user experiences, 13.10.2021.
Retrieved on 09.01.2022.
https://www.tomtom.com/blog/navigation/designing-effective-in-car-user-interfaces/

Automotive intelligent cockpit design trends

| short summary of a cockpit design trends report, published early 2021

According to a report of 2020 looking at new car models and concepts cars released in the last years, following major directions of intelligent automotive cockpit design trends can be summarized:

  1. Richer versatilities
    New products are getting introduced with developing automotive electronics, such as driver monitoring systems, driving recorders, rear row and co-pilot entertainment displays. Additionally, intelligent surfaces allow further versatility – window or sunroof glasses can become displays and intelligent seat materials can become interfaces as well.
  2. Multi-channel, fused human-vehicle interaction
    New ways besides touch and voice control are active voice assistants, gesture control, fingerprint reader, sound localization, face recognition and holographic imaging. These multi-channel interaction modes can contribute to safer use and driving as well as deliver an extended user experience.
  3. 3D and multiple screen cockpit displays
    We see dual-, triple- and quint-screen and A-pillar display implementations for delivering control, co-pilot and rear row interactions.
  4. “User experience”-centricity and scenario-based interaction
    In-vehicle scenario modes are getting in focus – the car interior should serve as an intelligent, connected, flexible and comfortable personal space, for e.g. driving, resting, working or even shopping. As a UX-centered implementation example the Mercedes-Benz S-Class ambient lighting system was named, that uses 263 LEDs to adapt to driving situations (warnings) or give a real-time feedback of interactions with the onboard computer.
  5. Interaction with every surface via intelligent materials
    New surface materials are introduced in concept cars to explore touch control possibilities like displaying fuctional buttons in new ways.
  6. Touch feedback as key technology for higher level of safety
    Besides TIER-1 suppliers also several start-ups are developing touch feedback technologies for supporting less distraction and more effective driver-car interaction.
  7. Software systems will be keys of differentiation
    The introduction of Android to in-car entertainment systems was a big step. The need for personalization and simultaneous software and hardware iterations and 3D vision are new challenges for the operations system development in realizing intelligent cockpit systems.

Source

[1] Summary of: “Automotive Intelligent Cockpit Design Trend Report, 2020” by ReportLinker. Retrieved on 09.01.2022
https://www.reportlinker.com/p06003502/Automotive-Intelligent-Cockpit-Design-Trend-Report.html?utm_source=GNW

Haptics & driving safety

| A summary of an interesting research paper that fits well into the multimodal view of my research on in-car AR solutions.

All information summarised in this blog post was taken from the research survey cited at the end of the post, which contains the exact sources of the statements.

“The Use of Haptic and Tactile Information in the Car to Improve Driving Safety: A Review of Current Technologies” – by Y. Gaffary and A. Lécuyer, 2018

The paper summerizes results of experimental studies in the above mentioned topic, categorizes them and discusses findings, limits and open ends.

Several instruments and devices on a car’s dashboard require visual attention from the driver, who is already busy with the driving tasks. While the visual and auditory channels are highly engaged, the tactile and kinesthetic channels could be used for additional, parralel input.

Several sources of the paper state that the haptic feedback can be perceived despite of high cognitive load, more effectively than visual or auditory feedback.

Within the haptic modality there are two kinds of possible feedback:

  • tactile feedback: perception from the skin
  • kinesthetic feedback: perception through muscular effort (force feedback)

Haptic technologies in cars

For transfering haptic feedback, the actuators need to be fitted to specific positions in the car’s interface, to have a direct connection to the driver: steering wheel, pedals, seat, seat belt, clothes and the dashboard.

A source, Van Erp and van Veen classified the information that could be transferred through haptics in cars:

  • spatial information about surrounding objects
  • warning signals
  • silent communication only with the driver
  • coded information about statii
  • general information about settings

This paper focuses on two groups: haptic assistance systems (feedback triggered by voluntary action) and haptic warning systems.

Haptic assistance systems

Controlling the car’s functions

Several sourced of this paper analysed the influence of tactile feedback on the “eyes-off-road time” with rotary knobs and sliders on the dashboard, central console and steering wheel (the main sources of haptic feedback). The devices had clicking effects or could change their movement friction or vibration frequency. The results were the most effetive with visuo-haptic-feedback (combining visuals and haptics), reducing the glancing time by ca. 0.5 s and 39%. One study resulted in the preference of 230 Hz vibration on the steering wheel over lower frequencies. At this input method the vibrations of the road are a limiting factor.

Maneuver support

The paper states that the main source of haptic help for maneuvring is kinetic feedback on the steering wheel. Several studies were mentioned looking at difficult driving situations: parking, driving backwards with a trailer, low visibility. In all of these cases the results showed positive improvements (lower mental demand while same performance), when force feedback was helping the driver to steer in the right direction at the right time.

Navigation

For preventing additional visual or auditorial load and distraction, studies were described on using different actuator placements to give directional feedback to the driver. Such examples were besides the steering wheel the augmentation of waist belts or the driver’s seat with actuator matrices, indicating navigational directions. The results showed less distraction than with only auditory guidance, and even a 3.7 times less failure rate with haptic-auditory feedback.

Haptic warning systems

Awareness of surroundings

Similarly to the navigational purposes, current studies described in the paper propose the augmentation of waist belts and seats for giving directional information as warning signals about surrounding cars or other objects – most importantly in blind spots or behind the vehicle.

Collision prevention

Collision prevention needs fast driver reactions, once the danger is noticed. According to the paper, haptic feedback can significantly improve reaction times. As collision warnings are also based on spatial information, therefor same methods were analysed in studies as for helping navigation or awareness of surroundings – augmented belts, seats and pedal. One system with actuators in the seat showed improvements in spatial localization of threats by 52% compared to only audio warnings.

Lane departure

The main methods to warn about lane departure were tactile and kinesthetic feedback on the steering wheel. As the direction has to be corrected by turning the wheel, the drivers responded intuitively on the augmentation of the wheel with vibrators and motors. These solutions can be found widely spread in the automotive industry. Vibrotactile seats and pedals were also tested and found to work better, be less annoying and cause less interference than audio warnings.

Speed control

As the accelerator pedal is the device of controlling speed, this survey reports many studies to be found on its augmentation. They are looking at implementation of tactile feedback and also force feedback (resistance to pressure and controlled reaction force). Both methods lead to positive results in adjusting too high speeds and maintaining a given speed, and reported by users to be satisfying and useful.

Limits of existing experimental protocols

There are several limiting factors described, which should be considered for further analysis:

  • The age of users and the differences in perception of haptic feedback. Older people seem to be more affected by them.
  • Augmented seats: the thickness of clothing, the height and the weight of the users.
  • Different ways (habits) of holding and turning the steering wheel.
  • Static vs. dynamic signals can have different effects (dynamic signals were seen to be more effective).
  • Effects of multiple haptic feedback systems working parallel in the same car have to be analysed.

Almost all of the described researches were done with the help of driving simulators. They can deliver compareable results but do not fully represent the real driving environment. Realistic stress and also overconfidence in the feedback systems were not analysed either.

My Summary

During driving the driver is under high visual and auditorial cognitive loads from th basic tasks. In these cases haptic feedback can be a very effective solution to trigger reactions of the driver. The interfaces to be used are limited to the areas with which the driver is permanently in contact (steering wheel, seat, pedals, clothes), except the dashboard for changing car functions and settings.

It can be concluded that it makes sense to augment those interfaces with haptic feedback which are relevant for the specific tasks the feedback relates to. For example tactile or force feedback on the steering wheel for maneuvring support or lane departure warning and haptic feedback from the accelerator pedal for speed keeping warnings.

It is interesting to see that spatial information can be perceived well through the body via vibrator matrices in augmented seats. This method carries more limitations than interfaces touched by the hands though.

The most effective solutions seem to be combinations of modalities (visual-haptic, auditory-haptic feedbacks), but in all cases the situations and possible use cases have to be considered as well. E.g. a vibration of the seat can be percieved well while parking slowly, but not while driving fast on a bumpy road…

As the information gathered from this paper is based on simulated experiments, I will also try to find further studies or at least reports on currently implemented haptic systems in production cars.

Source

Gaffary, Y. and Lécuyer, A., on Frontiers in ICT 5:5: The Use of Haptic and Tactile Information in the Car to Improve Driving Safety: A Review of Current Technologies; 2018.
Retreived on 12.12.2021.
https://www.frontiersin.org/articles/10.3389/fict.2018.00005/full

Further automotive AR examples

| continuing my last blog entry on examples

In the past week I was searching for further examples of AR implementation in car’s user interafaces. Currently, after three weeks of research in this topic I have the impression that the industry is mainly focusing on visual augmentation as a help for the driver. Here are some further examples that offer some new aspects and features.

GMC Sierra HD trailer camera

The 2020 GMC Sierra HD pickup truck featured a novel implementation of AR technology. The truck was designed to tow heavy duty trailers and has in sum 15 cameras to help the driver navigate the really large truck. One camera can be mounted at the back end of the trailer looking at the road behind. This image can then be displayed added to the built-in rear camera view of the truck, letting the trailer almost disappear. [3]

I personally find the solution to be a nice gimmick but would actually question its practical benefit. The view for sure doesn’t help manoeuvring with the attached trailer.

GMC trailer camera [3]

Land Rover Clearsight ground view

Looking at special utility solutions, LandRover also implemented a camera augmentation in the Evoque and Defender models – on the main screen. The system works with cameras on the side mirrors and on the front grille and help the driver to see a 180° ground view in front of the car and between the front wheels, below the normal vision field. As LandRover targets off-road enthusiasts, this feature could be welcomed for showing any dangerous obstacles on rough terrains, or simply when climbing steep hills. A similar system is also implemented in the higher-class Bentley Bentaiga SUV. [7]

Besides this “transparent bonnet” system, the company Jaguar-LandRover was also conducting research also on a “transparent pillar” solution. That should help drivers in a more urban environment to see their surroundings in 360°, including objects hidden by the roof’s pillars with the help of cameras and AR. The research was done in 2014 and I couldn’t find any further outcome of the idea. Additionally they have also shown a unique way of AR navigation help: a ghost car projected in front of the driver that has to be followed along the route. [8] [9]

Jaguar transparent pillar and ghost car concept [8]

Panasonic’s state-of-the-art AR HUD

Panasonic Automotive is an other supplier (like Continental, etc.) who is developing onboard systems for automotive OEMs, like an AR Head Up Display with high-end features. Their product was shown on the latest CES 2021 exhibition and is claimed to get implemented in a series production car of an unknown brand in 2024. The system stands out from the other existing HUDs in following features:

  • AI software for 3D navigation graphics, supporting smooth responses to sudden changes ahead of the car. It also uses information from all the onboard ADAS systems (e.g. a 180° forward facing radar with 90 m range) and generates updates in less then 0,3 seconds. (the spatial-AI, AR navigation platform is a patent by Phiar)
  • Eye tracking technology to ensure that the driver always sees the projected information on the right place at any head movement.
  • Vibration control: image stabilization for bumpy roads.
  • Advanced optics, 4K resolution with laser and holography technology (by Envisics).

[5] [6]

To cover all relevant sensory fields in a car interior interface, next week I want to focus on the topic of haptics and tactile feedback solutions on the market.

Sources

[1] YouTube video from Roadshow: Car Tech 101: The best ways AR is being installed in cars | Cooley On Cars. Retrieved on 11.12.2021.
https://www.youtube.com/watch?v=PHhvCRexjWQ

[2] YouTube video from Roadshow: 2020 GMC Sierra HD: Heavy-duty hauler debuts “Transparent Trailer” tech. Retrieved on 11.12.2021.
https://www.youtube.com/watch?v=U0gZ9HaCWsA

[3] Online article by Road And Track: The 2020 GMC Sierra HD Can Make Your Trailer “Invisible”. Retrieved on 12.12.2021.
https://www.roadandtrack.com/new-cars/future-cars/a26009122/2020-gmc-sierra-heavy-duty-invisible-trailer/

[4] YouTube video by About Cars: Panasonic’s Innovative Augmented-Reality HUD Could Be in Cars by 2024. Retrieved on 12.12.2021.
https://www.youtube.com/watch?v=cLgMnSTSxog

[5] Online article by Panasonic: Panasonic Automotive Brings Expansive, Artificial Intelligence-Enhanced Situational Awareness to the Driver Experience with Augmented Reality Head-Up Display. Retrieved on 12.12.2021.
https://na.panasonic.com/us/news/panasonic-automotive-brings-expansive-artificial-intelligence-enhanced-situational-awareness-driver

[6] Online article by Auganix.org: Panasonic collaborates with Phiar to bring real-world AI-driven Augmented Reality navigation to its automotive solutions. Retrieved on 12.12.2021.
https://www.auganix.org/panasonic-collaborates-with-phiar-to-bring-real-world-ai-driven-augmented-reality-navigation-to-its-automotive-solutions/

[7] Online article by Car Magazine: Does it work? Land Rover’s ClearSight handy X-ray vision tech. Retrieved on 12.12.2021.
https://www.carmagazine.co.uk/car-news/tech/land-rover-clearsight/

[8] Online article by Autocar: Jaguar Land Rover previews transparent pillar technology. Retrieved on 12.12.2021.
https://www.autocar.co.uk/car-news/new-cars/jaguar-land-rover-previews-transparent-pillar-technology

[9] Online article by Jaguar: Jaguar Land Rover Develops Transparent Pillar And ‘Follow-Me’ Ghost Car Navigation Research. Retrieved on 12.12.2021.
https://media.jaguarlandrover.com/news/2014/12/jaguar-land-rover-develops-transparent-pillar-and-follow-me-ghost-car-navigation

Automotive AR examples

| looking at some state of the art examples of in-car AR systems on the market

MBUX – the newest infotainment system of Mercedes-Benz

2018 was the year when Mercedes Benz introduced their newest infotainment system called MBUX. This uses the front camera (originally used for parking) to create a live stream of the road ahead, combined with graphics of navigation hints or finding adresses. Since then it was continuosly improved and the latest version was revealed in 2021 in the S/EQS-Class models, featuring an AR navigation display and a HUD with distance assist, lane keeping assist and dinamic arrows showing directions.

Video demostrations of the 2021 MBUX system:
https://www.youtube.com/watch?v=hnRbi5UcJnw
https://www.youtube.com/watch?v=DCgy3askMcM

Audi AR HUD

Audi announced their augmented reality HUDs as an optional feature for the newest high-end electric SUV, the Q4 e-tron for 2021. The visual information shown in front of the driver are similar to the MBUX’s content. Audi explicitly defines two areas: the status field (in a visual distance of ca. 3 m) and the AR field (in a visual distance of ca. 10 m), which seems to be bigger than in the German competitor’s solution.

Demo video of the Q4 e-tron HUD:
https://www.youtube.com/watch?v=Ea6o-_smVk8

Hyundai and WayRay

Looking at HUDs further, Hyundai/Genesis was the first brand to implement laser-holographic AR head-up displays in their G80 model, presented by the young AR developer company WayRay in 2019. It is said to have tremendous benefits compared to past HUDs (using reflected LCD screens) in terms of precision and visibility for the driver.

The Swiss startup WayRay claims to be the only company to have implemented holography to HUDs. Their holographic optical elements (HOEs) in their displays should provide unprecedented 3D images while remaining transparent and capable of being bent to windshields. The company presents its uniqueness in the field by covering “deep-tech” holography hardware development (e.g. blue-laser beams) as well as software development, all realised in-house.

They have already received large fundings by Hyundai and Porsche, have presented a 180° AR cockpit experience and offer different add-on solutions for vehicles, boats and airplanes. Their newest project is a shared car concept (“Holograktor”), designed for the “Metaverse” with a complete gaming / working / learning possibility while traveling autonomously. In their cooperation with Pininfarina on a concept car, they proposed solutions of the “True AR” displays also for side windows, providing new ways of passenger infotainment and entertainment experiences.

A report from the FIA Formula E on WayRay’s developments also predicts the use of HUD systems for race cars in the future. The pilots behind the wheels could get visualized ideal racing lines, brake points or a ghost car to chase on the race track.

Hyundai’s In-Car Noise Cancelling

Besides HUDs, Hyundai is pushing the development of AR solutions in cars in other aspects as well. Like in our headphones, noise cancelling also found its way into the car interiors, bringing more comfort to the passengers. According to Hyundai, the earlier systems were only capable of masking steady engine noises, but their newest solution (“Road Noise Active Noise Control”) in the upcoming Genesis GV80 will be capable of deleting different tire noises at different speeds. It uses multiple microphones placed directly into the wheel wells, accelerometers, amplifiers and a digital signal processor. As a result of the complex calculations for each individual wheel, the in-car noise should be reduced by half (3 dB).

Engine sound enhancements

Writing about noises of the car, we also have to take a short look at the opposite effects to noise cancelling – the engine sound enhancement devices. Due to the downsizing of the engine displacements, the roars coming from the combustion got also reduced. To keep the emotions connected to sporty engine sounds though, manufacturers are using additional devices to create compensating sound effects.

These can be pipes from the intake manifold connected to the dashboard walls, in some cases with an extra flap to control the sound throughput only for the sporty driving situations (Toyota, Ford, Porsche).

BMW was known to use engine sound amplification through a synthesised reproduction of the actual engine noise played simply on the car’s speakers.

The Volkswagen Group made it a bit more complicated by adding a special speaker device (“Soundaktor”) below the windshield to produce deep, buzzing tones resembling larger engine sounds. In some models there are also speakers built into the exhaust pipes to alter the natural noises coming from the engine, to make them more emotional or masculin.

Soundmodule for the Mercedes G350d

3D ADAS system of Arkamys

Beeping noises in a car are existent since many years, with the intention to help drivers. But beeping on itself is not always enough to give an understandable signal about what is happening or dangerous around the driver. The company Arkamys presented an intuitive alerting concept for Advanced Driver Assistance Systems – parking, lane keeping, blind spot and other assistants – by placing many different speakers inside the cabin and generating a 3D sound experience. With this it is possible to signalize the direction where a possible danger can exist, making the recognition and processing of the information easier and more intuitive for the driver.

Electric cars

Electric cars are further good examples where in-car noise generators are used to give the driver and passengers the known feeling of vehicle driving dynamics. Porsche is a perfect example where specific sounds are developed for representing the brand’s identity within the driver experience. They call the system “Porsche Electric Sport Sound” that enhances some natural noises of the drivetrain but also reduces disturbing ones, while implementing sounds to compy with the legal regulations for electric vehicle alerting sounds.

Thinking further about sound augmentation in cars, probably the already most spread system is the parking assistant, giving beeping sound feedback on the remaining distance to obstacles around the car. The design of these systems could probably fill a chapter on its own, but as it is already an everday tool, I won’t go further into detail on it.

The above listed examples are not even close to a complete list of use cases. Therefor I want to further research the current technologies. The next step will then be to look into the reasons for these systems, why they were developed and what practical needs, feelings and experiences are the underlying causes.

Sources

Online article on Wired: With In-Car AR, Drivers Get a New View of the Road Ahead. Retrieved on 05.12.2021
https://www.wired.com/story/in-car-ar-drivers-get-new-view-road-ahead/

Article on Wired: Hyundai’s Luxury SUV Mixes Mics and Math for a Silent Ride. Retrieved on 05.12.2021
https://www.wired.com/story/hyundai-genesis-gv80-suv-noise-cancelling/

Online Article on FIA Formula E: How AR and VR are revolutionising the car industry. Retrieved on 05.12.2021
https://www.fiaformulae.com/en/news/2020/june/ar-vr

WayRay – offical website. Retrieved on 05.12.2021
https://wayray.com/#who-we-are
https://wayray.com/press-area/#media_coverage

Online article on WayRay by CNET and Autocar. Retrieved on 05.12.2021
https://www.hyundai.news/eu/articles/press-releases/hyundai-wayray-unveil-next-generation-visual-technology-at-ces-2019.html

YouTube video by Roadshow: CES 2019: WayRay’s holographic AR windshield is real, hitting the road soon. Retrieved on 05.12.2021
https://www.youtube.com/watch?v=HFIgjQI2E6Y

AutoCar article on the Pininfarina concept car. Retrieved on 05.12.2021
https://www.autocar.co.uk/car-news/new-cars/pininfarina-concept-car-showcased-holographic-ar-display

Online article by AutoZeitung: Mercedes entwickelt MBUX weiter. Retreived on 05.12.2021
https://www.autozeitung.de/mercedes-infotainment-192628.html

Mercedes-Benz MBUX System – online articles and images, retrieved on 05.12.2021
https://www.wired.com/story/in-car-ar-drivers-get-new-view-road-ahead/
https://www.extremetech.com/extreme/314758-2021-mercedes-s-class-2-hud-sizes-level-3-autonomy-4d-sound-5-lcds
https://carbuzz.com/news/new-mercedes-s-class-shows-off-amazing-augmented-reality-display

Audi AR HUD system: online article and Youtube video on Slashgear. Retreived on 05.12.2021
https://www.slashgear.com/the-audi-q4-e-trons-augmented-reality-head-up-display-is-dashboard-genius-09662735/
https://www.audi-technology-portal.de/de/elektrik-elektronik/fahrerassistenzsysteme/audi-q4-e-tron-ar-hud-de/

Online article on GeekDad: Augmented Reality for Your Ears. Retrieved on 01.02.2022
https://geekdad.com/2016/02/arkamys/

Image of Mercedes G350d soundmodule. Retrieved on 01.02.2022
https://www.tuningblog.eu/kategorien/tuning-wiki/soundgenerator-nachruesten-232502/

CarThrottle article on sound enhancers. Retrieved on 05.12.2021
https://www.carthrottle.com/post/5-ways-that-manufacturers-enhance-the-sound-of-their-cars/

The Porsche Sound – online article, retrieved on 05.12.2021
https://newsroom.porsche.com/de/produkte/taycan/sound-18542.html

AR basics and automotive trends

| a short and basic definition on Augmented Reality, the first implementations in vehicles and current innovation trends

What exactly is Augmented Reality and when was it first used?

To have a clear distinction between related expressions, Paul Milgram’s Reality-Virtuality Continuum from 1994 shows the relation of Augmented, Mixed and Virtual reality in a very comprehensible way. [3] As shown in the illustration below, AR is the evolution of real environments in the direction of complete virtuality, but still having a majority of real content. Augmented Virtuality on the hand would describe systems using more virtual than real models.

Illustration by P. Milgram and H. Colquhoun Jr., in A Taxonomy of Real and Virtual
World Display Integration [4]

To have an official definition, in The Concise Fintech Compendium AR is described as “an enhanced version of the physical, real-world reality of which elements are superimposed by computer generated or extracted real-world sensory input such as sound, video, graphics or haptics.” [1]

Already in 1997 R. T. Azuma stated three essential characteristics of AR systems [2]:

  • combining reality with a virtual world
  • interacting in real-time
  • registering in 3D space

Azuma also described the two basic possibilities of combining virtual inputs with the real world: virtual objects can be added to the real perception or real objects can be hidden by overlaying virtual effects. This may be possible not only for optical perception, but also for sound and haptics. He described systems with speakers and microphones, altering the incoming sound of our surroundings (like today’s noise-cancelling), or gloves with additional haptic feedback of simulated forces. [2] Basically AR could help us to enhance all of our senses, but it is mostly implemented in visual systems. [6]

After reading basics theories on Augmented Reality from the early 1990’s, one wouldn’t think that the first personal AR system was developed in 1968 at the Harvard University by Ivan Sutherland, the “father of computer graphics” – a HMD (Head-Mounted-Display) system. [8]

Regarding vehicles and and the first implementation of AR, we have to go even further back in time. The predecessor of today’s BAE System plc., Elliot Flight Automation along with Cintel claim the development of the first Head-Up-Display (HUD) in operational service in 1961 – for a military aircraft of the British Royal Navy, the Blackburn Buccaneer. [9]

The first HUD in a passenger car is stated to be used in the Oldsmobile Cutlass Supreme Indy 500 pace car made by General Motors in 1988. [10] Following photo depicts this very simple AR solution on the windscreen.

The HUD in the Oldsmobile Cutlass Supreme Indy500 pace car, from 1988.
Source: https://www.autoevolution.com/news/how-to-add-a-head-up-display-to-your-car-136497.html

In the last decades, AR was further decveloped and implemented in many different areas, and with the evolution of displays, projectors and computer graphics, we can have now our own AR applications on our smartphones or passenger cars. While starting to dig deeper into existing automotive AR solutions, I found the following interesting study as a foundation to enclose my topic of interest.

AR innovations in the automotive industry today

A study carried out by the Austrian “innovation intelligence company” StartUs GmbH analysed over 400 startups and created an overview on the most innovative use cases of AR in the automotive industry [7]:

The study chart by StartUs GmbH [4]

They state that the the total augmented reality automotive market is growing by 177% every year and will reach $5.5 billion by 2022. [7]

From their five areas of innovation my main focus will be on “Experience Enhancement”. The use cases are see-through displays, windshield projectors or various wearables, that can help the driver with additional, immediate information on important events of the surroundings without any distraction. [7]

Existing solutions for this area will follow in my further research.

Sources

[0] Wikipedia – Summaries on Augmented Reality
https://en.wikipedia.org/wiki/Augmented_reality
https://de.wikipedia.org/wiki/Erweiterte_Realität
https://en.wikipedia.org/wiki/Mixed_reality

[1] Schueffel, P.: The Concise Fintech Compendium. Fribourg: School of Management Fribourg/Switzerland, 2017
https://web.archive.org/web/20171024205446/
http://www.heg-fr.ch/EN/School-of-Management/Communication-and-Events/events/Pages/EventViewer.aspx?Event=patrick-schuffel.aspx

[2] Azuma, R. T.: A Survey of Augmented Reality. In: Presence: Teleoperators and Virtual Environments. 6, Nr. 4, 1997, S. 355–385

[4] Milgram, P., Colquhoun Jr., H.: A Taxonomy of Real and Virtual World Display Integration. In: Mixed reality: Merging real and virtual worlds, Springer, 1999, p. 1-26

[5] The basics of Augmented Reality – Interview with an AR expert; Indestry.com; Retrieved on 27.11.2021
https://www.indestry.com/blog/the-basics-of-augmented-reality-interview-with-an-ar-expert

[6] Kipper, G., Rampolla J.: Augmented Reality: An Emerging Technologies Guide to AR; Elsevier; 2013

[7] Online article: How Augmented Reality Disrupts The Automotive Industry; by StartUs Insights Research Blog. Retrieved on 28.11.2021
https://www.startus-insights.com/innovators-guide/how-augmented-reality-disrupts-the-automotive-industry/

[8] Online article by Javornik, A: The Mainstreaming of Augmented Reality: A Brief History; Harvard Business Review; 2016. Retrieved on 28.11.2021
https://hbr.org/2016/10/the-mainstreaming-of-augmented-reality-a-brief-history

[9] Online article by BAE Systems: The evolution of the Head-Up Display. Retrieved on 28.11.2021
https://www.baesystems.com/en/feature/our-innovations-hud

[10] Wikipedia summary on automotive Head-Up Displays:
https://en.wikipedia.org/wiki/Automotive_head-up_display

Augmented Reality solutions and their challenges for in-car UX/UI

| a short introduction to the research topic

Drivers of new cars nowadays are served by several different assistance systems, presenting information about the car and the environment. With the evolution of digital interfaces, the cars GUIs improved in quality, size and customizability.

Besides displays on the dashboard and center console, the Head Up Displays (HUD) were developed with growing importance due to their optimal positioning directly in the driver’s line of sight. Displaying information on the windshield is the perfect condition for using Augmented Reality in vehicles. By this technology a completely new user experience is enabled without any need for clumsy AR-glasses. There are already some expanded systems in new, high-end cars on the market, and newest demos of suppliers like Continental and Panasonic indicate a fast spread of augmented reality technologies for vehicles in the future.

With further evolution of connected cars and smart cities, there will be countless of inputs and data to be shown besides of driver navigation. Until the cars are not completely autonomous, the collected big data will be more and more used to help the driver on his way. On one hand, probably some implementations like HUDs will become mandatory equipment, because of their safety benefits. On the other hand, the data and information has to be selected and designed precisely, not to overwhelm, confuse and distract the driver in any way.

In my research I want to examine the current and possible future trends of AR solutions for in-car user interfaces and the involved challenges that have to be overcome. Some examples of questions I want to get answers to are:

  • What in-car AR solutions are already on the market and in development?
  • What are the possible digital information groups that can truly help drivers behind the steering wheel?
  • In which ways can AR be implemented to deliver helpful information to the driver?
  • What are the boundaries of the human perception and which psychological aspects have to be considered to design a safety critical in-car AR interface?
Source: Futurus

Inspirations:

Forbes: Are You Ready for Augmented Reality in your Car?
https://www.forbes.com/sites/bernardmarr/2019/08/26/are-you-ready-for-augmented-reality-in-your-car/?sh=3ffda8973144

Wards Auto: Augmented Reality HUDs Will Make Vehicles Safer, But There’s a Catch
https://www.wardsauto.com/vehicles/augmented-reality-huds-will-make-vehicles-safer-there-s-catch

Next Reality News: Augmented Reality in Cars — the Companies & Tech Driving Us into the Future
https://next.reality.news/news/augmented-reality-cars-companies-tech-driving-us-into-future-0182485/

First literature examples:

Pesce, M.: Augmented Reality – Unboxing Tech’s Next Big Thing; Polity Press, 2021

Aukstakalnis, S.: Practical Augmented Reality: A Guide to the Technologies, Applications and Human Factors for Ar and Vr (Usability); Addison-Wesley Professional, 2016

Castro, C.: Human Factors of Visual and Cognitive Performance in Driving; CRC Press, 2009

Azzam, M.: Virtual Reality and Augmented Reality Safety Rules; Independently published, 2019

Jacob, P. & Jeannerod, J.: Ways of Seeing: The scope and limits of visual cognition; Oxford University Press, 2004

Wiklund, M.E., Costantino, C., Post, R. et al.: Designing for Safe Use: 100 Principles for Making Products Safer; CRC Press, 2019

Parker, P.M.: The 2021-2026 World Outlook for Augmented Reality Products; ICON, 2020