The intended purpose
With this deck, we, at PCH INNOVATIONS, demonstrate our own vision of new building blocks for HMI-systems by 2018+ that could be market-ready within the next five years. But before presenting these solutions, the deck’s entry point consists of a wrap-up of 2018+ technological and societal givens, which will challenge & impact interface design. We then provide a comprehensive assessment of the current HMI landscape by applying a holistic research perspective.
Today’s best-of-bench user experiences are clearly defined by the smartphone, gaming industry and sci-fi genre. But direct translation from consumer electronics and other disciplines into the automotive world still has its obstacles and limitations. Defining the user experiences & rituals for cars 2018+ means meshing traditional usage patterns with a wide range of new functions like real-time, cloud-based infotainment, in-car health, in-car education, etc. – driven by the coming revolution of autonomous driving.
Our Collaborative Vision
Working within the automotive industry for 20 years, we found ourselves in the vortex of HMI-design and -development over the last years.
This and the continuous dialogue with our clients at Jaguar Land Rover, VW, Audi, Denso, etc. and with our partners at MIT, Google X, UCLA, TU Berlin, and others drove us towards a radical, new-generation HMI’s and cockpit design, founded in the principles of intuitive, effortless, aural and highly-adaptive interactions. With this vision, we strive for advanced but accessible and highly scalable solutions, with respect to pricing, regional specificities or consumer groups. Because we believe in the HMI as an important differentiator to counteract the overall decline in the importance of cars and the brands behind.
With ‘Blended Drive’ we clearly have the intention to change a fundamental part of the drive experience – we have a sensory transformation in mind, interfaces and graphics, which are more based on reality and make previously invisible things visible and experienceable. That could make any HMI more organic and much more intuitive to use, which will make us see, interact and actively participate in a new way.
CHAPTER 1: UNDERSTANDING THE WORLD BY 2018+
Looking to 2018+, exponential innovation and technological immersion are not deniable any more. Our environment will be equipped with autonomous, context-aware and energy-autarkic sensors, re-articulating our actions, conversations, perspectives and relationships in powerful new directions. We are already products of our environment, though soon our environments will become extensions of us. Sensors, tagged objects and data will act as artificial nerves and extend our nervous systems into the world beyond, enabling the environment as well as the car to become a proactive, interactive, and perceptive sentient system. And that will redefine our relationships with products and services, but especially our mobility rituals and routines.
These new modes will inherently change the way we interact as individuals, groups and on a global scale. By 2018+, we will encounter unseen and all encompassing interaction spheres. To bring this new sense of space also into the automotive context, the industry must ask itself: How can in-car interactivity broaden our borders and horizons by 2018+? How will an empathically responsive environment enhance our in-car routines and experiences?
With the rise of automated and intuitive data reasoning, cars somehow will become devices and more personalized partners, storing and reflecting our daily routines. Utilizing this information beyond our home, leisure and work environments, cars will help us to improve ourselves through highly informed, better decisions, activity and behavior.
With ubiquitous and more natural interaction, real-time monitoring capabilities will extend our focus and reach, and the way we interact with our surroundings and each other. If the automotive industry starts embracing these new levels of intuitive interaction, it also could offer greater possibilities for broadening the natural, temporal, spatial and social borders of our daily in-car routines.
CHAPTER 2: ANALYZING HMI STATE-OF-THE-ART 2015-
Before starting the development of our own solutions for HMI-systems, new interaction modes and content provisioning logics that sufficiently address the challenges of the upcoming automotive revolution, we defined reference points and investigated the capabilities, limits and also consumer reactions of state-of-the-art HMI-systems 2015-, developed by automotive OEMs, system suppliers, research labs and players from other industries.
This analysis covers HMI-relevant interior components, augmentation technologies, multi-screen setups, mechanical and digital controls; in addition we assessed UI-flows, graphic schemes & aesthetic implications and aspects of data intelligence. We also considered the most recent and upcoming HMI-technologies as gesture controls, eye-tracking, aerial feedback & controls, speech recognition and biometrics. And due to our believe that the purpose of the car is drastically changing, we analyzed health-, eco-drive-, educational- and autonomous drive-assisting systems.
From the 25 systems and concepts we’ve analyzed, we had some crucial TAKE-AWAYS, which we’d like to pass on to you:
The continuously rising quantity of in-car information and various information-layers, makes it more & more complex to keep crucial information clear and well-structured for the driver. HMI systems have to have a clear functional & control hierarchy, e.g. by separating mechanical and digital functions (not every function will be digital, and even not by 2020!).
Today, we are still not able to personalize content in the car, neither to share it between passengers and across screens, nor to take it out of the car, which is key for trans-modal mobility services. In-car content management still can’t handle human errors, it is not forgiving and not predictive & intuitive enough. People expect simple or even ‘blind’ interaction logics that use age-old muscle memory and learned usage patterns; learning times for HMIs should be kept at a minimum, e.g. for the rising number of ad-hoc drivers.
For the HMI to know the driver and its passengers by heart and to make the driving experience truly individualized, the OEMs and system suppliers will have to think way beyond biometric data. Because you can only leverage single HMI-technologies as gesture-, voice-control or eye-tracking to its full potential by combining them with new kinds of haptic feedback mechanisms. Cognitive monitoring via the use of brain-vehicle-interfaces come into play when we want to detect the origin of our intuitions, intentions and emotions. That in combination with deep learning algorithms would also allow the car to become more individualized and way more adaptive, intuitive and fluid.
With respect to the rising need to address the upcoming challenges of autonomous driving we believe the vehicle interior’s architecture, versatility and functions should be rethought. The progressing evolution from autonomous drive level 2 to 3 (and later 4) requires a driver cockpit and vehicle’s interior way more adaptive and physically re-configurable than it is today.
CHAPTER 3: DECRYPTING THE UPCOMING AUTOMOTIVE REVOLUTION – Part I
When human mankind did the bold step from horses to very basic ‘cars’ back in 1883, automobiles, in the ages of very basic transportation and mobility, remained a privilege for very few wealthy pioneers. The time after World War I, when cars started to be equipped with rubber tires and cities with traffic signaling systems, more and more people could afford a car and suddenly found themselves in experiencing an unseen freedom.
The 30ies and 40ies brought first major changes in vehicles’ morphology: aerodynamics and car bodies made form pressed metal were driven by the consumer’s desire for speed. In response, the OEMs often adapted the symmetrical shells of boats and airplanes.
The 50ies and 60ies brought sculptural elements into car design, to control reflections on a glossy painted surface. The seamless progression and transition of vehicle's surfaces evolved as boat carpenters & airplane designers perfected the math of symmetry and continuity over decades and even centuries. Beauty, elegance, comfort and hereby luxurious interiors became the consumer’s preference in. With rising speeds and better infrastructure in the 60ies and 70ies, serious safety and ecologic concerns drove the political, societal and economic agenda – that let the OEMs focus on smaller cars, on the first alternative drivetrain concepts and on passive safety features. And the big players combined these developments with fully automated assembly processes.
Cars in the 80ies became more complex, more diversified, more safe and hereby loaded with electronic features – answering the consumers’ more rational and conscious mobility needs. The 80ies and 90ies already introduced us to the first driver assistant systems, enhanced electronic safety features and enforced efficiency; from now on the actual vehicle design only evolved slowly and logically, due to stricter demands from legislation and focus on technological integration.
The 90ies were all about optimizing and strengthening the vehicle structure, compensating the increase in weight simply with stronger engines. On the other side the Hybrid movement got initiated, the first CAN Bus system and HUD made its way into series production – before the 2000s, with GPS, park assist, bluetooth integration and other more advanced assistance systems, gave us the first hints towards the connected car.
With global warming and a global financial crisis in mind we saw the consumers, the automotive industry and especially new players introducing a change that was overdue. Alternative drive trains hit everybody’s private or commercial agenda, new mobility services, transmodal systems and smart transportation-related eco-systems arose; several new high-tech in-car applications made the 2010s the beginning of partly autonomous and hyper-connected vehicles, at least 3 to 5 years earlier than expected.
Different from many other industries, most automakers still struggle to adapt and transform themselves so that they can cater the drastically changing mobility usage patterns and in-car routines. That leaves a lot of vehicle systems, interfaces and content provisioning either outdated or too fancy and complex. Additionally, the OEMs are endeavoring to evolve basic and known safety & information features to infotainment systems, which integrate technologies as natural voice recognition, advanced use of contextualized services, multi-modal and multi-screen HMI-logics that respond to gestures, and AR-windows that promise to not only increase driving safety but also engage passengers individually – to finally turn the car into a personal platform.
CHAPTER 3: DECRYPTING THE UPCOMING AUTOMOTIVE REVOLUTION – PART II
But what’s happening now, couldn’t be better phrased than CNN’s Steven Johnson recently did: “For almost 100 years, the underlying conventions of automobile technology have been stable ones: we own our own gas-powered cars, and drive them (mostly by ourselves) to work and back. But the eruption of innovative startups working in the automotive field ... suggests that the entire stack of technologies and behaviors associated with automobiles may be ripe for reinvention.”
So now, mostly driven by out-of-industry players as Google X Labs, Uber, Tesla, Boston Dynamics, Lyft, SideCar, etc. the OEMs face a grand revolution: today’s cars already make the shift from manual to (partly) autonomous sooner than a lot expected. Built-in sensors, cameras and radars enable the vehicle to take over pretty much of the basic driving tasks. Pushed by stoning leapfrogs in AI-based applications, robotics and other technological fields, autonomous mobility is basically around the corner. The concept has made its move from a research & lab status to safe, on-road prototypes – and all that at least five years before the OEMs thought it would.
(Partly) Autonomous and self-driving cars will definitely save lives, they will be more efficient and provide time savings & environmental benefits, they will be much better in navigating through traffic and in minimizing congestion. And for sure they will offer new freedom to mothers & fathers with kids in the car, to elder and handicapped people. This new generation of cars (and services) will increase in-car safety and hereby mobility for drivers that suffer under the danger of e.g. low blood pressure, strokes, diabetes, spasticity, etc.
To make that leap now and also to reduce the need for expensive infrastructural updates, the sensor-based and connected vehicle has to advance in data security, privacy and analytics and has to become way smarter in mimicking human senses, understanding human intentions (even before the human realizes). And the OEMs have to rethink ownership models, expand opportunities for vehicle sharing and create eco-systems that drive down costs of mobility, address the changing demographics and the different needs of new consumer tribes and finally help democratize mobility beyond the Western World.
Capter 4: CHARGING THE CAR WITH A NEW PURPOSE
But with all this coming, main focus of HMI and system development is to keep the human more in the loop than ever before. Because level 2/ 3 autonomous technologies, will already free the driver to do other things and be way more productive while driving. And that requires to change the way technology, information and people interact, because the car of 2018+ has the potential to become the driver’s home, work environment and even life support system and hereby keeps the driver engaged along the entire ride – not get bored, fall asleep or and not to have difficulties maintaining the necessary level of attention.
In order to define future HMI-requirements and to develop new, consumer-relevant HMI- solutions, we need to understand the future mobility patterns and driving styles 2018+ and hereby how people will be using their cars. Because with exponentially increasing technological possibilities, the car’s capabilities also have to undergo significant change – to really leverage its potential to become the driver’s home, work environment and even life support system. For that, the car should enable the driver to transit seamlessly between different modes of transportation, to execute daily routines & rituals in the background while driving and to stay connected on-the-go. Therefore input & content streams have to be fused, probabilistic data has been provisioned (outside-in) – in order to guide the driver and passengers through the infrastructural, cultural & social layers of the car’s surroundings – based on the driver’s interests, mood, context & location, traffic, construction areas, schedule, etc. But also envision solutions where the vehicle informs its surroundings about the driver’s and the vehicle’s intentions (inside-out).
Another key purpose of the car 2018+ will be to make the passengers feel relaxed, to keep them in balance at any time. Their rational well-being is defined by a 360-transparent footprint and a feeling of comport and maximum safety. The emotional side of well-being addresses new features that help the driver not to worry about to de-stress and relax while driving, to even reboot and refresh his/ her senses and to have a meaningful experience while driving. These new functions require a continuous monitoring of the driver’s state of mind, body and soul; possible threats from the environment, driving styles and footprint are monitored in real-time and make the vehicle systems react smoothly along the driver’s condition. Means, we envision a car that not only visualizes & uses key vital signals for safety maximization & convenience of e.g. in-car health or educational services; but a car that is able to pre-formulate driver’s demands.
Since we live in times of intellectual athleticism, where people strive for lifelong learning and continuous education, the car is the perfect environment for people across all ages to learn a new language, to learn about a city’s narrative and history, etc. The vehicle, as we see it for 2018+, leverages the passengers’ physical & cognitive capabilities by dynamically providing the appropriate dose of continuous & contextual learning along the passenger’ skills, by unveiling the most unique, original and ‘insider’ information of the surroundings and offering new meaning and experiences.
Chapter 5: INVENTING NEW HMI BUILDING BLOCKS
In the previous slides we hopefully gave you a deep understanding why we believe that the car’s purpose is changing, in resonance with the changing driver and passenger habits, needs and preferences – under the assumption of gradually introduced autonomous driving levels. With the design of interface solutions by 2018+, we continue to forge a new horizon of interactive possibilities, that broaden our spatial, temporal & social borders by creating responsive & proactive sentient systems. Our relationships with interfaces are radically changing: the context of being connected & in a flow, getting inspiration, personal enrichment, being in control, switching modes and being in a state of peace of mind – all that will drive the attributes and functions of future interfaces. These altered needs & rituals raise new interaction patterns & rituals, e.g. dealing with inside-out/ outside-in information, the vehicle as a real-time learning & knowledge hub, a health space, a concierge, an experience machine, a retail store, etc. Interfaces 2018+ will let us experience full immersion in a natural way, we will perceive our devices and cars more as partners. Soon we will be used to spatial recognition and using our mirrored digital identities, benefitting from augmented memories, ubiquitous real-time reflection and even parallel & contextualized realities. Based on this background, the design of in-car interface solutions has to reflect driver-centered but context-adaptive controlling ergonomics & a fused dialogue architecture, enabling a hyper-personalized and a true human surround experience that facilitates an empathic perception of our surroundings.
Today’s HMI solutions keep the driver’s attention mostly on the dashboard, the graphic user interface (GUI) and the content provisioning is often too complex and not adaptive to the corresponding traffic situation. There is no sufficient classification between permanent and temporary, primary and secondary information, with very little option for customization. With the ‘Blended Drive’ concept we want to enable the driver to keep his/ her eyes always on the road, with maximum simplicity within the GUI and the content. We developed a concept for a proper content classification, driven by a highly dynamic and spatial (360 degrees) content organization across the de-/populating, multi-layered and highly customizable displays.
With respect to manual and autonomous driving our HMI-concept also has the purpose to contextually communicate the automated vehicle systems’ advantage and to provide one clear action-oriented message to the driver in any situation. Even if a particular action (such as braking) might be triggered by the vehicle systems, the HMI must behave consistently and communicate proactively where to look at and what to do, to support the vehicle system – and to avoid anxiety and uncomfortable feelings for the driver. Increased traceability, understandability, education on ADM will avoid surprises and fearful situations.
Following the naturally random danger positioning in traffic, the ‘Blended Drive’ concept uses all available space inside & outside the vehicle and features multi-sensor input channels (e.g. touch, hear, feel) for anticipatory and ambient warnings.
Besides all new functions and features, safety still remains our key concern. In order to make ‚safe‘ decisions while operating a car, the OEMs have to make sure that certain signals, warnings and tasks are processed properly by the driver and its passengers. This can only be achieved if the cerebral, cognitive proof is monitored by the vehicle’s systems in real-time. Cognitive signals can only be sensed, processed and reliably analyzed by Brain Computer Interfaces (BCI). We don’t talk about BCI solutions in which people would be able to control a car by thoughts. We see it as a key background technology to ensure that certain signals and tasks are properly perceived and cognitively processed by the driver.
BCI-technology can reduce critical work overload conditions, monitor the driving style to detect aggressiveness or control the state of drowsiness to enhance safety drastically. And a BCI-system cannot only optimize but even make the effective use of new technologies as gaze detection, person identification, populating/de-populating & interactive interfaces, biometric monitoring, etc. possible. BCI-technology in conjunction with the above named technologies will enable mind-free, blended driving and will make the car a true human surround interface.
HMI BUILDING BLOCK 2 – Brain Vehicle Interface (BVI)
I BUILDING BLOCK 2 – Brain Vehicle Interface (BCI)
Chapter 6: Outro
We need to close the loop between the driver, car and the HMI, so that the car is aware of driver’s state and driver is aware of driving situation. That requires a holistic multi-modal, multi-touch and multi-zone HMI design that takes the environment, driver expectation, driver state and individual preferences into account, with the result of state- compliant content provisioning & situation-specific warnings & information – which leads to a much safer, effective and pleasurable driving experience and a joyful ride.
The simplicity and intuitiveness of the shown ‘blended’ modes and logics of interaction also consider the increasing age of drivers in mature markets and the growing popularity of car sharing, especially amongst young drivers, which also brings the need for personalization for various drivers in the same vehicle. We vote for new logics in the dynamic and balanced orchestration of the various displays and information sources, supplemented by 360-degree ambient, multi-sensorial and natural interaction schemes, and by sound-, aerial-, tactile- and also ultrasonic-supported feedbacks.
Having said that, safety clearly remains the key cornerstone in HMI design. Minimizing driver distraction whilst optimizing the driver and passenger experience and at the same time exploiting the potential of new interaction schemes is a real opportunity for intelligent design concepts, which provide the driver and passenger with a unique and compelling experience.
Once these shown ‘blended’ modes and logics of interaction have been implemented successfully, cars by 2018+ will be essential to unlock and re-enable the creative genesis of cities, homes and companies. We will be incited to travel new routes, leave behind beaten paths, and adopt new rituals by the virtue of previously unseen possibilities and advantages. Because the consumer will desire interactive tools and schemes which not only optimize safety, productivity and savings, but which foster quality of life, value of time, love of curiosity, and reveal pure serendipity.
Seamless in-vehicle inter-activity will enable us to empathically sense our situations and challenges, amplify our anticipation of individual and societal needs, and act in our mutual best interests, as if guided by a reliable assistant, close friend or mentor.
We can see fundamental building blocks of autonomous and even driverless cars on the road already now. Systems that provide crash prevention, emergency braking, self-parking, lane guarding, etc. have been in the markets for several years. However it plays out, autonomous and self-driving vehicles are on the rise – and fast. Their full adoption will take some time, but their convenience, cost, safety and other factors will make them ubiquitous and indispensable. Such as with any technological revolution, the companies that plan ahead, adjust the fastest and imagine the biggest will survive and thrive. And companies invested in old technology and practices will need to evolve or risk loosing influence.
Same goes for rigid attitudes towards the integration of cutting-edge technologies and revamped interface design into HMI development 2018+. The future device & content logic, visual language for screen or mid-air design should not so much refer to the known computer/ phone/ wearable interfaces, but for example rather more to broad visual references from luminescent underwater wildlife or astrological phenomena. The visual language of future HMIs should be more organic, natural and reflect our surroundings in a way that the actual technology behind almost appears foreign and unrecognizable to its users.
Regardless of how far in the future we speculate, usability and functionality are key, the HMI should encompass a robust, utilitarian screen design, narrative clarity and brand-specific ingenuity. Affordable and accessible for everybody. That is ‘The Blended Drive 2018+’.
PCH’s HMI CORE TEAM
Partner & Managing Director
Concept/ Visual Design
Concept/ Visual Design
Digital & Hardware Prototyping
SPECIAL THANKS TO
OUR HMI CLIENTS
Jaguar Ltd., Audi AG, Denso Corporation, FEV GmbH
OUR INTERFACE TEAM & PARTNERS
Ash Thorp, Ryan Cashman, FEV GmbH, TU Berlin, UCLA Health Lab, Google Automotive Group, Ostendo, Mirai Media Lab
All documentation, discoveries, designs and methods of PCH INNOVATIONS GMBH remain solely the intellectual property of PCH INNOVATIONS GMBH. The analysis of works presented in the form of ideas, text, graphic design, product design, engineering plans, photographs, audio, video, and another storage media made available along these concept lines reflects the realization of the concept by PCH INNOVATIONS GMBH. Free use of these materials is only available under contract with the creators and through the fulfillment of that contract, only in the case that a contract with the creators grants entitlement to such use. In the case these rights are granted, usage is determined depending on implementation of the spatial, temporal and substantive scope of the rights of use. Usage, in whole or in extracts, without consent by the creators, and its disclosure to third parties, constitutes a breach of copyright with all legal consequences. PCH INNOVATIONS GMBH exclusively performs its services under its General Terms and Conditions which are made on our homepage (www.pch-innovations.com). By submitting an offer of contract to PCH INNOVATIONS GMBH you agree that the General Terms and Conditions of PCH INNOVATIONS GMBH shall exclusively apply to the performance of that contract.