In the future, cars will become more and more intelligent, and there will be more and more human-computer interactions between the car and the people in the car. At present, emotion perception is a relatively mature concept: the car analyzes and recognizes the emotional status of the occupants, and provides feedback with interior car ambient lighting, smell, vibration, etc., to enhance the comfort and safety of the ride.
The emotion perception solution mainly uses the camera to recognize the face, cooperates with the AI biosignal sensor, and then analyzes the driver’s emotional state in real-time through special software, and then responds to soothing emotions and invigorating the soul through the interior furnishings such as atmosphere lights and fragrances. Face recognition methods include geometric recognition, three-dimensional recognition, skin texture recognition, thermal imaging recognition, and so on.
The introduction of emotional perception connects the divided systems of the car ambient lighting, fragrance system, and audio in the car into a whole, bringing a more comfortable travel experience.
Car Ambient Lighting with Emotion Perception Development in OEMs
Kia demonstrated the Real-time Emotion Adaptive Driving (READ MOTION), which focuses on the driver’s emotional state to create a more enjoyable mobile experience. It uses artificial intelligence-based biosignal recognition technology to analyze the driver’s emotions in real-time. To optimize and personalize the seating space of the vehicle. And according to the results of the intelligent evaluation, the lights, temperature, and music in the cockpit are changed to adjust the atmosphere and make the driver’s mood more pleasant and relaxing.
This system is jointly developed by Kia and the emotional computing team of the MIT Media Lab. The READ system can monitor the emotional state of the driver in real-time through AI biosignal recognition technology. The specific method is to check facial expressions, heart rate, and skin through sensors. Electric activities, etc., to determine the driver’s mood and then customize the cockpit environment to create a “more pleasant driving experience.”
Toyota Boshoku launched AceS (Active Comfortable Engagement Space) this time, which can be adjusted according to the driver’s physical, posture, and emotion detection:
Like many concept cockpits, AceS also has an emotion-sensing system. When the onboard computer determines that someone inside is in a bad mood, it can be improved by distributing fragrance and adjusting the car ambient lighting. If the sensor detects that the driver has become drowsy, it will use music and vibration to help stay awake.
Sensors represented by Toyota Boshoku recognize emotion perception and then adjust member emotions through ambient lights, fragrances, and music.
Panasonic Motors unveiled its latest SPYDR 2.0, a single-brain cockpit domain controller solution, equipped with a driver monitoring system (DMS) integrated with a head-up display (HUD) at the 2019 CES show.
DMS HUD uses Panasonic’s proprietary DMS algorithm to achieve seamless integration based on virtual reality. The driver monitoring camera is embedded in the HUD device, using Panasonic’s expertise in manufacturing precision optical components (such as projectors and cameras). The system is 40% smaller than the size of similar products on the market today. In addition, the system can also track the driver’s face and adjust the HUD mirror and display vertically according to the driver’s facial features and location.
Panasonic also exhibited “Human Insight Technology”, which introduces human body sensing, which can simultaneously recognize and estimate human features such as the face, age, gender, and life support through only camera shooting and photo processing technology and artificial intelligence. Information (heart rate) technology; introduction of body load sensing introduces the technology of digitizing and quantifying human body characteristics around body load through 3D sensing of human activities in space; introduction of the emotion-sensing camera, thermal imaging camera, pressure sensor, odor sensor, etc. Unique components and analysis algorithms to accurately predict human body conditions such as emotion, temperature, weight, and pressure.
Hyundai Mobis eMotion emotion mode-the vehicle can detect the emotions and mood states of the members in the car through facial recognition, and provide analysis and feedback on the results. The members in the car can experience it as a game. In addition, the car ambient lighting will also change with the mood of the members.
The traditional compartment on the left side of the I-mobility TYPE-C concept car released by Aisin only supports semi-autonomous driving. When the vehicle’s driver status perception system detects an abnormal state of the driver (such as sleepy), its seat will vibrate to wake up The driver refocused on driving.
Of course, as a leader in the automotive industry, Mercedes-Benz and BMW have already taken actions in emotional perception, such as:
At CES Asia in 2017, Mercedes-Benz launched the Fit & Healthy smart health concept car through passive sensors on the steering wheel and wearable devices such as smart bracelets worn by passengers to collect emotional information, and then use sound and light systems, seat massages, and release fragrances, etc. Ways to create different atmosphere scenes to fit the current emotional state of the occupants.
During BMW’s 2018 BMW Summer School cross-field activities, 34 international doctoral students, research elites, and industry professionals shared their views on the current situation and challenges of emotion-aware vehicle assistance technology.
The car will become the second intimate and private space outside the home in the future. Interior car ambient lighting has become a design element that cannot be ignored. The atmosphere lamp is in the market introduction and growth stage. How to design the atmosphere lamp better and perfectly realize people Car interaction requires the joint efforts of OEMs, car interior lights factories, interior decoration factories, material manufacturers, controllers, films, light guide materials, and other companies.