dublin, February 13, 2024 /PRNewswire/ — “China Automotive Multimodal Interaction Development Research Report, 2023” report added ResearchAndMarkets.com Recruitment.
China Automotive Multimodal Interaction Development Research Report 2023 covers the mainstream cockpit interaction modes, the application of interaction modes in major vehicle models to be launched in 2023, supplier cockpit interaction solutions, and the convergence trend of multimodal interaction. We are investigating in detail.
If we sort out the interaction modes and features of new models announced in the previous year, we can see that active, anthropomorphic, and natural interactions have become mainstream. Regarding interaction modes, in single modal interaction, the control range of mainstream interactions such as touch and voice has expanded from inside the car to outside the car, and the application cases of new interactions such as fingerprints and electromyography in the car have begun to increase. . Multimodal fusion interaction allows multiple fusion interactions such as voice + head posture/face/lip language, face + emotion/smell to be available in cars, creating a more active and natural human-car interaction. We are aiming for
Large-scale AI models are evolving from single-modal to multimodal and multitasking convergence. Compared to single modal, which can process only one type of data such as text, images, and audio, multimodal can process and understand multiple types of data, including visual, auditory, and linguistic, allowing you to better understand complex information. can be understood and produced.
As the multimodal foundational model continues to develop, its capabilities will also improve significantly. This improvement will enable AI agents to gain more sophisticated cognitive and environmental understanding capabilities, enabling more intelligent automated decision-making and actions, creating new possibilities for applications in the automotive field and for the future. It provides a broader perspective of intelligent development.
Spark Cockpit OS, developed by iFlytek based on the Spark model, supports multiple interaction modes such as voice, gestures, eye tracking, and DMS/OMS. Spark Car Assistant enables multi-intent recognition through deep contextual understanding, delivering more natural human-machine interactions. First featured on EXEED Terra ES models, the iFlytek Spark model brings five new experiences: Vehicle Function Tutor, Empathic Partner, Knowledge Encyclopedia, Trip Planning Expert, and Physical Health Consultant.
AITO M9 Scheduled to be released in 2020 December 2023Built-in HarmonyOS 4 IVI system. Xiaoyi, HarmonyOS 4's intelligent assistant, is connected to his Huawei Pangu model, which includes natural language models, visual models, and multimodal models. The combination of HarmonyOS 4 + Xiaoyi + Pangu Model further enhances ecosystem features such as device collaboration and AI scenarios, and uses multimodal interaction technology to support diverse interaction modes such as voice recognition, gesture control, and touch control. is provided.
Main topics covered:
1 Overview of multimodal interaction
1.1 Defining multimodal interactions
1.2 Multimodal interaction industrial chain
1.3 Multimodal fusion algorithm
1.4 Multimodal interaction policy environment
2 Touch-based human-computer interaction
2.1 Haptic interaction development route
2.2 OEM haptic interaction highlights
2.3 Cockpit display trends
2.4 Development trends of smart surface materials
2.5 Haptic feedback mechanism
3 Auditory-based human-computer interaction
3.1 Development route for voice functions
3.2 Summary of audio functions of OEM companies
3.3 Overview of OTA updates for OEM voice features
3.4 Development trends of voice interaction images
3.5 Application of voiceprint recognition to car models
3.6 Trends in customizing voice functions
3.7 Major suppliers of voice functionality
3.8 OEM voice function development model
4 Vision-based human-computer interaction
4.1 Face recognition
4.2 Gesture recognition
4.3 Lip movement recognition
4.4 Other visual interactions
5 Human-computer interaction based on smell
5.1 Development route of olfactory interaction function
5.2 Principle of intelligent fragrance system
5.3 Fragrance system technology
5.4 Application of olfactory interaction in car models
5.5 Overview of OEM fragrance system technology
5.6 Trends in olfactory interaction design
5.7 Overview of olfactory interaction suppliers
6 Human-computer interaction based on biometrics
6.1 Fingerprint recognition
6.2 Iris recognition
6.3 Myoelectric recognition
6.4 Vein recognition
6.5 Heart rate recognition
7 Multimodal Interaction Applications by OEMs
7.1 Emerging car manufacturers
7.1.1 Multimodal interactions in Xpeng G6
7.1.2 Multimodal interactions in Li L7
7.1.3 Multimodal Interaction with NIO EC7
7.1.4 Multimodal Interaction in Neta GT
7.1.5 Multimodal interaction in HiPhi Y
7.1.6 Multimodal interaction of Hycan A06
7.1.7 Multimodal Interaction in Hycan V09
7.1.8 New AITO M7 multimodal interaction
7.1.9 AITO M9 multimodal interaction
7.2 Traditional Chinese independent car manufacturers
7.2.1 Multimodal interaction in Chery Cowin Kunlun
7.2.2 Multimodal interaction in WEY Blue Mountain DHT PHEV
7.2.3 Multimodal Interaction in Hyper GT
7.2.4 Multimodal Interaction with Trumpchi E9
7.2.5 Multimodal Interaction in Voyah Passion
7.2.6 Multimodal Interaction with Denza N7
7.2.7 Multimodal Interaction in Frigate 07
7.2.8 Multimodal interaction in Changan Nevo A07
7.2.9 Multimodal Interaction in Jiyue 01
7.2.10 ARCFOX Kaola Multimodal Interaction
7.2.11 Multimodal interactions in Deepal S7
7.2.12 Multimodal interactions on Galaxy L6
7.2.13 Multimodal interactions in Lynk & Co 08
7.2.14 Multimodal interaction in LIVAN 7
7.2.15 Multimodal interaction with ZEEKR X
7.2.16 Multimodal interaction of ZEEKR 009
7.2.17 Multimodal Interaction in IM LS7
7.2.18 GEOME G6 multimodal interaction
7.3 Traditional joint venture automobile manufacturers
7.3.1 Multimodal interaction in Mercedes-Benz EQS AMG
7.3.2 Multimodal interaction with GAC Toyota bZ 4X
7.3.3 Multimodal interaction in FAW Toyota bZ 3
7.3.4 Buick Electra E5 Multimodal Interaction
7.3.5 11th Generation GAC Honda Accord Multimodal Interaction
7.3.6 Multimodal interaction of FAW Audi e-tron GT
7.3.7 Multimodal Interaction in BMW XM
7.4 Concept car
7.4.1 Multimodal interaction of Audi A6 Avant e-tron
7.4.2 Multimodal interaction in BMW i Vision Dee
7.4.3 Multimodal Interaction with RAM 1500 Revolution
7.4.4 Multimodal interaction in Peugeot Inception
7.4.5 Multimodal interaction with Yanfeng XiM23
8 Supplier Multimodal Interaction Solutions
8.1 Adaptability
8.2 Shipia Vision
8.3 Cerence
8.4 Continental
8.5 iFlytech
8.6 Sense Time
8.7 Adayoo
8.8 Desei SV
8.9 ArcSoft Technology
8.10 AIS speech
8.11 Horizon Robotics
8.12 Thundersoft
8.13 Patio
8.14 Joyson Electronics
8.15 Huawei
8.16 Baidu
8.17 tencent
8.18 Bamma Network
8.19 Mini eye
8.20 Hikvision
9 Overview and trends of multimodal interaction
9.1 Trends in multimodal interaction integration
9.2 Cockpit computing power required for multimodal interactions
9.3 Large-scale AI models needed for multimodal interactions
9.4 Multimodal Interaction and Cockpit Hardware Integration
9.5 Overview of multimodal interaction features of a typical car model
For more information about this report, please visit https://www.researchandmarkets.com/r/jva2r1.
About ResearchAndMarkets.com
ResearchAndMarkets.com is the world's leading source of international market research reports and market data. We provide the latest data on international and regional markets, key industries, top companies, new products and latest trends.
Media contact:
research and market
Laura WoodSenior Manager
[email protected]
For Eastern Standard Time office hours, please call +1-917-300-0470.
For USA/Canada Toll Free +1-800-526-8630
For GMT office hours, please call +353-1-416-8900.
US Fax: 646-607-1904
Fax (outside the US): +353-1-481-1716
Logo: https://mma.prnewswire.com/media/539438/Research_and_Markets_Logo.jpg
Source Research and Market