In the dynamic world of automotive technology, China is taking the lead in developing innovative multimodal interactions. The recently released “China Automotive Multimodal Interaction Development Research Report, 2023” reveals a paradigm shift in cockpit interaction modes and their applications.
Anthropomorphic natural interaction: the new normal
This report highlights a significant trend towards active, anthropomorphic and natural interactions. This change marks a shift from traditional single-modal interactions to more sophisticated and intuitive interfaces. Expanding control over touch and voice interactions is evidence of this change.
New interactions such as fingerprint and electromyography are gaining traction, reflecting the industry's commitment to improving the user experience. Voice control technology, in particular, is becoming increasingly popular in various car models.
Evolution of AI models: From single modal to multimodal
Large-scale AI models are evolving from single-modal to multi-modal and multi-tasking, significantly increasing their capabilities. This evolution is evident in the development of vehicle voiceprint recognition technology.
For example, iFlytek's Spark Cockpit OS and Spark Car Assistant support multiple interaction modes, demonstrating the potential of multimodal interfaces. Similarly, AITO M9's HarmonyOS 4 IVI system has intelligent assistant Xiaoyi connected to his Huawei Pangu model, demonstrating the power of multitasking fusion.
Supplier cockpit interaction solutions: A glimpse of the future
This report also takes a closer look at cockpit interaction solutions offered by suppliers and provides insight into emerging trends. As multimodal interaction fusion continues to gain momentum, we can expect to see more advanced and intuitive interfaces in the near future.
In conclusion, “China Automotive Multimodal Interaction Development Research Report 2023” paints an exciting picture for the future of automotive technology. With a focus on active, anthropomorphic, and natural interactions, the industry is moving toward more intuitive and user-friendly interfaces. The evolution of large-scale AI models from single-modal to multi-modal and multi-tasking fusion is a major step forward that promises enhanced functionality and improved user experience.
As of February 13, 2024, the impact of these developments is far-reaching and signals a transformation of the automotive industry where interactions become seamless, intuitive, and inherently human.
Note: All details in this article are fact-checked and published without bias in accordance with the principles of responsible journalism.