Registered users can unlock up to five pieces of premium content each month.
TomTom and Microsoft Unveil Their Conversational AI Voice Assistant at CES 2024 |
NEWS |
In January 2024, TomTom and Microsoft showcased their joint venture in generative Artificial Intelligence (AI) voice assistants at CES. On top of Microsoft’s AI expertise, TomTom has built a conversation AI assistant for the car that will enable easier access to climate control, navigation options, and other vehicle functions. Their venture is indicative of a transition toward more sophisticated voice controls in vehicles; several other Original Equipment Manufacturers (OEMs) and stakeholders have explored Voice Assistants (Vas) that integrate AI, too. BMW’s i Vision Dee (Digital emotional experience) concept car features an intelligent digital companion that becomes excited when it sees you and expresses moods through screens on the front grille, while Chinese Electric Vehicle (EV) company NIO’s NOMI AI goes beyond typical infotainment functions and learns users’ preferences over time for daily routines, travel patterns, and vehicle functions; communicating through a face-like interface. Finally, Mapbox debuted MapGPT in October, which is a conversational AI tool that allows OEMs to customize their own VAs for their vehicles with their Mapbox location services, as well as customizable voices, avatars, personalities, and wake words.
The Advantage of AI in Voice Assistants |
IMPACT |
The inherent need for drivers to stay focused on the road during driving tasks makes VAs ideally suited for the vehicle, so while they struggled to generate revenue in the smart home model, this will not be the case for the in-vehicle ecosystem. Enabling easy completion of tasks that typically required attention and interaction means VAs provide a material value-add to the driving experience, and AI is propelling VAs to better address the significant distraction vectors that drivers encounter when they try to complete a non-driving task, including:
AI and natural language training also allow a VA to function from language models that have been trained on massive amounts of data, and specialized datasets that can reflect a range of languages, regional dialects, and accents, enabling mass market adoption across more regions.
A significant proportion of drivers in the United States recognize the benefits of VAs, with more than 60% who have used one stating that when buying a car, the presence of a VA is a factor and 13% reporting that it’s a significant consideration. Constantly improving Natural Language Processing (NLP) capabilities will only further fuel this growth in adoption and importance in the future. Already, consumers are expecting that conversation VAs will permeate their vehicles for several functions, with 95% expecting to use them 3 years in the future. Of these, 74% would prefer using them for booking an appointment for a service and 72% for ordering specific services like groceries.
From an AI Voice Assistant to an AI Virtual Co-Pilot |
RECOMMENDATIONS |
As adoption grows, the capabilities of VAs should also grow in tandem, especially against the backdrop of the Software-Defined Vehicle (SDV) transition where new features will be delivered entirely through software. To continue adding value to a driving experience, VAs will have to extend their applications from responsive actions to proactive suggestions. As the vehicle collects data on drivers and passengers to build a coherent profile of habits, likes, dislikes, etc., it will be able to recommend locations or in-car experiences to meet needs before they are communicated to the vehicle. For example, as a driver reaches the 3rd hour of a cross-country trek to visit their parents this holiday season, the car can use information about their typical rest habits on long journeys to judge when they should take a break, and recommend a nearby restaurant from their favorite cuisine to visit for lunch during their rest stop. In-vehicle sensors and cameras will make this functionality even more sophisticated, with recognition of driver traits such as gaze, posture, or alertness to evaluate their level of fatigue and recommend rest periods more accurately.
The transition toward proactive suggestions can also present opportunities for in-car advertising, which a recent ABI Insight, “What Must the Automotive Industry Do to Make Success of In-Car Advertising?,” has shown to be an increasingly feasible reality due to OEM desperation for revenue extraction and improving advertiser sophistication. The customized recommendations according to the driver’s tastes enable the VA to advertise in more discrete and useful ways than traditional in-car advertisements, and can complement other discrete advertising solutions such as 4screen’s point of interest pins.
As Level 3 (L3) Autonomous Vehicles (Avs) begin to enter the market, an AI virtual co-pilot becomes even more relevant to the in-vehicle experience as it can take on the role of content recommendations for infotainment through its knowledge of journey times and likely hands-off time for the driver. To achieve this future, however, OEMs need to consider the best ways to place advanced VAs in their vehicles. Developing in-house seems infeasible for many OEMs due to the significant knowledge and investment required to develop next-generation VAs, so partnerships with specialized AI or VA companies may be the way forward. For example, BMW Group took this approach for its Intelligent Personal Assistant in the BMW 7 series and i7, partnering with Cerence for its AI and voice technology. Any approach that fully utilizes the VA as an avenue for revenue generation through advertising, commerce, or as a feature-on-demand option must perfect the development and customization for OEM brands.