Patti Robb, chief strategist and head of innovation automated driving, Intel, discusses the various ways in which natural language processing enhances AV development
Interactive and immersive experiences are two features to enable richer user experience and devices differentiation. Tactile screens are an early instantiation of interactive experience that empowered smartphones, tablets, and then ported to PCs and 2-in-1s. Speech recognition and natural language processing (NLP) is another form of interactive experience first used in text editors to remove the typing hassle and saw wide applications in clinics and hospitals to help clinical staff deliver medical reports in an efficient way. Recently, NLP applications evolved in users’ devices for personal applications (e.g. Alexa, Cortana, Google Assistant and Siri) and home assistant/entertainment applications (e.g. Amazon Echo, Apple Home Pod, Google Assistant).
The autonomous vehicle industry is leveraging AI and very advanced compute capabilities with connectivity to the network and to the cloud, to focus on the automated driving experience. There is opportunity to enhance the interactive experience for passengers who will be the only occupants of the vehicles with L5 full autonomous ‘no driver’ vehicles. Passengers’ experience can also benefit from focusing on streaming HD video content and bringing VR to the car. NLP has big potential in these areas and many more. For this to happen, there is need for the ecosystem to work NLP actors to boldly join the AV actors and extend their platform API’s bringing NLP to every car.
AVs ranging from consumer vehicles to robo-taxis are big incubators for compute capabilities and AI starting with the vehicle, spanning to the cloud and to the network edge with 5G deployment evolution. This opens big opportunities to leverage the compute and connectivity capabilities to enable interactive experience for passengers; and between passengers and the vehicle. A recent study we did showed that Natural Language Processing (NLP) applications enable not only richer experience for passengers, but also a trust relationship between the passenger and the autonomous vehicle.
As with any new technology, end-consumers’ trust is a key pillar for the wide-scale of AVs’ cross-geos. The best way to think of ‘trust’ is to get inspired from the human world and how trust is built between humans and how dialog and conversation are very important to build trust between humans. Similarly, the end-users (who will be the passengers in AVs) need to build a trust relation with the AVs to ensure not only belief and adoption in the technology, but also comfort with autonomous rides. NLP is a game-changer for building this trust. Within Intel, we conducted several studies and test drives with passengers leveraging NLP to build trust and we received global agreement from several user groups of passengers of different demographics that their experience with NLP made an impact to gain trust in AVs.
NLP enhances trust with AVs
Following are examples of new and exciting services enabled by interactive experiences in autonomous vehicles:
· Natural language dialog to talk with the vehicle: Passengers talk to the vehicle, such as asking for a ride to the destination they want, changing the route, stopping for a restaurant, or stopping at a shopping mall. This provides a very rich experience and is very useful for the elderly, who may not be keen on interacting with automated systems. AI algorithms in the vehicle for NLP would have a reasoning engine to make sure that kids in the car would not take advantage of such a system for playing. We would not want the car to head off to the North Pole to visit Santa without parental approval!
· Natural language to call a virtual valet driver: Passengers away from their vehicles can call requesting the vehicle to come pick them up (a phone call with AI interaction, or text message with the time and location of pickup). This provides very rich experience and an alternative to installing apps on phones to request the vehicle. AI algorithms for NLP in the vehicle need to process voice and messages and interpret the information on location and time for pickup, as well as confirming authorization. The vehicle can monitor local traffic conditions so that it will arrive on time.
· Rich and interactive passenger experience: Interactive experience through NLP allows the passengers to request the type of entertainment they need (e.g. show me a movie of actor A or local weather, or display news). AI may run in each vehicle, or can be distributed between the vehicle and the cloud.
· Interactive tourism enabling new services for passengers and cities: Passengers in tourist buses can receive video streams of the surrounding environment, displayed on side windows and enabling interactive services (e.g. checking museums operating hours and purchasing tickets, thus saving time and eliminating long queues). The vehicle can add interactive layers to the content streamed from the cloud.
· Online gaming: Passengers entertainment through online gaming during travel is an expected service leveraging the compute, connectivity and multiple screens of autonomous vehicles. With 5G, players for online gaming can be among the passengers in different vehicles on the road, constituting a local cloud for gaming.
· Emergency handling: Interactive experience also expands to emergency handling, where a third party can interact with the vehicle in an emergency situation to steer the vehicle to a safe stop (e.g. remote control and remote driving for a vehicle in the event of an accident). With 5G, emergency handling and interaction with the vehicle can happen in real time.