<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	xmlns:media="http://search.yahoo.com/mrss/"
	>

<channel>
	<title>AI &amp; Sensor Fusion News | ADAS &amp; Autonomous Vehicle International</title>
	<atom:link href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/feed" rel="self" type="application/rss+xml" />
	<link>https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion</link>
	<description></description>
	<lastBuildDate>Wed, 29 Apr 2026 16:58:18 +0000</lastBuildDate>
	<language>en-GB</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>Neusoft and AISpeech partner on AI-powered in-vehicle mobility companion using LLM technology</title>
		<link>https://www.autonomousvehicleinternational.com/news/connectivity/neusoft-and-aispeech-partner-on-ai-powered-in-vehicle-mobility-companion-using-llm-technology.html</link>
		
		<dc:creator><![CDATA[Zahra Awan]]></dc:creator>
		<pubDate>Wed, 29 Apr 2026 16:58:18 +0000</pubDate>
				<category><![CDATA[AI & Sensor Fusion]]></category>
		<category><![CDATA[Connectivity]]></category>
		<guid isPermaLink="false">https://www.autonomousvehicleinternational.com/?p=24111</guid>

					<description><![CDATA[<a href="https://www.autonomousvehicleinternational.com/news/connectivity/neusoft-and-aispeech-partner-on-ai-powered-in-vehicle-mobility-companion-using-llm-technology.html"><img width="300" height="168" src="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/04/AdobeStock_1840776318-300x168.jpg" alt="Neusoft and AISpeech partner on AI-powered in-vehicle mobility companion using LLM technology" align="left" style="margin: 0 20px 20px 0;max-width:100%" /></a><p>In the AI-native era, large language models (LLMs) are reshaping in-vehicle interaction. Traditional single-command systems are giving way to more advanced approaches that enable contextual understanding, prediction and cross-scenario coordination. Natural voice interaction with continuous conversation flow and long-term memory is emerging as a key development to improve the in-car user experience.</p>
<p>Neusoft Corporation and AISpeech have signed a memorandum of cooperation (MoC) to jointly develop an AI-powered mobility companion featuring natural conversation and proactive intelligence for the global market, by integrating AISpeech&#8217;s cutting-edge AI voice interaction technologies into Neusoft&#8217;s recently launched OneCoreGo Global In-Vehicle Intelligent Mobility Solution 7.0.</p>
<p><a href="https://www.autonomousvehicleinternational.com/news/connectivity/neusoft-and-aispeech-partner-on-ai-powered-in-vehicle-mobility-companion-using-llm-technology.html" rel="nofollow">Continue reading Neusoft and AISpeech partner on AI-powered in-vehicle mobility companion using LLM technology at ADAS &amp; Autonomous Vehicle International.</a></p>
]]></description>
		
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">24111</post-id>						  <media:content url="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/04/AdobeStock_1840776318.jpg" medium="image" />
			</item>
		<item>
		<title>Pony.ai debuts new autonomous driving compute platform</title>
		<link>https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/pony-ai-debuts-new-autonomous-driving-compute-platform.html</link>
		
		<dc:creator><![CDATA[Anthony James]]></dc:creator>
		<pubDate>Mon, 27 Apr 2026 08:38:59 +0000</pubDate>
				<category><![CDATA[AI & Sensor Fusion]]></category>
		<guid isPermaLink="false">https://www.autonomousvehicleinternational.com/?p=24046</guid>

					<description><![CDATA[<a href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/pony-ai-debuts-new-autonomous-driving-compute-platform.html"><img width="300" height="168" src="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/04/final-executive-image-e1777283781737-300x168.jpg" alt="Pony.ai debuts new autonomous driving compute platform" align="left" style="margin: 0 20px 20px 0;max-width:100%" /></a><p>Pony.ai has begun using a new autonomous driving domain controller – a high-performance compute system designed for Pony.ai’s L4 autonomous driving platform and also a broader set of customer applications across autonomous mobility. Developed in collaboration with Nvidia, the new controller is built on the Nvidia Drive Hyperion platform and powered by Nvidia Drive AGX Thor with Nvidia NVLink, supporting Pony.ai’s next phase of commercialization in robotaxis and its growing domain controller business.</p>
<p>The new system is designed to deliver significant gains in AI computing performance, energy efficiency and support for the latest AI models, while meeting core L4 requirements such as multisensor fusion, full-scenario perception and high-complexity scenario understanding.</p>
<p><a href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/pony-ai-debuts-new-autonomous-driving-compute-platform.html" rel="nofollow">Continue reading Pony.ai debuts new autonomous driving compute platform at ADAS &amp; Autonomous Vehicle International.</a></p>
]]></description>
		
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">24046</post-id>						  <media:content url="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/04/final-executive-image-e1777283781737.jpg" medium="image" />
			</item>
		<item>
		<title>INTERVIEW: Tiancheng Lou, founder and CTO, Pony.ai on PonyWorld 2.0</title>
		<link>https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/interview-tiancheng-lou-founder-and-cto-pony-ai-on-ponyworld-2-0.html</link>
		
		<dc:creator><![CDATA[Anthony James]]></dc:creator>
		<pubDate>Wed, 22 Apr 2026 08:42:22 +0000</pubDate>
				<category><![CDATA[AI & Sensor Fusion]]></category>
		<category><![CDATA[Features]]></category>
		<guid isPermaLink="false">https://www.autonomousvehicleinternational.com/?p=23988</guid>

					<description><![CDATA[<a href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/interview-tiancheng-lou-founder-and-cto-pony-ai-on-ponyworld-2-0.html"><img width="300" height="168" src="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/04/DSCF7490-拷贝-scaled-e1776849130170-300x168.jpg" alt="INTERVIEW: Tiancheng Lou, founder and CTO, Pony.ai on PonyWorld 2.0" align="left" style="margin: 0 20px 20px 0;max-width:100%" /></a><p class="p2"><strong><em>AAVI</em> catches up with Pony.ai&#8217;s founder and CTO, Tiancheng Lou, following the company&#8217;s recent launch of PonyWorld 2.0, the latest upgrade to its proprietary world model and a major advance in the core training system behind its autonomous driving stack.</strong> </p>
<p class="p2">After validating the unit economics of robotaxi operations in two major metropolitan markets in China with its seventh-generation robotaxi fleet, Pony.ai is keen to speed commercialization across China and beyond. It is targeting a fleet of more than 3,000 vehicles by the end of this year, with deployments spanning 20 cities globally. Nearly half of those cities will be in overseas markets.</p>
<p><a href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/interview-tiancheng-lou-founder-and-cto-pony-ai-on-ponyworld-2-0.html" rel="nofollow">Continue reading INTERVIEW: Tiancheng Lou, founder and CTO, Pony.ai on PonyWorld 2.0 at ADAS &amp; Autonomous Vehicle International.</a></p>
]]></description>
		
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">23988</post-id>						  <media:content url="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/04/DSCF7490-拷贝-scaled-e1776849130170.jpg" medium="image" />
			</item>
		<item>
		<title>AI-defined vehicles and expanded electrification shape Nissan’s future mobility strategy</title>
		<link>https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/ai-defined-vehicles-and-expanded-electrification-shape-nissans-future-mobility-strategy.html</link>
		
		<dc:creator><![CDATA[Zahra Awan]]></dc:creator>
		<pubDate>Mon, 20 Apr 2026 10:37:52 +0000</pubDate>
				<category><![CDATA[AI & Sensor Fusion]]></category>
		<category><![CDATA[Business]]></category>
		<guid isPermaLink="false">https://www.autonomousvehicleinternational.com/?p=23970</guid>

					<description><![CDATA[<a href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/ai-defined-vehicles-and-expanded-electrification-shape-nissans-future-mobility-strategy.html"><img width="300" height="168" src="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/04/APTI-and-AAVI-Nissan-e1776681146387-300x168.jpg" alt="AI-defined vehicles and expanded electrification shape Nissan’s future mobility strategy" align="left" style="margin: 0 20px 20px 0;max-width:100%" /></a><p>Nissan Motor has announced its long‑term vision: Mobility Intelligence for Everyday Life, a plan to integrate mobility intelligence into everyday life through Nissan&#8217;s focus on AI-defined vehicles (AIDV), offering a choice of electrification technologies to meet diverse customer and market needs.</p>
<p>Ivan Espinosa, president and CEO, said, &#8220;This is the right moment to articulate Nissan&#8216;s long‑term vision as we move beyond the Re:Nissan recovery plan and set a clear path for the future. Our vision defines where Nissan is headed, with customer experience as our guiding priority. By advancing mobility intelligence, we will deliver intuitive, advanced and reliable products and technologies that offer outstanding value and enrich how mobility is experienced.&#8221;</p>
<p>At the core of the strategy is Nissan AIDV, combining Nissan AI Drive and Nissan AI Partner technologies.</p>
<p><a href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/ai-defined-vehicles-and-expanded-electrification-shape-nissans-future-mobility-strategy.html" rel="nofollow">Continue reading AI-defined vehicles and expanded electrification shape Nissan’s future mobility strategy at ADAS &amp; Autonomous Vehicle International.</a></p>
]]></description>
		
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">23970</post-id>						  <media:content url="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/04/APTI-and-AAVI-Nissan-e1776681146387.jpg" medium="image" />
			</item>
		<item>
		<title>Pony.ai launches self-improving physical AI engine PonyWorld 2.0</title>
		<link>https://www.autonomousvehicleinternational.com/news/adas/pony-ai-launches-self-improving-physical-ai-engine-ponyworld-2-0.html</link>
		
		<dc:creator><![CDATA[Zahra Awan]]></dc:creator>
		<pubDate>Mon, 13 Apr 2026 13:47:04 +0000</pubDate>
				<category><![CDATA[ADAS]]></category>
		<category><![CDATA[AI & Sensor Fusion]]></category>
		<category><![CDATA[Testing]]></category>
		<guid isPermaLink="false">https://www.autonomousvehicleinternational.com/?p=23913</guid>

					<description><![CDATA[<a href="https://www.autonomousvehicleinternational.com/news/adas/pony-ai-launches-self-improving-physical-ai-engine-ponyworld-2-0.html"><img width="300" height="168" src="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/04/20260409-182610-300x168.jpg" alt="Pony.ai launches self-improving physical AI engine PonyWorld 2.0" align="left" style="margin: 0 20px 20px 0;max-width:100%" /></a><p>The autonomous driving industry is entering a new commercial phase. The challenge is no longer simply proving that driverless technology works, but improving performance quickly and consistently to enable broader deployment, stronger unit economics and sustained technical leadership.</p>
<p>To address this, Pony.ai has launched PonyWorld 2.0, the latest upgrade to its proprietary world model and a major advancement in the core training system behind its autonomous driving stack. The key advancement is its ability to identify its own weaknesses and guide targeted improvements. The upgrade introduces three core capabilities: self-diagnosis, targeted data collection in areas where performance is still limited and more efficient training focused on the most challenging cases.</p>
<p><a href="https://www.autonomousvehicleinternational.com/news/adas/pony-ai-launches-self-improving-physical-ai-engine-ponyworld-2-0.html" rel="nofollow">Continue reading Pony.ai launches self-improving physical AI engine PonyWorld 2.0 at ADAS &amp; Autonomous Vehicle International.</a></p>
]]></description>
		
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">23913</post-id>						  <media:content url="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/04/20260409-182610.jpg" medium="image" />
			</item>
		<item>
		<title>OPINION: How the software-defined vehicle is redefining development</title>
		<link>https://www.autonomousvehicleinternational.com/opinion/opinion-how-the-software-defined-vehicle-is-redefining-development.html</link>
		
		<dc:creator><![CDATA[Laura Kalka ]]></dc:creator>
		<pubDate>Wed, 18 Mar 2026 09:00:23 +0000</pubDate>
				<category><![CDATA[ADAS]]></category>
		<category><![CDATA[AI & Sensor Fusion]]></category>
		<category><![CDATA[Opinion]]></category>
		<category><![CDATA[Software]]></category>
		<guid isPermaLink="false">https://www.autonomousvehicleinternational.com/?p=23362</guid>

					<description><![CDATA[<a href="https://www.autonomousvehicleinternational.com/opinion/opinion-how-the-software-defined-vehicle-is-redefining-development.html"><img width="300" height="200" src="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/03/IMG_5638-300x200.jpg" alt="OPINION: How the software-defined vehicle is redefining development" align="left" style="margin: 0 20px 20px 0;max-width:100%" /></a><p>The automobile is undergoing one of the most profound transformations in its history. What was once a largely mechanical product is evolving into a software-centric system: the software-defined vehicle (SDV). In this new paradigm, it is no longer hardware alone that defines a vehicle’s capabilities, but software, says <strong>Laura Kalka, team lead marketing at b-plus</strong>, with functions, performance characteristics, and even user experience increasingly shaped by code.</p>
<p>This shift is more than a technological upgrade. It represents a structural realignment of the entire industry. Vehicles are turning from static, finished products into dynamic platforms that continue to evolve long after leaving the production line.</p>
<p><a href="https://www.autonomousvehicleinternational.com/opinion/opinion-how-the-software-defined-vehicle-is-redefining-development.html" rel="nofollow">Continue reading OPINION: How the software-defined vehicle is redefining development at ADAS &amp; Autonomous Vehicle International.</a></p>
]]></description>
		
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">23362</post-id>						  <media:content url="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/03/IMG_5638-scaled-e1771325043912.jpg" medium="image" />
			</item>
		<item>
		<title>RoboSense lidars selected for WeRide and Geely’s Robotaxi GXR</title>
		<link>https://www.autonomousvehicleinternational.com/news/sensors/robosense-lidars-selected-for-weride-and-geelys-robotaxi-gxr.html</link>
		
		<dc:creator><![CDATA[Zahra Awan]]></dc:creator>
		<pubDate>Fri, 13 Mar 2026 16:24:47 +0000</pubDate>
				<category><![CDATA[AI & Sensor Fusion]]></category>
		<category><![CDATA[Robo-Taxis]]></category>
		<category><![CDATA[Sensors]]></category>
		<guid isPermaLink="false">https://www.autonomousvehicleinternational.com/?p=23607</guid>

					<description><![CDATA[<a href="https://www.autonomousvehicleinternational.com/news/sensors/robosense-lidars-selected-for-weride-and-geelys-robotaxi-gxr.html"><img width="300" height="168" src="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/03/RoboSense_LiDAR__EM4_E1-e1773419043789-300x168.jpg" alt="RoboSense lidars selected for WeRide and Geely’s Robotaxi GXR" align="left" style="margin: 0 20px 20px 0;max-width:100%" /></a><p>RoboSense’s digital lidar sensors, the EM4 and fully solid-state E1, will be integrated into the Robotaxi GXR, jointly developed by WeRide and Zhejiang Farizon New Energy Commercial Vehicle Group, a subsidiary of Geely. Production of the vehicle is scheduled to begin in Q3 2026, with 2,000 units planned for domestic and international markets.</p>
<p>The GXR is equipped with WeRide’s latest Gen8 autonomous driving suite, built on the company’s Sensor Suite 8.0 (SS8.0). This includes RoboSense’s thousand-beam-level digital lidar EM4 as the primary sensor, and the E1 fully solid-state digital lidar for blind-spot detection, providing redundancy and enhanced reliability in operation.</p>
<p><a href="https://www.autonomousvehicleinternational.com/news/sensors/robosense-lidars-selected-for-weride-and-geelys-robotaxi-gxr.html" rel="nofollow">Continue reading RoboSense lidars selected for WeRide and Geely’s Robotaxi GXR at ADAS &amp; Autonomous Vehicle International.</a></p>
]]></description>
		
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">23607</post-id>						  <media:content url="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/03/RoboSense_LiDAR__EM4_E1-e1773419043789.jpg" medium="image" />
			</item>
		<item>
		<title>Qualcomm and Wayve collaborate on pre-integrated ADAS/AD system for auto makers</title>
		<link>https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/qualcomm-and-wayve-collaborate-on-pre-integrated-adas-ad-system-for-auto-makers.html</link>
		
		<dc:creator><![CDATA[Anthony James]]></dc:creator>
		<pubDate>Tue, 10 Mar 2026 13:12:00 +0000</pubDate>
				<category><![CDATA[AI & Sensor Fusion]]></category>
		<category><![CDATA[Business]]></category>
		<guid isPermaLink="false">https://www.autonomousvehicleinternational.com/?p=23571</guid>

					<description><![CDATA[<a href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/qualcomm-and-wayve-collaborate-on-pre-integrated-adas-ad-system-for-auto-makers.html"><img width="300" height="168" src="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/03/Wayve_Qualcomm_Logos-e1773160209850-300x168.png" alt="Qualcomm and Wayve collaborate on pre-integrated ADAS/AD system for auto makers" align="left" style="margin: 0 20px 20px 0;max-width:100%" /></a><p class="x_MsoNormal">Qualcomm Technologies, Inc and Wayve have confirmed they are working together on an advanced production-ready ADAS and AD system for auto makers worldwide. The collaboration adds the Wayve AI Driver as an end-to-end AI driving intelligence layer to Qualcomm Technologies’ high-performance, field-proven Snapdragon Ride, consisting of system-on-chips and tightly integrated active safety software, delivering a pre-integrated system that enables regulatory and hands-off ADAS deployment, expanding to broader driving environments and hands-off, eyes-off capabilities.</p>
<p class="x_MsoNormal">Designed to serve as an advanced ADAS/AD foundation, the pre-integrated platform enables auto makers to deploy highly capable, advanced features quickly, while also enabling customization, future scaling and upgrading.</p>
<p><a href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/qualcomm-and-wayve-collaborate-on-pre-integrated-adas-ad-system-for-auto-makers.html" rel="nofollow">Continue reading Qualcomm and Wayve collaborate on pre-integrated ADAS/AD system for auto makers at ADAS &amp; Autonomous Vehicle International.</a></p>
]]></description>
		
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">23571</post-id>						  <media:content url="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/03/Wayve_Qualcomm_Logos-e1773160209850.png" medium="image" />
			</item>
		<item>
		<title>XPeng to deploy second-generation AI-driven autonomous driving system by 2027</title>
		<link>https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/xpeng-to-deploy-second-generation-ai-driven-autonomous-driving-system-by-2027.html</link>
		
		<dc:creator><![CDATA[Matt Ross]]></dc:creator>
		<pubDate>Tue, 03 Mar 2026 17:26:33 +0000</pubDate>
				<category><![CDATA[AI & Sensor Fusion]]></category>
		<guid isPermaLink="false">https://www.autonomousvehicleinternational.com/?p=23521</guid>

					<description><![CDATA[<a href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/xpeng-to-deploy-second-generation-ai-driven-autonomous-driving-system-by-2027.html"><img width="300" height="168" src="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/03/image_72584830041772462345157-e1772558681838-300x168.png" alt="XPeng to deploy second-generation AI-driven autonomous driving system by 2027" align="left" style="margin: 0 20px 20px 0;max-width:100%" /></a><p>Chinese auto maker XPeng has announced that its second-generation Vision-Language-Action autonomous driving system (VLA 2.0) will begin global delivery in 2027, with Volkswagen designated as the inaugural launch partner in the Chinese market.</p>
<p>In addition, this month will see the company begin rolling out its XOS 5.8.7 over-the-air update to existing vehicle owners in Europe.</p>
<p>At XPeng’s launch event in Amsterdam, Netherlands, on March 2, the company showcased VLA 2.0&#8217;s architecture and outlined the global rollout strategy.</p>
<p>The evolution to VLA 2.0 marks a significant advance in AI-controlled driving. Unlike traditional VLA models that rely on a sequential vision‑language‑action pipeline, VLA 2.0 uses an end-to-end vision-to-action architecture that removes intermediate language-based translation layers and directly converts perception into driving decisions.</p>
<p><a href="https://www.autonomousvehicleinternational.com/news/ai-sensor-fusion/xpeng-to-deploy-second-generation-ai-driven-autonomous-driving-system-by-2027.html" rel="nofollow">Continue reading XPeng to deploy second-generation AI-driven autonomous driving system by 2027 at ADAS &amp; Autonomous Vehicle International.</a></p>
]]></description>
		
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">23521</post-id>						  <media:content url="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/03/image_72584830041772462345157-e1772558681838.png" medium="image" />
			</item>
		<item>
		<title>Tensor and Arm collaborate on agentic robocar</title>
		<link>https://www.autonomousvehicleinternational.com/news/adas/tensor-and-arm-collaborate-on-agentic-robocar.html</link>
		
		<dc:creator><![CDATA[Zahra Awan]]></dc:creator>
		<pubDate>Fri, 27 Feb 2026 14:44:49 +0000</pubDate>
				<category><![CDATA[ADAS]]></category>
		<category><![CDATA[AI & Sensor Fusion]]></category>
		<category><![CDATA[Sensors]]></category>
		<guid isPermaLink="false">https://www.autonomousvehicleinternational.com/?p=23492</guid>

					<description><![CDATA[<a href="https://www.autonomousvehicleinternational.com/news/adas/tensor-and-arm-collaborate-on-agentic-robocar.html"><img width="300" height="168" src="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/02/DSC00169-dark-1400x933-1-300x168.jpg" alt="Tensor and Arm collaborate on agentic robocar" align="left" style="margin: 0 20px 20px 0;max-width:100%" /></a><p>To support global commercialization of Level 4 autonomous capabilities, Tensor and Arm have announced a multiyear strategic collaboration to provide the compute architecture for Tensor&#8217;s agentic AI personal robocar. Using the Arm platform, which integrates hardware, software and ecosystem support, Tensor has deployed over 400 Arm-based cores per vehicle.</p>
<p>Tensor is designing its vehicles around built-in intelligence, rather than retrofitting autonomy onto existing or legacy platforms. The robocar is powered by a vertically integrated Level 4 autonomy stack, and a comprehensive sensor suite of 37 cameras, five lidars, 11 radars, 22 microphones, 10 ultrasonic sensors, three IMUs, GNSS, 16 collision detectors, eight water-level detectors, four tire-pressure sensors, a smoke detector and triple-channel 5G connectivity.</p>
<p><a href="https://www.autonomousvehicleinternational.com/news/adas/tensor-and-arm-collaborate-on-agentic-robocar.html" rel="nofollow">Continue reading Tensor and Arm collaborate on agentic robocar at ADAS &amp; Autonomous Vehicle International.</a></p>
]]></description>
		
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">23492</post-id>						  <media:content url="https://www.autonomousvehicleinternational.com/wp-content/uploads/2026/02/DSC00169-dark-1400x933-1.jpg" medium="image" />
			</item>
	</channel>
</rss>
