In an exclusive feature first published in the January 2024 issue of ADAS & Autonomous Vehicle International, Anthony James investigates how a UK project seeks to boost autonomous driving on rural and urban residential roads, partly by training AVs to act more assertively
Nissan is playing a major role in a government-funded project to bolster the UK’s burgeoning autonomous driving sector. Launched at the end of September, evolvAD will be key to pushing AD technologies at the OEM as part of its long-term vision, Nissan Ambition 2030. The project will test how well fully electric Nissan Leafs equipped with cutting-edge AD technology deal with tricky residential areas and rural roads. It will also examine the role of V2I technologies in supporting the deployment of autonomous vehicles.
“In our previous HumanDrive and ServCity research projects, our AD team and partners have tackled highways and complex city environments,” explains Robert Bateman, evolvAD project manager and manager of the research and advanced engineering team at Nissan Technical Centre Europe (NTCE).
“Nissan will now enhance its technology further by testing and trialling it in other driving environments, specifically urban residential and complex rural roads. The project will also explore what transport opportunities autonomous mobility can provide to A roads and minor roads that are mostly found within rural and intercity communities.”
These types of driving environments present their own unique set of challenges for AD technology. For example, drivers in residential areas often face narrow roads, single lanes with parked vehicles on either side and slow driving speeds. Rural roads can include similar conditions but with higher driving speeds, winding profiles, blind corners, blind gradients and few to no road markings.
Delivered by a consortium of five industry partners including Nissan as technical lead, evolvAD is jointly funded by government and the consortium partners, with some of the money coming from the government’s £100m (US$124m) Intelligent Mobility Fund, administered by the Centre for Connected and Autonomous Vehicles (CCAV) and delivered by the UK’s innovation agency, Innovate UK.
The research project will run for 21 months, coming to an end in March 2025, and will see six members of the NTCE team working with around 20 experts from Connected Places Catapult (CPC), Humanising Autonomy, SBD Automotive and TRL.
“TRL, which manages the Smart Mobility Living Lab (SMLL), is looking at the development of infrastructure,” notes Bateman. “It’s also looking at supply chain readiness regarding the test specifications and design specifications that UK suppliers will need to deliver an autonomous vehicle to an OEM. CPC is working on a UK-developed, high-definition map, and Humanising Autonomy is focused on trying to better predict what pedestrians are going to do.”
The latter’s website says the company’s “ethical computer-vision software analyzes videos to quickly classify, interpret and predict human behavior so that it can better inform automated decision-making engines”. Bateman says improving how an AV can predict the often-surprising behavior of humans forms a key part of the project.
“When we speak to our Japanese colleagues, they often remark how, until they came to London, they didn’t realize pedestrians won’t always wait for the green man before crossing,” he explains. “Better prediction algorithms can help the car to understand where and when a pedestrian is going to cross. It’s also about ensuring that when you are in one of our vehicles, it provides a comfortable ride rather than slamming on the brakes every time it thinks a pedestrian might cross the road. We’re trying to make our AD as human-like as possible in how it responds.”
Urban residential road testing will be done in partnership with TRL, which will use SMLL’s real-world testbed, spread across London roads in Greenwich and the Queen Elizabeth Olympic Park.
The testbed features 24km of instrumented public urban roads, both single- and multi-lane, including traffic circles. All testing will take place during daylight hours, mainly between 9:00am and 3:00pm: “This is to avoid the school run – but there is an actual school on one of the main roads we are using for the test, so we will evaluate at the school pick up, as well,” says Bateman.
There are also speed bumps, traffic signals, one-way systems, pedestrian crossings, bridges, underpasses and overpasses. SMLL’s extensive test zone has been chosen to enable the project’s partners to rigorously test and observe the performance of their technologies and vehicles from every angle, as they take on the everyday challenges of driving down a busy urban road.
“Take speed bumps,” continues Bateman. “There are lots of different types, including those that place three ridges in a line across the width of the road – but which should you go over? Do you go over the middle or both?” Furthermore, “Very few human drivers actually drive fully around a mini roundabout [traffic circle], with some going straight over.”
Electric scooters will also be in the mix: “In the last three or four years we’ve had e-scooters appear alongside cyclists.”
“We’ll be testing even when it is raining or when there is light fog,” notes Nirav Shah, an NTCE research engineer working on the project, when asked for further examples of how evolvAD will differ from HumanDrive and ServCity. “We will also be testing on roads where there is no division between the oncoming vehicle or the test vehicle’s direction of travel, whereas in previous projects there has been a central reservation between lanes.”
The project will also explore and trial vehicle-to-infrastructure (V2I) technology to improve situation awareness, path planning and overall performance. TRL will connect test vehicles to infrastructure and send new sources of data to the vehicle to improve its situational awareness.
“There are 270 cameras in total across the SMLL testbed,” explains Shah. “We will use the data from these cameras to better understand whether objects are moving or stationary.”
Burrage Road, a busy, single-lane route at the heart of the testbed – with lots of parked cars down either side – is of particular interest. “The information from the roadside cameras will help the test subject to understand if it needs to move toward the middle of the road to avoid a vehicle that has begun to pull out, for example,” says Shah. “It will need to move further out to go around, but in so doing it will map potentially with an oncoming vehicle – how will the stack deal with that, in comparison to driving on a dual carriageway?”
Bateman shares his colleague’s enthusiasm for the challenge ahead. “As part of our 2030 vision, we want to make this technology available to everybody,” he says. “To do that, it’s got to work in rural areas and busy residential roads, whether we’re using it for delivering goods or to help people visit friends or family – you’ve got to be able to get down those streets.”
However, he says previous research has revealed that this is far from an easy task for a computer: “Currently, if a parked car started to pull out, the autonomous vehicle would give way unless there wasn’t another car coming toward it for quite some distance. AVs will need to be more assertive, going forward. In the previous ServCity trial, the car just wasn’t ready for it.”
For rural environments, where vehicle speeds mean the stakes are even higher, testing will initially be conducted inside proving grounds within the UK, namely UTAC Millbrook and Mira. This testing will include the development and validation of enhanced autonomous vehicle motion control in high dynamic use cases, and will provide lots of useful data to inform further simulation modeling.
Use cases such as blind corners, road gradient ascents/descents and low-quality road lane lines will be used for optimum vehicle trajectory, speed and motion planning at heading speeds up to 96km/h with vehicle acceleration limits increased to 0.5g. Testing will only move to public roads after an intense period of simulation to fine-tune performance. “Rural testing has begun at UTAC Proving Ground in Millbrook utilising its outer handling and alpine routes,” notes Shah.
Three cars will be used for testing, with some additional cars set up for data collection. Two of the test cars will be used for urban testing while the third will cover rural trials.
The vehicles are equipped with cameras, lidar, GPS and radar, and the computers required to process the incoming information and steer the vehicle. Localization and path planning are based on several sensor inputs, as well as a digital map stored internally in the vehicle, so that the car is not dependent on any single sensor for its safe operation.
During the trials, the automated vehicles will be occupied by a trained test driver and operator responsible for overseeing safe vehicle operation. All test cars will drive within the regular speed limits of the various roads that they encounter.
“Before we even put the software into the vehicle, we do simulations using data previously collected from the testbed area and some of the area beyond, so that we get an idea of any unique features that will be required to then update the software,” explains Shah.
“We then do more simulation and test the software for any unexpected behavior, and then we take it onto a public road with a safety driver, where we run the software offline to compare it with what the safety driver is doing,” he continues.
All of evolvAD’s safety drivers are fully qualified and have been trained to react in the event of a system failure. “If the control system of the vehicle fails, then the safety driver follows the training they have undergone previously at the proving ground, which includes acceleration or steering override,” says Shah. “There are a lot of safety checks, protocols and procedures undertaken before the vehicle can begin testing on the open road.”
The team are reluctant to reveal any safety driver intervention data, at this stage: “Everything depends on what testing we are doing and the maturity of that software and hardware at the time,” explains Bateman. “However, the plan is that when we get to the end of the project there will be no intervention from the safety driver.”
Five companies make up the evolvAD project:
Nissan – lead partner and leading the development of the connected and autonomous vehicles (CAV) that will be trialled during the project
Connected Places Catapult – applying advanced machine learning techniques to generate high-definition maps from aerial imagery
Humanising Autonomy – UK supplier with advanced vulnerable road user (pedestrians, cyclists and motorcyclists) perception and behavior estimation capability
SBD Automotive – onboard cybersecurity and advanced safety case
TRL – Developing vehicle system validation processes utilizing infrastructure on the Smart Mobility Living Lab (SMLL) testbed
Shah notes that the safety drivers are integral to the project, providing valuable input that shapes the AD software. “All the safety drivers are very well trained not just in test driving but in every aspect of driving on a public road,” he says. “They all drive like an expert chauffeur and they help us understand how the vehicle should behave from their perspective. There’s a lot of discussion and collaboration with our safety drivers as this ensures the vehicle doesn’t behave like someone driving on the road for the first time.”
As to how best to solve the conundrum presented by a parked car wanting to pull out, Bateman says it’s all a matter of tuning: “If there is enough space and we want to be assertive, then potentially we could go more toward the middle of the lane, keeping enough space on the right-hand side for the oncoming vehicle to pass. In software terms, it’s not only longitudinal movement but also lateral movement, and deciding when to move laterally is very interesting in those scenarios. A lot of it is down to fine-tuning, which takes up a lot of the software engineers’ time.”
Bateman notes that Nissan learned a similar lesson from the previous ServCity project: “When we first started, as the car approached a roundabout, it would slow right down to give way, even when there was no oncoming traffic, but that isn’t how humans drive – we tend to maintain the same speed if the path is clear. However, the software that first came across from Japan slowed the car down to give way before then pulling off. If you were the car behind, and you weren’t paying attention, you’d end up going straight into the back of it.”
He continues, “Nirav and his team immediately began tuning the software to make it more assertive, so now it says, ‘I’m at the roundabout. There isn’t a car coming. I don’t need to slow down, I can go’. We then began to tune it to take the same ‘lane’ around the roundabout as a human would take. We used to joke that one of the roundabouts was more of a ‘square-about’ – you had to straighten up a bit, then go around a little bit and then straighten up, etc. In the end, we tuned it so rather than the steering wheel being jerky, it moved far more naturally.”
While all evolvAD vehicles will be equipped with 100% autonomous drive capability, Nissan is keen to stress that the project does not signal any intention to launch a fully autonomous vehicle in the UK and Europe in the near future.
Instead, evolvAD fits into a wider autonomous drive research and development program that is taking place across Nissan’s R&D facilities worldwide. As such, the project’s findings will help inform future Nissan AD systems for passenger vehicles, with a focus on how the OEM can ensure its systems integrate into urban environments.
The company already offers an L4-capable hands-off ProPILOT 2.0 self-driving system for use on highways under approved conditions, in certain countries. Research projects such as evolvAD will be vital in taking Nissan’s future technology to the next level, helping to ensure such systems better integrate into urban environments.
This feature was first published in the January 2024 issue of ADAS & Autonomous Vehicle International magazine – you can find our full magazine archive, here. All magazines are free to read. Read the article in its original format here.