► We trial Nissan’s ProPilot 3 self-driving tech…
► … with a baptism of fire in the heart of Tokyo
► Developed with the British AI firm, Wayve
Tokyo’s streets make London look like a one-horse village. The road network is built on a completely unintelligible mesh of exits and overpasses. Every junction is crammed with filter lights – and pedestrians and cyclists swarm every crossing. It doesn’t matter how much driving experience you have, you’ll feel out of your depth here.
Right now, our Nissan Ariya is perched one on such junction near one of the capital’s busy crossings, waiting for a never-ending line of commuters before it can turn. But there’s no white-gloved taxi driver behind the wheel. This EV is running Nissan’s ProPilot 3 autonomous technology.

What is ProPilot 3?
It’s a completely autonomous driving system that Nissan has been developing for more than a decade. We last ProPilot 2.0 in action on the Playmobil-esque streets of Milton Keynes but, in the last two years, Nissan has stepped up the system’s AI integration by partnering with British firm Wayve – and the result is its most advanced system yet.
Nissan’s original ProPilot system debuted in 2016, offering the ability to motor along single lanes. It was followed up by ProPilot 2.0 in 2019, which added multiple lane technology. The ProPilot 3 software we’re trying out here is expected to launch in 2027 and aims to be as good as a human driver.
Like this? Get more CAR delivered to your browser! Click here to add CAR Magazine as a preferred source on Google.
‘Honestly speaking, the highway was a very simple environment, but now we are driving into the very complex functions,’ Tetsuya Iijima, Nissan’s exec chief engineer of autonomous tech tells us before the 15km drive. ‘This means the next generation of ProPilot is equal to, or better than human driving capability.’

The Wayve software Nissan is using is constantly learning, but the key processes needed to get it as functional as a human were completed around four to five years ago.
The evolution of Wayve began by training foundation models with footage from taxi dash cams and other vehicles; and while not precise ‘it gives the AI brain a very high-level understanding of the world and right direction to move.’
After that, Wayve used data from what Iijima calls ‘safety vehicle operators’ to polish things further. These are specially selected drivers that are careful and safe but also able to make process, so they don’t slow traffic.
Put all that learning together and HD maps aren’t needed; the AI can read and understand any driving condition it’s pointed at.
What about the hardware?
As is the case with humans, Nissan’s latest AI brain can’t work without relevant, reliable sensors. For that reason, the Ariya we’re testing is fitted what looks like a space age roof box. It contains one Lidar, eleven cameras and five radars, all working in together to provide the best possible picture. Nissan isn’t currently disclosing where these sensors are from, nor the chips that must crunch their data – but we do at least know a Lidar is key to the formula.
‘Lidar goes beyond the reach of a camera, because as far as camera can see, [the AI will work]. Beyond the reach of a camera, it doesn’t work.’ Iijima tells us, as our Ariya continues on its way, now indicating before working its way around a stranded Kei car with its hazards on. ‘Typical situation: in the United States, interstates are almost completely dark,’ he says. ‘Driving at 75 to 80mph, in that situation, the camera’s range is 40 to 50 meters. A camera is not enough.’
What’s it like to (not) drive?
The conditions we’re driving in are nowhere near as precarious as a moonlit American highway strewn with unlit breakdowns, but they are challenging. The Ariya needs to constantly manoeuvre around around traffic, stop at complex junctions and allow enough space for cyclists and other road users.
And if it works here, it’ll also work in rural areas due to their lack of motorists and overall less cluttered, complex 3D environment. ‘Historically, these kinds of tests start from rural area,’ Iijima explains, ‘Because rural areas are easier.’

The AI’s driving style can be considered ‘accommodating,’ but I later learn there are at least three possible personalities on offer, each with varying levels of ‘assertiveness.’ It’d be interesting to see how the other two modes work – and whether they offer a noticeable benefit to the quality of the treatment you receive from other road users.
The thinking is sound, but so is the overall style of driving. Acceleration is smooth and gradual while steering inputs are refined. This partly comes down to the forward planning the ProPilot 3 system is doing, but also the inputs Nissan has tuned the system to make.
Early verdict
The weirdest thing? How quickly it begins to feel mundane. Less than halfway through the 20-minute journey, my attention beings to stray from the display that shows the Ariya’s autonomous thinking – and even from the wheel which has been moving itself throughout the journey. Instead, I find myself beginning to check my emails, glance at the time in the UK and think of headlines for this article.
Like solid state technology, full autonomous driving is perennially ‘around the corner,’ but it’s fair to say this 40-minute, 15km test has at least convinced me that it’s truly viable. Unlike solid-state battery technology though, autonomous tech must find a way past legality as well as scale. In the meantime, watch this space.
