► Bosch is integrating AI into ADAS and autonomous car tech
► We ride in demo of ChatGPT-like AI driving an ID. Buzz
► ‘VLA’ describes what it sees and can explain its decisions
I’ve just experienced an AI large language model very much like ChatGPT drive a Volkswagen ID. Buzz – albeit gingerly.
Bosch – seemingly like most of the major industry players in today’s modern world – is investing in AI, convinced it’s the next big thing in terms of technology. And yes, Bosch has now got to the stage where it has put it AI directly into the driver’s seat, testing to see if the tech that’s quickly growing in capability can ferry us around and blending the digital and physical worlds all in one.
Yes, sorry, I should back up a few steps.
Bosch, the engineering and manufacturing giant, is pouring €2.5bn into artificial intelligence by the end of 2027, with CEO Stefan Hartung saying that ‘AI’s true potential doesn’t lie so much on screens and in the purely virtual world, but rather where it meets the physical world.’
Hartung, speaking at the brand’s tech day in July 2025, is keen to point out that Bosch has been working on AI for years, founding its own ‘Centre for Artificial Intelligence’ in 2017 – a centre that’s now become four different locations. While working on integrating it in other areas, it’s been testing and developing AI technology to work with and one day surpass conventional advanced driver assistance systems (ADAS) in cars of the future. It’s also joined forces with VW Group’s beleaguered software arm CARIAD to provide it with assistance in its own endeavours.
Alongside that, though, Bosch has a range of its own ADAS sensors, cameras and software that it is working on overlaying with artificial intelligence. The reason why we got a chance to sample them is because Bosch invited a very limited number of media to the final day of a demonstration event it had put on for OEMs. Bosch had set up the event to show off their latest findings and technologies to car makers, rolling out a new shop front for future ADAS tech they could buy straight off Bosch’s shelves.
‘Bosch is shaping the future of the mobility experience,’ says Jerome Rigobert, Bosch’s head of ADAS product management. ‘We are doing that with our products, our ADAS software solutions, sensors and our AI technology – we have a clear target to double our expected sales for these products by the mid-2030s.’ Essentially, Bosch wants a piece of the growing AI pie.
The idea is that AI can provide more context and subjective decision making to driver assistance systems that are programmed to rigidly stick to rules and regulations, and to not deviate from them. Bosch’s software engineers are developing something called a ‘vision language action’ (VLA) model that builds upon the ‘large language model’ setup something like ChatGPT has. We already have ChatGPT in many modern cars, now, but it’s usually limited to being a voice assistant as part of your infotainment system. AI is also working its way into how car makers design and produce cars, too.
What makes a VLA different to your normal, off-the-shelf ChatGPT, though is that it ‘understands the world better than ADAS systems without language functionality,’ according to Dr. Frederik Zilly, AI expert at Bosch (and our guide in the ID. Buzz). The idea being that the VLA can see around it via a car’s in-built cameras and sensors and can ‘talk in a language ADAS systems understand’ according to Zilly, to provide extra contextual information for ADAS tech.
On top of that, given the VLA model is based off a large language model, it means it can describe in human language what’s around it. Before we got rolling, Zilly – via a laptop connected to the ID. Buzz demo car – asked the VLA to describe what it saw via the front camera. The VLA did just that, with remarkable accuracy. We were in a rainy parking area, next to a temporary structure and parked near some other cars; the AI described the scene pretty flawlessly.
So, how does that apply to ADAS? Take, for example, traffic sign recognition. The theory of having an AI overlay on an ADAS system like that is when TSR reads a road sign wrong, AI can step in and clarify or correct it if it thinks there’s enough additional context to warrant doing so. If, for example, TSR reads a ‘130’ sign on the back of a European lorry when you’re on a British motorway, for example, the system has a tendency to read that as the actual speed limit and display it in your instruments. AI, provided it has vision via cameras and other sensors, can correct the TSR because it can recognise your car is on a motorway and has just passed a lorry, which will then maintain the ‘70’ or ‘national speed’ icon in the display.
But that’s just the first step. The next step is to let generative AI drive your entire car when given full access to the car’s suite of sensors, cameras, radar and – of course – the car’s controls. Zilly is quick to point out, though, that the AI driver assistance tech is defined as ‘Level 2++’ within the agreed levels of autonomy; ‘the driver is still the boss,’ says Zilly.
Effectively, yes. Sat in the back of the ID. Buzz, we first are taken out to a flat obstacle course to see if the tech can perform some basic manoeuvres. I’ll be honest, in those early few minutes, things don’t go off to a great start. Zilly, speaking into a microphone, tells the system to ‘turn left at the next intersection’ as the ID. Buzz gingerly meanders forward, steering wheel tweaking left and right constantly. But the voice dictation doesn’t fully translate properly, so the turn doesn’t happen. Instead, Zilly types the command into a chat box, and the car does that – again very slowly, like a learner driver still bumbling about in a car park.
A bigger challenge comes as a map marker is set, causing the system to plot a trajectory around some of the roads around the Bosch campus in Renningen. The pace picks up, and the Buzz maintains its course pretty smoothly. Zilly interrupts the route plan by saying ‘stop by that car’ – a Passat that’s parked by an access road. Given the system can see the car, it keeps driving and stops accurately alongside.
It’s all a bit eerie, but fascinating at the same time. This technology, very clearly, is still in the early stages, with Bosch’s legion of software developers working hard to make it all come together in the next few years. As for practical applications, Zilly says that the AI system will be able to work without additional hardware requirements to production cars to make it all work, but its functionality will instead be limited by what sensors and cameras are installed on the car it’s applied to.
Well, that’s the next issue. As we’re finding out as it grows and develops, AI can introduce inaccuracies or downright falsehoods if provided with open-ended prompts – something we’re seeing regularly on social media via content generation AI tools.
The issue with overlaying AI into sometimes safety-critical driver assistance systems is that if it’s not seen as trustworthy, car owners may not use it or actively turn it off. We’re already seeing it with ADAS, as drivers frequently complain about tech like lane-keeping assistance or traffic sign recognition and actively figure out ways to turn it off.
‘I think that’s the most important question for the overall industry,’ Rigobert tells CAR. ‘If you want to bring in AI, we need to ensure the public – the end customer – will go with it. We have done surveys asking if people trust AI and how they would react if AI wasn’t working, and I see the answer as very region specific. It’s very negative in Europe, but in China there’s not much issue. Part of that answer is that we have to consider where we are in the world with it.’ Rigobert points to the fact that Bosch’s VLA can visualise, describe and explain its actions, which he says will help build trust in the system.
Matthias Klauda, Bosch’s executive vice president for engineering and R&D, expands that point further. ‘There’s a usability perspective and a technical perspective,’ he tells CAR. ‘In terms of usability, what we’ve found out is what a user wants is to understand what the car is doing. What is the car seeing and what is it reacting to, etcetera. The vision language action model is a really good measure here, because you can really set the scene and get a verbal interpretation of it.
‘The technical part is that you need to design safety right into the stack, so that means you need redundancies in both hardware and software,’ Klauda adds. ‘You can’t base redundancy with two stacks that use the same methodology, because then they would both make the same mistake. You need differing software paradigms for the two stacks and make sure they supervise each other so the system remains safe in all conditions.’
Well, that’s a wider question, but a safe answer is around 2030. A big breakthrough in technology being ‘five years away’ is a bit of a running gag these days, with the same general trajectory having been predicted for years when it comes to tech like self-driving cars or solid state batteries.
But Bosch is actually being realistic. It says the first level of its AI-integrated ADAS could be ready in 2027 for Europe but, because it’s not an OEM and is a supplier, it might be another couple of years before the technology actually makes it to a production car. Given Bosch’s partnership with CARIAD, the first cars we could see the technology applied could be VW Group products running on the new Scalable Systems Platform from 2030.
Jake has been an automotive journalist since 2015, joining CAR as Staff Writer in 2017. With a decade of car news and reviews writing under his belt, he became CAR's Deputy News Editor in 2020 and then News Editor in 2025. Jake's day-to-day role includes co-ordinating CAR's news content across its print, digital and social media channels. When he's not out interviewing an executive, driving a new car for review or on a photoshoot for a CAR feature, he's usually found geeking out on the latest video game, buying yet another pair of wildly-coloured trainers or figuring out where he can put another car-shaped Lego set in his already-full house.
By Jake Groves
CAR's news editor; gamer, trainer freak and serial Lego-ist