Auto Trader cars

Skip to contentSkip to footer
News

A ride in a self-driving car

‘Self driving’ tech is increasingly common in our cars but Wayve’s AI-driven Jaguar proves the real thing is rather more involved

I’m sitting in the passenger seat of a Jaguar I-Pace, being driven round the streets of Camden. As we approach a zebra crossing on a busy street, a woman ambles vaguely in its direction. I’m not at all certain, 10 yards out, if she intends to use it or not, but the car slows to a stop, just as it becomes clear she is indeed going to cross the road. So far, so normal.
But it wasn’t a driver who stopped the car, and it wasn’t a human who interpreted the woman’s stroll from 10 yards away and decided she was intending to cross the road. It was the Wayve artificial intelligence software driving the car. Wayve is the well-documented brainchild of Alex Kendall, one of those frighteningly young high achievers (he’s 32) that make you wonder what you’ve been doing with your life. We both attended Cambridge University, but he went on to become co-founder and CEO of a billion-dollar start-up in 2017 that’s likely to change the notion of transport for ever, and I went on to bitch about other people through the medium of journalism. C’est la vie.
Anyway, Wayve is all about self-driving cars, and uses embodied AI to power the vehicles. Otherwise known as AV 2.0, Wayve's approach to artificial intelligence is a ‘learned’ one, teaching itself how to manage the road network, building up its learning every time it receives more data from its cameras and sensors, while AV 1.0, a rules-based approach involving mapping and coded-in responses to predetermined situations is used by some rivals in the AI space.
Critics of embodied AI, used by Wayve, say the data it feeds off isn’t deep enough to cover every scenario, so some instances must still be coded in, to stay safe. Wayve scientists insist the data is wide enough in its scope. To that end they use data from the cars in trial as well as dash cams, data from partner car brands, road signs and more. Wayve doesn’t build the cars, rather it creates the software to enable self-driving then partners with car brands to put it into their models. Manufacturers are apparently crawling over each other to sign up. The potential benefits of self-driving are huge, and primarily summed up in efficiency, both of time and money. True autonomous vehicles save on manpower, move more predictably when in a fleet, and should crash less, when human error is dialled out.
However, we are a little way off the UK’s roads being populated by robot vehicles. The first iteration of Wayve software in cars will be level 2/3 autonomy, which to you and I just means the sort of advanced safety functions a lot of cars have, like active cruise control and lane-keep assist.
All the same the software includes dormant level 4/full autonomy capability so, when regulation allows, the vehicles will be ready for hands-off driving. And that includes lorries as well as cars. So, how good is the tech? I’m spared the true freakiness of sitting in the car with no one in the driver seat because, thanks to UK law, a human currently has to be present and ready to take control should the software muck up. We head out of the gated compound in a dodgy area of Camden, onto the street, and the driver presses the green button that allows the car to take control. Off we go, quickly up to the 20mph speed limit, easing past parked cars, slowing slightly for speed humps, and coming to a stop at a red light. So far, so surprisingly confident and smooth. Then there’s the woman-at-the-crossing incident, as above, which blows my mind. The car simultaneously understood that there was a crossing which required attention, and one pedestrian among about five on the pavement who was displaying enough intent to cross the road that the car stopped. I ask how the software reads intent. “Physics”, says the Wayve executive sitting in the back seat. The software actually reads her gait, her pace, her direction and her momentum, to discern that she is headed for the crossing, when I would have struggled to do. And all in a few milliseconds. That’s impressive.
The next couple of situations are less impressive. We pull up in slow traffic behind a stationary bus. The car edges out to see if there is space in front of the bus to go round it but the bus is simply in a queue of traffic. The car decides otherwise, and starts to go round the bus, so the driver intervenes to brake and reverse back in line behind it. The other incident is a woman in a wheelchair who approaches a zebra crossing as we arrive at it. The car and woman hesitate, the woman goes to cross just as the car starts to accelerate, so we carry past her, apologetic and blushing.
These things are infrequent though in a drive round some of London’s most congested, busy roads. The car deals easily with a maze of fresh roadworks which weren’t there the day before, and its cameras can even read road signs that stipulate which hours a bus lane is operational, and react accordingly. The car is, in other words, behaving just as a human would do. Undoubtedly, there’s a way to go yet, and mass consumer appetite isn’t there either. Inevitably the first application of this tech will be in commercial fleets. But in time, who knows? It’s an impressive start, with heavy investment and experienced industry executives on board. It’s also made me feel extremely inadequate about my career choice.