Twitter Facebook Youtube

ROBORACE: “The vehicles will actually have personalities”

In Montréal, we grabbed half an hour with Bryn Balcombe, Chief Technical Officer of Roborace, for a chat about the psychology of AI and what racing is, in a technology age. Well, technically we just meant to talk to him for ten minutes about the future of Roborace but things escalated a little and he humoured us getting a bit into what it means to call an AI a driver and whether the Robocar will ever actually offer gestures of universal respect and polite suggestions on driving improvement to its fellow drivers. This two-part interview will give you an insight into ROBORACE that you have not gotten so far.


Roborace is an electric, autonomous racing championship, designed to be contested not by drivers but by programmers and technologists, via specialist racing Artifical Intelligences. In its early stages, it has been demonstrating at Formula E tracks and privately testing and developing the sensory, mechanical and programmatical technology that will allow it to ultimately launch its racing. Currently, most of its demonstration is around Devbot, an adjusted LMP3 car which can run autonomously. Soon the bespoke Robocars will hit the streets in earnest, with the beginning of autonomous racing alongside Formula E. But what does that mean, for AI and human drivers alike?

To take it back to the roots of Roborace, when you first started this project did you imagine that you’d go for something like Devbot as it is now, where it has to be mechanically operated as an LMP3 car, that is a driveable LMP3 or or did you think you’d go straight into the Robocars?

When it first started, the original idea is to use motorsport to advance technology – that’s what motor racing has been used for before. The other side of it is that you need to be able to engage the public – and that’s one of the things that motorsport is struggling to do – it’s a great challenge. Formula E’s doing a great job of building a new base for motorsport, Formula 1’s changing now to try and retain its base.

But you have to have an entertainment proposition and you have to be able to inspire the public and that’s why they created Robocar, that’s why Daniel Simon came on board to do that part. You then have the engineering elements of that, so you have to go from a 3D rendering to a manufacture-level product, that takes time. And during that process, what we created was Devbot. So while we were waiting for the final designs, in that period between concept design and product we created Devbot as a development vehicle to test out the concepts that were going into Robocar. So: four wheel drive powertrain, sensor suite, the computer architecture, is all identical on Devbot to Robocar.

Oh ok, so it actually is the same – with some variance for shape and whatever?

Yes, the only difference is we have a cockpit, and some differences in the steering mechanism because we have a human that can actually drive the car manually. But overall the idea is that you can develop software on Devbot and then you just port the software across, put it on Robocar and Robocar will work. And that’s the normal development routine that we go through at an event.

Ostensibly, the LMP3 looks very different from the Robocar but I guess it’s mechanically close to the same thing?

The main advantage is that you get to manually drive the thing, so you can test out all our power train systems, you can actually acquire a lot of data – which is what we’ve been doing here, so you can manually drive around just doing data acquisition. Now, in Robocar you can’t do that because it needs that data to drive. If it hasn’t got the data beforehand then it can’t drive. So always good to have an option to do that scanning and data acquisition manually. We then are able to run the software in the car, so we’re able to have a human in the car but monitoring that AI driving performance – so again, that’s another validation you can get, you can’t get closer to the vehicle dynamics than being in the car.

So when the human drives – obviously the driver takes Devbot for a lap, does Devbot then take the driver for a lap?

Basically – that’s exactly what happens. So you do some manual installation laps, so that’s really checking the vehicle performance, making sure everything mechanically is fine, from a human perspective.

That the brakes are working and things?

Yeah, exactly. And then the second step is that you put it into the AI mode, the driver stays in the car but takes his hand off the wheel, foot off the throttle and the brake and then he monitors the racing line and the performance of the AI driver. Once he’s confidence, he can then step out of the car and can start to do AI driver development testing.

In terms of developing the AI, it’s now within 8% of a human driver’s speed – or that was the last figure I saw?

Yeah, that was right – when we were in Berlin. But that’s a variable and that’s something that we’ll be talking about a lot more next season. Is, if you take the performance, how do you break the performance down? We look at it on three levels; you have the vehicle hardware’s performance, you then have the performance of the vehicle intelligence platform – which is the environment sensing and the communications – and then you have the performance of the AI driver. So those three things define overall performance and it’s important for us to be able to communicate that.

Does the AI driver – obviously it’s data based, it’s programmatical, it is ultimately a computer programme but does it have tracks it prefers?

Yeah, this is where you get the difference – so you would say, like a mechanical setup or an aero setup on the car will be tailored to a particular track. So that’s vehicle hardware, you can tailor the car to that. If you look at a sensor suite and the communications for the car – again, you can say that could be biased towards one type of environment or another. So when we run in Formula E tracks, for example, the lidars are really effective at looking at vertical surfaces on the walls.

Oh of course, cus you’ve got the sheer concrete barriers.

Exactly – then when you go to somewhere like Silverstone then you’re relying much more on machine vision, for example.

Because it’s not seeing an edge to the track?

You don’t have such a defined edge so yeah, for sure, the different environments we run at have different performance, if you like. So I wouldn’t call that a personality in terms of the programmatical.

No, but it’s better in certain circumstances.

Yeah. When you start to look at the personality of a driver, that’s really where the teams come in – and whether their use of sensors really affects their performance on a track, gives a sense of personality on track.

Which would be where, when you get Robocar into manufacture and the series gets going, the vehicles will actually have personalities in that respect, according to the way that they’re set up?

Somewhat – Daniel Simon described it the other day and he said that software always has a personality, that’s by Bill Gates created Windows and Steve Jobs created iOS and you have different people who are fans of each of those products and the way you use the software and the way it feels is very different. The same thing will manifest itself in cars. What’s interesting is that some of that comes through the dashboard and that human-machine interface of the car, the perception of the car but actually the personality of the car is defined by its actions.

That’s what’s really interesting. So in different countries there are different drivers that we know and know all drive a certain car in a certain way, so you can ascribe to the entire population of those car owners a particular personality. It will be the same but for autonomous vehicles. And that’s what’s really interesting, the interaction of the actions of the car define its personality. So in racing, it’s the actions but also the interactions with the other objects, with the other drivers.

So you might get like – a French Clio driver has a very different style to a British Volvo driver?

(Laughing) Yeah, so… in the UK we’re allowed to say ‘White Van Man’.

(Also laughing) Yes, yeah.

So effectively you attribute a style of driving to a certain model of vehicle because you’ve seen that vehicle being driven so many times in that way. So you’d describe that as a population. So when you then buy an autonomous…. Let’s say it’s a Toyota, for example, if the software is identical in all those vehicles then they’ll all behave the same way. So you’ll start to ascribe personality to those cars.

And there’s kind of like… as the owner of the car then you have to work out whether that personality matches your personality. And that’s what becomes really interesting. So we’ll see that play out but on a race track, in terms of whether one car pushes to overtake another car and if there’s an accident then who does the public think is in the wrong or in the right. And it’ll split the public, in the same way that Rosberg-Hamilton splits the public. You will have a particular favourite that’s built up over time, based on their actions that have gone before.

Yes definitely. I mean, the good thing about Robocar is it probably won’t storm down the pitlane having an argument with anyone.

Yeah I mean, that’s the interesting thing…

Find out whether machines will indeed take over in part two of our interview with Bryn Balcombe which will be available on tomorrow.

Images courtesy of ROBORACE and FIA Formula E