Tesla self-driving test driver: ‘you’re running on adrenaline the entire eight-hour shift’

A new report based on interviews with former test drivers who were part of Tesla’s internal self-driving team reveals the dangerous extremes Tesla is willing to go to test its autonomous driving technologies.

While you can make the argument that Tesla’s customers are self-driving test drivers as the automaker is deploying what it calls its “supervised self-driving” (FSD) system, the company also operates an internal fleet of testers.

We previously reported on Tesla hiring drivers all over the country to test its latest ‘FSD’ software updates.

Now, Business Insider is out with a new report after interviewing nine of those test drivers who are working on a specific project called ‘Rodeo’. They describe the project:

Test drivers said they sometimes navigated perilous scenarios, particularly those drivers on Project Rodeo’s “critical intervention” team, who say they’re trained to wait as long as possible before taking over the car’s controls. Tesla engineers say there’s a reason for this: The longer the car continues to drive itself, the more data they have to work with. Experts in self-driving tech and safety say this type of approach could speed up the software’s development but risks the safety of the test drivers and people on public roads.

- Advertisement -advanced

One of those former test drivers described it as “a cowboy on a bull and you’re just trying to hang on as long as you can” – hence the program’s name.

Other than sometimes using a version of Tesla FSD that hasn’t been released to customers, the test drivers generally use FSD like most customers, with the main difference being that they are more frequently trying to push it to the limits.

Business Insider explains in more detail the “critical intervention team” with project Rodeo:

Critical-intervention test drivers, who are among Project Rodeo’s most experienced, let the software continue driving even after it makes a mistake. They’re trained to stage “interventions” — taking manual control of the car — only to prevent a crash, said the three critical-intervention drivers and five other drivers familiar with the team’s mission. Drivers on the team and internal documents say that cars rolled through red lights, swerved into other lanes, or failed to follow posted speed limits while FSD was engaged. The drivers said they allowed FSD to remain in control during these incidents because supervisors encouraged them to try to avoid taking over.

These are behaviors that FSD is known to do in customer vehicles, but drivers generally take over before it goes too far.

The goal of this team is to go too far.

One of the test drivers said:

“You’re pretty much running on adrenaline the entire eight-hour shift. There’s this feeling that you’re on the edge of something going seriously wrong.”

Another test driver described how Tesla FSD came within a couple of feet from hitting a cyclist:

“I vividly remember this guy jumping off his bike. He was terrified. The car lunged at him, and all I could do was stomp on the brakes.”

The team was reportedly pleased by the incident. “He told me, ‘That was perfect.’ That was exactly what they wanted me to do,” said the driver.

You can read the full Business Insider report for many more examples of the team doing very dangerous things around unsuspecting members of the public, including pedestrians and cyclists.

How does this compare to other companies developing self-driving technology?

Market leader Waymo reportedly does have a team doing similar work as Tesla’s Rodeo “critical intervention team”, but the difference is that they do the testing in closed environments with dummies.

Electrek’s Take

This appears to be a symptom of Tesla’s start-up approach of “move fast, break things”, but I don’t think it’s appropriate.

To be fair, none of the nine test drivers interviewed by BI said that they were in an accident, but they all described some very dangerous situations in which outsiders were dragged into the testing without their knowledge.

I think that’s a bad idea and ethically wrong. Elon Musk claims that Tesla is about “safety first”, but the examples in this report sound anything but safe.

- Advertisement -spot_img

Read more

Recommend News