A Tesla driver says he’s still a loyal user of Full Self-Driving, even if he collides with another car while using Autopilot.


According to a recent New York Times report, Tesla driver Dave Key was quick to defend the electric car maker’s driver assistance program, even years after he crashed using Autopilot.

The 69-year-old former software entrepreneur told the Daily that despite the accident, he believes the Full Self-Driving software, which is still in beta, has the potential to save lives, adding that even fatal accidents should not be negatively impacted. the wider vision of autonomous vehicles. As a society, we choose the way to save the most lives,” he told The New York Times.

On May 29, 2018, Dave Key’s 2015 Tesla Model S drove him back to the dentist on Autopilot. It was a route Key had followed countless times before: a two-lane highway into the hills over Laguna Beach, California. But on that trip, while Key was distracted, he veered out of his lane and slammed into the back of a parked police car, spinning the car and pushing the SUV aside. No one was hurt, but the Tesla was in bad shape.

Key, 69, a former software entrepreneur, put on his engineering hat and took a dispassionate view of the accident: The problem with fixed objects—sorry, sounds silly—is that they don’t move. he said. For years, Tesla’s AI struggled to distinguish stationary objects from the background. Instead of being disappointed that the computer couldn’t solve such a simple problem, Key convinced himself that there was a reason behind the crash: not some kind of glitch, but a known software limitation, a black swan phenomenon.

Apparently this was a problem other Autopilot users were having.

Last year, the National Highway Traffic Safety Administration (NHTSA) updated its investigation into more than a dozen Tesla cars that crashed into emergency vehicles while using Autopilot. The agency said the driver assistance feature had difficulty identifying parked vehicles.

Since the crash, Key has bought three more Teslas and continues to use both Autopilot and FSD, adding that the software has been upgraded to monitor the driver’s attention to the road with a reminder function that monitors steering wheel pressure and vehicle movement. driver’s eyes.

Quote sent by New York Times

As we drove, Key compared FSD to Autopilot in his 2015 Tesla. He said Autopilot was like sophisticated cruise control: speed, direction, collision avoidance. Although in his case, he said, “I don’t think it really prevented accidents.” More affected by FSD. He was able to cope with every situation presented to him. My only real complaint is that it doesn’t always choose the path I would choose.

After a minute, the car prompts Key to keep your hands on the wheel and your eyes on the road. Tesla is now a kind of nanny, he complained. While Autopilot once dangerously allowed inattentive drivers to fall asleep at the wheel, this flaw has been eliminated as the stationary object error. Between the steering wheel and the eye tracking, it’s just a problem being solved, Key (…) said.

Key brought four pages of notes for our interview, listing facts about the accidents, for example, divided into subheadings. Tesla Full Auto Vehicle Technology (Talk). He’s a man who walks around with a well-formed battery of ideas about life’s most important topics—computers, software, exercise, money—and is ready to share them. He was particular about my understanding that Autopilot and FSD were lifesavers: data showed their crash rate was much lower than other cars during the beta, his notes read. Don’t rush [dveloppement de la] bta FSD will result in more accidents and deaths based on solid statistics.

Accidents like his, and even the deadliest, are unfortunate, he says, but they haven’t distracted society from the goal of widespread adoption of autonomous vehicles. Key drew an analogy to coronavirus vaccines, which have prevented hundreds of thousands of deaths but also caused rare deaths and injuries from adverse reactions. As a society, he concluded, we choose the path to save the most lives.

After Autopilot’s release in October 2015, Musk encouraged drivers to think it was more advanced than it was, saying in January 2016 that it was better than a human driver. In November of that year, the company released a video of a Tesla on Bay Area roads with the following note*: The person in the driver’s seat is only there for legal reasons. He does nothing. The car drives itself. Musk also rejected the name Copilot in favor of Autopilot.

A 2016 video promoting Tesla’s self-driving car

The fine print made it clear that the technology was only for driver assistance, but that message only got through some of Musk’s announcements. Many drivers seemed genuinely confused about Autopilot’s capabilities (by the way, Tesla refused to disclose that the car crashed in the company’s parking lot in a 2016 video, which was pointed out by a Tesla engineer who admitted that the 2016 video was self-promoting. Driving the car was fake) .

Tesla faces a number of complaints

Unfortunately for Musk, the short term comes first and his business is in for a tough few months. The first lawsuit against Tesla will be heard in connection with an accident involving Autopilot in February. The other four will follow in succession. Donald Slavik, who will represent the plaintiffs in at least three of those cases, says a normal automaker would have chosen to settle the case out of court: “They see it as a cost of doing business.” Musk has vowed to fight in court regardless of the threats it poses to Tesla. The dollar could rise quickly, Slavik said, especially if there are punitive damages.

In one of his complaints against Tesla, Slavik A listed major accidents involving WW’s autopilot. A Tesla car hit a road sweeper in China. A Tesla car hit a tractor trailer on a highway in Florida. A Tesla Model 3 veered off the road and burst into flames during a downpour in Indiana. In the Florida Keys, a Model S car went over the top of the road. In New York, a Model Y hit a man who was changing a tire on the shoulder of the Long Island Expressway. A Tesla car unexpectedly crashed into a highway barrier in Montana. Then the same thing happened in Dallas, Mountain View and San Jose.

In fact, the many allegations in the ongoing lawsuits boil down to a single theme: Tesla has consistently inflated consumer expectations and downplayed the dangers associated with it. The cars weren’t sufficiently monitored by the driver because Musk didn’t want drivers to think the car needed human control.

Note that in April 2019, Musk said: If you have a system that is at or below human-level reliability, driver monitoring makes sense. But if your system is significantly better, more reliable, than a human, monitoring doesn’t help much.

Drivers were not warned of an automatic braking problem or an uncommanded lane change. The company eventually acknowledged the technology’s limitations in its user manual, but released viral videos of a Tesla driving a messy route without human intervention.

The cause of your accident or incident is always your fault

Musk’s ideal customer is someone who, like Key, is willing to take the blame when something goes wrong but has almost unlimited faith in the next update. In the presentation, a Tesla engineer made it clear*: We want to let the customer know that you have to trust your car first*: everything works as it should. Then the cause of your accident or incident is always on you.

As if to describe these remarks, and they [Key et le journaliste] while driving and his Tesla missed a left turn, Key quickly diagnosed the problem: If the system had been upgraded to FSD 10.69, he claimed, the car would have made the turn without fail.

Unfortunately for Musk, not all Tesla owners are like Dave Key. Plaintiffs in autopilot lawsuits may agree that artificial intelligence is getting better, but on the backs of early adopters and bystanders who may be killed along the way.

Online, there’s a battle between pro-Musk and anti-Musk groups over Autopilot and FSD.

Key is far from the only Tesla driver to report software issues, but he continues to support Elon Musk’s vision for the company. For example, another daily reported earlier that a Tesla driver had driven 6,392 miles (about 10,286 kilometers) using mainly Autopilot and FSD, and despite encountering bugs in the software, continued to view the program as a lifesaver.

Source: The New York Times Magazine

And you?

Do you understand Dave Key’s perspective? Do you share his opinion? To what extent?
The cause of your accident or incident is always on you, do you agree with what this Tesla engineer said? Why?

Leave a Reply

Your email address will not be published. Required fields are marked *