Tesla reported today that the National Highway Traffic Safety Administration (NHTSA) is opening a preliminary evaluation to see if Autopilot worked according to expectations after a recent crash resulting in the death of a Model S driver.
The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.When asked why the radar sensors didn't recognize the truck and apply brakes or change trajectory, Elon Musk explained that the radar tunes out what looks like an overhead road sign to avoid false braking events.
Tesla and Elon Musk extended their condolences to the family of the victim, saying the driver was a friend and a long time EV supporter.
NHTSA said it was working with the Florida Highway Patrol in the inquiry into Mr. Brown's fatal accident. The agency cautioned that opening an investigation did not mean it believe there was a defect in the vehicle being examined.
Tesla has repeatedly made it clear to drivers to keep their hands on the wheels at all times as Autopilot is still in beta phase. In fact, every time that Autopilot is engaged, the car reminds the driver to "Always keep your hands on the wheel. Be prepared to take over at any time." The system always checks the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
Tesla says owners of its electric vehicles have driven 130 million miles using its autonomous Autopilot feature. There has been one fatality worldwide for approximately every 60 million miles. Tesla's director of Autopilot Programs Sterling Anderson was at the EmTech Digital conference last month speaking on the MIT Tech Review panel about "delivering the Promise of Autonomous vehicles" in San Francisco. He told the audience that the data collected from those trips is being analyzed to improve the Autopilot function and introduce more upgrades. He warned that Autopilot cannot legally replace a driver as of yet and insisted that it "should be used with a driver fully engaged, fully in the loop, using their cognitive abilities as they normally would". In addition, he advised drivers to "stay very in tune with the set of scenarios the car doesn't handle well" adding that they "should be very engaged and be prepared to take over."
During a Q&A at OCE Discovery in Toronto last month, JB Straubel made the case for Autopilot where the pros of using it outweigh the cons in terms of safety.
Tesla was forced recently to deny, in a statement, a rumor of NHTSA investigation into suspension failure on its Model S after a blogger mounted a smear campaign against the company. Elon Musk then went on twitter to clarify that 37 of 40 suspension complaints to NHTSA were fraudulent and registered to false locations or vehicle identification numbers, adding that NHTSA have found no safety concerns with the Model S suspension and have no further need for data from Tesla on this matter.
You can read Tesla's full statement below.
The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.According to The Verge, the accident occurred on May 7th in Williston, Florida with 40-year-old Ohio resident Joshua Brown, who recorded a video of his Tesla Model S’ autopilot saving him from a crash back in April.
NHTSA said it was working with the Florida Highway Patrol in the inquiry into Mr. Brown's fatal accident. The agency cautioned that opening an investigation did not mean it believe there was a defect in the vehicle being examined.
Tesla has repeatedly made it clear to drivers to keep their hands on the wheels at all times as Autopilot is still in beta phase. In fact, every time that Autopilot is engaged, the car reminds the driver to "Always keep your hands on the wheel. Be prepared to take over at any time." The system always checks the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
Tesla says owners of its electric vehicles have driven 130 million miles using its autonomous Autopilot feature. There has been one fatality worldwide for approximately every 60 million miles. Tesla's director of Autopilot Programs Sterling Anderson was at the EmTech Digital conference last month speaking on the MIT Tech Review panel about "delivering the Promise of Autonomous vehicles" in San Francisco. He told the audience that the data collected from those trips is being analyzed to improve the Autopilot function and introduce more upgrades. He warned that Autopilot cannot legally replace a driver as of yet and insisted that it "should be used with a driver fully engaged, fully in the loop, using their cognitive abilities as they normally would". In addition, he advised drivers to "stay very in tune with the set of scenarios the car doesn't handle well" adding that they "should be very engaged and be prepared to take over."
During a Q&A at OCE Discovery in Toronto last month, JB Straubel made the case for Autopilot where the pros of using it outweigh the cons in terms of safety.
Data is going to end up telling us this at some point. We, as an industry, will have statistical proof that some autonomous systems are safer when enabled than when they’re not enabled. It’s not a question of ethics, if you have several billion miles of driving, we can look at the data and say these billion miles are safer that these billion miles, clearly we want to use that system. Just like airbags today, they can still cause some harm, there’s definitely dangers for having airbags, but i think everybody understands you’re much better off having airbags and some of those dangers than not having them at all. I believe we’ll have similar situation with autonomous driving.
You can read Tesla's full statement below.
We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.
Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.
It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.
The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.
No comments :
Post a Comment