"When activating Summon from the parking stalk, choose the direction of travel on the touchscreen before exiting the car" the update screen reads. Autopark will then start after the driver exits the car.
Last week, Utah Model S owner Jared Overtorn had his car crashed into a trailer while in Summon mode. Overton claimed that the Summon feature started on its own and slammed his car into a trailer in front of it while he was in the grocery store. Tesla disputed his claim after taking a look at the car’s logs, saying that Summon was on and Overton should have been watching his car.
In an unrelated incident, ArsTechnica reported about a Model S owner driving last month north from Los Angeles on I-5, cruising in autopilot mode. "All of a sudden the car ahead of me came to a halt. There was a decent amount of space so I figured that the car was going to brake as it is supposed to and didn't brake immediately. When it became apparent that the car was not slowing down at all, I slammed on the brakes but was probably still going 40 when I collided with the other car."
In response, Tesla said that the car logs point to the driver hitting the brake pedal and deactivating autopilot and traffic aware cruise control, returning the car to manual control instantly. (This has been industry-wide practise for cruise control systems for many years.) The use of the brake also apparently disengaged the automatic emergency braking system, something that's been standard across Tesla's range since it rolled out firmware version 6.2 last year.
Autopilot still requires the drivers to be attentive to the vehicle even if it is self driving or parking. With time, this feature will be to prevent accidents from happening in unique circumstances as it collects data from drivers driving billions of miles per year.
During a Q&A with JB Straubel last week at OCE Discovery 2016, Straubel was asked how ethical dictating a decision will be during accidents when Autopilot is enabled, he answered:
"Data is going to end up telling us this, at some point we as an industry will have statistical proof that some autonomous systems are safer when enabled than when they’re not enabled. It’s not a question of ethics, if you have several billion miles of driving, we can look at the data and say these billion miles are safer that these billion miles, clearly we want to use that system.
Just like airbags today, they can still cause some harm, there’s definitely dangers for having airbags, but i think everybody understands you’re much better off having airbags and some of those dangers than not having them at all. I believe we’ll have similar situation with autonomous driving."
In response, Tesla said that the car logs point to the driver hitting the brake pedal and deactivating autopilot and traffic aware cruise control, returning the car to manual control instantly. (This has been industry-wide practise for cruise control systems for many years.) The use of the brake also apparently disengaged the automatic emergency braking system, something that's been standard across Tesla's range since it rolled out firmware version 6.2 last year.
During a Q&A with JB Straubel last week at OCE Discovery 2016, Straubel was asked how ethical dictating a decision will be during accidents when Autopilot is enabled, he answered:
"Data is going to end up telling us this, at some point we as an industry will have statistical proof that some autonomous systems are safer when enabled than when they’re not enabled. It’s not a question of ethics, if you have several billion miles of driving, we can look at the data and say these billion miles are safer that these billion miles, clearly we want to use that system.
Just like airbags today, they can still cause some harm, there’s definitely dangers for having airbags, but i think everybody understands you’re much better off having airbags and some of those dangers than not having them at all. I believe we’ll have similar situation with autonomous driving."
No comments :
Post a Comment