Thursday, July 7, 2016

Tesla responds to Fortune about the fatal crash: you could have asked first

Ever since the fatal accident on May 7th in Florida involving 40-year-old Ohio resident Joshua Brown, there were a barrage of negative media reports blaming both the victim and the Autopilot of his Tesla Model S vehicle for the crash. Some claimed it happened because the driver was watching a DVD of Harry Potter while others blamed the Autopilot for not steering away from the trailer truck that was crossing the highway. You can read more about the accident in our previous article.

One of the more striking reports was that of Fortune accusing Tesla and its CEO Elon Musk of hiding the news of the fatal accident when they raised $1.46 billion in fresh capital from the sale of its 6.8 million new common stock offering on May 18th.
On May 18, eleven days after Brown died, Tesla and CEO Elon Musk, in combination (roughly three parts Tesla, one part Musk), sold more than $2 billion of Tesla stock in a public offering at a price of $215 per share—and did it without ever having released a word about the crash. 
To put things baldly, Tesla and Musk did not disclose the very material fact that a man had died while using an auto-pilot technology that Tesla had marketed vigorously as safe and important to its customers.
The magazine went on to claim they tried reaching out to a Tesla representative on July 4th in order to get its side of the story before publishing the article the next day. However, according to Tesla, this never happened.
When Fortune contacted Tesla for comment on this story during the July 4th holiday, Fortune never asked any of these questions and instead just made assumptions. Tesla asked Fortune to give it a day to confirm these facts before it rushed its story to print. They declined and instead ran a misleading article.
Fortune magazine went on to publish a second article, pointing at the "semi-self driving autopilot" as cause for the accident.
[Fortune's articles] assume that this accident was caused by an Autopilot failure. To be clear, this accident was the result of a semi-tractor trailer crossing both lanes of a divided highway in front of an oncoming car. Whether driven under manual or assisted mode, this presented a challenging and unexpected emergency braking scenario for the driver to respond to. In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle's position in lane and adjusts the vehicle's speed to match surrounding traffic. 
The company was forced yesterday to issue a statement about Fortune's accusations after the latter published multiple articles thereafter to capitalize on the spat between their editor Alan Murray and Elon Musk on twitter. Tesla claims that Fortune never asked the company why the news about the incident were not made public before the stock offering occurred.
To summarize, when Tesla started its investigation, it informed NHTSA of the incident 9 days later. Because of the nature of the accident, a Tesla investigator had to fly to Florida to inspect the car and the crash site and pull the vehicle's logs. The investigator finished reviewing the logs by the end of May. Days after the stock offering went place.

Tesla also accused Fortune for failing to acknowledge that there has been zero confirmed fatalities after thousands of its customers worldwide drove more than 130 million miles where Autopilot was activated. According to Tesla, it is statistically safer to use Autopilot than without.
During a Q&A at OCE Discovery in Toronto last month, JB Straubel made the case for Autopilot where the pros of using it outweigh the cons in terms of safety.
Data is going to end up telling us this at some point. We, as an industry, will have statistical proof that some autonomous systems are safer when enabled than when they’re not enabled. It’s not a question of ethics, if you have several billion miles of driving, we can look at the data and say these billion miles are safer that these billion miles, clearly we want to use that system. Just like airbags today, they can still cause some harm, there’s definitely dangers for having airbags, but i think everybody understands you’re much better off having airbags and some of those dangers than not having them at all. I believe we’ll have similar situation with autonomous driving.
Separately yesterday, NHTSA announced it is investigating a July 1st crash in Pennsylvania of a Tesla Model X to determine whether automated functions were in use at the time of the accident. State police said the Model X struck a turnpike guard rail, then veered across several traffic lanes and into the median, where it landed on its roof in the middle of the roadway. The driver and a passenger were injured, Reuters reported. In a statement, Tesla Motors said, "Based on the information we have now, we have no reason to believe that Autopilot had anything to do with this accident."

You can read Tesla's full statement below:
First, Fortune mischaracterizes Tesla's SEC filing. Here is what Tesla's SEC filing actually says: "We may become subject to product liability claims, which could harm our financial condition and liquidity if we are not able to successfully defend or insure against such claims." [full text included below] This is just stating the obvious. One of the risks facing Tesla (or any company) is that someone could bring product liability claims against it. However, neither at the time of this SEC filing, nor in the several weeks to date, has anyone brought a product liability claim against Tesla relating to the crash in Florida. 
Next, Fortune entirely ignores what Tesla knew and when, nor have they even asked the questions. Instead, they simply assume that Tesla had complete information from the moment this accident occurred. This was a physical impossibility given that the damage sustained by the Model S in the crash limited Tesla's ability to recover data from it remotely. 
When Tesla told NHTSA about the accident on May 16th, we had barely started our investigation. Tesla informed NHTSA because it wanted to let NHTSA know about a death that had taken place in one of its vehicles. It was not until May 18th that a Tesla investigator was able to go to Florida to inspect the car and the crash site and pull the complete vehicle logs from the car, and it was not until the last week of May that Tesla was able to finish its review of those logs and complete its investigation. When Fortune contacted Tesla for comment on this story during the July 4th holiday, Fortune never asked any of these questions and instead just made assumptions. Tesla asked Fortune to give it a day to confirm these facts before it rushed its story to print. They declined and instead ran a misleading article. 
Here's what we did know at the time of the accident and subsequent filing:
That Tesla Autopilot had been safely used in over 100 million miles of driving by tens of thousands of customers worldwide, with zero confirmed fatalities and a wealth of internal data demonstrating safer, more predictable vehicle control performance when the system is properly used.
 
That contrasted against worldwide accident data, customers using Autopilot are statistically safer than those not using it at all. 
That given its nature as a driver assistance system, a collision on Autopilot was a statistical inevitability, though by this point, not one that would alter the conclusion already borne out over millions of miles that the system provided a net safety benefit to society. 
Given the fact that the "better-than-human" threshold had been crossed and robustly validated internally, news of a statistical inevitability did not materially change any statements previously made about the Autopilot system, its capabilities, or net impact on roadway safety. 
Finally, the Fortune article makes two other false assumptions. First, they assume that this accident was caused by an Autopilot failure. To be clear, this accident was the result of a semi-tractor trailer crossing both lanes of a divided highway in front of an oncoming car. Whether driven under manual or assisted mode, this presented a challenging and unexpected emergency braking scenario for the driver to respond to. In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle's position in lane and adjusts the vehicle's speed to match surrounding traffic. 
Fortune never even addresses that point. Second, Fortune assumes that, putting all of these other problems aside, a single accident involving Autopilot, regardless of how many accidents Autopilot has stopped and how many lives it has saved, is material to Tesla's investors. On the day the news broke about NHTSA's decision to initiate a preliminary evaluation into the incident, Tesla's stock traded up, not down, confirming that not only did our investors know better, but that our own internal assessment of the performance and risk profile of Autopilot were in line with market expectations.
The bottom line is that Fortune jumped the gun on a story before they had the facts. They then sought wrongly to defend that position by plucking boilerplate language from SEC filings that have no bearing on what happened, while failing to correct or acknowledge their original omissions and errors.
Featured headline image: Reuters.

2 comments :

  1. someone died while driving a bike... it is the same nonsense... keychain incident kille dmore people or other cases where breakes did not work or car catches fire... But remembering, such things as bad publicity do not exist...

    ReplyDelete
  2. Feels very hurtful to read about this accident with the Quality Trailers! Tesla autopilot isn't working 100%, but it's clearly defined according to their update information that it's working 100% accurate autopilot.
    Thanks for sharing!

    ReplyDelete