visit altcar.org website if you want to get information about Driver, 27, Crashes His Rare 1.15M McLaren Just One DAY After Picking It Up and other information about Supercar and automotive
@pch101: the problem with your case here is that you are basing it on an accident report that is fairly superficial and not necessarily accurate. We know the speed of the Tesla cited in the report is a guess (which you have never acknowledged), I think it is highly likely that the speed of the truck is also nothing more than a guess. The drawing is clearly labeled not to scale and shows us nothing other than the paths of the vehicles but has no indication of the changes in their position over time so you really can’t draw too many conclusions from that sketch.
It is possible that the truck driver was unable to see the Tesla when he started making his turn. It is possible that he did see the car, decided he could safely complete the turn but was mistaken because the car was traveling at a higher than expected speed. It is possible that he determined that as long as the Tesla slowed down the turn would be OK. It is possible he just didn’t bother looking at all. And so on. There are scenarios where the truck driver is at fault, there are scenarios where he may not be at fault.
Your one-eyed view of the event does not admit of any of these alternative scenarios and so you simply repeat your statement and insult anyone who disagrees with you. That’s not really adding anything to the thread. I expect some sort of silly response to this post which I will ignore.
The inquiry may be able to reconstruct the events more accurately than the FHP report, but I would expect their main focus to be on the automation aspect. It’s pretty clear that the system failed to interpret the input data. What is not clear is how the driver interacted or failed to interact with the system, whether the system gave any indication to the driver of a need to assume controland on a broader front to what degree are drivers becoming dependent on the system and using it in ways that Tesla does not intend them to. If the report is going to offer anything of use it has to look at those kinds of issues both for the sake of drivers and for the sake of those trying to develop the systems, regardless of degree of assistance provided by the system.
Automation dependency is a significant problem in aviation (google Children of Magenta for an excellent presentation of this.) There was a piece in the Wall Street Journal this week presenting reactions of aviation experts to the incident here in question. I am not suggesting that the systems in planes and cars are in any way at the same stage of automation, but the human factors involved in introducing automation into automobiles will have quite a lot in common with those on the flight deck, excepting issues of crew resource management since we don’t generally divide driving responsibilities in a car.
It is generally accepted that humans don’t do an especially good job of monitoring computer systems, and the sudden transition from automatic flight to manual control has caused pilots problems. It may have been a factor in the wreck in PA, it may have been a factor here. It will be interesting to read the NTSB’s conclusions.
Source : http://www.thetruthaboutcars.com/2016/07/driver-fatal-florida-tesla-crash-speeding-autopilot-nhtsa-confirms/
Thank you for visit out blog! We hope you get information do you want about Driver, 27, Crashes His Rare 1.15M McLaren Just One DAY After Picking It Up