Page 1 of 2 12 LastLast
Results 1 to 30 of 48

Thread: Tesla autopilot not so good

  1. #1
    Join Date
    Oct 2006
    Location
    R.I. USA
    Posts
    9,870

    Tesla autopilot not so good

    Reported on CNBC today, an accident involving an autopilot which didn't recognize a tractor trailer
    crossing the highway in front of it.
    A search found this site : ELECTREK 12 Comments
    TwitterFacebookGooglePinterestReddit
    A fatal accident involving a Tesla Model S on Autopilot and a tractor-trailer, which is just now coming to light but happened last month, prompts a preliminary evaluation by the U.S. National Highway Traffic Safety Administration (NHTSA). The evaluation will determine whether the system worked according to expectations and it is the first step before an investigation which could potentially lead to a recall.

    Tesla issued a statement regarding the accident, which you can read in full below:


    It looks like it could be the first reported death caused by a crash while the Autopilot was activated."
    More in the article. Not the last word for this .

    Dan

  2. #2
    Join Date
    Jun 2006
    Posts
    4,542
    Quote Originally Posted by danscope View Post
    Reported on CNBC today, an accident involving an autopilot which didn't recognize a tractor trailer
    crossing the highway in front of it.
    A search found this site : ELECTREK 12 Comments
    TwitterFacebookGooglePinterestReddit
    A fatal accident involving a Tesla Model S on Autopilot and a tractor-trailer, which is just now coming to light but happened last month, prompts a preliminary evaluation by the U.S. National Highway Traffic Safety Administration (NHTSA). The evaluation will determine whether the system worked according to expectations and it is the first step before an investigation which could potentially lead to a recall.

    Tesla issued a statement regarding the accident, which you can read in full below:


    It looks like it could be the first reported death caused by a crash while the Autopilot was activated."
    More in the article. Not the last word for this .

    Dan
    Is a link mising here?

  3. #3
    Join Date
    Feb 2003
    Location
    Depew, NY
    Posts
    10,655
    It seems to be a hot story. Try this link: http://www.cnbc.com/2016/06/30/us-re...tal-crash.html
    Solfe, Dominus Maris Pavos.

  4. #4
    Join Date
    Jun 2005
    Posts
    12,460
    I can understand why the camera missed it, since apparently it was a white truck on the background of a white sky. But I wonder why the radar didn't catch it. In any case, though, apparently the autopilot system is not intended to be fully automated.

    http://www.theverge.com/2016/6/30/12...gs-fatal-crash

    It specifically warns that it can't always detect vehicles so that the driver should always be ready to take control.
    As above, so below

  5. #5
    Join Date
    Apr 2011
    Location
    Norfolk UK and some of me is in Northern France
    Posts
    7,126
    Seems junctions are always a hazard, maybe the gps should link and warn for every junction, so that only long stretches of road are automated?
    sicut vis videre esto
    When we realize that patterns don't exist in the universe, they are a template that we hold to the universe to make sense of it, it all makes a lot more sense.
    Originally Posted by Ken G

  6. #6
    Join Date
    Feb 2003
    Location
    Depew, NY
    Posts
    10,655
    I thought Tesla's "autopilot" was more like driver assist than autonomous. Perhaps it doesn't have the ability to override the driver in extreme cases.
    Solfe, Dominus Maris Pavos.

  7. #7
    Join Date
    Jun 2002
    Posts
    1,815
    Quote Originally Posted by Jens View Post
    It specifically warns that it can't always detect vehicles so that the driver should always be ready to take control.
    So it merely issues an invitation to inattentiveness, then tells you to pay attention.

  8. #8
    Join Date
    Apr 2007
    Location
    Nowhere (middle)
    Posts
    34,619
    The thing that bothers me, is that I've heard several people say "Oh, if the person had been in control it wouldn't have crashed!" ...Because human drivers have never caused car cashes before.
    "I'm planning to live forever. So far, that's working perfectly." Steven Wright

  9. #9
    Join Date
    Feb 2003
    Location
    Depew, NY
    Posts
    10,655
    It kind of sounds to me like someone placed the expectation on the "autopilot" that it could competently control the vehicle, when it's actually for assisting the driver. Perhaps the reporters.

    The car can do a few things autonomously, but outside of that range of performance, it assists the driver. The same driver in the same car previous posted a video where the car signaled him to take over and he swerved to avoid another vehicle. The "autopilot" made a sound instead of doing anything else. That isn't autonomous driving by any measure.

    I would think that in this particular situation, the car nor the driver could have anticipated and reacted to the crash in time. Sometimes physics happens. 99% of driving is anticipation of problems and not entering that arena.
    Solfe, Dominus Maris Pavos.

  10. #10
    Join Date
    Feb 2005
    Posts
    10,862
    This was a case of dual blind spots.

    Nothing new:
    http://www.aiga.org/keep-on-truckin-with-caution/
    http://www.aiga.org/archivedmedia/ke...wide-turns.jpg

    If you can't see me in my mirror--I can't see you.

    <----PASSING SIDE
    ----> SUICIDE

    Long story made short. he with 18 wheels always has right-of-way--even if he doesn't.

  11. #11
    Join Date
    Oct 2006
    Location
    R.I. USA
    Posts
    9,870
    It appears that you cannot abdicate your responsibility to remain " the Pilot in Command " of a vehicle .
    I think the proximity braking and speed governing are excellent technology , but surely will not replace alert driving .

    Dan

  12. #12
    Join Date
    Apr 2016
    Location
    Nether
    Posts
    706
    There is a really big grey area between fully controlling a car and a 100% self-driving vehicle. In time these will improve in their safety gain and subsequently in social acceptance. But one small step after the other it is not impossible, just can't get there in 1 time.

    For instance cars can already park themselves fully automated, which is great for certain 'target groups', but when an accident happens during automated parking there is already an issue with responsibility i think.
    "Downwards is the only way forwards" Cobb

    Noting in science is proven actual

  13. #13
    Join Date
    Jun 2005
    Posts
    12,460
    Quote Originally Posted by AFJ View Post
    For instance cars can already park themselves fully automated, which is great for certain 'target groups', but when an accident happens during automated parking there is already an issue with responsibility i think.
    Of course there is an issue with responsibility. But there always is, which is why we have courts and juries.
    As above, so below

  14. #14
    Join Date
    Oct 2006
    Location
    R.I. USA
    Posts
    9,870
    If you press a button to park your car, .....you STILL need to visually clear yourself....all around , before. Using the robot isn't going to feed the bulldog in a court of law.

  15. #15
    Join Date
    Jun 2005
    Posts
    12,460
    Quote Originally Posted by Solfe View Post
    It kind of sounds to me like someone placed the expectation on the "autopilot" that it could competently control the vehicle, when it's actually for assisting the driver. Perhaps the reporters.
    It may have been the driver. I don't think it's been determined yet, but there are reports that the driver might have been watching a movie at the time of the crash. If so, then there is a problem that people might think that the technology is better than it really is. It's still a "beta" system
    As above, so below

  16. #16
    Join Date
    Jun 2005
    Posts
    12,460
    Quote Originally Posted by danscope View Post
    If you press a button to park your car, .....you STILL need to visually clear yourself....all around , before. Using the robot isn't going to feed the bulldog in a court of law.
    Well, there was a crash of a driverless train in Germany back in 2006. In that case the courts found two line dispatchers guilty of negligence. So courts will look at what happened and decide who--the manufacturer, the operator, the municipality--bears responsibility for an accident.
    As above, so below

  17. #17
    Join Date
    Feb 2009
    Posts
    511
    Quote Originally Posted by Jens View Post
    It may have been the driver. I don't think it's been determined yet, but there are reports that the driver might have been watching a movie at the time of the crash. If so, then there is a problem that people might think that the technology is better than it really is. It's still a "beta" system
    Plus, beta or not, it's a driving assist system, not a fully automated car. Autopilot may be a bad name for it.

  18. #18
    Join Date
    Dec 2002
    Location
    Nashville, TN
    Posts
    1,657
    Quote Originally Posted by Elukka View Post
    ...Autopilot may be a bad name for it.
    When a commercial pilot is flying an aircraft with autopilot engaged, he is still supposed to be constantly alert and ready to take over in an instant. He is supposed to maintain awareness of different flight regimes and relative risk factors -- even on autopilot. For example when making an autolanding he knows this phase entails higher risk, and he'd be extremely vigilant. The pilot flying will not be engrossed in a movie on his iPad. Similarly a Tesla on autopilot traveling through an intersection is at higher risk. If the driver of the recent Tesla fatal accident had been behaving like a pilot does when using autopilot, the accident probably would not have happened. In this sense "autopilot" is a good term for the current Tesla system. Just as in aviation, "autopilot" does not imply a fully autonomous vehicle with no human controls. It means there is limited automation that requires continued awareness and prudent decision making from the driver/pilot.

  19. #19
    Join Date
    Oct 2006
    Location
    R.I. USA
    Posts
    9,870
    Hi Joema , Well said .

  20. #20
    Join Date
    Nov 2010
    Posts
    5,561
    From what I understand this is the first fatality associated with the Tesla autopilot in 130 million miles of use, so maybe the more important question is how many accidents have been avoided and lives saved already.

    It also looks like commercial trucks are going to go fully autonomous before passenger vehicles, this technology is becoming mature very quickly.

    https://www.trucks.com/2016/05/17/go...s-trucks-kits/
    "Back off man, I'm a Scientist!"- Peter Venkman, PhD in Psychology and Parapsychology

  21. #21
    Join Date
    Jul 2015
    Location
    Castle Valley, Utah
    Posts
    155
    Quote Originally Posted by starcanuck64 View Post

    It also looks like commercial trucks are going to go fully autonomous before passenger vehicles, this technology is becoming mature very quickly.

    https://www.trucks.com/2016/05/17/go...s-trucks-kits/
    From the above link...



    "The researchers are discovering that the development of self-driving trucks provides a greater challenge because of their greater mass and inertia."

    “When you do this for heavy trucks, it’s different because of weight and dynamics,” Bo Wahlberg, a professor at the Royal Institute of Technology, told Trucks.com. “Trucks are much more complicated—much more difficult and more dangerous.”



    It would be hard enough trying to build one of these things from the ground up; bolting a bunch of aftermarket hardware onto an existing rig seems to me, a veteran trucker, a recipe for disaster.
    Last edited by Lucretius; 2016-Jul-05 at 02:26 PM.
    You're a ghost driving a meat-covered skeleton made of stardust, riding a rock hurtling through space. Fear nothing.

  22. #22
    Join Date
    Oct 2001
    Location
    The Space Coast
    Posts
    3,999
    Quote Originally Posted by joema View Post
    When a commercial pilot is flying an aircraft with autopilot engaged, he is still supposed to be constantly alert and ready to take over in an instant. He is supposed to maintain awareness of different flight regimes and relative risk factors -- even on autopilot. For example when making an autolanding he knows this phase entails higher risk, and he'd be extremely vigilant. The pilot flying will not be engrossed in a movie on his iPad. Similarly a Tesla on autopilot traveling through an intersection is at higher risk. If the driver of the recent Tesla fatal accident had been behaving like a pilot does when using autopilot, the accident probably would not have happened. In this sense "autopilot" is a good term for the current Tesla system. Just as in aviation, "autopilot" does not imply a fully autonomous vehicle with no human controls. It means there is limited automation that requires continued awareness and prudent decision making from the driver/pilot.
    Except, isn't this the very problem that contributed to airline crashes like Air France 447? The crew wasn't watching a movie, per se, but due to the "autopilot" handling everything, when multiple things started going wrong, they couldn't get their minds up to speed fast enough to deal with it all. I think there's only so "vigilant" one can stay over the course of a drive (particularly a long one) if you're not actively driving.

    CJSF
    "A scientific theory
    Isn't just a hunch or guess
    It's more like a question
    That's been put through a lot of tests
    And when a theory emerges
    Consistent with the facts
    The proof is with science
    The truth is with science"
    -They Might Be Giants, "Science Is Real"


    lonelybirder.org

  23. #23
    Join Date
    Nov 2010
    Posts
    5,561
    Quote Originally Posted by Lucretius View Post
    From the above link...



    "The researchers are discovering that the development of self-driving trucks provides a greater challenge because of their greater mass and inertia."

    “When you do this for heavy trucks, it’s different because of weight and dynamics,” Bo Wahlberg, a professor at the Royal Institute of Technology, told Trucks.com. “Trucks are much more complicated—much more difficult and more dangerous.”



    It would be hard enough trying to build one of these things from the ground up; bolting a bunch of aftermarket hardware onto an existing rig seems to me, a veteran trucker, a recipe for disaster.
    It's certainly going to be a challenge. One of the developments which may make it less chaotic is the ability to join multiple trucks into a closely spaced train that provides less wind resistance and possibly an easier control issue. And as with passenger aircraft which are becoming progressively automated, there will likely be drivers in semis for years.
    "Back off man, I'm a Scientist!"- Peter Venkman, PhD in Psychology and Parapsychology

  24. #24
    Join Date
    Jun 2005
    Posts
    12,460
    Quote Originally Posted by CJSF View Post
    Except, isn't this the very problem that contributed to airline crashes like Air France 447? The crew wasn't watching a movie, per se, but due to the "autopilot" handling everything, when multiple things started going wrong, they couldn't get their minds up to speed fast enough to deal with it all. I think there's only so "vigilant" one can stay over the course of a drive (particularly a long one) if you're not actively driving.

    CJSF
    I think the problem with that flight was that the speed indicators were messed up, which caused the autopilot to disengage, and then the pilots failed to handle the problem correctly. They may have been inadequately trained, but the problem would have existed with or without the autopilot. A better example of the danger of autopilot may be China Airways 140.
    As above, so below

  25. #25
    Join Date
    Jan 2005
    Location
    Anzakistan
    Posts
    10,638
    Quote Originally Posted by Solfe View Post
    I thought Tesla's "autopilot" was more like driver assist than autonomous. Perhaps it doesn't have the ability to override the driver in extreme cases.
    Yes.

    In "our" news today: http://www.stuff.co.nz/motoring/news...y-crash-itself

    TL;DR : it wasn't really autopilot.
    Measure once, cut twice. Practice makes perfect.

  26. #26
    Join Date
    Oct 2006
    Location
    R.I. USA
    Posts
    9,870
    That sounds more like it .

  27. #27
    Join Date
    Dec 2002
    Location
    Nashville, TN
    Posts
    1,657
    Quote Originally Posted by CJSF View Post
    Except, isn't this the very problem that contributed to airline crashes like Air France 447? The crew wasn't watching a movie, per se, but due to the "autopilot" handling everything, when multiple things started going wrong, they couldn't get their minds up to speed fast enough to deal with it all. I think there's only so "vigilant" one can stay over the course of a drive (particularly a long one) if you're not actively driving.
    CJSF
    The AF447 voice and data recorders indicate the crew was highly alert -- they were skirting a thunderstorm. It was not a lack of vigilance. Rather when airspeed data was lost and the flight control system degraded to "alternate law", there was a complete breakdown of crew coordination and communication. The copilot pulled full back on his sidestick and stalled the aircraft, then held it in a stall with mostly full back sidestick all the way down -- for three minutes. The Airbus controls are not physically linked so the captain did not notice this at first. Then when he noticed it, the copilot refused to yield control. This had nothing to do with the autopilot.

  28. #28
    Join Date
    Oct 2001
    Location
    The Space Coast
    Posts
    3,999
    I'm probably conflating AF447 with another crash then. I recall there being quite a few analyses of aircrews, when suddenly faced with a convergence of warnings and unexpected behaviors were found to be too slow to adapt and assess the situation, and that future training was changed to try and keep the crews sharp and able to respond. The reliance on autopilot was cited, if I recall (which I obviously don't recall too well, it would seem).

    Oh well, thanks for the clarification.

    CJSF
    "A scientific theory
    Isn't just a hunch or guess
    It's more like a question
    That's been put through a lot of tests
    And when a theory emerges
    Consistent with the facts
    The proof is with science
    The truth is with science"
    -They Might Be Giants, "Science Is Real"


    lonelybirder.org

  29. #29
    Join Date
    Feb 2008
    Posts
    381
    Quote Originally Posted by CJSF View Post
    .... I think there's only so "vigilant" one can stay over the course of a drive (particularly a long one) if you're not actively driving.

    CJSF
    Yes the vigilance problem is a tricky one. I'm a fan of the Tesla but don't own one yet.

    In my fevered imaginations I'm driving the thing along the road in autopilot but it has occurred to me that it's unlikely I will be able to sustain vigilance without the constant prompting of needing to adjust speed and lane position etc that normal driving entails.

    I suppose NOT watching a dvd while doing it would be a good start...

    I don't know if it's just me... but I have found that even with just a cruise control managing the speed, I tend to do essentially silly things.... For instance if the traffic slows ahead I take no action hoping that it clears enough to still maintain that cruise speed. I end up much closer than I would normally be to the car ahead, simply because I was reluctant to intervene.

    Mostly my intervention would be a frantic clicking downward of the speed control hoping to get it to cruise at the new lower speed without having to disengage the system by braking. There is no rational reason for this behaviour of mine... it seems that after engaging the tech solution to speed control I am reluctant to abort the process.

    Although it doesn't directly translate to the Tesla system, I suspect that I will be similarly reluctant to intervene with it also when I have it engaged. Obviously when a truck has filled the windscreen I'll be abandoning my fascination with the tech in favour of a robust braking procedure... but will it be too late then because I was hoping for too long that the radar collision avoidance would do it for me?

    I think there is a human factor involved here that will be hard to mitigate until the systems achieve full autonomy.... until then people will probably continue to do non-rational silly things with their toy.

  30. #30
    Join Date
    Jan 2005
    Location
    Anzakistan
    Posts
    10,638
    Quote Originally Posted by CJSF View Post
    I'm probably conflating AF447 with another crash then. I recall there being quite a few analyses of aircrews, when suddenly faced with a convergence of warnings and unexpected behaviors were found to be too slow to adapt and assess the situation, and that future training was changed to try and keep the crews sharp and able to respond. The reliance on autopilot was cited, if I recall (which I obviously don't recall too well, it would seem).

    Oh well, thanks for the clarification.

    CJSF
    Sounds a bit like this might be the one you were thinking of: https://en.wikipedia.org/wiki/China_Airlines_Flight_006 (Or at least, similar).

    I know of it from a Discovery channel documentary I saw last weekend. The T.V. show seemed to make much of the pilots putting too much effort into the engine that was out, and too little on actually flying the plane (leaving it to autopilot, which struggled with the situation).

    (Experts interviewed on these shows have often said "flying the plane is first priority, solving problems is second".)
    Measure once, cut twice. Practice makes perfect.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •