Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty’s Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

  • FangedWyvern42@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Every couple of months there’s a new story like this. And yet we’re supposed to believe this system is ready for use…

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      In what way is it not ready to use? Does cars have some other driver assistant features that are fool proof? You’re not supposed to blindly trust any of those. Why would FSD be an exception? The standards people are aplying to it are quite unreasonable.

      • assassin_aragorn@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        You’re not supposed to blindly trust any of those. Why would FSD be an exception?

        Because that’s how Elon (and by extension Tesla) market it. Full self driving. If they’re saying I can blindly trust their product, then I expect it to be safe to blindly trust it.

        And if the fine print says I can’t blindly trust it, they need to be sued or put under legal pressure to change the term, because it’s incredibly misleading.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          Full Self Driving (Beta), nowdays Full Self Driving (Supervised)

          Which of those names invokes trust to put your life in it’s hands?

          It’s not in fine print. It’s told to you when you purchase FSD and the vehicle reminds you of it every single time you enable the system. If you’re looking at your phone it starts nagging at you eventually locking you out of the feature. Why would they put driver monitoring system in place if you’re supposed to put blind faith into it?

          That is such an old, beat up strawman argument. Yes, Elon has said it would be fully autonomous in a year or so which turned out to be a lie but nobody today is claiming it can be blindly trusted. That simply just is not true.

          • assassin_aragorn@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            Unfortunately, companies also have to make their products safe for idiots. If the system is in beta or must be supervised, there should be inherently safe design that prevents situations like this from happening even if an idiot is at the wheel.

            • Thorny_Insight@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              ESP is not idiot proof either just to name one such feature that’s been available for decades. It assists the driver but doesn’t replace them.

              Hell, cars themselves are not idiot proof.

          • Honytawk@lemmy.zip
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            It isn’t Full Self Driving if it is supervised.

            It’s especially not Full Self Driving if it asks you to intervene.

            It is false advertisement at best, deadly at worst.

      • ammonium@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Because it’s called Full Self Drive and Musk has said it will be able to drive without user intervention?

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          It’s called Full Self Driving (Supervised)

          Yeah, it will be able to drive without driver intervention eventually. Atleast that’s their goal. Right now however, it’s level 2 and no-one is claiming otherwise.

          In what way is it not ready to use?

        • dream_weasel@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          The naming is poor, but in no way does the car represent to you that no intervention is required. It also constantly asks you for input and even watches your eyes to make sure you pay attention.

            • dream_weasel@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              Marketing besides the naming we have already established and Elon himself masturbating to it? Is there some other marketing that pushes this narrative, because I certainly have not seen it.

      • Holyginz@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        No, the standards people are applying to it are the bare minimum for a full self driving system like what musk claims.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          It’s a level 2 self driving system which by definition requires driver supervision. It’s even stated in the name. What are the standards it doesn’t meet?

      • Piranha Phish@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        It’s unreasonable for FSD to see a train? … that’s 20ft tall and a mile long? Am I understanding you correctly?

        Foolproof would be great, but I think most people would set the bar at least as high as not getting killed by a train.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          Did you watch the video? It was insanely foggy there. It makes no difference how big the obstacle is if you can’t even see 50 meters ahead of you.

          Also, the car did see the train. It just clearly didn’t understand what it was and how to react to it. That’s why the car has a driver who does. I’m sure this exact edge case will be added to the training data so that this doesn’t happen again. Stuff like this takes ages to iron out. FSD is not a finished product. It’s under development and receives constant updates and keeps improving. That’s why it’s classified as level 2 and not level 5.

          Yes. It’s unreasonable to expect brand new technology to be able to deal with every possible scenario that a car can encounter on traffic. Just because the concept of train in a fog makes sense to you as a human doesn’t mean it’s obvious to the AI.

          • Piranha Phish@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            In what way is it not ready to use?

            To me it seems you just spent three paragraphs answering your own question.

            can’t even see 50 meters ahead

            didn’t understand what it was and how to react to it

            FSD is not a finished product. It’s under development

            doesn’t mean it’s obvious to the AI

            If I couldn’t trust a system not to drive into a train, I don’t feel like I would trust it to do even the most common tasks. I would drive the car like a fully attentive human and not delude myself into thinking the car is driving me with “FSD.”

              • Piranha Phish@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                6 months ago

                Completely true. And I would dictate my driving characteristics based on that fact.

                I would drive at a speed and in a manner that would allow me to not almost crash into things. But especially trains.

                • Thorny_Insight@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  6 months ago

                  I agree. In fact I’m surprised the vehicle even lets you enable FSD in that kind of poor visibility and based on the video it seemed to be going quite fast aswell.

                • Thorny_Insight@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  6 months ago

                  Yeah there’s a wide range of ways to map the surroundings. Road infrastructure, however is designed for vision so I don’t see why just cameras wouldn’t be sufficient. The issue here is not that it’s didn’t see the train - it’s on video, after all - but that it didn’t know how to react to it.

    • dream_weasel@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Ever couple of months you hear about every issue like this, just like you hear about every airline malfunction. It ignores the base rate of accurate performances which is very high.

      FSD is imperfect but still probably more ready for use than a substantial fraction of human drivers.

      • lolcatnip@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        You hear so much about the people Jeffrey Dahmer murdered, but never anything about all the people he didn’t murder!

          • lolcatnip@reddthat.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            6 months ago

            I see you’ve decided to be condescending, and also made a falsifiable claim. This is the part where you bring some actual data or STFU.

            • dream_weasel@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              6 months ago

              Whatever you say Mr Dahmer joke instead of content. I see that was really all in good faith and maybe I unintentionally hurt your feelings by citing a source on base rate biases?

              What data would you like me to bring for discussion since you’ve been so open thus far? Do you want me to bring some data showing that teslas spend more time not having accidents than having accidents? I’m happy to go do some homework to enrich this interaction.

              It’s not as though you can just ask Tesla for every case of an FSD crash. The falsifiable claim is just me tossing a number, the point is that memorable bad press and bad stats are not the same.

      • buddascrayon@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        This isn’t actually true. The Tesla full self driving issues we hear about in the news are the ones that result in fatal and near fatal accidents, but the forums are chock full of reports from owners of the thing malfunctioning on a regular basis.

        • dream_weasel@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          It IS actually true. It does goofy stuff in some situations, but on the whole is a little better than your typical relatively inexperienced driver. It gets it wrong about when to be assertive and when to wait sometimes, it thinks there’s enough space for a courteous merge but there isn’t (it does some Chicago style merges sometimes), it follows the lines on the road like they are gospel, and doesn’t always properly estimate how to come to a smooth and comfortable stop. These are annoying things, but not outrageous provided you are paying attention like you’re obliged to do.

          I have it, I use it, and I make lots of reports to Tesla. It is way better than it used to be and still has plenty of room to improve, but a Tesla can’t reboot without having a disparaging article written about it.

          Also fuck elon, because I don’t think it gets said enough.

          • bane_killgrind@lemmy.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            typical relatively inexperienced driver

            Look at the rates that teenagers crash, this is an indictment.

            provided you are paying attention

            It was advertised as fully autonomous dude. People wouldn’t have this much of a hard-on for trashing it if it wasn’t so oversold.

            • Thorny_Insight@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              6 months ago

              This fully autonomous argument is beat to death already. Every single Tesla owner knows you’re supposed to pay attention and be ready to take over when necessary. That is such a strawman argument. Nobody blames the car when automatic braking fails to see the car infront of it. It might save your ass if you’re distracted but ultimately it’s always the driver whose responsible. FSD is no different.

              • Pazuzu@midwest.social
                link
                fedilink
                English
                arrow-up
                0
                ·
                6 months ago

                If it’s not fully capable of self driving then maybe they shouldn’t call it full self driving

          • buddascrayon@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            Seriously you sound like a Mac user in the '90s. “It only crashes 8 or 9 times a day, it’s so much better than it used to be. It’s got so many great features that I’m willing to deal with a little inconvenience…” Difference being that when a Mac crashes it just loses some data and has to reboot but when a Tesla crashes people die.

            • dream_weasel@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              These are serious rate differences man.

              Every driver, and even Tesla, will tell you it’s a work in progress, and you’d be hard pressed to find someone who has had an accident with it. I’d be willing to bet money that IF You find someone who has had an accident they have a driving record that’s shitty without it too.

              If you want to talk stats, let’s talk stats, but “It seems like Tesla is in the news a lot for near crashes” is a pretty weak metric, even from your armchair.

    • darki@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      It is ready because Musk needs it to be ready. Watch out, this comment may bring the morale down, and Elron will be forced to … Cry like a baby 😆

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Didn’t he recently claim Tesla robotaxi is only months away?
        Well I suppose he didn’t say how many months, but the implication was less than a year, which has been his claim every year since 2016.

        • dustyData@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          He said that Teslas were an investment worth hundreds of thousands of dollars because owners would be able to use them as robot taxis when they weren’t using their car and charge a small fee by next year…in 2019. Back then he promised 1 million robot taxis nationwide in under a year. Recently he gave the date august 8 to reveal a new model of robot taxi. So, by Cybertruck estimates, I would say a Tesla robot taxi is a possibility by late 2030.

          He is just spewing shit to keep the stock price afloat, as usual.

          • dual_sport_dork 🐧🗡️@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            He also said they were ready to manufacture the 2nd generation Tesla Roaster “now,” which was back in 2014. No points for guessing that as of yet (despite taking in millions of dollars in preorders) they have not produced a single one.

            Given this very early and still quite relevant warning, I’m astounded that anyone is dumb enough to believe any promise Elon makes about anything.