Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty’s Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

  • Akasazh@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    I don’t see any information about the crossing. Was it a crossing without gates? As the sensors must’ve picked that up when driving towards it. If so, is a huge oversight not putting up gated crossings nowadays, certainly on busy roads, regardless of the performance of self driving cars.

    • AmidFuror@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      So sick of people referring to “Do not ram train” mode. You see it all over social media, but especially Lemmy. It’s “Do not ram train (Supervised)” mode, and you’d have to be living under a rock for the last 5+ years to think you don’t have to actually take control of the wheel to stop it from ramming a train.

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Oh! As a token of ah…of…aah… a knowledge mental acknowledgement, we the US people would like to gift this here Tesla to you all, Putin, and Iran leadership. You get a Tesla and you get a Tesla…and you get a Tesla!

  • FangedWyvern42@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Every couple of months there’s a new story like this. And yet we’re supposed to believe this system is ready for use…

    • dream_weasel@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Ever couple of months you hear about every issue like this, just like you hear about every airline malfunction. It ignores the base rate of accurate performances which is very high.

      FSD is imperfect but still probably more ready for use than a substantial fraction of human drivers.

      • buddascrayon@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        This isn’t actually true. The Tesla full self driving issues we hear about in the news are the ones that result in fatal and near fatal accidents, but the forums are chock full of reports from owners of the thing malfunctioning on a regular basis.

        • dream_weasel@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          It IS actually true. It does goofy stuff in some situations, but on the whole is a little better than your typical relatively inexperienced driver. It gets it wrong about when to be assertive and when to wait sometimes, it thinks there’s enough space for a courteous merge but there isn’t (it does some Chicago style merges sometimes), it follows the lines on the road like they are gospel, and doesn’t always properly estimate how to come to a smooth and comfortable stop. These are annoying things, but not outrageous provided you are paying attention like you’re obliged to do.

          I have it, I use it, and I make lots of reports to Tesla. It is way better than it used to be and still has plenty of room to improve, but a Tesla can’t reboot without having a disparaging article written about it.

          Also fuck elon, because I don’t think it gets said enough.

          • bane_killgrind@lemmy.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            typical relatively inexperienced driver

            Look at the rates that teenagers crash, this is an indictment.

            provided you are paying attention

            It was advertised as fully autonomous dude. People wouldn’t have this much of a hard-on for trashing it if it wasn’t so oversold.

            • Thorny_Insight@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              6 months ago

              This fully autonomous argument is beat to death already. Every single Tesla owner knows you’re supposed to pay attention and be ready to take over when necessary. That is such a strawman argument. Nobody blames the car when automatic braking fails to see the car infront of it. It might save your ass if you’re distracted but ultimately it’s always the driver whose responsible. FSD is no different.

              • Pazuzu@midwest.social
                link
                fedilink
                English
                arrow-up
                0
                ·
                6 months ago

                If it’s not fully capable of self driving then maybe they shouldn’t call it full self driving

          • buddascrayon@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            Seriously you sound like a Mac user in the '90s. “It only crashes 8 or 9 times a day, it’s so much better than it used to be. It’s got so many great features that I’m willing to deal with a little inconvenience…” Difference being that when a Mac crashes it just loses some data and has to reboot but when a Tesla crashes people die.

            • dream_weasel@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              These are serious rate differences man.

              Every driver, and even Tesla, will tell you it’s a work in progress, and you’d be hard pressed to find someone who has had an accident with it. I’d be willing to bet money that IF You find someone who has had an accident they have a driving record that’s shitty without it too.

              If you want to talk stats, let’s talk stats, but “It seems like Tesla is in the news a lot for near crashes” is a pretty weak metric, even from your armchair.

      • lolcatnip@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        You hear so much about the people Jeffrey Dahmer murdered, but never anything about all the people he didn’t murder!

          • lolcatnip@reddthat.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            6 months ago

            I see you’ve decided to be condescending, and also made a falsifiable claim. This is the part where you bring some actual data or STFU.

            • dream_weasel@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              6 months ago

              Whatever you say Mr Dahmer joke instead of content. I see that was really all in good faith and maybe I unintentionally hurt your feelings by citing a source on base rate biases?

              What data would you like me to bring for discussion since you’ve been so open thus far? Do you want me to bring some data showing that teslas spend more time not having accidents than having accidents? I’m happy to go do some homework to enrich this interaction.

              It’s not as though you can just ask Tesla for every case of an FSD crash. The falsifiable claim is just me tossing a number, the point is that memorable bad press and bad stats are not the same.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      In what way is it not ready to use? Does cars have some other driver assistant features that are fool proof? You’re not supposed to blindly trust any of those. Why would FSD be an exception? The standards people are aplying to it are quite unreasonable.

      • Piranha Phish@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        It’s unreasonable for FSD to see a train? … that’s 20ft tall and a mile long? Am I understanding you correctly?

        Foolproof would be great, but I think most people would set the bar at least as high as not getting killed by a train.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          Did you watch the video? It was insanely foggy there. It makes no difference how big the obstacle is if you can’t even see 50 meters ahead of you.

          Also, the car did see the train. It just clearly didn’t understand what it was and how to react to it. That’s why the car has a driver who does. I’m sure this exact edge case will be added to the training data so that this doesn’t happen again. Stuff like this takes ages to iron out. FSD is not a finished product. It’s under development and receives constant updates and keeps improving. That’s why it’s classified as level 2 and not level 5.

          Yes. It’s unreasonable to expect brand new technology to be able to deal with every possible scenario that a car can encounter on traffic. Just because the concept of train in a fog makes sense to you as a human doesn’t mean it’s obvious to the AI.

          • Piranha Phish@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            In what way is it not ready to use?

            To me it seems you just spent three paragraphs answering your own question.

            can’t even see 50 meters ahead

            didn’t understand what it was and how to react to it

            FSD is not a finished product. It’s under development

            doesn’t mean it’s obvious to the AI

            If I couldn’t trust a system not to drive into a train, I don’t feel like I would trust it to do even the most common tasks. I would drive the car like a fully attentive human and not delude myself into thinking the car is driving me with “FSD.”

              • Piranha Phish@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                6 months ago

                Completely true. And I would dictate my driving characteristics based on that fact.

                I would drive at a speed and in a manner that would allow me to not almost crash into things. But especially trains.

                • Thorny_Insight@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  6 months ago

                  I agree. In fact I’m surprised the vehicle even lets you enable FSD in that kind of poor visibility and based on the video it seemed to be going quite fast aswell.

                • Thorny_Insight@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  6 months ago

                  Yeah there’s a wide range of ways to map the surroundings. Road infrastructure, however is designed for vision so I don’t see why just cameras wouldn’t be sufficient. The issue here is not that it’s didn’t see the train - it’s on video, after all - but that it didn’t know how to react to it.

      • ammonium@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Because it’s called Full Self Drive and Musk has said it will be able to drive without user intervention?

        • dream_weasel@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          The naming is poor, but in no way does the car represent to you that no intervention is required. It also constantly asks you for input and even watches your eyes to make sure you pay attention.

            • dream_weasel@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              Marketing besides the naming we have already established and Elon himself masturbating to it? Is there some other marketing that pushes this narrative, because I certainly have not seen it.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          It’s called Full Self Driving (Supervised)

          Yeah, it will be able to drive without driver intervention eventually. Atleast that’s their goal. Right now however, it’s level 2 and no-one is claiming otherwise.

          In what way is it not ready to use?

      • assassin_aragorn@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        You’re not supposed to blindly trust any of those. Why would FSD be an exception?

        Because that’s how Elon (and by extension Tesla) market it. Full self driving. If they’re saying I can blindly trust their product, then I expect it to be safe to blindly trust it.

        And if the fine print says I can’t blindly trust it, they need to be sued or put under legal pressure to change the term, because it’s incredibly misleading.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          Full Self Driving (Beta), nowdays Full Self Driving (Supervised)

          Which of those names invokes trust to put your life in it’s hands?

          It’s not in fine print. It’s told to you when you purchase FSD and the vehicle reminds you of it every single time you enable the system. If you’re looking at your phone it starts nagging at you eventually locking you out of the feature. Why would they put driver monitoring system in place if you’re supposed to put blind faith into it?

          That is such an old, beat up strawman argument. Yes, Elon has said it would be fully autonomous in a year or so which turned out to be a lie but nobody today is claiming it can be blindly trusted. That simply just is not true.

          • Honytawk@lemmy.zip
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            It isn’t Full Self Driving if it is supervised.

            It’s especially not Full Self Driving if it asks you to intervene.

            It is false advertisement at best, deadly at worst.

          • assassin_aragorn@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            Unfortunately, companies also have to make their products safe for idiots. If the system is in beta or must be supervised, there should be inherently safe design that prevents situations like this from happening even if an idiot is at the wheel.

            • Thorny_Insight@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              ESP is not idiot proof either just to name one such feature that’s been available for decades. It assists the driver but doesn’t replace them.

              Hell, cars themselves are not idiot proof.

      • Holyginz@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        No, the standards people are applying to it are the bare minimum for a full self driving system like what musk claims.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          It’s a level 2 self driving system which by definition requires driver supervision. It’s even stated in the name. What are the standards it doesn’t meet?

    • darki@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      It is ready because Musk needs it to be ready. Watch out, this comment may bring the morale down, and Elron will be forced to … Cry like a baby 😆

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Didn’t he recently claim Tesla robotaxi is only months away?
        Well I suppose he didn’t say how many months, but the implication was less than a year, which has been his claim every year since 2016.

        • dustyData@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          He said that Teslas were an investment worth hundreds of thousands of dollars because owners would be able to use them as robot taxis when they weren’t using their car and charge a small fee by next year…in 2019. Back then he promised 1 million robot taxis nationwide in under a year. Recently he gave the date august 8 to reveal a new model of robot taxi. So, by Cybertruck estimates, I would say a Tesla robot taxi is a possibility by late 2030.

          He is just spewing shit to keep the stock price afloat, as usual.

          • dual_sport_dork 🐧🗡️@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            He also said they were ready to manufacture the 2nd generation Tesla Roaster “now,” which was back in 2014. No points for guessing that as of yet (despite taking in millions of dollars in preorders) they have not produced a single one.

            Given this very early and still quite relevant warning, I’m astounded that anyone is dumb enough to believe any promise Elon makes about anything.

  • cestvrai@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    As a frequent train passenger, I’m not overly concerned.

    Seems a bit too weak to derail, probably only delay.

  • riodoro1@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    What a bunch of morons people were in 1912 to believe a ship could be unsinkable. Amirite guys?

    • Fades@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      What a horrible thing to say, especially since Elon and Tesla have only relatively recently turned to absolute shit. There are a lot of Tesla drivers that don’t support what he has done to the company and all that.

      Here you are advocating for the death of people because they purchased a vehicle. A lot of people bought Teslas as they were one of the better EVs at the time during Tesla’s climb to their peak (which they have since fallen very far from). They too deserve death?

      • Captain Aggravated@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Here you are advocating for the death of people because they purchased a vehicle.

        No; I’m expressing the same sentiment that I express for motorcycle riders that refuse to wear a helmet. I really, genuinely don’t care if they beat their brains out on the front bumper of a Hyundai, but I don’t think they get to force a Hyundai driver to hose brains off their car.

        Teslas are death traps. Their owners can make that choice for themselves but I don’t think they get to make it for others, which is what they try to do every time they turn on that self-driving feature.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    Obvious strong blinking red light ahead, obvious train passing ahead…

    Tesla FSD: Hmmm let’s not even slow down, I don’t see any signs of problems.

    FSD is an acronym for Fool Self Driving.

  • nyan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Are there any classes of object left that Tesla FSD has not either hit or almost hit? Icebergs, maybe?

  • ElPenguin@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    As someone with more than a basic understanding of technology and how self driving works, I would think the end user would take special care driving in fog since the car relies on cameras to identify the roads and objects. This is clearly user error.

    • Noxy@yiffit.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Leaving room for user error in this sort of situation is unacceptable at Tesla’s scale and with their engineering talent, as hamstrung as it is by their deranged leadership

      • SaltySalamander@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        If you are in the driver’s seat, you are 100% responsible for what your car does. If you let it drive itself into a moving train, that’s on you.

        • Noxy@yiffit.net
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          I cannot fathom how anyone can honestly believe Tesla is entirely faultless in any of this, completely and totally free of any responsibility whatsoever.

          I’m not gonna say they’re 100% responsible but they are at least 1% responsible.

          • SaltySalamander@fedia.io
            link
            fedilink
            arrow-up
            0
            ·
            6 months ago

            If Tesla is at fault for an inattentive driver ignoring the myriad warnings he got to remain attentive when he enabled FSD and allowing the 2 ton missile he’s sitting in to nearly plow into a train, then Dodge has to be responsible for the Challenger being used to plow into those protestors in Charlottesville.

            God fucking damn it, why do you people insist on making me defend fucking Tesla?!

    • tb_@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      This is clearly user error.

      When it’s been advertised to the user as “full self driving”, is it?

      Furthermore, the car can’t recognize the visibility is low and alert the user and/or refuse to go into self driving?

      • darganon@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        There are many quite loud alerts when FSD is active in subpar circumstances about how it is degraded, and the car will slow down. That video was pretty foggy, I’d say the dude wasn’t paying attention.

        I came up on a train Sunday evening in the dark, which I hadn’t had happen in FSD, so I decided to just hit the brakes. It saw the crossing arms as blinking stoplights, probably wouldn’t have stopped?

        Either way that dude was definitely not paying attention.

      • Maddier1993@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        When it’s been advertised to the user as “full self driving”, is it?

        I wouldn’t believe an advertisement.

        • tb_@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          I wouldn’t trust Musk with my life either.

          But, presumably, we have moved beyond the age of advertising snake oil and miracle cures and advertisements have to be somewhat factual.

          If a user does as is advertised and something goes wrong I do believe it’s the advertiser who is liable.

          • 0x0@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            But, presumably, we have moved beyond the age of advertising snake oil and miracle cures and advertisements have to be somewhat factual.

            Keyword presumably.

            • tb_@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              Right. But can you blame the user for trusting the advertisement?

            • jaybone@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              If the product doesn’t do what it says it does, that’s the product / manufacturers fault. Not the users fault. Wtf lol how is this even a debate.

    • Jakeroxs@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      This is showing it works or no? I can’t tell and there isn’t audio, it seems like it would be stopped correctly.

  • itsonlygeorge@reddthat.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Tesla opted not to use LIDAR as part of its sensor package and instead relies on cameras which are not enough to determine accurate location data for other cars/trains etc.

    This is what you get when billionaires cheap out on their products.

    • Imalostmerchant@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      I never understood Musk’s reasoning for this decision. From my recollection it was basically “how do you decide who’s right when lidar and camera disagree?” And it felt so insane to say that the solution to conflicting data was not to figure out which is right but only to listen to one.

      • Jakeroxs@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Also that LIDAR is more expensive then cameras, which means higher end user price, as far as I remember.

      • wirehead@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        I mean, I think he’s a textbook example of why not to do drugs and why we need to eat the rich, but I can understand the logic here.

        When you navigate a car as a human, you are using vision, not LIDAR. Outside of a few edge cases, you aren’t even using parallax to judge distances. Ergo, a LIDAR is not going to see the text on a sign, the reflective stripes on a truck, etc. And it gets confused differently than the eye, absorbed by different wavelengths, etc. And you can jam LIDAR if you want. Thus, if we were content to wait until the self-driving-car is actually safe before throwing it out into the world, we’d probably want the standard to be that it navigates as well as a human in all situations using only visual sensors.

        Except, there’s some huge problems that the human visual cortex makes look real easy. Because “all situations” means “understanding that there’s a kid playing in the street from visual cues so I’m going to assume they are going to do something dumb” or “some guy put a warning sign on the road and it’s got really bad handwriting”

        Thus, the real problem is that he’s not using LIDAR as harm reduction for a patently unsafe product, where the various failure modes of the LIDAR-equipped self-driving cars show that those aren’t safe either.

    • Wrench@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      LIDAR would have similarly been degraded in the foggy conditions that this occurred in. Lasers are light too.

      While I do think Tesla holds plenty of responsibility for their intentionally misleading branding in FSD, as well as cost saving measures to not include lidar and/or radar, this particular instance boils down to yet another shitty and irresponsible driver.

      You should not be relying on FSD over train tracks. You should not be allowing FSD to be going faster than conditions allow. Dude was tearing down the road in thick fog, way faster than was safe for the conditions.

      • Pazuzu@midwest.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        6 months ago

        Maybe it shouldn’t be called full self driving if it’s not fully capable of self driving

      • Rekorse@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        A Tesla drover might get the impression that the cars “opinion” is better than their own, which could cause them to hesitate before intervening or to allow the car to drive in a way they are uncomfortable with.

        The misinformation about the car reaches the level of negligence because even smart people are being duped by this.

        Honestly I think some people just dont believe someone could lie so publicly and loudly and often, that it must be something else besides a grift.

      • FreddyDunningKruger@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        One of the first things you learn to get your driver’s license is the Basic Speed Law, you must not drive faster than the driving conditions would allow. If only Full Self Driving followed the law and reduced its max speed based on the same.

    • skyspydude1@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Not only that, but took out the radar, which while it has its own flaws, would have had no issue seeing the train through the fog. While they claimed it was because they had “solved vision” and didn’t need it anymore, it’s bullshit, and their engineering team knew it. They were in the middle of sourcing a new radar, but because of supply chain limitations (like everyone in 2021) with both their old and potential new supplier, they wouldn’t continue their “infinite growth” narrative and fElon wouldn’t get his insane pay package. They knew for a fact it would negatively affect performance significantly, but did it anyway so line could go up.

      While no automotive company’s hands are particularly clean, the sheer level of willful negligence at Tesla is absolutely astonishing and have seen and heard so many stories about their shitty engineering practices that the only impressive thing is how relatively few people have died as a direct result of their lax attitude towards basic safety practices.

  • Noxy@yiffit.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Feels like these things were more capable a decade ago when they had radar.

    Not that they should be called “full self driving” either then or now, but at least radar can deal fog better than regular ass cameras