• NutWrench@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    “self-driving cars” are not going to be a thing within our lifetimes. It’s a problem that requires MUCH smarter AIs than we currently have.

  • Toes♀@ani.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    This is speculation, but were most of them from people who disabled the safety features?

      • Toes♀@ani.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I heard it’s fairly common for people to disarm the feature that requires you to hold the wheel.

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          5 months ago

          Anything remotely supportive of Tesla on lemmy usually results in massive downvotes.

          You’ve angered the hive mind by suggesting people are actively trying to bypass teslas saftey system so they can be idiots thus making it not wholly Teslas fault.

          And yes, many people are actively using bypass devices, but not all.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      You don’t have to disable it to beat the safety system.

      They were all pretty much due to inattentiveness, though. Many were drunk drivers.

      Many do use defeat devices as well, but not all.

      This was all brand new when it first came out and we didn’t really have proper regulations for it. Things have gotten more restrictive, but people do still find ways around it and there’s no fool proof solution to this as humans are smart and will find ways around things.

  • root@precious.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    There are some real Elon haters out there. I think they’re ugly as sin but I’m happy to see more people driving vehicles with all the crazy safety features, even if they aren’t perfect.

    You’re in control of a massive vehicle capable of killing people and destroying property, you’re responsible for it.

    • machinin@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      If only Elon would say something similar when he re-tweets a video of people having sex while the car is on autopilot. Can you guess what he actually said?

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      I’m quite certain that there will be some humble pie served to the haters in not too distant future. The performance of FSD 12.3.5 is all the proof you need that an actual robotaxi is just around the corner. Disagree with me all you want. All we need to do is wait and see.

      However I’m also sure that the cognitive dissonance is going to be so strong for many of these people that even a mountain of evidence is not going to change their mind about it because it’s not based in reason in the first place but emotions.

      • machinin@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        What makes this time any different from the dozens of other times musk had said we’re six months away from FSD? When do you think Tesla will take responsibility for accidents that happen while using their software?

        If they do that in the next year, I’ll gladly eat humble pie. If they can’t, will you?

  • antlion@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    These are spanning from the earliest adopters, up until August of last year. Plenty of idiots using a cruise control system and trusting their lives to beta software. Not the same as the current FSD software.

    Your own car insurance isn’t based on your driving skill when you had your learners permit. When Tesla takes on the liability and insurance for CyberCab, you’ll know it’s much safer than human drivers.

    • Hegar@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Plenty of idiots using a cruise control system and trusting their lives to beta software.

      Using it exactly as it was marketed doesn’t make you an idiot.

      • halcyoncmdr@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        You really want to get into reality versus marketing in this world? Very little marketing actually shows real world products and use cases in a real world environment. Heck, advertising often doesn’t even show the actual product at all.

        Your McDonald’s burger is NEVER going to look like the marketing photo. You don’t want to get anywhere near that “ice cream” or “milkshake” from the ad either, mashed potatoes and glue are often used for those advertising replacements.

        This doesn’t even get into things like disclaimers and product warnings, or people ignoring them.

      • Thorny_Insight@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        The car prompts you every single time you enable this system to keep your eyes on the road and be prepaired to take over at any moment.

        • machinin@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          5 months ago

          That’s the fine print. He’s talking about the marketing - the influencer videos, Musk’s tweets of those videos, Tesla’s own marketing videos, etc.

    • machinin@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      But Tesla had a video in 2016 saying that people were only in the driver seat for legal reasons. Musk even said it was only an issue with regulators.

      Oh, who to believe!

      Notice, when talking about new features, Tesla shills love to promote how great it is and how often it saves then from problems (I can’t imagine how badly they must drive. We intervened on our grandmother after a couple of close calls). Then, when there is news about these accidents, they are so quick to blame the driver.

      Also, all these problems are with the old versions, the new versions clean up everything.

      I do agree with OP here about one thing - don’t take anything Tesla and Musk say about the cars’ capabilities seriously (including how that might impact stock price) until Tesla is willing to take financial responsibility for accidents. Until then, it’s all Musk bullshit.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    This is the best summary I could come up with:


    In March 2023, a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y traveling at “highway speeds,” according to a federal investigation that published today.

    The Tesla driver was using Autopilot, the automaker’s advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars.

    NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into stationary emergency vehicles parked on the side of the road.

    Most of these incidents took place after dark, with the software ignoring scene control measures, including warning lights, flares, cones, and an illuminated arrow board.

    Tesla issued a voluntary recall late last year in response to the investigation, pushing out an over-the-air software update to add more warnings to Autopilot.

    The findings cut against Musk’s insistence that Tesla is an artificial intelligence company that is on the cusp of releasing a fully autonomous vehicle for personal use.


    The original article contains 788 words, the summary contains 158 words. Saved 80%. I’m a bot and I’m open source!

    • ForgotAboutDre@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Cameras and AI aren’t a match for radar/lidar. This is the big issue with the approach to autonomy Tesla’s take. You’ve only a guess if there are hazards in the way.

      Most algorithms are designed to work and then be statistically tested. To validate that they work. When you develop an algorithm with AI/machine learning, there is only the statistical step. You have to infer whole systems performance purely from that. There isn’t a separate process for verification and validation. It just validation alone.

      When something is developed with only statistical evidence of it working you can’t be reliably sure it works in most scenarios. Except the exact ones you tested for. When you design an algorithm to work you can assume it works in most scenarios if the result are as expected when you validate it. With machine learning, the algorithm is obscured and uncertain (unless it’s only used for parameter optimisation).

      Machine learning is never used because it’s a better approach. It’s only used when the engineers don’t know how to develop the algorithm. Once you understand this, you understand the hazard it presents. If you don’t understand or refuse to understand this. You build machines that drive into children, deliberately. Through ignorance, greed and arrogance Tesla built a machine that deliberately runs over children.

  • tearsintherain@leminal.space
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Move fast, break shit. Fake it till you sell it, then move the goal posts down. Shift human casualties onto individual responsibility, a core libertarian theme. Profit off the lies because it’s too late, money already in the bank.

  • magnetosphere@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    I’ve often wondered why the FTC allows it to be marketed as “Full Self-Driving”. That’s blatant false advertising.

          • Thorny_Insight@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            5 months ago

            Might want to check your facts there. FSD works anywhere in the US, both cities and highways. Even on unmapped roads and parking lots.

            • Turun@feddit.de
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              What Tesla is (falsely IMO) advertising as “full self driving” is available in all new Mercedes vehicles as well and works anywhere in the US.

              Mercedes is in the news for expanding that functionality to a level where they are willing to take liability if the vehicle causes a crash during this new mode. Tesla does not do that.

              • Thorny_Insight@lemm.ee
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                5 months ago

                works anywhere in the US

                The system Mercedes is using is extremely limited and hardly compareable to FSD in any way.

                Drivers can activate Mercedes’s technology, called Drive Pilot, when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control. The technology does not work on roads that haven’t been pre-approved by Mercedes, including on freeways in other states.

                Source

                • BeigeAgenda@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  5 months ago

                  I would much rather use FSD that is limited to routes and conditions where the developers and testers agree that it’s safe.

                  Compared to a company that says “everything works”, and “those drivers that got killed must have been doing something wrong”.

                • machinin@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  5 months ago

                  If I understand that person correctly, you are confusing the two systems.

                  Mercedes has two systems. One of a driver assist system that does everything the current version of FSD can do. It is unlimited in the same way that Tesla’s FSD is unlimited.

                  They have an additional system, that you cite, that is Level 3, a true hands-off self-driving system. It is geographically limited.

                  So, the question is, does Tesla have any areas where you can legally drive hands free using their software?

                • Turun@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  5 months ago

                  That is the new system. Tesla has no equivalent to it. Or to phrase it differently:

                  Drivers can not activate teslas’s equivalent technology, no matter what conditions are met, including not in heavy traffic jams, not during the daytime, not on spec ific California and Nevada freeways, and not when the car is traveling less than 40 mph. Drivers can never focus on other activities. The technology does not exist in Tesla vehicles

                  If you are talking about automatic lane change, auto park, etc (what tesla calls autopilot or full self driving) these are all features you can find in most if not all high end cars nowadays.

                  The new system gets press coverage, because as I understand it, if there is an accident while the system is engaged Mercedes will assume financial and legal responsibility and e.g. cover all expenses that result from said accident. Tesla doesn’t do that.

            • machinin@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              “Fuck this guy for bringing facts into our circlejerk” - The downvoters, probably

              Ha! Just saw this. Did someone get their facts confused?

            • suction@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              When you stop using the Tesla kool-aid marketing terms and start to understand the actual state of the technology and more importantly legislation, we might start to listen to what you are trying to say. Hint: using the term “FSD” or “Autopilot” is an immediate disqualifier

            • machinin@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              Oops, you fell for the Tesla marketing BS. FSD isn’t actually full self driving like the Mercedes system. With Tesla, you have to keep your hands on the wheel at all times and pay close attention to the road. You are completely responsible for anything that happens. Mercedes takes responsibility for any accidents their software causes.

        • machinin@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          But it works and it’s hands off. Tesla can’t even legally do that under any condition.

          And fuck you if you ask Tesla to pay for any mistakes their software might make. It is ALWAYS your fault.

        • spamspeicher@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Level 3 in the S-Class and EQS has been available since may 2022. And the speed limit is there because that is part of a UN regulation that the Mercedes is certified for. The regulation has been updated since the release of Mercedes Drive Pilot to allow speeds up to 140km/h but Mercedes needs to recertify for that.

        • conciselyverbose@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Because they’re doing shit responsibly.

          For the target audience they chose that thing is a fucking bargain. Do you know how many people making damn good money sit in hours of 4 lane bumper to bumper traffic every day? “You don’t have to drive and we assume liability if our system fucks up” is a massive value add.

          (Not enough that I’d ever consider dealing with that kind of commute no matter what you paid me. But still.)

        • suction@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Still the most advanced system that is legal to use on public roads, worldwide. Tesla’s most advanced system is many leagues below that, so not sure why it’s so hard to believe for some people that Tesla is nothing but an also-ran.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      You can literally type in an address and the car will take you there with zero input on the driver’s part. If that’s not full self-driving then I don’t know what is. What FSD was capable of a year ago and how it performs today is completely different.

      Not only does these statistics include the way less capable older versions of it, it also includes accidents caused by autopilot which is a different system than FSD. It also fails to mention how the accident rate compares to human drivers.

      If we replace every single car in the US with a self-driving one that’s 10x safer driver than your average human that means you’re still getting over 3000 deaths a year due to traffic accidents. That’s 10 people a day. If one wants to ban these systems because they’re not perfect then that means they’ll rather have 100 people die every day instead of 10.

      • machinin@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        You can literally type in an address and the car will take you there with zero input on the driver’s part. If that’s not full self-driving then I don’t know what is.

        Who is responsible if there is an accident, you or Tesla? That is the difference from true FSD and regular driver assistance features.

        Regarding driving regulations -

        If we had better raw data, I’m sure we could come up with better conclusions. Knowing the absolutely tremendous amount of BS that Musk spews, we can’t trust anything Tesla reports. We’re left to speculate.

        At this point, it is probably best to compare statistics for other cars with similar technologies. For example, Volvo reported that they went 16 years without a fatal accident in their XC90 model in the UK (don’t know about other places). That was a couple of years ago, I don’t know if they have been able to keep that record up. With that kind of record that has lasted for so long, I think we have to ask why Tesla is so bad.

      • Turun@feddit.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        It also fails to mention how the accident rate compares to human drivers.

        That may be because Tesla refuses to publish proper data on this, lol.

        Yeah, they claim it’s ten times better than a human driver, but none of their analysis methods or data points are available to independent researchers. It’s just marketing.

        • dgmib@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          This is the part that bothers me.

          l’d defend Tesla when FSD gets into accidents, even fatal ones, IF they showed that FSD caused fewer accidents than the average human driver.

          They claim that’s true, but if it is why not release data that proves it?

          • machinin@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            It isn’t the average driver. Most cars are equipped with driver assist features, we have to say that is should be better than people using current driver assist features from other companies. If Tesla is behind everyone else, but better than a 20 year-old car, it’s still problematic.

        • machinin@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          I have a feeling that user blocks people that are critical of Tesla. They are probably oblivious to several comments in this thread. It’s really no wonder why they have no clue about how bad Tesla really is.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          I’m not claiming it is 10x safer than a human - I’m saying that even if it was there would still be daily deaths despite that.

          Tesla has published the data - people just refuse to believe it because it doesn’t show what they think it should. There’s nothing more Tesla can do about it at this point. It’s up to independent researches from now.

          • machinin@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            Comment:

            none of their analysis methods or data points are available to independent researchers.

            Your response:

            It’s up to independent researches from now.

            I think you missed an important point there. Can you show the detailed methods and data points that Tesla used for their marketing materials?

          • Turun@feddit.de
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            5 months ago

            I would love to see this data, can you link it? Either a paper by unaffiliated researchers or the raw data is fine.
            I am aware their marketing pushes the “10x better” number. But I have yet to see the actual data to back this claim.

            • Thorny_Insight@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              Either a paper by unaffiliated researchers or the raw data is fine.

              Like I said; the only data available is from Tesla itself which any reasonable person should take with a grain of salt. If you want to see it you can just google it. There’s plenty of YouTubers independently testing it aswell but these are all obviously biased fanboys that can’t be trusted either.

              • ForgotAboutDre@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                5 months ago

                Tesla sues people that criticise them in the media. You really can’t trust most reviews. The reviews are also looking for money from companies like Tesla so their not impartial.

    • reddig33@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      As is “autopilot”. There’s no automatic pilot. You’re still expected to keep your hands on the wheel and your eyes on the road.

      • halcyoncmdr@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I am so sick and tired of this belief because it’s clear people have no idea what Autopilot on a plane actually does. They always seem to assume it flies the plane and the pilot doesn’t do anything apparently. Autopilot alone does not fly the damned plane by itself.

        “Autopilot” in a plane keeps the wings level at a set heading, altitude, and speed. It’s literally the same as cruise control with lane-centering, since there’s an altitude issue on a road.

        There are more advanced systems available on the market that can be installed on smaller planes and in use on larger jets that can do things like auto takeoff, auto land, following waypoints, etc. without pilot input, but basic plain old autopilot doesn’t do any of that.

        That expanded capability is similar to how things like “Enhanced Autopilot” on a Tesla can do extra things like change lanes, follow highway exits on a navigated route, etc. Or how “Full Self-Driving” is supposed to follow road signs and lights, etc. but those are additional functions, not part of “Autopilot” and differentiated with their own name.

        Autopilot, either on a plane or a Tesla, alone doesn’t do any of that extra shit. It is a very basic system.

        The average person misunderstanding what a word means doesn’t make it an incorrect name or description.

        • Turun@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          I’d wager most people, when talking about a plane’s autopilot mean the follow waypoints or Autoland capability.

          Also, it’s hard to argue “full self driving” means anything but the car is able to drive fully autonomously. If they were to market it as “advanced driver assist” I’d have no issue with it.

          • halcyoncmdr@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            I’d wager most people, when talking about a plane’s autopilot mean the follow waypoints or Autoland capability.

            Many people are also pretty stupid when it comes to any sort of technology more complicated than a calculator. That doesn’t mean the world revolves around a complete lack of knowledge.

            My issue is just with people expecting basic Autopilot to do more than it’s designed or intended to do, and refusing to acknowledge their expectation might actually be wrong.

            Also, it’s hard to argue “full self driving” means anything but the car is able to drive fully autonomously. If they were to market it as “advanced driver assist” I’d have no issue with it.

            Definitely won’t get an argument from me there. FSD certainly isn’t in a state to really be called that yet. Although, to be fair, when signing up for it, and when activating it there are a lot of notices that it is in testing and will not operate as expected.

            At what point do we start actually expecting and enforcing that people be responsible with potentially dangerous things in daily life, instead of just blaming a company for not putting enough warnings or barriers to entry?

            • Turun@feddit.de
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              Also, it’s hard to argue “full self driving” means anything but the car is able to drive fully autonomously. If they were to market it as “advanced driver assist” I’d have no issue with it.

              Definitely won’t get an argument from me there. FSD certainly isn’t in a state to really be called that yet. Although, to be fair, when signing up for it, and when activating it there are a lot of notices that it is in testing and will not operate as expected.

              At what point do we start actually expecting and enforcing that people be responsible with potentially dangerous things in daily life, instead of just blaming a company for not putting enough warnings or barriers to entry?

              Then the issue is simply what we perceive as the predominant marketing message. I know that in all legally binding material Tesla states what exactly the system is capable of and how alert the driver needs to be. But in my opinion that is vastly overshadowed by the advertising Tesla runs for their FSD capability. They show a 5 second message about how they are required by law to warn you about being alert at all times, before showing the car driving itself for 3 minutes, with the demo driver having the hands completely off the wheel.

            • machinin@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              At what point do we start actually expecting and enforcing that people be responsible with potentially dangerous things in daily life, instead of just blaming a company for not putting enough warnings or barriers to entry?

              Volvo seeks to have zero human deaths in their cars. Some places seek zero fatality driving environments. These are cultures where safety is front and center. Most FSD enthusiasts (see comments in the other threads below) cite safety as the main impetus for these systems. Hopefully we would see similar cultural values in Tesla.

              Unfortunately, Musk tweets out jokes when responding to a video of people having sex on autopilot. That is Tesla culture. Musk is responsible for putting these dangerous things in consumers hands and has created a culture where irresponsible and possibly fatal abuse of those things is something funny for everyone to laugh at. Of course, punish the individual users who go against the rules and abuse the systems. You also have to punish the company, and the idiot at the top, who holds those same rules in contempt.

        • machinin@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          5 months ago

          I say let Tesla market it as Autopilot is they pass similar regulatory safety frameworks as aviation autopilot functions.

        • Captain Aggravated@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Flight instructor here.

          I’ve seen autopilot systems that have basically every level of complexity you can imagine. A lot of Cessna 172s were equipped with a single axis autopilot that can only control the ailerons and can only maintain wings level. Others have control of the elevators and can do things like altitude hold, or ascend/descend at a given rate. More modern ones have control of all three axes and integration with the attitude instruments, and can do things like climb to an altitude and level off, turn to a heading and stop, or even something like fly a holding pattern over a fix. They still often don’t have any control over the power plant, and small aircraft typically cannot land themselves, but there are autopilots installed in piston singles that can fly an approach to minimums.

          And that’s what’s available on piston singles; airline pilots seldom fly the aircraft by hand anymore.

        • Saik0@lemmy.saik0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          “Autopilot” in a plane keeps the wings level at a set heading, altitude, and speed. It’s literally the same as cruise control with lane-centering, since there’s an altitude issue on a road.

          Factually incorrect. There are autopilot systems on planes now that can takeoff, fly, and land the flight on their own. So yes, “autopilot” is EXACTLY what people are assuming it to mean in many cases. Especially on planes that they would typically be accustom to… which is the big airliners.

          Now where you’re missing the point… There are varying degrees of autopilot. And that would be fine and dandy for Tesla’s case if you wish to invoke it. But considering the company has touted it to be the “most advanced” and “Full self driving” and “will be able to drive you from california to new york on it’s own”. They’ve set the expectation in that it is the most advanced autopilot. Akin to the plane that doesn’t actually need a pilot (although one is always present) for all three major parts of the flight. No tesla product comes even close to that claim, and I’m willing to bet they never do in their lifetime.

          • halcyoncmdr@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            5 months ago

            Now where you’re missing the point… There are varying degrees of autopilot. And that would be fine and dandy for Tesla’s case if you wish to invoke it. But considering the company has touted it to be the “most advanced” and “Full self driving” and “will be able to drive you from california to new york on it’s own”.

            I have said from the beginning that there are varying levels of Autopilot on planes and that needs to be taken into account when talking about the name and capabilities… that’s my entire argument you illiterate fool.

            You are, at best, failing to acknowledge, or more likely, willfully ignoring the fact that Tesla does differentiate these capabilities with differently named products. All while claiming that a plane Autopilot must inherently be the most advanced version on the market to be compared to Tesla’s most basic offering.

            You are adding in capabilities from the more advanced offerings that Tesla has, like Enhanced Autopilot, and Full Self Driving and saying those are part of “Autopilot”. If you want to compare basic Tesla Autopilot, then compare it to a basic plane Autopilot. Tesla doesn’t claim that basic “Autopilot” can do all the extra stuff, that’s why they have the other options.

            That’s the issue I have with these conversations, people are always comparing apples and oranges, and trying to claim that they’re not to try and justify their position.

            Tesla’s website does indicate these differences between the versions, and has as each added capability was added to the overall offerings.

            • Saik0@lemmy.saik0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              You are, at best, failing to acknowledge

              No. That whole statement INCLUDING what you quoted was me allowing you to invoke it.

              Literally : “And that would be fine and dandy for Tesla’s case if you wish to invoke it.” Then I stated why that’s bad to invoke.

              You can claim I’m willfully ignorant. But you’re just a moron Elon shill.

              Tesla doesn’t claim that basic “Autopilot” can do all the extra stuff, that’s why they have the other options.

              And there’s why I’m just going to call you a moron Elon shill and move on. You’re full of shit. All they do is claim that it’s amazing/perfect. Then you buy the car and you expect the function and it doesn’t do it, not even close.

              • halcyoncmdr@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                5 months ago

                But you’re just a moron Elon shill.

                Ah yes, the classic internet response of calling anyone you disagree with a shill. Because clearly someone disagreeing with you and pointing out issues with claims means they must inherently be defending a company without any valid claims. Easy to ignore when you don’t consider them a real person having a discussion.

                No point in arguing with someone unwilling to have an actual discussion and just resorting to calling someone a shill because they refuse to accept a different point of view can even exist.

                “You’re a shill, so nothing you say matters”.

                • Saik0@lemmy.saik0.com
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  5 months ago

                  When you outright lie about the facts it’s hard to have any other opinion about you. So yes, you’re a shill

        • reddig33@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          “But one reason that pilots will opt to turn the system on much sooner after taking off is if it’s stormy out or there is bad weather. During storms and heavy fog, pilots will often turn autopilot on as soon as possible.

          This is because the autopilot system can take over much of the flying while allowing the pilot to concentrate on other things, such as avoiding the storms as much as possible. Autopilot can also be extremely helpful when there is heavy fog and it’s difficult to see, since the system does not require eyesight like humans do.”

          Does that sound like something Tesla’s autopilot can do?

          https://www.skytough.com/post/when-do-pilots-turn-on-autopilot

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            Flight instructor here. The flying and driving environments are quite different, and what you need an “autodriver” to do is a bit different from an “autopilot.”

            In a plane, you have to worry a lot more about your attitude, aka which way is up. This is the first thing we practice in flight school with 0-hour students, just flying straight ahead and keeping the airplane upright. This can be a challenge to do in low visibility environments such as in fog or clouds, or even at night in some circumstances, and your inner ears are compulsive liars the second you leave the ground, so you rely on your instruments when you can’t see, especially gyroscopic instruments such as an attitude indicator. This is largely what an autopilot takes over for from the human pilot, to relieve him of that constant low-level task to concentrate on other things.

            Cars don’t have to worry about this so much; for normal highway driving any situation other than “all four wheels in contact with the road” is likely an unrecoverable emergency.

            Navigation in a plane means keeping track of your position in 3D space relative to features on the Earth’s surface. What airspace are you in, what features on the ground are you flying over, where is the airport, where’s that really tall TV tower that’s around here? Important for finding your way back to the airport, preventing flight into terrain or obstacles, and keeping out of legal trouble. This can be accomplished with a variety of ways, many of which can integrate with an autopilot. Modern glass cockpit systems with fully integrated avionics can automate the navigation process as well, you can program in a course and the airplane can fly that course by itself, if appropriately equipped.

            Navigation for cars is two separate problems; there’s the big picture question of “which road am I on? Do I take the next right? Where’s my exit?” which is a task that requires varying levels of precision from “you’re within this two mile stretch of road” to “you’re ten feet from the intersection.” And there’s the small picture question of “are we centered in the traffic lane?” which can have a required precision of inches. These are two different processes.

            Anticollision, aka not crashing into other planes, is largely a procedural thing. We have certain best practices such as “eastbound traffic under IFR rules fly on the odd thousands, westbound traffic flies on the even thousands” so that oncoming traffic should be a thousand feet above or below you, that sort of thing, plus established traffic patterns and other standard or published routes of flight for high traffic areas. Under VFR conditions, pilots are expected to see and avoid each other. Under IFR conditions, that’s what air traffic control is for, who use a variety of techniques to sequence traffic to make sure no one is in the same place at the same altitude at the same time, anything from carefully keeping track of who is where to using radar systems, and increasingly a thing called ADS-B. There are also systems such as TCAS which are aircraft carried traffic detection electronics. Airplanes are kept fairly far apart via careful sequencing. There’s also not all that much else up there, not many pedestrians or cyclists thousands of feet in the air, wildlife and such can be a hazard but mostly during the departure and arrival phases of flight while relatively low. This is largely a human task; autopilots don’t respond to air traffic control and many don’t integrate with TCAS or ADS-B, this is the pilot’s job.

            Cars are expected to whiz along mere inches apart via see and avoid. There is no equivalent to ATC on the roads, cars aren’t generally equipped with communication equipment beyond a couple blinking lights, and any kind of automated beacon for electronic detection absolutely is not the standard. Where roads cross at the same level some traffic control method such as traffic lights are used for some semblance of sequencing but in all conditions it requires visual see-and-avoid. Pedestrians, cyclists, wildlife and debris are constant collision threats during all phases of driving; deer bound across interstates all the time. This is very much a visual job, hell I’m not sure it could be done entirely with radar, it likely requires optical sensors/cameras. It’s also a lot more of the second-to-second workload of the driver. I honestly don’t see this task being fully automated with roads the way they are.

          • FiskFisk33@startrek.website
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            5 months ago

            At SkyTough, we pride ourselves on ensuring our readers get the best, most helpful content that they’ll find anywhere on the web. To make sure we do this, our own experience and expertise is combined with the input from others in the industry. This way, we can provide as accurate of information as possible. With input from experts and pilots from all over, you’ll get the complete picture on when pilots turn autopilot on while flying!

            This is GPT.

            After that intro I don’t trust a single word of what that site has to say.

            If the writer didn’t bother to write the text, i hope they don’t expect me to bother to read it.

  • set_secret@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    VERGE articles seem to be getting worse over the years, they’ve almost reached Forbes level, yes this does raise some valid safety concerns. No Tesla isn’t bad just because it’s Tesla.

    It doesn’t really give us the full picture. For starters, there’s no comparison with Level 2 systems from other car makers, which also require driver engagement and have their own methods to ensure attention. This would help us understand how Tesla’s tech actually measures up.

    Plus, the piece skips over extremely important stats that would give us a clearer idea of how safe (or not) Tesla’s systems are compared to good old human driving.

    We’re left in the dark about how Tesla compares in scenarios like drunk, distracted, or tired driving—common issues that automation aims to mitigate. (probably on purpose).

    It feels like the article is more about stirring up feelings against Tesla rather than diving deep into the data. A more genuine take would have included these comparisons and variables, giving us a broader view of what these technologies mean for road safety.

    I feel like any opportunity to jump on the Elon hate wagon is getting tiresome. (and yes i hate Elon too).

    • WormFood@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      a more genuine take would have included a series of scenarios (e.g. drunk/distracted/tired driving)

      I agree. they did tesla dirty. a more fair comparison would’ve been between autopilot and a driver who was fully asleep. or maybe a driver who was dead?

      and why didn’t this news article contain a full scientific meta analysis of all self driving cars??? personally, when someone tells me that my car has an obvious fault, I ask them to produce detailed statistics on the failure rates of every comparable car model

      • mojofrododojo@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        a driver who was fully asleep. or maybe a driver who was dead?

        why does it need to become a specious comparison for it to be valid in your expert opinion? because those comparisons are worthless.

    • PersnickityPenguin@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      A couple of my criticisms with the article, which is about “autopilot” and not fsd:

      -conflating autopilot and dad numbers, they are not interoperable systems. They are separate code bases with different functionality.

      -the definition of “autopilot” seems to have been lifted from the aviation industry. The term is used to describe a system that controls the vector of a vehicle, is the speed and direction. That’s all. This does seem like a correct description for what the autopilot system does. While “FSD” does seem like it does not live up to expectations, not being a true level 5 driving system.

      Merriam Webster defines autopilot thusly:

      “A device for automatically steering ships, aircraft, and spacecraft also : the automatic control provided by such a device”

    • mojofrododojo@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      So we should let musk endanger people needlessly for tesla’s profits?

      a human driver isn’t 100% but you can at least hold the human liable for their mistakes. Is Musk going to be liable for the accidents this causes?

      Because that’s the human left in the loop, the fool self drive champion.

  • Betide@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    The same people who are upset over self driving cars are the ones who scream at the self checkout that they shouldn’t have to scan their own groceries because the store isn’t paying them.

    32% of all traffic crash fatalities in the United States involve drunk drivers.

    I can’t wait until the day that this kind of technology is required by law I’m tired of sharing the road with these idiots and I absolutely trust self driving vehicles more than I trust other humans.

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      people who are upset over self driving cars

      If you are talking about Teslas, you can’t be upset about something a car doesn’t actually do unless you think it’s actually capable of doing it.

      The only thing I don’t like is that Tesla is able to claim it has a “full self driving” mode which is not full self driving. Seems like false advertising to me.

    • kat_angstrom@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      I’ve never heard of anyone screaming because they had to scan their own groceries at a self-checkout. Is this a common thing?

      • Fridgeratr@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        No. No idea what they’re talking about. If someone really feels that way, there are usually other aisles with people that can scan the groceries.

    • EvolvedTurtle@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      I recently learned that at least half of the drivers where I live thing it’s fine to cut me off while we are going 70mph on the highway with no signal

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    It’s just a dozen! You know how many people COVID took? And everyone wanted COVID! …it spreads of the air? Where’s my fabric non filtering 😷 mask with added holes baby!? So you know…how cool would it be if you’re riding a ordinary car and someone else is driving it into a wall or semi, except it’s actually not a sentient being but an algorithm? It would be pretty cool right?

    • letsgo@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      OK.

      Question: how do you propose I get to work? It’s 15 miles, there are no trains, the buses are far too convoluted and take about 2 hours each way (no I’m not kidding), and “move house” is obviously going to take too long (“hey boss, some rando on the internet said “stop using cars” so do you mind if I take indefinite leave to sell my house and buy a closer one?”).

        • letsgo@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          I already have (Yamaha MT10), but presumably that has the same problem that cars do (burning fossil fuels); also it’s no good in shit weather (yeah I know that means I need better clothing).

        • letsgo@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Sure, but the challenge was “Don’t use cars”, not “Don’t use cars where there is viable mass transit in place”.

    • TypicalHog@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      I swear some people in this thread would call airplane autopilot bad cause it causes SOME death from time to time.

    • dependencyinjection@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      When I see this comment it makes me wonder, how do you feel when you see someone driving a car?

      Should I feel guilty for owning a car. I’m 41 and I got my first car when I was 40, because I changed careers and it was 50 miles away.

      I rarely used it outside of work and it was a means to get me there. I now work remote 3 days so only drive 2.

      I don’t have social media or shop with companies like Amazon. I have just been to my first pro-Palestine protest.

      Am I to be judged for using a car?

      • hydration9806@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I believe what they mean is “fuck car centric societal design”. No reasonable person should be mad that someone is using the current system to live their life (i.e. driving to work). What the real goal is spreading awareness that a car centric society is inherently isolating and stressful, and that one more lane does absolutely nothing to lessen traffic (except for like a month ish)

      • PlexSheep@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        That’s a good question!

        The short answer is no. Cars suck for many reasons, but it’s a fact in many parts of the world that you cannot be a functioning member of a society without one, especially if your government doesn’t get that cars suck or you live somewhere remote.

        How do I feel when I see someone driving a car? Mostly my feelings don’t change, because it is so normalized. But I get somewhat angry when I see uselessly huge cars that are obviously just a waste of resources. I have fun ridiculing car centric road and city design, but it’s the bad kind of fun.

        I am also very careful around cars, both while I’m in and outside of them. Cars are very heavy and drivers are infamous for being bad at controlling them. This isn’t their fault, it’s super easy to make mistakes while driving, you just have to move your feet a little too fast or move your hand a little too far and boom, someone is dead.

        Think about driving on a highway. If the guy next to you accidentally moves the wheel a little more than usual, that car will crash into you, creating a horrendous scene. It’s just too prone to failure, and failure will probably mean person damages. For this reason, cars are legitimately scaring me, even if I have to deal with it.

        Sorry if that does not make sense to you. I’m still trying to figure all this out for myself and I’m not always rational about these topics, because seeing the potential of our cities being wasted by car centric design makes me angry.

      • machinin@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        Probably not you personally, but the system, oil companies, and people like Musk and his followers that want to prioritize private driving over public transportation.

        I say fuck cars, and I have one too. I try to avoid using it, but it’s easy to be lazy. I’m also fortunate to live someplace with great public transportation.

        Don’t take it personally, just realize life can be better if we could learn to live without these huge power-hungry cargo containers taking us everywhere.