The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

  • FiskFisk33@startrek.website
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.

    how is it legal to label this “full self driving” ?

    • krashmo@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      That’s pretty clearly just a disclaimer meant to shield them from legal repercussions. They know people aren’t going to do that.

      • GoodEye8@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Last time I checked that disclaimer was there because officially Teslas are SAE level 2, which let’s them evade regulations that higher SAE levels have, and in practice Tesla FSD beta is SAE level 4.

      • don@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        “But to be clear, although I most certainly know for a fact that the refreshing sparkling water I sell is exceedingly poisonous and should in absolutely no way be consumed by any living (and most dead*) beings, I will nevertheless very heartily encourage you to buy it. What you do with it after is entirely up to you.

        *Exceptions may apply. You might be one.

    • kiku@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      If customers can’t assume that boneless wings don’t have bones in them, then they shouldn’t assume that Full Self Driving can self-drive the car.

      The courts made it clear that words don’t matter, and that the company can’t be liable for you assuming that words have meaning.

        • LifeInMultipleChoice@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          Now go after Oscar Meyer and Burger King. I am not getting any ham in my burger or dog in my hot’s. They are buying a product which they know full well before they complete the sale that it does not and is not lawfully allowed to auto pilot itself around the country. The owners manuals will give them a full breakdown as well I’m sure. If you spend thousands of dollars on something and don’t know the basic rules and guidelines, you have much bigger issues. If anything, one should say to register these vehicles to drive on the road, they should have to be made aware.

          If someone is that dumb or ignorant to jump through all the hoops and not know, let’s be honest: They shouldn’t be driving a car either.

      • FiskFisk33@startrek.website
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        legal or not it’s absolutely bonkers. Safety should be the legal assumption for marketing terms like this, not an optional extra.