The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    86
    arrow-down
    2
    ·
    edit-2
    1 month ago

    It’s worse than that, though. Our eyes are significantly better than cameras (with some exceptions at the high end) at adapting to varied lighting conditions than cameras are. Especially rapid changes.

    • Buddahriffic@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      Not only that, when we have trouble seeing things, we can adjust our speed to compensate (though tbf, not all human drivers do, but I don’t think FSD should be modelled after the worst of human drivers). Does Tesla’s FSD go into a “drive slower” mode when it gets less certain about what it sees? Or does its algorithms always treat its best guess with high confidence?

    • jerkface@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      23
      ·
      1 month ago

      Hard to credit without a source, modern cameras have way more dynamic range than the human eye.

      • magiccupcake@lemmy.world
        link
        fedilink
        English
        arrow-up
        38
        arrow-down
        1
        ·
        1 month ago

        Not in one exposure. Human eyes are much better with dealing with extremely high contrasts.

        Cameras can be much more sensitive, but at the cost of overexposing brighter regions in an image.

        • conciselyverbose@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          21
          ·
          1 month ago

          They’re also pretty noisy in low light and generally take long exposures (a problem with a camera at high speeds) to get sufficient input to see anything in the dark. Especially if you aren’t spending thousands of dollars with massive sensors per camera.

          • jerkface@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            1 month ago

            I dunno what cameras you are using but a standard full frame sensor and an F/4 lens sees way better in low light than the human eye. If I take a raw image off my camera, there is so much more dynamic range than I can see or a monitor can even represent, you can double the brightness at least four times (ie 16x brighter) and parts of the image that looked pure black to the eye become perfectly usable images. There is so so so much more dynamic range than the human eye.

            • conciselyverbose@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              6
              ·
              1 month ago

              Do you know what the depth of field at f/4 looks like? It’s not anywhere in the neighborhood of suitable for a car, and it still takes a meaningful exposure length in low light conditions to get a picture at all, which is not suitable for driving at 30mph, let alone actually driving fast.

              That full frame sensor is also on a camera that’s several thousand dollars.