The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    51
    ·
    edit-2
    1 month ago

    It’s worse than that, though. Our eyes are significantly better than cameras (with some exceptions at the high end) at adapting to varied lighting conditions than cameras are. Especially rapid changes.

    • Buddahriffic@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      Not only that, when we have trouble seeing things, we can adjust our speed to compensate (though tbf, not all human drivers do, but I don’t think FSD should be modelled after the worst of human drivers). Does Tesla’s FSD go into a “drive slower” mode when it gets less certain about what it sees? Or does its algorithms always treat its best guess with high confidence?