• anlumo@feddit.de
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    As a software developer, that’s not how testing works. QA is always trying to come up with weird edge cases to test, but once it’s out in the wild with thousands (or more) of real-world users, there’s always going to be something nobody ever tried to test.

    For example, there was a crash where an unmarked truck with exactly the same color as the sky was 90° sideways on the highway. This is just something you wouldn’t think of in lab conditions.

    • lloram239@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      there’s always going to be something nobody ever tried to test.

      That’s not what is happening. We don’t see weird edge cases, we see self driving cars blocking emergency vehicles and driving through barriers.

      For example, there was a crash where an unmarked truck with exactly the same color as the sky was 90° sideways on the highway.

      The sky is blue and the truck was white. Testing the dynamic range of the camera system is absolutely something you do in in lab situation. And a thing blocking the road isn’t exactly unforeseen either.

      Or how about railroad crossing, Tesla can’t even the difference between a truck and a train. Trucks blipping in out of existence, even changing direction, totally normal for Tesla too.

      I don’t expect self driving cars to be perfect and handle everything, but I expect the manufacturers to be transparent about their abilities and they aren’t. Furthermore I expect the self driving system to have a way to react to unforeseen situations, crashing in fog is not acceptable when the fact that there was fog was plainly obvious.

      • abhibeckert@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        And a thing blocking the road isn’t exactly unforeseen either.

        Tesla’s system intentionally assumes “a thing blocking the road” is a sensor error.

        They have said if they don’t do that, about every hour or so you’d drive past a building and it would slam on the brakes and stop in the middle of the road for no reason (and then, probably, a car would crash into you from behind).

        The good sensors used by companies like Waymo don’t have that problem. They are very accurate.