A Tesla was in its self-driving mode when it crashed into a parked patrol vehicle responding to a fatal crash in Orange County Thursday morning, police said.

The officer was on traffic control duty blocking Orangethorpe Avenue in Fullerton for an investigation into a suspected DUI crash that left a motorcyclist dead around 9 p.m. Wednesday when his vehicle was struck.

A Fullerton Police Department spokesperson said the officer was standing outside his vehicle around midnight when he saw a Tesla driving in his direction and not slowing down.

  • fuckwit_mcbumcrumble@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    7 months ago

    My Subaru with adaptive cruise control is smart enough to not zoom into the back of a parked car. If my car with a potato for a CPU can figure it out then why can’t a tesla in any more with it’s significant more advanced computer?

    • halcyoncmdr@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      edit-2
      7 months ago

      It is simple, it depends on what the vehicle is using to actually process other vehicles to maintain distance from.

      These systems process a lot of information, and a lot of it is pretty bad data that needs to be cleaned to remove erroneous readings before it can be processed. Sensors stream a lot of info, and not all of it is perfectly accurate. The same is true for a Tesla or any other vehicle, and filtering that data accurately means a better experience.

      Say your vehicle has a forward facing radar, and you’re driving along the highway and the radar gets a return for a large object in front of the car 100 feet ahead when the returns immediately before were showing a 300 foot clear zone. Is it more likely that a large object suddenly appeared in front of the car, or that this return is erroneous and the next few returns after will show a clear zone again? Overhead signs and overpasses can show similar returns to a large truck in your lane for instance. This is one advantage lidar has over radar, more accurate angle measurements at all distances.

      So say the vehicle acts on that return and slam on the brakes because the “object” is only 100 feet ahead at highway speeds. Then the erroneous return goes away and there’s a clear road again. That’s the “phantom braking” I’m sure you’ve seen various people talk about. The system reacting to an erroneous return instead of filtering it out as a bad reading. Now random braking in the middle of a highway is dangerous as well, need to minimize that. Is it more likely a massive wall suddenly appeared directly in front of the car, or that it’s a couple bad readings? The car has to determine that to make a decision on what to do. And different types of sensors will detect things differently. To some sensors, materials like paper are essentially invisible for instance but metal is clear as day. If the sensor can’t detect something, it won’t react.

      Note that these readings do not involve a camera at all. They inherently work differently than a human driver does by looking at the road. So many people online want to point out that sensors are more “reliable” or “trustworthy” compared to vision since there’s little processing, you just get a data point, yet sensors will provide bad data often enough that it needs to have a filter to remove bad data. A camera works like a person, it can see everything, you just need to teach it to ide tify what it needs to pay attention to, and what it can ignore, like the sky, or power lines, or trees passing by on the side of the road. But not the human on the side of the road, need to see that.

      Then we get into the fact that various sensors exist on older vehicles that have been removed from newer ones. Things like radar and ultrasonic sensors have been removed in favor of using computer vision via the cameras directly, like a human driver watching the road. Going frame by frame to categorize what it sees for vehicles, people, cones, lanes, etc. and comparing to previous frames to extrapolate things like motion, movement, and relative speed. But with cameras you have issues with things like lights blinding them, just like a bright light blinds a person. Maybe the camera can’t see for some reason, like a light shining directly in the lens. It takes a little time for it to try and adjust exposure to compensate for a bright light shining directly in the lens.

      You might suggest using as many sensors as possible then, but that makes it nearly impossible to actually make a decision then. Sensor integration is a huge data processing issue. how do you determine what data to accept and what to ignore when you get conflicting results from different types of sensors? This is why Tesla is trying to just do it all via vision. One type of sensor, roughly equivalent to a human but with wider visual spectrum sensitivity. Just classify what’s in each frame and act on it. Simple implementation, just needs A LOT of data to train it in as many situations as possible.

      And that camera is where we get to emergency vehicles specifically. In my opinion, these emergency vehicle accidents are likely the camera being blinded repeatedly by the emergency lights rotating and the camera shifting exposure up and down every second or so to try and maintain an image it can actually process. As a human, at night, those lights make it hard for even me to see the rest of the road.

      It’s not like regular drivers never crash into emergency vehicles either, they just don’t make national news, just like the 33 car fires every hour in the US alone.

      It’s not a simple thing, and even your “simple” car by comparison is doing a lot to filter the data it gets. It could be using completely different kinds of data than another vehicle for that cruise control, so given the right circumstances it may react differently.

      For what it’s worth, my Model 3 has rarely had issues with Autopilot acting in any sort of dangerous manner. A few phantom braking issues back when I got it in 2018, but I haven’t had a single one of those in maybe 4 years now, even in areas where it would almost always react that way back when I got it. Sometimes a little lane weirdness with old poorly marked lane lines, or even old lane lines visible in addition to the current ones in some areas. It’s pretty easy to tell the situations AP might have issues with once you’re used it just a few times.