A Tesla was in its self-driving mode when it crashed into a parked patrol vehicle responding to a fatal crash in Orange County Thursday morning, police said.

The officer was on traffic control duty blocking Orangethorpe Avenue in Fullerton for an investigation into a suspected DUI crash that left a motorcyclist dead around 9 p.m. Wednesday when his vehicle was struck.

A Fullerton Police Department spokesperson said the officer was standing outside his vehicle around midnight when he saw a Tesla driving in his direction and not slowing down.

  • halcyoncmdr@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    3
    ·
    5 months ago

    I thought “autopilot” was only supposed to be used on freeways. And obviously assisted by a human who should have seen a fucking parked cop car coming and intercede anyway.

    It depends on which system they actually had on the vehicle. It’s more complicated than random people seem to think. But even with the FSD beta, it specifically tells the driver every time they activate it that they need to pay attention and are still responsible for the vehicle.

    Despite what the average internet user seems to think, not all Teslas even have the computer capable of Full Self Driving installed. I’d even say most don’t. Most people seem to think that Autopilot and FSD are the same, they’re not, and never have been.

    There have been 4+ computer systems in use over the years as they’ve upgraded the hardware and added capabilities in newer software. Autopilot, Enhanced Autopilot, and Full Self Driving BETA are three different systems with different capabilities. Anything bought prior to the very first small public closed beta of FSD a couple years ago would need to be replaced with a new computer to use FSD. Installation cost is included if someone buys FSD outright, or they have to pay for the upgrade if they instead want the subscription. All older Teslas however would be limited to Autopilot and Enhanced Autopilot without that computer upgrade.

    The AP and FSD systems are not at all the same, and they use different code. Autopilot is designed and intended for highways and doesn’t require the upgraded computer. Autopilot is and always has been effectively just Traffic Aware Cruise Control and Auto steer. Enhanced Autopilot added extra features like Summon, Auto lane change, Navigetc.ate on Autopilot (on-ramp to off-ramp navigation) but has never been intended for city streets. Autopilot itself hasn’t really been updated in years, almost all the updates have been to the FSD beta.

    The FSD beta is what is being designed for city streets, intersections, etc. and needs that upgraded computer to process everything for that in real time. It uses a different codebase to process data.

    • ShepherdPie@midwest.social
      link
      fedilink
      arrow-up
      14
      arrow-down
      2
      ·
      5 months ago

      The spokesperson said that the Tesla was in self-drive mode and **the driver admitted to being on a cellphone at the time of the crash. **

      That seems to answer all the questions about this accident.

      • halcyoncmdr@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        4
        ·
        edit-2
        5 months ago

        Not really. “Self drive mode” isn’t the name of either of the driving systems, that could be either Autopilot or the Full Self Driving beta. The spokesperson was for the police department, not Tesla. They’re unlikely to know there is a difference between the two systems, like most people.

        It’s at best, secondhand info from the driver, and most likely him saying he was on the phone and the car was driving, which would be the same thing they likely said with either system running. I doubt the driver is explaining differences between AP and FSD to the police.

        • Takumidesh@lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          5 months ago

          I think the above commenter was just pointing out that the driver admitted to not paying attention to the road

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      5 months ago

      My Subaru with adaptive cruise control is smart enough to not zoom into the back of a parked car. If my car with a potato for a CPU can figure it out then why can’t a tesla in any more with it’s significant more advanced computer?

      • halcyoncmdr@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        2
        ·
        edit-2
        5 months ago

        It is simple, it depends on what the vehicle is using to actually process other vehicles to maintain distance from.

        These systems process a lot of information, and a lot of it is pretty bad data that needs to be cleaned to remove erroneous readings before it can be processed. Sensors stream a lot of info, and not all of it is perfectly accurate. The same is true for a Tesla or any other vehicle, and filtering that data accurately means a better experience.

        Say your vehicle has a forward facing radar, and you’re driving along the highway and the radar gets a return for a large object in front of the car 100 feet ahead when the returns immediately before were showing a 300 foot clear zone. Is it more likely that a large object suddenly appeared in front of the car, or that this return is erroneous and the next few returns after will show a clear zone again? Overhead signs and overpasses can show similar returns to a large truck in your lane for instance. This is one advantage lidar has over radar, more accurate angle measurements at all distances.

        So say the vehicle acts on that return and slam on the brakes because the “object” is only 100 feet ahead at highway speeds. Then the erroneous return goes away and there’s a clear road again. That’s the “phantom braking” I’m sure you’ve seen various people talk about. The system reacting to an erroneous return instead of filtering it out as a bad reading. Now random braking in the middle of a highway is dangerous as well, need to minimize that. Is it more likely a massive wall suddenly appeared directly in front of the car, or that it’s a couple bad readings? The car has to determine that to make a decision on what to do. And different types of sensors will detect things differently. To some sensors, materials like paper are essentially invisible for instance but metal is clear as day. If the sensor can’t detect something, it won’t react.

        Note that these readings do not involve a camera at all. They inherently work differently than a human driver does by looking at the road. So many people online want to point out that sensors are more “reliable” or “trustworthy” compared to vision since there’s little processing, you just get a data point, yet sensors will provide bad data often enough that it needs to have a filter to remove bad data. A camera works like a person, it can see everything, you just need to teach it to ide tify what it needs to pay attention to, and what it can ignore, like the sky, or power lines, or trees passing by on the side of the road. But not the human on the side of the road, need to see that.

        Then we get into the fact that various sensors exist on older vehicles that have been removed from newer ones. Things like radar and ultrasonic sensors have been removed in favor of using computer vision via the cameras directly, like a human driver watching the road. Going frame by frame to categorize what it sees for vehicles, people, cones, lanes, etc. and comparing to previous frames to extrapolate things like motion, movement, and relative speed. But with cameras you have issues with things like lights blinding them, just like a bright light blinds a person. Maybe the camera can’t see for some reason, like a light shining directly in the lens. It takes a little time for it to try and adjust exposure to compensate for a bright light shining directly in the lens.

        You might suggest using as many sensors as possible then, but that makes it nearly impossible to actually make a decision then. Sensor integration is a huge data processing issue. how do you determine what data to accept and what to ignore when you get conflicting results from different types of sensors? This is why Tesla is trying to just do it all via vision. One type of sensor, roughly equivalent to a human but with wider visual spectrum sensitivity. Just classify what’s in each frame and act on it. Simple implementation, just needs A LOT of data to train it in as many situations as possible.

        And that camera is where we get to emergency vehicles specifically. In my opinion, these emergency vehicle accidents are likely the camera being blinded repeatedly by the emergency lights rotating and the camera shifting exposure up and down every second or so to try and maintain an image it can actually process. As a human, at night, those lights make it hard for even me to see the rest of the road.

        It’s not like regular drivers never crash into emergency vehicles either, they just don’t make national news, just like the 33 car fires every hour in the US alone.

        It’s not a simple thing, and even your “simple” car by comparison is doing a lot to filter the data it gets. It could be using completely different kinds of data than another vehicle for that cruise control, so given the right circumstances it may react differently.

        For what it’s worth, my Model 3 has rarely had issues with Autopilot acting in any sort of dangerous manner. A few phantom braking issues back when I got it in 2018, but I haven’t had a single one of those in maybe 4 years now, even in areas where it would almost always react that way back when I got it. Sometimes a little lane weirdness with old poorly marked lane lines, or even old lane lines visible in addition to the current ones in some areas. It’s pretty easy to tell the situations AP might have issues with once you’re used it just a few times.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      5 months ago

      All cars since mid 2019 have the computer required for FSD.

      At this point that includes the majority of all Teslas ever sold. Somewhere between 750k and 800k of 6 million don’t have the hardware. And of those 100-200k are upgradeable, maybe more but the research time isn’t worth it.

      That being said, it still could have been AP and not FSD as the media gets it confused all the time.

      • halcyoncmdr@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        5 months ago

        Is that the actual time cutoff? My 2018 model 3 that came with Enhanced Autopilot was originally said to have the hardware necessary for FSD (Computer 2.5 the car says), but there were updates before FSD became actually available.

        I never considered buying it so I never paid more than cursory attention to all of the different hardware revisions, only major ones like Computer 3, removing radar during parts shortages around COVID, the Ultrasonic sensors, etc.

        Also I hadn’t realized that it had actually been that long since I bought it, without most of the regular time-based car maintenance like oil changes time has flown by with it. Or that production had ramped up so significantly since I got my Model 3. I knew it had ramped obviously and that the Model Y launched, but I didn’t realize how significant all of that actually was when added together.

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          5 months ago

          Ya, it’s been that long and they’ve made a lot of cars since heh.

          I got mine in H1 2019 and it was HW 2.5, and sometime shortly after that HW3 came out. At the time I knew that was the situation but I wasn’t concerned since they said they’d upgrade it.

          It took awhile after HW3 came out to be offered the upgrade though. By the time I was eligible, we were in the peak of early covid lockdowns and I wasn’t traveling to the not so close service center for the upgrade.

          Eventually they did it via mobile service and I got it.