• abraham_linksys@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    116
    arrow-down
    6
    ·
    1 year ago

    We need to build special roads so self driving cars can navigate properly.

    You could even connect self driving cars together, by letting the front car pull them the others could save their batteries.

    And with these “trains” of self driving cars pulling each other, you wouldn’t have to build the self driving car roads very wide, they could just run on narrow “tracks” for the wheels.

    Then we’d have more space for human stuff instead of car stuff like roads and parking lots everywhere.

    He’s done it again. Elon Musk is a god damn genius.

    • amanneedsamaid@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      83
      arrow-down
      4
      ·
      1 year ago

      Bill the manufacturer 100%, IMO. Thats why I think self driving cars beg an unanswerable legal question, as when the car drives for you, why would you be at fault? How will businesses survive if they have to take full accountability for accidents caused by self-driving cars?

      I think its almost always pointless to hold back innovation, but in this case I think a full ban on self driving cars would be a great move.

      • DauntingFlamingo@lemmy.ml
        link
        fedilink
        English
        arrow-up
        29
        arrow-down
        3
        ·
        edit-2
        1 year ago

        The most basic driving like long stretches of highway shouldn’t be banned from using AI/automated driving. The fast paced inner city driving should be augmented but not fully automatic. Same goes for driving in inclement weather: augmented with hard limits on speed and automated braking for anything that could result in a crash

        Edit: I meant this statement as referring to the technology in it’s current consumer form (what is available to the public right at this moment). I fully expect that as the technology matures so will the percentage of incidents decline. We are likely to attain a largely driverless society one day in my lifetime

        • snooggums@kbin.social
          link
          fedilink
          arrow-up
          22
          arrow-down
          3
          ·
          1 year ago

          “Self driving with driver assist” or whatever they call it when it isn’t 100% automated is basically super fancy cruise control and should be treated as such. The main problem with the term autopilot is that for airplanes it means 100% control and very misleading when used for fancy cruise control in cars.

          I agree that it should be limited in use to highways and other open roads, like when cruise control should be used. People using cruise control in the city without being in control to brake is the same basic issue.

          Not 100% fully automated with no expectation of driver involvement should be allowed when it has surpassed regular drivers. To be honest, we might even be there with how terrible human drivers are…

          • GonzoVeritas@lemmy.world
            link
            fedilink
            arrow-up
            22
            arrow-down
            1
            ·
            1 year ago

            Autopilot systems on airplanes make fewer claims about autonomous operation than Tesla. No pilot relies completely on autopilot functionality.

          • Amju Wolf@pawb.social
            link
            fedilink
            arrow-up
            4
            ·
            1 year ago

            Autopilot in aircraft is actually kinda comparable, it still needs a skilled human operator to set it up and monitor it (and other flight controls) all of the time. And in most modes it’s not even really all that autonomous - at most it follows a pre-programmed route.

              • Amju Wolf@pawb.social
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                They can, but the setup is still non-trivial and full auto landing capability isn’t used all that much even if technically available. It also isn’t just the capability of the aircraft, it requires a shitton of supporting infrastructure on the ground (airport) and many airports don’t support this.

                That would be equivalent to installing new intersections where you’d also have a broadcast of what the current signals are for each lane, which would help self-driving cars immensely (and regular cars eventually too, with assistive technologies to help drivers drive more safe), but that’s simply not a thing yet.

        • possibly a cat@lemmy.ml
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          1 year ago

          At this point, I vote for whatever is most well-demonstrated to be safe. I like your ideas.

          I was also thinking, maybe a standardized protocol could be implemented where municipalities can broadcast a signal containing the local road rules, which could then be interpreted by the car’s processor. If you could get enough bandwidth then you could feasibly even give site-specific instructions, like provide extra breaking distance and signal time at a specific intersection (or you know, the light status lol), or road state characteristics like dryness, or lockout areas with road work or accidents.

          However, I also think that the driver should ultimately be responsible for the safety of the vehicle’s operation for the time being, including when the vehicle is driving itself. The driver has the ability to hit the break and take control. While the technology is this immature, it is irresponsible for the operator to not supervise it. Fine the manufacturer a hefty fine for implementing unsafe technology, and fine the operator a much smaller but still meaningful amount for unsafe operation.

          Unfortunately, due to the number of vehicles on the roads and the resource and pollution intensity of manufacture and maintenance, the best solution to these problems is to replace personal vehicle infrastructure, not to upgrade it.

          • Amju Wolf@pawb.social
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 year ago

            I mean that’s a huge issue for human drivers too.

            We need assistive technologies that protect us, but if at any point the driver is no longer driving the car manufacturer needs to take full responsibility.

          • DauntingFlamingo@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            1 year ago

            That would be the augmented part and the AI. ANYTHING that presents a potential hazard already takes a vehicle out of automated driving in most models, because after a few Teslas didn’t stop people started suing

        • Dudewitbow@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Its why im all for automated trucking. Truck drivers is a dwindling source and living the lifestyle of a cross country truck driver isnt highly sought after job. The self driving should do the large trip from hub to hub, and each hub ahould do the last few miles. Keeps drivers local and fixes a problem that is only going to get worse.

        • amanneedsamaid@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I disagree, I feel no matter how good the technology becomes, the odd one-in-a-million glitch that kills someone is not preferable to me over the accidents caused by humans. (Even if we assume the self driving cars crash at a lesser rate than human drivers).

          The less augmentation past lane assist and automated braking the better IMO. I definitely disagree with a capped speed limit built into the vehicle, that should never be limited less than what could melt engine components or something (and even that would be take time to turn on). The detriments that system would cause when it malfunctions far outweigh the benefits it would bring to safety.

        • sin_free_for_00_days@sopuli.xyz
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          I’m pretty sure there are autonomous cars driving around San Francisco, and have been for some time.

          EDIT: Here’s an uplifting story about San Francisco-ians(?) interacting with the self-driving cars.

      • stanleytweedle@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 year ago

        I think its almost always pointless to hold back innovation, but in this case I think a full ban on self driving cars would be a great move.

        I agree on both points. Also I think it’s important to characterize the ‘innovation’ of self driving as more social-economic than technological.

        The component systems- sensing, processing, communications, power, etc- have a wide range of engineering applications and research and development will inevitably continue no matter the future of self-driving. Self driving only solves a very particular social-economic-technological issue that only exists because of how humans historically chose to address the same issue with older technology. Self driving is more of a product than a ‘technology’ in my book.

        So my point there is that I don’t think a ban on full self driving really qualifies as ‘holding back innovation’ at all. It’s just telling companies not to develop a specific product. Hyperbolic example but nobody would say banning companies from creating a nuclear powered oven was ‘holding back innovation’. If anything forcing us to re-envision human transportation without integrating into legacy requirements advances innovation more than just trying to use AI to solve the problems created by using humans to solve the original problem of how to move humans around in cars.

        • amanneedsamaid@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          I see it the same way, but an incredible amount of people I’ve discussed this with say that its stupid to hold back technological innovation “like self-driving cars”. Its an unnecessary piece of technology.

          I also just think the whole ethical complication is fucked. The way we have it now, every driver is responsible for their actions and no driver ever glitches out on the freeway (and if they do, they bear the consequences). Imagine a man’s wife and kids getting killed by a drunk driver vs a self-driving car. In one scenario you can clearly place blame, and take action in a much more meaningful way than just suing a car manufacturer.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        The responsible party should be the owner of the vehicle, not the manufacturer or passenger. If a company runs an automated ride share service, for example, that company should be liable. Likewise if you own a car and use the self-driving feature, you are at fault it it goes wrong, so you should use it at your own risk.

        That said, for the owner to be truly responsible, they need ownership of the self-driving code, as well as diagnostics for them to be able to monitor it. If they don’t have that, do they truly own the car?

        That said, there’s nothing stopping a manufacturer or dealer from making a deal to cover self-driving fines.

        • amanneedsamaid@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 year ago

          Well exactly, I see no way that all the self driving source code will be FOSS (I don’t think corporations would ever willingly sign onto this). So the responsible party in the case of a malfunction should therefore be the company, because in a full self driving setup the occupant is not controlling the vehicle, and has no reasonable way to ensure the safety of the code.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            Which is why it should be dual responsibility. The owner of the vehicle chose to use the feature, so they have responsibility. If it malfunctions when the driver was following the instructions, the manufacturer has responsibility. Both are culpable, so they should share responsibility.

    • schroedingershat@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      2
      ·
      1 year ago

      Nah. Give tesla the same number of points everyone else gets on their license. If the company runs out, no more cars controlled by tesla on the roads…

      • MeshPotato@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        We already had that in the 70s and 80s. Those were RoRo trains.

        You put your car on a drive on ramp. Go into the comfy cabin, maybe even a sleeper cabin for over night journeys. Get out at the other end, drive your car down the carrier and explore the area that you’ve journeyed to with the vehicle that you own. Look up the 89s ABC film about the Ghan railway closing down.

        I live in Australia and love seeing the distant from my home centre of tue country. Unfortunately long distance trains here have become a lifestyle luxury experience rather than transportation. Same goes for bicycles amd motorcycles.

  • Rostby@lemm.ee
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    3
    ·
    1 year ago

    All I want is to see a post not related to Elon musk

    • Piecemakers@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      3
      ·
      edit-2
      1 year ago

      I dunno. I could go for one about him launching himself to Mars in a carbon-fiber & titanium capsule, piloting via gamepad, ya know? Especially if he brought Bezos, the Koch bros. & Gates along. 🤷🏼‍♂️ It’d save time at least on setting up the ol’ woodchipper down the road, ya know?

  • Arotrios@kbin.social
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    JFC that’s frightening. It blew that red at about 30mph, didn’t even really slow down except for the curve.

    • killall-q@kbin.social
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      1 year ago

      Because the car didn’t recognize it as a red light, probably due to all the green lights that were facing a similar direction.

      The issue is not the speed at which it took the turn, but that it cannot distinguish which traffic lights are for the lane the car is in.

      • possibly a cat@lemmy.ml
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        1 year ago

        Check out the interface screen, specifically from 0:17 to 0:21. I think the navi is operating in a developer mode. It shows what the FSD senses.

        Interestingly it seems that it does accurately sense the lights for a moment. But it also erroneously senses them, and the flicker pattern means it probably wasn’t able to come to a confident determination. If that’s the case this thing should have been built to fail safe.

        As the operator states, that’s a highway. They were taking quite the risk to chance it a second time, which is the iteration captured on video.

      • NotMyOldRedditName@kbin.social
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        1 year ago

        If you’ve watched any of their recent AI talks, they talk a lot about these unusual and complex intersections. Lane mappings in complexe intersections being one of the hardest problems. Currently they’re taking data from numerous cars to reconstruct intersections like this to then turn into a simulation and train it so it learns more and more complex things.

        There really are only 2 options.

        Solve this with vision and AI, or solve this with HD maps.

        But it has to be solved.

      • SheeEttin@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        If it sees red and green, it should take the safe option and stop until it is sure or the driver takes over.

        • NotMyOldRedditName@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          1 year ago

          If it’s unsure, but for whatever reason this failed, it seemed sure.

          I’ve had the car slow in unsure situations before so it can and does.

          It just got this one very wrong for some reason

    • drekly@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      1 year ago

      It blows my mind they decided not to use LIDAR anymore. Of course it’s getting worse.

  • adhdplantdev@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    1 year ago

    Man hackernews is full of people criticizing the poster saying that he should have disengaged the system so it learns completely missing the point that FSD should not be considered safe.

  • TGTX@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    1 year ago

    Definitely an unusual intersection where one street looks like a diagonal merge into another, but the stoplight placement is bizarre as the driver can see two different light directions at the same time coming up on the approach.

  • AwkwardPenguin@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    5
    ·
    1 year ago

    To be fair, it’s a messy intersection with lots of traffic lights. I’m struggling to understand which one is the one to look at. However I’m finding hard to believe Tesla actually has the skills to unbeta this shit hole.

    • galaxies_collide@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      That’s the thing, if FSD isn’t advanced enough to handle tricky intersections no matter the circumstance, then it’s not ready for deployment.

  • rusticus1773@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    12
    ·
    1 year ago

    So sick of shit like this getting posted. Of course the software is not perfect. There are so many warnings about it not being independent of driver intervention it’s crazy. Yet here we are with the entire internet hating on Musk so much that we have to tear down the evolution of self driving cars, which is arguably the most complicated computing and programming problem in history. Bring on the downvotes but for the record, I think Musk is a douchebag but can separately appreciate the effort involved in the herculean task of programming cars that drive themselves.

      • rusticus1773@lemmy.ml
        link
        fedilink
        arrow-up
        3
        arrow-down
        9
        ·
        1 year ago

        Oh I definitely get it - doesn’t matter how you think it’s marketed. Only an idiot would think it could completely drive independent of human input.

        • czech@no.faux.moe
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          edit-2
          1 year ago

          My uber was a Tesla once. The guy was convinced he could text and drive just fine on the highway. Looked at me like I was a total Karen. The average person is an idiot. Thats the crux of the issue.

          • ANuStart@kbin.social
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            Ok i am all in on the Elon hate, but I know you’re lying because I have a friend with full self driving and if you so much as look at the person in the passenger seat the car flips a shit and will even go as far as to disable full self driving for the rest of your trip.

            • czech@no.faux.moe
              link
              fedilink
              arrow-up
              4
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Looks like cameras were introduced in 2021. When was my incident?

          • rusticus1773@lemmy.ml
            link
            fedilink
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            Again, the in cabin camera monitors you and makes sure you are paying attention to the road. Otherwise, how can Tesla be held responsible for idiots? Remember that the idea is to DEVELOP self driving software so that the roads can be SAFER (not 100% safe but safer) from idiot humans.

            • czech@no.faux.moe
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              edit-2
              1 year ago

              That’s the first time you mentioned a camera; there is no “again”.

              But what does that change if its still being advertised as self-driving?

              People are idiots. They rationalize the safety features e.g. " oh, its not yet approved by regulators so they must use the driver-cam but Musk has said its self-driving so its okay".

              Remember that the idea is to DEVELOP self driving software so that the roads can be SAFER (not 100% safe but safer) from idiot humans.

              Umm what? The idea is to turn a profit for shareholders. Musk is not Tony Stark. In fact Tesla is now behind other self-driving platforms because Musk does not prioritize safety.
              https://www.businessinsider.com/elon-musk-demanded-cameras-over-radar-in-self-driving-cars-nytimes-2021-12

                • czech@no.faux.moe
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  Found Elon’s number one fan. Thanks for a good laugh.

                  Quoted for posterity:

                  Stop believing what you read without real world experience. Seriously, you’re what’s wrong with social media. You’re so gullible you believe everything you read.

                  Is FSD perfect or even good yet? No. But I promise you it’s better than any other option out there. I’ve personally used ford, Volvo ridden in waymo and own FSD. It’s not close.

                  Don’t pretend to know what Teslas mission is. Because you obviously don’t.

                  Your comments are so laughably ignorant it’s hard to even understand what your point it other than hatred for Musk.

    • Addv4@kbin.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      I think it’s not a case of the software not being perfect, but that they are actively breaking in live environments, where it is amazingly critical that they not break. If that is an issue, then they need to get to such a level of confidence were they don’t need to worry about breaking, which tesla is apparently not at currently.

      • rusticus1773@lemmy.ml
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Yet the software disengages if it detects that you are not paying attention. So in reality, when it’s engaged there are actually TWO drivers. How can anyone argue that’s less safe than ONE driver?

  • PenguinJuice@kbin.social
    link
    fedilink
    arrow-up
    3
    arrow-down
    28
    ·
    edit-2
    1 year ago

    He was prolly fired bc he couldn’t program the thing to stop at a red light.

    Also who knows when this footage was taken or if it was just test footage that has since been ironed out.

        • SpooneyOdin@lemmy.ml
          link
          fedilink
          arrow-up
          6
          ·
          1 year ago

          Or you could - oh, I don’t know - read the article you are commenting on… it says he was a test operator and not a programmer.

          • PenguinJuice@kbin.social
            link
            fedilink
            arrow-up
            1
            arrow-down
            5
            ·
            1 year ago

            Oh lol well then yeah, this is like releasing footage of a half baked game and claiming its buggy. Of course it is.

        • Chozo@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          That’s not how burden of proof works.

          Do you have information to back up what you said?

    • iByteABit [he/him]@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Just the fact that you think programming a car to stop at a red light is a one man task is enough to show how much you know about what you’re talking about

    • Rough_N_Ready@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      13
      ·
      1 year ago

      If you had read the article you’d know his job was “advanced driver assistance systems test operator”. His job was to test the cars, not program them.