• coffeebiscuit@lemmy.world
    link
    fedilink
    English
    arrow-up
    128
    arrow-down
    6
    ·
    1 year ago

    Auto pilot beta? People are willing to test betas for cars? Are you insane? Insurance is going to have a field day.

    • zeppo@lemmy.world
      link
      fedilink
      English
      arrow-up
      67
      arrow-down
      7
      ·
      1 year ago

      What bothers me is, I have to drive on the road with people running some braindead Elon Musk software?

        • Uniquitous@lemmy.one
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          I was taught to always drive defensively. You never know when someone’s going to get distracted, get stupid, have a stroke… add glitchy robots to the list, it doesn’t make a whole lot of difference.

        • skyspydude1@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          And yet FSD is still worse than the one time I got in the car with an exchange student who had never driven a car before coming to the US and thought her learners permit was the same as a driver’s license.

    • elxeno@lemm.ee
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      3
      ·
      1 year ago

      From what i read, Auto Pilot (AP) is just to keep u on your lane while Full Self Driving (FSD) just switches lanes into oncoming traffic.

      • nomad@infosec.pub
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Funny how George Hotz of Comma.ai predicted this exact same issue years ago: “if I were Elon Musk I would not have shipped that lane change”.

        This issue likely arises as the cars sensors can not look “far enough ahead” on the lane it changes to. Which can lead to crashes from behind due to much faster cars and in this case lane confusion aa the car can not see oncoming traffic.

    • Dr. Dabbles@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      5
      ·
      1 year ago

      Even better, several people have died using it or killed someone else. It also has a long history of driving underneath semi truck trailers. Only Europe was smart enough to ban this garbage.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        24
        ·
        1 year ago

        FSD has never driven under a truck, that was autopilot, which is an LKAS system. The incident happend 1 year prior to “Navigate on autopilot” so the car in question was never even able to change lanes on its own. The driver deliberately instructed the car to drive into the trailer.

        FSD beta is currently available in most of Europe and has been for several months.

        • Dr. Dabbles@lemmy.world
          link
          fedilink
          English
          arrow-up
          32
          arrow-down
          6
          ·
          1 year ago

          FSD has never driven under a truck

          Yes it has. Well, into the back of one so fast that it went under at least.

          which is an LKAS system.

          So is FSD. 🤣 It’s level 2 bud, you’re really REALLY confused for someone pretending to own one.

          The driver deliberately instructed the car to drive into the trailer.

          Are you saying Josh Brown killed himself? Because if you are, that would be a new repulsive low even for you Elon simps.

    • MrSqueezles@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      The craziest part of the article is just how much effort the author put into collecting data and filing feedback and really really hoping that Tesla could pull the videos (they can), then went on to actively try and succeeded in recreating the problem at high speed next to another car.

    • meco03211@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      1 year ago

      Not Auto Pilot (AP). There’s a difference between FSD and AP. AP will just keep you between the lane lines and pace the car in front of you. It can also change lanes when told to. There’s also Enhanced Auto Pilot (EAP). EAP was supposed to bridge the gap between AP and FSD. It would go “on ramp to off ramp”. So it could switch lanes as needed and get to exit ramps. FSD is the mode where you shouldn’t need to touch it outside of answering the nag (the frequent nag to “apply force to the steering wheel” to tell it you are still alive and paying attention)*.

      '* At least I think that’s the same for FSD. I’m only on AP with AP1 hardware. Never had an issue that I’d blame on a “bug” or the software doing something “wrong”.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        1 year ago

        Even if those dipshits “opted in,” the rest of us sharing the road sure as Hell didn’t!

      • ours@lemmy.film
        link
        fedilink
        English
        arrow-up
        11
        ·
        1 year ago

        This isn’t just some email web app that may have a few bugs, it’s putting lives at risk on the road. They shouldn’t be able to just label it a beta, overpromise its capabilities, and neglect any responsibility.

      • abcxyz@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        I just can’t understand how regulators all over the world allow these things on the road. How the fuck do you allow the release of potentially deadly (for everyone involved, not just for the user) software en masse for the public to beta test for you… This is not Diablo IV…

      • foo@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Beta only means buggy piece of shit to people who use software and then mostly gamers. In industries where prototypes can kill people a “beta” product is one that is safe for the intended use. For example, if you invented a new way to do internal scans of people, before you can even test it on humans you would have done extensive testing on animals to know what works, what doesn’t, and what gives them cancer, and have done the modelling to have a strong understanding on if it is safe with humans.

        Nobody would tolerate a scanner that gave people cancer, oops

  • Flying Squid@lemmy.world
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    10
    ·
    1 year ago

    I’m not especially sympathetic to the Tesla drivers this might kill.

    I’m worried about everyone else.

  • asudox@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    11
    ·
    edit-2
    1 year ago

    It shouldn’t have even been released for normal people to use it in daily life, in real roads full of other cars. This poses a big life risk if you ask me, I hope countries start banning this feature soon otherwise many more other deaths will happen, and Elon somehow will get away with them. What’s so hard about driving a real car manually? Did you all become fatass lazy people that don’t even have the willpower to drive a car? Ridiculous. ML is experimental and for a machine, it’s amazing, but it isn’t as good as a human YET, thus causing life threatening accidents. FSD literally is still in beta, and people are driving full speed in roads with this beta software.

    • dufr@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      1 year ago

      It can’t be used in the EU. It would need to pass a review, Elon have claimed they are close to getting it through but Elon says a lot of things.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        18
        ·
        1 year ago

        Self-driving cars are actually only legal in a few countries. And those countries have tests.

        It’s only the United States that just lets anyone do what everyone earth it is that they want, even if it’s insanely dangerous.

        Everywhere else any car company that’s espousing self-driving tech would actually have to prove that it is safe, and only a few companies have managed to do this and even then the cars are limited to predefined areas where they are sure they’re not going to come across difficult situations.

      • tony@lemmy.hoyle.me.uk
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        In its current state it has basically no chance IMO.

        If they’d concentrated in making AP/Highway driving smarter first they might have got that through… there are already rules for that… but cities? I’d love to see the autonomous car that could drive through London or Manchester.

    • Ocelot@lemmies.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      47
      ·
      edit-2
      1 year ago

      Humans did not evolve to drive cars. ML did. It drives consistently with no distractions. It is never tired, drunk, or experiences road rage. It has super human reaction time and can see in a full 360 degrees. It is not about being a lazy fatass it is about safety. Hundreds of people in the US were killed in car accidents just today, and none of them were from self driving cars.

      Also please provide an example of a life threatening accident cause by FSD.

      • Zummy@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        1
        ·
        1 year ago

        The article listed 2 life threatening near accidents that were only prevented because the person behind the wheel took over and kicked out FSD. Read the article and then comment.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          7
          ·
          1 year ago

          Hilarious telling them to read the article first when you couldn’t even be bothered to read their question before replying.

          • Zummy@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            1 year ago

            I read it just fine. He asked for an example of a life threatening accident caused by Full Self Driving. I noted that 2 examples were listed in the article. The ONLY difference was that the driver prevented the accidents by being aware. The FSD was going to cause accidents without intervention. I guess in your would people are supposed to do nothing to avoid a major accident. Hilarious that you want to love FSD driving so much that you’re willing to defend a billionaire who wouldn’t piss on you if you were on fire. Billionaires are not your friends. FSD is BETA feature that doesn’t work properly. Take your love somewhere else and away from my comment because you read it, didn’t understand it, and fired off a reply stating I didn’t do something I did because you can understand me. The next time you want to have a discussion come prepared, or don’t come at all!

            • CmdrShepard@lemmy.one
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              8
              ·
              1 year ago

              Ah the “only difference” in your two examples of life-threatening accidents occurring is that no accident occurred in either example? That’s quite the difference if you ask me… this isn’t a level 4 or 5 system so driver intervention is required. These systems can’t improve without real world testing, meanwhile a hundred people die on the road every single day. I guess you’d prefer more people die on the road from drunk or distracted drivers than have manufacturers roll out solutions that aren’t absolutely 100% perfect even if they’re more perfect than human drivers most of the time.

              Your obessesion with Musk is clouding your judgment. I made no mention of him, nor do I like or defend him. This tech wasn’t built by Musk so who gives a shit about him in this discussion?

              • Zummy@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 year ago

                I am not obsessed with Musk in any form, but the fact of the matter is when you have FSD systems that fail to do the thing they are supposed to do, then maybe it’s not the best idea to roll them out the entire world. Maybe it’s better to continue with more limited testing. You act as if all drunk driving/distracted will stop when FSD is used and that simply isn’t the care. Many people still use gasoline powered cars and drink and drive even though it’s dangerous to do so. Furthermore, FSD will lead to more distracted driving because people will assume the self driving means the car will take of everything and there is no need to be vigilant.

                The plain truth is that while FSD can be the future, rolling it out despite knowing that it isn’t ready is not the solution it’s irresponsible and will cause harm. The almost accidents that you aren’t concerned with would have most likely killed the driver and probably other people to. Our difference of opinion here is that you believe it’s okay if people die as long the the testing shoes that there is a chance they won’t die in the future and think if anyone dies it’s too much. The feature clearly isn’t ready for prime time and needs more limited real world testing, but the fact of the matter is testing doesn’t bring in money.

                Your inability to ever consider the fact that a worldwide roll out might not be the best idea right now since the testing shows the car isn’t ready shows that you really aren’t arguing in good faith. You have chosen the position that FSD is good and is ready even when confronted with articles like the above show it isn’t. I would wager that a lot of people want the era, of FSD, they just want it when it works. Keep the roll out more limited and do further testing. When mistakes happen, take the time to figure out why and how it can be prevented in the future. You argue testing is needed, but are in favor of a roll out now even though we need lots more limited real world testing. Both can’t be true. Time to think what you really want, because I don’t think you know… And accusing any person who doesn’t want a complete roll out of FSD today of having a bias against Musk shows that.

        • Ocelot@lemmies.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          11
          ·
          edit-2
          1 year ago

          Teslas have 360 degree dashcams that are recording all the time. Why didn’t they upload the video? I promise you they have it.

          Such a video would go viral pretty easily. It would light a fire under tesla engineering to fix such a dangerous and life threatening situation. Where is it? Why is there never any footage attached to these articles? Why can’t I find a video ANYWHERE of such a thing? Why can nobody in this thread bashing the tech over and over produce any justification for their fear?

          If I were tesla and I wanted to cover up dangers of FSD trying to kill people I wouldn’t give everyone a constantly running dashcam. It would really make them look bad.

          Could it possibly be, just maybe, that the video disagrees with the “journalist” opinion that it was performing dangerously? Could it be that an article that says “Tesla FSD performs admirably, swerves to avoid obstacle that would have caused a blowout” might not get nearly as many clicks and ad revenue? Maybe?

          FSD is aware of where barriers and medians are. If it needs to swerve to avoid an obstacle it will go in whatever direction is safest. Sometimes that means towards a barrier. Sometimes the driver panicking and disengaging and taking over interrupts the maneuver and causes danger that wasn’t otherwise present. We will never know what actually happened because there is no evidence. Evidence that I promise you exists but for whatever reason was omitted.

          If a cop said something outrageous and dangerous happened to them and they say they are completely clear of fault and wrongdoing, would it not be reasonable to want to see the bodycam footage? If for whatever reason the police department says “we don’t have it” “its corrupted” or whatever other excuse would that not raise eyebrows? The same situation applies here.

          There are plenty of youtube channels out there like dirtytesla, whole mars catalog, AI Driver, Chuck Cook, and many others that show and even livestream FSD. None of them have been in an accident, even in very early releases of the beta software. These people are comfortable with the beta and often don’t take over control of the vehicle under any circumstances, even in their torture test scenario.

          Is it at all possible, just maybe, that FSD isn’t as dangerous as you might think? Fear is often a result of ignorance.

          I am extremely open to changing my mind here just show me some convincing evidence. Every tesla is recording all the time so it should be really easy to find some, no?

          Im sure im just a Tesla shill or fanboy whatever. The truth is Im just looking for facts. I would like to know why people feel this way and are so afraid of new technology despite overwhelming evidence to the contrary that it is saving lives.

          • wizardbeard@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            Wow that’s sure a lot of text for someone that didn’t read the article.

            The author states that despite having storage plugged in, he was not given the option to save a recording.

            • Ocelot@lemmies.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              1 year ago

              Thats because its a rolling recording. If you explicitly want to save a clip long-term you honk the horn. This is clearly laid out in the manual and is located as a setting right on the screen where the dashcam is enabled. This line is a pure cop-out. They had the footage they just refused to upload it. Possibly, they never bothered to check for it but that would be incredibly irresponsible for anything resembling “journalism”

          • Zummy@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            So you are saying that since the author of this article didn’t upload a video of the events he detailed, that means FSD has absolutely no issues and is completely safe for every person on the road to use all time? Seems like quite a leap to me, but what do I know? It seems to me that people here want FSD when it’s ready. You want it now, ready or not. I guess that’s where we disagree. And I don’t really think you are open to anyone changing your mind. I think you picked your position and come hell or high water you’re sticking to it.

            • Ocelot@lemmies.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              1 year ago

              FSD is not without issues, but yes lots in this thread are implying that FSD is unsafe and causes tons of accidents, which there is absolutely no evidence to back that up. Its just a “Feeling” they have. They believe that it is irresponsible of anyone to use it and doing so puts others at unnecessary risk. People will genuinely believe that I am putting my family and neighbors at serious risk of harm because I’m irresponsible enough to have and use FSD. All I have asked for the entire time is just some kind of evidence that it is dangerous. Anything. Help me understand your view. Please.

              The reason I defend it so much at this point is because it is already demonstrated to be far safer than any average human driver and getting better with every release. With the new V12 and full neural net it is expected to get far smoother and drive even more human like with less code and consuming less power. We have seen massive improvements in the tech just in the past year and the rate in which it gets better continues to accelerate. It is impossible to count how many lives it has saved already through accident avoidance. We don’t need misinformed people bashing and trying to cast doubt and hold back this technology just because they “feel” a certain way about it. You should absolutely criticize valid concerns, but FFS please bring some facts and evidence to the table.

              The reason I am confident why FSD is safe despite “feelings” is how its programmed. For several years prior to even the earliest public beta, the camera and AI system learned how to correctly identify everything on the road. Other cars, pedestrians, dogs, cats, babies, telephone poles, traffic cones, whatever. It is in a state now where it is as accurate as it ever can be and any issues it has are with regards to mislabeling one thing as something else (like a car as a truck, etc). That doesn’t actually matter with self driving because literally the first line of code in FSD is something like: “This is a car, this is a truck, this is a pedestrian, this is a dog and this is where it is, where it is going and how far away it is”… OK? Don’t hit those. And it doesn’t. Everything else comes secondary. It drives like a robot, it obeys traffic laws to a T and that pisses off other drivers, or freaks out whoever is behind the wheel because the car didn’t do exactly what they would have done in that situation and it is therefore wrong, so they had to take over, often times the act of the driver taking over for the car actually puts them in more danger than they were in if they would have just let the car continue the maneuver. This shouldn’t be surprising because humans as a whole SUCK at driving and making decisions like that. It is sometimes unnecessarily cautious around pedestrians (But, honestly how would you want it to behave?) It might suddenly detect a hazard and swerve to avoid it, possibly moving the car into another unoccupied space. It is fully aware of the space it is occupying and fully aware of the space it is about to occupy. And it doesn’t hit anything. There are lots of youtube channels that prove this, they upload regularly and stress test FSD and try to get it into trickier and trickier situations and it never hits anything. It acts indecisively sometimes, and waits for gaps too large in an abundance of caution, but these are the issues that are getting better over time. At no point does it do anything “Unsafe”, especially since wide release of the public beta. Imagine, if you would, a world where all cars are like this. The most dangerous part of driving right now, FSD or not, is other drivers. The more people we have using it who understand it and are comfortable with it, the better it gets and our roads get safer and safer. I really don’t care how you feel about Elon, he deserves every bit of hate that is sent his way, but FFS please take a look at FSD for what it is and what it is becoming. If it helps you feel any better he was not personally responsible for writing a single line of code or designing any of the components of the system.

              All I’ve gotten to “back up” the claims that it is dangerous here is 3 different articles referencing the exact same incident (the bay bridge pile-up). The video clearly shows the car coasting (regen) to a stop and just sitting there. Had emergency braking been engaged, the hazards would have been turned on, and the car would have stopped a lot quicker. FSD never, ever has had any history or incident of completely stopping in a lane. Any complaints about “Phantom Braking” are usually where the car slows down due to a detected hazard which may or may not be present. There is no evidence of this ever happening anywhere else. 500k of these cars on the road and no other similar reports. Is that a fault of the software, or is it more likely some kind of user error? From my standpoint, having actually used FSD for several years I can tell you with complete certainty that the car would never behave like that and there are far too many red flags in that video to reasonably cast blame on the software. Of course, we will see what plays out in the court case once it is completed, but in my professional opinion, the driver clearly disengaged FSD and allowed the car to come to a complete stop on its own and did nothing to move the car out of the way, it had absolutely nothing to do with the software. I’m 100% open to disagreement on that and am curious as to what a civilized discussion on it would sound like and what someone else thinks is happening here, but so far it just turns into a flame war and I get called a deluded fanboy, even being called a liar and other names. No evidence, no discussion, only anger.

              Again, here is my point. If FSD is as dangerous as others are implying then we should see tons of accidents. Given that every single one of these cars has a constantly running 360 degree dashcam, we should see some evidence, right? Maybe not from this specific case, maybe there’s a valid reason why they couldn’t upload it. But, surely with half a million cars on the road and many millions of miles traveled collectively, and more and more teslas hitting the road every day we should at least see something by now, right? There are tons and tons of videos of teslas avoiding accidents, but nobody wants to mention or talk about those. People are focusing all of their energy into one highly suspect negative with nothing to back it up, holding back technology and safety and refusing to have any sort of civilized discussion around it. Their entire perception on how this technology works is restricted to a few clickbait headlines where they didn’t even bother to read the article. They come here and confidently proclaim that they know for a fact it doesn’t work and will never work and how dare you even try to bring any facts to the table. Its as if the discussion is being led by children. Its not productive, not based on any sort of facts and doesn’t go anywhere, and we’re all confused and less informed as a result. For example, someone posts an accident that occurred in 2018 as evidence that full-self-driving doesn’t work, when FSD didn’t even exist until 2021. If you point that out, there’s no concession, there’s no rebuttal, there’s only anger. Pointing out simple, easily verifiable facts makes you a Tesla fanboy and therefore any opinion or input you may have on the matter is invalid. Your mind is already made up and you will never see our point of view! No, I don’t currently see your point of view because you don’t have even the most basic facts straight.

              • Zummy@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                It’s clear from what you wrote that you want FSD to be as good as it can and I think we can get there but we aren’t there yet. You say there hasn’t been any reports of any accidents with FSD save for one, but I don’t know if that’s true and that would require some serious research on my behalf to evaluate that. First, I don’t know the number of people that have a car capable of doing FSD driving, from your reply you said 500k on the road, but provided no evidence so I can’t say that’s true without independent evaluation. Second, I have no knowledge of how many of those cars use FSD. It may be a bunch, but it may not. You don’t say and I don’t know. Now there may be far less accidents with FSD, but if the number of people of people on the road in Q1 is 286 million just in the US (https://www.statista.com/statistics/859950/vehicles-in-operation-by-quarter-united-states/ ), and the number of vehicles using FSD every single day all the time for every single drive, it would stand to reason there are far less accidents because there are far less car. You also mention that it has become good at being able to detect objects and I think it has, but being able to detect objects and being able to avoid getting accidents when there are 286 million FSD driving cars on the road that FSD exclusively every single time the vehicle is in use are two different things.

                The fact is, I do want FSD to be a thing, but when I see article written by someone who says that two times they had to take over for the car so it didn’t kill the driver or others, I start to worry that FSD isn’t ready. And frankly although there are YouTube channels that are about electric vehicles that haven’t brought up accidents ever, I wonder if they have a reason not to. I’m not sure. Also, I can’t say the big YouTube channels have never talked about this because I haven’t watched every video they’ve ever posted. And I would have to do that to know if your correct.

                I see that you are passionate about FSD, and I think your passion makes you overlook the real discussion going on. People, and certainly not all people, generally want FSD to be a thing for the reasons you stated, but they want to make sure the cars are safe when they are. And I get that you take a risk every time you drive a car, but the fact of the matter is from reading this article I get the sense that FSD isn’t ready to implemented for every person with a drivers license to use. It sounds like the author knew what to do because he had been driving for some time. If he hasn’t, I think the situation could have been very different.

                You talk about the car not doing exactly what they would have done, but the in articles case it was going to crash. I don’t think anyone would have done that. If the car was able to detect the object, why was it going to crash into it? That is something that would need to be investigated. You argue that people talk about FSD being removed/cancelled because people have a feeling it isn’t good, but I haven’t seen that in droves. I’ve seen several people say that they think FSD needs more testing and more limited roll out.

                I know I didn’t hit all your points, but they were quite numerous. I want full self driving, but I want it to be reliable. And I think if articles like this are written we just aren’t there yet. Yes, keep it coming, but be real about its current limitations.

      • Chocrates@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Self driving is not there, and it may never get there, but you are right. We can save so many lives if we get this right.

        I dont know is Musk is responsible enough to be the one to get us there though.

    • sdoorex@slrpnk.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Well, this article is written by FredTesla who use to mod the TeslaMotors subreddit. Not only did he drink the koolaid, he brewed the damn stuff.

  • sdf05@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    1 year ago

    This is like that show “Upload”; the guy literally gets killed by a car

    • III@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      You should finish watching that first episode before making such bold statements.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        7
        ·
        1 year ago

        I mean I think its still a valid point. The car in the show was sabotaged, and that is definitely something that might be a thing once all cars self-drive. Especially once they remove controls like steering wheels.

        There hasn’t been a tesla FSD hack yet, but it would take spoofing a software update (and spoof the authentication and certs, etc)… The attacker would need to have access to a pretty massive supercomputer to make their own custom self-driving software and today getting the certs and everything right is next to impossible… but even then its only next to impossible, not impossible.

          • jabjoe@feddit.uk
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            “Attack surface” is the term you want. Big software means big attack surface. So keep code lean for security as well as efficiency.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            4
            ·
            1 year ago

            There are still a lot of other layers that need to be compromised past the cert for such an attack to even be possible. Even so, I suspect when such an attack does happen it will probably be for stealing cars. Your car would just wake up in the middle of the night and drive itself somewhere else to be cut up for parts. Less likely is any kind of safety issue since its so easy to take over control of the car.

        • 8ender@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          1 year ago

          Don’t even need sabotage. You already share the road with cars that someone repaired under a tree with the cheapest parts they could find.

          • reddithalation@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            oh look more anti right to repair sentiment.

            no, cars repaired by people other than the manufacturer wont kill you

            • 8ender@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I’ve been repairing cars for over 15 years. There’s a massive spectrum for quality on almost any aftermarket replacement part. Literally the same part can range from $50 to $400 and the only difference is quality and durability.

              Sometimes the cheap part is fine, sometimes they cause weird problems. Especially electrical parts.

              • reddithalation@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                yeah sure, but that is unlikely to kill you or someone else, and diy repair is almost always good for the consumer

    • jabjoe@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Well hold on there, he survived the crash, and would probably have been ok. It was the upload that killed him.

      • sdf05@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Yeah, my bad 🤣 I meant the car technically endangered him to not live longer 😔

  • Mockrenocks@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    Frankly, it speaks incredibly poorly to the NHTSA that this kind of behavior is allowed. “Beta testing” a machine learning driving assistance feature on active highways at 70+ miles an hour is a recipe for disaster. Calling it Full-Self Driving while also not having guardrails on its behavior is false advertising as well as just plain dangerous.

  • fosforus@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    I like my Tesla but there’s no way I’ll be switching that thing on. They’re even calling it beta, what the fuck do people think that means?

  • megalodon@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    7
    ·
    1 year ago

    FFS. He was testing a beta update at 73 miles per hour. Is he really expecting sympathy?

    • spezz@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Maybe it shouldnt be released for real world use with such major bugs then. Dont give me the crap that iTs DiFfErEnT because tesla is a “technology company” either. Its a car, safety features on it should work damn near 100% of the time before it is released.

      • megalodon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        What crap am I giving? I’m just saying it’s a stupid idea to beta test self driving technology on the highway.

        • hackitfast@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          It’s the same reason we don’t take drugs that haven’t been tested yet. You know, not in lab rats.

          Google treats its users as beta testers all the time. Difference is a phone won’t kill me when it crashes and reboots.

    • SomeRandomWords@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I thought all FSD updates were beta updates? Did I miss the announcement of FSD going GA and being stable?

      If that’s the case, then yeah I probably wouldn’t test run a new update on the highway first. But I also have no idea if this issue happens at lower speeds as well.

      • megalodon@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        Isn’t that the issue? He’s using something that’s still in beta on the highway.

        • SomeRandomWords@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Yes, 100%. Anyone is a fool to use Tesla “FSD Beta” pretty much anywhere. But Tesla markets it as totally safe to use anywhere and everywhere (but especially highways) so there’s a point where you have to stop calling everyone that owns a Tesla a fool and acknowledge that the common denominator is Tesla and just not the owner’s foolishness.

          • megalodon@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I didn’t call everyone that owns a Tesla a fool. I questioned whether someone who decides to risk their life to test a feature still in beta deserves sympathy.

  • Ocelot@lemmies.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    86
    ·
    1 year ago

    Electrek has a long history of anti tesla clickbait. Take this with a grain of salt.

    Teslas are factory equipped with a 360 degree dashcam yet we never see any footage of these alleged incidents.

    • silvercove@lemdro.idOP
      link
      fedilink
      English
      arrow-up
      45
      arrow-down
      6
      ·
      1 year ago

      Are you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.

        • zeppo@lemmy.world
          link
          fedilink
          English
          arrow-up
          28
          arrow-down
          5
          ·
          1 year ago

          Musk just did a 20 minute video that ended with it trying to drive into traffic.

            • zeppo@lemmy.world
              link
              fedilink
              English
              arrow-up
              22
              arrow-down
              4
              ·
              1 year ago

              The video ended when he made an “intervention” at a red light. I’m not watching whatever link that is because I’m not a masochist.

              • Ocelot@lemmies.world
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                25
                ·
                edit-2
                1 year ago

                Here’s the specific timestamp of the incident you mentioned in case you wanted to actually see it: https://youtu.be/aqsiWCLJ1ms?t=1190 The car wanted to move through the intersection on a green left turn arrow. I’ve seen a lot of human drivers do the same. In any case, its fixed now and never was part of any public release.

                The video didn’t end there, it was near the middle. What you’re referring to is a regression specifically with the HW3 model S that failed to recognize one of the red lights. Now I’m sure that sounds like a huge deal, but here’s the thing…

                This was a demo of a very early alpha release of FSD 12 (current public release 11.4.7) representing a completely new and more efficient method of utilizing the neural network for driving and has already been fixed. It is not released to anyone outside of a select few Tesla employees. Other than that it performed flawlessly for over 40 minutes in a live demo.

                • midorale@lemmy.villa-straylight.social
                  link
                  fedilink
                  English
                  arrow-up
                  12
                  arrow-down
                  3
                  ·
                  1 year ago

                  Other than that it performed flawlessly for over 40 minutes in a live demo.

                  I get that this is an alpha, but the problem with full self driving is that’s way worse than what users want. If chatgpt gave you perfect information for 40 minutes (it doesn’t) and then huge lies once, we’d be using it everywhere. You can validate the lies.

                  With FSD, that threshold means a lot of people would have terrible accidents. No amount of perfect driving outside of that window would make you feel very happy.

                • zeppo@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 year ago

                  it has to perform flawlessly 99.999999% of the time. The number of 9s matters. Otherwise, you are paying some moron to kill you and perhaps other people.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            15
            ·
            edit-2
            1 year ago

            I’m sure you’re just going to downvote this and move on without reading but I’m going to post it anyway for posterity.

            First, a little about me. I am a software engineer by trade with expertise in cloud and AI technologies. I have been an FSD beta tester since late 2020 with tens of thousands of incident-free miles logged on it.

            I’m familiar with all of these incidents. Its great that they’re in chronological order, that will be important later.

            I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.

            The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.

            Tesla’s autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of “hands-off” system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.

            OK, now that being said, lets dig in:

            November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge

            • I’m from the area and have driven this exact spot hundreds of times on FSD and have never experienced anything even remotely close to what is shown here
            • “Allegedly” with FSD engaged
            • Tesla FSD “phantom” braking does not behave like this, and never has in the past. Teslas have 360 degree vision and are aware of traffic in front of and behind them.
            • Notice at the beginning of the video that this car was in the process of a lane change, this introduces a couple of possibilities as to what happened here, namely:
            • Teslas do have a feature under autopilot/FSD that if after multiple warnings for the driver to pay attention and no engagement, the car will slow down and pull over to the shoulder and stop. This particular part of the bay bridge does not have a shoulder, so it stopped where it is. This seems unlikely, since neural networks are very capable of identifying what a shoulder is and that its in an active lane of traffic, and even with tesla’s massive fleet of vehicles on FSD there are no other recorded instances of this happening anywhere else.
            • This particular spot on the bay bridge eastbound has a very sudden and sharp exit to Yerba Buena Island. What I think happened is that the driver was aiming for this exit, saw that they were about to miss it and tapped the brake and put on the turn signal not realizing that they just disengaged FSD. The car then engaged regen braking and came to a full stop.
            • When a tesla comes to a full stop automatically (an emergency stop) it puts the hazards on automatically. This has been a feature since the v1 autopilot days. This car’s hazards do not come on after the stop.
            • What seems especially weird to me is that the driver continued to let the car sit there at a full stop while traffic piled up behind them. In FSD you are always in control of your own car and all it would have taken is tapping the accelerator pedal to get moving again. FSD will always relinquish control over the car to you if you tap the brakes or grab and move the steering wheel hard enough. Unless there was some mechanical issue that brought the car to a stop and prevented it from moving, in which case this is not the fault of the FSD software.
            • Looking at how quickly (or lack thereof) the car slowed down this seems to very clearly be the car using regen braking, not emergency braking. I’m almost positive this means that FSD was disengaged completely.
            • We don’t have all the facts on this case yet and I’ll be anxious to see how this plays out in court but there are definitely many red flags on this one that have me questioning what actually happened here, but I doubt if FSD has anything to do with it.
            • If my earlier point is true this is actually an instance of an accident being caused because the driver disengaged self-driving. The car would have been much safer if the driver wasn’t even there.

            April 22, 2022: Model Y in “summon mode” tries to drive through a $2 million jet

            • This one is a favorite among the tesla hate community. Understandably so.
            • Smart summon has 0 to do with FSD or even autopilot. It is a party trick to be used under very specific supervised processes
            • Smart summon relies exclusively on the front camera and ultrasonic sensors
            • While smart summon is engaged, the user still has full control over their car via the phone app. If the car does anything unexpected you only need to release your finger from the button and the car stops immediately. The “driver” did not do this and was not supervising the car, the car did not see the jet because it was entirely above the ultrasonic sensors, and as I’m sure you can understand the object recognition isn’t exactly trained on parked airplanes.
            • The app and the car remind the driver each and every time it is engaged that they need to be within a certain range and within eyesight of the car to use it. If you remote control your car into an obstacle and it causes an accident, its your fault, period.
            • Tesla is working on a new version of smart summon which will make this feature more useful in the future.

            February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system’s safety

            • I suggest actually watching the video here. What happened is highly at odds with what is actually in the video, but the vid is just over an hour long so I bet most people don’t bother watching it.
            • “It wouldn’t have hit them, it definitely wouldn’t have hit them. Do we need to cut that?” “No, you can keep it in”
            • If you look at what was happening on the car’s display, it detected someone entering the crosswalk and stepping out into traffic on the left side. The car hit the brake, sounded an alert and swerved to the right. There was a bicycle in front of where the car swerved but at no point was it about to “nearly take out a bicyclist”. It did definitely overreact here out of safety but at no point was anyone in danger.
            • Relatively speaking this is a very old version of FSD software, just after the first wave of semi-public release.

            December 6, 2021: Tesla accused of faking 2016 Full Self Driving video

            • lol

            March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car

            • Now we’re getting into pre-FSD autopilot. See above comments about the capabilites of autopilot. Feel free to compare these to other cars LKAS systems. You will see that there are still lots of accidents across the board even with LKAS. That is because it is an assist system and the driver is still fully responsible and in-control of the car.

            June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck

            • Again, pre-FSD. If the driver didn’t see the overturned truck and disengaged to stop then I’m not sure how anyone expects a basic LKAS system to be able to do that for them.

            March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes

            • This one involves a fatality, unfortunately. However, the car was not self-driving. There is something else very important to point out here:
            • The feature that allows Teslas to change lanes automatically on the freeway (Navigate on Autopilot) was not released until a year after this accident happened. That means, that if AP was engaged in this accident, the driver deliberately instructed the car via engaging the turn signal to merge into that truck.

            May 7, 2016: First known fatality involving Tesla’s Autopilot system

            • Now we’re getting way back into the V1 autopilot systems which weren’t even made by tesla. This uses a system called MobilEye and is made by a third party and is even less capable than V2 autopilot

            So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we’ve got in terms of a list showing how “Dangerous” it is? That is pretty remarkable.

            Excited to see your response.

            • naeemthm@lemmy.world
              link
              fedilink
              English
              arrow-up
              15
              arrow-down
              4
              ·
              edit-2
              1 year ago

              Interesting, you wrote an entire dissertation on why you think this is all a false flag about Full Self Driving, but it seems to be mostly anecdotal or what you think is happening. Being a “software by trade” isn’t enough to face the facts that something fishy is 100% going on with Tesla’s autopilot system.

              “The last time NHTSA released information on fatalities connected to Autopilot, in June 2022, it only tied three deaths to the technology. Less than a year later, the most recent numbers suggest 17 fatalities, with 11 of them happening since May 2022. The Post notes that the increase in the number of crashes happened alongside a rapid expansion of Tesla’s “Full Self-Driving” software from around 12,000 vehicles to almost 400,000 in about a year”

              https://www.caranddriver.com/news/a44185487/report-tesla-autopilot-crashes-since-2019/#

              You claim the timeline is important here and this is all post-2022.

              • CmdrShepard@lemmy.one
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                1 year ago

                What’s fishy about it? You realize 40,000 people die every year from car accidents, meaning 110 die every single day, and you’re referencing 17 fatalities spread out over a few years as some big crisis. This tech (from any manufacturer) isn’t going to prevent 100% of accidents, and there’s not much you can do when drivers willingly drive their car into the side of a semi just like they did before this technology existed.

                I won’t argue AP, FSD, or any other system doesn’t have it’s issues but most of these responses are overblown sensationalism.

              • Ocelot@lemmies.world
                link
                fedilink
                English
                arrow-up
                6
                arrow-down
                12
                ·
                edit-2
                1 year ago

                I am not a “Software by trade” that was a typo. Believe it or not I wrote that entire thing on mobile.

                Correlation does not equal causation. Tesla sold a huge number more vehicles in the past 2 years than ever before. Also in 2019,2020 and part of 2021 not a lot of people were driving due to the pandemic.

                And, yes, a lot of the first incident I covered there was mostly anecdotal or what I think is happening. Importantly, what I think is happening as someone with years and tens of thousands of miles of experience using FSD beta. I do not have the facts and also importantly, neither do you. I am interested to see what comes out of that court case, but from where I sit I do not think FSD was involved at all.

                Please let me know where I have misrepresented facts, I will either correct them or cite sources.

                Again, Teslas come with a factory installed 360 dashcam. It records all the time. Where are all of the videos of these FSD related incidents?

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        The ones from that guy who runs his own competing autonomous driving company who also refused to allow anyone else to perform the test with the car (which was all proven to be bullshit later because he was hitting the accelerator pedal)? There’s a lot of misinformation and FUD floating around out there.

        • Ocelot@lemmies.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Dan O’Dowd of Green Hill Software. Spent millions of dollars on a superbowl ad hitpiece and it backfired spectacularly. Although clearly there are still a few people that believe it. You should listen to the podcast with whole mars catalog of him trying to explain himself. Its really wild.

          Tesla took him to court and won

    • kinther@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      2
      ·
      1 year ago

      Wishful thinking that Tesla would publicly distribute footage of an accident caused by one of their cars…

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        10
        ·
        edit-2
        1 year ago

        Its saved on to a thumb drive. any user can pull off and use or post the footage anywhere. It never gets uploaded to tesla, only snapshots and telemetry.

        lol the anti tesla crew will downvote even the most basic facts.

        • kinther@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          But is it technically the user’s data, or is there some clause in Tesla car ownership that says it is Tesla the company’s data?

          Forgive me I’m ignorant of the fine details. I purchased a Chevy Bolt but had been looking into a Tesla as an alternative until Elon tried to be the super-cool Twitter guy.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            1 year ago

            It’s definitely the users data. There are a few tesla dashcam channels out there loaded with footage of other drivers acting like idiots.

    • Dr. Dabbles@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      4
      ·
      1 year ago

      Bud, we’ve seen literally thousands of videos of this happening, even from the Tesla simps. You’re seven years behind on your talking points.

        • Dr. Dabbles@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 year ago

          The first public release was much later than the smaller beta, which I had access to. And my reference to seven years was Josh Brown being killed by autopilot in 2016.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        16
        ·
        1 year ago

        Can you link a few? Something where FSD directly or indirectly causes an accident?

        • Dr. Dabbles@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          3
          ·
          1 year ago

          You’re working very hard in this thread to remain in the dark. You could take two seconds to look for yourself, but it seems like you won’t. Hell, they performed a recall because it was driving through stops. Something it’ll still do, of course, but they performed a recall.

          • Astroturfed@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            ·
            1 year ago

            Elon literally had to hit the brakes manually in a Livestream of the self driving tech as the car was going to go strait through a red light. Like less than a week ago… SOOOO safe, all the news stories of it killing people are fake!

            • Dr. Dabbles@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              2
              ·
              1 year ago

              Yeah. These people aren’t even good liars, but they try their hardest to defend the complete nonsense and lies.

                • CmdrShepard@lemmy.one
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  1 year ago

                  And that’s an article about Autopilot which is a completely separate system. For someone with such strong opinions, you sure seem to lack even a basic understanding of the technology that you’re discussing here, but I’m sure you’ll just pull out more insults and keep making references to your current obsession, Musk, as if that makes your argument any more credible or factual.

            • Ocelot@lemmies.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              12
              ·
              edit-2
              1 year ago

              The early alpha build not part of public release? That video? The one with the known regression in the model S?

              That video was a demo of the new FSD beta 12 software, which is the first time a neural network was in complete control of the car, resulting in a massive reduction in code and overall smoothness. Did I mention the part where it was unreleased to the public? Maybe there’s a reason for that?

              Other than that the car performed flawlessly in the entire 40 minute drive.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            9
            ·
            edit-2
            1 year ago

            The recall was most definitely not for “driving through stops”. It was to fix the behavior of doing a “rolling stop”, which is something 99.5% of drivers do, which is how it learned to do that. Where do you see that it still does not make a complete stop at stop signs?

            https://www.forbes.com/sites/bradtempleton/2022/02/01/feds-make-tesla-remove-rolling-stops-its-a-terrible-decision/?sh=67b344722111

            I’m not trying to remain in the dark here, I’m just presenting facts. I’m very open to change my mind on this situation entirely just give me the facts. You said there were thousands of these videos I’m just asking for evidence. I just get downvoted and nobody posts any of the evidence.

            Im an AI professional and have been an FSD beta tester for almost 3 years with tens of thousands of miles logged. How can I possibly be the one “in the dark” here?

            • Dr. Dabbles@lemmy.world
              link
              fedilink
              English
              arrow-up
              8
              arrow-down
              3
              ·
              1 year ago

              “rolling stop”

              Or put another way by someone not desperate for Elon’s attention, not stopping. Driving through stops.

              Where do you see that it still does not make a complete stop at stop signs?

              Signs, lights, it’ll gladly not stop for any of them. Where do I see it? Real life. Actually owning one of these foolish gadgets for 5 years. Where do you see your examples?

              Also, don’t send me brad templeton opinion pieces, he’s a complete hack and has outed himself as such many times. He does have a nice video explaining what he thinks of Tesla stans like you though. Did you watch that one, or do you only link his material when it’s convenient?

              I’m just presenting facts

              No you aren’t, you’re presenting a curated social media marketing campaign. Congrats, you fell for the ad. Do you think that beer is going to make you more attractive, too?

              I’m very open to change my mind on this situation entirely

              Ok. Tell us what evidence it would take for you to completely change your mind on this and realize Elon is a hack, running a dangerous con with low quality software being released to cars in the US and Canada? What evidence would you require to change your mind and accept that Tesla doesn’t properly test releases before they go out to customers?

              I’m an AI professional

              This has absolutely zero bearing on anything except that you’re probably extremely susceptible to Elon’s outright lies.

              have been an FSD beta tester for almost 3 years

              Doubt.

              How can I possibly be the one “in the dark” here?

              The term is “delusion”.

              • Ocelot@lemmies.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                7
                ·
                1 year ago

                Lets not resort to name calling or personal attacks here. You stated there are “Thousands of videos” of FSD related accidents, I only asked for a few examples. Please tell me where it is you’re getting this information. Help me change my mind.

                Do you understand what a “rolling stop” is? It is when you don’t come to a complete stop at a stop sign, you slow down to 0.5 or 1mph, check both ways and move through. This has been studied time and time again that practically NOBODY on the road comes to a full and complete stop at stop signs. That is how the FSD beta was working in earlier releases because thats how it learned to drive. NHTSA said they had to come to a complete stop so Tesla fixed it. You again said that Teslas were still rolling through stop signs and I’m once again asking where you got that information?

                “Actually owning one of these foolish gadgets for 5 years” I’m guessing you’re trying to say you own a Tesla? You clearly don’t have FSD because if you did you’d know that it makes full stops 100% of the time. I certainly have doubts you actually do own a tesla because if someone spends 40-60 grand on something they consider a “Foolish Gadget” why on earth would they hold on to it for so long? Just sell it and get something else, move on with your life and don’t bash people who like their cars.

                I’ve asked you, now 3 times now to present evidence. Video evidence of FSD doing dangerous things. Given that all teslas have 360 dashcams that are constantly recording and we live in an age of such ease of video sharing that really shouldn’t be a big ask if FSD is as dangerous as you’re implying. These incidents should be happening daily. That is what would change my mind. What would change your mind?

                I bought my model Y in 2020 with FSD and emailed tesla for early beta access based on my engineering experience and the part of the country I’m located in. They granted it almost a year later and I’ve been driving with it almost every day since. Why on earth would you doubt that? Do you need some kind of evidence?

                • Dr. Dabbles@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  8
                  arrow-down
                  2
                  ·
                  1 year ago

                  Lets not resort to name calling or personal attacks here

                  If you’re going to be a liar, I’m going to call you one.

                  Someone has already provided you samples and you contorted yourself trying to deny their existence still. That shows the caliber of person you are.

                  Do you understand what a “rolling stop” is?

                  Do you understand what a red light and a stop sign are? See, traffic control devices are to be obeyed properly, and creating software that intentionally breaks the law is irresponsible at best. Only a clown would attempt to defend this. Meanwhile, you’re ignoring Elon’s own video from a week ago because it instantly disproves your insane position.

                  I’m guessing you’re trying to say you own a Tesla?

                  I did. I learned my lesson and am a proud one-and-done former tesla owner.

                  You clearly don’t have FSD

                  I did. Swing and a miss.

                  it makes full stops 100% of the time

                  Except, you know, the fucking recall proves it didn’t. And it still doesn’t after the recall. And of course, it misses traffic control devices frequently, ignores them at speed, attempts to pull through them when stopped, etc. Please, do yourself a favor and end this now. Lying to me isn’t going to work.

                  if someone spends 40-60

                  2018 P3D with performance package. More like 70+, with EAP from the factory, and the $2k FSD upgrade when Elon was busy being an idiot about pricing. If you have any questions for someone that’s actually owned one, I’d be glad to answer them for you.

                  why on earth would they hold on to it for so long?

                  Waiting for my replacement.

                  don’t bash people who like their cars.

                  I didn’t. I bashed you for being a liar.

                  I’ve asked you, now 3 times now to present evidence.

                  I asked what evidence would change your mind, and I see you entirely dodged that question. Because there is none. There’s nothing that would change your mind, because your mind is made up. It’s religion, and you don’t convince someone their religion is nonsense. I’m not surprised, of course. All liars behave like this- they pretend there’s something that could completely shift their world view, and change a core piece of their identity… like simping for Musk. But deep down, they know. There’s no such evidence. The racism, the sexual assaults, the financial grift, the hard right bullshit, the transphobia and homophobia, none of that changes your mind. The untested nature of AP and FSD, the release of “smart” summon that immediately started crashing into things, the fact they sent engineers down to Chuck Cook’s intersection for three months to program a single behavior. None of that sinks in when you believe in the religion of Tesla.

                  I bought my model Y in 2020

                  lmao, so absolutely didn’t have FSD longer than me. Delightful. Hysterical and delightful.

                • Flying Squid@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  arrow-down
                  1
                  ·
                  1 year ago

                  Do you understand what a “rolling stop” is?

                  I sure do. I got pulled over for doing one. Because they’re not legal.

    • DingoBilly@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      5
      ·
      1 year ago

      Given your posts and rampant Tesla fanboyism, I honestly wouldn’t be surprised if you’re Elon himself just anxiously trying to save face.

      Then again, Elon would just publicly sprout misinformation about it all so it probably isn’t. Still, surprising that people are just so obsessed with Tesla they can’t take the bad with the good.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        11
        ·
        1 year ago

        all im asking for is some evidence of the bad. Nobody can provide it. It really shouldn’t be that hard.

        • DingoBilly@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          1 year ago

          You were provided evidence and disregard it and make excuses for it. It’s hard to have a discussion if you just exclude all evidence for it.

          Think of it another way, you’re saying there’s absolutely no way that FSD has ever failed in its publicly available software, even with hundreds of thousands of cars on the road? Use a logic test on yourself and ask if that’s realistic.

          • Kage520@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            Fsd makes a TON of mistakes. I’ve had the beta from the first public release. I don’t trust it to do anything more than lane holding and cruise control, with maybe some supervised lane changes. But it’s a beta. I understand that I am helping to test beta software.

            FSD in its current form should not be given to everyone. Tesla had it right when they gave it only to proven drivers (okay, it would have been better to test with paid employees, but I digress).

            FSD right now is like handing the keys to your 15 year old child and going to sleep in the back while they drive you home.

          • CmdrShepard@lemmy.one
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            Can you point to this evidence as I don’t see it anywhere?

            Also busting out a strawman argument one reply in to the discussion isn’t a good sign for the strength of your argument.

            • DingoBilly@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 year ago

              Look at the replies. I’m not going to sit and hand-pick them out for you, there’s plenty on there. Plenty of people either posting videos or stating first hand evidence of issues with FSD.

              Not sure where you see a strawman either. But whatever, if you aren’t seeing any evidence despite the many posts and don’t see how impossible a perfect record is then you won’t be convinced with any evidence regardless or are just a troll.

              • CmdrShepard@lemmy.one
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                I have looked at the replies and there are only a couple links about Autopilot crashes from users who think this is the same as FSD when it isn’t.

                Your strawman is claiming that this user is saying FSD is perfect and never had a failure. Nobody is arguing that. You guys keep mentioning all the deaths related to FSD, yet nobody has been able to provide a single one as evidence.

                • DingoBilly@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  I tried mate. You can lead a horse to water but you can’t force it to drink.

                  I guess you’re right then, if we exclude all the videos since they’re not FSD related, and all the written testimonials about FSD not working because they’re just written after all, then looks like FSD is indeed perfect and has no possible evidence for malfunctions. Praise be to Elon.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            5
            ·
            edit-2
            1 year ago

            Please let me know where I stated anything inaccurate in the comment about the single incident that has been dug up in the 500k FSD cars and millions of miles traveled self-driving.

            Also lets please keep this civil and not be name-calling. I hate Elon as much as anyone else and he deserves pretty much all the hate he gets. However it doesn’t change facts. Its not like he was responsible for writing even a single line of code in FSD or even designed or built any of the cars himself.

    • Astroturfed@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      1 year ago

      Ah yes, there’s no readily available footage of the dead bodies flying into the street or being crushed under the wheels so it’s made up. Of course.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        16
        ·
        edit-2
        1 year ago

        not all accidents are that violent. I would even accept a video of a simple fender bender to prove that FSD beta causes accidents with any sort of frequency. Those should be pretty common if FSD is dangerous as a lot of people are implying, right?

        • Flying Squid@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          1 year ago

          Wait, are you now suggesting you won’t accept that Teslas with FSD ever get into accidents without video evidence? FSD is perfect?

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            7
            ·
            edit-2
            1 year ago

            No I would never suggest that. The overwhelming consensus here is “FSD is dangerous. More dangerous than humans” Im asking for any proof of that here. So far, nothing. If they were getting in to accidents all the time there would be all kinds of footage, no? The fact is that even in this beta stage its already safer than human drivers. That apparently rubs people the wrong way for some reason. Don’t we all want safer roads?

              • Ocelot@lemmies.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                3
                ·
                1 year ago

                Do you have any footage to share of FSD fender benders? If not how can you even claim it’s dangerous? Every car equipped with FSD hardware is equipped with 360 dashcams. It should be really easy to find some footage where FSD is at-fault for an accident.

        • Astroturfed@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          edit-2
          1 year ago

          Look, I don’t like children either but wanting more child mowing cars out on the road is pretty twisted.

    • LibertyLizard@slrpnk.net
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      3
      ·
      1 year ago

      Hilariously I’ve also seen them accused of a pro-Tesla bias. Personally I think they are pretty balanced.

      • Dr. Dabbles@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        They are for sure not balanced. Alfred might have become more realistic about Elon and his bullshit for guarantee he would never get his roadster. That doesn’t mean he’s balanced.

        • LibertyLizard@slrpnk.net
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          edit-2
          1 year ago

          And this opinion is based on what? Obviously every online news source is concerned with increasing readership. But I’m not aware of any consistent factual issues in their reporting.

            • LibertyLizard@slrpnk.net
              link
              fedilink
              English
              arrow-up
              9
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Honestly the quality of journalism in this article is pretty low. Some of the points are valid but most are just nitpicks about little opinion pieces at the ends of the articles. I don’t find these particularly valuable, and they sometimes contain some bad takes as pointed out here, but that’s not an issue of factual reporting. So the worst they’ve identified is a few minor omissions which, sure, but if you write thousands of articles that’s going to happen.

              And by the way, this article is making the case that Electrek is deliberately biased towards Tesla, not away from them. So if anything it undermines your point.

              I think the scandal about car referrals was pretty suspicious, but again, when you look at their reporting it comes down as pretty balanced. Perhaps you could argue they talk too much about Tesla but they cover the good and the bad. And I would say almost everyone in America has been talking about Tesla too much for quite some time.