it seemed fine at the start, then it suddenly pulled to the hard right.
Fast, artistic, high-definition, simply stunning
“FSD” should always be in quotes. What a joke from a company run by a mentally ill self proclaimed Nazi.
Full Sorta Drives
Fucking Shit at Driving
They made it about 2.5% of the planned trip on Tesla FSD v13.9 before crashing the vehicle.
A ha ha ha ha
and how much of the $2T promised did Edolf Twitler cut from government spending?
shareholder-influencers
I did not know my body could feel this revolted
Have you heard that they call themselves “rebellionaires”?
Rebellends
I don’t care what shareholders do behind closed doors, but they shouldn’t be allowed to shove it down our throats in public where children can see it!
(Obviously sarcasm, they shouldn’t be allowed to exist behind closed doors either)
Well, there core of every Tech-related mania since at least the Net boom in the late 90s has been people invested into the mania passing themselves as just people giving friendly advice online to try and convince others to jump into the bandwagon in order for their own stakes to go up in value.
This kind of shit has been more than normalized for decades.
The only unusual thing thing here is that they’re open about having an investment in TSLA.
I can’t hear you over me holding my tulips so tightly my knuckles are cracking.
The internet and social media have given everyone worldwide voices, and that’s both good and bad.
And there is so much pessimism for the future that lots of people are willing to sell their souls to cash out while they’re still here.
They point out in this incident and an incident from the author where, when you’re relying on Autopilot, even when you see something well in advance, you hesitate to react because you expect the car to do it for you.
I’ve always felt the myriad of safety features that protect the driver through corrective input/output are more harm than good. If you rely on your lane assist, adaptive cruise control, and proximity sensors, you aren’t prepared to react when they fail.
You shouldn’t be under the impression that a car will save your life. You should always have the mindset that you are responsible for the vehicle. If someone hit my small car because a sensor failed on theirs, I don’t give a shit if your system failed. You’re the responsible driver.
If you rely on your lane assist, adaptive cruise control, and proximity sensors, you aren’t prepared to react when they fail.
Yes! They’re making people lazy and inattentive behind the wheel.
Its basically LLM brain rot but for driving.
The safety systems would work perfectly if the cars could communicate, or more robustly if they could be mechanically linked together for the “easy” (highway) portion of the drive. Imagine a lane with nose to tail cars all doing exactly the same thing. At exits (predefined stops) you could get off to change or stay on the same one. Put a little station there with bathrooms and food.
Maybe we could even replace that highway lane with steel tracks and the tires with steel wheels for lower friction.
Damn you just always re-derive the train
Simply linking the cars wouldn’t be enough to address an issue like this though. They still need to individually recognize something like the debris this car ran over and deal with it appropriately.
If cars are linked to share data like this then I can easily see a scenario where one model of car with really good sensors sends a warning saying “hey, there’s road debris here”. But subsequent cars still need to be able to see it and avoid it as well. If the sensors in a following car aren’t as good as the sensors in the first car then that second car could still strike it.
Debris doesn’t remain stationary. Each vehicle that hits it will move it, possibly break it into multiple pieces, etc. And eventually, either through that process or by a person moving it, it will cease being a hazard.
I was just jokingly rederiving a train instead. I think automated cars is mostly really silly
We have an interesting highway near me where the HOV reverses direction for morning /evening commutes. When I come home from my son’s and it’s going the opposite direction, the stupid car would happily plough the multiple striped lift arms with red ribbons and flashing red lights at the entrance.
You are an idiot to have driven with AP/FSD and waited as long as they did.
This is the massive gulf between level 3 and 4 systems, and why level 3 is potentially dangerous.
I’m a very firm believer in the fact that safety features should be annoying and uncomfortable. Your lane assist needs to beep loudly every time it moves you back, thereby not only keeping you safe, but indirectly conditioning you to keep between the lanes to avoid the annoying beep.
My dad’s Mercedes indeed beeps incredibly loudly (anyone sleeping immediately wakes up in a panic) if the blind spot sensor goes off… which it does as soon as you put your blinker on.
Guess what that wonderful bit of tech taught my dad to do? That’s right, don’t use the blinker to change lanes if you don’t want your eardrums blown out.
The fundamental problem is that car manufacturers aren’t being held liable for the accidents caused directly or indirectly by these “safety” systems. There is zero oversight and no mandate to investigate false positives of these systems, even when they cause an accident. The end result is that for the manufacturers the point is not to improve safety but to do obnoxious safety theater so regulators look away from rising pedestrian deaths. “Sure our cars are one ton heavier, but they have automatic braking soooo we’re good right?”
Who knows if these gadgets actually do anything or even if they don’t decrease overall safety. The manufacturer gets positive marketing, throws the regulator off their scent, and isn’t held liable for shit when the “safety” system fails or encourages bad habits. Win-win-win. Except the general public loses. But who ever cared about these schmucks?
I have a particular gripe against lane keep assist. When it was active on cars I’ve rented… on the mountain passes just outside of the Vancouver Area, it went off way too often, since the lines would get blurry, or you have to stay clear of oncoming trucks around a curve meaning you have to go to the shoulder a bit. Also giving space when passing bicycle riders on the shoulder you (after checking of course), move to the centre just a tad.
Making these features more annoying would lead to alarm fatigue more than better behaviour.
I had to turn off the lane assist in our Mazda for that reason. It was constantly steering me back toward obstacles I was trying to avoid. I cursed it many times.
Other false alarms are frequent enough that I’m starting to ignore the alarm, so when it actually catches me in a mistake, I’ll probably ignore it then, too, and be in a crash.
That’s an easy argument to make but the reality is not that simple. Determining how many accidents are caused by these systems is much easier than determining how many accidents they have prevented. When an accident happens there’s something that can be investigated. There’s data. But when the system saves you and you go on your merry way, it’s never reported anywhere. The statistics have a very extreme bias here.
Accidents per miles driven takes successful trips into account.
This will sound ridiculous and I’m not claiming it’s even a valid feeling, but I’d rather die by my own hand with my input being involved than to have a safety system fail and have no involvement from me.
At least then I know there was some action I could have maybe taken to prevent it. But when it’s a safety system (still under heavy development) that fails, I’d feel way more cheated. Someone convinced me I would be safe and now they’ve lied.
However, after returning it to the Turo owner and having the suspension damage evaluated by Tesla, the repair job was estimated to be roughly $10,000. I wouldn’t be surprised if there’s a similar situation with this accident.
Hmm. That makes me wonder.
Like, it’s hard for me or for Joe Blow to evaluate how effective a car company’s self-driving functionality is. Requires expertise, and it’s constantly changing. And ideally, I shouldn’t be the one to bear cost, if I can’t evaluate risk, because then I’m taking on some unknown cost when purchasing the car.
And the car manufacturer isn’t in a position to be objective.
But an insurer can do that.
Like, I wonder if it’d be possible to have insurers offer packages that cover cost of accidents that occur while the car is in self-driving mode. That’d make it possible to put a price tag on accidents from self-driving systems.
Yes, if and only if the car manufacturer is the one paying for it. Otherwise the buyer is still taking on an unknown cost when buying the car in the form of an unknown number of insurance premiums.
The odds of this happening are of course zero.
Wouldn’'t that be a warranty issue? The car is sold with self-driving, if self-driving caused an accident that is because the product sold didn’t behave in the manner expected. This is of course only valid really for vehicles sold as fully self driving and not as an addition.
Though this really only applies in places with strong consumer protection laws lol.
Warranty is provided for a limited time only, though. Would you be comfortable riding a self-driving car past that warranty, when the self-driving software is no longer receiving updates and you’ll be the one who has to pay for any damage it causes?
Yes? People use driver aids today without warranty and many cars are on the road past their warranty.
Perhaps there could be a high insurance premium if the system seems insufficient, yet that might not stop people either. People are lazy and not very logical.
Thr cars that are on the road today past their warranty aren’t likely to decide to cause an accident. Mechanical failures leading to accidents are very rare. Self-driving technology has more examples of accidents caused than accidents prevented. I don’t think the same reasoning applies.
I agree to that extent, but I don’t think people will be deterred by it unless it’s not allowed by law.
A car from the early '90s is still driven unless it becomes too expensive for the comfort it provides but safety does not seem to be a consideration for many at this price-point (and I guess at other price points too). Modern regular cars are far more safe than what was typical in the '90s and trucks are far less safe than regular modern cars, yet they’re on the road.
As such, I think people people will keep using it, downplaying the risk involved. Many don’t treat cars as a boring means of transportation but rather as a desirable object. Us humans don’t act very logical when we want something.
There’s no arguing with that.
One could say “the thing speaks for itself”.
Hans?
Ja, Franz? Is it time for us to pump! [Clap] You up?
Not sure if that’s where you were going, my comment from before is just a literal translation of OP’s username.
I get the Hans & Franz reference, but I didn’t notice OP’s username. My reference is to chess grandmaster Hans Niemann describing how he beat world champion Magnus Carlsen in a controversial match.
Lines up nicely as “chess ipsa loquitur” as long as you don’t mind being hunted to the ends of the Earth by dead language pedants.
“People called Romanes they go the house”?
Lol. But indeed nice of the Youtubers to put it to the test and also be honest with the results.
I think they planned for it to fail; maybe not this fast. But there’s much more engagement bait potential with everyone loving to hate Tesla.
A duo of Tesla shareholder-influencers tried to complete Elon Musk’s coast-to-coast self-driving ride that he claimed Tesla would be able to do in 2017 and they crashed before making it about 60 miles.
From the article. I don’t think the Bearded Tesla Guy youtube channel was trying to have their tesla fail so quickly and spectacularly. I think they just wanted to rely fully on self driving and got unlucky with something being in the middle of the road.
Maybe. Or they’re just trying to make a buck as the ship sinks.
I like my Tesla, but it really has trouble driving by itself in many situations.
The irony of people downvoting you is that they probably also love to hate Tesla, or at least what it’s become under Elon.
I know I do. We all know Elon didn’t make shit, but he still thinks of it as his baby, that’s why I love to see these types of fails.
You’re probably right. That’s not enough to avoid downvotes because you potentially don’t count yourself among the Tesla haters. If your comment is ambiguous to the point where people aren’t sure where you stand, they will assume the worst.
I can handle downvotes ;-)
It’s funny how one needs to really tow the exact mindset of a sub on lemmy or get downvoted or even banned. In some ways it’s even worse than reddit.
For the record, I hate Tesla because of Elon. Some of the actual engineering is cool. I have a Tesla (bought before Elon went off the deep end) and quite like it. I wouldn’t buy another now.
They said it was the latest software but did not say whether the hardware was the latest. I wonder if it had HW 4, higher resolution cameras or the front bumper camera
Since they’ve gotten rid of LIDAR, there has been nothing but trouble. Such a monumentally stupid decision to try to brute force things the software route.
They never had LIDAR, only a few companies do (like Waymo). Teslas used to have RADAR but even that was removed.
to rely on cameras…because it doesn’t ever rain or snow in southern california.
Who cares, Tesla doesn’t have FSD software or hardware
No one has full self-driving yet, but it does have potential to change the world if anyone does succeed. I’m interested in all developments toward that, regardless of Nazi ceos
You have to be a certain kind of stupid to allow this to happen. The dude in the driver seat is a paint chip eater for sure