Garbage cars, garbage software, garbage ceo.
"Poorly designed, partially functional software running with substandard hardware and subpar implementation designed by overextended engineers and burnt-out developers led by known megalomaniac malfunctions, local man astonished."
That site is thoroughly enshitified:
Googles wet dream to make all web pages like this.
I love having uBlock Origin on my mobile browser
My well-trained monkey brain scrolled right past your comment, then I did a double take because I realized Lemmy shouldn't have ads. Thanks for that little scare, lol.
Why would you go to a site like that without an ad blocker?
I'm not sure any car company could make me feel comfortable about using a full auto feature.
Error: Angle of Attack too high, pitching nose down.
Error: Stall warning! Pitching nose down. Please do not resist!
I knew that was the Boeing 737 Max "issue" without clicking the link. When safety is an add-on cost, capitalism/profit margins negates safety.
From Fight Club:
A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don't do one.
Reading about the 737 Max disaster should be a must for everyone.
I agree. But automation has made air travel safer by an order of magnitude. The problem with the 737 Max debacle was trying to use automation as a band aid to avoid costly recirtification and pilot training. They didn't inform pilots about the new pitch control system and they didn't train them on how to deal with runaway trim. Oh and relying on a single sensor to detect ACA was also a bad move. So, many mistakes .. for which a lot of people died.
Notwithstanding that, every day hundreds of planes rely on automation to help keep passengers safe.
The other issue is the only had one of those angle of attack sensors and should’ve had more redundancy
You'd think aerospace engineers would have it down to reflex that things need to be fail safe. It's ironic a system designed to make the plane safer actually crashed the plane. That one should get an award for world's worst engineering.
Like any accident it wasn't just one thing. The maker implemented a safety system that was not fault tolerant, then the airline neglected to train pilots how to deal with a failure of that system. In fact that particular airline didn't even know the system had been added to their planes. Bad engineering, communication, and training still happens in the industry, but really it's pretty amazing how safe these machines are overall.
Pilot error is still the cause of a majority of accidents. A big problem is bad pilots that don't pass regular exams can slip through the system because of management deficiencies. Like pilots it happens in the medical industry where bad doctors or nurses just get passed on from one hospital to the next. Employers fail to do proper checks on previous job performance.
And FSD was going to be delivered in 2018, 2019, 2020, just around the corner.
Glad I never paid for that option when it was 5K not to mention the ludicrous 15K they want for it now.
Elon's motto: "Move fast and break stuff."
In this case, stuff may mean people, property, laws, take your pick.
Por que no los tres?
The software sucks, it always has sucked, and it's always going to suck. The cameras are in the wrong place, there isn't enough compute available, and the jitter in distance/size measurement because of non-existent stereoscopic cameras means there's no hope for real depth measurement.
People got ripped off for $15k for this crap.
Musk also refuses to use LIDAR for who knows what reason.
The reason is to save money so they can make more money. It always is about the $$$.
The Elon simps in the replies...
If anyone is searching for an easier way to the videos:
If anyone is searching for an easier way to see twitter in general:
Lmao car can't tell the difference between a green light facing the other road and the red light facing it.
Corporations only have incentive to suppress results that don't help them. This is why unless some third party evaluator (such as a gov agency) should be recieving automotive data separately for evaluation.
Car companies could easily send encrypted camera data to a third party data holder that both the client and company can decrypt - this would prevent the goverment from decrypting this data en mass - and when the car violates a law or crashes it could be decrypted by either party.
So I hate Elon Musk and I think Tesla is way overhyped, but I do want to point out that singular anecdotes like this don't mean anything.
Human drivers run red lights and crash cars all the time. It's not a question of whether a self-driving car runs a light or gets in a crash, it's whether they do it more often than a human driver. What are the statistics for red lights run per mile driven?
I'd hazard a guess that it's lower, but regardless shouldn't be available to the consumer yet if this is what they found in preliminary testing. Trying to hide it is quite disingenuous though - of course it's going to make mistakes while in testing and even after, trying to hide those mistakes and act as if they don't exist is not how you treat your potential customer base.
It’s kind of like how people are more likely to die in a car accident on their way to the airport than they are to die in a plane. Yet people are more likely to be afraid of flying.
There’s a sense of control that people tend to gravitate towards. In the aggregate, Tesla might run fewer red lights than humans. At the individual level there will certainly be “safe drivers” who run red lights at a lower rate, who end up dying because of the Tesla. It’s a hard pill to swallow.
I think what it really boils down to is that the vast majority of drivers who run red lights choose to do so out of stupidity, where as someone trusting Tesla's claims about their new "self-driving" car might not have the chance to stop the vehicle as it hurtles itself through a red light. So yes, in terms of raw numbers it will cause less accidents in some cases, but that it can happen at all when the average trusting consumer/user would expect to never do that compared to a normal car should be a huge issue.
Also, as far as liability goes, I'm horrified to think about what the future of vehicle injury lawsuits will look like in the US when the driver can blame the software and the company providing the software is run by a grifter asshole.
"one of those AMERICAN MADE self driving cars"
Rip the Simpsons
Why didn't he disengage and start breaking?
Because he was busy playing with his phone, duh!
Thanks for posting. Be warned, that cross-post link is NOT the same post. Definitely NSFW.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed