this post was submitted on 12 Feb 2024
191 points (93.6% liked)

Technology

59428 readers
3166 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘Boycott Tesla’ ads to air during Super Bowl — “Tesla dances away from liability in Autopilot crashes by pointing to a note buried deep in the owner’s manual, that says Autopilot is only safe on fr...::undefined

you are viewing a single comment's thread
view the rest of the comments
[–] Ghostalmedia@lemmy.world 27 points 9 months ago (3 children)

When I was looking to buy a new car back in early 2019, I walked into a showroom for a final test drive before I threw some money down for a Model 3.

It started to rain pretty hard on the return drive back. When executing an auto lane change, the sensors freaked out because of the water interference and they violently yanked the car back into the origin lane halfway through the lane change. It hydroplaned a hair and scared this shit out of my wife and I. The Telsa employee assured us “it’s ok, this is normal.” Hearing that was normal was not comforting.

Upon returning to the showroom, a different model 3 in the parking lot started backing toward a small child. My wife saw what was happening, threw herself in front of the car, and that caused it to halt.

I’m sure the software has progressed in the past 5 years, but suffice to say, we changed our minds on the car at that time. Those two incidents within 15 minutes really made us question how that shit was legal.

[–] Draupnir@lemmy.world 7 points 9 months ago* (last edited 9 months ago) (1 children)

If the car was backing out, that was a human driver in control, not autopilot. Autopilot can only be enabled while driving on a well-marked roadway. The first part is plausible however. Likely the software at the time could not handle rain appropriately and you are absolutely right to question this if they tell you it was normal.

[–] Ghostalmedia@lemmy.world 4 points 9 months ago

The car was being summoned from a parking space. Summon / Smart Summon will absolutely back out of a space fully autonomously.

[–] Fisch@lemmy.ml 5 points 9 months ago* (last edited 9 months ago) (2 children)

That's the thing, it's only legal in the US (as far as I know, at least). In Germany you're only allowed to use self-driving if your hands are on the steering wheel at all times and you can take over if something goes wrong.

[–] eltrain123@lemmy.world 2 points 9 months ago

That’s the case in the US, too. The car automatically shuts off autopilot after 3 warnings of not keeping your hands on the steering wheel. It produces a loud audible alert after a few seconds if it senses the driver isn’t keeping their hands on the wheel. After the 3rd time, it continues the audible alert until the driver takes back control.

There are also several warnings about keeping your hands on the wheel and staying alert when engaging autopilot.

The people saying otherwise are either ignorant or disingenuous.

[–] c0m47053@feddit.uk 2 points 9 months ago (1 children)

I'm pretty sure that is also the case in the US. These incidents are either caused by some sort of defeat device (I have seen weights that wrap around the steering wheel, no idea if they work), or people who have just gotten good at resting a hand on the wheel and not paying attention I think

[–] Fisch@lemmy.ml -1 points 9 months ago

I thought Tesla just added that by choice and not because it's required by law

[–] JasSmith@sh.itjust.works 3 points 9 months ago

These instances of errors are obviously alarming, but all the evidence we have is that they’re still safer than human drivers. They will make mistakes - and sometimes those mistakes will cost lives - but they will make fewer mistakes than humans. Given this, as visceral as it feels when we hear of these stories, I think our ire is misplaced. Automated driving will never be perfect. If that’s the bar we’re aiming for we should just give up and go home. The goal is better than humans, and in many conditions, it’s already there.