this post was submitted on 02 May 2024
60 points (91.7% liked)

Technology

59428 readers
3493 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

According to the news self driving trucks are about to hit the road with no driver on board.

But according to this book that is not going to happen. The author says that the real purpose is to get rid of the skilled drivers and replace them with underpaid button pushers.

Will they really do that? What's going to be the situation few years from now?

you are viewing a single comment's thread
view the rest of the comments
[–] IphtashuFitz@lemmy.world 9 points 6 months ago (1 children)
  • The AI will shut off before an impending accident just to transfer the blame onto the human.

I may be mistaken but I thought a law was passed (or maybe it was just a NHTSA regulation?) that stipulated any self driving system was at least partially to blame if it was in use within 30 seconds of an accident. I believe this was done after word got out that Tesla’s FSD was supposedly doing exactly this.

[–] barsquid@lemmy.world 5 points 6 months ago (1 children)

Time limit should be higher but that sounds like a step in the right direction.

[–] laurelraven@lemmy.blahaj.zone 1 points 6 months ago

The time limit is probably adequate since 30 seconds is actually quite a long time on the road in terms of response. Actions taking place that far before an accident will not lead irrevocably to the accident