this post was submitted on 18 Jul 2023
43 points (92.2% liked)

Technology

34894 readers
1098 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
top 7 comments
sorted by: hot top controversial new old
[–] Aldehyde@kbin.social 4 points 1 year ago (3 children)

If self driving cars become disabled by putting a cone on the hood, maybe they aren’t ready to be on the streets.

[–] HughJanus@lemmy.ml 3 points 1 year ago* (last edited 1 year ago)

By that measure they never will be. Because people are never going to stop fucking with them.

I took a ride in a self driving car. There were 3 different people who just walked right out in front of the car. One of them crossed the street then turned back around to walk back in front. People really enjoy fucking with them. This is why we can't have nice things.

[–] hayander@lemmyngton.au 1 points 1 year ago (1 children)

I wouldn’t expect a human driver to move if their view is obstructed or there are objects on the vehicle

[–] theluddite@lemmy.ml 1 points 1 year ago

Humans are capable of assessing and addressing the obstruction; meanwhile these cars are permanently disabled without outside assistance.

[–] dom@lemmy.ca 1 points 1 year ago

They won't be ready to be on the streets until they've had a lot of time on the streets.

It's a catch 22.

[–] davewritescode@lemm.ee 3 points 1 year ago

Attacking AI based systems with malicious input is like shooting fish in a barrel. Combine high complexity with low comprehension of how the internals of the system actually work and you have a field day for security researchers.

It’s only a matter of time before someone hurts or robs a Tesla driver by forcing the car to do something that’s not in the best interest of the operator.

[–] apprehensively_human@lemmy.ca 1 points 1 year ago
load more comments
view more: next ›