this post was submitted on 17 Jul 2023
248 points (96.6% liked)

Technology

59446 readers
4490 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Well this is terrifying...

you are viewing a single comment's thread
view the rest of the comments
[–] nandeEbisu@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

I'm less concerned about that if its purely public data. If a police officer sat in a helicopter looking for drivers driving erratically, then notified a trooper on the ground to check on the car, and perform a field sobriety test if there is cause to do so I think that would fall within the confines of the law, even though thousands of cars could have been in their field of view and considered for potential DUI.

I am of the opinion that if the data is not either directly in public view, or the user can opt out of persisting it and it is available to the general public, even if for a fee, then its fine to use the data. I think any kind of AI algorithm's suggestions on its own should not be considered probable cause, you can use it to narrow down suspects, but you need actual evidence for a warrant or arrest.

I think the issue I have with this situation is collecting and storing such a vast amount of travel data on individuals without their consent. If leaked, that data could be used to track down victims of stalking and abuse, or political dissidents.