this post was submitted on 03 Aug 2023
2 points (100.0% liked)

TeslaMotors

388 readers
1 users here now

Tesla (formerly Tesla Motors) is a tech company based in Palo Alto, California with a mission to accelerate the world's transition to sustainable energy. Developing products across the complete energy life cycle, from solar production and battery storage, to all-electric vehicles heavily focused on autonomy.

founded 4 years ago
MODERATORS
top 6 comments
sorted by: hot top controversial new old
[–] notfromhere@lemmy.one 1 points 1 year ago (1 children)

Jailbreaking automated equipment introduces a ton of risk. I’m generally a supporter of being able to do whatever you want with things you own, but things like this where tinkering with a heavy object on wheels where glitches will kill people, I’d be fine with some amount of regulation here probably from NHTSA. We really don’t want some random person running a 1-click tool to hack their self driving car to install buggy self driving software that may or may not have still have the safety overrides still working. Imagine if what they think they’re installing is FSD beta from Tesla but what they’re really installing is something infected with spyware which corrupts safety overrides?

I don’t know much about how driver takeover and brake pedal take over works on the Teslas, maybe these fears are unfounded, just my 2 cents.

[–] rebelsimile@sh.itjust.works 0 points 1 year ago (1 children)

Yeah I don’t know if I agree with this at all. You could make the same argument about nearly any component on any car (engine, brakes, etc).

[–] notfromhere@lemmy.one 0 points 1 year ago (1 children)

We’re talking about an autonomous system controlling the steering, acceleration and braking. Hardly an apt comparison to “engine, brakes, etc.” Those things are components to the functioning of the overall system. The self driving stuff sits on top of that and needs to be able to identify issues with the prime mover, brakes and etc and to disengage if it’s unsafe. Allowing someone to jailbreak the self driving system and override safety shutdowns is a recipe for disaster.

[–] rebelsimile@sh.itjust.works 0 points 1 year ago (1 children)

I’m saying that the power to tinker with a car and destroy its ability to be safely operated is not a new technology thing. Getting the government involved to prevent people from tinkering with items they own is a severe overreaction to a “threat” that has existed as long as people have owned wrenches and cars.

[–] notfromhere@lemmy.one 0 points 1 year ago (1 children)

That’s a great point. I guess my concerns are more from my bad experiences with computers randomly doing weird stuff and trusting my life to a system like that.

[–] rebelsimile@sh.itjust.works 1 points 1 year ago

I can understand that. I have an OBD-II module that I’ve used on my EV to unlock certain stupid locked features (including larger gas tank capacity — it’s a PHEV), but I definitely didn’t want to touch anything that could cause, say, a computer crash while driving down the road. But I’ve also had tires blow out, headlights fall out, transmissions break, and engines seize over the years as well. There are plenty of mechanical things that anyone could do that would cause catastrophe on the roads. I just don’t wanna go overboard on the government involvement since I think we should be able to actually repair/tinker with/jailbreak whatever we own, especially when it costs tens of thousands of dollars.