this post was submitted on 14 Aug 2023
0 points (NaN% liked)

Technology

58959 readers
3952 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

top 13 comments
sorted by: hot top controversial new old
[–] CaptainProton@lemmy.world 0 points 1 year ago* (last edited 1 year ago) (1 children)

This is stupid. Teslas can park themselves, they're not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.

That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver's fault and they should be held responsible for their actions. It's not the courts job to legislate.

It's actually the NTSB's job to regulate car safety so if they don't already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.

[–] dzire187@feddit.de 0 points 1 year ago

It should be pulling over and putting the flashers on if a driver is unresponsive.

Yes. Actually, just stopping in the middle of the road with hazard lights would be sufficient.

[–] EndOfLine@lemmy.world 0 points 1 year ago* (last edited 1 year ago) (1 children)

Officers injured at the scene are blaming and suing Tesla over the incident.

...

And the reality is that any vehicle on cruise control with an impaired driver behind the wheel would’ve likely hit the police car at a higher speed. Autopilot might be maligned for its name but drivers are ultimately responsible for the way they choose to pilot any car, including a Tesla.

I hope those officers got one of those "you don't pay if we don't win" lawyers. The responsibility ultimately resides with the driver and I'm not seeing them getting any money from Tesla.

[–] friendlymessage@feddit.de 0 points 1 year ago

Well, in the end it's up to whether Tesla's ADAS is compliant with laws and regulations. If there really were 150 warnings by the ADAS without it disengaging, this might be an indicator of faulty software and therefore Tesla being at least partially at fault. It goes without saying that the driver is mostly to blame but an ADAS shouldn't just keep on driving when it senses that the driver is incapacitated.

[–] masterairmagic@sh.itjust.works 0 points 1 year ago (1 children)
[–] Thorny_Thicket@sopuli.xyz 0 points 1 year ago* (last edited 1 year ago) (1 children)

Tesla on autopilot/FSD is almost 4 times less likely to be involved in a crash than a human driven Tesla which even then is half as likely to end up in a accident compared to average car. You not liking Musk fortunelately doesn't change these facts.

In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

Source

[–] masterairmagic@sh.itjust.works 0 points 1 year ago (1 children)

So Tesla says. There is no independent verification of this data. It could all be bullshit.

[–] Thorny_Thicket@sopuli.xyz 0 points 1 year ago (1 children)

Perhaps. I'm sure you'll provide me with the independent data you're basing that "Teslas are not safe" claim on

[–] masterairmagic@sh.itjust.works 0 points 1 year ago (1 children)

So you take Tesla's word and believe it, but ask for proof for the contrary?

You're just a hypocrite.

[–] narp@feddit.de 0 points 1 year ago (1 children)

You made the first comment: "Teslas aren't safe", without providing proof.

And now you're calling someone a hypocrite because he asks for data of exactly what you claimed, while you're redefining your first argument as "the contrary".

So, do you have proof that Tesla's aren't safe in comparison to other cars, or is it just your opinion?

[–] masterairmagic@sh.itjust.works 0 points 1 year ago (1 children)

We're literally having this discussion under a video where automatic braking should have kicked in, but didn't.

[–] narp@feddit.de 0 points 1 year ago

But you can't base a fact on one accident. Or even multiple. What if newspapers like to write especially about Tesla accidents to generate clicks?

Teslas seemingly have a lot of accidents, but without checking the statistics and comparing it to other manufacturers you wouldn't really know if the perceived truth is a fact or not.

[–] chakan2@lemmy.world 0 points 1 year ago

I hope the cops win. Autopilot allows for a driver to completely disengage their attention from the car in a way that's not possible with just cruise control.

There's no way you can drop a human in a life threatening critical situation with 2.5 seconds to make a decision and expect them to make reasonable decisions. Even stone cold sober, that's a lot to ask of a person when the car makes a critical mistake like this.

On cruise, the driver would still have to be aware that they were driving. With auto pilot, the driver had likely passed out and the car carried on it merry way.