this post was submitted on 18 Oct 2024
36 points (92.9% liked)

No Stupid Questions

35862 readers
1342 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 1 year ago
MODERATORS
 

The reason I ask, is I've been seeing alot of news and cases of Tesla's self driving acting up and being a point of contention. But back in 2016-17 my ex's uncle and aunt got a Model X when they first dropped and they "auto-drove" us like 50 miles without any noticeable issue.

Was i just gambling my life or has the tech somehow gotten worse?

top 35 comments
sorted by: hot top controversial new old
[–] Sanctus@lemmy.world 35 points 1 month ago (2 children)

The tech never "evolved" in the first place. It was a deformity from the start. LIDAR was developed for this reason. But Tesla uses cheap ass cameras that try to interpret what it is seeing through the visual data. I'm guessing with my layman's knowledge this is why they veer at semi trucks. Because the technology itself is based on a shitty premise.

[–] BertramDitore@lemm.ee 24 points 1 month ago (1 children)

In general I think you’re right about the tech just being shitty, but a slight correction: LiDAR was not developed for self-driving, it’s just a relevant application of the technology. LiDAR has been around for quite a while, and was initially best known as a remote sensing technology. It is effective at remote sensing because it can penetrate certain solid materials, most importantly foliage. So when an aerial LiDAR dataset is collected for a forested area, since the light can penetrate through most of the foliage, one can essentially ‘delete’ the vegetation from the resulting point cloud, leaving a bare earth model, which is a very close approximation of the landscape’s actual topography if there had been no trees. This can be especially valuable for archaeological research, as foliage is often a significant obstacle for accurately mapping large sites, or even finding them in the first place.

All of that to say, yeah, self-driving buzz made LiDAR well known as tech, but it wasn’t developed for that purpose.

[–] Sanctus@lemmy.world 6 points 1 month ago

Thanks, I did not know that and it is honestly a way cooler application. Foliage X Ray vision.

[–] RaoulDook@lemmy.world 2 points 1 month ago (1 children)

In a non-Tesla car I've driven, there was an autopilot cruise control mode that just used cameras. In practice it only works out well if you're driving long distance on a highway with low traffic. It's still nice to have (much better than having no autopilot cruise mode) but I don't trust it around multiple lanes of other cars doing unpredictable shit. Also quits working in the rain when the cameras are obscured.

[–] Sanctus@lemmy.world 3 points 1 month ago

Sounds like normal cruise control functions with less unexpected errors.

[–] FuglyDuck@lemmy.world 34 points 1 month ago

Tesla’s marketing has consistently lied to you about the state of their “full self driving”.

It’s not. It never was. This is a perfect example of “fake it ‘till you make it”

[–] fine_sandy_bottom@lemmy.federate.cc 21 points 1 month ago (2 children)

The tech hasn't regressed, it just hasn't progressed while the marketing has.

Look up the automation levels: https://au.pcmag.com/cars-auto/94559/is-your-car-autonomous-the-6-levels-of-self-driving-explained

My wife's car is 6 years old, and is level 2. Nothing amazing now, but kinda cool in 2018.

Since then expectations have increased dramatically, and the problems you're hearing about are cars expected to have the higher levels of automation but failing to achieve that.

It seems like this is one of those technical problems that gets exponentially more difficult to solve, the closer we get to solving it. What I mean is, suppose a human averages 100,000km per "incident". It was easy to make a car do 90,000km per incident, less so to have it do 95,000km per incident, but we're finding it very very difficult to get that last 5% performance.

[–] FuglyDuck@lemmy.world 5 points 1 month ago (1 children)

It seems like this is one of those technical problems that gets exponentially more difficult to solve, the closer we get to solving it. What I mean is, suppose a human averages 100,000km per “incident”. It was easy to make a car do 90,000km per incident, less so to have it do 95,000km per incident, but we’re finding it very very difficult to get that last 5% performance.

to this, most cars with relatively high degrees of automation bounce out of it when there's something that it feels it can't solve adequately. Most of the time that works, but it would be like if a human were somehow able to only drive when everything was routine. (90% of our driving is routine. it's that last ten percent that isn't- and that's when we crash. well. barring things like drunk driving or distracted driving. Humans are dumb. autonomous driving was designed by humans and isn't any smarter)

[–] fine_sandy_bottom@lemmy.federate.cc 3 points 1 month ago (1 children)

You're not wrong, but that's not really what I meant although perhaps I didn't explain it very well.

Another way to say the same thing, if you group together all the various components or aspects of "driving", 95% of them might be solved relatively easily, but getting the last 5% right is extraordinarily difficult.

It's deceiving because the first time you saw a Level 2 car in 2018 it's natural to think that if they've made so much progress seemingly overnight, then surely in the next few years we will have Level 6 cars.

I do take your point that humans are also good drivers 95% of the time and mistakes only occur within 5% of situations. The issue there is the imperative that autonomous cars must be better than a human in all circumstances. If a human makes, on average, 5 serious mistakes every 500,000km, but an autonomous car makes 6, you'd probably not want to put your family in that autonomous car.

[–] FuglyDuck@lemmy.world 2 points 1 month ago (1 children)

well, to be fair, I'm pretty ardently apposed to self driving cars- i wouldn't put my family in one even if it was better than humans.

The reason being is that all it takes is one bug in one code and your entire family is fucked; and with the way corporations are now handling updates, I frankly don't trust them to maintain it properly at all. (AKA forced updates, with shit-for-testing cloudstrike, for example; could have been prevented, but they just had to push that update globally, all at the same time, to everyone. imagine a malicious admin doing a terror attack by making all your cars crash. or some intern pushing the wrong code and suddenly your car is bricked. I'll keep my dumb car, thank you very much.)

Yeah. I tend to agree.

Being able to drive without killing someone is only one aspect of an autonomous vehicle, and security is one that I'm not confident about in the least.

I've noticed that my wife's Level 2 car is just hopeless outside of the city. Sure that's where most people live and it's fine for most people.

Driving on country roads it spends more time having self-disabled it's autonomous features than not, simply because it can't see the road or what have you.

[–] thisguy1092@lemmy.world 1 points 1 month ago

Probably the best answer

[–] NeoNachtwaechter@lemmy.world 10 points 1 month ago (2 children)

Was i just gambling my life or has the tech somehow gotten worse?

We can safely assume that tech has evolved to the better during this time.

But it is still too dangerous. It should not be allowed on public roads yet.

[–] BlameThePeacock@lemmy.ca 6 points 1 month ago (1 children)

The public honestly shouldn't be allowed on roads yet either... they're fucking terrible at driving.

[–] 0x4E4F@sh.itjust.works 4 points 1 month ago

Especially kids with BMWs.

[–] Drivebyhaiku@lemmy.world 1 points 1 month ago

That the tech has evolved to be better actually is an assumption. The novel data problem hasn't been meaningfully addressed really at all so mostly we assume that progress has been made... but it's not meaningful progress. The promises being made for future capability is mostly pretty stale hype that hasn't changed year to year with a lot of the targets remaining unchanged. We are getting more data on where specifically and how it's failing, which is something, but overall it appears to be a plateau of non-linear progress with different updates being sometimes less safe than newer ones.

That actually safe self driving cars might be decades away however is antithetical to the hype run marketing campaigns that are working overtime to put up smoke and mirrors around the issue.

[–] gusgalarnyk@lemmy.world 9 points 1 month ago (1 children)

I appreciate these comments saying the tech hasn't degraded and it's been standstill, or that it was never great in the first place, all of which is true but I would like to interject my own Model 3 experience. When we first bought the Tesla in 2019 the self driving functionality on the highway felt safe and functional in nominal conditions. When we sold the Tesla 2 years ago (2022) the self driving felt noticably more finicky. It struggled to switch lanes, recognize when lanes started and ended, and had noticably more issues with maintaining proper speed and distance with other cars.

It probably wasn't significantly more dangerous, but it felt like it was. What was a feature we used for the first year or two without much complaint turned into something we never used and our driving time when down in that third year not up so it wasn't exposure time I don't think.

[–] ContrarianTrail@lemm.ee 1 points 1 month ago (1 children)

That's Autopilot though, not FSD. Untill not too long ago it was still switching to Autopilot on highways where as in cities it was using an entirely different system. Nowdays the neural nets based system is used on highways too assuming you have FSD subscribtion. If not, then it'll keep using Autopilot which is less advanced system.

[–] Anticorp@lemmy.world 3 points 1 month ago (2 children)

So you're saying the less advanced system works better?

[–] asdfasdfasdf@lemmy.world 1 points 1 month ago (1 children)

Probably, because it's simpler. Driving on a highway is way less complex of a task than driving in cities.

[–] Anticorp@lemmy.world 1 points 1 month ago

They should leave the working system in place for highways then.

[–] bitchkat@lemmy.world 1 points 1 month ago

It does a lot less. It just keeps you in yoursne and adjusts speed. You can't change lanes with disengaging AP.

[–] ch00f@lemmy.world 7 points 1 month ago (1 children)

The first Model X has Autopilot 1 which was a system designed by Mobileye. Tesla's relationship with Mobileye fell apart and they replaced it with an Nvidia based system in 2017(?). It was really really bad at the start as they were essentially starting from scratch. This system also used 8 cameras instead of the original 1.

Then Tesla released AP hardware 3 which was a custom-built silicon chip designed specifically for self-driving which also enabled proper navigation of surface streets in addition to the just highway lanekeeping offered in AP1. This broadened scope of actually dealing with turns and traffic from multiple angles is probably where the reputation of it being dangerous has come from.

My HW3 enabled Model 3 does make mistakes, though it's rarely anything like hitting a pedestrian or running off the road. Most of my issues are with navigational errors. If the GPS gets messed up in the tunnel, it'll suddenly decide to take an exit that it isn't supposed to, or it'll get in the left lane to pass someone 1/4 mile from a right-exit.

[–] zante@lemmy.wtf 17 points 1 month ago (2 children)

I’m glad you rarely hit pedestrians

[–] 0x4E4F@sh.itjust.works 7 points 1 month ago

We call them NPCs here sir.

[–] ch00f@lemmy.world 4 points 1 month ago (1 children)

Heh, I guess I should have phrased that differently.

But yeah, it's actually really courteous. Sometimes a little too much. It'll move over to the left side of the lane if it sees a cyclist or pedestrian on the shoulder to the right. Unfortunately, it doesn't understand when there's a 3 ft concrete barrier between me and the pedestrian and will do it anyway. Makes some narrow bridge crossings a little scarier than necessary.

[–] zante@lemmy.wtf 1 points 1 month ago
[–] hendrik@palaver.p3x.de 7 points 1 month ago* (last edited 1 month ago)

Probably the gambling thing. And it's not like they wreck themselves every 50 miles, so the anecdotal evidence doesn't really apply. I mean unles they can't do that anymore. But I think the negative press is about some more rare incidents.

[–] NevermindNoMind@lemmy.world 6 points 1 month ago

Just a guess, but it's probably a combination of two things. First, if we say a self driving car is going to hit an edge case it can't resolve once in every, say, 100,000 miles, the number of Tesla's and other self driving cars on the roads now means more miles driven more frequently which means those edge cases are going to occur more frequently. Second, people are becoming over reliant on self driving - they are (incorrectly ) trusting it more and paying less attention, meaning less chance of human intervention when those edge cases occur. So probably the self driving is overall better, but the number of accidents overall is increasing.

[–] thesohoriots@lemmy.world 6 points 1 month ago

There are also many more Teslas on the road, and the “full self driving” incidents are more widely reported on since the new ownership likes to overpromise and vastly underdeliver. Other commenters have already addressed the tech side, but a few years ago, the Tesla-specific FSD was found to be active right up until a split second before some prolific collisions with emergency vehicles, leading to speculation on liability. Tesla aside, I think it’s just laziness on the part of drivers used to FSD doing the menial tasks of driving.

[–] Jobe 3 points 1 month ago

A guy on YT used to (or still does idk) upload entire FSD trips after every update. I haven't watched any lately but the trend was that the software was improving with some regressions every now and then, which were fixed in the next updates.

[–] phoenixz@lemmy.ca 3 points 1 month ago

No. Elon nusk simply is a liar on the level of trump and always claimed Tesla's could self-drive then from one end of the country to another whereas in reality it wouldn't be able to get itself out of a parking lot.

Any claim and or idea from musl can basically be considered a lie

[–] bitchkat@lemmy.world 2 points 1 month ago

That car had AP1 which used hardware developed by MobileEye. Tesla doest like licensing software so they ditched that for a homegrown solution.

[–] ContrarianTrail@lemm.ee -4 points 1 month ago* (last edited 1 month ago) (1 children)

Tesla recently switched from human-written code to full neural networks with their FSD system, which resulted in a significant leap in how reliable and human-like it became. Nowadays, it’s safe to assume it will perform better than the average human driver on most trips, but it’s still not reliable enough to be blindly trusted. You still need to sit there, paying attention to make sure it doesn’t do anything stupid. It’s, by far the most advanced 'self-driving' system available on a vehicle you can buy.

The reason it’s constantly in the news is that anything related to Elon, AI, or self-driving is guaranteed to get clicks in today’s mediascape. It's not a flawless system by any means but it's also not as bad as people make it to be.

Obvious shill gonna shill obviously.