Chozo

joined 4 months ago
[–] Chozo@fedia.io 3 points 8 hours ago* (last edited 8 hours ago)

(I'm assuming we're talking about unprotected left turns.)

I don't know if I ever saw it happen, myself, so I can't say for certain. My understanding of the SDC's logic is that if it was already in the intersection, it would complete the turn, and then pull off to the right shoulder to let the emergency vehicle pass. If it hasn't yet entered the intersection and detects siren lights behind it, I believe it will turn on the hazard lights and remain stationary unless honked at (I could be mistaken, but I think it'll recognize being honked at by emergency vehicles, and will assume it to mean "move forward and clear a path"). The SDCs have an array of microphones around the car to detect honks, sirens, nearby crashes, etc, and can tell the direction the sounds are coming from for this purpose.

That said, because it's listening for sirens, the SDC will usually be aware that there's an emergency vehicle heading toward it well ahead of time, and if they've got their lights on, the SDC will usually be able to determine which vehicle, specifically, is the emergency vehicle, so it can monitor its trajectory and make sure it's staying out of the way when possible. Typically, they will be proactive about steering clear of anything with lights/sirens running.

This would also considered a higher-priority event, and usually it will automatically ping a live human to remotely monitor the situation, and depending on the specific context, they may either command the SDC to remain stationary, proceed forward, make a U-turn, or whatever else may be necessary. In case the emergency vehicle has a loud speaker, we'd be able to hear any requests they're making of us, as well.

For what it's worth, I know that Waymo also works pretty closely with the Phoenix PD, and provide them with updates about any significant changes to the car's behaviors or any tips/tricks for dealing with a stuck car in an emergency situation, so if a situation got particularly sticky, the cops would know how to work around it. My understanding is that Phoenix PD has generally been very cooperative, though they've apparently had issues with state troopers who don't seem to care to learn about how to deal with the cars.

[–] Chozo@fedia.io 2 points 12 hours ago

I tried it on mine, but couldn't get it to happen. It's possible that Gboard might be trying to make a grammatical correction. Not to imply that you don't know how to write, but something that needs to be asked from a troubleshooting perspective: Are you sure you're using the correct word for the sentence?

Also, and this might be an annoying step to take but, does it continue after clearing the Gboard app settings completely? I think Gboard will also make "habitual" corrections based on your usage, so it could also be possible that it thinks you frequently delete "don't" and replace it with "didn't", and is trying to preemptively correct it.

[–] Chozo@fedia.io 44 points 14 hours ago (5 children)

I used to work on the software for these cars, so I can speak to this a little. For what it's worth, I'm no longer with the project, so I have no reason to be sucking Google's dick if these weren't my honest opinions on the tech being used here. None of this is to excuse or defend Google, just sharing my insight on how these cars operate based on my experiences with them.

Waymo's cars actually do a really good job at self-navigation. Like, sometimes it's scary how good they actually are when you see the conditions they can operate under. There are so many layers of redundancies that you could lose all of the camera feeds, GPS, and cellular data, and they'll still be able to navigate themselves through traffic by the LIDAR sensors. Hell, even if you removed the LIDAR from that scenario, those cars accurately know their location based on the last known location combined with how many times each tire has revolved (though it'd just run into everything along the way, but at least it'd know where it's located the entire time). All of the other sensors and data points collected by the cars actually end up making GPS into the least accurate sensor on the car.

That said, the article mentions that it was due to "inconsistent construction signage", which I'd assume to be pretty accurate from my own experience with these cars. Waymo's cars are usually really good at detecting cone placements and determining where traffic is being rerouted to. But... that's generally only when the cones are where they're supposed to be. I've seen enough roadwork in Phoenix to know that sometimes Mad Max rules get applied, and even I wouldn't know how to drive through some of those work zones. It was pretty rare that I'd have to remotely take over an SDC, but 9/10 times I did it was because of construction signs/equipment being in weird places and I'd have to K-turn the car back where it came from.

That's not to say that construction consistently causes the cars to get stuck, but I'd say was one of the more common pain points. In theory, if somebody were to run over a cone and nobody picks it back up, an SDC might not interpret that obstruction properly, and can make a dumb decision like going down the wrong lane, under the incorrect assumption that traffic has been temporarily rerouted that way. It sounds scary, and probably looks scary as hell if you saw it on the street, but even then it's going to stop itself before coming anywhere near an oncoming car, even if it thinks it has right of way, since proximity to other objects will take priority over temporary signage.

The "driving through a red light" part I'm assuming might actually be inaccurate. Cops do lie, after all. I 100% believe in a Waymo car going down the opposing lane after some sketchy road cones, but I have a hard time buying that it ran a red light, since they will not go if they don't detect a green light. Passing through an intersection requires a positive detection of a green light; positive or negative detection of red won't matter, it has to see a green light for its intended lane or it will assume it has to stop at the line.

In the video, the cop says he turns on his lights and the SDC blows through a red light. While I was working there, red light violations were so rare that literally 100% of the red light violations we received were while a human was driving the car in manual mode. What I'd assume was likely going on is that the SDC was already in a state of "owning" the intersection for an unprotected left turn when the lights came on. When an SDC thinks it's being pulled over, it's going to go through its "pullover" process, which first requires exiting an intersection if currently in one. So what likely ended up happening is the SDC was already in the intersection preparing for a left turn, the light turns red while the SDC is in the box (and still legally has right of way to the intersection), cop turns on the sirens, SDC proceeds "forward" through the intersection until it's able to pull over.

But, that's just my speculation based on my somewhat outdated understanding of the software behind these cars. I'd love to see the video of it, but I doubt Waymo will release it unless there's a lawsuit.

[–] Chozo@fedia.io 44 points 15 hours ago (11 children)

I'm not sure why the police say it's "not feasible" to issue Google a citation. Google are the registered owners of the vehicles and thus responsible for any actions it performs, just mail them a ticket?

[–] Chozo@fedia.io 12 points 18 hours ago (1 children)

You can enable monetization on your server once you hit certain requirements. It will let users pay to become a member/subscriber, kinda similar to subscribing/joining on Twitch/YouTube. Basically just gives you a special role in the server, maybe access to hidden channels, etc, depending on how the server owner has it configured.

[–] Chozo@fedia.io 10 points 22 hours ago

So, a subway handle.

[–] Chozo@fedia.io 84 points 1 day ago (1 children)

Microsoft's much-heralded Notepad.exe was storing files as plain text

Same level of security concern. Quit putting your sensitive data into apps that aren't meant for it.

[–] Chozo@fedia.io 5 points 1 day ago (3 children)

Not sure if you're joking or not, but if not: they lost a lawsuit just over a month ago pertaining to distributing copyrighted works, the fallout of which is still being sorted through.

So maybe don't upload more copyrighted works to their servers for now.

[–] Chozo@fedia.io 4 points 1 day ago* (last edited 1 day ago)

The Alabama Ammo Bandito?!

[–] Chozo@fedia.io 51 points 1 day ago (3 children)

Texas here. Wishing I knew what that felt like.

[–] Chozo@fedia.io 1 points 1 day ago

Wordle 1,112 3/6*

🟨🟨⬛🟨⬛ 🟩🟩⬛🟩🟩 🟩🟩🟩🟩🟩

[–] Chozo@fedia.io 7 points 1 day ago

Gee, I can't imagine why somebody might make a throwaway account to ask a question about something they already said they've been bullied over.

 

Roko's basilisk is a thought experiment which states that an otherwise benevolent artificial superintelligence (AI) in the future would be incentivized to create a virtual reality simulation to torture anyone who knew of its potential existence but did not directly contribute to its advancement or development, in order to incentivize said advancement.It originated in a 2010 post at discussion board LessWrong, a technical forum focused on analytical rational enquiry. The thought experiment's name derives from the poster of the article (Roko) and the basilisk, a mythical creature capable of destroying enemies with its stare.

While the theory was initially dismissed as nothing but conjecture or speculation by many LessWrong users, LessWrong co-founder Eliezer Yudkowsky reported users who panicked upon reading the theory, due to its stipulation that knowing about the theory and its basilisk made one vulnerable to the basilisk itself. This led to discussion of the basilisk on the site being banned for five years. However, these reports were later dismissed as being exaggerations or inconsequential, and the theory itself was dismissed as nonsense, including by Yudkowsky himself. Even after the post's discreditation, it is still used as an example of principles such as Bayesian probability and implicit religion. It is also regarded as a simplified, derivative version of Pascal's wager.

Found out about this after stumbling upon this Kyle Hill video on the subject. It reminds me a little bit of "The Game".

 
 

Don't poke the Viper in the jungle unless you're ready for the venom.

view more: next ›