this post was submitted on 18 Aug 2024
17 points (87.0% liked)

Cybersecurity

5588 readers
213 users here now

c/cybersecurity is a community centered on the cybersecurity and information security profession. You can come here to discuss news, post something interesting, or just chat with others.

THE RULES

Instance Rules

Community Rules

If you ask someone to hack your "friends" socials you're just going to get banned so don't do that.

Learn about hacking

Hack the Box

Try Hack Me

Pico Capture the flag

Other security-related communities !databreaches@lemmy.zip !netsec@lemmy.world !cybersecurity@lemmy.capebreton.social !securitynews@infosec.pub !netsec@links.hackliberty.org !cybersecurity@infosec.pub !pulse_of_truth@infosec.pub

Notable mention to !cybersecuritymemes@lemmy.world

founded 1 year ago
MODERATORS
 

Copilot Autofix, a new addition to the GitHub Advanced Security service, analyzes vulnerabilities in code and offers code suggestions to help developers fix them.

top 4 comments
sorted by: hot top controversial new old
[–] ShinkanTrain@lemmy.ml 22 points 2 months ago (3 children)
[–] prex@aussie.zone 7 points 2 months ago

Autofix has now corrected your sentence to:

"We're all going to die."

This is now a perfectly correct sentence in every way.

Thank you for using Autofix.

[–] Gladaed 4 points 2 months ago

True, but unrelated. Llms aren't sentient. They are just a useful tool at times.

[–] hoch@lemmy.world 0 points 2 months ago

Please point to where the language model hurt you