Asifall

joined 1 year ago
[–] Asifall@lemmy.world 1 points 6 days ago

Is there evidence that this is true? Ive read that the US is actually not more litigious than some European nations and the idea that it is has been boosted by corporations that want to shift public opinion against plaintiffs (an example being the infamous McDonald’s coffee lawsuit)

[–] Asifall@lemmy.world 1 points 6 days ago

I’m not qualified to say if this is accurate but thanks for putting in the effort to write it!

[–] Asifall@lemmy.world 7 points 1 week ago (1 children)

Growing up my mom didn’t understand this and always insisted that the sink plungers were the only kind that worked (she also called them toilet plungers) and that toilet plungers (the fancy kind) were some kind of trick. Took until I was in college that I learned you shouldn’t have to break a sweat unclogging your toilet.

[–] Asifall@lemmy.world 1 points 1 month ago

I think we also need levels of PII or something, maybe a completely different framework.

There’s this pattern I see at work where you want to have a user identifiable by some key, so you generate that key when an account is created and then you can pass that around instead of someone’s actual name or anything. The problem though, is that as soon as you link that value to user details anywhere in your system that value itself becomes PII because it could be used to correlate more relevant PII in other parts of your system. This viral property it has creates a situation where a stupid percentage of your data must be considered PII because the only way it isn’t is if it can be shown that there is no way to link the data to anybody’s personal information across every data store in the company.

So why is this a problem? Because if all data is sensitive none of it is. It creates situations where the production systems are so locked down that the only way for engineers to do basic operations is to bend the rules, and inevitably they will.

Anyway, I don’t know what the solution is but I expect data leaks will continue to be common passed the point when the situation is obviously unsustainable

[–] Asifall@lemmy.world 1 points 1 month ago

In my experience most Starbucks workers will just let you say small/medium/large without questioning it.

[–] Asifall@lemmy.world 3 points 1 month ago (3 children)

Maybe, but nobody cares when you call them facists so I’m not sure what the best move is.

I do actually think the weirdness argument does probably play to the suburban traditional values types who want to believe they’re the normal ones and everyone else is going crazy.

[–] Asifall@lemmy.world 5 points 1 month ago

I think at least some of it is because republicans already call democrats corrupt and criminal. When the dems come back and make the same accusations it just looks like bickering. Ideally the substance of such claims would matter but current political discourse in the US prioritizes sound bites and quips

[–] Asifall@lemmy.world 1 points 1 month ago (1 children)

That’s another good example. The Trump/Putin kissing mural is a great example of something that ends up being homophobic rather than partisan.

So you think it should be illegal?

If you used it to slander your neighbor, it would not be legal.

You’re entirely ignoring my point, I’m not trying to pass the video off as real therefore it’s not slander.

[–] Asifall@lemmy.world 1 points 1 month ago (3 children)

You keep referring to this as revenge porn which to me is a case where someone spreads nudes around as a way to punish their current or former partner. You could use AI to generate material to use as revenge porn, but I bet most AI nudes are not that.

Think about a political comic showing a pro-corporate politician performing a sex act with Jeff bezos. Clearly that would be protected speech. If you generate the same image with generative AI though then suddenly it’s illegal even if you clearly label it as being a parody. That’s the concern. Moreover, the slander/libel angle doesn’t make sense if you include a warning that the image is generated, as you are not making a false statement.

To sum up why I think this bill is kinda weird and likely to be ineffective, it’s perfectly legal for me to generate and distribute a fake ai video of my neighbor shooting a puppy as long as I don’t present it as a real video. If I generate the same video but my neighbor’s dick is hanging out, straight to jail. It’s not consistent.

[–] Asifall@lemmy.world 9 points 1 month ago (3 children)

Do you know what enthusiasm means? The article you linked has a number of examples of what data might lead someone to believe in the increased enthusiasm. Did you read it?

[–] Asifall@lemmy.world 1 points 1 month ago (5 children)

That’s arguably a better rule than the more traditional flat-fee penalties, as it curbs the impulse to treat violations as cost-of-business. A firm that makes $1B/year isn’t going to blink at a handful of $1000 judgements.

No argument there but it reinforces my point that this law is written for Taylor swift and not a random high schooler.

You’d be liable for producing an animated short staring “Definitely Not Mickey Mouse” under the same reasoning.

Except that there are fair use exceptions specifically to prevent copyright law from running afoul of the first amendment. You can see the parody exception used in many episodes of south park for example and even specifically used to depict Mickey Mouse. Either this bill allows for those types of uses in which case it’s toothless anyway or it’s much more restrictive to speech than existing copyright law.

[–] Asifall@lemmy.world 9 points 1 month ago* (last edited 1 month ago) (9 children)

Not convinced on this one

It seems like the bill is being pitched as protecting women who have fake nudes passed around their school but the text of the bill seems more aimed at the Taylor swift case.

1 The bill only applies where there is an “intent to distribute”

2 The bill talks about damages being calculated based on the profit of the defendant

The bill also states that you can’t label the image as AI generated or rely on the context of publication to avoid running afoul of this law. That seems at odds with the 1st amendment.

view more: next ›