this post was submitted on 14 Jun 2023
0 points (NaN% liked)

Lemmy.World Announcements

29028 readers
3 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news ๐Ÿ˜

Outages ๐Ÿ”ฅ

https://status.lemmy.world

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations ๐Ÿ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 1 year ago
MODERATORS
0
Reddit is OpenAI's moat (www.cyberdemon.org)
submitted 1 year ago* (last edited 1 year ago) by ambystoma@feddit.de to c/lemmyworld@lemmy.world
 

Interesting theory for what might have been another motivation behind the API changes. After all, Sam Altman (CEO of OpenAI) is a member of the Reddit board. What do you think?

edit: this is not my article by the way

you are viewing a single comment's thread
view the rest of the comments
[โ€“] Izzy@lemmy.world 0 points 1 year ago (2 children)

I thought it was obvious this was the case. Twitter and Reddit are unhappy that AI language models have used all this data for training, but didn't get paid for it. Personally I don't even consider it "their" data to begin with even if they can claim legal ownership of it. But they want to get paid obscene amounts of money for data that was created by the goodwill of their users.

[โ€“] ambystoma@feddit.de 0 points 1 year ago

Yes, but the special thing here is that OpenAI, which has a lot of shared stakeholders with Reddit, has already trained their models on its data, so they might have an interest in turning it off for the other companies. Also, they might be in a better position to negotiate with Reddit for special access to the data than smaller companies.

It's a pretty wild theory, but interesting nontheless.

[โ€“] flibbertigibbet@feddit.de 0 points 1 year ago

If they were worried data being scooped up for AI training, they would have approached it differently. What they did was target 3rd party apps to drive people to their app so they can get tracking data and push ads.