this post was submitted on 01 Sep 2024
343 points (96.7% liked)

RPGMemes

10131 readers
891 users here now

Humor, jokes, memes about TTRPGs

founded 1 year ago
MODERATORS
 

This comic follows on from the Previous comic which will almost certainly provide context.

You might not wanna be famous, but when you're level 10, every organization within a mile is watching what you're doing.

you are viewing a single comment's thread
view the rest of the comments
[–] Forester@yiffit.net 46 points 2 weeks ago* (last edited 2 weeks ago) (14 children)

This won't fix it but it might help.

Make sure you have a robots.txt file with a crawl delay set for all agents once every 30 seconds and that you are disallowing most of the WordPress directories such as WP admin, the media directory, etc.

I would also strongly recommend that you use a caching system if you are not using one. It's a lot more efficient to serve the same image a hundred times to different bots from the ram than loading it off your drive.

Just my personal opinions working in a web hosting environment.

That'll probably help if it's i/o issues.

[–] ahdok@ttrpg.network 35 points 2 weeks ago (8 children)

most of these AI scrapers don't respect robots.txt, so I'm not sure that really helps much, but... we have tried doing all of these things.

[–] itslilith@lemmy.blahaj.zone 22 points 2 weeks ago (6 children)

Someone on lemmy suggested to create a dummy endpoint that normal people won't be able to navigate to, and disallow it in robots.txt

Then when somebody crawls it you know they are ignoring robots.txt, and you ip ban them

[–] ahdok@ttrpg.network 14 points 2 weeks ago (3 children)

That's pretty clever.

I think that these AI scrapers might be smart enough that this doesn't really work though - at least if I were designing them I'd have them all come from dynamic IPs and not have any of them bother hitting the same target more than once. These things are very dedicated to acquiring content without consent, and if they're capable of causing problems for (say) Reddit, I'm not sure my little website is going to have much luck deterring them.

Honestly a better strategy might be to just glaze everything I draw.

[–] Johanno 7 points 2 weeks ago (1 children)

I am not sure if it costs money, but you could implement captchas.

Or use cloudflare to do that bot detecting for you.

Worst case you make it so you need to create an account to see content.

[–] ahdok@ttrpg.network 4 points 2 weeks ago

Well, we are already using cloudflare, that's one of the other reasons why the site is so slow... I don't think the other two suggestions prevent a scraper from requesting the information from the server... I think they'd just make it more arduous for real people to access the content.

[–] MouseKeyboard@ttrpg.network 3 points 1 week ago

Honestly a better strategy might be to just glaze everything I draw.

I doubt that will help, they can still scrape the site and then wait until whatever version of Glaze was applied is cracked.

[–] Lumisal@lemmy.world 2 points 2 weeks ago (1 children)

Instead of a tech solution, why not a legal one? Place somewhere in the website that refusal to follow your robots.txt is agreement to pay you x amount of money for your content. Then combine that with the dummy page solution the other person brought up so you can record the IP address, then take them to court so they pay you. Has potential to bring you a really really nice chunk of money.

[–] ahdok@ttrpg.network 5 points 2 weeks ago

I believe that there are multiple very high profile billion-dollar lawsuits being run against AI companies right now. I don't really have the budget to sue these companies.

load more comments (2 replies)
load more comments (3 replies)
load more comments (8 replies)