this post was submitted on 23 Mar 2023
0 points (NaN% liked)
Lemmy Support
4633 readers
1 users here now
Support / questions about Lemmy.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes its a bug in the crawler, it throws an error about too many files. The problem seems to be due to recursion, because there are up to 2700 concurrent active tasks. Need to find a way to optimize this.
You run them in parallel?