In the past two weeks, the site has been very unstable, sometimes for one second a page would load and the next it wouldn’t load at all. The status of lemmy.world would fluctuate a lot on lemmy-status.org, and there have times where the site has been down for hours.
Either the problems with its API responses are breaking lemmy.world, or a broken lemmy.world is causing problematic API responses.
Currently, you can ask lemmy.world for page number billion of its communities and it’ll return a response (for the communities it thinks it has on that page, rather than an empty one, as it should). For something like lemmyverse.net, this means its crawler can never get to end of a scan, and some apps are maybe trying to endlessly load the list.
References:
https://github.com/tgxn/lemmy-explorer/issues/139
(there’s also a post in this community about it, which I would link to if lemmy.world would load for me)
This is the same reason I had to turn off my search engines crawler.
There were changes made to the API to ignore any page > 99. So if you ask for page 100 or page 1_000_000_000 you get the first page again. This would cause my crawler to never end in fetching “new” posts.
lemm.ee on the other hand made a similar change but anything over 99 returns an empty response. lemm.ee also flat out ignores
sort=Old
, always returning an empty array.Both of these servers did it for I assume the same reason. Using a high page number significantly increases the response time. It used to be (before they blocked pages over 99) that responses could take over 8-10 seconds! But asking for a low page number would return in 300ms or less. So because it’s a lot harder to optimize the existing queries, and maybe not possible, for now the problematic APIs were just disabled.
Can you check periodically so you can get the link?
Yep. Worked this time, so I’ve edited my comment.