It has been brought to my attention that certain users are abusing the torrent search and other resources issuing multiple requests per second using scrapers or scripts and due those actions the site became slower and slower.

I have implemented rate limiting and the current limits are:


Rate limit Default page per ip : 50 requests / 5 minutes
Rate limit Default page per user : 100 requests / 5 minutes

Rate limit Rss page per ip : 4 requests / 5 minutes
Rate limit Rss page per user : 8 requests / 5 minutes

Rate limit Torrent page per ip : 20 requests / 5 minutes
Rate limit Torrent page per user : 40 requests / 5 minutes

Rate limit API per ip : 20 requests / 5 minutes
Rate limit API per user : 40 requests / 5 minutes

Default = all other pages
Per IP = requests are counted by IP address.
Per user = if you access the site from multiple IPs/locations (home with browser, seedbox with rss/autodl, jackett, sonarr radarr, this will be combined) -> currently is calculated with 2 instances.

I urge every user to check if the case you have automations that doesn't respect to adjust them to meet these criteria.

The current limits are agresive and are subjected to change. We will adjust these limits to not impact the normal usage of the site. If you have a unique setup and can't adjust please let us now in order to take into consideration your needs.