I had a look. It looks like if you set the category, it works, however there seems to be a bug in Radarr where it ignores movie-search unavailability. AT doesn't support the 'movie-search' capability, and there's code in Radarr to check that, but it still sends movie searches regardless (it should be sending regular searches instead). I'll see if I can get a bug report in.
Attempting to add anime tosho to Sonarr as an indexer, but keep getting a 404 error when testing the connection. I am running the latest version as a docker container in Unraid. Here are the settings:
Enable RSS: Yes Enable Automatic Search: yes Enable Interactive Search: yes URL: https://feed.animetosho.org/api API Key: 0 Categories: SD, HD Anime Categories: TV (5000) [I'm assuming this is all of them]
After hitting test, receive the error "Unable to connect to indexer, check the log for more details. Output from log is: NzbDrone.Common.Http.HttpException: HTTP request failed: [404:NotFound] [GET] at [https://feed.animetosho.org/api/api?t=caps&apikey=0]
I have no issues connecting to the site in browser (on a different computer), on server pinging aimetosho.org comes back perfectly with no packet loss, and traceroute makes it all the way there.
I grabbed an NZB mirror and manually added it to SABnzbd (another docker container on the same server), and it found the file and downloaded it no problem.
What am I missing here? I'm sure I'm not the first person to have this issue, is there a walkthrough or something that explains this?
Are you ticking the 'Remember' checkbox when you log in?
If so, try checking what cookie is being set after you log in. In Chrome, press F12, go to Application tab, then on the left side, search for Cookies and click on that. On the list on the right, there should be an entry for 'ant[usertoken]' - check that the Expires time is correct.
That's something I am aware of. I use Chrome and Vivaldi as well as Opera and Puffin on desktop and mobile. And since these are my own devices, I always turn off the option to clear data browsing on exit. There are no issues with other sites. They were still logged in when I accessed them, but not with the animetosho. That's why I said it's either a server-side or a user-side issue. I don't know ¯\_(ツ)_/¯
It's either a server-side or a user-side issue, but it logs me out of the site every time I close my browser. I know that logging in isn't required to use the site, but I'd like to see the release in my local time zone, and the only way to do so is to login. I appreciate how the site is currently functioning, but if possible, how about implementing the 'current time' to automatically detects and adjusts to the user's timezone without the need to login?
Thanks for the suggestion. I'm a little unsure if it'll break anything out there (I think some clients are configured with specific categories, so they may be stuck with 5070). The other complication is that the API requires supporting a 'cat' filter, which I can't easily do. None of the sources (Nyaa/TT/Anidex) separate TV/movies, and though I could try something with matched AniDB data, I don't have an index to query against. Still, I can investigate further, but can't promise anything.
I don't know much about Radarr, but would it perhaps be possible to override categories on that end?
Thanks for your reply, i think there is no distinction between normal movies and anime movies, as such it's unlikely to ever has separate category. so i suggest using regular movies category until such time they actually add anime movies cateogry.
Thanks for the suggestion. Unfortunately Newznab doesn't define a category for anime movies, so can't really be done until they change that. There may also be some difficulty with distinguishing between movies and TV series, but I could try.
Comment in Feedback 15/03/2021 12:25 — Anonymous: "Likable Person"
Hi, For Thank you for your great website, it's hands down one of the best for NZB. if i may ask that you also add another category for anime movies support in NZB, right now animetosho doesn't seem to work in radarr
We're just that busy. At the moment, we have about a 2.5 hour backlog remaining for the highest priority files, those mostly being the single file new releases.
AT doesn't create any torrents, and skips end of season batches of weekly files it has previously uploaded from groups like Erai-Raws and Subsplease rather than re-upload them.
maybe its because they are in a container instead of being uploaded as single files.
Forcing everything into an archive will obviously force indexers to treat the whole thing as one file. But wrapping things in RARs is generally a silly thing to do and isn't what's done here. It's unfortunate that this is the case, but that's just Usenet I guess.
the known bad stuff like cleo or judas which take another release and re encode it
I believe Cleo generally re-encodes off someone else's encode, but Judas generally encodes directly from source, which presumably fits your definition of "original release", so you'd be mistaken on that part.
i see usenet as a backup for obscure and hard to find content that might not be seeded anymore when i find it so that why i see this site as a huge waste of potential
Processing more content is nice, and I have no idealistic objection against it. However I have limited resources to work with, so, as much as I don't like it, I have to draw a line somewhere.
I'm glad that you see an opportunity for hosting obscure content in Usenet, and again, I'm not against the idea - I just am unable to do it. And this is where you can help - with that motivation, I'd encourage you to look at what's needed to make it happen. If it's of any help, I do publish the tools I've written for Usenet uploading, free to be used in any way, to further assist you (or others) in achieving such a goal.
I'm hoping you see AT not as "wasted potential", but more as a demonstration of potential. Go and achieve what I am unable to!
What you want doesn't suit me and what I want doesn't suit you. But it sounds like you have plenty of resources, so I'll just wish you good ridden-- luck.
interesting. they never got split up like that when i uploaded complete packs. maybe its because they are in a container instead of being uploaded as single files. and for the encodes i was talking about the known bad stuff like cleo or judas which take another release and re encode it. you upload plenty of those but miss the original release. these days many releases can be up to 30gb in size and sometimes even more. i see usenet as a backup for obscure and hard to find content that might not be seeded anymore when i find it so that why i see this site as a huge waste of potential
02/04/2021 20:59 — Anonymous