Guess I need to go back a week and switch to SubsPlease. Sucks they gave us no Info. Maybe Erai-raws got scared of the new DMCA laws passed in the corvid bill. Not sure if any of it applies to anime. I do know it applies to IPTV streaming sites. Government officials definitely got a lot of money in payoffs to pass such an idiotic bill.
The script does try to use hashes and video length to help with identifying what it is, which you can't get until it has all the data, so it's usually delayed until the torrent has finished. For skipped entries, since it'll never get that info, it just tries to do with just the name.
Since it seems that people care, I can look at wiring it so it tries a name based pre-match whilst the torrent is still downloading.
I believe the ones being skipped have no notable delay in displaying series info. The delay is in those being processed, and I think it probably has to do with creating entries and adding other info from anidb without the files.
But it's clear now what the question is about so you'll probably get a better answer after the holidays.
This has nothing to do with rss. By grayed out entries it means both those currently being processed and those that will be skipped. The stuff about how the torrent is classified if it is or is not fetched is moot as the ,torrent file is always fetched (it's in storage.animetosho.org/torrent, as you can tell by downloading any .torrent file). Obviously some torrents cannot be matched to an anidb etc entry, but the majority are. This site has all the information it will ever use (short of the owner manually intervening) the moment an entry appears for the users to see, so it would seem to be perfectly simple to implement the suggestion as stated in the OP - by runnng that part of the script earlier.
If you mean torrents being processed and not those being skipped?
There is no field in a torrent header that declares what series it belongs to. My understanding is that AT looks for keyword matches between the torrent and anidb's database. If the torrent is not fetched, the torrent title is used. If it is fetched, the torrent title and filenames are used. So it has to wait for the filenames to become available.
Also, some torrents cannot be classified and remain "unsorted".
can the site owner make it so that the grayed out entries have a link to what they are? i frequently don't know what something is and have to look it up elsewhere. not really a problem but annoying enough to ask. i'm assuming it's all automated anyway, so just run that part of the script earlier?
While you wait, are you trying to overwrite with meaningless info?
Comment in Feedback 27/12/2020 11:47 — Anonymous: "nakatsuka"
I can't delete my email address in the account settings so please delete my account if possible. I wouldn't like to abandon my account, as I said, I'm not active anymore due to personal reasons and I'll have to delete my email ad account and social media accounts too.
It's a wish and again I'd like to thank you for creating such a useful website, keep up the good job and stay healthy. Have a nice New Year! Xx
Thanks for the comment. You can just delete the email address in the Account Settings page, if that helps. Or you can just abandon the account. I can also delete your account as well, if you wish, though I'm not sure what the point of that is...
Comment in Feedback 26/12/2020 12:17 — Anonymous: "nakatsuka"
Hi admin! I have question regarding my account? Is it possible to delete my account? if yes,, how?
Please don't get it wrong, I find this website useful however I'm not being active for the past months and I'll have to delete my email account soon. If I have time (in the near future, hopefully) I'll come, visit again. Thank you for such a wonderful website but I have to say goodbye (for now, I hope) :((
First, to convince me there is an issue, you need to cite specific torrents, what they prove and why, which you didn't. By specific torrents, I mean actual (links, of course).
Second, to be worth a discussion you would have to explain what a solution would look like and what it would take to implement it, which you didn't.
I don't think it's obvious at all: It's been demonstrated how the same group can upload the same files two slightly different ways on a different site and get two completely different results on this one, and your only response seems to be "don't care LOL".
An explanation would be something along the lines of "the specific way it's uploaded on nyaa is really important because _______ "; taking the time to say that you could easily explain this but you decided not to just seems mean. Not everyone knows everything that seems obvious to you.
We're well aware of bittorrent's deficiencies, but I fail to see how anything you wrote applies to Usenet or DDL: thirteen files is thirteen files, and it's the same thirteen files however they're presented. If I hand you thirteen nzbs with one file, and one nzb with thirteen files, and it will literally download the exact same articles. Likewise there's no functional difference between me linking you thirteen MEGA files and one MEGA folder.
This is only going to become more of an issue as more stuff disappears behind the Netflix timegate: Netflix Release Day rolls round and group A releases thirteen torrents of thirteen files because that's how their bot works, and they get picked up; group B releases one torrent of thirteen files because that's how their bot works and they get skipped. Netflix releases an eleven episode show in one day, and group B is picked up, Netflix releases a twelve episode show and they're skipped. There is no sense to any of this.
Forgive me being blunt, but my overall takeaway is that you went tl;dr, then churned out a stock response to some other post you assumed you'd read. The question was not about a specific number in the skipping policy, but why it treats the exact same release differently depending on details of how it is posted on nyaa that aren't relevant outside of nyaa.
The purpose of the skipping policy is a judgement on how much of limited resources to allocate by default on something that no one may actually have an interest in. We also skip batches of files that have already come here as single episodes, such as Erai's end of season batches and batches of Horriblesubs files.
You probably know that most torrent hosts have an "anti-flooding" rule so they delete a bunch of single episode torrents that could have been batched. What hosts want are large, well-seeded swarms. And that's what torrent users want too.
You can think of AT like a pie: If one slice gets bigger for whatever reason, the other slices will have to shrink to offset it. Increasing the skipping policy level would be one such change. But the math doesn't change however it happens.
Something I've noticed is the exact same files will get skipped if the release group puts them on nyaa as a batch (which is more and more common with the decline of broadcast TV and the rise of anime being dropped onto streaming services in 13-episode chucks), whereas they wouldn't if they'd been put on nyaa one-file-per-torrent.
Used to be that a massive torrent was always an after-the-fact batch, but this is less and less the case thanks to Netflix.
The question has certainly been asked before. Unfortunately, I don't know of any good way to synchronise updates (the consumer-level sync tools typically don't work well for mass distribution).
Using the database dumps, it should be possible to build a tool to retrieve all files and sync them daily, but this does require someone to write such a tool.
I wanted to know if AnimeTosho would be able to upload all their extracted archives of subtitles/attachments/etc somewhere (like maybe Google Drive, Archive.org, or maybe something like SyncThing). Along with Kitsunekko, I think AnimeTosho is probably one of the largest and most important archives of fansubs on the internet today. There are so few places one can find archived copies of subtitles from multiple groups for the same series, most general subtitle websites barely have any anime subs available. And seeing how just recently a lot of the old DDL links for series are going to disappear, it made me aware of how precious this data is and how easy it is to lose it. Loads of the original torrents these subs/attachments came from are long since dead, and this site is one of the few available sites to download them from.
To preserve all of this important data I'd love to see (and I'm sure others would too) a complete archive of all the available subs/attachments, downloadable as one large compressed archive that gets updated maybe once every other month or so (Kitsunekko offers a similar downloadable archive), or some sort of folder we could sync access to with SyncThing (or some similar software). Its kinda like asking for a really useful piece of software to open source itself, so that way in the worst case all this valuable data doesn't just disappear and the work can continue.
Anyway, thank you so much for running such a useful website that I literally use every day! Keep up the good work! ^_^
Skipping policy: Since our bandwidth use and backlogs of shows waiting for ddl-links are both up quite a bit, I don't see AT increasing the torrent skipping size any time soon.
Help out?: We appreciate your appreciation, but we're good for what we want to do. Thanks anyways.
Really appreciate your site and have been using it for years. Are there any chances of increasing the file size of the skipping policy? Any ways I could help out?
Looks like a bug in the mkvextract being used here - thanks for reporting. I've updated to a newer version, so hopefully that fixes it. Newer extracts will use this newer version.
11/01/2021 07:29 — Anonymous