oh well there comes the usual ugly little shit who is bullied by everyone in real world because it looks like a dork playing internet warrior here again lol, no one respects you here either bro, and no, youre not smart, it doesnt change your situation, youre still what you are even being anonymous = a dork lol
Is there any particular benefit in supporting those that you're interested in? I personally can't see any reason, so 'never' is certainly a possibility at this point.
The most likely reason is that you now need an account to upload content. AT only does anonymous uploads, as far as I'm aware. They also have a weird bitcoin thing going on.
And the FAQ contains the usual amount of confusing double-talk, such as 'unlimited storage' but for mulitple uploads it says, "However the sum of all files should not exceed 500MB. " Hard to figure out what they mean.
There is no such listing on Nyaa or TT at this time. And I don't see a reference to it on the Deadfish website. When they torrent it, we will have it. (It will be a Deadfish if based on Mori. It would be a Bakedfish if/when Horrible does it.)
Dear animetosho please upload bakedfish dagashi kashi thank you very much
Comment in Feedback 05/01/2016 04:29 — Anonymous: "Church of Tibb"
o Tibb-sama, thy name will be forever remembered o Lord, so you thee know, thy servants will always be prepared for thy glorious return o God, blessed by anime we are :P
Comment in Feedback 31/12/2015 16:55 — Anonymous: "Anonymouse"
i got these episodes of Bubblegum Crisis Tokyo 2040 from chauthanh.info but the ep 18 is unavailable for some weeks now, and i dont have an account there.
the file is [Anime-Takeover]_Bubblegum_Crisis_Tokyo_2040_Ep18_(6CF75f1C), i have all the other ones, someone know another way (link) to it? my isp blocks torrent.
It's meant to be scrollable to accommodate smaller screens that can't fit the whole height. I never thought that panel was particularly useful though, other than for search. Was there anything else you found useful there?
I would like to propose changing left panel to be always at top of the page, so when u scroll down, the left panel will always be on top left (in style add style="position: fixed;"). Something like this could also be added to the "Search result" page ('div id="title_desc"' and 'div class="home_list_filter"'). Also I like the Julie style the most. I think its cuz there are few images of her :)
If u really want to keep Henrietta theme, do this: 1. You can save the .css of this style ("style_121.css"). 2. Then when u press RMB anywhere on website go to "Website preferences..." (this is how this option is called in my web browser) 3. Go to "Display" and look for "My stylesheet" 4. Choose from your local disk the style file (for example "style_121.css" for Henrietta) 5. Everything should look like you would have Henrietta style Not sure if this will work 100%, maybe you will need to add some !important in .css. The other way is to use add on to web browser (I use Violentmonkey) and it can force style on website (but still you need .css), but that solution also may need adding some !important in .css. As long as website structure wont change this way should work forever.
Ah, that may be the reason. I've added UsersCloud as a host we upload to directly, anyway. I think Go4Up doesn't use an account, so the link dies after 24 hours.
Given that AT targets people who quickly get releases the moment they are out, then AT is perfectly fine as it is.
I'd say that this target is mostly because I don't know of a reliable way to keep files available long term (other than relying on torrents). So I don't mind looking at ways to extend storage time, but ultimately don't think there's too much that can be feasibly done.
For the hosts you mentioned: - Solidfiles: we upload to them - both Mediafire and Mega: we uploaded to them until they started blocking anonymous uploads. Their current account limitations and the way their system is set up don't really make it feasible to use from an automated script - Userscloud: I looked into it, but forgot why I haven't implemented it. Perhaps I got distracted... will look into it further. We did use to upload to Usersfiles until they stopped working. As they delete files after 24 hours, it was to an account, but as the account had no limits, it worked. Userscloud looks very similar and may ultimately be the same deal.
Basically this only works if the count is made when the download button is pressed.
It may also depend on whether they track who is performing the download (i.e. account/IP). I would think that hosts consider the possibility of uploaders trying to fool such a system, then again, many file hosters don't seem to put much effort into their systems...
One note: You may realize this, but free account uploads like Mega and Mediafire won't work at AT if there is a storage limit per account which there generally is (or no point to their requiring accounts). In the case of Mega, the free account limit last time I looked was 50GB. AT creates links for about 100GB per day. It would need some automated way to create new accounts as needed and distribute content accordingly which is probably not possible.
In uploading, there are two common methods I often see: - Method 1: make use of mirror creating services to put file on as many hosters as possible. Pros: + No total space limit + No account required Cons: + Limited file size and expiration time - Method 2: use only more reliable (for both uploading and downloading) hosters and upload with free accounts Pros: + Larger file size and longer availability Cons: + Account termination + Limited total space + Account management
Given that AT targets people who quickly get releases the moment they are out, then AT is perfectly fine as it is. I can't suggest any change on that part as all the necessary features are there. I understand AT relies on public and free hosting services for uploading and such hosts cannot be counted on for long-term storage. I just thought it would be nice if file longevity could be extended. My first suggestion aims to realize that. But in reality, to go the extra mile for a small set of people is not worth it. A common sense.
Regarding reliable hosters, as a person who downloads I can only name some which are reliable for a person who downloads (MEGA, SolidFiles, Userscloud, MediaFire). For uploaders without an account, the treatment is pretty much the same across all hosters. Most noticeable are limits on file size and expiration time. With an account (a free one), it is a bit nicer, one can upload bigger files and have them stored there for longer but then there is a relative limit on total storage capacity.
Regarding tricking the hosters, my apology. Previously I totally left out the limitations set on downloading from these hosts without a premium account. Please forget about it. Basically this only works if the count is made when the download button is pressed. I believe most will eventually use the system which only counts once the file is fully transferred.
About active link checking, if everyone can be suggested to use a browser addon to do that, there is no need to implement it.
Thank you all for the discussion and for bothering with my comment.
There is also the complication of the difference between what these file hosts promise and what they actually deliver. Basically, they promise the moon, but only deliver what makes business sense. They promise the moon only for as long as you don't try to take them up on it.
Isn't there a reliable hoster that only deletes a file if it is not accessed after a set interval (even if it was uploaded by an unregistered user)? Out of curiosity though, isn't it possible to make the hoster believe that a given file is accessed before each time that interval expires?
Do you have any such hosts that you'd suggest? It may or may not be possible to fool the hoster into thinking a file has recently been accessed - that would depend on how they determine its popularity. There's also feasibility to consider, i.e. if the file needs to be fully downloaded periodically, is there enough capacity to do this (considering hoster download limitations, server resources etc)? If you know of any solution where this works, please do share.
but such link checking, as I once tried, is quite easily doable and pretty correct
It's doable no doubt, I've never really gotten around to it much, and haven't considered it to be terribly important, since users can quite easily do it themselves (your suggested Greasemonkey scripts, for example). It's not quite as simple as you put it (not all hosts give nice error messages, and some in fact keep all download pages but not the files themselves), but the main difficulty is with managing the volume of links.
There is also the issue of diminishing returns: How much effort do you want to put out just in case someone might want something some day, if no one wanted it in the last 60 days?
26/01/2016 13:46 — Anonymous