Because the server was down for a few days so you need to give it some time to process everything. Mediafire especially, because it's a very slow place to upload to, and often spits out errors.
Fair enough, though it's unknown which hosts do that. But yes, making files resistant to being taken down means less convenience. I'll definitely consider encryption if a lot of files here get taken down. I would presume most people upload to their account to earn points/income from downloads, which may be a reason why the files you've seen have lasted longer. Files here are uploaded anonymously so they may not last quite as long, but are somewhat resistant to the whole account being taken down. Other possibility is that the files were downloaded more - there isn't a lot of traffic on this site, so I imagine there's plenty of files that rarely get downloaded.
Do you have Javascript disabled? I tried making the CAPTCHA as friendly as possible, but all of that obviously requires Javascript. I guess the without-Javascript solution could be a little better.
Hash use has been talked about a lot by the people who issue dmca deletions. The most recent article I read about the subject is http://torrentfreak.com/megaupload-wha...on-120120/ though the discussion is about megupload's use of hashes.
I have been following the subject for several years, but as always I miss a major point or two when I synthesize my data into information for others, and you are right about the user account bans, but those usually/used to happen after repeat violations. I'm not quite sure what the method is but ancedotes seem they've gotten a bit more trigger happy on the user bans. My statements about reports, filenames, and why people encrypt the headers of archives come from extensive experiments people do which I followed. I don't see the expired links issue come up as much, so my statement on that is much weaker.
By the way it seems I always fail your captcha the first time through ill luck, but I've gotten into the habit of copy, hit the back button, then pasting.
Thanks for the info. I'm not too sure if I explained it well, but just from what I've noticed (correct me if I'm wrong), links here often die due to expiry. I base this assumption on the fact that most dead links here are at least 30 days old, and the large majority of newer links are up. Also, a number of hosts specifically say that they delete files after a period of time, especially low for anonymous uploads.
But certainly, I take your point on evasive measures if necessary :)
I've been considering whether to try automatic redownloading files after a certain period, but am unsure where the trigger is exactly - does downloading the first 1KB of the file constitute a "download"? Also there's some concerns on the number of files needing to be refreshed continually growing, but I guess letting older files die is a possible solution.
As for automated removal, there's potentially many different means through which hosts could be detecting content to be deleted, including: 1. HTTP referer - the reason why some forums only post links in code boxes / use link anonymizer services 2. Uploader's IP 3. Uploader's account 4. Some other deep search / fingerprinting system
Unless the host specifies what they do, it's really just speculation though. By the way, have you found an example which indicates that hosts are really using hash matches? Another way of getting around hash matches could be to simply append a null byte at the end of the file. This does cause all CRCs on the filename to be off however.
By crc I meant hash such as md5. Animetake's files are deleted quite often, so for example "[SubDESU] Phi-Brain 17(720x400 XviD MP3)[47088CCF].avi" gets reported. Your links go down because the hash matches even if file name is different. They are using Hashes and wipeing all links more since the Megaupload fiasco, before they sometimes just deleted only one link at a time.
As far as I know links are removed on the following criteria and the standard prevention for each case depending on what level you want: 1. Reported | Obfuscate the link with dlc, captchas, and or login required 2. File Name Match | Randomize filename, make an archive with random name but keep the file inside the same 2. Archive contains matched file | Encrypt headers in archive to scramble filenames using 7z(mhe option) or rar(hp optoin) ect 3. CRC Match | Archive file and add a password 4. Link expired | Rarely happens with major hosts and popular websites as long as file is in use but you can automatically download a little of the file periodically.
Churning through stuff I guess >_> Have disabled updates for now, and done some 'load chunking' to try to help it get through without totally dying.
Sucks, but at least the host has been quite helpful through this and they're actively trying to stop it occurring in the future (they're fine most of the time, but sometimes things stuff up). Downtime like this is rare, fortunately enough.
Interesting point about the link scrapers - I hadn't thought of that. Though I guess if one's just looking to obfuscate links, one can simply obfuscate it themselves without requiring downloader support? I think most links at the moment are being removed due to expiry from the host, as opposed to take down notices, but that may not always be the case later on. So for now, I guess the benefit will just be to have everything in a single list, and I'll keep the plaintext links available for convenience sake.
That was actually a reply to bottom of pg 15 but I failed the captcha and when I hit back my comment was deleted and the reply bubble didn't fill back up, so after retyping it I forgot to hit reply.
Those things exist not because they wanted to hide links, but to prevent deletiion by robot grabbers and minumum wage workers. DLC support is not exclusive to Jdownloader but a secured technology to help discourage(there are utilities that work around) abuse by linik deleting 3rd parties. Some other downloaders that have been approved are pyload and cryptload. There are some link obscurer/protector websites where you submit yoru links and they give your users a dlc or if you want the actual links you need to solve a captcha for them, many have a backend for automated downloaders such as jdownloder/pyload so you don't even have to actually have people get the dlc.
By following the practice of producing randomly renamed rar files with a password and filename encryption enabled, and only distributing your links through DLC or related technology you can ensure your links will weather not just days/week but years.
Yes, it is a list of links for download a lot of parts. http://cl.ly/0h000X2K200j172i0j1r ("kaiba" is a dlc conteiner of 24 links) However, it is not only for Jdownloader. I think MyPony uses it too.
Really sorry about the downtime. It's frustrating to me too as it means I have to worry about when the server does come online, the amount of load it has to get through. The host has informed me that the DC apparently will only reconnect the server on Monday, and provided a detailed explanation of what the issue was (I like this host because they actually actively inform you of issues and how they try to solve them).
So here's hoping everything goes back to normal on Monday and the server doesn't die from trying to mirror so much at once. Apologies for the inconvenience.
Thanks for the link - looks interesting. I've tried reading that page, but I'm not too sure on what it exactly is. It sounds like some encrypted list of links that only JDownloader can use? I'm a bit confused about the encryption part - it would seem that if JDownloader needs to contact the server to decrypt the links, then any application pretending to be JDownloader could do the same.
So I guess the only real point of it is for JDownloader to download everything as one part?
A more stable server (or maybe some backup if it can be set up) would be nice, but I use not-so-stable servers because they're cheap. If someone's willing to supply a better server to me for free I'll look into it.
Apparently someone on the node was abusing the server, it got disconnected by the DC, so now the host has to try and get it back and running. Arguably the host's fault for not filtering out the attack.
A FLAC audio stream is typically 100-150MB per episode (as opposed to, say, a 128kbps AAC stream which would be around 20-25MB per episode; why anyone would use AC3 is beyond me), but if you subtract the difference, their files are still large. I'm actually not sure how they make their files so big, at least for their 720p encodes (excessive filtering?) - their x264 settings seem decent.
Anyway, I don't particularly like criticising people who give away their time for free, so I apologise for even starting the topic.
All hosts are uploaded to anonymously - accounts are never used. I'm a bit iffy over raising the batch limit at the moment as I feel that the server is pretty much near its processing limits anyway. Maybe some day. (mutters about how Coalgirls' releases are usually oversized anyway)
Are the files uploaded on bayfiles under anonymous? There seemed to be no captcha and countdown when i tried to download them.
When i went to the site, it says 5GB for everyone. Is that true? Can anyone testify to that?
On another note, could you raise the file size limit to 2GB if the server permits? This is due to quite a handful of Coalgirls's releases (720p) are not being processed as they are over the limit. [Whatever the script considers to be a 'batch', over 1.5GB in size]
31/01/2012 21:45 — admin