|
I randomly tried dognzb 2 days ago and the registrations were open (I think they were closed before?). My account says trial and number of days. Maybe it's a stupid question but do the trial accounts get converted to some permanent but limited account or are they just closed if you don't sign up before the trial period is over?
|
# ¿ Sep 2, 2018 17:51 |
|
|
# ¿ May 6, 2024 15:31 |
|
Rexxed posted:They give you 30 days after the trial ends where you can't use the site but you can upgrade trial to member. If you don't buy membership they delete the trial account 30 days after the trial expiration day. It's hard to find the correct info about how their stuff works but they do have a support forum: Thanks for the answer! I was looking for the details about the trial membership, read the FAQ and search the forum a bit but didn't find anything. I've been using DrunkenSlug and Usenet Crawler but the latter is often down. I'm looking for some decent indexers but they aren't that easy to come by IMO. dobnzb has everything neatly sorted and labeled, at least.
|
# ¿ Sep 3, 2018 07:40 |
|
I'll keep that in mind, thanks! I signed up for dog (only for a year in case they decide to shut down in the meantime or something) and a kind goon sent me some more tips in a PM so I'm all set for now. Setting up everything was a bit of a pain in the rear end but even on my relatively slow internet usenet has been a crazy QoL improvement for me so it was worth it in the end. e: typo lordfrikk fucked around with this message at 06:18 on Sep 7, 2018 |
# ¿ Sep 5, 2018 09:10 |
|
Where do you even store all that stuff when you're downloading 1TB per month?
|
# ¿ Oct 18, 2018 22:37 |
|
nerox posted:https://forums.somethingawful.com/showthread.php?threadid=2801557&perpage=40 Colostomy Bag posted:NAS. Usually what I do is copy stuff down, unzip/rar/par/whatever on my SSD then send it up to my re purposed server that runs FreeNAS with Plex on a ZFS pool. Works for me. It's an old 2600K Sandy Bridge box. But how big are your home NASes anyway? I've seen that one video of a guy having 70 TB storage complete with an enterprise-level server housing but I'm thinking there's not many people who go all out on storage like that, right? EL BROMANCE posted:Depends if it's being archived or not really, but being able to download at high speed means you can get the very best version of something that exists. Maybe a UHD remux thats 100gb, then deleted afterwards. Yeah, I could understand if people download, watch and delete. Storing all that stuff is pretty mindboggling to me, though
|
# ¿ Oct 19, 2018 08:37 |
|
I signed up for a monthly on NewsgroupDirect and while it works great most of the time I'm noticing lot of older stuff is missing parts. Do I understand it correctly that this is not an indexer problem but a usenet provider problem? If so, what other providers have great retention and won't have you paying through the roof? For reference, I snagged their $3.14/mo deal.
|
# ¿ Feb 20, 2020 11:36 |
|
Canuckistan posted:NewsGroupDirect used to be a reseller of a larger network with 3500+ days retention, but they split off and became an independent provider. As such their retention went to poo poo, purporting to be 1100 days but I find even that to be unreliable.. That sucks but luckily they offer 3 month trial (or close to it with the first 3 months being sub half a dollar) so I might cancel it. Lot of time I want to download the original BD disk and those are mostly gone. Posted on the previous page but Newshosting seems (on paper) to have high retention. Does it match reality? Personally I don't download enough to warrant several providers at once so I'm looking for the best overall. Right now I only have 300 Mbps so it doesn't need to support anything higher than that.
|
# ¿ Feb 20, 2020 17:39 |
|
Xaris posted:Which one is that? Watchtower. Go here: https://containrrr.dev/watchtower/ The websites has complete docs that will help you set it up. I have it configured to automatically update daily at midnight, restart containers and delete the old images. I have these 3 environmental variables set but you can also use command line parameters: code:
https://containrrr.dev/watchtower/arguments/#time_zone https://containrrr.dev/watchtower/arguments/#cleanup https://containrrr.dev/watchtower/arguments/#scheduling You can also tell it to ignore certain containers etc. One thing I will say is that automatic updating of images/containers make the image tag you're using more important. Usually I use "latest" for everything but using that for databases is a really bad idea unless you like a bunch of broken containers and having to rollback your databases (probably a very bad idea). I know because I set my PostgreSQL to latest and it updated automatically from 12 to 13, breaking my poo poo. Oops! norp posted:https://hub.docker.com/r/v2tec/watchtower Seems like the author maintains the image themselves: https://github.com/containrrr/watchtower
|
# ¿ Dec 2, 2020 10:00 |
|
Craptacular! posted:While I can understand being able to issue a command that makes containers automatically pull a new image and rebuild themselves, I can't get behind having that happen entirely hands-off automated as Watchtower does. Docker kind of has a busted security model that requires administrators to exercise caution and Watchtower takes it out of their hands in a way that just seems too scary to me. You're essentially giving a whole bunch of different software authors the reservation to run whatever software they please on your system with actions that are processed through a daemon that runs in root, and if one of the containers ever becomes malicious or something you'll probably auto-update to it unawares. Good point, I can only agree with that! I accepted that poo poo will break when I started to use Watchtower and it has but I don't mind that much because the time saved on all the containers I'm running was worth it. I only had PostgreSQL and Nextcloud break, both on major version upgrades, which I am now more mindful of. Containers from more established providers like Linux Server should minimize all sorts of problems but they won't ever prevent the user being dumb like me.
|
# ¿ Dec 3, 2020 14:51 |
|
Knot My President! posted:How does Dog compare to DrunkenSlug? According to NZBHydra2 indexer stats, it's easily competing on unique hits with DrunkenSlug, even on the free tier. Of course my paid subs to DrunkenSlug and NZBGeek are processing a lot more API hits than the 100 free hits on DogNZB but still, it has lot of stuff the other two don't.
|
# ¿ May 18, 2021 15:29 |
|
I had some time to mess around with my setup and tried replacing NZBHydra 2 with Prowlarr and ended up keeping it. I like the method of adding indexers to Radarr/Sonarr directly better just because I can see the name of the indexer in the history or during interactive search. It's a small change but I also found out a small tool called unmanic (https://github.com/Unmanic/unmanic) that is a bit finicky but allows you to easily optimize your library, for example automatically remove all English language tracks from every file etc.
|
# ¿ Oct 3, 2021 22:34 |
|
Are those secret indexer people going to enter the thread and say mean words to me if I mention them or what? I’m just lolling irl at thee dumb fight club rules.
|
# ¿ Oct 23, 2021 22:44 |
|
I like Usenet a lot more than private trackers. Got Eweka as the only provider, and DrunkenSlug and NZBGeek as indexers. Works like a charm! For anime I prefer Nyaa + AnimeBytes, though.
|
# ¿ Oct 30, 2021 16:09 |
|
This is after switching to Prowlarr roughly 3 months ago? I have paid accounts on the first two, DOGnzb I'm using my expired premium. DOGnzb is highly competitive with the other two for my use case but the reason why I'm currently not subscribed is the payment method which is extremely dumb.
|
# ¿ Nov 3, 2021 18:17 |
|
Does anyone have any experience with mergerfs? I just want to pool together 3 drives so I thought that might be the easiest. However, keeping all the SABnzbd/Sonarr/Radarr settings as before, some weird permission issue is cropping up. SABnzbd thinks it cannot write to the drive and/or says the disk is full (it's not) but small files like TV show episodes get downloaded nonetheless, while bigger downloads just end up producing so many errors that SABnzbd pauses the downloading. Doing docker execute into the container shows that I can't create anything as the assigned user while root works. However, when deleting a file as root it says that the file doesn't exist, even though it does and it does get deleted.
|
# ¿ Jun 7, 2022 20:41 |
|
Keito posted:I used mergerfs for years before moving on to ZFS, so it'd be from memory, but maybe I can help if you post some details because there's not enough to go on from this post. Thank you for your help! I use OMV6 to manage everything, and from what I can tell it's running directly on the host system. I have 3 physical drives mounted to /srv/dev-disk-by-uuid-$UUID and the mergerfs union mountpoint is /srv/mergerfs/storage. mergerfs is running as root. The mount options are allow_other, cache.files=off, use_ino, dev, and suid. In the containers the volumes are mapped to /srv/mergerfs/storage/$FOLDER. The user and group of the folders in the containers is the same (UID 1000, GID 100). The raw ps aux | grep merger output is: code:
code:
|
# ¿ Jun 8, 2022 09:15 |
|
I have actually figured it out... I set 200 GB min space and the drive where I've originally stored all the files has less than that now. The policy was set to existing path, most free space but the paths did not exist on the other drives 🤦 Which policy is the best for just saving to one drive then selecting another drive when the space runs out if I don't want to micromanage folders?
|
# ¿ Jun 8, 2022 10:25 |
|
Keito posted:Thanks for the info dump, I was about to ask you about that minfreespace setting but you beat me to it. I've set it to "all" and it works! "mfs" wouldn't work for some reason... mergerfs is a bit more involved than I imagine, however, I will say that I am a huge idiot who didn't read anything about mergerfs before setting it up so it's 100% on me. It's actually surprising nothing burned down because of my stupidity
|
# ¿ Jun 8, 2022 15:32 |
|
Warbird posted:Would anyone know any Usenet discussion groups that would be focused on foreign films? Specifically Chinese ones? The usual suspects mentioned here don’t seem to do well in that area. I personally don't know about Usenet indexers but for torrents there's Avistaz, they have all kinds of Asian movies/TV.
|
# ¿ Apr 2, 2023 17:20 |
|
Kinda late but one big advantage of Jellyfin is that it works completely offline because the login is local, no need to access any other online service. The UI is not great but combine it with Infuse on Apple TV/iOS and it’s suddenly the best combo IMO.
|
# ¿ Jul 25, 2023 21:28 |
|
Steam can usually saturate a gigabit connection. Somebody in the Steam Deck thread posted this picture from the Baldur’s Gate 3 launch:
|
# ¿ Sep 21, 2023 18:12 |
|
Downloading 100s of GBs of Steam games on gigabit can tax your SSD and CPU more than you might expect, and I’m sure it can be a bottleneck, too. Also it might be possible that you have a cheaper SSD which relies on cache to achieve higher speeds, so your download (and the whole system usually) will screech to a halt when you run out of it during bigger downloads (my previous cheap SATA SSD would hit that around 80 GB).
|
# ¿ Sep 23, 2023 05:51 |
|
Deluge was good because it had thin client I could run on my PC and use that to easily feed it torrents. Had problems with stability, though, so I've switched to rTorrent, which is more stable for me, but I can't find a way to connect to my NAS from the desktop.
|
# ¿ Oct 10, 2023 07:56 |
|
I've had dog for awhile but it stops working intermittently and it's not the fastest I do have a lot of successful grabs from it, though, so I'm not sure how much that matters in an automated environment
|
# ¿ Nov 24, 2023 13:38 |
|
I haven't bought any lifetime subs for anything because I'm always worried they will go back on their promise, and I can feel superior supporting the devs more
|
# ¿ Nov 24, 2023 17:58 |
|
I've been using Sonarr v4 for months now and I can't remember a single bug, so either they're nonexistent, or minuscule, or I'm entirely too dumb to notice.
|
# ¿ Dec 21, 2023 12:50 |
|
Jellyfin with Infuse on Apple TV is perfect. I had been using CoreELEC before for 2 years and it felt like using something from the 90s. I appreciate all the people working on it but drat.
|
# ¿ Dec 30, 2023 01:18 |
|
I've been using SABnzbd since I've started self-hosting stuff ~4 years ago and haven't had any issues with it
|
# ¿ Jan 16, 2024 09:33 |
|
LunaSea is available for iOS and Android, and it also does Sonarr/Radarr etc. but I don't know how good is NZBClient.
|
# ¿ Apr 17, 2024 19:28 |
|
|
# ¿ May 6, 2024 15:31 |
|
Jel Shaker posted:can i ask what apps that people are using for watching video on their iphone? Infuse. It works on iPhone, iPad, Apple TV and Macs, and connects flawlessly to Jellyfin. I use it to regularly download stuff to watch to my Mac and watch it offline on a train/plane.
|
# ¿ Apr 30, 2024 13:47 |