Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
more falafel please
Feb 26, 2005

forums poster

How does one go about getting an invite for the new NZBs.org? I checked the OP and wiki. I have an account on the old one, but it doesn't work in SickBeard anymore.

Adbot
ADBOT LOVES YOU

more falafel please
Feb 26, 2005

forums poster

So, my DogNZB premium subscription is expiring, and they want me to either send bitcoin or buy a t-shirt through a weird third party website that seems sketchy, and costs an extra $10. Is there a way to send bitcoin that doesn't involve setting up some wacky account and/or paying basically double for it?

more falafel please
Feb 26, 2005

forums poster

Thanks, I sent a gift card.

more falafel please
Feb 26, 2005

forums poster

Ok, so I've slowly been modernizing my setup. Running Sonarr/Radarr, using NZBGeek as an indexer, and dropped SuperNews for a deal on Thundernews a while back. AFAICT Thundernews is a Highwinds reseller, and has 4000+ days retention, which is great. I'm thinking of getting a block account to try to use as backup. What block accounts are good other than Blocknews/NewsDemon (which are also Highwinds)?

more falafel please
Feb 26, 2005

forums poster

sedative posted:

https://usenet.farm/ is good and this UsenetExpress reseller is having a sale on blocks right now https://usenetfire.com/plans/

Both are on different backbones.

Thanks, bought the $20 1000GB block from UsenetFire. Should last me a grip.

more falafel please
Feb 26, 2005

forums poster

uhhhhahhhhohahhh posted:

It's supposed to be the same quality as x264 at half the file size, but I don't know if that's actually true yet. I remember there were some problems with compression artefacts and blocking a while ago I've had it excluded for years because I'm still running a Pi 2 with Kodi and can't play them anyway.

Don't worry, a Pi 3 can't play them either, unless I need to pay :2bux: for some codec I haven't done yet. Audio works but video doesn't. Ironically my 2011 iMac plays them fine in VLC.

more falafel please
Feb 26, 2005

forums poster

Speaking of my lovely setup, this may be more suited for the NAS/Storage thread, but I *need* to do something about my lovely setup. I run SAB/Sonarr/Radarr off my 2011 iMac that probably has bad RAM or something because it's consistently making GBS threads the bed, store on an old Drobo which I named "toaster" because the lights reminded me of cylons and BSG was literally on the air at the time, and mount the Drobo on my Pi 3 over NFS, then use Kodi to playback.

I want to upgrade to a machine that's dedicated to downloading/serving media -- I'd like to keep the playback devices completely separate, if possible, so they can be more easily replaced. Is there a good cheap NAS solution for this? I've been a Unix kid forever, so I'm fine getting in the weeds a bit with setup, but once it's set up I want it to just work.

Also, there's not really a way to migrate off a Drobo without just copying poo poo off to new drives, right?

more falafel please
Feb 26, 2005

forums poster

Maybe y'all can help. I'm running Sonarr/Radars/NZBget on a beefy PC, storing on a USB external HDD and then playing on an RPi3 with OSMC (so Kodi), mounting the shared external over SMB. The RPi is on WiFi, the PC is hardwired cat6 to the router.

I have trouble with buffering almost anything that downloads in 1080p. I know the RPi can't handle x265, so I exclude that, but even x264 stuff will get choppy and ill have to pause so it can buffer. My best guess is that it's bitrate/filesize related rather than decoding related, because bigger files tend to cause more problems.

The easiest fix I've found is to just prioritize 720p profiles in the Any quality category, which seems to work fine, but it seems like the RPi3 should be able to handle 1080p video, and it seems like the network should be able to handle ~2.5 gigs over 45 minutes, and it seems like the HDD and the USB interface should be able to as well, considering I can download the same file in, like 5 minutes, to the same drive.

What might be my bottleneck? The WiFi? My PS4 is able to get 1080p video over WiFi, an inch or two away from the RPi.

more falafel please
Feb 26, 2005

forums poster

Keito posted:

I find Kodi to be almost useless on WiFi without modifying the caching. The defaults are way too optimistic and will make you buffer with the slightest disruption.

https://kodi.wiki/view/HOW-TO:Modify_the_video_cache

Oh wow. I should mention that the Samba shares are mounted with fstab (much easier than figuring out whatever GUI tools might be there, and OSMC doesn't have much in the way of stuff outside Kodi), so I wouldn't be surprised if Kodi just thinks it's a local drive and doesn't do any caching. That would make sense.

What settings do you use?

more falafel please
Feb 26, 2005

forums poster

I don't think dog does open reg anymore.

more falafel please
Feb 26, 2005

forums poster

Geek is the only indexer I have set up currently, and I don't have any problems.

more falafel please
Feb 26, 2005

forums poster

Mr. Crow posted:

Can somebody explain this to me? I have been thinking about trying Usenet for a while but am on some private trackers so having to pay to get involved and lack of clear instructions around Usenet have been off putting. Im only just seeing we have this thread and OP seems better than anything else ive seen though, is it still up to date?

Am I paying to download 1TB of data only and will need to by more when i hit the limit? Do i need to pay for some other membership to access (i see something about a free trial..) of is it unlimited access till i hit the limit? Would this be a good deal / indexer for a new user?

The OP is pretty out of date -- the basic concepts are there, but the numbers have changed dramatically.

There are two different services you need to have access to in order to easily download files from usenet: a usenet provider and an indexer. The deal that was posted was about a provider. In addition, there are two types of provider accounts -- unlimited monthly and block. Unlimited monthly means you pay every month (often times there's deals for preordering 6 or 12 months, but same difference) and can download an unlimited amount. Block accounts charge by the gigabyte (or terabyte). Think of it as the difference between an unlimited minutes phone plan and prepaid cards. If you download, say, 200 GB a month, the plan that got posted is less than $2/month, because you're only buying a new block every 5 months or so. If you download a lot, or have lots of spikes, unlimited may make more sense. Often times people will get an unlimited plan and a backup block account on a different provider backbone, so that if one network is missing things, there's a chance you can get them from the other.

more falafel please
Feb 26, 2005

forums poster

cr0y posted:

Sounds like ND wants to be their own backbone eventually...



It's bonkers to me that companies can make money in this business charging people a couple bucks a month and likely needing ungodly amounts of storage and bandwidth. Would love to learn more about the economics of it though...

Storage and bandwidth are way cheaper than they used to be, despite what an abusive opiate addict leeching off a website for the last 20 years may have told you

more falafel please
Feb 26, 2005

forums poster

Yeah, so if I bought a lifetime membership a while back that CC info wasn't breached? Honestly can't remember if I used paypal or not.

more falafel please
Feb 26, 2005

forums poster

Yeah, with SSL your ISP knows you're talking to a Usenet server, but they don't know you're not just fetching an awful lot of articles from rec.baseball or whatever to read later.

DMCA notices from torrent trackers work because you're connecting to a tracker and broadcasting out to everyone else connected to that tracker HEY I HAVE THIS FILE/CAN I GET THIS FILE, so rights holders can just connect to the same tracker and log everyone who says they're sharing their stuff. The only way they'd know you were downloading files from a Usenet server is if a) the server kept logs about which users accessed which articles and b) shared those logs because of a subpoena or something

Usenet is going to your cousin's place to buy drugs from him, public torrent trackers are standing on a street corner yelling I WOULD LIKE TO BUY DRUGS

more falafel please
Feb 26, 2005

forums poster

Takes No Damage posted:

Also since torrenting is peer-based, things that are new and/or popular will have higher availability since more people are trading pieces around, while usenet (in my mind at least) is more like an old school FTP server where a file just sits there and waits to be requested upon. I'm sure that's a gross oversimplification but it makes sense to me :saddowns:

No, that's basically true. Usenet providers store every article in every group they crawl, going back to their maximum retention. So if a provider's maximum retention is 10 years, there should be no difference between a file that was uploaded last week and 5 years ago.

more falafel please
Feb 26, 2005

forums poster

I need to transition my Sonarr/Radarr/NZBget setup to something other than the machine it's running on now, and I've also been getting sick of having to switch between the PS4 to watch Netflix/Prime/etc, and the OpenELEC RPi3 I've been using to play video. My big plan was to get a Roku device (ended up getting the Streaming Stick+), move the media drive to the Pi3, and set up Sonarr/Radar/NZBget and Plex Media Server on the Pi as well.

I got it set up, and... well, Plex can't handle transcoding, like at all. That shouldn't be much of a problem, except it seems to be transcoding just about any .mkv I try to watch. .mp4s are fine, and clearly not transcoding. If I disable transcoding on the server, it says that the mkv files aren't available for direct play on the Roku.

Is there One Weird Trick I'm missing to get these videos to play direct, or am I going to have to run PMS off something beefier? The Pi is connected via gigabit, bandwidth shouldn't be the issue. Everything's fine when it's not transcoding (htop is using more CPU than Plex) but totally pegged when it is.

edit: the video in question that made me realize this was not gonna work is 334MB/2.2Mbps, 720p H264 with 2ch AAC, in an MKV container. So not crazy by any stretch, and the Roku should be able play the video stream with zero problems.

more falafel please fucked around with this message at 05:15 on Jun 30, 2021

more falafel please
Feb 26, 2005

forums poster

UltimoDragonQuest posted:

You really can't transcode on a Pi 3 but it should be able to Direct Play most codecs. FLAC and some boutique ones give me trouble on my Roku TV but not AC3 or normal stuff.

My best guesses:
It's trying to convert needlessly because quality is throttled. Plex App->Settings->Video->Local Quality->Original
Subtitle formatting. Plex app->Settings->Video->Burn Subtitles->Only Image Formats could help.

Oh dang, that did it, thanks!


Somehow I didn't bother looking in the client settings, I figured all of that would be server-side. I'm still new to Plex. I also set it to force direct play, so maybe now I can disable transcoding completely.

I think this means this setup will work for me, as long as I don't care about remote streaming (I mostly don't).

more falafel please
Feb 26, 2005

forums poster

Mister Fister posted:

It's been years since i've used SABNZBD and PVR Software...

Between Sonarr, Sickbeard, Sickrage and whatever other PVR software there is, which is the best currently right now (that works with sabnzbd)? Thanks!

I switched from Sickbeard to Sonarr a few years ago and I can't think of any time I've missed something from Sickbeard.

more falafel please
Feb 26, 2005

forums poster

kri kri posted:

IMO sonarr except if tvdb fucks up the series, then you kinda gotta do manually or use medusa

add prowlarr to your stack too

What does Medusa do about tvdb loving up series? Is there a way to fix those while still using Sonarr?

more falafel please
Feb 26, 2005

forums poster

Taima posted:

I suspect that Newsgroup Ninja may be throttling my download speed due to too much bandwidth use, anyone know if that's a thing?

In fact I'm almost certain they are doing it considering how low the speed is regardless of how long the file has been on the server...

It may be your ISP throttling you down.

more falafel please
Feb 26, 2005

forums poster

NZBGeek has been nails for me. A few years ago I would have loved something like Prowlarr or NZBHydra to aggregate all my indexers, but now I get everything from Geek and it works great.

more falafel please
Feb 26, 2005

forums poster

more falafel please posted:

NZBGeek has been nails for me. A few years ago I would have loved something like Prowlarr or NZBHydra to aggregate all my indexers, but now I get everything from Geek and it works great.

I say this and now it's giving me SSL errors:

code:
2021-10-07 23:11:05.3|Warn|Newznab|Unable to connect to indexer

[v3.0.6.1265] System.Net.WebException: Error: TrustFailure (Authentication failed, see inner exception.): 'https://api.nzbgeek.info/api?t=caps&apikey=(removed) ---> System.Net.WebException: Error: TrustFailure (Authentication failed, see inner exception.) ---> System.Security.Authentication.AuthenticationException: Authentication failed, see inner exception. ---> Mono.Btls.MonoBtlsException: Ssl error:1000007d:SSL routines:OPENSSL_internal:CERTIFICATE_VERIFY_FAILED
  at /build/mono-sto__t/mono-6.8.0.105+dfsg/external/boringssl/ssl/handshake_client.c:1132
  at Mono.Btls.MonoBtlsContext.ProcessHandshake () [0x00064] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at Mono.Net.Security.MobileAuthenticatedStream.ProcessHandshake (Mono.Net.Security.AsyncOperationStatus status, System.Boolean renegotiate) [0x00106] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at (wrapper remoting-invoke-with-check) Mono.Net.Security.MobileAuthenticatedStream.ProcessHandshake(Mono.Net.Security.AsyncOperationStatus,bool)
  at Mono.Net.Security.AsyncHandshakeRequest.Run (Mono.Net.Security.AsyncOperationStatus status) [0x00006] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at Mono.Net.Security.AsyncProtocolRequest.ProcessOperation (System.Threading.CancellationToken cancellationToken) [0x0012a] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
   --- End of inner exception stack trace ---
  at Mono.Net.Security.MobileAuthenticatedStream.ProcessAuthentication (System.Boolean runSynchronously, Mono.Net.Security.MonoSslAuthenticationOptions options, System.Threading.CancellationToken cancellationToken) [0x00346] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at Mono.Net.Security.MonoTlsStream.CreateStream (System.Net.WebConnectionTunnel tunnel, System.Threading.CancellationToken cancellationToken) [0x001f4] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at System.Net.WebConnection.CreateStream (System.Net.WebOperation operation, System.Boolean reused, System.Threading.CancellationToken cancellationToken) [0x001f5] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
   --- End of inner exception stack trace ---
  at System.Net.WebConnection.CreateStream (System.Net.WebOperation operation, System.Boolean reused, System.Threading.CancellationToken cancellationToken) [0x00275] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at System.Net.WebConnection.InitConnection (System.Net.WebOperation operation, System.Threading.CancellationToken cancellationToken) [0x0015b] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at System.Net.WebOperation.Run () [0x000b7] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at System.Net.WebCompletionSource`1[T].WaitForCompletion () [0x000b1] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at System.Net.HttpWebRequest.RunWithTimeoutWorker[T] (System.Threading.Tasks.Task`1[TResult] workerTask, System.Int32 timeout, System.Action abort, System.Func`1[TResult] aborted, System.Threading.CancellationTokenSource cts) [0x00118] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at System.Net.HttpWebRequest.GetResponse () [0x00019] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0 
  at NzbDrone.Common.Http.Dispatchers.ManagedHttpDispatcher.GetResponse (NzbDrone.Common.Http.HttpRequest request, System.Net.CookieContainer cookies) [0x00123] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Http\Dispatchers\ManagedHttpDispatcher.cs:81 
   --- End of inner exception stack trace ---
  at NzbDrone.Common.Http.Dispatchers.ManagedHttpDispatcher.GetResponse (NzbDrone.Common.Http.HttpRequest request, System.Net.CookieContainer cookies) [0x001bb] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Http\Dispatchers\ManagedHttpDispatcher.cs:107 
  at NzbDrone.Common.Http.HttpClient.ExecuteRequest (NzbDrone.Common.Http.HttpRequest request, System.Net.CookieContainer cookieContainer) [0x00086] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Http\HttpClient.cs:126 
  at NzbDrone.Common.Http.HttpClient.Execute (NzbDrone.Common.Http.HttpRequest request) [0x00008] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Http\HttpClient.cs:59 
  at NzbDrone.Common.Http.HttpClient.Get (NzbDrone.Common.Http.HttpRequest request) [0x00007] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Http\HttpClient.cs:281 
  at NzbDrone.Core.Indexers.Newznab.NewznabCapabilitiesProvider.FetchCapabilities (NzbDrone.Core.Indexers.Newznab.NewznabSettings indexerSettings) [0x000a1] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\Newznab\NewznabCapabilitiesProvider.cs:64 
  at NzbDrone.Core.Indexers.Newznab.NewznabCapabilitiesProvider+<>c__DisplayClass4_0.<GetCapabilities>b__0 () [0x00000] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\Newznab\NewznabCapabilitiesProvider.cs:36 
  at NzbDrone.Common.Cache.Cached`1[T].Get (System.String key, System.Func`1[TResult] function, System.Nullable`1[T] lifeTime) [0x000b1] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Cache\Cached.cs:104 
  at NzbDrone.Core.Indexers.Newznab.NewznabCapabilitiesProvider.GetCapabilities (NzbDrone.Core.Indexers.Newznab.NewznabSettings indexerSettings) [0x00020] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\Newznab\NewznabCapabilitiesProvider.cs:36 
  at NzbDrone.Core.Indexers.Newznab.Newznab.get_PageSize () [0x00000] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\Newznab\Newznab.cs:24 
  at NzbDrone.Core.Indexers.Newznab.Newznab.GetRequestGenerator () [0x00000] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\Newznab\Newznab.cs:28 
  at NzbDrone.Core.Indexers.HttpIndexerBase`1[TSettings].TestConnection () [0x00007] in M:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\HttpIndexerBase.cs:335 

2021-10-07 23:11:05.8|Warn|SonarrErrorPipeline|Invalid request Validation failed: 
 -- : Unable to connect to indexer, check the log for more details
Anyone else seeing this?

edit: weird. There's a note on their announcements from a month ago saying Comcast customers are having SSL issues because of xFi Advanced Security flagging it as a "suspicious website", but I don't have any of that enabled because I'm using their modem in bridge mode. Also, I'm able to hit https://api.nzbgeek.info/api?t=caps&apikey=(removed) from this machine and the machine Sonarr is on (with curl, it's headless) and get an XML category list. Definitely looks like it's a problem on Sonarr's side, but I haven't changed anything.

edit 2: Ok, I checked Radarr and it worked, so I restarted Sonarr and now it works fine. No idea.

more falafel please fucked around with this message at 05:32 on Oct 8, 2021

more falafel please
Feb 26, 2005

forums poster

Is there a way I can limit how much Sonarr will queue up "Wanted" releases? I'm on a bandwidth cap on Comcast, and occasionally I look at it and realize it's queued up like 2TB worth of seasons from a show that's in my library but incomplete. I can pause the ones that get added and just keep them in my NZBget queue forever, but I'd rather give it a limit of, say, 500 GB/month for old releases.

more falafel please
Feb 26, 2005

forums poster

George RR Fartin posted:

Nzb360 is a great Android app for managing all that and the .arrs, too

I use LunaSea on iOS, which is pretty nice. The only thing I wish it had was the ability to quickly switch between apps, like when I'm 4 layers deep in Sonarr menus and want to see what NZBget is doing, I have to go all the way back to the main screen to get there. It's still nice.

more falafel please
Feb 26, 2005

forums poster

You're going directly to your provider, who has your personal information already, so if they wanted to screw your over, they could if you were on VPN or not. The reason you have problems with torrents is that rights holders are just on the tracker and listening to you say HEY YALL GOT THIS BLOCK OF COPYRIGHTED MATERIAL I HAVE THIS OTHER BLOCK OF COPYRIGHTED MATERIAL so hiding your IP is important. Your ISP knows you're connecting to a Usenet provider, but there are plenty of legitimate uses of Usenet, and rights holders can't tell what you're doing at all.

more falafel please
Feb 26, 2005

forums poster

Rights holders are definitely still sending DMCA notices, and at least Comcast is still responding to them. I think there's a multiple-strike system though.

more falafel please
Feb 26, 2005

forums poster

I added a one season, 10 episode series to Sonarr, downloaded a single NZB of the whole season, and due to some weirdness that was happening with another download, it didn't get imported. NZBGet lists it as successful, and I can see the files sitting in dst/Series/Blahblah.S01.1080p.blahblahblahblah, but of course the filenames are all obfuscated "abc.xyz.04bec16627f729.mkv" junk because that's just the way Usenet works now. I can't manually import them because I don't know which is which, they don't seem to have any metadata that has episode numbers (at least according to mediainfo), and forcing NZBGet to re-postprocess the download doesn't seem to have done anything. Was NZBGet supposed to have the information it needs to rename these properly? I think Sonarr would be only be able to find that information if it's in the filenames. Do I just need to download the episodes individually? If that's the case it kinda sucks that whole season downloads are just broken now.

more falafel please
Feb 26, 2005

forums poster

I did some more googling and apparently some indexers (including nzb.su, which I use) do a filename obfuscation thing. It's not a problem with single episodes, but with full seasons it is. There's maybe a script for NZBGet that can deobfuscate them, and apparently SAB can do it automatically. Maybe it's time to switch back to SAB.

more falafel please
Feb 26, 2005

forums poster

It seems that there's some mapping in the NZB itself, but I haven't parsed out what it is.

more falafel please
Feb 26, 2005

forums poster

I just sprung for a seedbox for a number of reasons, but the upshot is it'd be nice to have my whole setup running on beefy-ish hardware that can actually transcode, and the bandwidth and storage is nice.

I don't have root, but I do have SSH. The host has a control panel type thing that lets you install a number of apps, so I've got the standard SABnzbd/Sonarr/Radarr/Lidarr setup. I added Prowlarr because hey, easy to install, and that way it's easier to add indexers. Prowlarr also lets you download arbitrary NZBs/torrents from your indexers, so I decided to hook it up to my SAB instance so it could actually download things. The problem is that everything's set up to have friendly external URLs (https://mysweetseedbox.host.eu/username/sabnzbd for example), but internally the apps are running on their own ports. I'm sure there's some Docker business going on as well as a reverse proxy situation.

The problem is that I can't connect Prowlarr to the SABnzbd instance. Sonarr/Radarr/Lidarr have it set up for localhost, port 8500, with an API key and username/pass. Prowlarr, for whatever reason (I'm assuming some Docker shenanigans?) doesn't like that at all. Prowlarr also doesn't have a "URL path" configuration option for the SABnzbd setup, so I can't use the external URL either (the host field balks at anything other than a hostname, can't put a path in there). I can't use a black hole directory, it seems like, because Prowlarr can't see my home directory.

Is there anything I can do about this short of contacting the host and asking for them to mess with it? I am decidedly not a docker expert, and I assume I don't have permissions to mess with the docker configs myself.

e: I found Prowlarr's config directory and set up a blackhole directory there, and that does appear to have worked. I'd still prefer to have it connect directly, but blackhole isn't the end of the world. I suspect this is related to why I couldn't get Ombi to connect to Sonarr/Radarr either.

more falafel please fucked around with this message at 21:52 on Mar 24, 2022

more falafel please
Feb 26, 2005

forums poster

I've just started using Lidarr to keep track of my music collection, but I'm mostly not downloading from Usenet or using Lidarr to do it.

It's great that I've been doing Usenet stuff for like 15 years and it keeps getting *better* instead of worse. I just migrated all my stuff to a seed box and instead of uploading all my video, I just downloaded it all again, and it worked perfectly. Within like 3 days I had gotten basically everything.

more falafel please
Feb 26, 2005

forums poster

It is pretty funny that Sonarr (and possibly other *arrs) don't like NZBs with the names of languages in the title, even when they're also in the name of the episode. It didn't have any problem manually downloading the episode "Greek Revival Bookcases" but it sure wasn't gonna do it automatically.

Maybe I'll dig into the source and see if I can make it ignore language descriptors if they appear to be part of the episode title.

more falafel please
Feb 26, 2005

forums poster

Tea Bone posted:

In get keeping the quality, release group etc in the file name, but I can't think of a single time it would be useful to have the IMDb id in the file name? Some kind of library that can't index by movie name and year alone?

Edit: what's with the exclamation mark after series names? I checked it out on the guide and they don't give a reason for it.

I've recently started using the nzb360 which has been a game changer for me on mobile. Radarr and Sonarr have usable mobile interfaces but they're clunky and far from optimised.

Looks like this is basically the android equivalent of LunaSea. Super useful, the only gripe I have is that if I add an episode, I'm several layers deep (Sonarr -> Show -> Season -> Episode Search) and getting back to SABnzbd means going all the way back to the main screen and losing all that context. I usually just keep an SABnzbd tab open in my browser so I don't have to deal with it.

more falafel please
Feb 26, 2005

forums poster

Hughmoris posted:

For those with a large amount of linux ISOs, how do you handle backups?

I keep my ISOs on an external USB drive plugged into the desktop, no backup solution. I know it's only a matter of time before that bites me.

I recently moved my whole setup to a seedbox in the Netherlands on a gigabit pipe. It took less than a day to redownload all my Linux ISOs.

more falafel please
Feb 26, 2005

forums poster

Incessant Excess posted:

I know little of how this works but would be interested, can you explain it and share the name of the company you went with?

Sure: https://www.seedhost.eu/dedicated-seedboxes.php It's not super cheap (I got the 16TB storage one), but I'm sick of managing hard drives, worrying if they're gonna fail, wondering if I should set up a NAS or homebrew one myself, etc. Plus, I want to be able to access my Plex remotely, share it with friends, and let it transcode (before I was running everything off a Raspberry Pi, so if Roku couldn't play a file natively, I couldn't play it), I don't want to worry about my own bandwidth when I download a few seasons of Linux ISOs, and I can use public torrents for Linux ISOs. I can also seed torrents indefinitely without worrying about my own upload bandwidth, so I can give something back to the open source community.

The only real drawbacks are the price, the fact that now everything is remote so I use bandwidth to watch stuff, and that I'm not sure what happens if I need more storage.

They give shell access (but not root), every program I've wanted to install has a one-click thing on their control panel, but I should be able to install arbitrary stuff as long as it doesn't need root.

But drat, is it sick to hit download on a 10 gig Linux ISO and have it... ready to install... and showing up in Plex in like a minute.

more falafel please
Feb 26, 2005

forums poster

Lidarr can use A Thing That Isn't Usenet that I'm not sure if we're supposed to talk about here, and that Thing has private Things for music that are very, very, very good, but I definitely can't talk more about that here.

more falafel please
Feb 26, 2005

forums poster

Shumagorath posted:

I've been on TweakNews since I switched off Astraweb waaaay back, but they struggle to crack 25MB/sec and tonight they were dropping to low single digits while BlockNews hits a solid 50MB/sec. Is there a new top provider out there that won't listen to takedown notices?

I'm hoping for an unlimited account but I could always buy nothing but blocks, I guess? Newshosting is reputed to be faster (and have better retention) but they use the same Omicron backbone...?

I'm currently using Eweka and Thundernews, both unlimited accounts. I'm probably gonna dump thundernews when it expires, I pull almost everything through Eweka.

In my experience I have to cycle through providers every 2 years or so, but it's pretty easy to find cheap deals that make that easy.

more falafel please
Feb 26, 2005

forums poster

Laserface posted:

has anyone had issue configuring Lunasea to talk to SABNZBD+? everything else on my system (sonarr, radarr) works fine, but I cant for the life of me get it to talk to SAB.

have turned on external access and have tried with and without authentication. Keep getting a 403 error from Lunaseas connection test which seems to suggest the authentication method is not 'basic' and needs custom headers?

Mine is just set up with the API key from SABnzbd's Settings->General. Works fine.

Adbot
ADBOT LOVES YOU

more falafel please
Feb 26, 2005

forums poster

I used NZBget for a while, but I switched back to SABnzbd and I like it better. Maybe I'm just more used to it, maybe it's that the switch back corresponded to me switching from self-hosted to a seedbox with an extremely fat pipe, but it just seems to work nicer.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply