|
Is anyone running recyclarr and willing to share their configs? I use an Apple TV with infuse for pretty much all my clients. I am looking through the wiki and it's kind of doing my head in. I know I can run sonarr extended, but those just pull the profiles and not the scores as far as I can tell.
|
# ? Apr 4, 2023 15:08 |
|
|
# ? May 30, 2024 02:55 |
|
Taima posted:Couple of questions if yall don't mind. If you’re using Plex, you can go down the Plex Meta Manager path perhaps. It syncs with Plex and the *arrs when you feed it external lists and can fill in the gaps. I only have a basic setup with a handful of types of collections (best movies of the last decade per year, actor collections, Oscar stuff, currently trending which is really useful to add to the Home page if you backfill a lot). I don’t have it synced to download personally, and just watch the list it spits out in the morning when it runs so I can see what is missing in each category. It could likely spiral out of hand quickly I’m sure ha. For actual downloading I use mdblist.com with the *arrs for watching lists of new content that’s airing/releasing, no reason you couldn’t use that with backfill if you can write up the filters that match your needs. ——- Unrelated, but have managed to consolidate my little setup to a single row on my hifi shelf. Still have 3 empty drive bays with about 70tb online. Nice, simple, essentially silent and not much in the way of power draw. The 4 bay Terramaster on the right is neat, screwless drive mounting now. Definitely an improved design over the 5 bay. EL BROMANCE fucked around with this message at 15:18 on Apr 4, 2023 |
# ? Apr 4, 2023 15:15 |
|
Thanks for the recommendations yall! Bromance that is looking great, and more or less exactly what I run (2x 4 drive enclosures) though mine leads to a media center windows pc with a 3080/5800x that serves as a secondary gaming pc and the primary plex server that's on 24/7. That's a nice setup you got there.
|
# ? Apr 4, 2023 16:54 |
|
Thanks, back in the olden days I lived in a big house with a friend who had a similar tech mentality so there were things like racks everything, speakers covering every wall and just clutter everywhere (once bought a line of airline seats simply due to the fact that someone was selling them and it seemed funny at the time). It's nice to be able to reduce everything so small! I just noticed your point 3 on your original post. I use two services personally - iDrive for a fixed 5TB or so of space that the most important of my stuff goes on to, they do a decent first year rate and then when I was about to cancel last time they pretty much matched it. Not sure if that works indefinitely but it was enough to keep them around. I also use an unlimited Back Blaze plan which takes a long, long time to backup that amount of data - a few months even on gigabit and the client isn't exactly invisible when it comes to processing, but it did all get up there and the versioning works well. Getting files back out of it is not going to be fun if a huge drive drops - the intention would be in those circumstances to rebuild from usenet quickly, then use the online archive to fill in the gaps that were harder/stuff I've encoded myself. I pay the extra dollar or two for a years worth of version history too.
|
# ? Apr 4, 2023 17:17 |
|
kri kri posted:Is anyone running recyclarr and willing to share their configs? I use an Apple TV with infuse for pretty much all my clients. I am looking through the wiki and it's kind of doing my head in. I know I can run sonarr extended, but those just pull the profiles and not the scores as far as I can tell. I understand where you are coming from, it took me a few days to get my head around the config too when I looked at recyclarr. My setup is using Sonarr v4 for a HDR Samsung TV and android clients, basically HD and UHD profiles. It is a very simple config, I'm sure your Apple TV has different requirements. code:
|
# ? Apr 5, 2023 10:03 |
|
EL BROMANCE posted:Thanks, back in the olden days I lived in a big house with a friend who had a similar tech mentality so there were things like racks everything, speakers covering every wall and just clutter everywhere (once bought a line of airline seats simply due to the fact that someone was selling them and it seemed funny at the time). It's nice to be able to reduce everything so small! Oh poo poo ok this is making way more sense. Backblaze is what I tried to use and it was just trash. But it sounds like it works, just takes months? Good times. But I also moved to Seattle area and unfortunately no longer have the 1000/1000 fiber I had in the bay area, and I'm limping along on gigabit cable (ie no upload speed) so idk, maybe expecting to backup 100TB over a cable upload is stupid regardless. Does cable modem get mad if you saturate your 30mb upload or whatever? Honestly this has been kind of an issue in general- I had no problem saturating the fiber to download 5TB in a few days or whatever. But I just did that over the weekend, and it made me realize that I don't actually know if cable providers tolerate such large swathes of downloads. Anyone know what the score is here? I have Astound if that helps. I also SSL my traffic, that's enough I hope? Also I recently added 2x Exos X18 enterprise drives in the hope that it would add durability, but looking back, I'm not sure if that's true; do the drives actually last appreciably longer, or do they just have good support if/when they fail? Taima fucked around with this message at 11:18 on Apr 5, 2023 |
# ? Apr 5, 2023 11:15 |
|
I just downloaded a thing that was posted over 10 years ago. Usenet is kind of amazing.
|
# ? Apr 19, 2023 14:54 |
|
I was getting all my movies via a 3-at-a-time Netflix DVD subscription back then. I had my Mac Mini set up to automatically rip + tag + import into iTunes any DVDs I inserted. I would mail the DVDs back from my downtown office mailroom, because I knew it’d reliably take less time to get back to Netflix and be processed so a new DVD would be mailed out.
|
# ? Apr 20, 2023 03:25 |
|
Dicty Bojangles posted:I was getting all my movies via a 3-at-a-time Netflix DVD subscription back then. I had my Mac Mini set up to automatically rip + tag + import into iTunes any DVDs I inserted. I would mail the DVDs back from my downtown office mailroom, because I knew it’d reliably take less time to get back to Netflix and be processed so a new DVD would be mailed out. As a chronic digital hoarder, manually mirroring all of Netflix speaks to me. e: Or at least it would have when they were just a DVD rental service, not wasting 1s and 0s on their original content
|
# ? Apr 20, 2023 03:40 |
|
Well, you know, they just had everything, and it was so easy to have a mile-long queue which made for some nice surprises when something would show up in your mailbox and you had no idea why you would’ve added it.
|
# ? Apr 20, 2023 03:53 |
|
Dicty Bojangles posted:Well, you know, they just had everything, and it was so easy to have a mile-long queue which made for some nice surprises when something would show up in your mailbox and you had no idea why you would’ve added it. It's like Wish.com, you order a ton of random crap while drunk and then for the next 4 months stuff shows up in your mailbox you had no idea you ordered.
|
# ? Apr 20, 2023 06:52 |
|
omg seems like a good indexer that I would like to use...
|
# ? Apr 23, 2023 05:53 |
|
I have NSBget downloading to an SSD to make the most of my connecion, then radarr/sonarr moves the completed download to a traditional HDD which Plex reads from. Obviously, the move takes a while since it's moving to a new device so it's a copy operation rather than a move. Is there a way for radarr/sonarr to link to the original file on the SSD temporarily while the copy is in progress so that files are available in Plex as soon as they've downloaded? I've googled around and can't seem to find anything which makes me think I'm going about this the wrong way. Is there a better way of doing this? Tea Bone fucked around with this message at 17:57 on Apr 24, 2023 |
# ? Apr 24, 2023 17:46 |
|
even if you could symlink the file temporarily on a separate volume, surely the copy time is such a small fraction of the dl/unpack that it's not worth it.
|
# ? Apr 24, 2023 20:04 |
|
You'd be better off making the SSD a general cache drive and then having whatever platform you're running handle moving the stale data onto the spinning disks.
|
# ? Apr 24, 2023 20:25 |
|
Tea Bone posted:I've googled around and can't seem to find anything which makes me think I'm going about this the wrong way. Is there a better way of doing this? Like everyone else said, use the hard drive. You didn't say what your connection is but a gigabit connection won't saturate a spinning drive (100MB/s). If you are on a 10gig internet connection (1000MB/s) then you'll have to decide where to bottleneck. A spinning hard drive is surprisingly fast at reading and writing a stream of data. gariig fucked around with this message at 20:52 on Apr 24, 2023 |
# ? Apr 24, 2023 20:50 |
|
Qwijib0 posted:surely the copy time is such a small fraction of the dl/unpack that it's not worth it. Thanks Ants posted:You'd be better off making the SSD a general cache drive and then having whatever platform you're running handle moving the stale data onto the spinning disks. gariig posted:Like everyone else said, use the hard drive. You didn't say what your connection is but a gigabit connection won't saturate a spinning drive (100MB/s) Thanks everyone. I clearly have something configured poorly because I'm on half a gigabit and the move operation takes about the same time as the download does. I'll try downloading directly to the spinning discs and see how it goes.
|
# ? Apr 24, 2023 21:17 |
|
I’ve wanted to do the same thing, the slowdown is real. Plex is scanning the incoming file to create thumbnails and find the intro/credits. So if you queue up a season you are reading and writing at same time and you’ll end up writing at like 10MB/s.
|
# ? Apr 25, 2023 05:01 |
|
The main reason I have the NZB client sending to an SSD before moving off to an array of regular HDDs is if I’m grabbing a stack of releases at once, it seems to perform a lot better when it’s post processing something while downloading another. Before I was setting it to pause post processing during downloads.
|
# ? Apr 25, 2023 05:59 |
|
I've always found the relationship between usenet and speed (both bandwidth and processing) to be kind of interesting. On one hand you want to unpack quickly and you want to get releases asap especially if they're one of the media empires that take down files. It's basic human nature to optimize. But most people have quasi-dedicated computers for Usenet, which combined with the automated suite of tools that exist now, becomes so interesting imo. You can download and process 24 hours a day so the very concept of speed becomes relative; is there a real upside to a gigabit fiber connection? Or a quick unpack? Kind of, I guess. But in practice not really? In the past week I've downloaded like 10TB of legal linux ISOs, and I don't think it would have been appreciably different if I was on a connection half as quick. It really shows how robot workers can be incredibly inefficient, but running 24 hours a day, will still have similar results. It's an interesting life lesson about how time intersects with quality and often times is interchangeable to some degree. Idk where I'm going with this but I feel like there's lessons to be had here. Speed approaches irrelevancy in Usenet because it's holistically automated in a way that consumers have very little normal access to. It's a small taste of the thought process, experience and decision making that goes on in corporations replacing human labor for machine labor.
|
# ? Apr 25, 2023 10:24 |
|
I mean you're right. My internet is only 50mbps and I limit Usenet to 20mbps because it will completely saturate that in a single multithreaded download and kill other stuff for a bit. I definitely don't care how long something takes from requesting in Overseerr to downloading and unpacking into plex. If I really want it fast to watch now I'll just grab the smallest version of it. Technically I can upgrade my internet to 1gbps fibre but I really don't see any point for our 2 person household for 3x the cost. The only time it bothers me is when I want to play something from Steam noooww but that's pretty rare.
|
# ? Apr 25, 2023 11:53 |
|
Taima posted:You can download and process 24 hours a day so the very concept of speed becomes relative; is there a real upside to a gigabit fiber connection? Or a quick unpack? Kind of, I guess. But in practice not really? In the past week I've downloaded like 10TB of legal linux ISOs, and I don't think it would have been appreciably different if I was on a connection half as quick. Or if you're like me and have a media collection you've been building up for years, you decide you want to watch something you downloaded ages ago and it turns out it's a 700MB XviD which looks like muddy crap on a 4K OLED. Being able to upgrade in the time it takes to fix a snack is worth every penny of my 2gig fiber. wolrah fucked around with this message at 14:21 on Apr 25, 2023 |
# ? Apr 25, 2023 14:18 |
|
For me, I just love that when my mom calls up requesting a certain distro of Redhat it's available for her in Plex by the time the phone call is over.
|
# ? Apr 25, 2023 15:07 |
|
wolrah posted:I don't disagree in general, but those times when the automation fails and you just sat down expecting to watch your favorite open source short film and the copy it grabbed has burned in subtitles, or it wasn't even the right thing altogether, being able to delete that and run a manual download in seconds is really nice. It's exactly this. Most of the time it doesn't really make any difference to me if something downloads in 5 minutes or 2 hours, but on the occasion I've not planned ahead it would be nice to queue up the download and be ready to go in those 5 minutes.
|
# ? Apr 25, 2023 15:09 |
|
Fair points for sure. Random Sonaar question: I notice that while it's alright at doing its job, I can usually do a lot better by stacking nzbs myself. In particular, it seems really bad at downloading entire seasons of shows vs individual ones and combining them together. That doesn't always happen, but usually if I'm willing to play in the mud and queue my own nzbs there's a decent chance I'm going to do a better job than the auto-get, and sometimes extremely better (for example Sonaar was sure it couldn't find the final season of a show this morning, but the first "S03" full season nzb worked right away). Don't get me wrong Sonaar is extremely good, but is there any way to increase its accuracy? e: also do you guys have any ways to procure missing bits of a season(al Linux ISO) because often I get most of a season(al Linux ISO) and it's super annoying to get the last couple of episodes (of Linux)? I don't want to use bit torrent. By the way this has nothing to do with my question/comments but I recently added Geek to my Indexers and have found it to be exceptional! It runs a little slow, probably because they have so much bespoke code running on top to refine searches, but it consistently finds ISOs that other indexers don't see or keep. Between .su and Geek, you don't need much else (I know a lot of people would add Slug here but I find it to be almost the same as .su, and pretty expensive on top of that). I'm just floored at how good Geek is. I've tried a LOT of indexers and Geek just shits all over everything I've tried for shows in particular. Their invites are open too, so have at it. The one holy grail that I've yet to find though is somewhere that has consistently working HBO and/or Paramount Plus 4K open source wallpaper ISOs. Definitely open to recommendations on that front. Taima fucked around with this message at 13:06 on May 2, 2023 |
# ? May 2, 2023 11:38 |
|
I know you don't want to use BT but I do find it far better with Sonarr for season packs that don't end up a mishmash of qualities from different releases. Obviously having a good private tracker helps a lot too.
|
# ? May 2, 2023 14:19 |
|
Qualities can be helped by scoring some keywords, I basically tell it ‘once you see the word Amazon/amzn in a file name, grab that and we’re good’ and then it doesn’t matter who packed the file. Sometimes it grabs the same episode in the same quality twice as not every group uses that tag, but it ensures me consistency.
|
# ? May 2, 2023 15:26 |
|
Tags in Sonarr (and Radarr) are an evolving task. I'll add tags to the ignore/prefer list as things are grabbed and they do/don't meet expectations. I have certain groups tagged as ignored, but and other tags for preferred/final grabs of anything I want to keep long-term. Occasionally I'll grab things manually but for day to day it's been reliable.
|
# ? May 2, 2023 17:06 |
|
To get Radarr/Sonarr up to snuff, I highly suggest setting up Custom Formats from Trash Guides. Custom Formats in Radarr (and Sonarr v4, which is the current develop branch) are great. You can set it so that certain groups are preferred over other groups, and so those releases will always get preferred. You can also blacklist subpar groups and never download things from them.
|
# ? May 2, 2023 17:20 |
|
Taima posted:Fair points for sure. One thing that has helped with accuracy a lot is having the year in the folder and episode naming structure.
|
# ? May 2, 2023 21:46 |
|
Taima posted:Idk where I'm going with this but I feel like there's lessons to be had here. Speed approaches irrelevancy in Usenet because it's holistically automated in a way that consumers have very little normal access to. It's a small taste of the thought process, experience and decision making that goes on in corporations replacing human labor for machine labor. Yup, I also had this thought back when I was botting in Diablo 2. This is what captains of industry must have felt like during industrialization. Sure, my bot may not play optimally and die occasionally, but it runs 24/7 and doesn't need lunch breaks. Same with the *arrs, they replaced so much menial busy work I used to do daily that I now wonder why I ever did them.
|
# ? May 3, 2023 10:43 |
|
Xenthalon posted:Same with the *arrs, they replaced so much menial busy work I used to do daily that I now wonder why I ever did them. I used to spend sometimes hours a week browsing Newzbin and TVDB to figure out what I was missing and queue it up, then I'd manually rename and organize it as well before importing in to my XBMC library. When I finally gave in and spent a weekend setting it up it was like magic, stuff would just appear on its own and I felt stupid for how much time I had wasted being stubborn.
|
# ? May 3, 2023 16:08 |
|
wolrah posted:Up until what had to be just before the current version of this thread (which turns 12 tomorrow, btw) I was a holdout still doing things manually and in hindsight I can not understand why I spent so long doing that. I know there was something about how Sickbeard organized shows that I was picky about and IIRC at the time Couch Potato was still kinda janky and showing its roots as Sickbeard code bodged to handle movies, but still. You took me back with newzbin (RIP) The good news is that you eventually caved and joined us! Better late than never.
|
# ? May 3, 2023 17:44 |
|
I used to manually tag every single MP3 with artist, album, title, track# and year, sometimes even genre. WITH PROPER CAPITALIZATION I still remember the moment I gave up: I was in the middle of some rap album with lots of guest artists so for every track the artist tag was a different 'x with y featuring z' and I looked up and said gently caress This and just quit tagging my poo poo for years. Nowadays I just let the Musicbrainz client take a wack at everything and that automates 99% of it. The few things it can't find I just rawdog into Plex and it usually knows enough to at least display a band pic and bio.
|
# ? May 4, 2023 08:57 |
|
^^^ same, it's just too much meticulous, menial work and I can't go back to manually managing my music library after getting used to beets doing everything for me. Automation rules.
|
# ? May 4, 2023 09:15 |
|
I wish TRaSH Guides would add some guides for Lidarr.
|
# ? May 7, 2023 19:11 |
|
Dicty Bojangles posted:I wish TRaSH Guides would add some guides for Lidarr. I was never able to get Lidarr to work very well at all, there's just too many variations of releases to consistently get what you're looking for without manually searching each thing, which largely defeats the purpose of an *arr app. Either load up Nicotine++ or bite the bullet and pay for a Deezer subscription IMO.
|
# ? May 7, 2023 22:32 |
|
Oh yeah I already use the other options, sending the results through beets for organizing. Lidarr has worked moderately ok for some purposes so far, but the default quality settings are crap and it frequently messes up and downloads individual file posts rather than proper releases.
|
# ? May 7, 2023 23:32 |
|
So what's the best way to get Star Trek TOS into Sonarr without any of the new fangled edits?
|
# ? May 30, 2023 19:19 |
|
|
# ? May 30, 2024 02:55 |
|
Super-NintendoUser posted:So what's the best way to get Star Trek TOS into Sonarr without any of the new fangled edits? You could use minimum age and set it to a little longer than however long ago newfangled edits were released. Or do manual searches for each episode and only download ones that are old enough.
|
# ? May 30, 2023 19:22 |