|
So people have been talking about using distributed networking or something so you can lend the 4090, or parts of it, to other computers? Can you actually do that? I am interested...
|
# ? Oct 19, 2022 22:10 |
|
|
# ? May 15, 2024 15:38 |
|
Taima posted:So people have been talking about using distributed networking or something so you can lend the 4090, or parts of it, to other computers? https://forums.somethingawful.com/showthread.php?threadid=3871439 Someone recently posted about a project to render blender projects, if that’s more your thing.
|
# ? Oct 19, 2022 22:23 |
|
Rinkles posted:Idk if it means anything besides supplies being limited, but fwiw the 750 and 770 quickly go out of stock whenever they’re available at Newegg. Which, combined with the fact that people ARE SCALPING INTEL GPUS ON EBAY citation: leads me to believe there are a ton of scalpers that don't know what the gently caress they're doing and think "gpu price always go up" without actually understanding what's going on, are in for a bad time. Intel will be fine, do not subsidize a lovely product, the scalpers are already doing that fine.
|
# ? Oct 19, 2022 22:31 |
|
SwissArmyDruid posted:Which, combined with the fact that people ARE SCALPING INTEL GPUS ON EBAY citation:
|
# ? Oct 19, 2022 22:40 |
|
It looks like precisely one scalped A770 has actually sold on ebay, for $650+25 Zero scalped A750s have moved at all
|
# ? Oct 19, 2022 22:50 |
|
hobbesmaster posted:For the digital hoarders out there this means put your 8+ TB spinny drives in a NAS. I love the convenience and expense of running two computers for all the functionality of one. As long as your mobo has SATA ports on it I see no reason not to have at least one storage disc locally if it's going to save you 50-150w+ from running it in a NAS. Especially if you're doing something like local video editing. Shipon posted:Honestly I watch enough YouTube on a second monitor while playing games that I wouldn't mind picking up an Intel card just for video decode, but apparently Chrome doesn't let you actually choose which GPU to use for hardware acceleration. I am sick of stuttering frames while watching videos and playing games I was reading somewhere else that this might also just be chrome sucking major rear end, try firefox if you're stuttering. CatelynIsAZombie fucked around with this message at 23:41 on Oct 19, 2022 |
# ? Oct 19, 2022 23:09 |
|
CatelynIsAZombie posted:I love the convenience and expense of running two computers for all the functionality of one. As long as your mobo has SATA ports on it I see no reason not to have at least one storage disc locally if it's going to save you 50-150w+ from running it in a NAS. Especially if you're doing something like local video editing. A desktop synology uses like 5 W plus the power usage of the hard drives. That’s less than the hit I take from raising the SOC voltage for memory. hobbesmaster fucked around with this message at 01:05 on Oct 20, 2022 |
# ? Oct 20, 2022 00:08 |
|
i don't know anything about video editing but do people do it from spinners?
|
# ? Oct 20, 2022 00:09 |
|
shrike82 posted:i don't know anything about video editing but do people do it from spinners? Most editing rigs I see are SSDs for the project being worked on and a big spinny drive for other stuff. Then it all gets thrown on a backup system never to be looked at again.
|
# ? Oct 20, 2022 00:11 |
|
unpronounceable posted:If you’re thinking of things like Folding@home, where you can let your computer to do some scientific processing while you aren’t using it, we have a thread for it. Sorry, I am not being clear. I've just been told before that the technology exists to share a single GPU between computers. For example my 4090 can "be available" and apportioned according to need among multiple rigs. That is an interesting thought because I'll only be saturating the card during certain time frames (high end games at 4K/144) and the rest of the time I think it would be awesome to be able to apportion that power elsewhere within my network. I might just be making this whole thing up though, or simply not understanding what people were talking about, which is very possible.
|
# ? Oct 20, 2022 00:17 |
|
Taima posted:Sorry, I am not being clear. You can have multiple GPUs assigned to different VMs that are accessed remotely, but I don't believe you can split a GPU? (edit: I guess you can but it seems finicky) LTT's done a bunch of videos on this if you want to go searching. Dr. Video Games 0031 fucked around with this message at 00:42 on Oct 20, 2022 |
# ? Oct 20, 2022 00:19 |
|
Taima posted:Sorry, I am not being clear. You're thinking about virtual machines, where multiple virtual PCs share access to the underlying physical hardware. You can do things like put multiple GPUs in a machine and assign one to each VM, or split up one GPU between multiple VMs. But it's a tremendous amount of yak shaving to really do so.
|
# ? Oct 20, 2022 00:34 |
|
CatelynIsAZombie posted:I love the convenience and expense of running two computers for all the functionality of one.
|
# ? Oct 20, 2022 00:37 |
|
yeah the savings is from not having your PC on 24/7
|
# ? Oct 20, 2022 00:42 |
|
Well, irresponsible decisions were made. (Radeon 9700 Pro provided for scale) The MSI 4090 Gaming Trio. The card is indeed very large, but I was honestly expecting it to be even bigger. This is a 3.5 slot design that also isn't unreasonably tall or long, and it fits comfortably into my Lancool III. It seems like it might be a 3090 Ti cooler design copy and pasted onto the 4090, which is fine by me. The thermal and acoustic performance is very reasonable. The fans seem to have good quality bearings and motors that make more of a woosh sound than a whir, even at high RPMs. The hottest the card got when I was stressing it out with Furmark was 78C, and the fans were running at a very quiet 1400 RPM. Even at 1600 RPM, the fans are whisper quiet, so I'm surprised that their default fan curve is so loose. This may be the first card where I want to make the stock fan curve more aggressive. It only came with a 3x8-pin-to-16-pin adapter, and the power limit tops out at 106% (477W?), which is not an issue for me. There is some coil whine when the card is under load, but it's quiet enough to not be audible over game audio. It may become more annoying if I ever want to move my PC onto my desk instead of tucked away into a corner to my left. Stable diffusion runs slower with the 4090 than it did with my previous card, so presumably there is something I need to configure differently or reinstall to make it take full advantage of the new card. Games perform as expected (extremely well). edit: I indeed had to update some stuff for SD to run at full speed. I posted about it in the GBS AI art thread. I tested the card in TimeSpy Extreme and got 18.6k at stock and 19.5k with a +200/+1200 OC tossed on (which does not consume much more power since I didn't touch voltages). Not sure if I'll keep the OC. It seems stable and doesn't move the needle on power much, but it also doesn't move the needle on performance much, so meh. +200 brings core clock to 2850. It crashed with +400. Didn't try any in between because I don't want to push things that much. Dr. Video Games 0031 fucked around with this message at 22:19 on Oct 20, 2022 |
# ? Oct 20, 2022 01:25 |
|
Dr. Video Games 0031 posted:Well, irresponsible decisions were made. #jealous
|
# ? Oct 20, 2022 01:27 |
|
lol like the noctua browns hiding in the background
|
# ? Oct 20, 2022 01:28 |
|
I got the base Trio too. Ok random question after slotting the card, the Nvidia Control panel now has a new setting called... Change ECC State that has to do with error correction or something. should I enable that?
|
# ? Oct 20, 2022 01:45 |
|
Managed to get an order in today with Best Buy for the gaming trio as well. Scheduled to pickup Saturday so hopefully it actually goes through. Will need to swap from my NR200 back to my fractal define S however.
|
# ? Oct 20, 2022 01:57 |
|
shrike82 posted:lol like the noctua browns hiding in the background traded the mattress sheets for them
|
# ? Oct 20, 2022 02:05 |
|
please do not make fun of me for my storage bed
|
# ? Oct 20, 2022 02:17 |
|
Had an interesting article pop up in my feed and would be very interested in the threads opinion: https://www.techradar.com/news/nvidia-rtx-4090-gpu-is-alarmingly-good-at-cracking-passwords Is this something that might shift the metric on the prevalence of password cracking, or is it simply an evolution of something that was already commonplace? The article suggests a standard 8 character password could be forced within an hour with eight 4090s. That seems like a relatively low barrier of entry for somebody willing to spend a bit of cash to break into some specific (and presumably lucrative) targets.
|
# ? Oct 20, 2022 02:23 |
|
Ok boys we are switching from bitcoin mining to password mining.
|
# ? Oct 20, 2022 02:27 |
|
EGPUS ARE BACK ON THE MENU, Thunderbolt’s next spec triples bandwidth to 120Gbps—with a catch The catch is that it's asymmetric. 120 out 40 in.
|
# ? Oct 20, 2022 02:29 |
|
RGX posted:Had an interesting article pop up in my feed and would be very interested in the threads opinion:
|
# ? Oct 20, 2022 02:43 |
|
RGX posted:Had an interesting article pop up in my feed and would be very interested in the threads opinion: They can do it in that short amount of time only after they obtained the hash of a password I assume? Once they got their hands on that, something bad has already gone down like a MITM attack or a leaked database, and then it's game over anyway. It's not like they could try to login into your google account 3 billion times per second (at least I hope so)
|
# ? Oct 20, 2022 02:49 |
|
RGX posted:Had an interesting article pop up in my feed and would be very interested in the threads opinion: Offline password brute-forcing is an embarrassingly parallelizable problem, you can already crack passwords that quickly if you're willing to throw enough ec2 instances at it. Using local GPUs just makes it potentially cheaper. Also using GPUs to do this is not a new idea, I knew some grad students at my university playing with the idea back in 2007. And the relative ease of offline brute-forcing means industry already assumes that leaked passwords are cracked passwords. sauer kraut posted:They can do it in that short amount of time only after they obtained the hash of a password I assume? Once they got their hands on that, something bad has already gone down like a MITM attack or a leaked database, and then it's game over anyway. Yeah exactly.
|
# ? Oct 20, 2022 02:52 |
|
sauer kraut posted:They can do it in that short amount of time only after they obtained the hash of a password I assume? Once they got their hands on that, something bad has already gone down and it's game over anyway. This really isn't news given the 4090's known performance improvement over the 3090Ti.
|
# ? Oct 20, 2022 02:53 |
|
A Silent Hill 2 remake in UE5 was announced today, and, well... https://store.steampowered.com/app/2124490/SILENT_HILL_2/ RIP to Pascal. You had a good run.
|
# ? Oct 20, 2022 05:04 |
|
Bit of a gulf between a 2080 and a 6800XT. Or maybe they’re accounting for ray tracing.
|
# ? Oct 20, 2022 05:08 |
|
are there any games that perform better on >16GB RAM these days? i'd made the assumption 32GB is a safe default these days but a quick google doesn't seem to indicate that it's necessary
|
# ? Oct 20, 2022 05:10 |
|
Taima posted:I got the base Trio too. ECC is one of those things that If you dont already know that you need ECC, then you don't need it and you should leave it off. It will incur a performance penalty with GDDR6/6X, from my understanding. I'm not sure if exposing that variable for the 4090 is a mistake or not, it is usually something that only shows up with the Studio Drivers and a Quadro or A/T series pro GPU. For a home user, there really isn't any reason to toggle it on.
|
# ? Oct 20, 2022 05:11 |
|
Dr. Video Games 0031 posted:A Silent Hill 2 remake in UE5 was announced today, and, well... Lol RIP 3050 owners
|
# ? Oct 20, 2022 05:12 |
|
On the other hand:
|
# ? Oct 20, 2022 05:15 |
|
there isn't even a release window yet for the SH2 remake so i wouldn't read too much into those requirements yet but still lol
|
# ? Oct 20, 2022 05:17 |
|
Rinkles posted:Bit of a gulf between a 2080 and a 6800XT. Or maybe they’re accounting for ray tracing. if it's using lumen, then there's no avoiding it. the AMD cards will just be much worse than nvidia cards no matter what.
|
# ? Oct 20, 2022 05:22 |
|
Spent an hour upgrading from a generic 750 watt PSU to a pretty nicer MSI 850 watt model, was not very glamorous undoing cable management and redoing it. I know that fully modular is the fancy way to go but I don't really think it's really necessary to need the 24pin and CPU connectors as modular as you pretty much need those. I guess i'll stop being picky about which 4090 to throw in and use the Zotac 4090, I managed to order the Gigabyte OC model that is 1699$ before tax but I may just let it lapse instead of picking it up.
|
# ? Oct 20, 2022 06:25 |
|
infraboy posted:Spent an hour upgrading from a generic 750 watt PSU to a pretty nicer MSI 850 watt model, was not very glamorous undoing cable management and redoing it. I know that fully modular is the fancy way to go but I don't really think it's really necessary to need the 24pin and CPU connectors as modular as you pretty much need those. the word you're looking for is "semi-modular". A lot more prevalent in SFX PSUs.
|
# ? Oct 20, 2022 07:05 |
|
shrike82 posted:are there any games that perform better on >16GB RAM these days? There are cases where it's credibly reported as useful. MS Flight Sim, large projects in Cities Skylines, and leaky hack jobs like Tarkov, Just Cause 3/4 etc. If you have a second screen open while playing, and/or a browser I'd go for 32.
|
# ? Oct 20, 2022 09:37 |
|
|
# ? May 15, 2024 15:38 |
|
Dr. Video Games 0031 posted:if it's using lumen, then there's no avoiding it. the AMD cards will just be much worse than nvidia cards no matter what. well there's potentially avoiding hardware raytracing, depending on which lumen mode they use remember lumen can use hardware RT (accurate) or software RT (less accurate but potentially faster, especially on AMD)
|
# ? Oct 20, 2022 11:10 |