Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
So people have been talking about using distributed networking or something so you can lend the 4090, or parts of it, to other computers?

Can you actually do that? I am interested...

Adbot
ADBOT LOVES YOU

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib

Taima posted:

So people have been talking about using distributed networking or something so you can lend the 4090, or parts of it, to other computers?

Can you actually do that? I am interested...
If you’re thinking of things like Folding@home, where you can let your computer to do some scientific processing while you aren’t using it, we have a thread for it.

https://forums.somethingawful.com/showthread.php?threadid=3871439

Someone recently posted about a project to render blender projects, if that’s more your thing.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Rinkles posted:

Idk if it means anything besides supplies being limited, but fwiw the 750 and 770 quickly go out of stock whenever they’re available at Newegg.

Which, combined with the fact that people ARE SCALPING INTEL GPUS ON EBAY citation:



leads me to believe there are a ton of scalpers that don't know what the gently caress they're doing and think "gpu price always go up" without actually understanding what's going on, are in for a bad time.

Intel will be fine, do not subsidize a lovely product, the scalpers are already doing that fine.

FlamingLiberal
Jan 18, 2009

Would you like to play a game?



SwissArmyDruid posted:

Which, combined with the fact that people ARE SCALPING INTEL GPUS ON EBAY citation:



leads me to believe there are a ton of scalpers that don't know what the gently caress they're doing and think "gpu price always go up" without actually understanding what's going on, are in for a bad time.

Intel will be fine, do not subsidize a lovely product, the scalpers are already doing that fine.
Yeah I imagine this is NEW GPU= SCALP more than anything else

DoombatINC
Apr 20, 2003

Here's the thing, I'm a feminist.





It looks like precisely one scalped A770 has actually sold on ebay, for $650+25

Zero scalped A750s have moved at all :cawg:

CatelynIsAZombie
Nov 16, 2006

I can't wait to bomb DO-DON-GOES!

hobbesmaster posted:

For the digital hoarders out there this means put your 8+ TB spinny drives in a NAS.

I love the convenience and expense of running two computers for all the functionality of one. As long as your mobo has SATA ports on it I see no reason not to have at least one storage disc locally if it's going to save you 50-150w+ from running it in a NAS. Especially if you're doing something like local video editing.

Shipon posted:

Honestly I watch enough YouTube on a second monitor while playing games that I wouldn't mind picking up an Intel card just for video decode, but apparently Chrome doesn't let you actually choose which GPU to use for hardware acceleration. I am sick of stuttering frames while watching videos and playing games

I was reading somewhere else that this might also just be chrome sucking major rear end, try firefox if you're stuttering.

CatelynIsAZombie fucked around with this message at 23:41 on Oct 19, 2022

hobbesmaster
Jan 28, 2008

CatelynIsAZombie posted:

I love the convenience and expense of running two computers for all the functionality of one. As long as your mobo has SATA ports on it I see no reason not to have at least one storage disc locally if it's going to save you 50-150w+ from running it in a NAS. Especially if you're doing something like local video editing.


A desktop synology uses like 5 W plus the power usage of the hard drives.

That’s less than the hit I take from raising the SOC voltage for :rice: memory.

hobbesmaster fucked around with this message at 01:05 on Oct 20, 2022

shrike82
Jun 11, 2005

i don't know anything about video editing but do people do it from spinners?

njsykora
Jan 23, 2012

Robots confuse squirrels.


shrike82 posted:

i don't know anything about video editing but do people do it from spinners?

Most editing rigs I see are SSDs for the project being worked on and a big spinny drive for other stuff. Then it all gets thrown on a backup system never to be looked at again.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

unpronounceable posted:

If you’re thinking of things like Folding@home, where you can let your computer to do some scientific processing while you aren’t using it, we have a thread for it.

https://forums.somethingawful.com/showthread.php?threadid=3871439

Someone recently posted about a project to render blender projects, if that’s more your thing.

Sorry, I am not being clear.

I've just been told before that the technology exists to share a single GPU between computers.

For example my 4090 can "be available" and apportioned according to need among multiple rigs.

That is an interesting thought because I'll only be saturating the card during certain time frames (high end games at 4K/144) and the rest of the time I think it would be awesome to be able to apportion that power elsewhere within my network.

I might just be making this whole thing up though, or simply not understanding what people were talking about, which is very possible.

Dr. Video Games 0031
Jul 17, 2004

Taima posted:

Sorry, I am not being clear.

I've just been told before that the technology exists to share a single GPU between computers.

For example my 4090 can "be available" and apportioned according to need among multiple rigs.

That is an interesting thought because I'll only be saturating the card during certain time frames (high end games at 4K/144) and the rest of the time I think it would be awesome to be able to apportion that power elsewhere within my network.

I might just be making this whole thing up though, or simply not understanding what people were talking about, which is very possible.

You can have multiple GPUs assigned to different VMs that are accessed remotely, but I don't believe you can split a GPU? (edit: I guess you can but it seems finicky)

LTT's done a bunch of videos on this if you want to go searching.

Dr. Video Games 0031 fucked around with this message at 00:42 on Oct 20, 2022

Gwaihir
Dec 8, 2009
Hair Elf

Taima posted:

Sorry, I am not being clear.

I've just been told before that the technology exists to share a single GPU between computers.

For example my 4090 can "be available" and apportioned according to need among multiple rigs.

That is an interesting thought because I'll only be saturating the card during certain time frames (high end games at 4K/144) and the rest of the time I think it would be awesome to be able to apportion that power elsewhere within my network.

I might just be making this whole thing up though, or simply not understanding what people were talking about, which is very possible.

You're thinking about virtual machines, where multiple virtual PCs share access to the underlying physical hardware. You can do things like put multiple GPUs in a machine and assign one to each VM, or split up one GPU between multiple VMs. But it's a tremendous amount of yak shaving to really do so.

Shumagorath
Jun 6, 2001

CatelynIsAZombie posted:

I love the convenience and expense of running two computers for all the functionality of one.
The resiliency you can get with a good ZFS system plus the convenience of not having to turn my gaming PC on for Plex far outweigh the upkeep. I also save ~40 watts of 24/7 power draw.

shrike82
Jun 11, 2005

yeah the savings is from not having your PC on 24/7

Dr. Video Games 0031
Jul 17, 2004

Well, irresponsible decisions were made.


(Radeon 9700 Pro provided for scale)

The MSI 4090 Gaming Trio. The card is indeed very large, but I was honestly expecting it to be even bigger. This is a 3.5 slot design that also isn't unreasonably tall or long, and it fits comfortably into my Lancool III. It seems like it might be a 3090 Ti cooler design copy and pasted onto the 4090, which is fine by me. The thermal and acoustic performance is very reasonable. The fans seem to have good quality bearings and motors that make more of a woosh sound than a whir, even at high RPMs. The hottest the card got when I was stressing it out with Furmark was 78C, and the fans were running at a very quiet 1400 RPM. Even at 1600 RPM, the fans are whisper quiet, so I'm surprised that their default fan curve is so loose. This may be the first card where I want to make the stock fan curve more aggressive. It only came with a 3x8-pin-to-16-pin adapter, and the power limit tops out at 106% (477W?), which is not an issue for me. There is some coil whine when the card is under load, but it's quiet enough to not be audible over game audio. It may become more annoying if I ever want to move my PC onto my desk instead of tucked away into a corner to my left.

Stable diffusion runs slower with the 4090 than it did with my previous card, so presumably there is something I need to configure differently or reinstall to make it take full advantage of the new card. Games perform as expected (extremely well).

edit: I indeed had to update some stuff for SD to run at full speed. I posted about it in the GBS AI art thread. I tested the card in TimeSpy Extreme and got 18.6k at stock and 19.5k with a +200/+1200 OC tossed on (which does not consume much more power since I didn't touch voltages). Not sure if I'll keep the OC. It seems stable and doesn't move the needle on power much, but it also doesn't move the needle on performance much, so meh. +200 brings core clock to 2850. It crashed with +400. Didn't try any in between because I don't want to push things that much.

Dr. Video Games 0031 fucked around with this message at 22:19 on Oct 20, 2022

MarcusSA
Sep 23, 2007

Dr. Video Games 0031 posted:

Well, irresponsible decisions were made.


(Radeon 9700 Pro provided for scale)

The MSI 4090 Gaming Trio. The card is indeed very large, but I was honestly expecting it to be even bigger. This is a 3.5 slot design that also isn't unreasonably tall or long, and it fits comfortably into my Lancool III. It seems like it might be a 3090 Ti cooler design copy and pasted onto the 4090, which is fine by me. The thermal and acoustic performance is very reasonable. The fans seem to have good quality bearings and motors that make more of a woosh sound than a whir, even at high RPMs. The hottest the card got when I was stressing it out with Furmark was 78C, and the fans were running at a very quiet 1400 RPM. Even at 1600 RPM, the fans are whisper quiet, so I'm surprised that their default fan curve is so loose. This may be the first card where I want to make the stock fan curve more aggressive. It only came with a 3x8-pin-to-16-pin adapter, and the power limit tops out at 106% (477W?), which is not an issue for me. There is some coil whine when the card is under load, but it's quiet enough to not be audible over game audio. It may become more annoying if I ever want to move my PC onto my desk instead of tucked away into a corner to my left.

Stable diffusion runs slower with the 4090 than it did with my previous card, so presumably there is something I need to configure differently or reinstall to make it take full advantage of the new card. Games perform as expected (extremely well)

#jealous

shrike82
Jun 11, 2005

lol like the noctua browns hiding in the background

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I got the base Trio too.

Ok random question after slotting the card, the Nvidia Control panel now has a new setting called... Change ECC State that has to do with error correction or something. should I enable that?

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Managed to get an order in today with Best Buy for the gaming trio as well. Scheduled to pickup Saturday so hopefully it actually goes through. Will need to swap from my NR200 back to my fractal define S however.

Inept
Jul 8, 2003

shrike82 posted:

lol like the noctua browns hiding in the background

traded the mattress sheets for them

Dr. Video Games 0031
Jul 17, 2004

please do not make fun of me for my storage bed

RGX
Sep 23, 2004
Unstoppable
Had an interesting article pop up in my feed and would be very interested in the threads opinion:

https://www.techradar.com/news/nvidia-rtx-4090-gpu-is-alarmingly-good-at-cracking-passwords

Is this something that might shift the metric on the prevalence of password cracking, or is it simply an evolution of something that was already commonplace? The article suggests a standard 8 character password could be forced within an hour with eight 4090s. That seems like a relatively low barrier of entry for somebody willing to spend a bit of cash to break into some specific (and presumably lucrative) targets.

MarcusSA
Sep 23, 2007

Ok boys we are switching from bitcoin mining to password mining.

SwissArmyDruid
Feb 14, 2014

by sebmojo
EGPUS ARE BACK ON THE MENU,

Thunderbolt’s next spec triples bandwidth to 120Gbps—with a catch

The catch is that it's asymmetric. 120 out 40 in.

Shumagorath
Jun 6, 2001

RGX posted:

Had an interesting article pop up in my feed and would be very interested in the threads opinion:

https://www.techradar.com/news/nvidia-rtx-4090-gpu-is-alarmingly-good-at-cracking-passwords

Is this something that might shift the metric on the prevalence of password cracking, or is it simply an evolution of something that was already commonplace? The article suggests a standard 8 character password could be forced within an hour with eight 4090s. That seems like a relatively low barrier of entry for somebody willing to spend a bit of cash to break into some specific (and presumably lucrative) targets.
It's not a huge deal. 2x better at any given algo doesn't change what you're able to realistically attack, and you'll get more coverage out of a better dictionary + rule set.

sauer kraut
Oct 2, 2004

RGX posted:

Had an interesting article pop up in my feed and would be very interested in the threads opinion:

https://www.techradar.com/news/nvidia-rtx-4090-gpu-is-alarmingly-good-at-cracking-passwords

Is this something that might shift the metric on the prevalence of password cracking, or is it simply an evolution of something that was already commonplace? The article suggests a standard 8 character password could be forced within an hour with eight 4090s. That seems like a relatively low barrier of entry for somebody willing to spend a bit of cash to break into some specific (and presumably lucrative) targets.

They can do it in that short amount of time only after they obtained the hash of a password I assume? Once they got their hands on that, something bad has already gone down like a MITM attack or a leaked database, and then it's game over anyway.
It's not like they could try to login into your google account 3 billion times per second (at least I hope so)

acksplode
May 17, 2004



RGX posted:

Had an interesting article pop up in my feed and would be very interested in the threads opinion:

https://www.techradar.com/news/nvidia-rtx-4090-gpu-is-alarmingly-good-at-cracking-passwords

Is this something that might shift the metric on the prevalence of password cracking, or is it simply an evolution of something that was already commonplace? The article suggests a standard 8 character password could be forced within an hour with eight 4090s. That seems like a relatively low barrier of entry for somebody willing to spend a bit of cash to break into some specific (and presumably lucrative) targets.

Offline password brute-forcing is an embarrassingly parallelizable problem, you can already crack passwords that quickly if you're willing to throw enough ec2 instances at it. Using local GPUs just makes it potentially cheaper. Also using GPUs to do this is not a new idea, I knew some grad students at my university playing with the idea back in 2007. And the relative ease of offline brute-forcing means industry already assumes that leaked passwords are cracked passwords.

sauer kraut posted:

They can do it in that short amount of time only after they obtained the hash of a password I assume? Once they got their hands on that, something bad has already gone down like a MITM attack or a leaked database, and then it's game over anyway.
It's not like they could try to login into your google account 3 billion times per second (at least I hope so)

Yeah exactly.

Shumagorath
Jun 6, 2001

sauer kraut posted:

They can do it in that short amount of time only after they obtained the hash of a password I assume? Once they got their hands on that, something bad has already gone down and it's game over anyway.
It's not like they could try to login into your google account 3 billion times per second (at least I hope so)
It depends on the context. NTLM is something you could conceivably do live-to-air. A bcrypt at-rest database might let an attacker get into some accounts before the admins issue a force-reset with secondary authentication. Any database can also provide hashes you can look up for later and a good sample of human-generated passwords you can re-use elsewhere.

This really isn't news given the 4090's known performance improvement over the 3090Ti.

Dr. Video Games 0031
Jul 17, 2004

A Silent Hill 2 remake in UE5 was announced today, and, well...

https://store.steampowered.com/app/2124490/SILENT_HILL_2/



RIP to Pascal. You had a good run.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Bit of a gulf between a 2080 and a 6800XT. Or maybe they’re accounting for ray tracing.

shrike82
Jun 11, 2005

are there any games that perform better on >16GB RAM these days?

i'd made the assumption 32GB is a safe default these days but a quick google doesn't seem to indicate that it's necessary

Cygni
Nov 12, 2005

raring to post

Taima posted:

I got the base Trio too.

Ok random question after slotting the card, the Nvidia Control panel now has a new setting called... Change ECC State that has to do with error correction or something. should I enable that?

ECC is one of those things that If you dont already know that you need ECC, then you don't need it and you should leave it off. It will incur a performance penalty with GDDR6/6X, from my understanding.

I'm not sure if exposing that variable for the 4090 is a mistake or not, it is usually something that only shows up with the Studio Drivers and a Quadro or A/T series pro GPU. For a home user, there really isn't any reason to toggle it on.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Dr. Video Games 0031 posted:

A Silent Hill 2 remake in UE5 was announced today, and, well...

https://store.steampowered.com/app/2124490/SILENT_HILL_2/



RIP to Pascal. You had a good run.

Lol RIP 3050 owners

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
On the other hand:

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
there isn't even a release window yet for the SH2 remake so i wouldn't read too much into those requirements yet but still lol

Dr. Video Games 0031
Jul 17, 2004

Rinkles posted:

Bit of a gulf between a 2080 and a 6800XT. Or maybe they’re accounting for ray tracing.

if it's using lumen, then there's no avoiding it. the AMD cards will just be much worse than nvidia cards no matter what.

infraboy
Aug 15, 2002

Phungshwei!!!!!!1123
Spent an hour upgrading from a generic 750 watt PSU to a pretty nicer MSI 850 watt model, was not very glamorous undoing cable management and redoing it. I know that fully modular is the fancy way to go but I don't really think it's really necessary to need the 24pin and CPU connectors as modular as you pretty much need those.

I guess i'll stop being picky about which 4090 to throw in and use the Zotac 4090, I managed to order the Gigabyte OC model that is 1699$ before tax but I may just let it lapse instead of picking it up.

SwissArmyDruid
Feb 14, 2014

by sebmojo

infraboy posted:

Spent an hour upgrading from a generic 750 watt PSU to a pretty nicer MSI 850 watt model, was not very glamorous undoing cable management and redoing it. I know that fully modular is the fancy way to go but I don't really think it's really necessary to need the 24pin and CPU connectors as modular as you pretty much need those.

I guess i'll stop being picky about which 4090 to throw in and use the Zotac 4090, I managed to order the Gigabyte OC model that is 1699$ before tax but I may just let it lapse instead of picking it up.

the word you're looking for is "semi-modular". A lot more prevalent in SFX PSUs.

sauer kraut
Oct 2, 2004

shrike82 posted:

are there any games that perform better on >16GB RAM these days?

i'd made the assumption 32GB is a safe default these days but a quick google doesn't seem to indicate that it's necessary

There are cases where it's credibly reported as useful.
MS Flight Sim, large projects in Cities Skylines, and leaky hack jobs like Tarkov, Just Cause 3/4 etc.

If you have a second screen open while playing, and/or a browser I'd go for 32.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

if it's using lumen, then there's no avoiding it. the AMD cards will just be much worse than nvidia cards no matter what.

well there's potentially avoiding hardware raytracing, depending on which lumen mode they use

remember lumen can use hardware RT (accurate) or software RT (less accurate but potentially faster, especially on AMD)

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply