Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
AzureSkys
Apr 27, 2003

Kind of a dumb question, but I need to use a few more monitors with my RTX 3090 (Which has 4 connectors and I need two more). I have an old GTX 660ti that fits OK to install on my motherboard and case.

However, the drivers for the RTX 30 series don't include the GTX 600 series so it's not being recognized by Windows 10 x64. I don't see any RTX 30 series drivers that do. Installing any that are compatible with the 660 then disables the 3090.

Is there any way to get them both to function or is it just not doable without the driver support working for both?

It's not a big deal, I'm just experimenting with some options and wanted to see if I can use this old hardware first.

Adbot
ADBOT LOVES YOU

orange juche
Mar 14, 2012



AzureSkys posted:

Kind of a dumb question, but I need to use a few more monitors with my RTX 3090 (Which has 4 connectors and I need two more). I have an old GTX 660ti that fits OK to install on my motherboard and case.

However, the drivers for the RTX 30 series don't include the GTX 600 series so it's not being recognized by Windows 10 x64. I don't see any RTX 30 series drivers that do. Installing any that are compatible with the 660 then disables the 3090.

Is there any way to get them both to function or is it just not doable without the driver support working for both?

It's not a big deal, I'm just experimenting with some options and wanted to see if I can use this old hardware first.

Not really unless you want to dabble into custom edited drivers and you don't want to do that, unsigned drivers are a pretty big safety risk.

Get a newer 2nd GPU that fits in your case, there's single card solutions that are new enough to be supported I think, if card height is your concern.

Yudo
May 15, 2003

I have the terrible feeling that fsr 3 isn't coming until the middle of rdna 3's life cycle. That isn't going to help in terms of developer adoption, though apparently fsr is easy to implement. Maybe if it runs on rdna 2 it will end up on consoles.

Arivia
Mar 17, 2011

AzureSkys posted:

Kind of a dumb question, but I need to use a few more monitors with my RTX 3090 (Which has 4 connectors and I need two more). I have an old GTX 660ti that fits OK to install on my motherboard and case.

However, the drivers for the RTX 30 series don't include the GTX 600 series so it's not being recognized by Windows 10 x64. I don't see any RTX 30 series drivers that do. Installing any that are compatible with the 660 then disables the 3090.

Is there any way to get them both to function or is it just not doable without the driver support working for both?

It's not a big deal, I'm just experimenting with some options and wanted to see if I can use this old hardware first.

Do you have an integrated gpu on your cpu you could use, if the last two monitors don’t need much graphics acceleration?

AzureSkys
Apr 27, 2003

orange juche posted:

Not really unless you want to dabble into custom edited drivers and you don't want to do that, unsigned drivers are a pretty big safety risk.

That's sorta what I thought.

Arivia posted:

Do you have an integrated gpu on your cpu you could use, if the last two monitors don’t need much graphics acceleration?
Unfortunately I don't with my R9 5950x and motherboard.

I'm using Spacedesk to use a laptop as the other screens through my home network, but there's some latency issues and a TripleHead2Go but it treats the other monitors as a single display which makes certain window placement finicky .
I'll keep my eye out for a newer simple GPU that would fit. I think I have access to a 1080 that I may try out, if it'll fit and not cause heating concerns.

It's for Flight Simulator instrument panel displays which gets a bit ridiculous.

orange juche
Mar 14, 2012



Yudo posted:

I have the terrible feeling that fsr 3 isn't coming until the middle of rdna 3's life cycle. That isn't going to help in terms of developer adoption, though apparently fsr is easy to implement. Maybe if it runs on rdna 2 it will end up on consoles.

FSR3 does run on RDNA2, since it's an open standard and AMD specifically stated it would run on older hardware, they'd be silly to not have it run on older hardware as current consoles are RDNA2 based, and also the lions share of their graphics hardware goes into consoles.

Arivia
Mar 17, 2011
You could buy like a used 1050 for peanuts to do it with it sounds like.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

It's always the crazy *sim folks ;)

repiv
Aug 13, 2009

Yudo posted:

I have the terrible feeling that fsr 3 isn't coming until the middle of rdna 3's life cycle. That isn't going to help in terms of developer adoption, though apparently fsr is easy to implement. Maybe if it runs on rdna 2 it will end up on consoles.

it's pretty obvious they announced it very early in development, the discussion of it at GDC was little more than "yeah we're working on it" with no examples or performance numbers given

they still need to come up with their own version of reflex in order to make interpolation actually palatable

Animal
Apr 8, 2003

I imagine the increase in frame generation latency is more pronounced when you have something like 45 real frames generating up to 60. But as I am looking at it now on MSFS, Plague Tale, and CP2077 on a 4090, I would not be able to pass a double blind test. And I’m very sensitive to stuff like artifacting, micro stutters and input lag. Maybe I would be able to notice in a twitch game like Overwatch, but those competitive FPS’ tend to have very high FPS on mid-end hardware so frame generation is not necessary. Very exciting technology, would be amazing to have a Nintendo Switch 2 that can take Zelda from 40fps to 60fps.

Yudo
May 15, 2003

Animal posted:

I imagine the increase in frame generation latency is more pronounced when you have something like 45 real frames generating up to 60. But as I am looking at it now on MSFS, Plague Tale, and CP2077 on a 4090, I would not be able to pass a double blind test. And I’m very sensitive to stuff like artifacting, micro stutters and input lag. Maybe I would be able to notice in a twitch game like Overwatch, but those competitive FPS’ tend to 40fps to 60fps.

I have seen it demonstrated that frame generation works best when there are already adequate or close to it "real frames." When interpolation is applies with fewer frames, latency increases, and I guess artifications as well, though that is harder to measure and harder to demonstrate via compressed video.

Animal
Apr 8, 2003

It’s a pretty great first iteration. I’m looking forward to see it mature. DLSS 2 was a big jump from when it was first released.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

AzureSkys posted:

I have an old GTX 660ti

You've already gotten a functionally correct answer, but just to be informational about the why: GTX 600 series is Kepler and I believe the current support is only back to Turing.

repiv
Aug 13, 2009

AzureSkys posted:

That's sorta what I thought.

Unfortunately I don't with my R9 5950x and motherboard.

I'm using Spacedesk to use a laptop as the other screens through my home network, but there's some latency issues and a TripleHead2Go but it treats the other monitors as a single display which makes certain window placement finicky .
I'll keep my eye out for a newer simple GPU that would fit. I think I have access to a 1080 that I may try out, if it'll fit and not cause heating concerns.

It's for Flight Simulator instrument panel displays which gets a bit ridiculous.


depending on the specifics of the monitors you might be able to use a displayport MST hub to pull it off without a second graphics card

the catch is that the combination of monitors you attach to the hub can't exceed the bandwidth limit of a single displayport output

Shipon
Nov 7, 2005

repiv posted:

depending on the specifics of the monitors you might be able to use a displayport MST hub to pull it off without a second graphics card

the catch is that the combination of monitors you attach to the hub can't exceed the bandwidth limit of a single displayport output

If he's just using the extra outputs for pop-out flight displays those really shouldn't be too much of a bandwidth hit - those are probably something like 1280x1024 or whatever anyway.

orcane
Jun 13, 2012

Fun Shoe

mdxi posted:

You've already gotten a functionally correct answer, but just to be informational about the why: GTX 600 series is Kepler and I believe the current support is only back to Turing.
Maxwell and newer, as in the GTX 900 series and the GTX 750 variants (745, 750, 750 Ti), but not the rest of the GTX 700 series.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Paul MaudDib posted:

you know, people put a very negative spin on NVIDIA doing their damndest to make sure there's inventory on shelves at MSRP on launch, and then even throwing in a gift card as a promo. Like, you can also view that as recognition that price is what matters right now and not loving around with 3 months of above-MSRP cards while the channel fills/etc, like, that's actually kinda respecting consumers, and they wrote big checks and kicked some partner asses (who promptly came grumbling to tech media ofc). The tonal difference with how people interpret AMD and NVIDIA's actions is kind of extreme.

like literally AMD did/still is? doing the same thing with free DDR5 kits on AM5, and it's so nice that they're trying to get people over the hump of lovely mobo costs with a solid kit of RAM, etc. Or TLOU on any GPU, an actually good game bundle-in (and this is even a $150 RX 6400 - buy a $150 HTPC card/server card and get a $60 game, if that was a day-1 buy for you). NVIDIA promo: "why is it going below MSRP if it's such a great value".

(and yeah AM5 obviously isn't the model of high sales, but like the 4070+Steam card it's reached the point of "ok that's actually generous and (if you can get the deal) this becomes a reasonable/compelling buy compared to AM4/Ampere/RDNA2 actually" and people still don't want it. It's the "2070S moment", people are so invested in "AM5 too expensive" that they don't weigh the slightly more expensive mobo against a free decent ram kit (which gets you over the hump of "but I have some old DDR4" too). And yeah it's been a bad week but AMD will get past this, it's a lot of noise but it's getting fixed. And that's the tonal difference again lol.)

I dunno what the wholesale price of a steam wallet card is, I think unlike a RAM kit it's not something that probably gets that much cheaper in bulk, so it may be a case where microcenter is paying part of it and taking zero margins on the sale to get people in the door and buying a power strip/etc, but I'd really think NVIDIA is kicking in some money just like AMD here too. Microcenter is very great but they're not "slap a $100 promo on every GPU" level generous on their own. Microcenter is just very willing to do dealz like this given their focus on foot-traffic which prevents the internet from trivially farming/reselling them.)

NVIDIA is still doing their damndest to not give on the RAM though, and while 12GB is fine for a while it really is time to start moving the stack upwards. 12GB for $600, 16GB for $1200, 24GB for $1600 is still pretty steep. $500 for 12GB... getting reasonable. But it's just also not that expensive to add more.

You do not, in fact, have to give it to them. Nvidia is not your friend (and neither is AMD).

Miguel Prado
Nov 5, 2008

Don't worry, like they say " It's all good! "

So I got purple lines all over my monitor and device manager has disabled my GPU citing error code 43. Happened as I was booting up Jedi Survivor, been playing it fine all week.

Can anybody point me in the right direction to where I can I get some help?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Miguel Prado posted:

So I got purple lines all over my monitor and device manager has disabled my GPU citing error code 43. Happened as I was booting up Jedi Survivor, been playing it fine all week.

Can anybody point me in the right direction to where I can I get some help?

If a reboot doesn't fix it, you likely have a dead GPU on your hands. That's how VRAM dies.

Breetai
Nov 6, 2005

🥄Mah spoon is too big!🍌

orcane posted:

Maxwell and newer, as in the GTX 900 series and the GTX 750 variants (745, 750, 750 Ti), but not the rest of the GTX 700 series.

Huh. Is dropped support just a case of not being able to play games released after the support ended, or is it a case of critical safety concerns/potential attack vector for malicious code? Now I've got my new rig my old one with a 970 is connected to my tv as a living room media centre/couch co-op gaming platform, was just wondering if I'm likely to have a limited shelf life as apparently that card's architecture is next on the chopping block.

hobbesmaster
Jan 28, 2008

Breetai posted:

Huh. Is dropped support just a case of not being able to play games released after the support ended, or is it a case of critical safety concerns/potential attack vector for malicious code? Now I've got my new rig my old one with a 970 is connected to my tv as a living room media centre/couch co-op gaming platform, was just wondering if I'm likely to have a limited shelf life as apparently that card's architecture is next on the chopping block.

It mainly means they use different drivers. The other thing is those “game ready” updates won’t be available for old cards but I doubt they do much for older generations anyway.

The lifecycle for security updates is going to be much longer, but a security update would still be a separate driver.

Oxyclean
Sep 23, 2007


ijyt posted:

A used 3080 10/12GB would be a nice upgrade, especially if you're around 2560x1440.
How much should I be expecting to pay for a 3080? I'm in Canada so I'm not super sure where I should be looking to buy used, and I'm not clear if there's concerns regarding cards that were used for crypto? I've heard that's sometimes exaggerated but I'd hate to get burned unless I'm getting a card significantly cheaper.

This place has some in the 850-900$ (CAD) range Which does still feel like a lot of money.

Newgg.ca has some for sub 900$, but they're either "Peladn Gaming" which Ive never heard of before, or refurb cards (which I assume is fine.)

What is the general state of prices/the market? I don't really need to upgrade -right now- but I got the impression we're coming down from the insane prices of the crypto craze, but the 4xxx series is priced super high and the 4070 is kind of trash?

How does the 3070 match up to the 3080?

Yudo
May 15, 2003

Oxyclean posted:

How much should I be expecting to pay for a 3080? I'm in Canada so I'm not super sure where I should be looking to buy used, and I'm not clear if there's concerns regarding cards that were used for crypto? I've heard that's sometimes exaggerated but I'd hate to get burned unless I'm getting a card significantly cheaper.

This place has some in the 850-900$ (CAD) range Which does still feel like a lot of money.

Newgg.ca has some for sub 900$, but they're either "Peladn Gaming" which Ive never heard of before, or refurb cards (which I assume is fine.)

What is the general state of prices/the market? I don't really need to upgrade -right now- but I got the impression we're coming down from the insane prices of the crypto craze, but the 4xxx series is priced super high and the 4070 is kind of trash?

How does the 3070 match up to the 3080?

The 30 series hasn't been discounted the same way as AMD's 6000 series has. They are not good value at prices that are near their original msrp. The 3070 is becoming a non-starter anyway due to the increasing probability of encountering its vram constraints. Lots of online types are saying that 12gb should be considered the new minimum, and while I have no idea if that will be the case for you, Nvidia is way too stingy on that front. If you can find a 12gb 3080 for less than $600 usd, it may be worth considering, or just get the 4070 which can use dlss 3.

The market sucks, but there at least are cards on the shelves. In terms of value (frame per usd), the 4070, 7900xt (which is now less than $800 usd), the 6800 and 6950 are the best at the higher end of the price spectrum.

My experience with high wattage amd cards is one of unacceptable levels of coil whine: I may go for a 4070 due to it being less power hungry and perhaps less likely to have an annoying whine.

shrike82
Jun 11, 2005

https://twitter.com/cataferal/status/1654146597456863235?s=20

:lmao: don't think PC performance is going to be fixed anytime soon

Dr. Video Games 0031
Jul 17, 2004

shrike82 posted:

https://twitter.com/cataferal/status/1654146597456863235?s=20

:lmao: don't think PC performance is going to be fixed anytime soon

https://twitter.com/digitalfoundry/status/1653374383862235145

Oxyclean
Sep 23, 2007


Yudo posted:

The 30 series hasn't been discounted the same way as AMD's 6000 series has. They are not good value at prices that are near their original msrp. The 3070 is becoming a non-starter anyway due to the increasing probability of encountering its vram constraints. Lots of online types are saying that 12gb should be considered the new minimum, and while I have no idea if that will be the case for you, Nvidia is way too stingy on that front. If you can find a 12gb 3080 for less than $600 usd, it may be worth considering, or just get the 4070 which can use dlss 3.

The market sucks, but there at least are cards on the shelves. In terms of value (frame per usd), the 4070, 7900xt (which is now less than $800 usd), the 6800 and 6950 are the best at the higher end of the price spectrum.

My experience with high wattage amd cards is one of unacceptable levels of coil whine: I may go for a 4070 due to it being less power hungry and perhaps less likely to have an annoying whine.

Looking at the 4070s, there some at the low end of 800$ CAD (~600USD) - that seems a bit more palatable to cards pushing 900-1000$.
But I'm also not really understanding why there's so much price variance? these all appear to be 12GB cards - I guess some are Ti? I forgot what that means, is it a higher clock speed model or something?

Between the Asus and Gigabyte ones at 809$ CAD, is there much of a difference/recommendation?

Oxyclean fucked around with this message at 01:30 on May 5, 2023

repiv
Aug 13, 2009

the profile looks pretty gross

https://twitter.com/SheriefFYI/status/1653970157319131137

Yudo
May 15, 2003

Oxyclean posted:

Looking at the 4070s, there some at the low end of 800$ CAD (~600USD) - that seems a bit more palatable to cards pushing 900-1000$.
But I'm also not really understanding why there's so much price variance? these all appear to be 12GB cards - I guess some are Ti? I forgot what that means, is it a higher clock speed model or something?

Between the Asus and Gigabyte ones at 809$ CAD, is there much of a difference/recommendation?

Both the 4070 and 4070ti have 12gb of VRAM. Though they are both based on the same chip (AD104), the 4070 is a cut down version (fewer compute cores) of the 4070ti that is also clocked lower. As a result, the 4070ti is a considerably faster card that uses more power. Is it worth the price premium? Personally, I don't think it is, but if you need more grunt than the 4070, the 4070ti has it.

The Asus one says Asus, and the Gigabyte one says Gigabyte. Aside from that, they are likely almost the same. I think Asus is a better overall brand than Gigabyte, even in light of their current difficulties. I would not pay a premium for either, and certainly not above whatever MSRP is in CAD (which I assume is $809). Buy whatever is cheaper, but tie goes to Asus.

Yudo fucked around with this message at 01:51 on May 5, 2023

Subjunctive
Sep 12, 2006

✨sparkle and shine✨


That is some rookie poo poo, I mean come on now.

Branch Nvidian
Nov 29, 2012



I just completely deleted Jedi Survivor from my system. For one reason or another the patch never downloaded, and the game continued to hitch and lag like poo poo. Some real Bush League poo poo.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
That thread actually made me feel better about myself.

DoombatINC
Apr 20, 2003

Here's the thing, I'm a feminist.





https://twitter.com/SheriefFYI/status/1653972212993642496

ay caramba

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
I wonder how they scale and schedule worker threads, that's hilarious. That would kill them extra hard if they synchronize across multiple CCDs or with the slower E cores on the Intel side, right? I haven't seen any benchmarks of Jedi on small CPUs, what if the ultimate Jedi CPU is an i3, with only 4 P cores?

Hell, I'm thinking now of an Alder Lake deep dive I read where the saw the ring bus between cores runs faster when the E cores are disabled.

orange juche
Mar 14, 2012




They made the game for console and were told at the last minute "btw you need to release for PC too" but since they made the game all hosed up on account of consoles being able to handle that, the game is just wrecked.

Not that their dumb coding doesn't have an impact on consoles, just less of one.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Twerk from Home posted:

I wonder how they scale and schedule worker threads, that's hilarious. That would kill them extra hard if they synchronize across multiple CCDs or with the slower E cores on the Intel side, right? I haven't seen any benchmarks of Jedi on small CPUs, what if the ultimate Jedi CPU is an i3, with only 4 P cores?

Hell, I'm thinking now of an Alder Lake deep dive I read where the saw the ring bus between cores runs faster when the E cores are disabled.

https://twitter.com/Dachsjaeger/status/1653687263740538880?t=mGafVURrzPuZCvNxz1JmDA&s=19

There's a perf improvement with more P-cores so an i3 isn't the answer, but theres a regression when the E-cores get turned on lmao

repiv
Aug 13, 2009

it's impressive how they managed to screw up such low level details while using unreal engine, which abstracts all that stuff away

maybe they customized unreal and bit off more than they could chew?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

This must mean that nobody has even gestured at the build with a profiler on PC, or else the supernova-bright problems are just baked in so deep they don’t know how to attack them.

900ms waits is deep, bellowing lols though.

orange juche
Mar 14, 2012



Subjunctive posted:

This must mean that nobody has even gestured at the build with a profiler on PC, or else the supernova-bright problems are just baked in so deep they don’t know how to attack them.

900ms waits is deep, bellowing lols though.

They got their professional certs at the bottom of a cracker jack box, the entire team

shrike82
Jun 11, 2005

https://twitter.com/D_S_O_Gaming/status/1654111230334849024?s=20

pretty funny that paid mods to patch graphics are a thing now

Adbot
ADBOT LOVES YOU

Cross-Section
Mar 18, 2009

Yeah, I was planning on taking a break until more patches dropped but then I stumbled upon that DLSS3 mod

https://www.youtube.com/watch?v=BbRdpHex2No

You indeed have to drop 5 bucks on the creator's Patreon but it doubled my framerate and got rid of a lot of the stutters I was experiencing, so I'd say that's a good enough deal

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply