|
Kind of a dumb question, but I need to use a few more monitors with my RTX 3090 (Which has 4 connectors and I need two more). I have an old GTX 660ti that fits OK to install on my motherboard and case. However, the drivers for the RTX 30 series don't include the GTX 600 series so it's not being recognized by Windows 10 x64. I don't see any RTX 30 series drivers that do. Installing any that are compatible with the 660 then disables the 3090. Is there any way to get them both to function or is it just not doable without the driver support working for both? It's not a big deal, I'm just experimenting with some options and wanted to see if I can use this old hardware first.
|
# ? May 3, 2023 16:11 |
|
|
# ? May 30, 2024 14:15 |
|
AzureSkys posted:Kind of a dumb question, but I need to use a few more monitors with my RTX 3090 (Which has 4 connectors and I need two more). I have an old GTX 660ti that fits OK to install on my motherboard and case. Not really unless you want to dabble into custom edited drivers and you don't want to do that, unsigned drivers are a pretty big safety risk. Get a newer 2nd GPU that fits in your case, there's single card solutions that are new enough to be supported I think, if card height is your concern.
|
# ? May 3, 2023 16:18 |
|
I have the terrible feeling that fsr 3 isn't coming until the middle of rdna 3's life cycle. That isn't going to help in terms of developer adoption, though apparently fsr is easy to implement. Maybe if it runs on rdna 2 it will end up on consoles.
|
# ? May 3, 2023 16:20 |
|
AzureSkys posted:Kind of a dumb question, but I need to use a few more monitors with my RTX 3090 (Which has 4 connectors and I need two more). I have an old GTX 660ti that fits OK to install on my motherboard and case. Do you have an integrated gpu on your cpu you could use, if the last two monitors don’t need much graphics acceleration?
|
# ? May 3, 2023 16:21 |
|
orange juche posted:Not really unless you want to dabble into custom edited drivers and you don't want to do that, unsigned drivers are a pretty big safety risk. That's sorta what I thought. Arivia posted:Do you have an integrated gpu on your cpu you could use, if the last two monitors don’t need much graphics acceleration? I'm using Spacedesk to use a laptop as the other screens through my home network, but there's some latency issues and a TripleHead2Go but it treats the other monitors as a single display which makes certain window placement finicky . I'll keep my eye out for a newer simple GPU that would fit. I think I have access to a 1080 that I may try out, if it'll fit and not cause heating concerns. It's for Flight Simulator instrument panel displays which gets a bit ridiculous.
|
# ? May 3, 2023 16:32 |
|
Yudo posted:I have the terrible feeling that fsr 3 isn't coming until the middle of rdna 3's life cycle. That isn't going to help in terms of developer adoption, though apparently fsr is easy to implement. Maybe if it runs on rdna 2 it will end up on consoles. FSR3 does run on RDNA2, since it's an open standard and AMD specifically stated it would run on older hardware, they'd be silly to not have it run on older hardware as current consoles are RDNA2 based, and also the lions share of their graphics hardware goes into consoles.
|
# ? May 3, 2023 16:33 |
|
You could buy like a used 1050 for peanuts to do it with it sounds like.
|
# ? May 3, 2023 16:33 |
|
AzureSkys posted:...O__O It's always the crazy *sim folks
|
# ? May 3, 2023 16:38 |
|
Yudo posted:I have the terrible feeling that fsr 3 isn't coming until the middle of rdna 3's life cycle. That isn't going to help in terms of developer adoption, though apparently fsr is easy to implement. Maybe if it runs on rdna 2 it will end up on consoles. it's pretty obvious they announced it very early in development, the discussion of it at GDC was little more than "yeah we're working on it" with no examples or performance numbers given they still need to come up with their own version of reflex in order to make interpolation actually palatable
|
# ? May 3, 2023 17:22 |
|
I imagine the increase in frame generation latency is more pronounced when you have something like 45 real frames generating up to 60. But as I am looking at it now on MSFS, Plague Tale, and CP2077 on a 4090, I would not be able to pass a double blind test. And I’m very sensitive to stuff like artifacting, micro stutters and input lag. Maybe I would be able to notice in a twitch game like Overwatch, but those competitive FPS’ tend to have very high FPS on mid-end hardware so frame generation is not necessary. Very exciting technology, would be amazing to have a Nintendo Switch 2 that can take Zelda from 40fps to 60fps.
|
# ? May 3, 2023 17:36 |
|
Animal posted:I imagine the increase in frame generation latency is more pronounced when you have something like 45 real frames generating up to 60. But as I am looking at it now on MSFS, Plague Tale, and CP2077 on a 4090, I would not be able to pass a double blind test. And I’m very sensitive to stuff like artifacting, micro stutters and input lag. Maybe I would be able to notice in a twitch game like Overwatch, but those competitive FPS’ tend to 40fps to 60fps. I have seen it demonstrated that frame generation works best when there are already adequate or close to it "real frames." When interpolation is applies with fewer frames, latency increases, and I guess artifications as well, though that is harder to measure and harder to demonstrate via compressed video.
|
# ? May 3, 2023 17:52 |
|
It’s a pretty great first iteration. I’m looking forward to see it mature. DLSS 2 was a big jump from when it was first released.
|
# ? May 3, 2023 18:13 |
|
AzureSkys posted:I have an old GTX 660ti You've already gotten a functionally correct answer, but just to be informational about the why: GTX 600 series is Kepler and I believe the current support is only back to Turing.
|
# ? May 3, 2023 18:33 |
|
AzureSkys posted:That's sorta what I thought. depending on the specifics of the monitors you might be able to use a displayport MST hub to pull it off without a second graphics card the catch is that the combination of monitors you attach to the hub can't exceed the bandwidth limit of a single displayport output
|
# ? May 3, 2023 18:46 |
|
repiv posted:depending on the specifics of the monitors you might be able to use a displayport MST hub to pull it off without a second graphics card If he's just using the extra outputs for pop-out flight displays those really shouldn't be too much of a bandwidth hit - those are probably something like 1280x1024 or whatever anyway.
|
# ? May 3, 2023 19:50 |
|
mdxi posted:You've already gotten a functionally correct answer, but just to be informational about the why: GTX 600 series is Kepler and I believe the current support is only back to
|
# ? May 4, 2023 13:59 |
|
Paul MaudDib posted:you know, people put a very negative spin on NVIDIA doing their damndest to make sure there's inventory on shelves at MSRP on launch, and then even throwing in a gift card as a promo. Like, you can also view that as recognition that price is what matters right now and not loving around with 3 months of above-MSRP cards while the channel fills/etc, like, that's actually kinda respecting consumers, and they wrote big checks and kicked some partner asses (who promptly came grumbling to tech media ofc). The tonal difference with how people interpret AMD and NVIDIA's actions is kind of extreme. You do not, in fact, have to give it to them. Nvidia is not your friend (and neither is AMD).
|
# ? May 4, 2023 17:58 |
|
So I got purple lines all over my monitor and device manager has disabled my GPU citing error code 43. Happened as I was booting up Jedi Survivor, been playing it fine all week. Can anybody point me in the right direction to where I can I get some help?
|
# ? May 4, 2023 19:11 |
|
Miguel Prado posted:So I got purple lines all over my monitor and device manager has disabled my GPU citing error code 43. Happened as I was booting up Jedi Survivor, been playing it fine all week. If a reboot doesn't fix it, you likely have a dead GPU on your hands. That's how VRAM dies.
|
# ? May 4, 2023 19:22 |
|
orcane posted:Maxwell and newer, as in the GTX 900 series and the GTX 750 variants (745, 750, 750 Ti), but not the rest of the GTX 700 series. Huh. Is dropped support just a case of not being able to play games released after the support ended, or is it a case of critical safety concerns/potential attack vector for malicious code? Now I've got my new rig my old one with a 970 is connected to my tv as a living room media centre/couch co-op gaming platform, was just wondering if I'm likely to have a limited shelf life as apparently that card's architecture is next on the chopping block.
|
# ? May 4, 2023 22:19 |
|
Breetai posted:Huh. Is dropped support just a case of not being able to play games released after the support ended, or is it a case of critical safety concerns/potential attack vector for malicious code? Now I've got my new rig my old one with a 970 is connected to my tv as a living room media centre/couch co-op gaming platform, was just wondering if I'm likely to have a limited shelf life as apparently that card's architecture is next on the chopping block. It mainly means they use different drivers. The other thing is those “game ready” updates won’t be available for old cards but I doubt they do much for older generations anyway. The lifecycle for security updates is going to be much longer, but a security update would still be a separate driver.
|
# ? May 4, 2023 23:06 |
|
ijyt posted:A used 3080 10/12GB would be a nice upgrade, especially if you're around 2560x1440. This place has some in the 850-900$ (CAD) range Which does still feel like a lot of money. Newgg.ca has some for sub 900$, but they're either "Peladn Gaming" which Ive never heard of before, or refurb cards (which I assume is fine.) What is the general state of prices/the market? I don't really need to upgrade -right now- but I got the impression we're coming down from the insane prices of the crypto craze, but the 4xxx series is priced super high and the 4070 is kind of trash? How does the 3070 match up to the 3080?
|
# ? May 5, 2023 00:39 |
|
Oxyclean posted:How much should I be expecting to pay for a 3080? I'm in Canada so I'm not super sure where I should be looking to buy used, and I'm not clear if there's concerns regarding cards that were used for crypto? I've heard that's sometimes exaggerated but I'd hate to get burned unless I'm getting a card significantly cheaper. The 30 series hasn't been discounted the same way as AMD's 6000 series has. They are not good value at prices that are near their original msrp. The 3070 is becoming a non-starter anyway due to the increasing probability of encountering its vram constraints. Lots of online types are saying that 12gb should be considered the new minimum, and while I have no idea if that will be the case for you, Nvidia is way too stingy on that front. If you can find a 12gb 3080 for less than $600 usd, it may be worth considering, or just get the 4070 which can use dlss 3. The market sucks, but there at least are cards on the shelves. In terms of value (frame per usd), the 4070, 7900xt (which is now less than $800 usd), the 6800 and 6950 are the best at the higher end of the price spectrum. My experience with high wattage amd cards is one of unacceptable levels of coil whine: I may go for a 4070 due to it being less power hungry and perhaps less likely to have an annoying whine.
|
# ? May 5, 2023 00:57 |
|
https://twitter.com/cataferal/status/1654146597456863235?s=20 don't think PC performance is going to be fixed anytime soon
|
# ? May 5, 2023 01:05 |
|
shrike82 posted:https://twitter.com/cataferal/status/1654146597456863235?s=20 https://twitter.com/digitalfoundry/status/1653374383862235145
|
# ? May 5, 2023 01:14 |
|
Yudo posted:The 30 series hasn't been discounted the same way as AMD's 6000 series has. They are not good value at prices that are near their original msrp. The 3070 is becoming a non-starter anyway due to the increasing probability of encountering its vram constraints. Lots of online types are saying that 12gb should be considered the new minimum, and while I have no idea if that will be the case for you, Nvidia is way too stingy on that front. If you can find a 12gb 3080 for less than $600 usd, it may be worth considering, or just get the 4070 which can use dlss 3. Looking at the 4070s, there some at the low end of 800$ CAD (~600USD) - that seems a bit more palatable to cards pushing 900-1000$. But I'm also not really understanding why there's so much price variance? these all appear to be 12GB cards - I guess some are Ti? I forgot what that means, is it a higher clock speed model or something? Between the Asus and Gigabyte ones at 809$ CAD, is there much of a difference/recommendation? Oxyclean fucked around with this message at 01:30 on May 5, 2023 |
# ? May 5, 2023 01:27 |
|
the profile looks pretty gross https://twitter.com/SheriefFYI/status/1653970157319131137
|
# ? May 5, 2023 01:40 |
|
Oxyclean posted:Looking at the 4070s, there some at the low end of 800$ CAD (~600USD) - that seems a bit more palatable to cards pushing 900-1000$. Both the 4070 and 4070ti have 12gb of VRAM. Though they are both based on the same chip (AD104), the 4070 is a cut down version (fewer compute cores) of the 4070ti that is also clocked lower. As a result, the 4070ti is a considerably faster card that uses more power. Is it worth the price premium? Personally, I don't think it is, but if you need more grunt than the 4070, the 4070ti has it. The Asus one says Asus, and the Gigabyte one says Gigabyte. Aside from that, they are likely almost the same. I think Asus is a better overall brand than Gigabyte, even in light of their current difficulties. I would not pay a premium for either, and certainly not above whatever MSRP is in CAD (which I assume is $809). Buy whatever is cheaper, but tie goes to Asus. Yudo fucked around with this message at 01:51 on May 5, 2023 |
# ? May 5, 2023 01:48 |
|
repiv posted:the profile looks pretty gross That is some rookie poo poo, I mean come on now.
|
# ? May 5, 2023 01:50 |
|
I just completely deleted Jedi Survivor from my system. For one reason or another the patch never downloaded, and the game continued to hitch and lag like poo poo. Some real Bush League poo poo.
|
# ? May 5, 2023 01:53 |
|
That thread actually made me feel better about myself.
|
# ? May 5, 2023 01:57 |
|
https://twitter.com/SheriefFYI/status/1653972212993642496 ay caramba
|
# ? May 5, 2023 01:58 |
|
I wonder how they scale and schedule worker threads, that's hilarious. That would kill them extra hard if they synchronize across multiple CCDs or with the slower E cores on the Intel side, right? I haven't seen any benchmarks of Jedi on small CPUs, what if the ultimate Jedi CPU is an i3, with only 4 P cores? Hell, I'm thinking now of an Alder Lake deep dive I read where the saw the ring bus between cores runs faster when the E cores are disabled.
|
# ? May 5, 2023 02:05 |
|
They made the game for console and were told at the last minute "btw you need to release for PC too" but since they made the game all hosed up on account of consoles being able to handle that, the game is just wrecked. Not that their dumb coding doesn't have an impact on consoles, just less of one.
|
# ? May 5, 2023 02:11 |
|
Twerk from Home posted:I wonder how they scale and schedule worker threads, that's hilarious. That would kill them extra hard if they synchronize across multiple CCDs or with the slower E cores on the Intel side, right? I haven't seen any benchmarks of Jedi on small CPUs, what if the ultimate Jedi CPU is an i3, with only 4 P cores? https://twitter.com/Dachsjaeger/status/1653687263740538880?t=mGafVURrzPuZCvNxz1JmDA&s=19 There's a perf improvement with more P-cores so an i3 isn't the answer, but theres a regression when the E-cores get turned on lmao
|
# ? May 5, 2023 02:12 |
|
it's impressive how they managed to screw up such low level details while using unreal engine, which abstracts all that stuff away maybe they customized unreal and bit off more than they could chew?
|
# ? May 5, 2023 02:15 |
|
This must mean that nobody has even gestured at the build with a profiler on PC, or else the supernova-bright problems are just baked in so deep they don’t know how to attack them. 900ms waits is deep, bellowing lols though.
|
# ? May 5, 2023 02:16 |
|
Subjunctive posted:This must mean that nobody has even gestured at the build with a profiler on PC, or else the supernova-bright problems are just baked in so deep they don’t know how to attack them. They got their professional certs at the bottom of a cracker jack box, the entire team
|
# ? May 5, 2023 02:53 |
|
https://twitter.com/D_S_O_Gaming/status/1654111230334849024?s=20 pretty funny that paid mods to patch graphics are a thing now
|
# ? May 5, 2023 02:53 |
|
|
# ? May 30, 2024 14:15 |
|
Yeah, I was planning on taking a break until more patches dropped but then I stumbled upon that DLSS3 mod https://www.youtube.com/watch?v=BbRdpHex2No You indeed have to drop 5 bucks on the creator's Patreon but it doubled my framerate and got rid of a lot of the stutters I was experiencing, so I'd say that's a good enough deal
|
# ? May 5, 2023 02:55 |