Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Paul MaudDib posted:

GPUs push way more power than most CPUs (apart from Intel hurrr) and need way beefier VRMs and then to deal with all of that heat somehow. And it all needs to fit into a dual-slot card.

Usually it's either caps or MOSFETs that burn out these days, we're past the bad days of the early leadless/RoHS-compliant solders and the GPU itself basically doesn't fail unless you do something Buildzoid level stupid and physically wire around all the safeties. Cap/mosfet failures are basically a function of heat and runtime.

that said, there are a not-insignificant amount of motherboard failures as well. It's probably the second most likely component to die behind the GPU, at least on higher-end rigs that push a lot of CPU power. Like the GPU, the CPU itself is usually fine but the VRMs don't live forever. Things like PSU and memory are pretty stable up until you know they're getting senile (8+ years).

I'd also say some of this is design. The same CPUs are also used across tons of applications, so reliability is key. In the "Pick Two: Cost, Reliability, Speed" equaiton for consumer GPUs they always skimp on reliability. The failure rates (in my experience) of Quadros or Datacenter cards is much lower per hour worked. This is OK though, I'd rather Geforce have a slightly higher failure rate in exchange for not having the cost factor the other lines do.

Adbot
ADBOT LOVES YOU

CaptainSarcastic
Jul 6, 2013



Of my last 8 videocards I think only one doesn't work anymore, and I'm not 100% sure about that. I was building a computer for a family member and intended to put in my old EVGA 9800 GTX and it wouldn't power on. Rather than troubleshoot I just threw in the next card I could locate, an EVGA GTX 660.

I guess I have 4 cards installed in machines right now:

2070 Super
1060 6GB
GTX 660
GT 5200 (this is a PCI card in a dual PIII server I keep around because I don't know I just do shut up)

Somewhere around here:
GTX 9800 (probably dead)
GT 9500
GTX 460
GTX 260

v1ld
Apr 16, 2012

repiv posted:

there's certainly conspicuous use of raytracing in gran turismo 7, and it's running at 60fps

https://www.youtube.com/watch?v=oz-O74SmTSQ

can't wait to see the end of lovely screenspace reflections

That's impressive. This is running on RDNA2 hardware, I guess.

What's the state of ray tracing support in the APIs? What will PS5 games in dev right now use to access the ray tracing hardware?

repiv
Aug 13, 2009

v1ld posted:

What's the state of ray tracing support in the APIs? What will PS5 games in dev right now use to access the ray tracing hardware?

In terms of which APIs expose RT hardware? DX12 yes, Vulkan yes through an Nvidia extension and soon through a cross-vendor extension, CUDA yes, DX11 and OpenGL no.

Sony usually roll their own APIs so it's probably something custom in their case, that's all under NDA and nobody is talking yet.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Asciana posted:

This has got to be the worst time in the GPU cycle ever for my 1080 to kick the bucket (EVGA, VRM exploded out of warranty).

Did you buy it on a credit card? If so, it's worth looking to see if your card has any sort of extended warranty that you could take advantage of. A lot of Citibank cards give a free 2 year extension (and is why I buy most of my electronics on said card), while a lot of other ones have a 1 year extension. Worth a look if you haven't.

Asciana
Jun 16, 2008

DrDork posted:

Did you buy it on a credit card? If so, it's worth looking to see if your card has any sort of extended warranty that you could take advantage of. A lot of Citibank cards give a free 2 year extension (and is why I buy most of my electronics on said card), while a lot of other ones have a 1 year extension. Worth a look if you haven't.

UK goon here. It was second hand unfortunately and even then outwith the third party warranty. It's certainly put me off buying second hand; though I wonder if it had been used for mining at some point.

Twibbit
Mar 7, 2013

Is your refrigerator running?

repiv posted:

In terms of which APIs expose RT hardware? DX12 yes, Vulkan yes through an Nvidia extension and soon through a cross-vendor extension, CUDA yes, DX11 and OpenGL no.

Sony usually roll their own APIs so it's probably something custom in their case, that's all under NDA and nobody is talking yet.

The PS5 tech breakdown says they are using Vulkan's Ray tracing API that was finished up quite some time ago.

repiv
Aug 13, 2009

Twibbit posted:

The PS5 tech breakdown says they are using Vulkan's Ray tracing API that was finished up quite some time ago.

Where did they say that? Googling PS5 Vulkan turns up nothing but speculation as far as I can see.

Also the cross-vendor Vulkan raytracing extension is still provisional, not fully finalized yet.

repiv fucked around with this message at 01:23 on Jun 13, 2020

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Rumors flyin’ hot n heavy: 3080 and 3090 possible co-processor...

https://wccftech.com/nvidia-traversal-coprocessor-rtx-3090-gpu/

e: https://www.youtube.com/watch?v=i537qwchiAw

Taima fucked around with this message at 02:21 on Jun 15, 2020

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
That's one of those rumors that is totally wild and out there but also... maybe makes sense? Assuming there aren't performance problems with doing your RT calculations in a separate place, it would have a bunch of advantages :

By splitting off a significant portion of the GPU, Nvidia would gain much of the benefit of MCM without having to do actual MCM. By making the core GPU more space-efficient and avoiding yield issues, Nvidia could pack more graphics acceleration hardware into the same cost of silicon. Probably the biggest reason would be Nvidia getting a huge advantage in professional and enterprise product segmentation. Making RT cores come on physically separate chips would mean that Nvidia could sell segments of the increasingly diverse business compute market distinct products that are more efficient for end users and thus more profitable for Nvidia.

I tend to assume it's not real because it's one of those things that's so wild that it would definitely have leaked by now, and there probably would have been some evidence hinting in that direction in Nvidia's original Ampere presentation.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

im ready for physx 2.0

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
That likely can’t be done for the same reasons you can’t have an “stand-alone RTX card” like people were whining for when Turing first launched. The RTX units need to share cache with the FP32 units to achieve reasonable performance, pushing that to an external chip would be difficult architecturally.

(Also, the RT “cores” are not really standalone cores, they are another type of execution unit within the SM, like how there’s integer units and FP32 units and tensor units.)

I’m not really sure how or why that would work anyway. The CPU already knows when it’s building the scene which models are “repeated”, there is no need to determine this experimentally GPU-side with great effort. Also, there really is no point to doing so, knowing a model is repeated tells you nothing since the important part is the placement of the model within the scene - if you have two teapots next to a light source but one is occluded by another object, that BVH tree isn’t relevant to the rays cast at the other one.

Furthermore, you really cannot have anything substantial there and still meet PCIe board spec. No cooler and probably not even a backplate. Even just a package itself would be pushing it.

If we’re throwing out wild ideas, I would say it’s more likely that Hopper is coming next year because the MCM tech is already in Ampere and Hopper is just two Ampere chips back to back. But then you run into the same “that product probably can’t exist within the PCIe card spec and certainly would be weird with a second cooler”, etc.

I am guessing the answer is far more mundane and those are the pads where the passives go, or those are test points for engineering sample cards and won’t be on production cards at all.

Coreteks has always struck me like Moore’s Law Is Dead in that he knows what he’s talking about like 2/3rds of the time at best. I would take this one with a mountain of salt.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

K8.0 posted:

I tend to assume it's not real because it's one of those things that's so wild that it would definitely have leaked by now, and there probably would have been some evidence hinting in that direction in Nvidia's original Ampere presentation.

The one thing making me wonder if it's possible is the wacky fan design on the prototype cards. But yeah, odds are it's probably BS.

Overall, this gen has me salivating. Nvidia actually has to try for once. Who knows what they'll pull out of their rear end. Regardless, I feel like once this gen is out, the 2000 series will be considered pretty trash* based on the general info being leaked (and I say that as a 2080 owner). That's good. Progress is good. I only bought a 2080 because Frys sold me one for $500... long story. it's just great to be really excited about a GPU release. I haven't really felt this way since the 1080Ti.

I mean poo poo, just HDMI 2.1 alone is completely game-changing. If the 3080 was just like 25% faster, did RTX better, and had HDMI 2.1, I would already be totally sold. But it luckily looks like we're in for a really big step up. Only question remaining is exactly how much of a jump we're looking at, and how we get there.

That doesn't even count the super-wildcard that is DLSS 2.0 possibly becoming a regular feature. And given how ridiculously good it is, it probably will. In that case, Ampere could very conceivably do 4K/120 with RTX. That's wild. Even 6 months ago we all figured that would be completely impossible, now it's not even that far fetched.

*relatively speaking, no one get mad please

Taima fucked around with this message at 05:02 on Jun 15, 2020

Rabid Snake
Aug 6, 2004



Asciana posted:

This has got to be the worst time in the GPU cycle ever for my 1080 to kick the bucket (EVGA, VRM exploded out of warranty).

Man I've heard of that happening when those EVGA pascal cards (1070/1080) first came out. Did you install the thermal pads they issued as replacements? JW if the thermal pads addressed the issue or not. Not many people realized it was a problem before it was too late which sucks

Rabid Snake fucked around with this message at 06:08 on Jun 15, 2020

SwissArmyDruid
Feb 14, 2014

by sebmojo
I have actually convinced myself to wait until the major confluence of new technologies hits, in the form of the combined USB4/DDR5/DP2.0 followed by widerspread adoption of PCIe 4.0 devices before I pull the trigger on any major tech purchases.

Until them, I'mma just keep flogging this old hardware that I have into keep on going just a little bit longer...

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Same but HDMI 2.1 and VRR on television sets

Might be waiting a while :smith:

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Zedsdeadbaby posted:

Same but HDMI 2.1 and VRR on television sets

Might be waiting a while :smith:

You... I mean... you can do that today?

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Taima posted:

You... I mean... you can do that today?

The bodged freesync implemention that we see on LG and Samsung TVs aren't quite the same thing, they still suffer from low hz ranges and flickering, and HDMI 2.0 can't handle 4k over 60fps unless it's compressed

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

SwissArmyDruid posted:

I have actually convinced myself to wait until the major confluence of new technologies hits, in the form of the combined USB4/DDR5/DP2.0 followed by widerspread adoption of PCIe 4.0 devices before I pull the trigger on any major tech purchases.

Until them, I'mma just keep flogging this old hardware that I have into keep on going just a little bit longer...

There's always new poo poo on the horizon that will be usually a mundane extension of the current tech AMAZING.

Alchenar
Apr 9, 2008

I'm just on the lookout for 4k monitors that will refresh faster than 60fps without having an enormous price premium attached.

VelociBacon
Dec 8, 2009

Alchenar posted:

I'm just on the lookout for 4k monitors that will refresh faster than 60fps without having an enormous price premium attached.

Same but there doesn't seem to be anything decent at the moment that doesn't also have HDR poorly implemented with fans etc or a curve or ultrawide.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Alchenar posted:

I'm just on the lookout for 4k monitors that will refresh faster than 60fps without having an enormous price premium attached.

Unfortunately I think you’ll be waiting awhile for that to happen.

Wrar
Sep 9, 2002


Soiled Meat

SwissArmyDruid posted:

I have actually convinced myself to wait until the major confluence of new technologies hits, in the form of the combined USB4/DDR5/DP2.0 followed by widerspread adoption of PCIe 4.0 devices before I pull the trigger on any major tech purchases.

Until them, I'mma just keep flogging this old hardware that I have into keep on going just a little bit longer...

Zen3 is a big uplift if you're several CPU generations behind. It has everything but the DDR5 (DP2 is a video card improvement imo.)

Worf
Sep 12, 2017

If only Seth would love me like I love him!

its gonna be so neat in, idk, 2 years or so when we have all at once, new DDR, bigger display pipes, and whatever benefits PCIE4 brings or whatever.

plus more VRR on TVs and better monitors coming out

tbh i dont usually penny pinch on building my desktops because of how much i use them but this time i might actually start saving in advance just to make it easier.

i figure im over $2k on just GPU and display(s) at this point

getting a massive ultrawide with actual good resolution and fast refresh with *sync is gonna own so hard

do people stack those at all lol

Worf fucked around with this message at 15:20 on Jun 15, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Statutory Ape posted:

i figure im over $2k on just GPU and display(s) at this point

getting a massive ultrawide with actual good resolution and fast refresh with *sync is gonna own so hard

do people stack those at all lol

Yeah, I'm in the same boat. I was running a 2500k up until a few years ago, and I've been running a 5820k@4.5 since then because it's more than fast enough for Destiny 2, FFXIV, etc., when paired with the 1080Ti buttcoins paid for at the height of that insanity, and I haven't seen a whole lot of reason to upgrade since.

Blowing $2k on a 3080(Ti?) + better ultrawide towards the end of the year is basically a given at this point, now I have to figure out if it'll be at all worth it to wait for DDR5 to refresh the rest of my computer.

And, yes, some people stack ultrawides. gently caress, I probably will end up doing that just because by then I doubt there'll be much of a resale market for a 3440x1440@100Hz monitor.

Zarin
Nov 11, 2008

I SEE YOU

DrDork posted:

And, yes, some people stack ultrawides. gently caress, I probably will end up doing that just because by then I doubt there'll be much of a resale market for a 3440x1440@100Hz monitor.

One on top of the other, or one off to the side?

Mistikman
Jan 21, 2001

I was born ready. I'm Ron Fucking Swanson.

Alchenar posted:

I'm just on the lookout for 4k monitors that will refresh faster than 60fps without having an enormous price premium attached.

The big issue right now is the DisplayPort and HDMI spec used on all monitors and video cards currently on the market can't support much more than 60 fps at 4k.

There are various monitors out there that use workarounds to get beyond this, but the implementation isn't perfect and yeah, it's expensive.

With any luck the next generation of cards from nvidia and AMD will have the newer specs, and the monitors will quickly follow. I expect there will still be a price premium for a couple years, but the implementation won't be so hacky either.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zarin posted:

One on top of the other, or one off to the side?

On top. I've already got a 27" 1440p on each side. Of course then I'd have to find a monitor arm capable of holding two ultrawides like that, but I'm sure there's one out there somewhere.

(yes, it's kinda crazy, but given that I'm WFH for the foreseeable future, being able to connect a work laptop to one ultrawide while leaving the other two monitors up for my normal personal computer has actually been real nice. It'd be a lot nicer if I could use a proper KVM so I didn't have to deal with a lovely laptop keyboard, but that's blocked by policy so no dice there)

Zarin
Nov 11, 2008

I SEE YOU

DrDork posted:

On top. I've already got a 27" 1440p on each side. Of course then I'd have to find a monitor arm capable of holding two ultrawides like that, but I'm sure there's one out there somewhere.

(yes, it's kinda crazy, but given that I'm WFH for the foreseeable future, being able to connect a work laptop to one ultrawide while leaving the other two monitors up for my normal personal computer has actually been real nice. It'd be a lot nicer if I could use a proper KVM so I didn't have to deal with a lovely laptop keyboard, but that's blocked by policy so no dice there)

Eh, even with the 2 side screens, I didn't even consider that wild haha.

So many of my co-workers only used 1 screen until I showed them The Way. My boss commented once about that, and I told her that if someone sign off on the budget for it, I'd use 6 if I could.

I did mange to talk them into getting us each a 4k this year, so we went from 3x1080p to (technically, sorta, almost) 6x 1080p by swapping out one monitor for the 4k.

I'd need a different dock or something to try and hook everything up, and I'm not even sure these laptops could do more than 3 screens anyhow.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Wrar posted:

Zen3 is a big uplift if you're several CPU generations behind. It has everything but the DDR5 (DP2 is a video card improvement imo.)

1600 desktop or 7700K laptop, a mishmash of SATA 2.5", SATA M.2, couple different speeds of ram, with the fastest being the 3200 SO-DIMMs in a machine that can't clock them faster than 2400, spinning rust in 2.5" AND 3.5" form factors, and my choice of either a 1050ti or 1060 Max-Q while I wait for my monitor to get RMA'd?

Yeah, I think it's safe to say that I'm a good position to wait for the aforementioned things.

Shaocaholica
Oct 29, 2002

Fig. 5E
I'm building/upgrading a Core2 system for ~reasons~. It's going to have an extreme edition CPU with a mild overclock. So something like 3.8Ghz quad core. What GPU would go nicely with this CPU for 1440p gaming? Want to at least try modern games on this so VRAM is a concern when factoring in older GPUs. Budget is $200. Used market is preferable!

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
So are you looking for era-appropriate GPUs or just something that makes sense? If you are fine with a more modern GPU and want to save money, a used RX 570 or RX 580 for <$100 can be a great deal. If you're trying to get as much performance as possible without breaking the budget, then you're going to have to go hunting but a used 1660 Super or Ti is probably going to win if you can find one that squeaks into your budget.

sauer kraut
Oct 2, 2004
Huh that is quite the combo of wants for $200.
Only thing that comes to mind is a used Vega56. They're supposed to be a fun projects for people who love tinkering with clocks&volts&drivers all day.

Asciana
Jun 16, 2008

Rabid Snake posted:

Man I've heard of that happening when those EVGA pascal cards (1070/1080) first came out. Did you install the thermal pads they issued as replacements? JW if the thermal pads addressed the issue or not. Not many people realized it was a problem before it was too late which sucks

I actually fired the serial number into EVGAs checker and it actually shipped with the BIOS update and the pads so it died regardless! A week later i'm still trying to decide between a 1660 Super or a 2070 Super. I guess its down to how big a jump these 30xx cards are going to be. I do wish some solid info would come out so I could make a decision. I'm more swayed to the 1660 but I worry it's going to hit September and i'll be playing Cyberpunk on a subpar GPU (I game at 1440p)

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

With a 1440p144 monitor the 2070S is the obvious choice over a 1660S. However, if you can wait until September, mise do so.

Alchenar
Apr 9, 2008

Mistikman posted:

The big issue right now is the DisplayPort and HDMI spec used on all monitors and video cards currently on the market can't support much more than 60 fps at 4k.

There are various monitors out there that use workarounds to get beyond this, but the implementation isn't perfect and yeah, it's expensive.

With any luck the next generation of cards from nvidia and AMD will have the newer specs, and the monitors will quickly follow. I expect there will still be a price premium for a couple years, but the implementation won't be so hacky either.

Yeah I have a 4k 60hz monitor now on a 4.5 year old system. The plan is to do a full new build this year (or early next) spending silly money to get peak performance, then look for a new monitor in 2-3 years where it's still going to be a expensive but the tech will hopefully be a bit more mature and widespread.

Asciana
Jun 16, 2008

Some Goon posted:

With a 1440p144 monitor the 2070S is the obvious choice over a 1660S. However, if you can wait until September, mise do so.

I guess the only games i'm playing at the minute are indie games, some DOOM Eternal and Divinity Original Sin II which I dont think would tax a 1660S too much. My biggest fear is that the 30xx series is going to end up being outside my budget and i'll be stuck with a 1660 for a while. (guess I should have thought about that before going for 1440p, eh?)

CaptainSarcastic
Jul 6, 2013



Asciana posted:

I guess the only games i'm playing at the minute are indie games, some DOOM Eternal and Divinity Original Sin II which I dont think would tax a 1660S too much. My biggest fear is that the 30xx series is going to end up being outside my budget and i'll be stuck with a 1660 for a while. (guess I should have thought about that before going for 1440p, eh?)

What games you play would make a difference in the analysis.

I built a whole new system and bought a 1440p 144hz monitor, and limping along with my old 1060 6GB from the previous system just didn't feel sustainable to me. That said, I had been waiting to play Far Cry 5 (my old CPU was missing an instruction set to be able to play it), and have been playing Far Cry New Dawn, with Metro Exodus and Borderlands 3 next in my queue. At 1440p the 1060 6GB struggled, and I had to set graphical details down to maintain a decent framerate.

If what you're playing and plan to play in the near future, and where you fall in terms of expectations on graphical details and framerate, then a 1660S might indeed be workable. I opted to get a 2070S, but that was based on my own preferences.

Shaocaholica
Oct 29, 2002

Fig. 5E

K8.0 posted:

So are you looking for era-appropriate GPUs or just something that makes sense? If you are fine with a more modern GPU and want to save money, a used RX 570 or RX 580 for <$100 can be a great deal. If you're trying to get as much performance as possible without breaking the budget, then you're going to have to go hunting but a used 1660 Super or Ti is probably going to win if you can find one that squeaks into your budget.

Nah doesn’t have to be period correct. I already have a vega56 from another project I could harvest.

So the over 11 year old core2 quad extreme can keep up with literally current GPUs at 1440p?

Adbot
ADBOT LOVES YOU

Asciana
Jun 16, 2008

CaptainSarcastic posted:

What games you play would make a difference in the analysis.

I built a whole new system and bought a 1440p 144hz monitor, and limping along with my old 1060 6GB from the previous system just didn't feel sustainable to me. That said, I had been waiting to play Far Cry 5 (my old CPU was missing an instruction set to be able to play it), and have been playing Far Cry New Dawn, with Metro Exodus and Borderlands 3 next in my queue. At 1440p the 1060 6GB struggled, and I had to set graphical details down to maintain a decent framerate.

If what you're playing and plan to play in the near future, and where you fall in terms of expectations on graphical details and framerate, then a 1660S might indeed be workable. I opted to get a 2070S, but that was based on my own preferences.

I dont mind really dropping down a graphical quality notch for a while I guess. Just need to prepare to commit to getting that 30xx card when they come out! Would a 2060S be a bad choice at this time?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply