Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Rolo posted:

What do you guys use to peek at GPU wattage?

I usually use HWMonitor but now that I have the Mystery Dell 3080, both HWMonitor and NZXTCam are reading 0 watts.

Back in real life I always have this plugged in to my PC to keep tabs on the full system load:

https://smile.amazon.com/P3-P4400-Electricity-Usage-Monitor/dp/B00009MDBU

Which works well but I'd love if there was a model with faster refresh (this is only once per second) and an always-on backlight so I can see it at night.



TIMG my dude
vvvv

Zero VGS fucked around with this message at 18:57 on Oct 30, 2020

Adbot
ADBOT LOVES YOU

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
Hardware Unboxed has their Watch Dogs: Legion optimization guide up, I've linked it to their DLSS section.

It's interesting insofar as that like F1 2020, and unlike Death Stranding & Control, it's a case where DLSS vs. native TAA still provides an image quality downgrade, even in the quality setting. It's very slight at Quality and considering the performance uplift it's a no-brainer for the very slight texture detail drop in that mode, but balanced/performance look noticeably worse than native.



DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
I would say if you need to zoom to 300% to really see the difference, then it's not something to worry about unless you're simply taking screenshots for a blog.

But, yeah, for all the black magic that DLSS is, it isn't perfect.

repiv
Aug 13, 2009

It seems like a bit of give and take, texture detail is resolved better by their TAA but DLSS resolves micro geometry details better going by PCGHs examples

https://www.pcgameshardware.de/commoncfm/comparison/clickSwitch.cfm?id=162627
https://www.pcgameshardware.de/commoncfm/comparison/clickSwitch.cfm?id=162628

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

DrDork posted:

I would say if you need to zoom to 300% to really see the difference, then it's not something to worry about unless you're simply taking screenshots for a blog.

But, yeah, for all the black magic that DLSS is, it isn't perfect.

We need to see more shots definitely, one aspect I was really impressed with in Death Stranding & Control was that it not only sharpened vs TAA, but actually reduced subpixel aliasing better than TAA as well. This effect may still be present here even if the texture detail is dialed back. Also as mentioned in the video, additional sharpening could be added which may help.

Another interesting tidbit is that with ray tracing at 4k with Ultra textures, you're overflowing an 8GB card, and 10GB is cutting it close.

repiv
Aug 13, 2009

Yeah I would expect that better resolution of small geometry details to manifest as better temporal stability overall

Maybe DigitalFoundry will take a closer look at it, they tend to focus on stability

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Happy_Misanthrope posted:

Another interesting tidbit is that with ray tracing at 4k with Ultra textures, you're overflowing an 8GB card, and 10GB is cutting it close.

So you're saying I should flip the 3080 I've got inbound and get a 3090 :thunk:

Someone please talk me out of this.

The Grumbles
Jun 5, 2006
The thing that's important to keep in mind about DLSS is that it's really loving cool. If you're into computers to the extent that you're building them and messing about under the hood to this extent, it almost doesn't matter that there are imperfections compared to whatever other method. I mean, half the reason for wanting to run a high-end graphics card is so you can point at the screen and remind yourself that the box on your desk is rendering that stuff in real time.

Cabbages and VHS
Aug 25, 2004

Listen, I've been around a bit, you know, and I thought I'd seen some creepy things go on in the movie business, but I really have to say this is the most disgusting thing that's ever happened to me.

Rolo posted:

Mine just now went from “reserving” to “preparing for shipment.” There’s still hope.

It hasn’t shipped, but at least the little progress bar went from yellow to green.

mine went to "delayed" and I am not very optimistic, but we'll see:


If there weren't sixteen things in my life / the 2020 world to be much more alarmed about, i I would find this launch debacle infuriating instead of a more or less equal mix of tilting and hilarious, because it is a dark comedy of the intersection of late stage capitalism and a mismanaged pandemic and people with more money than sense (including myself, since I have a 3070 and 3080 on order at present and will absolutely buy a NAVI if I don't have one of those two cards in my possession at the time i have a window to snap up a radeon).

Kibner
Oct 21, 2008

Acguy Supremacy

Happy_Misanthrope posted:

Another interesting tidbit is that with ray tracing at 4k with Ultra textures, you're overflowing an 8GB card, and 10GB is cutting it close.

How much VRAM would be used with those settings at 1440p, instead?

Unhappy Meal
Jul 27, 2010

Some smiles show mirth
Others merely show teeth

Happy_Misanthrope posted:

Another interesting tidbit is that with ray tracing at 4k with Ultra textures, you're overflowing an 8GB card, and 10GB is cutting it close.

Actually overflowing or the typical, "But I NEED it." game requesting more than it actually uses overflowing?

Volguus
Mar 3, 2009
There is one EVGA GeForce RTX 3070 XC3 Black Gaming card available on canadian Amazon. $1,999CAD.
pre:
Ships from United States and sold by Sorry for Scalping Shop.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Kibner posted:

How much VRAM would be used with those settings at 1440p, instead?

According to the review, still around 7.5GB VRAM for Ultra settings + Ultra textures, while High drops it down 6.5GB, and Medium to 4.4GB. There is also basically no visual distinction between Ultra, High, and even Medium textures. Soooo yeah. Makes me kinda wonder if the textures were natively created at "High" and then upsampled to make the Ultra ones, which made them larger while not really doing much to increase quality.

But, yes, it does seem like this is a game that you don't want to run 4k+RTX+Ultra everything on an 8GB card.

Unhappy Meal posted:

Actually overflowing or the typical, "But I NEED it." game requesting more than it actually uses overflowing?

He claims running Ultra (7.5GB) on a 6GB card (didn't say which) produced noticeable stuttering, so this may be legitimate actual "need" (except that there's no real need to run Ultra in the first place).

e; and while he didn't give raw performance numbers, it's probably safe to say that running WD:L maxed out at 4k is probably not gonna get you good performance on the presumable 1060 6GB, regardless of textures.

DrDork fucked around with this message at 19:40 on Oct 30, 2020

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Unhappy Meal posted:

Actually overflowing or the typical, "But I NEED it." game requesting more than it actually uses overflowing?

Actually overflowing - massive stutters.

bus hustler
Mar 14, 2019

Ha Ha Ha... YES!
Wonder what the difference in the long tail between 2080ti's 11GB and 3070's tensor 2.0 cores will wind up being at higher settings.

repiv
Aug 13, 2009

DrDork posted:

According to the review, still around 7.5GB VRAM for Ultra settings + Ultra textures, while High drops it down 6.5GB, and Medium to 4.4GB. There is also basically no visual distinction between Ultra, High, and even Medium textures. Soooo yeah. Makes me kinda wonder if the textures were natively created at "High" and then upsampled to make the Ultra ones, which made them larger while not really doing much to increase quality.

It might be that the texture setting is controlling the amount of VRAM allocated for streaming, rather than directly controlling the texture resolution loaded, so medium-high is enough to load the highest quality textures but high-ultra allows textures to stay cached for longer and potentially alleviate pop-in (especially on non-SSD systems)

v1ld
Apr 16, 2012

bus hustler posted:

Wonder what the difference in the long tail between 2080ti's 11GB and 3070's tensor 2.0 cores will wind up being at higher settings.

What do the tensor 2.0 cores bring over what's in the 2080ti? What're the other in-silicon improvements of the 3070/80 over the 2080ti?

Asking because the 2080ti is slightly better than the 3070 in some games and up to 10-12% better in others in some of the graphs I've seen. If you can get one for $450ish, it's a good alternative for the 3070 so it'd be interesting to know where the 3070 has scope to improve over time.

E: I'd still like the 3080 personally since I think it'll have very long legs at 3440x1440 but a sweet deal on a 2080ti may sway me. A 12GB 3080ti would be even sweeter. The monitor will outlive the card for sure, not interested in moving off that res for a long while, so even a 2080ti could be a reasonable fit till the next monitor.

v1ld fucked around with this message at 19:46 on Oct 30, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

v1ld posted:

What do the tensor 2.0 cores bring over what's the 2080ti? What're the other in-silicon improvements of the 3070/80 over the 2080ti?

Considerably better RTX performance, mostly. Otherwise it's not really an upgrade over the 2080 Ti, it's just, you know, 40% of the price of one.

But yeah, if you can snake a 2080 Ti for $400, that'd be a compelling deal vs a 3070.

bus hustler
Mar 14, 2019

Ha Ha Ha... YES!
I bought a 2080ti for ~$750 used almost 2 years ago and have been happy with my purchase & am almost certainly going to be able to skip this entire generation anyway, being slightly behind the 3070 even by mid/end of the cycle isn't going to be a bad place at all. edit : guess its more like 18mo

Or maybe the world will end or ill win the lotto before then who knows. Or get a raise (the least likely!)

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Even at the same price I would struggle between a 2080Ti and a 3070. The 3070 no doubt has some architectural improvements that have the potential to pay out down the road, but 11 vs 8gb of VRAM is nice and the bandwidth advantage is huge for the 2080Ti, 37.5% more memory bandwidth and 37.5% more memory is hard to pass up. I'd probably lean toward the 2080Ti from what I've seen so far. The 1% lows on the 3070 are not competitive with the 2080 Ti even if average framerates are. The biggest downside to a 2080Ti is that you're locked to DP 1.4, but I don't think a 3070 is a great 4k GPU anyway.

Rolo
Nov 16, 2005

Hmm, what have we here?

Zero VGS posted:

I usually use HWMonitor but now that I have the Mystery Dell 3080, both HWMonitor and NZXTCam are reading 0 watts.

HWmonitor does wattage? I cant get it to post anything but voltage or temp.

v1ld
Apr 16, 2012

bus hustler posted:

I bought a 2080ti for ~$750 used almost 2 years ago and have been happy with my purchase & am almost certainly going to be able to skip this entire generation anyway, being slightly behind the 3070 even by mid/end of the cycle isn't going to be a bad place at all. edit : guess its more like 18mo

Or maybe the world will end or ill win the lotto before then who knows. Or get a raise (the least likely!)

I'd be sitting happy with a 2080Ti for this gen if I had one, but I'm one of those that missed buying the 1080ti before the crypto crunch hit and stuck with the 7970 for longer than they should have - though it was a brilliant card for all of its time.

This 5700 is doing great in every game I play so there's no absolute need to get another card right now. Waiting until I have games that can benefit from RTX is smart - but a well-priced 2080ti is going to be tempting, in part due to the 3GB extra mem as K8.0 said e: and because I have no need for 4K right now and won't for years.

v1ld fucked around with this message at 20:12 on Oct 30, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

v1ld posted:

but a well-priced 2080ti is going to be tempting, in part due to the 3GB extra mem as K8.0 said e: and because I have no need for 4K right now and won't for years.

If you have no intention of going 4k, your need for >8GB of VRAM is likely to be near zero. Comparing the 99th%ile frames might be a more interesting decision point, though with GSync that issue also largely is papered over.

Euphoriaphone
Aug 10, 2006

Cabbages and Kings posted:

mine went to "delayed" and I am not very optimistic, but we'll see:


If there weren't sixteen things in my life / the 2020 world to be much more alarmed about, i I would find this launch debacle infuriating instead of a more or less equal mix of tilting and hilarious, because it is a dark comedy of the intersection of late stage capitalism and a mismanaged pandemic and people with more money than sense (including myself, since I have a 3070 and 3080 on order at present and will absolutely buy a NAVI if I don't have one of those two cards in my possession at the time i have a window to snap up a radeon).

You can check my earlier posts ITT but I had the same issue when I ordered a 3080 for pickup from BB. You can chat with a virtual agent who can tell you if it's a true delay. They told me it was just a notification the system sometimes sends, but not to worry about it. My card ended up coming earlier than my original confirmation even said.

runaway dog
Dec 11, 2005

I rarely go into the field, motherfucker.
jeeze another batch of NIB 2080tis sold for 430 on ebay again this morning, uuuugh

runaway dog fucked around with this message at 20:37 on Oct 30, 2020

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum
Look closely at the graphs, they confusingly put the same card on twice for RTX on vs off. So it looks like the 3070 is about 2-5% worse with RTX off and maybe 5-7? % better than the 2080ti with RTX on at 1440p, in metro, and in control the 3070 is slightly better or the same without RTX and a marginal amount better with RTX.
https://www.techpowerup.com/review/gigabyte-geforce-rtx-3070-gaming-oc/36.html

For people wanting that big ray-tracing boost for 1440p@144, the 3080 or even the 3090 if money is no object, is the way to go it seems. That said, the 3070 looks to me about 3x faster than my current 980, but the same could be said for the 2080 ti. 3080 just seems better, unlike the normal 10% better for 30% more money, kind of 70 vs 80 series difference. Given the numbers here it seems that with RTX on, control and metro, the 3080 is 23% better than the 3070 at 1440p in metro and 25% better in control. With RTX off, the 3080 is 28% better than the 3070 in control and 24% better in metro.

The base price of the 3070 founders is 500 and the 3080 is 700. 3080 is 33% more expensive.

Ray tracing 1440p average (metro and control): 3080 is 24% better
ray tracing off 144p average, 3080 is 26% better.

So the 3080 is 24 or 26% better than the 3070 depending on what you're looking at (RTX on vs off) at 1440p, is more future-proof due to more ram, and costs 33% more.


Please someone check my numbers. I basically think that this means that 3070 and 3080 are both worth it at 1440p. Normally there's a clear value winner. I suppose 3070 is still it, but there's no real argument against getting a 3080.

pzy
Feb 20, 2004

Da Boom!

4000 Dollar Suit posted:

jeeze another batch of NIB 2080tis sold for 430 on ebay again this morning, uuuugh

Check the feedback... someone is trying to manipulate eBay sold prices. Those cards do not exist.

runaway dog
Dec 11, 2005

I rarely go into the field, motherfucker.
I guess it did seem off that when I clicked on them ebay just said they couldn't find the auction, figured that was because they were already sold but every other sold item doesn't do that, so oh well.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

redreader posted:

Please someone check my numbers. I basically think that this means that 3070 and 3080 are both worth it at 1440p. Normally there's a clear value winner. I suppose 3070 is still it, but there's no real argument against getting a 3080.

Yeah, that's about right, and part of why the 3080 is such a hot little item: its pricing has put it at almost a linear price:perf increase vs the 3070, which is uncommon; as you say, usually you're paying like 30% more for 10% more performance or something.

The 3070 isn't really intended to be notably better than the 2080Ti. It's supposed to give you very comparable performance at $500 instead of $1200. If you already have a 2080Ti, a 3070 is not a step up, and would be a pretty silly thing to consider moving to.

As far as 1440p goes, yeah, if you have a monitor that'll do 144Hz or more, you can use the extra horsepower of the 3080. If you don't, or don't really care about the difference between 100FPS and 125FPS, then the 3070 is a very competent card at a modest discount. Don't buy more card than you need, unless you really like the idea of holding on to a card for 3+ years for some reason.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

4000 Dollar Suit posted:

I guess it did seem off that when I clicked on them ebay just said they couldn't find the auction, figured that was because they were already sold but every other sold item doesn't do that, so oh well.

Yeah, not sure what's exactly happening in the background, but they're absolutely fake. Same way there've been a few listings for 3080 and 3090's at MSRP or even below from the Philippines or whatever, but while they show up on searches, if you actually click on them they turn out to be invalid. Someone's trying to gently caress with eBay's metrics to drive prices down.

Actual 2080 Ti's are still going for $700-$900, depending.

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Not everybody has a case that can handle 340 watts of heat just from the GPU. The 3070 is 60 watts less than a 2080ti and while the 3080 stomps the same, it uses 70 more watts to do so, or in other words, a whole extra 6 pin power connector's worth.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

terrorist ambulance posted:

Has anyone who's got their 3080 or 3090 actually gaming with it at 4k and if do how's it going

Yeah. It's good, what do you want to know? Ampere has by far the greatest gains at 4K, of all resolutions. If you want to do 4K/120 and have it be viable for as long as possible you probably want a 3090, otherwise a 3080 is fine.

Taima fucked around with this message at 20:57 on Oct 30, 2020

terrorist ambulance
Nov 5, 2009

Taima posted:

Yeah. It's good, what do you want to know? Ampere has by far the greatest gains at 4K, of all resolutions. If you want to do 4K/120 and have it be viable for as long as possible you probably want a 3090, otherwise a 3080 is fine.

Just whether it's working as advertised. I'm a bit skeptical of its ability to do 4k 120, or even 4k 60, for long given that like Control and Tomb Raider are getting sub 60 fps benchmarks at 4k already with high settings and rtx.

I realize that those are demanding games and you can probably finesse performance some by fine tuning details and where possible using dlss, but if the rtx 3080 is really taxed at 4k its a thing to consider before putting down a bunch of money on a 4k monitor or whatever

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
4K high refresh max settings will remain "I don't care how much money I spend yearly" territory for years. 1440p sure, you can expect to buy a GPU like a 3080 and still get very acceptable high refresh max/near max settings performance in the newest games 2 or 3 years from now, but 4k does not work that way and won't for quite some time. Maybe the second gen of MCM GPUs or something will bring hideously overpowered GPUs that perform beyond all reason, but that's like 5+ years down the line.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

terrorist ambulance posted:

Just whether it's working as advertised. I'm a bit skeptical of its ability to do 4k 120, or even 4k 60, for long given that like Control and Tomb Raider are getting sub 60 fps benchmarks at 4k already with high settings and rtx.

I realize that those are demanding games and you can probably finesse performance some by fine tuning details and where possible using dlss, but if the rtx 3080 is really taxed at 4k its a thing to consider before putting down a bunch of money on a 4k monitor or whatever

Thing is, I wouldn't expect too many games to be coming out with RTX and not DLSS. I'm sure there'll be a few, but they seem to mostly be going hand-in-hand so far. And that takes Control back up into the 90's at 4k w/RTX for a 3080.

But yeah, you won't be doing 4k ULTRA MAXXX w/RTX on AAA titles 2-3 years from now and expect to be >60Hz, just like you can't expect to be >60Hz at 4k ULTRA MAXXX even without RTX on a 2080 Ti today: progress marches on, and your 2 year old card is never gonna be able to cut it against the top-end super-ultra-max settings.

runaway dog
Dec 11, 2005

I rarely go into the field, motherfucker.
I just realized, the amd cards probably aren't going to work with the g-sync hardware in my monitor huh?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

4000 Dollar Suit posted:

I just realized, the amd cards probably aren't going to work with the g-sync hardware in my monitor huh?

Depends. If you have an older "true" GSync monitor, with the GSync hardware and all and no mention whatsoever of FreeSync support...no, it probably won't. If you got a "GSync compatible / certified" monitor in the last year or two, though, that does mention FreeSync (these are now much more common than the "true" GSync monitors), then you'll be ok.

repiv
Aug 13, 2009

DrDork posted:

Thing is, I wouldn't expect too many games to be coming out with RTX and not DLSS. I'm sure there'll be a few, but they seem to mostly be going hand-in-hand so far.

RT is beginning to show up in AMD sponsored games like DIRT 5 and those of course won't have DLSS

...but given what we know about RDNA2s RT throughput, Ampere will probably perform better anyway :shrug:

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

repiv posted:

RT is beginning to show up in AMD sponsored games like DIRT 5 and those of course won't have DLSS

...but given what we know about RDNA2s RT throughput, Ampere will probably perform better anyway :shrug:

Yeah, it'll be interesting to see what the quality is like; it'll be hard for AMD to really showcase "full" RT like NVidia has if their cards simply can't keep up, so we may end up with games like that not really needing DLSS since the RT quality (and corresponding performance hit) will be forced to stay on the low side of things.

Adbot
ADBOT LOVES YOU

Cabbages and VHS
Aug 25, 2004

Listen, I've been around a bit, you know, and I thought I'd seen some creepy things go on in the movie business, but I really have to say this is the most disgusting thing that's ever happened to me.

Euphoriaphone posted:

You can check my earlier posts ITT but I had the same issue when I ordered a 3080 for pickup from BB. You can chat with a virtual agent who can tell you if it's a true delay. They told me it was just a notification the system sometimes sends, but not to worry about it. My card ended up coming earlier than my original confirmation even said.

Thanks, but, I did chat with them and reading between the lines Gigabyte and/or Nvidia hosed them on allocation but they don't know how badly yet. So, they suggested keeping my order for the week and they are going to cancel it for me if they don't have a card at that point.

:shrug:

I continue to see this as mostly a proxy snipe hunt to keep my mind of USPOL, and I find it a little hard to believe that I won't have a new card by, say, early 2021 in the worst case. I guess I'm equal parts tilted and amused by the whole debacle, the last time I tried to get in-demand hardware at all close to release was the Xbox 360 in 2005, and at that point I just called a couple local Target stores every morning before work for like 4 days until one of them had one. It was before the world of widespread botnet scalpers, discords for tracking stock, etc. So, that side of things is kind of interesting and silly to me.

repiv posted:

RT is beginning to show up in AMD sponsored games like DIRT 5 and those of course won't have DLSS

DIRT Rally 2 is one of the best VR experiences I own, and the lack of VR in DIRT 5 really tilts me

Cabbages and VHS fucked around with this message at 21:45 on Oct 30, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply