Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
VorpalFish
Mar 22, 2007
reasonably awesometm

27GL850 is what I got since the 83 was out of stock. It's nice, I like it a lot. Probably gonna ride it until microled is actually viable.

Adbot
ADBOT LOVES YOU

Indiana_Krom
Jun 18, 2007
Net Slacker

K8.0 posted:

There are still big advantages to capping frame rate. The delay from framerate > refresh rate + vsync is always going to be significant. Going to a higher refresh rate makes it less bad, but it does not quite work the way you assume it does. And even if you can never hit your max refresh rate, latency is much lower when the GPU isn't fully loaded. Please stop posting this ignorant garbage. If you don't care about latency, that's fine, but don't spread misinformation. Gaining even 10ms of latency advantage is always going to be a big deal. That represents a significant shift in your percentile in human reaction speed, even if we ignore the benefits that aren't just straight up reaction time.
The discussion is about 240 Hz monitors, your linked videos were taken on a 144 Hz monitor with obvious cherry picking of the data and not paying attention to where the latency is happening through the whole input to display pipeline. What is happening here is the impact of CPU render ahead, there are more frames buffered ahead by the CPU in uncapped operation regardless of GPU utilization. Every single example in these videos is the result of CPU render ahead. The in-game cap stops the CPU from buffering up as much work ahead of the GPU in these specific games which results in a significant latency reduction, this is display agnostic and will happen in these games any time the CPU is not the limit regardless of the refresh rate. This is game specific and not even the result of a blanket cap because RTSS actually makes the latency WORSE than uncapped in these same games:
https://www.youtube.com/watch?v=VtSfjBfp1LA
Also observe how counterstrike behaves in this video, caps universally increase latency all the way up to 650 FPS and the display makes no difference at all.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Just as a further reminder of how time has lost all meaning in this hellscape of plague and idiocy, this talk about microLEDs and FALD poked loose a stray memory about something I saw at <checks notes> CES last year?! It seemed so long ago...

https://www.youtube.com/watch?v=STdZ_kiHYEY

I'm surprised that displays like this aren't clogging store shelves, bypassing the FALD + microLEDs entirely. And if a lower-tier manufacturer like Hisense (it's not even their idea, it was Dolby's decades ago) can be on it, why aren't the bigger names like Samsung and LG?

I'm sure input latencies must be entirely unsuitable for gaming, but the benefits in movie watching and desktop use though?

SwissArmyDruid fucked around with this message at 22:42 on Aug 7, 2020

Chimp_On_Stilts
Aug 31, 2004
Holy Hell.
Are there any best guesses as to power consumption for the new 30xx GPUs?

I'm trying to plan a new system and am thinking about power supplies. 850 watts should be sufficient, right?

repiv
Aug 13, 2009

That Hisense is expensive ($2600 in China) and barely qualifies as HDR with a peak of 500 nits, so the technology isn't really a slam dunk yet.

I think the main problem is that to forgo FALD you need an extremely bright, hot and power hungry backlight over the entire panel even if you only need a small bright zone. Reintroducing FALD fixes that but drives up the cost even more.

There's a professional display that uses dual-layer LCD tech and reaches 1000 nits, but consumes *checks notes* two hundred and eighty watts for a 31" panel to get there

https://www.flandersscientific.com/XM311K/

SwissArmyDruid
Feb 14, 2014

by sebmojo

repiv posted:

That Hisense is expensive ($2600 in China) and barely qualifies as HDR with a peak of 500 nits, so the technology isn't really a slam dunk yet.

I think the main problem is that to forgo FALD you need an extremely bright, hot and power hungry backlight over the entire panel even if you only need a small bright zone. Reintroducing FALD fixes that but drives up the cost even more.

There's a professional display that uses dual-layer LCD tech and reaches 1000 nits, but consumes *checks notes* two hundred and eighty watts for a 31" panel to get there

https://www.flandersscientific.com/XM311K/

Ah, the XD9G is already out over there? The things I was reading said that it was going to come out Q3.... which it is, already, jesus, time has lost all meaning.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Indiana_Krom posted:

The discussion is about 240 Hz monitors, your linked videos were taken on a 144 Hz monitor with obvious cherry picking of the data and not paying attention to where the latency is happening through the whole input to display pipeline. What is happening here is the impact of CPU render ahead, there are more frames buffered ahead by the CPU in uncapped operation regardless of GPU utilization. Every single example in these videos is the result of CPU render ahead. The in-game cap stops the CPU from buffering up as much work ahead of the GPU in these specific games which results in a significant latency reduction, this is display agnostic and will happen in these games any time the CPU is not the limit regardless of the refresh rate. This is game specific and not even the result of a blanket cap because RTSS actually makes the latency WORSE than uncapped in these same games:
https://www.youtube.com/watch?v=VtSfjBfp1LA
Also observe how counterstrike behaves in this video, caps universally increase latency all the way up to 650 FPS and the display makes no difference at all.

Monitor is magic, but also monitor is irrelevant - what? You're so all over the place that it's hard to even keep track. Since you've repeatedly ignored every time I've pointed out how and why you're wrong in the past and this time haven't even critically viewed the videos I linked and the one you linked, I'm not going to bother writing a huge step by step effortpost to break down every way you're wrong and why. I guess I'm just going to have to settle for calling you out every time you post some out of context and/or incorrect nonsense about why you would want to use a frame limiter.

For anyone else who is actually interested, yes, you can get the lowest latency possible by running vsync off, and with vsync off it is sometimes preferable to run a completely uncapped framerate. I used to fall into the camp that says that vsync off is always best for competitive play. However, after playing around with both vsync off and on, it's clear that at least for some people your actual reactions are worse with vsync off because tearing hurts your visual perception by more milliseconds than the delays from capping do.

Once you have vsync on, you basically always want a framerate cap for at least one of several reasons - to keep yourself in VRR and out of plain vsync (which has a very significant latency penalty that does not entirely scale with your monitor's refresh rate), to avoid GPU backpressure/CPU runahead (this is where in-game limiters tend to excel, since they can control the game logic and input timing, not just the rendering pipeline), and to deal with frame pacing issues (this is where some in-game limiters fall flat on their face, and sometimes it may be better to pay a small latency penalty in order to avoid wild variations in visual pacing and input latency. Also many single player games/less competitive games have no built-in frame limiter). In-game frame limiters are (mostly) getting better all the time, but there are still times where you want to use the Nvidia frame limiter. There is no longer a reason to use RTSS to cap frames, the Nvidia limiter has replaced.

Nothing about hwunboxed video debunks BattleNonsense's videos (or his primary focus for the last like four years), because he didn't understand what Battlenonsense was testing and why, so he tested a bunch of irrelevant stuff by doing vsync off tests for a technique that has more universal relevance when vsync is on, and not even clearly labeling his tests. At least half the things he tested are just validations of behavior we've known for 20+ years if you're fine with tearing, but they don't apply when you don't want to accept tearing.

K8.0 fucked around with this message at 23:14 on Aug 7, 2020

forest spirit
Apr 6, 2009

Frigate Hetman Sahaidachny
First to Fight Scuttle, First to Fall Sink


So in late March I bought a used EVGA 2080ti for 1300 which I thoroughly enjoyed over the pandemic, and I just sold it for 1400 and just bought an EVGA 2080 Super, the plan being to use the Step Up program and get a 3080/90 when they launch

I'm pretty sure I'm doing this right

apart from the interim period right now where I'm using a gtx 1050. It's so cute!

Indiana_Krom
Jun 18, 2007
Net Slacker
I'm thinking this is a communication issue instead of one of us being right or wrong.

Your videos both show exactly the same thing: CPU at 300 FPS in overwatch/pubg/bfv has worse latency than CPU at <something closer to GPU or display FPS>. You are looking at the VRR window and vsync penalties, but they aren't the biggest offender in those videos, the CPU cap is:
CPU=60 no GPU limit == 50/45/35 latency.
CPU=300 GPU limited to 85 FPS via 200% resolution scale == 85/72/60 latency. *still within 30-144 VRR window* so this is not the fault of vsync.
The game logic is running asynchronous to the GPU/display and doing it very badly so it causes lots of additional latency which is all on the game engine. In other words you should absolutely cap overwatch but the VRR window and vsync isn't as important to it as capping below your GPU utilization limit which is where you really get punished. Vsync also triggers this behavior just not as severely as hitting the GPU limit, probably because vsync in those videos is 144 Hz. At 60/85 Hz might be just as bad as GPU limited to 60/85 FPS, which probably deserves further investigation (I don't have overwatch).

All I really have been saying is that with a 240+ Hz freesync/gsync monitor what most people should do is just turn on global vsync+low latency modes and forget all the frame capping because its a lot of effort that makes things worse more than half the time and doesn't return significant benefits when you are already all the way up at 240+ Hz even when it does work. The benefit at 60 Hz can be massive and should absolutely be a priority, at 144 Hz it is there and worth a quick check if a particular game needs it (most games don't), but the higher you go past those refresh rates the less it matters.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

Chimp_On_Stilts posted:

Are there any best guesses as to power consumption for the new 30xx GPUs?

I'm trying to plan a new system and am thinking about power supplies. 850 watts should be sufficient, right?

I'm hoping so, I just upgraded my older 650 watt Corsair to an Seasonic Focus PX-850.

If 850 watts isn't enough for a 3080Ti, Nvidia can go gently caress themselves.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
The new 12-pin connectors that were rumored, and later confirmed, for Ampere is supposed to be able to draw 600W, plus or minus another 75W from the PCI-e slot.

In comparison a card that needs two 8-pin connectors is rated to draw up to 300W.

Make of that what you will.

FuturePastNow
May 19, 2014


Maybe the 3090 really is a dual GPU card, then lol

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

gradenko_2000 posted:

The new 12-pin connectors that were rumored, and later confirmed, for Ampere is supposed to be able to draw 600W, plus or minus another 75W from the PCI-e slot.

In comparison a card that needs two 8-pin connectors is rated to draw up to 300W.

Make of that what you will.

There is absolutely zero chance consumer Ampre is going to draw 600W. There's little reason to think it'll even draw 300W unless overclocked. We already know TSMC's 7nm process is pretty good power-wise, and Samsung's 8nm process isn't much worse, especially since its primary customer is cell phone and mobile chips. The only possible exception would be if they really did drop a dual-GPU card in there, which seems highly unlikely given that SLI is dead.

This sounds like a "building for the future" thing, where the future is giant gently caress-off datacenter compute cards.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

DrDork posted:

There is absolutely zero chance consumer Ampre is going to draw 600W. There's little reason to think it'll even draw 300W unless overclocked. We already know TSMC's 7nm process is pretty good power-wise, and Samsung's 8nm process isn't much worse, especially since its primary customer is cell phone and mobile chips. The only possible exception would be if they really did drop a dual-GPU card in there, which seems highly unlikely given that SLI is dead.

This sounds like a "building for the future" thing, where the future is giant gently caress-off datacenter compute cards.

600W @ 12V is 50A they'd be better off doing 48V and 2 stage conversion, drat that's a lot of copper

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Malcolm XML posted:

600W @ 12V is 50A they'd be better off doing 48V and 2 stage conversion, drat that's a lot of copper

Yeah, it's a lot of power. But it's not entirely without precedent: the Radeon 295x2 claimed 500W on its spec sheet (with actual use being closer to 600W at times), at 28A per 8-pin connector, which technically violates PCIe standards, but apparently AMD was mostly able to get away with it on the grounds that most 800+W PSUs were able to push that much power, regardless of the specs.

And for those of you keeping score in terms of cooling, AMD slapped a single 38x120mm AIO on there (two pumps/plates, of course), and was able to keep that 600W monstrosity at ~70C under load, at max clocks, while the 290X's it was based on would routinely hit 94C at similar clocks, while also being 15dB quieter under load (because the 290Xs had impressively poo poo coolers).

So for anyone worrying that a single 120mm rad looks "small" for a GPU...don't worry about it.

To be clear, I still don't think NVidia is going to release a dual-GPU card given they've basically told everyone that SLI is dead and to please stop bothering them about it. I don't expect to see any multi-GPU gaming product out of them until chiplets are ready to go, and that's not really the same thing.

shrike82
Jun 11, 2005

https://videocardz.com/newz/nvidia-geforce-rtx-30-ampere-series-to-be-announced-on-september-9th

quote:

According to GamersNexus, NVIDIA is set to announce its GeForce RTX 30 series based on Ampere architecture on September 9th.

9/9.... 3090??

Soul Glo
Aug 27, 2003

Just let it shine through
I think this is the right place to ask a question which might not have a good answer just yet, considering, but here goes:

Should I expect to be able to play PS5/Xbox Series X generation games on a 2070 Super (paired with an i7 9700)? My current monitor is a 1080p/240hz panel, so my understanding is I should be just fine as the next consoles are targeting 4k with GPUs that stack up against the 2080 and 2070 in closed, optimized systems. I know I probably should be on a 1440p panel, and might upgrade, but I plan to use this 1080p panel for the time being.

Right now Googling just gets articles citing teraflop differences which doesn't tell me much wrt expectations of the next gen and lower resolutions, and I know future-proofing is dumb and bad and not possible, but I'm needing to replace a computer earlier than expected so I might be upgrading my machine now rather than next year and can get a good deal on a 2070S.

ufarn
May 30, 2009
3090 also being the available stock.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Soul Glo posted:

I think this is the right place to ask a question which might not have a good answer just yet, considering, but here goes:

Should I expect to be able to play PS5/Xbox Series X generation games on a 2070 Super (paired with an i7 9700)? My current monitor is a 1080p/240hz panel, so my understanding is I should be just fine as the next consoles are targeting 4k with GPUs that stack up against the 2080 and 2070 in closed, optimized systems. I know I probably should be on a 1440p panel, and might upgrade, but I plan to use this 1080p panel for the time being.

Right now Googling just gets articles citing teraflop differences which doesn't tell me much wrt expectations of the next gen and lower resolutions, and I know future-proofing is dumb and bad and not possible, but I'm needing to replace a computer earlier than expected so I might be upgrading my machine now rather than next year and can get a good deal on a 2070S.

Will you? Yes. They will not release PC games that require GPUs that only 10% or less if their users own. At 1080p I'd expect you generally will have everything turned max and be fine. At 1440 you might be making some decisions on high framerates vs a couple options (particularly RTX especially if there isn't DLSS options), and at 4K there's probably more of those decisions.

I don't think you'll be in a position where you'll "need" to upgrade anything for a while.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

ufarn posted:

3090 also being the available stock.

How long do third parties like EVGA take to get cards out once they’re announced?

I’d like to stick with them because I’ve never had a bad experience with them and the warranty is rad.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
It's a mixed bag, and depends on what NVidia decides to do. There have been launches in the past where partners have had stuff out the same day it's announced. NVidia has also enforced a short (two week, IIRC?) product ban for other launches to push their FE models. No one really knows. "As soon as they're allowed to" is about the best we can say.

ufarn
May 30, 2009
Any thoughts on what to make of the slim eight-day launch window between announcement and availability? Sounds a little like a paper launch, no?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

ufarn posted:

Any thoughts on what to make of the slim eight-day launch window between announcement and availability? Sounds a little like a paper launch, no?

That is the expectation, yes. Though not because of any gap between announcement and availability, but rather because everyone is having capacity and shipping problems right now, and GPUs for the last several years have not generally launched with much immediate volume.

Geemer
Nov 4, 2010



Even if there wasn't a pandemic, with the last couple of generations it's felt like by the time there's actual stock, people were being told to wait until the next generation that was right around the corner.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
The rumor mill says September 9th announcement / September 17th launch.

https://www.gamersnexus.net/news-pc/3609-hw-news-rtx-3000-release-date-amd-x86-marketshare-intel-leaks

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Even if relative stock for the 3080Ti is better than the 2080Ti launch, I’m assuming that quarantine will cause feverish, insane demand at almost any price point they care to set.

My plan is to try to grab an FE the nanosecond that’s possible, and if that fails, keep my ear to the ground on a few forums to track partner boards as they arrive. I fully expect this to be a fuckin’ battle.

I sat out the 2080Ti, what were the winning strategies then?

I got a 1070 right when they released by ocd trolling amazon listings / nowinstock

Taima fucked around with this message at 23:26 on Aug 8, 2020

GRINDCORE MEGGIDO
Feb 28, 1985


Taima posted:

Even if relative stock for the 3080Ti is better than the 2080Ti launch, I’m assuming that quarantine will cause feverish, insane demand at almost any price point they care to set.

My plan is to try to grab an FE the nanosecond that’s possible, and if that fails, keep my ear to the ground on a few forums to track partner boards as they arrive. I fully expect this to be a fuckin’ battle.

I sat out the 2080Ti, what were the winning strategies then?

I got a 1070 right when they released by ocd trolling amazon listings / nowinstock

I'm gonna wait a few months and not give a poo poo :dukedoge:

FuturePastNow
May 19, 2014


Wonder what the used market for 2080s will look like after the release.

CaptainSarcastic
Jul 6, 2013



I'm starting to think I should stop having Steam display my FPS in every game and start only worrying about FPS when I notice any visual distortions. I've been catching up on some older games where my Ryzen 5 + 2070 Super are way overpowered, and the games lock in at 60 FPS despite showing the target framerate in-game as 143 FPS, and they still look and play perfect.

I'm wondering if the habits I've developed the last couple years of watching my framerate number instead of going by how the game actually looks and plays has become counterproductive. Training myself to actually rely on subjective experience instead of objective measurement might not be an easy adjustment, though.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Ignorance is bliss. There are plenty of issues I wouldn't have batted an eyelid at a decade ago that now will immeasurably irritate me.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
And I'm saying this as someone with outdated hardware that's happy to get a consistent 60 at 1080

ufarn
May 30, 2009
Honestly, it's 2020, why doesn't Steam have a frametime graph at this point.

MikeC
Jul 19, 2004
BITCH ASS NARC

Taima posted:

Even if relative stock for the 3080Ti is better than the 2080Ti launch, I’m assuming that quarantine will cause feverish, insane demand at almost any price point they care to set.

My plan is to try to grab an FE the nanosecond that’s possible, and if that fails, keep my ear to the ground on a few forums to track partner boards as they arrive. I fully expect this to be a fuckin’ battle.

I sat out the 2080Ti, what were the winning strategies then?

I got a 1070 right when they released by ocd trolling amazon listings / nowinstock

Why the desperation? Cyberpunk is releasing in mid November. 3070 is on track to release in October and the 3060 shortly afterwards. AMD is going to break its back to get mid range RDNA 2.0 out the door before Thanksgiving (US).

The winning strategy for the 2080Ti era was not to buy the RTX 2XXX series at all,

queeb
Jun 10, 2004

m



im just hoping to get into the evga step up queue with my 2070 i just grabbed and wait for like 6-8 months until they can send and 3080.

shrike82
Jun 11, 2005

Odd to rush to get an FE if you're getting the highest end card.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

shrike82 posted:

Odd to rush to get an FE if you're getting the highest end card.

Not really. The 20-series FEs were legitimately good cards. Sure, you can get AIB ones with more bling, or more stripped down ones if you enjoy lovely thermals, but if you're already accepting you're going to be paying $1200+ for it, if a FE is available and AIBs are not (which wouldn't surprise me), if you grab one then at least you have a decent card for the next few years. If you wait to play the AIB game, you might get a good card at a slight discount from the FE (and in EVGA's case get excellent support out of the deal, too), or you might end up getting sniped out of the market by bots and have to either deal with buying one for insane prices or waiting weeks to months for inventory.

Broose
Oct 28, 2007
I'm eager to replace my 970, but I've never paid attention to a new generation's release until the 2xxx gen. I've been saving money over the entire generation since they didn't seem like a good idea when they released. Should I try to get my hands on a FE card? Are they worse than third-party cards? Do they have worse warranties? From the way people post about them, it doesn't seem like they are held in high regard, but I don't know exactly why.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Getting the cooling apparatus off the FE 2xxx series stressed out Tech Jesus more than I think I've ever seen him before, so that's something to take into consideration.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

Broose posted:

I'm eager to replace my 970, but I've never paid attention to a new generation's release until the 2xxx gen. I've been saving money over the entire generation since they didn't seem like a good idea when they released. Should I try to get my hands on a FE card? Are they worse than third-party cards? Do they have worse warranties? From the way people post about them, it doesn't seem like they are held in high regard, but I don't know exactly why.

My understanding is FE coolers have typically been kind of lovely.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

BIG HEADLINE posted:

Getting the cooling apparatus off the FE 2xxx series stressed out Tech Jesus more than I think I've ever seen him before, so that's something to take into consideration.

Isn't it disassembling the FE 2xxx cooler that's difficult, if you want to replace the fans or whatever? I don't think separating it from the board is that problematic.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply