Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

K8.0 posted:

Every card this gen other than the 4090 has a name two slots above what its hardware corresponds to. The 4070 is really a 4060, it should be $300ish. Consumers should not be giving them credit for this renaming bullshit they're pulling.

In what generation did a xx60 card meet/beat a xx80 from the previous generation like the 4070 (that you claim is a 4060) meets/beats a 3080?

The 4070 is a sidegrade from the 3080. The 1060 was a step below the 980, the 2060 below the 1080, the 3060 a (big) step down from the 2080S.

Lockback fucked around with this message at 23:31 on Apr 18, 2023

Adbot
ADBOT LOVES YOU

Google Butt
Oct 4, 2005

Xenology is an unnatural mixture of science fiction and formal logic. At its core is a flawed assumption...

that an alien race would be psychologically human.

chaleski posted:

If the 4070 drops to about $500 I'll get one, otherwise go to hell Nvidia

Same but $450

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

It's still super funny to me that the 3050 is worse than the 2060

FuturePastNow
May 19, 2014


The 1630 is worse than the 1050 too

hobbesmaster
Jan 28, 2008

change my name posted:

It's still super funny to me that the 3050 is worse than the 2060

2060 was in active production and $100 less than the 3050 at launch too.

Bondematt
Jan 26, 2007

Not too stupid

hobbesmaster posted:

2060 was in active production and $100 less than the 3050 at launch too.

Don't worry, they fixed that glitch!

Branch Nvidian
Nov 29, 2012



FuturePastNow posted:

The 1630 is worse than the 1050 too

Yeah, and I'm worse than my dad, what's your point?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Lockback posted:

In what generation did a xx60 card meet/beat a xx80 from the previous generation like the 4070 (that you claim is a 4060) meets/beats a 3080?

The 4070 is a sidegrade from the 3080. The 1060 was a step below the 980, the 2060 below the 1080, the 3060 a (big) step down from the 2080S.

It's not about performance, it's about the specs of the hardware. Physical size of the chip, size relative to flagship, core count relative to flagship, bus width, etc, all point consistently to Nvidia just loving consumers and grossly overcharging compared to all prior generations. The "4070" has 36% the core count of the 4090. The 3060 had 34% of the 3090, the 2060 had 44% of the 2080 Ti, the 1060 had 36% of the 1080 Ti, the 960 had 36% of the 980 Ti, the 760 had 40% of the 780 Ti... how far back do you want me to go? Especially the die size tells you what the product actually is and roughly what it cost Nvidia to make in relative terms.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

K8.0 posted:

It's not about performance, it's about the specs of the hardware.

What?? Of course it's about performance. No one cares what the BoM is, you charge based on what kind of performance the card is able to deliver. That is a very strange "No true Scotsman" type argument.

some dillweed
Mar 31, 2007

The 1060 and 980 are pretty similar in performance if you check benchmarks, unless you're talking about the cut down 3 GB version. Also, why exactly are you defending Nvidia's overpricing of their products? This is a weird argument.

Shipon
Nov 7, 2005

Grog posted:

The 1060 and 980 are pretty similar in performance if you check benchmarks, unless you're talking about the cut down 3 GB version. Also, why exactly are you defending Nvidia's overpricing of their products? This is a weird argument.
Comparing anything to 10th gen is absolutely unfair because that was by far and large a huge outlier in value proposition. No poo poo they're not gonna make that mistake again...

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
1080ti was definitely an aberration and not a trend-setter. I think I read here that nvidia made the 1080ti quite strong because they thought the 500 series were going to be competitive (they weren't)

ijyt
Apr 10, 2012

Zedsdeadbaby posted:

1080ti was definitely an aberration and not a trend-setter. I think I read here that nvidia made the 1080ti quite strong because they thought the 500 series were going to be competitive (they weren't)

bold of them to admit that they'd have left performance on the table if they could

mobby_6kl
Aug 9, 2009

by Fluffdaddy


1060 > 980
2060 > 1080
3060 > 2070
4060 > 3060???

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I'm having a hell of a time understanding the intersection of VRR and GSync. My monitor is a S95B QD OLED. It officially supports 120 fps but you can overclock it to 144hz.

If I set VSync to enabled, whether in-game or in the Nvidia Control panel, it caps the fps back down to 120fps for some reason. I have set the recommended CRU settings (increasing HDMI 2.1 and Freesync VRR to go to 144hz and such) and set up the custom resolutions in the Nvidia Control Panel.

If I turn off VSync it does 144/4K no problem, but there is clearly tearing at times, and frankly games are unplayable without GSync after you get used to it.

Any ideas here? Literally everything works fine except:

1) VSync being on caps at 120
2) VRR doesn't appear to be activating above 120

These two things feel connected, right? And if I have to set the default limits back to 120hz, that won't ruin my life- the difference between 120 and 144 is pretty difficult to notice honestly even as someone who really cares about and notices image fidelity. But at the same time, I am stumped and just want it to work right.

e: Btw, as a random aside, I can't believe how amazing the 7800X3D plays with a 4090. For example it completely unlocked Returnal to the point where I can set everything including ray tracing at Epic level (albeit with DLSS Quality) and it maintains capped 120/4K with 109 fps 1% lows!!! wtf man. So good.

sauer kraut posted:

Try turning vsync off everywhere, and setting the Nvidia driver fps limiter to 140 globally.
If you get tearing that way, the overclocking disables gsync and should be avoided, but I've never used such a beast display myself so :shrug:

Vsync plus Gsync generally just means that if you go above max display fps, good old vsync kicks in and throttles the card. But only then.

Unfortunately I've tried the few permutations I can think of- Global Vsync On but off in game, Global Vsync on but off in game, Global and in-game VSync off and on. That is combined with a Nvidia Control Panel FPS limit of 140. Could that be an issue? Maybe I should try not limiting the FPS in the control panel, but my understanding is that such a thing is borderline required in order for the VRR to play right at cap...?

It really, really feels to me like VRR just isn't kicking in above 120fps despite CRU settings saying otherwise...

Idk. I've spent hours on this poo poo and have nothing to show for it, the pc gods hate me. I figure I'm probably just hosed and should stay at 120 but you guys are smart as hell so I figured if anyone would have an idea, it would be here. The only thing keeping me going at this point is that no one else who has this display has indicated that VRR doesn't work up to 144hz (the guides to implement specifically recommend increasing the GSync/Freesync VRR range to 144 in Cru which suggests it works fine). But I'm at the end of my rope here and ready to throw in the towel.

What sucks even worse is that Samsung intentionally capped the S95B at 120 (requiring much fuckery to get 144hz) but in their new 2023 version, the S95C, they made 144hz baseline. It's obvious that they intentionally locked down the monitor just so they could have another data point to sell the 2023 version. I hate that type of poo poo...

Taima fucked around with this message at 13:00 on Apr 19, 2023

sauer kraut
Oct 2, 2004
Try turning vsync off everywhere, and setting the Nvidia driver fps limiter to 140 globally.
If you get tearing that way, the overclocking disables gsync and should be avoided, but I've never used such a beast display myself so :shrug:

Vsync plus Gsync generally just means that if you go above max display fps, good old vsync kicks in and throttles the card. But only then.

AutismVaccine
Feb 26, 2017


SPECIAL NEEDS
SQUAD

sauer kraut posted:

Try turning vsync off everywhere, and setting the Nvidia driver fps limiter to 140 globally.
If you get tearing that way, the overclocking disables gsync and should be avoided, but I've never used such a beast display myself so :shrug:

Vsync plus Gsync generally just means that if you go above max display fps, good old vsync kicks in and throttles the card. But only then.

Imo thats the way.

Limit the fps to your max panel fps (not overclocked) - 4 or so. (I have a 144/165oc viewsonic elite, i just limit my fps in the nvidia panel to 140 fps)

With the games i play (mostly older ones) it works fine

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
What you're saying makes me question if the display is actually running at 144hz. It kind of sounds like it's stuck at 120 and you haven't got the 144 thing actually working properly.

repiv
Aug 13, 2009

yeah check the OSD to see if it's actually running in 144hz mode when you set it to 144hz input

repiv
Aug 13, 2009

another possibility is you have a bad HDMI cable dropping it down to HDMI 2.0 mode, which can do 4K120 (with chroma subsampling) but not 4K144

VorpalFish
Mar 22, 2007
reasonably awesometm

K8.0 posted:

It's not about performance, it's about the specs of the hardware. Physical size of the chip, size relative to flagship, core count relative to flagship, bus width, etc, all point consistently to Nvidia just loving consumers and grossly overcharging compared to all prior generations. The "4070" has 36% the core count of the 4090. The 3060 had 34% of the 3090, the 2060 had 44% of the 2080 Ti, the 1060 had 36% of the 1080 Ti, the 960 had 36% of the 980 Ti, the 760 had 40% of the 780 Ti... how far back do you want me to go? Especially the die size tells you what the product actually is and roughly what it cost Nvidia to make in relative terms.

I mean the names are completely arbitrary; as long as they're consistent (so a huge gently caress you to Nvidia for the entire mobile stack) what matters is price and performance, which is... unexciting this gen, but generally maybe okay if you don't have a previous gen card.

Nvidia is a company, they're always going to try to extract the absolute most profit they can which means charging as much as they can get away with while selling the volume they need to.

Of course they're loving consumers to the extent possible, it's an adversarial relationship.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Except they've never done this before. They've followed a very clear pattern of naming relative to CUs, die size, bus width, etc for about a decade, and the 40 series breaks it. People say things like "xxx was too good of a value, they'll never do another xxx" again, but they have. The 3060 Ti and 3080 were incredible deals at MSRP, and that's just one generation ago. They'll probably do significantly better next generation too, especially if people refuse to buy this generation at lovely prices like people did with the initial Turing GPUs.

bone emulator
Nov 3, 2005

Wrrroavr

Do I need a new PSU if I get a 4070? I have a 500 or 550W now.

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!

bone emulator posted:

Do I need a new PSU if I get a 4070? I have a 500 or 550W now.

Nvidia recommend a minimum of 650w, so yes.

Mega Comrade fucked around with this message at 17:52 on Apr 19, 2023

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:

Mega Comrade posted:

Nvidia recommend a minimum of 750w, so yes.

Nvidia says 650w for the 4070.

https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/rtx-4070-4070ti/

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
4070 is 250watt yeah? So 550watt would MAYBE be ok depending on your CPU and what other devices you have. Probably not a great idea though. You probably need to replace that PSU eventually and this seems like a good reason to do that.

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!
It's due to the power spikes all these GPUs have isn't it?

Nfcknblvbl
Jul 15, 2002

It’s because CPUs are pulling a lot of power now.

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:

Mega Comrade posted:

It's due to the power spikes all these GPUs have isn't it?

They reined in the transient power spikes for the most part on the 40 series as far as I'm aware.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Everything is pulling more power, but that's nothing new. I remember thinking "Why would I ever need more than a 300w PSU" before thinking "Why would I never need more than a 500w PSU". Doing more work requires more power, and efficiency is only advancing so fast to compensate for that.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

K8.0 posted:

Except they've never done this before. They've followed a very clear pattern of naming relative to CUs, die size, bus width, etc for about a decade, and the 40 series breaks it.

if you follow this train of thought to its conclusion, the 6950XT is really a RX 480-class GPU because it's got a 256b bus, right? 6700XT is a 1060 competitor because it's got a 192b bus?

kinda funny how nobody said a damned word about it last generation when it was AMD doing it. Why is smaller bus/bigger cache suddenly this hot-button issue for people?

oh right green man bad

like, there are very valid reasons we are never going back to the days of 2MB L2 / no L3 / 512b buses. PHYs don't shrink and cache density on 7nm/5nm nodes is super high so it's advantages to trade PHY area for cache area.

And you probably can't even route that sort of thing on GDDR6X/GDDR7. Look at the PCB layout of a 3090/4090 and they've got the memory modules tucked basically underneath the ILM, as close as humanly possible, to shorten the traces up. There simply isn't room on those PCBs for another 4 modules.

MCDs help a little bit since they enable you to staple on PHYs without affecting your main GCD, but, it's always going to be a balance because you can just as easily staple on cache on the MCDs, or stack it on the GCD/MCDs.

honestly the real reason that it's come up all of a sudden is that when AMD did it they gave it the Infinity Cache (TM) style branding and NVIDIA just quietly integrated it. We make fun of the GamerCache(TM) poo poo that AMD constantly does but this is an object lesson in why you do it regardless of how stupid it is to enthusiasts - people don't know what they should care about unless you give them a word that lets them care. Like you would think enthusiasts would grasp the significance of doubling cache but here we are.

Ada has GameCache on it, that's why it's got smaller memory buses. What's so hard?

Paul MaudDib fucked around with this message at 18:47 on Apr 19, 2023

The Wonder Weapon
Dec 16, 2006



Dessel posted:

gsync/freesync have known to cause these issues, as has non-multiplicative refresh rates on multi monitor setups not dividable by each other, but those problems might have been fixed, I'm not sure about the status of these problems on Windows/AMD/Nvidia.

Generally speaking I use one browser which has hardware acceleration turned off for watching videos on the background, and another for when I actually focus on the content with hardware acceleration on. You will very likely skip frames but it's better than having the game slow down or the video downright freezing.

edit: Having the option of having another GPU might indeed sidestep this problem, though sometimes which GPU handles which window can be a little funky as far as I'm aware (...It should generally be window -> monitor -> GPU). Might want to look into this or similar if you end up going this route. Browsers may also have option in themselves to select a GPU. Running two GPUs might lead into other interesting problems, but I haven't managed to run into anything catastrophic with my past setups on integrated+discrete.

I know this is from like a week ago but I sort of forgot that I posted it

It was mentioned telling Windows what to render on what GPU if you have more than one. I was always under the impression that the GPU the monitor was plugged into was the GPU that would render it. So 1440p monitor on the 5700, and 2x 1080 on a secondary card. Is that not how it works?

Racing Stripe
Oct 22, 2003

I have a 1440P monitor connected via DP to a Radeon 6700XT. I was just on a work zoom meeting, and my screen went black, and first it showed a little "DP" icon and then said no signal. Meanwhile, the audio from the meeting became very low-fi, but I could still hear them and they could hear me.

I restarted and everything seems to have gone back to normal, but I've never had that happen before, and I'm wondering what the variety of possible culprits could be. It seems like either a GPU or monitor problem since the system continued running, but the sudden performance dip in Zoom (or maybe it was audio, not zoom; I use a discrete sound card) suggests to me that the problem didn't affect only my video signal.

poo poo's all working fine now, so of course I'll just hope it stays that way. But if this happens again, any idea on what I should start investigating? Could, like, the GPU driver crash and wipe out the signal to the monitor and have some residual effects on other parts of the system without causing a total system shutdown?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
check your Windows Event Viewer logs although I doubt you'll see anything too significant there.

first step in debugging is always to replace the displayport/hdmi cable, it's cheap and sometimes it helps (audio travels on a separate pair iirc so maybe it'd help but who knows)

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Paul MaudDib posted:

if you follow this train of thought to its conclusion, the 6950XT is really a RX 480-class GPU because it's got a 256b bus, right? 6700XT is a 1060 competitor because it's got a 192b bus?

This is an incredibly non-relevant rant, Paul. Nvidia hasn't changed their bus widths for the entire time period I talked about. It's 384 bit flagships and 128-192 bit (mostly 192) x60 class the entire time. If this were about Nvidia shrinking bus sizes without starving their GPUs for bandwidth, which they aren't, I wouldn't be complaining. It's about them transparently loving consumers by downsizing GPUs below the flagship by not just bus width but every single metric we have consistently been able to use to predict product positioning for a decade.

Cygni
Nov 12, 2005

raring to post

K8.0 posted:

It's about them transparently loving consumers by downsizing GPUs below the flagship by not just bus width but every single metric we have consistently been able to use to predict product positioning for a decade.

"Predict product positioning"? What? The names change every cycle, they dont mean anything, they are made by marketers and don't represent any physical aspect of the product and have literally always been that way.

VorpalFish
Mar 22, 2007
reasonably awesometm

Literally nobody buys a GPU based on "this is 40% of the die size of the largest die they currently offer."

It's interesting as a thought exercise to guess which product their margins might be higher on relative to others of the same gen, but from a consumer perspective it's mostly meaningless, and given we don't know what they pay per wafer of tsmc n4, I don't know that we could necessarily extrapolate whether margins are high or low relative to previous gens; are margins abnormally low on the 4090 relative to flagship of previous gens? Or are margins on everything else abnormally high? Some combination of the 2?

What matters to consumers is how much does it cost, how fast is it in absolute terms, as well as relative to my alternatives (either a card from someone else or keeping what I already have).

Where poo poo gets misleading is when they launch 18 different versions called the same thing.

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
they're definitely overpricing stuff this generation (the 4080 and 4070 Ti alone are proof of that) but all the comparisons to the 4090 vs 30 series compared to the 3090 get silly very quickly because the 3090 was extremely bad value compared to every other card in the 30 series

if they'd done something more normal and say, called the 4070 Ti just the 4070 and called the 4070 the 4060 Ti (since that lines up much closer to how the 30-series related to the 20-series in terms of performance), and priced everything for at most $100 above the equivalent 30-series cards then people would be probably happy enough overall

Cygni
Nov 12, 2005

raring to post

lih posted:

they're definitely overpricing stuff this generation (the 4080 and 4070 Ti alone are proof of that) but all the comparisons to the 4090 vs 30 series compared to the 3090 get silly very quickly because the 3090 was extremely bad value compared to every other card in the 30 series

if they'd done something more normal and say, called the 4070 Ti just the 4070 and called the 4070 the 4060 Ti (since that lines up much closer to how the 30-series related to the 20-series in terms of performance), and priced everything for at most $100 above the equivalent 30-series cards then people would be probably happy enough overall

If they priced them at zero dollars everyone would be even happier!

Many people have been conditioned by the last 10 years to have expectations in this hobby that a) new products are always coming within a year or two, and b) those new products will continue to offer some assumed level of price/performance increase. Some people also seem to assume that certain names or pricepoints infer they are "high end" (whatever that means) or will be able to handle future games (that havent been released) in some specific way or with specific settings. The expectations have also creeped up, going from "30fps at 640x480 is all anyone needs" to scoffing at anything less than 1440p/144fps RT on.

Those assumptions and expectations are increasingly becoming problematic as the cost of development skyrockets. It is not going to be like the recent past moving forward, there is no forever growth. Unfortunately, physics is real.

The new products offer better performance and power draw than the ones they replace at the same price point. Thats objectively a win for consumers. Whether its a big enough win for you to buy the new toy? Thats up to you.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
The 6800U is dead, long live the 7940HS!

https://www.youtube.com/watch?v=Yqrw9cGA6QU

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply