Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Arivia
Mar 17, 2011

Zero VGS posted:

What they all need to do is grow a spine and dictate terms to Nvidia, not the other way around.

"Even though we're competitors, you agree to give us all access to review samples on the same date, and if you bully any of us about content, all of us will skip your sample next time and poo poo-talk your behavior to customers"

If they could resist the urge to "scoop" one another they'd be better off for it.

There’s no way you could get everyone to agree and even if you could nvidia would just find some more YouTube channels to send review samples to who care about the exposure more than ethics.

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
It's the journalism idea that if the White House press secretary refuses to answer a question from one network, all the other journos from the other networks should press them on the same question

But it doesn't work even for something as important as that

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

gradenko_2000 posted:

It's the journalism idea that if the White House press secretary refuses to answer a question from one network, all the other journos from the other networks should press them on the same question

But it doesn't work even for something as important as that

Right? Why are they so self-defeating? It's some kinda maximum crab bucketing that turns a whole industry into a joke.

Dr. Video Games 0031
Jul 17, 2004

I mean, HUB did a public call-out and posted the email Nvidia sent them, which resulted in a lot of other major content creators also joining the call-out. But their message was clear, they'd just find another way to review their cards anyway, working with their retail or AIB contacts to get early samples instead. Not reviewing Nvidia's cards would've been a win for Nvidia and is the outcome they were looking for, honestly, because HUB was very critical of Turing at the time.

Arivia
Mar 17, 2011

gradenko_2000 posted:

It's the journalism idea that if the White House press secretary refuses to answer a question from one network, all the other journos from the other networks should press them on the same question

But it doesn't work even for something as important as that

also like, it's easy for us to just think of you having to band together the big youtube tech news people - ltt, gn, hub, etc. but the list of people who get preview cards is much bigger than that - do you see ars technica joining such an effort these days? pc magazine? the verge?

and that's JUST talking about english north american media. igor might pay attention, maybe. how about people doing reviews in asian countries who have no idea what the relationship is like in the english-speaking part of the world? do you think videocardz or whomever is gonna avoid posting translated reviews (like the stuff they already translate) in support of a solidarity boycott?

the answer is no.

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord

Zero VGS posted:

Right? Why are they so self-defeating? It's some kinda maximum crab bucketing that turns a whole industry into a joke.

It's the prisoner's dillema

Weird Pumpkin
Oct 7, 2007

Dr. Video Games 0031 posted:

You are plugging in the pcie power connector incorrectly. You cannot plug in the 6-pin first and then do the +2, that will not work. You have to line up the plastic bits on the connectors and "slot" them in together. This part circled here:



And then once they're slotted together, plug the whole thing in at once as if it's an 8-pin. If you do the 6-pin first and then the +2 after, the plastic part on the 6-pin will interfere with the +2's connection, and you won't be able to plug in the +2 all the way. This connector is designed so there's no way for you to fully plug in the 6-pin without also fully plugging in the +2 as long as you're doing it correctly.

You absolutely need to have another go at this, because a loose connection will cause random shutdowns for you (or prevent the system from turning on) in the best case, and in the worst case it can literally start a fire.

edit: to be fair, a fire would be extremely unlikely, but it may melt your power connector and ruin your card.

Oh geez, well fair enough! I'll pop it off and fix it then

Edit ok well, I feel very dumb but that did in fact fix it and nothing was damaged or anything, so no harm no foul I guess! Thanks very much for letting me know. Weird boot hiccup is still there, but at least it's no longer a fire risk!

Weird Pumpkin fucked around with this message at 05:28 on May 20, 2023

Kazinsal
Dec 13, 2011

Weird Pumpkin posted:

Oh geez, well fair enough! I'll pop it off and fix it then

Edit ok well, I feel very dumb but that did in fact fix it and nothing was damaged or anything, so no harm no foul I guess! Thanks very much for letting me know. Weird boot hiccup is still there, but at least it's no longer a fire risk!

Serious Hardware/Software Crap > GPU Megat[H]read: but at least it's no longer a fire risk!

Quaint Quail Quilt
Jun 19, 2006


Ask me about that time I told people mixing bleach and vinegar is okay

Twerk from Home posted:

I miss MSAA. I've been playing some Xbox 360 stuff on a Series S, and that era had MSAA pretty much across the board on the 360 and now they all run at 4x the resolution and twice the frame rate. Just looks really crisp and clean in a way that modern post-processing smears away.
I wasn't happy with rocket league's subpar or broken AA and found on reddit the suggestion to add MSAA 8x as a per game profile (in addition to anisotropic, FXAA and regular) using NCP or Nvidia profile inspector per game and it helps

Looking again at some reddit guides I probably have enough PC to run SGSSAA (which is like loads stacked together?)

It gets kinda confusing what kinds of AA will work together well or if it's optimal to turn off regular AA for best results and if my hardware can handle it so I just hack that in for that particular game and am generally happy with most other games.

Things like dark souls I can just double the render resolution with DSR instead but it tanks the fps in most other games.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

repiv posted:

MSAA post

Building on this, what are the characteristics of supersampling anti-aliasing, besides the obvious one that it's much more taxing since you're rendering at a higher-than-native resolution?

Josh Lyman
May 24, 2009


Josh Lyman posted:

Amazon has been absolutely useless trying to get my Diablo 4 game code over 2 sessions with customer support. At this point, I'm tempted to return my 4070 (though they might finally ban my account for too many returns) and just ride out my 1070 until the next gen since I get 70-80fps in D4 on 1440p high with FSR quality. Not ideal but I wouldn't feel bad about waiting.
As a follow up, I decided to give it one more try in case the 7900XT magically drops to $700 in the next 3 weeks and/or I decide to return my 4070 since Amazon would likely deduct the cost of D4 from the return even if I never received the code. I got someone on the phone who updated my file, which apparently hadn't been done with my previous 2 attempts, and said he would personally follow up with me on Sunday.

On a slightly unrelated note, having hardware AV1 encoding is nice for recording festival livestreams though HEVC would also be fine.

ijyt
Apr 10, 2012

Every kind of 40 series FE card has been readily available in the UK for like a week now and haven't gone out of stock. Love to see it. Still too expensive.

repiv
Aug 13, 2009

gradenko_2000 posted:

Building on this, what are the characteristics of supersampling anti-aliasing, besides the obvious one that it's much more taxing since you're rendering at a higher-than-native resolution?

the main thing that differentiates "proper" supersampling from just rendering at a higher resolution then scaling down is the sampling pattern, the points sampled within each pixel are rotated about 45 degrees from the pixel grid which has nicer visual properties



MSAA also does this (but only on triangle edges), as does TAA (cycling though each of the sample positions with each frame)

on an unrelated note when i reposted that diagram on imgur it auto flagged it as potentially adult material, if there was any doubt that their porn purge will catch a ton of clean posts, like tumblrs purge

Weird Pumpkin
Oct 7, 2007

Kazinsal posted:

Serious Hardware/Software Crap > GPU Megat[H]read: but at least it's no longer a fire risk!

Wouldn't be the first time a thread title has come from my idiocy, and definitely won't be the last as I'm very dumb you see :hai:

Lamquin
Aug 11, 2007
Got a quick question regarding FPS capping;
I know it's recommended to set a max framerate of [Monitor Refresh Hz] - 3 so that Gsync is always on, but also that making the GPU Load at 100% is not great as it can introduce some input latency, so I usually try to run a benchmark and then cap the framerate on a per game basis so the GPU is at 90-95%.

https://www.youtube.com/watch?v=7CKnJ5ujL_Q&t=334s

However, with my new fancy 4070 TI I have an opposite problem; the CPU is at 100% in many games while the GPU is taking it easy. Does the same rule apply there (i.e. cap the game at a lower framerate, or you get some input latency) or is it a whole different ballpark?
It's not critical as I'm not exactly a competitive FPS gamer, but I'm curious. :shobon:

hobbesmaster
Jan 28, 2008

Lamquin posted:

However, with my new fancy 4070 TI I have an opposite problem; the CPU is at 100% in many games while the GPU is taking it easy. Does the same rule apply there (i.e. cap the game at a lower framerate, or you get some input latency) or is it a whole different ballpark?
It's not critical as I'm not exactly a competitive FPS gamer, but I'm curious. :shobon:

I wouldn’t worry about reported CPU or GPU % for that, simple percentage is not necessarily a good proxy for responsiveness.

As a trivial example a tight busy wait that is constantly polling for a button to be pressed would show 100% cpu and be very responsive to that button press.

steckles
Jan 14, 2006

repiv posted:

the main thing that differentiates "proper" supersampling from just rendering at a higher resolution then scaling down is the sampling pattern, the points sampled within each pixel are rotated about 45 degrees from the pixel grid which has nicer visual properties
Beyond just sampling at a higher rate than your output pixel grid, there’s no strict definition of super sampling but yeah, any decent implementation is gonna move the sub samples. Even rotated grid isn’t great and the patterns used by MSAA can be quite complicated. Ideally you want a different set of stratified samples per pixel and then reconstruct them with a really huge filter.

Offline renderers use filters that touch 16, 36, or even 64 pixels per sample. 1000 samples with a box filter will usually look worse than 16 samples with a good filter that approximates sinc. It’d be interesting to see where the inflection point was on the GPU, where you’re better off throwing resources into filtering versus rendering. Of course, I guess it’d be pointless given that good sampling isn’t really compatible with all the temporal reconstruction and denoising shenanigans that are needed today.

repiv
Aug 13, 2009

it's unfortunate that when we do have the GPU headroom to supersample a game, the only universal way to do it (DSR/VSR) is sub-optimal since it uses ordered grid sampling

nvidia and AMD used to have modes which abused the MSAA hardware to do full-screen SSAA with shifted samples but they only worked right in games which supported MSAA natively

Mental Hospitality
Jan 5, 2011

Weird Pumpkin posted:

Wouldn't be the first time a thread title has come from my idiocy, and definitely won't be the last as I'm very dumb you see :hai:

Don't be too hard on yourself poster. The last video card I installed was an AGP GeForce 3 in a home-built Athlon XP system (:corsair:), I imagine I'd do all kinds of stuff to modern components.

E: "The pins are in the motherboard CPU socket now?! Wtf?!"
-Me, probably

Mental Hospitality fucked around with this message at 17:45 on May 20, 2023

MadFriarAvelyn
Sep 25, 2007

So, should the stars align, I should be picking up a 4090 FE from Best Buy tomorrow. But I've only got a 750W PSU in my current rig. Do I hold off throwing it into my current rig (otherwise powering an i9 9900k) or do I wait to install it for an upgraded CPU/PSU to avoid both CPU throttling and/or PSU exploding? Recommended advice to reduce the power target to 80% with a slight overclock aside?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
videocardz: RX 7600 & RTX 4060 Ti tested in 3DMark

quote:

The 3DMark benchmark may not always be the best comparison tool for modern gaming graphics cards. With powerful upscaling technologies and raytracing, gamers can see massive changes in performance with just one option enabled. Yet, the 3DMark data, especially the Graphics scores, are very consistent and comparable metrics for GPUs, especially when we include all the most popular benchmarks.

The 3DMark suite offers DX11, DX12 and now even raytracing tests. We requested FireStrike, TimeSpy and Speed Way data from reviewers to look where both of the soon-to-be-released cards sit.



There is no doubt that RTX 4060 Ti will be faster than Radeon RX 7600. NVIDIA’s GPU tends to be especially better in DX12 and DX12/RT tests. The RTX 4060 Ti appears to 24% to 25% faster than the RX 7600 in TimeSpy and up to 63% faster than the Speed Way (DX12 with ray tracing). However, there seems to be just a one-digit performance difference between both cards in FireStrike test (RX 7600 offers 92% to 97% performance of RTX 4060 Ti).

In terms of generation upgrade, the RTX 4060 Ti 8GB is on average 10% faster than RTX 3060 Ti, while RX 7600 sees 34% higher performance than RX 6600.



Kindly note that RTX 4060 Ti/RX7600 benchmarks were collected from multiple reviewers and the data was averaged. Both cards were using pre-launch press drivers. There is a chance that either NVIDIA or AMD could release an updated driver before launch that can affect the scores.

Josh Lyman
May 24, 2009


Videocardz previously reported the 4070 scores 17,783 in Time Spy which is 33% higher than the 4060 Ti. There were idiots on YouTube and Reddit saying the 4060 Ti 16GB would be more appealing than the 4070 due to the extra VRAM.

Indiana_Krom
Jun 18, 2007
Net Slacker

MadFriarAvelyn posted:

So, should the stars align, I should be picking up a 4090 FE from Best Buy tomorrow. But I've only got a 750W PSU in my current rig. Do I hold off throwing it into my current rig (otherwise powering an i9 9900k) or do I wait to install it for an upgraded CPU/PSU to avoid both CPU throttling and/or PSU exploding? Recommended advice to reduce the power target to 80% with a slight overclock aside?

You definitely want a new PSU (preferably with the native 16 pin power plug for the 4090) if you keep it in the 9900k system or not. That 750W is likely not going to cut it because it probably doesn't have the necessary 4x PCIe 8 pin connectors and a 4090 can spike to 600w. And you are also likely to run into CPU limits even at 4k.

For reference, I have a 3080 Ti with a 9900k, and I semi-regularly reach CPU limits at 1440p. I haven't upgraded yet because both the 13900k and the 7800X3D are both clusterfucks between insane power consumption with stupid power/efficiency scheduler problems on the 13900k or just plain exploding on the pad with the 7800X3D.

BONESTORM
Jan 27, 2009

Buy me Bonestorm or go to Hell!
I have an 850 watt PSU and I have been fine with a 4090 Gaming X Trio I’ve owned since not long after release. I believe it’s only the higher end models that can pull 600W, the FE is limited to 450W similar to the model I own from what I’ve read. I would power limit to like 80% with a 750 Watt power supply, but if it’s a good brand and not ancient I think it would be doable, if not the best idea.

I’m in similar boat CPU wise, my 9900K is holding my 4090 back more than I expected, but the current high end offerings aren’t rock solid enough for me to throw down the cash just yet. Summer’s almost here so gaming isn’t a high priority now anyway, will probably just wait for the next round of processors and rebuild then with the transplanted 4090.

BONESTORM fucked around with this message at 14:37 on May 21, 2023

FuturePastNow
May 19, 2014


"Effective memory bandwidth" is as real as downloading more RAM

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

FuturePastNow posted:

"Effective memory bandwidth" is as real as downloading more RAM

New marketing strategy: sell RAM kits based on the speed and latency of SRAM l1 cache on the cpus you can use the kit with!

You'd think nvidia would have learned the first time round that advertising the fastest bit of ram and keeping quiet about the slow bit

Yudo
May 15, 2003

According to a leak posted on videocardz, the 4060ti is on average 10% faster than the 3060ti in 3dmark. If that generalizes, day one reviews are not going to be kind.

Edit: a 3rd party retailer is selling the reference xfx 7900xt on Amazon for $700 and thr sapphire nitro xtx for $900. The seller looks dicey, though.

Yudo fucked around with this message at 18:35 on May 21, 2023

Cygni
Nov 12, 2005

raring to post

Cheapest 3060 Ti is $409 on PCPP. 4060 Ti is a little bit faster it seems, uses a lot less power, has the generational improvements (AV1 encoder + DLSS3), and is the same price. Ready for GN to describe it as boring, like most of the 20 series parts (and most GPU launches in the future). As always, wait for the reviews, should be out in about 48hrs.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Extrapolating, we can expect the RTX 4060 to be less than 10% faster than the RTX 3060, which was an underwhelming product at the same MSRP nearly two years ago. LOL Nvidia.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
There's going to be reviewers trolling for clicks that are going to expertly craft situations where the 3060 with it's 12GB of VRAM outperforms the 4060.

The 4060 Ti may be only 10% faster, but inflation adjusted its sticker price is more than 10% lower, making for decent performance per dollar improvement.

Yudo
May 15, 2003

It's a dog. There is no spinning it if that relative perfomance generalizes to other applications. Lol at whipping out the inflation calculator, because that is a thing your average consumer does or even understands.

MarcusSA
Sep 23, 2007

Twerk from Home posted:

There's going to be reviewers trolling for clicks that are going to expertly craft situations where the 3060 with it's 12GB of VRAM outperforms the 4060.

The 4060 Ti may be only 10% faster, but inflation adjusted its sticker price is more than 10% lower, making for decent performance per dollar improvement.



The only thing that helps this is the record inflation rates we’ve had.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
I have to say I don't particularly care about "memory bandwidth" in any way, like "how wide is the memory bus" was never a question I had when choosing a GPU. If it performs how it should for the money, it's fine. Which with the 4060 I don't think it does.

MarcusSA posted:

The only thing that helps this is the record inflation rates we’ve had.
But you can have 300% more performance when rendering at 720p and with frame generation


e: let's check my previous predictions!



Looks like I was too optimistic, again. I thought it would be around 3070 Ti, but 10% over 3060 Ti would put it on the level of the 3070. My new guess for the 4060 is that it'd be around the A770 level.

mobby_6kl fucked around with this message at 20:15 on May 21, 2023

FuturePastNow
May 19, 2014


At least it's power efficient I guess. Maybe they'll make some smaller cards.

Dr. Video Games 0031
Jul 17, 2004

mobby_6kl posted:

I have to say I don't particularly care about "memory bandwidth" in any way, like "how wide is the memory bus" was never a question I had when choosing a GPU. If it performs how it should for the money, it's fine. Which with the 4060 I don't think it does.

But you can have 300% more performance when rendering at 720p and with frame generation


e: let's check my previous predictions!



Looks like I was too optimistic, again. I thought it would be around 3070 Ti, but 10% over 3060 Ti would put it on the level of the 3070. My new guess for the 4060 is that it'd be around the A770 level.

You're still too optimistic about the 4060 in my opinion. A750 level is more likely.

FuturePastNow posted:

At least it's power efficient I guess. Maybe they'll make some smaller cards.

There's already at least one dual-slot, single-fan 4060 Ti announced from Palit/Gainward: https://videocardz.com/newz/first-mini-itx-geforce-rtx-40-graphics-card-announced

Cygni
Nov 12, 2005

raring to post

Assuming you are willing to compromise one thing or another, the 6650XT and Arc 750 are likely gonna end up with better price/performance than either the Radeon 7600 or 4060 launching next week. But the reality is it seems the market has pretty clearly spoken that people aren't willing to compromise to save the $50.

Yudo
May 15, 2003

Cygni posted:

Assuming you are willing to compromise one thing or another, the 6650XT and Arc 750 are likely gonna end up with better price/performance than either the Radeon 7600 or 4060 launching next week. But the reality is it seems the market has pretty clearly spoken that people aren't willing to compromise to save the $50.

Seeing as this generation of video cards are selling terribly, I think that people aren't willing to compromise and thus aren't buying anything.

MadFriarAvelyn
Sep 25, 2007



I've never driven more carefully to and from a Best Buy in my life.

Dr. Video Games 0031
Jul 17, 2004

For the record, I'd plug that bad boy in right away instead of waiting for more new parts. Set an 80% power limit, and it'll surely work with a 750W power supply just fine.

Adbot
ADBOT LOVES YOU

Cross-Section
Mar 18, 2009

Dr. Video Games 0031 posted:

For the record, I'd plug that bad boy in right away instead of waiting for more new parts. Set an 80% power limit, and it'll surely work with a 750W power supply just fine.

I've been doing just this for months, can second

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply