Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

gradenko_2000 posted:

Diablo 2 Resurrected does this for fur and then the Switch port can't clean it up and it ends up looking like static

Switch is really rough, it basically never has any extra oomph for AA and you can always tell.

My kids found the Pikmin 3 demo on switch and then we broke the Wii U out of storage because I'm not going to re-buy a game for $60 and it turns out that Pikmin 3 looks noticeably better on Wii U then switch. They're both 720p but the Wii U release either has a better scaler or better AA.

Adbot
ADBOT LOVES YOU

meta²
Sep 11, 2001

What the flip was Grandma doing at the dunes?

Am I crazy or could the 4070 actually be a good card for me? I have an SFF and need something smaller than 330 x 150 x 50mm. I am upgrading from a 2060 because I can’t get it to support my LG C2 properly. I am willing to spend some cash and pass this card on to my son, it seems like the fastest card to fit my constraints.

pyrotek
May 21, 2004



meta² posted:

Am I crazy or could the 4070 actually be a good card for me? I have an SFF and need something smaller than 330 x 150 x 50mm. I am upgrading from a 2060 because I can’t get it to support my LG C2 properly. I am willing to spend some cash and pass this card on to my son, it seems like the fastest card to fit my constraints.

It isn't a bad product, the price is just too high. If you have a Micro Center nearby, this Radeon RX 6950 XT will likely be around 25% faster than the 4070 in raster performance at UHD and should work well for you if you have a good enough power supply. Some games are already over 12GB of VRAM usage at UHD, so the 16GB is an actual advantage these days and will be going forward. You get roughly 3090 raster performance and the extra 4GB of VRAM over the 4070, but you lose DLSS, RT peformance, and the 4070 will definitely be more efficient.

New Zealand can eat me
Aug 29, 2008

:matters:


You know, I wasn't expecting much based on the extremely critical articles about the Elden Ring raytracing update but: god drat. I beat the game at launch, but this looks markedly better than I remember it being. It's not necessarily a blatant difference, but it feels like a "it has colored lights" quake 2 moment. Having no issue maintaining 60fps @ max everything with a 7900XTX.

Personally the glowy edges at launch bothered me enough to the point that I played offline in 2880p with the framerate limit unlocked, but this is much, much better.

Side note: Does anyone know how to re-enable that feature with RDNA3 cards? I really liked being able to enable super resolution and run at 2880p when I was recording replays etc, but that seems to be gone now which is frustrating.

Truga
May 4, 2014
Lipstick Apathy

repiv posted:

there's a tension between responsive controls and natural looking animation, if you let the player move like doomguy then you can't really make the animations look physically plausible

games like RDR2 and TLOU lean hard into making the animation look natural at the expense of responsiveness

funny thing is, doom movement is insanely floaty by modern standards, it's just there's no animation to go with it just some shotgun swaying at the bottom

IMO, gw2 went for the ideal scenario where you have really responsive controls, but the animations are made so starting/stopping looks relatively natural

Kibner
Oct 21, 2008

Acguy Supremacy
Guild Wars 2?

Blurb3947
Sep 30, 2022
Garden Warfare 2?

Gyrotica
Nov 26, 2012

Grafted to machines your builders did not understand.
Grandma Waifus 2, clearly.

Branch Nvidian
Nov 29, 2012



Gyrotica posted:

Grandma Waifus 2, clearly.

They still look like they’re barely 13 though.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
So some 4070 listings leaked from the official sites (allegedly) https://www.tomshardware.com/news/multiple-vendors-list-nvidia-rtx-4070-boards-early

:lmao:

New Zealand can eat me
Aug 29, 2008

:matters:


900 earth dollars for a GPU with 12GB of VRAM? Uhhhhh

pyrotek
May 21, 2004



Don't be silly, people. Those prices are higher than the 4070 Ti versions, and those already aren't moving.

Josh Lyman
May 24, 2009


When I went to return my 4070 Ti to Microcenter yesterday, a manager-looking person commented that he didn’t blame me since it didn’t have enough memory. The CSR who actually processed my returns said they were getting a lot of 40 series back (it looked like the box was marked for return to vendor). And there was a person at another register who was returning the exact same model.

I doubt all these returns will show up on the 1st quarter financials, but at some point I hope we can see just how the 40 series is actually doing aside.

Prescription Combs
Apr 20, 2005
   6
I definitely don't regret getting the 7900XTX.. Thing has been completely solid and stable.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
It seems crazy to me that 10 or 12 GB is insufficient now, I mean 970's with the "3.5" GB ran lots of games very well and wasn't that long ago.

Like, Last of Us can easily go over 10 just from ramping up textures. Are they just, not compressed or something?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Zero VGS posted:

It seems crazy to me that 10 or 12 GB is insufficient now, I mean 970's with the "3.5" GB ran lots of games very well and wasn't that long ago.

Like, Last of Us can easily go over 10 just from ramping up textures. Are they just, not compressed or something?

It wasn't crazy to me that the 3080 10GB was a bad idea, seemed off from day one, but the fact we're just blowing through 12GB casually is somewhat alarming, it honestly seems like an optimisation issue.

That said, the 970 was 8 and half years ago, and there was such a fuss kicked up about the slow 512MB of RAM that there was a lawsuit.

But if we take the facts as they are, I think NVIDIA made a miscalculation on their RAM cheaping out this time, arguably more than they have done in the past. I wouldn't recommend a card over $500 with less than 16GB, and AMD will give you that.

HalloKitty fucked around with this message at 20:20 on Apr 3, 2023

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

HalloKitty posted:

It wasn't crazy to me that the 3080 10GB was a bad idea, seemed off from day one, but the fact we're just blowing through 12GB casually is somewhat alarming, it honestly seems like an optimisation issue. I wouldn't recommend a card over $500 with less than 16GB.

That said, the 970 was 8 and half years ago, and there was such a fuss kick up about the slow 512MB of RAM that there was a lawsuit.

AMD also responded to the GTX 970 being outstanding by throwing you 8GB of VRAM for less money with the otherwise outclassed R9-390. The 970 didn't have a ton of VRAM, it had just enough. I do think that Nvidia may have guessed wrong about the VRAM usage trend. Hell, we may see the 3060 age weirdly well as a $330 GPU compared to the 8GB 3070 at $499 and 3070 Ti at $599.

Re: TLOU PC port, it's a lovely port and shouldn't really be used as a benchmark, but it definitely will because almost every game runs extremely well right now so everybody's looking for things that run poorly. I've seen that it's also extremely CPU heavy in addition to being VRAM heavy, because the PS5 has dedicated hardware to handle the streaming from storage but TLOU didn't implement the Windows / Xbox DirectStorage option to do that, so it makes the CPU do a ton of extra work.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Twerk from Home posted:

AMD also responded to the GTX 970 being outstanding by throwing you 8GB of VRAM

Oh that's right, I totally remember I got one of those RX 480 cards at launch with 4GB of ram, and I was able to flash it into 8GB. That "download more memory" meme actually came true!

https://www.techpowerup.com/223893/amd-4gb-radeon-rx-480-can-be-flashed-into-8gb

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Twerk from Home posted:

AMD also responded to the GTX 970 being outstanding by throwing you 8GB of VRAM for less money with the otherwise outclassed R9-390. The 970 didn't have a ton of VRAM, it had just enough. I do think that Nvidia may have guessed wrong about the VRAM usage trend. Hell, we may see the 3060 age weirdly well as a $330 GPU compared to the 8GB 3070 at $499 and 3070 Ti at $599.

Also the 1070 next gen had 8GB. I guess 12GB would've been ok if the cards weren't going to be like $700

pyrotek posted:

Don't be silly, people. Those prices are higher than the 4070 Ti versions, and those already aren't moving.
Those look like some fancy OC versions? I dunno MSI stuff but yeah I'd assume they'd be lower once actually launched. Still, I'm not counting on $600 cards.

pyrotek
May 21, 2004



Twerk from Home posted:

Re: TLOU PC port, it's a lovely port and shouldn't really be used as a benchmark, but it definitely will because almost every game runs extremely well right now so everybody's looking for things that run poorly. I've seen that it's also extremely CPU heavy in addition to being VRAM heavy, because the PS5 has dedicated hardware to handle the streaming from storage but TLOU didn't implement the Windows / Xbox DirectStorage option to do that, so it makes the CPU do a ton of extra work.

I agree that it is a lovely port, but that certainly doesn't mean it won't be a sign of things to come as we get more PS5-centered games. RE4 remake crashes with RT if you don't have enough VRAM and you put texture settings too high, D4 beta wants tons of RAM for its highest texture setting, and (ugh) Hogwart's Legacy uses 16GB with RT.

Now, of course you can just lower settings etc. to get around these issues, but these are $800+ cards! It is bad enough to have to worry about VRAM allocation on expensive cards a couple years down the road, but these games are available right now.

mobby_6kl posted:

Those look like some fancy OC versions? I dunno MSI stuff but yeah I'd assume they'd be lower once actually launched. Still, I'm not counting on $600 cards.

Ventus is their near-MSRP line. 2x is probably double fan with a small heatsink, 3x better cooling. The X Trio is the step up for that.

For reference, the Ventus 3x 4070 Ti (there is no 2x for the 4070 Ti) non-OC is $815 at Micro Center. The OC version is $860. The Gaming Trio non-X is $880 and the X is $890. I think the store plugged in 4070 Ti-equivalent prices for the 4070 models somehow.

pyrotek fucked around with this message at 20:58 on Apr 3, 2023

Branch Nvidian
Nov 29, 2012



Prescription Combs posted:

I definitely don't regret getting the 7900XTX.. Thing has been completely solid and stable.

I don’t regret getting a 7900XTX, but AMD’s driver software absolutely has a learning curve to it. FFXIV in borderless window mode doesn’t obey frame rate capping, so the GPU cranks out 300+ fps while the fans spin up to maximum speed and sound like a jet engine (found a solution for this using the Chill feature eventually).

RE4 Remake has crashes as well on it that are different from the crashes my 3070 Ti was running into. I’ll exit a cutscene, bust through a door, or enter/exit combat and the game will hitch for a few frames and then just freeze. Don’t know if this is the card’s fault or not. Haven’t tested anything else out with it other than compiling shaders on TLOU and lmao that’s not worth loving with at this venture.

Lamquin
Aug 11, 2007
The ongoing concerns with VRAM for the future has absolutely put a damper on the honeymoon of getting a 4070 TI (12GB) back in January.
I'm hoping it'll be a gradual thing over several years and not whatever The Last of Us port is showing us, but talk about a mood killer. I run games at 1440p, was aware of the concerns regarding 4K when I purchased it and this became an issue far quicker than I thought. :shobon:

Watching Digital Foundrys video slamming the port does bring a bit of hope into it at least that maybe future developers will try to find a way to work around it.

UHD
Nov 11, 2006


Lamquin posted:

The ongoing concerns with VRAM for the future has absolutely put a damper on the honeymoon of getting a 4070 TI (12GB) back in January.
I'm hoping it'll be a gradual thing over several years and not whatever The Last of Us port is showing us, but talk about a mood killer. I run games at 1440p, was aware of the concerns regarding 4K when I purchased it and this became an issue far quicker than I thought. :shobon:

Watching Digital Foundrys video slamming the port does bring a bit of hope into it at least that maybe future developers will try to find a way to work around it.

or maybe its just a really lovely port and shouldnt be indicative of the future unless lovely pc ports are the norm

wait poo poo okay maybe it is gonna be that bad

glibness aside i dunno. either devs will make it work or theyll shut out so many customers the port wont make financial sense at all. cp2077 is a showcase for a lot of the hot poo poo gpu tech and even it will run on weaker cards if you accept lower settings. we're still a long ways away from raytracing being necessary instead of just neat

Dr. Video Games 0031
Jul 17, 2004

It's a lovely port, but it's also the third game this year already that has run into issues with 8GB cards, after HogLeg and RE4make, and there were a few last year too (such as miles morales for one example). At some point, you have to stop calling these outliers, because there's bound to be more games like this this year.

That said, I'm not so doom and gloomy about 12GB cards yet. And hopefully DirectStorage with GPU decompression can help speed up asset streaming and reduce the required vram capacity or make it less costly when vram is exceeded.

Shumagorath
Jun 6, 2001
I’m a lot happier with a 12GB 3080 than I would be with a 12GB 4070”Ti”, because I’d want the latter to carry me a lot longer. I even paid $1000CAD for the former a year ago and I’m still happier with it because expectations are met.

5000 series will be pointless unless I make it draw bigger than 1200p60Hz. The 3080 12GB can RT that all day in any game with reasonable optimization.

TOOT BOOT
May 25, 2010

Is GDDR expensive or is this just market segmentation?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

TOOT BOOT posted:

Is GDDR expensive or is this just market segmentation?

GPU manufacturers are limited by bus size. The VRAM options they had for the 3070 were 8GB or 16GB. It's weird as hell to me that they didn't go with 16GB given that the $330 RTD 3060 has 12GB.

They could have done a 16GB 3070 and 12GB 3080.

hobbesmaster
Jan 28, 2008

TOOT BOOT posted:

Is GDDR expensive or is this just market segmentation?

For nvidia at least it’s more about architecture and the number of chips that can be used. When they cut down the die they cut off memory bus width too.

Dr. Video Games 0031
Jul 17, 2004

TOOT BOOT posted:

Is GDDR expensive or is this just market segmentation?

It's just market segmentation. The 3060 has 12GB of GDDR6 out of necessity due to the memory bus width, and Nvidia didn't have a problem selling that for much cheaper than the 8GB 3070 or 10GB 3080. Meanwhile, Intel produced 16GB and 8GB versions of the A770, and they were priced just $20 apart initially. It's easy to conclude from this that GDDR6 is probably pretty cheap.

hobbesmaster
Jan 28, 2008

Dr. Video Games 0031 posted:

It's just market segmentation. The 3060 has 12GB of GDDR6 out of necessity due to the memory bus width, and Nvidia didn't have a problem selling that for much cheaper than the 8GB 3070 or 10GB 3080. Meanwhile, Intel produced 16GB and 8GB versions of the A770, and they were priced just $20 apart initially. It's easy to conclude from this that GDDR6 is probably pretty cheap.

The 3060 12GB is because the other option is 6GB and would be unacceptable. The 4000 series “cut down” chips have relatively narrow buses so are already maxxed at these lower VRAM sizes. 192 bit = 6 chips. GDDR6 comes in 8Gb or 16Gb chips so 12 GB is the best they can do on a 4070 (ti).

The 3000 series equivalents however had larger buses and used 8Gb chips because those were the more available type. That’s why there kept being rumors of “double vram” cards of various types.

I guess that’s still market segmentation, but the VRAM options are very limited for nvidia when they cut down their chips in ways that AMD and Intel designed around.

hobbesmaster fucked around with this message at 15:28 on Apr 4, 2023

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Zero VGS posted:

Oh that's right, I totally remember I got one of those RX 480 cards at launch with 4GB of ram, and I was able to flash it into 8GB. That "download more memory" meme actually came true!

https://www.techpowerup.com/223893/amd-4gb-radeon-rx-480-can-be-flashed-into-8gb


Reminds me of the old ATI 9500 nonpro -> 9700pro mod.

Dr. Video Games 0031
Jul 17, 2004

hobbesmaster posted:

The 3060 12GB is because the other option is 6GB and would be unacceptable. The 4000 series “cut down” chips have relatively narrow buses so are already maxxed at these lower VRAM sizes. 192 bit = 6 chips. GDDR6 comes in 8Gb or 16Gb chips so 12 is the best they can do on a 4070 (ti).

The 3000 series equivalents however had larger buses and used 8Gb chips because those were the more available type. That’s why there kept being rumors of “double vram” cards of various types.

I guess that’s still market segmentation, but the VRAM options are very limited for nvidia when they cut down their chips in ways that AMD and Intel designed around.

they are the ones who design the architectures and they have the ability to make these chips any bus width they want. and so they can also use any vram amount they want as long as they design the gpus accordingly.

New Zealand can eat me
Aug 29, 2008

:matters:


Who makes the least worst pci-e risers? Some of the language model poo poo I'm doing wants >24GB & <40GB of VRAM, and supports multiple GPUs. Feeling pretty :smugdog: for keeping the Radeon VII now. Also, what is the longest one I can reasonably hope to get away with? I don't plan on gaming with it at all, it's purely going to be a VRAM bitch (as it should, that HBM2 is quick). If I could get it outside of the case that would be ideal as it's a significant source of heat.

The only earnest complaint I have about the Merc 310 7900XTX is how loving fat it is. I don't mind a long card at all (The Sapphire Tri-X Fury OC was something like 14.1" long), but occupying 4 slots is fuckin rude.

Also snagging 128gb of ram, because these things expect 60GB to be able to quantize the models and I'm sure they'll only get bigger. I can't really complain tho these feel like '2023 problems', and I'll finally be able to run the UE5 cityscape demo properly after this :v:

New Zealand can eat me fucked around with this message at 05:56 on Apr 4, 2023

Josh Lyman
May 24, 2009


How much can Nvidia really be saving by using narrower buses and (relatively) less VRAM? The only rational reason is planned obsolescence unless they're somehow just focused on FLOPS?

Cygni
Nov 12, 2005

raring to post

Product segmentation (just like the rest of this market), and yes, it can be significantly cheaper to make in the BOM. Simpler signaling, simpler packaging, simpler PCB layouts, less layers, smaller coolers, smaller VRMs, less capacitance etc.

That said, as always, the vast majority of the cost of these products is in the engineering and tooling that is done before a single MEGA GAMER card is sold.

E: also worth pointing out than when these chips were taping out, GDDR6 prices had hit $16/GB on DigiKey and people were talking about the chip shortage lasting years. Wouldn’t surprise me if that factored into the planning.

Cygni fucked around with this message at 06:53 on Apr 4, 2023

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
The introduction of non-binary RAM should alleviate the constraint that bus width applies to memory capacity. It's probably too late for any cards in this generation, but at least in future it means Nvidia/AMD won't be forced to choose between 1GB and 2GB chips.

Shrimp or Shrimps
Feb 14, 2012


Cross-Section posted:

I'm the weirdo who cranks the res up to 2.25x DLDSR in every game that supports it simply because he can :cool:

I do DLDSR 2.25x on my 1080p monitor with a 3080 so that's like 2880 x 1620 and also use DLSS to upscale from a lower internal render. I don't know if it's just placebo or if I'm just dumb, but it looks better to me than doing native and using game engine AA like 100% of the time.

The 3080 pulls double duty on a 4k tv though and there is definitely no using DLDSR there!

Shrimp or Shrimps fucked around with this message at 09:28 on Apr 4, 2023

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
My impression is that as more and more current-gen/PS5 ports come to PC, the PC ports are suffering because they have no equivalent of the PS5's super-fast decompression capability

PCs have directstorage but it's just not being used, at all

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

New Zealand can eat me posted:

Who makes the least worst pci-e risers?

You want either a riser that is solid PCB, flexible PCB, or "twinaxial". Just don't get anything that's a normal ribbon cable; it will have crosstalk and probably won't hold up to PCI 3.0/4.0

edit: I can vouch that this works perfectly: https://www.amazon.com/LINKUP-RX6900XT-High-Speed-Gen4%E2%94%83Universal-Compatible/dp/B08XN26DRZ/

Zero VGS fucked around with this message at 13:07 on Apr 4, 2023

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

Zedsdeadbaby posted:

My impression is that as more and more current-gen/PS5 ports come to PC, the PC ports are suffering because they have no equivalent of the PS5's super-fast decompression capability

PCs have directstorage but it's just not being used, at all

I think that's a part of it. If developers can't stream as much data per second off your storage drive, then they may need to preload more data which causes longer load times and more VRAM consumption. It's also possible that developers are trying to account for players who have slower SATA SSDs. DirectStorage could help, but GPU decompression only came out early this year/late last year, so it may take some time for it to trickle into shipped games.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply