Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
shrike82
Jun 11, 2005

man is jedi survivor half-baked on PC, i'm on the second "open" planet and have hit a location where the game is consistently crashing

Adbot
ADBOT LOVES YOU

Arrath
Apr 14, 2011


Shipon posted:

Have their games ever been bad performance wise? The bugs have always come in other forms like hilarious physics bugs or required quests just not working randomly.

Skyrim and Fallout had infamously bad performance and save corruption issues on the PS3 IIRC, but I think that was the more on the architecture of the console and how they tried to kludge the engine into working on it.

hobbesmaster
Jan 28, 2008

I’m pretty sure Skyrim had some graphics weirdness on release but that was over 10 years ago

shrike82
Jun 11, 2005

fallout 76 runs like dogshit but it's an online thing so many people give them a pass on it

Dr. Video Games 0031
Jul 17, 2004

Starfield is going to be Bethesda's first DX12 game, I believe, so you should be ready for anything.

sauer kraut
Oct 2, 2004

Branch Nvidian posted:

Reporting back in on my FFXIV GPU issue. Turns out the problem was being caused by LG's on-screen control application, which is used to update monitor firmware and such....

Oh yeah always stay away from software like that. Anything Asus (although AI Suite won't work with Windows' virtualization enabled anyway lol), Samsung SSD stuff, LED controls, mouse&keyboard poo poo.
Glad you found it at least :)

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
ah yes RGB software that doesnt save the settings into the firmware

those are the loving best

some dillweed
Mar 31, 2007

Shipon posted:

Have their games ever been bad performance wise? The bugs have always come in other forms like hilarious physics bugs or required quests just not working randomly.
Morrowind on Xbox was also pretty bad, from what I saw. When I worked in a store's photo/electronics department, someone brought that game back saying that their kid told them it didn't work. We had an Xbox to test it, and when I put the disc in, it started up normally but just ran really poorly and choppily. They saw it running that way, and I think they might have asked if it was "supposed to look like that" and seemed disappointed. In the end, I think they couldn't return it because our manager/policies didn't allow returns for being unsatisfied with a game's performance.

Branch Nvidian
Nov 29, 2012



Arrath posted:

Skyrim and Fallout had infamously bad performance and save corruption issues on the PS3 IIRC, but I think that was the more on the architecture of the console and how they tried to kludge the engine into working on it.

My favorite FO3 thing was the Alaska DLC where something about the light reflections off the snow would hard lock the PS3 and you'd have to pull the power cable out of the console because nothing would work on the system, to include the power button. After the patch that came with that DLC, sometimes the system would also hard lock because an explosion was too bright.

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.

Dr. Video Games 0031 posted:

Starfield is going to be Bethesda's first DX12 game, I believe, so you should be ready for anything.

oh good point

i'd certainly expect all sorts of stuttering problems and other weirdness, just probably not being graphically ambitious to the point where they're completely dependent on poorly implemented upscaling

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

lih posted:

oh good point

i'd certainly expect all sorts of stuttering problems and other weirdness, just probably not being graphically ambitious to the point where they're completely dependent on poorly implemented upscaling

they dont need to push the technical envelope to be bethjank

the sheer weight of project management alone would do it already

Anime Schoolgirl
Nov 28, 2002

Palladium posted:

they dont need to push the technical envelope to be bethjank

the sheer weight of project management alone would do it already
as many as 500 people are in the starfield dev team and i'm left wondering if bethesda has the management capacity to make that run the least bit smoothly

like it's great for train ride games like cod because they churn those out and have a formula (even for mp) activision's hundreds of people can follow to the letter, but i'm not sure it's going to work as well for a game that theoretically can have any combination of things and quest flags happening or not happening

New Zealand can eat me
Aug 29, 2008

:matters:


Cygni posted:

https://www.digitimes.com.tw/tech/dt/n/shwnws.asp?CnlID=1&Cat=40&id=0000662749_VR76FCFB51XKH38BW2VPS

DigiTimes reporting that AMD and Nvidia wont move to TSMC 3 until 2025, and have likely delayed future plans for using smaller nodes. Yield rate, performance, and wafer price are apparently disappointing on the node. So wouldn't expect any new high end GPUs (except for maybe a 4090 Ti) for quite a while.

Initially, when Apple dropped a fuckton of their TSMC 3 reservations, I had attributed it to a grim forecast of the near-future. In hindsight, it sounds much more like them to find out that it sucks and being like "nah you can keep that"

meta²
Sep 11, 2001

What the flip was Grandma doing at the dunes?

Upgraded my sffpc’s card from a 2060 to a 4070, and have been thoroughly enjoying playing on my LG-C2.

In other controversial opinions, man have I been enjoying Jedi Survivor. Most beautiful game I have played yet, and the core gameplay is incredible.

MarcusSA
Sep 23, 2007

Starfield is going to be fine. It’s not going to be technically amazing but it’s not going to be whatever this current poo poo is.

SCheeseman
Apr 23, 2003

But will it have working doors (no, door technology is too advanced)

Yudo
May 15, 2003

Is the 7900xt for $700 a good value proposition? I have the opportunity to buy one at that price (though a model made by my least liked amd aib) and it seems reasonable vis-a-vis its performance. I ask here due to reviews (rightfully) dismissing it as too expensive at msrp and glossing over its performance as a result.

I would prefer a 4080, but that is at least $300 more. I like raytracing more than I thought I would and dlss 2 looks better than fsr 2 to me.

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Shipon posted:

Have their games ever been bad performance wise? The bugs have always come in other forms like hilarious physics bugs or required quests just not working randomly.

Fallout 4 would drop to literally 0 fps sometimes for a few seconds on one of the consoles at launch

DoctorRobert
Jan 20, 2020

Yudo posted:

Is the 7900xt for $700 a good value proposition? I have the opportunity to buy one at that price (though a model made by my least liked amd aib) and it seems reasonable vis-a-vis its performance. I ask here due to reviews (rightfully) dismissing it as too expensive at msrp and glossing over its performance as a result.

I would prefer a 4080, but that is at least $300 more. I like raytracing more than I thought I would and dlss 2 looks better than fsr 2 to me.

:S sure sounds like you should get your nvidia card buddy

Yudo
May 15, 2003

DoctorRobert posted:

:S sure sounds like you should get you nvidia card buddy

Paying $1000+ for a video card is a line I won't cross, and lol at $850 for 12gb of vram in 2023. There is nothing for me to buy.

I need a new card and am trying to get the least hosed, meaning I am okay with settling if the price is right.

Yudo fucked around with this message at 06:35 on May 1, 2023

Yaoi Gagarin
Feb 20, 2014

tbh bethesda is the only AAA studio that I expect could give any fucks at all about PC. it seems like an afterthought everywhere but if anyone would at least try it would be Todd and his seven dwarves right?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
so with my price rant let ask the moore's law bit: if 7600XT (6700 non-XT plus a bit ish performance, which is a chunk above 6600XT) for $300 or $329 launched with 8GB, if 4070 12GB (3080/6800XT) for de facto $500 slots in above it, and then partners launched custom 7600XT clamshell 16gb at $349/379, or even $399 (slower but 16gb vs 12gb VRAM and it's cheaper), what would you buy there?

navi 33 is legit pretty cheap, it's monolithic 6nm and decently smaller than navi 23 on the same node while also being slightly faster. it will truly be interesting, I don't have a good sense which way this will go between market pressure and AMD's internal wafer economy and overall cost pressures from the manufacturing. I kinda think there's a decent chance it'll land uncomfortably higher than the clearance prices but who knows, maybe AMD wants to do business and is willing to eat margins to move wafers. Navi 33 is actually going to be a solid product and a reasonably efficient performer, I think.

Anime Schoolgirl
Nov 28, 2002

if that is indeed the 7600xt and not say a 7600 no letters, the SKU spread for navi 32 between is going to be a little weird because the W7800 in the radeon pro series implies that a possible 7800 sku is going to be a navi 31 die with two dead MCDs and 70 active CUs.

assuming navi 33 is 7600xt:
super cut down navi 31 seen in the w7800 -> 7800 XTX with 16gb?
navi 32 with 4 MCDs -> 7800 no letters + XT with 16gb
navi 32 with 3 MCDs -> 7700 no letters + XT still with 12gb

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Anime Schoolgirl posted:

if that is indeed the 7600xt and not say a 7600 no letters, the SKU spread for navi 32 between is going to be a little weird because the W7800 in the radeon pro series implies that a possible 7800 sku is going to be a navi 31 die with two dead MCDs and 70 active CUs.

assuming navi 33 is 7600xt:
super cut down navi 31 seen in the w7800 -> 7800 XTX with 16gb?
navi 32 with 4 MCDs -> 7800 no letters + XT with 16gb
navi 32 with 3 MCDs -> 7700 no letters + XT still with 12gb

from what I have seen this seems to have been a MCM penalty, N33 looks like a solid performer that made about the same perf/mm2 step but without making the node step, it's iso-node. It's a MCM penalty after all.

yes, that leaves N32 in a super hosed up slot because it's pretty expensive compared to N33 but not really all that much faster, because it's got MCM penalty. And it still uses a pretty significant amount of 6nm MCD space for its mediocity. 6x 37.5mm = 225mm2 which is more than a N33 die right?

(edit: 4x 37.5 being 150mm2 of 6nm MCD area, slightly less)

for the additional marginal cost from the addition of the 5nm GCD in this scenario, is adding a big 5nm GCD going to return anything notable in additional revenue?

AMD completely has a massive hole in the middle of their line unfortunately. N33 is going to be great. N31 is ok. N32 is going to be interesting.

Paul MaudDib fucked around with this message at 08:30 on May 1, 2023

Animal
Apr 8, 2003

This is a weird one. I installed a 4090 FE. TimeSpy Extreme gets really good results compared to similar hardware on the 3DMark scoreboard. But for some reason, the regular TimeSpy gets a very low graphics score of 22,000 but the average for my hardware is 33,000. The GPU does not go up to 100% utilization. It's definitely not CPU limited with my overclocked 12900k. I tried nuking the drivers with DDU, changing the power plan to high performance, disabling G-Sync and V-sync. No dice. I suspect that some games are also running worse than they should, so I definitely wanna get this fixed.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
You might have an FPS cap set somewhere, perhaps in msi afterburner or your nvidia control panel. The GPU utilization not being 100% on an easier timespy run is a telltale sign

As you run gsync it's likely you set the framerate cap to something just under your monitor's refresh rate, and the easier timespy run could go higher than that but the cap prevents that from happening. The more stressful timespy extreme probably fully uses your GPU but still not hit the framerate cap hence you getting an expected score on that. That's my assumption at least.

Zedsdeadbaby fucked around with this message at 13:11 on May 1, 2023

Animal
Apr 8, 2003

Zedsdeadbaby posted:

You might have an FPS cap set somewhere, perhaps in msi afterburner or your nvidia control panel. The GPU utilization not being 100% on an easier timespy run is a telltale sign

As you run gsync it's likely you set the framerate cap to something just under your monitor's refresh rate, and the easier timespy run could go higher than that but the cap prevents that from happening. The more stressful timespy extreme probably fully uses your GPU but still not hit the framerate cap hence you getting an expected score on that. That's my assumption at least.

That made so much sense that I actually ran to the computer to surely fix the problem. Sadly, I didn't have a framerate cap on either the NVIDIA control panel, or RivaTuner, the only two places I've ever applied a cap. And I'd already nuked and reinstalled both applications. I tried setting a limit then removing it but that didn't fix it. While monitoring the hardware during the test, the FPS doesn't seem to be 'pegged' at a specific framerate. it'll go from 140fps up to the high 160fps's, and the GPU is fully boosting to almost 3Ghz and cpu to 5Ghz. But they are both pulling a lower wattage and utilization and fans barely spinning. This also seems to happen on Metro Enhanced, and the framerate will drop to the 40's, which is not good. If it was just TimeSpy, I wouldn't care. It's a head scratcher.

I just tried a motherboard BIOS update which didn't fix it. The only thing I can think of is that I'm not using all 4 PC-E cables with the included squid adapter. But if that were an issue, you'd TimeSpy Extreme would suffer even more for lack of power. Also, the RivaTuner monitor doesn't show the GPU as being power limited during the test. It doesn't show its limited at all.

Animal fucked around with this message at 14:02 on May 1, 2023

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
That's very strange, what's your PSU? We've had goons in the thread before with odd framerate behaviour and it turned out their PSUs weren't quite providing enough juice. The 4090 and 12900k are both very demanding power-wise. The GPU would definitely benefit from a proper power cabling setup at the least.

quote:

The only thing I can think of is that I'm not using all 4 PC-E cables with the included squid adapter.

It's possible your GPU is only drawing up to 300W and not the full 450W it needs. If I recall right, if it can only 'see' 300W being fed to it, that will treat it as 100% of its power, instead of just 66% of 450W, hence the weird readouts you're getting. If I was a betting man, I'd say a proper, fully utilized connection will get the full power input again. But it sounds like you may need a new PSU. You need to use all 4 connections.

Dr. Video Games 0031
Jul 17, 2004

Just hook up one of the daisy chained power connectors to the adapter. That's perfectly fine to do.

repiv
Aug 13, 2009

Zedsdeadbaby posted:

It's possible your GPU is only drawing up to 300W and not the full 450W it needs. If I recall right, if it can only 'see' 300W being fed to it, that will treat it as 100% of its power, instead of just 66% of 450W, hence the weird readouts you're getting. If I was a betting man, I'd say a proper, fully utilized connection will get the full power input again. But it sounds like you may need a new PSU. You need to use all 4 connections.

AFAIK the 4090 won't boot with only two power connectors attached, which is what would theoretically limit it to 300W

the only states it will accept is three connectors (which limits it to flat 450W) or four connectors (which makes it default to 450W with the option of raising it to 600W)

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Oh of course. 150 per connection, which makes a lot of sense. It's been a while since I watched GN's video on the 4090 and got the three/four connectors info mixed up.
I do reckon Animal's issue stems from the power supply/connection for sure

So it sounds like his card is working at max 450w, and it's trying to draw in more than that, but can't?

Animal
Apr 8, 2003

Zedsdeadbaby posted:

That's very strange, what's your PSU? We've had goons in the thread before with odd framerate behaviour and it turned out their PSUs weren't quite providing enough juice. The 4090 and 12900k are both very demanding power-wise. The GPU would definitely benefit from a proper power cabling setup at the least.

It's possible your GPU is only drawing up to 300W and not the full 450W it needs. If I recall right, if it can only 'see' 300W being fed to it, that will treat it as 100% of its power, instead of just 66% of 450W, hence the weird readouts you're getting. If I was a betting man, I'd say a proper, fully utilized connection will get the full power input again. But it sounds like you may need a new PSU. You need to use all 4 connections.

Another great suggestion. But it actually pulls up to 450w (and spikes higher) during TimeSpy Extreme and Cyberpunk. I’ll try daisy chaining the cables just in case.

repiv
Aug 13, 2009

have you checked if anything weird is going on with the PCIe bandwidth?

GPU-Z shows the link speed

Animal
Apr 8, 2003

I just daisy-chained the last power connector. It now allows AfterBurner to boost up to 133%! But it didn't fix the low utilization issue...
I also tried setting the Windows power plan and the NVIDIA control panel power setting both to max performance.


repiv posted:

have you checked if anything weird is going on with the PCIe bandwidth?

GPU-Z shows the link speed

repiv
Aug 13, 2009

oh you might need to put the GPU under load to get it to switch to PCIe3/4 mode

all 16 lanes are active at least

Dr. Video Games 0031
Jul 17, 2004


This is saying that your card is running at PCIe 2.0 speeds. You'll have to go back into your bios and see if there's a way to set what PCIe speed your GPU slot is operating at.

edit: Or maybe not? I don't know much about gpu-z so repiv probably knows better than me

edit 2: yeah, that's how gpu-z works. ignore me then.

power crystals
Jun 6, 2007

Who wants a belly rub??

My 4090 says x8 4.0 at idle (which is differently weird). I'd say it's at least worth checking the BIOS to see if it says anything about the speed of that slot. Do you have a riser cable or some other unusual connection between the card and the board?

Animal
Apr 8, 2003

repiv posted:

oh you might need to put the GPU under load to get it to switch to PCIe3/4 mode

all 16 lanes are active at least

It goes up to 4 when I put load with that little button on the app itself.

Animal
Apr 8, 2003

power crystals posted:

My 4090 says x8 4.0 at idle (which is differently weird). I'd say it's at least worth checking the BIOS to see if it says anything about the speed of that slot. Do you have a riser cable or some other unusual connection between the card and the board?

Nope, and I’ve updated the BIOS and gone back to defaults since I spotted the problem. I’ll dig deeper in the BIOS. But it’s running at 16z 4.0 when I put load, so I don’t know what else to look for.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

This is saying that your card is running at PCIe 2.0 speeds. You'll have to go back into your bios and see if there's a way to set what PCIe speed your GPU slot is operating at.

edit: Or maybe not? I don't know much about gpu-z so repiv probably knows better than me

edit 2: yeah, that's how gpu-z works. ignore me then.

yeah they dynamically clock the bus down on idle to save power and GPU-Z reflects the current state

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply