|
man is jedi survivor half-baked on PC, i'm on the second "open" planet and have hit a location where the game is consistently crashing
|
# ? May 1, 2023 01:15 |
|
|
# ? May 31, 2024 02:59 |
Shipon posted:Have their games ever been bad performance wise? The bugs have always come in other forms like hilarious physics bugs or required quests just not working randomly. Skyrim and Fallout had infamously bad performance and save corruption issues on the PS3 IIRC, but I think that was the more on the architecture of the console and how they tried to kludge the engine into working on it.
|
|
# ? May 1, 2023 01:17 |
|
I’m pretty sure Skyrim had some graphics weirdness on release but that was over 10 years ago
|
# ? May 1, 2023 01:19 |
|
fallout 76 runs like dogshit but it's an online thing so many people give them a pass on it
|
# ? May 1, 2023 01:20 |
|
Starfield is going to be Bethesda's first DX12 game, I believe, so you should be ready for anything.
|
# ? May 1, 2023 01:21 |
|
Branch Nvidian posted:Reporting back in on my FFXIV GPU issue. Turns out the problem was being caused by LG's on-screen control application, which is used to update monitor firmware and such.... Oh yeah always stay away from software like that. Anything Asus (although AI Suite won't work with Windows' virtualization enabled anyway lol), Samsung SSD stuff, LED controls, mouse&keyboard poo poo. Glad you found it at least
|
# ? May 1, 2023 01:32 |
|
ah yes RGB software that doesnt save the settings into the firmware those are the loving best
|
# ? May 1, 2023 01:36 |
|
Shipon posted:Have their games ever been bad performance wise? The bugs have always come in other forms like hilarious physics bugs or required quests just not working randomly.
|
# ? May 1, 2023 02:43 |
|
Arrath posted:Skyrim and Fallout had infamously bad performance and save corruption issues on the PS3 IIRC, but I think that was the more on the architecture of the console and how they tried to kludge the engine into working on it. My favorite FO3 thing was the Alaska DLC where something about the light reflections off the snow would hard lock the PS3 and you'd have to pull the power cable out of the console because nothing would work on the system, to include the power button. After the patch that came with that DLC, sometimes the system would also hard lock because an explosion was too bright.
|
# ? May 1, 2023 02:59 |
|
Dr. Video Games 0031 posted:Starfield is going to be Bethesda's first DX12 game, I believe, so you should be ready for anything. oh good point i'd certainly expect all sorts of stuttering problems and other weirdness, just probably not being graphically ambitious to the point where they're completely dependent on poorly implemented upscaling
|
# ? May 1, 2023 03:05 |
|
lih posted:oh good point they dont need to push the technical envelope to be bethjank the sheer weight of project management alone would do it already
|
# ? May 1, 2023 03:51 |
|
Palladium posted:they dont need to push the technical envelope to be bethjank like it's great for train ride games like cod because they churn those out and have a formula (even for mp) activision's hundreds of people can follow to the letter, but i'm not sure it's going to work as well for a game that theoretically can have any combination of things and quest flags happening or not happening
|
# ? May 1, 2023 04:36 |
|
Cygni posted:https://www.digitimes.com.tw/tech/dt/n/shwnws.asp?CnlID=1&Cat=40&id=0000662749_VR76FCFB51XKH38BW2VPS Initially, when Apple dropped a fuckton of their TSMC 3 reservations, I had attributed it to a grim forecast of the near-future. In hindsight, it sounds much more like them to find out that it sucks and being like "nah you can keep that"
|
# ? May 1, 2023 05:17 |
|
Upgraded my sffpc’s card from a 2060 to a 4070, and have been thoroughly enjoying playing on my LG-C2. In other controversial opinions, man have I been enjoying Jedi Survivor. Most beautiful game I have played yet, and the core gameplay is incredible.
|
# ? May 1, 2023 05:19 |
|
Starfield is going to be fine. It’s not going to be technically amazing but it’s not going to be whatever this current poo poo is.
|
# ? May 1, 2023 05:51 |
|
But will it have working doors (no, door technology is too advanced)
|
# ? May 1, 2023 06:18 |
|
Is the 7900xt for $700 a good value proposition? I have the opportunity to buy one at that price (though a model made by my least liked amd aib) and it seems reasonable vis-a-vis its performance. I ask here due to reviews (rightfully) dismissing it as too expensive at msrp and glossing over its performance as a result. I would prefer a 4080, but that is at least $300 more. I like raytracing more than I thought I would and dlss 2 looks better than fsr 2 to me.
|
# ? May 1, 2023 06:19 |
|
Shipon posted:Have their games ever been bad performance wise? The bugs have always come in other forms like hilarious physics bugs or required quests just not working randomly. Fallout 4 would drop to literally 0 fps sometimes for a few seconds on one of the consoles at launch
|
# ? May 1, 2023 06:27 |
|
Yudo posted:Is the 7900xt for $700 a good value proposition? I have the opportunity to buy one at that price (though a model made by my least liked amd aib) and it seems reasonable vis-a-vis its performance. I ask here due to reviews (rightfully) dismissing it as too expensive at msrp and glossing over its performance as a result. :S sure sounds like you should get your nvidia card buddy
|
# ? May 1, 2023 06:28 |
|
DoctorRobert posted::S sure sounds like you should get you nvidia card buddy Paying $1000+ for a video card is a line I won't cross, and lol at $850 for 12gb of vram in 2023. There is nothing for me to buy. I need a new card and am trying to get the least hosed, meaning I am okay with settling if the price is right. Yudo fucked around with this message at 06:35 on May 1, 2023 |
# ? May 1, 2023 06:32 |
|
tbh bethesda is the only AAA studio that I expect could give any fucks at all about PC. it seems like an afterthought everywhere but if anyone would at least try it would be Todd and his seven dwarves right?
|
# ? May 1, 2023 06:45 |
|
so with my price rant let ask the moore's law bit: if 7600XT (6700 non-XT plus a bit ish performance, which is a chunk above 6600XT) for $300 or $329 launched with 8GB, if 4070 12GB (3080/6800XT) for de facto $500 slots in above it, and then partners launched custom 7600XT clamshell 16gb at $349/379, or even $399 (slower but 16gb vs 12gb VRAM and it's cheaper), what would you buy there? navi 33 is legit pretty cheap, it's monolithic 6nm and decently smaller than navi 23 on the same node while also being slightly faster. it will truly be interesting, I don't have a good sense which way this will go between market pressure and AMD's internal wafer economy and overall cost pressures from the manufacturing. I kinda think there's a decent chance it'll land uncomfortably higher than the clearance prices but who knows, maybe AMD wants to do business and is willing to eat margins to move wafers. Navi 33 is actually going to be a solid product and a reasonably efficient performer, I think.
|
# ? May 1, 2023 07:36 |
|
if that is indeed the 7600xt and not say a 7600 no letters, the SKU spread for navi 32 between is going to be a little weird because the W7800 in the radeon pro series implies that a possible 7800 sku is going to be a navi 31 die with two dead MCDs and 70 active CUs. assuming navi 33 is 7600xt: super cut down navi 31 seen in the w7800 -> 7800 XTX with 16gb? navi 32 with 4 MCDs -> 7800 no letters + XT with 16gb navi 32 with 3 MCDs -> 7700 no letters + XT still with 12gb
|
# ? May 1, 2023 07:54 |
|
Anime Schoolgirl posted:if that is indeed the 7600xt and not say a 7600 no letters, the SKU spread for navi 32 between is going to be a little weird because the W7800 in the radeon pro series implies that a possible 7800 sku is going to be a navi 31 die with two dead MCDs and 70 active CUs. from what I have seen this seems to have been a MCM penalty, N33 looks like a solid performer that made about the same perf/mm2 step but without making the node step, it's iso-node. It's a MCM penalty after all. yes, that leaves N32 in a super hosed up slot because it's pretty expensive compared to N33 but not really all that much faster, because it's got MCM penalty. And it still uses a pretty significant amount of 6nm MCD space for its mediocity. 6x 37.5mm = 225mm2 which is more than a N33 die right? (edit: 4x 37.5 being 150mm2 of 6nm MCD area, slightly less) for the additional marginal cost from the addition of the 5nm GCD in this scenario, is adding a big 5nm GCD going to return anything notable in additional revenue? AMD completely has a massive hole in the middle of their line unfortunately. N33 is going to be great. N31 is ok. N32 is going to be interesting. Paul MaudDib fucked around with this message at 08:30 on May 1, 2023 |
# ? May 1, 2023 08:04 |
|
This is a weird one. I installed a 4090 FE. TimeSpy Extreme gets really good results compared to similar hardware on the 3DMark scoreboard. But for some reason, the regular TimeSpy gets a very low graphics score of 22,000 but the average for my hardware is 33,000. The GPU does not go up to 100% utilization. It's definitely not CPU limited with my overclocked 12900k. I tried nuking the drivers with DDU, changing the power plan to high performance, disabling G-Sync and V-sync. No dice. I suspect that some games are also running worse than they should, so I definitely wanna get this fixed.
|
# ? May 1, 2023 08:33 |
|
You might have an FPS cap set somewhere, perhaps in msi afterburner or your nvidia control panel. The GPU utilization not being 100% on an easier timespy run is a telltale sign As you run gsync it's likely you set the framerate cap to something just under your monitor's refresh rate, and the easier timespy run could go higher than that but the cap prevents that from happening. The more stressful timespy extreme probably fully uses your GPU but still not hit the framerate cap hence you getting an expected score on that. That's my assumption at least. Zedsdeadbaby fucked around with this message at 13:11 on May 1, 2023 |
# ? May 1, 2023 13:09 |
|
Zedsdeadbaby posted:You might have an FPS cap set somewhere, perhaps in msi afterburner or your nvidia control panel. The GPU utilization not being 100% on an easier timespy run is a telltale sign That made so much sense that I actually ran to the computer to surely fix the problem. Sadly, I didn't have a framerate cap on either the NVIDIA control panel, or RivaTuner, the only two places I've ever applied a cap. And I'd already nuked and reinstalled both applications. I tried setting a limit then removing it but that didn't fix it. While monitoring the hardware during the test, the FPS doesn't seem to be 'pegged' at a specific framerate. it'll go from 140fps up to the high 160fps's, and the GPU is fully boosting to almost 3Ghz and cpu to 5Ghz. But they are both pulling a lower wattage and utilization and fans barely spinning. This also seems to happen on Metro Enhanced, and the framerate will drop to the 40's, which is not good. If it was just TimeSpy, I wouldn't care. It's a head scratcher. I just tried a motherboard BIOS update which didn't fix it. The only thing I can think of is that I'm not using all 4 PC-E cables with the included squid adapter. But if that were an issue, you'd TimeSpy Extreme would suffer even more for lack of power. Also, the RivaTuner monitor doesn't show the GPU as being power limited during the test. It doesn't show its limited at all. Animal fucked around with this message at 14:02 on May 1, 2023 |
# ? May 1, 2023 13:58 |
|
That's very strange, what's your PSU? We've had goons in the thread before with odd framerate behaviour and it turned out their PSUs weren't quite providing enough juice. The 4090 and 12900k are both very demanding power-wise. The GPU would definitely benefit from a proper power cabling setup at the least.quote:The only thing I can think of is that I'm not using all 4 PC-E cables with the included squid adapter. It's possible your GPU is only drawing up to 300W and not the full 450W it needs. If I recall right, if it can only 'see' 300W being fed to it, that will treat it as 100% of its power, instead of just 66% of 450W, hence the weird readouts you're getting. If I was a betting man, I'd say a proper, fully utilized connection will get the full power input again. But it sounds like you may need a new PSU. You need to use all 4 connections.
|
# ? May 1, 2023 14:10 |
|
Just hook up one of the daisy chained power connectors to the adapter. That's perfectly fine to do.
|
# ? May 1, 2023 14:12 |
|
Zedsdeadbaby posted:It's possible your GPU is only drawing up to 300W and not the full 450W it needs. If I recall right, if it can only 'see' 300W being fed to it, that will treat it as 100% of its power, instead of just 66% of 450W, hence the weird readouts you're getting. If I was a betting man, I'd say a proper, fully utilized connection will get the full power input again. But it sounds like you may need a new PSU. You need to use all 4 connections. AFAIK the 4090 won't boot with only two power connectors attached, which is what would theoretically limit it to 300W the only states it will accept is three connectors (which limits it to flat 450W) or four connectors (which makes it default to 450W with the option of raising it to 600W)
|
# ? May 1, 2023 14:13 |
|
Oh of course. 150 per connection, which makes a lot of sense. It's been a while since I watched GN's video on the 4090 and got the three/four connectors info mixed up. I do reckon Animal's issue stems from the power supply/connection for sure So it sounds like his card is working at max 450w, and it's trying to draw in more than that, but can't?
|
# ? May 1, 2023 14:17 |
|
Zedsdeadbaby posted:That's very strange, what's your PSU? We've had goons in the thread before with odd framerate behaviour and it turned out their PSUs weren't quite providing enough juice. The 4090 and 12900k are both very demanding power-wise. The GPU would definitely benefit from a proper power cabling setup at the least. Another great suggestion. But it actually pulls up to 450w (and spikes higher) during TimeSpy Extreme and Cyberpunk. I’ll try daisy chaining the cables just in case.
|
# ? May 1, 2023 14:18 |
|
have you checked if anything weird is going on with the PCIe bandwidth? GPU-Z shows the link speed
|
# ? May 1, 2023 14:26 |
|
I just daisy-chained the last power connector. It now allows AfterBurner to boost up to 133%! But it didn't fix the low utilization issue... I also tried setting the Windows power plan and the NVIDIA control panel power setting both to max performance. repiv posted:have you checked if anything weird is going on with the PCIe bandwidth?
|
# ? May 1, 2023 14:30 |
|
oh you might need to put the GPU under load to get it to switch to PCIe3/4 mode all 16 lanes are active at least
|
# ? May 1, 2023 14:32 |
|
This is saying that your card is running at PCIe 2.0 speeds. You'll have to go back into your bios and see if there's a way to set what PCIe speed your GPU slot is operating at. edit: Or maybe not? I don't know much about gpu-z so repiv probably knows better than me edit 2: yeah, that's how gpu-z works. ignore me then.
|
# ? May 1, 2023 14:32 |
|
My 4090 says x8 4.0 at idle (which is differently weird). I'd say it's at least worth checking the BIOS to see if it says anything about the speed of that slot. Do you have a riser cable or some other unusual connection between the card and the board?
|
# ? May 1, 2023 14:34 |
|
repiv posted:oh you might need to put the GPU under load to get it to switch to PCIe3/4 mode It goes up to 4 when I put load with that little button on the app itself.
|
# ? May 1, 2023 14:36 |
|
power crystals posted:My 4090 says x8 4.0 at idle (which is differently weird). I'd say it's at least worth checking the BIOS to see if it says anything about the speed of that slot. Do you have a riser cable or some other unusual connection between the card and the board? Nope, and I’ve updated the BIOS and gone back to defaults since I spotted the problem. I’ll dig deeper in the BIOS. But it’s running at 16z 4.0 when I put load, so I don’t know what else to look for.
|
# ? May 1, 2023 14:37 |
|
|
# ? May 31, 2024 02:59 |
|
Dr. Video Games 0031 posted:This is saying that your card is running at PCIe 2.0 speeds. You'll have to go back into your bios and see if there's a way to set what PCIe speed your GPU slot is operating at. yeah they dynamically clock the bus down on idle to save power and GPU-Z reflects the current state
|
# ? May 1, 2023 14:39 |