|
I have an Asrock 6900XT I bought used in the fall but it recently started crashing while gaming. It only happens after I’ve been gaming for a while, so the card is heat soaked. None of my temps in particular are super high, the gpu itself is around 75-80 usually, with a hot spot of 95-100, but it has happened even at 70 degrees. It won’t happen if the whole card is still cold. I ran OCCT VRAM test and I get like 8 errors a minute while the card is warm. Does that mean my card is bad? I think it does but I’m hoping there’s something I can do to salvage it before I go buy a 4090.
|
# ? Apr 27, 2023 13:23 |
|
|
# ? May 31, 2024 13:05 |
|
TheCobraEffect posted:I have an Asrock 6900XT I bought used in the fall but it recently started crashing while gaming. It only happens after I’ve been gaming for a while, so the card is heat soaked. None of my temps in particular are super high, the gpu itself is around 75-80 usually, with a hot spot of 95-100, but it has happened even at 70 degrees. It won’t happen if the whole card is still cold. I think it's likely that this is an ex-mining card and the memory has gone through a lot of abuse. The VRAM test returning some errors supports this theory. My first thought is to try underclocking the memory slightly to see if that helps. See if you can do -200 MHz on the memory or something. You probably won't lose much performance and it may help keep things stable.
|
# ? Apr 27, 2023 13:27 |
|
Branch Nvidian posted:I don't have any of those settings available to me right now since I removed the Adrenalin software and went with a driver only install. install the full suite there is alot of useful things in there!
|
# ? Apr 27, 2023 15:44 |
|
Yeah. I know the traumatic stress response from dealing with Nvidia drivers is to go bare drivers only, no GFE, but this is not actually the case with AMD, at this current time. (I will not be held responsible for this statement if this changes a few years down the line, like if Raja wheedles his way back into the fold, or something.) SwissArmyDruid fucked around with this message at 07:02 on Apr 28, 2023 |
# ? Apr 27, 2023 16:11 |
|
Dr. Video Games 0031 posted:I think it's likely that this is an ex-mining card and the memory has gone through a lot of abuse. The VRAM test returning some errors supports this theory. I tried this, same errors in OCCT and it crashes in the same way. Dang, guess I got unlucky in the used gpu market. Thanks!
|
# ? Apr 27, 2023 18:26 |
|
Lol
|
# ? Apr 27, 2023 21:50 |
|
What would be a good upgrade option from the 1070? The 4070 is expensive and I've heard it's not really all that great? The 3070 doesn't feel a whole lot cheaper either. Granted I'm having a little bit of a "new shiny" urge so I don't know if I need to upgrade -right now- since it's doing the job for most of the stuff I play, but an upgrade might be nice since it's been so long. Like, I've got a quest 2, VR Chat via quest link runs alright, but when I tried that Valve VR tech demo (Hand Lab, I think) it was basically unplayably laggy, so I'm assuming Im going to struggle to run most VR stuff that's more involved then VRC.
|
# ? Apr 27, 2023 23:05 |
|
shrike82 posted:
At this point they're literally just throwing random poo poo on these shits.
|
# ? Apr 27, 2023 23:33 |
|
Oxyclean posted:What would be a good upgrade option from the 1070? The 4070 is expensive and I've heard it's not really all that great? The 3070 doesn't feel a whole lot cheaper either. A used 3080 10/12GB would be a nice upgrade, especially if you're around 2560x1440.
|
# ? Apr 27, 2023 23:57 |
|
steckles posted:That is exactly how zero bounce direct lighting works. The randomness comes in either when you’ve got large lights which can be partially occluded, or when you want to know what light is being reflected off non-light surfaces and contributing to the illumination at a particular spot. For that, you can only really do random sampling. There are lots of clever ways to increase the likelihood you’re shooting rays in “good” directions, but randomness is fundamentally baked into the whole concept. VostokProgram posted:It's a classic Monte Carlo approach. Shoot the rays randomly and sum the results. Keep adding rays until it converges. It's what they do for photorealistic renders for movies You want a random sample but you don’t have to walk them in random order, right? I don’t get why you can’t do bounced light the same way, hmm.
|
# ? Apr 28, 2023 00:10 |
|
Craptacular! posted:It feels like every generation of computer is hosed over by running four sticks of RAM. I was going to replace my 3700X because of this poo poo. gently caress. I think Zen4 will get better here, because people are really going to want to run 128GB with the 79xx parts, but right now it’s definitely a bad idea. I have 2x32 still in the box because it was out of the return window when I watched the Level 1 Techs video.
|
# ? Apr 28, 2023 00:14 |
|
Craptacular! posted:It feels like every generation of computer is hosed over by running four sticks of RAM. I was going to replace my 3700X because of this poo poo. gently caress.
|
# ? Apr 28, 2023 01:39 |
|
Anime Schoolgirl posted:I never had a problem running four sticks in the late 00s-early 10s. Granted, the platforms I was using them on were DDR2 platforms and non-Z Haswell (where you couldn't go above ddr3-1600) in those respective eras. In general “regular” PCs have topped out at two channels and two ranks per channel. Zen has always been “happiest” with all 4 ranks populated. That may mean 4 DIMMs or it may be 2 DIMMs.
|
# ? Apr 28, 2023 01:46 |
|
Jedi Survivor apparently has awful performance on all platforms. Being an AMD tie-in title doesn't help apparently, since they skipped DLSS and included a bad FSR implementation. But it is once again a game where every card in the 30-series can't play it at the resolutions the segmentation was intended for. So expect to play at 1080p on a 3070 and 1440p on a 3080ti because 4K requires 18GB of VRAM or some poo poo.
|
# ? Apr 28, 2023 02:23 |
|
As of tonight the following setting seemed to work: 5900X in "Game Mode," so only one CCD active Radeon Chill on with Minimum at 238fps, Maximum at 239fps. SAM disabled GPU Tuning Minimum Frequency of 2800MHz, Maximum of 2900MHz, 1150mV. Power Tuning at +15% Enhanced Sync: Off Wait for Vertical Refresh: Always On Adaptive Sync Compatible: Enabled I'm sure none of this will matter and the issue will reappear tomorrow, but figured I'd update with what mostly worked for this evening. Still frequently had GPU utilization dropping down into the 60% range. Starting to wonder if it's a power limitation, even though I should be well under the necessary power threshold and a giant 3x transient spike isn't shutting the system down or anything.
|
# ? Apr 28, 2023 03:04 |
|
Gave Jedi Survivor a shot since it's on the EA subscription plan and oof does it run poorly I'm hitting framerates in the 50s on a 13600K/4090 on 1440P UW with FSR2 quality and RT
|
# ? Apr 28, 2023 06:42 |
|
Subjunctive posted:You want a random sample but you don’t have to walk them in random order, right? I don’t get why you can’t do bounced light the same way, hmm. The first is that to compute the incoming radiance at any given point, you need information about the entire scene. Each pixel needs to estimate the average colour within some cone of directions, depending on the BSDF of the material and the orientation of the geometry. To calculate that, you need to do the same for every single surface within that cone and to calculate those, you need all the surfaces in those cones and so on forever. That's obviously not a tractable problem, so sampling a random assortment of directions to some random depth of recursion is the best that we can do. Eliminate the randomness between pixels, and you end up with correlation artefacts which cause ugly banding when still and nasty flickering in motion. The more fundamental problem is that to answer the question "can these two points in space see each other?", you need to load an unknowable-in-advance amount of geometry to make that determination. You can have two rays that start and end 1mm apart and one will need to check 1kb of geometry and the other will need to check 50mb worth because it took a different path through the BVH. No matter how coherently you're processing samples, you're still gonna end up being hit by bad memory access patterns when determining visibility. There are a few ways to address these. For the visibility problem, going super async and batching a huge numbers of rays together or making sure your whole scene will fit in L2 cache are basically the only strategies that work. If you didn't mind false occlusions and light leaking, you could trace against low-resolution proxy geometry. For sampling the path integrals, you can pick your random numbers in a way that maximizes the distance between points on the hypercube (Quasi-Monte Carlo), you can sample clusters of rays when you find a path with a high contribution (MLT, MEMLT, Path Guiding, Too many others to list), you can sample clusters of random numbers when you find a good point on the hypercube (PSSMLT), you can try to share rays between pixels when you find a path with a high contribution (ERPT, ReSTIR), you can use some proxy representation of scene radiance (Voxel Cone Tracing, Light Probes, VPLs), or you can use a surface based approach where radiance is computed at fixed locations and each pixel is interpolated from it's nearest neighbors (Radiosity, Surfels, Photon Mapping, plain old Light Maps). All of these serve to minimize randomness, but come with various tradeoffs in maximum quality, time to image, or memory usage and you can always come up with some pathological scene geometry that will make any algorithm perform badly steckles fucked around with this message at 07:45 on Apr 28, 2023 |
# ? Apr 28, 2023 07:43 |
|
https://twitter.com/Sebasti66855537/status/1651828680706998273?t=lL5WAGk-0jnJAqtIzgcjaA&s=19 Lol
|
# ? Apr 28, 2023 10:59 |
|
gradenko_2000 posted:https://twitter.com/Sebasti66855537/status/1651828680706998273?t=lL5WAGk-0jnJAqtIzgcjaA&s=19 the gently caress
|
# ? Apr 28, 2023 11:05 |
|
yep, that's a cpu bottleneck. the game engine is unable to effectively utilize more than a few threads, and your gpu will end up at below 100% utilization most of the time. Not even the 7800X3D can get a consistent 60fps. GPU and resolution are almost irrelevant.
Dr. Video Games 0031 fucked around with this message at 12:04 on Apr 28, 2023 |
# ? Apr 28, 2023 11:19 |
|
Almost sounds more like a CPU bug then a CPU bottleneck per se.
|
# ? Apr 28, 2023 12:17 |
|
the good news this time is it's broken on everything not just pc, so not merely a terrible port
|
# ? Apr 28, 2023 12:33 |
|
It's funny how people, after years and years of cycles of hype and disappointments still run and buy games at launch, let alone pre-order digital games. Why would developers bother to get their poo poo right from the get go?
|
# ? Apr 28, 2023 12:52 |
|
But how else are people supposed to participate in memes and Twitter slapfights with maximum attention on launch day?
|
# ? Apr 28, 2023 12:56 |
|
star wars may run like poo poo but it has the frog in 4K Ultra HD so it's impossible to say if it's bad or not https://twitter.com/lukeisamazing/status/1651264103510421506
|
# ? Apr 28, 2023 13:01 |
|
he's got his job work out for him https://twitter.com/TechPowerUp/status/1651157886830014469
|
# ? Apr 28, 2023 13:06 |
|
And the corporation shall be huge and black, and the eyes thereof red with the blood of living creatures, and the whore of Crypto shall ride forth on a three-headed serpent, and throughout the lands, there'll be a great scalping of parts.
|
# ? Apr 28, 2023 13:10 |
|
steckles posted:There are two problems that mean randomness will always be part of ray tracing. Thanks, this was great.
|
# ? Apr 28, 2023 14:01 |
|
another apology tweet for the collection https://twitter.com/EAStarWars/status/1651990800862183426
|
# ? Apr 28, 2023 19:12 |
|
tired: day one patch inspired: day one apology
|
# ? Apr 28, 2023 19:16 |
|
Their framing of the performance issues is kind of weird though. It's not just "high-end machines" that are underperforming. If high-end hardware struggles to hit 60, then you better believe that midrange and low-end hardware are even worse off.
|
# ? Apr 28, 2023 19:18 |
|
i don't know what audience their specification of the problem is for either. maybe it's just what happens when an apology from developers has to go several corporate layers like legal+marketing+social
|
# ? Apr 28, 2023 19:20 |
|
Dr. Video Games 0031 posted:Their framing of the performance issues is kind of weird though. It's not just "high-end machines" that are underperforming. If high-end hardware struggles to hit 60, then you better believe that midrange and low-end hardware are even worse off. to me it reads like "well it's your fault for running windows 10 and its your fault for running a 4090 with a not amazing cpu" which is just stupid bullshit writ large
|
# ? Apr 28, 2023 19:22 |
|
Dr. Video Games 0031 posted:Their framing of the performance issues is kind of weird though. It's not just "high-end machines" that are underperforming. If high-end hardware struggles to hit 60, then you better believe that midrange and low-end hardware are even worse off. Low end systems were supposed to suck, clearly, so expectations met.
|
# ? Apr 28, 2023 19:24 |
|
gradenko_2000 posted:https://twitter.com/Sebasti66855537/status/1651828680706998273?t=lL5WAGk-0jnJAqtIzgcjaA&s=19 HUB warned us that these ports meant that the 4090 was basically a 480p GPU now, but you all thought you knew better
|
# ? Apr 28, 2023 20:21 |
|
Dr. Video Games 0031 posted:Their framing of the performance issues is kind of weird though. It's not just "high-end machines" that are underperforming. If high-end hardware struggles to hit 60, then you better believe that midrange and low-end hardware are even worse off. I think a big part of it was mid-range systems were ok (a 3060ti runs at like 45 fps on Ultra 1440p and over 60 on 1080p Ultra), but setting lower settings does very little and higher end GPUs appear to not be used at all. It's even playable on a 3050! So it's more complex than "Runs poorly".
|
# ? Apr 28, 2023 20:36 |
|
Lockback posted:I think a big part of it was mid-range systems were ok (a 3060ti runs at like 45 fps on Ultra 1440p and over 60 on 1080p Ultra), but setting lower settings does very little and higher end GPUs appear to not be used at all. It's even playable on a 3050! So it's more complex than "Runs poorly". It’s absolutely a bug. If we’re throwing out wild guesses something is locking up the game loop and some large amount of the cpu time is spent spinning instead of telling the GPU what to do. GPU utilization is a bad thing to measure but 36% probably means something in the game logic is seriously broken. I wonder if the 4090 is even clocking up.
|
# ? Apr 28, 2023 21:03 |
|
Dr. Video Games 0031 posted:Their framing of the performance issues is kind of weird though. It's not just "high-end machines" that are underperforming. If high-end hardware struggles to hit 60, then you better believe that midrange and low-end hardware are even worse off. Rumormill is that actually it runs pretty well in its recommended hardware of a 11600K and an RX6700XT, which could imply that it has thread scheduling problems on both multi-CCD AMD chips and the heterogeneous core Intel chips. Hell, I want to see a review from someone on a $99 Core i3-13100F. I bet it runs great on those.
|
# ? Apr 28, 2023 21:06 |
|
repiv posted:star wars may run like poo poo but it has the frog in 4K Ultra HD so it's impossible to say if it's bad or not It also has cantina band music from two real-world bands that I quite like (and two that I don't know, but who seem pretty good), performing under SW themed aliases. https://open.spotify.com/album/1ldPSreHFSpICI8VVNK5rL
|
# ? Apr 28, 2023 21:19 |
|
|
# ? May 31, 2024 13:05 |
|
hobbesmaster posted:something is locking up the game loop and some large amount of the cpu time is spent spinning instead of telling the GPU what to do. Frog croaking
|
# ? Apr 28, 2023 21:29 |