Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

igors analysis seems to be more misses than hits lately

capacitorgate was also him, and he blamed the EVGA 3090 issues on their custom fan controller which ended up being wrong

Adbot
ADBOT LOVES YOU

kliras
Mar 27, 2021
... and then jay usually aggregates the igor piece and dials up the clickbait a little extra on youtube

Termyie
Aug 18, 2022

Always choose violence.

Igor is sounding more like Reddit Armchair analysis than anything. Why not let us get the video cards that could replace a water heater and see what the results are? I wonder if it will as loud as the 5000 series back in the day with the leaf blower for a fan.

Shipon
Nov 7, 2005
i support nvidia's battle against the SFF meme, for one

Bondematt
Jan 26, 2007

Not too stupid
...do they make that 104 sized board with a PCI slot?

I have an idea for a terrible build.

hobbesmaster
Jan 28, 2008

Bondematt posted:

...do they make that 104 sized board with a PCI slot?

I have an idea for a terrible build.

PCIe 16x slots won’t fit in the form factor. I’m poking around industrial embedded sites and I think the smallest place you can find a 16x slot is ITX.

Bondematt
Jan 26, 2007

Not too stupid

hobbesmaster posted:

PCIe 16x slots won’t fit in the form factor. I’m poking around industrial embedded sites and I think the smallest place you can find a 16x slot is ITX.

Yeah, best I found was you might be able to do a breakout board to PCI X16 on certain pc104 boards, but no mention if what gen it even is.

Just wanted a stupid tiny case mounted to a 4090.

Cygni
Nov 12, 2005

raring to post

Termyie posted:

Igor is sounding more like Reddit Armchair analysis than anything. Why not let us get the video cards that could replace a water heater and see what the results are? I wonder if it will as loud as the 5000 series back in the day with the leaf blower for a fan.

The whole obsessive internet culture around GPUs these days is pretty tiring honestly. Between the losers still brand warrioring publicly traded billion dollar tech companies in 2022 (lol), the leak industrial complex both getting used as marketing directly and also just randomly making stupid poo poo up, the youtubers/bloggers realizing that being negative about every single product gets more clicks, the tiresome repeated reddit memes, etc.


These are supposed to be fun toys that you can choose to buy or not man, like, its ok!

repiv
Aug 13, 2009

Bondematt posted:

Yeah, best I found was you might be able to do a breakout board to PCI X16 on certain pc104 boards, but no mention if what gen it even is.

Just wanted a stupid tiny case mounted to a 4090.

comedy option: intel nuc compute element paired with a 4090

Bondematt
Jan 26, 2007

Not too stupid

repiv posted:

comedy option: intel nuc compute element paired with a 4090

...oh man I didn't even think of those.

Might need to do some digging as they might even make a competent gaming PC rather than just a joke.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC
22.10.1 Release Notes

quote:

Known Issues

Chromium-based browsers may experience video stuttering on extended monitors with mixed refresh rates using Radeon™ RX 6000 series GPUs.
Users may encounter dropped frames during video playback using hardware acceleration on Radeon™ RX 6000 series GPUs. [Resolution targeted for 22.10.2]
Display may briefly show corruption when switching between video and game windows on some AMD Graphics Products such as the Radeon™ RX 6700 XT. [Resolution targeted for 22.10.2]
When Vertical Refresh Sync is set globally to Always Off, system stuttering or driver timeout may occur during video playback using Radeon™ RX 6000 series GPUs. [Resolution targeted for 22.10.2]
Driver timeout may occur using HEVC hardware encoding on Radeon™ RX 6000 series GPUs. [Resolution targeted for 22.10.2]

Ohhhhhh. So AMD has been why Youtube (on Firefox) was freezing and only playing audio on my second monitor above 480p when I was playing FFXIV on my main.

And why the Photos app crashes when playing gifs or mp4s and why the Windows Movie & TV freezes and crashes my desktop.

AMD hasn't had Recommended WHQL drivers for five months and these have been an issue for a good while. I really hope 22.10.2 is soon and fixes them.

edit: I managed to migrate these issues by replacing the Windows apps with third party ones like VLC Media Player and IrfanView some time ago.

OhFunny fucked around with this message at 20:10 on Oct 6, 2022

kliras
Mar 27, 2021
yeah there's been a lot of frustration on the /r/amd subreddit over the regressions in the last update. also sounds like it's the first driver regression in a while

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
GPU acceleration has been semi-broken on both brands indefinitely, there are active browser bugs on AMD that I remember from getting my 7850 back in 2012.

The least aggravating thing is to just turn it off at least on desktop where power doesn’t matter.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

kliras posted:

yeah there's been a lot of frustration on the /r/amd subreddit over the regressions in the last update. also sounds like it's the first driver regression in a while

The video in browsers has been an issue since June at least. I remember posting about it here. Since I thought it was Firefox acting up, but found it happening on Edge and Chrome. I did manage to fix that after flipping enough switches in AMD's control settings.

The problem with playing videos on the desktop is at least a month old. I noticed it in early September when I moved all the screenshots and mp4s on my Switch over to my PC to free up size and discovered all the mp4s would freeze or crash my system. I thought it was an issue with something Nintendo did until I realized every video on my PC was doing the same. I figured it was a Windows issue since switching their apps out with third party ones fixed it.

Which is why I didn't post about it here.

edit:

Paul MaudDib posted:

GPU acceleration has been semi-broken on both brands indefinitely, there are active browser bugs on AMD that I remember from getting my 7850 back in 2012.

The least aggravating thing is to just turn it off at least on desktop where power doesn’t matter.

This is something I did, but switching off hardware acceleration did not fix the problem for either the browser issues or the video issues.

OhFunny fucked around with this message at 20:38 on Oct 6, 2022

hobbesmaster
Jan 28, 2008

Bondematt posted:

...oh man I didn't even think of those.

Might need to do some digging as they might even make a competent gaming PC rather than just a joke.

Those won’t work, PCIE 4x only and no physical slot.

I think the smallest and silliest you can do is the mITX type boards intended for “gaming” applications like slot machines, video poker and so forth.

For example:
A general industrial motherboard
A gaming targeted one

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib

hobbesmaster posted:

Those won’t work, PCIE 4x only and no physical slot.

Some of them say they have Thunderbolt ports, so full silly would have it connected to an external gpu, assuming it works like i think it should.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
How were Amazon for getting 3000 series cards? I would really prefer to use Amazon due to the same day delivery and returns policy

I wanna order that poo poo in the morning and get it by noon :chaostrump:

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:

Taima posted:

How were Amazon for getting 3000 series cards? I would really prefer to use Amazon due to the same day delivery and returns policy

I wanna order that poo poo in the morning and get it by noon :chaostrump:

Botted to gently caress like every other retailer during the chip shortage.

Enos Cabell
Nov 3, 2004


Best Buy is the way to go IMO. Same day delivery if you have one in town, or you can drive out and have them put it in your car for you without entering the store.

Dr. Video Games 0031
Jul 17, 2004

Paul MaudDib posted:

RDNA2 just has bad RT perf in general, I think it’s just a lower priority for AMD at least as of last gen, especially with consoles being in the drivers seat, but AMD could have designed it more scalable if they wanted. Like they could have had consoles use a have double-pumped or quad-pumped version that executes more slowly, or design faster units but only put them in half the TMUs on the consoles. AMD doesn’t seem to prioritize it right now, kinda like tessellation back in the GCN1 days.

(RT is around 4% of the NVIDIA die as of Turing, and rdna2 is supposedly around 2%, and AMD’s synthetic/path-traced RT performance is around 50% of Turing relative to raster, so the numbers kinda line up there.)

But anyway RT is kinda hard on cache I think, there’s not really a way to predict what material/texture you’re going to hit after a bounce, and random access is tough on caches, it’s the same reason why ethereum mining was uncacheable…

RDNA2's RT performance is trash, but I wasn't talking about that. AMD performs worse than usual at rasterization in some games like spider-man, and in SM's case specifically I suspect it's due to the lower memory bandwidth. Though I did gently caress up by saying "40-series" in that post instead of "30-series."

Zedsdeadbaby posted:

That makes sense, the 600w rumor hung around like a bad smell for like a year. It was only recently they apparently went back on it

The biggest cards probably actually ARE still 600W cards. Supposedly they're dual-bios cards that have a stock bios and an OC bios with a ridiculous power limit. There's a reason why Asus recommends a 1000W PSU for the ROG Strix while they only recommend 850W for the Tuf Gaming.

Here's a 4090 pulling 600W: https://videocardz.com/newz/nvidia-geforce-rtx-4090-spotted-reaching-3-0-ghz-and-616-watts-with-gpu-stress-tool

The clock speed shown is lower at 600W. It would be funny if that power limit caused the card to thermally throttle and perform worse. (edit: 600W was pulled in Furmark which explains the clock speed differential)

Dr. Video Games 0031 fucked around with this message at 00:32 on Oct 7, 2022

MarcusSA
Sep 23, 2007

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

repiv posted:

the double edged sword of nvidia pushing traversal into dedicated hardware is they can't add any new functionality there without new hardware

They could do a software traversal like AMD, if the new capability was worth it, no?

repiv
Aug 13, 2009

Subjunctive posted:

They could do a software traversal like AMD, if the new capability was worth it, no?

they could, but discarding much of their general purpose traversal performance to gain some relatively specialized optimizations almost certainly isn't worth it

AMD have nothing to lose by tacking more features onto their traversal loop however

repiv
Aug 13, 2009

makes me wonder what shenanigans they'll get up to in the console space as they're stuck with RDNA2 for the next 6-8 years, being able to customize the BVH walk arbitrarily could lead to some fun optimizations that will never see the light of day on PC

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

repiv posted:

makes me wonder what shenanigans they'll get up to in the console space as they're stuck with RDNA2 for the next 6-8 years, being able to customize the BVH walk arbitrarily could lead to some fun optimizations that will never see the light of day on PC

I hope to be proven wrong, but I actually think that as the console generation marches on and games stop targeting PS4 / Xbox One, we're going to see 30 fps make a comeback. Time will tell.

repiv
Aug 13, 2009

we've yet to see UE5s lumen running at more than 30fps on the consoles :v:

at least metro exodus enhanced serves as a proof that the consoles can handle advanced raytracing effects at 60fps and decent resolution (well, maybe not on the Series S) if developers put their minds to it

repiv fucked around with this message at 01:44 on Oct 7, 2022

wargames
Mar 16, 2008

official yospos cat censor

repiv posted:

we've yet to see UE5s lumen running at more than 30fps on the consoles :v:

at least metro exodus enhanced serves as a proof that the consoles can handle advanced raytracing effects at 60fps and decent resolution (well, maybe not on the Series S) if developers put their minds to it

RDNA2 doesn't have the most robust raytracing but next gen consoles with RDNA4 probably will.

v1ld
Apr 16, 2012

repiv posted:

at least metro exodus enhanced serves as a proof that the consoles can handle advanced raytracing effects at 60fps and decent resolution (well, maybe not on the Series S) if developerswizards put their minds to it

FTFY

Those folks have always been able to wring performant lighting out of potatoes, going back to their STALKER roots.

Agree with you, joking aside: there's good motivation to figure out how to get at least smooth 30 fps out of the RDNA2 hardware for a while yet.

Good observation on how that will motivate software side optimizations for RT. Do you think this was a deliberate design choice from AMD in their existing RT implementation as opposed to not having the time to implement it until RDNA3?

I don't even know what's missing in RDNA2 for RT in hardware that may be added in RDNA3 to achieve perf parity with Nvidia's stuff, to be honest. But I hope everyone embraces it somewhere during this console gen.

shrike82
Jun 11, 2005

i'm also curious about the rumored switch 2 nvidia soc and how it'll fare against games targetting PS5/XSX

hobbesmaster
Jan 28, 2008

shrike82 posted:

i'm also curious about the rumored switch 2 nvidia soc and how it'll fare against games targetting PS5/XSX

Its probably going to look almost exactly like one of the "Jetson Orin Nano series" columns.
https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-orin/#orion-prod-module-dev-kit-specs

Whats interesting is the GPU config: "1024-core NVIDIA Ampere architecture GPU with 32 Tensor Cores" at 625MHz.
The current switch is: 256:16:16 (shaders aka cores, tmu, rop), at 1 GHz.

Despite the lower clocks it has double the memory bandwidth and paper FP32 performance which is all we can use as a proxy for graphics from a datasheet. An ampere core would come with DLSS which probably will be the biggest thing, but its still a 5-15W APU.

hobbesmaster fucked around with this message at 02:35 on Oct 7, 2022

Vintersorg
Mar 3, 2004

President of
the Brendan Fraser
Fan Club



Dr. Video Games 0031 posted:

Mining demand was nearly unlimited at the time, so I don't think it would've taken any pressure off of Nvidia and AMD cards. Increase the GPU supply by 2x, and miners would've bought 2x as many cards, I'm positive.

https://twitter.com/aschilling/status/1577693207332360193

This is just embarrassingly too big. What a joke this gen is. It’s pathetic.

Craptacular!
Jul 9, 2001

Fuck the DH

doesn't come with a sag bracket, naturally.


Vintersorg posted:

What a joke this gen is. It’s pathetic.

repiv
Aug 13, 2009

v1ld posted:

Good observation on how that will motivate software side optimizations for RT. Do you think this was a deliberate design choice from AMD in their existing RT implementation as opposed to not having the time to implement it until RDNA3?

I don't even know what's missing in RDNA2 for RT in hardware that may be added in RDNA3 to achieve perf parity with Nvidia's stuff, to be honest. But I hope everyone embraces it somewhere during this console gen.

i think the implementation in RDNA2 was just a minimum viable product, it's pretty clear which way the wind is blowing with both nvidia and intel offloading more and more RT functions to dedicated hardware

nvidia spent a decade refining software raytracing in OptiX before RTX arrived, they certainly gave the idea a fair shake before throwing up their hands and deciding that it warranted dedicated silicon

repiv fucked around with this message at 02:38 on Oct 7, 2022

shrike82
Jun 11, 2005

hobbesmaster posted:

Its probably going to look almost exactly like one of the "Jetson Orin Nano series" columns.
https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-orin/#orion-prod-module-dev-kit-specs

Whats interesting is the GPU config: "1024-core NVIDIA Ampere architecture GPU with 32 Tensor Cores" at 625MHz.
The current switch is: 256:16:16 (shaders aka cores, tmu, rop), at 1 GHz.

Despite the lower clocks, double the memory bandwidth and paper FP32 performance which is all we can use as a proxy for graphics from a datasheet. An ampere core would come with DLSS which probably will be the biggest thing, but its still a 5-15W APU.

you have to wonder if the existing DLSS pretrained networks will work well with the upscaling input and output resolutions that a Switch would need. not sure if Nvidia would do it for free or Nintendo would pay for it either

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
amd are just technologically behind the curve on raytracing and are playing catch-up, i'm not sure there's that much more to it. they didn't dedicate as much die space to dedicated rt hardware like nvidia did, presumably because they were more focused on catching up with rasterisation. if rdna3 is competitive with the 30-series though on raytracing that'll probably be good enough for now though, with the 40-series being what it is.

the leaked switch 2 graphics api included support for raytracing & dlss but i wouldn't really expect much in the way of rt performance at all. dlss is pretty decent at even fairly low native resolutions these days, but idk exactly where the performance sweet-spot will be for the switch 2. would hazard a guess that they'll want to aim for 720p native so they can do 4k alright with dlss ultra performance, and 1080p extremely well? maybe 540p to 720p for handheld depending on screen resolution?

repiv
Aug 13, 2009

shrike82 posted:

you have to wonder if the existing DLSS pretrained networks will work well with the upscaling input and output resolutions that a Switch would need. not sure if Nvidia would do it for free or Nintendo would pay for it either

DLSS is definitely on the cards for the next switch, in the big nvidia leak there were direct references to running it under the NVN2 API (NVN being the custom graphics API on the switch) and also references to needing an ampere card to optimally simulate NVN2 in a PC development environment

as for how well it will work at the modest resolutions the switch2 will probably target, who knows

shrike82
Jun 11, 2005

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
Well it’s definitely been true for a couple of generations now that you need to double check your case size for GPU clearance before you buy something. Modern cases have gotten better for GPU clearance by necessity, but compact cases have always been touch and go for the top end cards.

E: I should state I’m not defending the humongous size of the 4090s, just that the pics of “lol it doesn’t fit because it’s big” have been around for years, it’s not a new phenomenon.

kliras
Mar 27, 2021
even some atx motherboards won't fit in the most popular cases out there. somehow the noctua d15 is the last component you have to worry about for clearance

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

AirRaid posted:

Well it’s definitely been true for a couple of generations now that you need to double check your case size for GPU clearance before you buy something. Modern cases have gotten better for GPU clearance by necessity, but compact cases have always been touch and go for the top end cards.

E: I should state I’m not defending the humongous size of the 4090s, just that the pics of “lol it doesn’t fit because it’s big” have been around for years, it’s not a new phenomenon.

Maybe, but we're able to take those pics with bigger cases each new generation, and it never stops being funny.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply