Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Subjunctive
Sep 12, 2006

✨sparkle and shine✨

This looks very cool, though I haven’t really dug into it:

https://github.com/adam-maj/tiny-gpu

quote:


If you want to learn how a CPU works all the way from architecture to control signals, there are many resources online to help you.

GPUs are not the same.

Because the GPU market is so competitive, low-level technical details for all modern architectures remain proprietary.

While there are lots of resources to learn about GPU programming, there's almost nothing available to learn about how GPU's work at a hardware level.

The best option is to go through open-source GPU implementations like Miaow and VeriGPU and try to figure out what's going on. This is challenging since these projects aim at being feature complete and functional, so they're quite complex.

This is why I built tiny-gpu!

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
https://twitter.com/HardwareUnboxed/status/1783689694762668286

slidebite
Nov 6, 2005

Good egg
:colbert:

God dammit, my local best buy shows stock on 4090FE, now I have to decide if I want to drop the coin :argh:

Hughmoris
Apr 21, 2007
Let's go to the abyss!

slidebite posted:

God dammit, my local best buy shows stock on 4090FE, now I have to decide if I want to drop the coin :argh:

Do it for us Goons still rocking a 970.

:mad:

Anime Schoolgirl
Nov 28, 2002

some at videocardz have speculated this may be related to Ryzen Z1 allocation and how Lenovo and Asus have gobbled up practically all the available supply for that (the other systems being sold by smaller vendors are using regular 78/940Us that are only slightly less efficient)

Cygni
Nov 12, 2005

raring to post

They were never very serious about their Radeon line or very prolific, tbh. Sounds like they pulled the plug on it. They are also pretty heavy on Intel/Nvidia in their other product lines too (no AMD desktops, Intel with their handheld etc), but they do have have a token few new AMD laptops.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

slidebite posted:

God dammit, my local best buy shows stock on 4090FE, now I have to decide if I want to drop the coin :argh:

I would heavily caution anyone against buying a 4090. The part is approaching 2 years old, and Nvidia has a serious fetish for gating proprietary technology behind their newest model lines.

As much as Nvidia doesn't have competition in the top end, it's worth bearing in mind that their gaming and AI solutions share common architecture. If Nvidia wants to toe up against AMD AI alternatives, it's going to juice those products to ensure dominance. And that juice will by default trickle down to the GPU lineup, especially the top end.

In the worst case scenario you could be looking at 4-6 months before your 2k purchase is utterly annihilated by something new and probably not much more expensive.

Anyways I was curious, Jedi Survivor came out on Game Pass, and I've been paying it for 30 minutes or so. Digital Foundry has a video from a few months ago that claims the port is absolute poo poo and totally unplayable. I trust them, but my experience has been... fine? Like sure, it's no Forbidden West pc port, but did they fix it recently or something?

The reason I ask is because I don't want to get too far into this game just to discover, later, that aspects of the port ruin the experience...

Taima fucked around with this message at 23:23 on Apr 26, 2024

Listerine
Jan 5, 2005

Exquisite Corpse

Taima posted:

Could the people buying 4090s elaborate on why? It just feels like the worst time.

I've only been considering it since I just got $1000 in a tax return, and they're now apparently obtainable without standing in line or other hijinx. I'm too old/busy to stalk cards, and my inner argument is: do I get a 4090 now or risk waiting for a 5000 series card, which I may or may not be able to get at launch? I guess it's a case of "bird in the hand" etc.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
What are you gaming on right now and what is your target resolution and fps?

Cross-Section
Mar 18, 2009

Taima posted:

The reason I ask is because I don't want to get too far into this game just to discover, later, that aspects of the port ruin the experience...

Most of the performance criticism stems from the location you visit after the prologue, so yeah it's kind of this, but also mark me down as someone who played through the whole game on PC with the DLSS3 mod and had a great time. :shrug:

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Oh amazing thanks! For the record it has DLSS3 and frame gen officially now btw!

Listerine
Jan 5, 2005

Exquisite Corpse

Taima posted:

What are you gaming on right now and what is your target resolution and fps?

I'm not gaming, I use the CUDA cores for rendering. The 4090 has roughly twice the CUDA count of the two 3000 series cards I'm using now, I could get work done in half the time basically if I completely replaced these two cards with one.

YerDa Zabam
Aug 13, 2016



Taima posted:

I would heavily caution anyone against buying a 4090. The part is approaching 2 years old, and Nvidia has a serious fetish for gating proprietary technology behind their newest model lines.

As much as Nvidia doesn't have competition in the top end, it's worth bearing in mind that their gaming and AI solutions share common architecture. If Nvidia wants to toe up against AMD AI alternatives, it's going to juice those products to ensure dominance. And that juice will by default trickle down to the GPU lineup, especially the top end.

In the worst case scenario you could be looking at 4-6 months before your 2k purchase is utterly annihilated by something new and probably not much more expensive.

Anyways I was curious, Jedi Survivor came out on Game Pass, and I've been paying it for 30 minutes or so. Digital Foundry has a video from a few months ago that claims the port is absolute poo poo and totally unplayable. I trust them, but my experience has been... fine? Like sure, it's no Forbidden West pc port, but did they fix it recently or something?

The reason I ask is because I don't want to get too far into this game just to discover, later, that aspects of the port ruin the experience...

-ninja- I think your original edit asked why people had bought their 4090s. This was written with that in mind rather than a justification or something. -ninja-
I bought mine (a month or so ago) because I could afford it, and I wanted to play everything maxed out now. Nobody knows when the next thing will be our, how much it will cost, or how available it will be. I have zero interest in strategizing poo poo like that any more.
If I decide I want the 5090 or whatever then I'll sell my 4090 at a loss of, idk £500-£700 so maybe £70-£100 per month cost from the use I got, and I'll be happy with that. All bearing in mind we're talking multi-thousand £/$ toys.

But, I'll probably be happy with it for a few years. I may end up eating my words, but I'm not sure what the 5090 (or whatever)would give me? Make the minimum lows higher maybe? Everything is maxed to gently caress and I don't want for anything more really.
Sure, new games will come out that are more demanding, but that's all hearsay. At the moment I'm loving loving it, and if something comes in a few months out that is actually 70% better (lol) then it won't take anything away from what I have atm, and even then.
I used to try and plan around stuff like this but over the years I realised that I'd spent months loving about with igpus and stressing about releases for the sake of a few hundred quid.


I'm also just started playing Jedi Survivor and I'm really enjoying it. Maybe 3 hours into it. It does seem to have some occasional weirdness for no discernible reason. It isn't perfect, but I don't think it's nearly as bad as I was fearing. I remember the DF video, and it is definitely nothing as bad as that showed. I think the initial issues were dealt with, though some people still claim that it runs badly at points, I've got it all maxed out ant it runs 4k between 60-120 with the occasional little bit of frame pacing or something, but nothing that affects gameplay
Oh I see others have mentioned DLSS3 and frame gen, that probably makes all the difference.

I'll be kicking myself when the 5090 is a requirement for DLSS4 lmao ;-)

Listerine posted:

I'm not gaming, I use the CUDA cores for rendering. The 4090 has roughly twice the CUDA count of the two 3000 series cards I'm using now, I could get work done in half the time basically if I completely replaced these two cards with one.

This is the kind of reply that makes me feel like a vapid bourgeoisie

YerDa Zabam fucked around with this message at 23:54 on Apr 26, 2024

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Taima posted:

Anyways I was curious, Jedi Survivor came out on Game Pass, and I've been paying it for 30 minutes or so. Digital Foundry has a video from a few months ago that claims the port is absolute poo poo and totally unplayable. I trust them, but my experience has been... fine? Like sure, it's no Forbidden West pc port, but did they fix it recently or something?

I mean if it runs way way worse than the original despite them both being Unreal Engine 4 then I can see what it gets so much poo poo, like how do you gently caress that up it's same engine probably a lot of the same assets.

Did it get frame interpolation yet? I think that's what I was waiting on.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Listerine posted:

I'm not gaming, I use the CUDA cores for rendering. The 4090 has roughly twice the CUDA count of the two 3000 series cards I'm using now, I could get work done in half the time basically if I completely replaced these two cards with one.

If your professional use case is such that you can absorb the loss/investment of buying a 2 year old part today, then so be it.

YerDa Zabam posted:

-ninja- I think your original edit asked why people had bought their 4090s.

Sorry about that edit, guilty as charged!

Honestly "I can afford it and don't really care about the value prop" is a great reason to go forward now. My words of caution only apply for those who have a budget.

YerDa Zabam posted:

I'm also just started playing Jedi Survivor and I'm really enjoying it. Maybe 3 hours into it. It does seem to have some occasional weirdness for no discernible reason. It isn't perfect, but I don't think it's nearly as bad as I was fearing. I remember the DF video, and it is definietely nothing as bad as that showed. I think the initial issues were dealt with, though some people still claim that it runs badly at points, but I'm just going to keep at it since it is a pretty cool game so far and if that stops there's plenty more

That's so great to hear. I'm having a blast too!

Zero VGS posted:

I mean if it runs way way worse than the original despite them both being Unreal Engine 4 then I can see what it gets so much poo poo, like how do you gently caress that up it's same engine probably a lot of the same assets.

Did it get frame interpolation yet? I think that's what I was waiting on.

For sure- we've also just... you know... had some amazing ports recently. So the bar is high, and I totally get the criticism because frame issues in general are such a killjoy. If by frame interpolation you are referring to DLSS3/FG specifically I am happy to report that I tested it for you just now and it looks like it's working fine!

Taima fucked around with this message at 00:02 on Apr 27, 2024

Inept
Jul 8, 2003

Cross-Section posted:

Most of the performance criticism stems from the location you visit after the prologue, so yeah it's kind of this, but also mark me down as someone who played through the whole game on PC with the DLSS3 mod and had a great time. :shrug:

Why are you using spoilers like that when you might actually be talking about the game you beat

Cross-Section
Mar 18, 2009

Inept posted:

Why are you using spoilers like that when you might actually be talking about the game you beat

I was trying to be cute about using a DLSS3 mod while forgetting that it's long since been patched into the game lol

YerDa Zabam
Aug 13, 2016



Taima posted:


Honestly "I can afford it and don't really care about the value prop" is a great reason to go forward now. My words of caution only apply for those who have a budget.


Budget, value and 4090. Those words do not belong together. (except when the sort of do though, heh)

Zero VGS posted:

I mean if it runs way way worse than the original despite them both being Unreal Engine 4 then I can see what it gets so much poo poo, like how do you gently caress that up it's same engine probably a lot of the same assets.

Did it get frame interpolation yet? I think that's what I was waiting on.

To my eyes, and in a very unscientific way, it seems to have a lot more going on then the first game. Looks fancier to a degree that I could imagine it being more demanding
Not that I think that excused the mess that they initially at least, delivered though, gently caress no, that seems to be something else.

shrike82
Jun 11, 2005

Listerine posted:

I'm not gaming, I use the CUDA cores for rendering. The 4090 has roughly twice the CUDA count of the two 3000 series cards I'm using now, I could get work done in half the time basically if I completely replaced these two cards with one.

the 4090 runs much faster and cooler for compute - downside is no nvlink but i don't think that's a factor against your 3080s

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

If I just play 1440p is there any card that would be a big upgrade over my 3080?

Cyberpunk drops frame rates hard on me in the PL expansion at some sections. Not enough to be unplayable but enough to annoy me.

YerDa Zabam
Aug 13, 2016



Kraftwerk posted:

If I just play 1440p is there any card that would be a big upgrade over my 3080?

Cyberpunk drops frame rates hard on me in the PL expansion at some sections. Not enough to be unplayable but enough to annoy me.

How much are you looking to spend? Also, what refresh rate is you monitor?
4080 super would be amazing of course, pricey though, but even a 4070Ti Super would be a big upgrade. Still not cheap tbh.
I've not got the Cyberpunk expansion yet, but supposedly there are a couple of areas that are just bastards to run.

I really need to actually play Cyberpunk instead of just wandering about marveling at how cool everything looks

YerDa Zabam fucked around with this message at 00:28 on Apr 27, 2024

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

YerDa Zabam posted:

How much are you looking to spend? Also, what refresh rate is you monitor?
4080 super would be amazing of course, pricey though, but even a 4070Ti Super would be a big upgrade. Still not cheap tbh.
I've not got the Cyberpunk expansion yet, but supposedly there are a couple of areas that are just bastards to run.

I really need to actually play Cyberpunk instead of just wandering about marveling at how cool everything looks

Monitor has a 144hz refresh rate. It’s a very good Asus ROG monitor 27” that I don’t wanna get rid of.

YerDa Zabam
Aug 13, 2016



Kraftwerk posted:

Monitor has a 144hz refresh rate. It’s a very good Asus ROG monitor 27” that I don’t wanna get rid of.

Cool, just making sure it wasn't a 60 or 75hz or the like.
1440p/144hz, the yeah 4070 Ti Super would be big upgrade over your 3080. Any more would be well into diminishing returns, anything less wouldn't be that much above what you have.

Dr. Video Games 0031
Jul 17, 2004

The 4070 Ti Super is maybe 25 - 30% faster than the 3080, not really a huge upgrade.

I'd just turn on DLSS in games that don't run super smoothly and wait until the next gpu gen, tbh. Quality mode DLSS looks fine at 1440p.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



YerDa Zabam posted:

This is the kind of reply that makes me feel like a vapid bourgeoisie

I mean, anyone that can afford a 4090 for gaming is a bourgeoisie, lol.

Not that it matters in the end.

YerDa Zabam
Aug 13, 2016



Dr. Video Games 0031 posted:

The 4070 Ti Super is maybe 25 - 30% faster than the 3080, not really a huge upgrade.

I'd just turn on DLSS in games that don't run super smoothly and wait until the next gpu gen, tbh. Quality mode DLSS looks fine at 1440p.

Yeah, true dat, the 1440p uplift wasn't great really.

A deranged option for the terminally impatient such as myself is...
Sell the 3080. (I'll use UK prices here) £450. Buy second hand 4080 (non super) £750. £300 for 50% improvement, no waiting!
Sell the 4080 whenever the next stuff comes (if you are bothered)

Again, a bit UK-centric, but you'll maybe save £50 a month in electric cost between the 30 and 40 series lol.
(not the 4090, that thing is scaring me every time I look at the watt meter, even with an undervolt)
To be "fair" to the electric companies, it has reduced a wee bit in the last year or two. The side effect of those prices is that I had to sell all my older enterprise homelab/NAS gear, as having 600W going all day wasn't viable. That partly funded the 4090, so silver linings and all that jazz

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

YerDa Zabam posted:

Yeah, true dat, the 1440p uplift wasn't great really.

A deranged option for the terminally impatient such as myself is...
Sell the 3080. (I'll use UK prices here) £450. Buy second hand 4080 (non super) £750. £300 for 50% improvement, no waiting!
Sell the 4080 whenever the next stuff comes (if you are bothered)

Again, a bit UK-centric, but you'll maybe save £50 a month in electric cost between the 30 and 40 series lol.
(not the 4090, that thing is scaring me every time I look at the watt meter, even with an undervolt)
To be "fair" to the electric companies, it has reduced a wee bit in the last year or two. The side effect of those prices is that I had to sell all my older enterprise homelab/NAS gear, as having 600W going all day wasn't viable. That partly funded the 4090, so silver linings and all that jazz

50 quid a month?? difference?? surely there's no way, unless you're talking about running the cards flat out all the time. What are the numbers on that?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I wanted to drop a quick note on games added to GeForce Now over the last two months, since there's been a number of big hitters lately.

in no particular order, and as an incomplete list:

- Balatro
- Dragon's Dogma 2
- Elder Scrolls Online
- Evil West
- Manor Lords
- Pacific Drive
- Halo Infinite
- RAGE and RAGE 2
- Starcraft 2
- Starcraft Remastered
- Hearthstone
- Overwatch 2
- Diablo 2: Resurrected
- Diablo 3
- Diablo 4
- Fallout 4
- Fallout 76

YerDa Zabam
Aug 13, 2016



On their long podcast type video, Digital Foundry mentioned that GeForce Now has VRR support now. Pretty impressive stuff.

Sorbus
Apr 1, 2010
Ordered a b-stock Asus Tuf 4080 Super for about 1060 EUR. It will be a bit of a pain to dismantle current water loop but this should last until the 8XXX series or whatever.

repiv
Aug 13, 2009

seems capcom is working on pathtracing for RE Engine and there's an early version already hidden in dragons dogma 2

https://www.youtube.com/watch?v=UnYOKXS_wuo

YerDa Zabam
Aug 13, 2016



Not sure if the takeaway was that Capcom are working on it. Seems it is something that is sometimes used during production for work flow reasons. Be totally rad if they do though, and you never know.
Actually, I'd be happy if they'd at least fix the current issues

kliras
Mar 27, 2021
re8 had it in some nominal capacity, could be it's just lying around in the re engine from that. re2+3+7 also got it backported, but that was a bit of a mess. i think people also say to just turn it off in re4. i guess capcom's making some kind of attempt, but rarely a wholehearted one

could also be capcom's been messing around with it for the ps5 pro where sony's supposedly pushing for raytracing

(also please don't ever upload videos or do livestreams where you have clearly modded the game; capcom will strike that poo poo down like an angry god)

kliras fucked around with this message at 16:01 on Apr 27, 2024

repiv
Aug 13, 2009

YerDa Zabam posted:

Not sure if the takeaway was that Capcom are working on it. Seems it is something that is sometimes used during production for work flow reasons.

true, it could just be for quicker iteration during level editing before committing to a full GI bake

YerDa Zabam
Aug 13, 2016



Yeah he shows something similar in a UE5 game about half way through the video.

The bit when he maxes out the bounces and it drops to 20fps is pretty sobering.
For every time that I say to myself "the 4090 is insane, I can't imagine what I'd want from a 5090" there's something like that which answers the question, sort of. There's a Cyberpunk mod that lets you tweak bounces and stuff, manly to improve performance iirc, and it also lets you go ham with bounces and it loving craters performance.

YerDa Zabam fucked around with this message at 16:10 on Apr 27, 2024

Dr. Video Games 0031
Jul 17, 2004

repiv posted:

true, it could just be for quicker iteration during level editing before committing to a full GI bake

DD2 doesn't used baked GI, though other RE engine games have. Alex speculated that it may have been used as a ground truth to compare against when developing their RTGI implementation.

It would be neat if they can get it in good enough shape to ship it as a bonus feature in a patch, though it needs not only denoising but likely a new sampling algorithm that will fix the candle issue. (it's probably not using anything like restir)

Adding in a true denoiser might be beyond what's possible with modding, but I wonder if modders can mod in ray reconstruction somehow. That might solve the denoising problem, at least.

Dr. Video Games 0031 fucked around with this message at 16:29 on Apr 27, 2024

slidebite
Nov 6, 2005

Good egg
:colbert:

Yeah, I resisted that 4090FE. I'll hold out to the next gen.

Animal
Apr 8, 2003

slidebite posted:

Yeah, I resisted that 4090FE. I'll hold out to the next gen.

But what if this is the last gen?

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
Is there some kilowatt-hour calculation to make comparing power efficiency between using a 4090 vs two 3090s or whatever? Could it actually be worth it?

Adbot
ADBOT LOVES YOU

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

I mean it'll vary greatly with usage and what not. I think in the US energy is cheap enough that it isn't worth the bother. In EU energy prices can be multiples of the US prices. So, over the lifetime of your computer a more energy efficient GPU (as compared to a less efficient one) may pay itself back, while at the same time allowing you to keep a smaller PSU, less heat, less strain etc.



I'm reading that at full load, 2 3090s would consume about 750W. For a 4090 that's supposedly 461W. Full load is not a realistic scenario for gaming and most usages, but it gives an upper limit for the savings. I'm also reading that 3090s consume about 100W idle. That seems implausibly high to me. For the 4090 it's supposedly less than 10W.

So, under full load you get 289W difference. At 25 ct. energy price that's 7 ct. difference per hour max load. If you do high quality VR for two hours every day, that'd be 14.05 ct. per diem, or 51 EUR/USD per annum. Not bad, but maybe not worth it.

For the idle condition the difference might still be a whopping 180W - if those reports of high idle power consumption are true. That would actually start making a pretty substantial effect. Let's say your computer runs another 2 hours idle every day, you'd save another 31 EUR/USD per annum.


Altogether the absolute maximum might be in the order of 82 EUR/USD per annum. Less if you live in Texas (ca. 50 USD p.a.), more if you life in Germany (maybe 110 EUR p.a.). That's in addition to improvements in quality of life, chiefly less load, heat, fan noise, and wear. And warranty. So depending on what you pay for electricity and how long your computer runs it might be worth it.

Lord Stimperor fucked around with this message at 00:02 on Apr 28, 2024

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply