Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
spunkshui
Oct 5, 2011



FlamingLiberal posted:

Are people already over Diablo 4? I've heard about all of the dumb poo poo that they did to kill the Overwatch franchise, which is frankly impressive, but I thought people mostly liked D4.

They put zero thought into level 50 to 100 so when the story ends everyone just stopped playing.

No cool PVP battle grounds.

No re-fighting campaign bosses. (LOL?)

They made it open world with a bunch of 1 time NPC quest givers you will never care about them vs the old design where you really get to know the town members.

The game scales to your level and whoever came up with the difficulty is dogshit so you cant jump forward to harder content ever.

The game desperately needed more interesting dungeon concepts.

Every single dungeon was like "here is a door you cant open, go kill 2 guys and make a key" copy and pasted over and over and over.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

idgi, what does re-encoding a youtube video and storing it locally get you

I don't re-encode youtube, but yt-dlp has support for remuxing for your exact choice of the youtube formats and output containers (mp4, mkv, etc). Having a file you can drop on your phone or play through the tv is convenient, and I don't want to watch ads etc. And VLC uses a lot less cpu power than a web browser if you're on battery etc.

for family vids and stuff, the point is to reduce the size at archival-quality, because realtime-encoded files (even from hardware encoders) tend to use a lot more bitrate than you need with a proper software encode, you can bring the size down to like a third at nearly the same visual quality just by running it through x264 placebo or veryslow. and if you're going to store them forever (on your drive on in glacier etc), you want to get the size down (just like google).

alternatively, when I encode a video clip that's going to be on backblaze or google drive for others to watch etc, I'll squish it down to 720p but make it the best-looking 720p I can, so that I don't burn up a ton of storage, and they don't need a ton of bandwidth on their phones etc. Google does re-encode it for the "preview" most people use, but it still eats a bunch more space than you need to.

Paul MaudDib fucked around with this message at 06:05 on Aug 19, 2023

Cygni
Nov 12, 2005

raring to post

I dont play multiplayer (disgusting), but I thought D4 was an… ok single player campaign, but no desire to play through again. D2, Grim Dawn, Titan Quest, and PoE are some of the few games I’ve ever really played through multiple times but D4 has no second-run appeal to me at the moment.

The game clearly wants you to play it like a stupid person, it reeks of Ubisoftisms, and even the single player felt tailored to modern action MMO sensibilities. It’s a shame, because the amount of effort the artists put into it is clearly staggering. Maybe it will get better in the future, i dunno.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord

Cygni posted:

I dont play multiplayer (disgusting), but I thought D4 was an… ok single player campaign, but no desire to play through again. D2, Grim Dawn, Titan Quest, and PoE are some of the few games I’ve ever really played through multiple times but D4 has no second-run appeal to me at the moment.

The game clearly wants you to play it like a stupid person, it reeks of Ubisoftisms, and even the single player felt tailored to modern action MMO sensibilities. It’s a shame, because the amount of effort the artists put into it is clearly staggering. Maybe it will get better in the future, i dunno.

I was gonna impulse buy D4 then realized it was $70 *before* all the weird battle pass stuff and with more time that has passed it seems like BG3 was the pro impulse purchase.

Completely unrelated, what happens after 4k gaming? When I went from 24” 1080 to 27” 1440p I don’t remember being all that crisper. Still an improvement but nothing ground changing. Anecdotally, I hear the quality jumping from 1440p to 4k is even slimmer. Is there a point where we stop climbing resolution tiers because the computation cost is so high vs the quality benefit?

Yudo
May 15, 2003

buglord posted:

I was gonna impulse buy D4 then realized it was $70 *before* all the weird battle pass stuff and with more time that has passed it seems like BG3 was the pro impulse purchase.

Completely unrelated, what happens after 4k gaming? When I went from 24” 1080 to 27” 1440p I don’t remember being all that crisper. Still an improvement but nothing ground changing. Anecdotally, I hear the quality jumping from 1440p to 4k is even slimmer. Is there a point where we stop climbing resolution tiers because the computation cost is so high vs the quality benefit?

For the most part, we are already there. The pixel density of 4k is fine for most large screen sizes; under 27in, I can't always tell 1440p from 4k. At some pixel density, it stops being noticeable. Everyone has different eyes, and you should probably check this stuff out for yourself.

8k (and arguably 4k) ratchet up computational demands dramatically. An HD video image is about 2 mpx, 4k 8.5; 8k is a whopping 33mpx. That is a lot of pixels that a GPU has to calculate. There is also the fact that the percent of 8k TVs sold each year is in the low single digits. Maybe 8k will be the next upsell for Nvidia and AMD, but I doubt it. The juice for most setups isn't worth the squeeze. I am sticking with 1440p for video games as using features like path tracing in Cyberpunk is compatible with a 144fps experience, something I doubt is doable in 4k. I doubt path tracing will be a common feature in games for some time to come; still, UE5 also looks extremely hardware demanding.

Pixel density to me is more of a productivity thing, not really for games. I like the extra density for text, window tiling and such. I guess it depends on how big is the screen you are using. A lot of people here play PC games on a big 4k TV, and that makes sense. 8k is a solution looking for a problem for not uber rich people with 100in+ home theaters, in my opinion.

Josh Lyman
May 24, 2009


Dr. Video Games 0031 posted:

Diablo 4 had a strong start, which blizzard proceeded to absolutely destroy by releasing a series of terrible patches that ruin a lot of what people liked about the game.
Ya I loved the game before season 1 but they nerfed the poo poo out of sorcerer which was already a fairly weak class and I just don’t enjoy playing other classes as much.

shrike82
Jun 11, 2005

i think people are going to stop worrying about "after 4k gaming" once 4K30 console games are the norm again and games transition to UE5 tanking current gen PCs

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

buglord posted:

Completely unrelated, what happens after 4k gaming? When I went from 24” 1080 to 27” 1440p I don’t remember being all that crisper. Still an improvement but nothing ground changing. Anecdotally, I hear the quality jumping from 1440p to 4k is even slimmer. Is there a point where we stop climbing resolution tiers because the computation cost is so high vs the quality benefit?

I think that the answer is "dynamic resolution and upscaling" which we already have right now. TVs seem to barreling right towards 8Kband I wouldn't be surprised if display tech continues to give us pretty cheap pixels in the future. Nobody's going to play at native 8K though.

4K works much better than 1440 as an upscaling target for a variety of resolutions. 1080 scaled up to 1440 looks kind of gross, I realize DLSS can fix most that but not every game has DLSS. You can upscale 1080 to 4K using integer scaling if you want, and more importantly you can upscale a dynamic resolution from 900p-1600p or whatever up to 4K and it'll look much better than trying to scale that to 1440. That side of things would continue to get better as t 8K, I'd assume.

Arivia
Mar 17, 2011

Paul MaudDib posted:

av1 is so fuckin good, for real. I use computers wrong so I eschew the tools built for the purpose and rip the exact formats+files I want. Youtube's 1080p AV1 variant of the team fortress 2 expiration date vid is 114mb (399+251+embedded metadata), looks fantastic even in the action sequences.

And for podcasts and video essays and tech news and the like, I like having a local copy for convenience/ease-of-use. I use a space-optimized file (480p + 60kbps opus) cause I don't want to have 10 episodes of 5 shows sitting around at 500mb+ per episode sitting around on my phone. 480p is pretty compact already and 480p AV1 again looks great, and squeezes some impressive file sizes too. A 65-minute Greg's Airplanes video is 75mb, again that's a bit of a happy case since it's a slideshow format essay, but that's perfectly fine for a "portable" copy. For a video that's a GN interview with wendell (2019-10-31 unraid pt1) that's 35m long, it's 105mb.

It takes a while for the av1 version to be uploaded (understandable) and I'm not quite clear about who gets encoded/re-encoded with av1. Obviously the answer is "whoever google thinks will save them the most bandwidth on delivery but for example greg has a fair bit of older (not that much older even!) catalog that's never been re-encoded even though I'm sure it's listened to decently frequently etc. It seems completely random. Obviously big channels are getting their newer content done with it to bring down costs, but not always? It's weird. I'm also not clear whether google re-encodes, or only allows av1 when the uploader does it, or maybe re-encodes for you when you're popular enough?

but it's an obvious win for visual quality and bringing down my storage usage/amount of my phone storage it eats (lolapple).

I've never tinkered with software AV1 for family videos etc but I'll do placebo x265 encodes if I know I'm gonna keep it, and you can get some decent compression like that too. A 15 minute 2x720p video takes like 8 hours or whatever on my 9900K but eh, who cares. And it spits out pretty good looking files even for some pretty small sizes (CRF27/29) that deal with motion a lot better than H264 even with placebo. To me the proposition of getting AV1 files from hardware encode that have about the same quality as the 8-hour x265 encode sounds great! but on the other hand it at least would have to have decent ffmpeg support before I'd really touch it.

I just wanted to make a Katy perry joke

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Arivia posted:

I just wanted to make a Katy perry joke

https://www.youtube.com/watch?v=9kaIXkImCAM

Cygni
Nov 12, 2005

raring to post

buglord posted:

Completely unrelated, what happens after 4k gaming? When I went from 24” 1080 to 27” 1440p I don’t remember being all that crisper. Still an improvement but nothing ground changing. Anecdotally, I hear the quality jumping from 1440p to 4k is even slimmer. Is there a point where we stop climbing resolution tiers because the computation cost is so high vs the quality benefit?

4k 144 or 240hz at sub $300, and then QD-OLED/MicroLED 4k 144 or 240hz at sub $300. And at the current pace of the monitor market, that takes us like 10 years into the future when Mecha-Obama outlaws videogames because of the 4th Anime War and the destruction of New NeoCleveland III.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

buglord posted:

Completely unrelated, what happens after 4k gaming? When I went from 24” 1080 to 27” 1440p I don’t remember being all that crisper. Still an improvement but nothing ground changing. Anecdotally, I hear the quality jumping from 1440p to 4k is even slimmer. Is there a point where we stop climbing resolution tiers because the computation cost is so high vs the quality benefit?

I don't think the resolution treadmill is going to continue. 4K is more worth it than people give it credit for, the ability to have "retina display" oversampling of the viewer's visual acuity is good, and that's true for both consumer/professional monitor and TV markets. But I don't see any appetite for anyone to continue the treadmill to 8K. The hardware isn't there for it yet either, it would really take a leap to MCM with multiple big GCDs to start getting to playable framerates. $4k or $8k GPU type stuff. And even then right now it's only luxury market TVs and laser projectors that can even show the signal.

Framerate isn't worthless either and even at 4K there is already the tradeoff that 1440p is going to be faster for any given generation of 4K. And that same tradeoff exists between 4K and 8K and I just don't see it being worth it for 99% of people to go from 4K120 to 8K60 or 4K240 to 8K120. And you have to drive that higher resolution - of course you can upscale etc but lol.

(the exception is VR, I think VR is still absolutely insatiable for resolution and vision pro is the first thing that's truly eliminated screendoor effect, and even then it's probably still not where they want it for fine-detail acuity like text on the fake-monitor. but of course even then VR cares a lot about framerate too, and the answer is more or less that VR wants more of everything, and maybe we're currently in the trough of disillusionment where people finally fully understand just how hard a problem it is to make a virtual world a half inch from your eye with enough resolution that it works like a normal monitor that you'd want to write code on, and update it 240 times a second, with low latency, and make the thing efficient enough that you can get a reasonable lifespan out of it in a portable (wearable) format. And they'll use temporal sampling and upscaling and foveated rendering and any other trick they need to get there.)

The fundamental problem with 4K144 compared to 1440p144 is that you can do the latter off a stock DP1.2 link (overclocked was 165 Hz) and the former requires an (overclocked or chroma subsampled) DP1.4 link (stock is 4K120). At any given level of transmission, 1440p is more pixel-efficient than 4K for a given framerate. You can support more streams on your dock, or thunderbolt link, or MST chain, or idle pixel clock (lol), or anything else. And if you step that down to 8K, now you've got real deep compromises in framerate at a given link speed. and future link speed specs will improve this, but they'll also improve the other resolutions, you can already do 1440p240 or something on DP1.4, or 2x 120hz. And while the refresh rate treadmill won't scale forever, there's worthwhile gains to at least 240hz if you can send them and display them. And the connector standards iterate sloooowww. It doesn't get that much easier to move 10-40gbps+ that often.

Eventually I wonder if we won't get to something that looks more like a video codec than a scanout raster sweep or a packetized bit array. if oleds can draw at 0.1ms response time that's effectively thousands of FPS response time, why not try to feed data faster in the areas of motion in the image etc? Macroblocks are more efficient than selective line scanout and that's more efficient than a simple start-to-finish scanout. Just have the protocol send some kind of macroblock update or line update format. Or foveated scanout ;)

The easy answer is "do the upscaling over there so you don't have to send the fullframe image" but I don't think that would ever be feasible to pull that out without access to textures, memory, SMs, cache locality, etc. But I think a "video compression" style thing would work, if OLEDs can be fully or partially agile in the pixel addressing and you send a stream of updates. You can have "1000hz refresh in the action area" but also have high resolution in the other areas, and make the most effective use of the limited link speed. Spend your bits where they do the most good.

But there's just zero appetite for 8K right now in the general public imo, and zero reason to adopt it, especially given the increasing tradeoffs on refresh rate. Who is the use-case beyond movie buffs with whatever services let you rent 8k movies or whatever and videography carts/video editing pros? Especially if you have to upscale at performance or ultraperformance to get there.

(to be clear professional 8K does exist, including oled, but it's the $25k 24" calibrated reference monitor not your gaming stuff. And they are fine with the tradeoffs because they're not working at 240fps. And that one dell monitor, which was pulled from the market iirc)

Paul MaudDib fucked around with this message at 06:29 on Aug 19, 2023

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Paul MaudDib posted:

I don't think the resolution treadmill is going to continue.

...

But there's just zero appetite for 8K right now in the general public imo, and zero reason to adopt it, especially given the increasing tradeoffs on refresh rate.

(to be clear professional 8K does exist, including oled, but it's the $25k 24" calibrated reference monitor not your gaming stuff. And they are fine with the tradeoffs because they're not working at 240fps. And that one dell monitor, which was pulled from the market iirc)

You can buy 8K TVs at Best Buy for under $1000 already. Within a decade there's going to be a ton of 8K displays in homes, whether people specifically want them or not. I'd be surprised if some of that doesn't trickle into monitors at all. Hell, I'd bet that whatever early planning stage the PS6 / Xbox Next are at, that they're being designed around an 8K output target, using upscaling of course.

Once you're there, wouldn't it make sense to send that 8K display an input that's been upscaled by DLSS or FSR or XeSS to 8K rather than to 4K?

Dr. Video Games 0031
Jul 17, 2004

As Digital Foundry has said in the past, we're entering into a "post-resolution" era. The concept of your render resolution being 1:1 with your display resolution is quickly going out of fashion. It's a trend that started several years ago and is going to accelerate over the next few. The future of graphical presentation will involve balancing resolution with graphical features to give you the level of fidelity you're most happy with at the frame rate you want. Usually this will mean rendering at below your display's resolution, though it could conceivably mean rendering above it too. People who insist on native-resolution only five years from now will look like how anti-TAA people look now.

Further into the future, sometime in the 2030s or beyond, it's possible that we'll end up with micro-LED displays with hyper-dense LED arrays that can create any arbitrary resolution. And at that point, we really will be in a post-resolution era. Nobody will be thinking about their display resolution at all anymore. Maybe we'll even create new signal standards capable of delivering images of non-uniform pixel density without scaling artifacts or a loss of sharpness so edges can have lots of pixel information while other parts of the image have less. Or some kind of macroblock/foveated method like Paul describes (or a combination of both). And then we'll be in both a post-resolution and a post-frame-rate era, which is when things will start getting really spooky.

Trying to do a flat 8K though is just dumb when it comes to gaming, and I hope nobody ever takes that idea seriously

Twerk from Home posted:

You can buy 8K TVs at Best Buy for under $1000 already. Within a decade there's going to be a ton of 8K displays in homes, whether people specifically want them or not. I'd be surprised if some of that doesn't trickle into monitors at all. Hell, I'd bet that whatever early planning stage the PS6 / Xbox Next are at, that they're being designed around an 8K output target, using upscaling of course.

Once you're there, wouldn't it make sense to send that 8K display an input that's been upscaled by DLSS or FSR or XeSS to 8K rather than to 4K?

Even with an 8K TV (which will remain a bad idea to buy no matter how cheap it is), I'd still want games to DLSS/FSR/XeSS up to 4K and then integer scale that to 8K. If you think VRAM is becoming a major concern now, just wait until you try to run a game at 8K (those upscaling techniques don't reduce VRAM requirements of their higher output resolution much at all)

Also I'd have to see a side-by-side to know for sure, but I have a feeling that running DLSS Quality at 4K and integer scaling that up to 8K will probably look better than running DLSS Ultra Performance at 8K on top of being faster and requiring less VRAM. It just seems like there'd be a lot more room for error when you're doing a 3x upscale.

Dr. Video Games 0031 fucked around with this message at 05:42 on Aug 19, 2023

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Twerk from Home posted:

You can buy 8K TVs at Best Buy for under $1000 already. Within a decade there's going to be a ton of 8K displays in homes, whether people specifically want them or not. I'd be surprised if some of that doesn't trickle into monitors at all. Hell, I'd bet that whatever early planning stage the PS6 / Xbox Next are at, that they're being designed around an 8K output target, using upscaling of course.

Once you're there, wouldn't it make sense to send that 8K display an input that's been upscaled by DLSS or FSR or XeSS to 8K rather than to 4K?

that's fair, people are dumb, but, I learned this lesson with my seiki 50" (which was the shitass 50UY-01 model that didn't take the same firmware as the 120hz!), pure resolution for dollar isn't the same thing as image quality. I would absolutely choose a high-zone-count (someone said they're up to 2k or 4k or something in the new gen?) miniLED or QLED or LG WOLED 4K panel over a 8k shitass panel any day. there is such a thing as a product being too cheap to actually compete in its bracket at all, and actually too cheap to be worth it over a premium lower-res panel.

should have gotten a nicer 1080p instead, (pre-amazon) woot screwed me on a vizio E-series refurb, a decent 1080p "not amazing but no big problems" class thing. broken in transit cause of their lovely boxes, and they didn't have a functional return process for large items at all, and refused to process the item even though the helpful USPS clerk let me put a normal package label on an oversize item and it was delivered to them. had to do a chargeback

then sears was clearing out the seiki for something silly like $300 and you could supposedly do 1080p120 on it with the hacked firmware. It was poo poo, awful zone uniformity, anything space-opera'y was hideously checkerboarded. And the hacked firmware didn't flash on that hardware revision.

Paul MaudDib fucked around with this message at 05:48 on Aug 19, 2023

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Paul MaudDib posted:

that's fair, people are dumb, but, I learned this lesson with my seiki 50" (which was the shitass 50UY-01 model that didn't take the same firmware as the 120hz!), pure resolution for dollar isn't the same thing as image quality. I would absolutely choose a high-zone-count (someone said they're up to 2k or 4k or something in the new gen?) miniLED or QLED or LG WOLED 4K panel over a 8k shitass panel any day. there is such a thing as a product being too cheap to actually compete in its bracket at all, and actually too cheap to be worth it over a premium lower-end panel.

should have gotten a nicer 1080p instead, woot screwed me on a vizio E-series refurb, a decent 1080p "not amazing but no big problems" class thing. broken in transit cause of their lovely boxes, and they didn't have a functional return process for large items at all, and refused to process the item even though the helpful USPS clerk let me put a normal package label on an oversize item and it was delivered to them. had to do a chargeback

then sears was clearing out the seiki for something silly like $300 and you could supposedly do 1080p120 on it with the hacked firmware. It was poo poo, awful zone uniformity, anything space-opera'y was hideously checkerboarded. And the hacked firmware didn't flash on that hardware revision.

Yes, right now the 8K displays at any given price point are worse than the 4K displays, and also 8K in specific isn't worth paying for at this time, but the market momentum is absolutely heading towards 8K and I think once the economies of scale are there everybody's getting an 8K TV whether they want it or not.


Dr. Video Games 0031 posted:

Even with an 8K TV (which will remain a bad idea to buy no matter how cheap it is), I'd still want games to DLSS/FSR/XeSS up to 4K and then integer scale that to 8K. If you think VRAM is becoming a major concern now, just wait until you try to run a game at 8K (those upscaling techniques don't reduce VRAM requirements of their higher output resolution much at all)

Also I'd have to see a side-by-side to know for sure, but I have a feeling that running DLSS Quality at 4K and integer scaling that up to 8K will probably look better than running DLSS Ultra Performance at 8K on top of being faster and requiring less VRAM. It just seems like there'd be a lot more room for error when you're doing a 3x upscale.

Memory prices are dirt cheap and look to stay low, right? 24GB of GDDR6 is what, $80? https://www.tomshardware.com/news/gddr6-vram-prices-plummet. I'm resisting the urge to bring stupid memes into this, but AMD is able to sell a card around $700 now with 20GB of VRAM on top of their super-expensive MCM process.

About DLSS quality at a lower resolution vs DLSS performance at a higher resolution, we're living that right now with the 1440p vs 4K debate and I think that it's clearly not settled. If anything, I'd rather have DLSS upscale a given frame more than generate frames on its own, we've seen how DLSS framegen has some serious downsides. Why not have it work upscaling a 1440p render to 8K rather than upscaling it to 4K and then doing framegen?

Dr. Video Games 0031
Jul 17, 2004

See, my thought is that while there are meaningful improvements to image quality to be had between 1440p and 4K, when you compare 4K with 8K there's really not much there. Upscaling to 4K from 1440p looks better than native 1440p, but if 8K doesn't look better than 4K, then why are we trying to upscale to it directly?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
this is what I meant by macroblock, really just thinking of an arbitrary "output buffer blit operation" that can be executed by the other side, and you can come up with codings/ISAs for that. 8088 Domination "executable sprites", or some kind of a delta-compression/neural-compression thing. it just has to run in really low latency, and provide high level of bit update presency (perhaps the early bits get devoted to hotspots, or pixel draws are scheduled across the whole real clock time window ala "chasing the beam").

I just think it's generally possible to be more flexible than Display Stream Compression if you abandon line scanout. Line scanout is no longer a good mechanism when you have the ability to drive arbitrary pixels in the array, and I think OLEDs can do that.

(executable bitstream compression is extremely cursed)

Paul MaudDib fucked around with this message at 07:22 on Aug 19, 2023

BurritoJustice
Oct 9, 2012

Truga posted:

maybe nvidia should invest into making FSR, which is opensource, better instead of pushing for a proprietary cuda based solution, then

it does not

FSR is open source in it's license, but it is not collaborative like "true" open-source projects. They've never accepted a pull request, probably never will given that there are really basic ones that are just simple bug fixes that are ignored, and all the development is behind the scenes. AMD just dumps the latest code on on Github every six months, typically a month or two after the latest version shipped in binary form in a sponsored game.

Nvidia could branch off FSR2 and work on that, but that obviously isn't going to be the version that is shipped in AMD sponsored games. And at that point, why bother starting with FSR2 as a base at all?

So no, Nvidia can't meaningfully improve FSR.

Dr. Video Games 0031
Jul 17, 2004

shrike82 posted:

puredark's gotten a review copy of starfield and will apparently have a dlss mod ready at launch

This turned out to be misinfo by the way. Puredark said he plans on making the mod available within a day of the game's launch. Some idiots assumed that must mean he has a review code and presented his comments as if he did. He has since replied that he doesn't have a review code and he simply thinks he'll be able to get the mod ready within a day because he's built a framework to do this quickly.

Truga
May 4, 2014
Lipstick Apathy

BurritoJustice posted:

So no, Nvidia can't meaningfully improve FSR.

amd already ships new and improved drivers the linux community fixed for them even on windows, so if a bunch of hobbyists could do it i'm sure jensen can too

Tha_Joker_GAmer
Aug 16, 2006
I can't believe nvidia remote killed my (almost) 9 years old 980, forcing me into a very awkward situation (not wanting to buy bad graphics cards but wanting to play video games). Posting right now with the power of integrated grafix.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

Tha_Joker_GAmer posted:

I can't believe nvidia remote killed my (almost) 9 years old 980, forcing me into a very awkward situation (not wanting to buy bad graphics cards but wanting to play video games). Posting right now with the power of integrated grafix.

I have a 1050Ti you can have for the low low price of (checks the market) a thousand dollars.

Truga
May 4, 2014
Lipstick Apathy
still don't care what the whole youtuber drama is about other than ltt being a shithole to work for cause i avoided that already, but the content's been amazing

Kibner
Oct 21, 2008

Acguy Supremacy
In addition to releasing a new chipset driver, AMD also announced and released a developer tool called Radeon GPU Detective (RGD) to help developers diagnose crashes: https://gpuopen.com/radeon_gpu_detective_available_now/

Animal
Apr 8, 2003

repiv posted:

the fact that they're still running the gsync module on an FPGA in the year of our lord 2023 is just baffling, they're literally an ASIC company, why wouldn't you make an ASIC once the functionality is nailed down

it's a good implementation but the hardware cost is unjustifiable nowadays

I agree with this.

The reason why I stick to monitors with a G-Sync Ultimate module is that the VRR range is down to 0hz. Does Freensync go below 40hz these days? The whole point of VRR is to smooth out gameplay, I want it to smooth things out when it dips below 40.

BurritoJustice
Oct 9, 2012

Truga posted:

amd already ships new and improved drivers the linux community fixed for them even on windows, so if a bunch of hobbyists could do it i'm sure jensen can too

AMDGPU is actually a bonafide collaborative open source project though, not the faux style of FSR.

Animal posted:

I agree with this.

The reason why I stick to monitors with a G-Sync Ultimate module is that the VRR range is down to 0hz. Does Freensync go below 40hz these days? The whole point of VRR is to smooth out gameplay, I want it to smooth things out when it dips below 40.

Yeah, freesync does low framerate compensation. It's done a bit differently to how gsync does it though, gsync has a buffer on the monitor that it uses to hold frames that need repeating, while Freesync LFC is a driver side fix where the GPU handles the doubling.

You can make the driver side LFC misbehave and flicker if you intentionally create a situation where it's just on the edge of the freesync range and it's turning on and off, but otherwise it's a near identical experience to module GSync for low framerates.

BurritoJustice fucked around with this message at 15:55 on Aug 19, 2023

repiv
Aug 13, 2009

it does provided the monitor has a wide enough VRR range for driver-level LFC to work, where the GPU doubles or triples or quadruples up frames to push lower framerates up into the range the monitor can actually display

the gsync module does something similar at low framerates but it's implemented in the module itself rather than on the GPU side

e;fb

Indiana_Krom
Jun 18, 2007
Net Slacker

Animal posted:

I agree with this.

The reason why I stick to monitors with a G-Sync Ultimate module is that the VRR range is down to 0hz. Does Freensync go below 40hz these days? The whole point of VRR is to smooth out gameplay, I want it to smooth things out when it dips below 40.

If the monitor has "g-sync compatible" certification from Nvidia it should support LFC down to single digit rates. Basically Nvidia started their own certification program to fix AMDs poo poo for them.

BurritoJustice
Oct 9, 2012

The major downside of AdaptiveSync relying on the source for LFC in general is that it's easy to be SOL if the driver writers don't give a poo poo about anything but the strict requirements of the spec (which doesn't specify LFC). The PS5 doesn't support LFC, which means
VRR is effectively useless in a lot of cases. Some game devs, like Insomniac, get around this by doing game level LFC where the game engine itself is doing the duplication.

Yudo
May 15, 2003

For those of you convinced that RDNA3 can't do math:

WCCF posted:

But just this week, both Intel and AMD optimized their software stacks to get massive speedups in generative AI which has seen AMD's RTX 7900 XTX get higher performance per dollar than an NVIDIA RTX 4080 in generative AI (specifically Stable Diffusion with A111/Xformers). Considering Stable Diffusion accounts for the vast majority of non-SaaS, localized generative AI right now - this is a major milestone and finally offers some competition to NVIDIA.

Using Microsoft Olive and DirectML instead of the PyTorch pathway results in the AMD 7900 XTX going form a measly 1.87 iterations per second to 18.59 iterations per second! You can read the detailed guide by AMD over here. This level of performance in Automatic111 is pretty close to the SHARK-based approach to Stable Diffusion and definitively puts the company on the map with regards to generative AI. As it turns out, it also makes the 7900 XTX offer slightly higher GenAI performance per dollar (in Stable Diffusion /A111) than the comparative RTX 4080 - at least at current prices.

wargames
Mar 16, 2008

official yospos cat censor

AMD needs to do this with more projects.

Kazinsal
Dec 13, 2011

Now if only it were for something actually useful instead of godawful AI "art" garbage.

repiv
Aug 13, 2009

look, someone has to poo poo up every art platform with 10 billion generated fantasy landscapes and sameface scifi girls

Branch Nvidian
Nov 29, 2012



Installed my brother's old RX 470 and the same issue I've been having with the 7900 XTX appeared. Used DDU, installed drivers fresh, and problem continues to exist. Do I just cut my losses and buy an Nvidia GPU at this point?

Branch Nvidian fucked around with this message at 19:45 on Aug 19, 2023

Yudo
May 15, 2003

Branch Nvidian posted:

Installed my brother's old RX 470 and the same issue I've been having with the 7900 XTX appeared. Used DDU, installed drivers fresh, and problem continues to exist. Do I just cut my losses and but an Nvidia GPU at this point?

If you installed a known good video card and have the exact same problem, I think that is a good sign the problem lies elsewhere. At tjhe least, I would be reluctant to throw good money after bad without understanding what is happening. Do you have an Nvidia card laying around that you can test?

njsykora
Jan 23, 2012

Robots confuse squirrels.


Kazinsal posted:

Now if only it were for something actually useful instead of godawful AI "art" garbage.

Yeah but they can’t sucker in billions of tech investor money by making a better upscaler for videogames.

Branch Nvidian
Nov 29, 2012



No, I don't unfortunately. I told UPS to reroute my shipment so I'll have my GPU back in a couple days. Going to try nuking my windows install again and starting over, testing with each new piece of software and driver as I install them. Don't know what else to try in the mean time.

PC LOAD LETTER
May 23, 2005
WTF?!

Branch Nvidian posted:

Installed my brother's old RX 470 and the same issue I've been having with the 7900 XTX appeared. Used DDU, installed drivers fresh, and problem continues to exist. Do I just cut my losses and buy an Nvidia GPU at this point?
Huuuh?

If you have the same problem with different, known good, video card then the problem isn't the video card. Its something else in the system.

Just a example of how something weird can screw things up: I had a case with a paper clip in it once that somehow fell behind the mobo that cause some weird problems that I couldn't figure out until I pulled the mobo out of the case and found it by accident. Worked fine after that was removed!

I would start testing everything in that computer in another PC with known good parts until the problem is duplicated to find the culprit. That is the old, hard and slow way of doing it but does work.

Adbot
ADBOT LOVES YOU

Yudo
May 15, 2003

Branch Nvidian posted:

No, I don't unfortunately. I told UPS to reroute my shipment so I'll have my GPU back in a couple days. Going to try nuking my windows install again and starting over, testing with each new piece of software and driver as I install them. Don't know what else to try in the mean time.

Install Linux and see if it has the same problem. Perhaps too it may throw more informative errors than what you are getting from windows.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply