Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
shrike82
Jun 11, 2005

i have both the 256gb deck and the rog ally extreme - there're trade offs with both

the biggest issue with the deck right now is its lovely 720P screen - the low res makes some games hard to parse e.g., diablo 4 and i'm not sure why, but the screen is noticeably dingier than other 720P screens. also, it only supports wifi 5. in contrast, the ally's 1080p is great even compared to other 1080p mobile handhelds. also supports wifi6(e) and it handles home streaming to my PS5 and PC more consistently at higher bit rates

the main downside with the ally is you have to deal with windows crap and significantly worse better life at low TDP settings. >=15W and you'll hit the same wall for both devices though

i guess let's hope there'll be a new deck model with a decent 1080P (OLED) screen and wifi 6 support?

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

gradenko_2000 posted:

I'm surprised this technique isn't really used for PCs. Seems like a cheap and easy alternative

ubisoft games had checkerboard rendering on PC for a time, but they ripped it out in favor of TAAU

RE Engine still has it (they call it "interlaced mode") but they also have FSR2 now which is just better

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.

repiv posted:

i can't make heads or tails of this neural geometry stuff, researchers seem very excited about it but they were also very excited about voxels and here we are in 2023 still using triangles for everything

good news, neural geometry is voxels - the ultra high-res models they're path-tracing with it are voxel-based

MarcusSA
Sep 23, 2007

shrike82 posted:

i have both the 256gb deck and the rog ally extreme - there're trade offs with both

the biggest issue with the deck right now is its lovely 720P screen - the low res makes some games hard to parse e.g., diablo 4 and i'm not sure why, but the screen is noticeably dingier than other 720P screens. also, it only supports wifi 5. in contrast, the ally's 1080p is great even compared to other 1080p mobile handhelds. also supports wifi6(e) and it handles home streaming to my PS5 and PC more consistently at higher bit rates

the main downside with the ally is you have to deal with windows crap and significantly worse better life at low TDP settings. >=15W and you'll hit the same wall for both devices though

i guess let's hope there'll be a new deck model with a decent 1080P (OLED) screen and wifi 6 support?

I have both as well and I agree with this. I have been using my Ally more though because I’ve been playing games that don’t run on the deck.

Dr. Video Games 0031
Jul 17, 2004

repiv posted:

wait FF16 is using FSR1? in this economy?

:barf:

The FSR1 resolve looks decent in the quality mode, though you lose some background detail.


(DRS doing a bit of a number on that last one though)

It's pure rear end in the performance mode though, and there ends up being a lot of detail lost in the middle camera distances too. You have to zoom right in on faces for instance to see them in decent detail, otherwise they look blank and featureless. At least all cutscenes are done at 1440p-ish at 30fps.

I don't know if they have any performance headroom to use FSR2 without dropping resolution even more, which will probably result in a pretty nasty image. They probably went a bit too hard on geometric detail in the environments and shadow resolution, as much as I like the shadows in this game.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
it'd be crazy to me if Valve didn't launch a 24GB GDDR6 console or something like that. they are the heir-apparent to the PC market if they want it now, people are super pissed about discrete GPU prices and Valve consumer sentiment and product confidence around Steam Deck is very high. People like it and honestly the Ally launching has really only emphasized how good Valve did with it in every way, the ally is an advert for the deck. And people are just not having any of the cost increases in mobos/DDR5/CPUs/GPUs and Valve could do another similar "this is pretty much at cost and we just make it up in volume" midrange gaming-PC APU and just absolutely crush the market. And this is almost a unique moment where I don't think most people would even care about the downsides (which will exist).

imagine the igpu-disabled 4700S PS5 harvest chip, but made for valve, with a big midrange VRAM and good zero-copy support (because it's an APU/console) etc. Series X isn't that much of a loss, you can make a pretty good "console" for $700 or $800 even without subsidy. and they have the buy-in to do the "pc console" thing right now, but it can also just boot windows (like a 4700S), etc. It would sell like crazy, drop modern Zen2+RDNA2 APU at $699 or $799 ish with like 6700XT performance or w/e?

If you wanna be super cheeky have AMD do a semicustom tweak (as MS and Sony do) on RDNA3, have them add the tensor accelerator from CDNA and tie it into that RDNA3 instruction. Do a push for a Valve Steamline api that unifies everything so it runs on their poo poo.

people are gonna puke at the idea of no more upgrading memory, but socketed DDR can't make a viable performance APU. You'd need like 8-channel, with an Epyc-sized socket and a bunch of power for data movement. LPDDR5X puts a bunch of channels on the package itself, at super low power, and GDDR6+ continue to be the mack truck of bulk-bandwidth latency-insensitive stuff for fast GPUs etc. with socketed DDR4/5 you just end up with a crazy product that's much worse than it needs to be.

AMD and Intel are both doing cache stacking on their performance APUs soon, with large high-power packages (120W+). AMD has quad-channel memory and stacked cache on the Strix Halo SKU so it's gonna be a bigboi sku, the cache will amplify the quad-channel memory bus's already significant performance, that's how they're getting to performance APUs. But it's gonna cost a lot and come with downsides too, like that's a multichip package with stacked cache on every package, it's gonna cost. Soldered GDDR6+ and stacked LPDDR5X just pull sooo much less energy for a given amount of data movement, even on top of costs. It sucks, computronium memory is not socketed I guess.

Paul MaudDib fucked around with this message at 04:22 on Jul 1, 2023

hobbesmaster
Jan 28, 2008

Dr. Video Games 0031 posted:

They probably went a bit too hard on geometric detail in the environments and shadow resolution, as much as I like the shadows in this game.

Looking forward to the heartfelt conversation in sin’s core before Yoship transforms into Tanaka’s final aeon.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Have there been any good comparisons between MetalFX vs DLSS and FSR? I know there was a little bit with RE2 but now I’m curious if there’s anything more recent.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

SourKraut posted:

Have there been any good comparisons between MetalFX vs DLSS and FSR? I know there was a little bit with RE2 but now I’m curious if there’s anything more recent.

realistically the metal porting toolkit is simply not good enough even to evaluate the merits of metalfx, because with a ~halved framerate (due to thunking overhead) it is not getting as many temporal samples as it should be either, temporal algorithms in general get glitchier with very low input framerate (and very low input resolution). It is a "holy poo poo guys the game pretty much already works on the compat layer, just do the tweaks to make it run ok" message from apple to devs not a final product.

you need actual games (not tech demos) that are properly ported onto each API, and MacOS just doesn't have any games with Metal that matter other than RE:Village (which is the one iirc).

not having at least vulkan support makes them a nonstarter, moltenvk+porting kit seemingly runs too slow, gg for any adoption.

I do think Vision Pro/etc are probably using MetalFX too. But that's not PC gaming, that's AR experiences or pro work.

Also do bear in mind there's both spatial and temporal MetalFX!

Paul MaudDib fucked around with this message at 05:15 on Jul 1, 2023

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Yeah, MetalFX "Quality" mode using temporal while the "Performance" mode uses spacial, apparently. World of Warcraft would be another candidate, since it's been ported natively to Metal, but it currently just has FSR 1.0.

shrike82
Jun 11, 2005

https://twitter.com/VideoCardz/status/1675041667924066305?s=20

would be neat if this became a thing
it's impossible to install a new SSD onto my mobo without removing my 4090 given how large it is

Dr. Video Games 0031
Jul 17, 2004

shrike82 posted:

https://twitter.com/VideoCardz/status/1675041667924066305?s=20

would be neat if this became a thing
it's impossible to install a new SSD onto my mobo without removing my 4090 given how large it is

This is only a thing with the 4060 Ti because it uses only eight lanes. You probably won't see it with higher-end cards since you'd have to do lane-sharing shenanigans with them.

Edit: The way this appears to work is that four of the PCIe lanes are connected directly to the m.2 slot. So the GPU and SSD have no direct access to each other, for those wondering. In order for this to work, you will have to bifurcate the x16 slot into x8/x8 so your motherboard can detect the SSD and GPU as separate devices.

Dr. Video Games 0031 fucked around with this message at 08:55 on Jul 1, 2023

repiv
Aug 13, 2009

Paul MaudDib posted:

you need actual games (not tech demos) that are properly ported onto each API, and MacOS just doesn't have any games with Metal that matter other than RE:Village (which is the one iirc).

by god that's kojima's music

https://www.youtube.com/watch?v=8v5zQv3-H-0&t=39s

but yeah the porting toolkit is overall just baffling as a strategic move, i don't think developers needed convincing that their games could technically run on a mac, it's just not worth their while to produce a native port

Kibner
Oct 21, 2008

Acguy Supremacy
It's for showing devs where the bottlenecks are in their code to make porting to AS take less time and effort. Not replacing the porting effort or showing that it's possible.

repiv
Aug 13, 2009

i don't follow how observing bottlenecks under ISA/API emulation is useful when you ultimately have to target native ARM and native Apple APIs regardless

apple tried to present it that way in the session but their logic was more or less

1. run your game under emulation
2. profile your game running under emulation
3. observe that the emulation results in bad performance
4. throw emulation in the trash and make a native port

so we went from "gently caress you make a native port" to "gently caress you make a native port, with extra steps"

Falcorum
Oct 21, 2010

Kibner posted:

It's for showing devs where the bottlenecks are in their code to make porting to AS take less time and effort. Not replacing the porting effort or showing that it's possible.

That's not going to make it take less time and effort since any bottlenecks you identify might be nonsense when you're doing a native port, and even if they aren't it won't be a 1-1 relation. Profilers also exist and are significantly simpler to use than having two versions of a game running under different backends and trying to compare them.

As a dev, this "porting toolkit" is utterly bizarre and I can't see who it's for at all. It's not actually for porting so companies that aren't porting their games to Macs already won't care, and companies that are will have better tools anyway.

Falcorum fucked around with this message at 14:08 on Jul 1, 2023

repiv
Aug 13, 2009

people are getting some milage out of using the porting toolkit as a proton equivalent for mac, but the future of that is already looking shaky since rosetta 2 can't run AVX code

the consoles have full-rate AVX now so it's going to creep into more games, TLOU already doesn't run under rosetta because it requires AVX

change my name
Aug 27, 2007
Probation
Can't post for 3 hours!

MarcusSA posted:

I have both as well and I agree with this. I have been using my Ally more though because I’ve been playing games that don’t run on the deck.

I've tested the Ally before but would still choose the Deck if I could only own one of the two because a) I already own a gaming PC, and b) 4-6 hours of battery life for indie and emulated games is great and the Ally can't get anywhere close.

steckles
Jan 14, 2006

Falcorum posted:

As a dev, this "porting toolkit" is utterly bizarre and I can't see who it's for at all. It's not actually for porting so companies that aren't porting their games to Macs already won't care, and companies that are will have better tools anyway.
I figured it’d work as a tool to show developers that there might be enough Mac users to justify a real port. “Look at these people jumping through hoops to play your game. If these people exist, maybe the market for Mac gamers is big enough to justify a real port.” It’s not like it’s a small installed base and the GPU in the M1/M2 is good enough that it wouldn’t be a total embarrassment on the performance front.

On the flip side, perhaps leadership at Apple is nervous about games and this is a weird baby step dreamed up to gauge interest while not offending the ardent anti-game contingent too much. “Look at these people jumping through hoops to play games on our computers. Maybe there are enough dollars there to justify more support from us.”

Don Dongington
Sep 27, 2005

#ideasboom
College Slice
I'm in a weird spot at the moment.

Looking to spend some of my tax return to upgrade my 980ti finally, but the options have me scratching my head.
(AU pricing):
RTX 4060: $499
RX 6750: $529
RX 6800: $779
6800xt: $860
RTX $4070: $899

System is an R5 5600/B550/32GB DDR5 3200. 1440p freesync monitor.
Have a series X that I've mostly used for recent games but historically have been more of a PC gamer and always had a reasonable gaming PC.

Usually buy a 7x or 8x series nVidia and get about 5+ years out of it, and not above spending $8-900 if that's what it takes; however I do not see the 4070 as a long term player, so have been considering going RDNA2 or an 8gb card for a stop gap.

6750 seems like a better option here but I may be missing something.

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
the 6700xt (cheapest on pc partpicker is $484) or even the 6700 ($439, so if you really want to save) would be better options at the low-end. there is barely any meaningful difference between the 6750xt and the 6700xt, and the 6750 non-xt does not exist. the caveat there is the usual amd caveat of worse ray-tracing performance and no dlss, but you get higher vram.

if you are willing to spend $900 on a card you should probably just get the 4070. the 6800xt is pretty close in performance just with worse raytracing and 12gb vs 16gb of vram isn't too important at 1440p for now. idk that the 6800 would really make sense as an option in-between the lower end and the 4070

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
My top two options would be considering doing nothing and checking the prices on used 3080s.

genericnick
Dec 26, 2012

I've had some issues with the hdmi connection on my new-ish 4080. Specifically, waking up from standby causes windows to shrink and accumulate at the top left of the screen. Also, rarely there is loss of signal for a second. This doesn't occur with Displayport, but that's only 1.4 so not great for 4k hdr. I already tried using the Display Driver Uninstaller in safe mode and replacing the cable. Since the card only has one hdmi I can't exclude that it's a hardware issue but is there anything else I should try/look at?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
What's the monitor? It sounds like the monitor is not telling the GPU it's present when it goes to sleep, and with no monitor connected the GPU is defaulting to dropping to some lower resolution when it wakes up without a monitor.

Also, are you on Win 11? The way it handles disconnected monitors is WAY better.

genericnick
Dec 26, 2012

K8.0 posted:

What's the monitor? It sounds like the monitor is not telling the GPU it's present when it goes to sleep, and with no monitor connected the GPU is defaulting to dropping to some lower resolution when it wakes up without a monitor.

Also, are you on Win 11? The way it handles disconnected monitors is WAY better.

It's the Samsung odyssey neo g7. Seems like there is no newer firmware for the thing. But yeah, I'm still on 10. Might be time to try out 11.

change my name
Aug 27, 2007
Probation
Can't post for 3 hours!

genericnick posted:

It's the Samsung odyssey neo g7. Seems like there is no newer firmware for the thing. But yeah, I'm still on 10. Might be time to try out 11.

Upgrading to 11 solved the similar issues I was having running a 4K monitor next to a 1440p one.

Don Dongington
Sep 27, 2005

#ideasboom
College Slice

K8.0 posted:

My top two options would be considering doing nothing and checking the prices on used 3080s.

$850-900. I think the days of the cheap used 3080 are already over here.

change my name
Aug 27, 2007
Probation
Can't post for 3 hours!
Used 3060s have dropped to $200 on Ebay, that's the play over a new 6600 for around the same price, right?

Don Dongington
Sep 27, 2005

#ideasboom
College Slice
I feel like the 3060/6600 aren't a great proposition unless you're stuck on 1080p.

6700 or 3060ti are going to have better legs/handle 1440p far better for not much more money.

SlowBloke
Aug 14, 2017

genericnick posted:

This doesn't occur with Displayport, but that's only 1.4 so not great for 4k hdr.

Displayport 1.4 can handle 8k resolutions, so 4k hdr will do fine.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



SlowBloke posted:

Displayport 1.4 can handle 8k resolutions, so 4k hdr will do fine.

It can handle 8k @ 60 hz with DSC; 4k HDR @ 60hz is well within DP 1.4's ability, but if they're trying to push 4K HDR @ 120hz or higher, they might run into problems due to bandwidth restrictions.

Inept
Jul 8, 2003

genericnick posted:

I've had some issues with the hdmi connection on my new-ish 4080. Specifically, waking up from standby causes windows to shrink and accumulate at the top left of the screen. Also, rarely there is loss of signal for a second. This doesn't occur with Displayport, but that's only 1.4 so not great for 4k hdr. I already tried using the Display Driver Uninstaller in safe mode and replacing the cable. Since the card only has one hdmi I can't exclude that it's a hardware issue but is there anything else I should try/look at?

Is the cable specifically rated for 4k or better? trying two old HDMI 1.4 cables are going to have issues.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Don Dongington posted:

$850-900. I think the days of the cheap used 3080 are already over here.

That's a shame. In the US a 3080 is ~2/3 as much as a 4070 for a slightly faster GPU, definitely a better buy.

I don't think you have a great choice. It's going to come down between buying the cheapest GPU you can live with and hanging on for a few years hoping things get better, or spending a lot on the 4070 and probably being dissatisfied with it a lot sooner than you should be with a GPU that expensive.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

https://twitter.com/VideoCardz/status/1675041667924066305?s=20

would be neat if this became a thing
it's impossible to install a new SSD onto my mobo without removing my 4090 given how large it is

I love the idea of the QNAP 10gbe + dual M.2 NVMe card (with PCIe switch so no bifurcation), make the most use of your one slot. There really is a market for "combination cards" both bifurcated and otherwise that is currently unexplored. (SFP+ plz)

https://www.qnap.com/en-us/product/qm2-2p410g1t

And yeah, especially with GPUs only taking x8 lanes, doing something else in the slot would be cool. Network too, thunderbolt, etc. The downside being the thermals are probably pretty intense there too.

genericnick
Dec 26, 2012

Inept posted:

Is the cable specifically rated for 4k or better? trying two old HDMI 1.4 cables are going to have issues.

Yeah, I double checked. Both were rated for 8k.

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

K8.0 posted:

That's a shame. In the US a 3080 is ~2/3 as much as a 4070 for a slightly faster GPU, definitely a better buy.

I don't think you have a great choice. It's going to come down between buying the cheapest GPU you can live with and hanging on for a few years hoping things get better, or spending a lot on the 4070 and probably being dissatisfied with it a lot sooner than you should be with a GPU that expensive.

Could it be that used 3080s are more available in the US because mining was more prevalent there? The last time someone could profitably GPU Mine in my area was like 2016 or 2017 or so (at least that I can recall).

Dr. Video Games 0031
Jul 17, 2004

The mining crash hit the Chinese GPU market harder than any other, and used GPUs have become dirt cheap there. I'm pretty sure they're exporting a lot of them to America instead of selling them at rock-bottom prices in China, though many of those are ending up rebranded and fraudulently sold as new on Newegg, like this "MLLSE" card. That poo poo is 100% a used mining GPU that had its cooler replaced and is being sold as new. I've seen that exact same cooler on several other cards from fake brands.

FlamingLiberal
Jan 18, 2009

Would you like to play a game?



Apparently I missed the fact that Microsoft signed an exclusivity deal with AMD so that Starfield won’t have DLSS support? That’s incredibly stupid.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

FlamingLiberal posted:

Apparently I missed the fact that Microsoft signed an exclusivity deal with AMD so that Starfield won’t have DLSS support? That’s incredibly stupid.

Lack of DLSS isn’t confirmed, but not unlikely given their past track record.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

three different outlets have asked AMD whether or not their contract blocks competitors middleware now, and all of them got vague non-responses

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply