Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Cygni
Nov 12, 2005

raring to post

Scoss posted:

Is the upcoming AMD reveal likely to be limited only to bigass flagship enthusiast cards like Nvidia?

Probably unlikely that they will reveal anything that would disrupt the $300-400 tier right?


AMD has Navi33 (our friend the Hotpink Blowfish), which is expected to be a monolithic 4096 shader part with 8x interface and 8gb of RAM. But I don't think anyone has any desire to launch any low end parts at the moment, so its possible we don't see it for a while.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy
yeah, i think the low end agenda for next year's gonna be "if you want low end, there's gonna be $250 3080s on ebay soon" lol

remember $200 radeon 290s? lmfao

MarcusSA
Sep 23, 2007

AMD is going to release a $299 card with 16gb and kill whatever intel just put out

Chainclaw
Feb 14, 2009

it looks like the RTX A6000 is going for around $5,000. At that price, I'm wondering if it makes more sense to get a Mac Pro. Didn't someone in this thread mention that the Mac Pro shares ram between the GPU and CPU? But I'm getting confusing results on searching, the Mac Pro builder makes it look like it has separate vram / regular ram, but google claims it does share?

MarcusSA
Sep 23, 2007

Chainclaw posted:

it looks like the RTX A6000 is going for around $5,000. At that price, I'm wondering if it makes more sense to get a Mac Pro. Didn't someone in this thread mention that the Mac Pro shares ram between the GPU and CPU? But I'm getting confusing results on searching, the Mac Pro builder makes it look like it has separate vram / regular ram, but google claims it does share?

AFAIK its all unified ram at this point.

The Clap
Sep 21, 2006

currently training to kill God

Chainclaw posted:

it looks like the RTX A6000 is going for around $5,000. At that price, I'm wondering if it makes more sense to get a Mac Pro. Didn't someone in this thread mention that the Mac Pro shares ram between the GPU and CPU? But I'm getting confusing results on searching, the Mac Pro builder makes it look like it has separate vram / regular ram, but google claims it does share?

The current Mac Pro (released Dec 2019) is one of the last remaining Intel holdouts with separate CPU & GPU. It looks like Apple is holding off until they're able to put together an "Extreme" version of the M2 with 48 cores before they're going to update the Mac Pro to their ARM chips with unified RAM. All of their laptops, the Mac Studio and the current iMac releases have moved to the ARM chip with unified RAM though.

Chainclaw
Feb 14, 2009

The Clap posted:

The current Mac Pro (released Dec 2019) is one of the last remaining Intel holdouts with separate CPU & GPU. It looks like Apple is holding off until they're able to put together an "Extreme" version of the M2 with 48 cores before they're going to update the Mac Pro to their ARM chips with unified RAM. All of their laptops, the Mac Studio and the current iMac releases have moved to the ARM chip with unified RAM though.

Someone pointed me at the Studios and they seem to have the integrated memory. The main gotcha I think is Nvidia puts out a ton of cool software that only runs on Nvidia GPUs, but building an Nvidia machine would probably cost like $10k, when an equivalent Mac Studio would be like $7k. I think more stuff works on Nvidia besides just first party Nvidia software, too.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Scoss posted:

Is the upcoming AMD reveal likely to be limited only to bigass flagship enthusiast cards like Nvidia?

Probably unlikely that they will reveal anything that would disrupt the $300-400 tier right?

AMD has Navi 33 for midrange (7700XT or 7600XT tier depending on how they do it) but they opted to push it back to next year and launch the high-end parts first. I think the implication is they don't want to be fighting miner inventory either... like I said I think that one's lose-lose, if you undercut pricing then miners will adapt and undercut back, $200 is better than $0 for them.

AMD has the benefit that they don't have to ditch a bunch of their own last-gen inventory on top of that, though... ideally, NVIDIA really needs to time it so that they sell through about the same time as miner inventory starts to clear, and I'm not sure that's possible, they'd really have to move a lot of inventory.

I'm guessing we don't see Navi 33 in much volume until Q2 next year. It's hard to say when they'll do the announcement/launch because timelines are so squishy, they could honestly launch it as soon as CES, but in that case I'd expect it to be a very protracted launch with reviews releasing in like, mid/late feb and cards on the market no earlier than late feb/early march, with very little inventory in march and things really firming up in April. Or they could do the announcements in Feb and do a firmer launch in mid or late March. Or even slide things back a little further and announce in March, even. Those two scenarios (CES announce vs Feb/March announce) would be my guesses at this point, but there aren't rumors on this that I've seen, just my guess from entrail-reading.

Chainclaw posted:

it looks like the RTX A6000 is going for around $5,000. At that price, I'm wondering if it makes more sense to get a Mac Pro. Didn't someone in this thread mention that the Mac Pro shares ram between the GPU and CPU? But I'm getting confusing results on searching, the Mac Pro builder makes it look like it has separate vram / regular ram, but google claims it does share?

Mac Pro hasn't been updated with M1/M2 processors yet, it still uses Intel (not sure if it's skylake-sp or ice lake-sp at this point) so no shared RAM there. Right now the Mac Studio with M1 Ultra (2x M1 Max chiplets on a package) is the beefiest thing Apple offers.

But yes, the M1/M2 series do share RAM between all the processors/coprocessors... the CPU, GPU, and NPU all are attached to a single memory controller and have their own ports - but not all the ports can saturate the whole controller. So you are in a situation where the CPU could potentially touch 64GB of memory, but it caps out at 1/4 the bandwidth that the GPU could access.

Note that the ML ecosystem isn't highly evolved for Apple yet... and I think especially for training, inference is a lot easier (I know there's Stable Diffusion running on M1). A lot of the tooling basically still assumes NVIDIA for the most part since that's the industry standard. And out of all Apple's performance claims, the GPU ones are the most dubious... I don't think many actual benchmarks have borne out the "3090 performance" claim. But it's hard to validate because not a lot of software runs on AMD anyway.

Be sure to check out what types of math (bfloat, etc) your particular training approach uses and check that Apple supports those with an acceptable level of performance, because that varies too.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Could orders please open up :/ sigh.

what 4090 models are yall getting? I have no idea what the preferred skus are.

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

repiv posted:

brief XeSS comparison in the new DF direct

https://www.youtube.com/watch?v=tb3P_Tdn2HA&t=1985s

motion handling "XeSS [on non-intel hardware] is still pretty good here I would say" voiced over this image:



That's not "pretty good" it's hilariously awful ghosting. Might as well be playing Snake! with those motion trails...

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Harik posted:

motion handling "XeSS [on non-intel hardware] is still pretty good here I would say" voiced over this image:



That's not "pretty good" it's hilariously awful ghosting. Might as well be playing Snake! with those motion trails...

That’s very reminiscent of what you get in Wonderlands with FSR2. Sadly the latest update broke the DLSS hack.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
Is there any competitive advantage to running CS:GO or Valorant or whatever at +240fps? Maybe this is more Games forum related but Hardware Unboxed and Gamers Nexus talk about playing games at 300FPS and I really have to wonder if this stuff makes or breaks noscope headshot performance on a noticeable level or if any competitive advantage it brings is lost when playing online with latency.

njsykora
Jan 23, 2012

Robots confuse squirrels.


buglord posted:

Is there any competitive advantage to running CS:GO or Valorant or whatever at +240fps? Maybe this is more Games forum related but Hardware Unboxed and Gamers Nexus talk about playing games at 300FPS and I really have to wonder if this stuff makes or breaks noscope headshot performance on a noticeable level or if any competitive advantage it brings is lost when playing online with latency.

Higher framerate = more responsive and when you're in a split second reaction shooter like CSGO or Valorant that stuff can matter when you're not in your mid-30s with wrists of dust.

Chainclaw
Feb 14, 2009

njsykora posted:

Higher framerate = more responsive and when you're in a split second reaction shooter like CSGO or Valorant that stuff can matter when you're not in your mid-30s with wrists of dust.

don't most modern games decouple render framerate from input polling framerate? If you're rendering, say, 300 FPS chances are the game isn't polling input at 300 FPS.

If you're playing Quake 2 or something built on ancient tech then yeah, rendering and gameplay logic loops will be coupled.

Dr. Video Games 0031
Jul 17, 2004

Chainclaw posted:

don't most modern games decouple render framerate from input polling framerate? If you're rendering, say, 300 FPS chances are the game isn't polling input at 300 FPS.

If you're playing Quake 2 or something built on ancient tech then yeah, rendering and gameplay logic loops will be coupled.

Rendering is an inherent part of input lag since input lag is measured from an action being input to the display of that action. The higher the frame rate, the sooner that action will be shown on screen and the more responsive a game will feel. For an extreme example, try capping your favorite game to 30 fps and see how different it feels.

Nfcknblvbl
Jul 15, 2002

I remember in Quake 3 there were certain ledges you can only jump to if your frame rate was capped at 125 fps. 333 fps lets you jump higher but online play limits to 125.

Kibner
Oct 21, 2008

Acguy Supremacy
Yes, it does make a difference. If nothing else, opposing players will become visible from around corners earlier the higher your framerate is. The greater your fps is over your opponent's, the more time you have to react before they do.

This is in addition to seeing your physical actions correspond to in-game actions more quickly, leading to you not having to overcorrect as much since there is less "lag time" between performing your action and seeing the result on the screen.

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop
are we polling input now or just getting events as they come in? Because there was a big to-do about implementing a type of nested IO wait on linux to try to better emulate how windows event loops work for wine gaming so I thought everything was event-driven on modern engines.

Chainclaw
Feb 14, 2009

Dr. Video Games 0031 posted:

Rendering is an inherent part of input lag since input lag is measured from an action being input to the display of that action. The higher the frame rate, the sooner that action will be shown on screen and the more responsive a game will feel. For an extreme example, try capping your favorite game to 30 fps and see how different it feels.

Yeah in that range, but the question was if it mattered at 300 FPS. If the game is polling input at 30 or 60 FPS, it won't matter from a competitive edge if you're running 300 FPS locally. Sure it will look way nicer, and that's what I care more about, but the question was if it helped competitive gamers.

Dr. Video Games 0031
Jul 17, 2004

Chainclaw posted:

Yeah in that range, but the question was if it mattered at 300 FPS. If the game is polling input at 30 or 60 FPS, it won't matter from a competitive edge if you're running 300 FPS locally. Sure it will look way nicer, and that's what I care more about, but the question was if it helped competitive gamers.

Polling rates in competitive games are much faster than 30 or 60fps. Input lag is measurably better at higher frame rates in most popular competitive games. Though there have been a few oddball examples like Apex Legends (which is still using the Source Engine, god bless it)

kliras
Mar 27, 2021
embargos on arc 7 up tomorrow

https://twitter.com/VideoCardz/status/1577400695728312348

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.

Paul MaudDib posted:

I'm guessing we don't see Navi 33 in much volume until Q2 next year. It's hard to say when they'll do the announcement/launch because timelines are so squishy, they could honestly launch it as soon as CES, but in that case I'd expect it to be a very protracted launch with reviews releasing in like, mid/late feb and cards on the market no earlier than late feb/early march, with very little inventory in march and things really firming up in April. Or they could do the announcements in Feb and do a firmer launch in mid or late March. Or even slide things back a little further and announce in March, even. Those two scenarios (CES announce vs Feb/March announce) would be my guesses at this point, but there aren't rumors on this that I've seen, just my guess from entrail-reading.
there are rumours about an expected CES announcement but nothing too solid

it'd make sense since they should be announcing a bunch of laptop chips like they always do & we already know that Navi 33 mobile is going to get a big push for laptops

TheScott2K
Oct 26, 2003

I'm just saying, there's a nonzero chance Trump has a really toad penis.
Got my 1660 for sale if anyone's looking for a starter GPU for their kid's fortnite box or something. You won't be able to reply though. I have PMs here and my twitter is @adequate_scott

https://forums.somethingawful.com/showthread.php?threadid=4013976

TheScott2K fucked around with this message at 23:36 on Oct 4, 2022

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Chainclaw posted:

Yeah in that range, but the question was if it mattered at 300 FPS. If the game is polling input at 30 or 60 FPS, it won't matter from a competitive edge if you're running 300 FPS locally. Sure it will look way nicer, and that's what I care more about, but the question was if it helped competitive gamers.

It's a measurable difference, but not really a meaningful one. Still, if you're a competitive gamer and your livelihood depends on how fast you click people's heads, it'd be very stupid not to take every millisecond you can possibly get.

hobbesmaster
Jan 28, 2008

Dr. Video Games 0031 posted:

Polling rates in competitive games are much faster than 30 or 60fps. Input lag is measurably better at higher frame rates in most popular competitive games. Though there have been a few oddball examples like Apex Legends (which is still using the Source Engine, god bless it)

I saw the other day that there’s apparently arguments about whether the Apex Legends engine is really still the source engine because it’s so heavily modified.

I’ll just confuse things further by claiming it’s the Quake I engine :)

repiv
Aug 13, 2009

respawn made it clear in GDC talks that massive chunks of the engine (rendering, collision, streaming...) were rewritten from scratch as early as titanfall 1, and it probably needed further work to support the big open apex maps

the titanfall/apex engine is source in the same way the modern COD engine is idtech, there's a direct lineage there but it's been ship of theseus'd at this point

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

If it still has the inverse square root function then it’s idtech, easy call.

repiv
Aug 13, 2009

if you want to see what a battle royale actually looks like on vanilla source just look to CS:GO danger zone, which feels like a janky mod that's barely keeping the engine from catching fire despite the map and player count being tiny by usual BR standards

Lackmaster
Mar 1, 2011
The game engine of Theseus

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

buglord posted:

Is there any competitive advantage to running CS:GO or Valorant or whatever at +240fps? Maybe this is more Games forum related but Hardware Unboxed and Gamers Nexus talk about playing games at 300FPS and I really have to wonder if this stuff makes or breaks noscope headshot performance on a noticeable level or if any competitive advantage it brings is lost when playing online with latency.

It's always going to be a meaningful advantage against peers. Frame rate matters a lot. The higher framerate goes the less the advantage is, but even the ~1ms going from 240 to 300 is going to give you a measurable performance increase.


Chainclaw posted:

Yeah in that range, but the question was if it mattered at 300 FPS. If the game is polling input at 30 or 60 FPS, it won't matter from a competitive edge if you're running 300 FPS locally. Sure it will look way nicer, and that's what I care more about, but the question was if it helped competitive gamers.

I don't think there's a game anyone takes seriously that polls at 60hz. In fact AFAIK every single game polls mouse input at framerate because the experience of decoupling mouse input from visual output would feel heinous, with horrible stutter inevitably resulting. Even if a game did, 300 FPS would still be a significant advantage, because you'd still be seeing things at least a millisecond faster than someone at a lower refresh rate. It'd just be less consistent in terms of reacting properly.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Isn't polling rate controlled by the OS and mouse driver, not the game?

Shumagorath
Jun 6, 2001

K8.0 posted:

It's always going to be a meaningful advantage against peers. Frame rate matters a lot. The higher framerate goes the less the advantage is, but even the ~1ms going from 240 to 300 is going to give you a measurable performance increase.

I don't think there's a game anyone takes seriously that polls at 60hz. In fact AFAIK every single game polls mouse input at framerate because the experience of decoupling mouse input from visual output would feel heinous, with horrible stutter inevitably resulting. Even if a game did, 300 FPS would still be a significant advantage, because you'd still be seeing things at least a millisecond faster than someone at a lower refresh rate. It'd just be less consistent in terms of reacting properly.
Do you have any research supporting that 1ms makes a measurable difference? Last I looked into it, Olympians have simple reaction times (starting pistol, ruler test, etc) on the order of 0.15s. 1ms would not have a measurable impact on a far more complex action like IFF —> Track —> Fire.

Dr. Video Games 0031
Jul 17, 2004

I would not say that 1ms on its own has a big impact, but competitive gamers seek to optimize every single part of the chain. Shave 1 ms off here, 2ms off there, etc, and eventually it will add up to a pretty big difference.

And reaction times are an entirely different story. And I'd like to think that reacting to a starting pistol and starting a sprint is a very different type of reaction than seeing something on a screen and flicking a mouse. These are not comparable actions, and I would suspect that this "far more complex" action can be done much faster than starting a sprint. And latency of any kind doesn't overlap with reaction times but adds to them.

KillHour
Oct 28, 2007


Shumagorath posted:

Do you have any research supporting that 1ms makes a measurable difference? Last I looked into it, Olympians have simple reaction times (starting pistol, ruler test, etc) on the order of 0.15s. 1ms would not have a measurable impact on a far more complex action like IFF —> Track —> Fire.

This comes up a lot, and yes. Latency is additive. Going from a 3ms device latency to a 1ms device latency still removes 2ms from the total round trip time, regardless of whether that total time is 5ms or 500ms.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
A follow up question here is, does additional FPS matter beyond your refresh rate

hobbesmaster
Jan 28, 2008

gradenko_2000 posted:

A follow up question here is, does additional FPS matter beyond your refresh rate

It depends on the game engine. Usually not these days.

Either way, if you’re talking 144 (or 141 after “recommendations”) then it really doesn’t matter. If it does matter you’ll have a coach…

Shumagorath
Jun 6, 2001
I’m arguing that 5-10ms of lag (input, network, whatever) is less impactful than many here are claiming it is outside of very simple games. Say your game is two players having a dot appear on screen and the fastest on-target click wins. In that scenario every millisecond matters. In a game where both players have free movement in 3-space, audio cues, materials, etc then quality of play is going to dominate minor latency gaps in everything but hypothetical quickdraws.

KillHour posted:

This comes up a lot, and yes. Latency is additive. Going from a 3ms device latency to a 1ms device latency still removes 2ms from the total round trip time, regardless of whether that total time is 5ms or 500ms.
Yes, but I’m saying with 10ms ping and 15ms input and display lag is still faster than all but the best professionals can even perceive.

Shumagorath fucked around with this message at 01:57 on Oct 5, 2022

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
I’m guessing there aren’t CRTs that are fast enough at 1080p-ish, but if they existed, I wonder if pros would stick to CRTs to get around sample and hold blur, which is honestly a big asterisk for high refresh rate LCDs.

Dr. Video Games 0031
Jul 17, 2004

Shumagorath posted:

I’m arguing that 5-10ms of lag (input, network, whatever) is less impactful than many here are claiming it is outside of very simple games. Say your game is two players having a dot appear on screen and the fastest on-target click wins. In that scenario every millisecond matters. In a game where both players have free movement in 3-space, audio cues, materials, etc then quality of play is going to dominate minor latency gaps in everything but hypothetical quickdraws.

Yes, but I’m saying with 10ms ping and 15ms input and display lag is still faster than all but the best professionals can even perceive.

Input lag is to an extent additive with reaction times too. Two people can have the same reaction time of 100ms but if one has 10ms less input lag then they are going to react faster.

Adbot
ADBOT LOVES YOU

Kibner
Oct 21, 2008

Acguy Supremacy

Shumagorath posted:

Do you have any research supporting that 1ms makes a measurable difference? Last I looked into it, Olympians have simple reaction times (starting pistol, ruler test, etc) on the order of 0.15s. 1ms would not have a measurable impact on a far more complex action like IFF —> Track —> Fire.

The person with a faster system gets to react with a lag advantage, making their success rate in reaction times against their opponents more likely. So, yes, it would make a measurable impact.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply