Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Anime Schoolgirl posted:

if i manage to get the game running at 30fps on my deskmini there's absolutely no reason why starfield shouldn't have a 60fps mode on the 4th gen xbox consoles, they somehow think people care about how sharp their assets are (when in practice they'll be degraded due to vram/LOD)

That's a pretty big if given that minimum specs got published today as an RX 5700. It's going to take minimum settings and probably some hacking with config files or mods to get it running at 30 on an iGPU.

Edit: I think this is the first game I've seen where the minimum requirements are higher than the most common GPU on the steam hardware survey. There's still a ton of 1060s and 1650s out there.

Twerk from Home fucked around with this message at 02:26 on Jun 12, 2023

Adbot
ADBOT LOVES YOU

Anime Schoolgirl
Nov 28, 2002

in my experience minimum requirements seem to be "what's the shittiest computer owned out of our entire staff that will 'run' it" and they're very often wildly inaccurate as a barometer to what the real "minimum" is, as most AAA games will scale downwards really well and some (mostly indie) games will actually have bigger practical requirements than stated (indie games listing pentium 4 cpus as a requirement when they're using the latest unity version for example)

Dr. Video Games 0031
Jul 17, 2004

If a game can do 4K30 on consoles but can't do 60fps at 1440p or lower with other graphical adjustments, then that's a sign that performance is being limited by the CPU. So expect this to be another CPU-heavy game (or another poorly optimized game that only utilizes 2 - 4 threads)

edit: We've seen this be the case with Gotham Knights, Plague Tale, and Redfall thus far. Those are games that run at high resolution and high graphical presets (in Gotham Knights' case, with RT forced on) because reducing visual fidelity has a minimal effect on performance if the CPU can't keep up at high frame rates. It's probably the same with Starfield.

Dr. Video Games 0031 fucked around with this message at 02:42 on Jun 12, 2023

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Dr. Video Games 0031 posted:

If a game can do 4K30 on consoles but can't do 60fps at 1440p or lower with other graphical adjustments, then that's a sign that performance is being limited by the CPU. So expect this to be another CPU-heavy game (or another poorly optimized game that only utilizes 2 - 4 threads)

These are the consoles, 4K means "900p-1440p dynamic resolution being reconstructed up to 4K". I'd bet money that Digital Foundry sees the Series S running 720p or lower, reconstructing up to 1440.

I fully agree it must be CPU bottlenecked, otherwise you could just run Series S settings on a Series X to get the 60fps performance mode.

Dr. Video Games 0031
Jul 17, 2004

Twerk from Home posted:

These are the consoles, 4K means "900p-1440p dynamic resolution being reconstructed up to 4K". I'd bet money that Digital Foundry sees the Series S running 720p or lower, reconstructing up to 1440.

I fully agree it must be CPU bottlenecked, otherwise you could just run Series S settings on a Series X to get the 60fps performance mode.

I don't know, I think there's a chance that it could at least be near 4K on XSX and near 1440p on XSS. I've edited my last post to reflect this, but the general trend has been to make the most out of the forced 30fps target by aiming for high resolutions and graphical presets. Jedi Survivor has been the biggest exception thus far, with it having terrible image quality on console while also only rarely hitting the 60fps target in its performance mode.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Anime Schoolgirl posted:

in my experience minimum requirements seem to be "what's the shittiest computer owned out of our entire staff that will 'run' it" and they're very often wildly inaccurate as a barometer to what the real "minimum" is, as most AAA games will scale downwards really well and some (mostly indie) games will actually have bigger practical requirements than stated (indie games listing pentium 4 cpus as a requirement when they're using the latest unity version for example)

Yeah if I can run Cyberpunk on a Mendocino laptop, you can run anything on anything if you turn it down low enough, esp with FSR

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

gradenko_2000 posted:

Yeah if I can run Cyberpunk on a Mendocino laptop, you can run anything on anything if you turn it down low enough, esp with FSR

Cyberpunk ran on the last-gen consoles and their terrible cat core CPUs, and cyberpunk's listed GPU minimum spec is a GTX 970 or RX 470, half the performance that Starfield's minimum is asking for.

I guess we'll see when it gets here. Do games even let you choose to run at 640x480 anymore?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Twerk from Home posted:

Cyberpunk ran on the last-gen consoles and their terrible cat core CPUs, and cyberpunk's listed GPU minimum spec is a GTX 970 or RX 470, half the performance that Starfield's minimum is asking for.

I guess we'll see when it gets here. Do games even let you choose to run at 640x480 anymore?

The Dead Space remake let you go down that low, yes

njsykora
Jan 23, 2012

Robots confuse squirrels.


hobbesmaster posted:

lol at playing a Bethesda game on console

lol at playing a Bethesda game that can't be modded (like the PC game pass version)

wargames
Mar 16, 2008

official yospos cat censor

hobbesmaster posted:

RDNA1 support for ARM64 on Linux first appeared in 6.2 in February of this year.

Nvidia’s Linux drivers are basically released for x86-64 and arm64 at the same time.

:iiam:

I don't know about the arm side of linux.

hobbesmaster
Jan 28, 2008

wargames posted:

I don't know about the arm side of linux.

For some reason AMD and Intel aren’t jumping to provide first class support for arm64 in traditional PC use cases.

A real mystery that.

The other funny thing is to think about how much nvidia and apple must hate each other at the moment. I wonder which direction would cause more rage at this point in their feud: Apple demanding proprietary details of nvidia’s GPUs to implement OSX drivers or nvidia demanding proprietary details of apple silicon to implement OSX drivers.

Kibner
Oct 21, 2008

Acguy Supremacy

njsykora posted:

lol at playing a Bethesda game that can't be modded (like the PC game pass version)

You can mod (some) PC Game Pass games. There is a whole "mods" folder for games that support it, now.

e: "Program Files/ModifiableWindowsApps", I think

Kibner fucked around with this message at 04:22 on Jun 12, 2023

FuturePastNow
May 19, 2014


Minimum specs nowadays aren't the shittiest computer that will run the game, they're more like the minimum that won't require turning the details down so low people post screenshots that make the game look bad

Inept
Jul 8, 2003

Giving the current consoles good CPUs was a real monkey’s paw wish. Instead of having to optimize for weak CPUs, they can just crank out some poo poo that runs poorly on everything

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Inept posted:

Giving the current consoles good CPUs was a real monkey’s paw wish. Instead of having to optimize for weak CPUs, they can just crank out some poo poo that runs poorly on everything

We wanted more realistic, precise and interactive physics simulations. We wanted better AI in our NPCs.

What we got was poor optimisation. To be expected, I guess.

BurritoJustice
Oct 9, 2012

hobbesmaster posted:

For some reason AMD and Intel aren’t jumping to provide first class support for arm64 in traditional PC use cases.

A real mystery that.

The other funny thing is to think about how much nvidia and apple must hate each other at the moment. I wonder which direction would cause more rage at this point in their feud: Apple demanding proprietary details of nvidia’s GPUs to implement OSX drivers or nvidia demanding proprietary details of apple silicon to implement OSX drivers.

IIRC, last I heard the blocking entity for NVIDIA in MacOS is Apple. NVIDIA would love to sell GPUs to Apple users but Apple wants nothing to do with them since the dying laptop GPUs fiasco.

That's pre ASi though, the blocker now is that Apple doesn't want to support dGPUs at all. The M2 Mac Pro seems like a loving terrible workstation, with the limited RAM, lack of dGPU support, and reused CPU from the Mac studio.

Inept posted:

Giving the current consoles good CPUs was a real monkey’s paw wish. Instead of having to optimize for weak CPUs, they can just crank out some poo poo that runs poorly on everything

I wouldn't classify the CPUs as good, strictly. They're not as far behind as the Atom tier CPUs in the PS4/Xbone, but Zen2 wasn't an incredible gaming architecture and the low clocks, lessened cache, and high latency memory make the CPUs perform like desktop Zen1. And Zen1 levels of gaming performance was standard in the PC world ten years ago. I think they're definitely more likely to be CPU than GPU limited in modern games, which is why we are seeing 4K30 more often.

BurritoJustice fucked around with this message at 10:55 on Jun 12, 2023

repiv
Aug 13, 2009

BurritoJustice posted:

That's pre ASi though, the blocker now is that Apple doesn't want to support dGPUs at all. The M2 Mac Pro seems like a loving terrible workstation, with the limited RAM, lack of dGPU support, and reused CPU from the Mac studio.

if you actually use all those pcie slots it's going to have really bad contention too, according to the asahi linux devs they're using a giant PCIe switch to hang two 16x slots and three 8x slots from 16 CPU lanes

i wonder if the original plan was for it to use the rumoured quad-die chip, and this was plan B

Anime Schoolgirl
Nov 28, 2002

HEDT platform with 16 total pcie lanes :pwn:

repiv
Aug 13, 2009

32 lanes total, but five of the pcie slots share 16 lanes, and the other x8 slot shares 8 lanes with all of the integrated I/O



the SSDs get 8 dedicated lanes, but those are proprietary modules so if you want to use standard NVMe drives they need to share 24 lanes with everything else

repiv fucked around with this message at 12:11 on Jun 12, 2023

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
https://twitter.com/VideoCardz/status/1668160171321819138?t=8Tnsnxmi5ohK9lzmZSunOA&s=19

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

gradenko_2000 posted:

The Dead Space remake let you go down that low, yes

I'm here thinking that a modern game in 640x480 might actually look really pretty, depening on the art style

Cross-Section
Mar 18, 2009

Kibner posted:

You can mod (some) PC Game Pass games. There is a whole "mods" folder for games that support it, now.

e: "Program Files/ModifiableWindowsApps", I think

Basically if the game has the "Administrator approval required for installation" note on its page in the app then it's about as moddable as your regular Steam game

If it doesn't then it's UWP nightmare time

Kerbtree
Sep 8, 2008

BAD FALCON!
LAZY!

Lord Stimperor posted:

I'm here thinking that a modern game in 640x480 might actually look really pretty, depening on the art style

That’s basically 40K: Boltgun’s thing.

Weird Pumpkin
Oct 7, 2007

Lord Stimperor posted:

I'm here thinking that a modern game in 640x480 might actually look really pretty, depening on the art style

I kinda like the look of some of the games where people have manually edited configs to get the lowest requirements possible. It's weird seeing modern games look so crunchy but in a really fun way

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

This thing is gonna suuuuuck.

Where's the betting pool on performance? Personally my guess is faster than a 3060 at 1080p and lower, slower at 1440p and higher.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Twerk from Home posted:

This thing is gonna suuuuuck.

Where's the betting pool on performance? Personally my guess is faster than a 3060 at 1080p and lower, slower at 1440p and higher.

I know we joked about nvidia leaning on frame generation to sell new cards but drat

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
poo poo, the 3050 might give it a run for its money

Termyie
Aug 18, 2022

Always choose violence.

Why is the memory bus getting worse on the lower-end cards? Is Nvidia cut costs but keeping the prices sky high?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Termyie posted:

Why is the memory bus getting worse on the lower-end cards? Is Nvidia cut costs but keeping the prices sky high?

Well, yeah

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It's not that they are cutting memory buses down, it's just that (aside from the 4090) they are taking a given GPU and labeling it and pricing it two tiers higher than they historically would have.

Hopefully it's a single generation aberration, but it may depend in part on how long the AI bubble lasts and Nvidia can screw consumers while having insatiable datacenter demand.

njsykora
Jan 23, 2012

Robots confuse squirrels.


Termyie posted:

Is Nvidia cut costs but keeping the prices sky high?

Yes that is how capitalism works. Mostly though they're almost entirely dominant of the market and have no reason to do anything else. Even when AMD cards are better they still don't sell. Hell the 4060Ti was extremely close to the A750 so if the 4060 is slower than both that and the RX7600 (both cheaper cards, much cheaper with the A750) we're probably about to see a practical demonstration of "I don't care I only buy Nvidia cards".

njsykora fucked around with this message at 16:08 on Jun 12, 2023

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
it's not even just the memory bus; the 4060Ti only has 8 lanes of PCIe, where the 3060Ti, and even the 3050 non-Ti, still had a full 16 lanes, and only the 3050 had 8 lanes

mobby_6kl
Aug 9, 2009

by Fluffdaddy

njsykora posted:

Yes that is how capitalism works. Mostly though they're almost entirely dominant of the market and have no reason to do anything else. Even when AMD cards are better they still don't sell. Hell the 4060Ti was extremely close to the A750 so if the 4060 is slower than both that and the RX7600 (both cheaper cards, much cheaper with the A750) we're probably about to see a practical demonstration of "I don't care I only buy Nvidia cards".

I wouldn't put it past Intel to bail on the market after the first half-baked attempt but hopefully Nvidia's shenanigans encourage them to stay in for a slice of the fat margins

FuturePastNow
May 19, 2014


Termyie posted:

Why is the memory bus getting worse on the lower-end cards? Is Nvidia cut costs but keeping the prices sky high?

To save a few pennies

repiv
Aug 13, 2009

isn't part of it that the memory PHYs don't shrink along with logic, so on denser nodes it's harder to justify putting a big bus on a small chip because it ends up dominating the die

same with PCIe lanes, the PHY size per lane doesn't shrink so we're seeing them cut down to 8 or 4 lanes on lower end cards

not to discount price gouging though, it's also that

repiv fucked around with this message at 16:53 on Jun 12, 2023

UHD
Nov 11, 2006


gradenko_2000 posted:

it's not even just the memory bus; the 4060Ti only has 8 lanes of PCIe, where the 3060Ti, and even the 3050 non-Ti, still had a full 16 lanes, and only the 3050 had 8 lanes

does fewer lanes translate to meaningful performance issues? because i was under the impression pcie speed doesn't really impact performance very much.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

UHD posted:

does fewer lanes translate to meaningful performance issues? because i was under the impression pcie speed doesn't really impact performance very much.

Fine if you have a PCIe 4 board and CPU, poo poo if you have a PCIe 3 board and CPU, now you're down to PCIe 3 8x

That's the main complaint

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

UHD posted:

does fewer lanes translate to meaningful performance issues? because i was under the impression pcie speed doesn't really impact performance very much.

you can run into a bottleneck if your CPU/motherboard is still on PCIe 3.0

this was a problem for AMD as soon as their 6000 series when they cut down the RX 6500XT and 6400 down to just FOUR lanes. An RX 6600 with eight lanes is mostly okay, but at the performance envelope that the 4060Ti is operating at, eight lanes of PCIe 3.0 can actually become an issue, compared to, say, a 3060Ti with 16

Dr. Video Games 0031
Jul 17, 2004

A theory emerges on why Gigabyte GPUs are disproportionately represented when it comes to cracked PCBs: https://www.tomshardware.com/news/gigabyte-gpu-design-details-emerge-about-pcb-cracking



Gigabyte cards have that notch next to the PCIe latching tab, which on some cards causes the edge of the PCB to come very close to the bottom-right GPU cooler mounting hole. This seems to be a problem with the GA102 cards in particular, with the other GPUs having less problematic mounting hole placement. EVGA's and Zotac's 3090s appear to have a similar notch and mounting hole placements, while Asus and MSI's don't.

Adbot
ADBOT LOVES YOU

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



repiv posted:

32 lanes total, but five of the pcie slots share 16 lanes, and the other x8 slot shares 8 lanes with all of the integrated I/O



the SSDs get 8 dedicated lanes, but those are proprietary modules so if you want to use standard NVMe drives they need to share 24 lanes with everything else

Yeah, it’s a really poo poo machine.

The 2019 Intel Mac Pros are probably going to hold value for quite some time as a result of just how bad that machine is.

Edit: to put it into perspective, I have a tracker for Apple refurbished hardware from the Apple Store, and pre-WWDC, it wasn’t uncommon for refurbished Mac Pros to sit for days on end without being purchased, but since WWDC, almost any decent 7,1 MP is being purchased immediately. They had a 16-core MP with 1TB SSD, 48 GB DDR4, and W5700X come in at a really great price early this morning, and it apparently immediately sold out.

njsykora posted:

Yes that is how capitalism works. Mostly though they're almost entirely dominant of the market and have no reason to do anything else. Even when AMD cards are better they still don't sell. Hell the 4060Ti was extremely close to the A750 so if the 4060 is slower than both that and the RX7600 (both cheaper cards, much cheaper with the A750) we're probably about to see a practical demonstration of "I don't care I only buy Nvidia cards".

We need an Intel Core vs. AMD Ryzen type of situation where it forces nVidia to be more competitive on price or performance increases, but lol, lmao, at the idea of AMD getting its act together enough to put something that gives nVidia an actual run for its money across the key performance metrics.

Canned Sunshine fucked around with this message at 16:58 on Jun 12, 2023

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply