|
Anime Schoolgirl posted:if i manage to get the game running at 30fps on my deskmini there's absolutely no reason why starfield shouldn't have a 60fps mode on the 4th gen xbox consoles, they somehow think people care about how sharp their assets are (when in practice they'll be degraded due to vram/LOD) That's a pretty big if given that minimum specs got published today as an RX 5700. It's going to take minimum settings and probably some hacking with config files or mods to get it running at 30 on an iGPU. Edit: I think this is the first game I've seen where the minimum requirements are higher than the most common GPU on the steam hardware survey. There's still a ton of 1060s and 1650s out there. Twerk from Home fucked around with this message at 02:26 on Jun 12, 2023 |
# ? Jun 12, 2023 02:03 |
|
|
# ? Jun 5, 2024 13:47 |
|
in my experience minimum requirements seem to be "what's the shittiest computer owned out of our entire staff that will 'run' it" and they're very often wildly inaccurate as a barometer to what the real "minimum" is, as most AAA games will scale downwards really well and some (mostly indie) games will actually have bigger practical requirements than stated (indie games listing pentium 4 cpus as a requirement when they're using the latest unity version for example)
|
# ? Jun 12, 2023 02:34 |
|
If a game can do 4K30 on consoles but can't do 60fps at 1440p or lower with other graphical adjustments, then that's a sign that performance is being limited by the CPU. So expect this to be another CPU-heavy game (or another poorly optimized game that only utilizes 2 - 4 threads) edit: We've seen this be the case with Gotham Knights, Plague Tale, and Redfall thus far. Those are games that run at high resolution and high graphical presets (in Gotham Knights' case, with RT forced on) because reducing visual fidelity has a minimal effect on performance if the CPU can't keep up at high frame rates. It's probably the same with Starfield. Dr. Video Games 0031 fucked around with this message at 02:42 on Jun 12, 2023 |
# ? Jun 12, 2023 02:37 |
|
Dr. Video Games 0031 posted:If a game can do 4K30 on consoles but can't do 60fps at 1440p or lower with other graphical adjustments, then that's a sign that performance is being limited by the CPU. So expect this to be another CPU-heavy game (or another poorly optimized game that only utilizes 2 - 4 threads) These are the consoles, 4K means "900p-1440p dynamic resolution being reconstructed up to 4K". I'd bet money that Digital Foundry sees the Series S running 720p or lower, reconstructing up to 1440. I fully agree it must be CPU bottlenecked, otherwise you could just run Series S settings on a Series X to get the 60fps performance mode.
|
# ? Jun 12, 2023 02:40 |
|
Twerk from Home posted:These are the consoles, 4K means "900p-1440p dynamic resolution being reconstructed up to 4K". I'd bet money that Digital Foundry sees the Series S running 720p or lower, reconstructing up to 1440. I don't know, I think there's a chance that it could at least be near 4K on XSX and near 1440p on XSS. I've edited my last post to reflect this, but the general trend has been to make the most out of the forced 30fps target by aiming for high resolutions and graphical presets. Jedi Survivor has been the biggest exception thus far, with it having terrible image quality on console while also only rarely hitting the 60fps target in its performance mode.
|
# ? Jun 12, 2023 02:48 |
|
Anime Schoolgirl posted:in my experience minimum requirements seem to be "what's the shittiest computer owned out of our entire staff that will 'run' it" and they're very often wildly inaccurate as a barometer to what the real "minimum" is, as most AAA games will scale downwards really well and some (mostly indie) games will actually have bigger practical requirements than stated (indie games listing pentium 4 cpus as a requirement when they're using the latest unity version for example) Yeah if I can run Cyberpunk on a Mendocino laptop, you can run anything on anything if you turn it down low enough, esp with FSR
|
# ? Jun 12, 2023 02:52 |
|
gradenko_2000 posted:Yeah if I can run Cyberpunk on a Mendocino laptop, you can run anything on anything if you turn it down low enough, esp with FSR Cyberpunk ran on the last-gen consoles and their terrible cat core CPUs, and cyberpunk's listed GPU minimum spec is a GTX 970 or RX 470, half the performance that Starfield's minimum is asking for. I guess we'll see when it gets here. Do games even let you choose to run at 640x480 anymore?
|
# ? Jun 12, 2023 02:55 |
|
Twerk from Home posted:Cyberpunk ran on the last-gen consoles and their terrible cat core CPUs, and cyberpunk's listed GPU minimum spec is a GTX 970 or RX 470, half the performance that Starfield's minimum is asking for. The Dead Space remake let you go down that low, yes
|
# ? Jun 12, 2023 02:58 |
|
hobbesmaster posted:lol at playing a Bethesda game on console lol at playing a Bethesda game that can't be modded (like the PC game pass version)
|
# ? Jun 12, 2023 03:16 |
|
hobbesmaster posted:RDNA1 support for ARM64 on Linux first appeared in 6.2 in February of this year. I don't know about the arm side of linux.
|
# ? Jun 12, 2023 03:32 |
|
wargames posted:I don't know about the arm side of linux. For some reason AMD and Intel aren’t jumping to provide first class support for arm64 in traditional PC use cases. A real mystery that. The other funny thing is to think about how much nvidia and apple must hate each other at the moment. I wonder which direction would cause more rage at this point in their feud: Apple demanding proprietary details of nvidia’s GPUs to implement OSX drivers or nvidia demanding proprietary details of apple silicon to implement OSX drivers.
|
# ? Jun 12, 2023 03:54 |
|
njsykora posted:lol at playing a Bethesda game that can't be modded (like the PC game pass version) You can mod (some) PC Game Pass games. There is a whole "mods" folder for games that support it, now. e: "Program Files/ModifiableWindowsApps", I think Kibner fucked around with this message at 04:22 on Jun 12, 2023 |
# ? Jun 12, 2023 04:20 |
|
Minimum specs nowadays aren't the shittiest computer that will run the game, they're more like the minimum that won't require turning the details down so low people post screenshots that make the game look bad
|
# ? Jun 12, 2023 04:30 |
|
Giving the current consoles good CPUs was a real monkey’s paw wish. Instead of having to optimize for weak CPUs, they can just crank out some poo poo that runs poorly on everything
|
# ? Jun 12, 2023 04:39 |
|
Inept posted:Giving the current consoles good CPUs was a real monkey’s paw wish. Instead of having to optimize for weak CPUs, they can just crank out some poo poo that runs poorly on everything We wanted more realistic, precise and interactive physics simulations. We wanted better AI in our NPCs. What we got was poor optimisation. To be expected, I guess.
|
# ? Jun 12, 2023 05:06 |
|
hobbesmaster posted:For some reason AMD and Intel aren’t jumping to provide first class support for arm64 in traditional PC use cases. IIRC, last I heard the blocking entity for NVIDIA in MacOS is Apple. NVIDIA would love to sell GPUs to Apple users but Apple wants nothing to do with them since the dying laptop GPUs fiasco. That's pre ASi though, the blocker now is that Apple doesn't want to support dGPUs at all. The M2 Mac Pro seems like a loving terrible workstation, with the limited RAM, lack of dGPU support, and reused CPU from the Mac studio. Inept posted:Giving the current consoles good CPUs was a real monkey’s paw wish. Instead of having to optimize for weak CPUs, they can just crank out some poo poo that runs poorly on everything I wouldn't classify the CPUs as good, strictly. They're not as far behind as the Atom tier CPUs in the PS4/Xbone, but Zen2 wasn't an incredible gaming architecture and the low clocks, lessened cache, and high latency memory make the CPUs perform like desktop Zen1. And Zen1 levels of gaming performance was standard in the PC world ten years ago. I think they're definitely more likely to be CPU than GPU limited in modern games, which is why we are seeing 4K30 more often. BurritoJustice fucked around with this message at 10:55 on Jun 12, 2023 |
# ? Jun 12, 2023 10:50 |
|
BurritoJustice posted:That's pre ASi though, the blocker now is that Apple doesn't want to support dGPUs at all. The M2 Mac Pro seems like a loving terrible workstation, with the limited RAM, lack of dGPU support, and reused CPU from the Mac studio. if you actually use all those pcie slots it's going to have really bad contention too, according to the asahi linux devs they're using a giant PCIe switch to hang two 16x slots and three 8x slots from 16 CPU lanes i wonder if the original plan was for it to use the rumoured quad-die chip, and this was plan B
|
# ? Jun 12, 2023 11:05 |
|
HEDT platform with 16 total pcie lanes
|
# ? Jun 12, 2023 11:43 |
|
32 lanes total, but five of the pcie slots share 16 lanes, and the other x8 slot shares 8 lanes with all of the integrated I/O the SSDs get 8 dedicated lanes, but those are proprietary modules so if you want to use standard NVMe drives they need to share 24 lanes with everything else repiv fucked around with this message at 12:11 on Jun 12, 2023 |
# ? Jun 12, 2023 11:52 |
|
https://twitter.com/VideoCardz/status/1668160171321819138?t=8Tnsnxmi5ohK9lzmZSunOA&s=19
|
# ? Jun 12, 2023 12:04 |
|
gradenko_2000 posted:The Dead Space remake let you go down that low, yes I'm here thinking that a modern game in 640x480 might actually look really pretty, depening on the art style
|
# ? Jun 12, 2023 13:23 |
|
Kibner posted:You can mod (some) PC Game Pass games. There is a whole "mods" folder for games that support it, now. Basically if the game has the "Administrator approval required for installation" note on its page in the app then it's about as moddable as your regular Steam game If it doesn't then it's UWP nightmare time
|
# ? Jun 12, 2023 13:46 |
|
Lord Stimperor posted:I'm here thinking that a modern game in 640x480 might actually look really pretty, depening on the art style That’s basically 40K: Boltgun’s thing.
|
# ? Jun 12, 2023 14:42 |
|
Lord Stimperor posted:I'm here thinking that a modern game in 640x480 might actually look really pretty, depening on the art style I kinda like the look of some of the games where people have manually edited configs to get the lowest requirements possible. It's weird seeing modern games look so crunchy but in a really fun way
|
# ? Jun 12, 2023 14:58 |
|
gradenko_2000 posted:https://twitter.com/VideoCardz/status/1668160171321819138?t=8Tnsnxmi5ohK9lzmZSunOA&s=19 This thing is gonna suuuuuck. Where's the betting pool on performance? Personally my guess is faster than a 3060 at 1080p and lower, slower at 1440p and higher.
|
# ? Jun 12, 2023 14:59 |
|
Twerk from Home posted:This thing is gonna suuuuuck. I know we joked about nvidia leaning on frame generation to sell new cards but drat
|
# ? Jun 12, 2023 15:29 |
|
poo poo, the 3050 might give it a run for its money
|
# ? Jun 12, 2023 15:37 |
|
Why is the memory bus getting worse on the lower-end cards? Is Nvidia cut costs but keeping the prices sky high?
|
# ? Jun 12, 2023 15:40 |
|
Termyie posted:Why is the memory bus getting worse on the lower-end cards? Is Nvidia cut costs but keeping the prices sky high? Well, yeah
|
# ? Jun 12, 2023 16:03 |
|
It's not that they are cutting memory buses down, it's just that (aside from the 4090) they are taking a given GPU and labeling it and pricing it two tiers higher than they historically would have. Hopefully it's a single generation aberration, but it may depend in part on how long the AI bubble lasts and Nvidia can screw consumers while having insatiable datacenter demand.
|
# ? Jun 12, 2023 16:05 |
|
Termyie posted:Is Nvidia cut costs but keeping the prices sky high? Yes that is how capitalism works. Mostly though they're almost entirely dominant of the market and have no reason to do anything else. Even when AMD cards are better they still don't sell. Hell the 4060Ti was extremely close to the A750 so if the 4060 is slower than both that and the RX7600 (both cheaper cards, much cheaper with the A750) we're probably about to see a practical demonstration of "I don't care I only buy Nvidia cards". njsykora fucked around with this message at 16:08 on Jun 12, 2023 |
# ? Jun 12, 2023 16:05 |
|
it's not even just the memory bus; the 4060Ti only has 8 lanes of PCIe, where the 3060Ti, and even the 3050 non-Ti, still had a full 16 lanes, and only the 3050 had 8 lanes
|
# ? Jun 12, 2023 16:18 |
|
njsykora posted:Yes that is how capitalism works. Mostly though they're almost entirely dominant of the market and have no reason to do anything else. Even when AMD cards are better they still don't sell. Hell the 4060Ti was extremely close to the A750 so if the 4060 is slower than both that and the RX7600 (both cheaper cards, much cheaper with the A750) we're probably about to see a practical demonstration of "I don't care I only buy Nvidia cards". I wouldn't put it past Intel to bail on the market after the first half-baked attempt but hopefully Nvidia's shenanigans encourage them to stay in for a slice of the fat margins
|
# ? Jun 12, 2023 16:21 |
|
Termyie posted:Why is the memory bus getting worse on the lower-end cards? Is Nvidia cut costs but keeping the prices sky high? To save a few pennies
|
# ? Jun 12, 2023 16:34 |
|
isn't part of it that the memory PHYs don't shrink along with logic, so on denser nodes it's harder to justify putting a big bus on a small chip because it ends up dominating the die same with PCIe lanes, the PHY size per lane doesn't shrink so we're seeing them cut down to 8 or 4 lanes on lower end cards not to discount price gouging though, it's also that repiv fucked around with this message at 16:53 on Jun 12, 2023 |
# ? Jun 12, 2023 16:35 |
|
gradenko_2000 posted:it's not even just the memory bus; the 4060Ti only has 8 lanes of PCIe, where the 3060Ti, and even the 3050 non-Ti, still had a full 16 lanes, and only the 3050 had 8 lanes does fewer lanes translate to meaningful performance issues? because i was under the impression pcie speed doesn't really impact performance very much.
|
# ? Jun 12, 2023 16:37 |
|
UHD posted:does fewer lanes translate to meaningful performance issues? because i was under the impression pcie speed doesn't really impact performance very much. Fine if you have a PCIe 4 board and CPU, poo poo if you have a PCIe 3 board and CPU, now you're down to PCIe 3 8x That's the main complaint
|
# ? Jun 12, 2023 16:42 |
|
UHD posted:does fewer lanes translate to meaningful performance issues? because i was under the impression pcie speed doesn't really impact performance very much. you can run into a bottleneck if your CPU/motherboard is still on PCIe 3.0 this was a problem for AMD as soon as their 6000 series when they cut down the RX 6500XT and 6400 down to just FOUR lanes. An RX 6600 with eight lanes is mostly okay, but at the performance envelope that the 4060Ti is operating at, eight lanes of PCIe 3.0 can actually become an issue, compared to, say, a 3060Ti with 16
|
# ? Jun 12, 2023 16:51 |
|
A theory emerges on why Gigabyte GPUs are disproportionately represented when it comes to cracked PCBs: https://www.tomshardware.com/news/gigabyte-gpu-design-details-emerge-about-pcb-cracking Gigabyte cards have that notch next to the PCIe latching tab, which on some cards causes the edge of the PCB to come very close to the bottom-right GPU cooler mounting hole. This seems to be a problem with the GA102 cards in particular, with the other GPUs having less problematic mounting hole placement. EVGA's and Zotac's 3090s appear to have a similar notch and mounting hole placements, while Asus and MSI's don't.
|
# ? Jun 12, 2023 16:55 |
|
|
# ? Jun 5, 2024 13:47 |
|
repiv posted:32 lanes total, but five of the pcie slots share 16 lanes, and the other x8 slot shares 8 lanes with all of the integrated I/O Yeah, it’s a really poo poo machine. The 2019 Intel Mac Pros are probably going to hold value for quite some time as a result of just how bad that machine is. Edit: to put it into perspective, I have a tracker for Apple refurbished hardware from the Apple Store, and pre-WWDC, it wasn’t uncommon for refurbished Mac Pros to sit for days on end without being purchased, but since WWDC, almost any decent 7,1 MP is being purchased immediately. They had a 16-core MP with 1TB SSD, 48 GB DDR4, and W5700X come in at a really great price early this morning, and it apparently immediately sold out. njsykora posted:Yes that is how capitalism works. Mostly though they're almost entirely dominant of the market and have no reason to do anything else. Even when AMD cards are better they still don't sell. Hell the 4060Ti was extremely close to the A750 so if the 4060 is slower than both that and the RX7600 (both cheaper cards, much cheaper with the A750) we're probably about to see a practical demonstration of "I don't care I only buy Nvidia cards". We need an Intel Core vs. AMD Ryzen type of situation where it forces nVidia to be more competitive on price or performance increases, but lol, lmao, at the idea of AMD getting its act together enough to put something that gives nVidia an actual run for its money across the key performance metrics. Canned Sunshine fucked around with this message at 16:58 on Jun 12, 2023 |
# ? Jun 12, 2023 16:55 |