Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Ganondork
Dec 26, 2012

Ganondork

DrDork posted:

The actual reliability of USB 3 is perfectly fine. What's not has been the compatibility, as 3.x decided to introduce multiple layers of confusing and conflicting speeds, capabilities, etc. They'll all work as a generic 3.0 host, but then whether it supports super high speed, a given wattage for charging, video, etc., is all a poo poo-show.

That’s exactly it, vendor implementations have been all over the place. Seems like it was completely over engineered.

Also, interesting story! Having built out many 3rd party integrations myself, I can say that all it takes is 1 poo poo vendor to force you to into bending to their crap. I’m sure your friend wished he could slap those fools!

Adbot
ADBOT LOVES YOU

pyrotek
May 21, 2004



DLSS 2.0 is amazing in the whopping four games that support it right now, but I can't stop thinking how much I'd love to have the technology in a new Switch revision.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib

TheFluff posted:

The selling point with USB-C in particular is that hey, it's one connector that does everything and you don't have to flip it three times when plugging in, sweet. The problem is that if you look at any random USB-C port there's absolutely no way of telling what it might support. Does it supply power? Maybe, but at 5V, 12V or 20V, or any of them? How much power? Enough to charge your laptop? Who knows! Does it support power input, for that matter? Sometimes only one of the ports on a device does. Can you use it for display output? Quite possibly, but it's anyone's guess what resolutions and refresh rates are supported, or even which protocol is being used (HDMI is possible, AFAIK). It's quite common that 4K 60Hz doesn't work. And even if you just use it for data transfer, there's no consistency there either. It's a loving mess.

Also, multiple types of USB-C cables with different capabilities. It's fun!

With displays, most of the time it's DisplayPort. But some will also support HDMI in addition.

Peanut Butler
Jul 25, 2003



so I used to work in computers, ages ago (trying to get back into it but that's neither here nor there), so I'm not really up on how PCIe works since AGP was the standard back then, and I've been pretty poor for a while, using what IT skills I have to stretch this Pentium G620 machine into functionality as long as possible-

as I posted upthread, I have a Radeon HD4870, a beast of a GPU that unfortunately can't do DX11. Wanted to play Disco Elysium, and my buddy gave me his old GeForce GT630- I'm stoked, I don't rly need this to be an amazing 60fps graphical experience, I'm here for the writing.

Went ahead and put the lil GT630 into the second PCIe slot, so I wouldn't have to disconnect the power cables from the chonky HD4870. After a little tinkering, got the GT630 working on a second monitor, and confirmed that it was handling DX11 shaders just fine.

I was using the VGA connector, and thought to use HDMI instead, so I unplugged the VGA- and Disco Elysium moved over to the primary monitor, still plugged into the HD4870 via DVI- and DX11 shaders still worked.

What's going on inside the computer/in software? Is the 630 handling rendering while outputting through the 4870? Is it just doing DX11 shader calls? Is this One Weird Trick to make your expensive and beefy old GPU that's hard to replace cheaply do DX11?

joke edit: how long until my computer explodes from having ATI and nVidia drivers on it at the same time

Peanut Butler fucked around with this message at 21:47 on Apr 5, 2020

Taerkar
Dec 7, 2002

kind of into it, really

I've currently got a 1060 6GB version and it's really starting to show its age with certain things, pushing me to look to getting an upgrade. I'll be moving this card into my partner's PC so they'll get good use out of it, but I'm mostly looking to not spend too much. I'm thinking around $300 or so but I could go up to $400 if it's a good enough card to justify spending extra on it.

Right now I've got a 1920x1080 setup but I occasionally use a 4K tv for group stuff.

TOOT BOOT
May 25, 2010

Peanut Butler posted:

so I used to work in computers, ages ago (trying to get back into it but that's neither here nor there), so I'm not really up on how PCIe works since AGP was the standard back then, and I've been pretty poor for a while, using what IT skills I have to stretch this Pentium G620 machine into functionality as long as possible-

as I posted upthread, I have a Radeon HD4870, a beast of a GPU that unfortunately can't do DX11. Wanted to play Disco Elysium, and my buddy gave me his old GeForce GT630- I'm stoked, I don't rly need this to be an amazing 60fps graphical experience, I'm here for the writing.

Went ahead and put the lil GT630 into the second PCIe slot, so I wouldn't have to disconnect the power cables from the chonky HD4870. After a little tinkering, got the GT630 working on a second monitor, and confirmed that it was handling DX11 shaders just fine.

I was using the VGA connector, and thought to use HDMI instead, so I unplugged the VGA- and Disco Elysium moved over to the primary monitor, still plugged into the HD4870 via DVI- and DX11 shaders still worked.

What's going on inside the computer/in software? Is the 630 handling rendering while outputting through the 4870? Is it just doing DX11 shader calls? Is this One Weird Trick to make your expensive and beefy old GPU that's hard to replace cheaply do DX11?

joke edit: how long until my computer explodes from having ATI and nVidia drivers on it at the same time

There's no real reason to have 2 GPUs installed. Just use the more powerful one which is probably the most recent one if it's much newer.

Arzachel
May 12, 2012

TOOT BOOT posted:

There's no real reason to have 2 GPUs installed. Just use the more powerful one which is probably the most recent one if it's much newer.

You'd think so, but the HD4870 is twice as fast as the GT630 which is a rebadged GT440.

repiv
Aug 13, 2009

a hd4870 isn't going to do you much good when dx11 is mandatory to run nearly everything released in the last 5+ years though

it may be time to buy a new card

repiv
Aug 13, 2009

the hd4xxx series doesn't even have official windows 10 drivers

NJD2005
Sep 3, 2006
...
Honestly sounds like a good situation for GeForce Now, Disco Elysium is supported and it's not a game where you need split second reflexes. There's a free version that gives you an hour session but I think right now has long lines to connect or you can get their premium version which gives you 6 hour sessions (you can just reconnect immediately after) and priority access for $4.99 a month.

Demostrs
Mar 30, 2011

by Nyc_Tattoo

Peanut Butler posted:

What's going on inside the computer/in software? Is the 630 handling rendering while outputting through the 4870?

Nobody else really answered this question, but I bet this scenario is what is happening. I know it’s at least possible, as I had my Intel iGPU rendering a game and displaying through a GTX 1060 before when that card was dying on me.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib

NJD2005 posted:

Honestly sounds like a good situation for GeForce Now, Disco Elysium is supported and it's not a game where you need split second reflexes. There's a free version that gives you an hour session but I think right now has long lines to connect or you can get their premium version which gives you 6 hour sessions (you can just reconnect immediately after) and priority access for $4.99 a month.

Disco Elysium looks like a game that will run on pretty much anything, though.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It's also possible that while the game has a "is this system DX11 capable" launch check, Unity is capable of doing fallback to DX9 (although not all the shaders may get proper fallbacks, etc).

Peanut Butler
Jul 25, 2003



repiv posted:

a hd4870 isn't going to do you much good when dx11 is mandatory to run nearly everything released in the last 5+ years though

it may be time to buy a new card

lol it's been time to buy a new card for a while but it's more pressingly time to buy food; shelter

a huge number of games I play use OpenGL, or DX9 mode, tho, I'm not rly into AAA titles; I've been barred from a few newer games but, like, there are decades of games I've never played so it hasn't been a sweat really

TOOT BOOT posted:

There's no real reason to have 2 GPUs installed. Just use the more powerful one which is probably the most recent one if it's much newer.

this has been addressed, but it's one of the frustrations of having a GPU that was very expensive when Bush was president- until real recently, most sub-$200 options that can do DX11 have been partial downgrades. I live on abt $10k/yr, so- it's all been about making the components I got when I had more money, plus components friends give me on upgrade, last as long as possible. The 4870 is probably the best card I coulda bought a decade ago, given my current situation, the thing renders most newer games it's compatible with super well for its age- waaaay better than a GPU manufactured in 1999 would in 2010!

NJD2005 posted:

Honestly sounds like a good situation for GeForce Now, Disco Elysium is supported and it's not a game where you need split second reflexes. There's a free version that gives you an hour session but I think right now has long lines to connect or you can get their premium version which gives you 6 hour sessions (you can just reconnect immediately after) and priority access for $4.99 a month.

Lambert posted:

Disco Elysium looks like a game that will run on pretty much anything, though.

K8.0 posted:

It's also possible that while the game has a "is this system DX11 capable" launch check, Unity is capable of doing fallback to DX9 (although not all the shaders may get proper fallbacks, etc).

Disco Elysium will start up if you can't do DX11, and is just unplayably visually glitchy as shader calls fail. Checked the log, and there aren't any DX11 shader errors now, runs great on my frankenstein potato machine with what I affectionately call an 'i1' CPU.
I just thought it was funny that DX11 calls work perfectly now despite me not having a monitor hooked up to the 630 at all

edit: lol it's even funnier, it glitches out if I start it on the old card, but not when I start it on a monitor hooked to the new card, and then yank the monitor cord to force it to display only on the main one
also framerate is significantly better on the old card, it's just drawing vertices out of bounds like an overheating GPU would
havin a computer lmaopalooza over here

Peanut Butler fucked around with this message at 00:56 on Apr 6, 2020

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS
gently caress, dude, PM me your address and I’ll send you a GTX 560 for free. That’ll do DX11 just fine and runs DX9 stuff a little faster than a HD4870. I might even have an HD6950 somewhere you can have, it’s a decent card but can’t be used in a Hackintosh, so I replaced it.

forest spirit
Apr 6, 2009

Frigate Hetman Sahaidachny
First to Fight Scuttle, First to Fall Sink


Looten Plunder posted:

This is currently a gaming rig running 1440p on one of those overclocked 96hz Korean monitors.

FYI if your 1440p Korean monitor is anything like MY 1440p Korean monitor then it's probably dual-link DVI, yeah? I don't think more modern cards have DVI anymore, at least mine didn't, so you'll have to buy an active adapter to get it working. The passive adapter that came with my card didn't work.

They're like 50~100 bucks. This is even if you decide to get a new graphics card and not just upgrade your cpu

Schiavona
Oct 8, 2008

Penpal posted:

FYI if your 1440p Korean monitor is anything like MY 1440p Korean monitor then it's probably dual-link DVI, yeah? I don't think more modern cards have DVI anymore, at least mine didn't, so you'll have to buy an active adapter to get it working. The passive adapter that came with my card didn't work.

They're like 50~100 bucks. This is even if you decide to get a new graphics card and not just upgrade your cpu

Tell me more. I have a Qnix from back when they were the 1440p overclockable 27inch hotness, and bought a converter with the rest of my stuff from Microcenter (https://www.microcenter.com/product/485545/HDMI_Male_to_DVI-D_Female_Adapter). Is this not gonna work?

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS
Nope. The best deal going are the Dell adaptors sold on eBay, they were about $39 last month when I got one for my Catleap 27”

Edit: https://www.ebay.com/i/293400633776...ASABEgIUU_D_BwE

JnnyThndrs fucked around with this message at 03:17 on Apr 6, 2020

repiv
Aug 13, 2009

It's not gonna work, those cheap passive adapters only do single link DVI and you need dual link for 1440p. The more expensive active adapters can do dual link but they can't be overclocked so you're stuck at 60hz.

e;fb

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender
I used one of those Dell adapters with my Qnix for a while and I was able to o/c to ~80Hz before I started getting artifacts.

Direct connect I was able to get to 100Hz.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I've been reading this thread and watching some Digital Foundry videos and I wanna check to see if I'm understanding some of these concepts correctly:

DLSS is when the game is running at a lower resolution than is native to the monitor, but the GPU upscales the image to fit, such as a 1080p image upscaled to fit a 4k monitor, or a 540p image upscaled to fit a 1080p monitor.

The idea is that if the DLSS algorithm works well enough to avoid artifacting and other noticeable image distortions, the image should look close enough to running a 4k image on a 4k monitor that you can't tell the difference.

And the reason why you want to do this, is so that you can squeeze out more performance from the card - the idea being that since 540p is easier to run, then you can do 540p on a High graphics preset and then DLSS upscales it to 1080p, and that would look better than running 1080p natively but having to use the Low preset to keep the same framerate.

Or, say, if your card isn't powerful enough to run 4k natively and keep a decent framerate even with the settings already turned down, then you can start with 1080p, have DLSS upscale it to 4k, and that should theoretically give you better performance without having to sacrifice the resolution, because what you really want to avoid is running on a resolution that's smaller than your monitor's native resolution, because that will stretch the image and generally look noticeably worse.

Again, assuming that DLSS works well enough that you do get the performance savings from running a natively lower resolution, but without significantly sacrificing image quality.

___

On the flipside, VSR and DSR is when the game is running at a higher resolution than is native to the monitor, but the GPU downscales the image to fit, such as a 1080p image downscaled to fit a 720p monitor.

This is a method of supersampling (anti-aliasing?), where the higher-res image, when downscaled to a smaller monitor, produces a sharper image.

And the reason why you want to do this, is if you have performance to spare: if you're already running on the High preset on your monitor's 720p native resolution and your GPU still has room to gallop, so to speak, you can use VSR/DSR to get even prettier visuals.

___

is that right?

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!

Taerkar posted:

I've currently got a 1060 6GB version and it's really starting to show its age with certain things, pushing me to look to getting an upgrade. I'll be moving this card into my partner's PC so they'll get good use out of it, but I'm mostly looking to not spend too much. I'm thinking around $300 or so but I could go up to $400 if it's a good enough card to justify spending extra on it.

Right now I've got a 1920x1080 setup but I occasionally use a 4K tv for group stuff.

Might consider a Nvidia 2060 Super which starts at 400$ and is roughly 50% faster than the 1060 with 2 more GB of RAM. Not quite the jump it was from 9XX to 1XXX but noticeably faster. Below that I'm not sure it's worth buying a new card. Might want to wait a few more months to see what is being released.
https://www.gpucheck.com/en-usd/com.../high/high/-vs-

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I think it's easier to just start calling the 2060S a 2070*, since that's what it is performance-wise.

Also, the EVGA 2060S SC Black is really low at the moment after a $20 promo code and rebate: https://www.newegg.com/evga-geforce-rtx-2060-super-08g-p4-3062-kr/p/N82E16814487485

BIG HEADLINE fucked around with this message at 09:10 on Apr 6, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

gradenko_2000 posted:

is that right?

You're glossing over a bit of how DLSS works compared to a normal up-scaling algorithm, but more or less, yeah.

DLSS = +performance, -image quality (though with DLSS 2.0 it's real hard to tell in most cases, and the performance gain is huge)
VSR = Supersampling for AMD; -performance, +image quality
DSR = Supersampling for NVidia; -performance, +image quality

In practice not too many people are running VSR/DSR, because the performance penalty vs image quality trade-off is usually not worth it except on older games where you've got plenty of GPU horsepower to spare. DLSS 2.0 looks to potentially be a game-changer, by granting massive performance gains at little to no discernible image quality loss.

Setset
Apr 14, 2012
Grimey Drawer
How does DLSS 2 compare to RIS?

repiv
Aug 13, 2009

RIS is just sharpening, it can only enhance details that are already there. It's fine for cleaning up the small blur you get from TAA, or a small amount of upscaling at a stretch, but it's fundamentally limited to whatever detail was already there in the input frame.

DLSS is a whole image reconstruction pipeline that can generate convincing 1440p frames out of input frames between 720p and 960p depending on the mode. No amount of sharpening alone will make 720p look like 1440p.

Taerkar
Dec 7, 2002

kind of into it, really

lllllllllllllllllll posted:

Might consider a Nvidia 2060 Super which starts at 400$ and is roughly 50% faster than the 1060 with 2 more GB of RAM. Not quite the jump it was from 9XX to 1XXX but noticeably faster. Below that I'm not sure it's worth buying a new card. Might want to wait a few more months to see what is being released.
https://www.gpucheck.com/en-usd/com.../high/high/-vs-

Is that 'a few more months' certain or likely to be pushed back even more with current events?

The one BIG HEADLINE linked to does look rather inviting.

Cygni
Nov 12, 2005

raring to post

Taerkar posted:

Is that 'a few more months' certain or likely to be pushed back even more with current events?

The one BIG HEADLINE linked to does look rather inviting.

Nobody knows. Judging by past Nvidia releases and the leaks though, it will likely be high end first ($1k+ and $700 price points), then cascading new parts over the following months. So it may be a while before there is a replacement at the $400 price point.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Yeah I would guess December at the soonest for the x60. Even still, I wouldn't personally be doing that upgrade right now. I have never stopped hating the value proposition of Turing as an upgrade, and while it was improved for a while after the Super launch, with the next generation creeping into sight it's becoming worse than ever.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

K8.0 posted:

Yeah I would guess December at the soonest for the x60. Even still, I wouldn't personally be doing that upgrade right now. I have never stopped hating the value proposition of Turing as an upgrade, and while it was improved for a while after the Super launch, with the next generation creeping into sight it's becoming worse than ever.

Indeed. As lovely as it is to tell people to "wait for the next gen" in a thread where we regularly pan people for saying that, this time it's probably a bit more reasonable, especially at the lower-end. The big unanswered question right now is whether NVidia is going to push Tensor cores down below the RTX xx60 level or not--getting a 1660( |S|Ti) or lower right now only to have them drop a 3660 or whatever silliness they're going to use for a naming convention with Tensor cores in Dec is gonna make the new card potentially double the speed with DLSS enabled.

On the other hand, that's also 8 months away, which is a looooong time, especially for people cooped up with not a whole lot to do right now.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Ephiphany moment when explaining that neither NVidia nor AMD have good drivers at the moment leads to de-bullshittification moment.

Is it more sensible to call RDNA 1 at this point, like... RDNA .5, and label RDNA 2 as RDNA 1 instead?

Because clearly the plan for AMD here is to get everyone and everything from the consoles to later discrete GPUs onto RDNA "2", so they're focusing all of their time and development on that, which has left anyone who picked up an RDNA 1 card in the lurch as they leave the drivers for that in a merely "mostly usable" state.

wargames
Mar 16, 2008

official yospos cat censor

SwissArmyDruid posted:

Ephiphany moment when explaining that neither NVidia nor AMD have good drivers at the moment leads to de-bullshittification moment.

Is it more sensible to call RDNA 1 at this point, like... RDNA .5, and label RDNA 2 as RDNA 1 instead?

Because clearly the plan for AMD here is to get everyone and everything from the consoles to later discrete GPUs onto RDNA "2", so they're focusing all of their time and development on that, which has left anyone who picked up an RDNA 1 card in the lurch as they leave the drivers for that in a merely "mostly usable" state.

not just rdna, vega56 owner here, and for a while amd drivers were good but 2020 updates have been poo poo.

Setset
Apr 14, 2012
Grimey Drawer

SwissArmyDruid posted:

Ephiphany moment when explaining that neither NVidia nor AMD have good drivers at the moment leads to de-bullshittification moment.

Is it more sensible to call RDNA 1 at this point, like... RDNA .5, and label RDNA 2 as RDNA 1 instead?

Because clearly the plan for AMD here is to get everyone and everything from the consoles to later discrete GPUs onto RDNA "2", so they're focusing all of their time and development on that, which has left anyone who picked up an RDNA 1 card in the lurch as they leave the drivers for that in a merely "mostly usable" state.

It’s possible that RDNA1 is unfixable at the moment. There were reports (that Adored guy) of AMD having horrible difficulties getting the drivers to function, well ahead of Navi’s launch. It’s possible RDNA2 came out so quickly because the only way to resolve the issue is an architectural change.

E: that said, I have Navi card and it runs everything I throw at it very well.

Setset fucked around with this message at 13:14 on Apr 7, 2020

Seamonster
Apr 30, 2007

IMMER SIEGREICH

Lube banjo posted:

It’s possible RDNA2 came out so quickly because the only way to resolve the issue is an architectural change.


I literally posted this on pg 1901.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

SwissArmyDruid posted:

Ephiphany moment when explaining that neither NVidia nor AMD have good drivers at the moment leads to de-bullshittification moment.

Is it more sensible to call RDNA 1 at this point, like... RDNA .5, and label RDNA 2 as RDNA 1 instead?

Because clearly the plan for AMD here is to get everyone and everything from the consoles to later discrete GPUs onto RDNA "2", so they're focusing all of their time and development on that, which has left anyone who picked up an RDNA 1 card in the lurch as they leave the drivers for that in a merely "mostly usable" state.

Well if that is the case...I'm totally down for the comedy that r/AMD will be when RDNA 1 ages like poo poo and Turing enters Finewine (tm) territory. RDNA 1 looks ripe to age poorly considering it's lack of DX12 Ultimate level features, so it's going to be behind Turing/Ampere, RDNA 2 and the consoles in feature set.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Lube banjo posted:

It’s possible that RDNA1 is unfixable at the moment. There were reports (that Adored guy) of AMD having horrible difficulties getting the drivers to function, well ahead of Navi’s launch. It’s possible RDNA2 came out so quickly because the only way to resolve the issue is an architectural change.

E: that said, I have Navi card and it runs everything I throw at it very well.

I'm guessing RTG just needs to get their act together on the software side. If it was fundamentally broken to that extent then compute wouldn't work (that FFT bug eventually got fixed), and I doubt they managed to break the pixel pipeline side of things that loving badly. To be blunt it's just a reworked GCN, it's not that different in the actual internal operation.

(there isn't really a need for a "clean sheet" redesign as long as they keep moving. You can trace the lineage of Turing all the way back to Fermi and even Tesla, it's just that NVIDIA makes major internal tweaks every generation. What really hurt AMD was doing basically nothing with GCN since its introduction. As long as they keep moving forward it's fine.)

Beautiful Ninja posted:

Well if that is the case...I'm totally down for the comedy that r/AMD will be when RDNA 1 ages like poo poo and Turing enters Finewine (tm) territory. RDNA 1 looks ripe to age poorly considering it's lack of DX12 Ultimate level features, so it's going to be behind Turing/Ampere, RDNA 2 and the consoles in feature set.

I'm guessing the raster performance will age fine (goes along with AMD rarely updating their architectures), and people went into it eyes-open knowing that raytracing wasn't going to be there, ever. Drivers and the poo poo-tier hardware encoding are bigger factors in the purchase decision than RT IMO.

And conversely I'm not sure the RT performance on Turing is going to be anything viable in 5 years or whatever. Like, I guess it is balanced by what is going into consoles, and it's not like Sony and MS went nuts with the RT performance there either, but I fully expect Ampere to make a sizeable leap in RT performance, there just isn't enough RT there to run much of anything. The effects we're getting right now are just the very shallow end of the pool. Even with DLSS helping out, you just need more rays.

Paul MaudDib fucked around with this message at 20:18 on Apr 7, 2020

Arzachel
May 12, 2012
Besides, the big console titles will be built for previous gen for another year or two. That's what happened with the last console launch and I'd be surprised if the execs had magically become less risk-averse since then.

SwissArmyDruid
Feb 14, 2014

by sebmojo
I mean.... yes? Forcing a target switch between this gen and next and blowing up years and millions of dollars of work completely defeats the purpose of having a shared x86 codebase between this gen and last.

In theory, (and I cannot stress this enough) this should basically be like moving from a Sandy Bridge machine running Win 8 to a Zen2 machine running Win 10. The ceiling on performance should be higher, but there shouldn't be anything stopping the exact same code targeted for a PS4 from running on a PS5.

I say "in theory", because 1) plus or minus DRM schemes, 2) video games devs really, really, really, really have this hardon for tying game logic and physics to framerate, which leads to older games being utterly unplayable when the framerate gets too high, and 3) Microsoft is more likely to actually achieve this, what with UWP being the status quo for them and actually having to support desktop OS.

Mindblast
Jun 28, 2006

Moving at the speed of death.


Devs need to make their projects work more so than make them work elegantly. It loving sucks for us but I see how it happens. It's complicated enough as it is.

The amount of developers going the distance like current ID software is, is small. And even that is in part due to their design goals demanding it.

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

SwissArmyDruid posted:

I mean.... yes? Forcing a target switch between this gen and next and blowing up years and millions of dollars of work completely defeats the purpose of having a shared x86 codebase between this gen and last.

It reduces but not eliminates the effort to translate between generations. A lot of console dev work is spent tuning things to that specific blob of hardware to get it to work, let alone work well. Considering that the new generations will have different RAM layouts, different GPUs, etc., it quickly gets non-trivial. If it "just worked" the way you suggest, XBox -> PC ports would take a few hours of recompiling and that'd be it and they'd run fine and all would be happy flowers and roses. We know this isn't the case, despite the XBox and a Windows PC being incredibly similar in most ways.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply