Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Cygni
Nov 12, 2005

raring to post

competitive twitch gaming is dumb, whats cool is tryin to max out subnautica without stuttering on loading (impossible)

please get it while its free on the epic store, its so good

Adbot
ADBOT LOVES YOU

Sphyre
Jun 14, 2001

VelociBacon posted:

Post em? I'm wondering if you guys have had to adjust these much.



Still reeling at how badly I lost the silicon lottery on my XC :negative:

il serpente cosmico
May 15, 2003

Best five bucks I've ever spend.

TheFluff posted:

Yes, but almost nobody notices input lag (not even among pro esports players), so you don't need to bother really. It's measurable but placebo. It gets talked about in nerd reviews mostly because it's hard to measure so you need to seek out other turbonerds who care about it. It's a social cue more than it is a meaningful performance metric. This is especially true on high refresh rate monitors where the frame times are low to begin with.

Yeah, I highly doubt I'll notice input lag when framerates are this high, but this is easy enough for me to set up this way, so I might as well.

Input lag on older games can be a bitch, though. I remember trying to play Mike Tyson's Punchout on a PC emulator / LCD monitor and literally not being able to react fast enough to certain punches, and I had to start guessing when to dodge. I ended up using a higher quality emulator with much lower input lag, and I was able to read and react to what was happening.

Prior to setting things up like I mentioned before, I typically left V-Snyc off in nvidia control panel and if the game I was playing had a frame limiter, I would use that, otherwise I'd just turn on V-sync to cap the games at 144hz. I doubt I'll feel a difference between this and using a third-party frame limiter, but part of my reasoning is that lag is accumulative. If the game is online, you can add any network input lag to monitor input lag to system input lag to input lag introduced by the game. Since it's easy to optimize a g-sync solution, I might as well.

il serpente cosmico
May 15, 2003

Best five bucks I've ever spend.
Regarding the 20ms thing, the difference in frame time between a 30FPS game and 60FPS game is 16.67ms. Pretty much anyone who plays games can notice the look and feel of this, and I think the person playing at 60 FPS would have a noticeable reaction advantage, so what makes this so different than non-FPS related input lag of roughly the same amount?

In the earlier example I gave of Mike Tyson's Punchout, I wasn't able to notice the input lag playing it casually on the first several fights on the first setup, but once I progressed to the point where things got much more reaction based, I did much better on the lower input lag setup. I might not have felt the difference in blind test, though.

And like I mentioned earlier, input lag is cumulative, and there are many components that introduce it. Lowering it in as many stages as you can seems helpful.

EDIT: Fighting games have 1-frame windows for certain links that only give you 16.67ms to time right. I would think the people who get really good at those would think 20ms of input lag is important?

il serpente cosmico fucked around with this message at 10:04 on Dec 15, 2018

coke
Jul 12, 2009

Craptacular! posted:

Judder when the scene suddenly changes and the buffer isn't ready.

I maintain the input lag added by v-sync at such high refresh is minimal enough to be negligible to most people. (CSGO pros aren't most people.)

Oh i see the disconnect between the two arguments.

One is saying that low input lag is not noticeable.

The other is saying less input lag will give you an advantage.



I’m going to agree with the second and say that lower input lag is always an advantage even though it might not be noticeable because who the hell sees lag as a good thing??

coke fucked around with this message at 12:53 on Dec 15, 2018

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
Just to put it out there, I'd like to apologize for being overly confrontative yesterday.

il serpente cosmico posted:

Regarding the 20ms thing, the difference in frame time between a 30FPS game and 60FPS game is 16.67ms. Pretty much anyone who plays games can notice the look and feel of this, and I think the person playing at 60 FPS would have a noticeable reaction advantage, so what makes this so different than non-FPS related input lag of roughly the same amount?

In the earlier example I gave of Mike Tyson's Punchout, I wasn't able to notice the input lag playing it casually on the first several fights on the first setup, but once I progressed to the point where things got much more reaction based, I did much better on the lower input lag setup. I might not have felt the difference in blind test, though.

And like I mentioned earlier, input lag is cumulative, and there are many components that introduce it. Lowering it in as many stages as you can seems helpful.

EDIT: Fighting games have 1-frame windows for certain links that only give you 16.67ms to time right. I would think the people who get really good at those would think 20ms of input lag is important?

People who can hit 1-frame windows in fighting games and speedruns aren't doing it with reaction time, they're latching on to a cadence. It's kinda like playing tight music - you can get extremely precise timings that way.

The reason it's hard to notice if two stimuli happen at the same time or not, or if there's a delay between a physical action you take and something happening in reaction to that, is because your brain is actively trying to fool you into thinking that there isn't a delay. It does this because there's a huge amount of input lag in the nerve system itself, and noticing that all the time would be distracting, so your perception does its best to try to cover that irritating detail up. You can most likely train yourself out of that to a certain extent, but not completely.

Here's a fascinating twitter thread about it that was one of the reasons I started reading up on these things:

https://twitter.com/analogist_net/status/1014397203450744832

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

coke posted:

Oh i see the disconnect between the two arguments.

One is saying that low input lag is not noticeable.

The other is saying less input lag will give you an advantage.



I’m going to agree with the second and say that lower input lag is always an advantage even though it might not be noticeable because who the hell sees lag as a good thing??

Of course it's an advantage, I'm not disputing that. The question is if it's a meaningful advantage that you should be worrying about, which I'd argue it's not. If you do a purely theoretical simulation (like Blurbusters in one of the articles I linked some posts back) where two players meet each other and the one who shoots first wins, and make everything that impacts delay the same, including player reaction time, then of course the player who gets the stimuli first would win every time. If you then assume very fast reaction times and introduce a (very low) variance in reaction time between players (blurbusters used a 170ms reaction time between seeing the other player and triggering the fire button, with a 20ms standard deviation) and keep everything exactly equal except for introducing a 15ms delay between the game and the display for one of the players, then the player with less delay will get the huge advantage that K8.0 goes on about, and win 70% of such duels.

That kind of calculation considered in a vacuum really isn't meaningful, though. Actual gaming scenarios are not purely reaction time based, the vast majority of players do not react that fast, the variance in human reaction time is far greater than a 20ms standard deviation under real gaming conditions, there are many other sources of variance in input lag in play (including differences in ping time and operating system interrupt handling times for example), and in case of multiplayer the server usually doesn't tick very fast anyway (IIRC Overwatch ticks at 60Hz and the client update rate is lower than that). The more factors you have that introduce a variance in reaction times, the less that 15ms input lag difference from your monitor matters. 15ms is actually quite a short time, especially considering that having a 50ms ping time is usually considered pretty good.

There's no reason not to reduce the input lag of course, but if you can't be bothered to janitor your settings with third party tools, don't worry too much about it as a casual player. Except if you have a 60Hz monitor, then it can be worth worrying about (or you can just get a 144Hz monitor instead).

TheFluff fucked around with this message at 13:36 on Dec 15, 2018

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Craptacular! posted:

Judder when the scene suddenly changes and the buffer isn't ready.

I maintain the input lag added by v-sync at such high refresh is minimal enough to be negligible to most people. (CSGO pros aren't most people.)

Bad frame pacing is just variation in latency. It's bad at low framerates, it's also bad at high framerates when you're throwing random frames away because you're overrunning the buffer. People correctly bitched about SLI micro-stutter for years because having a high framerate is still a lovely experience if the frames aren't representing well-distributed samples in in-game time.

You can maintain something objectively wrong, but it's still objectively wrong. Neither of you has made even the slightest attempt to explain how moving yourself 20ms to the left in the distribution of human reaction times isn't a significant advantage to everyone. The deviation of human average reaction times is in the region of 100ms. It doesn't matter if you and the other players you're trying to defeat are at the slow end of the reaction times, it doesn't matter if you and the other players you're trying to defeat have huge variances in reaction times (you don't, btw), you are still going to improve your performance a very significant, extremely noticeable amount by reducing latency that much.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I'm just sitting here waiting for variable refresh rates to appear on televisions :smith:

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

K8.0 posted:

Bad frame pacing is just variation in latency. It's bad at low framerates, it's also bad at high framerates when you're throwing random frames away because you're overrunning the buffer. People correctly bitched about SLI micro-stutter for years because having a high framerate is still a lovely experience if the frames aren't representing well-distributed samples in in-game time.

You can maintain something objectively wrong, but it's still objectively wrong. Neither of you has made even the slightest attempt to explain how moving yourself 20ms to the left in the distribution of human reaction times isn't a significant advantage to everyone. The deviation of human average reaction times is in the region of 100ms. It doesn't matter if you and the other players you're trying to defeat are at the slow end of the reaction times, it doesn't matter if you and the other players you're trying to defeat have huge variances in reaction times (you don't, btw), you are still going to improve your performance a very significant, extremely noticeable amount by reducing latency that much.

Dude, you’ve been super confrontational for a couple pages and I mean... if we were talking politics or something, okay yeah, I’d get that.

But it’s response time man. There’s an easy way to make your argument and to be respectful to the other dude you disagree with -assuming the dude is arguing in good faith which I believe he is.

It’s the GPU thread. Be cool, friend.

endlessmonotony
Nov 4, 2009

by Fritz the Horse

coke posted:

Oh i see the disconnect between the two arguments.

One is saying that low input lag is not noticeable.

The other is saying less input lag will give you an advantage.



I’m going to agree with the second and say that lower input lag is always an advantage even though it might not be noticeable because who the hell sees lag as a good thing??

Input lag on sub-20ms levels is literally impossible to notice for 99.9% of the human population. It can make a difference in professional gaming, but a lot of settings fuckery to avoid it is... questionably worth it if you have no money on the line?

Worf
Sep 12, 2017

If only Seth would love me like I love him!

I suspect K9.0 will be much nicer, almost like a faithful puppy

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

When I worked on VR headsets, a 20ms difference in motion-to-photon latency would have been considered pretty catastrophic. Generally we wanted total latency to be around that, or people started getting woozy. From playing with debug settings and badly-tuned software, I can tell you that I would bet a lot of money that I could tell you which VR config was 20ms more latent, and I am the farthest thing from a twitch gamer you can imagine.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Subjunctive posted:

When I worked on VR headsets, a 20ms difference in motion-to-photon latency would have been considered pretty catastrophic. Generally we wanted total latency to be around that, or people started getting woozy. From playing with debug settings and badly-tuned software, I can tell you that I would bet a lot of money that I could tell you which VR config was 20ms more latent, and I am the farthest thing from a twitch gamer you can imagine.
I don't doubt that at all, and you can also very clearly see input lag if it's visualized in a different way, like on a touch screen for example:

https://www.youtube.com/watch?v=vOvQCPLkPt4&t=51s

That's an entirely different class of perception, though. The difficulty in seeing lag between moving your finger and something happening on the screen is caused by the brain trying to cover up your own builtin input lag (that is, the nerves are really slow at transmitting signals).

TheFluff fucked around with this message at 15:45 on Dec 15, 2018

Indiana_Krom
Jun 18, 2007
Net Slacker
Frame limiting makes the game engine wait 7 MS between the start of each frame. So it starts a frame, cpu does the simulation and feeds the data to the gpu which renders the frame then scans it out when its done. If less than 7 MS has passed since starting the last frame, it waits till 7 MS has passed and then repeats the process. If more than 7 MS has already passed, then it starts rendering the next frame immediately. The frame rate will never exceed ~141 FPS, so the gpu will always be able to scan out the next frame to the display as soon as it is done, frames never sit in buffers, latency is minimized. The thing is; nvidia put quite a bit of effort into their frame pacing, so when you enable double buffered vsync the nvidia driver will lie to the game about when it is ready to start the next frame and make the game wait till roughly 6.94 MS after the start of the last frame before starting the next one. At that point the only difference is that frame limiting does "wait and then render" while vsync does "render and then wait". The amount of additional latency caused by vsync in that scenario is always going to be trivial and less than 1 refresh interval.

You will only run into latency trouble if you didn't change that one critically important to latency setting in the nvidia control panel / global 3d settings labeled "Maximum pre-rendered frames". It even says in the description this is how many frames ahead the driver will let the CPU queue up, this setting is absolutely a big fat high latency buffer any time your CPU is faster than your GPU or display. If this is anything other than 1, you are setting yourself up for significantly increased input latency. You can even test it yourself, disable gsync, set your refresh rate to 60 Hz, force vsync and then play a game that you could easily break 144 Hz on while adjusting that setting. You can even set vsync to half-refresh adaptive which will let you push the latency of that buffer up to over 133 MS (4 frames ahead at 33.3 MS/frame). Beware that some games also have their own render ahead setting which can and often will be stacked on top of the one the driver does.

Granted, latency flies right out the window when you are using AFR SLI, which requires deeper and therefore higher latency buffers as a basic requirement to the technology functioning at all. At a minimum the latency of AFR SLI is going to be double the latency of a single GPU at the same frame rate, or equal to a single GPU at half the frame rate.

Indiana_Krom fucked around with this message at 15:57 on Dec 15, 2018

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

TheFluff posted:

That's an entirely different class of perception, though.

I misunderstood then: I thought people were arguing about the human brain’s ability to perceive latency between the person’s action and an optically-sensed result. Isn’t that what’s at issue in all of these configurations, including touch and VR?

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Zedsdeadbaby posted:

I'm just sitting here waiting for variable refresh rates to appear on televisions :smith:

There's a few Samsung TVs that supports variable refresh rate but only at 1080p and only Free sync of course.

They support vrr a bit in 4k too but like only 48 hz to 60hz. At 1080p it's 20hz to 120hz but depends on the TV with some not supporting 120hz in the same model line (usually the smaller screen size ones)

https://www.rtings.com/tv/tests/motion/variable-refresh-rate

codo27
Apr 21, 2008

Zedsdeadbaby posted:

I'm just sitting here waiting for variable refresh rates to appear on televisions :smith:

Hasn't been much on the BFG displays lately. Where to they stand?

Rabid Snake
Aug 6, 2004



Anyone having any issues with the latest nvidia drivers where your high refresh rate monitor refuses to 'wake up and register' from sleep mode randomly? I wish I could downgrade drivers again but Battlefield 5 forces the new drivers on you. It seems like other people are having the same problem on the nvidia forums but I wasn't sure if there was a work around for it

Rabid Snake fucked around with this message at 18:20 on Dec 15, 2018

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

codo27 posted:

Hasn't been much on the BFG displays lately. Where to they stand?

From what I've read, it seems that BFG is a set of standards made by Nvidia and not an actual television itself. I think the specs are something like 65", 4K resolution, has HDR support and also supports gsync. There's a handful of TVs that meet this standard but none of them are available yet and may not be available until early 2019. The prices are prohibitively high though, several thousands.

AMD's freesync TVs aren't quite as fully fleshed out and obviously AMD itself doesn't have any GPUs capable of running at higher resolutions but they are close. Really hope Vega 2 makes a splash alongside TVs having wider freesync hz ranges.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
If I understand what was being said correctly and I have a 144hz gsync and a pc easily capable of a locked 144fps my best settings for the lowest possible input lag are:

vsync on in game settings
fps capped at 141using nvidia inspector
gsync on

Is that all, did I miss anything?

Rabid Snake posted:

Anyone having any issues with the latest nvidia drivers where your high refresh rate monitor refuses to 'wake up and register' from sleep mode randomly? I wish I could downgrade drivers again but Battlefield 5 forces the new drivers on you. It seems like other people are having the same problem on the nvidia forums but I wasn't sure if there was a work around for it


My problem is that my monitors refuse to turn off from inactivity after the most recent drivers. Something in the geforce experience is constantly running in the background. Closing it fixes it.

Stickman
Feb 1, 2004

Fauxtool posted:

If I understand what was being said correctly and I have a 144hz gsync and a pc easily capable of a locked 144fps my best settings for the lowest possible input lag are:

vsync on in game settings
fps capped at 141using nvidia inspector
gsync on

Is that all, did I miss anything?

Almost, but it's typically recommended to turn off in-game vsync and use "fast" vsync from the NVidia control panel.

E: Apparently fast vsync is DX11 only. If you're consistently running over max refresh on a DX12, OpenGL, or Vulcan game, you'll want to use the normal vsync option.

Fauxtool posted:

My problem is that my monitors refuse to turn off from inactivity after the most recent drivers. Something in the geforce experience is constantly running in the background. Closing it fixes it.

This might be related to default Broadcast LIVE settings, per this post. See if this fixes the problem!

quote:

Noticed that after installing the latest Nvidia Driver update, 390.77 my Windows 10 Creators 64 bit screensaver stopped working. After hours of trying the various listed fixes I discovered that Nvidia installs the driver settings for Broadcast LIVE to ON, which prevents the screensaver starting because LIVE is a background process preventing Screensaver start.. To remedy this and for screensaver to work again, you need to manually turn the Broadcast LIVE setting to OFF in the GeForce Experience settings menu

Hope this helps anyone having the same issue and saves wasted time trying to fix.

Stickman fucked around with this message at 20:14 on Dec 15, 2018

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Fauxtool posted:

If I understand what was being said correctly and I have a 144hz gsync and a pc easily capable of a locked 144fps my best settings for the lowest possible input lag are:

vsync on in game settings
fps capped at 141using nvidia inspector
gsync on

Is that all, did I miss anything?

I'll give you the tldr first and then explain it :
Set a global FPS cap at (refresh rate - 4) in RTSS, not Nvidia Inspector.
Force Vsync on through Nvidia control panel.

For any given game :
Turn vsync off in-game.
If tearing appears, turn vsync on in-game.
Optional slight effort that for some reason makes people decide to not use gsync at all : if the game has a framerate limiter, google if it's better than RTSS. If it is, make an RTSS profile for that game that disables the framerate cap, and cap at (refresh rate - 4) in game.

That's it. Now the why.

Nvidia Inspector does not do a particularly good job of reducing latency - in well engineered games, it's barely better than vsync with no cap.
RTSS does a pretty good job of reducing latency.
Some game FPS limiters do an exceptionally good job of reducing latency (most competitive games). Some are absolute garbage (GTA V is notorious for this).

Generally, forcing vsync through the drivers is ideal, because in some games turning vsync on will add more buffers and gently caress everything up, but in some games forcing vsync through the control panel won't work. I think the safest general approach is to force vsync on through the control panel, leave it off in-game, and if you still get tearing, turn it on in-game.

If you want to dive into latency stuff in general for games, BattleNonsense on youtube has great information in general.

As far as I know, all this is still correct. Someone up the page commented about Nvidia making improvements to the driver's frame pacing, but I have not seen any testing that shows this.

sauer kraut
Oct 2, 2004

Rabid Snake posted:

Anyone having any issues with the latest nvidia drivers where your high refresh rate monitor refuses to 'wake up and register' from sleep mode randomly? I wish I could downgrade drivers again but Battlefield 5 forces the new drivers on you. It seems like other people are having the same problem on the nvidia forums but I wasn't sure if there was a work around for it

Is it a Dell monitor by chance? A very popular model they sold has a unique bug with standby stuff that you need to disable in the monitor menu.

Also please stop it with sync stuff, you can talk endlessly about it. You still have to check and tweak it for every game just like a decade ago, since frame limiters and 'forced' driver settings don't work reliably at all in my experience.
I just tried to wrangle Fallout 4 and it went lol nope to any driver setting, ingame vsync or bust.

sauer kraut fucked around with this message at 21:07 on Dec 15, 2018

Strong Sauce
Jul 2, 2003

You know I am not really your father.





i used to be able to record my entire desktop with shadowplay, using the new geforce experience 3 and it seems like i can only record video for games that are "supported" by nvidia. is there a way to record my entire desktop?

edit: nevermind. gf exp 3 is just broken as poo poo and restarting it fixed the issue.

Strong Sauce fucked around with this message at 21:58 on Dec 15, 2018

Craptacular!
Jul 9, 2001

Fuck the DH

Fauxtool posted:

If I understand what was being said correctly and I have a 144hz gsync and a pc easily capable of a locked 144fps my best settings for the lowest possible input lag are:

vsync on in game settings
fps capped at 141using nvidia inspector
gsync on
You have it basically down, but many competitive games have either a fine controlled FPS limiter or a number of settings. Overwatch has a limiter built in game that you can set at 141. Quake Champions let’s you set an FPS cap at the command line. League and Warframe can let you set an FPS cap of 120 which is within Gsync range and isn’t notably different than 144. Dota 2, the game pros play for millions of dollars, automatically caps it’s FPS at 120.

If you’re seeking “playing Rocket League for money” precision you can do all that RTSS bullshit K8 suggests, but gently caress that noise imo.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
Turns out the vega fe blows rear end in a sff case and even undervolting doesn't help

Craptacular!
Jul 9, 2001

Fuck the DH

sauer kraut posted:

Also please stop it with sync stuff, you can talk endlessly about it. You still have to check and tweak it for every game just like a decade ago, since frame limiters and 'forced' driver settings don't work reliably at all in my experience.
I just tried to wrangle Fallout 4 and it went lol nope to any driver setting, ingame vsync or bust.

Gamebryo engine (Fallout, Elder Scrolls, anything with Todd Howard’s face selling it) ties it’s physics to the frame rate, so you don’t want FPS over 60 at least without some significant modding because, as Fallout 76 players found out, it causes objects to zip around the room like The Flash on heroin.

This is what happens when you develop one engine for 20 years and refuse to start over. :todd:

Stickman
Feb 1, 2004

Strong Sauce posted:

i used to be able to record my entire desktop with shadowplay, using the new geforce experience 3 and it seems like i can only record video for games that are "supported" by nvidia. is there a way to record my entire desktop?

Apparently is has something to do with the new privacy features? There's this somewhat convoluted to get the option to turn it off to show up.

E: Glad it "fixed" itself!

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Has anyone messed around with undervolting their Turing cards yet? I had pretty good luck with it on my MSI 1080 ti.

I have the MSI 2080 ti Gaming X Trio undervolted to 0.962V at 1965 mhz at the moment, dropped the power consumption from 330W at stock settings to around 270W. I can run my fans at around 40% and maintain temps around 70C because of this. 0.975 will pull 300W and 1V will pull the max 330W the bios allows, not a ton of room to add voltage based on the max power limit.

Xerophyte
Mar 17, 2008

This space intentionally left blank

Craptacular! posted:

This is what happens when you develop one engine for 20 years and refuse to start over. :todd:

I seem to be posting this in a lot of threads but: you know that Source 2 and idTech 5/6 are ultimately both directly derived from idTech 2, which ran the original Quake, right? Likewise for UE4 and the original Unreal. I'm sure that all traces of the original 90s code and structure are long gone in the current revisions but getting there has been an iterative process. Epic and id aren't doing that to be idiosyncratic, they're doing it because iterating is much more efficient than starting over. Code doesn't rust: building on an existing foundation, keeping what works and fixing what doesn't gets you further, faster as a rule.

NetImmerse, Gamebryo and Creation in general and the Bethesda-specific physics modules in particular have and have had their stupid quirks and bugs. I'm sure a lot of parts have tech debt that might in some cases best be cleared by rewriting from scratch using a more modern approach. Creation, far as I understand, was an effort to throw out a lot of the core Gamebryo structure that didn't really fit their modern use case and building a new framework around their existing modules, similar to what other engines do with their major rewrites. Their situation would almost certainly not have been improved by throwing out absolutely everything and starting over. No one does that, for good reasons.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
Eh, Gamebryo/Creation does have very annoying limitations and problems that are pretty obvious and related to the way the engine handles data structures/loads environment cells. It's a lovely engine that hitches constantly even on very high-end computers. The rewrite can't have been all that comprehensive.

Lambert fucked around with this message at 22:48 on Dec 15, 2018

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
The reason some of the fundamentals from id tech and unreal have been carried forward is that they involved good design and have been carefully shepherded with forward-thinking design. Bethesda made a rushed, incompetent mess and has spent 16 years making a worse mess.

That said fixing the physics bug is really easy, it's a single line ini edit that takes like 30 seconds. The really hilarious part is that all you do is divide 1 by your framerate to get a number, but that's too complex for Bethesda themselves to do automatically.

Xerophyte
Mar 17, 2008

This space intentionally left blank

Lambert posted:

Eh, Gamebryo/Creation does have very annoying limitations and problems that are pretty obvious and related to the way the engine handles data structures/loads environment cells. It's a lovely engine that hitches constantly even on very high-end computers. The rewrite can't have been all that comprehensive.

Sure, but the way to fix that is emphatically not to throw away the entire engine and start over from scratch. I'm not saying that Creation is a great engine: I'm saying that if they'd decided to write a new one it would've been worse.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Craptacular! posted:

The Flash on heroin.

When I think of going fast, I think of heroin.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Stickman posted:


This might be related to default Broadcast LIVE settings, per this post. See if this fixes the problem!

it was definitely this. I always had the live replay off but the latest update turned it on for me.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Craptacular! posted:

You have it basically down, but many competitive games have either a fine controlled FPS limiter or a number of settings. Overwatch has a limiter built in game that you can set at 141. Quake Champions let’s you set an FPS cap at the command line. League and Warframe can let you set an FPS cap of 120 which is within Gsync range and isn’t notably different than 144. Dota 2, the game pros play for millions of dollars, automatically caps it’s FPS at 120.

If you’re seeking “playing Rocket League for money” precision you can do all that RTSS bullshit K8 suggests, but gently caress that noise imo.

pro level bullshit you are no where good enough to need is what pc gaming is all about baby. My professional level setup is all that barely keeps me on par with the 12 year olds on adderall

Craptacular!
Jul 9, 2001

Fuck the DH

Fauxtool posted:

pro level bullshit you are no where good enough to need is what pc gaming is all about baby. My professional level setup is all that barely keeps me on par with the 12 year olds on adderall

I realized this morning that the best feature of 144hz really isn’t 144 FPS, it’s a locked 120 FPS within adaptive sync range and no extraneous bullshit.

Xerophyte posted:

Their situation would almost certainly not have been improved by throwing out absolutely everything and starting over. No one does that, for good reasons.

I didn’t mean starting over from scratch. “Starting over” means taking one of the better behaving engines and adapting it to do the kind of games they make. Running things on your own engine works for Rockstar but at this point it’s clear Bethesda isn’t Rockstar.

Keep in mind that we’re talking about the company that is married by corporate to id Software. This is why the announcement that they’ll reuse this engine yet again for TES and Starfield when they have the best engine coders in the industry virtually next door seems so very :tif:

Craptacular! fucked around with this message at 00:17 on Dec 16, 2018

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Craptacular! posted:

I realized this morning that the best feature of 144hz really isn’t 144 FPS, it’s a locked 120 FPS within adaptive sync range and no extraneous bullshit.


I didn’t mean starting over from scratch. “Starting over” means taking one of the better behaving engines and adapting it to do the kind of games they make. Running things on your own engine works for Rockstar but at this point it’s clear Bethesda isn’t Rockstar.

Keep in mind that we’re talking about the company that is married by corporate to id Software. This is why the announcement that they’ll reuse this engine yet again for TES and Starfield when they have the best engine coders in the industry virtually next door seems so very :tif:

It's almost certainly extremely costly for them to port over the toolkit written over the game engine which is what causes most of the bugs and to also retrain game devs

Adbot
ADBOT LOVES YOU

Indiana_Krom
Jun 18, 2007
Net Slacker

Craptacular! posted:

I realized this morning that the best feature of 144hz really isn’t 144 FPS, it’s a locked 120 FPS within adaptive sync range and no extraneous bullshit.

240 Hz monitors are even better, pretty much none of that bullshit ever applies because actually exceeding the refresh limit enough for it to matter on one is borderline impossible.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply