Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SwissArmyDruid posted:

I'm not saying I think you're crazy.... but I think a blind A-B experiment should be set up with multiple pairs of identical monitors running at different maximum refresh rates should be done to prove your assertion.

Two things:
1) The most important aspect of a 240hz refresh rate monitor is that it is sufficient to display 120hz 3D content using a glasses method, which is nice for a certain style of games and other media.

2) The human eye and brain can tell the difference between quite high framerates just fine, even if the higher you go the less of the additional frames per second you can really perceive. And depending on the game in question, there can be some really nice improvements in how responsive the controls feel.

Adbot
ADBOT LOVES YOU

New Zealand can eat me
Aug 29, 2008

:matters:


Generic Monk posted:

i'm sorry but that's loving mental

It really isn't. It's good you're sorry though it's not good to call people that. In all seriousness it's a 1.67x improvement over 144hz. When you eventually experience it for yourself, you'll feel silly for having said this, I promise.

SwissArmyDruid posted:

I'm not saying I think you're crazy.... but I think a blind A-B experiment should be set up with multiple pairs of identical monitors running at different maximum refresh rates should be done to prove your assertion.

Everyone I've had over agrees the difference is immediately obvious. It's not unlike half wearing a VR headset and just looking at the screens, only those have just a bit more latency.

FWIW, I was skeptical myself until I had multiple friends go to Quakecon and play on 240hz monitors. Their "butt dyno" reviews were what convinced me it was time to upgrade. I originally intended to hold out until we had the bandwidth for 10bit 144hz but that seems to be a ways off yet.

A few friends have been like "what? No that's stupid" and then I'm like "no really, move the mouse" and they react like they just touched the chrome goop in the matrix. Just an immediate "whoa! okay yeah wow" (I am known for sometimes loving around so their initial skepticism is warranted)

In a humorous/awesome twist of fate, Amazon actually sent me two AW2518hf's. I paid for the fast shipping and Fedex hosed up so they sent another one UPS and never cancelled the Fedex shipment which showed up a bit later. I am almost tempted to just get a 3rd, but there are too many issues with multimon 240hz freesync around capturing/streaming, so the 2nd one is off most of the time.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I'd be tempted once 27" 1440p models hit the market. I don't think I could bring myself to accept a 24" 1080p panel again at this point.

Also, the NVIDIA 3D Vision system is pretty much abandonware at this point, and I don't even know if it can run at 240 Hz.

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

I didn't even know I wanted a 240Hz monitor until today but I know it would be great. Actually producing 1440p @ 240Hz video is going to take some serious juice though.

New Zealand can eat me
Aug 29, 2008

:matters:


I definitely agree, prior to 240hz existing I prioritized dot pitch and color accuracy over everything else, and was really happy with Dell's 25" 1440p. Having to compromise was pretty painful, but the color accuracy of this panel is surprisingly good. It's still obviously a TN when you're looking at it from some hilariously oblique angle, but otherwise it's not like you're sacrificing 20% of the SRGB space or anything wild anymore.

IIRC we don't have 1440p@240hz for the same reason that we don't have 10 bit 1080p@144hz, we need to wait for a displayport cable/standard that actually has enough bandwidth to carry that kind of signal.

Apparently PCI-E 4.0's 1.0 spec just got finalized, so the future isn't that far off (I'm not sure if this is one of the limitations but I am assuming doubling the available bandwidth is one of the prerequisites)

Edit: also wanted to say that the bigger pixels are important for serious shootmans

New Zealand can eat me fucked around with this message at 18:41 on Oct 29, 2017

Generic Monk
Oct 31, 2011

New Zealand can eat me posted:

I definitely agree, prior to 240hz existing I prioritized dot pitch and color accuracy over everything else, and was really happy with Dell's 25" 1440p. Having to compromise was pretty painful, but the color accuracy of this panel is surprisingly good. It's still obviously a TN when you're looking at it from some hilariously oblique angle, but otherwise it's not like you're sacrificing 20% of the SRGB space or anything wild anymore.

IIRC we don't have 1440p@240hz for the same reason that we don't have 10 bit 1080p@144hz, we need to wait for a displayport cable/standard that actually has enough bandwidth to carry that kind of signal.

Apparently PCI-E 4.0's 1.0 spec just got finalized, so the future isn't that far off (I'm not sure if this is one of the limitations but I am assuming doubling the available bandwidth is one of the prerequisites)

Edit: also wanted to say that the bigger pixels are important for serious shootmans

i didn't know they made a tn version of the u2515h. i have that very monitor and it is extremely needs suiting


e: ohh you're talking about your 240hz monitor. my bad

taqueso posted:

I didn't even know I wanted a 240Hz monitor until today but I know it would be great. Actually producing 1440p @ 240Hz video is going to take some serious juice though.

not if you're a true patrician and don't play anything newer than 2004

Generic Monk fucked around with this message at 18:52 on Oct 29, 2017

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

New Zealand can eat me posted:

I definitely agree, prior to 240hz existing I prioritized dot pitch and color accuracy over everything else, and was really happy with Dell's 25" 1440p. Having to compromise was pretty painful, but the color accuracy of this panel is surprisingly good. It's still obviously a TN when you're looking at it from some hilariously oblique angle, but otherwise it's not like you're sacrificing 20% of the SRGB space or anything wild anymore.

IIRC we don't have 1440p@240hz for the same reason that we don't have 10 bit 1080p@144hz, we need to wait for a displayport cable/standard that actually has enough bandwidth to carry that kind of signal.

Apparently PCI-E 4.0's 1.0 spec just got finalized, so the future isn't that far off (I'm not sure if this is one of the limitations but I am assuming doubling the available bandwidth is one of the prerequisites)

Edit: also wanted to say that the bigger pixels are important for serious shootmans

If you use your gaming PC for any kind of graphical work a TN panel is still complete garbage. As a web developer it can cause me to simply not be able to see something in my implementation of a graphical design, it can make solid background colours appear as a slight gradient. It doesn't matter that others on a TN panel probably wont notice all of the flaws because you bet your arse that the designer is on a Mac and can see everything just fine.

So yeah, I just want TN to gently caress off already.

Generic Monk
Oct 31, 2011

New Zealand can eat me posted:

It really isn't. It's good you're sorry though it's not good to call people that. In all seriousness it's a 1.67x improvement over 144hz. When you eventually experience it for yourself, you'll feel silly for having said this, I promise.

in another sense though the further you get up there the smaller the time reduction of the frame actually being on the screen becomes. i'm not saying that it's snake oil, just that by definition the returns diminish. maybe i'm just not sensitive to this poo poo though - i've used 60hz screens all my life and recently got the ipad pro with the 120hz display, and while i can notice a difference and it's very cool i'd give that up before i gave up the great colour reproduction or hidpi. the only exception is working with the pencil, which is improved immeasurably

i don't doubt that i would have a different opinion if i was a pro level counterstrike player or something though. and i am of the opinion all displays should be 120hz so i can watch my stories without 3:2 pulldown

also if vr ever truly gets off the ground, give me the highest refresh rate humanly possible obviously

Generic Monk fucked around with this message at 19:04 on Oct 29, 2017

New Zealand can eat me
Aug 29, 2008

:matters:


If you haven't even used a proper 144hz freesync/gsync monitor I'm not sure why you are even offering your opinion on high refresh rate displays :rolleyes: quite literally just theorycrafting when you could spend a few hundred dollars and stop lying to yourself

Measly Twerp posted:

If you use your gaming PC for any kind of graphical work a TN panel is still complete garbage. As a web developer it can cause me to simply not be able to see something in my implementation of a graphical design, it can make solid background colours appear as a slight gradient. It doesn't matter that others on a TN panel probably wont notice all of the flaws because you bet your arse that the designer is on a Mac and can see everything just fine.

72% NTSC == 99% SRGB yea? I understand your hatred for the general performance of TN panels but this is actually quite serviceable. I have this right next to my U2518D, and I find that I prefer this screen more often than not. The only thing that actually makes either screen look bad are the D3 displays on my MBP/iPad/iPhone

Generic Monk
Oct 31, 2011

New Zealand can eat me posted:

If you haven't even used a proper 144hz freesync/gsync monitor I'm not sure why you are even offering your opinion on high refresh rate displays :rolleyes: quite literally just theorycrafting when you could spend a few hundred dollars and stop lying to yourself

i've been lying to myself all this time, you're right. i am actually gay for frametimes

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I guess the other thing is that at least with a 24" TN panel you aren't getting as much color shift at the edges as you do with a 27" TN panel. That was my only real complaint on the S2716DG, you had to keep your head relatively still, if you slid around a couple inches you could see the corners start to shift.

New Zealand can eat me
Aug 29, 2008

:matters:


Paul MaudDib posted:

I guess the other thing is that at least with a 24" TN panel you aren't getting as much color shift at the edges as you do with a 27" TN panel. That was my only real complaint on the S2716DG, you had to keep your head relatively still, if you slid around a couple inches you could see the corners start to shift.

I have to be sitting uncomfortably close (like t-rex arms on the keyboard/mouse) and bend myself over my armrests to start to discolor the far side of the screen. I wouldn't have bought it if I felt it was going to be a serious compromise based on the stats, but I am definitely inclined to agree with all of the 9/10 reviews. Prior to this I just had some garbo $200 Niexus so I acutely aware of how bad it can really be.

Generic Monk posted:

i've been lying to myself all this time, you're right. i am actually gay for frametimes

You're definitely something, going around calling people mental off inherently wrong assumptions

brainwrinkle
Oct 18, 2009

What's going on in here?
Buglord
We are all very impressed by your 240 Hz monitor in the AMD CPU thread.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Generic Monk posted:

the new forza does a locked 4k60 on xbonex, and while that's not strictly a representative title it's plenty powerful enough to do 1080p60 in most games afaik.
Depends entirely how CPU-centric the game is. It can still be a large bottleneck.

Generic Monk
Oct 31, 2011

New Zealand can eat me posted:

You're definitely something, going around calling people mental off inherently wrong assumptions

please calm down, i am positive your monitor is exactly as good as you say it is x

New Zealand can eat me
Aug 29, 2008

:matters:


With the PCI-E 4 1.0 spec finalized, does that mean the Electromechanical Specification isn't far off? We already know for sure the additional IO isn't important to gpu performance, but being able to pull more power through the slot might be useful.

brainwrinkle posted:

We are all very impressed by your 240 Hz monitor in the AMD CPU and Platform thread.

Freesync is an AMD brand, friend. Welcome to the thread, thanks for your contribution.

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

New Zealand can eat me posted:

With the PCI-E 4 1.0 spec finalized, does that mean the Electromechanical Specification isn't far off? We already know for sure the additional IO isn't important to gpu performance, but being able to pull more power through the slot might be useful.


Freesync is an AMD brand, friend. Welcome to the thread, thanks for your contribution.

You spelt platfrom incorrectly.

New Zealand can eat me
Aug 29, 2008

:matters:


:negative: gently caress

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

New Zealand can eat me posted:

With the PCI-E 4 1.0 spec finalized, does that mean the Electromechanical Specification isn't far off? We already know for sure the additional IO isn't important to gpu performance, but being able to pull more power through the slot might be useful.


Freesync is an AMD brand, friend. Welcome to the thread, thanks for your contribution.

The electromechanical spec has been pretty nailed down for a while, but there has been some incorrect reporting on the power. Some sites were saying there would be 300W on the slot which is NOT correct, it is still 75W over the slot. Supplemental power spec will now allow up to 300W with additional connectors.

GRINDCORE MEGGIDO
Feb 28, 1985


Measly Twerp posted:

You spelt platfrom incorrectly.

Yessss :getin:

Yaoi Gagarin
Feb 20, 2014

Generic Monk posted:

ps4 pro gpu is roughly equivalent in perf to a gtx970 which is still a solid 1080p60 card

I'm gonna need you to source a benchmark on this just because I have a 970 and don't want to believe you

repiv
Aug 13, 2009

VostokProgram posted:

I'm gonna need you to source a benchmark on this just because I have a 970 and don't want to believe you

The closest thing to the PS4 Pro GPU on PC is the RX470 - they're the same architecture, the RX470 is a little faster but the PS4 Pro has bolted-on 2xFP16 support so they're probably pretty close in practice.

Otakufag
Aug 23, 2004
But aren't its 8 jaguar cpu cores garbage? I think the Destiny 2 devs said it horribly bottlenecked the gpu.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Otakufag posted:

But aren't its 8 jaguar cpu cores garbage? I think the Destiny 2 devs said it horribly bottlenecked the gpu.

Less of a bottleneck than the original PS4 and Xbox One 8 Jaguar cores were, and the Xbox One X's got further improvements on the processors beyond what the PS4 Pro did.

Yaoi Gagarin
Feb 20, 2014

repiv posted:

The closest thing to the PS4 Pro GPU on PC is the RX470 - they're the same architecture, the RX470 is a little faster but the PS4 Pro has bolted-on 2xFP16 support so they're probably pretty close in practice.



Dang, that's kind of alarming. I'm unused to console hardware being this close to PC performance.

crazypenguin
Mar 9, 2005
nothing witty here, move along

New Zealand can eat me posted:

Apparently PCI-E 4.0's 1.0 spec just got finalized, so the future isn't that far off (I'm not sure if this is one of the limitations but I am assuming doubling the available bandwidth is one of the prerequisites)

GPUs can support the newer cable standards without needing any internal IO upgrade (this is why the ports are directly on the card: so all that data doesn't have to go over any internal bus.)

The biggest things PCIe 4 will give us are doubling the bandwidth between the CPU and chipset (which should make it easier to support fancy USB, and is probably a prerequisite for supporting USB 3.2 / 20 Gbps.) And it'll let NVMe drives immediately slam into a 7 GB/s bandwidth bottleneck instead of their current 3.5! :D

I'm excited for PCIe 4 for a lot of reasons, but it probably won't improve your frametimes much.

GRINDCORE MEGGIDO
Feb 28, 1985


VostokProgram posted:

Dang, that's kind of alarming. I'm unused to console hardware being this close to PC performance.

They're always close to *some* PC hardware. Didn't the 970 launch in 2014?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

VostokProgram posted:

Dang, that's kind of alarming. I'm unused to console hardware being this close to PC performance.

And XB1X is even a little faster than that.

The new consoles are actually pretty decent hardware. Like, the CPU is still not good (Jaguar sucks regardless of clocks and the clocks still are not on par with desktop chips) but the GPU is basically on par with lower-midrange desktop GPUs. You should actually be able to get close to 60fps for once, and XB1X (at least, not sure about PS4P) supports FreeSync.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

crazypenguin posted:

GPUs can support the newer cable standards without needing any internal IO upgrade (this is why the ports are directly on the card: so all that data doesn't have to go over any internal bus.)

The biggest things PCIe 4 will give us are doubling the bandwidth between the CPU and chipset (which should make it easier to support fancy USB, and is probably a prerequisite for supporting USB 3.2 / 20 Gbps.) And it'll let NVMe drives immediately slam into a 7 GB/s bandwidth bottleneck instead of their current 3.5! :D

I'm excited for PCIe 4 for a lot of reasons, but it probably won't improve your frametimes much.

pcie4 is totally pointless for gaming, since gaming is not pcie bandwidth bound. maybe there's some latency improvement but it's not the orders of magnitude that'd be necessary


and pcie5 is like 2 years out after that so i suspect amd/nvidia will just wait for that for mass market stuff and for GPGPU/deep learning where it IS pcie bandwidth bound jump on pcie4 and then pcie5

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Paul MaudDib posted:

And XB1X is even a little faster than that.
XB1X is in 1070 range even. Hell Outlast2 is apparently native 4K and 60fps on the 1X - don't think a 1070 can do that, albeit that may be an outlier. The mining boom is making the value proposition of PC GPU's looking not that great against mid-release console cycles now. Painful to think if the mining impact didn't exist where we would be on prices now, at least with the low/midrange.

Happy_Misanthrope fucked around with this message at 04:37 on Oct 30, 2017

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Happy_Misanthrope posted:

XB1X is in 1070 range even. Hell Outlast2 is apparently native 4K and 60fps on the 1X - don't think a 1070 can do that, albeit that may be an outlier. The mining boom is making the value proposition of PC GPU's looking not that great against mid-release console cycles now. Painful to think if the mining impact didn't exist where we would be on prices now, at least with the low/midrange.

Mid-release console cycles isn't really a thing, there was simply nothing comparable in previous generations. There were tiny, minor, speedups usually accompanying cost reduction revisions, but that was not usually noticeable outside constrained scenarios. And you'd have random poo poo like being able to double the RAM in the N64, or adding extra RAM to the Saturn through a cartridge, but nothing really affecting processing speeds.

And especially on the Microsoft side of things, they seem really dedicated to a "your old games will work on any of our consoles, eventually we'll stop releasing new games for old hardware" thing that they probably meant to do from the start but got lost among other launch-day missteps. So probably the Xbox XP or whatever that comes after the Xbox One X and goes against a PS5 is just going to straight up play the Xbox One stuff natively - no more typical console lifespan.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

fishmech posted:

So probably the Xbox XP or whatever that comes after the Xbox One X and goes against a PS5 is just going to straight up play the Xbox One stuff natively - no more typical console lifespan.

And, I don't think that necessarily locks them into AMD either. Consoles are more or less just a custom low-power x86 gaming desktop nowadays. I bet you could switch most games over to NVIDIA hardware pretty easily, after all they already have desktop drivers for any ports.

It arguably doesn't make sense for them to be using something exotic and expensive anyway. The PS3 was baller and all, but its titles also didn't tend to port well to anything else. The problem space is pretty explored at this point - the appropriate question is largely "would you like that with ARM or x86", which pretty much comes down to your power envelope.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
You can achieve console size with MiniSTX form factor, and both Sony and Microsoft have talked about upgrades in the future as a kind of rolling replacement for having to release new consoles in the future. Maybe they push instead with some kind of locked MXM type, go with an AMD processor (because AMD will ALWAYS offer better deals than Intel) but it leaves them open and flexible to dealing with AMD, Nvidia or whoever wants to throw their hat into the ring for GPU side of things. It should mean devs have a pretty set hardware feature set as well since they'd obviously limit it to a handful of GPUs (lets say it's still an APU, so you get baseline APU performance and then Sony/MS decide on a GPU upgrade every year, so games target the APU performance but obviously expanded features when going to dGPU)

This might also push MiniSTX as a form factor as a side effect.

Paul MaudDib posted:

It arguably doesn't make sense for them to be using something exotic and expensive anyway. The PS3 was baller and all, but its titles also didn't tend to port well to anything else. The problem space is pretty explored at this point - the appropriate question is largely "would you like that with ARM or x86", which pretty much comes down to your power envelope.

On power envelope, I want to see what 4W Banded Kestrel does vs Tegra X1.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

You can achieve console size with MiniSTX form factor,

There are some real admirable efforts coming out lately. Gigabyte stuffed a 1070 in a tall form-factor NUC, Zotac managed a 1080 in an even smaller one with a full liquid cooling solution. MSI has had a couple decent-ish console-style prebuilts too.

Modular motherboards based on a small footprint and a MXM card would be pretty baller. The 1980-vintage AT Extended form-factor (and variants thereof) is not the end-all be-all of case design. Unfortunately there is a pretty good pile of dead standards that never saw any adoption.

Paul MaudDib fucked around with this message at 06:28 on Oct 30, 2017

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

fishmech posted:

Mid-release console cycles isn't really a thing, there was simply nothing comparable in previous generations. There were tiny, minor, speedups usually accompanying cost reduction revisions, but that was not usually noticeable outside constrained scenarios. And you'd have random poo poo like being able to double the RAM in the N64, or adding extra RAM to the Saturn through a cartridge, but nothing really affecting processing speeds.
Yes, I know. Mid-release console cycles 'now' was meant to indicate they're a thing - now. I never considered the small revisions made in previous gens a new 'release' of the platform, I'm speaking strictly what started with the Pro.

Happy_Misanthrope fucked around with this message at 06:37 on Oct 30, 2017

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
edit, nm

Happy_Misanthrope fucked around with this message at 07:13 on Oct 30, 2017

Arzachel
May 12, 2012
Nvidia isn't a viable choice for consoles unless Sony and Microsoft decide to swap to ARM CPUs for whatever reason.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Arzachel posted:

Nvidia isn't a viable choice for consoles unless Sony and Microsoft decide to swap to ARM CPUs for whatever reason.

Nvidia GPUs are workable though in a semi modular system, and it's likely 7nm Ryzen will be good enough to last until materials change.

A SWEATY FATBEARD
Oct 6, 2012

:buddy: GAY 4 ORGANS :buddy:
Board: ASUS Prime x370-pro
CPU: 1700X

drat this board still suffers from teething troubles. The CPU core boost function causes system to crap out randomly, and I have to take special care to disable it in the BIOS after each update. It's not the processor, I can manually clock it up by several hundred megaherz and the system will be rock solid, but if the board tries to do something funny with the CPU frequency on the fly, I get random and unpredictable BSODs.

I had hoped that ASUS at least would have its poo poo together with the first-gen parts, but that seems not to have been the case. The problem remains - board firmware updates only maintain the bug.

Adbot
ADBOT LOVES YOU

New Zealand can eat me
Aug 29, 2008

:matters:


Do you have the chipset drivers installed with the amd power plan?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply