Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BonoMan
Feb 20, 2002

Jade Ear Joe

Cygni posted:

If you havent already, I would spend some time researching specifically which progs you wanna use (i know you said Cinema4D, but there is also Redshift, Blender etc), because GPU rendering can be all over the place. In some software, its a huge deal and basically mandatory. In some, you dick around for hours with plugins and the renders still run slower than CPU alone. In some, they will use any GPU you can find. In another, they absolutely will only work with an Nvidia card. The compatibility/usefulness is all over the place, even still today.

That said, in general, that motherboard and CPU will run just dandy with a 2070 Super for rendering. You might want to check to make sure the power supply and case can handle it. They can be long cards physically (which sometimes causes issues with drivebays and such in older cases), and they will want some juice and cooling to run for sustained renders without throttling.

Sorry for the late reply, but thanks! This is primarily utilizing the Redshift renderer in Cinema4D and CUDA core utilization in Premiere Pro/After Effects (and Nuke shortly). The case and PSU are both high enough. It's a gigantic gently caress off case and I put a fairly large PSU in it back in the day.

Adbot
ADBOT LOVES YOU

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Doesn't oled still have burn-in that makes it impractical for windows and gaming, where there are elements of the image that stays in place for extended period of time, like HUDs and taskbars, etc

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Zedsdeadbaby posted:

Doesn't oled still have burn-in that makes it impractical for windows and gaming, where there are elements of the image that stays in place for extended period of time, like HUDs and taskbars, etc

That's a whole topic but TLDR is that the newer models (C9 CX) have robust anti-burn features that are very good. things like a larger red subpixel, selective dimming etc.

The C9 came out about 1.5 years ago and it's really hard to find more than a handful of people having burn in so far, and we're talking r/OLED spergs who see a touch of off-color on a mono-color screen that you would never notice in real life. I don't use mine as a Window desktop though, it's connected as a second monitor for gaming/media only. No problems so far.

So yeah, general consensus is that it's fine. However, I wouldn't use it as a monitor unless you know what you're getting into. Also if you play one game for really long hours, constantly, and not much else, then that's probably a red flag too. that being said for 95% of all people, I think we're well over the hump for general game and media use.

You get a huge number of benefits. Near-instant pixel response times, HDMI 2.1 GSync, perfect contrast, great HDR (even on the desktop in Win10), 4K 120 fps if you have one of the upcoming HDMI 2.1 cards, extremely low input lag under 20ms (and in the single digits if you use 120hz). It's definitely one of those "you can never go back" things.

That being said, there are some games that you need to play in front of you with a KB/M, and current OLED are generally too large to fill that role presently. For those reasons, an OLED is really a second monitor, not a primary monitor for most people.

When an OLED 40 inches or under comes out (preferably 27-34 inches) I think you'll see a whole section of PC gamers switch over and use it for all of their computer use full-time. The smallest right now is 48 inches, which really limits that segment, but it's coming.

Encrypted
Feb 25, 2016

43” is the sweet spot for 4k on desktop without any scaling because window 10 is garbage. It has similar dpi and all and everything looks about the same.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
AMD and Intel battle over TSMC capacity

And this is why it’s not good to see Intel fall off the leading edge. Now AMD has to outbid Intel too, or else get cockblocked out of their own production.

4D chess theorycrafting: Intel doesn’t have to beat AMD in the CPU market, they just need to be good enough in the GPU market to starve AMD of the wafers they need for growth. Especially if they are high-margin enterprise GPUs where the cost can be justified, and where Intel can leverage its money to do the necessary software development to get your product into customers’ workflows without them having to re-invent the wheel for you. Intel has always been good at that.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Single-source bottlenecks are never good for the consumer. But reading the article itself it sounds like AMD/Intel were bidding against each other for the freed up capacity from Huawei dipping out. So AMD may not be losing anything vs the situation a few months ago, just weren't able to capitalize on that new opportunity.

But it does mean that you may be right going forward for new contracts / new nodes, where Intel's interest may force AMD to up their bids and that translates to more expensive end products.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

Taima posted:

When an OLED 40 inches or under comes out (preferably 27-34 inches) I think you'll see a whole section of PC gamers switch over and use it for all of their computer use full-time. The smallest right now is 48 inches, which really limits that segment, but it's coming.
Why aren't there decent, affordable OLED monitors? According to this site, some exist but they're all extremely expensive and the sizes are either huge or tiny. It's not like OLED TVs are anywhere near as pricey.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Encrypted posted:

43” is the sweet spot for 4k on desktop without any scaling because window 10 is garbage. It has similar dpi and all and everything looks about the same.

It's a 'sweet spot' if your desk is 6' deep, sure.

Windows 10 DPI scaling is definitely not as solid as Mac OSX, but the majority of apps are fine with it, finally. I have two 27" 4k monitors and it's pretty rare I run into scaling issues, the worst is basically the Dashlane app + a cheap old version of Vegas I use which just upsample so the text/UI elements are blurry.

You get used to not seeing pixels in text/UI elements. Going back to a display with 100% scaling after using a 4K scaled display (and every other device I own being high-DPI) and it really stands out.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

ConanTheLibrarian posted:

Why aren't there decent, affordable OLED monitors? According to this site, some exist but they're all extremely expensive and the sizes are either huge or tiny. It's not like OLED TVs are anywhere near as pricey.

In the past, the argument was that usage patterns for computer monitors were pretty much "worst case" scenarios for burning out OLEDs, and so no one wanted to release a $1000+ monitor that would likely be messed up in a year or two. The PR from that would be terrible, let alone the RMA/warranty costs.

With the C-series LG TVs, the burn-in issue seems to have been cracked. However, it's hard to say if that's entirely true (how many of them are being used to do office work 8hrs/day?), and even harder to say if whatever LG is doing with those TVs to make them burn-in resistant could be applied to smaller monitors. For example, it may be that they're able to use the size of the pixel elements in ways that would be challenging to manufacture at 1/4 the size.

Which is a longer way of saying "no one's really sure, and the panel manufacturers aren't telling."

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
The way LEDs work, the more you stress them, the more they burn out. I assume the same is true for OLEDs. So stuff that works on 48" and bigger TVs can't necessarily be scaled down to smaller monitors, because the pixels will burn out faster producing the same level of brightness. You don't necessarily need the same level of brightness on a monitor, but that might not be enough on its own.

Among other things, I imagine LG might be doing something like tracking the usage of each subpixel and using a known wear curve to increase brightness to compensate, which would get you a much longer usable lifespan on an LED that you don't need 100% brightness from to begin with.

Cygni
Nov 12, 2005

raring to post

LG has said they dont think their OLED technology is suitable for desktop use, so they wont be making desktop sizes/SKUs. I imagine a lot of that is due to not wanting to deal with burn in and customer expectations. Supposedly their next range of high end desktop screens after NanoIPS (stuff like the 27GL850) will be MicroLED, whenever thats ready.

Enos Cabell
Nov 3, 2004


How bad is burn in on OLED compared to plasma? I've got an old Samsung plasma downstairs that has the Rock Band UI burned in so badly you can see it with the power off.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Happy_Misanthrope posted:

It's a 'sweet spot' if your desk is 6' deep, sure.

Whoops, I sit a normal (2-3') distance from mine. I'm sure most people would think it was nuts. I really appreciate 100% scaling, Windows and most programmes are just super janky at anything else

Enos Cabell posted:

How bad is burn in on OLED compared to plasma? I've got an old Samsung plasma downstairs that has the Rock Band UI burned in so badly you can see it with the power off.

OLED "burn-in" is the compounds degrading. The blue especially degrades fast, which is why in Pentile displays they tend to make the blue subpixel larger than the others.
It's funny that organic is a selling point, whereas in the LCD projector world, inorganic LCD is the selling point

Ultimately though, whether the phosphor on a CRT or Plasma is getting pounded with radiation and losing its ability to convert one form of energy to another, or as in OLEDs, the compounds break down and emit less and less light in certain subpixels, the effect is the same.. uneven brightness, colour shift and retained images
How bad it is on a modern OLED TV I couldn't guess at, my only experience has been with smartphone displays, where it was clear to see where the notification bar and on-screen buttons were when playing full screen video (the normally black notification and button areas had barely degraded at all, and were much more vibrant than the rest of the display)

HalloKitty fucked around with this message at 21:14 on Jul 27, 2020

Seamonster
Apr 30, 2007

IMMER SIEGREICH

Encrypted posted:

43” is the sweet spot for 4k on desktop without any scaling because window 10 is garbage. It has similar dpi and all and everything looks about the same.

Preach. Also 43'' is about as large as can fit on a reasonably sized desk physically and necessary viewing distance-wise.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I have owned a 40 inch 4k monitor at one point and personally I thought it was too low DPI and hard to comfortably view from normal distance. I do too much text/code-based work to be fuckin’ around with low DPI.

Win10 scaling isn’t perfect but it’s not worth abandoning high DPI because of a few critical apps that don’t support it (gently caress you Dashlane). I try to stay at 150ppi+ for professional work. Gaming, I can see working better though.

Taima fucked around with this message at 01:28 on Jul 28, 2020

Encrypted
Feb 25, 2016

Seamonster posted:

Preach. Also 43'' is about as large as can fit on a reasonably sized desk physically and necessary viewing distance-wise.
Yeah 43" works fine with a desk that's only 2~3' deep and you can even make it better with a vesa arm.

With proper ergo setup where elbow is 90 degree from the desk and all, the top of the 43" would be around eye height, which is pretty much the max you can do without straining your eyes. It seriously just look like a regular screen with a lot more space on every side.

And lol at the fact the scaling gets worse if you mix and match dpi/scaling across multiple displays on windows and you drag the apps across them.

PC LOAD LETTER
May 23, 2005
WTF?!
Yuup.

~40in is about the practical limit for most people and their desks.

And 4K with a ~40in monitor is nearly perfect since the ppi will be noticeably higher than a 24in 1080p monitor but still not so tiny you can't read it without scaling so you can totally avoid dealing with scaling crap altogether.

Wrar
Sep 9, 2002


Soiled Meat
40" 4k has almost exactly the same pixel density as 27" 1440.

VelociBacon
Dec 8, 2009

If you're playing certain games I don't think you'd want a monitor over 32" at the most (assuming you sit a normal distance from your screen). I can't imagine playing StarCraft or a competitive shooter with a huge screen, having to physically turn my head to look at the minimap etc.

Having said that, I have a surface laptop 2 and while the screen is good, the text scaling thing kinda sucks sometimes.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I wonder if the text scaling issues depend on your exact scale percentage. I run MacOS for work and have a Win10 pc for gaming, which are both connected to a single thunderbolt/DP4 monitor that I can easily switch back and forth. Due to that I'm always switching between the two environments, making it super easy to compare.

I've been really critical of Win10 scaling, but to be honest, even MacOS scaling tends to kinda suck if you have it set on a weird percentage. I keep a flat 200% scale on my 5k2k monitor, and I think since it's a whole multiple, that leads to the best possible text.

On my Win10 box I run at 150% scale, and honestly, while text is probably not quite as crisp as MacOS at 200%, it's still very nice. In fact I would say that 150% scaling on Win10 makes text look better than 150% scale on MacOS. At least, in my experience.

Anyways if you are using scaling on high DPI and find the text to be rough on Win10, I would be curious on the exact percentages people are using just to see if that's part of the issue?


VelociBacon posted:

If you're playing certain games I don't think you'd want a monitor over 32" at the most (assuming you sit a normal distance from your screen). I can't imagine playing StarCraft or a competitive shooter with a huge screen, having to physically turn my head to look at the minimap etc.

SC2 was a major reason I sold my 40" 4K screen. A single building was absolutely giant. 100% unplayable. That being said I think that Blizzard is especially poo poo about scaling, iirc. Truth be told I'm very surprised to hear that people find that size preferable, but fair enough. It certainly wasn't in my experience. Like you say, moving your head to see parts of the screen sucks :shrug:

VelociBacon
Dec 8, 2009

Taima posted:


Anyways if you are using scaling on high DPI and find the text to be rough on Win10, I would be curious on the exact percentages people are using just to see if that's part of the issue?


SC2 was a major reason I sold my 40" 4K screen. A single building was absolutely giant. 100% unplayable. That being said I think that Blizzard is especially poo poo about scaling, iirc.

I think the scaling on my surface laptop 2 is 125%. It's fine and clear and normal 75% of the time but the other times it's almost like it's out of focus, stuff like system error messages etc. All the text is clear in word processing, websites, etc.

Sc2 doesn't scale what is on the screen because everyone would just use the settings to give themselves the widest POV.

space marine todd
Nov 7, 2014



When I have had a 42 inch screen (or even a 30" screen on a very narrow desk), I just played competitive FPS games in windowed mode. It not only made it easier for my peripheral vision, but the reduced resolution also increased the average and minimum framerate.

FuturePastNow
May 19, 2014


I've found 27" to be the ideal desktop size for me.

repiv
Aug 13, 2009

DF did a follow-up on DLSS 2.0 comparing it to the PS4 versions checkerboard reconstruction

https://www.youtube.com/watch?v=9ggro8CyZK4

Shrimp or Shrimps
Feb 14, 2012


space marine todd posted:

When I have had a 42 inch screen (or even a 30" screen on a very narrow desk), I just played competitive FPS games in windowed mode. It not only made it easier for my peripheral vision, but the reduced resolution also increased the average and minimum framerate.

Does running windowed mess with refresh rates and or adaptive sync?

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

repiv posted:

DF did a follow-up on DLSS 2.0 comparing it to the PS4 versions checkerboard reconstruction

https://www.youtube.com/watch?v=9ggro8CyZK4

Yikes those cryptobiotes trails though. Overall yes even performance mode looks better than checkerboard which is what I would have expected, but that's a very noticeable graphical anomaly due to DLSS.

Would have been nice to see FidelityFX compared as well.

As Alex mentioned (and he's tweeted about this before), DLSS is just not something that can be done at the driver level, it requires some pretty significant knowledge of the rendering engine. I've always though the DLSS 3.0 rumours of being able to be enabled through the driver or through drivers for select games was bullshit, I expect the quality/performance to continue to improve but all indications are it will still require developer involvement.

Happy_Misanthrope fucked around with this message at 15:48 on Jul 28, 2020

space marine todd
Nov 7, 2014



Shrimp or Shrimps posted:

Does running windowed mess with refresh rates and or adaptive sync?

Nope! G-Sync used to only work with full-screen, but it has worked in windowed mode for the last couple of years.

Cactus
Jun 24, 2006

The trails looked cool though, I liked them better than without.

repiv
Aug 13, 2009

Happy_Misanthrope posted:

Yikes those cryptobiotes trails though. Overall yes even performance mode looks better than checkerboard which is what I would have expected, but that's a very noticeable graphical anomaly due to DLSS.

Yeah, it's probably just missing motion vectors on that particular shader though. Any reconstruction technique will break in that case, but it seems DLSS is particularly sensitive to it.

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

space marine todd posted:

Nope! G-Sync used to only work with full-screen, but it has worked in windowed mode for the last couple of years.

How do you use G-Sync? Do you just enable V-Sync in games and if your video card/monitor support it it'll work automatically?

space marine todd
Nov 7, 2014



Kraftwerk posted:

How do you use G-Sync? Do you just enable V-Sync in games and if your video card/monitor support it it'll work automatically?

Nope!

1) disable V-sync in games
2) go to the NVIDIA Control Panel and enable G-Sync for your monitor
3) also, while you're in the NVCP, globally enable the NVIDIA framerate limiter to like 3 less than your monitor's refresh rate (so 141 if your monitor is 144) so G-Sync is always on
4) also also, while you're in the NVCP, globally enable V-Sync

See:

https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/

space marine todd fucked around with this message at 16:24 on Jul 28, 2020

SwissArmyDruid
Feb 14, 2014

by sebmojo

K8.0 posted:

The way LEDs work, the more you stress them, the more they burn out. I assume the same is true for OLEDs. So stuff that works on 48" and bigger TVs can't necessarily be scaled down to smaller monitors, because the pixels will burn out faster producing the same level of brightness. You don't necessarily need the same level of brightness on a monitor, but that might not be enough on its own.

Among other things, I imagine LG might be doing something like tracking the usage of each subpixel and using a known wear curve to increase brightness to compensate, which would get you a much longer usable lifespan on an LED that you don't need 100% brightness from to begin with.

I want one of these monster displays, but for a very different reason. For years I have railed against every single loving "smart" tvs that are absolute rear end and never provisioned with enough processing power or enough RAM with a GUI that responds like molasses while datamining the crap out of you and shoving advertisements in your face.

Give me one of these displays for my living room, I will hook up my own loving devices to it and be vastly happier.

space marine todd
Nov 7, 2014



SwissArmyDruid posted:

I want one of these monster displays, but for a very different reason. For years I have railed against every single loving "smart" tvs that are absolute rear end and never provisioned with enough processing power or enough RAM with a GUI that responds like molasses while datamining the crap out of you and shoving advertisements in your face.

Give me one of these displays for my living room, I will hook up my own loving devices to it and be vastly happier.

I have had LG and Samsung TVs for the past few years and they seemed to work perfectly fine in terms of GUI experience/performance when I used them just as displays for my Roku or Chromecast.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

space marine todd posted:

I have had LG and Samsung TVs for the past few years and they seemed to work perfectly fine in terms of GUI experience/performance when I used them just as displays for my Roku or Chromecast.

Yeah, everything works fine when you use smart TVs as dumb displays for other devices. But there are a lot of TVs (especially at the lower end) that are trying to run their own poorly optimized ad-filled GUI mess on a stale potato with 512kb RAM or whatever, and the user experience is atrocious. I'd almost forgotten about that, since all my recent TVs have run on a built-in Roku UI that works quite well, but I went to a friend's house to help her out with moving and dear god the TV she had was awful. It took us 20 minutes to delete all her profiles off that thing because booting Netflix took literally 5 minutes, returning to the Home screen from anywhere took 20-30 seconds, you could watch it sequentially load ad tiles one by one, and there was a ~5s lag between hitting a button and it doing anything in the GUI.

Whoever greenlighted that thing should have been taken out back and beaten over the head with that TV.

Cygni
Nov 12, 2005

raring to post

I Love It™ when TVs force you to use a GUI to just cycle through inputs, instead of just changing inputs when you press the button. Or better yet, get rid of the input button completely and force me to interact with their slow OS with ads (TCL) to change inputs. Thats good to me.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Cactus posted:

The trails looked cool though, I liked them better than without.

Huh, yeah I thought it was because they had some kinda chiral goo on them

CaptainSarcastic
Jul 6, 2013



"Smart" devices are rear end because the manufacturers are on, at best, yearly release cycles, and have little to no motivation to actually support the hardware moving forward. You're basically getting a TV (or Blu-ray player or whatever) with a cellphone running it that has lots of bloatware, probably a good load of spyware, and that will not be getting updates of any kind within a year. Samsung or Sony or LG or whoever has no interest in making sure the device you bought 2 years ago is still in good shape - they are entirely geared to sell you a new device.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

CaptainSarcastic posted:

"Smart" devices are rear end because the manufacturers are on, at best, yearly release cycles, and have little to no motivation to actually support the hardware moving forward. You're basically getting a TV (or Blu-ray player or whatever) with a cellphone running it that has lots of bloatware, probably a good load of spyware, and that will not be getting updates of any kind within a year. Samsung or Sony or LG or whoever has no interest in making sure the device you bought 2 years ago is still in good shape - they are entirely geared to sell you a new device.

i feel bad because most of the, i guess, victims of this, are older people i've talked to that aside from wanting a nice picture thought they were going to get something they could actually use for a long time.

those UI's and the hardware that drive them are a lovely n horrible achilles heel.

it was sad to see people give feedback that their tv basically stopped doing the basic things they wanted

Quaint Quail Quilt
Jun 19, 2006


Ask me about that time I told people mixing bleach and vinegar is okay

Kraftwerk posted:

How do you use G-Sync? Do you just enable V-Sync in games and if your video card/monitor support it it'll work automatically?
Seconding this
https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/

Instead of parkcontrol you can use 1usmas power/bios settings on Ryzen and it seems to do the same stuff I think. Bioses/Windows are starting to implement it anyway.

Adbot
ADBOT LOVES YOU

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.
From a bunch of comparison reviews I've read, I've gotten the impression Panasonic models are less smart and more TV.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply