Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Is it actually fixing it though, or is it just taking the bit depth it has and stretching it out and causing banding? How can feeding it a different white point change how the monitor is handling the gamut internally?

Extended mode is definitely overclocking and LG says right in the manual that it can cause flickering. In this model it apparently fucks the gamut up too.

Again, if the hardware is so great, what's so loving hard about shipping it in the right mode from the start? LG knows it's broken, that's why it's toggled off by default and why they tell you in the manual that "if it doesn't work sucks to be you, we know it's got problems".

That's always been the problem with FreeSync, 60 percent of the time it works every time. And LG and Samsung are the worst offenders there.

If it passed their testing, NVIDIA would have said so.

Paul MaudDib fucked around with this message at 22:09 on Jun 13, 2019

Adbot
ADBOT LOVES YOU

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Paul MaudDib posted:

As always, "Extended Mode" is overclocking. If it was stable enough not to gently caress with the display, it would have been the "Normal Mode" in the first place. LG isn't making it an optional toggle just because they hate selling panels with big features on the box.
it's super cool by the way that you're basically repeating word-for-word poo poo you read on /r/ultrawidemasterrace

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Nice post. Someone said a thing on reddit. Congrats.

He's not wrong either.

Alchenar
Apr 9, 2008

Absurd Alhazred posted:

Took me a few hours over several days to build my new PC because I'm very anxious, but I did it. The only part that is harder than essentially building legos and screwing some things together is attaching the heatsink to the CPU.

And the trick to attaching the heatsink to the CPU is to realise that 99% of handwringing on the internet about how best to apply thermal paste is complete nonsense and you will get identical results no matter what as long as you don't over-apply.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Paul MaudDib posted:

Is it actually fixing it though, or is it just taking the bit depth it has and stretching it out and causing banding? How can feeding it a different white point change how the monitor is handling the gamut internally?
I dunno, maybe take a colorimeter to it or something if you doubt the most reputable independent monitor review site on the entire internet so much?

Paul MaudDib posted:

can cause flickering
LG knows it's broken
we know it's got problems
60 percent of the time it works every time

dang Paul you're right, Freesync was actually a scam all along and the 34gk950f is an unusable piece of garbage

this loving hyperbole lmao


now i know where you got the ideas about the supposedly extremely widespread backbleed issues on the xv273k too, it's random rear end reddit posts all the way down

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
No people complaining in the last 6 months? No people complaining with Nvidia cards? Let's invent a story about how a firmware update released 6 months after a reddit post caused time-travel color gamut issues on AMD cards that definitely had nothing to do with windows wide gamut support being absolute dogshit.

Paul this is a new level of dumbposting, even for you.

Indiana_Krom
Jun 18, 2007
Net Slacker

Alchenar posted:

And the trick to attaching the heatsink to the CPU is to realise that 99% of handwringing on the internet about how best to apply thermal paste is complete nonsense and you will get identical results no matter what as long as you don't over-apply.

The only downside of putting too much on is cleaning it off the next time you remove the heatsink.

As for the other discussion about water cooling, I have a custom loop myself and the fan/pump RPMs are slaved to the coolant temperature. Crank my CPU and GPU to max (prime95 AVX + Furmark), and it takes ~3 minutes before the coolant is warm enough to cause the 3x140MM radiator fans to throttle up at all. The point is that even dissipating around 400w and keeping everything under 50C, the cooler hardly ever makes a sound other than eventually a low hum from the noctua fans reaching ~700 RPM.

If you go to do some short burst of CPU or GPU activity that causes its temperature to spike and in response your system audibly increases the throttle on a cooling fan for the three seconds the spike is happening and you find it annoying, then a well designed water cooler will basically never do that.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Indiana_Krom posted:

The only downside of putting too much on is cleaning it off the next time you remove the heatsink.

As for the other discussion about water cooling, I have a custom loop myself and the fan/pump RPMs are slaved to the coolant temperature. Crank my CPU and GPU to max (prime95 AVX + Furmark), and it takes ~3 minutes before the coolant is warm enough to cause the 3x140MM radiator fans to throttle up at all. The point is that even dissipating around 400w and keeping everything under 50C, the cooler hardly ever makes a sound other than eventually a low hum from the noctua fans reaching ~700 RPM.

If you go to do some short burst of CPU or GPU activity that causes its temperature to spike and in response your system audibly increases the throttle on a cooling fan for the three seconds the spike is happening and you find it annoying, then a well designed water cooler will basically never do that.

hey im putting together a loop right now and would like to set up something like that. What are you using for that or is it bios?

GRINDCORE MEGGIDO
Feb 28, 1985


Alchenar posted:

you will get identical results no matter what as long as you don't over-apply.

Be careful with liquid metal though, kids.

Beve Stuscemi
Jun 6, 2001




GRINDCORE MEGGIDO posted:

Be careful with liquid metal though, kids.

For real, it can kill you

Beve Stuscemi
Jun 6, 2001





This is right in line with used 2070’s.

What’s says this thread on 1080ti vs 2070 for VR? Is the greater vram amount worth more on the 1080ti, or is the future proofing worth more on the 2070?

Indiana_Krom
Jun 18, 2007
Net Slacker

Fauxtool posted:

hey im putting together a loop right now and would like to set up something like that. What are you using for that or is it bios?
A simple BIOS fan curve based off the temperature reading from one of these:
https://www.amazon.com/gp/product/B00CMR3CC2/
I replaced one of the plugs on my GPU block with it so it doesn't require any extra hardware since my motherboard has thermristor headers.
If your motherboard doesn't have thermristor headers, then an aquacomputer is the 800 pound gorilla of fan controllers that can be set up to control multiple fans/pumps/etc based off a sensor like that and can operate completely completely independent of the host system.

Stickman
Feb 1, 2004

Jim Silly-Balls posted:

This is right in line with used 2070’s.

What’s says this thread on 1080ti vs 2070 for VR? Is the greater vram amount worth more on the 1080ti, or is the future proofing worth more on the 2070?

The 1080 Ti is something like a 15-20% performance boost over the 2070, so really RAYZ is the only "future proofing" you're missing out on. Are there even any VR games that support real-time raytracing (and run at solid clip with it turned on)? I suspect both cards will hold their value equally well, so I'd just upgrade when there's actually some RTRT VR games and decent cards that can run them.

E: Refurbs will usually just have a 1-year warranty while used 2070s should still have at least two years of transferable warranty left if you buy an EVGA, Gigabyte or MSi card. Asus has been reported to honor warranty transfers, too, though it's excluded from their warranty agreement.

Stickman fucked around with this message at 06:29 on Jun 14, 2019

Absurd Alhazred
Mar 27, 2010

by Athanatos

Stickman posted:

The 1080 Ti is something like a 15-20% performance boost over the 2070, so really RAYZ is the only "future proofing" you're missing out on. Are there even any VR games that support real-time raytracing (and run at solid clip with it turned on)? I suspect both cards will hold their value equally well, so I'd just upgrade when there's actually some RTRT VR games and decent cards that can run them.

If I had to guess I'd say VR games would be the very last ones to add RT support, the goal is shallow pipelines and that just adds more depth, in the tracing itself and in the denoising.

Craptacular!
Jul 9, 2001

Fuck the DH

Zedsdeadbaby posted:

It's worth noting that this isn't entirely true - support is extremely spotty and Nvidia has to curate specific freesync monitors.

A good resource for this sort of thing.

EdEddnEddy
Apr 5, 2012



Yea a 1080Ti trades blows pretty well with a 2080 so I'd get it over a 2070 personally for VR.

The Rayz are really cool, but not an essential yet to pass up that sort of deal. The 1080Ti is a hell of a card.

Stickman
Feb 1, 2004

Absurd Alhazred posted:

If I had to guess I'd say VR games would be the very last ones to add RT support, the goal is shallow pipelines and that just adds more depth, in the tracing itself and in the denoising.

Yeah, that'd be my guess, too - it's just too much of a hit right now and VR needs stable and relatively high frame rates.


That's cool, thanks!

Arzachel
May 12, 2012

Paul MaudDib posted:

This is interesting and now I understand what they mean by "hybrid" architecture. It sounds like this is really console-first and they have a "legacy mode" that will run their old code and a "new mode" that mimics NVIDIA a lot more closely in various aspects (32-thread wavefront, etc). That gives some hope for die size reductions in future "pure" implementations for desktops if they can strip out "legacy mode".

It is still not a new new architecture, it's analogous to something like the Kepler->Maxwell switch where NVIDIA moved from 192 cores per SMM to 128 to improve occupancy, but we'll have to wait and see what the performance implications are. It is really more like a new "chapter" in GCN architecture though, it is not a new architecture, it is major tweaks on GCN. Which is fine, again, Kepler->Maxwell was the same thing conceptually.

It might not be a Terascale -> GCN level change but I think it's much closer to that than to Maxwell. The 64- wide wavefront over 4 clocks cadence was baked into pretty much every design decision for the various GCN archs and going from that to 64-wide over two clocks (realistically 32-wide over one clock, it looks like they're setting up to ditch 64-wide wavefronts with the next generation) is a massive change.

Shame the products are very ehh, but next gen is shaping up to be very interesting.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Jim Silly-Balls posted:

This is right in line with used 2070’s.

What’s says this thread on 1080ti vs 2070 for VR? Is the greater vram amount worth more on the 1080ti, or is the future proofing worth more on the 2070?

The 1080ti as about as fast as 2080, and a good deal faster than a 2070, (rtx naturally excepted); so it's the better deal. Seeing as rtx is incredibly heavy even on good hardware, and therefore unlikely to show up in vr soon (high frame rates being very necessary there), the 1080 ti is the clear winner in that scenario

HalloKitty fucked around with this message at 11:14 on Jun 14, 2019

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
32 minutes...32 loving minutes

Cygni
Nov 12, 2005

raring to post

its cool how hes able to make a living just talking about presentation slides for 30 minutes. the thing isn't even out yet.

VelociBacon
Dec 8, 2009

He knows his market, AMD owners have been used to slideshows for years.

NewFatMike
Jun 11, 2015

VelociBacon posted:

He knows his market, AMD owners have been used to slideshows for years.

:iceburn:

EdEddnEddy
Apr 5, 2012



VelociBacon posted:

He knows his market, AMD owners have been used to slideshows for years.

:golfclap:

I wonder if i could get a HW Review/Preview channel going that is dedicated to 5min or less clips considering the trend... Hmmm

ilkhan
Oct 7, 2004

I LOVE Musk and his pro-first-amendment ways. X is the future.

VelociBacon posted:

He knows his market, AMD owners have been used to slideshows for years.
:drat:

GRINDCORE MEGGIDO
Feb 28, 1985


VelociBacon posted:

He knows his market, AMD owners have been used to slideshows for years.

Brutal

Arzachel
May 12, 2012

EdEddnEddy posted:

:golfclap:

I wonder if i could get a HW Review/Preview channel going that is dedicated to 5min or less clips considering the trend... Hmmm

Youtube algorithms poo poo on clips shorter than 10 minutes which is why a ton of stuff that should only take a couple minutes tops gets stretched out to hit that magical 10 minute mark.

SwissArmyDruid
Feb 14, 2014

by sebmojo

EdEddnEddy posted:

:golfclap:

I wonder if i could get a HW Review/Preview channel going that is dedicated to 5min or less clips considering the trend... Hmmm

Pad it to 10, so you can monetize that poo poo.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Arzachel posted:

Youtube algorithms poo poo on clips shorter than 10 minutes which is why a ton of stuff that should only take a couple minutes tops gets stretched out to hit that magical 10 minute mark.

It's more that Adored is unable to collect his thoughts enough to put out a concise video so every video starts from when Earth was a molten ball of metal and covers every intermediate step along the way. It's actually harder to be concise and still convey the information you want to and he's just not good at stripping out the extraneous stuff.

You can pad it to 10 minutes without every single one turning into a 35 minute slideshow. Some of these videos are really multi-parters that literally stretch to more than an hour. You don't need to be putting out feature-length tech presentations.

Compare his video to something like a Deviant Ollam video at Defcon or some other high-quality conference proceeding and you'll understand just how vapid Adored's content is. They cover way more content, more concisely, more interestingly. Adored is popular because he scratches the AMD fanboy itch (insert 1-hour "Why Intel sucks" and "Why NVIDIA sucks" videos here) not because it's good.

That said, I am unable to look away from a slow-motion trainwreck as much as anyone else. Whatever you say about Adored, the guy knows how to drive clicks with his "5.1 GHz stock Zen2" and "$250 Navi" videos as well as both the fanboy support and outrage he gets off them.

Paul MaudDib fucked around with this message at 19:30 on Jun 14, 2019

Enos Cabell
Nov 3, 2004


Bring back Vine, but only for video reviews

EdEddnEddy
Apr 5, 2012



Well then, 5 minutes of actual content with 5+ more of filler gameplay or some other relevant but not mindless bumbling maybe.

Maybe hide a game key or two in there for people to discover in each video that stick around as incentive.. lol

Kazinsal
Dec 13, 2011


EdEddnEddy posted:

Well then, 5 minutes of actual content with 5+ more of filler gameplay or some other relevant but not mindless bumbling maybe.

Maybe hide a game key or two in there for people to discover in each video that stick around as incentive.. lol

"Thank you for watching this three minute review containing all the content you actually need to make an informed decision. Please enjoy the following seven minutes of smooth jazz."

EdEddnEddy
Apr 5, 2012



Kazinsal posted:

"Thank you for watching this three minute review containing all the content you actually need to make an informed decision. Please enjoy the following seven minutes of smooth jazz."

Would you watch it?

Kazinsal
Dec 13, 2011


Definitely. I'd stick around for the smooth jazz/Apex gameplay/recitation of Vogon poetry as well.

EdEddnEddy
Apr 5, 2012



I might have to give this a try then....


On the other video length side, I did create the 10hr Spinfusor videos for Tribes 1 and Tribes 2. Mainly in response to the Goon Tribes Reunion a while back.

Arzachel
May 12, 2012

Paul MaudDib posted:

It's more that Adored is unable to collect his thoughts enough to put out a concise video so every video starts from when Earth was a molten ball of metal and covers every intermediate step along the way. It's actually harder to be concise and still convey the information you want to and he's just not good at stripping out the extraneous stuff.

Oh, Adored definitely manages to make his videos several times longer than they have any right to be without any outside influence, I just wanted to complain about Youtube contributing to bullshit padding :v:

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~
Do you get more youtube monetization bux if someone watches the full 10 minutes, or is it less if they just watch the first 3 minutes of a 10 minute video

ufarn
May 30, 2009
You get additional ads on it iirc.

RME
Feb 20, 2012

the real barrier in content creation for something like tech news is figuring out how to pump out regular content when real significant news only happens every few months

Adbot
ADBOT LOVES YOU

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Happy_Misanthrope posted:

32 minutes...32 loving minutes



I have to believe there's weirdos who treat it like a Scottish Accent ASMR. That's the only explanation.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply