Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Worf
Sep 12, 2017

If only Seth would love me like I love him!

First 12 minutes are introduction and sponsors

Adbot
ADBOT LOVES YOU

RME
Feb 20, 2012

Mr.PayDay posted:

It’s usually the transition to 1440p that “forces” upgrades, as we are talking about going from 2,1 million pixels to 3,6 million pixels. Coming from a 980Ti, and while it was one of the best price-performance GPUs (if not the best ) the last ~ 4 years, the games and gfx engines of the last 2 years (especially Open world games) will brutally tank your fps.
I love playing on the best settings (why would I cut down eye candy stuff and visual immersion) but my 980Ti got heavily pulled down in the new games from 2017 on at 1440p, and by down by am talking about serious sub 60 fps.

If you stay with fullhd he for a while, the 980Ti is still perfectly fine here.

when i got my 1060 i, kind of foolishly, didnt really forsee myself getting a 1440p monitor for a while
right now it barely hangs on to adequacy because gsync really helps smooth out its struggling but every once in a while theres just going to be a scene that murders. Hopefully it holds through Sekiro okay, and it should, because I'm kind of unsure what I really want to upgrade to and I should only have more options later

craig588
Nov 19, 2005

by Nyc_Tattoo
I had a 4K monitor with a 680 that I ended up returning because the 4K implementation was too hacky. Too many games would get stuck on half the screen. I don't remember significant performance issues with it, I must have just been lucky with the games I was trying to play while I had it. I thought it was cool because I remembered a time when even 800x600 was too crazy high to get reasonable performance and it seemed like they figured out resolutions better where moving up wasn't dramatically difficult.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

spasticColon posted:

Is anyone else having issues with Nvidia's newest driver? I had to roll back to the previous driver because my system would randomly freeze up for 1-2 seconds, audio would stammer, and then it would go back to working normally until the next random 1-2 second freeze-up. Everything seems to be running fine after rolling back to 419.17 though.

Ok I thought this was something on my end. I've had random hitching in BFV and actually had a full system shutdown after a freeze up.

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
it COULD be a 5 minute article, but they don't make ad rev on the youtubes

Cygni
Nov 12, 2005

raring to post

I like the longer HardwareUnboxed/GamersNexus videos. Yall know you can just skip ahead to the graphs if ya want.

FuzzySlippers
Feb 6, 2009

Mr.PayDay posted:

In my Discord community I am observing a pattern of users with the 6700K and serious hints that this CPU is a heavy bottleneck when combined with a 2070/80/80Ti on new games/gameengines.
Despite having upgraded to a 2-seriesTuring GPU, the fps jumps were often underwhelming and games like Battlefield V, Kingdom Come Deliverance, Hitman 2018, Metro or now The Division 2 will penetrate 4 core CPU’s more then ever.

Pairing the best in slot gaming GPU that is the 2080Ti with a 6700K will hold your shiny GPU back in new games.
If you are playing AAA games that were released in 2017 or before that, the 6700K is nice. I had one myself until it bottlenecked seriously.

Unscientifically, after I bought a 2080 I decided to upgrade my very old i7 (2011-ish?) to new intel stuff and I didn't see much of a max framerate change but I definitely noticed my minimums improved and games overall felt smoother.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
If that video is like his other ones, it's long because he goes through every setting in detail and it's performance hit with visual comparisons. I love them for optimising

Reality
Sep 26, 2010
I changed my settings while watching that and it was pretty helpful. It’s watch that or fumble around myself.

I’d do the “palm slam a vhs and do the moves with the main character” thing but I’m not that funny or dedicated to the joke.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I'm surprisingly impressed by the RTX implementation in Exodus.

I didn't even get a 2080 to do RTX, it was just a better deal than a 1080Ti so I got it. I fully expected to just run with RTX off and not care. But drat is it good. There are so many random times in the game that I find myself losing sight of whatever I'm up to just to look at an especially good lighting implementation.

Don't get me wrong, it's far from perfect. You also need to be the right kind of person to appreciate it, because it's not really "flashy" like many graphic effects. In fact, it often makes the game darker than it otherwise would be, and some people probably hate that.

But there's something so awesome about it. The shadows cast on your gun when you're by a campfire at night. Even things like daytime illumination on NPC armor. The weirdest things in the game world captivate your attention in a very authentic and interesting way.

Anyways I just wanted to say that I'm way more impressed than I thought I would be. Ray tracing is the real deal. It's probably not worth getting into unless you have the cash to throw around, but it's been years since I've been so impressed by a graphic feature.

Really looking forward to the future of this tech. That's weird to say because if you asked me a month ago I would have said it was overhyped trash.

Ihmemies
Oct 6, 2012

I have 2560x1600@60Hz monitor with DL-DVI. My first problem was that GTX 970 couldn't run games at 60fps anymore, so I bought the very cheapest RTX 2060 with a DL-DVI port as a stopgap, while waiting for LG's 144Hz 38" ultrawide (with DP) and maybe 3xxx series GPU's.

So this lovely Gainward 2060 Golden Sample GPU had a terrible cooling solution. It ran at 40C even while idle, and steered towards 80C even when fans ran at full speed... Well the fan#2 worked erratically so I RMA'd the card before doing anything to it and got a new one. I played for an evening with the card#2 and everything worked fine.

Next day I slapped Arctic Cooling's Twin Turbo III cooler on it. It doesn't support RTX 2060 which means you have to drill new holes to the cooler bracket, salvage M2 nuts from the backplate holder clips, and remove a piece of the bracket because of a protruding surface component. I also learned the new holes must be drilled about 4-5mm offset because otherwise the cooler hits the GPU's connector bracaket and doesn't fit. Luckily the cooler's baseplate was large enough to just barely cover the entire GPU die...

Anyways. Now I have the same fan problem as with card#1's stock fans. The fans work fine until GPU core rises past 1350MHz and then the fan starts to run up and down allll the time. At this point I am extremely fed up with this poo poo and I order a loving GPU 4pin->motherboard PWM 4pin adapter from china (I actually order two just in case one is broken): https://www.ebay.de/itm/pc-Cooler-C...HQAAOSw4SxbTbPF

With that cable I can connect the GPU fan to my motherboard and hopefully drive the fans without any bullshit piss rear end software/driver/$whatever loving with the fan speed. Until the moment that cable arrives, I'm again stuck with my GTX 970 :sigh:

So... don't get used to quiet GPU's, your life will be easier. I first started to replace GPU coolers with this bad boy, completely fanless cooler:



Since then I haven't had a moment of peace with stock GPU cooling solutions.

Cavauro
Jan 9, 2008

What was the first GPU to fully saturate and lose any performance on PCI-E 2.0?

craig588
Nov 19, 2005

by Nyc_Tattoo
16x? The 2080 Ti. If you mean technically ANY performance the 680 ran like 1% slower on 2 vs 3.

Cavauro
Jan 9, 2008

I am very happy that you've told me.

craig588
Nov 19, 2005

by Nyc_Tattoo
The difference is smaller than I remembered. For some reason I thought the 2080 Ti had a 8% difference between 3 and 2, but it's still only like 3%.https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Ti_PCI-Express_Scaling/6.html
16x V1=8X V2=4X V3

Mr.PayDay
Jan 2, 2004
life is short - play hard

Taima posted:

I'm surprisingly impressed by the RTX implementation in Exodus.

I didn't even get a 2080 to do RTX, it was just a better deal than a 1080Ti so I got it. I fully expected to just run with RTX off and not care. But drat is it good. There are so many random times in the game that I find myself losing sight of whatever I'm up to just to look at an especially good lighting implementation.

Don't get me wrong, it's far from perfect. You also need to be the right kind of person to appreciate it, because it's not really "flashy" like many graphic effects. In fact, it often makes the game darker than it otherwise would be, and some people probably hate that.

But there's something so awesome about it. The shadows cast on your gun when you're by a campfire at night. Even things like daytime illumination on NPC armor. The weirdest things in the game world captivate your attention in a very authentic and interesting way.

Anyways I just wanted to say that I'm way more impressed than I thought I would be. Ray tracing is the real deal. It's probably not worth getting into unless you have the cash to throw around, but it's been years since I've been so impressed by a graphic feature.

Really looking forward to the future of this tech. That's weird to say because if you asked me a month ago I would have said it was overhyped trash.

I am serious when I am writing here that I “miss” RTX Features in The Division 2. It has nice graphics overall, but drat did Battlefield V and Metro spoil me with Reflections (BF V) and Ambient Occlusion+Global Illumination (Metro).
The Division 2 would perfectly pair with immersive RTX stuff.
RTX adds a visually enhanced experience that I will be missing in most games from now on that don’t offer RTX stuff :crossarms:

VelociBacon
Dec 8, 2009

Mr.PayDay posted:

I am serious when I am writing here that I “miss” RTX Features in The Division 2. It has nice graphics overall, but drat did Battlefield V and Metro spoil me with Reflections (BF V) and Ambient Occlusion+Global Illumination (Metro).
The Division 2 would perfectly pair with immersive RTX stuff.
RTX adds a visually enhanced experience that I will be missing in most games from now on that don’t offer RTX stuff :crossarms:

I have a 2080ti and also enjoy RTX stuff - I haven't played the new Division but I do think that they can do a lot of what you're perceiving as RTX stuff with other ways of spoofing ray tracing that they've already been doing for years. I think I just enjoy great lighting in games regardless of whether it's from RTX sources or not. I imagine level designers and lighting people in the industry are so used to making things look great without raytracing that it'll be 3-4 years before it peaks with RTX.

repiv
Aug 13, 2009

never learn how the non-raytracing approximations work, you'll see their flaws everywhere once you do :suicide:

crazypenguin
Mar 9, 2005
nothing witty here, move along

Unfortunately, not a great article.

Game engines just aren't written to transfer a lot of data over PCIe normally. The key word is "normally," though. The statistic you want to compare to see if PCIe speeds are acceptable is min FPS or 99th percentile, or similar good data about frame times.

Any article about graphics cards and PCIe that only reports average FPS is just garbage. You went through all that trouble and ignored the most interesting statistics!

That said, iirc, PCIe 2.0 x16 (equivalent to PCIe 3.0 x8) are generally regarded as "fine, usually." I forget where to find an actually good benchmark of this, however.

coke
Jul 12, 2009

Ihmemies posted:

Anyways. Now I have the same fan problem as with card#1's stock fans. The fans work fine until GPU core rises past 1350MHz and then the fan starts to run up and down allll the time. At this point I am extremely fed up with this poo poo and I order a loving GPU 4pin->motherboard PWM 4pin adapter from china (I actually order two just in case one is broken): https://www.ebay.de/itm/pc-Cooler-C...HQAAOSw4SxbTbPF


What if you just kept the stock heatsink but remove the fan/shroud and zip tie one or two 140mm ultra quiet noctua fans on it?
You can even restore the original lovely stock fan setup when you resell it.

coke fucked around with this message at 05:21 on Mar 18, 2019

Ihmemies
Oct 6, 2012

coke posted:

What if you just kept the stock heatsink but remove the fan/shroud and zip tie one or two 140mm ultra quiet noctua fans on it?
You can even restore the original lovely stock fan setup when you resell it.

I can only guess but I assume the heatsink is the worse part of the stock cooler. It looks/feels more like a restrictive heat box where air can't flow through and heat can't escape.. https://www.techpowerup.com/reviews/Palit/GeForce_RTX_2060_Gaming_Pro_OC/4.html In comparison Accelero is a very sparse design where air can freely move. Or maybe Palit/Gainward just uses bad TIM?



Whatever the reason, I've always had noticeably better experience with the Accelero cooler than any stock GPU cooler I've ever used. This cheap RTX 2060 already became very expensive. 50€ for aftermarket cooler, 15€ for VRM & memory heatsinks, some tools like drills, file, new sawblade for metal saw etc... now the fan adapter cable.

I could have saved 25€ by buying the twin turbo ii model which includes heatsinks, but I didn't realize they still sell it: https://www.amazon.de/Arctic-Accele...=gateway&sr=8-3

The backplate heatsink in twin turbo iii is so big it doesn't fit between my GPU and CPU heatsink.

craig588
Nov 19, 2005

by Nyc_Tattoo
Adding normal fans to a GPU only really makes sense if you already have them. They might cost 30 dollars, you need to come up with your own mounting (Which can be easy but still isn't free) and probably perform 90% as well as a 3rd party cooler. Or 50 dollars for a 3rd party cooler that will just work. I have a Noctuaed 1080 and am completely satisfied with it but only went that route because I already had the original fans from my D15.

craig588 fucked around with this message at 13:21 on Mar 18, 2019

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



Is MSI the only manufacturer who offers cards with idle fans <60°C? There was a Gainward 2080 Triple Fan on sale that I almost picked up, but after a quick check it turned out that keeping the fans off is not a common feature. I thought it was.

craig588
Nov 19, 2005

by Nyc_Tattoo
I had a Asus 980 that turned off the fans at idle. I disabled it in the bios because running at sub 30C at levels I couldn't hear made me happier.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
It's a feature on most cards 1000 series and after.

orcane
Jun 13, 2012

Fun Shoe

mcbexx posted:

Is MSI the only manufacturer who offers cards with idle fans <60°C? There was a Gainward 2080 Triple Fan on sale that I almost picked up, but after a quick check it turned out that keeping the fans off is not a common feature. I thought it was.

It's a common feature but only for higher tier SKUs (MSI Gaming, IIRC the Armor does it too nowadays, Asus Strix, Gigabyte Gaming etc.). The "budget" cards with weaker coolers frequently keep the fans running.

craig588 posted:

I had a Asus 980 that turned off the fans at idle. I disabled it in the bios because running at sub 30C at levels I couldn't hear made me happier.

As opposed to 35°C and you can't hear them even in an open case? :v:

craig588
Nov 19, 2005

by Nyc_Tattoo
Yeah that would have been good, but the card ran at about 50C when they were stopped in my case and it made no sense to me to run it warm for no benefit.

sanchez
Feb 26, 2003
Would it be dumb to buy a used GTX 1070 from ebay for ~$250 this week? GTX 1660 Ti is similarly priced new and close in terms of performance but I mostly only play Xplane which apparently appreciates the extra 2gb of ram on the 1070. Xplane seems to perform better on nvidia cards for whatever reason or I'd also be looking for a Vega 56. I have a 4690k with 24gb of DDR3 and an R9 270 2gb currently..

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Mr.PayDay posted:

I am serious when I am writing here that I “miss” RTX Features in The Division 2. It has nice graphics overall, but drat did Battlefield V and Metro spoil me with Reflections (BF V) and Ambient Occlusion+Global Illumination (Metro).
The Division 2 would perfectly pair with immersive RTX stuff.
RTX adds a visually enhanced experience that I will be missing in most games from now on that don’t offer RTX stuff :crossarms:

:same:

Division 2 with RTX Shadows + GI would probably cause my eyes to explode.

VelociBacon
Dec 8, 2009

9900k @ 5GHZ and 2080ti XC Ultra with +145 on the clock, getting around 80-120fps with all the settings maxed depending on indoor/outdoor. I don't know really how RTX would go, I'd probably have like 60fps?

Cygni
Nov 12, 2005

raring to post

Leather Father has a keynote at GTC today at 2pm. Its a deeplearning/AI conference, so expect lots of that talk. Considering Intel/Facebook went hard this week with Nervana products and the OAM form factor targeted right at Nvidia's SXM market domination, might be interesting.

Captain Yossarian
Feb 24, 2011

All new" Rings of Fire"

sanchez posted:

Would it be dumb to buy a used GTX 1070 from ebay for ~$250 this week? GTX 1660 Ti is similarly priced new and close in terms of performance but I mostly only play Xplane which apparently appreciates the extra 2gb of ram on the 1070. Xplane seems to perform better on nvidia cards for whatever reason or I'd also be looking for a Vega 56. I have a 4690k with 24gb of DDR3 and an R9 270 2gb currently..

If you aren't in a rush, I picked up a 1070 on eBay for $200 a couple weeks ago so the deals are out there.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Microsoft standardizes variable-rate shading as a part of DirectX 12

Another one of those ideas where you kind of look at it and wonder, ".....why the gently caress didn't we get this sooner?"

repiv
Aug 13, 2009

VRS is cool, most of the coverage is focusing on the ability to undersample tiles for better performance but it can actually supersample individual tiles as well.

That could probably be used to fix up the difficult cases where TAA doesn't work well by brute force supersampling a small number of tiles each frame, similar to the ATAA idea but without the need for raytracing.

Maybe even do some pseudo-foveated rendering in an FPS by supersampling the area around the crosshair where you're usually focused.

repiv fucked around with this message at 22:42 on Mar 18, 2019

repiv
Aug 13, 2009

Nvidia is going to enable their compute-based DXR path on Pascal and Turing GTX soon: https://www.nvidia.com/en-us/geforce/news/geforce-gtx-ray-tracing-coming-soon/

I don't know what the use-cases would be though, it's probably too slow to run any of the DXR stuff we've seen so far at usable speeds.

There's some new RTRT demos too. Unity showed their implementation for the first time :dogcited:

https://www.youtube.com/watch?v=AG7DDXwYpD0

https://www.youtube.com/watch?v=Kic-QDmS_Yw

https://www.youtube.com/watch?v=b2WOjo0C-xE

https://www.youtube.com/watch?v=ncv5Oe5q5zs

repiv fucked around with this message at 23:41 on Mar 18, 2019

EdEddnEddy
Apr 5, 2012



I think one thing for sure is the current style of rendering for everything pre RTX tech, was still needing the raw horsepower to push 1440P/4K and now that we are close enough, give or take, now we can start using some either dedicated hardware, or just left over resources in current hardware to do some Ray Tracing things. If we added it earlier, even with optimizations, would it have been possible to render at even 1080P? Of course there are a ton of variables to consider, but it would make an interesting research paper after this tech comes out and seeing the performance penalty of doing even super optimized ray tracing with RTX hardware vs anything prior.

Overall, RTX Tensor cores or not, Nvidia did kind of kick off the next level of rendering tech, feels like the Geforce 3 days all over again in some ways.


Oh and Want some Actual Quake II RTX?

EdEddnEddy fucked around with this message at 23:25 on Mar 18, 2019

NewFatMike
Jun 11, 2015

I'd be very interested in seeing if my GTX 1080 can push Quake II RTX, even if it's nothing more than a tech demo

repiv
Aug 13, 2009

I tried Q2VKPT on my 1070 when they (accidentally?) enabled Vulkan raytracing on Pascal in the Linux driver, and it ran at about 13fps at 1080p :cheeky:

Who knows if Q2RTX will run better or worse.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

SwissArmyDruid posted:

Microsoft standardizes variable-rate shading as a part of DirectX 12

Another one of those ideas where you kind of look at it and wonder, ".....why the gently caress didn't we get this sooner?"

Caller ID and *69 always existed, the phone companies just didn't have the incentive to give the common person access to them until people started asking why they didn't have them.

Also, I've a feeling MS is trying to add poo poo to DirectX to give their next console more marketing buzzwords.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
On the contrary, I think these features improve a lowest common denominator GPU by reducing the amount of work it needs to do, so that it can focus resources where they need to go.

I mean, yeah, there's nothing stopping them from ALSO using it as marketing, but it still helps.

We'll see in August.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply