Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
snickothemule
Jul 11, 2016

wretched single ply might as well use my socks
I remember having the R9 nano paired with a 2500k and it whined terribly, the CPU was having trouble at 1440p with it and the card couldn't hit it's clock rate at 1050mhz, always struggled with 900-930. Once paired it with a 6800k it helped get the clock speed to spec and I think the coil whine reduced a bit too, but undervolting was the only thing that ultimately helped.

I still have a soft spot for that little card :kiddo:

Adbot
ADBOT LOVES YOU

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
If when I screw with frequency and it doesn't help, how do I go about undervolting? Is it as simple as reducing the power limit in Afterburner?

LRADIKAL
Jun 10, 2001

Fun Shoe
Yup

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Power limit is watts. You don't want to reduce that, especially on Turing which has really strict power limts to begin with. IIRC reducing voltage is a bit of the pain in the rear end on Pascal/Turing but is still doable.

lDDQD
Apr 16, 2006
whoever's struggling with a wonky 5700xt: have you tried disabling pcie4?

Shrimp or Shrimps
Feb 14, 2012


If you want to undervolt with afterburner, have a look for a video on the voltage curve (ctrl+f).

What I'm wondering is would lowering the power limit (say by 10 percent) but then putting a core overclock of 10 percent achieve something similar, since the voltage curve undervolting is simply telling the card to hit the same clocks at a lower voltage, which is all the core clock offset in is doing anyway since the card has a strict upper voltage limit.

Arzachel
May 12, 2012

Shrimp or Shrimps posted:

If you want to undervolt with afterburner, have a look for a video on the voltage curve (ctrl+f).

What I'm wondering is would lowering the power limit (say by 10 percent) but then putting a core overclock of 10 percent achieve something similar, since the voltage curve undervolting is simply telling the card to hit the same clocks at a lower voltage, which is all the core clock offset in is doing anyway since the card has a strict upper voltage limit.

I *think* the power limit and core clock numbers serve as caps and don't actually modify the mhz/voltage curve but maybe I'm wrong.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Shrimp or Shrimps posted:

If you want to undervolt with afterburner, have a look for a video on the voltage curve (ctrl+f).

What I'm wondering is would lowering the power limit (say by 10 percent) but then putting a core overclock of 10 percent achieve something similar, since the voltage curve undervolting is simply telling the card to hit the same clocks at a lower voltage, which is all the core clock offset in is doing anyway since the card has a strict upper voltage limit.

This is how I had a functional GPU in my xps15 before Dell shipped a BIOS update that enabled "function happening" on it

The Electronaut
May 10, 2009
Not sure if this has been happening to anyone else: when starting up Rainbow 6 Siege for the past couple months I've started having a weird momentary screen flicker with losing some (not all) system tray icons in the taskbar when launching the game. The driver update from Nvidia yesterday had release notes in the change log that finally clued me into what was happening: the ultra Low lag latency setting in the control panel I had enabled back in November or whenever that function was added. It was causing crashing of other applications when BattleEye launched.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

The Electronaut posted:

Not sure if this has been happening to anyone else: when starting up Rainbow 6 Siege for the past couple months I've started having a weird momentary screen flicker with losing some (not all) system tray icons in the taskbar when launching the game. The driver update from Nvidia yesterday had release notes in the change log that finally clued me into what was happening: the ultra Low lag latency setting in the control panel I had enabled back in November or whenever that function was added. It was causing crashing of other applications when BattleEye launched.

that issue was seemingly specific to people using vulkan API, per that same patch note

The Electronaut
May 10, 2009

Statutory Ape posted:

that issue was seemingly specific to people using vulkan API, per that same patch note

Vulkan API for R6S was only released in the past week or so.

It was this note that clued me into what might be happening. After updating drivers and rebooting, the issue went away for me.

[Battleye][Low-Latency Mode]: Launching Battleye with NVIDIA Low Latency Mode set to Ultra may cause DWM to reset. [2775906]


This issue which I'm guessing you're referring to, apologies if wrong, wasn't what was affecting me, as I wasn't switching between windowed and full screen.

Windows 10 Only [Tom Clancy's Rainbow Six Siege][Vulkan][G-SYNC]: When playing the game in Vulkan mode with G-SYNC enabled, flickering occurs after switching the game between full-screen and windowed mode.[200578641]

Worf
Sep 12, 2017

If only Seth would love me like I love him!

The Electronaut posted:

Vulkan API for R6S was only released in the past week or so.

It was this note that clued me into what might be happening. After updating drivers and rebooting, the issue went away for me.

[Battleye][Low-Latency Mode]: Launching Battleye with NVIDIA Low Latency Mode set to Ultra may cause DWM to reset. [2775906]


This issue which I'm guessing you're referring to, apologies if wrong, wasn't what was affecting me, as I wasn't switching between windowed and full screen.

Windows 10 Only [Tom Clancy's Rainbow Six Siege][Vulkan][G-SYNC]: When playing the game in Vulkan mode with G-SYNC enabled, flickering occurs after switching the game between full-screen and windowed mode.[200578641]

lol no apologies necessary you got it , i misunderstood the post and thought it was the same issue

Craptacular!
Jul 9, 2001

Fuck the DH
I just want to say the latest video by DigitalFoundry is kind of neat. They looked at 1080p performance of some games using various engines (CryEngine, Frostbite DX12, etc) with the most popular mid-range GPUs of the past few years, namely the 970 and 1060 6GB, and what kind of card was necessary to achieve that same performance at 1440p for both Nvidia and AMD. Cost wasn't a factor here, it was simply "if you like this card at 1080 and want to move to 1440, what gives you equivalent perfromance?"

One of the things they discovered is Turing changed some things that newer titles have optimized for, but some older titles that have been abandoned for support will see hardly any improvement at all, Going from a 1060 to a 2060 basically gets you nothing in Crysis 3 due to the significant overhead of unoptimized coding.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Craptacular! posted:

Going from a 1060 to a 2060 basically gets you nothing in Crysis 3 due to the significant overhead of unoptimized coding.

lol i mean

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
I have a rx580 8gb nitro+ by sapphire that has been sitting unused for over a year. Never been flashed with nonsense mining bios, totally unmodified everything. I used it mostly in crossfire on a gaming pc and it was the secondary card so its workload was extra light. If I had to guess It probably has about 600 hours of full load on it and the fans are the kind that just pop out for replacement.

Anyone wanna buy it for like $100 + shipping?
If you have the know how you can apparently flash it to work for a mac pro and resell it for about $300
Let me know here or PM and ill make a SA-mart thread.


Edit: its been sold

Fauxtool fucked around with this message at 20:01 on Feb 10, 2020

Macichne Leainig
Jul 26, 2012

by VG
How easy is it to replace a cap on a GPU if your soldering skills are mediocre to bad? My GTX 970 no longer outputs any video, and I think the culprit is a capacitor that's so loose I can wiggle it with a finger easily.

If it doesn't work out, eh, I still got 6 years out of the thing. Bought it the day after it came out on a 20% Micro Center open box discount as someone returned it as it was "too loud."

SwissArmyDruid
Feb 14, 2014

by sebmojo
Difficulty of soldering is inversely proportional to the number of tools you have at hand.

Yes, you can get by with just a soldering iron and solder, but if you have flux and wick, replacing caps becomes almost trivial.

Geemer
Nov 4, 2010



Protocol7 posted:

How easy is it to replace a cap on a GPU if your soldering skills are mediocre to bad? My GTX 970 no longer outputs any video, and I think the culprit is a capacitor that's so loose I can wiggle it with a finger easily.

If it doesn't work out, eh, I still got 6 years out of the thing. Bought it the day after it came out on a 20% Micro Center open box discount as someone returned it as it was "too loud."

Shouldn't be too bad. Pick up a piece of broken electronics/a thing you don't care about breaking and practice removing and re-attaching components a bit on that. That way you can get a feel for it.

Macichne Leainig
Jul 26, 2012

by VG

SwissArmyDruid posted:

Difficulty of soldering is inversely proportional to the number of tools you have at hand.

Yes, you can get by with just a soldering iron and solder, but if you have flux and wick, replacing caps becomes almost trivial.

I don't have flux, I should probably get some of that. I know it can help quite a bit. Have two different irons, solder and wick though.

Geemer posted:

Shouldn't be too bad. Pick up a piece of broken electronics/a thing you don't care about breaking and practice removing and re-attaching components a bit on that. That way you can get a feel for it.

Yeah, I got some arduino stuff laying around. I should try with that first.

idiotsavant
Jun 4, 2000
2070S questions. this card was recommended earlier in the PC building thread. The goal is 1440/144 gaming starting with a build from scratch. CPU would be a Ryzen 3600, and the game I'm specifically interested in is Tarkov, though I'm also interested in upcoming stuff like Cyberpunk, etc.

I haven't built a computer or paid attention to parts for years, so I'm a little (or a lot) in the dark. The benchmark comparisons for the 2070S that I've seen with other cards make it seem ok - maybe not as much pure value as other cards but a decent balance between better performance and reasonable value. Is that the case or am I better off with something else? Also is the linked EVGA card a decent one or am I better off with another manufacturer? I'm a little concerned with noise (like every review comment seems like "great, but noisy/coil whine/etc") but I'm not sure if that's an issue with the card or with higher performance GPUs in general.

Final goal is that I haven't built a comp in a while, and I'm really not interested in having to gently caress around with things not working - reliability and not dicking with drivers etc is a priority. Not to restart the company quality argument, but it seems like NVIDIA is a little better there.

tldr: 1440/144 on a Ryzen 3600, priorities are ease of build, reliability, hopefully not insanely noisy card if possible. Is 2070S good, or should I go with something else?

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
2070S is fine and the EVGA ultra is a good option for $500.

VelociBacon
Dec 8, 2009

Just woke up, I loving dreamed last night that there was an nvidia page where you could see their fps limiter versus the RTSS limiter and you had that thing where it was a screenshot with the middle divider you could drag back and forth to see the render from the RTSS limiter vs the nvidia one and I was going through them in my dumb sleep brain sliding them back and forth like ehhh I dunno it's pretty slight maybe a little better color in the highlights there hmm.

GPUs have poisoned me/

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
Always go for the native limiter for more vibrant colours and crisper edges imho. It's the one less limiting.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

idiotsavant posted:

2070S questions. this card was recommended earlier in the PC building thread. The goal is 1440/144 gaming starting with a build from scratch. CPU would be a Ryzen 3600, and the game I'm specifically interested in is Tarkov, though I'm also interested in upcoming stuff like Cyberpunk, etc.

I haven't built a computer or paid attention to parts for years, so I'm a little (or a lot) in the dark. The benchmark comparisons for the 2070S that I've seen with other cards make it seem ok - maybe not as much pure value as other cards but a decent balance between better performance and reasonable value. Is that the case or am I better off with something else? Also is the linked EVGA card a decent one or am I better off with another manufacturer? I'm a little concerned with noise (like every review comment seems like "great, but noisy/coil whine/etc") but I'm not sure if that's an issue with the card or with higher performance GPUs in general.

Final goal is that I haven't built a comp in a while, and I'm really not interested in having to gently caress around with things not working - reliability and not dicking with drivers etc is a priority. Not to restart the company quality argument, but it seems like NVIDIA is a little better there.

tldr: 1440/144 on a Ryzen 3600, priorities are ease of build, reliability, hopefully not insanely noisy card if possible. Is 2070S good, or should I go with something else?

Yeah, it's a good card for 1440p/144. I like the 2080S a bit more for that spot when you can find it for $650 or less after coupons/whatnot but I can't see any of those right now, so this is probably the best value pick.

With a few exceptions on the bottom of the stack there isn't as HUGE difference between cards. That's just as good as some of the 2070S that go for $50-100 more.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

idiotsavant posted:

2070S questions. this card was recommended earlier in the PC building thread. The goal is 1440/144 gaming starting with a build from scratch. CPU would be a Ryzen 3600, and the game I'm specifically interested in is Tarkov, though I'm also interested in upcoming stuff like Cyberpunk, etc.

I haven't built a computer or paid attention to parts for years, so I'm a little (or a lot) in the dark. The benchmark comparisons for the 2070S that I've seen with other cards make it seem ok - maybe not as much pure value as other cards but a decent balance between better performance and reasonable value. Is that the case or am I better off with something else? Also is the linked EVGA card a decent one or am I better off with another manufacturer? I'm a little concerned with noise (like every review comment seems like "great, but noisy/coil whine/etc") but I'm not sure if that's an issue with the card or with higher performance GPUs in general.

Final goal is that I haven't built a comp in a while, and I'm really not interested in having to gently caress around with things not working - reliability and not dicking with drivers etc is a priority. Not to restart the company quality argument, but it seems like NVIDIA is a little better there.

tldr: 1440/144 on a Ryzen 3600, priorities are ease of build, reliability, hopefully not insanely noisy card if possible. Is 2070S good, or should I go with something else?

Tarkov is almost always CPU bound and you will never get good FPS. It doesn't care as much about your GPU. That said, in general the 2070S is a good 1440p card.

Steakandchips
Apr 30, 2009

How's a 1080 compare to a 2070s for 1440/144 gaming?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Steakandchips posted:

How's a 1080 compare to a 2070s for 1440/144 gaming?

https://www.anandtech.com/bench/product/2527?vs=2516

sauer kraut
Oct 2, 2004

Steakandchips posted:

How's a 1080 compare to a 2070s for 1440/144 gaming?

2070s is about 30% faster on average. But of course it's $500..550 as opposed to a decent, used 1080 for $300 or less.
A bit late in the cycle now to buy a new Turing imo.

Glass of Milk
Dec 22, 2004
to forgive is divine
My 970 is starting to randomly crash my system. I'm just doing 1080p- would of the single fan 1660S cards be a good choice for a mATX system? It seems to be the best bang for the buck based on my limited research.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Glass of Milk posted:

My 970 is starting to randomly crash my system. I'm just doing 1080p- would of the single fan 1660S cards be a good choice for a mATX system? It seems to be the best bang for the buck based on my limited research.

How much are you saving going single fan vs dual and aren't the single fan models triple slot? I don't remember the benchmarks but I'd be inclined to a dual fan over a single. The 1660S is a/the good choice for 1080p.

Steakandchips
Apr 30, 2009


Thanks mate, well handy.


sauer kraut posted:

2070s is about 30% faster on average. But of course it's $500..550 as opposed to a decent, used 1080 for $300 or less.
A bit late in the cycle now to buy a new Turing imo.

I already have a 1080.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Glass of Milk posted:

My 970 is starting to randomly crash my system. I'm just doing 1080p- would of the single fan 1660S cards be a good choice for a mATX system? It seems to be the best bang for the buck based on my limited research.

Are you sure it isn't power supply going bad?

StratGoatCom
Aug 6, 2019

Our security is guaranteed by being able to melt the eyeballs of any other forum's denizens at 15 minutes notice


lDDQD posted:

whoever's struggling with a wonky 5700xt: have you tried disabling pcie4?

Also check your PSU, Navi 1x had an overoptimistic power filtering setup and apparently they really flip out if given ripple. It's an early 7nm and below GPU issue, I imagine the 2x series on 7nm+ and Ampere will have significantly better filtering as a result.

Glass of Milk
Dec 22, 2004
to forgive is divine

Statutory Ape posted:

Are you sure it isn't power supply going bad?

I don't think so. Blue screen seems to indicate it hits some section of video memory and then crashes.

ItBreathes posted:

How much are you saving going single fan vs dual and aren't the single fan models triple slot? I don't remember the benchmarks but I'd be inclined to a dual fan over a single. The 1660S is a/the good choice for 1080p.

It was not much- $20? I'll look at a dual- I should have the room in my case.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Steakandchips posted:

I already have a 1080.

I would hold onto it a bit longer, I don't think you'll get good value for money if you stump up for a 2070s.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Hi I found a thing

https://twitter.com/kiriekagarino/status/1227021740062191618?s=20

Edit: But wait, there's more where that came from: http://www.yeston.net/index/index/english

Sidesaddle Cavalry fucked around with this message at 01:23 on Feb 11, 2020

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
GN already reviewed the GPU, which is actually a 570 (lol chinese market), but I haven't seen that case before and my verdict is that it owns even though it would run about 120c.

VelociBacon
Dec 8, 2009


I'm looking for a pc case.

it must be pinku (thats japanese for pink) or any girl color. It has to be of 2 or more ATX kubota (thats japanese for 2 compartments) and has be be chibi (small) sized for LANs. And has to be really kawaii (cute). Also It has to be about 10-20 bux. And you have to post pics of it first (i want to make shure it's kawaii [cute]). And it would be nice if it came with matching GPU (WITH backplate). OH! and it CANNOT have any cartoon pictures, or be made out of plastic. It has to be made of aluminum, or something like that. Also it would be nice if it was made in japan. and not in china or corea (korea) or whatever. I have found a pc case similar to the one im describing in e-bay, but it was M-ITX, and i dont want my GPU to touch my other things (it can get wet and i would not like that, plus E-ATX looks more kawaii)

Enos Cabell
Nov 3, 2004


VelociBacon posted:

I'm looking for a pc case.

it must be pinku (thats japanese for pink) or any girl color. It has to be of 2 or more ATX kubota (thats japanese for 2 compartments) and has be be chibi (small) sized for LANs. And has to be really kawaii (cute). Also It has to be about 10-20 bux. And you have to post pics of it first (i want to make shure it's kawaii [cute]). And it would be nice if it came with matching GPU (WITH backplate). OH! and it CANNOT have any cartoon pictures, or be made out of plastic. It has to be made of aluminum, or something like that. Also it would be nice if it was made in japan. and not in china or corea (korea) or whatever. I have found a pc case similar to the one im describing in e-bay, but it was M-ITX, and i dont want my GPU to touch my other things (it can get wet and i would not like that, plus E-ATX looks more kawaii)

:golfclap:

Adbot
ADBOT LOVES YOU

eames
May 9, 2009

Digital Trends reports on new details regarding Intel’s dGPU.

TLDR is that it looks like a modular architecture with the top model using four chips with 400-500W power consumption. Assuming linear scaling that would be four times the performance of the dev card.


https://www.digitaltrends.com/computing/intel-xe-graphics-gpu-reveals-tdp-tile-architecture-exclusive/

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply