Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

CrazyLoon posted:

If anyone has a bit of time to help me solve a new pc build conundrum:

Been hooking up my monitor, whose only option to connect to my RX 580 8GB (which happens to be an ex-mining card) is an HDMI 1.4 port. Everything else seems to work, I'm sure the card is clicked into the slot properly + is obviously powered on due to the light on it...but even though the computer starts up, the monitor can't get an HDMI signal and shows nothing (while it has no problem doing so at all on my older GTX 650Ti on my older computer).

I suspect it could be the fact it's a mining card, meaning its drivers are just wrong, but even so...shouldn't it at least be able to display basic BIOS and such? Or am I being stupid about something?

yes, the card should be able to run in at least a limited mode (eg 640x480 or something) right from boot to let you edit BIOS and stuff

Double-check the 8-pin cable is plugged in. It could be giving you a green light even if it's not plugged in all the way or whatever.

if you have another HDMI cable then give that a shot, it's always worth a try. Or try DVI and see if that helps (HDMI and DVI are pretty much the same thing with a different connector).

But I suspect you're right and the card has a messed up VBIOS from mining.

Plug your 650 Ti back in, put the RX 580 in a second slot, go to TechPowerUp's website and try to find the right VBIOS for your card, flash it, and see if it helps. It may take a couple tries, there can be multiple VBIOS for the same card for different VRAM chip timings and so on. If you can get a GPU-Z readout then it might tell you what brand of memory the card has and so on, if it's managing to get up far enough for GPU-Z to talk to it.

There's an outside chance it could just be hosed up entirely though, blown display IOs or whatever.

Paul MaudDib fucked around with this message at 21:02 on Oct 23, 2019

Adbot
ADBOT LOVES YOU

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Blorange posted:

I'm confused by these numbers, people are claiming that the driver is measurably stuttering for 1 millisecond? Something might be wrong with the display driver's input lag but it feels like they're posting these numbers simply because they're easy to measure.

People who use latencymon are almost universally nutjobs, yes. It's like electromagnetic hypersensitivity for gamers. If you're bored and want to stare deep into the abyss there's a both hilarious and somewhat unnerving thread on overclock.net where these people gently caress around with basically any setting they can get their hands on in their quest to make a meaningless number go down. There's some pretty spicy claims in there; the thread has at various points attributed input lag and "floaty mouse" to things like allowing the CPU VRM to turn off some of the power stages at low power settings, the Windows keyboard layout setting (English Philippines is lowest latency, apparently), DisplayPort cables (HDMI is clearly lower latency), PWM control of chassis fans, the fan cables themselves, the power grid, and many other things. It's some bizarre poo poo.

TheFluff fucked around with this message at 22:42 on Oct 23, 2019

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
remember when r/AMD woo-woo people started taking pictures of their monitors to "prove" that AMD gives you a sharper image and better detail

edit:

https://www.reddit.com/r/Amd/comments/9z7ezh/amd_vs_nvidia_image_quality_does_amd_give_out_a/

https://www.reddit.com/r/Amd/comments/c7yxrb/question_is_nvidia_cheating_on_image_quality_can/ (oh Coreteks, you never fail to disappoint me...)

"yeah I can definitely see a difference in this texture compression, ngreedia cheating again!" [ed note: texture compression is lossless and AMD uses it too since GCN 1.2...]

There is this weird AMD media-sphere with people like Adored, Coreteks, Mindblanktech, RedGamingTech, Moore's Law Is Dead, etc where a little vague tech knowledge meets ayyymd logic and the magic happens.


vvv GPU, not CPU, but then there are the woo-woos who think that AMD gives you a "smoother experience" but it can't be measured in 1% or 0.1% frametimes (or even just FCAT timings) somehow... but judging by the woo-woos on the last page it looks like it's Ryzen with the interrupt performance problems :v: vvv

Fully expect to see "NVIDIA microstutter problems!" everywhere tomorrow though, full blast from the AMD mediasphere.

Paul MaudDib fucked around with this message at 23:18 on Oct 23, 2019

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

Paul MaudDib posted:

remember when r/AMD woo-woo people started taking pictures of their monitors to "prove" that AMD gives you a sharper image and better detail

We're going to need a 'Ridicule CPUophiles' thread.

Craptacular!
Jul 9, 2001

Fuck the DH
I gotta say that the cool thing about being locked in the vault of G-Sync monitor hardware is that you stop caring about the GPU wars because you're gonna buy Nvidia no matter what it does or doesn't vis-a-vis AMD.

Nvidia can only compete against themselves for my dollar, and I gotta say their selves from a few years ago are kicking their present's rear end at the moment, in the name of "innovation".

Cavauro
Jan 9, 2008

If I were trying to make that sound good I would go with simple instead of cool

Craptacular!
Jul 9, 2001

Fuck the DH
Yes but there is definitely a time where I would have thrown time and effort into changing GPUs over stuttergate, or turned into one of these message board jihadists. But it is just another of a series of very insignificant complaints I have about Nvidia. I've already had to throw away my Hackintosh installation and settle for a buggier Linux (Nvidia is currently not great if you care about non-Windows desktops) so what's one more microcomplaint on the pile when I am, generally speaking, still satisfied.

It reigns in a terrible habit I have of discarding good hardware going after perfection.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Hey I can appreciate the train of thought that "I am a neurotic person, and I like being locked into something because I don't worry about it as much". You know what works for you, nothin' wrong with that.

Cavauro
Jan 9, 2008

I'm sorry. It actually was cool, in the end.

repiv
Aug 13, 2009

Nvidia has some RTX on/off comparisons for the new Call of Duty. It's not an earth shattering improvement but at least it's relatively cheap compared to other RTX implementations.

ufarn
May 30, 2009

repiv posted:

Nvidia has some RTX on/off comparisons for the new Call of Duty. It's not an earth shattering improvement but at least it's relatively cheap compared to other RTX implementations.
I watched a stream of it, and it’s incredibly selectively applied, which is pretty jarring outside small, dark areas.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Yeah it’s kinda of like the medium setting on tomb raider, which is a waste of time. Ultra on tomb raider looks cool because all light sources give ray traced shadows but is of course v spendy

Fallows
Jan 20, 2005

If he waits long enough he can use his accrued interest from his savings to bring his negative checking balance back into the black.

Statutory Ape posted:

He didn't pick the worst valve product to speculate GPU purchases on from that era, at least

https://www.youtube.com/watch?v=8KPIPIleULo

VelociBacon
Dec 8, 2009

This is the Ultra vs Low particle lighting slider and I can't believe how little of a difference it makes.

Stickman
Feb 1, 2004

VelociBacon posted:

This is the Ultra vs Low particle lighting slider and I can't believe how little of a difference it makes.

Honestly, it looks like the sort of effect that might be more obvious in motion - the explosion is brighter and seems to have more definition and contrast, so I could imagine it "popping" more in action. Then again, maybe not!

Worf
Sep 12, 2017

If only Seth would love me like I love him!

That latency post on Reddit felt like watching a group of people discuss the shapes of the clouds on the bluest day

Indiana_Krom
Jun 18, 2007
Net Slacker

Statutory Ape posted:

That latency post on Reddit felt like watching a group of people discuss the shapes of the clouds on the bluest day
The most amusing part is higher DPC latency can be a result of a billion different things like for instance the display driver bundling up more work in order to avoid pipeline stalls and get higher efficiency out of the system so it takes less raw CPU time away from everything else.

iospace
Jan 19, 2038


Statutory Ape posted:

That latency post on Reddit felt like watching a group of people discuss the shapes of the clouds on the bluest day

Yeah but it gives the AMD true believers ammo, so it's hard to say if its bad or not.

Mr.PayDay
Jan 2, 2004
life is short - play hard

Stickman posted:

That'll really depend heavily on the price/performance improvement in the next generation and how much RTX becomes a "must-have" feature, both of which are still up in the air. They'd need to dedicate relatively more silicon to raytracing to improve the relative performance hit, and by all accounts 7nm is already pretty expensive compared to previous node shrinks.

E: If we do get a decent price/performance boost, I'd expect the 2080 Ti to take the biggest relative hit, since it's marginal cost increase is much higher than marginal increases lower in the stack (outside of the 2080 Super)


It was mostly because the original question specifically asked about the NVidia models. But yeah, AMD is the way to go if you don't care about rayz.

I have some buddies that skipped the 2xxx Gen because they got a 1080Ti and did not want to pay 1200 for additional 35% avg fps and RT.
These guys will definitely buy a 3080Ti which should be 30-40% faster with RTX on and off compared to the 2080Ti and 60-80% faster than a 1080Ti vice versa.
But how will NVidia set the pricetag? Every Ti gen was way more expensive than the older gen since the 780Ti iirc.

An incentive and a problem at the same time might be that the 3080Ti will be the first 60K Ultra 4K GPU, but that is a niche.
The transition from fullhd to 1440p still is in early progress and might need some more years, because GPUs are expensive. The 2080Ti has so much power on 1440p (80-200 avg fps on ultra depending on the games +engines) that a 3080Ti would not be a useful upgrade at least for RTX off, so the 2080Ti user might skip the 3xxx gen and that keeps the price high for the few used 2080Ti that hit the second market.


I still hope SLI/nvlink gets a revival.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Mr.PayDay posted:

I still hope SLI/nvlink gets a revival.

I don't. I'd far prefer if nVidia put way more R&D money into single-package efficiency. That doesn't rule out the possibility of them doing a Ryzen-like architecture where there are multiple dies per package (if Intel can put out a CPU that's the size of a playing card, no reason why nVidia couldn't put out a Ti/Titan card of the same size), and them naming *their* version of the ~Infinity Fabric~ something stupid like ~Quantum NVLink~, but until hooking two cards up in parallel yields 100%+ in performance with no driver dickery, SLI/NVLink should stay 'dead.'

Maybe as we get up into PCIe 5 and 6 as the bandwidth grows and the latencies shrink, we could finally see a CrossFire-like interface that could provide something approaching that.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
I bought myself a Corsair H55 and tomorrow I'll be heading to Micro Center to buy a Kraken G12 to cool my 2070 Super. Doing some reading tonight I think I'll need heatsinks for the VRM/VRAM? if so would these Raspberry Pi ones work? Also, my card is an EVGA RTX 2070 Super Black Gaming, the cheapest non-blower one they make. Does anyone know if its the reference PCB design? Apparently reference 2070 Supers use the same mounting as 2080s, which are listed as compatible with the G12.

Why did I jump down this stupid rabbit hole.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Endymion FRS MK1 posted:

I bought myself a Corsair H55 and tomorrow I'll be heading to Micro Center to buy a Kraken G12 to cool my 2070 Super. Doing some reading tonight I think I'll need heatsinks for the VRM/VRAM? if so would these Raspberry Pi ones work? Also, my card is an EVGA RTX 2070 Super Black Gaming, the cheapest non-blower one they make. Does anyone know if its the reference PCB design? Apparently reference 2070 Supers use the same mounting as 2080s, which are listed as compatible with the G12.

Why did I jump down this stupid rabbit hole.

You don't need the heatsinks, as the G12 has a fan for them, but they certainly won't hurt. If you wanted, those ones you linked, or basically any others with self-adhesive thermal tape, would work fine. VRAM doesn't really need much to cool them.

Frankly, if you're gonna slap heatsinks on anything, put them on the VRMs, first.

And you're doing this because it's awesome to have a whisper silent GPU with a cooling solution you will very likely be able to move over to your next card, as well!

Worf
Sep 12, 2017

If only Seth would love me like I love him!

how are yall fixing nvidia's GFE error 0x0003


i nuked the program itself and its still acting fucky after reinstall

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
You could try completely nuking anything Nvidia-related with DDU (disable your network connection before doing this so Windows doesn't automatically reinstall a driver).

Worf
Sep 12, 2017

If only Seth would love me like I love him!

You know what I suspected that was the answer coming and I've already just decided to nuke the entire boot drive, which is now in progress!

I assume it won't even fix it and I'll be back in an hour with 0x0003.2


E: lol, forgot this tidbit i read ITT the other day re: you don't have to install the GFE anymore and its just an option at start,

blesssss




e2: oh ok so, all those errors are fixed, amazing. however the only real reason i'm trying to even get this poo poo fixed is so i can play new call of duty on the this pc- i guess people are having issues getting GTX 1080ti to work (probably other cards too) when force enable IGPU is active instead of auto in BIOS

i had/have force enabled that option because my tertiary screen is plugged into the IGPU slot. anybody know of a workaround on this, or has nvidia unfucked 3 monitors (for now) again


E: lol, i like the way my nvidia poo poo performs but forcing me to go into the BIOS (and lose my third monitor entirely, as currently set up)to play call of duty because my top end consumer card is unable to appropriately use even a fraction of its outputs without drawing hundreds of times the power it is designed to is going to get me to switch ASAFP

Worf fucked around with this message at 13:00 on Oct 28, 2019

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Can you manually set the card power state like in the early days of lovely dual monitor support?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
If you go to the Guru3D forums, there's a guy who provides "clean" versions of all new nVidia drivers for GTX and RTX cards, free of GFE and everything superfluous. There's also a utility out there now called NVSlimmer that enables you to do your own trimming and customizing of a driver package.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

What does some random dudes driver offer over unchecking the "install gfe" box in Nvidia's installer?

Fantastic Foreskin fucked around with this message at 15:36 on Oct 28, 2019

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

ItBreathes posted:

What does some random dudes driver offer over unchecking the "install gfe" box in Nvidia's installer?

It's just a driver with nothing but the core, PhysX, HD Audio, and RTX components (if you download/need them): https://forums.guru3d.com/threads/440-97-clean-version.421390/

Geemer
Nov 4, 2010



ItBreathes posted:

What does some random dudes driver offer over unchecking the "install gfe" box in Nvidia's installer?

The idea is probably not having a gigabyte of your disk space wasted by unpacked, but unused, superfluous driver components.Which can grow to several gb if you don't manually go in there to clean out the older installers Nvidia "helpfully" neglects to clean up when you install a newer version.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
There's a telemetry service that restarts itself if you try to kill it from Task Manager :tinfoil: Stripped-down drivers avoid installing it altogether

Cygni
Nov 12, 2005

raring to post

Dont know if we talked about it here yet, but Intel's GPU has reached the "power on" stage, meaning its been fabbed on real 10nm silicon and they have parts in hand. AT thinks this is on track for a mid/late 2020 launch for first products, but still lots of questions about what markets it will target, how it will target them, and when.

https://www.anandtech.com/show/15032/intel-2019-fab-update-10nm-hvm-7nm-on-track

But intel's shroud concepts are EVOLVING at a terrifying rate

Stickman
Feb 1, 2004

Sadly that's fan-made concept art, but I'll be extremely disappointed if Raja doesn't steal at least one or two of the ideas:

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
I mean, the steampunk market is certainly underserved in the GPU space.

Stickman
Feb 1, 2004

It's purely functional. How else would you turn the fans? Most gpus just hide the gears in a sad attempt to look high-tech.

ChaseSP
Mar 25, 2013



E: Sorry posted in the wrong thread.

ChaseSP fucked around with this message at 19:51 on Oct 28, 2019

Cygni
Nov 12, 2005

raring to post

Stickman posted:

Sadly that's fan-made concept art, but I'll be extremely disappointed if Raja doesn't steal at least one or two of the ideas:



lol forbes reported the shroud as fact. i always forget how lovely forbes is now.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Forbes is a blogging platform. I'm not even sure Forbes proper publishes anything anymore.

Stickman
Feb 1, 2004

ChaseSP posted:

E: Sorry posted in the wrong thread. (Originally asking about upgrade from 590 @ 1440p for $200-350)

E: Moved to PC building thread.

Stickman fucked around with this message at 20:03 on Oct 28, 2019

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo

Nomyth posted:

There's a telemetry service that restarts itself if you try to kill it from Task Manager :tinfoil: Stripped-down drivers avoid installing it altogether

loving greeeeeat, wasn't this the entire loving point of GFE? When enough people aren't installIng GFE because they don't want to be tracked, Nvidia, IT MEANS THEY DON'T WANT TO BE TRACKED.

I swear to god, I have hated the loving Nvidia card in this laptop from day 1.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply