|
CrazyLoon posted:If anyone has a bit of time to help me solve a new pc build conundrum: yes, the card should be able to run in at least a limited mode (eg 640x480 or something) right from boot to let you edit BIOS and stuff Double-check the 8-pin cable is plugged in. It could be giving you a green light even if it's not plugged in all the way or whatever. if you have another HDMI cable then give that a shot, it's always worth a try. Or try DVI and see if that helps (HDMI and DVI are pretty much the same thing with a different connector). But I suspect you're right and the card has a messed up VBIOS from mining. Plug your 650 Ti back in, put the RX 580 in a second slot, go to TechPowerUp's website and try to find the right VBIOS for your card, flash it, and see if it helps. It may take a couple tries, there can be multiple VBIOS for the same card for different VRAM chip timings and so on. If you can get a GPU-Z readout then it might tell you what brand of memory the card has and so on, if it's managing to get up far enough for GPU-Z to talk to it. There's an outside chance it could just be hosed up entirely though, blown display IOs or whatever. Paul MaudDib fucked around with this message at 21:02 on Oct 23, 2019 |
# ? Oct 23, 2019 20:58 |
|
|
# ? Jun 3, 2024 22:37 |
|
Blorange posted:I'm confused by these numbers, people are claiming that the driver is measurably stuttering for 1 millisecond? Something might be wrong with the display driver's input lag but it feels like they're posting these numbers simply because they're easy to measure. People who use latencymon are almost universally nutjobs, yes. It's like electromagnetic hypersensitivity for gamers. If you're bored and want to stare deep into the abyss there's a both hilarious and somewhat unnerving thread on overclock.net where these people gently caress around with basically any setting they can get their hands on in their quest to make a meaningless number go down. There's some pretty spicy claims in there; the thread has at various points attributed input lag and "floaty mouse" to things like allowing the CPU VRM to turn off some of the power stages at low power settings, the Windows keyboard layout setting (English Philippines is lowest latency, apparently), DisplayPort cables (HDMI is clearly lower latency), PWM control of chassis fans, the fan cables themselves, the power grid, and many other things. It's some bizarre poo poo. TheFluff fucked around with this message at 22:42 on Oct 23, 2019 |
# ? Oct 23, 2019 22:29 |
|
remember when r/AMD woo-woo people started taking pictures of their monitors to "prove" that AMD gives you a sharper image and better detail edit: https://www.reddit.com/r/Amd/comments/9z7ezh/amd_vs_nvidia_image_quality_does_amd_give_out_a/ https://www.reddit.com/r/Amd/comments/c7yxrb/question_is_nvidia_cheating_on_image_quality_can/ (oh Coreteks, you never fail to disappoint me...) "yeah I can definitely see a difference in this texture compression, ngreedia cheating again!" [ed note: texture compression is lossless and AMD uses it too since GCN 1.2...] There is this weird AMD media-sphere with people like Adored, Coreteks, Mindblanktech, RedGamingTech, Moore's Law Is Dead, etc where a little vague tech knowledge meets ayyymd logic and the magic happens. vvv GPU, not CPU, but then there are the woo-woos who think that AMD gives you a "smoother experience" but it can't be measured in 1% or 0.1% frametimes (or even just FCAT timings) somehow... but judging by the woo-woos on the last page it looks like it's Ryzen with the interrupt performance problems vvv Fully expect to see "NVIDIA microstutter problems!" everywhere tomorrow though, full blast from the AMD mediasphere. Paul MaudDib fucked around with this message at 23:18 on Oct 23, 2019 |
# ? Oct 23, 2019 22:37 |
|
Paul MaudDib posted:remember when r/AMD woo-woo people started taking pictures of their monitors to "prove" that AMD gives you a sharper image and better detail We're going to need a 'Ridicule CPUophiles' thread.
|
# ? Oct 23, 2019 22:52 |
|
I gotta say that the cool thing about being locked in the vault of G-Sync monitor hardware is that you stop caring about the GPU wars because you're gonna buy Nvidia no matter what it does or doesn't vis-a-vis AMD. Nvidia can only compete against themselves for my dollar, and I gotta say their selves from a few years ago are kicking their present's rear end at the moment, in the name of "innovation".
|
# ? Oct 24, 2019 03:48 |
|
If I were trying to make that sound good I would go with simple instead of cool
|
# ? Oct 24, 2019 04:44 |
|
Yes but there is definitely a time where I would have thrown time and effort into changing GPUs over stuttergate, or turned into one of these message board jihadists. But it is just another of a series of very insignificant complaints I have about Nvidia. I've already had to throw away my Hackintosh installation and settle for a buggier Linux (Nvidia is currently not great if you care about non-Windows desktops) so what's one more microcomplaint on the pile when I am, generally speaking, still satisfied. It reigns in a terrible habit I have of discarding good hardware going after perfection.
|
# ? Oct 24, 2019 18:29 |
|
Hey I can appreciate the train of thought that "I am a neurotic person, and I like being locked into something because I don't worry about it as much". You know what works for you, nothin' wrong with that.
|
# ? Oct 24, 2019 22:40 |
|
I'm sorry. It actually was cool, in the end.
|
# ? Oct 25, 2019 00:14 |
|
Nvidia has some RTX on/off comparisons for the new Call of Duty. It's not an earth shattering improvement but at least it's relatively cheap compared to other RTX implementations.
|
# ? Oct 25, 2019 23:25 |
|
repiv posted:Nvidia has some RTX on/off comparisons for the new Call of Duty. It's not an earth shattering improvement but at least it's relatively cheap compared to other RTX implementations.
|
# ? Oct 26, 2019 00:03 |
|
Yeah it’s kinda of like the medium setting on tomb raider, which is a waste of time. Ultra on tomb raider looks cool because all light sources give ray traced shadows but is of course v spendy
|
# ? Oct 26, 2019 00:33 |
|
Statutory Ape posted:He didn't pick the worst valve product to speculate GPU purchases on from that era, at least https://www.youtube.com/watch?v=8KPIPIleULo
|
# ? Oct 26, 2019 02:11 |
|
This is the Ultra vs Low particle lighting slider and I can't believe how little of a difference it makes.
|
# ? Oct 26, 2019 07:11 |
|
VelociBacon posted:This is the Ultra vs Low particle lighting slider and I can't believe how little of a difference it makes. Honestly, it looks like the sort of effect that might be more obvious in motion - the explosion is brighter and seems to have more definition and contrast, so I could imagine it "popping" more in action. Then again, maybe not!
|
# ? Oct 26, 2019 07:18 |
|
That latency post on Reddit felt like watching a group of people discuss the shapes of the clouds on the bluest day
|
# ? Oct 26, 2019 09:49 |
|
Statutory Ape posted:That latency post on Reddit felt like watching a group of people discuss the shapes of the clouds on the bluest day
|
# ? Oct 26, 2019 12:44 |
|
Statutory Ape posted:That latency post on Reddit felt like watching a group of people discuss the shapes of the clouds on the bluest day Yeah but it gives the AMD true believers ammo, so it's hard to say if its bad or not.
|
# ? Oct 27, 2019 00:06 |
|
Stickman posted:That'll really depend heavily on the price/performance improvement in the next generation and how much RTX becomes a "must-have" feature, both of which are still up in the air. They'd need to dedicate relatively more silicon to raytracing to improve the relative performance hit, and by all accounts 7nm is already pretty expensive compared to previous node shrinks. I have some buddies that skipped the 2xxx Gen because they got a 1080Ti and did not want to pay 1200 for additional 35% avg fps and RT. These guys will definitely buy a 3080Ti which should be 30-40% faster with RTX on and off compared to the 2080Ti and 60-80% faster than a 1080Ti vice versa. But how will NVidia set the pricetag? Every Ti gen was way more expensive than the older gen since the 780Ti iirc. An incentive and a problem at the same time might be that the 3080Ti will be the first 60K Ultra 4K GPU, but that is a niche. The transition from fullhd to 1440p still is in early progress and might need some more years, because GPUs are expensive. The 2080Ti has so much power on 1440p (80-200 avg fps on ultra depending on the games +engines) that a 3080Ti would not be a useful upgrade at least for RTX off, so the 2080Ti user might skip the 3xxx gen and that keeps the price high for the few used 2080Ti that hit the second market. I still hope SLI/nvlink gets a revival.
|
# ? Oct 27, 2019 21:24 |
|
Mr.PayDay posted:I still hope SLI/nvlink gets a revival. I don't. I'd far prefer if nVidia put way more R&D money into single-package efficiency. That doesn't rule out the possibility of them doing a Ryzen-like architecture where there are multiple dies per package (if Intel can put out a CPU that's the size of a playing card, no reason why nVidia couldn't put out a Ti/Titan card of the same size), and them naming *their* version of the ~Infinity Fabric~ something stupid like ~Quantum NVLink~, but until hooking two cards up in parallel yields 100%+ in performance with no driver dickery, SLI/NVLink should stay 'dead.' Maybe as we get up into PCIe 5 and 6 as the bandwidth grows and the latencies shrink, we could finally see a CrossFire-like interface that could provide something approaching that.
|
# ? Oct 27, 2019 21:32 |
|
I bought myself a Corsair H55 and tomorrow I'll be heading to Micro Center to buy a Kraken G12 to cool my 2070 Super. Doing some reading tonight I think I'll need heatsinks for the VRM/VRAM? if so would these Raspberry Pi ones work? Also, my card is an EVGA RTX 2070 Super Black Gaming, the cheapest non-blower one they make. Does anyone know if its the reference PCB design? Apparently reference 2070 Supers use the same mounting as 2080s, which are listed as compatible with the G12. Why did I jump down this stupid rabbit hole.
|
# ? Oct 28, 2019 08:08 |
|
Endymion FRS MK1 posted:I bought myself a Corsair H55 and tomorrow I'll be heading to Micro Center to buy a Kraken G12 to cool my 2070 Super. Doing some reading tonight I think I'll need heatsinks for the VRM/VRAM? if so would these Raspberry Pi ones work? Also, my card is an EVGA RTX 2070 Super Black Gaming, the cheapest non-blower one they make. Does anyone know if its the reference PCB design? Apparently reference 2070 Supers use the same mounting as 2080s, which are listed as compatible with the G12. You don't need the heatsinks, as the G12 has a fan for them, but they certainly won't hurt. If you wanted, those ones you linked, or basically any others with self-adhesive thermal tape, would work fine. VRAM doesn't really need much to cool them. Frankly, if you're gonna slap heatsinks on anything, put them on the VRMs, first. And you're doing this because it's awesome to have a whisper silent GPU with a cooling solution you will very likely be able to move over to your next card, as well!
|
# ? Oct 28, 2019 08:17 |
|
how are yall fixing nvidia's GFE error 0x0003 i nuked the program itself and its still acting fucky after reinstall
|
# ? Oct 28, 2019 10:50 |
|
You could try completely nuking anything Nvidia-related with DDU (disable your network connection before doing this so Windows doesn't automatically reinstall a driver).
|
# ? Oct 28, 2019 10:56 |
|
You know what I suspected that was the answer coming and I've already just decided to nuke the entire boot drive, which is now in progress! I assume it won't even fix it and I'll be back in an hour with 0x0003.2 E: lol, forgot this tidbit i read ITT the other day re: you don't have to install the GFE anymore and its just an option at start, blesssss e2: oh ok so, all those errors are fixed, amazing. however the only real reason i'm trying to even get this poo poo fixed is so i can play new call of duty on the this pc- i guess people are having issues getting GTX 1080ti to work (probably other cards too) when force enable IGPU is active instead of auto in BIOS i had/have force enabled that option because my tertiary screen is plugged into the IGPU slot. anybody know of a workaround on this, or has nvidia unfucked 3 monitors (for now) again E: lol, i like the way my nvidia poo poo performs but forcing me to go into the BIOS (and lose my third monitor entirely, as currently set up)to play call of duty because my top end consumer card is unable to appropriately use even a fraction of its outputs without drawing hundreds of times the power it is designed to is going to get me to switch ASAFP Worf fucked around with this message at 13:00 on Oct 28, 2019 |
# ? Oct 28, 2019 10:58 |
|
Can you manually set the card power state like in the early days of lovely dual monitor support?
|
# ? Oct 28, 2019 14:17 |
|
If you go to the Guru3D forums, there's a guy who provides "clean" versions of all new nVidia drivers for GTX and RTX cards, free of GFE and everything superfluous. There's also a utility out there now called NVSlimmer that enables you to do your own trimming and customizing of a driver package.
|
# ? Oct 28, 2019 15:12 |
|
What does some random dudes driver offer over unchecking the "install gfe" box in Nvidia's installer?
Fantastic Foreskin fucked around with this message at 15:36 on Oct 28, 2019 |
# ? Oct 28, 2019 15:32 |
|
ItBreathes posted:What does some random dudes driver offer over unchecking the "install gfe" box in Nvidia's installer? It's just a driver with nothing but the core, PhysX, HD Audio, and RTX components (if you download/need them): https://forums.guru3d.com/threads/440-97-clean-version.421390/
|
# ? Oct 28, 2019 15:41 |
|
ItBreathes posted:What does some random dudes driver offer over unchecking the "install gfe" box in Nvidia's installer? The idea is probably not having a gigabyte of your disk space wasted by unpacked, but unused, superfluous driver components.Which can grow to several gb if you don't manually go in there to clean out the older installers Nvidia "helpfully" neglects to clean up when you install a newer version.
|
# ? Oct 28, 2019 16:16 |
|
There's a telemetry service that restarts itself if you try to kill it from Task Manager Stripped-down drivers avoid installing it altogether
|
# ? Oct 28, 2019 17:52 |
|
Dont know if we talked about it here yet, but Intel's GPU has reached the "power on" stage, meaning its been fabbed on real 10nm silicon and they have parts in hand. AT thinks this is on track for a mid/late 2020 launch for first products, but still lots of questions about what markets it will target, how it will target them, and when. https://www.anandtech.com/show/15032/intel-2019-fab-update-10nm-hvm-7nm-on-track But intel's shroud concepts are EVOLVING at a terrifying rate
|
# ? Oct 28, 2019 19:20 |
|
Sadly that's fan-made concept art, but I'll be extremely disappointed if Raja doesn't steal at least one or two of the ideas:
|
# ? Oct 28, 2019 19:27 |
|
I mean, the steampunk market is certainly underserved in the GPU space.
|
# ? Oct 28, 2019 19:33 |
|
It's purely functional. How else would you turn the fans? Most gpus just hide the gears in a sad attempt to look high-tech.
|
# ? Oct 28, 2019 19:45 |
|
E: Sorry posted in the wrong thread.
ChaseSP fucked around with this message at 19:51 on Oct 28, 2019 |
# ? Oct 28, 2019 19:47 |
|
Stickman posted:Sadly that's fan-made concept art, but I'll be extremely disappointed if Raja doesn't steal at least one or two of the ideas: lol forbes reported the shroud as fact. i always forget how lovely forbes is now.
|
# ? Oct 28, 2019 19:52 |
|
Forbes is a blogging platform. I'm not even sure Forbes proper publishes anything anymore.
|
# ? Oct 28, 2019 19:54 |
|
ChaseSP posted:E: Sorry posted in the wrong thread. (Originally asking about upgrade from 590 @ 1440p for $200-350) E: Moved to PC building thread. Stickman fucked around with this message at 20:03 on Oct 28, 2019 |
# ? Oct 28, 2019 19:57 |
|
|
# ? Jun 3, 2024 22:37 |
|
Nomyth posted:There's a telemetry service that restarts itself if you try to kill it from Task Manager Stripped-down drivers avoid installing it altogether loving greeeeeat, wasn't this the entire loving point of GFE? When enough people aren't installIng GFE because they don't want to be tracked, Nvidia, IT MEANS THEY DON'T WANT TO BE TRACKED. I swear to god, I have hated the loving Nvidia card in this laptop from day 1.
|
# ? Oct 28, 2019 22:37 |