|
I was already outbid, meh. The secondary market has basically 0 discount from new.
|
# ¿ Jan 29, 2015 03:12 |
|
|
# ¿ May 15, 2024 21:57 |
|
My weakness is that I don't want to spend more than $300, and Micro Center charges tax so the $300 open-box doesn't work.
|
# ¿ Jan 29, 2015 03:31 |
|
A MIRACLE posted:This might be the wrong thread to ask but is there a way to enable "vsync" or some setting like that to prevent screen tearing on windows 7 desktop? It happens in any video program, or when I try to drag a window across the screen. It's really really really really really annoying now that I noticed it's happening. This website causes it really bad too, when I zoom in: http://www.amarasoftware.com/flash-animations/barcode.htm Uh... Hm. DWM has no option to turn Vsync off in Windows 7. Update your graphics driver to the lat4est and then check the control panel to make sure it's not being overridden in the 3D settings or whatnot. That said, Flash (which is the tech behind that site's animation) is not synced to the refresh rate at all. Flash's Vsync support is basically non-existent except for the artist to manually set the framerate of the animation to match the monitor's refresh rate (or an even divisor) or setting the animation mode to a GPU-accelerated one. Factory Factory fucked around with this message at 04:26 on Jan 29, 2015 |
# ¿ Jan 29, 2015 04:22 |
|
Poor Nvidia this week... the assholes. You know how they totally hypothetically could support FreeSync but aren't? That's because they're re-implementing it and calling it G-Sync for use in gaming laptops. As was extremely obvious yet they always denied, the G-Sync module was never required except to bring adaptive sync to non-mobile hardware. It's just an overpriced little monitor driver board that Nvidia can capture BoM dollars with.
|
# ¿ Jan 31, 2015 05:56 |
|
Airflow would be the only solution other than just sucking it up and not worrying about it. With the fans off at idle, the heat from the lower 970 has nothing to do but be carried by convection straight into the upper card.
|
# ¿ Feb 1, 2015 01:54 |
|
Jeez. Have you checked the electrical quality from your wall sockets? Voltage, brownouts, spikes, that kind of thing.
|
# ¿ Feb 1, 2015 04:12 |
|
Yeah, I agree that AnandTech that the 970 specs being wrong is a stupid mistake rather than a willful lie. But I am peeved as gently caress about G-Sync/FreeSync segmentation.
|
# ¿ Feb 3, 2015 20:01 |
|
If Intel adopts FreeSync, it will instantly be widely supported and outrageously popular, even likely backported to a wide variety of current hardware. AMD might not be able to out-strongarm Nvidia, but Intel can. And not just because they used to do StrongARM.
|
# ¿ Feb 4, 2015 17:44 |
|
SwissArmyDruid posted:G-Sync is technically superior. As it has a frame buffer on the monitor side, it can continue to refresh the last image indefinitely. With AdaptiveSync, as no frame buffer exists, after a while, it HAS to grab a new frame if the framerate gets down too low, as the effect of sitting on a frame for too long is a gradual washing out (on the scale of dozens of milliseconds) to white. If left unchecked, this would demonstrate itself to the human eye as flickering, and that is worse than any kind of screen tearing. Therefore, yes, monitor makers will program their scalars so that if it really absolutely NEEDS it, an AdaptiveSync display WILL still tear before it flickers. The tablet I'm using right now has a tiny framebuffer DRAM chip hooked to the screen to enable Panel Self-Refresh. DisplayPort: Pretty much already does everything G-Sync does.
|
# ¿ Feb 4, 2015 21:47 |
|
And once it comes out, they'll get about a year of market parity with Nvidia before Team Green comes out with the next Big Thing That Makes More Money Than Gaming GPUs and changes everything again.
|
# ¿ Feb 5, 2015 01:31 |
|
Sure, if you're into self-interacting big beautiful molecules.
|
# ¿ Feb 5, 2015 15:48 |
|
The "best" way is with a Gsync/FreeSync screen, but that's more than just V-sync. Second best is V-sync enabled on a 120 Hz or 144 Hz screen, triple buffered. The fast refresh minimizes latency and also means that you have additional V-sync'd refresh rates between 60 Hz and 30 Hz (36, 48, 72 for 144 Hz and 30, 40, 60 for 120 Hz). Beyond that, it all comes down to trade-offs. Triple buffering is strictly superior to double buffering except for the extra VRAM use, which shouldn't matter most of the time and especially never in backlogged games. Adaptive V-Sync is just a setting that turns off V-Sync if the frame rate drops below the monitor refresh rate. It's mostly only useful on 60 Hz screens and games that are *usually* solid-60 framerates, and on the occasional dips you prefer to see tearing rather than 30 FPS judder. 59 FPS vs. 60 FPS caps on a 60 Hz display... well... With V-Sync at 60 Hz, a 59 FPS cap displays at 30 FPS. If you can do a solid 60, then half your refresh rate will be wasted, and that's just bad. If your display can do 59 Hz and is set to do that, that's cool and all but doesn't change anything - typically that's actually 59.94 Hz, meant to match playback of NTSC/ATSC media from discs or TV tuners, and a 59 FPS cap is still below refresh so it will give you a locked 29.97 FPS experience. If you want to get esoteric, and your game can reliably render faster than monitor refresh, and your system has outputs for the Intel IGP (or AMD IGP I guess), you can set up Lucidlogix MVP in i-mode and enable Virtual V-Sync and Hyperformance. Together, these are worth a decent average framerate hit, but the faster-than-refresh the GPU can render, the more that display lag will be reduced (since the final displayed frame is taken from nearer to the end of the frame interval than a normal V-sync frame, which is taken from the beginning of the frame interval).
|
# ¿ Feb 10, 2015 08:09 |
|
So you know those mobile GeForce 900M GPUs you used to be able to overclock? Nvidia, you poor bastards.
|
# ¿ Feb 14, 2015 00:25 |
|
Panty Saluter posted:I have Afterburner and Nvidia Inspector installed. Is there a decent guide online for overclocking? I hate to just start goosing it and then burn the poor thing up As long as you don't mod the BIOS for additional voltage, there's nothing you can do to cause the card any significant damage.
|
# ¿ Feb 18, 2015 23:47 |
|
I can! Apple just started an out-of-warranty repair program for all Macs with AMD GPUs from 2011 through 2013 (though most of the problems are with 2011 models).
|
# ¿ Feb 20, 2015 01:36 |
|
Daviclond posted:I disagree. I'm not looking to give Nvidia an easy pass here but it's completely conceivable that in a large, highly technical and multi-discipline working environment, two separate groups (marketing guys and engineer guys) hosed up in communicating technical specifications and a fairly obscure, esoteric piece of information was missed out. It's not the job of Engineer Guys to go on websites in the months following the product release and check every marketing chart. Checking should have happened before the information was released to marketing (though I don't know how strict their checking/approval QA processes would be - these things get very lax in the absence of audits even if they do have procedures. We can expect they're much stricter now ) but even if it was, "4 GB VRAM, xxx-bit bus width" is still superficially correct and I can see how it might not be caught as an error. The suit is almost definitely a settlement grab. Absolute best case assuming no settlement is that the suit survives motion to dismiss (where the standard is "Is there any actionable claim alleged under any half-plausible set of facts whatsoever?") and then the plaintiff lawyers get to rummage through Nvidia's communications during discovery. That would be annoying enough that Nvidia would settle for significant "nuisance money" just to be rid of them. Even that level of success is a pretty optimistic, considering the level of intentionality necessary in the claims they're alleging.
|
# ¿ Feb 22, 2015 21:45 |
|
You guys... I am disappointed.Panty Saluter posted:More importantly the Vcore voltage is hitting as high as 1.4 despite my telling it not to. Check your Load Line Calibration setting and turn it off. LLC works by adding voltage under high stress situations, essentially being an additional offset to the Vcore. It's useful I guess? But it makes your Vcore higher than you think.
|
# ¿ Mar 8, 2015 17:00 |
|
It also has the same gimpy FP64 ratio as the 980. Big Maxwell is not a compute card the way Big Kepler is. But for FP32, it's hugely faster. So... it's a graphics card. Not an HPC card. E: A "deep learning"/neural network compute card.
|
# ¿ Mar 17, 2015 18:00 |
|
|
# ¿ May 15, 2024 21:57 |
|
Imagination released details on a new PowerVR uarch. It's looking pretty boss. http://blog.imgtec.com/powervr/powervr-vogue-gpu-finally-runs-crysis
|
# ¿ Apr 1, 2015 16:24 |