|
Kazinsal posted:And will use about 30% more power under load. Unless you're using it for compute work for hours on end, I really doubt that's going to drill your power bill.
|
# ? Jan 16, 2015 23:28 |
|
|
# ? May 30, 2024 10:09 |
|
Hace posted:Unless you're using it for compute work for hours on end, I really doubt that's going to drill your power bill. Yeah, but you have to be in the same room as it. And even with cheap power, sticker savings don't mean poo poo if it crosses the A/C threshold.
|
# ? Jan 16, 2015 23:34 |
Twerk from Home posted:At under $240, the R9-290 is slightly better performance per dollar than a $340+ 970, with the downside of being twice as hot and loud. And less features. I'm really loving DSR for older games, Skyrim and Diablo 3 look really great at fake 4k and run buttery smooth on the 970. It really makes the upgrade feel significant for everything, not just the newer games.
|
|
# ? Jan 16, 2015 23:35 |
|
Speaking of DSR, correct me if I'm wrong, but has Nvidia just slapped a new marketing term on what we used to call supersampling?
|
# ? Jan 16, 2015 23:37 |
|
Kazinsal posted:And will use about 30% more power under load. 145w TDP versus 290-300w TDP. Its more than a 30% difference for less performance.
|
# ? Jan 16, 2015 23:38 |
|
Kazinsal posted:Speaking of DSR, correct me if I'm wrong, but has Nvidia just slapped a new marketing term on what we used to call supersampling? DSR is not supersampling.
|
# ? Jan 16, 2015 23:41 |
|
AVeryLargeRadish posted:And less features. I'm really loving DSR for older games, Skyrim and Diablo 3 look really great at fake 4k and run buttery smooth on the 970. It really makes the upgrade feel significant for everything, not just the newer games. The R285 and R290 have a form of that with the latest (Omega) drivers.
|
# ? Jan 16, 2015 23:44 |
|
Hace posted:DSR is not supersampling. Okay, then can you explain it to me? The Nvidia marketing blurb says it "renders a game at a higher, more detailed resolution and intelligently shrinks the result back down to the resolution of your monitor", which sounds a lot to me like supersampling.
|
# ? Jan 16, 2015 23:46 |
|
Kazinsal posted:Okay, then can you explain it to me? The Nvidia marketing blurb says it "renders a game at a higher, more detailed resolution and intelligently shrinks the result back down to the resolution of your monitor", which sounds a lot to me like supersampling.
|
# ? Jan 16, 2015 23:46 |
|
It is, however, bizarrely limited compared to DSR. Amusing that the only card capable of 4K VSR is the flop R9 285 with its 2GB framebuffer.
|
# ? Jan 16, 2015 23:49 |
Kazinsal posted:Speaking of DSR, correct me if I'm wrong, but has Nvidia just slapped a new marketing term on what we used to call supersampling? With DSR you can just select the resolution you want from in the game. DSR also allows you to scale down from any resolution to any resolution while with supersampling you have to use a 2:1 or 4:1 and so on type scale. Basically it does something similar but makes it much more usable and flexible. Of course a 4k native monitor would be better but I could not justify $340 plus $700-ish. I'll go 4k once I can get a 4k 23" IPS display for $200-$300.
|
|
# ? Jan 16, 2015 23:52 |
|
AVeryLargeRadish posted:With DSR you can just select the resolution you want from in the game. DSR also allows you to scale down from any resolution to any resolution while with supersampling you have to use a 2:1 or 4:1 and so on type scale. Basically it does something similar but makes it much more usable and flexible. Of course a 4k native monitor would be better but I could not justify $340 plus $700-ish. I'll go 4k once I can get a 4k 23" IPS display for $200-$300. So technique-wise, it's like supersampling, but implementation-wise it's much more flexible. Almost like supersampling designed for computers with GPUs circa 2015 and not for the 3dfx Voodoo. Makes sense.
|
# ? Jan 16, 2015 23:58 |
|
BurritoJustice posted:It is, however, bizarrely limited compared to DSR. Hah didn't notice that. Bizarre. Happy_Misanthrope posted:The R285 and R290 have a hilariously lovely version of that with the latest (Omega) drivers.
|
# ? Jan 17, 2015 00:11 |
|
What would cause textures to randomly flick on and off in pretty much every game? This has been happening to my msi gtx970 for a while and i've tried updating drivers and updating the cards bios. It was fine for a couple of months and then all of a sudden i'd see things flickering, the card isnt overclocked or anything. It doesnt happen often but its noticeable on occasion, games are still playable but i'm not sure what I should try before I go back to scan and complain the card isnt working.
|
# ? Jan 17, 2015 14:59 |
|
track day bro! posted:What would cause textures to randomly flick on and off in pretty much every game? This has been happening to my msi gtx970 for a while and i've tried updating drivers and updating the cards bios. It was fine for a couple of months and then all of a sudden i'd see things flickering, the card isnt overclocked or anything. Texture issues are usually a sign something is wrong with your VRAM. If you're overclocking your VRAM, or it comes stock overclocked, you can try running it at 7ghz and see if it's stable, pretty sure that's the base clock for the GDDR5 being used on modern high end cards. I think there's a memtest86 style program for GPU's but I'm not sure, never used it myself.
|
# ? Jan 17, 2015 15:42 |
|
GDDR5 VRAM has error correction so it'll slow down performance if it encounters errors due to overclocking too far or insufficient voltage, but unlike GDDR3 it shouldn't result in texture issues unless there was something very wrong like memory controller damage. Might be a faulty card assuming the PSU isn't garbage, but I'd try a clean driver install first. track day bro!, what's the model # of the power supply you're using? Also try completely uninstalling the GPU drivers in safe mode with Display Driver Uninstaller (hell, run it twice even) then install the latest drivers from Nvidia.
|
# ? Jan 17, 2015 18:09 |
|
cisco privilege posted:GDDR5 VRAM has error correction so it'll slow down performance if it encounters errors due to overclocking too far or insufficient voltage, but unlike GDDR3 it shouldn't result in texture issues unless there was something very wrong like memory controller damage. Might be a faulty card assuming the PSU isn't garbage, but I'd try a clean driver install first. I've got a Seasonic G series 550w which I bought only a few months before. I'll try removing the drivers completely, I did try a slightly older driver with no difference. Strangely I had a couple of months where the 970 was flawless, also my old 6870 was fine.
|
# ? Jan 17, 2015 18:31 |
|
cisco privilege posted:GDDR5 VRAM has error correction so it'll slow down performance if it encounters errors due to overclocking too far or insufficient voltage, but unlike GDDR3 it shouldn't result in texture issues unless there was something very wrong like memory controller damage.
|
# ? Jan 17, 2015 18:47 |
|
track day bro! posted:I've got a Seasonic G series 550w which I bought only a few months before. I'll try removing the drivers completely, I did try a slightly older driver with no difference. So I tried the driver uninstaller and did a clean install of the latest driver, made no difference. I also tried removing the oc on my 4670k, again still getting weird flickery shadows and far away textures flickering.
|
# ? Jan 17, 2015 20:51 |
|
track day bro! posted:So I tried the driver uninstaller and did a clean install of the latest driver, made no difference. I also tried removing the oc on my 4670k, again still getting weird flickery shadows and far away textures flickering.
|
# ? Jan 17, 2015 21:00 |
|
Switched in an MSI 4G Gaming 970 to replace my EVGA 970 SC. while I'll give the EVGA a tiny amount of credit for having a higher stock overclock the MSI runs 12-15C cooler at both idle and load. After the BIOS update I'm not sure if I ever heard the EVGA card making noise over my Intel stock cooler, but I can't hear the MSI even after upgrading to a much quieter noctua cooler. Definitely blows away the "6%" improvement I would have gotten with EVGA's step up program.
|
# ? Jan 17, 2015 23:11 |
|
Not sure if I should ask this here or on the monitor thread, but here goes: I have an Asus GTX660 TI-DC2O-2GD5, hooked up to 3 screens: Dell U2312HM (LED IPS, DispolayPort) Dell 2209WA (IPS, DVI) Samsung HDTV (LCD, HDMI). Now, I've been happy with my setup so far, the main screen (Dell2312) looks good, was calibrated and everything was dandy. My HDTV clones the main screen, so I can sit on the couch and play videogames; the colours always looked a bit different (a tad more "washed out" than the pc monitor), but I always assumed it being because the monitor was a LED IPS panel and the HDTV a bit older LCD one. This is until I read somewhere that NVidia cards output limited RGB through HDMI instead of full RGB. I poked around a bit, saw an option for "full rgb" on the nvidia control panel but it did poo poo, and I ended up doing a registry hack. At this point I noticed a difference on the HDTV, so I calibrated that and now it looks the same as the monitor, which is great. But I'm using AVS HD 709's files to calibrate the monitors (brightness, contrast), and on this video from HD Nation (which uses the files I'm using), it states that I should adjust my brightness so the lines from 16 and down should not be visible. Thing is, when he cranks up the brightess, you can clearly see the black lines below 16, but when I crank up the brightness on my monitors (all 3), I never see anything at or below 16. Now, considering limited RBG is 16-256, could it be I'm not getting full RBG output from my graphics card, even after that "registry tweak" I mentioned earlier? Bear in mind that when I ran it I did see a difference on my HDTV (HDMI, which was what, according to what I read, had issues with the limited RGB?). Thanks, and sorry for the gigantic post. /edit: Right, update: Turns out the bug I read about was the graphics card outputting limited RGB to every monitor if it found something connected via HDMI, since it "assumed" it was an HDTV and hence unable to display full RGB. In my case, my monitors were ok (full RGB) but the HDTV wasn't. That registry proggie I mentioned fixed this by making the graphics card output Full RGB to everything connected to it. Now, since the latest driver (347.09), you can actually toggle this from the NVidia Control panel; I just tried and I can see the difference in the HDTV. Finally, I was also wrong about the RGB range: limited RGB (16-255) means every black "level" from 1 to 16 is output at the same brightness level (16), which "flattens" the image. I can now see details in the darker areas of my hdtv that I couldn't see before (but I could see in my monitor). I had originally assumed that "limited" meant it just didn't output anything below 16 so I was freaking out about that calibration video, but I was mistaken. Again, thanks, and sorry for the long post. Edmond Dantes fucked around with this message at 00:39 on Jan 18, 2015 |
# ? Jan 17, 2015 23:49 |
|
cisco privilege posted:Maybe try a different cable? Switch the port on the back of the card either to another DVI-D port or HDMI or whatever to see if that makes any difference. I'll give that a go when I get a chance also, I also had Warframe C2D from a driver failure today after the clean driver install.
|
# ? Jan 18, 2015 02:50 |
|
I just switched graphics cards from an ATI 7970 to a GTX 970, and am struggling to regain my 3 monitor setup due to different port configurations. On the 7970 I had a 3 monitor set up with 1 display port cable, and 2 DVI cables. I can't use this setup, because the GTX 970 only has 1 DVI port. One of my monitors can do display port, but the other two can only do DVI. Is my only option to regain third monitor use to buy an active DVI-Displayport Adapter? Am I correct in thinking a passive adapter won't work here?
|
# ? Jan 18, 2015 06:45 |
|
Megasabin posted:I just switched graphics cards from an ATI 7970 to a GTX 970, and am struggling to regain my 3 monitor setup due to different port configurations. On the 7970 I had a 3 monitor set up with 1 display port cable, and 2 DVI cables. I can't use this setup, because the GTX 970 only has 1 DVI port. One of my monitors can do display port, but the other two can only do DVI. Is my only option to regain third monitor use to buy an active DVI-Displayport Adapter? Am I correct in thinking a passive adapter won't work here? Which 970 did you get that only has the one DVI port? Always make sure the video card has the ports you expect before buying it. But yeah, you're correct that you'll need an active adapter. Personally though, I can't stand displayport on my monitors. Windows treats me powering off my monitors as disconnecting them and scrambles all my windows and icons around (not that I have too many icons). DVI is blessedly dumb and doesn't send the same information through the cable.
|
# ? Jan 18, 2015 06:51 |
|
Megasabin posted:I just switched graphics cards from an ATI 7970 to a GTX 970, and am struggling to regain my 3 monitor setup due to different port configurations. On the 7970 I had a 3 monitor set up with 1 display port cable, and 2 DVI cables. I can't use this setup, because the GTX 970 only has 1 DVI port. One of my monitors can do display port, but the other two can only do DVI. Is my only option to regain third monitor use to buy an active DVI-Displayport Adapter? Am I correct in thinking a passive adapter won't work here? Assuming you have the reference 970 port setup, you can just use a cheap passive HDMI-DVI adaptor.
|
# ? Jan 18, 2015 07:24 |
|
BurritoJustice posted:Assuming you have the reference 970 port setup, you can just use a cheap passive HDMI-DVI adaptor. Even neater is a cable with hdmi one end, dvi-d the other.
|
# ? Jan 18, 2015 10:55 |
|
Schiavona posted:I just sold a 560ti-448 for $100. I put it on Craigslist for that price assuming it'd get haggled down. Am I missing something or did the guy just really want a video card asap and not care about price? Checking eBay it seemed to be worth half as much? You never know what you're going to get selling a video card on Craigslist. I had a GTX570 1280MB for sale for $50, and a GTX660 2GB for $60. Performance wise they are roughly the same but the 660 is newer with more vram so it's the better buy. Nobody was interested in the 660, and I mean nobody. I got dozens of offers on the 570, even one for $20 in which the buyer wanted me to meet him 60 miles away from here. I get it that he probably wanted it for SLI, but come on. Eventually I sold the 570 for someone offered me asking price hassle-free (I knocked $10 off when she told me it was for her brother for Christmas and gave her a 90 day warranty), and put the 660 in my backup computer. I kept the ad up and still get offers in the 570, but not the 660.
|
# ? Jan 18, 2015 14:38 |
|
Rumor popped up on a chinese site, 40% faster than a 980 supposedly:
|
# ? Jan 18, 2015 22:11 |
|
Titan II can't be the only card coming from nVidia right? There's gonna be an awfully wide gap between the 980 and the Titan II. If the 390 is the only card there its going to be popular.
|
# ? Jan 19, 2015 05:03 |
|
HalloKitty posted:Even neater is a cable with hdmi one end, dvi-d the other. These are cheap, too. Monoprice.
|
# ? Jan 19, 2015 05:16 |
|
calusari posted:Titan II can't be the only card coming from nVidia right? There's gonna be an awfully wide gap between the 980 and the Titan II. If the 390 is the only card there its going to be popular.
|
# ? Jan 19, 2015 05:21 |
|
Biggest human being Ever posted:Rumor popped up on a chinese site, 40% faster than a 980 supposedly This is the Titan-X card whose PCB has been popping up on tech sites right? calusari posted:Titan II can't be the only card coming from nVidia right? There's gonna be an awfully wide gap between the 980 and the Titan II. If the 390 is the only card there its going to be popular. Agreed, a chopped down GM200 needs to fill this gap. I would be really surprised if Titan-X is the chopped down card and we see a Titan-X Black in the future. We're stilling looking at a Q1 launch right? Khagan fucked around with this message at 06:53 on Jan 19, 2015 |
# ? Jan 19, 2015 06:51 |
|
Hey, I recently bought a Gigabyte GTX970 and noticed that it's making this weird grinding noise. It only occurs at specific fan speeds (around 55%-65%), and goes away when I tilt the case (Node 304) for about 45 degrees. It also occurs whenever I boot up the computer, since it seems like the GPU fan is set to start from a very high percentage and quickly goes down as the pc boots up. Currently I'm working around this issue by setting the fan speed in afterburner to go about 70% or over when gaming and 40% and under when on desktop, but I'm not sure if this is the right course of action. Should I just get a replacement or is this normal for modern GPUs?
|
# ? Jan 19, 2015 11:22 |
|
Make sure the fans are not scraping against a wire or cable, otherwise yeah send it in for a replacement.
|
# ? Jan 19, 2015 11:58 |
|
Khagan posted:This is the Titan-X card whose PCB has been popping up on tech sites right? No, that was a benchmark of an unknown radeon R9-300 series card, as you can see from the screenshot.
|
# ? Jan 19, 2015 12:08 |
|
http://maxict.nl/product/4779325/-pny-gtx-960-2gb-gddr5 Dutch GTX 960 listed for 249€ plus VAT, in a shop that sells 970's for good prices
|
# ? Jan 19, 2015 13:05 |
|
calusari posted:Titan II can't be the only card coming from nVidia right? There's gonna be an awfully wide gap between the 980 and the Titan II. If the 390 is the only card there its going to be popular. 980 ti im sure
|
# ? Jan 19, 2015 22:24 |
|
Is there a definitive best GTX 970 for a triple monitor setup?
|
# ? Jan 19, 2015 22:49 |
|
|
# ? May 30, 2024 10:09 |
|
Safetyland posted:Is there a definitive best GTX 970 for a triple monitor setup? Personally I'd say Gigabyte because you can run all displayport with it. (full disclosure: I have the G1. It's huge but good)
|
# ? Jan 19, 2015 22:58 |