|
Knifegrab posted:So is the 1070 still a good card to get if you are looking for a relatively high end card but are budgeting more on price/performance? Not unless you find one at MSRP and/or are a buttcoiner.
|
# ? Aug 21, 2017 03:37 |
|
|
# ? Jun 5, 2024 09:15 |
|
Palladium posted:Fanfic truly comes from the most unexpected of places... Never underestimate the autist.
|
# ? Aug 21, 2017 04:00 |
|
This is a general question, but is there a reason that, while desktop CPUs are routinely running at 4 GHz and cell phone SOCs are running at 2-3 GHz, GPUs are barely hitting 2 GHz?
|
# ? Aug 21, 2017 04:13 |
|
sincx posted:This is a general question, but is there a reason that, while desktop CPUs are routinely running at 4 GHz and cell phone SOCs are running at 2-3 GHz, GPUs are barely hitting 2 GHz? Im guessing its the 2000 cores
|
# ? Aug 21, 2017 04:21 |
|
VulgarandStupid posted:Jay was also 500 pounds at one point, I kind of doubt Linus has ever broken 150. Jesus; good for him.
|
# ? Aug 21, 2017 04:53 |
|
Avalanche posted:
I stay away from all the screeching harpy build and unboxing bs personally, even with a new pc to build.
|
# ? Aug 21, 2017 04:54 |
|
Knifegrab posted:So is the 1070 still a good card to get if you are looking for a relatively high end card but are budgeting more on price/performance? I bought one three weeks ago. I could have had a 1080 for $26 more, but it would have been a baseline Gigabyte, and instead I got an EVGA with the "we swear this one won't burn down" system. This is because I always, always always pay for quality air coolers that have premium quiet fans (my last card was the red and black Twin Frozr IV cooler on a wimpy little RX470). If your priorities are different, like you own a 4K monitor now and not in three years like me, then get a 1080. Or if that's all too high end for you, get the Gigabyte 3GB 1060 for $200 after Masterpass promo at Newegg. It's fine for now, but requires you to understand that "ultra-extreme settings" are a glorified screenshot mode that require intense scrutiny to tell apart from High. Craptacular! fucked around with this message at 05:26 on Aug 21, 2017 |
# ? Aug 21, 2017 05:23 |
|
Craptacular! posted:I bought one three weeks ago. I could have had a 1080 for $26 more, but it would have been a baseline Gigabyte, and instead I got an EVGA with the "we swear this one won't burn down" system. This is because I always, always always pay for quality air coolers that have premium quiet fans (my last card was the red and black Twin Frozr IV cooler on a wimpy little RX470). If your priorities are different, like you own a 4K monitor now and not in three years like me, then get a 1080. If you really bought a 1070 for only $26 less than a 1080, then I honestly think that was a bad value decision.. I own the crappiest 1080 dual fan card in existence (an Inno3D Twinx2) and while it gets loud, its perfectly capable of handling a stock 1080. Then I undervolted + overclocked, and because Pascal is so responsive to undervolting, it's even better now. Even then, you could squirrel away 10 bucks a week and get a decent cooler replacement, like an aio, in less than 3 months.
|
# ? Aug 21, 2017 05:59 |
|
Gigabyte fans are really good also. I'd have gone for the 1080 no question.
|
# ? Aug 21, 2017 06:19 |
|
I actually didn't mean to spend that much. I was going to get a 1070 Black for $390 but hosed it up and missed out, eBay had a 10% off coupon expiring on August 1st, and I didn't want to spend another month empty-handed on integrated graphics when Overwatch launched summer games again. Anyway, I live in the desert with a computer that's about four feet from my head with an inefficient air conditioner that costs hundreds a month running constantly and our power bills are so high that we're actually conscious of wall warts being unnecessarily left plugged in, so I try to run stuff stock and cool and quiet. I'm playing 1080p/60 and just want something to get me through the next few years. I have no idea by the time the 1070 is useless if I'll even like games, or can afford to play them. All but $100 of this card was paid for with Crypto Lottery Cash flipping my last one. Making about $375 on a RX470 at the very moment things peaked was great, but I got tired of waiting for the bubble to burst and bought back in at nearly as ridiculous prices. The way I looked at it was like, I could buy a 1060 6GB, and it would be enough for me. But since I resell everything eventually I should get the card that someone using a 1440p monitor might look at, since in a few years there's going to be even more people who have those. Edit: And if the bubble does crash in the next 2.3333 months and prices tumble down, I spent a lot of money already toward a step-up. Craptacular! fucked around with this message at 07:24 on Aug 21, 2017 |
# ? Aug 21, 2017 07:20 |
|
I have no right at all to judge anyone else's GPU choice, I'm running a 1080 @ 1080p 60hz.
|
# ? Aug 21, 2017 07:26 |
|
Sure enough, I felt crazy for about 24 hours which was how I wound up cancelling the 1070 Black that could have cost $30 less. I am just a noise obsessive (my hearing isalways so sensitive, and was so good as a kid that it's often a curse), and while the EVGA fans aren't as quiet as MSI's best, the fact that only one spins up instead of both on the ICX cooler when playing Overwatch for a few hours is pretty sweet. And I had a Gigabyte motherboard in my HTPC that should have been recalled (but wasn't) for it's widespread inability to keep track of time, which was a real bad first impression. I'm ordering their low-end 1060 right now for a friend though so I guess I'll find out how stupid I was. I have until Halloween to hope the libertarians lose their shirts and prices resume to sanity.
|
# ? Aug 21, 2017 07:39 |
|
Craptacular! posted:And I had a Gigabyte motherboard in my HTPC that should have been recalled (but wasn't) for it's widespread inability to keep track of time, which was a real bad first impression. Yeah, just to point out, but you shouldn't trust EVGA then since their ACX 3.0 could catch fire (as you previously referenced). Every company has its issues, but thankfully nVidia's vendor QC requirements are pretty good (unlike AMD, who will let vendors church out whatever).
|
# ? Aug 21, 2017 07:44 |
|
GRINDCORE MEGGIDO posted:I have no right at all to judge anyone else's GPU choice, I'm running a 1080 @ 1080p 60hz. Due to a 7970HD blowing up and taking my motherboard and ram with it, I now transport my 1080-equipped computer from next to the TV in the living room to the bedroom where the 7970HD used to live whenever I want to play... ...Battlefield 1... At 1680x1050 resolution, 60hz. What can I say, I'm a masochist.
|
# ? Aug 21, 2017 07:46 |
|
judging by their low end products isnt a good idea either. An entry level luxury brand car is going be full of shiny plastic compared to just one step up in the product line. Its not cheap or low-end because its good
|
# ? Aug 21, 2017 07:48 |
|
Shrimp or Shrimps posted:Due to a 7970HD blowing up and taking my motherboard and ram with it, I now transport my 1080-equipped computer from next to the TV in the living room to the bedroom where the 7970HD used to live whenever I want to play... We both really need new displays.
|
# ? Aug 21, 2017 07:51 |
|
Fauxtool posted:judging by their low end products isnt a good idea either. An entry level luxury brand car is going be full of shiny plastic compared to just one step up in the product line. Its not cheap or low-end because its good Well that was actually the thing. The only 1080 I could afford is a low end one and had a lot of reviews saying it was noisy. If I could swing the G1, Xtreme, Aorus etc for that money I would have gone for it.
|
# ? Aug 21, 2017 07:53 |
|
SourKraut posted:Yeah, just to point out, but you shouldn't trust EVGA then since their ACX 3.0 could catch fire (as you previously referenced). Every company has its issues, but thankfully nVidia's vendor QC requirements are pretty good (unlike AMD, who will let vendors church out whatever). EVGA got a ton of poo poo for that, way more than what was actually warranted. link That being said, I still would have gotten the 1080 for that difference. Gigabyte isn't a disreputable manufacturer by any means.
|
# ? Aug 21, 2017 08:20 |
SourKraut posted:Yeah, just to point out, but you shouldn't trust EVGA then since their ACX 3.0 could catch fire (as you previously referenced). Every company has its issues, but thankfully nVidia's vendor QC requirements are pretty good (unlike AMD, who will let vendors church out whatever). That is no reason to trust or distrust EVGA, the VRM ran hot but well within spec. The failures were due to faulty parts, faulty parts that they had no ability to test and that none of the AIBs have the ability to test, they all depend on quality control from the parts manufacturer but sometimes a bad batch slips through and there is nothing they can do about it other than replace the failed cards. They only added the thermal pads and updated the cooler design because they needed to do something to stop people panicking, it was a PR move.
|
|
# ? Aug 21, 2017 08:25 |
Mooktastical posted:EVGA got a ton of poo poo for that, way more than what was actually warranted. link IDK, the last and only gigabyte card I got was their mini 970 and that thing crashed on stock clocks, which is apparently an issue for the card.
|
|
# ? Aug 21, 2017 08:52 |
|
Again, I've watched a lot of reviews of great Gigabyte cards. But almost everybody and their brother got the G1 and my option was the Windforce Base level model that has next to no YouTube coverage but middling reviews and a lot of "it's noisy" and even a guy on Amazon had to shove the fan sensor down so the fan stopped hitting it. In addition, almost everybody loves EVGA for reasons and I had never bought from them before, while even the Gigabyte subreddit has called out their lackluster customer support if something goes wrong. I have an Ivy Bridge processor, 8GB memory, and a 1080p display and this was already over my self imposed budget. I get the feeling any benefit would be bottlenecked by my system, and I deliberately resisted 1440p and don't bother to seek out any demonstrations of Gsync/Freesync. My choices were a $415 1070 ACX with no backplate, a $425 1070 ICX, and a $450 1080 from the "Not Good Enough For G1" bin, or to keep living on an Intel HD4000 iGPU. If you disagree with my judgment and would have chose differently, that's cool. I approached this like a guy who has won a bunch of money in a casino, but not enough to walk away, so he just blows it like free money. (After all, the best use of money would be "stop buying computer gamer poo poo".)
|
# ? Aug 21, 2017 09:15 |
|
Has anybody done testing on how Pascal cards scale in performance per watt with undervolting? Overclocking feels like it yields 3% performance for 50% higher power consumption, so I'm undervolting my 1070 to 0.875V at roughly 1800 Mhz but I suspect that peak efficiency is still far lower because the Max-Q parts boost to ~1.4 Ghz. Nvidia showed these slides at the Max-Q launch but they're useless without numbers.
|
# ? Aug 21, 2017 09:42 |
|
https://videocardz.com/newz/amd-radeon-vega-8-and-vega-10-mobile-spotted Dear loving god, the GPU side has infected the CPU side with its stupidity.
|
# ? Aug 21, 2017 10:04 |
|
It's looking like my $599 Amazon Gigabyte Vega 64's will be arriving today! Flipped one to a friend for 850 and the other to a miner for 1100. Friend is a nice guy and said I could open his and sperg out while he waits for the rest of his build to ship. I really just want to experience the 4600RPM fan noise Gonna be a long day waiting for the box
|
# ? Aug 21, 2017 11:05 |
|
Reminder: 100% of this pricing idiocy is the result of AMD's CTO promising the investors a 50% margin on Vega. Literally all they have to do to make this shitshow stop is drop their margins a bit and sell at their stated MSRP. That ship sailed long ago when it became obvious that Vega wasn't going to perform well for gaming. It's a Titan Xp-sized chip with HBM2 on it, that performs like a 1080. lovely margins are a foregone conclusion. The only other option would be to withdraw it from the gaming market entirely. To be honest if margins are this lovely I'm kinda surprised they launched RX Vega at all. I certainly can't see them selling it priced <$300 going against the 1160 in a year, I just don't see how they can sustain production for very long if the cost problem is this acute for them even at $500. Long term I think prices on this have to go down anyway so I'm not sure why AMD is dragging their feet so hard. Without the mining bubble they wouldn't hardly be able to move them at MSRP, and even with the bubble I don't think they can move it at the bundle prices. I think prices may drift downwards quickly anyway, once the FreeSync true-believers are sated there's not much reason to choose Vega over an NVIDIA product that is faster and cheaper and uses less power.
|
# ? Aug 21, 2017 11:10 |
|
AVeryLargeRadish posted:They skew things pretty heavily in their case reviews, about all they are good for is pretty footage. It's stuff like saying that a case runs "a little hot" when the difference between it and a well cooled case is 7-8C, a huge difference. It's very annoying because it's so subtle and it's not like they are lying outright, just phrasing stuff in a very deceptive way. Really the only ones I trust are Gamers Nexus and Jay, because both are pretty blunt and open about how they do stuff. If you want some good case reviews, watch anything by Leo with Kitguru. The b-roll isn't as shiny as Hardware Canucks, but he'll pick apart any flaws that case has, then he'll show you the full closed-loop watercooling solution he put together in there.
|
# ? Aug 21, 2017 11:19 |
|
Paul MaudDib posted:Reminder: 100% of this pricing idiocy is the result of AMD's CTO promising the investors a 50% margin on Vega. Literally all they have to do to make this shitshow stop is drop their margins a bit and sell at their stated MSRP. The more you look into it, the worse it gets. Apparently the monitor in the bundles that they said is $900 or whatever was selling for $700 or less just a few months ago I would have been livid had I actually snagged one of those Liquid bundles now. Has something changed that prevents companies from getting crafty with SKUs to hide poo poo like this better? I remember that being one of the big things with Black Friday back in 2010-2012, moreso on LCD TVs than monitors. They'd do special barebones SKUs for Black Friday, used multiple types of less desirable panels alongside the one everyone actually wanted so if you cared you had to return it 2-3-4 times and get lucky. Otherwise you were just paying for a model number with a substandard panel and it was nearly impossible to cross-shop. That said, were they promising 50% margins at launch, on the reference cards? I could see them approaching that figure once they start doing custom designs. For whatever reason r/AMD only ever found german articles, apparently stating that AMD will offer a wide range of bins for Vega, and also that there are height differences between them, complicating potential cooler designs? Seems like having to account for small height variations would be counterproductive to keeping margins low
|
# ? Aug 21, 2017 11:45 |
|
It looks like a bigger Polaris chip would have delivered the same performance at a much lower price point. At the very least they could've pumped those out to give them to the miners, IIRC Polaris margins are incredibly good. Vega as a consumer product should have been outright cancelled many months ago but I guess the project was long past the point of no return. Oh well, at least it's fun to watch because I don't really care whether the next 1180 is $599 or $899 as long as $NVDA performs the way it does.
|
# ? Aug 21, 2017 11:54 |
|
If multicore Navi gets beaten by a monolithic NVidia chip on any metric, RTG might as well just give it up. I wonder how much of Navi is based on Vega and how far along it is.
|
# ? Aug 21, 2017 11:59 |
|
eames posted:It looks like a bigger Polaris chip would have delivered the same performance at a much lower price point. At the very least they could've pumped those out to give them to the miners, IIRC Polaris margins are incredibly good. I have no idea why big Polaris isn't out, filling the gap. XboneX is getting it, so why not the desktop market? The one in the X is even a high yield version with 256 of the 2816 shaders disabled... there are so many good SKU opportunities there, but no, here we are with expensive as hell Vega, chowing down on all the power. HalloKitty fucked around with this message at 12:27 on Aug 21, 2017 |
# ? Aug 21, 2017 12:24 |
|
sincx posted:This is a general question, but is there a reason that, while desktop CPUs are routinely running at 4 GHz and cell phone SOCs are running at 2-3 GHz, GPUs are barely hitting 2 GHz? Also, CPUs run at a similar frequency while using SIMD instruction sets. They just don't report it that way. Khorne fucked around with this message at 13:13 on Aug 21, 2017 |
# ? Aug 21, 2017 13:02 |
|
Was any solution ever discovered for that annoying Nvidia 9/10 series driver bug where your voltage and GPU clock gets stuck at idle boost (or lower) even under full load in a game after the PC has been on a while? It recently reoccurred after my 2nd last EVGA 1070 driver update and it's intensely obnoxious. It's not throttling or PSU related, because the temps are as cool as they've ever been in the low 50C range under load.
|
# ? Aug 21, 2017 13:11 |
|
Oh is that common? I occasionally encountered that with my GTX 980 and yeah, it wasn't throttling and sometimes it even fixed itself after a while, otherwise required a reboot.
|
# ? Aug 21, 2017 13:20 |
|
eames posted:Has anybody done testing on how Pascal cards scale in performance per watt with undervolting? If you filter this thread to my posts only I did some undervolting benches in heaven and listed total system draw at the wall as well. Your basically correct in your assumption. Edit: here https://forums.somethingawful.com/showthread.php?threadid=3484126&perpage=40&pagenumber=5#post474459416
|
# ? Aug 21, 2017 14:54 |
|
Are you overclocking? There's an intentional feature to save the card if it sees instability by locking it to idle speeds, but if you're not it could be something bleaker about what it thinks might be wrong with the card (even if there's nothing wrong at all).
|
# ? Aug 21, 2017 14:57 |
|
B-Mac posted:If you filter this thread to my posts only I did some undervolting benches in heaven and listed total system draw at the wall as well. Your basically correct in your assumption. Pascal is massively overtuned at stock, let auto-OCed. My GTX1070 is able to sustain 1700MHz only at mere 0.8V, compared to 1811MHz @ 1.05 V at stock I'm only losing like 7% real performance at worst for ~50W less power. I can't go any lower only because 0.8V is the absolute minimum that MSI Afterburner voltage/freq curve can do.
|
# ? Aug 21, 2017 15:07 |
|
craig588 posted:Are you overclocking? There's an intentional feature to save the card if it sees instability by locking it to idle speeds, but if you're not it could be something bleaker about what it thinks might be wrong with the card (even if there's nothing wrong at all).
|
# ? Aug 21, 2017 15:30 |
|
TheRationalRedditor posted:Was any solution ever discovered for that annoying Nvidia 9/10 series driver bug where your voltage and GPU clock gets stuck at idle boost (or lower) even under full load in a game after the PC has been on a while? It recently reoccurred after my 2nd last EVGA 1070 driver update and it's intensely obnoxious. It's not throttling or PSU related, because the temps are as cool as they've ever been in the low 50C range under load. Oh crap, I have to check the clocks. I've been getting single digits after playing 7 days to die for a while, and that games not much prettier then Minecraft.
|
# ? Aug 21, 2017 15:41 |
|
Palladium posted:Pascal is massively overtuned at stock, let auto-OCed. My GTX1070 is able to sustain 1700MHz only at mere 0.8V, compared to 1811MHz @ 1.05 V at stock I'm only losing like 7% real performance at worst for ~50W less power. I can't go any lower only because 0.8V is the absolute minimum that MSI Afterburner voltage/freq curve can do. Mine also does around 1700Mhz at 0.8V but actual performance is worse than 1633Mhz at 0.8V because of the invisible throttling going on when the chip runs at its stability/reliability limit. The difference between 1797/0.875V and 1633/0.8V is 11% total system power consumption for 2.1% average FPS (Unigine Heaven 1440p Ultra) so I think peak efficiency is far lower, probably in the 1.3-1.4 Ghz range that the MaxQ chips boost to. Too bad we can't check. Note that this is measured at the wall and even includes my monitor, so actual GPU power/heat savings are higher. I guess the lesson to be learned here is that Nvidia ships factory overclocked chips so they can squeeze more performance out of smaller dies at the cost of higher power consumption and overclocking Pascal is .
|
# ? Aug 21, 2017 15:50 |
|
|
# ? Jun 5, 2024 09:15 |
|
TheRationalRedditor posted:Was any solution ever discovered for that annoying Nvidia 9/10 series driver bug where your voltage and GPU clock gets stuck at idle boost (or lower) even under full load in a game after the PC has been on a while? It recently reoccurred after my 2nd last EVGA 1070 driver update and it's intensely obnoxious. It's not throttling or PSU related, because the temps are as cool as they've ever been in the low 50C range under load. What works for me is power cycling my second monitor. I suspect it's incredibly specific to my setup unfortunately. Also, for me this bug only happens at power on. Once resolved it won't recur until I reboot, even then it's extremely rare - like once every few months.
|
# ? Aug 21, 2017 16:20 |