Goddammit I just went to Micro Center yesterday for a 970
|
|
# ¿ Jul 10, 2015 23:02 |
|
|
# ¿ May 14, 2024 01:58 |
what the heck does asic mean in this context? I might be dumb, but I don't see how 'application specific ic' fits into that context.
|
|
# ¿ Jul 21, 2015 04:45 |
Don Lapre posted:They test the asic on the cards and then sell them for more money if its a higher number. What are they measuring? efficiency?
|
|
# ¿ Jul 21, 2015 04:51 |
that's a wierd as gently caress way of using it, then. Why not just refer to the jitter or something?
|
|
# ¿ Jul 21, 2015 05:06 |
My only issue with Nvidia Inspector's multi display power saving thing is it holds the clock down no matter what. When I play games I get poo poo frame rates with it on.
|
|
# ¿ Sep 2, 2015 04:46 |
DrDork posted:Hence step 2 in my solution: on the Multi Display Power Saver window, select Activate Full 3D by GPU Usage and set the threshold to something like 20-25%. That should fix it for you. Ohhhh yeah there's my problem... I'm a dummy
|
|
# ¿ Sep 2, 2015 05:04 |
Panty Saluter posted:Newegg I am somewhat suspicious of your claims. Mostly because I've been running happily on 450W for a while now Haha, yeah my system draws 275W at the absolute max from the wall and it is an OC'ed 970 AND i7-4790k to 4.7G. 500W would be perfect to take advantage of the efficiency maxima at 50% load.
|
|
# ¿ Sep 10, 2015 05:22 |
Don Lapre posted:Its because 90% of the people are going to buy What the gently caress is a band width regulator and what does it have to do with power delivery.
|
|
# ¿ Sep 10, 2015 05:24 |
Can't argue with that.
|
|
# ¿ Sep 10, 2015 05:28 |
The new Deus Ex game with have DX12 support.
|
|
# ¿ Sep 25, 2015 19:42 |
My 970, OC'd to almost 1500mhz, and a 4790k oc'd to 4.6ghz pull 280W max from the wall running at full tilt.
|
|
# ¿ Nov 4, 2015 22:45 |
To be fair, if absolute price/perf were the chief metric everyone went by, we'd all be getting a 750ti or 950.
|
|
# ¿ Dec 31, 2015 04:28 |
I have the asus one and it works just fine (overclocked to 1450mhz even), though it is loud. My case isn't a soundproofing one, though.
|
|
# ¿ Dec 31, 2015 23:12 |
The article said there would be passive options though
|
|
# ¿ Jan 27, 2016 20:45 |
I'm assuming it's lower clocked and or undervolted
|
|
# ¿ Jan 27, 2016 20:52 |
Idk what you guys are doing but I have almost everything maxed on a 970 oced to 1450 and it runs 45-60 just fine. I even have fancy hair turned on. I suspect if I turned shadows down a tad I could get 50-60
|
|
# ¿ Jan 29, 2016 17:02 |
Yeah I just dusted my computer today, and a very light coating dusted off translated to 10 degrees C cooler temperatures for both GPU and CPU under load, and sub 30C idle! It's insane what a little dust can do.
|
|
# ¿ Feb 7, 2016 04:26 |
SwissArmyDruid posted:Intel's production capacity is for Intel. That is not going to change any time in the near or foreseeable future. Their fabs are something they guard very jealously, when realistically the only other company out there that's as vertically integrated as they are from design to end product is Samsung. It's funny, I was interviewing for samsung a few weeks ago and one thing they really wanted to hype up was how superior they were to Intel, process and fab-wise.
|
|
# ¿ Apr 11, 2016 04:33 |
Yeah, one of the managers I was talking to was quite certain that Intel's was 16nm, which for all I know it could be and they just call it 14nm (which is really more of an academic difference I would think).
|
|
# ¿ Apr 11, 2016 16:03 |
THE DOG HOUSE posted:Yeah and they aren't even saying that I would guess. They being Samsung the company. What a hiring manager says to someone is definitely not representative of what the company would actually say in public, it would be a stretch to even consider it marketing. Every company is the best at everything, just about to take over the competition any day now *wink!*, and a really great place to get in now before it blows up - when you're in an interview lol. Haha, yeah it was in a big presentation they gave to all of us who were interviewing.
|
|
# ¿ Apr 12, 2016 16:32 |
I'd done maybe ten or so mail in rebates over the years, most recently last year, and I got money from every single one of them *shrug*
|
|
# ¿ May 3, 2016 16:00 |
The 970 is already available Shimrra Jamaane posted:They have a card for that, its called the 970 and itll be available for about $100 in a few months.
|
|
# ¿ May 7, 2016 05:38 |
Bardeh posted:For $100? Show me where, and I'll buy one right now. ahh, misread your post, whoops!
|
|
# ¿ May 7, 2016 06:02 |
I'm gonna replace my 970 with a scalped 1k+ FE 1080 to run my 1080p 60hz monitor because i'm rich and i can
|
|
# ¿ May 29, 2016 03:49 |
Zero VGS posted:I had a ton of expiring Dell store credit, and they were nice enough to put up the PNY GTX 1080 FE for sale with "ships in 20-30 days", so looks like I have two of those on the way I'll need to flip. Well, you'd have to figure out how to fit a micro or regular ATX into the mini-itx case first...
|
|
# ¿ Jun 3, 2016 03:24 |
Zero VGS posted:With a bifuricated Mini-ITX like the newest ASRock stuff, you can (theoretically) use a custom ribbon cable to split the single x16 PCIe port into two x8, like with this: http://www.ebay.com/itm/331051900725 Huh. So it is, though I don't know enough about PCIE to know whether two cards can share the jtag, smbus and wake reactivation pins like that...
|
|
# ¿ Jun 3, 2016 18:14 |
There won't be.
|
|
# ¿ Jun 5, 2016 16:55 |
fozzy fosbourne posted:Some people are speculating that 1080 supply might be more constrained than 1070 because of the gddr5x Yes, but custom boards take time to design, as does modifying whatever needs to be modified to fit their custom coolers to the reference board.
|
|
# ¿ Jun 5, 2016 17:18 |
Anime Schoolgirl posted:There's probably a good reason why AMD made a seemingly last-minute switch to Samsung's fabs for Polaris. *Global Foundries, which licensed the 14nmlpp from samsung. Who knows if that includes any process or equipment improvements, though... Watermelon Daiquiri fucked around with this message at 13:22 on Jun 17, 2016 |
|
# ¿ Jun 17, 2016 13:20 |
Craptacular! posted:Newegg had an Asus 960 for $160. I'm limping along with an Asus 660 with factory OC and have for three years, going from playing Bioshock Infinite at ultra settings in 2013, to playing Overwatch in above-medium, GTA at medium, and Doom at downright ugly. I have an overclocked 970 and I run doom on the highest settings it lets me do at 1920x1200 with almost constant 60 fps. Sometimes it dips below it, but only for a second here and there.
|
|
# ¿ Jul 3, 2016 18:47 |
SwissArmyDruid posted:Ashes *is* a bad benchmark Part of the game's claim to fame is that poo poo is procedurally generated every time you play it. Last I heard, this also extended to the benchmark as well. And if there's one thing that you absolutely don't want when benchmarking things, it's a random seed changing itself from run to run. Even if the seed is exactly identical every time the benchmark is run, we're still just supposed to take their word that the result should be the same, regardless of hardware? Please. Thats what multiple trials are for! Assuming an even distribution of random seeds mapped to scene complexity, statistically speaking you are more likely to get more varied results than just seeds that get you complex scenes every time.
|
|
# ¿ Jul 5, 2016 12:35 |
Pssshh all you need is a script to automate the benchmarking.
|
|
# ¿ Jul 5, 2016 13:02 |
I succumbed to impulse and bought a EVGA 1070 SC. I'm a bit bummed that it can only overclock up to 2050 at the absolute max, but its still drat good. Since I only have one, 1200p monitor, I also went ahead and bought a crossover 2795 and a monitor arm. I was very tempted to buy a x34, but I have spent waaay too much money in the past couple months and while my new credit card is interest free until september of next year, I don't NEED that fancy a monitor lol.
|
|
# ¿ Jul 15, 2016 04:26 |
Of course overclocking 10x0 is silicon lottery; it's a brand new process node. It's nowhere near the 95+% yields of 28 nm (probably more like 75-85%), and you can bet the good bins are all drat good, rather than good bins just barely squeaking in spec for 16. For newer nodes, all of the kinks haven't been worked out, nor have there been anywhere near as many awesome power or judder/swing reduction developments. You need another few years to get back to that point, especially with finfets.
|
|
# ¿ Jul 16, 2016 02:49 |
Fauxtool posted:I just found out that you can lease enthusiast GPUs. It seems about as good an idea as renting rims. You are locked into a 12 month contract and you still pay the same monthly rate even when the value of the card inevitably suffers many price cuts. You can make a 13th payment at the end to buy the card. Welcome to Rent-a-Center
|
|
# ¿ Aug 1, 2016 09:02 |
Wouldn't 110 be, you know, worse than 108? Itd be like 1120 poo poo tier. And i think that 14nm for the next gen is mixing rumors up......
|
|
# ¿ Aug 12, 2016 22:54 |
Malcolm XML posted:nvidia doing tick tock now???
|
|
# ¿ Aug 13, 2016 01:30 |
PBCrunch posted:It seems a lot more likely that Nvidia would move low-power stuff like Tegra over to Samsung. Well to be perfectly fair, glofo might be loving up the process somehow or Polaris' design is flawed or something and can't get the speeds they want. I don't understand the physics well enough, but I can't see how whatever ln14lpcdwhateverthefuck wouldn't work for "high power" stuff unless they aren't giving enough fins for the transistors to carry current well enough (which makes me wonder if there is a fin difference between tsmc/Intel and samsung) or tsmc has an awesome process tech that allows that much current and/or improves rise/fall times a ton.
|
|
# ¿ Aug 13, 2016 02:53 |
Anime Schoolgirl posted:isn't that precisely the reason why amd contracted samsung directly for fabrication recently Huh? Are you sure you aren't thinking of Nvidia's pascal shrink?
|
|
# ¿ Aug 22, 2016 15:12 |
|
|
# ¿ May 14, 2024 01:58 |
Anime Schoolgirl posted:Nope. This is apparently a separate fab contract from the ones with TSMC (for Bristol Ridge) and Global Foundries. I was wondering if that was what you were referring to. All that is really saying is that they have the option, due to the licensing agreement between samsung and glofo ensuring easy compatibility. Now, I'm sure they've run a few test lots through giheung and/or austin to make sure it can translate, but I really don't think there's anything more concrete.
|
|
# ¿ Aug 22, 2016 21:27 |