|
AVeryLargeRadish posted:I figured that they don't have 3 way and 4 way SLI working the way they want yet so it's software locked. They would tell the press this so that people stop spreading the "1080 can only do 2-way SLI!" rumor, the 1080/1070 might not have it at launch but the hardware can do it once Nvidia gets it working properly. I suspect this is correct, but if they did want to lock features behind a license 3/4-way SLI is where I'd want them to start because I don't give a poo poo about that.
|
# ? May 15, 2016 03:52 |
|
|
# ? May 30, 2024 13:18 |
Lockback posted:I suspect this is correct, but if they did want to lock features behind a license 3/4-way SLI is where I'd want them to start because I don't give a poo poo about that. Better to make your money selling SLI bridges & licensing them out to mobo manufacturers.
|
|
# ? May 15, 2016 04:07 |
|
Subjunctive posted:My money is on "Titan 10". Still won't sell one to me.
|
# ? May 15, 2016 06:27 |
|
...Why don't they just make the card bigger and put more stuff on it? (I don't mean a 1090)
|
# ? May 15, 2016 06:53 |
|
Ak Gara posted:...Why don't they just make the card bigger and put more stuff on it? (I don't mean a 1090) How do you mean? They've got a larger chip than the 1070/1080, it's going to be the Titan whatever and then 1080 Ti when they have more than they're selling for a bunch of grand per to compute customers.
|
# ? May 15, 2016 07:06 |
|
FaustianQ posted:I guess there is precedent for a GX104 chip to be considered flagship so that's really not outside possibility. But rather than a 1080ti, just the new Titan (G? X2? Y? K? P?) or something, 900$. The weird fact is that the
|
# ? May 15, 2016 07:14 |
|
I dunno, it kind of make sense to not use the P100 design for a consumer gaming card, rather a redesign that turns 75% of the FP64 ALUs into FP32 could make an immensely powerful card for a GP102 or GP110. P100 is just a horrifically inefficient for consumer usage IMHO.
|
# ? May 15, 2016 08:48 |
|
SwissArmyDruid posted:Wait, is that seriously why they keep doing SLI through fingers? o.o No
|
# ? May 15, 2016 09:10 |
|
BIG HEADLINE posted:Or they move to other colors on ROYGBIV plus White. Somehow I think they'd skip the Titan Red, though. Titan Mauve
|
# ? May 15, 2016 15:23 |
|
El Scotch posted:Titan Mauve You aren't thinking enough in terms of "prestige product" Titan Rose Gold
|
# ? May 15, 2016 16:59 |
|
Evil Fluffy posted:If you have a 5-6+ year old PC I'm not sure how effectively you'll be able to use a 1070 or 1080 since the rest of the system won't be nearly as powerful. It still feels like maybe replacing it with something with a fan, even if it's just a 730 GT, would probably be an alright investment. Can't seem to find any replacement fans for the 9500 GT.
|
# ? May 15, 2016 17:03 |
|
So I'm looking for a switched HDMI splitter. That is, I want to have a single input, and select which of the two outputs it goes to. (I don't think it'll work if they're both connected at the same time.) All the splitters I can find are "always on" to both, and all the switches I can find are multi-input/single-output. My fallback is to have a short HDMI cable to the top of my desk with a F+F adapter, and then switch them manually when I use the different outputs, but that seems pretty uncivilized. E: I wonder if I can just run a switch backwards...
|
# ? May 15, 2016 17:21 |
|
I have a 2500k I bought in 2011 and I don't think a 1070 will outpace an OC'd 2500k. I don't think it's like it was 6-10 years ago when you had to upgrade in a rotation, more or less.
|
# ? May 15, 2016 17:22 |
|
Most games just aren't CPU limited. There's some out there that can push it (RTS games like SupCom or Total War, etc.), but they're in the minority. Even still, the 2500k remains completely relevant, even more so with a decent overclock.
|
# ? May 15, 2016 17:31 |
|
Subjunctive posted:My money is on "Titan 10". I could see it being called "Titan 4K".
|
# ? May 15, 2016 17:42 |
|
Titan TNT
|
# ? May 15, 2016 17:45 |
|
Games are starting to be able to actually use more and more cores, but that's a long progress and half the time it's in favor of old chips with a lot of cores because the total cpu requirement isn't going up as fast. We're finally approaching the future of gaming that bulldozer wouldn't suck miserably at that the die hards talked about right as it gets replaced with yet another high IPC processor that may be actually good. Generally these days, if you're going to have the budget for both but not at once and the CPU isn't totally incompetent, I'd do the GPU first because it will be a huge upgrade, and then follow with a CPU. The CPU on its own isn't going to give you a huge result, so even if GPUs get better before you upgrade the CPU (not a worry, we're on the leading edge of the next generation and these things tend to last 2 years now), the time before you've upgraded both being good instead of bad will more than outweigh the time after both are upgraded being a bit less good.
|
# ? May 15, 2016 17:45 |
|
xthetenth posted:Games are starting to be able to actually use more and more cores, but that's a long progress and half the time it's in favor of old chips with a lot of cores because the total cpu requirement isn't going up as fast. We're finally approaching the future of gaming that bulldozer wouldn't suck miserably at that the die hards talked about right as it gets replaced with yet another high IPC processor that may be actually good. I think what's interesting is that the components that get you the most bang-for-your-buck nowadays are the complete opposite of what was suggested 10 years ago - RAM and CPU seemed to be the "best" upgrades back then, but now it's GPUs and SSDs (although SSDs less so for gaming). I could see myself upgrading my GPU 3x+ before I bother upgrading my i5 4690k.
|
# ? May 15, 2016 17:49 |
|
xthetenth posted:Games are starting to be able to actually use more and more cores, but that's a long progress and half the time it's in favor of old chips with a lot of cores because the total cpu requirement isn't going up as fast. We're finally approaching the future of gaming that bulldozer wouldn't suck miserably at that the die hards talked about right as it gets replaced with yet another high IPC processor that may be actually good. i need your dealer's number, my weed isn't that good
|
# ? May 15, 2016 17:59 |
|
Bag of Sun Chips posted:I think what's interesting is that the components that get you the most bang-for-your-buck nowadays are the complete opposite of what was suggested 10 years ago - RAM and CPU seemed to be the "best" upgrades back then, but now it's GPUs and SSDs (although SSDs less so for gaming). CPUs don't move much these days year to year, so getting a real performance upgrade means going up the stack. That basically kills the price effectiveness right off the bat. Ask me about being a penny wise fool and not getting a Haswell-E when I could have gotten one instead of a 4790k. I'm going to end up replacing that much sooner than a 6 core, and probably end up paying more over time. The big thing is that CPUs are slamming into the law of diminishing returns on single core speed, and servers and laptops matter more than desktop so Intel's perfectly happy to pocket advances in tech to make laptops more efficient and make servers either more efficient or stuff them with more cores. Desktop still is okay with paying current prices per core, so it keeps on limping along. Meanwhile GPUs aren't gaining as fast because things are just overall slowing down, but they're still able to directly turn transistors into performance.
|
# ? May 15, 2016 17:59 |
|
I've got a lovely i5 2400 running at 3.1ghz from like 5 years ago together with a gtx960, do you think upgrading to a 1070 will produce too much bottleneck? Just aiming to play at 1080p with everything else maxed.
|
# ? May 15, 2016 18:07 |
|
Hahahahahahahahahahahaha *breathe* hahahahahahaaquote:In any case GeForce GTX 1080 only tops those charts in two benchmarks: Bitcoin Mining and Face Detection.
|
# ? May 15, 2016 18:34 |
|
The thing to understand about CPU bottlenecks is that they're wildly variant from game to game and are more or less completely separate from GPU performance. The latest greatest processor might run a corridor shooter at 300 FPS and an open world game at 80. If your GPU is so strong compared to your CPU that it actually gets held back, then your CPU will most likely choke so hard on a demanding game that your GPU won't have a chance to get hit that hard to begin with. If you have a Sandy Bridge or later CPU (and possibly even one a little older than that) I'd wait to upgrade it until you actually want to play a game it can't handle. HMS Boromir fucked around with this message at 18:37 on May 15, 2016 |
# ? May 15, 2016 18:35 |
|
NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO
|
# ? May 15, 2016 18:44 |
|
oh please butters please buy up all the GP104 stock the salt will be amazing
|
# ? May 15, 2016 19:06 |
|
Anime Schoolgirl posted:oh please butters please buy up all the GP104 stock the salt will be amazing You know whats better than "salt"? Being able to buy a new videocard for less than 2X the msrp.
|
# ? May 15, 2016 19:14 |
|
Kramjacks posted:You know whats better than "salt"? Being able to buy a new videocard for less than 2X the msrp.
|
# ? May 15, 2016 19:16 |
|
Anime Schoolgirl posted:oh please butters please buy up all the GP104 stock the salt will be amazing They've been doing a great job of cleaning up the back stock of Hawaii.
|
# ? May 15, 2016 19:29 |
|
I'm glad cryptocurrency is continuing to be a baffling force for evil in the world.
|
# ? May 15, 2016 19:32 |
|
A few weeks later a mining script is released that works wonders with Pascal cards, and AMD releases Polaris 10 for a competitive price/perf that has poor optimization for the mining script. Meanwhile, in AMD headquarters, the entire company boardroom steeples their fingers in unison as Lisa Su says but one word. "Excellent".
|
# ? May 15, 2016 19:45 |
|
Kramjacks posted:NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO Also I guess the pre order people gonna make bank flipping Alright AMD I don't know how many more chances you need but now is the time penus penus penus fucked around with this message at 20:33 on May 15, 2016 |
# ? May 15, 2016 20:29 |
|
FaustianQ posted:A few weeks later a mining script is released that works wonders with Pascal cards, and AMD releases Polaris 10 for a competitive price/perf that has poor optimization for the mining script. Meanwhile, in AMD headquarters, the entire company boardroom steeples their fingers in unison as Lisa Su says but one word. Please let this be a thing. Please.
|
# ? May 15, 2016 20:33 |
|
Isn't gpu mining not a thing anymore?
|
# ? May 15, 2016 20:35 |
|
KakerMix posted:Isn't gpu mining not a thing anymore? yeah all the 'big boys' use specialized FPGAs now.
|
# ? May 15, 2016 20:38 |
|
Some new 1080 slides http://videocardz.com/59962/nvidia-geforce-gtx-1080-final-specifications-and-launch-presentation
|
# ? May 15, 2016 20:42 |
|
Even if the 1080 is the best GPU by a mile for GPU, it still won't be nearly enough to cover your electricity. Specialized mining equipment has taken over for the last couple years. Other cryptos can use the GPU, but those never last more than a few weeks before the exit scam commences.
|
# ? May 15, 2016 21:04 |
|
tima posted:Some new 1080 slides http://videocardz.com/59962/nvidia-geforce-gtx-1080-final-specifications-and-launch-presentation lol if AMDs async compute ace-in-the-hole they've been sitting on for 5 years gets negated after paying off in like 3 games
|
# ? May 15, 2016 21:05 |
|
Rigged Death Trap posted:yeah all the 'big boys' use specialized FPGAs now.
|
# ? May 15, 2016 21:05 |
|
Rigged Death Trap posted:yeah all the 'big boys' use specialized FPGAs now. I thought they went to ASICs now?
|
# ? May 15, 2016 21:06 |
|
|
# ? May 30, 2024 13:18 |
|
Rigged Death Trap posted:yeah all the 'big boys' use specialized FPGAs now. They keep trying, again and again, to design coins that work for GPUs and not FPGAs/ASICS. The newest coin, "etherium", is apparently set up in a way that you can mine it on GPUs most efficiently, and it has a "killswitch" coded in so it'll stop producing new coins right around the time someone would be able to develop an FPGA/ASIC for it.
|
# ? May 15, 2016 21:10 |