|
SwissArmyDruid posted:Which is fine, because the gains from using quad-channel memory over dual-channel have never been statistically, price-performance, or realistically significant, unless you're doing some crazy-rear end compute poo poo. This approach by AMD is totally justified. (Only having dual-channel, I mean. We already knew that AMD was going to skew some benchmarks somehow, we just didn't know *how* yet.) I can't wait for an AMD chip with like 8 TB of memory per socket, and all the deep learning labs buying them up so they can run retardo-huge datasets through it.
|
# ? Jan 12, 2017 10:39 |
|
|
# ? Jun 5, 2024 04:37 |
|
If memory speeds have noticeable influence on frame rates, why wouldn't quad channel?
|
# ? Jan 12, 2017 13:09 |
|
Combat Pretzel posted:If memory speeds have noticeable influence on frame rates, why wouldn't quad channel?
|
# ? Jan 12, 2017 14:52 |
|
Anime Schoolgirl posted:in most cases latency is way more important than throughput (and higher speed RAM nowadays has significantly lower latency) but more and more games care about throughput so more than two channels is an actual boon now, GTAV/Witcher/ARMA 3 being the most notable ones, and Fallout 4 and Skyrim Remaster are affected by it to an absurd degree Yep, the minimum frametimes can go up 10-15% in some instances going from the generic DDR3 1600 to the super-stupid 2400+ kits, and DDR4 is more of the same. The latency going down is the important part, not so much the added bandwidth, which is why a super fast dual channel kit improves things, whereas a quad channel setup on an -E CPU wouldn't.
|
# ? Jan 12, 2017 18:43 |
|
Anime Schoolgirl posted:in most cases latency is way more important than throughput (and higher speed RAM nowadays has significantly lower latency) but more and more games care about throughput so more than two channels is an actual boon now, GTAV/Witcher/ARMA 3 being the most notable ones, and Fallout 4 and Skyrim Remaster are affected by it to an absurd degree Did not know about Witcher, somehow I am not actually surprised by Arma 3, but GTAV/Fallout/Skyrim are WTFs. Got links?
|
# ? Jan 12, 2017 19:55 |
|
http://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k This is the only article I've read this year.
|
# ? Jan 12, 2017 20:19 |
|
baderp. I really gotta stop skimming articles before lunch.
SwissArmyDruid fucked around with this message at 21:19 on Jan 12, 2017 |
# ? Jan 12, 2017 21:16 |
|
Jago posted:http://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k Yes, upgrade. to the 2600k
|
# ? Jan 12, 2017 21:24 |
|
incoherent posted:Yes, upgrade. You know, I'm really regretting cheaping out with one of the worst P67 motherboards out there. I never thought I'd be trucking on the same machine 6 years later with no real plans to upgrade soon.
|
# ? Jan 12, 2017 21:26 |
|
My poor 3550 is having problems with battlefield 1 to the point where I get better framerate in dx12 mode. BF1 generally doesn't give you better framerate in DX12 mode, ever. I'm the super special snowflake (that's trying to push 2560x1440@144hz with a 1070 that's being hobbled by the processor) I'll probably upgrade at some point but not until the Zen dust settles.
|
# ? Jan 12, 2017 21:32 |
|
Sounds like 4C/4T will be a thing as well with Zen Hopefully they don't go down to 2 so they can just go right up against the i3 with 4 cores (and hopefully be ~2x faster) and then go 4C/8T with the i5's.
|
# ? Jan 12, 2017 21:34 |
|
havenwaters posted:2560x1440@144hz
|
# ? Jan 12, 2017 22:00 |
|
EdEddnEddy posted:Sounds like 4C/4T will be a thing as well with Zen i3s have gotten really high clocked recently. It won't be as cut and dry as 2x faster, and probably more like a 3.9 GHz i3-7100 vs a 3GHz i5-7400. If they can sell an i5-7400 competitor for i3-7100 prices, AMD's in business.
|
# ? Jan 12, 2017 22:04 |
|
Jago posted:How many frames do you average? Does running half res look terrible? For whatever reason DX11 averages 30s to 40s. DX12 averages 50s to 70s which is fine by me but DX12 will randomly stutter sometimes. DX11 is "supposed" to average in the 70s to 90s though at that resolution with a 1070 when not cpu bound at all. edit: It's apparently a known issue that happens to some people and one of the fixes was to setup a framerate cap at 60 but that kind of defeats the point of having a high refresh rate monitor and oh well. Honestly by the time I want to upgrade probably won't be playing BF1 anymore and this'll be a moot point. 95% of other games work fine still. Deadly Premonition doesn't but well that's Deadly Premonition. MagusDraco fucked around with this message at 22:18 on Jan 12, 2017 |
# ? Jan 12, 2017 22:11 |
|
Twerk from Home posted:You know, I'm really regretting cheaping out with one of the worst P67 motherboards out there. I never thought I'd be trucking on the same machine 6 years later with no real plans to upgrade soon. Yeah having a lovely old mobo for my 2600k was basically my only excuse for building a new system, it is nice having two PCs though.
|
# ? Jan 13, 2017 00:08 |
|
Twerk from Home posted:You know, I'm really regretting cheaping out with one of the worst P67 motherboards out there. I never thought I'd be trucking on the same machine 6 years later with no real plans to upgrade soon. I'm regretting cheaping out in a different way, buying a regular 2500 instead of the 2500k. If the 6-core Zen has good performance and a reasonable price, I'm there. It'll be nice to get Oculus Home to stop complaining every time I boot it up.
|
# ? Jan 13, 2017 00:15 |
|
4C/4T might be being positioned to replace current Athlons? https://twitter.com/Thracks/status/819608674675081218 This is me, being super thirsty for Zen. *tightens belt, slaps arm* I need more!
|
# ? Jan 13, 2017 12:53 |
|
Ludicrous Gibs! posted:I'm regretting cheaping out in a different way, buying a regular 2500 instead of the 2500k. Ouch, so much performance left on the table for so little extra money. Oh well, Zen hopefully offers a way forward! Luckily, I'm finding it hard to believe it will suck given everything we know, unlike Vega, which looks like it will not quite reach the performance levels it needs to.
|
# ? Jan 13, 2017 13:20 |
|
Zen will have to be $30 and come with a (good) free gaming title before I consider upgrading from my 2500k. It takes everything I throw at it!
|
# ? Jan 13, 2017 16:04 |
|
You know what just give me Zen for free and I'll help "spread the word about it" that's worth something right
|
# ? Jan 13, 2017 16:25 |
|
I have absolutely no reason to upgrade from a 4790k but if Zen is good and the 6-core is affordable it's going to be hard to stop myself... Especially since I can give the haswell machine to my sister so she's not using my phenom II x4 955 from 2010
|
# ? Jan 13, 2017 16:59 |
|
Thats been the current trend hasn't it. Even though CPU's have gotten somewhat better, the drive and want to make a whole new system every 1-3 years practically evaporated if you got a Sandy Bridge or higher from whatever you had before. Outside of some nice features coming with newer CPU's and Platfroms, it hasn't been near as exciting to build a new system if your present even if old one still takes everything modern software can throw at it unlike in the old days. (Especially the Pre SSD days, ugh). GPU tech did get a boon with the 10XX series, and hopefully Vega can do something to keep that ball rolling, but I sure do hope Zen is the start of a new race if anything to get more cores in the consumer realm, and maybe bring the -E chips down to less than stupid levels to upgrade too. We will see. Nvidia pretty much confirmed that they held off announcing the 1080Ti at CES because of AMD's not full Vega unveil, so we have to wait until March to see what it might have in store and to try and crush AMD when they launch as well. Hopefully Vega can put up a good fight for once.
|
# ? Jan 13, 2017 20:08 |
|
I'm a dumb and bought a FX-8350 a year ago, so I get the drive to build a new system in a year anyway
|
# ? Jan 13, 2017 20:12 |
|
I bought a 4670k in 2013, was tempted to go i7 but they had almost no performance gains in games back then and I figured I would be upgrading in a couple years anyways. Now I am wishing I had gone with the i7 cause more cores is the only reason I am looking to upgrade right now. A video card upgrade a few months ago means that 99% of games run like I have a brand new computer, though I am CPU bottlenecked for sure when trying to hit 165fps in stuff like Overwatch.
|
# ? Jan 14, 2017 01:04 |
|
the difference between an 115x i5-k and i7-k is not that big, honestly.
|
# ? Jan 14, 2017 03:23 |
|
Is there any reason - other than fear - that AMD hasn't published a head to head benchmark of Zen against the 2500k? What are they hiding?
|
# ? Jan 14, 2017 06:42 |
|
ItBurns posted:Is there any reason - other than fear - that AMD hasn't published a head to head benchmark of Zen against the 2500k? Probably because it's what, 3-4 generations old at this point?
|
# ? Jan 14, 2017 06:45 |
|
ItBurns posted:Is there any reason - other than fear - that AMD hasn't published a head to head benchmark of Zen against the e6750?
|
# ? Jan 14, 2017 07:16 |
|
ItBurns posted:Is there any reason - other than fear - that AMD hasn't published a head to head benchmark of Zen against the Pentium 4?
|
# ? Jan 14, 2017 07:34 |
|
Pentium 4 is also funny because it marks the exact starting point of Kyle Bennett's salt trail against AMD An ill-advised $1000 purchase, beaten by a $200 cpu
|
# ? Jan 14, 2017 07:43 |
|
Anime Schoolgirl posted:Pentium 4 is also funny because it marks the exact starting point of Kyle Bennett's salt trail against AMD Is there more somewhere that I can read about this?
|
# ? Jan 14, 2017 07:53 |
|
SourKraut posted:Is there more somewhere that I can read about this? he was also responsible for creating the dogmatic AMD fanboy as a concept and as all gimmicks the irony washed off after a few years but they still hound him to this day Anime Schoolgirl fucked around with this message at 08:01 on Jan 14, 2017 |
# ? Jan 14, 2017 07:58 |
|
Thinking back to those days I didn't follow the whole Itanium thing that closely... but in retrospect if K7 hadn't cockslapped Netburst and opened a hole for AMD to deliver the x86-64 gut punch would we all be suckling at Intel's Itanium teat today on the desktop in 2017? I realize there were other factors at play: Itanium had growing pains and Microsoft plus a legion of legacy software vendors jumped at the 64 bit extension way out. But if K7 failed and K8 didn't happen as a result for whatever reason surely Intel could've strong armed everyone into their way of thinking? Or if K7 failed and Itanium wet farted just as hard in that reality would something else have risen up?
|
# ? Jan 14, 2017 08:51 |
|
Fabulousity posted:Thinking back to those days I didn't follow the whole Itanium thing that closely... but in retrospect if K7 hadn't cockslapped Netburst and opened a hole for AMD to deliver the x86-64 gut punch would we all be suckling at Intel's Itanium teat today on the desktop in 2017? I realize there were other factors at play: Itanium had growing pains and Microsoft plus a legion of legacy software vendors jumped at the 64 bit extension way out. But if K7 failed and K8 didn't happen as a result for whatever reason surely Intel could've strong armed everyone into their way of thinking? Or if K7 failed and Itanium wet farted just as hard in that reality would something else have risen up?
|
# ? Jan 14, 2017 08:55 |
|
Anime Schoolgirl posted:power architecture and the loss of low power consumer electronics as a concept until arm got better In the K7/Itanium die timeline my Discman with 20 secnd anti-skip could have been still sort of awesome in 2003? Nice. I love your avatar, by the way. "Come then! Show me what passes for compassion among your beloved kind!"
|
# ? Jan 14, 2017 10:08 |
|
As far as I know Itanium was built on a flawed premise that ILP could be achieved through compiler optimizations and explicitly parallel architectures rather than achieving ILP through hardware methods like superscalar microarchitectures, out of order execution, etc. It seems like almost everyone uses the hardware approach now days and I'm not aware of anyone having much success with the software approach in the traditional CPU market.
|
# ? Jan 14, 2017 11:05 |
|
The hardware approach makes more sense anyway, because when a CPU generation changes things up a lot internally, you'd have to recompile just about everything to keep the performance going. Probably doesn't matter for word processors and such, but anything beyond that? Games needing at least two builds of the executables to span at least a one architecture improvement, media creation people not wanting to upgrade software would get shafted, and so on. Whereas if it happens on the CPU, it's mostly transparent, although there's still some rather minor performance advantages to be had, when playing with the peculiarities of a processor architecture.
|
# ? Jan 14, 2017 13:23 |
|
Anime Schoolgirl posted:Pentium 4 is also funny because it marks the exact starting point of Kyle Bennett's salt trail against AMD Kyle Bennett is the Zsa Zsa Gabor of hardware reviews.
|
# ? Jan 14, 2017 14:32 |
|
Anime Schoolgirl posted:power architecture and the loss of low power consumer electronics as a concept until arm got better It's still hilarious that Intel had Xscale ARM processors and saw them get used in most every Blackberry and Palm Treo, and yet chose to sell off the entire division and its 1,400 employees to Marvell because they couldn't imagine Apple selling iPhones in large quantities.
|
# ? Jan 14, 2017 15:21 |
|
|
# ? Jun 5, 2024 04:37 |
|
Rastor posted:It's still hilarious that Intel had Xscale ARM processors and saw them get used in most every Blackberry and Palm Treo, and yet chose to sell off the entire division and its 1,400 employees to Marvell because they couldn't imagine Apple selling iPhones in large quantities. On one hand I would like to say Intel consumer division had a fundamental misunderstanding of their customer base (which the vast majority has never bought Intel CPUs solely on performance contrary to Intel's belief), yet on the other ARM mobile SoCs on the whole had proved to be a ton less profitable than Intel's consumer division.
|
# ? Jan 14, 2017 16:23 |