Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

SwissArmyDruid posted:

Which is fine, because the gains from using quad-channel memory over dual-channel have never been statistically, price-performance, or realistically significant, unless you're doing some crazy-rear end compute poo poo. This approach by AMD is totally justified. (Only having dual-channel, I mean. We already knew that AMD was going to skew some benchmarks somehow, we just didn't know *how* yet.)

Naples, on the other hand, I think I saw a thing showing that Naples will have 8-channel memory? Now there's a chip that will probably need it, if the way AMD wants to handle manipulating large data sets by default pans out.

I can't wait for an AMD chip with like 8 TB of memory per socket, and all the deep learning labs buying them up so they can run retardo-huge datasets through it.

Adbot
ADBOT LOVES YOU

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
If memory speeds have noticeable influence on frame rates, why wouldn't quad channel?

Anime Schoolgirl
Nov 28, 2002

Combat Pretzel posted:

If memory speeds have noticeable influence on frame rates, why wouldn't quad channel?
in most cases latency is way more important than throughput (and higher speed RAM nowadays has significantly lower latency) but more and more games care about throughput so more than two channels is an actual boon now, GTAV/Witcher/ARMA 3 being the most notable ones, and Fallout 4 and Skyrim Remaster are affected by it to an absurd degree

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

Anime Schoolgirl posted:

in most cases latency is way more important than throughput (and higher speed RAM nowadays has significantly lower latency) but more and more games care about throughput so more than two channels is an actual boon now, GTAV/Witcher/ARMA 3 being the most notable ones, and Fallout 4 and Skyrim Remaster are affected by it to an absurd degree

Yep, the minimum frametimes can go up 10-15% in some instances going from the generic DDR3 1600 to the super-stupid 2400+ kits, and DDR4 is more of the same. The latency going down is the important part, not so much the added bandwidth, which is why a super fast dual channel kit improves things, whereas a quad channel setup on an -E CPU wouldn't.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Anime Schoolgirl posted:

in most cases latency is way more important than throughput (and higher speed RAM nowadays has significantly lower latency) but more and more games care about throughput so more than two channels is an actual boon now, GTAV/Witcher/ARMA 3 being the most notable ones, and Fallout 4 and Skyrim Remaster are affected by it to an absurd degree

Did not know about Witcher, somehow I am not actually surprised by Arma 3, but GTAV/Fallout/Skyrim are WTFs. Got links?

LRADIKAL
Jun 10, 2001

Fun Shoe
http://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k

This is the only article I've read this year.

SwissArmyDruid
Feb 14, 2014

by sebmojo
baderp. I really gotta stop skimming articles before lunch.

SwissArmyDruid fucked around with this message at 21:19 on Jan 12, 2017

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Yes, upgrade.

to the 2600k :getin:

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

incoherent posted:

Yes, upgrade.

to the 2600k :getin:

You know, I'm really regretting cheaping out with one of the worst P67 motherboards out there. I never thought I'd be trucking on the same machine 6 years later with no real plans to upgrade soon.

MagusDraco
Nov 11, 2011

even speedwagon was trolled
My poor 3550 is having problems with battlefield 1 to the point where I get better framerate in dx12 mode. BF1 generally doesn't give you better framerate in DX12 mode, ever. I'm the super special snowflake (that's trying to push 2560x1440@144hz with a 1070 that's being hobbled by the processor)



I'll probably upgrade at some point but not until the Zen dust settles.

EdEddnEddy
Apr 5, 2012



Sounds like 4C/4T will be a thing as well with Zen

Hopefully they don't go down to 2 so they can just go right up against the i3 with 4 cores (and hopefully be ~2x faster) and then go 4C/8T with the i5's.

LRADIKAL
Jun 10, 2001

Fun Shoe

havenwaters posted:

2560x1440@144hz
How many frames do you average? Does running half res look terrible?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

EdEddnEddy posted:

Sounds like 4C/4T will be a thing as well with Zen

Hopefully they don't go down to 2 so they can just go right up against the i3 with 4 cores (and hopefully be ~2x faster) and then go 4C/8T with the i5's.

i3s have gotten really high clocked recently. It won't be as cut and dry as 2x faster, and probably more like a 3.9 GHz i3-7100 vs a 3GHz i5-7400.

If they can sell an i5-7400 competitor for i3-7100 prices, AMD's in business.

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Jago posted:

How many frames do you average? Does running half res look terrible?

For whatever reason DX11 averages 30s to 40s. DX12 averages 50s to 70s which is fine by me but DX12 will randomly stutter sometimes.


DX11 is "supposed" to average in the 70s to 90s though at that resolution with a 1070 when not cpu bound at all.

edit: It's apparently a known issue that happens to some people and one of the fixes was to setup a framerate cap at 60 but that kind of defeats the point of having a high refresh rate monitor and oh well.

Honestly by the time I want to upgrade probably won't be playing BF1 anymore and this'll be a moot point. 95% of other games work fine still. Deadly Premonition doesn't but well that's Deadly Premonition.

MagusDraco fucked around with this message at 22:18 on Jan 12, 2017

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Twerk from Home posted:

You know, I'm really regretting cheaping out with one of the worst P67 motherboards out there. I never thought I'd be trucking on the same machine 6 years later with no real plans to upgrade soon.

Yeah having a lovely old mobo for my 2600k was basically my only excuse for building a new system, it is nice having two PCs though.

Ludicrous Gibs!
Jan 21, 2002

I'm not lost, but I don't know where I am.
Ramrod XTreme

Twerk from Home posted:

You know, I'm really regretting cheaping out with one of the worst P67 motherboards out there. I never thought I'd be trucking on the same machine 6 years later with no real plans to upgrade soon.

I'm regretting cheaping out in a different way, buying a regular 2500 instead of the 2500k.

If the 6-core Zen has good performance and a reasonable price, I'm there. It'll be nice to get Oculus Home to stop complaining every time I boot it up.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
4C/4T might be being positioned to replace current Athlons?

https://twitter.com/Thracks/status/819608674675081218

This is me, being super thirsty for Zen. *tightens belt, slaps arm* I need more!

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Ludicrous Gibs! posted:

I'm regretting cheaping out in a different way, buying a regular 2500 instead of the 2500k.

Ouch, so much performance left on the table for so little extra money.

Oh well, Zen hopefully offers a way forward!

Luckily, I'm finding it hard to believe it will suck given everything we know, unlike Vega, which looks like it will not quite reach the performance levels it needs to.

ItBurns
Jul 24, 2007
Zen will have to be $30 and come with a (good) free gaming title before I consider upgrading from my 2500k. It takes everything I throw at it!

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
You know what just give me Zen for free and I'll help "spread the word about it" that's worth something right

Yaoi Gagarin
Feb 20, 2014

I have absolutely no reason to upgrade from a 4790k but if Zen is good and the 6-core is affordable it's going to be hard to stop myself... Especially since I can give the haswell machine to my sister so she's not using my phenom II x4 955 from 2010

EdEddnEddy
Apr 5, 2012



Thats been the current trend hasn't it. Even though CPU's have gotten somewhat better, the drive and want to make a whole new system every 1-3 years practically evaporated if you got a Sandy Bridge or higher from whatever you had before. Outside of some nice features coming with newer CPU's and Platfroms, it hasn't been near as exciting to build a new system if your present even if old one still takes everything modern software can throw at it unlike in the old days. (Especially the Pre SSD days, ugh).

GPU tech did get a boon with the 10XX series, and hopefully Vega can do something to keep that ball rolling, but I sure do hope Zen is the start of a new race if anything to get more cores in the consumer realm, and maybe bring the -E chips down to less than stupid levels to upgrade too. We will see.

Nvidia pretty much confirmed that they held off announcing the 1080Ti at CES because of AMD's not full Vega unveil, so we have to wait until March to see what it might have in store and to try and crush AMD when they launch as well. Hopefully Vega can put up a good fight for once.

FuturePastNow
May 19, 2014


I'm a dumb and bought a FX-8350 a year ago, so I get the drive to build a new system in a year anyway

Cinara
Jul 15, 2007
I bought a 4670k in 2013, was tempted to go i7 but they had almost no performance gains in games back then and I figured I would be upgrading in a couple years anyways. Now I am wishing I had gone with the i7 cause more cores is the only reason I am looking to upgrade right now. A video card upgrade a few months ago means that 99% of games run like I have a brand new computer, though I am CPU bottlenecked for sure when trying to hit 165fps in stuff like Overwatch.

Anime Schoolgirl
Nov 28, 2002

the difference between an 115x i5-k and i7-k is not that big, honestly.

ItBurns
Jul 24, 2007
Is there any reason - other than fear - that AMD hasn't published a head to head benchmark of Zen against the 2500k?


What are they hiding?

Haquer
Nov 15, 2009

That windswept look...

ItBurns posted:

Is there any reason - other than fear - that AMD hasn't published a head to head benchmark of Zen against the 2500k?


What are they hiding?

Probably because it's what, 3-4 generations old at this point?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

ItBurns posted:

Is there any reason - other than fear - that AMD hasn't published a head to head benchmark of Zen against the e6750?


What are they hiding?

Anime Schoolgirl
Nov 28, 2002

ItBurns posted:

Is there any reason - other than fear - that AMD hasn't published a head to head benchmark of Zen against the Pentium 4?


What are they hiding?

Anime Schoolgirl
Nov 28, 2002

Pentium 4 is also funny because it marks the exact starting point of Kyle Bennett's salt trail against AMD

An ill-advised $1000 purchase, beaten by a $200 cpu

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Anime Schoolgirl posted:

Pentium 4 is also funny because it marks the exact starting point of Kyle Bennett's salt trail against AMD

An ill-advised $1000 purchase, beaten by a $200 cpu

Is there more somewhere that I can read about this? :allears:

Anime Schoolgirl
Nov 28, 2002

SourKraut posted:

Is there more somewhere that I can read about this? :allears:
the forum meltdowns have been lost to the memory hole but there's still the wealth of articles of Kyle Bennett trying very, very hard to say AMD didn't come out on top during that era

he was also responsible for creating the dogmatic AMD fanboy as a concept and as all gimmicks the irony washed off after a few years but they still hound him to this day

Anime Schoolgirl fucked around with this message at 08:01 on Jan 14, 2017

Fabulousity
Dec 29, 2008

Number One I order you to take a number two.

Thinking back to those days I didn't follow the whole Itanium thing that closely... but in retrospect if K7 hadn't cockslapped Netburst and opened a hole for AMD to deliver the x86-64 gut punch would we all be suckling at Intel's Itanium teat today on the desktop in 2017? I realize there were other factors at play: Itanium had growing pains and Microsoft plus a legion of legacy software vendors jumped at the 64 bit extension way out. But if K7 failed and K8 didn't happen as a result for whatever reason surely Intel could've strong armed everyone into their way of thinking? Or if K7 failed and Itanium wet farted just as hard in that reality would something else have risen up?

Anime Schoolgirl
Nov 28, 2002

Fabulousity posted:

Thinking back to those days I didn't follow the whole Itanium thing that closely... but in retrospect if K7 hadn't cockslapped Netburst and opened a hole for AMD to deliver the x86-64 gut punch would we all be suckling at Intel's Itanium teat today on the desktop in 2017? I realize there were other factors at play: Itanium had growing pains and Microsoft plus a legion of legacy software vendors jumped at the 64 bit extension way out. But if K7 failed and K8 didn't happen as a result for whatever reason surely Intel could've strong armed everyone into their way of thinking? Or if K7 failed and Itanium wet farted just as hard in that reality would something else have risen up?
power architecture and the loss of low power consumer electronics as a concept until arm got better

Fabulousity
Dec 29, 2008

Number One I order you to take a number two.

Anime Schoolgirl posted:

power architecture and the loss of low power consumer electronics as a concept until arm got better

In the K7/Itanium die timeline my Discman with 20 secnd anti-skip could have been still sort of awesome in 2003? Nice.

I love your avatar, by the way. "Come then! Show me what passes for compassion among your beloved kind!"

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
As far as I know Itanium was built on a flawed premise that ILP could be achieved through compiler optimizations and explicitly parallel architectures rather than achieving ILP through hardware methods like superscalar microarchitectures, out of order execution, etc. It seems like almost everyone uses the hardware approach now days and I'm not aware of anyone having much success with the software approach in the traditional CPU market.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
The hardware approach makes more sense anyway, because when a CPU generation changes things up a lot internally, you'd have to recompile just about everything to keep the performance going. Probably doesn't matter for word processors and such, but anything beyond that? Games needing at least two builds of the executables to span at least a one architecture improvement, media creation people not wanting to upgrade software would get shafted, and so on. Whereas if it happens on the CPU, it's mostly transparent, although there's still some rather minor performance advantages to be had, when playing with the peculiarities of a processor architecture.

FuturePastNow
May 19, 2014


Anime Schoolgirl posted:

Pentium 4 is also funny because it marks the exact starting point of Kyle Bennett's salt trail against AMD

An ill-advised $1000 purchase, beaten by a $200 cpu

Kyle Bennett is the Zsa Zsa Gabor of hardware reviews.

Rastor
Jun 2, 2001

Anime Schoolgirl posted:

power architecture and the loss of low power consumer electronics as a concept until arm got better

It's still hilarious that Intel had Xscale ARM processors and saw them get used in most every Blackberry and Palm Treo, and yet chose to sell off the entire division and its 1,400 employees to Marvell because they couldn't imagine Apple selling iPhones in large quantities.

Adbot
ADBOT LOVES YOU

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Rastor posted:

It's still hilarious that Intel had Xscale ARM processors and saw them get used in most every Blackberry and Palm Treo, and yet chose to sell off the entire division and its 1,400 employees to Marvell because they couldn't imagine Apple selling iPhones in large quantities.

On one hand I would like to say Intel consumer division had a fundamental misunderstanding of their customer base (which the vast majority has never bought Intel CPUs solely on performance contrary to Intel's belief), yet on the other ARM mobile SoCs on the whole had proved to be a ton less profitable than Intel's consumer division.

  • Locked thread