Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

VodeAndreas posted:

Yes! My i7 920 struggles slightly at Battlefield 3 & 4 physics.

But not enough that I've felt pressured into upgrading yet... I think part of me is waiting for another jump like my Athlon X2 to Conroe then again to Nehalem.

Another jump like that? Dude, it's already here.

http://www.anandtech.com/bench/product/47?vs=1199

Adbot
ADBOT LOVES YOU

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

wheez the roux posted:

:psyduck: What kind of mental gymnastics does it take to consider this anything but a positive? Have you ever, even once in the past decade, been limited in any capacity whatsoever by CPU speed?

Not speaking for that goon, but a few MMOs' CPU dependency ruined my 120fps dreams. I am a big sperg.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

BobHoward posted:

Another jump like that? Dude, it's already here.

http://www.anandtech.com/bench/product/47?vs=1199

Yeah, the small gains per generation add up to a pretty unflattering picture of Nehalem right now, especially from a power consumption perspective. I'd say it's worth upgrading a Nehalem machine if you get CPU bottlenecked by things.

VodeAndreas
Apr 30, 2009

Yeah the incremental upgrades have added up in the 5 years or so, that shows a bigger gap than I thought (in benchmarks at least).

I know the way the tech cycle works and that there'll always be something bigger on the horizeon, I guess I'm just waiting for a 'killer app' to force me, my home desktop's primarily for gaming though which hasn't had much pushing it forward lately.

Not too concerned about power consumption, for file serving and other general purposes I've got a Intel Atom based system as well that consumes a hefty 38W at full load.
Compared with my desktop which idles at 140W or so, ~350W at load (single GPU, SSD).

wheez the roux
Aug 2, 2004
THEY SHOULD'VE GIVEN IT TO LYNCH

Death to the Seahawks. Death to Seahawks posters.

sincx posted:

Actually quite frequently. Encoding/transcoding media, batch projects in Photoshop, file compression, etc.

I've done some massive video encodes, and even my friends in the professional VFX industry only upgrade every 3 years or so. I mean, I'm sure fringe cases exist, but for 99.9% of people, year to year (or even every other year) updates have served zero purpose for a while.

SCheeseman
Apr 23, 2003

wheez the roux posted:

I've done some massive video encodes, and even my friends in the professional VFX industry only upgrade every 3 years or so. I mean, I'm sure fringe cases exist, but for 99.9% of people, year to year (or even every other year) updates have served zero purpose for a while.

Emulation. Namely PS2 and Wii, as well as some of the more intensive MAME stuff.

Police Automaton
Mar 17, 2009
"You are standing in a thread. Someone has made an insightful post."
LOOK AT insightful post
"It's a pretty good post."
HATE post
"I don't understand"
SHIT ON post
"You shit on the post. Why."
The problem with some online charts showing speed differences is that they are purely synthetic for your situation and don't really tell you how they would improve your real-world day-to-day computing. Even a CPU that's double or even triple the speed per clock from what you had before doesn't necessarily mean it will make your life easier or really improve your computing day-to-day life. In a real environment with real apps it could mean that something takes 3-4 seconds less. Hardly worth the investment. Applications haven't exactly pushed the envelope of processing power in a long time. Many are even written especially with old computers in mind.

Also, videogames nowadays are almost all written for consoles which are all pretty weak even compared to a somewhat older PC.

I still do stuff with Motorola 68060 and 68040 CPUs all the time. Performance wise they are somewhere below early Pentiums, even though their architecture is so vastly different that it's actually somewhat hard to say. Even they are often completely sufficient for the usage at hand, it just depends on the individual situation and what I want to do.

Welmu
Oct 9, 2007
Metri. Piiri. Sekunti.
Devil's Canyon on June 2nd is a paper launch, delayed for purchase 'til August-September.



Source.

Panty Saluter
Jan 17, 2004

Making learning fun!

sincx posted:

Actually quite frequently. Encoding/transcoding media, batch projects in Photoshop, file compression, etc.

I tried encoding a DVD to h.264 when I got my 4670k and averaged around 300fps. :stare: Makes me wish I still had DVDs to rip.

The Phenom II wasn't bad at all but you would only get 100+ fps on an encode during the credits and it averaged 60-70.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Welmu posted:

Devil's Canyon on June 2nd is a paper launch, delayed for purchase 'til August-September.



Source.
Ugh, that really sucks if true. At least we only have to wait a week to know. A delay out to after July would really tempt me to wait for Broadwell, except that all indications are it will keep getting pushed out as well, and waiting until Fall/Winter 2015 for a new system isn't acceptable. I'd really like a CPU with L4 cache though...

One big takeaway from the 14nm issues should be that scaling below 20nm is really hard. Some people believed Intel engineers could always find a way, but here we are with 14nm products a year behind schedule and still with low yield on advanced production. That doesn't mean continued scaling is impossible by any means, but it's getting harder and slower at an increasing rate with every generation.

beejay
Apr 7, 2002

Let's hope Intel has been working on whatever is the successor to silicon.

Edit: This is an interesting read: http://www.pcworld.com/article/2038207/intel-keeping-up-with-moores-law-becoming-a-challenge.html

beejay fucked around with this message at 15:25 on May 26, 2014

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Didn't we just have an announcement that Broadwell would be out for Back to School? Did that get revised already? Somehow I think Intel could get this stuff out sooner if AMD didn't have its head inserted fully into its rear end right now.

This is really starting to suck though as I've been hoping to replace my C2Q with a broadwell machine around this time. It's still very adequate but there are a ton of use cases where I'd benefit from an upgrade: photo and video editing, 3D rendering, data mining and LP solving are all very CPU intensive and something I do quite frequently.

E: What's Skylake is supposed to bring on top of that? If Broadwell is delayed, would they stick to the same interval before the next update?

beejay
Apr 7, 2002

Skylake should bring DDR4.

Honestly, if you want to upgrade, there is not really any reason not to do it now. It's not like Broadwell will be a massive jump over Haswell refresh. You will get big gains over a Core 2 regardless. If I had to make a best guess based on how things are looking now, I'd expect Broadwell next spring and Skylake maybe early 2016. That's based on nothing but gut feelings though. I'd just say upgrade now for you.

beejay fucked around with this message at 15:42 on May 26, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

mobby_6kl posted:

Didn't we just have an announcement that Broadwell would be out for Back to School? Did that get revised already? Somehow I think Intel could get this stuff out sooner if AMD didn't have its head inserted fully into its rear end right now.
On May 19th the Intel CEO confirmed they would miss back-to-school and were hoping to have initial product availability by the holidays. Note that this is only for their ultra-low-power CPUs, which will be followed by laptop CPUs by the end of the year, and desktop CPUs are slated for Summer 2015. Based on the usual lag from Intel's limited release to general availability, I think we'll see Broadwell laptops in early 2015 and desktop CPUs in Fall/Winter.

beejay posted:

Honestly, if you want to upgrade, there is not really any reason not to do it now. It's not like Broadwell will be a massive jump over Haswell refresh. You will get big gains over a Core 2 regardless. If I had to make a best guess based on how things are looking now, I'd expect Broadwell next spring and Skylake maybe early 2016. That's based on nothing but gut feelings though. I'd just say upgrade now for you.
Broadwell will include L4 cache on more models which is a pretty significant boost for memory bandwidth and latency sensitive applications, which seem to be what he's using.

pmchem
Jan 22, 2010


wheez the roux posted:

:psyduck: What kind of mental gymnastics does it take to consider this anything but a positive? Have you ever, even once in the past decade, been limited in any capacity whatsoever by CPU speed?

Every single day of my work life (science!).

Matt Zerella
Oct 7, 2002

Norris'es are back baby. It's good again. Awoouu (fox Howl)

pmchem posted:

Every single day of my work life (science!).

That'll do it alright.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Also, if you view growing system requirements as a reflection of increasing capabilities and functionality, then stagnation in requirements growth is definitely a bad thing. Look at gaming, the fact that games have been designed for the same consoles for 7 years has slowed the growth of system requirements, but also means that game technology is advancing more slowly.

Ignoarints
Nov 26, 2010
Welp I'm glad I didn't pull the trigger on parting out my computer then

coffeetable
Feb 5, 2006

TELL ME AGAIN HOW GREAT BRITAIN WOULD BE IF IT WAS RULED BY THE MERCILESS JACKBOOT OF PRINCE CHARLES

YES I DO TALK TO PLANTS ACTUALLY
Since Broadwell has been pushed back a year, are we likely to see in it any of the architectural upgrades that would've been scheduled for Skylake in Broadwell instead?

computer parts
Nov 18, 2010

PLEASE CLAP

Alereon posted:

Also, if you view growing system requirements as a reflection of increasing capabilities and functionality, then stagnation in requirements growth is definitely a bad thing. Look at gaming, the fact that games have been designed for the same consoles for 7 years has slowed the growth of system requirements, but also means that game technology is advancing more slowly.

"Capabilities" is not directly correlated with actual features, but more some specific features (for example, larger textures or more advanced physics mechanics).

In other words you can have a generic Call of Duty type game which has high system requirements or you can have a fairly unique and intriguing game (e.g., Gone Home) that has low system requirements. It's kind of hard now to actually do something "cool" (read: interesting) with increased processing power that wouldn't be possible on an earlier system.

That being said, the current craze is apparently having everything run in 4k resolutions so your graphics cards are still going to get better if only because of that.

computer parts fucked around with this message at 14:02 on May 27, 2014

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
We've been held back graphically thanks to consoles(360/PS3) far more than processing power.

computer parts
Nov 18, 2010

PLEASE CLAP

go3 posted:

We've been held back graphically thanks to consoles(360/PS3) far more than processing power.

Yeah, but honestly the main constraint has been memory and any other issue was a very distant second (near the end devs would just crank down the settings on the console versions rather than gently caress with PC games).

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

wheez the roux posted:

:psyduck: What kind of mental gymnastics does it take to consider this anything but a positive? Have you ever, even once in the past decade, been limited in any capacity whatsoever by CPU speed?

Well if you were the sort of user that didn't need big CPU processing power nothing was forcing you to upgrade, it's just that now the people who do need that power don't have any upgrade options.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

computer parts posted:

"Capabilities" is not directly correlated with actual features, but more some specific features (for example, larger textures or more advanced physics mechanics).

In other words you can have a generic Call of Duty type game which has high system requirements or you can have a fairly unique and intriguing game (e.g., Gone Home) that has low system requirements. It's kind of hard now to actually do something "cool" (read: interesting) with increased processing power that wouldn't be possible on an earlier system.

That being said, the current craze is apparently having everything run in 4k resolutions so your graphics cards are still going to get better if only because of that.
How much cool stuff you can do in your game engine depends on the hardware resources available, along with how efficiently you use it. It seems like your point is that a technically excellent game engine doesn't mean the resulting game will be good, which is true, but an engine with fewer limitations makes it possible, and indeed easier, to make a better game. More processing power and better engines make game development easier because you don't need to depend on complicated tricks to get acceptable performance, the engine can abstract things away from you. Look at how much effort Xbox 360 devs go to to work within the system's eDRAM, now imagine them simply not having to do that because the platform has enough resources. Since the 360 is 10 years old, we are well beyond the point where anyone can wring any more performance from the hardware and we have just been spending years trying to develop new gimmicks to distract people from how lovely everything looks. The reality is that three PPC cores simply wasn't poo poo, and while 8 Jaguar cores isn't a huge amount of CPU performance, it pushes us forward a LOT, especially in that it encourages game developers to write well-threaded x64 code.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD

Alereon posted:

...especially in that it encourages game developers to write well-threaded x64 code.

This is the biggest advantage of new consoles.

Shanakin
Mar 26, 2010

The whole point of stats are lost if you keep it a secret. Why Didn't you tell the world eh?
Personally I'm finding myself increasingly CPU bound in games with the greater prevalence of advanced physics engines and physics based games. This tends to be even more true of games that make no pretences of being console compatible.

Graphics at least tends to have options and falls back fairly gracefully if you can't manage the full quality. CPU not so much.

Ignoarints
Nov 26, 2010
What kind of games? I'm not asking to be a dick, actually interested. In my experience so far the only aspect of a game that actually stresses my 4670k are really intense online components of a game. The only way I could tell was reverting to stock speed and overclock speed back and forth, and it only ever applied to that particular workload for games. But it was a significant difference, and is why I always try to say "CPU isn't the bottleneck for most gaming"

Pimpmust
Oct 1, 2008

I see that you haven't been playing very many games of Victoria II :v:

fookolt
Mar 13, 2012

Where there is power
There is resistance

Pimpmust posted:

I see that you haven't been playing very many games of Victoria II :v:

I also hear Civilization V is pretty brutal for processors. But this whole topic makes me wonder: why do gamers care so much about extreme overclocking when the vast majority of games aren't CPU bound?

I'm not attacking anyone who does it; I just want to understand the benefits because I thought the days of AMD 2500 are long gone. Is the overall OS experience much smoother or something?

Ignoarints
Nov 26, 2010
My desktop experience is basically indistinguishable stock and overclocked. Frankly, I just do it because I like it. There are tangible benefits in some programs other than a web browser, but they're few and far between if you just game. In my best case scenario I got 10 fps and subjective smoothness in online battlefield 4. This was shocking to me because I wasn't expecting a visually noticeable difference.

But overall... it's just like my car. I'm squeezing the most I can out of it within reason. In many ways that are pretty parallel to cpu overclocking. But honestly how often do I use that? 1% of the time? Still worth it to me though.

I never heard of victoria II. However after googling it ... it's hard to believe that can actually bottleneck at the CPU without some serious optimization issues on the development side. Though the search results seemed to be full of people upset that its a single threaded game, which is something that is certainly improved between generations.

Col.Kiwi
Dec 28, 2004
And the grave digger puts on the forceps...

fookolt posted:

I also hear Civilization V is pretty brutal for processors. But this whole topic makes me wonder: why do gamers care so much about extreme overclocking when the vast majority of games aren't CPU bound?
I don't think they/we do. I'm only speaking anecdotally but I personally see very little overlap between serious gamers and serious overclockers. The serious gamers I know mostly don't know enough about hardware to even build a PC. The serious overclockers I've known have dabbled in games as if they were tech demos, playing a little single player on easy difficulty to enjoy the eye candy but that's it.

quote:

I'm not attacking anyone who does it; I just want to understand the benefits because I thought the days of AMD 2500 are long gone. Is the overall OS experience much smoother or something?
I feel like for most overclockers its more about having a project than about seeing a real world difference. The pleasure comes from tweaking and just the competitive aspect of trying to get the best possible stable numbers. It always was like that for me, and a few years back I used to be pretty into overclocking for a while. Amusingly back when I was into overclocking I only played games quite casually, these days I game semi competitively and I have no interest in tweaking/messing with my computer I just have an attitude that if it ain't broke don't fix it. It's a tool for me now instead of a project. What I said above is based on observing others though not just my own experience.

Pimpmust
Oct 1, 2008

Ignoarints posted:

My desktop experience is basically indistinguishable stock and overclocked. Frankly, I just do it because I like it. There are tangible benefits in some programs other than a web browser, but they're few and far between if you just game. In my best case scenario I got 10 fps and subjective smoothness in online battlefield 4. This was shocking to me because I wasn't expecting a visually noticeable difference.

But overall... it's just like my car. I'm squeezing the most I can out of it within reason. In many ways that are pretty parallel to cpu overclocking. But honestly how often do I use that? 1% of the time? Still worth it to me though.

I never heard of victoria II. However after googling it ... it's hard to believe that can actually bottleneck at the CPU without some serious optimization issues on the development side. Though the search results seemed to be full of people upset that its a single threaded game, which is something that is certainly improved between generations.

It's mainly the "POP" system which tracks the needs and desires of a world worth of population (more or less) coupled to a "free market".

Then of course AI for all the countries and military units, untriggered events that are tagged to said countries constantly checking in the background against their triggers...

Really, any Paradox game requires a beefy CPU and game developers in general are lovely at CPU optimization.

JawnV6
Jul 4, 2004

So hot ...
Is there any way to search ARK for package size? I have an odd application where Intel procs might be the right answer, but only if I can get them small enough. The filters give options for Socket, but I don't have any translation for socket type to millimiters. I've been poking around the Embedded section hoping that has everything. I'm not sure if "Atom for Storage" or "Atom for Communication" have any different package sizes.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Intel has announced a strategic partnership with RockChip, China's largest SoC vendor, in which Intel will provide CPU, modem, and other IP blocks, RockChip will glue them together on the SoC and arrange fabrication at TSMC. I'm not sure how competitive this will be with the contemporary Qualcomm SoCs, but we shall see.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


The brilliance of that is Intel managing to take fab capacity away from its competitors by utilizing TSMC to produce an x86 product.

Shanakin
Mar 26, 2010

The whole point of stats are lost if you keep it a secret. Why Didn't you tell the world eh?

Ignoarints posted:

What kind of games? I'm not asking to be a dick, actually interested. In my experience so far the only aspect of a game that actually stresses my 4670k are really intense online components of a game. The only way I could tell was reverting to stock speed and overclock speed back and forth, and it only ever applied to that particular workload for games. But it was a significant difference, and is why I always try to say "CPU isn't the bottleneck for most gaming"
I don't really bother with overclocking but perhaps I should.

Kerbal Space Program
DCS World
ARMA 3
A bunch of paradox games several hours in when there are a stupid amount of armies on the map, or victoria in more general terms
wargame (in 10v10 maps)
Civilisation
Prison Architect
etc
I'm guessing games like minecraft etc probably count too if you're into that (but they're also memeory limited often IIRC).

Getting a better GPU would let me turn up a few graphics settings but plopping in the newest $500 graphics hotness probably wont improve the framerate of most of these too much in general play. KSP is especially easy to explain because it's not especially graphically demanding but the performance of the CPU basically determines how big you can build your stuff so it has a real and tangible benefit.


On the other hand if I boot up skyrim or something then I'm noticeably graphics bound on my HD5850. I'll agree for the most part the big triple-A games are generaly graphics bound though.

Shanakin fucked around with this message at 23:02 on May 27, 2014

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

JawnV6 posted:

Is there any way to search ARK for package size? I have an odd application where Intel procs might be the right answer, but only if I can get them small enough. The filters give options for Socket, but I don't have any translation for socket type to millimiters. I've been poking around the Embedded section hoping that has everything. I'm not sure if "Atom for Storage" or "Atom for Communication" have any different package sizes.

Looks like on some/most there's a package size field? You can Select All, Compare, and export to Excel if you want to search/sort that field.

Probably the smallest would be the Quark SoCs at 15mm x 15mm, but they're not fast.

Ignoarints
Nov 26, 2010

Shanakin posted:

I don't really bother with overclocking but perhaps I should.


what kind of cpu do you have out of curiosity to give me an idea?

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

fookolt posted:

I also hear Civilization V is pretty brutal for processors.

Not really. It's not CPU bound in the sense that you're starving for processing time to render a single frame. Most of the heavy CPU work is simulating turns on larger maps. All this means is more wait between turns, but you can still look around the map and plan your next move.

Adbot
ADBOT LOVES YOU

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.
Don't forget Dwarf Fortress, where a main cause of a lost game is FPS death because it runs as fast as it can on a single core.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply