|
VodeAndreas posted:Yes! My i7 920 struggles slightly at Battlefield 3 & 4 physics. Another jump like that? Dude, it's already here. http://www.anandtech.com/bench/product/47?vs=1199
|
# ? May 26, 2014 06:42 |
|
|
# ? May 9, 2024 10:27 |
|
wheez the roux posted:What kind of mental gymnastics does it take to consider this anything but a positive? Have you ever, even once in the past decade, been limited in any capacity whatsoever by CPU speed? Not speaking for that goon, but a few MMOs' CPU dependency ruined my 120fps dreams. I am a big sperg.
|
# ? May 26, 2014 07:01 |
|
BobHoward posted:Another jump like that? Dude, it's already here. Yeah, the small gains per generation add up to a pretty unflattering picture of Nehalem right now, especially from a power consumption perspective. I'd say it's worth upgrading a Nehalem machine if you get CPU bottlenecked by things.
|
# ? May 26, 2014 08:04 |
|
Yeah the incremental upgrades have added up in the 5 years or so, that shows a bigger gap than I thought (in benchmarks at least). I know the way the tech cycle works and that there'll always be something bigger on the horizeon, I guess I'm just waiting for a 'killer app' to force me, my home desktop's primarily for gaming though which hasn't had much pushing it forward lately. Not too concerned about power consumption, for file serving and other general purposes I've got a Intel Atom based system as well that consumes a hefty 38W at full load. Compared with my desktop which idles at 140W or so, ~350W at load (single GPU, SSD).
|
# ? May 26, 2014 08:22 |
sincx posted:Actually quite frequently. Encoding/transcoding media, batch projects in Photoshop, file compression, etc. I've done some massive video encodes, and even my friends in the professional VFX industry only upgrade every 3 years or so. I mean, I'm sure fringe cases exist, but for 99.9% of people, year to year (or even every other year) updates have served zero purpose for a while.
|
|
# ? May 26, 2014 08:37 |
|
wheez the roux posted:I've done some massive video encodes, and even my friends in the professional VFX industry only upgrade every 3 years or so. I mean, I'm sure fringe cases exist, but for 99.9% of people, year to year (or even every other year) updates have served zero purpose for a while. Emulation. Namely PS2 and Wii, as well as some of the more intensive MAME stuff.
|
# ? May 26, 2014 11:26 |
|
The problem with some online charts showing speed differences is that they are purely synthetic for your situation and don't really tell you how they would improve your real-world day-to-day computing. Even a CPU that's double or even triple the speed per clock from what you had before doesn't necessarily mean it will make your life easier or really improve your computing day-to-day life. In a real environment with real apps it could mean that something takes 3-4 seconds less. Hardly worth the investment. Applications haven't exactly pushed the envelope of processing power in a long time. Many are even written especially with old computers in mind. Also, videogames nowadays are almost all written for consoles which are all pretty weak even compared to a somewhat older PC. I still do stuff with Motorola 68060 and 68040 CPUs all the time. Performance wise they are somewhere below early Pentiums, even though their architecture is so vastly different that it's actually somewhat hard to say. Even they are often completely sufficient for the usage at hand, it just depends on the individual situation and what I want to do.
|
# ? May 26, 2014 11:59 |
|
Devil's Canyon on June 2nd is a paper launch, delayed for purchase 'til August-September. Source.
|
# ? May 26, 2014 13:34 |
|
sincx posted:Actually quite frequently. Encoding/transcoding media, batch projects in Photoshop, file compression, etc. I tried encoding a DVD to h.264 when I got my 4670k and averaged around 300fps. Makes me wish I still had DVDs to rip. The Phenom II wasn't bad at all but you would only get 100+ fps on an encode during the credits and it averaged 60-70.
|
# ? May 26, 2014 13:38 |
|
Welmu posted:Devil's Canyon on June 2nd is a paper launch, delayed for purchase 'til August-September. One big takeaway from the 14nm issues should be that scaling below 20nm is really hard. Some people believed Intel engineers could always find a way, but here we are with 14nm products a year behind schedule and still with low yield on advanced production. That doesn't mean continued scaling is impossible by any means, but it's getting harder and slower at an increasing rate with every generation.
|
# ? May 26, 2014 14:09 |
|
Let's hope Intel has been working on whatever is the successor to silicon. Edit: This is an interesting read: http://www.pcworld.com/article/2038207/intel-keeping-up-with-moores-law-becoming-a-challenge.html beejay fucked around with this message at 15:25 on May 26, 2014 |
# ? May 26, 2014 15:23 |
|
Didn't we just have an announcement that Broadwell would be out for Back to School? Did that get revised already? Somehow I think Intel could get this stuff out sooner if AMD didn't have its head inserted fully into its rear end right now. This is really starting to suck though as I've been hoping to replace my C2Q with a broadwell machine around this time. It's still very adequate but there are a ton of use cases where I'd benefit from an upgrade: photo and video editing, 3D rendering, data mining and LP solving are all very CPU intensive and something I do quite frequently. E: What's Skylake is supposed to bring on top of that? If Broadwell is delayed, would they stick to the same interval before the next update?
|
# ? May 26, 2014 15:24 |
|
Skylake should bring DDR4. Honestly, if you want to upgrade, there is not really any reason not to do it now. It's not like Broadwell will be a massive jump over Haswell refresh. You will get big gains over a Core 2 regardless. If I had to make a best guess based on how things are looking now, I'd expect Broadwell next spring and Skylake maybe early 2016. That's based on nothing but gut feelings though. I'd just say upgrade now for you. beejay fucked around with this message at 15:42 on May 26, 2014 |
# ? May 26, 2014 15:37 |
|
mobby_6kl posted:Didn't we just have an announcement that Broadwell would be out for Back to School? Did that get revised already? Somehow I think Intel could get this stuff out sooner if AMD didn't have its head inserted fully into its rear end right now. beejay posted:Honestly, if you want to upgrade, there is not really any reason not to do it now. It's not like Broadwell will be a massive jump over Haswell refresh. You will get big gains over a Core 2 regardless. If I had to make a best guess based on how things are looking now, I'd expect Broadwell next spring and Skylake maybe early 2016. That's based on nothing but gut feelings though. I'd just say upgrade now for you.
|
# ? May 26, 2014 15:54 |
|
wheez the roux posted:What kind of mental gymnastics does it take to consider this anything but a positive? Have you ever, even once in the past decade, been limited in any capacity whatsoever by CPU speed? Every single day of my work life (science!).
|
# ? May 26, 2014 18:39 |
|
pmchem posted:Every single day of my work life (science!). That'll do it alright.
|
# ? May 26, 2014 18:42 |
|
Also, if you view growing system requirements as a reflection of increasing capabilities and functionality, then stagnation in requirements growth is definitely a bad thing. Look at gaming, the fact that games have been designed for the same consoles for 7 years has slowed the growth of system requirements, but also means that game technology is advancing more slowly.
|
# ? May 26, 2014 21:57 |
Welp I'm glad I didn't pull the trigger on parting out my computer then
|
|
# ? May 27, 2014 07:35 |
|
Since Broadwell has been pushed back a year, are we likely to see in it any of the architectural upgrades that would've been scheduled for Skylake in Broadwell instead?
|
# ? May 27, 2014 13:56 |
|
Alereon posted:Also, if you view growing system requirements as a reflection of increasing capabilities and functionality, then stagnation in requirements growth is definitely a bad thing. Look at gaming, the fact that games have been designed for the same consoles for 7 years has slowed the growth of system requirements, but also means that game technology is advancing more slowly. "Capabilities" is not directly correlated with actual features, but more some specific features (for example, larger textures or more advanced physics mechanics). In other words you can have a generic Call of Duty type game which has high system requirements or you can have a fairly unique and intriguing game (e.g., Gone Home) that has low system requirements. It's kind of hard now to actually do something "cool" (read: interesting) with increased processing power that wouldn't be possible on an earlier system. That being said, the current craze is apparently having everything run in 4k resolutions so your graphics cards are still going to get better if only because of that. computer parts fucked around with this message at 14:02 on May 27, 2014 |
# ? May 27, 2014 13:59 |
|
We've been held back graphically thanks to consoles(360/PS3) far more than processing power.
|
# ? May 27, 2014 14:38 |
|
go3 posted:We've been held back graphically thanks to consoles(360/PS3) far more than processing power. Yeah, but honestly the main constraint has been memory and any other issue was a very distant second (near the end devs would just crank down the settings on the console versions rather than gently caress with PC games).
|
# ? May 27, 2014 14:40 |
|
wheez the roux posted:What kind of mental gymnastics does it take to consider this anything but a positive? Have you ever, even once in the past decade, been limited in any capacity whatsoever by CPU speed? Well if you were the sort of user that didn't need big CPU processing power nothing was forcing you to upgrade, it's just that now the people who do need that power don't have any upgrade options.
|
# ? May 27, 2014 15:49 |
|
computer parts posted:"Capabilities" is not directly correlated with actual features, but more some specific features (for example, larger textures or more advanced physics mechanics).
|
# ? May 27, 2014 16:09 |
|
Alereon posted:...especially in that it encourages game developers to write well-threaded x64 code. This is the biggest advantage of new consoles.
|
# ? May 27, 2014 16:11 |
|
Personally I'm finding myself increasingly CPU bound in games with the greater prevalence of advanced physics engines and physics based games. This tends to be even more true of games that make no pretences of being console compatible. Graphics at least tends to have options and falls back fairly gracefully if you can't manage the full quality. CPU not so much.
|
# ? May 27, 2014 16:45 |
What kind of games? I'm not asking to be a dick, actually interested. In my experience so far the only aspect of a game that actually stresses my 4670k are really intense online components of a game. The only way I could tell was reverting to stock speed and overclock speed back and forth, and it only ever applied to that particular workload for games. But it was a significant difference, and is why I always try to say "CPU isn't the bottleneck for most gaming"
|
|
# ? May 27, 2014 17:08 |
|
I see that you haven't been playing very many games of Victoria II
|
# ? May 27, 2014 18:11 |
|
Pimpmust posted:I see that you haven't been playing very many games of Victoria II I also hear Civilization V is pretty brutal for processors. But this whole topic makes me wonder: why do gamers care so much about extreme overclocking when the vast majority of games aren't CPU bound? I'm not attacking anyone who does it; I just want to understand the benefits because I thought the days of AMD 2500 are long gone. Is the overall OS experience much smoother or something?
|
# ? May 27, 2014 18:19 |
My desktop experience is basically indistinguishable stock and overclocked. Frankly, I just do it because I like it. There are tangible benefits in some programs other than a web browser, but they're few and far between if you just game. In my best case scenario I got 10 fps and subjective smoothness in online battlefield 4. This was shocking to me because I wasn't expecting a visually noticeable difference. But overall... it's just like my car. I'm squeezing the most I can out of it within reason. In many ways that are pretty parallel to cpu overclocking. But honestly how often do I use that? 1% of the time? Still worth it to me though. I never heard of victoria II. However after googling it ... it's hard to believe that can actually bottleneck at the CPU without some serious optimization issues on the development side. Though the search results seemed to be full of people upset that its a single threaded game, which is something that is certainly improved between generations.
|
|
# ? May 27, 2014 18:31 |
|
fookolt posted:I also hear Civilization V is pretty brutal for processors. But this whole topic makes me wonder: why do gamers care so much about extreme overclocking when the vast majority of games aren't CPU bound? quote:I'm not attacking anyone who does it; I just want to understand the benefits because I thought the days of AMD 2500 are long gone. Is the overall OS experience much smoother or something?
|
# ? May 27, 2014 18:34 |
|
Ignoarints posted:My desktop experience is basically indistinguishable stock and overclocked. Frankly, I just do it because I like it. There are tangible benefits in some programs other than a web browser, but they're few and far between if you just game. In my best case scenario I got 10 fps and subjective smoothness in online battlefield 4. This was shocking to me because I wasn't expecting a visually noticeable difference. It's mainly the "POP" system which tracks the needs and desires of a world worth of population (more or less) coupled to a "free market". Then of course AI for all the countries and military units, untriggered events that are tagged to said countries constantly checking in the background against their triggers... Really, any Paradox game requires a beefy CPU and game developers in general are lovely at CPU optimization.
|
# ? May 27, 2014 18:35 |
|
Is there any way to search ARK for package size? I have an odd application where Intel procs might be the right answer, but only if I can get them small enough. The filters give options for Socket, but I don't have any translation for socket type to millimiters. I've been poking around the Embedded section hoping that has everything. I'm not sure if "Atom for Storage" or "Atom for Communication" have any different package sizes.
|
# ? May 27, 2014 19:19 |
|
Intel has announced a strategic partnership with RockChip, China's largest SoC vendor, in which Intel will provide CPU, modem, and other IP blocks, RockChip will glue them together on the SoC and arrange fabrication at TSMC. I'm not sure how competitive this will be with the contemporary Qualcomm SoCs, but we shall see.
|
# ? May 27, 2014 19:39 |
|
The brilliance of that is Intel managing to take fab capacity away from its competitors by utilizing TSMC to produce an x86 product.
|
# ? May 27, 2014 20:32 |
|
Ignoarints posted:What kind of games? I'm not asking to be a dick, actually interested. In my experience so far the only aspect of a game that actually stresses my 4670k are really intense online components of a game. The only way I could tell was reverting to stock speed and overclock speed back and forth, and it only ever applied to that particular workload for games. But it was a significant difference, and is why I always try to say "CPU isn't the bottleneck for most gaming" Kerbal Space Program DCS World ARMA 3 A bunch of paradox games several hours in when there are a stupid amount of armies on the map, or victoria in more general terms wargame (in 10v10 maps) Civilisation Prison Architect etc I'm guessing games like minecraft etc probably count too if you're into that (but they're also memeory limited often IIRC). Getting a better GPU would let me turn up a few graphics settings but plopping in the newest $500 graphics hotness probably wont improve the framerate of most of these too much in general play. KSP is especially easy to explain because it's not especially graphically demanding but the performance of the CPU basically determines how big you can build your stuff so it has a real and tangible benefit. On the other hand if I boot up skyrim or something then I'm noticeably graphics bound on my HD5850. I'll agree for the most part the big triple-A games are generaly graphics bound though. Shanakin fucked around with this message at 23:02 on May 27, 2014 |
# ? May 27, 2014 22:58 |
|
JawnV6 posted:Is there any way to search ARK for package size? I have an odd application where Intel procs might be the right answer, but only if I can get them small enough. The filters give options for Socket, but I don't have any translation for socket type to millimiters. I've been poking around the Embedded section hoping that has everything. I'm not sure if "Atom for Storage" or "Atom for Communication" have any different package sizes. Looks like on some/most there's a package size field? You can Select All, Compare, and export to Excel if you want to search/sort that field. Probably the smallest would be the Quark SoCs at 15mm x 15mm, but they're not fast.
|
# ? May 27, 2014 23:14 |
Shanakin posted:I don't really bother with overclocking but perhaps I should. what kind of cpu do you have out of curiosity to give me an idea?
|
|
# ? May 27, 2014 23:35 |
|
fookolt posted:I also hear Civilization V is pretty brutal for processors. Not really. It's not CPU bound in the sense that you're starving for processing time to render a single frame. Most of the heavy CPU work is simulating turns on larger maps. All this means is more wait between turns, but you can still look around the map and plan your next move.
|
# ? May 28, 2014 04:07 |
|
|
# ? May 9, 2024 10:27 |
|
Don't forget Dwarf Fortress, where a main cause of a lost game is FPS death because it runs as fast as it can on a single core.
|
# ? May 28, 2014 04:47 |