Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

As someone who's never bought a laptop, I associate i3/i5/i7 with 2c/4t, 4c/4t, 4c/8t, and it's really confusing whenever I want to help a friend pick out a laptop because I have to figure out what they actually mean there. The one thing I think I've learnded is that high end quad cores are i7-####HQ. Is that consistent or are there exceptions?

My Sandy Bridge 4 core/8 thread mobile i7 is i7-2630QM. Not sure when they changed between QM and HQ.

Adbot
ADBOT LOVES YOU

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

blowfish posted:

The typical consumer has like 5 tabs open at most.

And? That doesn't make horrible messes of javascript any easier to execute.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Boiled Water posted:

Ehh they really don't. Some of the "cores" are regular ol' CPU cores, but the majority, I think half, are graphics processors that only make pictures pretty.

I'm pretty sure you're thinking of the PS3's CPU. That had 8 cores in the CPU, one main PowerPC CPU, 6 :airquote:Synergistic Processing Elements:airquote: which were a different architecture and would be used variably for standard general purpose code, video decode, or graphics assistance, and yet another core that was primarily used for the system OS and security/encryption. The SPE cores lacked branch prediction.

(The Xbox 360's CPU was much simpler, 3 identical PowerPC cores that were effectively hyperthreaded, for 6 threads of execution)

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Josh Lyman posted:

Aren't there only a handful of games where increased CPU parallelism would even help?

This is only because it took til this generation for every console to have multiple identical cores, and console design often places constraints on any PC games that are also to be ported to consoles. The Wii was single core, the PS3 was the mess of different cores I mentioned previously, and the 360 was the only one with multiple identical cores as also mentioned previously.

In order to take advantage of the PS4/Xbox One's full capability, you need to be using 6 or 7 of the cores (usually one or two of the cores in the 8 core setups can be reserved for other things while a game is playing). And even the Wii U has a 3 identical core/possibly 6 threads CPU architecture, essentially copying the last gen Xbox 360's design as a PowerPC CPU.

So with every current console having more than 2 cores for games, you can expect future games on the PC to also really want a lot of cores. It is, after all, easier to just plop your core-hungry game engine onto the PC rather than spend a bunch of time refactoring things so that it runs just as well with 2 cores or 1 core. GTA IV was the first major example of this years back, remember all the people who were angry that it really needed a quad-core CPU to run well because of how Rockstar chose to port it from the 360?

fishmech fucked around with this message at 18:33 on May 29, 2016

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

ohgodwhat posted:

Did they see £79 and just assume that's the same as $79?

Could also be the UK site offers US dollar prices (probably with a hefty shipping cost tacked on, but which the store would ignore),

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

mayodreams posted:

Too bad that chart does not show how AMD is curb stomped on single thread performance even with their 'massive improvement' in generational performance.

I am still rooting for Zen to be competitive, but that chart is marketing bullshit.

It reminds me of when, back in 2006 as Apple was breaking out of their lowest share of the computer market since 1977 (they'd dropped to around 2% in the last part of the PowerPC era), they started putting out press releases of how they were growing faster than the Windows PC market!!

And it's like, yeah, it's easy to post very good percentage growth when you're finally crawling out of rock bottom!

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Ak Gara posted:

I'm getting tired of Minecraft of all loving things, maxing out my 5ghz 2500k. 100% cpu usage across all 4 cores @ 67c.

Would a 6700k help? Does Java care about cores vs threads?

Just curious, but are you using a lot of mods as well? It's kinda unusual for Minecraft to peg a 2500k without them.

You will certainly get better performance with the newer chip regardless, but it seems like something's gone fucky with your Minecraft install/world.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Ak Gara posted:

100's of mods! :v:

In that case, the new chip will probably also peg at 100% on all cores with all that stuff you have, but things will run better while doing it. :v:

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

I'm not sure why people react so violently to low resolution monitors. For me a smooth framerate and decent settings are orders of magnitude more important than high resolution, and I'm not in the business of buying $300 video cards so I can get all three. The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth.

Because they're horrible, and there's no situation other than "I absolutely must play games with the highest settings at the native resolution" where having low resolution benefits you. Using those 1280x720/1360x768/1366x768 monitors is essentially still using the same 1024x768 resolution we were using 20 years ago, it's simply not appropriate and things aren't really designed to work with it. Sure sometimes it's the only option on very small laptop displays, but that's a tradeoff you have to put up with the same way you used to have to put up with screens that ghosted so bad playing most games on there would be impossible too.

I mean you might as well say this is a perfectly usable way to browse the web:

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

...I should probably stop posting about my dumb monitor. Every time I do a swarm of people has to jump in and tell me I'm an idiot for not spending a bunch of money on something I don't want. I've used 1080p monitors and the "upgrade" is not worth money to me.

You can literally get a better monitor at a Goodwill for $5. If that's a bunch of money to you, I don't know what to tell you. 768-high resolutions were obsoleted before LCD monitors were mass market.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

PerrineClostermann posted:

Our first LCD monitor was a big, heavy thing with 1280x1024 native, from 2001.

Fun fact: a 1366x768 monitor has 20% fewer pixels than a 1280x1024 one. That's why they're so garbo.


HMS Boromir posted:

19.5" and I stand fairly close to it, for what it's worth. I admit I haven't gone to thrift stores looking for good deals on 1080p monitors but that might be because, as you might have noticed, I don't particularly care about getting a 1080p monitor - again, I've used one and I can go back without wanting to kill myself, which is apparently the expected reaction.

Honestly I wouldn't even have posted about it if I hadn't gotten this reaction before. It's seriously fine and there isn't something wrong with my brain that's making me think so.

Why are you obsessively focusing on 1080p? Even 1280x1024 or 1600x900 or a bunch of other resolutions would be better, the whatever by 720/768 monitors were last mainstream in the late 90s, and since then especially in widescreen formats, they've been pretty much the rock bottom for a desktop monitor.

The point is there is nothing fine about the monitor you have, it's terrible for actually using modern things which expect modern resolutions. We're not saying there's anything wrong with your brain, just that you've been making life hard for yourself to save $5.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

I think my logic is pretty sound. I've used a higher resolution monitor, it's nicer for desktop use but not by enough that I actually want one for that reason, especially since again I am not doing any work I'm getting paid for on this monitor. I don't think that calls for a cascade of people calling my choice of monitor resolution "horrible" and "making things hard on myself" and "Stockholm Syndrome" (jesus) but apparently goons disagree.

I'm gonna bow out here, I don't think any of us are having a good time and this is the Intel CPU thread anyway. Enjoy your high framerates at affordable prices KingEup, I'll continue enjoying mine.

If you didn't want people to tell you why your monitor is bad, why'd you come here talking about how you love using a bad monitor? Seriously.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Ak Gara posted:

Clearly I've underestimated the popularity of TrackPoint™ Style Pointer's.

They work very well for very compact laptops. Certainly wish they had been popular during the netbook fad instead of postage stamp trackpads!

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
It is unforgivable how Apple led the trend of not having actual buttons.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

japtor posted:

Mobile processors!

Speaking of core count, I think I'm gonna finally start a PC build and wondering how long I can get away with an i3 (or Pentium I guess?), or should I just spend the extra chunk for an i5 now for the extra cores? Going for something like a low end setup, but I'm hoping has enough longevity at lower settings or with simpler games to last a while.

If you plan to play any sort of modern games, an i5 for more cores is definitely the way to go.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Ak Gara posted:

There's some thing I don't get, is it motherboard limited or cpu limited that a motherboard would come with 2 USB 3. 0's and 20 USB 2.0's? They're backwards compatible! Why even include the slower ones? Same for SATA, why only 2 SATA 6gbps and 10 SATA 3gbps?

If it was just cost, I'd imagine the top end boards would at least offer for more money?

You only need SATA II for conventional hard drives, unless you're going for some real high-end drives. Similarly, it's perfectly fine to plug your mouse and keyboard into USB 2.0 ports - and if you want to support Windows 7 well, as many customers still need, you should have some 2.0 ports in there because 7 has trouble using 3.0 in a fresh install. There are also some instances where certain USB 1.0/1.1 devices people still have won't work right on USB 3.0 ports due to driver/implementation issues, especially if the usb 1.0/1.1 device was connected to a hub. I think newer chipsets aren't known to have this problem anymore, but it was a thing for a bit.

And in some cases there just isn't enough bandwidth available to the CPU/on the motherboard to handle having all the ports they want to sell you be at sata iii/usb 3 respecetively, but that's comparatively rare nowadays.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

DrDork posted:

You're right about single thread performance, but more and more games (let alone everything else) are seriously jumping on the multi-thread bandwagon, so that's not nearly the limitation that it was 4 or 5 years ago.

Would absolutely love to see Zen be A Thing, but...well, AMD.

Ironically, AMD is to thank for the move to heavy multicore games, because of how 8 core AMD x86-64 CPUs are in the two current generation consoles (wii u is a late last gen console).

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

feedmegin posted:

Exactly what regular consumer software do you think can make use of 22 cores? Using 4 is a bit of a struggle for most things. Cores aren't magic, having more of them is only useful if you have parallelisable work and that's just not true of the average word processor or whatever.

It would sure be stupid of most consumer software to optimize towards massive amounts of cores when massive amounts of cores aren't available to consumers. The argument you're making is like it was the late 80s and someone was saying that since DOS doesn't currently support more than 32 MB in a partition, no hard drive manufacturers should offer larger drives.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

feedmegin posted:

Exactly how many programs do you think your average user is actively using simultaneously?


If they're a typical user running Chrome and a word processor, they're actually running easily 10 "programs" at once.

feedmegin posted:

Again, cores aren't magic. Look up Amdahl's Law. You can't 'optimize towards massive amounts of cores' if the task you are attempting to complete is fundamentally sequential.

No one said they're magic, but these days we have plenty of programs that operate best with 2 cores and games that operate best with 4 or 8 threads. And having 8 threads is still pretty massive to the average consumer system, which is 2 or 4 thread.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

feedmegin posted:

Of course not. I said 'Average consumer task'. Not deep learning, not graphics stuff (because that's exactly why GPUs with loads of cores are a thing), not scientific computing, and specifically not high performance gaming, but the sort of thing Joe Sixpack does with his computer every day when he's not playing games on his PS4. You can throw a few more cores at that stuff and it'll help, but throwing 20 at it simply isn't going to have that much effect. Maybe if he's got like 8 tabs open and visible then the Javascript on each of those tabs goes a little bit faster.

The average consumer task is to have like Excel or Word open and then also a bunch of browser tabs (which in many browsers invokes multiple processes) and maybe the spotify app or something similar running too. That's actually a lot of processing that needs to be done, and it does bog down on their systems. If they had 20 threads on their processor, all that stuff they run at once would run a lot better even if they didn't know why it was running better.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

lllllllllllllllllll posted:

Kaby Lake promises less CPU strain playing 4K videos. Is this something that would be delegated to a modern dedicated graphics card (like Nvidia's 10XX series) if you have it? Or is this a useful feature whether you go with built-in graphics as well as for users with dedicated graphic cards?

Although they advertise it as being for 4K video, what's going on is that they're adding support for a newer standard codec, which would still benefit decoding and encoding in that format at 1080p or even 320x240. Most video cards out right now won't happen to support it but it'll eventually be standard there.

The previous generation of CPUs supported the new HEVC codec but only really well for 1080p resolution and 8 bit per channel color depth. Kaby Lake has better support for HEVC that remains useful at 4K resolution and 10 bits per color channel AND adds hardware decoding for VP9 format video as well.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
Yeah very high quality and short cables and connectors could potentially push VGA to 8K, but you'd be unlikely to achieve that.

Standard quality cable and connectors at normal lengths starts having noticeable degradation of signal quality above 1920x1200 or so, and who really wants to spend a bunch of money on cables just to still use VGA these days?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

BIG HEADLINE posted:

Fair enough - I was suggesting the TV thing simply so he could see if his eyes were hypersensitive to 120Hz. People got ill during the HFR showings of The Hobbit, and others couldn't stand the 'soap opera effect.' I know I always notice the higher frame rate the most during commercials, specifically pill commercials.

The Hobbit's "high" frame rate was just 48 fps, rather than something above 60. Not really relevant to 120 hz panels. Also it's a different thing when the TV is setup to generate its own fake intervening frames like you're seeing in commercials, versus native high framerate content.

Edit: it's worth remembering that in general our video frame rate standards are accidental rather than being carefully planned. The only reason movies are shot 24 fps is that it happened to be a decent compromise between capturing motion and conserving physical film back in the 20s and 30s (if we'd standardized at 30 fps for instance, every second of footage would need an extra 25% more physical frames, and consequently the amount of time that could be continuously shot on the same length of film is shorter, etc). It could have been just about anything though, as rates from 15 to 35 had been common before standardization.

When shot correctly, any high enough framerate looks fine. But there's nearly a century of experience in getting say 24 fps content to be well lit and everything related, not so much with others. You combine that high expertise in 24 and to a lesser extent 25 and 30 (for television in different parts of the world), and if you've been watching that stuff for a long time you get used to it being "correct" even though there's no inherent advantage.

fishmech fucked around with this message at 14:59 on Oct 6, 2016

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SuperDucky posted:

The Altera purchase raises a lot of questions. What was that quote about Saudi Princes upthread, though?

Saudi princes have assloads of cash and a habit of buying the most expensive thing regardless of whether its worth it. So the joke is that you probably wouldn't benefit enough to justify the cost.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Boiled Water posted:

And presumably all the bandwidth in the world.

It's only going to be like 45 megabits for second for a really high quality stream at 3840x2160 at most, probably a lot lower most of the time. Current Netflix 4k streaming uses 16 megabits at the minimum before it boots you down to 1080p streaming.

In case you're wondering, the new Ultra HD Blu-Ray standard for 4k movies and fancy soundtracks must support at least 108 megabit per second to handle the highest quality content.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
Part of why Netflix is ok with doing these restrictions is that a lot of people really don't care about a full 4k or even 1080p video in their browser, and a lot of people are viewing on a tablet, phone, or set top box/game console instead. And those tend to have less limits, especially the latest Xbox One and PS4.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Pryor on Fire posted:

News came out today that the new Nintendo console will charge over usb-c, while I think this is irrelevant to any sort of motherboard situation I can imagine it's still kind of a mile marker moment, and I'd really want a computer with usb-c going forward because loving everything will run off that now like it or not.

Well it's just a tablet, and USB charging is already common on those. It would mean a lot more if it was something like the new PS4/Xbox One revisions using USB-C.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Actuarial Fables posted:

I'm looking for a processor to run a home server for a storage RAID (backups, videos, music), certificate training (linux VMs for RHCSA), and whatever else I might want to experiment with. Is something from the Atom line going to be enough for my needs (something like http://www.newegg.com/Product/Product.aspx?Item=N82E16813182855 )? I'm out of my depth when it comes to server-oriented processors, and there's so many to choose from that I'm really lost. Should I be looking for a cheap Xeon processor instead?

Also here is a friend for all of you.


I wouldn't advise using an Atom system for a system that you expect to handle virtual machines, even if they'll be lightweight ones. But you also don't need a full on new Xeon system for this either.

If you're looking to go with new chips, a recent i5 system would probably be the best. Otherwise, try to get a hold of an old used Xeon from like 2013 or so - like one using the E3-1240 v3.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
A lot of great codenames have serious rights issues lurking if you tried to use them in production. :shrug:

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

PC LOAD LETTER posted:

PC market is shrinking but there are still sales to be had. Especially in laptop and server land where ASP's are still higher I believe.

Avg. users won't care about the CPU but they do care about bang vs buck and many of them haven't bothered to upgrade for much the same reason many enthusiasts haven't. The value just wasn't there to make a new system worth while.

It's worth remembering that when we're talking about "PC market shrinking" we're talking like going from 330 million units a year to 280 million units a year. It's still an assload of stuff selling.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

PerrineClostermann posted:

TIL enterprise solutions and consumer systems are equivalent

So you're saying spending $50 once to unlock something on your system is horrible but spending $50,000 a month instead is fine?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

PerrineClostermann posted:

Let me download this processor real quick

You can literally download improved functionality in the form of new microcode in regular CPUs though, to say nothing of what you can get up to with FPGA stuff.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Sidesaddle Cavalry posted:

I don't like that product segmentation thrives off of marketing extra features to people who don't know whether or not said features will actually be useful to them in practice or everyday use. Especially in products as complicated as microprocessors vs. average Joe consumers with no IT or compsci background.

I don't like marketing :saddowns:

Yeah but if we were just marketing CPUs based off of "what does the average person know about a processor" what would that even be? The raw clock speed maybe.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

WAR DOGS OF SOCHI posted:

Sorry for the derail, but I love this stuff. Are you saying that they don't bother cooling any more than is absolutely necessary? To the point that a human being could suffer heatstroke or something while in the racks? lmao that's crazy

It's not usually hot enough to cause heatstroke for the time someone would actually be in the room, but it'll get up pretty hot at times, easily like 100F.

The basic reasoning is that since the server vendors will now agree to warranty their equipment to run constantly at the higher temperatures, why should you try to force temps down to the 60s-70s F range anymore? You spend a ton less on power, which is good for saving money and great for the environment in general. If I remember right, Google was one of the pioneers of the practice and then the influence of their testing convinced the manufacturers to allow it.

In many places, this practice results in near 0 use of air conditioning equipment over large stretches of the winter and even spring and fall, since natural cooling to the environment happens quick enough if you set like 100F or 110F as the point where the AC has to come on.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Twerk from Home posted:


Good news is ARM options are coming up quickly to fill this gap. With dedicated hardware for encryption and huge i/o bandwidth, they'll gladly fill the CPU-light edge server role.

Haven't they been "coming up quickly" but never actually getting here for like, 5+ years now though?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

ohgodwhat posted:


The problem, from my perspective, is that this opens up a lot more surreptitious physical attacks, where now you can get JTAG access from momentary physical access without disturbing the look of the device, which was not possible before.

You mean the sort of physical attacks that you could do with most laptops with say, Thunderbolt, or Firewire, or ExpressCard, and I think maybe PCMCIA? That is, attacks that let you extract a bunch of information from the running system and allow for the possibility of injecting some manner of malicious executable code.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

ohgodwhat posted:

Yeah? Those are bad too? Thanks fishmech.

The point is there's a shitload of these things already, and those methods all are a lot more practical too. And those are just ones that any random ook with a pre-built device could shove in.

Feel free to poo poo yourself and weld a metal bar over your laptop USB ports if you want, I guess.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

ohgodwhat posted:

"Boy howdy, you think that's bad? But other things are bad too so we might as well do nothing."

But there is nothing to do? Like there's not actually a problem with this sort of debugging being possible through the USB port.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

ohgodwhat posted:

Not actually a problem:


How many Skylake laptops have PCMCIA? How many have USB?

edit: To be clear since the point tends to fly over your head, if this wasn't possible with USB, this attack vector likely would not exist for many new laptops. That previous standards were also bad isn't exactly a defense.

Nor would it be inconceivable for manufacturers to take a bit more care with the design to make such functionality available, but not turned on by default. Will this affect a huge number of people? Probably not, but there's really no excuse for continuing to make the same bad decisions.

But again, it isn't actually a problem. With a regular rear end USB port on any laptop, you can already build a malicious USB device that can inject malware and even read out significant sections of memory. That is a fundamental flaw in how USB devices work, since it "trusts" that devices are what they say they are and won't try to be malicious.

There is nothing "bad decision" about having this functionality.

You're just going to have to put on your big boy pants and accept that physical access has always meant there's nothing you can really do about security anymore. The only way to prevent this would be to make it nearly impossible to have external devices to use with your computer.

Adbot
ADBOT LOVES YOU

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

ohgodwhat posted:

I disagree that direct JTAG access is no more concerning of an attack vector than "normal" USB, mediated by the OS which at least in theory can mitigate attack from said devices. I'm sure there's bad poo poo there too but again, the policy shouldn't be to grow the attack surface.

Please continue to be unnecessarily hostile though, it really adds to the conversation.

You can disagree all you want, it's still true. The attack surface hasn't grown in the least. OSes being protected in theory doesn't matter, in theory you'll have a hard time using this thing as well.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply