Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Mr Chips
Jun 27, 2007
Whose arse do I have to blow smoke up to get rid of this baby?

Professor Science posted:

it's about as stupid as running unmodified MPI jobs on it, so whatever. (which is to say, it isn't smart, but it could be a *lot* more idiotic)
Isn't the point of the Phi that is can appear as an MPI cluster, so all you need to do is recompile your existing MPI apps to run on it? Or have I missed something fundamental? I understand that if the jobs don't fit in the card's memory, it'll struggle to shine.

Adbot
ADBOT LOVES YOU

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Mr Chips posted:

Isn't the point of the Phi that is can appear as an MPI cluster, so all you need to do is recompile your existing MPI apps to run on it? Or have I missed something fundamental? I understand that if the jobs don't fit in the card's memory, it'll struggle to shine.
you can, it's probably just going to run like garbage due to unnecessary communication latency, GDDR5 memory latency, poor singlethreaded performance relative to a standard CPU, and underutilization of LRBni (or whatever they're calling it now, I can't remember).

Mr Chips
Jun 27, 2007
Whose arse do I have to blow smoke up to get rid of this baby?
I get why that would be an issue for pthreads/openMP stuff, but surely the communications latency at least wouldn't be any worse than a typical MPI job running over TCP on 1/10 gig ethernet?


It looks like some of the stuff I'm interested in (eg RAXML) the Phi 5110P gets about 2x the performance over a dual socket Xeon E5-2680 with minimal code modifications. It seems like there is a point with these tasks where 60 crappy 1GHz cores does deliver better overall performance than 16 2.7GHz ones with AVX, SSE4 etc etc

Mr Chips fucked around with this message at 10:02 on Nov 4, 2014

mobby_6kl
Aug 9, 2009

by Fluffdaddy
So I was looking around for some updated information on the Skylake situation and came across this article:
http://www.kitguru.net/components/cpu/anton-shilov/intel-speeds-up-introduction-of-unlocked-core-i5i7-broadwell-processors/

If this is to be believed, the good news is that it's scheduled for a Q2 launch. The bad news is that they might only release locked, non-K versions at the time, while keeping Broadwell around.



quote:

...since Intel has no plans to release Skylake processors with unlocked multiplier in Q2 2015

By contrast, Intel’s “Haswell-K” and “Broadwell Unlocked” will offer overclockability, but will not feature native SATA Express support and will continue to rely on DDR3 memory.

The slide snap looks legit enough, but has anyone come across anything else to support or refute this?

PC LOAD LETTER
May 23, 2005
WTF?!
No but it jives with rumors that K version Skylake won't be on sale until Q3-4 '15.

Irritating but maybe by then faster DDR4 won't be so $rape$ price wise.

chocolateTHUNDER
Jul 19, 2008

GIVE ME ALL YOUR FREE AGENTS

ALL OF THEM
In general, how good are those haswell-based pentiums when compared to an i3? Trying to put together a nice, small form-factor build for my GF's mom who will mostly use it for Quickbooks and email.

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

chocolateTHUNDER posted:

In general, how good are those haswell-based pentiums when compared to an i3? Trying to put together a nice, small form-factor build for my GF's mom who will mostly use it for Quickbooks and email.

Quickbooks and email will never see the difference between a Pentium and i3.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

chocolateTHUNDER posted:

In general, how good are those haswell-based pentiums when compared to an i3? Trying to put together a nice, small form-factor build for my GF's mom who will mostly use it for Quickbooks and email.
A G3258 would be fine for desktop use. If you're trying to impress her, save the difference from the i3 and put it into a SSD.

chocolateTHUNDER
Jul 19, 2008

GIVE ME ALL YOUR FREE AGENTS

ALL OF THEM

mayodreams posted:

Quickbooks and email will never see the difference between a Pentium and i3.

Figured, just wanted to be sure.

cisco privilege posted:

A G3258 would be fine for desktop use. If you're trying to impress her, save the difference from the i3 and put it into a SSD.

This was the plan, glad to see you guys verified it! Thanks!

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

cisco privilege posted:

A G3258 would be fine for desktop use. If you're trying to impress her, save the difference from the i3 and put it into a SSD.

Wouldn't an i3 still be better for general multitasking and the like?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Hace posted:

Wouldn't an i3 still be better for general multitasking and the like?

Yes, the same way an i7 would be over an i5 - anything but the most intensive use or specific tasks will never show the difference.

Chuu
Sep 11, 2004

Grimey Drawer
A very long time ago (~Gulftown) I found a doc on Intel's site that explicitly said what the new max turbo frequency was for a processor as you disabled cores. I've searched Intel's site for about an hour now and can't find a similar doc. Does anyone know where they hide this information now?

Deuce
Jun 18, 2004
Mile High Club
Crossposting from the Parts Picker thread, Microcenter has the 4690k on sale for $179 right now.

http://www.microcenter.com/product/434177/Core_i5-4690K_35GHz_LGA_1150_Boxed_Processor

LiquidRain
May 21, 2007

Watch the madness!

Chuu posted:

A very long time ago (~Gulftown) I found a doc on Intel's site that explicitly said what the new max turbo frequency was for a processor as you disabled cores. I've searched Intel's site for about an hour now and can't find a similar doc. Does anyone know where they hide this information now?
You can't do this anymore because there's no longer a simple turbo frequency table of "1 core = 3.9GHz, 2 core = 3.6GHz, 3+ core = 3.4GHz." Turbo and GPU Turbo look at the total thermal output/temperature, and power usage of the entire die (GPU and CPU combined, how much of each is being used, and how many cores in use at what levels) and choose what to turbo and how much to turbo based on that.It's not simple.

LiquidRain fucked around with this message at 06:07 on Nov 18, 2014

Chuu
Sep 11, 2004

Grimey Drawer

LiquidRain posted:

You can't do this anymore because there's no longer a simple turbo frequency table of "1 core = 3.9GHz, 2 core = 3.6GHz, 3+ core = 3.4GHz." Turbo and GPU Turbo look at the total thermal output/temperature, and power usage of the entire die (GPU and CPU combined, how much of each is being used, and how many cores in use at what levels) and choose what to turbo and how much to turbo based on that. It's not simple.

Thanks for the info. Do you know when this changed?

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Chuu posted:

Thanks for the info. Do you know when this changed?

LiquidRain is wrong, the turbo frequency table you're interested in is still a thing. Here it is:

http://www.intel.com/support/processors/corei7/sb/CS-032279.htm

The stuff LiquidRain mentioned about turbo control being based on lots of sensor data is true, but this isn't a new development. The table's meaning has always been "with N cores active the cores will run somewhere between BaseFreq and FreqN, depending on conditions".

GokieKS
Dec 15, 2012

Mostly Harmless.
Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be. So my plan of just using this G3258 to tide me over until ASUS releases a mATX X99 board (which doesn't seem like it's going to happen until next year at the earliest) is coming back to bite me.

Debating if I want to just pick up a 4690K for $170 and plop it into this (kinda crappy) MSI Z97 board (that came with the G3258 in the $100 bundle) and continue waiting, forget Haswell-E/X99 and just get a 4790K + ASUS Maximus Gene VII, or just get the 5820K now and settle for a "cheap" X99 board. I guess I could also get the Rampage V Extreme, but I really don't want to a full ATX motherboard... or to spend $500 on one.

No Gravitas
Jun 12, 2013

by FactsAreUseless

GokieKS posted:

Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be.

From what little I looked at, it seems something like DRM takes up an entire core or something.

Ugh.

You don't want X99, Haswell-E or (for now) a 5XXX CPU. Those are for people who think big model numbers/big price = better. For games they are worse than Z97. Don't fall for lovely misleading Intel marketing.

BurritoJustice
Oct 9, 2012

GokieKS posted:

Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be. So my plan of just using this G3258 to tide me over until ASUS releases a mATX X99 board (which doesn't seem like it's going to happen until next year at the earliest) is coming back to bite me.

Debating if I want to just pick up a 4690K for $170 and plop it into this (kinda crappy) MSI Z97 board (that came with the G3258 in the $100 bundle) and continue waiting, forget Haswell-E/X99 and just get a 4790K + ASUS Maximus Gene VII, or just get the 5820K now and settle for a "cheap" X99 board. I guess I could also get the Rampage V Extreme, but I really don't want to a full ATX motherboard... or to spend $500 on one.

Even if you are deadset on X99 (no reason to be for gaming), why not just get a X99 mATX board from any other manufacturer. Bonus points because you won't have to pay the ASUS tax. Don't buy a Rampage V Extreme, there is no justification for that ever.

The logical solution is to just grab a 4690K if you need an upgrade, it will swap right in and be absolutely enough for every game out.

edit: Dunno what Gravitas is saying though. 5xxx CPUs aren't worse for gaming than 4xxx, and it certainly isn't Intel being misleading. The base clocks are lower, sure, but even an average 5960x will hit 4.5GHz with proper cooling. The extra cores might not help in most games, but they won't hinder and you'll hit the same clocks as a 4790k within margins of error.

BurritoJustice fucked around with this message at 06:20 on Nov 19, 2014

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

GokieKS posted:

Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be. So my plan of just using this G3258 to tide me over until ASUS releases a mATX X99 board (which doesn't seem like it's going to happen until next year at the earliest) is coming back to bite me.

Debating if I want to just pick up a 4690K for $170 and plop it into this (kinda crappy) MSI Z97 board (that came with the G3258 in the $100 bundle) and continue waiting, forget Haswell-E/X99 and just get a 4790K + ASUS Maximus Gene VII, or just get the 5820K now and settle for a "cheap" X99 board. I guess I could also get the Rampage V Extreme, but I really don't want to a full ATX motherboard... or to spend $500 on one.

I'm curious as to what parallel universe you imagine yourself living in where it makes even the slightest bit of sense to buy Haswell-E for your home gaming computer? Haswell-E is for workstations. Games will not run any better - a 5820K may well run them worse than a 4690K because each core is clocked lower. Even a 4790K is a complete waste of money. Just buy a 4690K, and sell the combo if you want to fund a Maximus Gene.

Can you just confirm for me though GokieKS that you've bought Inquisition and you've demonstrated conclusively that it won't run if you only have a dual core?

The Lord Bude fucked around with this message at 06:24 on Nov 19, 2014

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

GokieKS posted:

Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be.

What, really? I've seen reports of people having huge problems, but also people successfully playing the game on e.g. mobile i5s, which are dual core.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Factory Factory posted:

What, really? I've seen reports of people having huge problems, but also people successfully playing the game on e.g. mobile i5s, which are dual core.

If it's true we need to put a notice in the partspicking thread. Lots of people are going to be building PCs for this game.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
It's also important to keep in mind that Haswell-E requires another large investment in DDR4 memory just to get started. It's not a good buy right now at all for anything.

GokieKS
Dec 15, 2012

Mostly Harmless.
While I don't necessarily *need* it, I do, in fact, want Haswell-E. This machine will be doing a decent amount of video editing and encoding in addition to gaming.

And the reason why I was waiting for ASUS's ROG mATX X99 is that in my recent experience, the ROG mATX boards have pretty much been the best combination of quality components, high-end features, great overclocking capability, and also a better BIOS/UEFI than Gigabyte/ASRock/MSI. Now, that's not say that I literally would not consider any other option, but their track record meant that I definitely wanted to see what they had on offer first. And really, there's very limited options for mATX X99 right now, with the first two on market (ASRock and eVGA) both apparently having some quirks and issues. Gigabyte just announced a new one, and it may end up being my best option, but it's not widely available yet.

BurritoJustice
Oct 9, 2012

GokieKS posted:

While I don't necessarily *need* it, I do, in fact, want Haswell-E. This machine will be doing a decent amount of video editing and encoding in addition to gaming.

And the reason why I was waiting for ASUS's ROG mATX X99 is that in my recent experience, the ROG mATX boards have pretty much been the best combination of quality components, high-end features, great overclocking capability, and also a better BIOS/UEFI than Gigabyte/ASRock/MSI. Now, that's not say that I literally would not consider any other option, but their track record meant that I definitely wanted to see what they had on offer first. And really, there's very limited options for mATX X99 right now, with the first two on market (ASRock and eVGA) both apparently having some quirks and issues. Gigabyte just announced a new one, and it may end up being my best option, but it's not widely available yet.

Killer NIC on that one, forget it.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
On the dragonage issue:

From what I'm reading it looks like the dual core issues might be a bug rather than intended behaviour. People are having success by running the game in windows 7 compatibility mode.

GokieKS
Dec 15, 2012

Mostly Harmless.

The Lord Bude posted:

I'm curious as to what parallel universe you imagine yourself living in where it makes even the slightest bit of sense to buy Haswell-E for your home gaming computer? Haswell-E is for workstations. Games will not run any better - a 5820K may well run them worse than a 4690K because each core is clocked lower. Even a 4790K is a complete waste of money. Just buy a 4690K, and sell the combo if you want to fund a Maximus Gene.

Can you just confirm for me though GokieKS that you've bought Inquisition and you've demonstrated conclusively that it won't run if you only have a dual core?

It's not a gaming-only PC. I am well aware of what Haswell-E/X99 is and isn't, and I do have a legitimate reason to choose that. It's the same reason that I would definitively go for the 4790K over the 4690K if I just settle for Haswell/Z97.

And yes, I can confirm that I bought DA:I, installed it on my current (temporary) build (G3258 + MSI Z97 + GTX 780), and that it will not load past the EA/BioWare logo video and "don't close while game is saving" message. A lot of other users are experiencing the same thing, and many (though not all) of them have less than the 4 cores the minimum system requirements lists.

Sidesaddle Cavalry posted:

It's also important to keep in mind that Haswell-E requires another large investment in DDR4 memory just to get started. It's not a good buy right now at all for anything.

I am well aware. That was another reason why I was OK with waiting on the Rampage Gene V - I figured DDR4 prices will probably drop a bit too.

BurritoJustice posted:

Killer NIC on that one, forget it.

poo poo, you're right, I missed that. And yes, that completely removes it from consideration - I don't have many hard and fast rules when it comes to PCs that I build, but using an Intel NIC is one of them.

The Lord Bude posted:

On the dragonage issue:

From what I'm reading it looks like the dual core issues might be a bug rather than intended behaviour. People are having success by running the game in windows 7 compatibility mode.

It would be nice if that was the case, though since they explicitly listed quad-core CPU as a system requirement, it's hard to say. As for W7 compatibility mode, that didn't make any difference for me.

GokieKS fucked around with this message at 06:44 on Nov 19, 2014

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

GokieKS posted:

It's not a gaming-only PC. I am well aware of what Haswell-E/X99 is and isn't, and I do have a legitimate reason to choose that. It's the same reason that I would definitively go for the 4790K over the 4690K if I just settle for Haswell/Z97.

And yes, I can confirm that I bought DA:I, installed it on my current (temporary) build (G3258 + MSI Z97 + GTX 780), and that it will not load past the EA/BioWare logo video and "don't close while game is saving" message. A lot of other users are experiencing the same thing, and many (though not all) of them have less than the 4 cores the minimum system requirements lists.


I am well aware. That was another reason why I was OK with waiting on the Rampage Gene V - I figured DDR4 prices will probably drop a bit too.


poo poo, you're right, I missed that. And yes, that completely removes it from consideration - I don't have many hard and fast rules when it comes to PCs that I build, but using an Intel NIC is one of them.

Are you running windows 8.1? If so have you tried running the game in windows 7 compatibility mode? I've seen reports that that can fix the issue. Beyond that I guess you'll have to wait for a patch. I've long since learned not to play big ambitious RPGs on release day.

GokieKS
Dec 15, 2012

Mostly Harmless.
Yeah, I'm on 8.1 And yep, tried W7 compatibility mode. Didn't make any difference.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

GokieKS posted:

Yeah, I'm on 8.1 And yep, tried W7 compatibility mode. Didn't make any difference.

Bugger. Here's hoping they patch all the bugs in the next couple of days.

VorpalFish
Mar 22, 2007
reasonably awesometm

Factory Factory posted:

What, really? I've seen reports of people having huge problems, but also people successfully playing the game on e.g. mobile i5s, which are dual core.

The mobile i5s will present as 4 logical cores because of hyperthreading. Maybe that's the difference.

movax
Aug 30, 2008

I've got a problem that I'm pretty sure I'm over-complicating because of my background. I have a system that operates on a 10Hz SYNC signal that is distributed throughout the system and to various nodes. I have a x86 box running Linux that acts as a tester that needs to consume the 10Hz SYNC as interrupts to Linux to synchronize timestamps, etc.

I'm so broken that my easiest solution is to throw in a PCIe FPGA devkit and have it issue MSIs at a 10Hz rate to the kernel, since that's pretty simple. The machines are new enough that any legacy I/O doesn't even exist on the mobo as a header from the SuperIO, it's PCIe add-in card or bust. Am I forgetting any other braindead simple ways to wire a signal to the 8259-esque interrupt controller in the PCH?

I think most USB devices that expose GPIOs would have to poll on the interface. I was also entertaining the thought of having the 10Hz signal cycle the SMBus ALERT pin.

e: asking here because this is sort of generic x86 chat and there are plenty of Intel lurkers

No Gravitas
Jun 12, 2013

by FactsAreUseless

movax posted:

I've got a problem that I'm pretty sure I'm over-complicating because of my background. I have a system that operates on a 10Hz SYNC signal that is distributed throughout the system and to various nodes. I have a x86 box running Linux that acts as a tester that needs to consume the 10Hz SYNC as interrupts to Linux to synchronize timestamps, etc.

I'm so broken that my easiest solution is to throw in a PCIe FPGA devkit and have it issue MSIs at a 10Hz rate to the kernel, since that's pretty simple. The machines are new enough that any legacy I/O doesn't even exist on the mobo as a header from the SuperIO, it's PCIe add-in card or bust. Am I forgetting any other braindead simple ways to wire a signal to the 8259-esque interrupt controller in the PCH?

I think most USB devices that expose GPIOs would have to poll on the interface. I was also entertaining the thought of having the 10Hz signal cycle the SMBus ALERT pin.

e: asking here because this is sort of generic x86 chat and there are plenty of Intel lurkers

Maybe I'm an idiot, but if you have a USB device that can tell you how many SYNCs hit since your last check and that can tell you how far back in time each of those was... then you can figure out the timestamps x86 side? Maybe?

EDIT for clarity: Any microcontroller with a USB-UART should do this.

Ugh. I am an idiot, probably best to ignore me, but I just could not refuse a stab at this riddle.

No Gravitas fucked around with this message at 05:32 on Nov 20, 2014

JawnV6
Jul 4, 2004

So hot ...

movax posted:

I think most USB devices that expose GPIOs would have to poll on the interface. I was also entertaining the thought of having the 10Hz signal cycle the SMBus ALERT pin.

e: asking here because this is sort of generic x86 chat and there are plenty of Intel lurkers

Arduino/atmega consuming the edge and sending it up the ftdi/serial? Shouldn't add too much latency.

Ninja Rope
Oct 22, 2005

Wee.
Is this a proto-PTP type thing?

movax
Aug 30, 2008

JawnV6 posted:

Arduino/atmega consuming the edge and sending it up the ftdi/serial? Shouldn't add too much latency.

That's kind of what I was thinking, but I don't have a really good understanding at the moment about what FTDI devices do with Linux driver-wise (just the generic virtual COM port driver I assume) -- I could maybe just hook SYNC up to the Rx line, or maybe put the FTDI part in MPSSE mode. Wouldn't user space still have to poll on that tty though?

Ninja Rope: yeah, it is PTP like, system has a custom bus that ties together a few dozen processors that can perform certain critical operations synchronously.

movax fucked around with this message at 08:16 on Nov 20, 2014

fat bossy gerbil
Jul 1, 2007

I've got an older Ivy Bridge Celeron G1610 on an MSI B75MA-E33 that I purchased when it was the new Celeron on the bock, figuring I'd put in a better CPU somewhere down the line. Never did, and now I don't really feel like spending money on a new old processor for this old board, but I'm also not in the market for a new rig until later next year whenever Skylake hits. In the mean time bumping this old Celeron up to 3.0GHz to squeeze a little more life out of her should be doable on the stock cooler right?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

The French Army! posted:

I've got an older Ivy Bridge Celeron G1610 on an MSI B75MA-E33 that I purchased when it was the new Celeron on the bock, figuring I'd put in a better CPU somewhere down the line. Never did, and now I don't really feel like spending money on a new old processor for this old board, but I'm also not in the market for a new rig until later next year whenever Skylake hits. In the mean time bumping this old Celeron up to 3.0GHz to squeeze a little more life out of her should be doable on the stock cooler right?

Sure! Plenty of thermal headroom. Except good luck overclocking a locked-multiplier CPU on a platform that shits itself if the reference clock varies by more than a few percent off 100 MHz.

robostac
Sep 23, 2009

movax posted:

That's kind of what I was thinking, but I don't have a really good understanding at the moment about what FTDI devices do with Linux driver-wise (just the generic virtual COM port driver I assume) -- I could maybe just hook SYNC up to the Rx line, or maybe put the FTDI part in MPSSE mode. Wouldn't user space still have to poll on that tty though?

Ninja Rope: yeah, it is PTP like, system has a custom bus that ties together a few dozen processors that can perform certain critical operations synchronously.

FTDI doesn't use the generic USB serial drivers. On windows it's a driver that implements their direct USB mode (d2xx) which creates a virtual serial port (it shows up as two different devices). It also defaults to only update every 16ms which you can change in the inf file for the driver. I've not looked into how this works on linux though.

Adbot
ADBOT LOVES YOU

JawnV6
Jul 4, 2004

So hot ...

movax posted:

Wouldn't user space still have to poll on that tty though?

Yeah, true. Still, you're essentially counting off 100ms chunks, so even with a poll interval of 16ms you're not too far off? 100ms seems like eons, how much jitter can you tolerate there?

C# has spoiled me, I have nice DataReceived events that act enough like interrupts that I wasn't thinking about the USB device not having that capability.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply