|
Professor Science posted:it's about as stupid as running unmodified MPI jobs on it, so whatever. (which is to say, it isn't smart, but it could be a *lot* more idiotic)
|
# ? Nov 4, 2014 07:05 |
|
|
# ? May 23, 2024 19:03 |
|
Mr Chips posted:Isn't the point of the Phi that is can appear as an MPI cluster, so all you need to do is recompile your existing MPI apps to run on it? Or have I missed something fundamental? I understand that if the jobs don't fit in the card's memory, it'll struggle to shine.
|
# ? Nov 4, 2014 07:50 |
|
I get why that would be an issue for pthreads/openMP stuff, but surely the communications latency at least wouldn't be any worse than a typical MPI job running over TCP on 1/10 gig ethernet? It looks like some of the stuff I'm interested in (eg RAXML) the Phi 5110P gets about 2x the performance over a dual socket Xeon E5-2680 with minimal code modifications. It seems like there is a point with these tasks where 60 crappy 1GHz cores does deliver better overall performance than 16 2.7GHz ones with AVX, SSE4 etc etc Mr Chips fucked around with this message at 10:02 on Nov 4, 2014 |
# ? Nov 4, 2014 10:00 |
|
So I was looking around for some updated information on the Skylake situation and came across this article: http://www.kitguru.net/components/cpu/anton-shilov/intel-speeds-up-introduction-of-unlocked-core-i5i7-broadwell-processors/ If this is to be believed, the good news is that it's scheduled for a Q2 launch. The bad news is that they might only release locked, non-K versions at the time, while keeping Broadwell around. quote:...since Intel has no plans to release Skylake processors with unlocked multiplier in Q2 2015 The slide snap looks legit enough, but has anyone come across anything else to support or refute this?
|
# ? Nov 10, 2014 01:41 |
|
No but it jives with rumors that K version Skylake won't be on sale until Q3-4 '15. Irritating but maybe by then faster DDR4 won't be so $rape$ price wise.
|
# ? Nov 10, 2014 01:51 |
|
In general, how good are those haswell-based pentiums when compared to an i3? Trying to put together a nice, small form-factor build for my GF's mom who will mostly use it for Quickbooks and email.
|
# ? Nov 10, 2014 23:56 |
|
chocolateTHUNDER posted:In general, how good are those haswell-based pentiums when compared to an i3? Trying to put together a nice, small form-factor build for my GF's mom who will mostly use it for Quickbooks and email. Quickbooks and email will never see the difference between a Pentium and i3.
|
# ? Nov 11, 2014 00:40 |
|
chocolateTHUNDER posted:In general, how good are those haswell-based pentiums when compared to an i3? Trying to put together a nice, small form-factor build for my GF's mom who will mostly use it for Quickbooks and email.
|
# ? Nov 11, 2014 00:41 |
|
mayodreams posted:Quickbooks and email will never see the difference between a Pentium and i3. Figured, just wanted to be sure. cisco privilege posted:A G3258 would be fine for desktop use. If you're trying to impress her, save the difference from the i3 and put it into a SSD. This was the plan, glad to see you guys verified it! Thanks!
|
# ? Nov 11, 2014 04:57 |
|
cisco privilege posted:A G3258 would be fine for desktop use. If you're trying to impress her, save the difference from the i3 and put it into a SSD. Wouldn't an i3 still be better for general multitasking and the like?
|
# ? Nov 11, 2014 07:00 |
|
Hace posted:Wouldn't an i3 still be better for general multitasking and the like? Yes, the same way an i7 would be over an i5 - anything but the most intensive use or specific tasks will never show the difference.
|
# ? Nov 11, 2014 07:46 |
|
A very long time ago (~Gulftown) I found a doc on Intel's site that explicitly said what the new max turbo frequency was for a processor as you disabled cores. I've searched Intel's site for about an hour now and can't find a similar doc. Does anyone know where they hide this information now?
|
# ? Nov 18, 2014 01:18 |
|
Crossposting from the Parts Picker thread, Microcenter has the 4690k on sale for $179 right now. http://www.microcenter.com/product/434177/Core_i5-4690K_35GHz_LGA_1150_Boxed_Processor
|
# ? Nov 18, 2014 02:01 |
|
Chuu posted:A very long time ago (~Gulftown) I found a doc on Intel's site that explicitly said what the new max turbo frequency was for a processor as you disabled cores. I've searched Intel's site for about an hour now and can't find a similar doc. Does anyone know where they hide this information now? LiquidRain fucked around with this message at 06:07 on Nov 18, 2014 |
# ? Nov 18, 2014 02:11 |
|
LiquidRain posted:You can't do this anymore because there's no longer a simple turbo frequency table of "1 core = 3.9GHz, 2 core = 3.6GHz, 3+ core = 3.4GHz." Turbo and GPU Turbo look at the total thermal output/temperature, and power usage of the entire die (GPU and CPU combined, how much of each is being used, and how many cores in use at what levels) and choose what to turbo and how much to turbo based on that. It's not simple. Thanks for the info. Do you know when this changed?
|
# ? Nov 18, 2014 04:09 |
|
Chuu posted:Thanks for the info. Do you know when this changed? LiquidRain is wrong, the turbo frequency table you're interested in is still a thing. Here it is: http://www.intel.com/support/processors/corei7/sb/CS-032279.htm The stuff LiquidRain mentioned about turbo control being based on lots of sensor data is true, but this isn't a new development. The table's meaning has always been "with N cores active the cores will run somewhere between BaseFreq and FreqN, depending on conditions".
|
# ? Nov 18, 2014 06:04 |
|
Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be. So my plan of just using this G3258 to tide me over until ASUS releases a mATX X99 board (which doesn't seem like it's going to happen until next year at the earliest) is coming back to bite me. Debating if I want to just pick up a 4690K for $170 and plop it into this (kinda crappy) MSI Z97 board (that came with the G3258 in the $100 bundle) and continue waiting, forget Haswell-E/X99 and just get a 4790K + ASUS Maximus Gene VII, or just get the 5820K now and settle for a "cheap" X99 board. I guess I could also get the Rampage V Extreme, but I really don't want to a full ATX motherboard... or to spend $500 on one.
|
# ? Nov 19, 2014 06:12 |
|
GokieKS posted:Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be. From what little I looked at, it seems something like DRM takes up an entire core or something. Ugh. You don't want X99, Haswell-E or (for now) a 5XXX CPU. Those are for people who think big model numbers/big price = better. For games they are worse than Z97. Don't fall for lovely misleading Intel marketing.
|
# ? Nov 19, 2014 06:18 |
|
GokieKS posted:Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be. So my plan of just using this G3258 to tide me over until ASUS releases a mATX X99 board (which doesn't seem like it's going to happen until next year at the earliest) is coming back to bite me. Even if you are deadset on X99 (no reason to be for gaming), why not just get a X99 mATX board from any other manufacturer. Bonus points because you won't have to pay the ASUS tax. Don't buy a Rampage V Extreme, there is no justification for that ever. The logical solution is to just grab a 4690K if you need an upgrade, it will swap right in and be absolutely enough for every game out. edit: Dunno what Gravitas is saying though. 5xxx CPUs aren't worse for gaming than 4xxx, and it certainly isn't Intel being misleading. The base clocks are lower, sure, but even an average 5960x will hit 4.5GHz with proper cooling. The extra cores might not help in most games, but they won't hinder and you'll hit the same clocks as a 4790k within margins of error. BurritoJustice fucked around with this message at 06:20 on Nov 19, 2014 |
# ? Nov 19, 2014 06:18 |
|
GokieKS posted:Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be. So my plan of just using this G3258 to tide me over until ASUS releases a mATX X99 board (which doesn't seem like it's going to happen until next year at the earliest) is coming back to bite me. I'm curious as to what parallel universe you imagine yourself living in where it makes even the slightest bit of sense to buy Haswell-E for your home gaming computer? Haswell-E is for workstations. Games will not run any better - a 5820K may well run them worse than a 4690K because each core is clocked lower. Even a 4790K is a complete waste of money. Just buy a 4690K, and sell the combo if you want to fund a Maximus Gene. Can you just confirm for me though GokieKS that you've bought Inquisition and you've demonstrated conclusively that it won't run if you only have a dual core? The Lord Bude fucked around with this message at 06:24 on Nov 19, 2014 |
# ? Nov 19, 2014 06:22 |
|
GokieKS posted:Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be. What, really? I've seen reports of people having huge problems, but also people successfully playing the game on e.g. mobile i5s, which are dual core.
|
# ? Nov 19, 2014 06:24 |
|
Factory Factory posted:What, really? I've seen reports of people having huge problems, but also people successfully playing the game on e.g. mobile i5s, which are dual core. If it's true we need to put a notice in the partspicking thread. Lots of people are going to be building PCs for this game.
|
# ? Nov 19, 2014 06:25 |
|
It's also important to keep in mind that Haswell-E requires another large investment in DDR4 memory just to get started. It's not a good buy right now at all for anything.
|
# ? Nov 19, 2014 06:27 |
|
While I don't necessarily *need* it, I do, in fact, want Haswell-E. This machine will be doing a decent amount of video editing and encoding in addition to gaming. And the reason why I was waiting for ASUS's ROG mATX X99 is that in my recent experience, the ROG mATX boards have pretty much been the best combination of quality components, high-end features, great overclocking capability, and also a better BIOS/UEFI than Gigabyte/ASRock/MSI. Now, that's not say that I literally would not consider any other option, but their track record meant that I definitely wanted to see what they had on offer first. And really, there's very limited options for mATX X99 right now, with the first two on market (ASRock and eVGA) both apparently having some quirks and issues. Gigabyte just announced a new one, and it may end up being my best option, but it's not widely available yet.
|
# ? Nov 19, 2014 06:28 |
|
GokieKS posted:While I don't necessarily *need* it, I do, in fact, want Haswell-E. This machine will be doing a decent amount of video editing and encoding in addition to gaming. Killer NIC on that one, forget it.
|
# ? Nov 19, 2014 06:32 |
|
On the dragonage issue: From what I'm reading it looks like the dual core issues might be a bug rather than intended behaviour. People are having success by running the game in windows 7 compatibility mode.
|
# ? Nov 19, 2014 06:32 |
|
The Lord Bude posted:I'm curious as to what parallel universe you imagine yourself living in where it makes even the slightest bit of sense to buy Haswell-E for your home gaming computer? Haswell-E is for workstations. Games will not run any better - a 5820K may well run them worse than a 4690K because each core is clocked lower. Even a 4790K is a complete waste of money. Just buy a 4690K, and sell the combo if you want to fund a Maximus Gene. It's not a gaming-only PC. I am well aware of what Haswell-E/X99 is and isn't, and I do have a legitimate reason to choose that. It's the same reason that I would definitively go for the 4790K over the 4690K if I just settle for Haswell/Z97. And yes, I can confirm that I bought DA:I, installed it on my current (temporary) build (G3258 + MSI Z97 + GTX 780), and that it will not load past the EA/BioWare logo video and "don't close while game is saving" message. A lot of other users are experiencing the same thing, and many (though not all) of them have less than the 4 cores the minimum system requirements lists. Sidesaddle Cavalry posted:It's also important to keep in mind that Haswell-E requires another large investment in DDR4 memory just to get started. It's not a good buy right now at all for anything. I am well aware. That was another reason why I was OK with waiting on the Rampage Gene V - I figured DDR4 prices will probably drop a bit too. BurritoJustice posted:Killer NIC on that one, forget it. poo poo, you're right, I missed that. And yes, that completely removes it from consideration - I don't have many hard and fast rules when it comes to PCs that I build, but using an Intel NIC is one of them. The Lord Bude posted:On the dragonage issue: It would be nice if that was the case, though since they explicitly listed quad-core CPU as a system requirement, it's hard to say. As for W7 compatibility mode, that didn't make any difference for me. GokieKS fucked around with this message at 06:44 on Nov 19, 2014 |
# ? Nov 19, 2014 06:37 |
|
GokieKS posted:It's not a gaming-only PC. I am well aware of what Haswell-E/X99 is and isn't, and I do have a legitimate reason to choose that. It's the same reason that I would definitively go for the 4790K over the 4690K if I just settle for Haswell/Z97. Are you running windows 8.1? If so have you tried running the game in windows 7 compatibility mode? I've seen reports that that can fix the issue. Beyond that I guess you'll have to wait for a patch. I've long since learned not to play big ambitious RPGs on release day.
|
# ? Nov 19, 2014 06:41 |
|
Yeah, I'm on 8.1 And yep, tried W7 compatibility mode. Didn't make any difference.
|
# ? Nov 19, 2014 06:44 |
|
GokieKS posted:Yeah, I'm on 8.1 And yep, tried W7 compatibility mode. Didn't make any difference. Bugger. Here's hoping they patch all the bugs in the next couple of days.
|
# ? Nov 19, 2014 06:46 |
|
Factory Factory posted:What, really? I've seen reports of people having huge problems, but also people successfully playing the game on e.g. mobile i5s, which are dual core. The mobile i5s will present as 4 logical cores because of hyperthreading. Maybe that's the difference.
|
# ? Nov 19, 2014 21:29 |
|
I've got a problem that I'm pretty sure I'm over-complicating because of my background. I have a system that operates on a 10Hz SYNC signal that is distributed throughout the system and to various nodes. I have a x86 box running Linux that acts as a tester that needs to consume the 10Hz SYNC as interrupts to Linux to synchronize timestamps, etc. I'm so broken that my easiest solution is to throw in a PCIe FPGA devkit and have it issue MSIs at a 10Hz rate to the kernel, since that's pretty simple. The machines are new enough that any legacy I/O doesn't even exist on the mobo as a header from the SuperIO, it's PCIe add-in card or bust. Am I forgetting any other braindead simple ways to wire a signal to the 8259-esque interrupt controller in the PCH? I think most USB devices that expose GPIOs would have to poll on the interface. I was also entertaining the thought of having the 10Hz signal cycle the SMBus ALERT pin. e: asking here because this is sort of generic x86 chat and there are plenty of Intel lurkers
|
# ? Nov 20, 2014 04:17 |
|
movax posted:I've got a problem that I'm pretty sure I'm over-complicating because of my background. I have a system that operates on a 10Hz SYNC signal that is distributed throughout the system and to various nodes. I have a x86 box running Linux that acts as a tester that needs to consume the 10Hz SYNC as interrupts to Linux to synchronize timestamps, etc. Maybe I'm an idiot, but if you have a USB device that can tell you how many SYNCs hit since your last check and that can tell you how far back in time each of those was... then you can figure out the timestamps x86 side? Maybe? EDIT for clarity: Any microcontroller with a USB-UART should do this. Ugh. I am an idiot, probably best to ignore me, but I just could not refuse a stab at this riddle. No Gravitas fucked around with this message at 05:32 on Nov 20, 2014 |
# ? Nov 20, 2014 05:29 |
|
movax posted:I think most USB devices that expose GPIOs would have to poll on the interface. I was also entertaining the thought of having the 10Hz signal cycle the SMBus ALERT pin. Arduino/atmega consuming the edge and sending it up the ftdi/serial? Shouldn't add too much latency.
|
# ? Nov 20, 2014 05:36 |
|
Is this a proto-PTP type thing?
|
# ? Nov 20, 2014 06:55 |
|
JawnV6 posted:Arduino/atmega consuming the edge and sending it up the ftdi/serial? Shouldn't add too much latency. That's kind of what I was thinking, but I don't have a really good understanding at the moment about what FTDI devices do with Linux driver-wise (just the generic virtual COM port driver I assume) -- I could maybe just hook SYNC up to the Rx line, or maybe put the FTDI part in MPSSE mode. Wouldn't user space still have to poll on that tty though? Ninja Rope: yeah, it is PTP like, system has a custom bus that ties together a few dozen processors that can perform certain critical operations synchronously. movax fucked around with this message at 08:16 on Nov 20, 2014 |
# ? Nov 20, 2014 08:14 |
|
I've got an older Ivy Bridge Celeron G1610 on an MSI B75MA-E33 that I purchased when it was the new Celeron on the bock, figuring I'd put in a better CPU somewhere down the line. Never did, and now I don't really feel like spending money on a new old processor for this old board, but I'm also not in the market for a new rig until later next year whenever Skylake hits. In the mean time bumping this old Celeron up to 3.0GHz to squeeze a little more life out of her should be doable on the stock cooler right?
|
# ? Nov 20, 2014 09:09 |
|
The French Army! posted:I've got an older Ivy Bridge Celeron G1610 on an MSI B75MA-E33 that I purchased when it was the new Celeron on the bock, figuring I'd put in a better CPU somewhere down the line. Never did, and now I don't really feel like spending money on a new old processor for this old board, but I'm also not in the market for a new rig until later next year whenever Skylake hits. In the mean time bumping this old Celeron up to 3.0GHz to squeeze a little more life out of her should be doable on the stock cooler right? Sure! Plenty of thermal headroom. Except good luck overclocking a locked-multiplier CPU on a platform that shits itself if the reference clock varies by more than a few percent off 100 MHz.
|
# ? Nov 20, 2014 10:00 |
|
movax posted:That's kind of what I was thinking, but I don't have a really good understanding at the moment about what FTDI devices do with Linux driver-wise (just the generic virtual COM port driver I assume) -- I could maybe just hook SYNC up to the Rx line, or maybe put the FTDI part in MPSSE mode. Wouldn't user space still have to poll on that tty though? FTDI doesn't use the generic USB serial drivers. On windows it's a driver that implements their direct USB mode (d2xx) which creates a virtual serial port (it shows up as two different devices). It also defaults to only update every 16ms which you can change in the inf file for the driver. I've not looked into how this works on linux though.
|
# ? Nov 20, 2014 12:55 |
|
|
# ? May 23, 2024 19:03 |
|
movax posted:Wouldn't user space still have to poll on that tty though? Yeah, true. Still, you're essentially counting off 100ms chunks, so even with a poll interval of 16ms you're not too far off? 100ms seems like eons, how much jitter can you tolerate there? C# has spoiled me, I have nice DataReceived events that act enough like interrupts that I wasn't thinking about the USB device not having that capability.
|
# ? Nov 20, 2014 17:33 |