|
go3 posted:You're not going to convince IT guy since he is a cast-iron idiot. If you want change, convince the people above/around him. "My lovely laptop I got for $300 at walmart runs our software twice as fast as the machines you just finished building for us, why is that? Were they even cheaper than $300?"
|
# ? Sep 18, 2014 06:14 |
|
|
# ? May 13, 2024 10:46 |
|
Aquila posted:Probably never. Intel is very bad about actually bringing enterprise ssd's to market, and demand for the P3600 and P3700 is phenomenal. KHAAAAAAN! I really, really want those drat drives when I refresh our server offerings. Unless anyone else can think of a good PCIe SSD for write-heavy(ish) loads <$1.50/GB. KillHour fucked around with this message at 07:48 on Sep 18, 2014 |
# ? Sep 18, 2014 07:32 |
|
I seem to remember Intel teasing a feature where they could basically sleep a computer, and then have it instantly wake up to (say) respond to an attempted connection. This was different from a typical Wake on Lan. Anyone remember what I'm talking about? Is that feature in any of the NUC PCs?
|
# ? Sep 19, 2014 21:11 |
|
Paul MaudDib posted:I seem to remember Intel teasing a feature where they could basically sleep a computer, and then have it instantly wake up to (say) respond to an attempted connection. This was different from a typical Wake on Lan. Connected Standby?
|
# ? Sep 19, 2014 21:26 |
|
Not quite but this let me dig up what I was talking about. quote:Ready Mode essentially allows a 4th Gen Core processor (and presumably newer chips, which are slated to arrive later in the year), to enter a low C7 power state, while the OS and other system components remain connected and ready for action. Intel demoed the technology, and along with compatible third party applications and utilities, showed how Ready Mode can allow a mobile device to automatically sync to a PC to download and store photos. The PC could also remain in a low power state and stream media, server up files remotely, or receive VOIP calls. Ready Mode will be supported by a number of Intel’s system partners in a variety of all-in-one type systems a little later in the year. More or less the same concept, but aimed at the desktop market instead of Atom processors and Metro apps. I want this on a NUC or something similar as a low-power server. Doesn't look like it's available yet, though. Paul MaudDib fucked around with this message at 21:59 on Sep 19, 2014 |
# ? Sep 19, 2014 21:57 |
|
C7 is disabled by default, because most PSUs except the most recent ones don't support it.
|
# ? Sep 19, 2014 22:16 |
|
Does HDMI 2.0 support just require a board redesign or does it require cpu support as well? Is it possible we could get some haswell boards with hdmi 2?
|
# ? Sep 19, 2014 22:51 |
|
It requires a board redesign because of the huge clock speed increase, to 600 MHz from 340(?) MHz. Haswell and Broadwell support HDMI 1.4, not 2.0, and that's that.
|
# ? Sep 19, 2014 23:24 |
|
Wait wait wait, i was actually browsing around, saw the thread and skimmed through the OP and saw that the integrated graphics of the i5 4690k is better than the GPU on my old PC (a radeon 4850)??? Did i read that wrong? Because wanted to get my 2-weeks money back warranty on my video card so i sent it back and plugged everything back to my old PC. Would there be any realistic downside to this (like extra strain on my CPU or something). I don't plan to actually do any gaming till i get a new video card but i do want to watch youtube, vidoes, films, etc and do poo poo. P.S. do i have to change my currently use GPU in Bios to use the integrated graphics? edit: apparently i'm ancient, the GPU die is separated on the actual CPU chip, i should stress the cooler a bit more but unless i abuse it (which i won't) i should be ok, right? uaciaut fucked around with this message at 19:17 on Sep 24, 2014 |
# ? Sep 24, 2014 19:11 |
|
uaciaut posted:Wait wait wait, i was actually browsing around, saw the thread and skimmed through the OP and saw that the integrated graphics of the i5 4690k is better than the GPU on my old PC (a radeon 4850)??? No strain. Integrated graphics is used a lot, although not so typically on a 4690k if I had to guess. Its no harm or anything. I'd just plug it in and boot it. There are options in the BIOS for it but I think that really matters if you specifically disable it
|
# ? Sep 24, 2014 19:15 |
|
Silly nomenclature question: is there any reason at all why all the <architecture>-E chips are one number too high in the "thousands place"? Sandy Bridge CPUs were 2xxx, Sandy Bridge-E were 3xxx, Ivy Bridge were 3xxx, Ivy Bridge-E were 4xxx, and now Haswell CPUs are 4xxx and Haswell-E are 5xxx and Broadwell will probably be 5xxx and a hypothetical Broadwell-E will be 6xxx and so on. That seems like a pointless and confusing decision. Did Intel ever explain this?
|
# ? Sep 25, 2014 06:14 |
|
It's marketing. Pure "bigger numbers = better than." For a tenuous justification: The -E variants are released later, closer to the release of the die-shrink than the release of the original uarch. Add to that that they are better chips (in the sense of numbers, if not necessarily benchmarks), and therefore they're "next-gen."
|
# ? Sep 25, 2014 06:23 |
|
What's the latest word on desktop Broadwell? Core M chips should launch within a month or two, we *may* get LGA1150 CPUs in Q1 2015 and it seems that they'll be a ~5% clock-for-clock upgrade over Haswell. I'm pondering whether should I grab a 4690K / 4790K now or wait for the Broadwell desktop lineup. I could also grab an anniversary edition Pentium for *cheap*, overclock the hell out of it (watercooled rig) and sell it once Broadwell launches since Z97 motherboards support both Haswell and Broadwell. Upgrading from Nehalem i7-975.
|
# ? Sep 26, 2014 14:15 |
|
Welmu posted:What's the latest word on desktop Broadwell? Core M chips should launch within a month or two, we *may* get LGA1150 CPUs in Q1 2015 and it seems that they'll be a ~5% clock-for-clock upgrade over Haswell. Wait for Skylake; intel has said it will not be delaying it.
|
# ? Sep 26, 2014 14:24 |
|
Are there going to be any more CPUs made for the z97 chipset? Is the 4790k the last and the greatest for this mobo?
|
# ? Sep 26, 2014 15:44 |
|
r0ck0 posted:Are there going to be any more CPUs made for the z97 chipset? Is the 4790k the last and the greatest for this mobo? Broadwell has, so far, very different power needs than haswell. Even if they preserve lga1150, you would likely need a new MB to accommodate the new power requirements. Skylake will absolutely need a new socket, as the switch to DDR4 is a big move.
|
# ? Sep 26, 2014 15:58 |
|
Pretty much what I figured, I think the last time I upgraded my CPU without replacing my mobo was the celeron 300a.
|
# ? Sep 26, 2014 15:59 |
|
EoRaptor posted:Broadwell has, so far, very different power needs than haswell. Even if they preserve lga1150, you would likely need a new MB to accommodate the new power requirements.
|
# ? Sep 26, 2014 16:11 |
|
Malcolm XML posted:Wait for Skylake; intel has said it will not be delaying it. That's what they said about Broadwell. Let's be honest, for a product launching in twelve months or less, there is practically nothing known about Skylake and this does not bode well. Edit: Intel is apparently considering foregoing EUV for the 7nm node. Rime fucked around with this message at 17:37 on Sep 26, 2014 |
# ? Sep 26, 2014 16:31 |
|
EoRaptor posted:Broadwell has, so far, very different power needs than haswell. Even if they preserve lga1150, you would likely need a new MB to accommodate the new power requirements. Most motherboard makers are saying that their Z97 chip boards will support the 5th gen Core chips when they come out. But time will tell.
|
# ? Sep 26, 2014 16:38 |
|
So what's the over/under on when they finally say gently caress it and start switching over to graphene processes?
|
# ? Sep 26, 2014 18:23 |
|
Not until a number of years from now, by which time silicon progress has ground to an almost halt, sputtering adrift with enormous gaps of time between process changes.
|
# ? Sep 26, 2014 18:27 |
|
Correct me if I'm wrong, but I don't think anyone's ever managed to make even a one-off graphene processor in the lab with a non-trivial amount of transistors (like more than a half-adder, or something).
|
# ? Sep 26, 2014 18:42 |
|
KillHour posted:Correct me if I'm wrong, but I don't think anyone's ever managed to make even a one-off graphene processor in the lab with a non-trivial amount of transistors (like more than a half-adder, or something). Derp, that's not a processor.
|
# ? Sep 26, 2014 18:54 |
|
Rime posted:
Still impressive; further than I thought they were. Unless they can get a band gap, though, there's no hope in ever making a digital logic circuit out of it.
|
# ? Sep 26, 2014 19:13 |
|
KillHour posted:Still impressive; further than I thought they were. Unless they can get a band gap, though, there's no hope in ever making a digital logic circuit out of it. Rastor posted:The future may be the spiritual successor to the vacuum tube.
|
# ? Sep 26, 2014 19:33 |
Yeah, graphene is fundamentally incompatible with traditional semiconductor manufacturing and logic without some very clever workarounds that haven't really been successful.
|
|
# ? Sep 26, 2014 20:15 |
|
Can they simply combine graphene with quantum computers and be done with it?
|
# ? Sep 26, 2014 20:41 |
|
this is all moot, synaptic cpus will allow us to bend time and space and run crysis at 5k at 300 fps before any of that
|
# ? Sep 26, 2014 21:33 |
|
Won't a non-silicon processor also need a completely new instruction set?
|
# ? Sep 26, 2014 22:35 |
|
Tab8715 posted:Won't a non-silicon processor also need a completely new instruction set? No.
|
# ? Sep 26, 2014 22:42 |
|
KillHour posted:Still impressive; further than I thought they were. Unless they can get a band gap, though, there's no hope in ever making a digital logic circuit out of it. What about virtual bandgaps in graphene?
|
# ? Sep 27, 2014 02:19 |
|
With the vacuum tube, if it has a higher Ghz limit would it be worth skipping graphene and going straight for that? I don't know anything about electrical engineering but to me it sounds like the magic next step not graphene. Is Quantum computing some new super fast computer or is it a kind of new magic way of looking at electrons? I remembering someone telling me it's about using the fuzzy circumference of electrons as bits instead of using electrons as bits (ie, 1 electron or whatever is now 2+ bits instead of just 1) but that doesn't sound like any performance gains.
|
# ? Sep 27, 2014 03:02 |
|
Tab8715 posted:Won't a non-silicon processor also need a completely new instruction set? Basic digital logic instructions (AND, OR, NAND, NOR, XOR and so on) don't necessarily require silicon to be executed, nor do x86 instructions (although building an 8086 entirely out of non-silicon components would be a fun project)
|
# ? Sep 27, 2014 03:12 |
|
Lord Windy posted:With the vacuum tube, if it has a higher Ghz limit would it be worth skipping graphene and going straight for that? I don't know anything about electrical engineering but to me it sounds like the magic next step not graphene. A normal computer stores data in memory as a state (A bit is either '0' or '1', in binary systems). A quantum computer takes advantage of the fact that you can have a particle be in a superposition of multiple states (A qubit [short for quantum bit] can be both '0' and '1' at the same time.). For most math we do on a computer, this doesn't mean jack-squat. However, there are certain problems that are hard on classical computers, but aren't hard on a quantum computer ('Hard' has a specific meaning in computer science). It's these problems that quantum computers help solve; they're not better than classical computers in any generalized sense. Edit: Watch this. https://www.youtube.com/watch?v=g_IaVepNDT4 KillHour fucked around with this message at 04:35 on Sep 27, 2014 |
# ? Sep 27, 2014 04:33 |
|
Lord Windy posted:Is Quantum computing some new super fast computer or is it a kind of new magic way of looking at electrons? I remembering someone telling me it's about using the fuzzy circumference of electrons as bits instead of using electrons as bits (ie, 1 electron or whatever is now 2+ bits instead of just 1) but that doesn't sound like any performance gains.
|
# ? Sep 27, 2014 04:34 |
|
Lord Windy posted:With the vacuum tube, if it has a higher Ghz limit would it be worth skipping graphene and going straight for that? I don't know anything about electrical engineering but to me it sounds like the magic next step not graphene. Quantum computing, at least in the way that it's currently being done by outfits like D-Wave (which I've toured, weird place) is not in any way usable for consumer purposes. It's not general purpose hardware, it's closer to bitcoin ASICS, and it's unlikely that you could ever run a traditional OS or game on a quantum processor just due to the weirdness of it being a non-binary system. Certainly, backwards compatibility would not be possible.
|
# ? Sep 27, 2014 05:03 |
|
Rastor posted:Not until a number of years from now, by which time silicon progress has ground to an almost halt, sputtering adrift with enormous gaps of time between process changes. the chipmakers might have to focus their innovation in areas other than shrinking Mr Chips posted:Basic digital logic instructions (AND, OR, NAND, NOR, XOR and so on) don't necessarily require silicon to be executed, nor do x86 instructions (although building an 8086 entirely out of non-silicon components would be a fun project) you could probably build one out of germanium fairly easily for a laid-back, chilled-out processing experience.
|
# ? Sep 27, 2014 07:23 |
|
Rastor posted:Quantum computers are neither necessarily faster (though they have the potential to solve problems quickly that classical computers could not solve in a reasonable time, or at all) nor do they work via switching flowing electrons. They are fundamentally different things; even at the concept level they are based on qubits instead of bits. They are (would be) used for solving different problems than classical computers, and though no doubt someone will invent some kind of game for them, you won't be running Crysis on one.
|
# ? Sep 27, 2014 07:49 |
|
|
# ? May 13, 2024 10:46 |
|
atomicthumbs posted:the chipmakers might have to focus their innovation in areas other than shrinking But we have ample evidence that this simply can't happen, look at how badly nV screwed up when they got stuck at 28nm for too long, the fools
|
# ? Sep 27, 2014 08:07 |