|
Tapedump posted:This is where I need you need assure me that you're just joking and not really an idiot. I'm not joking. What would the downside of a modern CPU architecture running at 20GHz be vs a 4ghz quad core? You can still schedule multiple threads on a single core just fine, and when you do have just one thread crunching it would be way faster.
|
# ? Jun 3, 2015 05:18 |
|
|
# ? May 11, 2024 09:51 |
|
Twerk from Home posted:I'm not joking. What would the downside of a modern CPU architecture running at 20GHz be vs a 4ghz quad core? You can still schedule multiple threads on a single core just fine, and when you do have just one thread crunching it would be way faster. That's what Intel tried to do with the P4 and they ran straight into a thermal brickwall.
|
# ? Jun 3, 2015 05:20 |
|
Ragingsheep posted:That's what Intel tried to do with the P4 and they ran straight into a thermal brickwall. Oh, I know that it's impossible in reality, but it's a nice dream. If we somehow could get more single threaded speed it would be vastly preferable to more cores.
|
# ? Jun 3, 2015 05:23 |
|
Twerk from Home posted:Oh, I know that it's impossible in reality, but it's a nice dream. If we somehow could get more single threaded speed it would be vastly preferable to more cores. Intel and ARM keep trying through all the ways they can (though the conspiracy theorists here argue otherwise), and all we get are IPC, branch prediction, cache efficiencies, and new instruction sets that help single threads. Physics is a bitch. Can't go faster. At least, not while we're still using silicon.
|
# ? Jun 3, 2015 05:51 |
|
Twerk from Home posted:What would the downside of a modern CPU architecture running at 20GHz be...? <speaks in French> <is replied to in Tagalog> Let's not play the what-if game here. By your logic, why not just dream/ask of having a bajillion+1 cores that run at lunacy+X GHz?
|
# ? Jun 3, 2015 05:52 |
|
Seems you'll get your 20Ghz processor by 2020 now: http://www.extremetech.com/computing/185688-ibm-betting-carbon-nanotubes-can-restore-moores-law-by-2020
|
# ? Jun 3, 2015 07:48 |
|
Twerk from Home posted:You know what would be way better than us getting better at using multiple cores? Really fast single cores. I would love to have a single core 17.6GHz Haswell instead of a quad core 4.4 GHz one. The comments on that article are amazing: Allen posted:If 10 GHz is the best that Intel can do by 2011, AMD or somebody else is going to eat their lunch. Intel better pick up the pace if they want to remain dominant. Besides, I want it NOW. What will I do with it. Well, I also want the applications now. I guess I've been spoiled by the industry and expect incredible improvements every year. StickWithApple posted:doesn't matter the speed of intel's chip in 2011 because motorola and ibm will already have chips out by then that are more effective, use energy better, and run applications ready for the new iMac's that are introduced at the 2011 MacWorld. They'll run at somewhere near 7GHz and still be faster than an 11GHz or even 128GHz sh*t that intel puts out.. Sinful posted:Back in grad school I worked on computing with light and transistors that had ten states (0-9 or base 10) rather than 2 (0 and 1 or binary). Anybody who doesn't think that these types of technology won't be commercially available by 2011 is kidding themselves.
|
# ? Jun 3, 2015 09:56 |
|
I don't know if it's been mentioned but Thunderbolt 3 is now in a USB-C form factor. I think it's supposed to launch alongside Skylake too.
|
# ? Jun 3, 2015 14:37 |
|
Twerk from Home posted:You know what would be way better than us getting better at using multiple cores? Really fast single cores. I would love to have a single core 17.6GHz Haswell instead of a quad core 4.4 GHz one. quote:I'll be waiting in line at CompUSA.
|
# ? Jun 3, 2015 15:02 |
|
I wonder if Thunderbolt will ever get much cheaper though. It's cost-prohibitive compared to USB-c devices by an order of magnitude at least and that results in stupidly high prices for Thunderbolt accessories typically. Heck, I'm kind of shocked that the Thunderbolt ports on my LG34UM95P didn't make the monitor $1300+ at launch.
|
# ? Jun 3, 2015 15:23 |
|
necrobobsledder posted:I wonder if Thunderbolt will ever get much cheaper though. It's cost-prohibitive compared to USB-c devices by an order of magnitude at least and that results in stupidly high prices for Thunderbolt accessories typically. Heck, I'm kind of shocked that the Thunderbolt ports on my LG34UM95P didn't make the monitor $1300+ at launch. One of the reasons for that is it's so little-used that none of the parts have really achieved economies of scale. If it gets popular costs will come down, though by how much is an open question.
|
# ? Jun 3, 2015 15:31 |
|
necrobobsledder posted:I wonder if Thunderbolt will ever get much cheaper though. It's cost-prohibitive compared to USB-c devices by an order of magnitude at least and that results in stupidly high prices for Thunderbolt accessories typically. Heck, I'm kind of shocked that the Thunderbolt ports on my LG34UM95P didn't make the monitor $1300+ at launch. Supposedly all of the USB-C 3.1 ports will be able to interact with Thunderbolt so the only cost issue will be the cable.
|
# ? Jun 3, 2015 15:31 |
|
bnecrobobsledder posted:I wonder if Thunderbolt will ever get much cheaper though. It's cost-prohibitive compared to USB-c devices by an order of magnitude at least and that results in stupidly high prices for Thunderbolt accessories typically. Heck, I'm kind of shocked that the Thunderbolt ports on my LG34UM95P didn't make the monitor $1300+ at launch. bingo usb chips are dirt cheap compared to intel's weird rear end t-bolt chip and licensing scheme
|
# ? Jun 3, 2015 15:32 |
|
Tapedump posted:Nope sure how to respond, as it wouldn't be modern architecture, would it? All that I meant is as a software developer, faster single cores would be vastly preferable to multiple slower cores. Why couldn't we have modern architectures at higher clocks? The reason I chose ~20GHz is that's roughly 4x the current clock speed of quad cores, so there's theoretically about the same amount of cycles there.
|
# ? Jun 3, 2015 15:48 |
|
pmchem posted:There are compilers and debuggers being written for Intel chips by a heck of lot more companies, universities, and worldwide open source teams than "just Intel". No idea how you're even making that statement. pmchem posted:Intel has an incredible manufacturing/process advantage and huge resources for chip R&D, beyond something like "Cray" which was never really a CPU company so is a terrible example. It's far beyond what DEC was to Intel in the 90s. Cray still exists by the way: I used three of them today! They're not ignoring the competition from ARM; check out the recent realworldtech article about Atom improvements. Indeed, even for Intel facets of design are subject to competitive forces. MaxxBot posted:Theoretically ARM has the advantage of the decoder logic taking up a smaller portion of the CPU die but there's also been a trend over time of the decoder logic taking up a smaller portion of the CPU die across all microarchitectures so this might not end up mattering that much. From what I have seen so far Intel still has a big advantage in performance/watt due to their superior fab when compared to a theoretical ARM competitor, especially with their low power Xeons. It would be interesting to see how the performance/watt would compare if Intel made some ARM cores, which it looks like might happen before too long. Why are we letting Intel waste time with x86? Why not nationalize them, open their fab to everyone, and enjoy the benefits of the best ARM design teams getting the best possible silicon to run on? MaxxBot posted:EDIT: But fundamentally saying "Intel's going to be eaten alive by ARM" doesn't really make any sense. ARM is an ISA and Intel is a chip manufacturer, if Intel eventually sees an advantage in dumping x86 for ARM they will do so and go along designing and manufacturing chips just as they did before. Twerk from Home posted:I'm not joking. What would the downside of a modern CPU architecture running at 20GHz be vs a 4ghz quad core? You can still schedule multiple threads on a single core just fine, and when you do have just one thread crunching it would be way faster.
|
# ? Jun 3, 2015 15:51 |
|
evilweasel posted:One of the reasons for that is it's so little-used that none of the parts have really achieved economies of scale. If it gets popular costs will come down, though by how much is an open question. Has there been any reason for Thunderbolt to become popular that has legs? All I've seen for it are monitor hookups for big Macs and serious business peripherals that have firewire, usb 3 or straight up pci-e connected models available as well.
|
# ? Jun 3, 2015 15:55 |
|
JawnV6 posted:Can I be there when you tell the poor designer trying to hit a 1ns window that he now has to hit .05ns? It's ok, we'll reassure him with this bit about the OS's scheduler. Also we went from 4 cores but multiplied the time by 5. So everyone be aware we're not using that boring linear speedup, it's something superlinear at the least. I should have just used 4ghz and 16ghz, I meant I'd rather have all the cycles on one core rather than across several. Also, all my thoughts have been purely from the software end and ignoring chip design itself, I know absolutely nothing about it.
|
# ? Jun 3, 2015 15:56 |
|
necrobobsledder posted:I wonder if Thunderbolt will ever get much cheaper though. It's cost-prohibitive compared to USB-c devices by an order of magnitude at least and that results in stupidly high prices for Thunderbolt accessories typically. Heck, I'm kind of shocked that the Thunderbolt ports on my LG34UM95P didn't make the monitor $1300+ at launch. Previous versions also required expensive active cables, while this one works just fine with a 2$ passive cable (At current generation bandwidth levels). The 40 GB/s version still needs an active cable though.
|
# ? Jun 3, 2015 15:58 |
|
Twerk from Home posted:All that I meant is as a software developer... To answer the question, it's been answered at least twice on this page. Ragingsheep posted:That's what Intel tried to do with the P4 and they ran straight into a thermal brickwall. LiquidRain posted:Just as you said, it's impossible. Twerk from Home posted:Oh, I know that it's impossible in reality Please stop. I'll not pick up this derail again.
|
# ? Jun 3, 2015 16:01 |
|
this wonderful and completely imagined 'ARM is going to eat Intel' argument, brought to you on Intel processors.
|
# ? Jun 3, 2015 16:02 |
|
Twerk from Home posted:I should have just used 4ghz and 16ghz, I meant I'd rather have all the cycles on one core rather than across several. Also, all my thoughts have been purely from the software end and ignoring chip design itself, I know absolutely nothing about it. go3 posted:this wonderful and completely imagined 'ARM is going to eat Intel' argument, brought to you on Intel processors. JawnV6 fucked around with this message at 16:12 on Jun 3, 2015 |
# ? Jun 3, 2015 16:05 |
|
JawnV6 posted:Well, yeah, if you knew anything about chip design you would've seen the multi-core future coming from a decade ago. That much is clear. We all know SW folks are lazy and can't be bothered to learn and leverage parallelism, it's just funny to see someone take that and start demanding physical impossibilities. It's a tough problem. Some workloads parallelize well, some just don't. If it were easy to do highly parallel software then AMD would be more competitive right now. Their FX CPUs with 8 integer cores theoretically have similar performance to Ivy Bridge quad cores for highly parallel integer math.
|
# ? Jun 3, 2015 16:13 |
|
go3 posted:this wonderful and completely imagined 'ARM is going to eat Intel' argument, brought to you on Intel processors. I'm continually baffled why this is and has been a thing for so long. Like, why do people care so deeply what the instruction set is that runs their machines or the servers hosting their websites and databases and such? Because holy poo poo between the arm ANY DAY NOW!!! Crusaders or the "Intel just isn't innovating fast enough" brigade it always seems like it's way more important to people than it should be. Like, I legitimately have no earthly idea what the various advantages and disadvantages are of various styles of architecture, be it x86, arm version whatever, Power, etc. I know that the arm ISA takes marginally fewer transistors in terms of die space, but that ISA decode blocks are a pretty small % of dies in general these days so that's not really a deal like it was back when we were on 130nm chips. Is x86 just a really lovely scheme to work with being propped up by truckloads of R&D money? If someone is an embedded programmer I'd actually like to know. Because honestly it seems like we've continued to see pretty fantastic leaps in server performance from each generation of Xeon, while mobile chips have vastly improved battery life and maintained good enough performance over the last few years. Desktops haven't exactly leapt forward in single threaded performance as mentioned many times, but welp that's market forces for you. The money ain't there even if the nerd demand is. And like the other poster mentioned, so what if arm "wins?" Arm doesn't make chips.
|
# ? Jun 3, 2015 16:30 |
|
I thought x86 is just a front-end these days? I mean, with all that decoding to micro-ops and poo poo.
|
# ? Jun 3, 2015 16:39 |
|
Combat Pretzel posted:I thought x86 is just a front-end these days? I mean, with all that decoding to micro-ops and poo poo. I have no idea. Like, for the people clamoring for arm to "Win" do you think that means we get meaningfully better desktop CPUs in some form? or better performing laptops with even better battery life? Is it just "Competition will mean we get better stuff than we have now?" (Although at least in the laptop arena I sorta think the CPU's impact on runtime has definitely started to get eclipsed by things like 4k+ screens- See the 10 hour vs 15 hour XPS13 runtimes with 4k vs 1080 panels)
|
# ? Jun 3, 2015 16:47 |
|
Supposedly some stuff can still be processed natively for speed but yea most everything else is 'cracked' into micro-ops over several cycles or more (in some case hundreds or thousands of cycles for 'legacy' instructions that are seldom used) of some sort to run on the 'back end' which actually does all the computational work. Generally its not seen as a big deal anymore to do this sort of thing. At least for x86. Its the x87 FPU that is supposed to be the real nightmare these days to deal with what with its weird support for stuff like 80 bit double extended precision instructions which no one messes with anymore but still has to be supported.
|
# ? Jun 3, 2015 16:49 |
|
evilweasel posted:One of the reasons for that is it's so little-used that none of the parts have really achieved economies of scale. If it gets popular costs will come down, though by how much is an open question. Combat Pretzel posted:I thought x86 is just a front-end these days? I mean, with all that decoding to micro-ops and poo poo.
|
# ? Jun 3, 2015 16:55 |
|
Is there any point in introducing a completely new instruction set, that moves the burden of optimization a little more back to the compiler? Using mode switching antics, this doesn't sound unpossible.
|
# ? Jun 3, 2015 17:05 |
|
Introducing a completely new instruction set means having to build an extensive emulation set up that can provide performance/power usage not significantly worse than previous chips if yu want it to get any traction int he real computer market.
|
# ? Jun 3, 2015 17:07 |
|
Combat Pretzel posted:Is there any point in introducing a completely new instruction set, that moves the burden of optimization a little more back to the compiler? Using mode switching antics, this doesn't sound unpossible.
|
# ? Jun 3, 2015 17:11 |
|
NATIONALIZE INTEL! Malaysia will lead us into the new millennium of computing.
|
# ? Jun 3, 2015 17:25 |
|
Nintendo Kid posted:Introducing a completely new instruction set means having to build an extensive emulation set up that can provide performance/power usage not significantly worse than previous chips if yu want it to get any traction int he real computer market. Combat Pretzel fucked around with this message at 17:42 on Jun 3, 2015 |
# ? Jun 3, 2015 17:40 |
|
Twerk from Home posted:It's a tough problem. Some workloads parallelize well, some just don't. If it were easy to do highly parallel software then AMD would be more competitive right now. Their FX CPUs with 8 integer cores theoretically have similar performance to Ivy Bridge quad cores for highly parallel integer math. CS folks have this tendency to ask things of systems in a particularly inelegant way. There's some huge pressure to reduce things to simple asks that totally lose grounding in the painstakingly created abstractions. I saw some talk from a guy saying "if we had infinite memory, porblem solved!!" and my systems prof came out giggling. You've essentially got infinite memory now. The whole problem is fast access to the bits you want in a timely fashion, not the arbitrary limit on total storage. Gwaihir posted:Like, I legitimately have no earthly idea what the various advantages and disadvantages are of various styles of architecture, be it x86, arm version whatever, Power, etc. I know that the arm ISA takes marginally fewer transistors in terms of die space, but that ISA decode blocks are a pretty small % of dies in general these days so that's not really a deal like it was back when we were on 130nm chips. Is x86 just a really lovely scheme to work with being propped up by truckloads of R&D money? If someone is an embedded programmer I'd actually like to know. Everyone's really quick to point out that x86 gets decoded and that decode takes more transistors. So I guess we're totally unconcerned with the fact that a 2-byte opcode can do a flow checking hundreds of bytes of state that would break out to hundreds of ARM operations and whatever bus is bringing instructions in can't possibly use that extra bandwidth. It's so unbelievably myopic towards the tradeoffs implied by a basic CISC/RISC analysis. Would you spend transistors to reduce bus bandwidth? Gwaihir posted:And like the other poster mentioned, so what if arm "wins?" Arm doesn't make chips. PC LOAD LETTER posted:Intel tried that with Itanium. It turns out trying to make the compiler do more of the work for the hardware and programmer doesn't work IRL even though it sounds great on paper.
|
# ? Jun 3, 2015 17:42 |
|
JawnV6 posted:I don't think you get to have much of an opinion at all with a tabloid-grade understanding of the tradeoffs. Great, so talk then. Why is arm eating intel's lunch and what does them "winning" get us, the consumer? Are there huge flaws in current (Intel) chips that just haven't been exposed because AMD is a dumpster fire and the various arm licensees don't have Intel's foundry expertise? Do you just think we'd have much better chips in general if there were more competitors in the CPU market? e: And it's not like I have to be a dedicated embedded programmer or IC engineer to grasp the applied implications of how different chips perform, but thanks for that anyways Gwaihir fucked around with this message at 18:15 on Jun 3, 2015 |
# ? Jun 3, 2015 18:07 |
|
Wulfolme posted:Has there been any reason for Thunderbolt to become popular that has legs? All I've seen for it are monitor hookups for big Macs and serious business peripherals that have firewire, usb 3 or straight up pci-e connected models available as well. We're right about at the point where 4k stuff is starting to be adopted so there's that at least. It's unlikely that backups or external graphics cards or whatever are going to be a driver, I'll admit.
|
# ? Jun 3, 2015 18:48 |
|
Is Thunderbolt on PC still crippled?
|
# ? Jun 3, 2015 19:14 |
|
How so? Driver support?
|
# ? Jun 3, 2015 19:17 |
|
Tablets are totally gonna replace computers too
|
# ? Jun 3, 2015 19:53 |
|
Gwaihir posted:How so? Driver support? I am way out of date on things but basically http://www.anandtech.com/show/8529/idf-2014-where-is-thunderbolt-headed quote:Thunderbolt on PCs: A Crippled Experience
|
# ? Jun 3, 2015 20:57 |
|
|
# ? May 11, 2024 09:51 |
|
go3 posted:Tablets are totally gonna replace computers too Can't tell if sarcastic, but for digital content consumption, tablets/phones are pretty much there already.
|
# ? Jun 3, 2015 21:01 |