Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
While I can see modularity going away for the CPU/board, being able to adjust RAM and GPU are way to important within the desktop space. Further, as far as desktop goes the modularity also acts to lessen cost at single point of failure, where complete integration would require replacement for the entire system. My perspective might be different though, I build and maintain PCs for fun.

Tanreall posted:

Isn't that what the Nvidia SHIELD is trying to be?

I suppose, although I take issue with limitations imposed by SoC design now on overall GPU performance.

NihilismNow posted:

And if you want to buy a non modular ARM board and use it as a desktop, well you can already do that.

None of them are even close to potent enough for midrange due to design though, and in many cases are very limited as far as support and peripherals.

Adbot
ADBOT LOVES YOU

Nintendo Kid
Aug 4, 2011

by Smythe

Boiled Water posted:

Another question is: Would it make any sense and can ppc even compete with x86?

Well, are there any POWER/PowerPC chips that can run acceptably in a laptop (one of the biggest reasons Apple jumped to Intel is that the PPC G5 was impossible to fit in a laptop without going brick-thick desktop replacement)? Laptops are, after all, the biggest consumer market for x86-64 CPUs.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER



FaustianQ posted:

Yea, implied desktop there, I know tablets are fine but I'd like to see ARM move into laptops, and even desktops to an extent. I wouldn't mind seeing some kind of focus on modular Nano-ITX form factor with socketed ARMv8, SO-DIMM slots and something like PowerVR Series 6 on it's own PCB with GDDR5 support on PCIE x4. I think it could be done cheaply enough, and potentially simply enough that desktop PCs could have a really healthy competitive environment again. both CPU and GPU, while not really alienating customers due apparent complexity of replacement, construction and maintenance. Think of it as 200-300$ midrange gaming and home entertainment units.

I know, a lot of words saying "Wow, it kind sucks staring at the bleak future of Nvida and Intel".

What you're describing already exists in huge numbers, are a brand name and a couple of major players involved. They're known simply as Chromebooks.

The MUMPSorceress
Jan 6, 2012


^SHTPSTS

Gary’s Answer

FaustianQ posted:

While I can see modularity going away for the CPU/board, being able to adjust RAM and GPU are way to important within the desktop space. Further, as far as desktop goes the modularity also acts to lessen cost at single point of failure, where complete integration would require replacement for the entire system. My perspective might be different though, I build and maintain PCs for fun.


I don't know that this is even true anymore. I'd consider myself an enthusiast (I built my first PC when I was 10) and have always built my own PCs. Looking back, I have never replaced a discrete component in my computer other than a failed disk or optical drive. If I've got enough money to upgrade the GPU, I can save for 4 extra months and have enough money to just replace the whole thing, so that's what I do. It's just less headache than shuffling parts around and figuring out what I want to drag across builds and such. Instead, my old PC then becomes a media center somewhere in my house or a hand-me-down for one of my brothers who needs a better Minecraft rig or whatever.

And straight up, my Surface Pro 3 is better than my gaming machine at basically everything except playing video games, and I barely have time for video games anymore. I think many of us in the bracket that can afford fancy desktops are starting to have families and such and it's just easier/better to have a console in a family space so you can play with your kids or at least keep an eye on them while you play. My current gaming machine is 3 years old now, and I may not replace it at all.

WhyteRyce
Dec 30, 2001

Chuu posted:

The consumer market is very highly competitive thanks to ARM. Apple is working on custom silicon powerful enough to power Macbooks and Powerbooks, and Intel only has a toehold in the chromebook/tablet/smartphone world. I wouldn't be surprised to see an ARM-based macbook in the next five years -- and one of the side effects of the cloud is that a chromebook is likely to be completely functional for your average desk jockey very soon if not already.

If you care about desktops in 2015 you're a bit of a dinosaur. Sadly.

Intel has more than a toehold in the Chromebook and tablet world (contra revenue yeah yeah)

Gwaihir
Dec 8, 2009
Hair Elf

Nintendo Kid posted:

Well, are there any POWER/PowerPC chips that can run acceptably in a laptop (one of the biggest reasons Apple jumped to Intel is that the PPC G5 was impossible to fit in a laptop without going brick-thick desktop replacement)? Laptops are, after all, the biggest consumer market for x86-64 CPUs.

Nope, they're pretty exclusively focused on the server and datacenter market. That's where the $$$$ is. IBM's not interested in selling low margin consumer crap when they can sell big ole server chips for 4 digits a pop.

e: I mean, look at how well selling low margin consumer crap works for AMD in the consoles :v: It's not exactly a winning strategy.

evol262
Nov 30, 2010
#!/usr/bin/perl

Chuu posted:

If ARM had anything even remotely competitive to Xeon it would happen very quickly. The reality is, as much as people like to talk about ARM servers, ARM doesn't have anything remotely competitive by pretty much any metric that you would want to rack in a data center.

AArch64 is competitive, depending on your use case. Many workloads these days are light on CPU. AArch64 is a little slower than modern Atom cores (a little), and about on-par with performance/watt, though Xeon beats it on both accounts. Still, if you're looking for a ton of cores without a lot of heat, and so-so performance is ok, ARM is definitely an option.

Chuu posted:

IBM is dumping an absolute ton of money into POWER right now, and they have some big wins in some very specific industries (see: financial, oil & gas). I don't think you'll see power show up in something like EC2 because production just isn't there -- but I wouldn't be surprised to see POWER spread to other industries in the next couple of years.
OpenPOWER is potentially a big deal if anyone takes it up. They're still huge dies, and Xeon wins on performance/core, but the amount of cores (and SMT) they fit in is unreal, and the performance isn't that far behind. IBM's problem, as always, is price. OpenPOWER is supposedly going to help that by bringing in reference hardware and external vendors, but the CPUs themselves are still stupidly expensive.

VostokProgram posted:

How difficult would an industry-wide transition from x86 to ARM be? Would we have to throw out the entire IBM PC-esque architecture? Or would it be more like, your motherboard still uses a PCIe bus and UEFI/BIOS firmware and all the other things that define a modern PC, except that the CPU is running a different architecture and therefore your programs need to be recompiled.?

BIOS is a no-go (and nobody wants it anyway), but PCIe and UEFI work fine on other architectures. EFI started life on Itanium anyway.

Gwaihir posted:

Nope, they're pretty exclusively focused on the server and datacenter market. That's where the $$$$ is. IBM's not interested in selling low margin consumer crap when they can sell big ole server chips for 4 digits a pop.

e: I mean, look at how well selling low margin consumer crap works for AMD in the consoles :v: It's not exactly a winning strategy.

ARM is chipping away at this, but embedded PowerPC and MIPS are still players, even if the money isn't as flashy as shoving P8s into max-config mainframes but making customers pay :10bux: to enable them.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

As long as Intel has just a fraction performance per watt no one will stray from the Blue path.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

LeftistMuslimObama posted:

I don't know that this is even true anymore. I'd consider myself an enthusiast (I built my first PC when I was 10) and have always built my own PCs. Looking back, I have never replaced a discrete component in my computer other than a failed disk or optical drive. If I've got enough money to upgrade the GPU, I can save for 4 extra months and have enough money to just replace the whole thing, so that's what I do. It's just less headache than shuffling parts around and figuring out what I want to drag across builds and such. Instead, my old PC then becomes a media center somewhere in my house or a hand-me-down for one of my brothers who needs a better Minecraft rig or whatever.

And straight up, my Surface Pro 3 is better than my gaming machine at basically everything except playing video games, and I barely have time for video games anymore. I think many of us in the bracket that can afford fancy desktops are starting to have families and such and it's just easier/better to have a console in a family space so you can play with your kids or at least keep an eye on them while you play. My current gaming machine is 3 years old now, and I may not replace it at all.

Back when I started building, round 2004 or so, CPUs still had large enough leaps/sockets changed enough that the ticks and tocks of CPUs and GPUs matched up, replacing whole systems made sense - from 2004 to 2006 would have been an Athlon 64 and X850 to a Conroe and 8600GT for me, and that's a pretty massive difference in performance. However, since about 2009, x86 CPUs have kind of plateaued which has lead to just GPU/PSU/SSD and maybe RAM as changes over time, the CPUs and boards haven't needed replacing. If that's a continuing trend, then it'd still make sense for units meant for desktop space to at least have flexible GPU/PSU installation. RAM is more debatable and externals and cloud could theoretically deal with data storage fine. Further, integrating a potent GPU solution into a system is a bit far and away, the best so far is what, Radeon R7, Iris Pro 6200 and PowerVR Series 8?

I've got a Nexus 7 and it's fine, but I wouldn't use it for much else besides surfing and gaming, if only because I find touch screens clumsy as poo poo and there honestly isn't enough oomph behind a lot of them. Maybe a nice 10" A72 w/DDR4 and a decent PowerVR or Adreno could entice me to change my mind.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

LeftistMuslimObama posted:

I don't know that this is even true anymore. I'd consider myself an enthusiast (I built my first PC when I was 10) and have always built my own PCs. Looking back, I have never replaced a discrete component in my computer other than a failed disk or optical drive. If I've got enough money to upgrade the GPU, I can save for 4 extra months and have enough money to just replace the whole thing, so that's what I do. It's just less headache than shuffling parts around and figuring out what I want to drag across builds and such. Instead, my old PC then becomes a media center somewhere in my house or a hand-me-down for one of my brothers who needs a better Minecraft rig or whatever.


Definitely not a thing these days. The only real components that have gotten performance improvements are the GPUs. My rig is from 2012 and there's no other part to upgrade; I replaced my dual GPU solution with a single card, put in a ssd. That's ask I could really do.

Wistful of Dollars
Aug 25, 2009

If Zen can give me 8 cores and ~90% of Skylake I'm in. Perhaps out of pity, if nothing else.

Yes, I can actually make use of more than 4 cores

adorai
Nov 2, 2002

10/27/04 Never forget
Grimey Drawer
As someone who runs power in my datacenters (finance) i would switch to x86 in a heartbeat. There is something to be said for being to able to just order something from CDW with barely more than zero thought and knowing that it will fit my needs. Power is so loving expensive, it's like $1500 just to license a single core. So my 8 core proc has only 2 cores licensed.

Tindahbawx
Oct 14, 2011

adorai posted:

As someone who runs power in my datacenters (finance) i would switch to x86 in a heartbeat. There is something to be said for being to able to just order something from CDW with barely more than zero thought and knowing that it will fit my needs. Power is so loving expensive, it's like $1500 just to license a single core. So my 8 core proc has only 2 cores licensed.

So, wait, are you saying you've an 8 core machine that only has 2 cores working?

Riso
Oct 11, 2008

by merry exmarx

Tindahbawx posted:

So, wait, are you saying you've an 8 core machine that only has 2 cores working?

IBM has always done that. AS/400s come fully featured but only unlock based on what you pay for.

Gwaihir
Dec 8, 2009
Hair Elf
Yes, IBM spins it as "Dynamic capacity!" aka we'll sell you a server with 8 CPUs but you only get to use 6 of them and if you run in to a period of heavy workloads you can pay us an extra $$ to enable the last two for XYZ temporary time.

Tindahbawx
Oct 14, 2011

Gwaihir posted:

Yes, IBM spins it as "Dynamic capacity!" aka we'll sell you a server with 8 CPUs but you only get to use 6 of them and if you run in to a period of heavy workloads you can pay us an extra $$ to enable the last two for XYZ temporary time.

You essentially lease the cores? How bizarre, I don't think I'd appreciate that.

Gwaihir
Dec 8, 2009
Hair Elf
Well, I mean it just depends on how much you want to spend and what kind of flexibility you want. You can obviously buy a system with whatever you want in it- It's just that the way these things are priced and used, it's a not too bad a hedge against unexpected "Oh poo poo" workloads like a pile of unexpected transactions getting dropped on you all at once, etc. They'll put a spare CPU in there for a very low or nominal cost, and if you run in to a situation where you need it, it's a pushbutton thing "Now your system is faster" without any need for downtime/spinning up an extra VM/adding extra nodes to a cluster/whatever we would typically do when an X86 based server app needs more oomph, and you'll probably have spent less than if you had just flat out bought/activated more CPUs or cores in the first place.

It's definitely a different/weird mindset though.

e:
I admin an 8-socket AS400/Power770 at work and we've actually run in to a situation like that once in ~5 years or so- We only use 7 of the CPUs (CPU use during the day usually floats around 50% ish) normally split between a separate test and prod LPAR, and a vendor software fuckup resulted in dumping a week's worth of transactions in our WMQ queues all at once first thing in the morning. So the normal day's stuff was stuck in line behind all that crap, and we couldn't just delete the stuff out of the queue because it was valid data we needed. Luckily our IBM support team is good and likes us so they activated the extra cores without charge or waiting for bureaucratic crap and with that + borrowing all but one of the test lpar's cores the queue got chunked through in time for normal backups. The system works? :confuoot:

Gwaihir fucked around with this message at 15:50 on Jul 24, 2015

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

El Scotch posted:

If Zen can give me 8 cores and ~90% of Skylake I'm in. Perhaps out of pity, if nothing else.

Yes, I can actually make use of more than 4 cores

Aren't the construction cores there already in relation to something like a 4690? In perfectly multithreaded tasks, I remember the 220W 4.7GHz FX being equal to or a smidge faster than the 84W i5-4690.

BurritoJustice
Oct 9, 2012

Twerk from Home posted:

Aren't the construction cores there already in relation to something like a 4690?

Hahaha

In unrelated news AMD just hit $1.71. Almost a full dollar below where it was at before the 300 series launch.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

BurritoJustice posted:

Hahaha

In unrelated news AMD just hit $1.71. Almost a full dollar below where it was at before the 300 series launch.

I'm not joking here and not saying that there's any case where anybody should buy a 9590, because they cost more than their Intel competitors while using 4x the electricity, but with a perfectly threaded 8 thread load isn't an AMD 8 core @ 5ghz broadly equivalent to an Intel quad @ 3.9 like the 4690? That's what I understood El Scotch to be saying, if a Zen 8 core is 90% the total multithreaded computer of the Intel competitor he may be in, out of pity.



Edit: This is pretty much the only benchmark AMD is competitive in because very few things are perfectly 8+ threaded.

Twerk from Home fucked around with this message at 16:28 on Jul 24, 2015

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Twerk from Home posted:

I'm not joking here and not saying that there's any case where anybody should buy a 9590, because they cost more than their Intel competitors while using 4x the electricity, but with a perfectly threaded 8 thread load isn't an AMD 8 core @ 5ghz broadly equivalent to an Intel quad @ 3.9 like the 4690? That's what I understood El Scotch to be saying, if a Zen 8 core is 90% the total multithreaded computer of the Intel competitor he may be in, out of pity.



Edit: This is pretty much the only benchmark AMD is competitive in because very few things are perfectly 8+ threaded.

It's only even vaguely competitive if you close your eyes when you get the power usage

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

BurritoJustice posted:

Hahaha

In unrelated news AMD just hit $1.71. Almost a full dollar below where it was at before the 300 series launch.

That's pretty impressive. Releasing new products and losing money hand over fist.

Anime Schoolgirl
Nov 28, 2002

Boiled Water posted:

That's pretty impressive. Releasing new products and losing money hand over fist.
If it weren't for the x86 gently caress-you they would have been bought a long time ago

Wistful of Dollars
Aug 25, 2009

BurritoJustice posted:

Hahaha

In unrelated news AMD just hit $1.71. Almost a full dollar below where it was at before the 300 series launch.

Sounds like it's time to buy some AMD stock! It'll triple in value when Zen is released and I can cash out and use the dividends to buy a sandwich.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

In this analogy IBM plays tennis and VIA play... Snooker or something.

Generic Monk
Oct 31, 2011

Twerk from Home posted:

I'm not joking here and not saying that there's any case where anybody should buy a 9590, because they cost more than their Intel competitors while using 4x the electricity, but with a perfectly threaded 8 thread load isn't an AMD 8 core @ 5ghz broadly equivalent to an Intel quad @ 3.9 like the 4690? That's what I understood El Scotch to be saying, if a Zen 8 core is 90% the total multithreaded computer of the Intel competitor he may be in, out of pity.



Edit: This is pretty much the only benchmark AMD is competitive in because very few things are perfectly 8+ threaded.

holy poo poo i didn't know it was loving 220w

is it steam powered? do i have to shovel coal into the front of my computer?


anandtech commenters saying running this thing 24/7 for a year would cost you upwards of £500 lmao - did anyone actually buy this? has anyone plotted the sales of this chip against fire department callouts yet? jesus loving christ.

Generic Monk fucked around with this message at 19:49 on Jul 24, 2015

Nintendo Kid
Aug 4, 2011

by Smythe

Generic Monk posted:

holy poo poo i didn't know it was loving 220w

is it steam powered? do i have to shovel coal into the front of my computer?

Well if it gets any higher, it might be cheaper to generate your own electricity for it.

Generic Monk
Oct 31, 2011

Nintendo Kid posted:

Well if it gets any higher, it might be cheaper to generate your own electricity for it.

considering buying this and a 290 and just routing my plumbing through it. play some arkham knight in the shower to get it nice and toasty

ElehemEare
May 20, 2001
I am an omnipotent penguin.

Generic Monk posted:

considering buying this and a 290 and just routing my plumbing through it. play some arkham knight in the shower to get it nice and toasty

It gets cold in Canada in the winter. Why should I waste my money on an Intel CPU and a space heater when I could buy a 9590. AMD is just adding value guys.

Nintendo Kid
Aug 4, 2011

by Smythe
What was the highest watt Pentium 4 ever put out by Intel?

mmkay
Oct 21, 2010

115 Watts according to wiki.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
AMD is paving the way in space heater miniaturization

Edward IV
Jan 15, 2006

mmkay posted:

115 Watts according to wiki.

And 130 watts for the dual core Pentium D.

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender
See, now we've found AMD's ideal customer. 1) Someone who's computing workload is perfectly multi-threaded and 2) doesn't have to pay for electricity.

If only AMD could invest some time and money into identifying and marketing to this select group of individuals their stock price would pick right back up.

Fallorn
Apr 14, 2005

Krailor posted:

See, now we've found AMD's ideal customer. 1) Someone who's computing workload is perfectly multi-threaded and 2) doesn't have to pay for electricity.

If only AMD could invest some time and money into identifying and marketing to this select group of individuals their stock price would pick right back up.

What you mean is that power plants don't yet know they need this chip for doing what ever.

Chuu
Sep 11, 2004

Grimey Drawer

Krailor posted:

See, now we've found AMD's ideal customer. 1) Someone who's computing workload is perfectly multi-threaded and 2) doesn't have to pay for electricity.

If only AMD could invest some time and money into identifying and marketing to this select group of individuals their stock price would pick right back up.

My sarcasm meter is sometimes broken, but in case you are not serious -- before Bitcoin crashed and asic mining was a thing -- they absolutely catered and marketed directly to that segment.

I don't remember why but Bitcoin mining was dramatically more efficient on AMD cards and certain models were in very short supply and sold at or above retail as a result.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Wasn't it due to consumer level cards having really good DP precision capability? I swear the 5000-7000 series were.

repiv
Aug 13, 2009

Cryptography (and therefore mining) is all integer operations so DP performance was irrelevant. AMD won the bitcoin race for two reasons: they invested heavily in GCNs integer performance in spite of most GPU applications being float-centric, and they happened to include an instruction which directly maps to a key step of SHA-256 while nVidias pre-Kepler architectures had to break it down into three instructions.

Nintendo Kid
Aug 4, 2011

by Smythe
And more importantly, AMD quickly realized this didn't help at all for actual graphics card stuff, and subsequent GPU designs removed that stuff, largely.

Adbot
ADBOT LOVES YOU

Fabulousity
Dec 29, 2008

Number One I order you to take a number two.

I thought it was also because pre-GCN stuff was VLIW which worked really well for BC robber barons in waiting? I vaguely remember Radeon 6700/6800s being in high demand and buying used ones was really risky because you didn't know if a miner had drilled the thing to death on a milk crate chassis with box fans for who knows how many hours and lied about it.

  • Locked thread