Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
mobby_6kl
Aug 9, 2009

by Fluffdaddy
Maybe AMD's new CPU will the the NetBurst of this generation, that will get the clock speed wars going again :rice:

Adbot
ADBOT LOVES YOU

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

How does this analogy work? The dozers were already the AMDs netburst.

canyoneer
Sep 13, 2005


I only have canyoneyes for you

mobby_6kl posted:

Maybe AMD's new CPU will the the NetBurst of this generation, that will get the clock speed wars going again :rice:

I'll get my 80mm case fans with blue LEDs ready

PC LOAD LETTER
May 23, 2005
WTF?!
The details that were leaked a ways back said otherwise. Best rumors we have suggest a 3-4Ghz clockspeed for Zen.

Bulldozer was their attempt at a Netburst-ish uarch and it was a failure. Intel didn't do very well with it either so it doesn't seem to be a really good idea to pursue a speed demon uarch in x86 land.

NihilismNow
Aug 31, 2003

PC LOAD LETTER posted:

Most of those are the types you could say "its 64bit so its powerful" to and they'd buy it though. They have little to no understanding of the underlying tech and don't really know what it is they're buying. Just that its "better" somehow.

I think you underestimate how succesful intel's branding of their core line has been. People who have no idea how many bit their CPU is or what that even means know i7 > i5.
No one is going to bring back the 80886 for nostalgia's sake.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

feedmegin posted:

Sure...but the argument that started this off was something like 'if AMD were stronger Intel would be forced to compete and our desktop CPUs would be shooting up in performance again'. My counter-argument is that this is untrue because of physics, and this remains the case. People at home, even power users, aren't generally running dozens of VMs or high-traffic webservers or whatever so giving them more CPUs or more cores wouldn't do them any good, and otherwise we are stuck with 5% improvements in the only area where improvements matter for anything that you can't do with your GPU.

Physics stops you from increasing the clocks by much anymore but you can always put more transistors on the die for bigger caches and better branch prediction and stuff. They've really slowed down on doing that ever since Sandy Bridge came out on the desktop side while they're still trucking along with the increased transistor counts on the server side, which is why I was comparing the progress of desktop and server chips. I am guessing that this is due to a combination of diminishing returns, high cost/heat, and the fact that it would basically be niche product for PC enthusiasts which they have little economic incentive to provide. I still think that they would be pushing harder if they were still in close competition with AMD but yes there are inherent design challenges to providing more IPC that make it harder than just adding more cores like with GPUs and server CPUs.

PC LOAD LETTER
May 23, 2005
WTF?!

NihilismNow posted:

I think you underestimate how succesful intel's branding of their core line has been.
Their branding has been successful in that Intel is about the only name people know when it comes to PC CPU's. Many of those same people had no issues buying AMD chips when they were competitive with Intel's and were sold in suspicious looking white boxes with hastily printed labels slapped on. That was exactly how my 1st Slot A Athlon was sold at Fry's when they were first released BTW. Same went for the motherboard. The CPU is just a component in the PC. Expecting average people to demand to use a particular CPU because of branding is as reasonable as expecting average people to demand a certain brand of capacitor in their TV or brand of MOSFET on their motherboard.

NihilismNow posted:

People who have no idea how many bit their CPU is or what that even means know i7 > i5.
No it tells them almost nothing. Is it a i7-2600 or a i7-5930k? Would your average buyer who only needs to use Office, youtube, and email even need either? How exactly will they determine features and performance vs their needs and budget from only that cryptic set of a letter and 5 numbers? Even as someone who follows tech closer than most and can build his own PC I couldn't tell you exactly. Yea sure I know 1 is overclockable and a much more recent Core architecture than the other, but that is it. Cache size? Clock speed? Core counts? Anything other than a handwavey "that one should be faster in general"? I'd have to google it.

NihilismNow posted:

No one is going to bring back the 80886 for nostalgia's sake.
I know. It wasn't a "AMD/Intel should do what I say" post. That was why I put in "honestly I kinda half want" before talking about that stuff.

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
The Phenom and FX branding are poisoned after being attached to garbage products for the last 5+ years. Maybe AMD could bring the Opteron branding down to their new high performance desktop chips. Athlon for midrange and Duron for low-end seems like a good idea to me.

adorai
Nov 2, 2002

10/27/04 Never forget
Grimey Drawer
Which is not necessarily true anyway. Since everyone loves car analogies, in the 90's everyone knew a v6 was "better" than an i4, but these days they know they need more information because a turbo i4 can very well be "better" than a v6. The fact is that you are right, intel has been successful in the branding game, despite it not always being accurate.

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

The dual core with hyperthreading mobile i5 I have in my work laptop is garbage.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

fart simpson posted:

The dual core with hyperthreading mobile i5 I have in my work laptop is garbage.

Which one is it? I have the i5-5300U in my work laptop and the i5-5257U in my home laptop and I've been pretty happy with them.

MaxxBot fucked around with this message at 21:49 on Mar 25, 2016

A Bad King
Jul 17, 2009


Suppose the oil man,
He comes to town.
And you don't lay money down.

Yet Mr. King,
He killed the thread
The other day.
Well I wonder.
Who's gonna go to Hell?
Got a m3 Compute Stick and it's a 900mhz proc that can't handle 1440p YouTube video as well as my quadcore i5 and this upsets me.

Anime Schoolgirl
Nov 28, 2002

A Bad King posted:

Got a m3 Compute Stick and it's a 900mhz proc that can't handle 1440p YouTube video as well as my quadcore i5 and this upsets me.
that's supposed to be hardware-decodable, if you use a firefox variant or such have you set "media.gmp-provider.enabled" to false?

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

A Bad King posted:

Got a m3 Compute Stick and it's a 900mhz proc that can't handle 1440p YouTube video as well as my quadcore i5 and this upsets me.

Which i5?

Wistful of Dollars
Aug 25, 2009

Introducing the new 'Goes Like Hell' line of processors.

mobby_6kl
Aug 9, 2009

by Fluffdaddy

MaxxBot posted:

Which one is it? I have the i5-5300U in my work laptop and the i5-5257U in my home laptop and I've been pretty happy with them.

The new T-series Thinkpads that IT wants me to use instead of my T520 are actually (very mildly) slower than what I have. Of course they're thinner and lighter, but why would I care if it spends 90% of the time in the dock anyway.

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

mobby_6kl posted:

The new T-series Thinkpads that IT wants me to use instead of my T520 are actually (very mildly) slower than what I have. Of course they're thinner and lighter, but why would I care if it spends 90% of the time in the dock anyway.

Because they also have less features and ports (apart from USB3) of course!

Lafarg
Jul 23, 2012

mobby_6kl posted:

The new T-series Thinkpads that IT wants me to use instead of my T520 are actually (very mildly) slower than what I have. Of course they're thinner and lighter, but why would I care if it spends 90% of the time in the dock anyway.

Free upgrade?

Setzer Gabbiani
Oct 13, 2004


PBCrunch posted:

garbage products

Say what you will about the FX series, but the Phenom II was AMD's last great lineup that could compete with Intel on the cheap. poo poo, I was able to make an 1100t last for 5 years with no real problem on modern games, save for some stuff on PCSX2, which is where most former AMD owners like me start to hit a wall when it comes to SSE4.1/AVX2 being mandatory for pickier games

sincx
Jul 13, 2012

furiously masturbating to anime titties
Given how Intel "borrowed" their branding from BMW (3-series, 5-series, 7-series), perhaps AMD should similarly borrow from Mercedes (C-class Zen, E-class Zen, S-class Zen). They can even justify it as: C=consumer, E=enthusiast, S=super.

A Bad King
Jul 17, 2009


Suppose the oil man,
He comes to town.
And you don't lay money down.

Yet Mr. King,
He killed the thread
The other day.
Well I wonder.
Who's gonna go to Hell?

Anime Schoolgirl posted:

that's supposed to be hardware-decodable, if you use a firefox variant or such have you set "media.gmp-provider.enabled" to false?

Why isn't this enabled by default!? Was using Firefox, googled the issue and how to fix it, problem solved. Thanks!


6500, got it while it was on sale at $180.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

They should have similar video performance, they almost have the same iGPU in them (Intel HD Graphics 515 for Core m3, 530 for your i5).

A Bad King
Jul 17, 2009


Suppose the oil man,
He comes to town.
And you don't lay money down.

Yet Mr. King,
He killed the thread
The other day.
Well I wonder.
Who's gonna go to Hell?

Boiled Water posted:

They should have similar video performance, they almost have the same iGPU in them (Intel HD Graphics 515 for Core m3, 530 for your i5).


One is a compute stick, the other is a HTPC in a 10liter mITX case. One is a 5 watt 2 core part at 900mhz/2.2ghz, the other is a 65watt 4 core part at 3.2/3.6ghz paired with a Nano Fury. I'm upset that the compute stick is not comparable. /s

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

And yet the graphics parts of the chips are identical. I'm not sure what you're getting at.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


If USB-C is capable of feeding full-fat video cards NOW, what possible incentive is there to make new video cards PCI Express exclusives? Available, even primary, sure; PCI Express is on most workstation form factor computers and USB-C is on at most five percent of them. But backward-thinking at a time when onboard's aiming at the gaming market seems like suicide.

Will there even be a reason for PCI Express in five years' time? For the workstation form factor at all?

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Boiled Water posted:

And yet the graphics parts of the chips are identical. I'm not sure what you're getting at.

Throttling makes a big difference, especially 65W vs. 5W.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Eletriarnation posted:

Throttling makes a big difference, especially 65W vs. 5W.

Surely not for youtube videos?

Sir Unimaginative posted:

If USB-C is capable of feeding full-fat video cards NOW, what possible incentive is there to make new video cards PCI Express exclusives? Available, even primary, sure; PCI Express is on most workstation form factor computers and USB-C is on at most five percent of them. But backward-thinking at a time when onboard's aiming at the gaming market seems like suicide.

Will there even be a reason for PCI Express in five years' time? For the workstation form factor at all?

What are you asking? I've read and re-read it and can't make sense of your text.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Boiled Water posted:

Surely not for youtube videos?

Why not? Hardware decoding doesn't mean it uses no power, and 5W is a low ceiling.

Sir Unimaginative posted:

If USB-C is capable of feeding full-fat video cards NOW, what possible incentive is there to make new video cards PCI Express exclusives? Available, even primary, sure; PCI Express is on most workstation form factor computers and USB-C is on at most five percent of them. But backward-thinking at a time when onboard's aiming at the gaming market seems like suicide.

Will there even be a reason for PCI Express in five years' time? For the workstation form factor at all?

I think there's a divide between 'capable' and 'ideal' here. Even with Thunderbolt, USB-C is not going to push as much bandwidth as PCIe 3.0 x16 in the near term unless I missed something.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Boiled Water posted:

What are you asking? I've read and re-read it and can't make sense of your text.

I...I think they're saying that if USB-C can feed even Halo Tier GPUs, and thus work externally and be interchanged live, that in comparison to port size between PCIE and USB-C, PCIE ends up being hilariously inefficient on motherboard space and you'll get way more out of just using USB-C. Like, theoretically you could get away with XCF/SLI using USB-C on a Nano-ITX board that you'd never accomplish with PCIE.

Basically, eGPU over USB-C offers potentially huge scalability in a much smaller package? Compared to PCIE anyway.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Eletriarnation posted:

Why not? Hardware decoding doesn't mean it uses no power, and 5W is a low ceiling.

Sure it'll use power but I don't think it'll hit that ceilling decoding 1080p videos. edit: Realized I was on the internet and looked it up: Compute stick runs fine with 1080p, described as "struggling" at 1440p.

FaustianQ posted:

I...I think they're saying that if USB-C can feed even Halo Tier GPUs, and thus work externally and be interchanged live, that in comparison to port size between PCIE and USB-C, PCIE ends up being hilariously inefficient on motherboard space and you'll get way more out of just using USB-C. Like, theoretically you could get away with XCF/SLI using USB-C on a Nano-ITX board that you'd never accomplish with PCIE.

Basically, eGPU over USB-C offers potentially huge scalability in a much smaller package? Compared to PCIE anyway.

Sure that could be done, but when you factor in how huge graphics cards are there's no reason to do it except of course for the external GPU but that business case has yet to prove itself.

champagne posting fucked around with this message at 19:10 on Mar 28, 2016

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Sir Unimaginative posted:

If USB-C is capable of feeding full-fat video cards NOW, what possible incentive is there to make new video cards PCI Express exclusives? Available, even primary, sure; PCI Express is on most workstation form factor computers and USB-C is on at most five percent of them. But backward-thinking at a time when onboard's aiming at the gaming market seems like suicide.

Will there even be a reason for PCI Express in five years' time? For the workstation form factor at all?

Because most uses do not have need of an easily detachable video card? And putting a usb-c socket inside the case to handle the video card sounds kinda insane? Unless you're also making it so that the video cards themselves can be made much smaller, there's no point in saving a minuscule amount of board space with the port - if you're really cramped for space you can use the smaller pci-express slots and restrict your choice in video cards a little.

Plus I'm pretty sure you do need to do a non-trivial amount of extra engineering to ensure that your video card chipset will adequately function on both PCI Express and USB-C connection methods, as well as extra work for the drivers.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Boiled Water posted:

Sure that could be done, but when you factor in how huge graphics cards are there's no reason to do it except of course for the external GPU but that business case has yet to prove itself.

But GPU's really don't have to be gently caress huge with HBM2 and no longer being tied to PCIE. You could theoretically get smaller than MXM for PCB, although the cooling solution at that point would be interesting.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Yes they have to be huge. The large part of a graphics card isn't the card but rather the heat sink attached to it.

SwissArmyDruid
Feb 14, 2014

by sebmojo
No, the *card* has to be huge, but the interface doesn't. With most proper video cards getting most of their power straight from the power supply as opposed to from the bus, we are kind of approaching the point where our graphics cards can be plugged in by cables, just like our SATA drives. Like, remove the PCIe fingers off the card, replace them with a hypothetical GPU cable port, a la SATA, and then mount the GPU into cages like you would a hard drive, as opposed to hanging it off the motherboard.

I mean, wouldn't that be nice for the mini-ITX crowd, obviating the need for those fuggin' 3M PCIe riser cards?

SwissArmyDruid fucked around with this message at 19:31 on Mar 28, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Boiled Water posted:

Yes they have to be huge. The large part of a graphics card isn't the card but rather the heat sink attached to it.

Hmm, yeah, you'd have to come up with some exotic cooling solution for what is supposed to be an idea for a highly scalable, flexible and off the shelf solution for getting a ton of GPU's running in parallel. At that point, you're not doing much better than the Razer Core. Otherwise, custom FPGA solution.

HMS Boromir
Jul 16, 2011

by Lowtax
what if you like

made a graphics card that was like a folded up version of current cards and then in the middle you put a heatsink, with a fan on each side

graphics cube

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

No, the *card* has to be huge, but the interface doesn't. With most proper video cards getting most of their power straight from the power supply as opposed to from the bus, we are kind of approaching the point where our graphics cards can be plugged in by cables, just like our SATA drives. Like, remove the PCIe fingers off the card, replace them with a hypothetical GPU cable port, a la SATA.

I mean, wouldn't that be nice for the mini-ITX crowd, obviating the need for those fuggin' 3M PCIe riser cards?

At that point you might net something smaller than the DAN A4 with HBM2 cards that are roughly the size of or smaller than the Nano.

SwissArmyDruid
Feb 14, 2014

by sebmojo
And (and I made this edit after you quoted it but before you posted) your GPUs could start being mounted in cages or on sleds, mounted elsewhere in the case.

Hell, that Thermaltake Level 10 might finally be viable after all, if you could put the one vertical plane in the center, and mount GPUs on the backside of the motherboard tray.

Honestly, I think the DAN A4 is probably as small as it gets, unless you start restricting card form factors to like, Fury Nano-only. It's a given that everyone's video cards are going to be smaller (thinner coolers at minimum thanks to the combination of process shrink and full-sized chips not being launched yet, even if the areal shrink caused by HBM isn't to arrive until later)

SwissArmyDruid fucked around with this message at 20:37 on Mar 28, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Finally, the 5.25" slot makes a comeback!

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

SwissArmyDruid posted:

And (and I made this edit after you quoted it but before you posted) your GPUs could start being mounted in cages or on sleds, mounted elsewhere in the case.

Hell, that Thermaltake Level 10 might finally be viable after all, if you could put the one vertical plane in the center, and mount GPUs on the backside of the motherboard tray.

Honestly, I think the DAN A4 is probably as small as it gets, unless you start restricting card form factors to like, Fury Nano-only. It's a given that everyone's video cards are going to be smaller (thinner coolers at minimum thanks to the combination of process shrink and full-sized chips not being launched yet, even if the areal shrink caused by HBM isn't to arrive until later)

You could probably fit a pair of Fury Nanos in a DAN-A4 style case in a mATX footprint.

Other than that there's not really a point to the Nano formfactor - the DAN A4 fits a full-sized GPU, so why would you need a shortie card?

  • Locked thread