Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

And when you consider you can have the large intel sockets, lga 2011 I think, will fit on a myitx board it's not like you need that much space.

Edit: may also be msi are terribad at making motherboards, which is not outside the realm of possibility.

Adbot
ADBOT LOVES YOU

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
So long as they undercut Intel or offer more cores for the same dollar combined with lower board costs, the platform savings by moving to Red Team will be pretty great. I just made a haswell build I can dump for the price I paid (seriously wtf) and grab a Ryzen offering for peanuts.

GRINDCORE MEGGIDO
Feb 28, 1985


SwissArmyDruid posted:

Had they just gone and aped Intel's mounting hole positions, they'd probably have freed up a lot more space to bring those RAM slots in closer.

I'd have thought that'd be better for higher frequency RAM's.

I wonder what the memory controllers like. What kind of RAM speed do people get out of Bulldozer when they overclock it? I know it doesn't relate much to this (hopefully).

SwissArmyDruid
Feb 14, 2014

by sebmojo
Latency *is* a thing that can be affected by trace length from the socket to the ram slots, but I'm not willing to make a claim as to whether or not it is significant, I don't have any of my bookmarks to those tests on my phone.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Zen Copypasta from AT forum which in turn copypasta from Reddit:

quote:

https://www.reddit.com/r/Amd/comments/5mfjun/amd_drops_huge_news_on_ryzen_overclocking_and/

Summary...

-There will be full range of CPUs available at launch, not just 8 core. (me: woot)
-All Ryzen chips are overclockable. (me: woot x2)
-However, not all mobo's will allow overclocking, only x370, x300, and b350. (me: still a lot better than Intel)
-Only x370 will allow Crossfire and SLI. (Not quite right, see edit below). (me: I doubt anybody gives a poo poo about multi-GPU by this point, so)

AMD is aiming for early to mid Q1 for launch. (GDC?)
Edit: Looking at a few of the B350s shown at CES some of them definitely support Crossfire. I going to say PCWorld has this a little wrong. I would say that B350s will support x16 x4 crossfire only x8 x8 Crossfire/SLI will be reserved for x370 similar to how Intel differentiates the Z and B/H chipsets.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.
Does the new MB chipset support Thunderbolt 3? Because I really want to put my PC under the stairs and run an optical thunderbolt cable to my bedroom.

PC LOAD LETTER
May 23, 2005
WTF?!
The chipset don't but any vendor can slap a Intel TB chip on there if they want. I don't think any mobo has been announced with such a set up though.

Active USB 3 cables out to 50'+ a USB 3 dock of some sort might do what you want for lots less of course. I can't believe how much those longer optical TB cables cost.

edit: \/\/\/\/ Honestly TB vs USB3.1/3.2 feels lots like Firewire vs USB2 all over again. Its got Intel behind it and that is huge but it just still seems to be hardly getting any use at all even if the tech itself is impressive. I think if USB hadn't improved to 10Gbs with USB3.2 it might've had a good chance but once that happened its just too niche and expensive.

PC LOAD LETTER fucked around with this message at 16:03 on Jan 7, 2017

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Thunderbolt is a loving travesty on the PC. How long is the standard out? And there's still no/very very few mainboards with a port? Up until recently, you had to install expansion cards that plugged into PCIe and some port of the mainboard.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Combat Pretzel posted:

Thunderbolt is a loving travesty

I've yet to see a compelling case for thunderbolt that isn't well covered with regular USB C 3.1.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

PC LOAD LETTER posted:

The chipset don't but any vendor can slap a Intel TB chip on there if they want. I don't think any mobo has been announced with such a set up though.

Active USB 3 cables out to 50'+ a USB 3 dock of some sort might do what you want for lots less of course. I can't believe how much those longer optical TB cables cost.

edit: \/\/\/\/ Honestly TB vs USB3.1/3.2 feels lots like Firewire vs USB2 all over again. Its got Intel behind it and that is huge but it just still seems to be hardly getting any use at all even if the tech itself is impressive. I think if USB hadn't improved to 10Gbs with USB3.2 it might've had a good chance but once that happened its just too niche and expensive.

I go as far as to say TB is DOA tech. It's way too confusing as a standard for consumers, nobody except Intel/Apple wants to suck Intel's dick on licensing costs and it's competing with a free no-nonsense I/O called USB3.0 that is already offering 625MB/s.

Meanwhile, average office exec #123423 is still plugging in a 30-year old VGA cable for his meeting presentation.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Boiled Water posted:

I've yet to see a compelling case for thunderbolt that isn't well covered with regular USB C 3.1.
I was mostly interested in it a while ago for cheap higher-than-Gigabit networking. But alas...

penus penus penus
Nov 9, 2014

by piss__donald

Combat Pretzel posted:

Thunderbolt is a loving travesty on the PC. How long is the standard out? And there's still no/very very few mainboards with a port? Up until recently, you had to install expansion cards that plugged into PCIe and some port of the mainboard.

I dont know much about thunderbolt but im pretty sure its on all mid range and up z170 boards. Though its confusing because its label USB 3.1 (and is a regular USB port) and thunderbolt 3 simultaneously. I didn't realize thunderbolt was intel only but since its in the same port as a USB 3.1 I assume AMD can use it as well?

SwissArmyDruid
Feb 14, 2014

by sebmojo
Say what you will about Thunderbolt, it's presently the only thing that can even think about making external GPU docks a reality.

No, I don't think the change to use USB type C connectors is any help. I frankly think that in a perfect hypothetical world, Apple would have made the Lightning connector open source, and we could actually be using THAT for USB type-C instead of the abortion that it presently is*. I feel like this could have opened a pathway for Thunderbolt to migrate to PC as a result, assuming it was done early enough in its lifespan.



*(It's easier and cheaper to replace a cable than to replace a port when the tab snaps off. That, I think, could have made Thunderbolt relevant to more people. But no, Apple gotta :apple:.)

SwissArmyDruid fucked around with this message at 18:34 on Jan 7, 2017

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Lightning connecter is only six pins right, is that enough for a 3.1 USB signal?

SwissArmyDruid
Feb 14, 2014

by sebmojo


Eight pins, but I *believe* the contacts in the Lightning connector are double-sided. The Lightning connector doesn't care which way you plug it in, after all, and then I think negotiation takes care of the rest?

In any event, I think that if Apple weren't so goddamn obsessed with screaming "THIN! THIN! THIN!", you could probably double the number of wires in the cable, and have eight wires going to one side of the connector, eight wires going to the other, and definitely have enough wires for PCIe 4x.

edit: Ha, no, I forgot that type-C is 24-pin, not 16.

SwissArmyDruid fucked around with this message at 18:49 on Jan 7, 2017

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
The Lightning connector is garbage for reliability, even beyond the lack of pins it has. And it definitely couldn't carry the 5 amp/20 volt (that's 100 watt) current a USB-C connector can be specced to in the USB Power Delivery form of the USB 3.0 specs.

SwissArmyDruid
Feb 14, 2014

by sebmojo
On the other hand, I am dreading the day that I have to tell someone they now have a busted type-C port, the expensive kind, with Thunderbolt 3, because they first didn't plug the type-c connector in hard enough, then too-hard.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Something different with the combo USB3/TB ports? If those on my phones are worth as a reference, you gotta be really stupid to not plug it in "hard enough".

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Combat Pretzel posted:

I was mostly interested in it a while ago for cheap higher-than-Gigabit networking. But alas...

10G cards don't cost much anymore if you get them used, and cheaper 2.5/5G controllers that can use existing Cat5E installations for 1G are on the horizon.

Not quite like the 40G of TB3, but without a flash-based array you'd have trouble saturating 10G anyway.

RyuHimora
Feb 22, 2009

GRINDCORE MEGGIDO posted:

I'd have thought that'd be better for higher frequency RAM's.

I wonder what the memory controllers like. What kind of RAM speed do people get out of Bulldozer when they overclock it? I know it doesn't relate much to this (hopefully).

Both my FX-8350 and Athlon 750K, when overclocked to 4.5 GHz+, were able to run DDR3 2400, but I don't know if that's common or not. It was very turn-key though, I had zero issues.

fake edit: The Athlon actually got to 2666 before I chickened out, IIRC.

RyuHimora fucked around with this message at 05:47 on Jan 8, 2017

Kazinsal
Dec 13, 2011


Eletriarnation posted:

10G cards don't cost much anymore if you get them used, and cheaper 2.5/5G controllers that can use existing Cat5E installations for 1G are on the horizon.

Not quite like the 40G of TB3, but without a flash-based array you'd have trouble saturating 10G anyway.

10GBASE-T is terrifying. Insane power consumption, and you get to experience your network cable getting physically warm to the touch. 10GBASE-CR/Direct Attach cables are more expensive (since they're twinaxial cables permanently affixed to a pair of SFP+ transceivers) and have pretty severe length limitations but are much better.

e: I could be wrong but iirc 1000BASE-T uses about 0.4-0.6 watts while 10GBASE-T can use all the way upwards of 12 watts on really cheap equipment.

Kazinsal fucked around with this message at 09:06 on Jan 8, 2017

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
http://hexus.net/tech/news/cpu/101290-amd-confirms-ryzen-cpus-will-unlocked/

Supposedly all of the Ryzen CPUs will have unlocked multipliers. Nice.

Edit: whoops, this has already been mentioned

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Kazinsal posted:

10GBASE-T is terrifying. Insane power consumption, and you get to experience your network cable getting physically warm to the touch. 10GBASE-CR/Direct Attach cables are more expensive (since they're twinaxial cables permanently affixed to a pair of SFP+ transceivers) and have pretty severe length limitations but are much better.
Means if I were to go with SFP, I'd not have a new heating element between my computer and the NAS? I think there's 5m as SFP DA cables, which should be sufficient in my case. Sadly I haven't found cheap used cards that work in FreeNAS.

--edit: Heh, passive DAC SFP+ is 0.1W.

Combat Pretzel fucked around with this message at 17:09 on Jan 9, 2017

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
It might have been mentioned already, but I didn't realize that Ryzens will have 24x PCIe lanes directly from the CPU (e: 4x reserved for the chipset). As an Intel 750 NVMe SSD owner, this is putting me on the bandwagon for reconsidering Intel's X299 HEDT platform as my next upgrade, if Socket AM4 CPUs do end up with a worthwhile performance/price difference from Skylake-E/Kaby Lake-X. Video is Linus but he's not being too insufferable in this one:

https://www.youtube.com/watch?v=vPByz-PtWkw


e2: chipset is slightly future-leaning too, with usb 3.1 gen2

e3: while i'm on the foolish topic of futureproofing, sure why not let's mention amd's going to keep making cpus for this socket until 2020

Sidesaddle Cavalry fucked around with this message at 02:51 on Jan 10, 2017

SwissArmyDruid
Feb 14, 2014

by sebmojo
I have heard Papermaster's comments about how “We’re not going tick-tock,” and “Zen is going to be tock, tock, tock.”

http://www.pcworld.com/article/3155129/components-processors/amd-says-its-zen-cpu-architecture-is-expected-to-last-four-years.html

Great. We'll be stuck on 14nm chips until 2020. Although who knows how much of this is "because GloFo can't un-gently caress their poo poo sufficiently to get us onto 10nm before then".

edit: In retrospect what he PROBABLY meant was "Tick Tick Tick".

Ticks: Die shrinks and optimization
Tocks: New microarchitecture

edit edit: VVVVV :downsgun:

SwissArmyDruid fucked around with this message at 04:21 on Jan 10, 2017

Rastor
Jun 2, 2001

I saw another source that AMD is planning to stay on 14nm for a while. Possibly related to reports GloFo is going to try to skip over 10nm.

wargames
Mar 16, 2008

official yospos cat censor
We know ibm group may have found a way to 7nm and intel is the only one working on 10nm?

http://arstechnica.com/gadgets/2015/07/ibm-unveils-industrys-first-7nm-chip-moving-beyond-silicon/

and honestly I see us stuck at 7nm for a very very very long time.

Anime Schoolgirl
Nov 28, 2002

Sidesaddle Cavalry posted:

e3: while i'm on the foolish topic of futureproofing, sure why not let's mention amd's going to keep making cpus for this socket until 2020
Considering the competition has been changing sockets every 15 months, this is pretty welcome

filthychimp
Jan 2, 2006
Damned dirty ape

wargames posted:

We know ibm group may have found a way to 7nm and intel is the only one working on 10nm?

http://arstechnica.com/gadgets/2015/07/ibm-unveils-industrys-first-7nm-chip-moving-beyond-silicon/

and honestly I see us stuck at 7nm for a very very very long time.

As you can tell from the article's suspicious tone, this is pretty expected. The real challenge is making a process that allows you to mass produce millions of chips, and do so economically.

Everyone is in the process of shoving their 10nm processes out the door (except GloFlo who's skipping that node), and work is being done on 7nm. 5nm is the area of active research, and where a new transistor design or new materials is going to have to come in. Here's a more detailed article on 5nm.

SwissArmyDruid
Feb 14, 2014

by sebmojo
The amount of faith I have in GloFo not albatrossing the gently caress out of AMD is nil. Intel is still on track for 2020, right? 2020 is going to roll around and GloFo will still be in risk production, citing "unforeseen issues" and "developmental hurdles", and they won't have a 10nm node for them to fall back on.

SwissArmyDruid fucked around with this message at 06:25 on Jan 10, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
There is always FD-SOI for GloFo and AMD, Of which they have 22nm, 12nm, and 7nm FD-SOI. I have no idea what the hurdles would be moving from finFET to FD-SOI or whether FD-SOI is even suitable for such things, but it's apparently a much less complex process if not more expensive in small volume.

Also keep in mind that GloFo's 7nm finFET description sounds more like Samsung/TSMC 10nm and they aren't planning EUV for the first run. My guess is AMD does 14nm until mid 2018 (So Zen, Raven, and Zen+, which is likely an update to AVX-512), and shifts over to first generation "7nm GloFo" late 2018, early 2019 with another patch of chips arriving in mid 2020 using EUV if it's available. This is based on AMD's own roadmap for their GPU's, as Vega 20 is supposed to be a 7nm chip, but that's likely on the back of GloFo's promises.

Gwaihir
Dec 8, 2009
Hair Elf

Sidesaddle Cavalry posted:

It might have been mentioned already, but I didn't realize that Ryzens will have 24x PCIe lanes directly from the CPU (e: 4x reserved for the chipset). As an Intel 750 NVMe SSD owner, this is putting me on the bandwagon for reconsidering Intel's X299 HEDT platform as my next upgrade, if Socket AM4 CPUs do end up with a worthwhile performance/price difference from Skylake-E/Kaby Lake-X. Video is Linus but he's not being too insufferable in this one:

https://www.youtube.com/watch?v=vPByz-PtWkw


e2: chipset is slightly future-leaning too, with usb 3.1 gen2

e3: while i'm on the foolish topic of futureproofing, sure why not let's mention amd's going to keep making cpus for this socket until 2020

While I love the silly high end stuff like Intel 750s, what are you running on it that would possibly perform better with CPU drive PCIe lanes vs chipset? Has there ever even been benchmarks for SSDs running of CPU lanes vs PLX chips or chipset slots?

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

Gwaihir posted:

While I love the silly high end stuff like Intel 750s, what are you running on it that would possibly perform better with CPU drive PCIe lanes vs chipset? Has there ever even been benchmarks for SSDs running of CPU lanes vs PLX chips or chipset slots?

I tested it on my system. I didn't do it scientifically or anything, but the performance increase in Sandra was barely even there.

crazypenguin
Mar 9, 2005
nothing witty here, move along
I don't think you should see a performance increase. I think it's mostly about bottleneck avoidance.

We have a maximum 32 Gb/s (4 lanes) bandwidth between CPU and chipset. We're starting to get a lot of very high bandwidth devices. 10 Gb/s USB 3.1gen2, 10 Gb/s ethernet, etc. I was ABOUT to say the bottleneck was somewhat theoretical, but apparently we already have NVMe SSDs hitting 3500MB/s, which is total saturation of that bandwidth.

So giving NVMe SSDs their own lanes avoids starving other devices during peak usage.

Gwaihir
Dec 8, 2009
Hair Elf
Right, and the question remains, when are you seeing usage like that at home, heh?

NihilismNow
Aug 31, 2003
Agreed, 640kb should be enough for anyone.

E: Maybe someone has a :krad: homelab with teamed 10GB nics doing a backup from a Intel 750 to a remote system.

NihilismNow fucked around with this message at 19:27 on Jan 10, 2017

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Gwaihir posted:

Right, and the question remains, when are you seeing usage like that at home, heh?

Maybe not yet but if you can go as long with this as you could the 2500k you very likely will be in 6 years.

EdEddnEddy
Apr 5, 2012



4K video editing at home is a thing, and it's only going to get bigger each year, especially with 360 video coming for VR as well that needs >4K to not look like crap.

Remember that Radeon PRO with the onboard SSD scrubbing an 8K video in realtime?

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Gwaihir posted:

Right, and the question remains, when are you seeing usage like that at home, heh?

NihilismNow posted:

Agreed, 640kb should be enough for anyone.

E: Maybe someone has a :krad: homelab with teamed 10GB nics doing a backup from a Intel 750 to a remote system.

:smith:

Adbot
ADBOT LOVES YOU

Pryor on Fire
May 14, 2013

they don't know all alien abduction experiences can be explained by people thinking saving private ryan was a documentary

EdEddnEddy posted:

4K video editing at home is a thing, and it's only going to get bigger each year, especially with 360 video coming for VR as well that needs >4K to not look like crap.

Remember that Radeon PRO with the onboard SSD scrubbing an 8K video in realtime?

It's really not. Yes there are a few jobs in this field but most of them have gone away in the past 10 years. 90% of "video production" just happens on youtube or on phones nowadays.

Now the number of millennials who think they "need" a high end computer and camera for their awesome videos they will make is not decreasing at all, even if only 1/100 of those people actually do anything with it.

  • Locked thread