priznat posted:They might have some junk cluttering up the motherboard where a x16 edge would bang into it I guess. Workstation/server boards are how I know about open-ended dautherboard slots.
|
|
# ? Apr 21, 2024 23:36 |
|
|
# ? May 7, 2024 21:14 |
|
Asrock tends to leave the x1 slots open-ended on about half of their mid-high end consumer boards. I've only ever seen ASUS do it for their workstation-marketed and enterprise things.
|
# ? Apr 22, 2024 05:46 |
|
In terms of PCIe slots the best board I found was the Asus proart b650-creator - notably better than the x670 proart if you need a third slot with more than two lanes. The b650 will do x8/x8/x4 PCIe 4.0, which might limit upgrades in the future but the current gen nvidia cards only do 4.0 and my NIC is pcie 3.0 so I needed the lanes. In practice I'm only able to do about 20gbps with iperf on a 40Gbe NIC, instead of the 32ish I was hoping for, but that might be limited by the CPU at the other end and a lack of proper offload somewhere. The x670 proart will do x8/x8 PCIe 5.0 but then the third slot is stuck at two lanes PCIe 4.0. edit: Oh yeah there are MSI boards that will also do x8/x8/x4 but I had excluded them due to historically bad IOMMU support from MSI. If you're not concerned about that, the MSI boards will probably be even better. Desuwa fucked around with this message at 16:08 on Apr 22, 2024 |
# ? Apr 22, 2024 07:11 |
|
The MSI MPG X670E Carbon Wifi could be another option, it will do x8/x8/x4 with the first two x8s at PCIe 5.0. I've been using this board populated with a 3080 in the first slot, P420 SAS controller in the second, and a X520 2x 10Gb NIC in the third without any issues so far.
|
# ? Apr 22, 2024 09:51 |
Bjork Bjowlob posted:The MSI MPG X670E Carbon Wifi could be another option, it will do x8/x8/x4 with the first two x8s at PCIe 5.0. I've been using this board populated with a 3080 in the first slot, P420 SAS controller in the second, and a X520 2x 10Gb NIC in the third without any issues so far.
|
|
# ? Apr 22, 2024 09:58 |
|
You'd also need to consider the power delivery as wider slots have more power to them, you'd need to make sure that these cute small open ended slots can handle up to 75 watts cards.
|
# ? Apr 22, 2024 12:14 |
|
Kivi posted:You'd also need to consider the power delivery as wider slots have more power to them, you'd need to make sure that these cute small open ended slots can handle up to 75 watts cards. PCIe power is all on that front stubby bit, which is the same on every size of slot. Every slot needs 75 watts by spec. My guess would be that open-end slots are much easier to break or damage. Server stuff gets put together and then shoved into racks and nobody touches it until it fails or is obsolete. DIYers are always monkeying with their PCs. And if someone puts a heavy x16 GPU into an open-end x4 slot and then is moving or shipping the PC, it probably ends in tears. Thus the general absence in consumer boards and prevalence in server and pro-grade stuff.
|
# ? Apr 22, 2024 14:01 |
|
Klyith posted:PCIe power is all on that front stubby bit, which is the same on every size of slot. Klyith posted:Every slot needs 75 watts by spec.
|
# ? Apr 22, 2024 15:23 |
|
25W ought to be enough. Even the most inefficient 400GBit ConnectX-7 barely goes past this with 26W, with other models less power hungry.
|
# ? Apr 22, 2024 16:20 |
|
Mixing different types of memory of similar specs is still a no-go? I want to add more RAM to my NAS, turns out they meanwhile switched from Micron E-die to Hynix C-die on the (mostly) specific model of module.
|
# ? Apr 22, 2024 17:09 |
|
Combat Pretzel posted:Mixing different types of memory of similar specs is still a no-go? Do you mean like one stick of each, or adding another pair of sticks? Mixed sticks in a single pair, your problem is the XMP/AMP values might be slightly different for the two. When loading XMP it just looks at one stick, it doesn't do any comparison or smarts. It should work fine in JDEC, or with speed backed down one notch from rated value. But to fully OC you may need manual settings of timings & voltage set because you have to target the worst value for both. Adding a second pair, it doesn't matter because even 4 perfectly matched sticks will get trained to slower timings, if they can run at rated speed at all.
|
# ? Apr 22, 2024 17:32 |
|
Going from two to four sticks, at JEDEC speeds. They're DDR4-3200 CL22 out of the box. Well, four of them are gonna run at most at 2666 from what the mainboard manual says.
|
# ? Apr 22, 2024 17:48 |
|
Combat Pretzel posted:Mixing different types of memory of similar specs is still a no-go? It’s a NAS, run JEDEC speeds. Combat Pretzel posted:Going from two to four sticks, at JEDEC speeds. They're DDR4-3200 CL22 out of the box. Well, four of them are gonna run at most at 2666 from what the mainboard manual says. …well, if the mobo’s original release predates zen2 I bet you could get 3200 cl22 which is the fastest JEDEC speed.
|
# ? Apr 22, 2024 21:28 |
|
Tuna-Fish posted:True.
|
# ? Apr 24, 2024 19:39 |
|
crazypenguin posted:e: and it looks like Apple's A17 Pro is 35 TOPS, and A18 will probably come out at the same time, so maybe qualcomm isn't that far ahead of everyone here For what it's worth, Apple traditionally used 16-bit TOPS as their marketing number (*), and their NPUs always double that number when doing 8-bit computations. Some think that for whatever reason, they chose to market A17 Pro using 8-bit TOPs, while sticking with 16-bit numbers for M3. The reasoning is simple: in the past Apple's reused the same NPU block in both A-series and M-series chips, and M3/A17 Pro are both N3, launched at about the same time, and share lots of other cores (same CPUs, for example). There should be no reason why the A17 Pro is about 2x as fast. Frustratingly, in the months since launch, nobody seems to have benchmarked this to confirm or deny the hypothesis. Or if they have, I can't find it. By the way, yes, this is a huge problem for NPU TOPs comparisons in general. Be sure you're comparing the same thing... edited to add this footnote: * - iirc they seldom or never explicitly said they were using 16-bit, people had to test M1 to determine that the marketing number was 16-bit TOPs BobHoward fucked around with this message at 03:24 on Apr 26, 2024 |
# ? Apr 26, 2024 03:21 |
|
Oh interesting. I definitely didn't take the time to think carefully about whether these sources were using different units. I was confused about the difference between M3 and A17, but shrugged it off. I wonder if we'll see standardization here or if bitwidths and formats will continue to change.
|
# ? May 3, 2024 16:27 |
|
Looks like AMD is already ditching the (bad, stupid) "decoder ring" naming scheme to more closely copy intel... and work the current AI scam into the name, lol
|
# ? May 7, 2024 16:13 |
|
It makes sense, but I’ve always loved AMD’s dumb naming conventions regardless of how stupid they were. We wouldn’t have gotten Threadripper without it.
|
# ? May 7, 2024 16:26 |
|
Really, are we going to have to do the thing of putting "AI" in all of the product names for a few years? It's going to be like raytracing all over again. All the marketing will be saying there's something revolutionary right around the corner which will totally need this new functional unit, and then if and when that (unnecessary) killer app appears we're all going to realize that the first gen hardware isn't actually fast enough to do it effectively.
|
# ? May 7, 2024 16:31 |
|
We're gonna see "AI" on everything the way we used to see "smart" and "cloud" on everything, eventually marketing will find another word to strip of all meaning
|
# ? May 7, 2024 16:40 |
|
DoombatINC posted:We're gonna see "AI" on everything the way we used to see "smart" and "cloud" on everything, eventually marketing will find another word to strip of all meaning This has already happened.
|
# ? May 7, 2024 16:42 |
|
Eletriarnation posted:Really, are we going to have to do the thing of putting "AI" in all of the product names for a few years? I went to a water/wastewater treatment conference last month where pretty much 90% of the presentations were either Direct Potable Reuse or PFAS; I joked with some colleagues that DPR/PFAS are to the water industry what AI now is to everything else.
|
# ? May 7, 2024 17:08 |
|
3d tvs, anyone? Curved ones which are a terrible idea from a tv viewing distance or with more than one watching?
|
# ? May 7, 2024 18:13 |
|
I still have a 3D TV that I baby, because I actually enjoy watching movies in 3D. But I know I’m in the rarity. Looking forward to the new Acer 3D monitor they’re coming out with that tracks eye movement to adjust the stereoscopic view (which also means it’s glasses-free).
|
# ? May 7, 2024 18:14 |
|
Noticed that Edge (on Android at least) has rebranded itself "Edge - AI Browser". As if I wasn't already uninterested in using it, sheesh
|
# ? May 7, 2024 18:36 |
|
Yeah, they rebranded it that way across all versions of Edge I think, since I saw it in Windows 11 branded that way, recently. Honestly I don’t have an issue with Edge - it works decently enough, and I prefer it to Chrome. Firefox and Safari are my daily drivers though.
|
# ? May 7, 2024 18:57 |
|
Hey these NPUs are gonna revolutionize PCs by uh... doing Teams background blurring a little more efficiently, and that one photoshop filter. Also running some parts of Microsofts copilot AI locally, even though everyone hates it and instantly uninstalls it. The AI Miracle®! I do think its funny to rebrand your entire SoC because of the inclusion of a little fixed function block that takes up like sub-10% of your die space and accelerates a handful of things. Shoulda called them the AMD Ryzen AV1 series, at least that fixed function block might actually be useful in the future.
|
# ? May 7, 2024 19:03 |
|
AMD has always been quick to fad up the names. Like the Athlon XP dropping at the same time as Windows XP.
|
# ? May 7, 2024 19:15 |
|
As dumb as the AI hype is right now, I'm not going to complain about them adding hardware capabilities to client devices. They're investing in creating more capabilities for us, and the alternative was probably them shipping all our data off to their cloud. (Not that they won't want to do that anyway, of course.) Actual use cases do need to show up, but I suspect they exist or are coming. Dumb gimmicks like chatgpt are obscuring the useful cool stuff. Iphones have been able to search pictures with text queries for awhile, it'd be neat to see that become a more standard feature of PCs. Power efficient NPUs and e-cores might enable that kind of indexing without sucking battery and spinning up fans.
|
# ? May 7, 2024 19:20 |
|
Are the “pro” ryzen cpus usually available for retail sale or are those just for system builders? I am looking at the Ryzen 5 pro 8000 series (maybe 8600g) for a nas build as it has integrated graphics, low tdp and ecc support..
|
# ? May 7, 2024 19:20 |
|
priznat posted:Are the “pro” ryzen cpus usually available for retail sale or are those just for system builders? No retail, for system builders or at least by-the-tray only.
|
# ? May 7, 2024 19:36 |
|
Klyith posted:No retail, for system builders or at least by-the-tray only. Dammit that’s be the ideal one to get. I’m never building a new nas at this rate (literally been over a year)
|
# ? May 7, 2024 19:38 |
|
Would these CPU blocks be able to work in conjunction with a GPU? I can imagine at best this could let you pool system and GPU memory and calculations. I'm sure the pci express bus will be a bottleneck for this sort of thing.
|
# ? May 7, 2024 19:39 |
|
crazypenguin posted:Actual use cases do need to show up, but I suspect they exist or are coming. Dumb gimmicks like chatgpt are obscuring the useful cool stuff. Iphones have been able to search pictures with text queries for awhile, it'd be neat to see that become a more standard feature of PCs. Power efficient NPUs and e-cores might enable that kind of indexing without sucking battery and spinning up fans. Honestly I'm not really anti-NPU overall, i do think they have some potential niche uses. Which is why AMD already put the NPUs in the design long before the current stupid AI marketing scams kicked into gear. Strix Point, the chips being rebranded with AI in the name, taped out likely years ago... they've just decided to cash in on the current hype/scam by renaming it.
|
# ? May 7, 2024 20:01 |
|
|
# ? May 7, 2024 21:14 |
|
Yeah, the hardware improvements being made are good (as long as it isn’t to the detriment of general CPU performance improvements); it’s just the marketing and general “AI HERE, AI THERE, WE NEED AI EVERYWHERE” marketing that’s annoying/bad.
|
# ? May 7, 2024 21:02 |