Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.
Why do you have to be so confrontational about it. Everyone can use which ever works best. Or both.

I have connected my main 165Hz monitor to my gaming desktop and with HDMI to my 144Hz secondary monitor, so that I can connect my work laptop to the 165Hz with HDMI and to the 144 with DP. To the old tertiary monitor the desktop is connected with HDMI->DVI cable and work laptop with HDMI. A secondary work laptop is then connected with DP->HDMI adapter and HDMI cable to the 144, but it runs at only 1080p instead of 1440p, what the hell is wrong with that. And also VGA to the tertiary.

The desktop is of course connected with HDMI to the AV receiver for audio and through it to the TV. I'm about ready to invest in a 7 meter long optical HDMI 2.1 cable for the TV so I could get the full 4K 120Hz.

Let me tell you, the pandemic remote work has been a challenge trying to figure out all the connections. "So I have these ports, adapters, docks and cables. I can connect that to there and from there to that, then I can cross these here. Wait, what was I doing?". When I bought my new GPU a major consideration was that it has enough ports of a suitable types. Most GPU models I had to drop as unsuitable.

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Anime Schoolgirl posted:

They're also more capable than my hobby demands, which is "finding the shittiest possible modern hardware to play big-budget games on"

my brother in Christ 🙏🙏🙏

Anime Schoolgirl
Nov 28, 2002

gradenko_2000 posted:

my brother in Christ 🙏🙏🙏
Starfield on Mendocino is gonna be exquisite :getin:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Anime Schoolgirl posted:

AMD probably will never do an AM5 Athlon 3000G successor especially considering how expensively set up the AM5 chipsets have to be

why not simply rebrand the X300 chipset for entry-level Zen4 products :nsa:

(actually while this is a meme, the joke is X300 means literally "no chipset", all AMD processors are SOC and are perfectly capable of running without one, technically this also means that there's no technical reason you can't put a X470 or whatever on an AM5 chip. They're just the world's most expensive IO expander card, in stark contrast to Intel they play no role in the processor bringup and you literally can just put an X470 on AM5 and it will work fine, there is absolutely nothing special about an "AM5 chipset" other than it having PCIe 5 capability (maybe). See also TR4 vs TRX40 and how that was bullshit when they maintained 3 generations of socket compatibility with SP3.)

(naturally, Asrock has slapped one on a PCIe card :catdrugs:)

I have to imagine that both AMD and Intel employ a full-time Asrock Wrangler whose job is just to stay on top of whatever crackhead thing their engineering department is up to this week and yoink them offstage if they start to act out in public

Paul MaudDib fucked around with this message at 07:16 on Mar 17, 2023

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.

Pittsburgh Fentanyl Cloud posted:

You're welcome.

As long as DP does 165hz at 2560x1440 I don't really need to have any other preferences, until I change my monitor.

yeah dp is still the best option for a lot of monitors, because they don't support the newer hdmi standards that can match or exceed it. it's really only 4k at high framerates where you start to run into the limits of dp 1.4

Arzachel
May 12, 2012

Paul MaudDib posted:

why not simply rebrand the X300 chipset for entry-level Zen4 products :nsa:

I need gently caress all for IO. Gimme a cheap chipset-less itx board you cowards!

Hockenheim
Oct 20, 2022

by VG
DP is far and away the superior physical interface for computer stuff.
It’s doomed to be the next FireWire because they never got TV and amplifier makers onboard though.

hobbesmaster
Jan 28, 2008

Paul MaudDib posted:

I have to imagine that both AMD and Intel employ a full-time Asrock Wrangler whose job is just to stay on top of whatever crackhead thing their engineering department is up to this week and yoink them offstage if they start to act out in public

hobbesmaster posted:

Somewhere an AMD product manager’s phone starts ringing, they pick it up and see Asrock as the caller id. They hit answer, scream “NO” and hang up immediately.

Anyway I’m not sure what the market for that x670 expander would be. ASRock proved it could be done at least!

Wiseblood
Dec 31, 2000

What's going on with the Gigabyte B650E Aorus Master? It's not in stock anywhere (except marked up from 3rd party sellers) and B&H is saying it's discontinued.

Gyrotica
Nov 26, 2012

Grafted to machines your builders did not understand.
Update on heavily modded Stellaris on 7950X3d: by 2463 I'm at 30 seconds per month or so.

(heavily modded includes Gigastructures, ACOT, AOT, UI overhauls, bunch of other stuff)

Dr. Video Games 0031
Jul 17, 2004

Gyrotica posted:

Update on heavily modded Stellaris on 7950X3d: by 2463 I'm at 30 seconds per month or so.

(heavily modded includes Gigastructures, ACOT, AOT, UI overhauls, bunch of other stuff)

What was your CPU before upgrading, and how fast was it then?

hobbesmaster
Jan 28, 2008

Gyrotica posted:

Update on heavily modded Stellaris on 7950X3d: by 2463 I'm at 30 seconds per month or so.

(heavily modded includes Gigastructures, ACOT, AOT, UI overhauls, bunch of other stuff)

How many stars?

I assume the blokkats haven’t show up to garbage colllect.

BlankSystemDaemon
Mar 13, 2009



Ah, the proud tradition of throwing out a single number, with no high/low which means no average/mean/median, and absolutely no chance of confidence intervals, student Ts or standard deviation.

Gyrotica
Nov 26, 2012

Grafted to machines your builders did not understand.

Dr. Video Games 0031 posted:

What was your CPU before upgrading, and how fast was it then?

9900k - by now it would usually be about 60 seconds for a month (guestimation, but definitely less pleasant and playable - I'd usually hit this speed at around 2350/2375, and then into the 2400s it would be really struggling)


hobbesmaster posted:

How many stars?

I assume the blokkats haven’t show up to garbage colllect.

1000, no blokkats yet. Have endgame set for 2500.

hobbesmaster
Jan 28, 2008

Heh, I haven’t bothered with above medium in a while. Thats quite impressive actually. Given my 5800x3d could keep up with those mods in a medium galaxy a bit longer I bet you’d fly to 2500 on medium.

Or maybe it’d be almost exactly the same because the cache size is the same. Well, actually zen4 has more l2 so maybe that’ll help some more.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Arzachel posted:

I need gently caress all for IO. Gimme a cheap chipset-less itx board you cowards!

googled for "X300 mini-itx" and wouldn't you know it...

Cygni
Nov 12, 2005

raring to post

Paul MaudDib posted:

googled for "X300 mini-itx" and wouldn't you know it...

Iirc, the board makers werent allowed to sell these outside of the embedded/China space... so naturally you can buy one on aliexpress or even Newegg shipped directly from Hong Kong, lol

hobbesmaster
Jan 28, 2008

Cygni posted:

Iirc, the board makers werent allowed to sell these outside of the embedded/China space... so naturally you can buy one on aliexpress or even Newegg shipped directly from Hong Kong, lol

It’s the socket that’s unique there, the embedded market is going to be FP6 (ie bga am4)

Arzachel
May 12, 2012

Paul MaudDib posted:

googled for "X300 mini-itx" and wouldn't you know it...

Of course it's Asrock. AM5 would work even better since Raphael should have enough lanes to do front USB C headers as well, if I understand the block diagram right.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Arzachel posted:

Of course it's Asrock. AM5 would work even better since Raphael should have enough lanes to do front USB C headers as well, if I understand the block diagram right.

I have one of the Asrock Deskminis, and the front USB solution is so clever: they're directly mounted on the motherboard and just exposed as front I/O through the case, including USB-C. I hadn't realized that X300 meant "no chipset", but that's cool as hell.

Mofabio
May 15, 2003
(y - mx)*(1/(inf))*(PV/RT)*(2.718)*(V/I)
The asrock guys use their computers and dare to ask, What if?

Anime Schoolgirl
Nov 28, 2002

The APUs might have 8 PCIe lanes to the GPU again considering the APU chips list x20 instead of x28 for lane count, but since they're PCIe 4 you probably won't lose any performance except in weird edge cases (of which Intel ARC possesses many) or the 4090.

Though if you were gunning for an APU a GPU was probably not one of your priorities to begin with.

edit: though it would be a real shame to buy a 4xNVMe M.2 2280 card for one of these things to have only two of the drives actually work.

Anime Schoolgirl fucked around with this message at 04:55 on Mar 18, 2023

Cygni
Nov 12, 2005

raring to post

hobbesmaster posted:

It’s the socket that’s unique there, the embedded market is going to be FP6 (ie bga am4)

I was using the term more broadly for true embedded and industrial/gaming/signage. the “x300” stuff that Paul linked, for example, is a thin-itx AM4 that’s meant for those markets.

BlankSystemDaemon
Mar 13, 2009



Anime Schoolgirl posted:

edit: though it would be a real shame to buy a 4xNVMe M.2 2280 card for one of these things to have only two of the drives actually work.
If a motherboard manufacturer adds a PCIe switch, it'll work fine - albeit at reduced performance.

Hockenheim
Oct 20, 2022

by VG
I don’t think you have to worry about a board maker going to the expense of adding a very pricey Gen4 or 5 switch to a small board.

Yaoi Gagarin
Feb 20, 2014

Arzachel posted:

Of course it's Asrock. AM5 would work even better since Raphael should have enough lanes to do front USB C headers as well, if I understand the block diagram right.

How many angels has AMD unleashed and when do we get Ramiel

FuturePastNow
May 19, 2014


Have they used all the ninja turtle names yet?

Cygni
Nov 12, 2005

raring to post

They have been using famous artists for desktop/mobile and italian cities for server for their "codenames" (that are mostly used for marketing), but they are supposedly dropping and that and adopting the boring Intel philosophy of random first word plus the same second word depending on the product line.

Mobile will end in "Point", so currently Phoenix Point and then next is Strix Point. And desktop will be "Ridge", so Zen 5 is Granite Ridge.

Server still seems to be Italian cities. Turin is the Zen 5 server name.

Scam Likely
Feb 19, 2021

Bulldozer was a cool name, shame it was such a turd.

Josh Lyman
May 24, 2009


Has anyone run into weird issues coming out of sleep with their 7000 CPU? It's been two weeks with my 7900X + Asus ROG Strix B650E-F on Win10 and I think two or three times now, it completely hangs coming out of sleep and the monitors don't connect even when I power cycle them. The only solution is to hard reset. I'm asking in here bc my RGB fans are set to be solid in Windows through Armory Crate and when my system hangs coming out of sleep, they're stuck on their default color cycling. I don't recall this happening with my old 3570K system.

Kazinsal
Dec 13, 2011
Everyone I've talked to who has a 7000 series on an ASUS board has various baffling issues. Mine randomly stalls itself out after between 36 and 72 hours of uptime. No event viewer, no crash dump, nothing.

Part of me wants to just buy a new motherboard from literally any other vendor, but the rest of me is lazy enough to just accept the need to reboot once a day.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

I posted about the Phoronix 7950X sci/eng workload testing over in the Folding@Home discord, and joked that the "comedy option" of the 13900K had been thrown in -- burning 2.5X the watts to do 0.8X the work of the 7950X.

My experience there has been that it's a place full of reasonably level-headed nerds who just care about hardware as a tool for producing optimal amounts of science. But that offhand comment awakened something in one user, who proceeded to post two full screens of a semi-comprehensible "actually I'm not mad but have you considered" style screed. I haven't seen that kind of longform, pure-strain console warrioring since my USENET days, and was really surprised by it. I just skimmed the wall of text and did not engage, but the crux of his argument seemed to be:

1) Phoronix was using "flawed" numbers from some third party -- a bizarre accusation, given that Phoronix maintains their own benchmark suite

2) And regardless of point #1, the 13900K actually "won" because testing chips in their stock configuration is unfair, because everyone knows that the E cores are crap. So to make things fair you disable all E cores on the Intel side, and then also disable 8 Zen4 cores on the AMD side, and this makes it a "fair" 8 vs 8 core shootout in which the 13900K will handily defeat Team Red.

What. The Christ.

mdxi fucked around with this message at 17:35 on Mar 20, 2023

Tabemono
Oct 24, 2004
my tummy is empty

Cygni posted:

The online sentiment seems to be that Gigabyte has the best AM5 boards at the moment. But just like AM4 at first, the whole thing is a bit messy, especially with RAM support greater than listed by AMD. 6000+ may not work with some CPUs or board at all.

Is this still the prevailing feeling? I'm doing a 7950X3D + X670E build and trying to parse all the AM5 issues, BIOS updates etc. between motherboard manufacturers to figure out who's the good one this gen is making my brain melt.

I'm down to either the Gigabyte X670E Aorus Master, or on the Asus side the X670E-E ROG Strix or the X670E ProArt Creator board (MSI and Asrock are out, two PCIe slots is too limited for my use case). Going to run everything at stock, as long as it's all stable at EXPO on that Trident Z5 CL30 kit everyone gets I'll be more than happy. Started out defaulting to Asus, but it seems like I see more complaints about them on AM5 than Gigabyte at this point.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Kazinsal posted:

Everyone I've talked to who has a 7000 series on an ASUS board has various baffling issues. Mine randomly stalls itself out after between 36 and 72 hours of uptime. No event viewer, no crash dump, nothing.
Which is odd. If you start comparing the layouts of the various boards between vendors, the location of the connectors and specific components are more or less the same. Seems like they were all derived from a reference design. And there isn't much vendor specific other than LED crap and such.

hobbesmaster
Jan 28, 2008

mdxi posted:

I posted about the Phoronix 7950X sci/eng workload testing over in the Folding@Home discord, and joked that the "comedy option" of the 13900K had been thrown in -- burning 2.5X the watts to do 0.8X the work of the 7950X.

My experience there has been that it's a place full of reasonably level-headed nerds who just care about hardware as a tool for producing optimal amounts of science. But that offhand comment awakened something in one user, who proceeded to post two full screens of a semi-comprehensible "actually I'm not mad but have you considered" style screed. I haven't seen that kind of longform, pure-strain console warrioring since my USENET days, and was really surprised by it. I just skimmed the wall of text and did not engage, but the crux of his argument seemed to be:

1) Phoronix was using "flawed" numbers from some third party -- a bizarre accusation, given that Phoronix maintains their own benchmark suite

2) And regardless of point #1, the 13900K actually "won" because testing chips in their stock configuration is unfair, because everyone knows that the E cores are crap. So to make things fair you disable all E cores on the Intel side, and then also disable 8 Zen4 cores on the AMD side, and this makes it a "fair" 8 vs 8 core shootout in which the 13900K will handily defeat Team Red.

What. The Christ.

If they name dropped “advanced marketing devices” in the middle of that that is 100% the guy that runs userbenchmark.

hobbesmaster
Jan 28, 2008

Combat Pretzel posted:

Which is odd. If you start comparing the layouts of the various boards between vendors, the location of the connectors and specific components are more or less the same. Seems like they were all derived from a reference design. And there isn't much vendor specific other than LED crap and such.

It wouldn’t surprise me if between ATX and design rules for DDR5 and PCIE 4/5 they’re probably more or less forced into that configuration even if there isn’t a specific reference design.

BlankSystemDaemon
Mar 13, 2009



mdxi posted:

What. The Christ.
"What. The Christ" a good question to ask when it comes to benchmarking, because in so far as every benchmark suite is flawed, the same is true of the one Phoronix uses and maintains.

If you look at this benchmark you'll see that Michael decided that we'd get to know the exact set of compiler flags that everything was compiled with.
Unfortunately, if you want to compare with this benchmark or this benchmark, you can't really do that, because in those benchmarks Michael knew we didn't deserve to know the exact set of compiler flags, presumably because we've been bad readers.
It gets worse, though, because sometimes we get articles like this which include a different set of variables, presumably because Michael thinks that the readers deserve it as a little treat.

The problem with all of this, of course, the benchmarks don't always include highs, lows, median, mean, average values, as well as standard deviation, student T's and confidence intervals - all things that are considered completely necessary in the scientific field of statistics, and if you try to publish without them your paper is guaranteed to no get accepted in any journal.
It tends to vary quite a bit whether these are included, with more recent articles including them, indicating that he's partially listened to the feedback people have been giving him for almost a decade - but it's still not all of them, so you might as well think of each datapoint as being representative of a single run and therefore invalid in terms of forming a statistical universe.

There is a bigger problem, though - which is that the even if you do all of the above right, we know from the academic field of computer science that there's a lot of things that can alter benchmark results, such as:
Compiler version (even a minor version bump, let alone completely different choices of compilers), link order (ie. the exact order of files given to ldd), environment (as in, the UNIX environment variables, the things usually inherited from the user starting the compile) - all of these can cause unpredictability in measurements, and in the worst case, the difference in measurements can be much bigger than whatever performance optimizations come from using -O3.

None of this is new knowledge, of course - the original paper that introduced the concept broadly is from 2009, and Emery Berger has been presenting on it in 2019:
https://www.youtube.com/watch?v=r-TLSBdHe1A

And also in 2020:
https://www.youtube.com/watch?v=bzU-VEP9s78

And also in 2022:
https://www.youtube.com/watch?v=vVUnCXKuNOg

In one of those talks, Emery even shows that the unpredictability of measurements can vary by as much as 40% - so it's kinda hard to ignore.

BlankSystemDaemon fucked around with this message at 19:23 on Mar 20, 2023

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Tabemono posted:

Is this still the prevailing feeling? I'm doing a 7950X3D + X670E build and trying to parse all the AM5 issues, BIOS updates etc. between motherboard manufacturers to figure out who's the good one this gen is making my brain melt.

I'm down to either the Gigabyte X670E Aorus Master, or on the Asus side the X670E-E ROG Strix or the X670E ProArt Creator board (MSI and Asrock are out, two PCIe slots is too limited for my use case). Going to run everything at stock, as long as it's all stable at EXPO on that Trident Z5 CL30 kit everyone gets I'll be more than happy. Started out defaulting to Asus, but it seems like I see more complaints about them on AM5 than Gigabyte at this point.

fwiw I'm running a https://www.msi.com/Motherboard/MPG-B650I-EDGE-WIFI with https://www.gskill.com/product/165/377/1649234720/F5-6000J3040F16GX2-RS5W-F5-6000J3040F16GA2-RS5W

XMP works fine, I'm using Memory Context Restore to make the boot times acceptable (why don't people talk about that more? the boot times stock are insane!) and to bring the power back down (because the uncore is set to run at full speed this way) I dropped the uncore voltage and used the curve optimizer to drop the core voltage

hobbesmaster
Jan 28, 2008

I’ve never used an AM5 board, do they need to regularly retrain when running XMP timings?

That seems… bad, but I guess early AM4 did that too.

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

hobbesmaster posted:

I’ve never used an AM5 board, do they need to regularly retrain when running XMP timings?

That seems… bad, but I guess early AM4 did that too.

Every single boot if you don't change that setting, and wow it's a long boot, with nothing but a black screen to show for it

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply