Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Dr. Video Games 0031 posted:

I think Intel has shown that Intel 7 is pretty dang competitive still despite being "10nm," with more overall perf in raptor lake over zen 4, and there are some comparisons where they get surprisingly close on perf/watt.

This might be a dumb question, but isn't this in large part because Raptor Lake's P-cores are 6-wide while the E-cores are 5-wide, compared to Zen 4's 4-wide architecture? So between that, Gracemont's execution port differences, and just cranking up the juice significantly at the high end of the performance curve to squeeze out a bit more, Intel still seems to be just about hitting the wall with what "Intel 7" can do without a significant architectural change.

Adbot
ADBOT LOVES YOU

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Kibner posted:

Even on Noctua's massive NH-D15, they say in their documentation (iirc) that adding a second fan will lower temps by a max of 3 (three) C. That's it.

I'll have to find it, but I did an effort post previously about this, but basically: a second fan on a radiator isn't going to really increase the airflow through the radiator; at most it helps overcome the pressure drop across the fins.

For radiators with high fin density and the ability to transfer heat effectively to the fins, you may see a noticeable improvement in temperature as a result of adding a second fan, because the second fan is going to help the first fan reach its full airflow potential on its curve. This would be more noticeable with liquid cooling setups though; the thermodynamics of a tower air cooler with limited thermal conveyance via heat pipes and radiator size are such that ramping up air flow and/or more easily overcoming pressure losses has little benefit.

Of course there's much more to it than just this: ambient temperature, case airflow patterns, etc.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



But if they don't dump money wastefully into hardware, they'll just end up using it to do stock buybacks! (because they sure as hell aren't going to put it towards salaries...)

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



mdxi posted:

Re: Consoles and AMD, it's been big for them for quite a while. If you wanna go back and dig though financials, you're looking for the "Semicustom" division.

In other news, Phoronix has put out 7950X3D benchmarks. There are three take-aways for me, when looking at the sci/eng workloads.

One: if you're doing CFD, get thyself an X3D chip.


I have a 12700KF now, but man, that is really tempting (as someone that does CFD modeling as a side-consulting gig)

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



SwissArmyDruid posted:

I don't think Sony has total exclusives anymore. I think they saw the light (or, perhaps they greedily saw the opportunity to double-dip to fatten their bottom line) of bringing games to PC to extend the tail on games.

I belieeeeve? Some of those games were eligible for a PC port at some point, based on the leak from last year where someone pulled the titles from GeForce Now's store.

And then there are those just too big to have anything other than timed exclusivity, like FF7R2, FFXVI, and Demon's Souls Remake. Tentatively Spodermang 2 as well.

In contrast to previous installments, Capcom outright rejected exclusivity for Street Fighter 6, they are going WIDE on release with PS4/5, Xbox X/S, and PC with fully-integrated crossplay.

At most I think Sony does timed exclusives, where it might be available on PS5 for 6 months, 12 months, whatever. Sometimes this is probably the true reason, and other times it's probably just them taking advantage of the time needed to port a game to another platform, but calling it the "exclusive" period.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



SwissArmyDruid posted:

All these great motherboards bringing back great memories.

And my dumb rear end was stuck with Soyo.

Yeah, I love whenever PC Nostalgia chat occurs.

Anyone remember which company actually put vacuum tubes on a mobo for audio? That’s the type of crazy poo poo that ASRock would do now.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



phongn posted:

AOpen had a few of them in their 'TubeSound' line.

Thanks! Found a review of them: https://www.techwarelabs.com/reviews/motherboard/ax4ge_tube-g/index_2.shtml

I love the “Made in Russia”, Sov(iet?)tek sound tubes! :comrade:

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION




"Rapid unscheduled disassembly", lmao

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Paul MaudDib posted:

if it ever was it would require absolutely titanic VSOC voltage

I guess you could say that these enthusiasts were going down with the chip

I’ll see myself out…

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



lamentable dustman posted:

I assume they do a quick QA run on the chips before grinding them down to fit the 3D cache, makes sense their wouldn't be that many defective units to do a full launch.

e: also MC doesn't have that many stores, only 25ish country wide and even some of those server the same metro like the Atlanta stores. Awesome store to visit though if you are ever close to one.

I’m still salty that they won’t expand to Phoenix. It made sense that they wouldn’t when there were two Fry’s stores here, but now there isn’t really any competition.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION




:lol:

BlankSystemDaemon posted:

RT is great for screenshots, but I sure as poo poo don't notice it when I'm playing a game where it's enabled - except that the performance usually gets hit pretty bad, and that to compensate usually involves some kind of upsampling where the game is rendered at a lower resolution.

Seems kinda like that defeats the whole purpose of making things look good.

Yeah, when I've enabled it in the games I care about, it either hasn't been that impressive (which I'll chalk up to the developers half-assing its implementation), or it has looked good, but not worth the performance impacts.

It seems like we've entered a period where you need DLSS/FSR/XeSS just to enable features like RT and not crash performance, but it's all to chase marginal improvements...

Then they'll start pushing 8K and the cycle will start all over.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



repiv posted:

or more broadly, look to modern 3D animated movies/TV/anime which are all using pathtracing at this point regardless of their art style. even spiderverse is pathtraced.

how long it's going to take to get there is up for debate, but the endgame is for games to eventually converge on pathtracing like offline rendering already has

I’m not even sure how common this long-term convergence will be, because I think there’s a difference between how you observe media that you have no direct input into (such as movies) versus media that you can have input into, such as games.

Which isn’t to say that it shouldn’t be done, just that I think there will always be a difference in biological response to what the end user is observing based upon the method in which they engage with it.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Shipon posted:

8K is going to be utterly pointless being that 4k is already towards the limits of visual acuity at typical viewing distances anyway.

I don’t think this is true. People who have used Apple’s ProDisplay XDR 6K and Dell’s 6K and 8K offerings have all praised the image quality; they’re just currently in a different use case category right now and not meant for consumers.

But companies need to keep pushing numbers bigger for profit, and eventually 4K will seem like stagnation…

Stanley Pain posted:

People in this thread not seeing a difference with 4k or RT on. :eyepop:

I'm visually impaired and can see the difference lmfao..

I can see the difference when 4K or 5K are used for HiDPI settings, for font, etc., yeah. I can’t when gaming.

Also, being visually impaired probably helps even more for noticing some of these things, but that’s a whole separate discussion…

Canned Sunshine fucked around with this message at 06:26 on Jul 4, 2023

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Shipon posted:

The difference with those monitors is the use case is for UI elements and high information density, where there is a clear advantage to higher resolutions. I don't agree that the same is true for 3D game worlds for the most part. I think there's far more value in higher refresh rates than higher resolutions at this point - we may actually be able to achieve "real" motion blur if we can do 1000+ FPS.

I honestly don't think there's value in trying to constantly push the boundary of higher resolution in pursuit of "real motion blur" in games, because you're getting into the biology of the human eye and how foveal observation is processed by the brain relative to the surrounding aperture. That's why HiDPI looks so good, and why there are still so many possibilities that can be done with HiDPI implementation, especially with gaming, when you start getting into per-pixel opportunities.

When it comes to improving motion blur, you'd be better served by upsizing the monitor enough to cover your entire area of vision, and then having the game engine target game detail within an estimated foveal area while reducing the detail on the outer aperture area, i.e. variable frame rate across the display itself. But I'm not even sure if that's currently possible in terms of display tech? But it'd be pretty cool! Especially since it would open up all kinds of opportunities for how displays present information, including the return of stereoscopic 3D (and glassless this time)!

Otherwise though, if you're looking for the entire display to do it, it's always going to appear "off" to you even if you could hit 5,000 or 10,000 FPS, because of how the brain is processing what it's observing, and I doubt you'd end up happy with it...

This is where VR/augmented gaming could really shine.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



That’s not UW 4K. UW 4K would be like the 5120x2160 resolution LG 40WP95C.

3840x1600 is closer to 21:9.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Dr. Video Games 0031 posted:

According to the way monitor manufacturers refer to monitors, 3840x1600 is 4K Ultrawide. It's (around) 4000 pixels wide and is an ultra wide form factor. 5120x2160 isn't 4K, it's 5K. (or as some people say, 5K2K)

I thought 3840x1600 is (U)WQHD+, since that’s how it’d go by VESA traditional nomenclature? I’ve honestly never seen a manufacturer refer to 3840x1600 as “4K Ultrawide”, so would be curious to see an example.

Most seem to be adhering (somewhat) well to VESA and ITM’s guidelines in general for usage of the 4K acronym.

Canned Sunshine fucked around with this message at 03:20 on Jul 6, 2023

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Cygni posted:

Underneath the marketing speak in that interview, they still have some of the exact same problems for mobile/desktop as Alder/Rocket (and Lakefield before it) had. Regardless of ISAs or theoretical max IPC of the cores themselves, they will have fast power hungry cores with lots of cache, moderate efficiently clocked cores with less cache, and slow SMT threads on the big cores... all of which have dynamic performance based on power and heat targets, and will be working with OSes that are making decisions of what to put where based on high level and slow to implement assumptions. It might be simpler for AMD to address than on ARM or Intel, but they still have to address the issue to maximize the design. Relying on the OS scheduler making adjustments in the 10-100s of millisecond range without knowledge of the underlying architecture efficiency tables isn't gonna be ideal.

So yeah, I assume AMD has a plan here, and I hope they do... because the workaround for the similar but not identical dual-die X3D part issue was not super confidence inspiring.

e: apparently the 5c cores will have SMT enabled, so there are actually 4 performance states to calculate. from today's news:

https://milkyway.cs.rpi.edu/milkyway/show_host_detail.php?hostid=996435

I didn't see that as marketing speak, but rather intentional ambiguity because he wants to basically walk back some of what Papermaster said, without outright just saying that or making it look like what Papermaster said has been wrong.

My personal interpretation of that interview is that he indicated that AMD still has relatively little interest in going to LITTLE.big/etc., and instead would rather optimize the cores for their specific uses, i.e. Zen 4c/5c could be ideal for power-efficient laptops, servers, etc., while standard Zen 4/5 could go into gaming laptops, desktops, etc., where power efficiency is less of a concern. So it isn't the Alder/Rocket Lake problem at all.

And to me it makes sense, because unless you're applying it to a closed hardware system like a mobile device or Apple's closed ecosystem where you can really optimize for heterogeneous cores, then you'll end up like the Alder Lake/etc. circumstances. Intel could have just thrown more efficiency cores in lieu of performance cores and called it a day, and Alder Lake would probably have been even better?

Edit: Ironically if the performance ends up being accurate to what we saw, then Zen 4c could be AMD's "Pentium M-> Core" moment of sorts.

Canned Sunshine fucked around with this message at 15:39 on Jul 21, 2023

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



K8.0 posted:

I just picked up the 7700X + 32gb - Pro B650-P Wifi bundle from Microcenter. You know, the $400 one.

Except it cost me $230 plus tax. I asked them if that was the right price and they said yes. At this point if you have a vague interest in a new PC I suggest getting the gently caress over to Microcenter before they fix whatever is causing them to sell you that bundle at that price.

gently caress, we really need to get a micro center in Phoenix. It’s a barren wasteland since Fry’s effectively ceased functioning*.



* I’m not counting the several years with poor inventory and consignment status while technically still “open”

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



What, who doesn’t want IHS sponsorships?

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Well the intern had to take a break from coding FSR3 to write those, so cut them some slack!

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



hobbesmaster posted:

There’s probably one person out there doing this and their reasoning would have to be “AMD is so bad at this. Now, Intel, there’s a company that actually knows how to gently caress over other hardware vendors”

I laughed a lot harder than I probably should have at this.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



If I had the money I would buy one for CFD analysis, since this would be a perfect application for it.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



repiv posted:

windows ARM can run x86 binaries under emulation, but the performance is nowhere near as good as rosetta since non-apple hardware lacks the TSO extension that rosetta leans on

if qualcomm does TSO and microsoft updates their emulator to use it then maybe they can get somewhere

Yeah, I hope Qualcomm and Microsoft do since it not only benefits future customers buying Windows ARM machines, but also for those running Windows 11 ARM on ASi, like you said.

I played around with a number of engineering apps I use under Windows 11 ARM emulation using my M1 Max; I had an issue with a few, but by and large the apps worked well and it was still pretty snappy.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



NewFatMike posted:

What software have you tried? I’ve been curious about SOLIDWORKS, but I haven’t found anyone who’s actually tried to use it outside a few old Reddit posts.

I've used CFD software (Ansys, Simflow) with varying levels of success, and then a few AutoDesk products (Revit, AutoCAD) that worked depending on the version, strangely enough (2022, 2024 worked for me, while 2023 had issues). I've used a few others too (WaterGems, Flow2D, HEC-RAS) without any real problems.

I installed and ran Solidworks briefly using a prior model, but haven't really used it at all. It ran after I made a few adjustments I had seen online, but it did feel a bit slow at times with the model I used.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Cygni posted:

Intel’s RDRAM excursion and amd’s early reluctance to dabble in chipsets unless they absolutely had to left the door wide open for VIA, and they did have a short hot streak. Their only real competition was ALi, lol.

Once Nvidia (and ATI, and even SiS for a bit) entered the market, it became real obvious how bad VIAs compatibility and performance really was and they faded fast out of the market. But I guess so did everyone once Intel and AMD went full anti competition and killed an entire market off forever, while the courts twiddled their thumbs.

To some extent, I do miss the days of having to research a few different chipset options to see which one made the most sense for me.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Subjunctive posted:

Yeah it sounded like even 2 years could cause issues, and my hardware usually gets handed down twice at least, so I guess I’m going to read all these semi-conductor articles and decide how scared I am.

I think we once had a pretty good discussion on all of this, but I would not be worried about electromigration as it relates to your and general consumer use. There are other factors at play that are independent of whether it’s technically operating outside of JEDEC recommendations, and those factors would generally make themselves apparent regardless of that.

Edit: fwiw, I spend a healthy chunk of each week evaluating reports as it relates to material science, and impacts of areas such as voltage, MIC, hydraulic, etc. Electromigration exists, but it's concerns as it relates to general consumer use and even enthusiast use are significantly overstated.

Canned Sunshine fucked around with this message at 17:44 on Nov 12, 2023

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Subjunctive posted:

Wild—laptop only at this point I assume, unless ASRock has a special baby that supports it.

hobbesmaster posted:

Basically, it exists to allow socketable LDDR.

Somewhere, an ASRock engineer's ears started to burn a little bit...

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Who doesn't support the efficiency of a random hexadecimal naming convention!

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Zen 5 is also moving from 4 to 6 ALUs, so I'm wondering now if Zen 5c will see that, or will remain at 4 if it's based off of Zen 4.

That's disappointing if it's the case.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Ironically, Microsoft getting Windows for ARM to have good DX12 performance, might actually be a decent boon to macOS gaming via Crossover, Parallels, and VMWare Fusion.

It is frustrating that gaming isn't more than just a bullet point for Apple, since they have the resources to really make a significant push via supporting developer porting efforts, especially given how cash-rich they are. And by all accounts, MetalFX upscaling is superior to FSR :haw:

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



It makes sense, but I’ve always loved AMD’s dumb naming conventions regardless of how stupid they were. We wouldn’t have gotten Threadripper without it.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Eletriarnation posted:

Really, are we going to have to do the thing of putting "AI" in all of the product names for a few years?

It's going to be like raytracing all over again. All the marketing will be saying there's something revolutionary right around the corner which will totally need this new functional unit, and then if and when that (unnecessary) killer app appears we're all going to realize that the first gen hardware isn't actually fast enough to do it effectively.

I went to a water/wastewater treatment conference last month where pretty much 90% of the presentations were either Direct Potable Reuse or PFAS; I joked with some colleagues that DPR/PFAS are to the water industry what AI now is to everything else.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



I still have a 3D TV that I baby, because I actually enjoy watching movies in 3D. But I know I’m in the rarity.

Looking forward to the new Acer 3D monitor they’re coming out with that tracks eye movement to adjust the stereoscopic view (which also means it’s glasses-free).

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Yeah, they rebranded it that way across all versions of Edge I think, since I saw it in Windows 11 branded that way, recently.

Honestly I don’t have an issue with Edge - it works decently enough, and I prefer it to Chrome. Firefox and Safari are my daily drivers though.

Adbot
ADBOT LOVES YOU

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Yeah, the hardware improvements being made are good (as long as it isn’t to the detriment of general CPU performance improvements); it’s just the marketing and general “AI HERE, AI THERE, WE NEED AI EVERYWHERE” marketing that’s annoying/bad.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply