Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:

Subjunctive posted:

Corsair documents the set of compatible PSUs, I believe.

That’s fine. Still won’t do it.

Adbot
ADBOT LOVES YOU

Arrath
Apr 14, 2011


I think my left over PSU cables got thrown into my old-old-old mobo box of 'assorted pc building bits n bobs' along with the leftover cables from my PSU before that.

Might as well toss all of em rather than risk it.

Inept
Jul 8, 2003

8-bit Miniboss posted:

That’s fine. Still won’t do it.

So if you were missing a cable and the manufacturer had a replacement for sale that was listed as compatible, you wouldn't buy it, and would instead get a whole new PSU?

Cabbages and VHS
Aug 25, 2004

Listen, I've been around a bit, you know, and I thought I'd seen some creepy things go on in the movie business, but I really have to say this is the most disgusting thing that's ever happened to me.

8-bit Miniboss posted:

Braver than I. I’d never mix and match modular cables. I won’t gently caress around with power.

Subjunctive posted:

Corsair documents the set of compatible PSUs, I believe.
yep https://www.corsair.com/us/en/psu-cable-compatibility

8-bit Miniboss posted:

That’s fine. Still won’t do it.

:shrug: I'm not wasting 30 mins of my life doing something the manufacturer says is completely unnecessary, especially when they go as far as printing the cable type on the physical cables so you can line them up and look, but, to each their own. Wasn't trying to start an argument, was just pointing out that Corsair is uniquely cool and good in this regard.

I think at least some of the cabling in there goes back two PSUs to a RM750x, and may stick around for a H1200 whenever I upgrade to whatever is after a 4090 :laugh:

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Arrath posted:

I think my left over PSU cables got thrown into my old-old-old mobo box of 'assorted pc building bits n bobs' along with the leftover cables from my PSU before that.

Might as well toss all of em rather than risk it.

Yeah, and not saying this is for everyone, but if someone really was concerned/cared, one could always evaluate and confirm pin layouts for compatibility if desired.

Quaint Quail Quilt
Jun 19, 2006


Ask me about that time I told people mixing bleach and vinegar is okay
I think I've come up with the solution to the 4090 being cpu limited,

It's time for dual CPU's again!

Joke aside, a quick Google says this doesn't work and what we really need is a games code to actually use more threads and cores effectively.
https://cpuninja.com/dual-cpu-motherboard/

I'm happy with my 3080FE and 5800x3d at 1440
(Though I am in the market for an OLED in a year or 2)

Cygni
Nov 12, 2005

raring to post

Quaint Quail Quilt posted:

I think I've come up with the solution to the 4090 being cpu limited,

Bring Back Kryotech.


Au Revoir Shosanna
Feb 17, 2011

i support this government and/or service
imagine the sound

DoombatINC
Apr 20, 2003

Here's the thing, I'm a feminist.





If you can think of a better way to cool the blistering 30w of a K6-3 I'm all ears :colbert:

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:

Inept posted:

So if you were missing a cable and the manufacturer had a replacement for sale that was listed as compatible, you wouldn't buy it, and would instead get a whole new PSU?

That’s stupid and don’t need to follow up with anything.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
If they'd just put the turbo button back on desktops we wouldn't be CPU limited at all anymore

Arrath
Apr 14, 2011


Lockback posted:

If they'd just put the turbo button back on desktops we wouldn't be CPU limited at all anymore

TBH I think the turbo button should make a comeback, maybe call it the Eco button. Let me run the machine in Eco mode for everyday use, and step up to the ludicrous stock power settings as needed like when encoding a video or something. All without having to go into a control panel or BIOS to tweak power targets.

DeathSandwich
Apr 24, 2008

I fucking hate puzzles.
Edit : off topic, disregard.

CatelynIsAZombie
Nov 16, 2006

I can't wait to bomb DO-DON-GOES!

hobbesmaster posted:

A desktop synology uses like 5 W plus the power usage of the hard drives.

That’s less than the hit I take from raising the SOC voltage for :rice: memory.

This is interesting because reading the nas thread people are constantly fretting over how much their home servers are drawing. Idk what unit a "desktop synology" is but are we talking less than the $2,000 they're asking for for a 4 slot rackmount? https://www.amazon.com/Synology-Rackstation-Diskless-RS1619xs-4-Bay/dp/B07KM8C4PQ

shrike82 posted:

i don't know anything about video editing but do people do it from spinners?

the hours and hours of bullshit you're saving for b-roll/just in case/dont feel like archiving it but don't want to delete it really is best kept on a spinny big boy imo. even just running an entire davinci project off a hdd isn't really noticeable until poo poo gets huge or you're otherwise killing cpu/gpu perf with other processing

CatelynIsAZombie fucked around with this message at 19:00 on Oct 20, 2022

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

Lockback posted:

If they'd just put the turbo button back on desktops we wouldn't be CPU limited at all anymore

I will build a retro computer case with a working turbo button one day

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

CatelynIsAZombie posted:

This is interesting because reading the nas thread people are constantly fretting over how much their home servers are drawing. Idk what unit a "desktop synology" is but are we talking less than the $2,000 they're asking for for a 4 slot rackmount? https://www.amazon.com/Synology-Rackstation-Diskless-RS1619xs-4-Bay/dp/B07KM8C4PQ

Yeah, when they say "desktop" they mean it sits on your desk just like a desktop computer. Rackmount models typically are meant for enterprise use and will cost more. The model you link there has a Xeon D and 5yr warranty, which explains a lot of the cost.

A lot of people who have concerns about power consumption from home servers are running stuff like old LGA2011 Xeons or other full-sized server systems with high platform power requirements which still apply at idle. Desktop NAS units on the other hand are typically built on a low-powered processor like an Atom/embedded APU/ARM chip, or at most a typical desktop-class processor like a Xeon E.

You can get similar numbers in a self-build NAS by just choosing hardware that doesn't take so much power. My Plex server is an i5-10400 and it idles under 10W (with no hard drives), but even my old Xeon L3426 in a Supermicro board from 2010 probably only adds about 15W typically on top of the 8 hard drives it manages. It's an old platform, but it's very much optimized for low power consumption and still has plenty of performance for just ZFS/samba/torrents.

Nfcknblvbl
Jul 15, 2002

Thank goodness for Micro Center. Besides the MSI 4090 I got from Newegg, everything I need for the build will be available for pickup after I get off work tonight.

wolrah
May 8, 2006
what?

Dr. Video Games 0031 posted:

I wonder if that's going to push devs to release PS5 exclusives just so they don't have to develop for the S. Microsoft doesn't let you do Series X only, do they?
So far I don't think anyone's allowed games to be exclusive to refresh or upgrade models within the same generation. As far as I'm aware there were never any Xbox One X exclusive or PS4 Pro exclusive titles last generation either, you have to support the entire lineup for whatever generation you're targeting.

I think there are a couple of games for Switch that won't work on Switch Lite due to the fixed controllers and lack of dock support, but that's not really the same thing.

gradenko_2000 posted:

to Microsoft's point I can sort of see why they'd want to keep it: 300 bucks and a Game Pass sub is a LOT of gaming for a relatively small up-front investment, and I imagine (though this is speculation) that there's a market segment that the S has managed to tap
Also there are a lot of people still gaming on smaller, older, low resolution TVs. I have a Series X in my living room because I have a 4K120 HDR OLED that can make use of every bit of its capabilities, but from a gameplay standpoint my old first-release Xbox One still does what I need on the 2008 era 1080p60 LCD in my bedroom. I've considered upgrading that to a Series S when there have been bundle deals with it just for the boot times from not having to wait on a hard drive.

tehinternet posted:

I will build a retro computer case with a working turbo button one day
I'm actually kind of surprised some enthusiast brand hasn't done this as a novelty, just add another front panel button header and make it toggle the boost clock feature or something like that by default, maybe have the option to configure it to instead toggle between clock profiles instead. Sell an optional addon 3.5" bay insert that has a retro style toggle button and a four digit seven segment display to show the current clock for bonus points.

Cyrano4747
Sep 25, 2006

Yes, I know I'm old, get off my fucking lawn so I can yell at these clouds.

wolrah posted:

So far I don't think anyone's allowed games to be exclusive to refresh or upgrade models within the same generation. As far as I'm aware there were never any Xbox One X exclusive or PS4 Pro exclusive titles last generation either, you have to support the entire lineup for whatever generation you're targeting.

Sure, but the problem here is that the S was kind of gimpy when it launched. That poo poo is going to be looking reeeealy long in the tooth by the time we're near the end of this gen.

It was a terrible call on microsoft's part.

MarcusSA
Sep 23, 2007

Cyrano4747 posted:


It was a terrible call on microsoft's part.

Ok but the S is selling really well sooooo
:shrug:

Cyrano4747
Sep 25, 2006

Yes, I know I'm old, get off my fucking lawn so I can yell at these clouds.

MarcusSA posted:

Ok but the S is selling really well sooooo
:shrug:

Well, yeah, they gave people an option to buy a $300 "next gen" console. I'm not surprised it's selling really well.

But we're already starting to see some pinch points, and that poo poo is going to get worse as the generation progresses. The BONE lasted 7 years. We're coming up on year 2 of the X/S era. Those hardware limitations are going to be really, really, REALLY noticeable in 3 or 4 years time.

Basically the options are to keep AAA games locked in amber so they can run on a mid-range 2016 PC, let publishers stop targeting the S, or have people playing 320p or whatever low target resolution you have to hit to get whatever 2025's latest and greatest video card melter is. There's just no way they're going to be able to hide that it's a potato at that point.

edit: whoever said they need to give an option to stream from a series X instance has it right. As stand alone hardware these are going to look really anemic in just a couple of years (well, more so than they already do), but as a first step towards a streaming-only tier of even cheaper hardware? That could be interesting.

MarcusSA
Sep 23, 2007

My vote would be to make the game engines scale better.

That would be better for the consumers.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Cygni posted:

ECC is one of those things that If you dont already know that you need ECC, then you don't need it and you should leave it off. It will incur a performance penalty with GDDR6/6X, from my understanding.

I'm not sure if exposing that variable for the 4090 is a mistake or not, it is usually something that only shows up with the Studio Drivers and a Quadro or A/T series pro GPU. For a home user, there really isn't any reason to toggle it on.

Thanks brother.

If this is a Quadro type setting that is now available to 4090 users, I wonder if that increases the card's value for those that need it?

They also may have enabled the pathway since it sounds like the Titan isn't coming out or whatever.

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
The XBox One S does its job very well. Get people to buy an Xbox, not buy a Playstation, and get used to spending money on MS Game Pass.

The unfortunate (for MS) side effect is that making games playable on Series S probably makes it easier for devs to port to Switch.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

MarcusSA posted:

My vote would be to make the game engines scale better.

That would be better for the consumers.

That's just not possible. You can't endlessly scale being able to do functional RT, there are limitations. And the Series S doesn't have the hardware to do it. Even the "big" consoles really don't. If developers can get a title with significant RT effects running on PS5/XSX, they're already doing an amazing job of scaling.

With modern renderers and techniques being fundamentally key to how the visuals are designed, you can't just turn stuff off. The only option to make it run on really crappy hardware is incredibly low resolutions and framerates. And with more and more temporal techniques, you can't just crater framerates without doing horrible things to IQ.

The GPUs in the Series X and PS5 weren't awful when announced, but we've seen two huge steps since then. They're way behind on GPU horsepower, and there are still ~5 years of probably significant scaling to go as we get into the MCM GPU era. The Series S is just gimped, it's way too little power for modern games.

K8.0 fucked around with this message at 22:09 on Oct 20, 2022

shrike82
Jun 11, 2005

If the Series S continues outselling the X... it might actually end up being the default hardware target for console devs

I can see certain developers going PS5 only if that's the case

repiv
Aug 13, 2009

even ignoring performance, raytracing also has a significant VRAM overhead and the XSS is lacking in that department too

we're already seeing developers throw up their hands and disable RT completely on the XSS, in games that offer it on the PS5/XSX

mobby_6kl
Aug 9, 2009

by Fluffdaddy
So they made a "next-gen console: without the "next gen" part? And it's outselling the real one? That's pretty hilarious.

Arrath
Apr 14, 2011


mobby_6kl posted:

So they made a "next-gen console: without the "next gen" part? And it's outselling the real one? That's pretty hilarious.

The 2nd part isn't so hard to imagine, money is tight for a lot of folks.

njsykora
Jan 23, 2012

Robots confuse squirrels.


In the UK the Series S is £250. It's cheaper than the Switch in some places.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Yeah obviously that part makes sense, unlike MS actually releasing a crippled console and have it take over the market from your flagship. I dunno, maybe the financials work out in their favor though.

MarcusSA
Sep 23, 2007

repiv posted:

even ignoring performance, raytracing also has a significant VRAM overhead and the XSS is lacking in that department too

we're already seeing developers throw up their hands and disable RT completely on the XSS, in games that offer it on the PS5/XSX

What’s the problem with that?

It’s a 1080p console

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
I'm actually curious to see what happens once games finally stop targeting PS4 / Xbox One. I'd bet money that we end up with 30FPS as the default option, except PC CPUs are not twice as fast as Console CPUs currently, so if the consoles ship a CPU or memory latency limited 30FPS game, there's no way that PCs are going to be able to hit 60fps, except for a 4090 doing frame interpolation.

shrike82
Jun 11, 2005

Did they save that much money from having multitiered slow ram? Seems like a decision that they're going to regret long term along the lines of the Xbox One ram issue

njsykora
Jan 23, 2012

Robots confuse squirrels.


mobby_6kl posted:

Yeah obviously that part makes sense, unlike MS actually releasing a crippled console and have it take over the market from your flagship. I dunno, maybe the financials work out in their favor though.

More Xbox consoles means more Game Pass subs which is all they really care about. The whole story of Game Pass as Phil Spencer's told it is it was his way to convince the Microsoft higher ups that the games division could fit into and contribute to their pivot to cloud services. Hence the long time rumour that there's going to be a dedicated Xbox streaming stick at some point.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

shrike82 posted:

If the Series S continues outselling the X... it might actually end up being the default hardware target for console devs

Where are people getting numbers from?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

shrike82 posted:

Did they save that much money from having multitiered slow ram? Seems like a decision that they're going to regret long term along the lines of the Xbox One ram issue

The Series X has split RAM too, with the fast part at 560GB/s and the slow part at 336GB/s. It was an intentional decision for sure, and actually let them get the fast part of the fast RAM faster! The Series X has 10 memory chips, if they had done 16GB in 8 memory chips it would have to be slower because of a lower number of total chips. If they had done 10x2GB chips, then they'd have 20GB of GDDR6, which they didn't want to pay for in a $499 box.

I'm sure that Microsoft's vision is that the slow part of the RAM is used for the OS and non-VRAM usage, the dual-channel DDR4-3200 in your desktop only runs at 53GB/s total bandwidth and you get by just fine with that, after all. That leaves 10GB of fast RAM on the Series X, and 8GB of fast RAM on the series S.

repiv
Aug 13, 2009

MarcusSA posted:

What’s the problem with that?

It’s a 1080p console

it's another roadblock for developers if they have to maintain an RT-off path just for the XSS

shrike82 posted:

Did they save that much money from having multitiered slow ram? Seems like a decision that they're going to regret long term along the lines of the Xbox One ram issue

i'm not convinced that's an issue in practice, the slow RAM is sized such that it'll all or mostly be used for CPU allocations and it's plenty fast enough for that

even the slow RAM on the XSS is a little faster than typical PC DDR5

MarcusSA
Sep 23, 2007

repiv posted:

it's another roadblock for developers if they have to maintain an RT-off path just for the XSS


But they have to have an RT off path for PC too.

Adbot
ADBOT LOVES YOU

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

MarcusSA posted:

What’s the problem with that?

It’s a 1080p console

The problem is that it's crippling game design for anything that wants to run on console, and will continue to for the next 5+ years.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply