Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord

canyoneer posted:

If you wish to identify your ports USB
You must first answer these riddles three

:wow:

Adbot
ADBOT LOVES YOU

Khorne
May 1, 2002

mrk posted:

My Gigabyte also has USB 2.0 ports lol.
Most type a devices are still 2.0 even if they're labeled as 3.x. With obvious notable exceptions that need all the extra bandwidth the newer standards allow.

USB is what happens when you make number go up marketing drive your standard naming convention and adopt any dumb poo poo someone wants to do with no regard for the ecosystem you're creating. They make HDMI look good.

Khorne fucked around with this message at 07:17 on Dec 8, 2021

WhyteRyce
Dec 30, 2001

You can be USB 3 compliant but still only support HS/FS/LS

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
It's that why you can get "USB 3.0" flash drives that are still painfully slow (connected through USB 3.0+)?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Most USB drives have poo poo-tier flash that can't come close to saturating 2.0, never mind 3.0.

orcane
Jun 13, 2012

Fun Shoe
I have USB 3.0 flash drives that do their USB 3 speed for about 10 seconds then they overheat and throttle to under USB 2.0 speeds for the rest of the transfer, if they don't just stall out completely after a while. It's amazing.

hobbesmaster
Jan 28, 2008

canyoneer posted:

If you wish to identify your ports USB
You must first answer these riddles three point two, two by two

WhyteRyce
Dec 30, 2001

orcane posted:

I have USB 3.0 flash drives that do their USB 3 speed for about 10 seconds then they overheat and throttle to under USB 2.0 speeds for the rest of the transfer, if they don't just stall out completely after a while. It's amazing.

Yeah I got one of those little tiny ones that are about the size of a usb wireless mouse receiver and after a couple of minutes just hang on a transfer. Have to plug it in to a old 2.0 port just to transfer anything bigger than a gig

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord
Beat the system by buying a cheap sdxc card and a usb reader plug for it for an el-cheap 256gb drive.

Pro-tip: don’t do this ^ because the card will get insanely hot and fry itself into oblivion

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Trolling Thunder posted:

Beat the system by buying a cheap sdxc card and a usb reader plug for it for an el-cheap 256gb drive.

Pro-tip: don’t do this ^ because the card will get insanely hot and fry itself into oblivion

I was recently moving a lot of files between computers, and I tested SDXC versus HDD via USB3.0, and the latter was significantly faster. 44GB was 10 minutes with the hard drive, ~15 on the SD card.

For reference, when I tried the same transfer using my laptop's SD slot it was going to take over an hour, so the card reader is clearly important, and it's possible the USB hub I used for my test was also bottlenecking it. But otoh, if most card readers suck, that's a big strike against SD cards in its own way.

I guess the solution is to get a USB 3.0+ SSD?

orcane
Jun 13, 2012

Fun Shoe
Card readers are one of the obvious points where manufacturers try to save another 50 cents without a lot of people noticing or complaining about it. Some reviewers point them out in notebook or phone reviews and they're often fairly crappy. And a $5 USB card reader (that's actually sold for $50) is not going to be great either.

You can still do it but then you have to actually research a) fast cards and b) fast card readers (vs. just finding a fast USB thumbdrive). USB 3.0 SSDs are one alternative, or you get a decent (again, you have to find one that isn't awful) enclosure with USB 3.0 and UASP and an okay SSD and make your own external USB SSD.

WhyteRyce posted:

Yeah I got one of those little tiny ones that are about the size of a usb wireless mouse receiver and after a couple of minutes just hang on a transfer. Have to plug it in to a old 2.0 port just to transfer anything bigger than a gig
Haha that's exactly the kind I had in mind. My mom used tiny USB 2.0 ones for music in her car in the past and I wanted to get her a new one from Sandisk. When I tried to move data from my computers the transfer rates cratered almost immediately and sometimes the transfer just froze or aborted at some point. I then tried to use it to backup data on my PS4 and that kept nearly crashing the PS4 when in use. They're not actually usable unless you limit them to USB 2.0 speeds.

E: I actually tried to come up with ways to cool them but 90% of their volume are a plastic heat trap resting inside the USB connector. You'd have to replace the plastic shell with a heat conducting one and then you can just buy a new stick instead :suicide:

orcane fucked around with this message at 14:30 on Dec 9, 2021

orcane
Jun 13, 2012

Fun Shoe
Speaking of cheap parts.

Some LGA-1700 boards for Alder Lake come with a socket and retention mechanism design that easily bends once the CPU is put in, to the point the base of the CPU cooler can no longer properly transfer the heat because it's not actually connected to the entire heat spreader anymore:
https://www.igorslab.de/en/bad-cooling-at-alder-lake-problems-at-socket-lga-1700-on-the-lane-among-all-remedies/

Too Many Birds
Jan 8, 2020


orcane posted:

Speaking of cheap parts.

Some LGA-1700 boards for Alder Lake come with a socket and retention mechanism design that easily bends once the CPU is put in, to the point the base of the CPU cooler can no longer properly transfer the heat because it's not actually connected to the entire heat spreader anymore:
https://www.igorslab.de/en/bad-cooling-at-alder-lake-problems-at-socket-lga-1700-on-the-lane-among-all-remedies/

why are they not naming and shaming

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

LRADIKAL posted:

You guys aren't wrong to diss it, but I can generally plug anything into anything and it basically works. It will often tell me if it could be working better or maybe I don't even notice. Seems to me it is stupid, but good enough. At least you can plug it in in either direction and it's pretty small and fits pretty snug.

That's the saving grace to all of this. 99% of users and their devices are fine with basically anything they plug them into because outside mass-storage devices and NICs, the bandwidth needs for almost anything else you're plugging in is tiny.

I mean, I'm totally on the bandwagon for TB-over-USB-ports because I've actually got some devices that could benefit, but that by itself puts me in the 0.1%, and even I only have two of said devices out of dozens and dozens of USB items. Being able to do single-cable video + USB + power to a monitor sure will be nice whenever we actually get there in any meaningful sense (no, having the occasional crappy office monitor that supports it via TB2 doesn't count for much).

We bitch and complain that the naming is stupid (because it is), but for most people it just doesn't matter.

Skyarb
Sep 20, 2018

MMMPH MMMPPHH MPPPH GLUCK GLUCK OH SORRY I DIDNT SEE YOU THERE I WAS JUST CHOKING DOWN THIS BATTLEFIELD COCK DID YOU KNOW BATTLEFIELD IS THE BEST VIDEO GAME EVER NOW IF YOULL EXCUSE ME ILL GO BACK TO THIS BATTLECOCK
So maybe I missed the obvious answer but it seems like a lot of the fervor for the new intel cpi's died down. I really want to replace my 8700k with a top of the line CPU because I am cpu bound in a handful of games at the moment, are any of the new intel cpus worth looking at?

CoolCab
Apr 17, 2005

glem
yes for sure, big performance gains over last gen and current ryzen if you want more power. motherboards are super expensive though.

Cygni
Nov 12, 2005

raring to post

Yea, the 12600k and 12700k are probably what to get for a mid/high system right now if you are starting from scratch, especially if you have a local Microcenter. The Zen3 parts are a bit overpriced in comparison. Pair it with a ddr4 board.

You can save a little by getting something older, but if you only build every few years, I would spend the money and just go 12th gen.

Shipon
Nov 7, 2005

Skyarb posted:

So maybe I missed the obvious answer but it seems like a lot of the fervor for the new intel cpi's died down. I really want to replace my 8700k with a top of the line CPU because I am cpu bound in a handful of games at the moment, are any of the new intel cpus worth looking at?

Gaming performance is really good with the caveat that some older games might not play well with the E cores in play and you'll need to move to Win 11 for full advantage of the core scheduler. I just went from a 9700k to a 12900k myself because of Flight Sim and a few other things.

Stick with DDR4 though, DDR5 is somehow even harder to find than graphics cards at the moment and the modules that are out aren't very good.

CoolCab
Apr 17, 2005

glem
ddr4 is also very inexpensive and my expectation is that ddr5 will be much more reasonable, available and affordable in like a year, unless supply chains etc.

it might be a viable upgrade in future, if you were up for it, to replace your mobo in 1-2 years and upgrade your ram assuming it's cheaper. kind of weird that's true tbh, I feel like replacing your mobo before your CPU is atypical

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Skyarb posted:

So maybe I missed the obvious answer but it seems like a lot of the fervor for the new intel cpi's died down. I really want to replace my 8700k with a top of the line CPU because I am cpu bound in a handful of games at the moment, are any of the new intel cpus worth looking at?

There aren't many games where you will see any significant performance uplift from a properly OC'd 8700k. What exactly are you having issues with?

CoolCab
Apr 17, 2005

glem
well, iirc from the other thread he's pairing it with a 3080ti so this isn't a price/performance machine, it's more of a high end or showpiece. i feel like four generations and four years is a reasonable stretch for a routinely updated high end machine.

Cygni
Nov 12, 2005

raring to post

I went from an 8700K to a 5950X (which a 12700k will meet or beat) and def noted a lot more performance increase with a 3090 than i was expecting, but its extremely game dependent. Stuff like GTAV, HZD, and ROTR/SOTR all had double digit FPS gains, but RDR2 and Asscreed had zero change. YMMV obviously.

Skyarb
Sep 20, 2018

MMMPH MMMPPHH MPPPH GLUCK GLUCK OH SORRY I DIDNT SEE YOU THERE I WAS JUST CHOKING DOWN THIS BATTLEFIELD COCK DID YOU KNOW BATTLEFIELD IS THE BEST VIDEO GAME EVER NOW IF YOULL EXCUSE ME ILL GO BACK TO THIS BATTLECOCK
Yeah games vary. But for me hzd, rdr2, sea of thieves hilariously, hunt showdown, maybe battlefield if it ever doesn't suck rear end.

And yes I have a 3080ti so I'm not really looking to be smart but instead just get very high end. I will stick to ddr4 though

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
I have an 8700k and a 2070rtx and would probably have an easier time just upgrading the cpu/motherboard/ram :sigh:

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Skyarb posted:

So maybe I missed the obvious answer but it seems like a lot of the fervor for the new intel cpi's died down. I really want to replace my 8700k with a top of the line CPU because I am cpu bound in a handful of games at the moment, are any of the new intel cpus worth looking at?

11th gen landed with a wet fart (except maybe for the low-end) because it was a bad series, but 12th gen is a real step-up

I think it just doesn't get as much talk because CPUs are a lot easier to get than GPUs nowadays

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
What do people think of the E-cores? I've seen a fair amount of coverage saying they're close to pointless for gaming, and even sometimes decreas performance because they introduce some kind of latency (as in, in effect decreasing the FPS).

Rinkles fucked around with this message at 04:06 on Dec 15, 2021

Cygni
Nov 12, 2005

raring to post

I will be proudly not memorizing the math on these ever in my life. How much is 4x48 sticks? No idea. Will never know. Use a calculator.

https://www.anandtech.com/show/17117/sk-hynix-to-manufacture-48-gib-and-96-gib-ddr5-modules

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Rinkles posted:

What do people think of the E-cores? I've seen a fair amount of coverage saying they're close to pointless for gaming, and even sometimes decreas performance because they introduce some kind of latency (as in, in effect decreasing the FPS).

TPU did an E-Cores only review of the 12900K: https://www.techpowerup.com/review/intel-core-i9-12900k-e-cores-only-performance/

This surprisingly remains the story with 1080p, where the relative performance holds. The E-cores are bottlenecking the machine, the P-core-only configuration with HTT disabled ends up 33% faster, and the stock configuration about 39% faster. You've to understand that the P-core-only configuration also has HTT disabled, which explains the performance gap with the stock configuration. An exceptional result here is Civilization 6, where the E-cores-only configuration is found beating the 8-core/16-thread Core i7-10700K! Highly optimized games such as DOOM Eternal really don't care about the E-cores, where at 1080p it's seen matching just about any other chip.

At 1440p, our RTX 3080 GPU begins to feel the heat. The 8-core E-core only configuration performs roughly the same as most other Ryzen 3000 Zen 2 processors. Civilization and DOOM remain the outliers here, and there remain several games that are GPU-bottlenecked enough to where the CPU isn't making much of a difference. At 4K, with the bottleneck squarely in the hands of the GPU, our E-core bunch is holding out surprisingly well, performing on par with Zen 2 chips. However, the margins of performance between all chips are wafer-thin. The slowest chip is just 2% slower, and the fastest only 4.9% faster.

BIG HEADLINE fucked around with this message at 06:29 on Dec 15, 2021

Helter Skelter
Feb 10, 2004

BEARD OF HAVOC

Rinkles posted:

What do people think of the E-cores? I've seen a fair amount of coverage saying they're close to pointless for gaming, and even sometimes decreas performance because they introduce some kind of latency (as in, in effect decreasing the FPS).

Kinda neutral? They're not really "for" gaming, so I'm neither shocked nor appalled that they're not particularly amazing at it (though it should be noted that they're not usually terrible at it, either). They're pretty good for non-gaming thread-heavy stuff if you have a need (Intel sure did), and from what I gather there are usually workarounds with Process Lasso if you do run into stuff that's trying to hit the E-cores when it really shouldn't.

Overall I think they make a lot more sense in mobile platforms where space and power are more constrained (because that's what the Atom line they're based off of was made for). A 4p+8e laptop with a decent GPU could be pretty rad. I don't think we'd be seeing them on desktop like this if Intel wasn't still playing catch-up with AMD on multi-chip designs, though.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Skyarb posted:

So maybe I missed the obvious answer but it seems like a lot of the fervor for the new intel cpi's died down. I really want to replace my 8700k with a top of the line CPU because I am cpu bound in a handful of games at the moment, are any of the new intel cpus worth looking at?

TPU has the 12600K as being 25% faster than a 10600K (which is a reasonable proxy for an OC'd 8700K) in 720p gaming. It's a decent chunk faster. Just, for a lot of people sitting on 8700K/9900K, maybe not quite worth it yet (although that's starting to get into the range of worthwhile).

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Cygni posted:

I will be proudly not memorizing the math on these ever in my life. How much is 4x48 sticks? No idea. Will never know. Use a calculator.

https://www.anandtech.com/show/17117/sk-hynix-to-manufacture-48-gib-and-96-gib-ddr5-modules

Don’t tell anyone else, but 4x48 = 4x50 - 4x2 = 200 - 8 = 192 :pseudo:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Rinkles posted:

What do people think of the E-cores? I've seen a fair amount of coverage saying they're close to pointless for gaming, and even sometimes decreas performance because they introduce some kind of latency (as in, in effect decreasing the FPS).

i'm really impressed with gracemont in performance terms and size terms relative to other x86 but an A15 Blizzard core is around the same performance at 1/3 the area. Yeah, 5nm vs 7nm but even if you give it a generous 2x scaling with logic, Blizzard is far more areal efficient. On an equivalent node it would likely be >30% more performant per mm2.

efficiency cores may be where the bit about x86 decoder overhead being space-poor and power-inefficient comes into play - sure it's a couple percent of a giant performance core, but if your efficiency core needs the same decode width and it's 1/3 the area of a performance core, then the decoder also makes up 3x the relative area of the core. If the areal/power difference is 3% on a performance core it's 9% on an efficiency core, and if you need 8 of them there's no easy way around that, you need 8 frontends for your 8 cores. It's not gonna instantly murder everything but it's another factor dragging down the attempt to scale x86 downwards.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
if you're doing productivity work then it's my understanding that the E-cores are nice because it's more cores no matter what

for gaming, it's probably going to come down to how far the E-core-less i5-12400 is going to be pushed in terms of clocks, because even if we say that the E-cores are not-good for gaming and you're better off not using them, an i5-12600K is still going to run its P-cores at a higher frequency out of the box, on top of being able to be overclocked if you really wanna push it, so it's not a given that the 12400 is going to be a better value proposition for not having E-cores

(or rather, a 12400 probably is going to be a good value-buy, but because of being cheaper overall and letting you get away with a B- or H- series board, not because of the P- vs E-core performance)

Skyarb
Sep 20, 2018

MMMPH MMMPPHH MPPPH GLUCK GLUCK OH SORRY I DIDNT SEE YOU THERE I WAS JUST CHOKING DOWN THIS BATTLEFIELD COCK DID YOU KNOW BATTLEFIELD IS THE BEST VIDEO GAME EVER NOW IF YOULL EXCUSE ME ILL GO BACK TO THIS BATTLECOCK

Paul MaudDib posted:

TPU has the 12600K as being 25% faster than a 10600K (which is a reasonable proxy for an OC'd 8700K) in 720p gaming. It's a decent chunk faster. Just, for a lot of people sitting on 8700K/9900K, maybe not quite worth it yet (although that's starting to get into the range of worthwhile).

Is the 12600k the best of the new series?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Skyarb posted:

Is the 12600k the best of the new series?

the top SKU is the i9-12900K, with 8 P-cores and 8 E-cores
the next step down from that is the i7-12700K, with 8 P-corees and 4 E-cores
the next step down from that is the i5-12600K, with 6 P-cores and 4 E-cores

the i5-12600K gets a lot of attention because it gets you most of the performance - the i9 is only ~6% faster in gaming but is at twice the MSRP

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Oh my god, the 12600K is now $230 at Micro center. I probably would've waited if I'd known, even though an equivalent Z690 to the Z590 I got would've been a whole lot more expensive. (I could technically still return all the parts, but it would be a big hassle and a whole lot of nerves).

Dr. Video Games 0031
Jul 17, 2004

The 12600K is the sensible pick. It's $300 in many stores, $250 at Micro Center. That's a steal. The 12700K is the overkill pick. You'll pay $100 - $120 for like 5% extra performance and potentially more longevity as games get better at multithreading, though that remains to be seen and future proofing is, as usual, somewhat dumb. By the time that difference starts mattering, you'll probably want to upgrade again anyway. The 12900K is the high-end one that you get if you are going to do any kind of heavily multithreaded productivity work. Do not get it for gaming, it will only barely help you there over the 12700k.

The other important difference aside from core count (perhaps the most important difference) is cache size. 12600K has 16MB of L3 cache, the 12700K has 25MB, and the 12900K has 30MB. This accounts for almost all of the performance difference in games. The minor clock speed differences (4.9ghz for 12600k and 12700k, 5.2 for 12900k) account for the rest. The core counts are basically not a factor at all right now.

edit: or $230 from micro center for the 12600k. wow. The 12700k is $300. For a $70 price difference, that's worth it if you have a super high-end GPU and you're playing at 1440p. At 4K it's still iffy, but it's so cheap...

Also lol wow, alder lake really must be underperforming when it comes to sales.

Dr. Video Games 0031 fucked around with this message at 09:53 on Dec 15, 2021

BurritoJustice
Oct 9, 2012

12700K at $300 is a wildly good deal, wtf

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Yeah that's insane and I'm now extremely tempted to build a 12700k system like tomorrow.

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

K8.0 posted:

Yeah that's insane and I'm now extremely tempted to build a 12700k system like tomorrow.

If you're near a microcenter and on anything pre-zen3 I'd just do it lol

Get a Z690-A and some bdie, run it at 4000c15 or so. Insane performance for the money.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply