|
hobbesmaster posted:OEMs like Dell seem to be trying very hard to make it happen so Intel probably only wants to deal with that. The weird Dell power supply/motherboard seen in thier PC reviewed by GamersNexus didn't actually use the ATX12VO standard. It used their own proprietary standard that doesn't work with anything else.
|
# ¿ Jul 29, 2021 01:41 |
|
|
# ¿ May 19, 2024 15:28 |
|
ITX motherboards could switch to SO-DIMMs and hide those on the underside like they already do to M.2 drives.
|
# ¿ Jul 31, 2021 01:02 |
|
1.1M on the low end for him. He clearly needed to do an even more absurd face.
|
# ¿ Jul 31, 2021 12:11 |
|
Harik posted:That's not what "idle" power means. Idle power is when it's off, just wasting energy waiting on someone to press the power button. Minimal current to keep the USB (and possibly ethernet) circuitry on for wakeup events. That's not how it's been explained in the press, and in that Linus Tech Tips video that was posted earlier, he showed a drop in desktop idle power draw (measured at the wall) from around 60 watts to 30 watts when switching to an ATX12VO PSU on an otherwise identical build. Current 80plus certification doesn't even bother measuring efficiency below 20% load, and ATX12VO is meant to be much more efficient at that.
|
# ¿ Aug 2, 2021 10:03 |
|
Shipon posted:i would hope that a component would be usable for its warranty period. if that's a 10 year period then why shouldn't you be able to use it for that length? if it's a cheaper one with a 3 year warranty yeah you're probably a fool to hang onto it longer The "way too long" part was in reference to the guy saying he's held onto his current one for fifteen years. That thing's a ticking time bomb.
|
# ¿ Aug 2, 2021 13:19 |
|
Also, what's really being discussed is the power being drawn by your system measured at the wall, not the power being consumed by your core components. A better power supply will consume less electricity, but it won't change the figures you see in hwinfo at all. But if you're using such a low-power machine and it has a low-power power supply to match, then it should be reasonably efficient.
|
# ¿ Aug 2, 2021 21:50 |
|
Palladium posted:Who knows, but if its a quality gold+ unit it would be irrelevant anyway IMO How does a power efficiency rating relate to longevity when unpowered? Some people put way too much stock into the 80+ ratings. Power efficiency is important for a number of reasons (more efficiency = less waste heat = less wear and tear at load and less noise), but the bronze/gold/etc ratings don't indicate anything other than power efficiency. Perhaps you can draw some correlation between companies that care about product quality and companies that sell high-efficiency power supplies, but I wouldn't take that as a given.
|
# ¿ Aug 3, 2021 09:37 |
|
It's good that the 12600K will be less than $300 but I mourn the death of the sub-$200 CPU.
|
# ¿ Sep 11, 2021 12:14 |
|
Twerk from Home posted:Yeah, the 4C/8T i3s were an outstanding value that got completely overlooked by the Rocket Lake update. The 11400's MSRP is less than $200, in theory, making it a really great value too. Too bad only OEMs seem to have decent access to it. All X400 CPUs have been less than $200, until Alder Lake. It's just how it's going to be from now on. AMD already rose their prices for Zen 3 and Intel would be foolish not to follow suit to some degree (the 12600K seems like a potentially good value still, though).
|
# ¿ Sep 12, 2021 11:24 |
|
How does the switch to DDR5 actually affect performance? Does it do anything to be faster than DDR4 at the same speeds and timings, or are they directly comparable to one another? What I mean is, is DDR5-4800 at those timings as terrible as it looks on paper, or is there a secret sauce that makes it good?
|
# ¿ Oct 13, 2021 01:23 |
|
This is one argument we really don't need to drag back up from the grave it was resting comfortably in.
|
# ¿ Oct 20, 2021 12:39 |
|
Zedsdeadbaby posted:It's not there anymore but on my phone one of the google recommended articles was about a 6000MT/s stick, I think it was g.skill. https://www.techpowerup.com/288082/g-skill-announces-worlds-fastest-ddr5-6600-cl36-trident-z5-memory-kits DDR5-6600 CL36. I expect it to cost an arm and a leg.
|
# ¿ Oct 22, 2021 14:25 |
|
Cygni posted:ROG boards have "leaked": The biggest pain in the rear end about this is that most low-profile coolers aren't compatible with this kind of layout. Good luck fitting an NH-L12 or an Alpenföhn Black Ridge in there.
|
# ¿ Oct 23, 2021 02:05 |
|
Rocket Lake was relatively available on launch day. All CPUs were present at MSRP for some hours, enough time to order one if you were paying attention. After that, availability was hit or miss for a month or two but they came permanently back into stock in due time (aside from the 11400). Alder Lake may be in higher demand though due to it not being overshadowed by a competing CPU line out of the gate. As long as you're paying attention to the launch windows though, I would expect you to be able to get one.
Dr. Video Games 0031 fucked around with this message at 05:23 on Oct 25, 2021 |
# ¿ Oct 25, 2021 05:21 |
|
BlankSystemDaemon posted:You might wanna take this with a grain of salt, considering that this is the guy who's been destroying his companies business critical data for like a decade through wilful ignorance and outright dumb actions that were guaranteed to lead to failure. What, specifically, in this video is inaccurate?
|
# ¿ Oct 26, 2021 23:18 |
|
FC6 is also heavily single-threaded. Both it and AoE IV should mirror the single-core performance gains Alder Lake is bringing. None of these gains are very surprising. It'd be a pretty shocking disappointment if the new architecture Intel is releasing wasn't better than a year-old product line. AMD is sort of slipping here by not having anything competitive to release until early next year. edit: 5600X availability has been somewhat spotty this last month. Makes me wonder if AMD is holding any product back for a surge + price cut. A $250 or $240 5600X would at least stay semi-competitive with the 12600KF. If AMD keeps the 5600X at $300, then I wouldn't see any reason to ever consider it, unless you have heat/power concerns. Dr. Video Games 0031 fucked around with this message at 12:38 on Oct 27, 2021 |
# ¿ Oct 27, 2021 12:28 |
|
VorpalFish posted:AMDs tdp ratings aren't strict power limits like Intel's which is why you can't directly compare. The equivalent value for AMD would be the PPT, which is 142w for the 105w tdp chips and 88w for the 65w chips. As a 5600X owner, I'm reasonably sure the stock PPT for it is somewhere in the mid to upper 70s. You can increase it with the press of a button in Ryzen Master. You get like 98% of all the performance you're going to get out of the chip by the time you hit 90w though. (And 90% of the performance at 75w)
|
# ¿ Oct 28, 2021 00:18 |
|
VorpalFish posted:PPT is 88 watts stock for the 5600x No, that's not what I'm observing. Hammering the 5600X with Prime95 causes it to boost well under the max clock of 4600mhz across all cores. The default PPT as per Ryzen Master is 76 watts.
|
# ¿ Oct 28, 2021 00:52 |
|
The 12400 wasn't even mentioned by Intel today. They'll probably stealth drop it on some random day without giving it to press first like they did with the 11400.
|
# ¿ Oct 28, 2021 02:06 |
|
You're still getting an IPC uplift, so yeah, I'd expect something akin to 5600X performance but for $200 or less.
|
# ¿ Oct 28, 2021 03:28 |
|
VorpalFish posted:If it actually drops as a faster, more efficient 11400 at $200 and is readily available at msrp it pretty much instantly becomes the best value CPU for gaming. It is only now, about 7 months later, that the 11400 is actually in stock and available for purchase for $200. https://www.newegg.com/intel-core-i5-11400-core-i5-11th-gen/p/N82E16819118241 This "sale" ends in 2 hours supposedly and then it's back to being $220, which is $30 above MSRP. The F SKU is still wholly unavailable lol.
|
# ¿ Oct 28, 2021 05:12 |
|
My inclination would be to just wait another year. The 12th gen rollout seems like a clusterfuck of overpriced components, and the 11th gen may not be a large enough upgrade to justify the cost. 8th gen mobos have m.2 2280 slots too, don't they? And while they may not have PCIe Gen 4, the 3090 sees only a very minor uplift from using Gen 4. (like, 2 - 3% higher framerates.) IMO, wait and see if motherboard prices come down, and watch what happens with Zen 3 with 3D vcache and potentially Zen 4, and then step up into a bigger upgrade.gradenko_2000 posted:we really couldn't know for sure until reviews come in and there's independent verification of how much better Alder Lake is going to be, but on the other hand Alder Lake is probably going to be fine and the real watch-out is early-adopter costs and issues You have this partially backwards. If you're doing 4K gaming, that's less of a reason to upgrade your CPU. The higher the resolution you game at, the less work your CPU does. Ironically, if he's sticking with 1080p in order to push very high frame rates, that would see a much bigger benefit from upgrading his CPU. Also the 11700K offers very, very little benefit over the 11600K when it comes to gaming. Go with the 11600K if you do plan on upgrading to 11th gen. Dr. Video Games 0031 fucked around with this message at 08:29 on Oct 28, 2021 |
# ¿ Oct 28, 2021 08:00 |
|
"High-refresh" 4K gaming isn't really a thing except in old games and a select few competitive games after you've turned the settings down very low. And in that case, then sure, it may be worth it. I've been trying to find reviews that have benchmarks with both the 11th gen and 8th gen CPUs so we can get some direct comparisons. Guru3D has a few games with applicable benchmarks (up to 1440p; 4K charts would be even more flat): https://www.guru3d.com/articles_pages/intel_core_i7_11700kf_processor_review,24.html And TechPowerUp has benchmarks with the 9th gen through 11th, with some low power 9th parts that are worse than the 8700K (up to 4K): https://www.techpowerup.com/review/intel-core-i7-11700kf/17.html These are resources you should be looking at to determine if an upgrade is worth it. Also it occurs to me that Intel didn't specify what resolution they benchmarked gaming performance at for their Alder Lake performance comparison charts, but I'd bet it's 1080p low settings, or something to that effect. Always assume they're showing the most exaggerated results possible when they don't specify how they achieved them. In normal high resolution gaming conditions, Alder Lake won't magically give you 15% more FPS (and neither will the upcoming Zen 3 3D vcache chips). For people who are building PCs that are primarily used for gaming, just wait out the chaos, or maybe jump on any Zen 3 or Rocket Lake discounts that happen if you're desperate (or if you just have money to burn and an itch to upgrade I guess).
|
# ¿ Oct 28, 2021 08:35 |
|
TheFluff posted:Where on earth did you get this idea? I got on the 4K 120Hz train basically as soon as the first <$1000 monitor launched a few years ago. I have an 8700k and a 2080ti and while I don't play many AAA titles, there are very few titles I'm aware of that I couldn't run at >100fps (RDR2, basically). I run at mostly high or very high settings although with mostly useless performance hogs like SSAO turned off. As for competitive games, Rainbow 6 Siege is probably the most GPU intensive one I know and even that will easily run at >150fps with a 2080ti/3080 on ultra settings. You can also turn on DLSS if you want, but I generally don't because it's not necessary and it does look worse. This is strange to me because the vast majority of games I personally play get a max of maybe 90 FPS even with a 3080 Ti. I did mention competitive games (like R6 Siege) as being capable of higher frame rates, but the difference here is our personal perspectives of how much of the market they constitute, I guess. We definitely play very different kinds of games. Either way, point taken.
|
# ¿ Oct 28, 2021 14:36 |
|
I guess the point is more that when it comes to 4K, even at higher frame rates, for most games you're going to be much more GPU-bound than CPU-bound. There are obvious exceptions to this (again, R6 Siege and some other competitive games), but on the whole I would say that being on a state-of-the-art CPU is not hugely important to having good 4K gaming performance.
|
# ¿ Oct 28, 2021 15:08 |
|
Sphyre posted:Is alder lake finally 10nm? Did intel finally fix their busted rear end process? It's not 10nm, it's Intel 7®! (it's 10nm, but intel is moving away from talking about nanometers now because everyone measures them differently and process labels have lost all meaning) After Intel 7 comes Intel 5, then Intel 3, then Intel 20A, because once you hit 2nm, naming your process after transistor size becomes good again. Look, don't think about it, okay? Dr. Video Games 0031 fucked around with this message at 23:50 on Oct 28, 2021 |
# ¿ Oct 28, 2021 23:48 |
|
The cheapest B550 board I would want to buy is the MSI B550-A Pro for $120. The cheaper boards strike me as a little too cheap. You should also stay away from those cheap ASRock Z690 boards. They cheap out on the VRMs hard on those things, and the end result is you get very weak power delivery that won't be able to get anywhere near the full PL2 performance out of a higher end chip (and maybe not even the 12600K). Those things are some real pieces of poo poo, especially the Phantom Gaming 4 (edit: The ASRock Z690 Pro seems maybe not as terrible, its VRM is much beefier at least. wait for reviews). I think you're probably starting at $200 or $220 for a decent Z690 ATX board for now. And it's worth a having a healthy dose of skepticism towards the more affordable DDR5 kits that are showing up. It seems very much possible that those will perform worse than DDR4 kits that are 60% their price. If you want the kits that may actually be pretty performant, then the prices go up by an astronomical amount. I enjoy being on the bleeding edge, but this is looking a little too spicy. edit: The cheapest DDR5 board is the MSI PRO Z690-A DDR5 for $230, so you should make that the baseline when trying to compare the bottom end of each platform. At least the MSI Pro series tends to be trustworthy. Dr. Video Games 0031 fucked around with this message at 05:47 on Oct 29, 2021 |
# ¿ Oct 29, 2021 05:26 |
|
https://twitter.com/VideoCardz/status/1453107202365984782 So yeah, don't buy the Phantom Gaming 4 just because it's the cheapest one lol.
|
# ¿ Oct 29, 2021 08:01 |
|
Dum-dum #1 right on time.
|
# ¿ Nov 4, 2021 08:49 |
|
mobby_6kl posted:Whoa the 12600 looks like a great deal unless they decide to sell the 12900 for peanuts too. The power consumption in Blender is pretty wild (is that the AVX or something?) but I'm not going to have it rendering 24/7 so They cut AVX support. That's just how the 12900K reacts to any intense workload.
|
# ¿ Nov 4, 2021 09:27 |
|
Paul MaudDib posted:The money question is still how much ddr4 affects performance - because with DDR5 you’re spending $400 on memory with your $200 processor. What. The 5600X was and still is the no brainer best choice in the Zen 3 lineup for most people. It had killer performance for the price and the 5800X launched at an MSRP that was $150 higher for barely any gain in most real-world applications (including gaming).
|
# ¿ Nov 4, 2021 10:21 |
|
Paul MaudDib posted:5600X was an objectively bad choice in multiple dimensions - significantly worse perf/$ than the 3600 (price increased by far more than the performance) while the longevity is significantly worse than the 8700K (you got 3 extra years of lifespan with the 8700K for $75 extra MSRP vs MSRP) for relatively minimal performance increases. I have to say, I don't think I've seen a post on this forum this out of touch with reality in a very long time. The 5600X was over 30% better than the 3600X in gaming benchmarks and even better in productivity benchmarks. The lack of cores doesn't hold it back in the slightest, and its longevity is looking like it'll be pretty good. I plan on holding onto mine for a few more years.
|
# ¿ Nov 4, 2021 10:52 |
|
The 12600K is out of stock already at the bigger online stores, but you should be able to find some at the smaller stores (B&H Photo, for instance) for a little bit longer before it's out of stock everywhere for the next few months.
|
# ¿ Nov 4, 2021 14:41 |
|
CoolCab posted:yeah wowsers, if it is matching the 5800X then the 5600X would need a massive price drop to justify. what's pricing like? i heard the 12600k is a hundred bucks cheaper than a 5600X? Uh, no. The 12600K is $320. 5600X is around $300.
|
# ¿ Nov 4, 2021 14:42 |
|
Drakhoran posted:And here's a list of games you won't be able to play on Alder Lake: From TechSpot: quote:Luckily, there is a way to avoid the issue before the patches roll out. It involves enabling Legacy Game Compatibility Mode, which will place the E-cores in a standby mode while playing games. Here’s how to enable the feature:
|
# ¿ Nov 6, 2021 00:16 |
|
CoolCab posted:do we have any indicators of how twelfth gen is doing saleswise, is it impossible to buy? steve hardware unboxed talked some poo poo but i don't know how seriously to take it and would take him with a grain of salt on this anyway. All he said is that his retail contacts told him that the 12th gen is selling below expectations. That's hardly "talking poo poo" lol.
|
# ¿ Nov 9, 2021 01:20 |
|
Anecdotally, availability seems better than Zen 3 at its launch, though that may just come down to supply. Intel owns its own fabs, while AMD has to compete for fab time.
|
# ¿ Nov 9, 2021 05:31 |
|
Intel 7 is a "10nm" technology, not 7nm. Their supposedly 7nm process is "Intel 4"
|
# ¿ Dec 4, 2021 22:18 |
|
Beef posted:And since finFET we're in 3D structure territory, making just gate pitch or length technically even less meaningful. Though I find it very funny that in the same breath they talked about nanometers being a useless marketing term now, they announced that they'll be returning to measurement-based node labels in a few years anyway with "20A"
|
# ¿ Dec 5, 2021 21:34 |
|
|
# ¿ May 19, 2024 15:28 |
|
The 12600K is the sensible pick. It's $300 in many stores, $250 at Micro Center. That's a steal. The 12700K is the overkill pick. You'll pay $100 - $120 for like 5% extra performance and potentially more longevity as games get better at multithreading, though that remains to be seen and future proofing is, as usual, somewhat dumb. By the time that difference starts mattering, you'll probably want to upgrade again anyway. The 12900K is the high-end one that you get if you are going to do any kind of heavily multithreaded productivity work. Do not get it for gaming, it will only barely help you there over the 12700k. The other important difference aside from core count (perhaps the most important difference) is cache size. 12600K has 16MB of L3 cache, the 12700K has 25MB, and the 12900K has 30MB. This accounts for almost all of the performance difference in games. The minor clock speed differences (4.9ghz for 12600k and 12700k, 5.2 for 12900k) account for the rest. The core counts are basically not a factor at all right now. edit: or $230 from micro center for the 12600k. wow. The 12700k is $300. For a $70 price difference, that's worth it if you have a super high-end GPU and you're playing at 1440p. At 4K it's still iffy, but it's so cheap... Also lol wow, alder lake really must be underperforming when it comes to sales. Dr. Video Games 0031 fucked around with this message at 09:53 on Dec 15, 2021 |
# ¿ Dec 15, 2021 09:02 |