|
So with the 8C CPUs allegedly being 8+0 and 3900X being 6+6, are games etc on a 3900X going to run off one chiplet by default, or are we going to need Zen 2-specific optimizations to ensure you don't hit the IF accidentally? Be fun if you got better frametimes in-game on a 3700X/3800X than on a 3900X, for instance.
|
# ? May 28, 2019 15:37 |
|
|
# ? May 21, 2024 00:20 |
|
Shouldn't matter since both dies are behind the memory controller and have equal access to the memory. Inter die latency should be decent and IF bus bandwidth more than doubled too. edit: sounds good enough to me. TR and Epyc's issues with gaming weren't really related to inter die or inter CCX latency. It was because they had to use NUMA which isn't supported properly by most games or software. One of the big advantages of Zen2 is the off die memory controller means they get to use chiplets without needing NUMA anymmore. \/\/\/\/\/\/\/\/\/\/\/ PC LOAD LETTER fucked around with this message at 15:56 on May 28, 2019 |
# ? May 28, 2019 15:42 |
|
I don't think so since all cores are one hop from the I/O die, but I'm also just some guy with no expertise. If I understand correctly, Zen and Zen+ had latency issues because different cores had different hops to and from memory, and that was especially the case with SKUs like the 2990WX. We won't know until benchmarks, I guess, but I'd like to know if my understanding is on the right track.
|
# ? May 28, 2019 15:45 |
|
PC LOAD LETTER posted:Then you give the Icelake U system DDR4 2400 too in order to make the comparison more apples to apples instead of giving it some absurd overclocking RAM. If you were comparing two graphics cards, you wouldn't demand that each of them have the same memory speed for all the tests. Intel's memory controller (and announced products, with today's Dell launch) are at 3733 with LPDDR (3200 for regular DDR), and the max supported frequency for the Ryzen 3700U is 2400. That AMD can't match that frequency with Zen+ is AMD's problem. Where the comparison falls down to me is not the specs, its that 3700U laptops with DDR4 2400 (like the Acer Swift 3 announced a few days ago) are going to be like half the price or less of the IceLake U laptops that are launching today. So yeah, looks like Intel can beat AMD's IGPs now, but you are gonna pay for it.
|
# ? May 28, 2019 17:17 |
|
Cygni posted:If you were comparing two graphics cards, you wouldn't demand that each of them have the same memory speed for all the tests. They're iGPU's that both will be stuck with system memory. And the reality is frequently they'll be stuck in lower end machines with cheap RAM. If there are some special higher cost models that do end up using this RAM but are low volume while the higher volume ends up using the more typical cheap stuff than its still perfectly legit and reasonable to expect they'd use the cheap stuff when benchmarking them too.
|
# ? May 28, 2019 17:30 |
|
Benchmark is both ways and present both, explaining the above that RAM speed is going to be up to the laptop manufacturer for the most part.
|
# ? May 28, 2019 17:43 |
|
Is there a list of 370/470 mobos that allow you to flash the bios without a cpu in them?
|
# ? May 28, 2019 18:01 |
|
pixaal posted:Benchmark both ways and present both, explaining the above that RAM speed is going to be up to the laptop manufacturer for the most part. This is the solution. If you bench only with equivalent RAM, you're isolating the graphical unit for testing, but you're missing an important component of real-world performance. Testing only at max-speed RAM is misleading if installed systems may not have max RAM. It's like GN's CPU tests where they only pair with the most powerful GPU and mostly test games at low/medium settings at 1080p in order to try to find the CPU bottleneck points. It's great for people who understand the complete model can interpret the data, but it's misleading for consumers who might come away with the ideas like "an overclocked 9900k is a ~20% boost to game fps over a 2600X" when you're unlikely to see any difference at all for most common GPU/monitor/game configurations. Steve does add a few comments and charts on that, but I do wish he'd add more real-world-type comparisons.
|
# ? May 28, 2019 19:03 |
|
Asrock's engineers are apparently still smoking the good stuff. They're making a limited edition $999 X570 motherboard that watercools the god damned chipset: https://www.youtube.com/watch?v=Mi3eaDKirPI
|
# ? May 28, 2019 19:16 |
|
Drakhoran posted:Asrock's engineers are apparently still smoking the good stuff. They're making a limited edition $999 X570 motherboard that watercools the god damned chipset: I'm still disappointed that they haven't been the ones to make an ITX Threadripper board. Also, Buildzoid is way down on ASRock, saying that on the surface, their X570 boards just look like warmed-over X470 boards. PC LOAD LETTER posted:edit: yeah they used much faster RAM (LPDDR4X-3733) on the Intel Icelake U system: https://www.anandtech.com/show/14405/intel-teases-ice-lake-integrated-graphics-performance Cygni posted:If you were comparing two graphics cards, you wouldn't demand that each of them have the same memory speed for all the tests. Intel's memory controller (and announced products, with today's Dell launch) are at 3733 with LPDDR (3200 for regular DDR), and the max supported frequency for the Ryzen 3700U is 2400. That AMD can't match that frequency with Zen+ is AMD's problem. PC LOAD LETTER posted:They're not discrete graphics cards though. Agreed, and worse, where this is actually important, the RAM is entirely likely to be soldered, rather than SODIMMs. OEMs aren't exactly known for soldering on great memory. SwissArmyDruid fucked around with this message at 19:32 on May 28, 2019 |
# ? May 28, 2019 19:26 |
|
There was actually a hardware canucks video a bit back where they tested an expensive gigabyte laptop handicapped with the single channel ram it normally ships with compared to just going dual channel. It wasn't pretty, and that's with a discrete good gpu! So let me know how their igpu performs with a single stick of 2133 or whatever nonsense, because oems are weird.
|
# ? May 28, 2019 20:27 |
|
Broke out the Kill-a-watt this afternoon to see what the actual system power usage was on my compute stack machines. I currently have 2700s and 1600s in use, both with a listed TDP of 65W. Other than the CPUs, all 4 systems are identical. I got an idle power usage of 24.5W -- that would be CPU at idle, stock HSF, 2 x 8GB DDR4 (3000MHz), 2 x 120mm fans, and a M.2 SSD. Usage at 100% load on all cores+threads was 98.5W for the 1600 and 104.5W for the 2700. So the CPUs, at load, were burning 74W and 80W, respectively. I'm really curious to see what the under-load usage of the 3900X will be, and how that will drop as it is underclocked. Fun times ahead!
|
# ? May 29, 2019 01:12 |
|
How tricky is it to replace a CPU? I'm thinking of upgrading a B450 board with a 1700 with a 3000 series CPU. I'm wondering whether I should get a new gen mobo as well (GPU is an RTX Titan, 64GB ram). I vaguely remember the (3rd party) CPU fan clamp being a bitch to install. 1. Do people normally remove the parts surrounding the CPU (GPU, ram) to help with the install? 2. Are there any issues with the OS on a CPU change? 3. Can I expect to get the full performance of the CPU with a 1st gen board?
|
# ? May 29, 2019 02:16 |
|
shrike82 posted:How tricky is it to replace a CPU? I'm thinking of upgrading a B450 board with a 1700 with a 3000 series CPU. I'm wondering whether I should get a new gen mobo as well (GPU is an RTX Titan, 64GB ram). 1. That's recommended, especially if you have fat fingers and aren't familiar with the fastening mechanism. 2. **Before you replace**, make sure you update your motherboard's BIOS firmware first to the latest iteration. Some boards' latest BIOS may be beta only and located somewhere aside from the typical manufacturer's site that they're located. Also, if you're on Windows, you may need to get in touch with their customer support to get your machine re-validated. Microsoft now gives a ton of help and dedicated staff just for this one job, so that part at least isn't frustrating. 3. Not noticeably at all. The only thing you're really limiting with your old board is that IIRC the 3000 series has more I/O than presented to you than the previous generation, as in, you can plug in more things into the CPU than your old board has slots/sockets for Sidesaddle Cavalry fucked around with this message at 02:38 on May 29, 2019 |
# ? May 29, 2019 02:34 |
|
shrike82 posted:How tricky is it to replace a CPU? I'm thinking of upgrading a B450 board with a 1700 with a 3000 series CPU. I'm wondering whether I should get a new gen mobo as well (GPU is an RTX Titan, 64GB ram). If you are not very experienced with diy PC building, I would strongly recommend doing a teardown to extract the mobo out of the case. It's much easier to work with large air coolers with the mobo out of the case. If you're really new, this is a pretty good tutorial on how to take a PC apart. quote:2. Are there any issues with the OS on a CPU change? If using win10, no, though in may want to re-activate the license. Older versions of windows, maybe. e: Sidesaddle Cavalry's BIOS thing is the super-important bit. quote:3. Can I expect to get the full performance of the CPU with a 1st gen board? This is a question that many people want to know but we won't get answers for until launch & independent testing. If your B450 board has meh power, you might have trouble with the high core count versions of the 3000 -- but if you're looking to just get an 8-core it should be fine. Klyith fucked around with this message at 02:44 on May 29, 2019 |
# ? May 29, 2019 02:41 |
|
shrike82 posted:How tricky is it to replace a CPU? I'm thinking of upgrading a B450 board with a 1700 with a 3000 series CPU. I'm wondering whether I should get a new gen mobo as well (GPU is an RTX Titan, 64GB ram). This isn't exactly what you were asking about, but before you lift the CPU cooler off the CPU give it a gentle twist back and forth to break the adhesion of the thermal paste. The AM4 socket doesn't hold the CPU in very hard so the CPU can sometimes get pulled out with the cooler if the thermal paste is too sticky and you don't take precautions. If the CPU does come out with the cooler just gently pry it off the cooler and set it down. Also, make sure you have some isopropyl/rubbing alcohol, some paper towels, and q-tips to clean off the old thermal paste and some new thermal paste to apply when you reinstall the CPU and cooler.
|
# ? May 29, 2019 03:14 |
|
https://www.youtube.com/watch?v=Mi3eaDKirPI ASRock has an AM4 ITX board with Intel mounting hole layout. Honestly, I can't wait until AM5 rolls around and AMD can reasonably either work with Intel to standardize this poo poo, or something.
|
# ? May 29, 2019 03:51 |
|
Standardization would be great between both platforms but even if AMD wanted to Intel wouldn't want to I bet. Last time they had the same HSF mounting and sockets was like the Socket 7 era I think?
|
# ? May 29, 2019 04:02 |
|
mdxi posted:Broke out the Kill-a-watt this afternoon to see what the actual system power usage was on my compute stack machines. This is super hilarious to me since I've been considering retiring my FX-8350 (don't judge, I got it for free) in my VM box in favor of a 1600 or 1700. I wasn't sure how much power savings I'd actually see, but the 8350 box idles at about 100w. Granted, this is with 4 spinning drives and a couple SSD's, but still..
|
# ? May 29, 2019 04:33 |
|
BeastOfExmoor posted:This is super hilarious to me since I've been considering retiring my FX-8350 (don't judge, I got it for free) in my VM box in favor of a 1600 or 1700. I wasn't sure how much power savings I'd actually see, but the 8350 box idles at about 100w. Granted, this is with 4 spinning drives and a couple SSD's, but still.. Khorne fucked around with this message at 04:54 on May 29, 2019 |
# ? May 29, 2019 04:45 |
|
Khorne posted:What's drawing so much power at idle? I think 4 spinning drives can pull ~24-32W while idle although modern WD reds pull sub 1w while sleeping. The SSD should pull sub 1w. I'm assuming its just the super inefficient Bulldozer 8 core CPU causing the 50w delta. There's nothing else in the box that would burn much power. It has a fanless GPU just to give me the ability to boot, a single fan cooler, and 32GB of RAM.
|
# ? May 29, 2019 04:54 |
|
BeastOfExmoor posted:I'm assuming its just the super inefficient Bulldozer 8 core CPU causing the 50w delta. There's nothing else in the box that would burn much power. It has a fanless GPU just to give me the ability to boot, a single fan cooler, and 32GB of RAM. Also: I guess my computer pulls crazy idle watts as well because it's a 3770k.
|
# ? May 29, 2019 04:54 |
|
So what your saying is I would save money by upgrading. I'll pass that one on the the wife to make my case.
|
# ? May 29, 2019 04:57 |
|
Khorne posted:Do you let the hard drives go into standby? That's a good question. 30 seconds of googling indicates that my hypervisor (ProxMox) should spin down the drive if its inactive, but I'm not sure how to confirm that. Puddin posted:So what your saying is I would save money by upgrading. I'll pass that one on the the wife to make my case. It probably wouldn't completely pay for itself, but if I could get off my rear end and eBay the chip, motherboard, and 64GB of DDR3 I have sitting around I could probably come close to paying for a first gen Ryzen, B450 mobo, and 32GB of RAM with the proceeds.
|
# ? May 29, 2019 05:18 |
|
SwissArmyDruid posted:Honestly, I can't wait until AM5 rolls around and AMD can reasonably either work with Intel to standardize this poo poo, or something. Intel can't even keep to a standard in their own ecosystem for more than 2 generations tho
|
# ? May 29, 2019 05:22 |
|
Klyith posted:Intel
|
# ? May 29, 2019 10:57 |
|
Yeah, let's not mistake can't for won't, here. That said, Intel HAS retained the same 75mm hole spacing for mounting desktop coolers ever since LGA 1155, and LGA 1150, and LGA 1151, and LGA 1156. which means since 2010-2011, so, it's not *impossible* for them to hold to SOME kind of standard.
|
# ? May 29, 2019 11:58 |
|
Noctua is working on a TR4 version of the D15
|
# ? May 29, 2019 14:19 |
|
I like how they advertise their aircooler for 400W TDP like it's no big deal.
|
# ? May 29, 2019 17:25 |
|
that passive heatsink looks awesome
|
# ? May 29, 2019 18:34 |
|
The reporting is unclear, but it also seems to be a new version of the D14/D15 line for normal CPUs too.
Khorne fucked around with this message at 18:51 on May 29, 2019 |
# ? May 29, 2019 18:43 |
|
Some interesting information about overclocking Zen2 ES chips. Final product should be about this good or better. text version of post with punctuation intact (its a screenshot): quote:-4.8Ghz achievable on all cores I'm assuming they were comparing a 8C Zen2 to the 9900K but it doesn't explicitly say so. Also there is a off hand comment at the bottom of the screen shot about the RAM being used in most of the leaks as being pretty dang slow and higher latency (stuff like CL20-24 DDR4 2100 or worse I think) and that there really isn't a regression in memory latency to speak of with Zen2 in practice vs Zen1/+.
|
# ? May 29, 2019 18:51 |
|
Holy poo poo 4.8GHz is way better all core than I had expected. I was hoping for 4.5GHz boost all around. Extremely excited for Threadripper, I will absolutely be getting that enormous loving Noctua cooler if it's hitting 4.8 with PBO. MY RENDERS. RIP AND TEAR MY RENDERS.
|
# ? May 29, 2019 19:30 |
|
I'd assume it's a 12c part based on the cinebench vs 9900k. AMD isn't suddenly beating Intel 13% clock for clock in cinebench.. Those clocks are pretty solid looking for a 12c part, especially given that it's a (probably late) ES and the retail products will probably be slightly better. Really interested to see how well the memory controller does.
|
# ? May 29, 2019 19:39 |
|
NewFatMike posted:Extremely excited for Threadripper, I will absolutely be getting that enormous loving Noctua cooler if it's hitting 4.8 with PBO.
|
# ? May 29, 2019 19:52 |
|
Combat Pretzel posted:If you're air-cooling, pray to Baby Jesus that your mainboard BIOS has time constants on fan curves, otherwise the fan spooling from Precision Boost will drive you insane. but Noctua fans are fairly silent to start with.
|
# ? May 29, 2019 19:56 |
|
Combat Pretzel posted:If you're air-cooling, pray to Baby Jesus that your mainboard BIOS has time constants on fan curves, otherwise the fan spooling from Precision Boost will drive you insane. Inconsistent or loud noise from my pc drives me insane. Khorne fucked around with this message at 22:17 on May 29, 2019 |
# ? May 29, 2019 19:57 |
|
SwissArmyDruid posted:That 100 GbE looking like a bottleneck instead of a feature, all of a sudden. Good news! 400Gbe is gonna be shipping soon.
|
# ? May 29, 2019 20:52 |
|
Brief conclusion to my RAM woes in case anyone else stumbles into this thread racking their brain. I returned the extra RAM and extra PSU I ordered and went back to square one with the 2x8GB HyperX Fury and 2x8GB EVGA SuperClocked. The X470 Taichi did something on first boot that the B350 Tomahawk did not - it automatically detected an optimal latency setting for the four sticks of ram. So despite trying a full CL17 setting previously, with a timing of 16-17-17-37, the system boots totally fine with no blue screens. Great success? Who knows, but I guess moving to X470 wasn't as useless as I thought.
|
# ? May 29, 2019 21:16 |
|
|
# ? May 21, 2024 00:20 |
|
Combat Pretzel posted:If you're air-cooling, pray to Baby Jesus that your mainboard BIOS has time constants on fan curves, otherwise the fan spooling from Precision Boost will drive you insane. Oh yeah, I've got a Meshify C and it's got all the slots for fans. That's going to be moving some air. It'll take up to 360mm fans.
|
# ? May 29, 2019 22:01 |