Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BurritoJustice
Oct 9, 2012

I remember my old Q6700 computer. Bought it when I knew nothing about computers way back when, paid like 2300. The thing also had a 9800GTX+ and an explosive Huntkey V500 power supply that likely ended up being the death of my motherboard. Had a 1tb HDD which BLEW my mind at the time.

Ran XP on the bastard until 2011.

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

ShaneB posted:

So is there any word on whether or not the new CPUs will work on the previous generation of motherboard?

New CPUs won't work in Z87, only old Haswells. Everything LGA1150 will work in Z97, up until and including Broadwell. Skylake is going to be LGA1151 or something I believe.

BurritoJustice
Oct 9, 2012

ShaneB posted:

Is it possible with a BIOS upgrade?

Supposedly no, but people have managed to get overclocking working on b85/h87 before, so I'm guessing people will try to find a way.

BurritoJustice
Oct 9, 2012

Broadwell is guaranteed to be at least "unofficially supported" by Z97, similar to the situation with Devils Canyon/Refresh and Z87. I'm guessing this means we will see a Z107 or similar as the launch platform for Broadwell.

BurritoJustice
Oct 9, 2012

If you are going to be using customer water, why not buy an EK Supremacy (The Best Waterblock™) and then use the Precisemount Naked Ivy addon to mount directly? Saves buying a motherboard you don't want just for the guard, and easier than trying to buy a resold one.

BurritoJustice
Oct 9, 2012

Shaocaholica posted:

That's not really the same thing.

To be fair if he is using custom water it serves the same purpose and is only 4 dollars for the mounting kit. Won't work for aircoolers or whatever, but who uses aircoolers with 400 dollar motherboards and delidded CPUs anyway.

BurritoJustice
Oct 9, 2012

The Lord Bude posted:

I sate my itch by posting in the PC part picking thread.

The best way to scratch the itch I've found is to build computers for everyone I know (like 20k AUD worth this last year). In lull periods though I usually do build lists for myself and then convince myself I don't need it :shobon:.

BurritoJustice
Oct 9, 2012

The biggest thing I got from those benches is how much of a hideous bottleneck the AMD APUs are in every single game. In some tests it is the difference between 60fps and 40fps. And the A10-7850k costs more than an i5 :psyduck:

BurritoJustice
Oct 9, 2012

To be fair, you can't actually use SLI alongside a M.2 3.0 4x SSD (like the Samsung is) on LGA1150 without a PLX chip. Honestly if you really had to do it, the cheapest option would be a 4690k/4790k alongside an ASRock Extreme9 (only motherboard I know of with both PCIE 3.0 4x M.2 and a PLX chip, also a good 200 dollars cheaper than other PLX boards). But you shouldn't.

BurritoJustice
Oct 9, 2012

The Lord Bude posted:

The Samsung drive is PCIe 2.0 x4, not 3.0, and you can get an adapter for like $20 to plug it into a PCIe 2.0 x4 slot which is fairly common on just about every ATX sized board.

Even if you insisted on using the M.2 Slot, the two PCIe 3.0x16 slots would still run at x8 and x4, which I doubt would measurably impact performance of the cards unless maybe they were Titan Zs or something. 3.0x4 is still equivalent to 2.0x8, and people run SLI just fine on 2.0x8 without running into bandwidth issues. Here's some data on the subject:

http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/24.html

So it is perfectly possible, if you wanted to be a nutter, and use SLI and M.2; to do so without a PLX chip. And the Extreme 6 does in fact support booting from M.2. Any z97 board with M.2 support should - I know for a fact Asus has released a bios update to support it for 9 series boards.

Even using an Extreme9 - which at 236 dollars is only a tad more expensive than say a Maximus Gene, and much better featured - isn't an unreasonable purchase for someone looking to build a high end rig with money to burn.

Going for Haswell-E for a gaming machine is lunacy though, and honestly so is buying a samsung M.2 SSD vs just getting an 850 pro or something.

A lot of this is wrong. The XP941 is 3.0, it just wasn't marketed as such because until ASRock did 3.0 M.2 it wasn't an option. See Anandtech's review of the Extreme6 specifically addressing this.

Also it isn't possible to sli using 3.0 4x, Titan Z's or otherwise, as Nvidia disables it under 8x.

You're right about A. The extreme9 being a huge amount of motherboard for the money, and B. normal SATA SDDs on a cheaper board make way more sense.

BurritoJustice
Oct 9, 2012

Alereon posted:

:siren:The Samsung XP941 is not PCI-Express 3.0:siren:

quote:

"one of the few native PCIe 3.0 x4 drives in OEM circulation, the Samsung XP941"

:confused:

To be clear, I'm not recommending the drive at all here, I just wish to be as factual as possible.


(I love your work in the SSD thread)

BurritoJustice
Oct 9, 2012

A confusingly worded Anandtech review then, especially considering that later on they question the performance of the drive compared to the 3.0 4x theoretical maximum. (Also I'm sorry for correcting you on that Bude, my mistake).

I think the moral is the story is that there is a variety of crazy ways to strap PCI-E devices to a system, and its all fricking fascinating.

Thanks for clearing that up Alereon.

BurritoJustice
Oct 9, 2012

I don't think we should stop recommending ASRock based entirely off speculation and anecdotes, ignoring the multitudes of objective reviews showing them to be good, excellent even. The only spec in that review that the ASRock is not fantastic is a single audio benchmark where it is middle off the pack instead of the absolute best (like it is for DPC latency.

The added connector for PCI-E power is not an indicator of PCI-E trace quality, it's a feature. It's useful for stability when you have a whole pile of add in cards each requesting 75w from the slot. It's entirely optional. You know what else has one? The ~Asus~ Rampage V Extreme super ultra gamer edition.

BurritoJustice
Oct 9, 2012

Alereon posted:

Like I said above, the point is that the quality of isolation of the audio traces tells you a lot about the quality of the board in general. That's a board loaded with typically higher-end features, but with a mid-range price corners have been cut to get there. Some of those, like the limited number of VRM phases, are pretty reasonable. Poor quality trace layout and isolation, which you see as that poor THD+N result, is not. The difference between "good" and "crap" is making the right tradeoffs, and lower quality for more features is almost never the right trade.
Extreme high-end boards typically do have at least one additional 8-pin power connector to handle a large number of cards. Very low-end boards on which a lot of corners have been cut have a SATA power connector or a molex connector. The $400 Asus X99 Deluxe does not have any additional power connectors and it is meant to take four PCI-E 3.0 x16 cards, plus an x4. It has the build quality to power its slots without additional power connectors, the Asrock board does not. Consider that Gigabyte boards with a SATA power connector require it to be connected for stable operation with even a single videocard, despite the manual claiming it is for multi-GPU.

You are making some interesting jumps in logic here. While the audio results aren't fantastic, they certainly aren't terrible. The isolation of audio components is not something that depends just on quality of tracings, but also how they are routed, nearby components, shielding etc. By the logic of poor tracing solely from that result, we shouldn't recommend boards from MSI, EVGA, Gigabyte and ASUS as well, as they post motherboards with worse distortion figures than the Extreme6. I also find your comment about low end boards and molex connectors amusing, considering that the aforementioned Rampage V Extreme uses a molex connector. Anecdotally, I remember it being quoted of Asus and ASRock that they both choose molex for auxiliary power as it is more commonly a spare connector on modern power supplies, compared to PCI-E connectors which are commonly either all used, or on long dual connectors for graphics cards.

Like I said earlier, the additionally power connector is a feature (ignoring lovely outlying gigabyte boards) that is popular amongst enthusiasts. It is definitely not needed for normal operation on the ASRock boards, as quoted by reviewers and also from my own anecdote (my friend has an Extreme4 with two 780s on it without the connector merely because he did not want to have to route it). It is commonly used for one of three situations; 3 or 4 card setups, insane GPU overclocking/volting situations, and mining setups with a large amount of cards without PCI-E connectors that will stress the PCI-E power draw immensely (think 6 or 7 750ti's). In 99% of these situations even, it is merely placebo as well to have an effect. It certainly is popular in crowds such as [H]ardForum, overclock.net etc. I'd be willing to bet that ASRock gets more sales because of it.



EDIT: I forgot to address your comment on power phases. If you could expand on this, that would be good. It is an interesting comment considering that the Extreme6 has a 12 Phase design with good Nippon Chemi-Con caps rated at 12k hours. The equivalent Asus board, the Z97-A (which is more expensive), has an 8 phase design with 10k hour rated caps.

BurritoJustice fucked around with this message at 10:05 on Sep 1, 2014

BurritoJustice
Oct 9, 2012

GokieKS posted:

Ugh, seems like Dragon Age: Inquisition just flat out refuses to work on a dual-core machine regardless of how fast those cores may be. So my plan of just using this G3258 to tide me over until ASUS releases a mATX X99 board (which doesn't seem like it's going to happen until next year at the earliest) is coming back to bite me.

Debating if I want to just pick up a 4690K for $170 and plop it into this (kinda crappy) MSI Z97 board (that came with the G3258 in the $100 bundle) and continue waiting, forget Haswell-E/X99 and just get a 4790K + ASUS Maximus Gene VII, or just get the 5820K now and settle for a "cheap" X99 board. I guess I could also get the Rampage V Extreme, but I really don't want to a full ATX motherboard... or to spend $500 on one.

Even if you are deadset on X99 (no reason to be for gaming), why not just get a X99 mATX board from any other manufacturer. Bonus points because you won't have to pay the ASUS tax. Don't buy a Rampage V Extreme, there is no justification for that ever.

The logical solution is to just grab a 4690K if you need an upgrade, it will swap right in and be absolutely enough for every game out.

edit: Dunno what Gravitas is saying though. 5xxx CPUs aren't worse for gaming than 4xxx, and it certainly isn't Intel being misleading. The base clocks are lower, sure, but even an average 5960x will hit 4.5GHz with proper cooling. The extra cores might not help in most games, but they won't hinder and you'll hit the same clocks as a 4790k within margins of error.

BurritoJustice fucked around with this message at 06:20 on Nov 19, 2014

BurritoJustice
Oct 9, 2012

GokieKS posted:

While I don't necessarily *need* it, I do, in fact, want Haswell-E. This machine will be doing a decent amount of video editing and encoding in addition to gaming.

And the reason why I was waiting for ASUS's ROG mATX X99 is that in my recent experience, the ROG mATX boards have pretty much been the best combination of quality components, high-end features, great overclocking capability, and also a better BIOS/UEFI than Gigabyte/ASRock/MSI. Now, that's not say that I literally would not consider any other option, but their track record meant that I definitely wanted to see what they had on offer first. And really, there's very limited options for mATX X99 right now, with the first two on market (ASRock and eVGA) both apparently having some quirks and issues. Gigabyte just announced a new one, and it may end up being my best option, but it's not widely available yet.

Killer NIC on that one, forget it.

BurritoJustice
Oct 9, 2012

Welmu posted:

It's also possible to analyze this noise and use it to extract RSA keys.


In news, Skylake Core-M coming later this year according to Krzanich.

Haswell-EX specs:


Can anyone explain to me why the "top" (at least model number wise) part is a 4 core 140w part. Seems a bit insane, the frequency difference is relatively minor to the 18 core part.

BurritoJustice
Oct 9, 2012

mayodreams posted:

A 22% increase in clock speed is not a minor difference. Some workloads are not heavily threaded, so clock speed becomes king, and a 140W TDP 4 core proc will get you better results than an 18 core with 25W more TDP that will throttle cores to maintain that TDP max.

But Intel has 4.4GHz 4 cores in 88w TDPs, and the only difference I can see really is the extra cache and QPI links which can't make up the difference. Noting the clock speeds I should've compared to the 10 core. 6 more cores and 12.5% slower clocks for only a 25w delta?

BurritoJustice
Oct 9, 2012

r0ck0 posted:

The asus hero maximus VII hero just got a BIOS update:

http://www.asus.com/us/Motherboards/MAXIMUS_VII_HERO/HelpDesk_Download/

I looked up USB 3.1 and its pretty fast, up to 10gb/sec

http://en.wikipedia.org/wiki/USB#USB_3.1

How is this possible in a BIOS update? Will this mean the USB 3.0 ports on the mobo now support 3.1 devices/speeds? Or even better does it make the 2.0 ports 3.1?

It would have to be USB3.1 over a PCIE card or similar, as USB3.1 requires a whole new physical controller which is only just coming out to some MSI and ASRock boards.

BurritoJustice
Oct 9, 2012

Darkpriest667 posted:

If USB 3.1 has a different plug and controller it will take forever for adoption to occur. Everyone remember firewire?

USB C (the new plug everyone is hyping) is completely optional and in fact independent of USB3.1. There will be USB A (the old rectangle hole) 3.1 plugs, and they'll likely be around for a long time to come. The controller is just different in the sense that a USB 3 controller can't run things at 3.1, just how a USB 2 controller can't run things at 3 speeds. Sure it is a burden in the meantime, but the industry is used to periodically adding a new standard for USB.

I have a feeling that USB 3.1 will replace 3 but USB 2 will instead still be seen in parallel to 3.1, instead of the usual "this generation and the generation before are common" that we see with USB adoption.

BurritoJustice
Oct 9, 2012

Panty Saluter posted:

My 4670K (at 4.2 gHz) will tickle 80C but only if I'm stretching it (benchmarking or video encoding). Games only hit 60ish. I have a Noctua D15 (something like that) with one fan. Would the optional second fan help temperatures noticeably if I start pushing it a lot?

That sounds awfully high for what is the most overkill air cooler you can buy.

BurritoJustice
Oct 9, 2012

All those years AMD worked on APUs, and Intel just up and doubles their graphics performance in one release. Impressive, especially the power consumption figures.

BurritoJustice
Oct 9, 2012

Skylake is out in Aus and the CPUs and motherboards are hilariously expensive. $529 for a 6700K, and the motherboards are X99 level prices. I mean, Asus Z97-Deluxe for 529 loving dollars, or the comedy option Gigabyte Z97X Gaming for $800.

BurritoJustice
Oct 9, 2012

Anime Schoolgirl posted:

The bins must be incredibly inconsistent then since a lot of people have actual trouble getting past 4.0-4.1.

Every data point I've seen has put 4.5 as an average chip, 4.6 as a good one and 4.4 as a crappy chip. People not getting past 4.0 are doing something wrong.

BurritoJustice
Oct 9, 2012

Palladium posted:

Preliminary analysis from a value hunter perspective:

-i7 turbo bins: 6700K (4/4/4/4.2GHz) vs 6700 (3.7/3.8/3.9/4GHz), not as big as a difference compared to 4790K vs non-K.

-Locked 6600 is the best value for a non-OC i5 with the same turbo bins (3.6/3.7/3.8/3.9GHz) as the 6600K and a lower TDP of 65W, despite the lower base clock of the former (3.3GHz).

-The i3 6320 (3.9GHz) and 6300 (3.8GHz) are simply overpriced. Only the 6100 (3.7GHz) is reasonably priced at $117.

Turbo bins require much less thought once you realise that every motherboard worth buying includes some form of MCE.

BurritoJustice
Oct 9, 2012

Twerk from Home posted:

What does this acronym mean? I'm assuming it means "can run at full turbo bins all the time as thermals allow", basically ignoring the TDP if you have sufficient cooling. Can this be done with a non-K CPU and and non-Z motherboard? If that's a switch I can flip on an H170 board with a 6600, then there's a value king.

MCE = Multi Core Enhancement. Basically it is an alternate turbo behaviour that has every core at the max bin. And yes, it is on H series motherboards and can be used with non-K SKUs.

BurritoJustice
Oct 9, 2012

necrobobsledder posted:

Wasn't there a guy either in this thread or another one in SH/SC that wound up buying a Xeon Phi and running some benchmarks for his workloads at home?

That was "No Gravitas" if I remember correctly. It was a cool series of posts.

BurritoJustice
Oct 9, 2012

Malloc Voidstar posted:

Would the rumored i7-6850k (6 cores @ 3.6GHz) work well in gaming vs an i7-2600k @ 4.0GHz?
I do some video encoding where the older CPU hurts me but I don't want to upgrade my CPU/mobo/RAM only to lose a bunch of frames in games.

The increase in IPC between sandy bridge and broadwell is greater than the difference in clockspeed, so even if you don't overclock the Broadwell-E CPU (you should) you won't lose per core performance (which is what matters in games).

BurritoJustice
Oct 9, 2012

Twerk from Home posted:

Woo. Anybody who wants >4 cores right now should still be putting Haswell-E on one of the nice new X99 boards, right? BW-E really looks more expensive for the same performance.

Broadwell-E is actually basically worse than Haswell-E anyway, because the IPC increase isn't enough to offset the significantly worse overclocking. The E chips are overclocking about as well as the 1150 socket Broadwells, which is to say the worst overclocking headroom of any architecture in the last 5 years.

If someone offered me a 5960X or a 6900K I'd take the Haswell all day long.

BurritoJustice
Oct 9, 2012

PerrineClostermann posted:

What exactly is NVMe, anyway?

It's a storage protocol (think AHCI or IDE), that is made to handle both the increased throughput of PCI-E as well as the random access speeds of SSDs.

It's distinct from PCIE/SATA (interfaces) and M.2/SATA (connectors).

Yes it's all a bit of letter soup.

BurritoJustice
Oct 9, 2012

BIG HEADLINE posted:

With ~25% of 76/7700Ks making 5Ghz, I really wonder what Kaby-X's 'secret sauce' is going to be - 5Ghz native plus Sky-X's PCIe lanes?

Where did you get 25% from? SiliconLottery is quoting 62% for 7700K's.

BurritoJustice
Oct 9, 2012

Anime Schoolgirl posted:

things can go up to 5ghz in prime but crash and burn in variable workloads such as "running photoshop" and playing video games

so the 62% prime95 figure is pretty suspect and most definitely not a figure to drive up sales volume from their site!!!!

Would it not be in their better interest to quote a lower figure, therefore making their "guaranteed 5GHz+" CPU more rare and therefore more desirable over rolling the dice with a retail one? If it was 1/4 they wouldn't be selling 5GHz 7600K's for only $10 above retail.

Also the idea of something being stable in prime95 and unstable otherwise is completely counter to the usual goon groupthink of "don't bother with prime95 stability as it is a way higher load than your CPU will see in literally any other situation".

Not to mention the shitloads of reviews and anecdotal reports you can find all over the internet of people getting 5GHz easily in the 1.3-1.4v range.

E: Also why would we put more weight on "I heard this on the internet unsourced" over a company that exists solely to buy CPUs in bulk and bins them, getting objective data on average overclock ability.

BurritoJustice fucked around with this message at 22:17 on Jan 17, 2017

BurritoJustice
Oct 9, 2012

Multi-rail designs are perfectly fine as long as the rails are big enough for your usage. Single-rail PSUs are simpler and easier, but there is nothing inherently wrong with a multi-rail PSU and you shouldn't throw yours out just for that. OklahomaWolf of JonnyGuru fame famously prefers multi-rail designs, even if they have all but disappeared from the market.

BurritoJustice
Oct 9, 2012

Jago posted:

It's interesting, the consoles have 8 cores, but even the new scorpio in the xbox is otherwise anemic in the horsepower department in terms of single threaded performance. 2.3ghz not jaguar , not ryzen isn't impressive. (Though it is running GDDR5 RAM)
On the other hand the graphics are what, ten times more powerful?

I guess what I really want to know is when I can get graphics memory for my pc's processor.

Consoles having lots of hot garbage cores means that even more effort will have to be put into multithreaded optimisation. If they were eight even average cores they could just coast on using the first four mostly, but they are working with what are basically ULV laptop chips so they really need to squeeze everything they can.

BurritoJustice
Oct 9, 2012

Scorpio is looking to be neat and I can't wait for all the "no but see if you buy this $30 xeon and this old bitcoin fury with a dollar store PSU and a cardboard box you can smash the scorpio on performance and price!" builds on r/PCMR. They always get whipped up to prove consoles are never a consideration to be purchased, ever.

BurritoJustice
Oct 9, 2012

To be fair, a Skylake-E 6 core and a Coffee-Lake 6 core wouldn't conflict as much as you'd think. LGA2066 gets you quad channel memory and massive amounts of PCI-E lanes while LGA1151 gets you better clocks and an iGPU. They'll probably be roughly the same price as with the 7700K/6800K, ignoring platform cost differences, which lines them up as meaningful sidegrades depending on use case. The platform features are almost all dependant on chipset and I'm sure Intel will line them up so there won't be a big delta like there is with X99/Z270 right now.

drat I'm excited for Coffee-Lake to hopefully hit the best of both worlds of the 7700K and Ryzen/HEDT Intels.

BurritoJustice
Oct 9, 2012

If it carries over the better DDR4 controller of mainstream Skylake we'll probably see DDR4-3866+. Haswell-E got to 3466 with quad channel on the highest end boards so it wouldn't be too unreasonable of a jump.

BurritoJustice
Oct 9, 2012

Actuarial Fables posted:

Who are the people buying an m-ITX motherboard then sticking it in a case large enough to fit 2+ GPUs?

Founders 1080Ti x2 + EK waterblocks with single slot brackets + two slot bridge to fit two cards into the two slots that all ITX cases have? Of course there is the issue of aligning the graphics cards with the standard slots with the addition of the riser, so case modification would still be required for most cases. Should work with the ones with a separate GPU mount connected by a riser though.

Palladium posted:

Meanwhile in the Asrock exec meeting room: AM4 ITX? Who cares?


BurritoJustice
Oct 9, 2012

MaxxBot posted:

I guess in one Intel slide at Computex they were touting the i9 for "12k gaming." I mean come the gently caress on you can't even do 8k properly with SLI 1080 Tis, even 5k is pushing the limits of practicality in modern games.

"12K" as is commonly marketed is an incorrect way of naming 3x4K multi-monitor (11,520x2,160), so it is actually less than "8K" as properly marketed (7,680x4,320).

But then again "4K" is actually 4096x2160 not 3840x2160, so whatever, hail satan

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

Just a comment on the De8auer Skylake-X overclocking video, because a few people commented saying they distrust him because Intel is providing him with so many expensive CPUs to bin.

He works for caseking.de, which bins Intel CPUs and sells them for more than retail (think SiliconLottery.com). That's where he got the piles of Skylake-X CPUs. Like with SiliconLottery, assuming they'd be malicious, it would be in their best interest to under-exaggerate the average overclockability of retail processors so that their own service is more appealing. He even states in the video that 5GHz is a binned chip and standard chips are closer to 4.8GHz.

As with SiliconLottery last time this was discussed, I think it is logical to not immediately distrust his stats; even if no weight is given to his reputation in the overclocking community.

E: My personal guess is that with the 14nm+ process of Kaby and the IVR the chips will have fantastic voltage scaling up to 5GHz but will be heavily temperature limited making it unfeasible for most. The higher core count chips will cut a few hundred MHz off that like with past HEDT platforms.

BurritoJustice fucked around with this message at 18:56 on Jun 14, 2017

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply