|
The performance of that onboard video encoder is very surprising. I had no intention of ever using it but now it looks like it could be a nice added bonus
|
# ? Jan 3, 2011 06:41 |
|
|
# ? May 9, 2024 22:54 |
|
Interesting tidbit for those curious about boot times on EFI: Anandtech reports that their Intel P67 board cut POST times by a quarter versus Intel P57 and X58 boards, from about 29 seconds to about 22 seconds.
|
# ? Jan 3, 2011 06:43 |
|
Alereon posted:Interesting tidbit for those curious about boot times on EFI: Anandtech reports that their Intel P67 board cut POST times by a quarter versus Intel P57 and X58 boards, from about 29 seconds to about 22 seconds.
|
# ? Jan 3, 2011 06:50 |
|
WhyteRyce posted:The performance of that onboard video encoder is very surprising. I had no intention of ever using it but now it looks like it could be a nice added bonus
|
# ? Jan 3, 2011 07:07 |
|
Alereon posted:There was a pretty lovely footnote to this, Intel Quick Sync transcode technology only works if the on-die graphics is enabled and in-use. This means those of us with P67 boards or discrete graphics cards can't use use the video transcoder, which just smacks of a retarded implementation. Oh yeah I forgot about that. Never mind then!
|
# ? Jan 3, 2011 07:13 |
|
Alereon posted:There was a pretty lovely footnote to this, Intel Quick Sync transcode technology only works if the on-die graphics is enabled and in-use. This means those of us with P67 boards or discrete graphics cards can't use use the video transcoder, which just smacks of a retarded implementation. Couldn't this be solved by clever motherboard design having both the discrete and integrated in use but only use the discrete to actually display anything?
|
# ? Jan 3, 2011 08:11 |
|
Wow, 4.4Ghz on that lovely low-profile stock heatsink is pretty drat impressive.
|
# ? Jan 3, 2011 08:12 |
|
talk show ghost posted:Couldn't this be solved by clever motherboard design having both the discrete and integrated in use but only use the discrete to actually display anything? It's not that simple. You'd have to also be aware there are 2 GPUs and data would have to flow well from one to the other. There's a host of annoyances. Similar to why no one has a good solution for dynamically switching between an integrated GPU and a discrete one on the fly based on workload - you have to be able to assume the rest of the system is playing nice, which it most definitely is not.
|
# ? Jan 3, 2011 08:16 |
|
So if I don't intend to use the integrated graphics at all (and I'm using a P67/other board with no video), that part of the chip will be doing nothing? What the heck? That seems really stupid.
|
# ? Jan 3, 2011 08:29 |
|
Spite posted:Similar to why no one has a good solution for dynamically switching between an integrated GPU and a discrete one on the fly based on workload - you have to be able to assume the rest of the system is playing nice, which it most definitely is not. Don't some of the macbooks do exactly this?
|
# ? Jan 3, 2011 08:34 |
|
DuckConference posted:Don't some of the macbooks do exactly this? Not quite. The most recent Macbook pros will switch from an integrated intel part to the discrete nvidia chip. System apps that are "aware" will run integrated until a switch occurs. But they have to be coded with the assumption that their graphics context can and will be yanked out from underneath them at any time. Apps that aren't aware will always power up the discrete part, even if what they are doing doesn't need that power. For example, a simple Core Animation app will switch EVERYTHING over to discrete. It's a whitelist, not based on computational need.
|
# ? Jan 3, 2011 08:41 |
|
Are they really boxing the Extreme cooler with the K chips. Goodbye V8
|
# ? Jan 3, 2011 08:51 |
|
Spite posted:It's not that simple. You'd have to also be aware there are 2 GPUs and data would have to flow well from one to the other. There's a host of annoyances. Similar to why no one has a good solution for dynamically switching between an integrated GPU and a discrete one on the fly based on workload - you have to be able to assume the rest of the system is playing nice, which it most definitely is not.
|
# ? Jan 3, 2011 08:57 |
|
Sandy Bridge is the biggest disapointment (sic) of the year 2 days in. Oh Charlie. I like how the "showstopper bug" is booting from USB3 isn't stable. Also OpenGL is slow when running in software emulation. How sure is the Jan 5th release? Overclockers is still claiming Jan 9th.
|
# ? Jan 3, 2011 09:47 |
|
Honestly I am kinda disappointed in desktop Sandy Bridge. The absence of VT-d and TXT on the K-series isn't really significant, but it feels kind of insulting to cut features out of a premium SKU. The fact that you only get Intel HD Graphics 3000 on the K-series CPUs that are least likely to use it is also odd. It's also bullshit that we can't use Quick Sync without using the on-die graphics, I don't see what keeps them from running the silicon even if it's not driving a display. Given the Quick Sync limitation, the feature division between chipsets (P67 gets dual graphics and overclocking, H67 gets Intel HD Graphics and Quick Sync) is even more frustrating. On the plus side, Sandy Bridge appears to have more than delivered for the mobile sector. Sandy Bridge laptops are insanely fast and have great battery life, and Intel HD Graphics 3000 performs competitively with the Geforce GT 325M, which is pretty drat good for integrated graphics. Paired with a Geforce GTX 560M you'd have a pretty drat efficient mobile gaming system. Alereon fucked around with this message at 18:20 on Jan 3, 2011 |
# ? Jan 3, 2011 10:13 |
|
incoherent posted:Are they really boxing the Extreme cooler with the K chips. So K's come with the boxy cooler, like seen here : http://techreport.com/articles.x/20188/4 ? I got my eyes on a 2500K and was wondering if it's worth buying a fancier cooler. Although I'm more concerned about the noise level and most of them seem to be louder than at least the old stock Intel one, I haven't seen any reviews saying anything about this one.
|
# ? Jan 3, 2011 11:42 |
|
Anandtech says the stock K one is the low-profile cooler a couple times.
|
# ? Jan 3, 2011 11:43 |
|
I was excited until I found out the 2500K can't be overclocked on a board that actually supports the on-board graphics. The only games I play are older games that wouldn't use anything that powerful, and I was really hoping to reduce my build costs by saving on the PSU and graphics card, while having head-room to play with in terms of clock speed. I'm disappointed. I guess I'll wait until March to see how the market changes. I was hoping to build something sooner rather than later, but I'm not in a rush.
|
# ? Jan 3, 2011 12:43 |
|
Intel charging X dollars for the ability to overclock is downright evil. I say that fully aware that by buying a new processor (it's time for me to do so) and board I am supporting this.
|
# ? Jan 3, 2011 13:22 |
|
I'm glad that Anandtech mentioned it. I want/need the IOMMU for virtualization purposes. So I'm going to avoid the K-series like the plague. i7-2600 it is.
|
# ? Jan 3, 2011 15:11 |
|
Are the K chips still going to be price-gouged the first few weeks?
|
# ? Jan 3, 2011 16:18 |
|
R1CH posted:Sandy Bridge is the biggest disapointment (sic) of the year 2 days in. Oh Charlie. I like how the "showstopper bug" is booting from USB3 isn't stable. Thanks for that laugh, it's been a busy morning. Linux doesn't work quite right on bleeding-edge hardware? YOU DON'T SAY!?!?!
|
# ? Jan 3, 2011 17:07 |
|
Props to Anand, we've had SB CPUs here for awhile now, and I never realized *Ks didn't have VT-d. I don't virtualize quite enough for that to drive me away, but like someone said, it's kind of a downer that the premium SKU (at least for now) doesn't have all the bell and whistles. My reasoning (guess) is that in QA testing, that functionality went straight to hell above certain clocks, and rather than have users enable/disable it in BIOS, they just killed it off. (Either that, or it can't co-habitate with the HD 3000 graphics, which I think is unlikely). Also, offering up the 2600K and saying "yeah, you could get above 5GHz with this ", and then basically requiring a chipset (P67) that doesn't implement FDI...why make us pay for that HD3000? Maybe they just expect people to wait for the Z68. Maybe it was a chip packaging decision. I wouldn't mind being able to use the integrated GPU to run my 3rd display, it'd leave me PCIe slots free. HW transcoding and acceleration is nice on paper, as it always is, but dealing with being on the cutting edge of multimedia tech, I've been taking the approach of just throwing CPU brawn at decoding to avoid headaches. Colorspace conversion errors, incompatibility between output renderers and combination of DShow filters, asinine limitations for HW encoders...good thing there's a CPU underneath all of this that can actually earn its keep. Looks like a home-run for the mobile market, AMD is continuing to get owned there. Interestingly enough, I guess Intel is (like they have been) content to leave the "value" market to AMD. e: spasticColon, yes, I hope you like the feeling of Newegg's capitalist phallus in you! I will probably take it happily in return for a 2600K
|
# ? Jan 3, 2011 17:43 |
|
The only difference between P67 and H67 is whether there are hookups for video hardware, right? So why does the vanilla Asus P8H67 have no display connectors? As far as I can tell, it's just a P8P67 with a shitter audio codec.
|
# ? Jan 3, 2011 18:08 |
|
Factory Factory posted:The only difference between P67 and H67 is whether there are hookups for video hardware, right? So why does the vanilla Asus P8H67 have no display connectors? As far as I can tell, it's just a P8P67 with a shitter audio codec.
|
# ? Jan 3, 2011 18:19 |
|
Alereon posted:H67 doesn't support PCI-Express port bifurcation (dividing the x16 slot into two x8 slots) or overclocking. It's probably also less expensive than the P67. It looks like the spec page says it supports some uber quad-GPU Radeon config; maybe they repurposed the FDI link? Though, IIRC, electrically FDI is very similar to DisplayPort, so maybe it is just the cost issue.
|
# ? Jan 3, 2011 18:51 |
|
Also, Asus saw it fit to make me dig through their product pages to figure poo poo out (their product comparator was borked when I tried it), and then I saw legit reviews put up a nice spec table: http://www.legitreviews.com/article/1500/1/ Happy: - all have USB 3.0 - all have Firewire - all have SATA 6Gbps - P8P67 PRO and above implement a PHY for the integrated MAC in the P67 - P8P67 is only $160.00...so, $200 at the egg? Sad: - need to get the PRO or better for SLI - P8P67 doesn't use the Intel ethernet controller - P8P67 is starved for lanes - no legacy 775 holes Also, don't buy the LE. Personally I think I will go for the PRO, because I don't SLI, but want the Intel ethernet solution. Happy to see that pretty EFI implementation. I guess the Asus engineers that were sleeping all the time during training could learn in their sleep. e: In case anyone was curious about the integrated Intel Ethernet functionality, it's similar to what was in the Q57. The chipset provides an integrated 10/100/1000 MAC. This MAC is useless without an accompanying PHY however, which as its name suggests interfaces with the physical ethernet network. Apparently, it's cheaper for some makers to buy a Realtek controller (MAC+PHY) and use that compared to buying just the Intel PHY. If you don't use the Intel PHY however, I think you can repurpose that PCIe x1 link for something else...like the aforementioned Realtek chip. e2: And now I know why the PEX8608 is backordered, thanks Asus! I think everyone should seriously consider the "splurge" for getting the Intel solution; your torrents will thank you. movax fucked around with this message at 19:26 on Jan 3, 2011 |
# ? Jan 3, 2011 19:01 |
|
movax posted:I think everyone should seriously consider the "splurge" for getting the Intel solution; your torrents will thank you. If we don't torrent, should we care?
|
# ? Jan 3, 2011 19:27 |
|
Even if you do torrent, you're almost certainly not going to care about your NIC chipset. GigE has been common for long enough that even your standard Realtek or Marvell chipset has no problems making it work well.
|
# ? Jan 3, 2011 19:30 |
|
movax posted:Props to Anand, we've had SB CPUs here for awhile now, and I never realized *Ks didn't have VT-d. I don't virtualize quite enough for that to drive me away, but like someone said, it's kind of a downer that the premium SKU (at least for now) doesn't have all the bell and whistles. My reasoning (guess) is that in QA testing, that functionality went straight to hell above certain clocks, and rather than have users enable/disable it in BIOS, they just killed it off. (Either that, or it can't co-habitate with the HD 3000 graphics, which I think is unlikely). Yeah right, it's product differentiation.
|
# ? Jan 3, 2011 19:38 |
|
Alereon posted:Even if you do torrent, you're almost certainly not going to care about your NIC chipset. GigE has been common for long enough that even your standard Realtek or Marvell chipset has no problems making it work well. I know that Broadcom and Atheros aren't up to snuff compared to Intel's offering, and my experiences with Realtek haven't given me the most favorable opinion. Granted, this can be due to the implementation of the controller IC (or drivers!), but Dell had R610s available with Broadcom NICs that would randomly kill off Solaris. quote:Yeah right, it's product differentiation.
|
# ? Jan 3, 2011 20:01 |
|
Combat Pretzel posted:Trying to figure out what VMCS is, I ran over VirtualBox documentation that suggests that it's a feature available on all Intel CPUs with VT-x. The Virtual Machine Control Structure (VMCS) is explained in the Intel PRM Vol. 3B, if that's not quite enough to put you to sleep you can read the rest of the PRM. It's a 4kb region of memory containing guest state, host state, control bits, etc. accessed through the vmread/vmwrite instructions. Basically it's an implementation detail of VT-x and unless you're rolling your own hypervisor you shouldn't care about it. If any of the other SNB vets in FM want to meet up for lunch or at least to awkwardly stare at each other's shoes, hit me up on PM.
|
# ? Jan 3, 2011 20:09 |
|
JawnV6 posted:
My spirit will be at hotdog island.
|
# ? Jan 3, 2011 20:20 |
|
JawnV6 posted:The Virtual Machine Control Structure (VMCS) is explained in the Intel PRM Vol. 3B, if that's not quite enough to put you to sleep you can read the rest of the PRM. It's a 4kb region of memory containing guest state, host state, control bits, etc. accessed through the vmread/vmwrite instructions. Basically it's an implementation detail of VT-x and unless you're rolling your own hypervisor you shouldn't care about it. God my PRMs are so old. Ordering some shiny new ones today. I assume that you could possibly answer this: when will the Intel product pages for their 6 Series Chipsets go up? After/during CES?
|
# ? Jan 3, 2011 20:26 |
|
I absolutely love Intel. I buy their stuff constantly BUT I would never trust their driver team to be able to deliver stable and feature packed graphics drivers for this new chip. As of right now Intel has never been able to deliver a competitive graphics chip/driver. According to a lot of reviews this SB cannot decode FILM correctly and outputs 24fps which makes movie watching jittery. That is loving unacceptable. I mean all this hub hub over what amounts to a lovely entry level graphics chip that lacks features and who knows what else? No loving thanks. also, Marvel Yukon PCI-E Gig nics loving rock. You can get a nice Rosewill one from newegg for 25ish bucks.
|
# ? Jan 3, 2011 20:30 |
|
Raptop posted:My spirit will be at hotdog island. Hotdog Island is gone man JawnV6 posted:
Does CPT count? We crash your celebrations anyway.
|
# ? Jan 3, 2011 20:32 |
|
movax posted:
It creates a separate product for the gamer/enthusiast market that enterprise can't use. No doubt they don't want to offer the K models at all but are worried that completely cutting off overclocking will give AMD an advantage, so they have pigeonholed it with the K models- removing virtualization and playing these ridiculous chipset games.
|
# ? Jan 3, 2011 20:33 |
|
redeyes posted:I absolutely love Intel. I buy their stuff constantly BUT I would never trust their driver team to be able to deliver stable and feature packed graphics drivers for this new chip. As of right now Intel has never been able to deliver a competitive graphics chip/driver. According to a lot of reviews this SB cannot decode FILM correctly and outputs 24fps which makes movie watching jittery. That is loving unacceptable. I mean all this hub hub over what amounts to a lovely entry level graphics chip that lacks features and who knows what else? No loving thanks.
|
# ? Jan 3, 2011 20:39 |
|
redeyes posted:According to a lot of reviews this SB cannot decode FILM correctly and outputs 24fps which makes movie watching jittery. That is loving unacceptable. 1. It's a good thing there's a CPU included with SB that can decode video purely in software 2. It's been nearly half a century and everyone still has to deal with framerate fuckery (and they still get it wrong). Though screaming about 23.976 vs 24.000 is a new one to me. I wonder if the DisplayPort jitter issue that was present on Ibex Peak snuck its way into the 6 Series from all the design reuse?
|
# ? Jan 3, 2011 20:41 |
|
|
# ? May 9, 2024 22:54 |
|
So being that the 2500k costs Intel $216, how much will it likely sell for on a site like say Newegg?
|
# ? Jan 3, 2011 21:01 |