Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
freeforumuser
Aug 11, 2007
Sandy Bridge, AKA: finally a Intel GPU that is not to be avoided like the plague.

Even so, I'm still not a fan of Intel GPUs but hopefully this would drive down the mobile midrange GPUs down. Desktop wise I don't see why anyone would even bother with SB if they have already have overclocked ~4GHz i7/5/3/C2Q/Phenom II CPUs.

Adbot
ADBOT LOVES YOU

freeforumuser
Aug 11, 2007

4 Day Weekend posted:

I'm not sure if this is correct or not, but I swear I read somewhere that the reason most games don't need much processing power is that most are designed to be able to run on consoles as well. So once next gen consoles roll around, we'll see a big jump in system requirements.

Assuming they aren't ported over to the PC craptacularly like Splinter Cell: Conviction. I didn't upgrade my graphics card to play games that look worse AND runs slower than before.

freeforumuser
Aug 11, 2007
Let's face it, the only real apps left that are still primarily CPU limited are rendering and video encoding. Interestingly, both apps lend themselves well to massively parallel processing on GPUs, same for gaming physics. And now, we see Intel and AMD are pushing with CPUs with integrated GPUs. Coincidence? Me thinks no and let me proclaim the multicore era is already over and welcome our new GPU-dominant processor overlords.

freeforumuser
Aug 11, 2007

Alereon posted:

440BX didn't have that much longevity. It was replaced within a year, and obsolete within 2 years as all of Intel's CPUs after that point required a 133Mhz FSB. The 1100Mhz Coppermine P3 was the last supported CPU, the Coppermine-T and Tualatin CPUs (as well as all previous 133Mhz FSB CPUs) required at least an Intel i810E chipset.

Good old times when chipsets actually affected CPU performance. VIA ruled the roost back in 2000/01 because Intel tried to dictate the RAM market and shove RDRAM down our throats and what a karmic epic failure that was. Even without adjusting for inflation PC700 RDRAM cost like more than entire gaming rigs of today. However once Intel got their poo poo together by supporting DDR and Nforce came out for AMD...VIA was pretty much dead.

freeforumuser
Aug 11, 2007

Kerris posted:

How does AMD's Fusion compare to Sandy Bridge? Is Intel closer to delivering Sandy Bridge?

Mainstream part for Fusion is Llano which is K10 + 400/480 shader GPU. Needless to say the GPU will slaughter SB outright but the CPU portion will be 2 generations behind SB by the time its comes out in 2H 2011. But the real star of the show is Bobcat which will have C2D + 5450 class performance on a power footprint of an Atom which will be released by the end of this year....Now that is impressive!

freeforumuser
Aug 11, 2007

WhyteRyce posted:

It should make the netbook market interesting again at least, although I've never understood the fascination people have with wanting to play games on a netbook.

With current netbooks I agree, but Bobcat should lift netbook performance to an acceptable standard that netbook gaming will be finally viable and also competition against the slowass Atom. If Bobcat can force Intel to put a scaled-down SB into netbooks that would be even better on the whole.

freeforumuser
Aug 11, 2007

Alereon posted:

Fudzilla is reporting that Apple will switch to AMD Fusion processors in their upcoming products (update with additional confirmation here). This is probably a direct response to Sandy Bridge's on-die graphics not supporting OpenCL, as Apple has committed to shipping every computer with support for GPGPU acceleration. On products without dedicated GPUs, they do this by pairing an older Core 2 Duo processor with an nVidia Geforce 320M chipset that provides CUDA and OpenCL support. Since the Core 2 Duo is getting old and nVidia won't be making any more chipsets, they pretty much have to switch to AMD unless they're willing to put a dedicated GPU in every product (including the Macbook Air) or abandon their GPU acceleration plans.

I don't really like the idea of Intel dominating the mobile market so that's a win for us all.

freeforumuser
Aug 11, 2007

KKKLIP ART posted:

The other side of the coin is that over this generation processors are doing more work per clockcycle than before. I would bet a dollar or two that my E8400 (assuming I could turn one core off) can do more work faster than any of the p4 era 3.0GHz processors. AMD really started this trend with their Athlon 64 line, lower clocks but faster. Numbers for requirements were probably inflated a bit (like the wattage on crappy power supplies are) to make sure you could get the performance you needed

To give you some sense of scale, the slowest C2D at debut was the 1.86GHz E6300 and that was just as fast as the most powerful Netburst 3.73GHz dual-core + HT Extreme Edition chip. A 3GHz E8400 which is 10% more efficient per clock than the original C2Ds is going eat any P4 for breakfast, regurgitate it as lunch and chew through it as dinner.

On the on the hand of the spectrum, A 2.2 GHz P4 released in 2002 is on par with a 1.6GHz Atom. That should show how terrible Atoms are in terms of performance/price even if netbooks are pretty cheap to start with.

freeforumuser
Aug 11, 2007
I'm waiting to score some cheap i5 quads when SB hits. You guys can have the SB chips and let cheapskates like me clear away the (not-so) outdated trash, thanks.

freeforumuser
Aug 11, 2007

PC LOAD LETTER posted:

Yea this guy got a engineering sample. He posts at XS all the time and got his sample to a little over 5Ghz. Intel also demoed a overclocked system at 4.9Ghz with a stock heatsink running Cinebench before him though. Supposedly you don't need high volts to OC them either, but we'll have to wait and see because all we have right now are ES screens on random sites to go by.

If you run stuff at stock and already have a Core i3/5/7 SB really isn't worth upgrading for performance wise. If you overclock and/or have an older system it should be quite an upgrade though.

I'm sure retail SBs definitely won't hit the overclocks achieved by the hand-picked ES chips.

freeforumuser
Aug 11, 2007

dud root posted:

Anyone have any experience with the Asrock boards? The Exp4 & Exp6 have 775 mounting holes, which I need for my water block. Ill be pairing it with the 2500k

Me and my my friend bought the cheapest Asrock 1156 matx mobo for his i5 2400...None of us really cared about USB 3.0, and we went back and found out it has that and also UEFI. Amazing deal I tell ya.

freeforumuser
Aug 11, 2007

Faceless Clock posted:

Was there really any reason to wait? It looks like there are a few extra overclocking goodies, an SSD related feature, and...?

If you wait to buy hardware based on such minor revisions you'll never actually buy any hardware.

When the cheapest P67 boards comes with Crossfire and USB3.0 as baseline standard it's really hard to care for Z68. Ditto for LGA 2011 when 99.99% of us won't need more than a overclocked 2500K.

And I still don't get why Intel is selling i5-760 at the same price as a 2500K. They really believe there will be idiots falling for buying last-gen stuff at current gen prices?

freeforumuser
Aug 11, 2007

Alereon posted:

Anandtech just published some additional details about the upcoming Sandy Bridge-E processors and X79 chipset. Disappointingly, only the hex-core processors will be unlocked, the quad-core CPUs are limited to 6 bins above the maximum Turbo frequency, or 4.5Ghz. I'm curious to know what the actual chipset<->CPU bus is, and how many PCIe lanes from the chipset are actually available for devices.

It will most likely be X58 again but without any of the perks (better overclocking compared to S1156). Overpriced CPUs, overpriced mobos and overrated quad-channel memory. Anyone with a sense of cost-effectiveness would avoid this like the plague.

freeforumuser
Aug 11, 2007
Seems like Intel heavily gimped the SB mobile i3s to only a mere 2.1GHz, while the mobile i5s can turbo all the way up to 3.3GHz. I think they realized Arrandale i3s are simply too good for the price back then.

freeforumuser
Aug 11, 2007


All hail my new toy. (on a cheapo-yet-incredible Biostar TP67B+)

:smug:

freeforumuser fucked around with this message at 15:14 on Apr 29, 2011

freeforumuser
Aug 11, 2007

real_scud posted:

What is standard voltage? Cause I'm running 4.5 right now but my cpu voltage is at 1.22 and with all the dicking around I've forgotten what it originally was.

It's not stated on the Intel product page but according to my BIOS, 1.25V. However, I'm not sure if the 2500K is the CPUs with varying stock voltages.

freeforumuser
Aug 11, 2007

unpronounceable posted:

If those transistors make the performance increase that dramatic, I'll just keep my laptop until Ivy Bridge (and whatever AMD processor) comes out. Can't justify getting rid of it anyway while it's still under extended warranty.

Considering a SB mobile i7s are already faster than a desktop i7-920, imagine mobile IB with 22nm + 3D transistors which I guess 20% higher clocks at the same power envelope, in addition to architecture improvements to the CPU and 3x IGP performance = Oh my gooooooooooooooooooooood.

freeforumuser
Aug 11, 2007

movax posted:

Holy poo poo, called that one awhile ago! Not a big loss, PCIe 2.0 at current link widths is doing just fine. Still out of my price range though.

Unless you like to wave your e-peen around, LGA2011 is an horribly unappealing platform with cheapest hex-core starting at ~$560 and boards probably around $250. So the CPU and mobo alone costs more than a 2500K gaming rig for just for extra 2 cores which most apps won't even use. I'm soooo excited! (not)

freeforumuser
Aug 11, 2007

COCKMOUTH.GIF posted:

I'm definitely working on waiting for Ivy Bridge. The pushed back release deadline helps because it's a little bit after tax refund season which is good. I'm not exactly hurting at the moment, I'm on an e8400 with a Radeon HD 4890 so I'm thinking I should have no problem waiting until spring of next year to upgrade.

The confusing thing I can see is how some of these Sandy Bridge processors/motherboard chipsets have different features. There are a lot of pro's and con's like one processor/motherboard does virtualization well, but another does overclocking fairly well. I hope they don't try to pull that poo poo in Ivy Bridge.

Yeah, its retarded that the 2600K has VT-x but not VT-d despite being the top-tiered CPU for its socket, and P67 chipset should never had existed in the first place. But on the chipset front, AMD had a lot more pointless chipsets this gen and the last.

freeforumuser fucked around with this message at 02:07 on Sep 12, 2011

freeforumuser
Aug 11, 2007
My god, another 4-6% performance per clock over SB? What drugs are Intel on?

freeforumuser
Aug 11, 2007

karoshi posted:

Lovely chipset table, nice segmenting to squeeze all the dollars from the high-end must have all boards. Want to have SSD caching? Oh buy the cheapest boards that cant overclock. Oh, you wanted to overclock, too? Yeah, no, our middle of the range boards dont support that, can i interest you in this FARTAL1tY ROG XTR3M GTX 997 board for only $359? You can triple SLI with this puppy, too!

Is there a difference other than marketing (and 3x graphics support) between the z77 and z75? And aren't the graphics PCIe bundles controlled by the CPU anyway? So the bios checks the installed chipset and sets a flag on the CPU to block 3x SLI, because they havent installed a chip that's outside of the electrical path between CPU and GPU(s) anyway. So effectively this is like the famous SLI tax of the past where it wouldnt SLI unless it had a uselessSLI-optimizer chip on the board. The 891 triple SLI users are getting ripped I'd say.

It's funny the same PC industry that complains that PC gaming is too expensive is the same one that also sells overpriced "gaming" hardware.

And speaking of SLI, my friend bought an $250 680i triple SLI mobo back in the E6600 days with a 8800GTS 320MB only to end up with a single 9600GT before the whole thing was replaced with a 2500K + 6870. Thats an $150 extra paid for nothing.

freeforumuser
Aug 11, 2007
http://en.wikipedia.org/wiki/Haswell_(microarchitecture)

Most interesting bit about Haswell is the integrated southbridge. Not surprising, given the trend of integrating more and more circuity onto the CPU package itself.

This surely must make motherboard makers quake in their boots as they see Intel slowly removing them out of the equation. Intel can simply sell ready-made mobos with Haswell SoCs themselves, bypassing Asus & co completely.

Maybe that Intel's devious plan all along.

freeforumuser
Aug 11, 2007

Combat Pretzel posted:

Is a 30-50% guesstimate really worth 3-4x the price?

If gaming is the most intensive stuff you will do on your PC, LGA2011 is stupid as far as price/performance is concerned. 6 cores does nothing over 4 cores for games now and for at least the next 2-3 years.

freeforumuser
Aug 11, 2007

Agreed posted:

They've demonstrated that it's a good overclocker. I imagine that's going to be their push in the desktop market. Intel seems better positioned (due to having dump-trucks full of hundred dollar bills) to bring their prices down if it means staying competitive, though. At the proposed price points, AMD is going to have trouble, especially since Intel's going to be offering the 2700K at the 2600K's price point (which is probably just going to be a binning process with the current 2600Ks, but still, it'll push prices on the 2600K and 2500K down by design to make it that much harder for AMD).

AMD has talked about "easily getting 5ghz on air" as part of their press release regarding their record-holding 8GHz+ clock on a Bulldozer chip; the only problem is that their claimed clock for clock improvement, unless it's intentional misdirection (why would it be, let their stock take a dive while Intel pushes forward with high-powered architecture and even greater power efficiency for the markets where the real money is?) means that if every single Bulldozer buyer gets 5GHz no problem, it's just a big number. Assuming 35% improvement over current AMD CPUs, that's still lagging behind Intel's performance, and the clock advantage means that even if most Intel CPUs aren't likely to hit 5.0GHz right now, most of them will do 4.4-4.5GHz, so even among enthusiasts they've got an advantage in performance at the same price point as AMD's Bulldozer chips. I guess it remains to be seen what the more-than-just-hyperthreading, less-than-additional-cores extra parts on the Bulldozer chips will do for performance, but they better all overclock like crazy if AMD's got a shot at regaining some performance competitiveness this generation. And bring the server parts down, Intel runs the market there, how will AMD compete apart from cost savings?

Llano is taking a monstrous 1.5+ volts to hit a measly 3.8GHz with 32nm which is terrible consider the same architecture hit 4GHz at less voltages with 45nm. My hunch is all these BD delays were due to poor 32nm yields, and since BD was designed to hit 4GHz+ from the ground up I don't really think BD will be a winner on performance/watt front if the voltages are still sky high.

freeforumuser
Aug 11, 2007

Magic Underwear posted:

Nope. Read the article again. It's going to paper launch in Q4 2011, but "according to Otellini, first Ivy Bridge systems should become available in Spring 2012".

If you're in the market to upgrade you might as well do it now (especially if you're still using a C2D like me), IB isn't going to be a factor for 6 months or more. Not to mention that IB's performance gains are likely to be modest. Hell, if IB turns out to be amazing you can always just replace your 2500k, IB will be an 1155 part just like the 2500k.

edit: drat you star wars sex parrot for posting that article, nobody is reading the important part. let me summarize it in bold: As a consumer, there's no way in hell you're seeing Ivy Bridge this side of Q2 2012. That article just says they're going to get a few units out so that they can say they shipped in Q4 2011.

Word. Intel tocks (Penryn, IB) aren't really worth waiting for, it's the ticks (Conroe, Nehalem, SB) that are far more impressive.

freeforumuser
Aug 11, 2007
Ivy Bridge SKUs leaked:
http://forums.anandtech.com/showthread.php?t=2208665

tldr: not worth holding off SB for IB.

freeforumuser
Aug 11, 2007

Economy Clown Car posted:

I was just about to post this very question :catstare: but for the new line of CPUs in general. Because I thought it was funny to go from triple channel back to dual channel from the regular i7s to the SBs.


Also, I have been sitting on my hands waiting for the "New Standards"(TM) since we've been using 1920x1080 rez displays for quite a few years now and I'm waiting for anything higher than DDR3 2000 for memory. Logic being I don't want to get pranked in 2-3 years time and suddenly I have no upgrade capability because my mobo is the old standard and I have an old broke hoopty monitor.

Memory bandwidth has never been a problem ever since the dual channel chipsets debuted, especially so with todays CPUs with plenty of caches and prefetchers which very purpose is to make the CPU minimize on-demand RAM access as much as possible. In particular, LGA1366 triple-channel was overkill to the max and it still runs as fast in dual-channel mode. Whatever DDR3 2000+ things you see now is basically a marketing ploy to sell people overpriced RAM that does close to nothing over dirt cheap $15 DDR3 1333 sticks.

freeforumuser
Aug 11, 2007

Shaocaholica posted:

I've been asked to setup things like this and I just try to stress that if they plan on using it for any kind of 'work' work it will most likely give them a headache even if the brightness is adjusted correctly for close up viewing.

Yeah, those are same people that complains about flickering 17" CRTs set at 60Hz @ 640x480 in good old days. Most HDTVs are loving terrible for actual desktop use; I don't even understand how anyone is going to find enormous V-shaped subpixels bearable for at viewing anything less than 3 feet away. Bonus points for if the image is distorted for whatever reasons. It's kind of amazing how much poo poo people can tolerate with computers.

freeforumuser
Aug 11, 2007

WardeL posted:

I couldn't find a better place for this question so I guess I'll ask here.

I'm upgrading my computer, because my C2D is showing its age. I know this seems like a pretty basic question but I've been out of the loop about this kind of thing for like the past decade (the last time I knew a lot about computer hardware was when the Ti 4600 or whatever it was was the best GPU on the market).

If I don't give the tiniest poo poo about overclocking, there's no reason to get a i5 2500k over a 2500, right? That's the conclusion I got to but I just want to make double sure I'm right before I put down $500+ on new hardware.

There is little reason why you shouldn't overclock Sandy Bridge.

freeforumuser
Aug 11, 2007

Alereon posted:

EXPreview has some details and graphics benchmarks of the upcoming Ivy Bridge Core i5 3570K. HD Graphics 4000 is between 30-85% faster than HD 3000 in real games, up to more than twice as fast in 3DMark.

VR-Zone is also reporting that the previously announced delay for Ivy Bridge will only apply to dual-core mobile chips. Desktops and quad-core mobile chips will launch at the beginning of April as planned.

I was like "omg" when I saw IB GPU beats SB by 60% until I realised it's still much slower than my $150 mid-2008 9600GT (>> GT 240 DDR3).

freeforumuser
Aug 11, 2007

Beelzebubba9 posted:

Also, Intel's biggest competition these days is from its own legacy products. Unlike AT&T, Intel makes money by selling widgets, not services, so if they don't make new chips compelling enough to warrant an upgrade, then they don't generate revenue. I suspect that in a vacuum of viable competition we'll see the rate of improvements slow, but not by much. Intel still has to make vastly improved CPUs year over year to fuel growth, so unless Intel's shareholders want to settle for lower revenue and higher margin (and risk getting mauled on the low end by ARM/WOA), I don't see this changing.

Where AMD was useful was that they could keep Intel from solely dictating the direction of the market. Had AMD imploded a decade ago, I strongly suspect we'd all be running IA64 CPUs right now. Could be worse, really.

But here's a question: How many of you have greatly slowed the rates of your own CPU upgrades? I used to get a new computer/CPU every 2-3 years, but my current i7 920 @ 3.6Ghz has provided enough performance that I feel it'll be until Haswell (or its death) before I upgrade. Anyone else finding themselves in a similar boat?

My 2500K is going last a long time. From a strictly desktop POV IB is too lackluster to jump ship to.

It's not even just CPUs; AMD is making 7770s that are barely faster than its own 5770 almost 2.5 years ago it's as if they knew Nvidia won't try to one-up them in the future(even though GTX 560 is already waaaay better). Stagnation is here, folks.

Adbot
ADBOT LOVES YOU

freeforumuser
Aug 11, 2007
3770K Chinese (leaked?) preview.

http://news.mydrivers.com/1/218/218443_9.htm

Stock clocks only 18W less than 2600K at load. Performance pretty much unnoticable from 2600K in reality.

5GHz OC at 1.27V. Impressive by SB standards, but it might be a bit too high of a voltage for 22nm.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply