Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
roadhead
Dec 25, 2001

Sinestro posted:

I know this is pretty :downs: of me, but is BD going on laptops, or is it just Llano?

Last I heard the mobile equivalent of Bulldozer is Bobcat, but I might be mistaken there.

And by equivalent I mean you would need discrete graphics to use Bobcat. Does anyone know more?


EDIT: Don't listen to me I'm stupid. Bobcat is just the re-designed CPU core for the upcoming "below Llano" notebooks. I don't think they are going to do many "new" designs for notebooks that aren't APU - so high-end notebook workstations will just continue to adapt the HE desktop chips maybe?

roadhead fucked around with this message at 15:12 on May 19, 2011

Adbot
ADBOT LOVES YOU

roadhead
Dec 25, 2001

Looking at the hardwareheaven.com benches it seems they just used insane settings so that all the games were GPU-bound - like its a graphics test or something.

After that its just a matter of running each test enough times and throwing out the high scores for Intel and the low scores for AMD.

Had an 8150+Mobo in my cart on Newegg and plenty of time to finish the check-out but could not bring myself to do it. $100 more than the Phenom II x6 1100T AND I need a new motherboard? Hmmmm.

I still want one though - will wait for FX-8170.

Also I love the little A8-3850 in my HTPC - so what AMD lacks in the low-volume enthusiast gamer market perhaps can be made-up in the "I want the cheapest computer you have that plays games" segment at BestBuy.

roadhead
Dec 25, 2001

I'm hoping against hope the thread scheduling is all whack and threads that should be sharing a module (and L2) aren't so the L3 is being used for things it shouldn't be (under normal circumstances).

The only other explanation I can think up is they really did alter the integer pipelines significantly (lengthened them, I fear) from Phenom II and there will be no way a simple Windows 7 patch can save it.

How could they not see that it wasn't reliably beating the x4 and x6 Phenom IIs a while back, unless they had some synthetic (simulated most likely) situations where it was?

roadhead
Dec 25, 2001

Bob Morales posted:

So Bulldozer is AMD's Merced?

It's not *that* much of a complete waste of money and effort - at least its still x86/x64 and not some one-off instruction set that almost no-one uses.

If threads that should be co-operating are being assigned to separate modules and L2 and L3 is being wasted on inter-module communication as opposed to actually caching main memory like it should be, the ripple effect of a simple scheduling fix could be huge.

Even 10-15% at this point would at least be enough of an improvement not to look completely foolish next to the 2500k or the 1100T (once they drop the price of the 8150 to $235 or so) - but this is just wild speculation.

The seemingly abundant supply of the 6100 means that yields of fully working Bulldozers is probably not that good :/

roadhead
Dec 25, 2001

movax posted:

I think those guys are floundering because they don't have enough money. They're having to cut corners somewhere, be it the architecture team, process, software support, packaging, etc. They can't fire on all cylinders. In an ideal world they'd have an army of software engineers preparing drivers and updates for the major operating systems while the hardware team gets the actual hardware ready.

If they really have switched to a ton of EDA tools as well, I can see a disconnect between some old guard engineers and fresh guys that studied with EDA in school. I know I'm a baby engineer and I had EDA tools at my disposal during school, but I've had to go back to the dark ages a bit in supporting some legacy products.

The eternal optimist in me wants to say they automated bulldozer while the hand-tuned transistor work was(is) being done for Piledriver.

Perhaps these chips are more or less the same at the block-level and all the improvement in PD will be from tweaking the circuits down to as few gates as possible and other tuning.

Otherwise I just don't know anymore - this is obviously not the product we needed to come out of AMD to actually keep Intel on their toes. Did they even have to price drop the 2600k in response?

roadhead
Dec 25, 2001

The 1100T I bought as a consolation prize for myself installed smoothly - but I can no longer get any sort of program to read its temperature outside of in the BIOS it-self.

HWMonitor, AMD Overdrive, SpeedFan all say 0c - preposterous!

Gigabyte GA-MA790FXT-UD5P with the F8N BIOS revision of course.

Oh well I needed to upgrade in order to build my brother's wedding gift you see...

roadhead
Dec 25, 2001

Alereon posted:

Use CoreTemp for monitoring AMD CPU temperatures.

That also reports 0c when I am running Prime95 - obviously incorrect.

Probably a result of the "beta" status of this BIOS, the only one on this motherboard that supports this CPU.

roadhead
Dec 25, 2001

HalloKitty posted:

It makes it sound slightly less worse engineered. Still doesn't change the fact of the benchmarks/power & heat. I guess they're trying any kind of damage control right now.

Makes you think the recent house-clearing let go a lot of pure marketing people and maybe a real engineer took their place? How else do you officially release a number that is nearly twice the actual transistor count?!?

roadhead
Dec 25, 2001

Newegg messed up and sent me an email shouting about ordering it RIGHT NOW. And I followed the links to nothing. So.... Soon?


http://www.newegg.com/GTX680/?nm_mc=EMCCP-032212-NV&cm_mmc=EMCCP-032212-NV-_-GeForceGTX680-_-banner-_-SHOPNOW


Goes to a promo page but the links don't work, yet.

roadhead
Dec 25, 2001

nmfree posted:

For anyone looking for a late night laugh, AMD just sent out their 2011 Annual Report; I got mine in the mail yesterday so I haven't read through much of it yet but there's gotta be some good stuff in there. (If the direct link doesn't work go here instead to get it.)

I'm in the same boat as you in that I wanted to upgrade my X3 to an 1100T (or whatever) to increase my F@H points. When I looked 2 months ago all of the X6es were pretty difficult to find, wish I would have pulled the trigger then.

I'm glad I got mine when I did back in November. They probably quit making them a while back and the channel finally sold them all. I bet if you look hard enough and are willing to deal with an unknown vendor you can find one.

roadhead
Dec 25, 2001

Christobevii3 posted:

When did you see that it was nvidia? I've always seen ibm/amd?

http://semiaccurate.com/2012/01/18/xbox-nextxbox-720-chips-in-production/

Yea I was pretty sure AMD/ATI had all the design wins for the upcoming generation of consoles.

roadhead
Dec 25, 2001

Factory Factory posted:

No. Brazos might, maybe, to sort-of-compete with the new Atom SoCs (which have a huge advantage in power consumption and platform cost). But Trinity's lowest TDP will be 17W, destined for "ultrathin" laptops. Maybe possibly someone will make a fat-tab around it, like with IVB ULV CPUs, but that'd be a tough sell when tablet gaming largely does not need the type of GPU that Trinity prioritizes over CPU performance. According to AnandTech's testing, the load power consumption is kinda poo poo, too, with 65W Trinity A8-5600K using more 14W power on Metro 2033 than a 77W i7-3770K with HD 4000. This is despite having an idle power 7W lower.

--

TechReport is pissed about AMD's staggered NDA-lifts for Trinity, accusing its marketing of trying to exert de facto editorial control. The excerpts of the NDA-lift communication are pretty... well...

The NDA didn't lift on non-gaming benchmarks (i.e. CPU) because:


I think we can read between the lines, there: Piledriver cores blow.

TR is withholding its review on principle, acknowledging that all the buzz has already been set by other outlets' preview articles. It says:


AMD responded, and TR published it with commentary. TR didn't think it changed much.


I'm pretty sure the average person can count on one hand the number of times they've completely pegged their CPU in the last week. Yes its the thing everyone thinks they need, but raw CPU performance is "good enough" on any modern processor in my opinion.

Its the overall experience (balance of CPU and GPU) that AMD is aiming for now (since they'll never catch up in raw CPU performance obviously) and rightly so. You go where you think you can make a difference.

What I want to know is the actual street date of this Lenovo IdeaPad S405! All the press releases indicate it should be out now but I can't find it anywhere!

roadhead
Dec 25, 2001

Bob Morales posted:

You'll still notice an i5 vs a A6 or whatever. Pegging your CPU is like pegging your internet connection. '100%' only means you were using it at full speed for a whole second.

You might not 'peg' a 5mb connection that often but that doesn't mean a 20mb connection won't be 4 times faster, and that you won't notice it.

Not when the bottleneck is your HDD as it is when most people complain of their machine being "slow" - people don't react to the extra seconds (minutes?) it takes to encode an album full of MP3s, they react to the overall responsiveness and usefulness of the machine.

"Can I still browse the web without it herking and jerking while I encode these MP3s," type thinking.

roadhead
Dec 25, 2001

Professor Science posted:

Er, dedicated? As far as I've seen, PS4 seems to have a GPU with (at least relatively) standard GCN units. The only interesting thing that HSA would provide is if the GPU and the CPU have some level of cache coherence.

Main memory is shared and I bet all the L2 cache (and L3 if there is any) is shared as well.

roadhead
Dec 25, 2001

Professor Science posted:

Shared main memory != cache coherence. This kind of coherence (between CPU and integrated GPU) is one of the big promises of HSA. If I had to make a bet, I'd say the GPU can snoop the CPU L1/L2 but not vice-versa.

I doubt Sony wouldn't push this out without that feature though. This is heavily customized and will have volume in the millions most likely over the next 8-10 years, so I'm assuming all stops were pulled.

Unless you have a PS4 Dev-kit, and then I'd assume the NDA would keep you from posting anyway ;)

roadhead
Dec 25, 2001

JawnV6 posted:

It's not a matter of volume ("millions over 8-10 years" is pitiful) or pulling stops. Cache coherence between heterogeneous compute cores is a Hard Problem. It's entirely possible the complexity of making the agents agree on protocol was far greater than either team could manage.

Not to mention pointless if you can just solve it in software.

It's hard sure, but when you have a stable hardware platform and supporting Compiler/SDK/Best Practices a lot of the variables and unknowns that make it "Hard" go away and it becomes a lot more solvable. I'm betting we get cache coherence on BOTH the PS4 and the Next xbox (when using the AMD supplied tool-chain of course) and that is probably the thing that sold AMD APU to Sony and Microsoft for this gen.

roadhead
Dec 25, 2001

PS4 and Next-Box news are the only things really - and anyone who yawns at the thought of HSA - wtf?

roadhead
Dec 25, 2001

keyvin posted:

So, we know that the ps4 is using AMD, and I suspect the 360 is as well. Do you guys think this will lead to AMD returning as a viable option for PC gaming? I built my first Intel box ever this time around because of AMDs lovely single core performance. If developers are targeting AMD levels of per core performance, then it seems like it follows that AMD will be acceptable on the PC.

But its not even the Big-Boy core that you MIGHT (don't) build a desktop around. Its the low power Jaguar core that's meant to compete with Atom. Yea there are 8 of them but...


Better bet might be a GCN-based AMD GPU as the shaders they write for the console's should also work on the PC side but all the CPU code will just be C/C++ anyway and at the mercy of the compiler.


If for some reason they did hand-tuned Assembly on the consoles CPUs the micro-architectures are different enough that it would have to be re-tuned for Atom, Haswell, Bulldozer, Phenom II, etc. Which is why they don't often hand-tune x86 that much anymore.

roadhead
Dec 25, 2001

If its like the A8 in my HTPC build it will game fine at medium/high details at low resolution (720P), but no amount of sacrificing fidelity seems to get decent 1920x1080 frame-rates in any games.

Obviously for desktop/media tasks its sufficient/overkill.

roadhead
Dec 25, 2001

Boiled Water posted:

That is, to say the least, unlikely.

He didn't say anything about per-watt performance though :)

roadhead
Dec 25, 2001

Malcolm XML posted:

AMD is going up uP UP!

up like 25% in 2 days


much like power consumption on Zen and Polaris

I've been long on AMD so long I got my shares for $2.49

Yea :(

roadhead
Dec 25, 2001

They must not be using QuickSync on the 6700k that is streaming like poo poo right?

roadhead
Dec 25, 2001

Haquer posted:

I'm still on a Phenom II X6, I feel you

A year ago I finally played musical chairs and pulled the 705e out of the server and moved my 1100T there, and went Intel on my personal desktop for the first time since a 733 mhz coppermine mounted on a Slocket adapter...

Adbot
ADBOT LOVES YOU

roadhead
Dec 25, 2001

K8.0 posted:

The main time I find myself lacking frames is when I'm playing a game while watching something on my second monitor. That's a very, common use that benchmarks aren't going to account for and where going to 6 or 8 cores would probably have a real impact on performance.

What is it about advancing to the next episode on Netflix that totally kills my frame-rates for about 1 second? I've got 32 gigs of RAM!

  • Locked thread