Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Well, Crossfire works, as long as the game supports it, the drivers support the game, you're in full screen, and, if doing Hybrid Crossfire, your GPUs are relatively similar. But if any of those factors aren't taken care of, then yeah, it blows.

Adbot
ADBOT LOVES YOU

syzygy86
Feb 1, 2008

Nalin posted:

As somebody with an A8-3500M with a discrete Radeon 6750M in my laptop, I have to say that it is underwhelming. AMD's drivers just can't do it. I got better performance by flashing a modded bios and disabling the gpu core on my cpu. Crossfire just doesn't work.

Would the new AMD Enduro drivers help with this? My understanding is that AMD is finally trying to catch up to nVidia's switchable graphics, but I'm not sure what kind of hardware support is required.
http://www.anandtech.com/show/6243/amds-enduro-switchable-graphics-levels-up

OldPueblo
May 2, 2007

Likes to argue. Wins arguments with ignorant people. Not usually against educated people, just ignorant posters. Bing it.
Is any of this tech making it's way into a Win8 tablet?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
No. Brazos might, maybe, to sort-of-compete with the new Atom SoCs (which have a huge advantage in power consumption and platform cost). But Trinity's lowest TDP will be 17W, destined for "ultrathin" laptops. Maybe possibly someone will make a fat-tab around it, like with IVB ULV CPUs, but that'd be a tough sell when tablet gaming largely does not need the type of GPU that Trinity prioritizes over CPU performance. According to AnandTech's testing, the load power consumption is kinda poo poo, too, with 65W Trinity A8-5600K using more 14W power on Metro 2033 than a 77W i7-3770K with HD 4000. This is despite having an idle power 7W lower.

--

TechReport is pissed about AMD's staggered NDA-lifts for Trinity, accusing its marketing of trying to exert de facto editorial control. The excerpts of the NDA-lift communication are pretty... well...

The NDA didn't lift on non-gaming benchmarks (i.e. CPU) because:

AMD NDA e-mail posted:

This is in an effort to allow consumers to fully comprehend your analysis without prejudging based on graphs which do not necessarily represent the experiential difference and to help ensure you have sufficient content for the creation of a launch day article.

I think we can read between the lines, there: Piledriver cores blow.

TR is withholding its review on principle, acknowledging that all the buzz has already been set by other outlets' preview articles. It says:

TechReport posted:

[We're sympathetic, b]ut none of that changes the fact that this plan is absolutely, bat-guano crazy. It crosses a line that should not be crossed.

Companies like AMD don't get to decide what gets highlighted in reviews and what doesn't. Using the review press's general willingness to agree on one thing—timing—to get additional control may seem clever, but we've thought it over, and no. We'll keep our independence, thanks.

AMD responded, and TR published it with commentary. TR didn't think it changed much.

TechReport posted:

By exercising this sort of control over the release of independent test data, AMD gets to highlight its product's strengths and downplay its weaknesses. Or, as the firm put it, "The goal was to provide an opportunity to talk about the real-world experience with the product and also highlight the performance in key applications where we are targeting the product." At least they're being upfront about it.

roadhead
Dec 25, 2001

Factory Factory posted:

No. Brazos might, maybe, to sort-of-compete with the new Atom SoCs (which have a huge advantage in power consumption and platform cost). But Trinity's lowest TDP will be 17W, destined for "ultrathin" laptops. Maybe possibly someone will make a fat-tab around it, like with IVB ULV CPUs, but that'd be a tough sell when tablet gaming largely does not need the type of GPU that Trinity prioritizes over CPU performance. According to AnandTech's testing, the load power consumption is kinda poo poo, too, with 65W Trinity A8-5600K using more 14W power on Metro 2033 than a 77W i7-3770K with HD 4000. This is despite having an idle power 7W lower.

--

TechReport is pissed about AMD's staggered NDA-lifts for Trinity, accusing its marketing of trying to exert de facto editorial control. The excerpts of the NDA-lift communication are pretty... well...

The NDA didn't lift on non-gaming benchmarks (i.e. CPU) because:


I think we can read between the lines, there: Piledriver cores blow.

TR is withholding its review on principle, acknowledging that all the buzz has already been set by other outlets' preview articles. It says:


AMD responded, and TR published it with commentary. TR didn't think it changed much.


I'm pretty sure the average person can count on one hand the number of times they've completely pegged their CPU in the last week. Yes its the thing everyone thinks they need, but raw CPU performance is "good enough" on any modern processor in my opinion.

Its the overall experience (balance of CPU and GPU) that AMD is aiming for now (since they'll never catch up in raw CPU performance obviously) and rightly so. You go where you think you can make a difference.

What I want to know is the actual street date of this Lenovo IdeaPad S405! All the press releases indicate it should be out now but I can't find it anywhere!

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

roadhead posted:

I'm pretty sure the average person can count on one hand the number of times they've completely pegged their CPU in the last week. Yes its the thing everyone thinks they need, but raw CPU performance is "good enough" on any modern processor in my opinion.


You'll still notice an i5 vs a A6 or whatever. Pegging your CPU is like pegging your internet connection. '100%' only means you were using it at full speed for a whole second.

You might not 'peg' a 5mb connection that often but that doesn't mean a 20mb connection won't be 4 times faster, and that you won't notice it.

roadhead
Dec 25, 2001

Bob Morales posted:

You'll still notice an i5 vs a A6 or whatever. Pegging your CPU is like pegging your internet connection. '100%' only means you were using it at full speed for a whole second.

You might not 'peg' a 5mb connection that often but that doesn't mean a 20mb connection won't be 4 times faster, and that you won't notice it.

Not when the bottleneck is your HDD as it is when most people complain of their machine being "slow" - people don't react to the extra seconds (minutes?) it takes to encode an album full of MP3s, they react to the overall responsiveness and usefulness of the machine.

"Can I still browse the web without it herking and jerking while I encode these MP3s," type thinking.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
That's not true at all. You'll notice the difference between an AMD processor and an Intel processor in a system with a solid-state boot drive or a traditional mechanical boot drive. AMD isn't even competitive in the Mobile market with the release of HD4000 and Haswell's IGP will just be the final nail in the coffin.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

The slowest thing these days is probably browsing websites since they have so much drat javascript and flash and poo poo on them.

JawnV6
Jul 4, 2004

So hot ...

roadhead posted:

Not when the bottleneck is your HDD

What century are you posting from??

Devian666
Aug 20, 2008

Take some advice Chris.

Fun Shoe

JawnV6 posted:

What century are you posting from??

I'm pretty sure the transfer rate from wax cylinders is bottlenecking my system. They don't even have SATA connectors.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I can improve my burst transfer speed by whipping the child-scribes.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I learned to yell louder so that the information travels farther though as far as I can tell it's done nothing for my transfer speeds per se.

nmfree
Aug 15, 2001

The Greater Goon: Breaking Hearts and Chains since 2006

Devian666 posted:

I'm pretty sure the transfer rate from wax cylinders is bottlenecking my system.
I run mine in RAID0 for speed, what's the worst that could happen?

edit: I think I did that RAID joke right

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
AnandTech has part 2 of desktop Trinity up.

While it's a big bump over Llano, and CPU performance in non-gaming applications is improved and competitive with an i3, discrete GPU gaming is sadsack. Piledriver's improved IPC makes the A10-5800K a better gamer than the FX-8150 Bulldozer octocore, but it still falls short of dual core + HT IVB.

On the overclocking front, the A10-5800K is practically maxed out at its stock 4.2 GHz Turbo (1.45V stock :gonk:). It took ludicrous voltage to push the chip to 4.4 GHz (three times more than it takes my i5-2500K to go from 3.7 GHz top turbo to 4.6 GHz/all cores), and the reviewer could not get the system to boot at 4.5 GHz. For reference, Piledriver at 4.4 GHz manages about the same Cinebench single-threaded performance as Sandy Bridge at 2.9 GHz. The IPC difference is enormous.

So AMD's "Clock fast, volt hard, and leave a good-looking multi-corpse" strategy has not changed since first-gen Bulldozer. Llano's overclocking peculiarities are gone, but that doesn't get you much because the chip is pretty much maxed out anyway, on the CPU side.

Surprisingly, the chip isn't all bad - the IGP performance is pretty great, the platform idle power is really good, and it includes AES-NI instructions at price points where Intel strips them off. And with RAM prices down, it may be that an A10 actually does end up cheaper than an i3 + Radeon 6570. But that's the silver lining. Bottom line is that the chip is not compelling on the desktop.

Ragingsheep
Nov 7, 2009
All in all, not that surprising I guess.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I actually have to say, these things are remarkably competitive with Intel's current low-end offerings. The A8-5600K in particular seems like a perfectly reasonable option in a low-end system. I do wish the power efficiency was better though, and Intel has a lot of room to cut prices on the i3s or slot in some faster Pentiums.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Well at least the APU's are no longer just totally laughable performance on the desktop, though unless I absolutely had to game on a computer and had to do so without a discrete graphics card, the CPU is still inferior to a Core i3 overall while using twice as much power at the same price. They are roughly equivalent in multi-threaded tasks and Intel as one would expect dominates the single-threaded tasks. If Intel were to talk Aleron's idea and cut prices on the Core i3's by 10-20 bucks, it would do a lot to take the sails out of the Trinity. A theoretical 100 dollar Core i3 would really hit the right spot marketing wise.

Civil
Apr 21, 2003

Do you see this? This means "Have a nice day".

Beautiful Ninja posted:

Well at least the APU's are no longer just totally laughable performance on the desktop, though unless I absolutely had to game on a computer and had to do so without a discrete graphics card, the CPU is still inferior to a Core i3 overall while using twice as much power at the same price. They are roughly equivalent in multi-threaded tasks and Intel as one would expect dominates the single-threaded tasks. If Intel were to talk Aleron's idea and cut prices on the Core i3's by 10-20 bucks, it would do a lot to take the sails out of the Trinity. A theoretical 100 dollar Core i3 would really hit the right spot marketing wise.

If you're talking about gaming without a discreet card, there's no question - the i3 pales in comparison. Realistically, both CPU's can push most modern games, but the Trinity GPU kicks the poo poo out of the HD 2500.

This will be a respectable CPU for cheap, small integrated systems. The $400 box you get for your gramma at Best Buy.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
It kicks the crap out of Intel 4000 graphics actually going by reviews - over twice as fast in some gaming benchmarks.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Civil posted:

If you're talking about gaming without a discreet card, there's no question - the i3 pales in comparison. Realistically, both CPU's can push most modern games, but the Trinity GPU kicks the poo poo out of the HD 2500.

This will be a respectable CPU for cheap, small integrated systems. The $400 box you get for your gramma at Best Buy.

Just general desktop performance, since that was what the article anand released today was about. For budget gaming, I'd get an Intel Pentium + Radeon 7750 for 160 bucks since it's literally double the performance of the Trinity CPU at 130 bucks. Obviously you'd lose out on some multi-thread performance for desktop applications that make use of it, but I think the trade off is worthwhile.

Autech
Jun 6, 2008

The Brazilian Legkick Machine
I am probably going to end up buying an A10 5800k. A slight overclock on the CPU and ramp the GPU up to 1000mhz, and maybe it will keep me ticking over till i can be bothered forking out for a decent graphics card.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
You'd be better off waiting until you can afford a better graphics card and processor rather than throwing away money now. Seriously. Quasi-instant gratification is not worth the exasperation later when your processor doesn't measure up.

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

On the other hand, I'm still using an Athlon X4 and I feel exactly 0 exasperation about my processor. I can play my games and use my computer just fine and feel no reason to upgrade now. And I'm guessing if he thinks he can get by for a while with integrated graphics then he's not expecting to max new games at 1440p, either, so he might be just fine.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

MeramJert posted:

On the other hand, I'm still using an Athlon X4 and I feel exactly 0 exasperation about my processor. I can play my games and use my computer just fine and feel no reason to upgrade now. And I'm guessing if he thinks he can get by for a while with integrated graphics then he's not expecting to max new games at 1440p, either, so he might be just fine.

I'm pretty much here too. I've had an Athlon X4 for nearly three years now, and while I wouldn't mind a faster CPU at this point, my graphics were a bigger bottleneck than anything until I recently upgraded that to a 7950, especially on account of screen resolution.

That said, if I was shopping for a new MB/CPU now, instead of probably next year, I'd be looking at an i5 or so.

Killer robot fucked around with this message at 03:36 on Oct 3, 2012

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Autech posted:

I am probably going to end up buying an A10 5800k. A slight overclock on the CPU and ramp the GPU up to 1000mhz, and maybe it will keep me ticking over till i can be bothered forking out for a decent graphics card.

Get a Pentium + Radeon 7750 if you're looking to buy a budget gaming rig. It will be twice as fast for gaming and cost maybe 30-40 bucks more.

syzygy86
Feb 1, 2008

Beautiful Ninja posted:

Get a Pentium + Radeon 7750 if you're looking to buy a budget gaming rig. It will be twice as fast for gaming and cost maybe 30-40 bucks more.

Yeah, the Anandtech review shows that the Trinity desktop chips really don't hold up well against an Intel chip combined with discrete graphics. Desktop Trinity APU's are really only a decent choice if you know discrete graphics are not going to be used and you're OK with the tradeoffs.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Autech posted:

I am probably going to end up buying an A10 5800k. A slight overclock on the CPU and ramp the GPU up to 1000mhz, and maybe it will keep me ticking over till i can be bothered forking out for a decent graphics card.

Yeah, I was looking for a reason to like this offering from AMD too - an HTPC maybe, I thought. But ultimately it just isn't going to be any good compared to Intel + discrete card, in any real aspect.

The only place this kind of integration on die is going to be useful is in making smaller laptops with half decent graphics.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

Killer robot posted:

I'm pretty much here too. I've had an Athlon X4 for nearly three years now, and while I wouldn't mind a faster CPU at this point, my graphics were a bigger bottleneck than anything until I recently upgraded that to a 7950, especially on account of screen resolution.

That said, if I was shopping for a new MB/CPU now, instead of probably next year, I'd be looking at an i5 or so.

MeramJert posted:

On the other hand, I'm still using an Athlon X4 and I feel exactly 0 exasperation about my processor. I can play my games and use my computer just fine and feel no reason to upgrade now. And I'm guessing if he thinks he can get by for a while with integrated graphics then he's not expecting to max new games at 1440p, either, so he might be just fine.

I switched from a desktop using an Athlon X4 635 and a 560Ti to a Thinkpad W530 with a 3720QM and a K2000M. The difference is literally night and day, everything is snappier and generally more responsive. I can't really quantify if but qualitatively speaking, it's a move worth making and I'd bet the farm that you're not exasperated because you haven't tried anything better.

Edit: I guess that it's also important to say that if it's good enough for your needs, then you shouldn't upgrade. However, there's no reason to buy an AMD processor outright when you could get a better processor and a discrete card for $50 more that will draw similar amounts of power but offer better performance and longevity.

InstantInfidel fucked around with this message at 14:31 on Oct 3, 2012

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

Well obviously if I was building a new computer now, I'd go with Intel. But I have used newer computers with i5 and i7 processors fairly extensively and it's really not a night and day difference to me. It's moderately faster, but honestly it really doesn't matter that much to me (or to most real life people I know). Going from my netbook to my desktop is a night and day difference; the speed discrepancy is so great that it fundamentally changes the kinds of things I can do with my computer. Going from an Athlon X4 640 to an i5 is not; it doesn't unlock any new possibilities for me, it just does everything slightly faster.

That said, I'll probably upgrade my computer in 2013 and unless AMD really pulls something out that nobody is expecting then it will be the first time in a decade that I'll own an Intel-based computer as my primary machine for personal use.

GRINDCORE MEGGIDO
Feb 28, 1985


For sure Intel is the way to go, but going from a 4GHz 965BE to a 4GHz i5 2500K - for general work I can't tell any difference at all. The AMD box feels slow as poo poo now it inherited my old WD Green drive instead of an SSD, though (no suprise).

I don't regret the upgrade to Intel in the slightest though (gaming, and the Intel box runs quieter).

GRINDCORE MEGGIDO fucked around with this message at 15:53 on Oct 3, 2012

Autech
Jun 6, 2008

The Brazilian Legkick Machine

InstantInfidel posted:

You'd be better off waiting until you can afford a better graphics card and processor rather than throwing away money now. Seriously. Quasi-instant gratification is not worth the exasperation later when your processor doesn't measure up.

That is a fair point. I can afford a high end system, i just need to stop being stingy. Time to start browsing the build threads :)

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Autech posted:

That is a fair point. I can afford a high end system, i just need to stop being stingy. Time to start browsing the build threads :)

There is just one build thread, and its a sticky. Pick a system from the quick pick list, and you shouldn't be disappointed.

edit: herp hosed up the link somehow forgot the closing bracket and didn't catch it in preview.

pixaal fucked around with this message at 19:54 on Oct 3, 2012

Autech
Jun 6, 2008

The Brazilian Legkick Machine

pixaal posted:

There is just one build thread, and its a [url=http://forums.somethingawful.com/showthread.php?threadid=3458091] sticky. Pick a system from the quick pick list, and you shouldn't be disappointed.

Excellent link dude. Thank you very much and bye bye AMD!

(I am one of those "root for the underdog" types but sadly AMD just can't hang with the Intel this year :( )

pigdog
Apr 23, 2004

by Smythe

roadhead posted:

I'm pretty sure the average person can count on one hand the number of times they've completely pegged their CPU in the last week. Yes its the thing everyone thinks they need, but raw CPU performance is "good enough" on any modern processor in my opinion.
If you're gaming, then there are substantial differences in overall smoothness of gameplay, even if the differences in GPU-dependent, averaged framerates aren't big. Most of the time there are other bottlenecks in the system, but in spots which it is CPU-dependent, the difference is noticeable. E.g. the time taken to render one frame at constant 60 FPS is 16.7 ms, and if it takes longer than that, then that's a stutter, a drop in framerate. The graph on Skyrim timedemo is especially telling.

Nintendo Kid
Aug 4, 2011

by Smythe

pigdog posted:

If you're gaming, then there are substantial differences in overall smoothness of gameplay, even if the differences in GPU-dependent, averaged framerates aren't big. Most of the time there are other bottlenecks in the system, but in spots which it is CPU-dependent, the difference is noticeable. E.g. the time taken to render one frame at constant 60 FPS is 16.7 ms, and if it takes longer than that, then that's a stutter, a drop in framerate. The graph on Skyrim timedemo is especially telling.



The fact that the Phenom II X4 980 manages to outperform every other AMD chip gets me everytime. How did AMD let something like that happen?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Install Gentoo posted:

The fact that the Phenom II X4 980 manages to outperform every other AMD chip gets me everytime. How did AMD let something like that happen?

In terms of the K10 generation, I reckon it takes the crown over the hex-core 1100T because it's the highest clocked 4-core and hits the sweet spot in terms of the lowest time management overhead. There probably just aren't many games that can take full advantage of AMD's hex-core processors, they do alright in sequential and very efficiently threaded operations and will show that performance gain over the quad K10s, but it seems like there must be something with cache management or memory access that's bottlenecking the faster part in more dynamic workloads. But Bulldozer, yeah, that's just a total fart of a processor, wrecked the company for a reason. They took a shiiiiiiiitty gamble, bet the farm and lost.

I am more interested in the high end processors from Intel on the graph. I don't think the 100MHz bump could account for half as much time spent catching up to the GPU compared to the 2500K; is hyperthreading really that efficient in helping the execution pipeline do its thing? I'm all for it, running a 2600K and all, but still it surprises me that essentially the same processor with a minor clock increase gets that much extra performance out of hyperthreading.

Just efficiency of resource utilization, I guess. I feel jealous of Ivy Bridge, though. Stupid ever-increasing efficiency, how am I supposed to have top-end hardware when they keep making more of it and better all the time? :mad:

Agreed fucked around with this message at 23:22 on Oct 3, 2012

nmfree
Aug 15, 2001

The Greater Goon: Breaking Hearts and Chains since 2006

Autech posted:

(I am one of those "root for the underdog" types but sadly AMD just can't hang with the Intel this year :( )
I've been using AMD systems since I built my first desktop in 1999- partially because of that sentiment and partially because they've had some competitive products, but there comes a time when a person has to accept reality. The graph in pigdog's post just tidily sums up how bad it's gotten in the current generation.

Arivia
Mar 17, 2011
So here's something hilarious I just found out in my British Imperial Literature class: Because of the famous Dr. Livingstone's mess of a river expedition, to "Zambezi" something means to turn it into a giant boondoggle, mess, or in modern terms, clusterfuck.

Quite the pedigree AMD had going there.

Adbot
ADBOT LOVES YOU

chocolateTHUNDER
Jul 19, 2008

GIVE ME ALL YOUR FREE AGENTS

ALL OF THEM
Welp, I wanted to stick a Trinity in my soon to be built HTPC/light gaming PC, but man those results...

...guess I'll have to go with an intel based system :( Sorry AMD, I really wanted to like you.

  • Locked thread