|
Well, Crossfire works, as long as the game supports it, the drivers support the game, you're in full screen, and, if doing Hybrid Crossfire, your GPUs are relatively similar. But if any of those factors aren't taken care of, then yeah, it blows.
|
# ? Oct 1, 2012 05:20 |
|
|
# ? May 28, 2024 14:40 |
|
Nalin posted:As somebody with an A8-3500M with a discrete Radeon 6750M in my laptop, I have to say that it is underwhelming. AMD's drivers just can't do it. I got better performance by flashing a modded bios and disabling the gpu core on my cpu. Crossfire just doesn't work. Would the new AMD Enduro drivers help with this? My understanding is that AMD is finally trying to catch up to nVidia's switchable graphics, but I'm not sure what kind of hardware support is required. http://www.anandtech.com/show/6243/amds-enduro-switchable-graphics-levels-up
|
# ? Oct 1, 2012 06:04 |
|
Is any of this tech making it's way into a Win8 tablet?
|
# ? Oct 1, 2012 08:50 |
|
No. Brazos might, maybe, to sort-of-compete with the new Atom SoCs (which have a huge advantage in power consumption and platform cost). But Trinity's lowest TDP will be 17W, destined for "ultrathin" laptops. Maybe possibly someone will make a fat-tab around it, like with IVB ULV CPUs, but that'd be a tough sell when tablet gaming largely does not need the type of GPU that Trinity prioritizes over CPU performance. According to AnandTech's testing, the load power consumption is kinda poo poo, too, with 65W Trinity A8-5600K using more 14W power on Metro 2033 than a 77W i7-3770K with HD 4000. This is despite having an idle power 7W lower. -- TechReport is pissed about AMD's staggered NDA-lifts for Trinity, accusing its marketing of trying to exert de facto editorial control. The excerpts of the NDA-lift communication are pretty... well... The NDA didn't lift on non-gaming benchmarks (i.e. CPU) because: AMD NDA e-mail posted:This is in an effort to allow consumers to fully comprehend your analysis without prejudging based on graphs which do not necessarily represent the experiential difference and to help ensure you have sufficient content for the creation of a launch day article. I think we can read between the lines, there: Piledriver cores blow. TR is withholding its review on principle, acknowledging that all the buzz has already been set by other outlets' preview articles. It says: TechReport posted:[We're sympathetic, b]ut none of that changes the fact that this plan is absolutely, bat-guano crazy. It crosses a line that should not be crossed. AMD responded, and TR published it with commentary. TR didn't think it changed much. TechReport posted:By exercising this sort of control over the release of independent test data, AMD gets to highlight its product's strengths and downplay its weaknesses. Or, as the firm put it, "The goal was to provide an opportunity to talk about the real-world experience with the product and also highlight the performance in key applications where we are targeting the product." At least they're being upfront about it.
|
# ? Oct 1, 2012 10:39 |
|
Factory Factory posted:No. Brazos might, maybe, to sort-of-compete with the new Atom SoCs (which have a huge advantage in power consumption and platform cost). But Trinity's lowest TDP will be 17W, destined for "ultrathin" laptops. Maybe possibly someone will make a fat-tab around it, like with IVB ULV CPUs, but that'd be a tough sell when tablet gaming largely does not need the type of GPU that Trinity prioritizes over CPU performance. According to AnandTech's testing, the load power consumption is kinda poo poo, too, with 65W Trinity A8-5600K using more 14W power on Metro 2033 than a 77W i7-3770K with HD 4000. This is despite having an idle power 7W lower. I'm pretty sure the average person can count on one hand the number of times they've completely pegged their CPU in the last week. Yes its the thing everyone thinks they need, but raw CPU performance is "good enough" on any modern processor in my opinion. Its the overall experience (balance of CPU and GPU) that AMD is aiming for now (since they'll never catch up in raw CPU performance obviously) and rightly so. You go where you think you can make a difference. What I want to know is the actual street date of this Lenovo IdeaPad S405! All the press releases indicate it should be out now but I can't find it anywhere!
|
# ? Oct 1, 2012 20:02 |
|
roadhead posted:I'm pretty sure the average person can count on one hand the number of times they've completely pegged their CPU in the last week. Yes its the thing everyone thinks they need, but raw CPU performance is "good enough" on any modern processor in my opinion. You'll still notice an i5 vs a A6 or whatever. Pegging your CPU is like pegging your internet connection. '100%' only means you were using it at full speed for a whole second. You might not 'peg' a 5mb connection that often but that doesn't mean a 20mb connection won't be 4 times faster, and that you won't notice it.
|
# ? Oct 1, 2012 20:18 |
|
Bob Morales posted:You'll still notice an i5 vs a A6 or whatever. Pegging your CPU is like pegging your internet connection. '100%' only means you were using it at full speed for a whole second. Not when the bottleneck is your HDD as it is when most people complain of their machine being "slow" - people don't react to the extra seconds (minutes?) it takes to encode an album full of MP3s, they react to the overall responsiveness and usefulness of the machine. "Can I still browse the web without it herking and jerking while I encode these MP3s," type thinking.
|
# ? Oct 1, 2012 22:33 |
|
That's not true at all. You'll notice the difference between an AMD processor and an Intel processor in a system with a solid-state boot drive or a traditional mechanical boot drive. AMD isn't even competitive in the Mobile market with the release of HD4000 and Haswell's IGP will just be the final nail in the coffin.
|
# ? Oct 1, 2012 23:15 |
|
The slowest thing these days is probably browsing websites since they have so much drat javascript and flash and poo poo on them.
|
# ? Oct 1, 2012 23:43 |
|
roadhead posted:Not when the bottleneck is your HDD What century are you posting from??
|
# ? Oct 2, 2012 00:20 |
|
JawnV6 posted:What century are you posting from?? I'm pretty sure the transfer rate from wax cylinders is bottlenecking my system. They don't even have SATA connectors.
|
# ? Oct 2, 2012 01:00 |
|
I can improve my burst transfer speed by whipping the child-scribes.
|
# ? Oct 2, 2012 03:48 |
|
I learned to yell louder so that the information travels farther though as far as I can tell it's done nothing for my transfer speeds per se.
|
# ? Oct 2, 2012 04:08 |
|
Devian666 posted:I'm pretty sure the transfer rate from wax cylinders is bottlenecking my system. edit: I think I did that RAID joke right
|
# ? Oct 2, 2012 05:58 |
|
AnandTech has part 2 of desktop Trinity up. While it's a big bump over Llano, and CPU performance in non-gaming applications is improved and competitive with an i3, discrete GPU gaming is sadsack. Piledriver's improved IPC makes the A10-5800K a better gamer than the FX-8150 Bulldozer octocore, but it still falls short of dual core + HT IVB. On the overclocking front, the A10-5800K is practically maxed out at its stock 4.2 GHz Turbo (1.45V stock ). It took ludicrous voltage to push the chip to 4.4 GHz (three times more than it takes my i5-2500K to go from 3.7 GHz top turbo to 4.6 GHz/all cores), and the reviewer could not get the system to boot at 4.5 GHz. For reference, Piledriver at 4.4 GHz manages about the same Cinebench single-threaded performance as Sandy Bridge at 2.9 GHz. The IPC difference is enormous. So AMD's "Clock fast, volt hard, and leave a good-looking multi-corpse" strategy has not changed since first-gen Bulldozer. Llano's overclocking peculiarities are gone, but that doesn't get you much because the chip is pretty much maxed out anyway, on the CPU side. Surprisingly, the chip isn't all bad - the IGP performance is pretty great, the platform idle power is really good, and it includes AES-NI instructions at price points where Intel strips them off. And with RAM prices down, it may be that an A10 actually does end up cheaper than an i3 + Radeon 6570. But that's the silver lining. Bottom line is that the chip is not compelling on the desktop.
|
# ? Oct 2, 2012 08:56 |
|
All in all, not that surprising I guess.
|
# ? Oct 2, 2012 09:08 |
|
I actually have to say, these things are remarkably competitive with Intel's current low-end offerings. The A8-5600K in particular seems like a perfectly reasonable option in a low-end system. I do wish the power efficiency was better though, and Intel has a lot of room to cut prices on the i3s or slot in some faster Pentiums.
|
# ? Oct 2, 2012 10:11 |
|
Well at least the APU's are no longer just totally laughable performance on the desktop, though unless I absolutely had to game on a computer and had to do so without a discrete graphics card, the CPU is still inferior to a Core i3 overall while using twice as much power at the same price. They are roughly equivalent in multi-threaded tasks and Intel as one would expect dominates the single-threaded tasks. If Intel were to talk Aleron's idea and cut prices on the Core i3's by 10-20 bucks, it would do a lot to take the sails out of the Trinity. A theoretical 100 dollar Core i3 would really hit the right spot marketing wise.
|
# ? Oct 2, 2012 13:23 |
|
Beautiful Ninja posted:Well at least the APU's are no longer just totally laughable performance on the desktop, though unless I absolutely had to game on a computer and had to do so without a discrete graphics card, the CPU is still inferior to a Core i3 overall while using twice as much power at the same price. They are roughly equivalent in multi-threaded tasks and Intel as one would expect dominates the single-threaded tasks. If Intel were to talk Aleron's idea and cut prices on the Core i3's by 10-20 bucks, it would do a lot to take the sails out of the Trinity. A theoretical 100 dollar Core i3 would really hit the right spot marketing wise. If you're talking about gaming without a discreet card, there's no question - the i3 pales in comparison. Realistically, both CPU's can push most modern games, but the Trinity GPU kicks the poo poo out of the HD 2500. This will be a respectable CPU for cheap, small integrated systems. The $400 box you get for your gramma at Best Buy.
|
# ? Oct 2, 2012 20:36 |
|
It kicks the crap out of Intel 4000 graphics actually going by reviews - over twice as fast in some gaming benchmarks.
|
# ? Oct 2, 2012 20:39 |
|
Civil posted:If you're talking about gaming without a discreet card, there's no question - the i3 pales in comparison. Realistically, both CPU's can push most modern games, but the Trinity GPU kicks the poo poo out of the HD 2500. Just general desktop performance, since that was what the article anand released today was about. For budget gaming, I'd get an Intel Pentium + Radeon 7750 for 160 bucks since it's literally double the performance of the Trinity CPU at 130 bucks. Obviously you'd lose out on some multi-thread performance for desktop applications that make use of it, but I think the trade off is worthwhile.
|
# ? Oct 2, 2012 21:05 |
|
I am probably going to end up buying an A10 5800k. A slight overclock on the CPU and ramp the GPU up to 1000mhz, and maybe it will keep me ticking over till i can be bothered forking out for a decent graphics card.
|
# ? Oct 3, 2012 00:20 |
|
You'd be better off waiting until you can afford a better graphics card and processor rather than throwing away money now. Seriously. Quasi-instant gratification is not worth the exasperation later when your processor doesn't measure up.
|
# ? Oct 3, 2012 02:31 |
|
On the other hand, I'm still using an Athlon X4 and I feel exactly 0 exasperation about my processor. I can play my games and use my computer just fine and feel no reason to upgrade now. And I'm guessing if he thinks he can get by for a while with integrated graphics then he's not expecting to max new games at 1440p, either, so he might be just fine.
|
# ? Oct 3, 2012 03:21 |
|
MeramJert posted:On the other hand, I'm still using an Athlon X4 and I feel exactly 0 exasperation about my processor. I can play my games and use my computer just fine and feel no reason to upgrade now. And I'm guessing if he thinks he can get by for a while with integrated graphics then he's not expecting to max new games at 1440p, either, so he might be just fine. I'm pretty much here too. I've had an Athlon X4 for nearly three years now, and while I wouldn't mind a faster CPU at this point, my graphics were a bigger bottleneck than anything until I recently upgraded that to a 7950, especially on account of screen resolution. That said, if I was shopping for a new MB/CPU now, instead of probably next year, I'd be looking at an i5 or so. Killer robot fucked around with this message at 03:36 on Oct 3, 2012 |
# ? Oct 3, 2012 03:34 |
|
Autech posted:I am probably going to end up buying an A10 5800k. A slight overclock on the CPU and ramp the GPU up to 1000mhz, and maybe it will keep me ticking over till i can be bothered forking out for a decent graphics card. Get a Pentium + Radeon 7750 if you're looking to buy a budget gaming rig. It will be twice as fast for gaming and cost maybe 30-40 bucks more.
|
# ? Oct 3, 2012 03:38 |
|
Beautiful Ninja posted:Get a Pentium + Radeon 7750 if you're looking to buy a budget gaming rig. It will be twice as fast for gaming and cost maybe 30-40 bucks more. Yeah, the Anandtech review shows that the Trinity desktop chips really don't hold up well against an Intel chip combined with discrete graphics. Desktop Trinity APU's are really only a decent choice if you know discrete graphics are not going to be used and you're OK with the tradeoffs.
|
# ? Oct 3, 2012 03:55 |
|
Autech posted:I am probably going to end up buying an A10 5800k. A slight overclock on the CPU and ramp the GPU up to 1000mhz, and maybe it will keep me ticking over till i can be bothered forking out for a decent graphics card. Yeah, I was looking for a reason to like this offering from AMD too - an HTPC maybe, I thought. But ultimately it just isn't going to be any good compared to Intel + discrete card, in any real aspect. The only place this kind of integration on die is going to be useful is in making smaller laptops with half decent graphics.
|
# ? Oct 3, 2012 09:30 |
|
Killer robot posted:I'm pretty much here too. I've had an Athlon X4 for nearly three years now, and while I wouldn't mind a faster CPU at this point, my graphics were a bigger bottleneck than anything until I recently upgraded that to a 7950, especially on account of screen resolution. MeramJert posted:On the other hand, I'm still using an Athlon X4 and I feel exactly 0 exasperation about my processor. I can play my games and use my computer just fine and feel no reason to upgrade now. And I'm guessing if he thinks he can get by for a while with integrated graphics then he's not expecting to max new games at 1440p, either, so he might be just fine. I switched from a desktop using an Athlon X4 635 and a 560Ti to a Thinkpad W530 with a 3720QM and a K2000M. The difference is literally night and day, everything is snappier and generally more responsive. I can't really quantify if but qualitatively speaking, it's a move worth making and I'd bet the farm that you're not exasperated because you haven't tried anything better. Edit: I guess that it's also important to say that if it's good enough for your needs, then you shouldn't upgrade. However, there's no reason to buy an AMD processor outright when you could get a better processor and a discrete card for $50 more that will draw similar amounts of power but offer better performance and longevity. InstantInfidel fucked around with this message at 14:31 on Oct 3, 2012 |
# ? Oct 3, 2012 13:31 |
|
Well obviously if I was building a new computer now, I'd go with Intel. But I have used newer computers with i5 and i7 processors fairly extensively and it's really not a night and day difference to me. It's moderately faster, but honestly it really doesn't matter that much to me (or to most real life people I know). Going from my netbook to my desktop is a night and day difference; the speed discrepancy is so great that it fundamentally changes the kinds of things I can do with my computer. Going from an Athlon X4 640 to an i5 is not; it doesn't unlock any new possibilities for me, it just does everything slightly faster. That said, I'll probably upgrade my computer in 2013 and unless AMD really pulls something out that nobody is expecting then it will be the first time in a decade that I'll own an Intel-based computer as my primary machine for personal use.
|
# ? Oct 3, 2012 15:17 |
|
For sure Intel is the way to go, but going from a 4GHz 965BE to a 4GHz i5 2500K - for general work I can't tell any difference at all. The AMD box feels slow as poo poo now it inherited my old WD Green drive instead of an SSD, though (no suprise). I don't regret the upgrade to Intel in the slightest though (gaming, and the Intel box runs quieter). GRINDCORE MEGGIDO fucked around with this message at 15:53 on Oct 3, 2012 |
# ? Oct 3, 2012 15:32 |
|
InstantInfidel posted:You'd be better off waiting until you can afford a better graphics card and processor rather than throwing away money now. Seriously. Quasi-instant gratification is not worth the exasperation later when your processor doesn't measure up. That is a fair point. I can afford a high end system, i just need to stop being stingy. Time to start browsing the build threads
|
# ? Oct 3, 2012 17:29 |
|
Autech posted:That is a fair point. I can afford a high end system, i just need to stop being stingy. Time to start browsing the build threads There is just one build thread, and its a sticky. Pick a system from the quick pick list, and you shouldn't be disappointed. edit: herp hosed up the link somehow forgot the closing bracket and didn't catch it in preview. pixaal fucked around with this message at 19:54 on Oct 3, 2012 |
# ? Oct 3, 2012 17:54 |
|
pixaal posted:There is just one build thread, and its a [url=http://forums.somethingawful.com/showthread.php?threadid=3458091] sticky. Pick a system from the quick pick list, and you shouldn't be disappointed. Excellent link dude. Thank you very much and bye bye AMD! (I am one of those "root for the underdog" types but sadly AMD just can't hang with the Intel this year )
|
# ? Oct 3, 2012 19:29 |
|
roadhead posted:I'm pretty sure the average person can count on one hand the number of times they've completely pegged their CPU in the last week. Yes its the thing everyone thinks they need, but raw CPU performance is "good enough" on any modern processor in my opinion.
|
# ? Oct 3, 2012 21:02 |
|
pigdog posted:If you're gaming, then there are substantial differences in overall smoothness of gameplay, even if the differences in GPU-dependent, averaged framerates aren't big. Most of the time there are other bottlenecks in the system, but in spots which it is CPU-dependent, the difference is noticeable. E.g. the time taken to render one frame at constant 60 FPS is 16.7 ms, and if it takes longer than that, then that's a stutter, a drop in framerate. The graph on Skyrim timedemo is especially telling. The fact that the Phenom II X4 980 manages to outperform every other AMD chip gets me everytime. How did AMD let something like that happen?
|
# ? Oct 3, 2012 22:08 |
|
Install Gentoo posted:The fact that the Phenom II X4 980 manages to outperform every other AMD chip gets me everytime. How did AMD let something like that happen? In terms of the K10 generation, I reckon it takes the crown over the hex-core 1100T because it's the highest clocked 4-core and hits the sweet spot in terms of the lowest time management overhead. There probably just aren't many games that can take full advantage of AMD's hex-core processors, they do alright in sequential and very efficiently threaded operations and will show that performance gain over the quad K10s, but it seems like there must be something with cache management or memory access that's bottlenecking the faster part in more dynamic workloads. But Bulldozer, yeah, that's just a total fart of a processor, wrecked the company for a reason. They took a shiiiiiiiitty gamble, bet the farm and lost. I am more interested in the high end processors from Intel on the graph. I don't think the 100MHz bump could account for half as much time spent catching up to the GPU compared to the 2500K; is hyperthreading really that efficient in helping the execution pipeline do its thing? I'm all for it, running a 2600K and all, but still it surprises me that essentially the same processor with a minor clock increase gets that much extra performance out of hyperthreading. Just efficiency of resource utilization, I guess. I feel jealous of Ivy Bridge, though. Stupid ever-increasing efficiency, how am I supposed to have top-end hardware when they keep making more of it and better all the time? Agreed fucked around with this message at 23:22 on Oct 3, 2012 |
# ? Oct 3, 2012 23:20 |
|
Autech posted:(I am one of those "root for the underdog" types but sadly AMD just can't hang with the Intel this year )
|
# ? Oct 4, 2012 05:27 |
|
So here's something hilarious I just found out in my British Imperial Literature class: Because of the famous Dr. Livingstone's mess of a river expedition, to "Zambezi" something means to turn it into a giant boondoggle, mess, or in modern terms, clusterfuck. Quite the pedigree AMD had going there.
|
# ? Oct 9, 2012 19:49 |
|
|
# ? May 28, 2024 14:40 |
|
Welp, I wanted to stick a Trinity in my soon to be built HTPC/light gaming PC, but man those results... ...guess I'll have to go with an intel based system Sorry AMD, I really wanted to like you.
|
# ? Oct 10, 2012 02:11 |