Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Cybernetic Vermin posted:

I think you are vastly underestimating the number of extremely pessimistic global locks in software today. There are bound to be a lot of software that go from being just "multi-threaded" to basically scaling linearly in some operations with cheap memory transactions.
transactional memory is a cool bandaid on top of bad programming models. if you're writing pthreads today (and you're not writing a language runtime or something similarly low level) you are probably doing your job really badly. learn how to track dependencies and avoid shared state, programming models!

edit: you're basically arguing that there are a lot of applications that are inherently parallel, CPU bound, and extremely impacted by locking overhead to some shared object. those are the only cases where transactional memory could theoretically make a performance difference. what applications would those be? also, please note that Haswell TSX won't solve those problems due to the inability for any app that relies on a monolithic global lock to fit all of its shared state in L1.

the transactional memory people have been tooting this horn for ten years; if it were actually as amazing as all the academic papers claim, Rock probably wouldn't have killed Sun

Professor Science fucked around with this message at 00:36 on Feb 24, 2013

Adbot
ADBOT LOVES YOU

Cybernetic Vermin
Apr 18, 2005

Professor Science posted:

you're basically arguing that there are a lot of applications that are inherently parallel, CPU bound, and extremely impacted by locking overhead to some shared object. those are the only cases where transactional memory could theoretically make a performance difference. what applications would those be? also, please note that Haswell TSX won't solve those problems due to the inability for any app that relies on a monolithic global lock to fit all of its shared state in L1.

Right, that is the claim I am making (assuming that you are saying "locking overhead" not to mean overhead but rather actual lock contention). Hard to have any real statistical foundation for this, but I have worked on a fair bit of software myself where a lot of time is spent waiting on a lock on some huge collection where each thread will touch only a tiny random subset of the data held. It is not that this situation occurs because finer-grained locking is that hard, but it is just one of those things one knows is very hard to reason with perfect certainty about, so locks tend to be very pessimistic to be sure they are covering sufficiently much.

To some part the way to view this is that a perfectly successful transactional memory implementation will give you the performance of the finest-grained locking possible with the level of bug-resistance (and effort) of coarse locking.

Your phrasing of my statement is a bit disingenuous though, of course I am talking about CPU-bound software, since I have no idea what you expect Intel or AMD to put in their CPUs to improve the situation for software that is not CPU-bound. It should also be clear that any discussion is in terms of successfully eliminating false lock dependencies, since no parallel processing technology will help when there are real dependencies. If you find a huge step towards perfect extraction of parallelism to be amazing then fine. I have no idea how you imagine that transactional memory on Rock would have saved Sun, since they very notably didn't actually manage to make a Rock CPU, a very notably recurring phenomena when it came to Sun hardware promises (on the other hand they very successfully wasted money on unprofitable stuff like Java and OpenOffice).

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Crysis 3 does in fact favor AMD CPU's, according to this test from yesterday. This is kinda cool, perhaps it means I should go for hyperthreading when I upgrade to a Skylake/AMD-equivalent chip later if these trends continue?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Keep in mind that that's a rather odd mix of overclocked and non-overclocked processors, I wouldn't read too much into it until we get repeatable, correctly tested results from reputable English-language sites.

Star War Sex Parrot
Oct 2, 2003

So what's new at AMD?

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Waiting for Kabini and HSA to blow us away, I guess.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The 8970M is out (7970M with boost clocking). Richland and Kaveri are starting to sneak out. Richland is just showing up now in gaming notebooks, a higher-clocked A10 in the same power envelope. Kaveri is coming next June and it'll be the next APU revision with HSA junk.

Yawn.

roadhead
Dec 25, 2001

PS4 and Next-Box news are the only things really - and anyone who yawns at the thought of HSA - wtf?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Richland is literally just a higher-clocked Trinity.

Kaveri might be nice, what with GCN and HSA and an on-chip ARM core, but we know practically nothing about it compared to Haswell, which is coming out at about the same time. Except that it's Gen 3 Bulldozer. Big deal? Who knows?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
https://twitter.com/anandshimpi/status/335518239830454272

Apparently we may see NDA lift on Jaguar next week. Brazos was exciting and popular. Brazos 2.0 was not so much. Here comes Tiny APU For Baby Computers Gen 3.

Maxwell Adams
Oct 21, 2000

T E E F S
There are benchmarks of Temash out there already.

http://www.notebookcheck.net/Review-AMD-A6-1450-APU-Temash.92264.0.html

It compares pretty well to Atom. No sign of Kabini yet, though.

Yudo
May 15, 2003

Maxwell Adams posted:

There are benchmarks of Temash out there already.

http://www.notebookcheck.net/Review-AMD-A6-1450-APU-Temash.92264.0.html

It compares pretty well to Atom. No sign of Kabini yet, though.

This is for me a really interesting part. Low power, heat, etc. but not quite as useless as the current Atom or as expensive as an i3. It would make a nice traveling companion that could do work.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Anandtech's AMD Jaguar architecture article and AMD A4-5000 "Kabini" review are live.

SYSV Fanfic
Sep 9, 2003

by Pragmatica
So, we know that the ps4 is using AMD, and I suspect the 360 is as well. Do you guys think this will lead to AMD returning as a viable option for PC gaming? I built my first Intel box ever this time around because of AMDs lovely single core performance. If developers are targeting AMD levels of per core performance, then it seems like it follows that AMD will be acceptable on the PC.

roadhead
Dec 25, 2001

keyvin posted:

So, we know that the ps4 is using AMD, and I suspect the 360 is as well. Do you guys think this will lead to AMD returning as a viable option for PC gaming? I built my first Intel box ever this time around because of AMDs lovely single core performance. If developers are targeting AMD levels of per core performance, then it seems like it follows that AMD will be acceptable on the PC.

But its not even the Big-Boy core that you MIGHT (don't) build a desktop around. Its the low power Jaguar core that's meant to compete with Atom. Yea there are 8 of them but...


Better bet might be a GCN-based AMD GPU as the shaders they write for the console's should also work on the PC side but all the CPU code will just be C/C++ anyway and at the mercy of the compiler.


If for some reason they did hand-tuned Assembly on the consoles CPUs the micro-architectures are different enough that it would have to be re-tuned for Atom, Haswell, Bulldozer, Phenom II, etc. Which is why they don't often hand-tune x86 that much anymore.

Zhentar
Sep 28, 2003

Brilliant Master Genius

keyvin posted:

If developers are targeting AMD levels of per core performance, then it seems like it follows that AMD will be acceptable on the PC.

The problem is, AMD already is acceptable - just wholly inferior at every price point. And it's not bulldozer cores going into these consoles, so it's not even going to encourage developers to optimize for the strengths that could possibly make the bulldozer architecture cost competitive.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

keyvin posted:

So, we know that the ps4 is using AMD, and I suspect the 360 is as well. Do you guys think this will lead to AMD returning as a viable option for PC gaming? I built my first Intel box ever this time around because of AMDs lovely single core performance. If developers are targeting AMD levels of per core performance, then it seems like it follows that AMD will be acceptable on the PC.

The PS4 is using an AMD APU, so is the Xbox One.

However, the per core performance of the CPUs they are using is below what they're offering now anyway - it's essentially a pumped up netbook CPU, they just happen to have two modules with 4 cores back to back. So essentially, not amazing CPU peformance, but 8 of them. It doesn't change the scene in terms of PC parts.

It is of course a win for AMD.

Maxwell Adams
Oct 21, 2000

T E E F S

HalloKitty posted:

It doesn't change the scene in terms of PC parts.

Not really, but it basically guarantees that new games will be highly multithreaded and 64-bit. It could also mean that video cards with less than 6 gigs of ram will be obsolete.

The really interesting stuff is the hUMA architecture in the new consoles. The CPU and GPU can access the same memory, which could be amazing with the memory bandwidth in the PS4. Someone might, for example, make a hot new tesselation algorithm that relies on hUMA. Who knows what would happen next. Radeon cards with a few jaguar cores tacked on?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Maxwell Adams posted:

Radeon cards with a few jaguar cores tacked on?

drat, now that's a good idea.

Maxwell Adams
Oct 21, 2000

T E E F S

HalloKitty posted:

drat, now that's a good idea.

Just imagine how smug PC gamers could be. "Oh, your console run on a sophisticated APU? Yeah, my PC has that. As an accessory."

Rawrbomb
Mar 11, 2011

rawrrrrr

Maxwell Adams posted:

Not really, but it basically guarantees that new games will be highly multithreaded and 64-bit. It could also mean that video cards with less than 6 gigs of ram will be obsolete.



While the xbone and ps4 seem to both have 8gb of ram, I somehow doubt that developers will get access to more than 2-4gb.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug

Rawrbomb posted:

While the xbone and ps4 seem to both have 8gb of ram, I somehow doubt that developers will get access to more than 2-4gb.

IIRC, Sony already mentioned that Games will have access to 6GB of the ram with 2 reserved for the OS.

The xbone, runs DDR3 while the PS4 is running (G)DDR5, it can play an significant impact as the APU will be utilizing the system ram for video processing. Hell my APU got a nice 15fps boost in CS:Go when I went from 1333mhz CAS9 to 1600 CAS9 ram.

Anand did a nice writeup, kept me occupied on my lunch today.
http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4

Dilbert As FUCK fucked around with this message at 23:55 on May 24, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It's DDR3-2133 on the XBO, so that's something, plus the eSRAM cache will help a lot with GPU transfers the way Haswell's GT3e is helped by the eDRAM cache.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Anandtech has a good article about the Kabini value proposition, which mostly seems to be providing better value cheap/small laptops than Celeron/Pentium processors.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Alereon posted:

Anandtech has a good article about the Kabini value proposition, which mostly seems to be providing better value cheap/small laptops than Celeron/Pentium processors.

Seems to be a decent little thing in the sector, but they're right, it needs to turbo aggressively when under lightly threaded loads. Then it would be a bit of a winner in the low end segment.

Riso
Oct 11, 2008

by merry exmarx
Sony said it will be a 7/1 memory split for games/OS, MS is 5/3.

SYSV Fanfic
Sep 9, 2003

by Pragmatica

Maxwell Adams posted:

Not really, but it basically guarantees that new games will be highly multithreaded and 64-bit. It could also mean that video cards with less than 6 gigs of ram will be obsolete.

I thought it was the lack of highly threaded applications that was holding AMD back in the PC market. IIRC battlefield 4 runs nearly as well on an eight core A8479578439 or whatever the AMD branding is these days as it does on a core i7 that is over $100 more. So games being highly multi-threaded seems like a big win for AMD, especially considering that the intel roadmap has them sticking to four cores for the next tick-tock cycle.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Maybe, maybe not. As I said a couple pages ago, each Jaguar core does a pitiful amount of work by desktop CPU standards. An i5-3570K could run every instruction eight Jaguar cores could with clocks to spare.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Maxwell Adams posted:

Not really, but it basically guarantees that new games will be highly multithreaded and 64-bit.

This is something I keep going back and forth on when toying around with the idea of building a cheap-ish Steambox. Is it better to have an i3, or a similarly priced AMD part with 4-6 cores? It seems like the dual-core i3 (even with hyperthreading) will be a liability before long.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
A Haswell i3 will probably handle things just fine. Plus you can kick it up a bit with a 25% BCLK strap overclock. An Ivy i3? Yeah, probably will feel slow for next-gen ports.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Riso posted:

Sony said it will be a 7/1 memory split for games/OS, MS is 5/3.

3 gigs for dynamic ad space.

orange juche
Mar 14, 2012



incoherent posted:

3 gigs for dynamic ad space.

I wouldn't mind this if the ads actually reduced the cost of the console.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Cockblocking Jerk posted:

I wouldn't mind this if the ads actually reduced the cost of the console.

Hell, give me a game with a good physics engine, a sandbox mode, and a bunch of cans of Coke you can really shake up, and I'd pay for that ad.

Anyway, to finish answering the "i3 vs. FX-4 or FX-6 chip" question, if you're buying now, the FX-6 chip might look better when you get to a CPU-bound game made for PS4 and XBO. Still doesn't mean it's a single-thread powerhouse that'll run Starcraft 2 well, though.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Riso posted:

Sony said it will be a 7/1 memory split for games/OS, MS is 5/3.

It's fairly meaningless though, that's something easy enough to change in software if it really starts constraining developers over the console lifespan, and the worst case of doing so is that it makes the switch from gaming to the Windows-based OS in the One a little less snappy. Microsoft was just conservative with how much memory to tell developers it's okay to use, since until Sony's announcement they were hoping the PS4 wasn't going to go with a full 8GB.

Shaocaholica
Oct 29, 2002

Fig. 5E
Paging anyone with an FX-8 series CPU who would care to run this benchmark in the overclocking thread?

http://forums.somethingawful.com/showthread.php?threadid=3465021&pagenumber=38#post415824410

Factory Factory asked an interesting question on how well the AMD arch would do given what we know about the algorithm.

Riso
Oct 11, 2008

by merry exmarx

Killer robot posted:

Microsoft was just conservative with how much memory to tell developers it's okay to use, since until Sony's announcement they were hoping the PS4 wasn't going to go with a full 8GB.

Sony until their announcement didn't know they were doing 8gb, they originally thought they could get only four.

SYSV Fanfic
Sep 9, 2003

by Pragmatica

Shaocaholica posted:

Paging anyone with an FX-8 series CPU who would care to run this benchmark in the overclocking thread?

http://forums.somethingawful.com/showthread.php?threadid=3465021&pagenumber=38#post415824410

Factory Factory asked an interesting question on how well the AMD arch would do given what we know about the algorithm.

Wrong thread, no one actually owns an AMD processor here.

orange juche
Mar 14, 2012



Shaocaholica posted:

Paging anyone with an FX-8 series CPU who would care to run this benchmark in the overclocking thread?

http://forums.somethingawful.com/showthread.php?threadid=3465021&pagenumber=38#post415824410

Factory Factory asked an interesting question on how well the AMD arch would do given what we know about the algorithm.


keyvin posted:

Wrong thread, no one actually owns an AMD processor here.

We're all pretty much bitter ex-AMD owners, who all gathered up to bitch about how bad they are.

Riso
Oct 11, 2008

by merry exmarx
Let me be the first to admit to still use an AMD Athlon II x4 640.

Adbot
ADBOT LOVES YOU

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

Riso posted:

Let me be the first to admit to still use an AMD Athlon II x4 640.

Me too and I don't see much of a reason to upgrade at the moment. If I did upgrade though, I'd pretty much have to go Intel for the first time in over 10 years.

  • Locked thread