Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Star War Sex Parrot posted:

They made the vertical axis origin non-zero to make the AMD bars tower over the NVIDIA bars. The AMD bar is at times 4x larger than the NVIDIA bar when that really only represents 1.6x faster.

PR departments are poo poo (especially AMD's). What else is new. That's why I said take it with a grain of salt.

Yeah, fair enough, but it's pretty clear if you're familiar with baseline comparisons and can rub a few brain cells together what the actual differences are. A 30% to 60% improvement over the fastest card today is dynamite regardless of how they put the graph together.

Although I will say this, it does follow part one of the pattern since, what, the nVidia 6000-series? of ATI firing the opening shot and people being suitably impressed, then nVidia being quiet about it for a bit and then dropping an even more powerful top end card. I'm waiting for that shoe to drop before making any decisions, but it would be nice to support ATI this generation.

Adbot
ADBOT LOVES YOU

Star War Sex Parrot
Oct 2, 2003

Agreed posted:

Yeah, fair enough, but it's pretty clear if you...can rub a few brain cells together what the actual differences are.
We're talking about gamers here.

Shaocaholica
Oct 29, 2002

Fig. 5E
They could have used log scale.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Star War Sex Parrot posted:

We're talking about gamers here.

Didn't you recently build a computer just for games?

And then Star War Sex Parrot was a zombie.

Longinus00
Dec 29, 2005
Ur-Quan

Shaocaholica posted:

They could have used log scale.

Why would they do that?

Shaocaholica
Oct 29, 2002

Fig. 5E

Longinus00 posted:

Why would they do that?

It was a joke.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Agreed posted:

In what way?

Where does it say "times the perfomance"?

For all we know, it could be 1fps vs 1.6fps!

Or, the difference between the framerates at the very top of the graph, as opposed to actually the whole bar - I mean look, it starts at 0.8, I think this graph is most definitely misleading.

Doesn't mean I'm making GBS threads on AMD though, and yes, the GTX580 is limited at 1.5GB, very much so.

Beelzebubba9
Feb 24, 2004

Agreed posted:

Yeah, fair enough, but it's pretty clear if you're familiar with baseline comparisons and can rub a few brain cells together what the actual differences are. A 30% to 60% improvement over the fastest card today is dynamite regardless of how they put the graph together.

Although I will say this, it does follow part one of the pattern since, what, the nVidia 6000-series? of ATI firing the opening shot and people being suitably impressed, then nVidia being quiet about it for a bit and then dropping an even more powerful top end card. I'm waiting for that shoe to drop before making any decisions, but it would be nice to support ATI this generation.

Not that Tahiti XT's performance isn't fantastic, but AMD does have the advantage of a full node die shrink over the GF110, so of course it's going to outperform it by a large margin. If the 28nm GPUs from nVidia and AMD are anything like their 40nm products, the major factor in determining who has the top performing GPU will likely be which company is willing to put out the hotter, larger, more expensive card. And even then, most of us will likely buy the card that slots in nearest $250 anyway.

Considering the size and performance, is it safe to assume that the 7970 will be a $500 card?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

Where does it say "times the perfomance"?

For all we know, it could be 1fps vs 1.6fps!

Or, the difference between the framerates at the very top of the graph, as opposed to actually the whole bar - I mean look, it starts at 0.8, I think this graph is most definitely misleading.

Doesn't mean I'm making GBS threads on AMD though, and yes, the GTX580 is limited at 1.5GB, very much so.

It's a normalized graph, with the performance in % normalized to the GTX 580. Those are GTX 580 units. Nice way for ATI to dickwave, I guess, but if nVidia uses 7970 units and can put out a similar graph it's going to be kind of hilarious. And I will generally go for a hotter, noisier card because I am a child when it comes to shiny things, and you don't get workable CUDA performance until the high end :/

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Don't get me wrong, I'm eagerly awaiting it. But let's be honest, we need reviews from the likes of AnandTech before making a real call.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

Don't get me wrong, I'm eagerly awaiting it. But let's be honest, we need reviews from the likes of AnandTech before making a real call.

The beauty of following tech news is you get to have your cake and eat it too. All the speculation is just a bunch of bullshit shop talk, nobody's going to buy anything until there's actual info out. In the meantime, I have no problem drooling (either as if hungry or, er, slow) over a card that can put in 1.6 580units in a game I've been struggling to play on my 580 with the same settings :D

eames
May 9, 2009

Shaocaholica posted:

They could have used log scale.

They tried but their PR department is running Bulldozer machines and would not have been able to calculate the values in time for the release. :downsrim:

In any case, I think the upcoming generation of PC GPUs is what next-gen consoles will run for the next 5-6 years. Hopefully they won’t mess this up.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

eames posted:

They tried but their PR department is running Bulldozer machines and would not have been able to calculate the values in time for the release. :downsrim:

In any case, I think the upcoming generation of PC GPUs is what next-gen consoles will run for the next 5-6 years. Hopefully they won’t mess this up.

To me, that's what's most exciting about this. Generally speaking when a company develops for a console they end up taking a hit in the PC market that generation, but if they aren't just pulling stuff out of their rear end, that doesn't look like it's going to be the case this time around (probably because of how parallel development for consoles and PC have become), and they're definitely putting together some salty hardware capable of at least hitting 1080p TVs with lots and lots of beautiful looking stuff. At extremely low power draw and with broad hardware acceleration and app integration for further multimedia use. Even if nVidia does come out with Kepler and it's a monster and performs better for PC folks, the Southern Islands cards' features alone will go far to improve the console-as-media-center experience.

Shaocaholica
Oct 29, 2002

Fig. 5E
As far as consoles go, I'm more interested in seeing how much memory the developers are asking for and how much they actually get. Working with 512mb for both graphics and data is really outdated. I would love to see next gen consoles go 64bit just so developers can port over to x86-64 and ditch 32bit builds altogether.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Shaocaholica posted:

As far as consoles go, I'm more interested in seeing how much memory the developers are asking for and how much they actually get. Working with 512mb for both graphics and data is really outdated. I would love to see next gen consoles go 64bit just so developers can port over to x86-64 and ditch 32bit builds altogether.

What percentage of gamers are still using a 32-bit OS?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Bob Morales posted:

What percentage of gamers are still using a 32-bit OS?

According to Steam, 17% use XP 32-bit, 11.5% use Vista 32-bit, and 10% use Windows 7 32-bit. So almost 40%.

Star War Sex Parrot
Oct 2, 2003

Shaocaholica posted:

I would love to see next gen consoles go 64bit just so developers can port over to x86-64 and ditch 32bit builds altogether.
The next consoles won't be x86 anyway so it doesn't matter.

JawnV6
Jul 4, 2004

So hot ...

Shaocaholica posted:

I would love to see next gen consoles go 64bit just so developers can port over to x86-64 and ditch 32bit builds altogether.

What? I think the gap between PPC and x86 is a tad bigger than 32/64. And if you're not addressing more than 4GB of dram/mmio why do you want 64 bit?

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Shaocaholica posted:

As far as consoles go, I'm more interested in seeing how much memory the developers are asking for and how much they actually get. Working with 512mb for both graphics and data is really outdated. I would love to see next gen consoles go 64bit just so developers can port over to x86-64 and ditch 32bit builds altogether.

I'm with this. When both modern console generations have gotten old, their biggest problems keeping up with "modern" titles was less raw power (though that too) and more just the tiny amount of memory included. Even as primarily a PC gamer, with multiplatform game development being the norm, I have to put up with inconveniences based on console memory size and load time even when a new generation is out, much less five years later.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Star War Sex Parrot posted:

The next consoles won't be x86 anyway so it doesn't matter.

Also they can't "go" 64-bit, both the 360 and the PS3 already are 64 bit.

Edit: funny enough, despite Nintendo hitting 64-bit first with the Nintendo 64 in 1996, I think everything they've released since has been 32-bit.

Zhentar fucked around with this message at 22:53 on Dec 21, 2011

Shaocaholica
Oct 29, 2002

Fig. 5E
^^^ I would think devs are using 32bit pointers somehow to save space since neither the 360 or PS3 needs the additional addressing bits even if their CPUs are 64bit.

JawnV6 posted:

What? I think the gap between PPC and x86 is a tad bigger than 32/64. And if you're not addressing more than 4GB of dram/mmio why do you want 64 bit?

I think you misread. I'm hoping next gen consoles have more than 4GB of memory so all builds will have to be 64bit to use it all.

Also, do you really think game developers for PPC and x86 are all programming in assembly? Game devs are constantly porting from 360/PS3 to x86. I would think it would be easier to go from a native 64bit source to another 64bit build.

Shaocaholica fucked around with this message at 23:06 on Dec 21, 2011

movax
Aug 30, 2008

Shaocaholica posted:

^^^ I would think devs are using 32bit pointers somehow to save space since neither the 360 or PS3 needs the additional addressing bits even if their CPUs are 64bit.


I think you misread. I'm hoping next gen consoles have more than 4GB of memory so all builds will have to be 64bit to use it all.

Also, do you really think game developers for PPC and x86 are all programming in assembly? Game devs are constantly porting from 360/PS3 to x86. I would think it would be easier to go from a native 64bit source to another 64bit build.

I think he's shooting for the fact that x86/PPC have vast architectural differences as well. Obviously they aren't programming entirely in assembly, with the robust development tools that are supplied from each manufacturer; this isn't the dark ages of game development.

Remember that the 360/PS3 are exotic compared to our x86s. 360 is a three-core PPC-based CPU, each core can perform 2-way SMT. The Cell hardware far outpaced the state-of-the-art in compiler development/parallel programming tools and supplies a single PPC-derived core mated to 7 128-bit RISC cores.

And yet, most companies buy an engine that's been designed to run on both of those platforms to base their games on. The obvious exceptions tend to be each system's halo-titles; I remember some blog-posts from a Naughty Dog developer where he details their dropping down to the assembly level to get some effects from Uncharted working properly.

I think we'll definitely see more memory in the new-generation consoles, especially for high-res textures, but I don't see why they'd need in excess of 4GB. Even our PC GPUs are fine pushing 1080p with 1GB VRAM, and I don't think most titles in the PC environment eat more than 1GB or so of system memory.

Anyways, AMD getting some much-needed revenue for winning a contract to supply a GPU would be most welcome! And IBM will probably get the business (again) for the CPU design.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
As far as we know AMD won the GPU contract for all three next-gen consoles, and may even be supplying an APU for the 360 successor.

Star War Sex Parrot
Oct 2, 2003

This looks to be a mostly-complete dump of the reviewer's kit for the 7970.

http://videocardz.com/29950/amd-radeon-hd-7970-benchmarks-gaming-performance

Looks like I'm buying a new video card when I get back from vacation on January 1.

Star War Sex Parrot fucked around with this message at 00:11 on Dec 22, 2011

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
The multi-monitor performance is almost certainly spot on. A 3GB card making GBS threads on a 1.5GB card is easy to believe when they're talking about 3x 30 inchers.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

The multi-monitor performance is almost certainly spot on. A 3GB card making GBS threads on a 1.5GB card is easy to believe when they're talking about 3x 30 inchers.

Yeah, honestly with the full kit displayed like that I'm more skeptical than before, especially since they display 6970 numbers. They're looking at scenarios that trip up the GTX 580 because of a lack of video memory, hence it turning in performance that appears nearly identical or just extremely marginally better than the 6970 in those graphs despite being quite a bit faster than the 6970 in most usage scenarios.

The 6970 is a great card, but now I'm dialing back my nerd wood a bit to wait for some proper testing. I still figure the next generation of graphics cards is going to kick rear end, but with that kind of tricky poo poo going on, not as impressed. They're misrepresenting the 580's 3Dmark11 score, too, again in a way that really favors their hardware. I can run a damned test right now and see numbers bigger than that, and I don't turn it to Performance or any e-peen bullshit when I benchmark, I just use it as a stability test for DX11 features when overclocking.

Shaocaholica
Oct 29, 2002

Fig. 5E

movax posted:

but I don't see why they'd need in excess of 4GB. Even our PC GPUs are fine pushing 1080p with 1GB VRAM, and I don't think most titles in the PC environment eat more than 1GB or so of system memory.

Well its a catch-22. Devs won't make a game that eats 4GB of memory if they know >50% of their user base won't have that available.

Also, you have to remember that not all game data is stuff for the GPU. I think Rockstar and their open world titles will most definitely use >4GB if you gave it to them.

Again, this isn't about what devs 'need' right now. Its about moving forward and pushing through barriers. Remember these consoles have to last a while. If they are just 'ok' for what devs need on launch day the faster they will feel obsolete as the months and years roll by.

Shaocaholica fucked around with this message at 01:06 on Dec 22, 2011

Nintendo Kid
Aug 4, 2011

by Smythe
When you get a 2 GB video card and 8 gb of system ram in a fairly cheap laptop, it'd be rather nice if the consoles we'll be using for the next 8 years have that much to work in.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Anandtech has reposted their Graphics Core Next architecture preview article, probably in preparation for a Midnight Eastern launch unveiling tonight (in an hour and 40 minutes).

Edit: Probably shouldn't call it a launch since there won't be even notional product to buy.

Alereon fucked around with this message at 04:39 on Dec 22, 2011

Star War Sex Parrot
Oct 2, 2003

I really shouldn't be this excited for a new GPU. :f5:

Autarch Kade
May 6, 2008

Star War Sex Parrot posted:

This looks to be a mostly-complete dump of the reviewer's kit for the 7970.

http://videocardz.com/29950/amd-radeon-hd-7970-benchmarks-gaming-performance

Looks like I'm buying a new video card when I get back from vacation on January 1.

I've looked over what was shown on that site. At first I was actually pretty excited to see how the performance could be. Now though I'm starting to fear that my dual, 1GB, 5870 cards are going to suffer immensely come Christmas when I add the third 30" to my setup. Judging by the performance figures they were showing for single 7970s on the multiple monitor charts, anti-aliasing won't even be a functional option for me, and even without it my frames per second will suffer.

Upgrading to one or two 7970s sounds tempting, and almost necessary in my case, but rather expensive.

The Crossfire scaling does seem to work well according to those figures.

What's the earliest we are expecting independent benchmarks and reviews?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Autarch Kade posted:

What's the earliest we are expecting independent benchmarks and reviews?
My guess is in about 40 minutes, though that may just be an announcement without cards in reviewers' hands.

L-O-N
Sep 13, 2004

Pillbug
Some review sites like[H]ard have already confirmed that they have the cards.

Autarch Kade
May 6, 2008

L-O-N posted:

Some review sites like[H]ard have already confirmed that they have the cards.

HardOCP 7970 Review This link has Eyefinity benchmarks in it as well.

Well, definitely faster than a GTX 580. Not sure if the increased performance is worth the price, if you already had a 580 however.

EDIT: Fixed the URL they broke.

Autarch Kade fucked around with this message at 11:02 on Dec 22, 2011

Star War Sex Parrot
Oct 2, 2003

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review

edit: So it's a paper launch. gently caress that noise. Expected availability on January 9, so I might as well wait to see how the 7950 turns out.

Star War Sex Parrot fucked around with this message at 06:32 on Dec 22, 2011

Longinus00
Dec 29, 2005
Ur-Quan
Considering the process issues it's not surprising they don't have enough supplies. It's a repeat of last year, and the year before.

PC LOAD LETTER
May 23, 2005
WTF?!

Install Gentoo posted:

When you get a 2 GB video card and 8 gb of system ram in a fairly cheap laptop, it'd be rather nice if the consoles we'll be using for the next 8 years have that much to work in.
High clocked GDDR5 and XDR2 are pretty pricey unfortunately. I wish it was as cheap as GDDR/DDR3. 8GB of V/system RAM on nexgen consoles would probably go a long way towards inoculating future PC games against consolitis.

Star War Sex Parrot posted:

might as well wait to see how the 7950 turns out.
7950 is supposed to be about on par with the GTX580 for ~$450-500 according to the rumor mill.

If you already have a GTX580 or 6970 it might not be worth upgrading, especially if rumors of a major refresh mid year pan out. If you're still "clunking" a long on a 4xxx or 5xxx card or if you really want to push multi high resolution monitors then its worth to upgrade IMO. For the latter application CF'd 7970's is probably sensible if expensive.

So long as you keep your resolution around 1920x1080 1GB VRAM is still fine. Its not really a big problem until you higher resolutions.\/\\/\/

PC LOAD LETTER fucked around with this message at 08:14 on Dec 22, 2011

texting my ex
Nov 15, 2008

I am no one
I cannot squat
It's in my blood
This was a bit disappointing, in the sense that it doesn't change much. I was expecting something really refreshing, what we got looks like "just" a solid launch like we've gotten the last few years.

I really need a new card but might aswell wait for Nvidia's response. I got a 5970 now, I still don't understand why they only put 1GB memory on it :psyduck:. It's still very fast but limited in newer games because of this.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
There is a 2GB version, but I agree, it is a limiting factor :(

Star War Sex Parrot posted:

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review

edit: So it's a paper launch. gently caress that noise. Expected availability on January 9, so I might as well wait to see how the 7950 turns out.

But coming in and making GBS threads on all existing single GPU cards, even if expected, is very nice.
Idle temps/power consumption = loving awesome.

The review is missing CrossFire and Eyefinity performance though, I'd like to see the tipping point at which that 3GB comes into its own.

HalloKitty fucked around with this message at 10:43 on Dec 22, 2011

Adbot
ADBOT LOVES YOU

Rawrbomb
Mar 11, 2011

rawrrrrr
I would totally kill for a new card. I play on a single monitor, but with three monitors. BF3 has a habit of forcing aero to disable itself for the extra memory :(

  • Locked thread