Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Gonkish
May 19, 2004

Animal posted:

That is the blingiest video card I have seen. Whats that crap protruding from the side?

It's the buyer's self-worth. Dollah, dollah bill.

Adbot
ADBOT LOVES YOU

Monday_
Feb 18, 2006

Worked-up silent dork without sex ability seeks oblivion and demise.
The Great Twist
Just bought a 2GB Radeon 7850 for $135. Gonna throw it into my current outdated machine for now, then transfer it into a new one I'm gonna build around Christmas time. Will definitely be bottlenecked for now with my Athlon X2 5200, but should at least be a slight upgrade from my 5670.

Monday_ fucked around with this message at 06:27 on Sep 15, 2013

Arzachel
May 12, 2012
Hawaii XT teaser: https://www.techpowerup.com/gpudb/2460/radeon-r9-290x.html

Clocks might not be final, though.

Wistful of Dollars
Aug 25, 2009

Arzachel posted:

Hawaii XT teaser: https://www.techpowerup.com/gpudb/2460/radeon-r9-290x.html

Clocks might not be final, though.

"Data on this page may change in the future."

I'll reserve judgement until it's actually confirmed. I can't find a source for that data (yet).

FetalDave
Jun 25, 2011

Moumantai!
Looks like nVidia released their newest drivers this morning.

http://www.geforce.com/drivers/results/66884

Doesn't look like they changed much from the previous beta drivers. These are WHQL.

Arzachel
May 12, 2012

El Scotch posted:

"Data on this page may change in the future."

I'll reserve judgement until it's actually confirmed. I can't find a source for that data (yet).

Techpowerup are the guys behind GPU-Z, not a random hit baiting site, so this has a reasonable chance of being true. We'll know for sure on the 25th though.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

FetalDave posted:

Looks like nVidia released their newest drivers this morning.
AMD released the Catalyst 13.9 WHQL drivers, but these do not incorporate the fixes from the Betas so should not be used (except if you have an Enduro configuration that is not working with the 13.10 Betas).

Im_Special
Jan 2, 2011

Look At This!!! WOW!
It's F*cking Nothing.

FetalDave posted:

Looks like nVidia released their newest drivers this morning.

http://www.geforce.com/drivers/results/66884

Doesn't look like they changed much from the previous beta drivers. These are WHQL.

So has anyone with a GTX 560Ti been brave enough to try these yet, and how do they fair? I've been stuck on 314.22 for what feels like a year now because every new 32X.XX driver has giving me weird stutter/mini-lockups.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT
Man, Nvidia seems like they're having a string of terrible luck with their drivers lately. It's like an on-and-off thing with them, I remember back when I had a GTX 465, the only drivers that played games smoothly in my rig at the time were the Forceware 175.19 drivers - any more recent ones after that caused massive slowdowns in certain parts of games where I'd never seen it before. A good example was playing BioShock 1 & 2, any areas that had any fog/dust effects would get laggy and jittery, and even some of the other smoke and fire effects looked "off" in comparison.

Wistful of Dollars
Aug 25, 2009

More 9000 series news/rumours. (I refuse to abide by the new naming scheme).

The first picture comes from someone at :dice:, so I think it's a legit picture of the physical card. The bench mark results, who knows how accurate they are.

Arzachel
May 12, 2012
Aaand I was wrong, TPU does pull their pre-release specs from rumours. The 512bit memory bus is pretty ballsy if true.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Arzachel posted:

The 512bit memory bus is pretty ballsy if true.
Not really.. ATI/AMD used to use 512bit memory buses on their cards up through the 2900XT's. They only moved to 256bit for the 3870's when they moved to faster GDDR4.

Arzachel
May 12, 2012

LCD Deathpanel posted:

Not really.. ATI/AMD used to use 512bit memory buses on their cards up through the 2900XT's. They only moved to 256bit for the 3870's when they moved to faster GDDR4.

Yeah, that was a ring bus I believe. Nvidia ran 512bit DDR3 on the GTX280 too. It's not that it hasn't done before, but the added complexity and power draw wasn't seen as worth it since AMD and Nvidia started using GDDR5.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Arzachel posted:

Aaand I was wrong, TPU does pull their pre-release specs from rumours. The 512bit memory bus is pretty ballsy if true.

Wide bus plus fast memory is different than wide bus plus slow memory, but...

If both next-gen architectures aren't pulling for bandwidth with both hands and feet they're not doing it right. I think it's going to be a little bit of keep-up-with-consoles because of how drastically different they handle things. Our strong single-threaded performance versus their vastly parallel performance and as little overhead as possible in terms of moving poo poo around to be worked on (coherence). We're already staring coherence issues in the muzzle. Our systems are not purpose-built like consoles, we have so much poo poo in the way of doing game-specific things; all hands are programming in x86, but there is just a lot more overhead and a lot less communication possible with computers and robust operating systems vs. thin OSes made to get the gently caress out of the way when games are played and custom silicon aimed at parallelism. Without bandwidth we don't even have a shot at coherence on the PC, but even with it, I have doubts about being able to do things like consoles, shared language or not. Some major code transliteration will be at least as required for ports to run well on the PC, obviously.

Edit: If current leaks are true, performance of the 9000s being ~roughly at Titan level (a few FPS here or there depending on AA, games, etc.) is not exciting to me. Releasing a new generation that competes strongly!... with the current generation. Welp. Hopefully they can hit hard on price this go-around, because I doubt Maxwell's highest end single GPU card is going to be an inch slower than the top-tier Kepler cards, would bet significantly faster; and, they've got some neat stuff that might be relevant to PC gaming being able to keep up with some of the cool poo poo that consoles are doing - at least according to currently available info on the green side of things. (disclaimer: if the leaks are true, obv.)

Agreed fucked around with this message at 20:48 on Sep 22, 2013

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
There's been news over the last week of additional problems involving Crossfire and Eyefinity. In short, when running in Crossfire mode with Eyefinity, about half of the frames are dropped without being displayed. Further, TechReport is reporting that Crossfire is inherently broken with 4K resolution due to Crossfire link bandwidth limitations.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
This wasn't unexpected, as AMD has said from the start that the brokenness of CrossFire is additionally complicated by all the hacks used to get Eyefinity to work. It's also worth nothing (as the TR author did) that the problem with 4K monitors is not due to 4K inherently, but due to teaming two DVI connections for the display, which is treated as Eyefinity internally (as it is treated as a Surround group on Nvidia GPUs). Using a DisplayPort 1.2 connection to the 4K monitor 100% avoids the 4K tearing issue, as while there are still multiple data streams, they share a timing source because they come from a single display controller.

Factory Factory fucked around with this message at 22:06 on Sep 22, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I am going to be really interested in seeing how the next generation of cards and its refresh, explicitly, deal with the 1080p render target the consoles will have forever. It's looking more and more like the best looking console games will have access to resources in a way that gives them a huuuge advantage relative to their absolute processing power, so I'm kind of anticipating a wait for computers to do anything but make damned sure they look at least as good as consoles and can maybe do 1600p, before the real push toward higher resolutions comes. Hopefully the systemic problems with driver implementations of 4K will be solved by the time that's a more salient concern.

Arzachel
May 12, 2012

Agreed posted:

Edit: If current leaks are true, performance of the 9000s being ~roughly at Titan level (a few FPS here or there depending on AA, games, etc.) is not exciting to me. Releasing a new generation that competes strongly!... with the current generation. Welp. Hopefully they can hit hard on price this go-around, because I doubt Maxwell's highest end single GPU card is going to be an inch slower than the top-tier Kepler cards, would bet significantly faster; and, they've got some neat stuff that might be relevant to PC gaming being able to keep up with some of the cool poo poo that consoles are doing - at least according to currently available info on the green side of things. (disclaimer: if the leaks are true, obv.)

Node shrinks don't do miracles anymore. With TSMC track record lately and how the 7970/680 launch went, we'd be lucky to see the first 20nm cards a year from now, slightly faster but more expensive than the 780/290X. If the performance is there, the 290X is likely to get even more mileage than the GTX580. Price it under the competition, bundle with BF4, don't require BIOS flashing for sane manual overclocking and it will do as well as a 600$ GPU can.

computer parts
Nov 18, 2010

PLEASE CLAP

Agreed posted:

I am going to be really interested in seeing how the next generation of cards and its refresh, explicitly, deal with the 1080p render target the consoles will have forever. It's looking more and more like the best looking console games will have access to resources in a way that gives them a huuuge advantage relative to their absolute processing power, so I'm kind of anticipating a wait for computers to do anything but make damned sure they look at least as good as consoles and can maybe do 1600p, before the real push toward higher resolutions comes. Hopefully the systemic problems with driver implementations of 4K will be solved by the time that's a more salient concern.

Given that last generation consoles couldn't even reliably do 720p I'm going to go out on a limb and think that constant 1080p will still be a pipe dream.

VDay
Jul 2, 2003

I'm Pacman Jones!

computer parts posted:

Given that last generation consoles couldn't even reliably do 720p I'm going to go out on a limb and think that constant 1080p will still be a pipe dream.

What does one have to do with the other? Last generation's consoles are like 8 years old at this point and came out before everyone owned an HDTV.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Didn't know whether to put this in the AMD thread or here, but it it more GPU related and this thread is more active.

Apparently Kaveri is pretty good

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Endymion FRS MK1 posted:

Didn't know whether to put this in the AMD thread or here, but it it more GPU related and this thread is more active.

Apparently Kaveri is pretty good
It's important to note that that's a very small set of benchmarks, probably the only one Kaveri is anywhere close at. It's also troubling that they aren't showing an order-of-magnitude boost in memory bandwidth for Iris 5200 over HD 4600, which indicates they did something wrong/shady during testing.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

computer parts posted:

Given that last generation consoles couldn't even reliably do 720p I'm going to go out on a limb and think that constant 1080p will still be a pipe dream.

This would be an awesome point for the thread's resident industry dev person to come in and expand on how efficiently render calls are on the consoles vs. the overhead introduced by an OS, how highly parallel they can actually be without so much poo poo in the way of scheduling... Paging, paging?

Mozi
Apr 4, 2004

Forms change so fast
Time is moving past
Memory is smoke
Gonna get wider when I die
Nap Ghost
I've been struggling with my laptop graphics card (Quadro 2000M w/ Optimus) hitting 93C and downthrottling, and just noticed that when I play games in windowed or borderless windowed mode my GPU utilization is ~75%, but when I switch to fullscreen it jumps to 99% and the heat starts rising. When I minimize the windowed game the same thing happens. I'm posting this here in case this is a known problem, I just reinstalled everything so there's nothing funky in the system.

Fallows
Jan 20, 2005

If he waits long enough he can use his accrued interest from his savings to bring his negative checking balance back into the black.
Windowed games don't anti alias so fullscreen is hitting your GPU harder.

Mozi
Apr 4, 2004

Forms change so fast
Time is moving past
Memory is smoke
Gonna get wider when I die
Nap Ghost
With AA disabled in Worms Revolution my GPU load goes from ~45% to 95%+ just from switching from windowed to fullscreen, and goes back above 90% when I minimize the game when it's windowed.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Is Vsync enabled in fullscreen mode? Because only rendering 60fps would definitely reduce power usage.

Mozi
Apr 4, 2004

Forms change so fast
Time is moving past
Memory is smoke
Gonna get wider when I die
Nap Ghost
Yes, I forced VSync on.

I disabled Optimus in the BIOS and it was definitely related - now, windowed programs still fully utilize the GPU, but for some reason at max use I top out a full 10 degrees C cooler than before! So I guess the problem isn't solved but isn't a huge hassle anymore... until I need battery life. I have a replacement fan assembly I was prepared to put in, which along with new thermal paste and a good cleaning would probably have helped.

Thanks for everyone's suggestions, sorry to clog up the thread. Carry on!

Edit: Still gets up to 93 eventually. Guess I'll have to open it up.

Mozi fucked around with this message at 01:14 on Sep 24, 2013

PC LOAD LETTER
May 23, 2005
WTF?!

Agreed posted:

Edit: If current leaks are true, performance of the 9000s being ~roughly at Titan level (a few FPS here or there depending on AA, games, etc.) is not exciting to me.
Titan performance for 40% less gets a "meh"? Sure $600 is still a lot but its for a halo product at launch. The slightly-less high end and mid range products will probably sell for lots less much like the 7950 vs 7970 this gen. I don't think it'll be 4xxx level of prices all over again but we should see some decent prices for some fairly powerful hardware. What would be exciting for you? Making PC gaming development more console-ish somehow? I'm not sure how MS could pull that off. Make a whole new version of "windows for games OS" and have it load/unload when needed and just run everything through a hypervisor? That is probably a stupid idea but what else would work?

VDay posted:

What does one have to do with the other?
The reasoning I've seen is this gen's consoles don't have the massive performance boost over their predecessor's that the X360/PS3 had over the Xbox/PS2. Xb1/PS4 are both supposed to be around 6-8x faster then their predecessors. Sony and MS didn't push the hardware limits anywhere near as hard as they did last time which is why Xb1 is supposed to be profitable on day 1 for the hardware alone and PS4 is supposed to make money after 1 or 2 game sales.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

PC LOAD LETTER posted:

Titan performance for 40% less gets a "meh"? Sure $600 is still a lot but its for a halo product at launch. The slightly-less high end and mid range products will probably sell for lots less much like the 7950 vs 7970 this gen. I don't think it'll be 4xxx level of prices all over again but we should see some decent prices for some fairly powerful hardware. What would be exciting for you? Making PC gaming development more console-ish somehow?

It's specious as all hell to talk about "Titan performance" - everyone who pays attention to this stuff knows that really it's the GTX 780 it's going up against, and Titan is the entry-level GPGPU card in the Kepler lineup. I will grant you that the launch schedule resulted in there being a period of time where Titan existed and the GTX 780 did not, and at dead stock settings it might get a few FPS higher, but let them use default turbo behavior and move the sliders up and the GTX 780 pretty quickly outruns the Titan for performance in games since it's got the same TDP but has half the VRAM to worry about and fewer SMXes.

So we're really looking at a lateral movement in terms of performance, not a revolutionary "holy poo poo Titan performance for around $600?!" - nVidia already did that, it's called the GTX 780. What AMD is doing, if the leaks are correct, is making a card that is competitive with the GTX 780, and good for them. Maybe they can lower prices on it since 28nm is pretty mature, if Maxwell launches and does outperform Kepler significantly enough to merit price competition.

And yeah, making PC game development "more console-ish" would be pretty neat, if by that you mean addressing the problem of moving workloads around coherently. As to how to do it, let's wait and see, yeah?

Agreed fucked around with this message at 00:39 on Sep 24, 2013

PC LOAD LETTER
May 23, 2005
WTF?!

Agreed posted:

It's specious as all hell to talk about "Titan performance" - everyone who pays attention to this stuff knows that really it's the GTX 780 it's going up against
Did AMD gimp GPGPU performance in Hawaii though? I haven't seen anything that suggest they have. GPGPU might not be too important right now, I'd put it up there with Physx at the moment, but its got a lot of potential.

Agreed posted:

And yeah, making PC game development "more console-ish" would be pretty neat, if by that you mean addressing the problem of moving workloads around coherently. As to how to do it, let's wait and see, yeah?
Only big development in hardware that maybe applies here for a while is going to be AMD hUMA I think. What is MS supposed to be doing for DX12?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

PC LOAD LETTER posted:

Only big development in hardware that maybe applies here for a while is going to be AMD hUMA I think. What is MS supposed to be doing for DX12?

Not much. Possibly nothing.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

PC LOAD LETTER posted:

Only big development in hardware that maybe applies here for a while is going to be AMD hUMA I think. What is MS supposed to be doing for DX12?
I've read all the HSA stuff released thus far--I don't see any way it can possibly work with discrete GPUs. also, considering half of it basically "hey WDDM is bad, we should do this magical thing instead that is less bad in this entirely hand-wavy way," yeah, not happening.

The_Franz
Aug 8, 2003

Agreed posted:

This would be an awesome point for the thread's resident industry dev person to come in and expand on how efficiently render calls are on the consoles vs. the overhead introduced by an OS, how highly parallel they can actually be without so much poo poo in the way of scheduling... Paging, paging?

Is there any advantage anymore? The new consoles aren't the unitaskers of previous generations. They are running real operating systems (Windows/FreeBSD) with enough background tasks to justify reserving gigs of memory for the OS and they seem to be running actual compositors or windowing systems judging from how you can multitask while jumping in and out of games as opposed to the current method of reserving a millisecond or two to draw some GUI elements over the frame before the buffer flip.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The point of a console OS and a multicore CPU, whether divided logically like the PS4 or virtualized like the Xboner, is that you can devote a fixed set of resources to unitasking even while having stuff left over for multitask stuff. It does matter and there is an advantage. Windows by itself doesn't have a mechanism to guarantee resource availability in the same way.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

PC LOAD LETTER posted:

Did AMD gimp GPGPU performance in Hawaii though? I haven't seen anything that suggest they have. GPGPU might not be too important right now, I'd put it up there with Physx at the moment, but its got a lot of potential.


I'd really not like to have to pay $1k for ungimped dual precision calculations. If Hawaii can compete with Titan on compute, that's good enough for me.

The_Franz
Aug 8, 2003

Factory Factory posted:

The point of a console OS and a multicore CPU, whether divided logically like the PS4 or virtualized like the Xboner, is that you can devote a fixed set of resources to unitasking even while having stuff left over for multitask stuff. It does matter and there is an advantage. Windows by itself doesn't have a mechanism to guarantee resource availability in the same way.

True, I was addressing the issue of console OSes no longer being "lite" in the sense that they now take more memory and may have even more going on in the background than an idle desktop OS and GPU operations being more expensive since they are now dealing with multiple GPU contexts and windowing systems as opposed to having exclusive GPU access in the past.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

The_Franz posted:

True, I was addressing the issue of console OSes no longer being "lite" in the sense that they now take more memory and may have even more going on in the background than an idle desktop OS and GPU operations being more expensive since they are now dealing with multiple GPU contexts and windowing systems as opposed to having exclusive GPU access in the past.

Well, Xboner is really not what you're thinking of. It's Xbox OS and Windows running side-by-side on a HyperV hypervisor. It's literally a unitasking OS and a multitasking OS running side by side, with resource division (and sharing) enforced by the hypervisor.

And even so, console OSes are "lite" in the sense that they offer bare-metal hardware access in a way that programming through DirectX on Windows does not. There is very little abstraction from the hardware, and optionally none. You just don't get that in Windows, even with OpenGL.

Alkanos
Jul 20, 2009

Ia! Ia! Cthulhu Fht-YAWN

Im_Special posted:

So has anyone with a GTX 560Ti been brave enough to try these yet, and how do they fair? I've been stuck on 314.22 for what feels like a year now because every new 32X.XX driver has giving me weird stutter/mini-lockups.

Tried it, and got two freezes within an hour. Back to 314 I go!

Adbot
ADBOT LOVES YOU

The_Franz
Aug 8, 2003

Factory Factory posted:

And even so, console OSes are "lite" in the sense that they offer bare-metal hardware access in a way that programming through DirectX on Windows does not. There is very little abstraction from the hardware, and optionally none. You just don't get that in Windows, even with OpenGL.

Even on the PS3 and 360 you never really had total access to the hardware. You were still running on top of a hypervisor, a kernel with a task scheduler and an OS layer that needed a certain amount of time on a specific core(s). You had almost exclusive GPU access, but your completed frame still went through the OS so it could draw GUI elements on top of it. The PS3 had even more overhead since the disk encryption made it impossible to do unbuffered IO. I think the last platforms where you could boot it up and basically have nothing else between you and the hardware was the PS2 or GameCube.

The PS4 and Xbone are still built for games, but no matter how you look at it there is a lot more overhead this time around as it's just the nature of the beast when you want proper multitasking, the ability to record footage and the ability to watch your cable box in a window next to your game.

The_Franz fucked around with this message at 03:21 on Sep 24, 2013

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply