Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
L-O-N
Sep 13, 2004

Pillbug

Coredump posted:

http://techreport.com/discussions.x/22543
JPR: Nvidia ruled discrete GPUs last quarter


I know this is going back a ways. But this right here is where I getting that idea from. I don't know what AMD can do but they don't have the gpu marketshare that their products should have for them.

This doesn't take into account AMD's integrated GPUs, which is eating into both Nvidia's low end and their own low end.

Adbot
ADBOT LOVES YOU

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you
Ookay, so DP is nice, to have a quick and simple cable to connect. Frankly it sucks that every time I switch to another input, and then come back to it the damned resolution freaks out and whatever I had displayed on my secondary monitor [the DP connected one] gets moved back to my primary one.

Guess I'll switch the input formats on the monitors since I never switch inputs on my primary monitor.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

real_scud posted:

Ookay, so DP is nice, to have a quick and simple cable to connect. Frankly it sucks that every time I switch to another input, and then come back to it the damned resolution freaks out and whatever I had displayed on my secondary monitor [the DP connected one] gets moved back to my primary one.

Guess I'll switch the input formats on the monitors since I never switch inputs on my primary monitor.

Yeah, when I got a 6950, I thought it would be fun to try DisplayPort. I ended up going back to DVI because of this. It's not a DisplayPort inherent problem the way I understand it, but the way Windows handles it.

I understand that by default it is reasonable enough behaviour to detect screens, alter the desktop accordingly, such as when you plug a projector in or something, but it would be really nice to have an option somewhere that preserved the desktop regardless of screen detection state.

Coredump
Dec 1, 2002

L-O-N posted:

This doesn't take into account AMD's integrated GPUs, which is eating into both Nvidia's low end and their own low end.

So what, that doesn't invalidate the point I was making.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
AMD has acquired SeaMicro, an innovative high-density server vendor and close Intel partner. SeaMicro were the guys pushing ultra-dense Atom servers, and servers that combined Atoms and Xeons together into very tiny packages. Perhaps we'll see servers with Brazos APUs instead of Atoms, combining better threaded performance with the opportunity for GPU computing? The 80 shader ALUs on an E-350 have about ten times the potential floating point performance of the CPU cores, so if someone cares enough to optimize their applications to run on a GPU there's a lot of performance available to pick up. The integrated memory controller on the Brazos APUs is also a pretty significant advantage over Atom for non-trivial workloads.

Bonus Edit: Rumors that Sony would be using an AMD Trinity APU (two Piledriver modules, four "cores", plus VLIW4 graphics) for the Playstation 4 appear to be solidifying. Previously it was sort of assumed (despite rumors to the contrary) that Sony wouldn't dump the Cell even if they switched to AMD as the GPU provider, but this is looking more and more likely.

Alereon fucked around with this message at 00:21 on Mar 1, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Interesting, and pretty sensible. Seems like a good way to keep the packaging small without building your own design.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Alereon posted:

AMD Trinity APU (two Piledriver modules, four "cores", plus VLIW4 graphics)

Eh? I thought Trinity was going to have GCN graphics. Leaked slides apparently referenced "next-gen DX11," which is kind of a limited field but suggests DX11.1.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Factory Factory posted:

Eh? I thought Trinity was going to have GCN graphics. Leaked slides apparently referenced "next-gen DX11," which is kind of a limited field but suggests DX11.1.
It's been confirmed as VLIW4 since last year, there's a pretty long lead time between when a new architecture hits the discrete market and when it gets integrated.

Anandtech posted:

Because of this need to inform developers of the hardware well in advance, while we’ve had a chance to see the fundamentals of GCN products using it are still some time off. At no point has AMD specified when a GPU will appear using GCN will appear, so it’s very much a guessing game. What we know for a fact is that Trinity – the 2012 Bulldozer APU – will not use GCN, it will be based on Cayman’s VLIW4 architecture.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Alereon posted:

It's been confirmed as VLIW4 since last year, there's a pretty long lead time between when a new architecture hits the discrete market and when it gets integrated.

Friggin' Google only giving me results since October. Thanks.

Star War Sex Parrot
Oct 2, 2003

Cross-posting because this is a drat joke.

Vagrancy posted:

AMD have released Catalyst drivers for the consumer preview:

http://support.amd.com/us/kbarticles/Pages/Windows8ConsumerPreviewDrivers.aspx
Dear AMD: please support the Radeon HD 7000 series.

Just in case you're not aware, it is your flagship card and you released it 2 months ago.

Love,
Star War Sex Parrot

tijag
Aug 6, 2002

Star War Sex Parrot posted:

Cross-posting because this is a drat joke.

Dear AMD: please support the Radeon HD 7000 series.

Just in case you're not aware, it is your flagship card and you released it 2 months ago.

Love,
Star War Sex Parrot

Agreed.

They beat nvidia to market by 4 months [at least] and yet it looks like it was too much for the driver team. GCN is the second 'new' arch from them in a year. I know VLIW-4 wasn't as big a change from VLIW-5 as GCN is, but still, they've been tasked with a big load.

I think they will figure it out and get it right. I feel like improving their drivers is going to increase performance a good amount.

illectro
Mar 29, 2010

:jeb: ROCKET SCIENCE :jeb:

Hullo, I'm Scoot Moonbucks.
Please stop being surprised by this.
I've got an issue with a newly installed Radeon HD6450, if I just install the catalyst package then the computer will lock up solid after I login, but, it's fine if I only install the graphics drivers alone. This might be related to hardware monitoring because when I try to run any third party temperature monitoring program I get random temporary freezes, but these aren't fatal.

Has anyone seen this or got a clue to a suggested fix. Am I missing some critical card management component if I don't have these ati services running, is my card going to fail to throttle itself and cook the gpu.
Furthermore the dvi output doesn't appear to be producing a signal (haven't tested the hdmi) and really I'd like to avoid being stuck on VGA. I can't figure out how to fix this without the catalyst app

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

illectro posted:

I've got an issue with a newly installed Radeon HD6450, if I just install the catalyst package then the computer will lock up solid after I login, but, it's fine if I only install the graphics drivers alone. This might be related to hardware monitoring because when I try to run any third party temperature monitoring program I get random temporary freezes, but these aren't fatal.

Has anyone seen this or got a clue to a suggested fix. Am I missing some critical card management component if I don't have these ati services running, is my card going to fail to throttle itself and cook the gpu.
Furthermore the dvi output doesn't appear to be producing a signal (haven't tested the hdmi) and really I'd like to avoid being stuck on VGA. I can't figure out how to fix this without the catalyst app
Uninstall all existing drivers, use Driver Sweeper to remove the remnants, reboot, install latest drivers.

illectro
Mar 29, 2010

:jeb: ROCKET SCIENCE :jeb:

Hullo, I'm Scoot Moonbucks.
Please stop being surprised by this.

Alereon posted:

Uninstall all existing drivers, use Driver Sweeper to remove the remnants, reboot, install latest drivers.

No joy, but i did isolate it to the catalyst control panel gizmo, everything else works.

Star War Sex Parrot
Oct 2, 2003

The 7870 has landed.

http://techreport.com/articles.x/22573

http://www.anandtech.com/show/5625/amd-radeon-hd-7870-ghz-edition-radeon-hd-7850-review-rounding-out-southern-islands

forbidden dialectics
Jul 26, 2005






I like how they specifically call out 5850 owners. Give us a better reason to upgrade and we will, AMD!

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Wow, I was really expecting to be disappointed with near-6870 performance, but this is amazing. I guess I'm just gonna forget about building a new computer and drop a 7870 into my machine!

Edit: Holy poo poo these overclocking benchmarks!

Alereon fucked around with this message at 07:05 on Mar 5, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Very, very nice. That's a fairly killer price:performance bracket. nVidia is going to be proper screwed to compete (on merit, though that doesn't go as far as one might wish) if they can't produce a really nice line, and soon.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Nostrum posted:

I like how they specifically call out 5850 owners. Give us a better reason to upgrade and we will, AMD!

Eh, why would you? You can still play everything..

KillHour
Oct 28, 2007


This is pretty awesome, since I'm finally replacing my aging rig when Ivy Bridge hits. Hopefully nVidia puts out some awesome cards as well, so I can have my pick of the litter when I'm ready to buy.

heated game moment
Oct 30, 2003

Lipstick Apathy
Kind of disappointed that I can't justify replacing my $200 5870 with any of the new AMD cards. At least, not without stepping up to 2560x1600.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Cyrezar posted:

Kind of disappointed that I can't justify replacing my $200 5870 with any of the new AMD cards. At least, not without stepping up to 2560x1600.

Surely it's good that you have a relevant card that will last you longer?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Anandtech has been saying that this is the last generation of games that will run at 1080p on cards with 1GB of video RAM, but of course 8000-series cards may be about to hit before we see demanding, next-gen games. Unreal 4 could be pretty incredible, especially given the amazing things Epic Games was able to do with Unreal 3 in their Samaritan demo. That demo could run real-time on a Radeon HD 7900-series card. Jorge Jimenez's real-time separable SubSurface Scattering demo is also amazing, it runs pretty smoothly on even old cards. This CryTek paper on rendering tech in Crysis 3 is pretty cool too.

I've been watching Timothy Lottes' blog pretty closely, he works at nVidia and posts a lot about game and rendering technology. He's currently working on FXAA4, which is a shader-based antialiasing filter designed to be much more effective and efficient than traditional antialiasing. There's some pretty cool pictures here and here. If the game developer integrates the temporal FXAA4, you basically get 2X MSAA AND 2X SSAA for a tiny fraction of the price of regular 2X MSAA, and it works even behind shaders where traditional antialiasing doesn't (so, for example, an aliased edge seen distorted through glass will get antialiased too).

Yaos
Feb 22, 2003

She is a cat of significant gravy.
Speaking of Unreal 4 Epic is supposed to be talking about their new engine at GDC. Thanks for the link to that guys blog, I like reading about the future of graphics.

I have a 5850 and it's been doing great for me, I'm waiting until at least the 8000 series comes out, maybe even longer. I'll have to see what AMD and Nvidia are offering at the end of this year before I get anything new. At that point my computer will be 3 years old and I usually replace everything so I'll be looking at processors too. I may go longer though since my hardware has been doing great.

Yaos fucked around with this message at 15:49 on Mar 6, 2012

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.
FXAA also blurs everything on the screen slightly, though.

tijag
Aug 6, 2002

Aleksei Vasiliev posted:

FXAA also blurs everything on the screen slightly, though.

If the developer implements FXAA I believe it is possible that the text can remain untouched by the process.

Peechka
Nov 10, 2005
I just replaced my 8800GS that I had in my crappy Dell system, and this card pretty much kept up until SWTOR, but that game has a lovely engine so its partly the games fault. And were talking about a 5 yr old $130 card. I replaced that with a $135 ATI 6850, got about a 100% performance increase and fully expect this to last at least 3-4 yrs.

While it would be great to go out and spend $500 on a new top of the line card, I just find this to be foolish.

tijag
Aug 6, 2002

Peechka posted:

I just replaced my 8800GS that I had in my crappy Dell system, and this card pretty much kept up until SWTOR, but that game has a lovely engine so its partly the games fault. And were talking about a 5 yr old $130 card. I replaced that with a $135 ATI 6850, got about a 100% performance increase and fully expect this to last at least 3-4 yrs.

While it would be great to go out and spend $500 on a new top of the line card, I just find this to be foolish.

What resolution is your monitor?

I have a 5850 with an i5-750 OC'd to 3.6ghz and SW:TOR ran pretty good [45-60 fps] with medium+ settings on 2048x1152.

If you're rocking a CRT monitor with a low resolution your statement makes sense. if you have very very low standards for what kind of graphic fidelity you require your statement makes sense.

Peechka
Nov 10, 2005

tijag posted:

What resolution is your monitor?

I have a 5850 with an i5-750 OC'd to 3.6ghz and SW:TOR ran pretty good [45-60 fps] with medium+ settings on 2048x1152.

If you're rocking a CRT monitor with a low resolution your statement makes sense. if you have very very low standards for what kind of graphic fidelity you require your statement makes sense.

My monitor is a cheap dell 1680x1050 and the 8800gs ran it at low but there was slowdowns in some situations. I never saw the FPs but I would say at the lowest 10 fps but averaged about 25fps.

My proc is a Q6600 @ 2.4 with 4 gigs of ram. With the new ATI6850 I get anywhere from 35-50FPs at med settings.

The other computer I have a Sapphire 5850 Extreme and a 2500K at 4.3Ghz and it runs SWTOR at 50-70FPS on high with a 1680x1050 monitor. So what im assuming is that im more CPU bound than anything in the old dell.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Aleksei Vasiliev posted:

FXAA also blurs everything on the screen slightly, though.
That was true for older versions, but FXAA3 and its later point releases (especially the current 3.11 version) are a lot better about removing aliasing while not affecting texture detail as much. HardOCP did a good FXAA article last year, even though they weren't using the latest version it was much better than 2X MSAA or AMD's MLAA, delivering quality near 4X MSAA in many cases, with performance better than 2X MSAA. FXAA4's temporal algorithm should be a lot more effective as jittering the view between frames allows the algorithm to more effectively identify and remove temporal aliasing from all sources (not just the edges of geometry) while increasing texture sharpness thanks to the free 2X supersampling. The only downsides are that it requires game support and doesn't work as well during motion, but most games use motion blur anyway. The FXAA4 spatial only algorithm will still cause some degree of sharpness loss, but I think that's less objectionable than the aliasing you give up. You can also combined FXAA with regular MSAA, 2X MSAA makes FXAA much more effective without too much cost.

Star War Sex Parrot
Oct 2, 2003

MLAA 2.0 is looking a lot more attractive now. I'd like to see an updated compare/contrast article with the latest FXAA.

Zhentar
Sep 28, 2003

Brilliant Master Genius

tijag posted:

If the developer implements FXAA I believe it is possible that the text can remain untouched by the process.

Correct. It's pretty simple - the developer just tells it do the FXAA pass before putting the text on the screen.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Star War Sex Butt posted:

MLAA 2.0 is looking a lot more attractive now. I'd like to see an updated compare/contrast article with the latest FXAA.

I disagree. More specifically, new MLAA might be better than old MLAA (stressing the might, as it's more "different" than better - seems to be able to differentiate HUD elements better, but it also doesn't categorically improve antialiasing behavior, either, in some cases better, some cases worse). It's not more attractive than FXAA's modern iterations by a wide stretch. But if that's not what you meant, apologies.

Star War Sex Parrot
Oct 2, 2003

Agreed posted:

I disagree.
I should have said "a lot more attractive than MLAA 1.0." My apologies if I made it seem like I was talking about FXAA.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Not that it's anything to do with AMD at all, but: http://www.anandtech.com/show/5672/acer-aspire-timelineu-m3-life-on-the-kepler-verge

An uninspiring (1366x768 15.6") Acer notebook with.. Kepler! Nothing interesting to see here really, other than the battery life on this machine.

Is there an NVIDIA thread? Should there be one? I don't know, but we were talking about graphics cards in general here for a bit, and some were waiting for Kepler info.

movax
Aug 30, 2008

HalloKitty posted:

Not that it's anything to do with AMD at all, but: http://www.anandtech.com/show/5672/acer-aspire-timelineu-m3-life-on-the-kepler-verge

An uninspiring (1366x768 15.6") Acer notebook with.. Kepler! Nothing interesting to see here really, other than the battery life on this machine.

Is there an NVIDIA thread? Should there be one? I don't know, but we were talking about graphics cards in general here for a bit, and some were waiting for Kepler info.

I was thinking of a dedicated GPU thread, I've got like 70% of an OP written for that, just had put on hold for awhile.

That's kinda hilarious though, everyone scrambling because Acer "accidentally" let a notebook out the door with a Kepler-based chip on-board. If they're already selling that machine though...Acer got shipped sufficient quantities of the chip to put it into production? Maybe desktop Kepler will actually show up in Q3!

Seamonster
Apr 30, 2007

IMMER SIEGREICH

HalloKitty posted:

Not that it's anything to do with AMD at all, but: http://www.anandtech.com/show/5672/acer-aspire-timelineu-m3-life-on-the-kepler-verge

An uninspiring (1366x768 15.6") Acer notebook with.. Kepler! Nothing interesting to see here really, other than the battery life on this machine.

Is there an NVIDIA thread? Should there be one? I don't know, but we were talking about graphics cards in general here for a bit, and some were waiting for Kepler info.

All it needs is the 1080p screen from the xps 15z or even the regular xps if possible...

adorai
Nov 2, 2002

10/27/04 Never forget
Grimey Drawer
we received two demo DL165 G7 servers from our reseller this week. We put ESXi 4.1u2 on them, joined them to one of our VMware clusters, and started migrating VMs over. So far, I am actually pretty impressed with the bulldozer processors in them. Since it really all depends on VM load this next number is useless, but the MAX CPU utilization we saw today was 35% on a single proc 6272.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Ah, don't tell me you're impressed with Bulldozer virtualisation performance, because I had the choice recently between Opteron 4184 (previous generation 6-core, 2.8GHz) and Opteron 4284 (Bulldozer, 4-module, 3GHz) and after much deliberation, went for the 4184.

Potentially the wrong decision. Anyone else got experience to chime in with?

HalloKitty fucked around with this message at 10:39 on Mar 15, 2012

Adbot
ADBOT LOVES YOU

Gwaihir
Dec 8, 2009
Hair Elf
Most of the benchmarks comparing the two architectures put performance clock for clock at about the same, with the power use advantage going to the older Magny-cours chips. Either way though, you're getting about the same performance for VM use (You're just using more juice for bulldozer). The bulldozer wins a couple applications related to enhancements it has, like hardware accelerated AES support, which the older chip completely lacks, but obviously that's not a really common thing to be using your servers for in terms of the usual web/database/email/whatever. The new chips do have significantly lower *idle* power usage, however, so if the machines are not loaded too heavily you'd win some savings there.

e: http://www.anandtech.com/show/5058/amds-opteron-interlagos-6200/5 these are the higher end 12 core 6100s and 4 module BD chips, but the results should scale down.

Gwaihir fucked around with this message at 18:44 on Mar 15, 2012

  • Locked thread