|
Coredump posted:http://techreport.com/discussions.x/22543 This doesn't take into account AMD's integrated GPUs, which is eating into both Nvidia's low end and their own low end.
|
# ? Feb 28, 2012 19:01 |
|
|
# ? May 11, 2024 12:25 |
|
Ookay, so DP is nice, to have a quick and simple cable to connect. Frankly it sucks that every time I switch to another input, and then come back to it the damned resolution freaks out and whatever I had displayed on my secondary monitor [the DP connected one] gets moved back to my primary one. Guess I'll switch the input formats on the monitors since I never switch inputs on my primary monitor.
|
# ? Feb 29, 2012 06:11 |
|
real_scud posted:Ookay, so DP is nice, to have a quick and simple cable to connect. Frankly it sucks that every time I switch to another input, and then come back to it the damned resolution freaks out and whatever I had displayed on my secondary monitor [the DP connected one] gets moved back to my primary one. Yeah, when I got a 6950, I thought it would be fun to try DisplayPort. I ended up going back to DVI because of this. It's not a DisplayPort inherent problem the way I understand it, but the way Windows handles it. I understand that by default it is reasonable enough behaviour to detect screens, alter the desktop accordingly, such as when you plug a projector in or something, but it would be really nice to have an option somewhere that preserved the desktop regardless of screen detection state.
|
# ? Feb 29, 2012 11:06 |
|
L-O-N posted:This doesn't take into account AMD's integrated GPUs, which is eating into both Nvidia's low end and their own low end. So what, that doesn't invalidate the point I was making.
|
# ? Feb 29, 2012 14:01 |
|
AMD has acquired SeaMicro, an innovative high-density server vendor and close Intel partner. SeaMicro were the guys pushing ultra-dense Atom servers, and servers that combined Atoms and Xeons together into very tiny packages. Perhaps we'll see servers with Brazos APUs instead of Atoms, combining better threaded performance with the opportunity for GPU computing? The 80 shader ALUs on an E-350 have about ten times the potential floating point performance of the CPU cores, so if someone cares enough to optimize their applications to run on a GPU there's a lot of performance available to pick up. The integrated memory controller on the Brazos APUs is also a pretty significant advantage over Atom for non-trivial workloads. Bonus Edit: Rumors that Sony would be using an AMD Trinity APU (two Piledriver modules, four "cores", plus VLIW4 graphics) for the Playstation 4 appear to be solidifying. Previously it was sort of assumed (despite rumors to the contrary) that Sony wouldn't dump the Cell even if they switched to AMD as the GPU provider, but this is looking more and more likely. Alereon fucked around with this message at 00:21 on Mar 1, 2012 |
# ? Mar 1, 2012 00:15 |
|
Interesting, and pretty sensible. Seems like a good way to keep the packaging small without building your own design.
|
# ? Mar 1, 2012 00:27 |
|
Alereon posted:AMD Trinity APU (two Piledriver modules, four "cores", plus VLIW4 graphics) Eh? I thought Trinity was going to have GCN graphics. Leaked slides apparently referenced "next-gen DX11," which is kind of a limited field but suggests DX11.1.
|
# ? Mar 1, 2012 00:36 |
|
Factory Factory posted:Eh? I thought Trinity was going to have GCN graphics. Leaked slides apparently referenced "next-gen DX11," which is kind of a limited field but suggests DX11.1. Anandtech posted:Because of this need to inform developers of the hardware well in advance, while we’ve had a chance to see the fundamentals of GCN products using it are still some time off. At no point has AMD specified when a GPU will appear using GCN will appear, so it’s very much a guessing game. What we know for a fact is that Trinity – the 2012 Bulldozer APU – will not use GCN, it will be based on Cayman’s VLIW4 architecture.
|
# ? Mar 1, 2012 00:41 |
|
Alereon posted:It's been confirmed as VLIW4 since last year, there's a pretty long lead time between when a new architecture hits the discrete market and when it gets integrated. Friggin' Google only giving me results since October. Thanks.
|
# ? Mar 1, 2012 00:50 |
|
Cross-posting because this is a drat joke.Vagrancy posted:AMD have released Catalyst drivers for the consumer preview: Just in case you're not aware, it is your flagship card and you released it 2 months ago. Love, Star War Sex Parrot
|
# ? Mar 1, 2012 03:57 |
|
Star War Sex Parrot posted:Cross-posting because this is a drat joke. Agreed. They beat nvidia to market by 4 months [at least] and yet it looks like it was too much for the driver team. GCN is the second 'new' arch from them in a year. I know VLIW-4 wasn't as big a change from VLIW-5 as GCN is, but still, they've been tasked with a big load. I think they will figure it out and get it right. I feel like improving their drivers is going to increase performance a good amount.
|
# ? Mar 1, 2012 18:32 |
|
I've got an issue with a newly installed Radeon HD6450, if I just install the catalyst package then the computer will lock up solid after I login, but, it's fine if I only install the graphics drivers alone. This might be related to hardware monitoring because when I try to run any third party temperature monitoring program I get random temporary freezes, but these aren't fatal. Has anyone seen this or got a clue to a suggested fix. Am I missing some critical card management component if I don't have these ati services running, is my card going to fail to throttle itself and cook the gpu. Furthermore the dvi output doesn't appear to be producing a signal (haven't tested the hdmi) and really I'd like to avoid being stuck on VGA. I can't figure out how to fix this without the catalyst app
|
# ? Mar 4, 2012 18:48 |
|
illectro posted:I've got an issue with a newly installed Radeon HD6450, if I just install the catalyst package then the computer will lock up solid after I login, but, it's fine if I only install the graphics drivers alone. This might be related to hardware monitoring because when I try to run any third party temperature monitoring program I get random temporary freezes, but these aren't fatal.
|
# ? Mar 4, 2012 19:01 |
|
Alereon posted:Uninstall all existing drivers, use Driver Sweeper to remove the remnants, reboot, install latest drivers. No joy, but i did isolate it to the catalyst control panel gizmo, everything else works.
|
# ? Mar 5, 2012 00:43 |
|
The 7870 has landed. http://techreport.com/articles.x/22573 http://www.anandtech.com/show/5625/amd-radeon-hd-7870-ghz-edition-radeon-hd-7850-review-rounding-out-southern-islands
|
# ? Mar 5, 2012 06:28 |
|
Star War Sex Butt posted:The 7870 has landed. I like how they specifically call out 5850 owners. Give us a better reason to upgrade and we will, AMD!
|
# ? Mar 5, 2012 06:34 |
|
Wow, I was really expecting to be disappointed with near-6870 performance, but this is amazing. I guess I'm just gonna forget about building a new computer and drop a 7870 into my machine! Edit: Holy poo poo these overclocking benchmarks! Alereon fucked around with this message at 07:05 on Mar 5, 2012 |
# ? Mar 5, 2012 06:56 |
|
Very, very nice. That's a fairly killer price:performance bracket. nVidia is going to be proper screwed to compete (on merit, though that doesn't go as far as one might wish) if they can't produce a really nice line, and soon.
|
# ? Mar 5, 2012 07:10 |
|
Nostrum posted:I like how they specifically call out 5850 owners. Give us a better reason to upgrade and we will, AMD! Eh, why would you? You can still play everything..
|
# ? Mar 5, 2012 09:50 |
|
This is pretty awesome, since I'm finally replacing my aging rig when Ivy Bridge hits. Hopefully nVidia puts out some awesome cards as well, so I can have my pick of the litter when I'm ready to buy.
|
# ? Mar 5, 2012 16:07 |
|
Kind of disappointed that I can't justify replacing my $200 5870 with any of the new AMD cards. At least, not without stepping up to 2560x1600.
|
# ? Mar 6, 2012 01:07 |
|
Cyrezar posted:Kind of disappointed that I can't justify replacing my $200 5870 with any of the new AMD cards. At least, not without stepping up to 2560x1600. Surely it's good that you have a relevant card that will last you longer?
|
# ? Mar 6, 2012 10:09 |
|
Anandtech has been saying that this is the last generation of games that will run at 1080p on cards with 1GB of video RAM, but of course 8000-series cards may be about to hit before we see demanding, next-gen games. Unreal 4 could be pretty incredible, especially given the amazing things Epic Games was able to do with Unreal 3 in their Samaritan demo. That demo could run real-time on a Radeon HD 7900-series card. Jorge Jimenez's real-time separable SubSurface Scattering demo is also amazing, it runs pretty smoothly on even old cards. This CryTek paper on rendering tech in Crysis 3 is pretty cool too. I've been watching Timothy Lottes' blog pretty closely, he works at nVidia and posts a lot about game and rendering technology. He's currently working on FXAA4, which is a shader-based antialiasing filter designed to be much more effective and efficient than traditional antialiasing. There's some pretty cool pictures here and here. If the game developer integrates the temporal FXAA4, you basically get 2X MSAA AND 2X SSAA for a tiny fraction of the price of regular 2X MSAA, and it works even behind shaders where traditional antialiasing doesn't (so, for example, an aliased edge seen distorted through glass will get antialiased too).
|
# ? Mar 6, 2012 10:38 |
|
Speaking of Unreal 4 Epic is supposed to be talking about their new engine at GDC. Thanks for the link to that guys blog, I like reading about the future of graphics. I have a 5850 and it's been doing great for me, I'm waiting until at least the 8000 series comes out, maybe even longer. I'll have to see what AMD and Nvidia are offering at the end of this year before I get anything new. At that point my computer will be 3 years old and I usually replace everything so I'll be looking at processors too. I may go longer though since my hardware has been doing great. Yaos fucked around with this message at 15:49 on Mar 6, 2012 |
# ? Mar 6, 2012 15:46 |
|
FXAA also blurs everything on the screen slightly, though.
|
# ? Mar 8, 2012 18:28 |
|
Aleksei Vasiliev posted:FXAA also blurs everything on the screen slightly, though. If the developer implements FXAA I believe it is possible that the text can remain untouched by the process.
|
# ? Mar 8, 2012 20:44 |
|
I just replaced my 8800GS that I had in my crappy Dell system, and this card pretty much kept up until SWTOR, but that game has a lovely engine so its partly the games fault. And were talking about a 5 yr old $130 card. I replaced that with a $135 ATI 6850, got about a 100% performance increase and fully expect this to last at least 3-4 yrs. While it would be great to go out and spend $500 on a new top of the line card, I just find this to be foolish.
|
# ? Mar 8, 2012 21:11 |
|
Peechka posted:I just replaced my 8800GS that I had in my crappy Dell system, and this card pretty much kept up until SWTOR, but that game has a lovely engine so its partly the games fault. And were talking about a 5 yr old $130 card. I replaced that with a $135 ATI 6850, got about a 100% performance increase and fully expect this to last at least 3-4 yrs. What resolution is your monitor? I have a 5850 with an i5-750 OC'd to 3.6ghz and SW:TOR ran pretty good [45-60 fps] with medium+ settings on 2048x1152. If you're rocking a CRT monitor with a low resolution your statement makes sense. if you have very very low standards for what kind of graphic fidelity you require your statement makes sense.
|
# ? Mar 8, 2012 22:39 |
|
tijag posted:What resolution is your monitor? My monitor is a cheap dell 1680x1050 and the 8800gs ran it at low but there was slowdowns in some situations. I never saw the FPs but I would say at the lowest 10 fps but averaged about 25fps. My proc is a Q6600 @ 2.4 with 4 gigs of ram. With the new ATI6850 I get anywhere from 35-50FPs at med settings. The other computer I have a Sapphire 5850 Extreme and a 2500K at 4.3Ghz and it runs SWTOR at 50-70FPS on high with a 1680x1050 monitor. So what im assuming is that im more CPU bound than anything in the old dell.
|
# ? Mar 9, 2012 01:29 |
|
Aleksei Vasiliev posted:FXAA also blurs everything on the screen slightly, though.
|
# ? Mar 9, 2012 18:05 |
|
MLAA 2.0 is looking a lot more attractive now. I'd like to see an updated compare/contrast article with the latest FXAA.
|
# ? Mar 9, 2012 19:25 |
|
tijag posted:If the developer implements FXAA I believe it is possible that the text can remain untouched by the process. Correct. It's pretty simple - the developer just tells it do the FXAA pass before putting the text on the screen.
|
# ? Mar 9, 2012 19:38 |
|
Star War Sex Butt posted:MLAA 2.0 is looking a lot more attractive now. I'd like to see an updated compare/contrast article with the latest FXAA. I disagree. More specifically, new MLAA might be better than old MLAA (stressing the might, as it's more "different" than better - seems to be able to differentiate HUD elements better, but it also doesn't categorically improve antialiasing behavior, either, in some cases better, some cases worse). It's not more attractive than FXAA's modern iterations by a wide stretch. But if that's not what you meant, apologies.
|
# ? Mar 9, 2012 21:29 |
|
Agreed posted:I disagree.
|
# ? Mar 9, 2012 22:43 |
|
Not that it's anything to do with AMD at all, but: http://www.anandtech.com/show/5672/acer-aspire-timelineu-m3-life-on-the-kepler-verge An uninspiring (1366x768 15.6") Acer notebook with.. Kepler! Nothing interesting to see here really, other than the battery life on this machine. Is there an NVIDIA thread? Should there be one? I don't know, but we were talking about graphics cards in general here for a bit, and some were waiting for Kepler info.
|
# ? Mar 14, 2012 14:43 |
|
HalloKitty posted:Not that it's anything to do with AMD at all, but: http://www.anandtech.com/show/5672/acer-aspire-timelineu-m3-life-on-the-kepler-verge I was thinking of a dedicated GPU thread, I've got like 70% of an OP written for that, just had put on hold for awhile. That's kinda hilarious though, everyone scrambling because Acer "accidentally" let a notebook out the door with a Kepler-based chip on-board. If they're already selling that machine though...Acer got shipped sufficient quantities of the chip to put it into production? Maybe desktop Kepler will actually show up in Q3!
|
# ? Mar 14, 2012 16:06 |
|
HalloKitty posted:Not that it's anything to do with AMD at all, but: http://www.anandtech.com/show/5672/acer-aspire-timelineu-m3-life-on-the-kepler-verge All it needs is the 1080p screen from the xps 15z or even the regular xps if possible...
|
# ? Mar 14, 2012 19:06 |
|
we received two demo DL165 G7 servers from our reseller this week. We put ESXi 4.1u2 on them, joined them to one of our VMware clusters, and started migrating VMs over. So far, I am actually pretty impressed with the bulldozer processors in them. Since it really all depends on VM load this next number is useless, but the MAX CPU utilization we saw today was 35% on a single proc 6272.
|
# ? Mar 15, 2012 01:27 |
|
Ah, don't tell me you're impressed with Bulldozer virtualisation performance, because I had the choice recently between Opteron 4184 (previous generation 6-core, 2.8GHz) and Opteron 4284 (Bulldozer, 4-module, 3GHz) and after much deliberation, went for the 4184. Potentially the wrong decision. Anyone else got experience to chime in with? HalloKitty fucked around with this message at 10:39 on Mar 15, 2012 |
# ? Mar 15, 2012 10:36 |
|
|
# ? May 11, 2024 12:25 |
|
Most of the benchmarks comparing the two architectures put performance clock for clock at about the same, with the power use advantage going to the older Magny-cours chips. Either way though, you're getting about the same performance for VM use (You're just using more juice for bulldozer). The bulldozer wins a couple applications related to enhancements it has, like hardware accelerated AES support, which the older chip completely lacks, but obviously that's not a really common thing to be using your servers for in terms of the usual web/database/email/whatever. The new chips do have significantly lower *idle* power usage, however, so if the machines are not loaded too heavily you'd win some savings there. e: http://www.anandtech.com/show/5058/amds-opteron-interlagos-6200/5 these are the higher end 12 core 6100s and 4 module BD chips, but the results should scale down. Gwaihir fucked around with this message at 18:44 on Mar 15, 2012 |
# ? Mar 15, 2012 18:41 |