|
SourKraut posted:I've been debating between the non-X 290 or going with the ASUS Matrix R9-280X. Aside from a faster card you do get AMDs TrueAudio - assuming developers support it.
|
# ? Oct 14, 2013 22:01 |
|
|
# ? May 14, 2024 22:32 |
|
The 290 non-X should have TrueAudio too. TrueAudio is present on all GCN 1.1 GPUs, just not the re-launch of GCN 1.0 models.
|
# ? Oct 14, 2013 22:04 |
|
Yeah TrueAudio is somewhat tempting, but for the games I play at 1440p (WoW, D3, some old source games, Civ 5) I'm not sure how much benefit I'd see going with the 290 over the 280X. I've been tempted to just get the 280X now and if I need a little more power down the road buying another 280X once specials kick in and Crossfiring it. I know Crossfire isn't the best but I'm hoping AMD will continue the focus on fixing Crossfire via driver updates and by the time I would Crossfire them it's be better. At the same time though the lack of a bridge on the 290s is an interesting development for Crossfire.
|
# ? Oct 14, 2013 22:30 |
|
Now it appears that the 290X is slated for October 18th. Goddammit AMD, which is it? Oct. 18th, or Oct. 24th? Some reports even say it won't be out until Halloween. The constant date jumping plus the lack of confirmation on price is really wracking my brain.
|
# ? Oct 15, 2013 00:25 |
|
Rahu X posted:Now it appears that the 290X is slated for October 18th. What's the card going to retail at? At least it'll be out before Battlefield 4 is released. That's all we could ever hope for.
|
# ? Oct 15, 2013 04:09 |
|
Tab8715 posted:What's the card going to retail at? I'm hoping the 290 non-x will be out before BF4 is released as my PC is 3 years old and I'm building a new one.
|
# ? Oct 15, 2013 04:35 |
|
Wikipedia shows the R290 is suppose to be released tomorrow?
|
# ? Oct 15, 2013 07:19 |
|
Tab8715 posted:Wikipedia shows the R290 is suppose to be released tomorrow?
|
# ? Oct 15, 2013 16:29 |
|
Rahu X posted:Now it appears that the 290X is slated for October 18th. Funny that. Nvidia is supposed to announce something on the 17th. The plot thickens. I just need more pixels, my 7970 is overclocked to 1200 and can't go a lick faster without artifacts and can't push 3 monitors for poo poo if I turn up any of the settings.
|
# ? Oct 15, 2013 19:41 |
|
Some supposed 290x numbers. How accurate/reliable these are, well .
|
# ? Oct 16, 2013 15:59 |
|
Wasn't Nvidia supposed to have a refresh to counter AMD's recent announcements? I heard rumors of a Titan LE or Titan Ultra. That's not happening, is it?
|
# ? Oct 16, 2013 18:45 |
|
Lolcano Eruption posted:Wasn't Nvidia supposed to have a refresh to counter AMD's recent announcements? I heard rumors of a Titan LE or Titan Ultra. That's not happening, is it?
|
# ? Oct 16, 2013 20:19 |
|
Lolcano Eruption posted:Wasn't Nvidia supposed to have a refresh to counter AMD's recent announcements? I heard rumors of a Titan LE or Titan Ultra. That's not happening, is it? Nothing of that sort. The most you'll see concerning the Titan is MAYBE a price drop, but I'd say that's unlikely. What's pretty much confirmed at this point is that NVIDIA will be introducing the 760ti and maybe a 770ti. At least the 760ti, because there's already a page for it.
|
# ? Oct 16, 2013 20:28 |
|
There was speculation months ago about a 790 model, but I haven't heard anything substantiating that.
|
# ? Oct 16, 2013 20:44 |
|
Rahu X posted:What's pretty much confirmed at this point is that NVIDIA will be introducing the 760ti and maybe a 770ti. At least the 760ti, because there's already a page for it.
|
# ? Oct 16, 2013 23:57 |
|
People are losing their loving minds over this AMD launch. This poo poo happens every time one company releases a new card and everyone seems to forget that the other company did the exact same thing the previous year. It's pretty funny to watch the fanboys for each side bicker and fight over which team is better and bring up/ignore "facts" about each side. I just wish they would get it done already so I can decide which card I'm going to get. As it is, a single 7970 overclocked to its breaking point is -not- enough to run 8000x1440 resolution with any sort of eyecandy.
|
# ? Oct 17, 2013 16:40 |
|
You have three monitors, right? If you want to game at those resolutions, you're going have to go with a multi-GPU setup. Anyway, I do hope the R9-290 won't be so power hungry.
|
# ? Oct 17, 2013 17:41 |
|
veedubfreak posted:People are losing their loving minds over this AMD launch. This poo poo happens every time one company releases a new card and everyone seems to forget that the other company did the exact same thing the previous year. It's pretty funny to watch the fanboys for each side bicker and fight over which team is better and bring up/ignore "facts" about each side. What kind of setup even adds up to 8000x1440? edit: and plus, anybody could have told you that 8000x1440 is well beyond the purview of a single card. It's 40% more pixels than 4k. Even dual- and tri-SLI titans struggle at 4k. Magic Underwear fucked around with this message at 18:10 on Oct 17, 2013 |
# ? Oct 17, 2013 18:06 |
|
Magic Underwear posted:What kind of setup even adds up to 8000x1440? Likely 7680x1440 with some amount of bezel correction in the drivers. I do the same on nVidia and usually end up with a bezel corrected resolution around that.
|
# ? Oct 17, 2013 18:16 |
|
Got a special ed GTX 580, and I wonder how it compares to todays specs. Here are the specifications: Nvidia GeForce GTX 580 Xtreme Lightning Ed. 832 MHz Core 3072MB GDDR5 4200MHz Memory http://www.msi.com/product/vga/N580GTX-Lightning-Xtreme-Edition.html#/?div=Specification Could someone enlighten me how this works out compared to todays standards?
|
# ? Oct 17, 2013 18:18 |
|
Ximen posted:Got a special ed GTX 580, and I wonder how it compares to todays specs. Here are the specifications: http://www.anandtech.com/bench/GPU13/704 Look at all these benchmarks, but remember that your card is a high-end overclocked version so it will do considerably better than a stock 580. I would say it will do pretty drat well at 1080p for another few years. For 1440p or 1600p, you're going to have to turn things down to stay at 60 fps.
|
# ? Oct 17, 2013 18:43 |
|
I'm thinking of selling my SLI 2gb 680s and getting a single 290x card because I'm pretty sick of dual GPU woes in some games and at 1440p I'm not quite getting the performance I expect. However the market in Quebec sucks dick to sell local and the best offer I got was 550$ for both cards, am I doing something wrong?
|
# ? Oct 17, 2013 20:08 |
|
Athropos posted:I'm thinking of selling my SLI 2gb 680s and getting a single 290x card because I'm pretty sick of dual GPU woes in some games and at 1440p I'm not quite getting the performance I expect. However the market in Quebec sucks dick to sell local and the best offer I got was 550$ for both cards, am I doing something wrong? I would try to sell them separately. You make more money that way with any product and especially I suppose with two identical cards that don't work that well together. I see them going for around $300 (USD) on ebay. Also, why are you averted to shipping? Sell them separately and use shipping. ethanol fucked around with this message at 20:23 on Oct 17, 2013 |
# ? Oct 17, 2013 20:19 |
|
Athropos posted:I'm thinking of selling my SLI 2gb 680s and getting a single 290x card because I'm pretty sick of dual GPU woes in some games and at 1440p I'm not quite getting the performance I expect. However the market in Quebec sucks dick to sell local and the best offer I got was 550$ for both cards, am I doing something wrong? Out of curiosity, which games are you playing at 1440p where the 2x 680s aren't up to the performance? I ask because I've been considering either 2x 770s or 2x 280Xs long-term and I'm at 1440p, so it'd be interesting to see with what games it may not even matter to go SLI or Crossfire.
|
# ? Oct 17, 2013 20:23 |
|
To be fair, I doubt it's the actual power of two 680s that's the problem, but more the 2GiB VRAM
|
# ? Oct 17, 2013 20:26 |
|
SourKraut posted:Out of curiosity, which games are you playing at 1440p where the 2x 680s aren't up to the performance? I ask because I've been considering either 2x 770s or 2x 280Xs long-term and I'm at 1440p, so it'd be interesting to see with what games it may not even matter to go SLI or Crossfire. I dunno, it's wildly inconsistent. I'm one of them "60 fps or bust" people and the performance in Borderlands 2 (maybe due to physics), Final Fantasy 14 (CPU bottlenecks which SLI doesnt help with), BF4 which maxes out Vram usage like mad, and Rome 2 which is a poorly coded piece of poo poo that chugs even 5ghz octocores all lack oomph at that resolution. Some games run flawlessly though, and for controller games like Assassins Creed the Batman games I can play on my HDTV at 1080 which gives a break to my 2 cards. Maybe these framedrops to like 40 fps are because of CPU bottlenecks that I don't know about. I7-3770k running at 4.4ghz. I'm thinking that a single GPU solution with more VRAM will bring me less problems in the end by putting less stress on the CPU. Athropos fucked around with this message at 22:19 on Oct 17, 2013 |
# ? Oct 17, 2013 22:17 |
|
It doesn't really work like that in practice, none of the single GPU cards are powerful enough to match up to twin GTX680s, except in cases where it's lack of RAM causing the issue. The new AMD card is not going to be better enough over a Titan or GTX780 to measure up.
|
# ? Oct 17, 2013 22:24 |
|
Magic Underwear posted:http://www.anandtech.com/bench/GPU13/704 Well, problem nowadays is that there are many incopabilities since it's a overclocked 580, an example is BF4 where I had major frame drops on even the High/Med options, whilst on ultra it was unplayable.. On the other hand I have BF3 which I can run smoothly on Ultra settings, which in my opinion looks a lot better than BF4.
|
# ? Oct 17, 2013 22:43 |
|
Magic Underwear posted:What kind of setup even adds up to 8000x1440? Oh I'm well aware of that. As noted, it's bezel compensated 3x27". The problem is, MWO, which I can't seem to stop loving playing, does not support multiple cards because PGI is a bunch of loving retards. But I do plan on either picking up 2 290x -if- it comes out with a frame pacing driver that works in crossfire and eyefinity. Otherwise I'm just gonna get 1 and wait until either MWO stops being dumb or nvidia brings out something to beat a single 290x.
|
# ? Oct 17, 2013 22:51 |
|
So I updated my GeForce drivers to 331.40 Beta, for Windows 8.1. It enabled Stereoscopic 3D by default. Everything has this drat fool red/blue split. Why on earth is that a default setting??
|
# ? Oct 17, 2013 23:25 |
|
veedubfreak posted:The problem is, MWO, which I can't seem to stop loving playing, Factory Factory posted:So I updated my GeForce drivers to 331.40 Beta, for Windows 8.1. It enabled Stereoscopic 3D by default. Everything has this drat fool red/blue split. Why on earth is that a default setting??
|
# ? Oct 17, 2013 23:32 |
|
BITCOIN MINING RIG posted:Holy poo poo MWO is still going on? I figured IGP would've gone under by now, if someone didn't just burn their offices to the ground. It was like that in the beta too, I have no clue why.
|
# ? Oct 17, 2013 23:52 |
|
WHERE MY HAT IS AT posted:It was like that in the beta too, I have no clue why.
|
# ? Oct 17, 2013 23:58 |
|
Two interesting things recently posted about that I just wanted to briefly remark on: 1. A person playing games known to be extremely poorly optimized and/or CPU-bound complains of the performance of two 680s in SLI. Legitimate gripe though on the 2GB VRAM limit, that's a bummer. 2. A person playing at a resolution not quite twice 4K is miffed that no modern card supports it adequately, when the current crop of cards (and I include Titan and the 290X there) aren't capable of running the most demanding games of today singly or in some cases even in tandem at 1080p at a minimum of 60fps. Such is the price of pixels, the charge toward higher resolutions will be the domain of computer graphics as it has been, but especially so with the performance I think we can expect from consoles at or near 1080p. What am I saying? Nothing bad about either use case, just that some users' demands on hardware are not yet solved problems. Time will tell.
|
# ? Oct 18, 2013 00:24 |
|
AMD's stock lost almost 8% of its value on after hours trading. This is after posting profit for the first time in 4 quarters.
|
# ? Oct 18, 2013 00:31 |
|
True teaser benchmarks of the 290X are in, just in case you guys didn't know. http://www.tomshardware.com/news/amd-radeon-r9-290x-benchmark-performance,24732.html http://www.anandtech.com/show/7424/amd-radeon-r9-290x-performance-preview-bioshock-infinite http://www.legitreviews.com/benchmarks-amd-radeon-r9-290x-versus-nvidia-geforce-gtx780_126653 What interests me most is that on the Legit Reviews site, the results are labeled as being in "Quiet Mode." This wasn't referenced at all by AMD to my knowledge. Either way, I don't know how to feel about this. The NDA will be lifted soon though, which eases my mind some.
|
# ? Oct 18, 2013 01:16 |
|
Rahu X posted:Either way, I don't know how to feel about this. The NDA will be lifted soon though, which eases my mind some. This whole launch has bothered me quite a bit. I'm sure someone in AMD's marketing department probably said "Hey, let's just throw out little bits of information here and there, but never actually say when this thing is being released. It'll totally build suspense and hype!", but it's rather annoying to not even say when the NDA might be lifted or such. They might as well just launch the drat thing or at least just say when it'll actually be released.
|
# ? Oct 18, 2013 04:32 |
|
Does anyone actually feel hyped by the 290X?
|
# ? Oct 18, 2013 04:37 |
|
Factory Factory posted:Does anyone actually feel hyped by the 290X? I'd personally love to upgrade from my 7870Ghz, but if the 290X is going to be $600+ and only similar to a 780, I'll be pretty disappointed. However, if it's $500 and womps a 780, I'll be a happy man.
|
# ? Oct 18, 2013 04:55 |
|
|
# ? May 14, 2024 22:32 |
|
Factory Factory posted:Does anyone actually feel hyped by the 290X? Not that I know of, which is why it baffles me as to why AMD is handling this the way they are.
|
# ? Oct 18, 2013 04:58 |