Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Guni posted:

I'd personally love to upgrade from my 7870Ghz, but if the 290X is going to be $600+ and only similar to a 780, I'll be pretty disappointed. However, if it's $500 and womps a 780, I'll be a happy man.

Wasn't the quoted msrp 549-749?

Adbot
ADBOT LOVES YOU

Gonkish
May 19, 2004

Tab8715 posted:

Wasn't the quoted msrp 549-749?

They didn't supply an MSRP at the reveal, unlike the rest of this series. Some unconfirmed stuff from the source on Newegg put it at $729, although that was the custom version from MSI. It was guesstimated that meant that the stock version is $699.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


When is the NDA officially over?

Magic Underwear
May 14, 2003


Young Orc

Tab8715 posted:

When is the NDA officially over?

no one knows.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Was nVidia supposed to announce something today? It almost feels like there was some sort of game of chicken to see whether AMD was going to release sensitive info after the past week. I hate everything. Also I'm another one of those cursed with playing games that are hobbled by one CPU or GPU limitation or another.

Magic Underwear
May 14, 2003


Young Orc

Sidesaddle Cavalry posted:

Was nVidia supposed to announce something today? It almost feels like there was some sort of game of chicken to see whether AMD was going to release sensitive info after the past week. I hate everything. Also I'm another one of those cursed with playing games that are hobbled by one CPU or GPU limitation or another.

Well, they did announce a game bundle. http://www.geforce.com/whats-new/articles/announcing-the-geforce-gtx-holiday-bundle-featuring-heroes-pirates-and-spies

Starting the 28th, 770/780/Titan gets you Splinter Cell Blacklist, ACIV, and Batman AO. 760/680/670/660ti/660 gets you the first two. Plus with any of them you get $100 off some piece of hardware that no one cares about.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Factory Factory posted:

Does anyone actually feel hyped by the 290X?
You mean you're not wetting yourself over 3D positional audio and a proprietary 3D API only used by specific games? What, was 1998 not good enough?

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai

Factory Factory posted:

Does anyone actually feel hyped by the 290X?

I was at first, but the constantly jumping and unconfirmed NDA dates, along with price rumors of $700 and recent benchmarks that show it barely beating a GTX 780, have really lulled the hype for me.

I think what's drained me the most personally is that I chose to wait on these drat cards. My previous card (a GTX 580) died almost a month ago and I figure I would wait to see what AMD was offering at the top of their new spectrum before I jumped on anything. Prospects of a Titan killer for around the same cost I paid for my GTX 580 back when they first released really hyped me out. Then I heard about the vastly increased price rumors. Then the delayed NDAs. Then the 290 (which was my new go to card) being delayed past the 290X. And now these benchmarks.

At this point, I'm seriously just considering getting something like a 280X or two and being done with it because I'm an impatient sod who can't stand being on an HD 3650 for any longer. At least if the 290X ends up being too much for my blood, and the 290 doesn't come until November.

I hope that "Quiet Mode" bit from the Legit Reviews benchmarks ends up being true, because if it isn't, AMD might as well stand for Always Massive Disappointments.

Rahu X fucked around with this message at 07:15 on Oct 18, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Anand are the masters of faint praise and saying a little more than what can be said, through generalization:

Anandtech's ''Why-Is-AMD-Making-Us-Do-This'' Preview posted:

Moving on, it’s interesting to note in this case that both cards are essentially tied at Ultra quality, but when we dial down to medium the 290X takes a very decisive 14% lead. At the highest quality settings we should be shader/texture bound due to the significant use of shader effects on Bioshock’s highest quality settings, whereas at lower quality settings at least some of the bottleneck will shift to elements such as ROP throughput, memory bandwidth, and the geometry pipeline.

Wrapping this preview up, we’ll have more details on the 290X in the near future. AMD has made it clear that they are aiming high with their new flagship video card, so it will be interesting to see what they can pull off as we approach Tahiti/7970’s second birthday.

1. It sure is interesting that if you were to hypothetically present a scenario where known hardware performs at a certain level with THESE settings, and note that said known hardware and this special sauce are tied, but special sauce takes off at THOSE settings, it hypothetically suggests (about no card in particular) ROP and memory bandwidth limitations/geometry throughput. Not saying anything in situ, just, you know, musing. Cough.

2. AMD has made it clear that they are aiming high - well that's a curious statement. Price? Who knows :D

3. "...it will be interesting to see what they can pull off as we approach Tahiti/7970's second birthday ahaha that's just being a dick about it, especially given that much of the R9 ### is made up of tech that is indeed two years old.


I don't think the writer/staff at Anand are super pleased to be involved in this baffling publicity stunt. The card seems like it should perform quite well, as you must reasonably expect from 6 billion transistors, looking forward to more information as it comes out, but this struck me as particularly funny and pushing the boundaries of what seems to have been allowed in testing + "speculation" in a pretty neat way.



Edit: Re: Mantle, y'know... I had a 3DFX card and was old enough to think Glide was bullshit then, I don't think AMD has the market penetration to get their proprietary 3D API adopted widely any more than 3DFX could at the time. Hard to serve one master more than the other when the numbers don't support it and studios run on tight enough schedules and budgets as it is, see number of PhysX games for more :cry: Especially since it seems to require a ground-up level of attention, if BF4's indicated release timing is any indication. Smells funny.

Agreed fucked around with this message at 11:49 on Oct 18, 2013

Schpyder
Jun 13, 2002

Attackle Grackle

Agreed posted:

Edit: Re: Mantle, y'know... I had a 3DFX card and was old enough to think Glide was bullshit then, I don't think AMD has the market penetration to get their proprietary 3D API adopted widely any more than 3DFX could at the time.

I think they actually could, and AnandTech makes the same point I was going to before I even loaded that article to read some more about Mantle:

It's the new consoles.

Low-level APIs are de rigueur in the console space. Since AMD is supplying the CPU & GPU for the XB1 and PS4, and since they're both based on GCN, if they keep Mantle as similar as possible to the APIs they provide on the new consoles, then any dev doing cross-platform development has very little work to do to enable Mantle support. And in that regard, their PC market share is largely irrelevant.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Where are my 290x reviews?

Dotcom656
Apr 7, 2007
I WILL TAKE BETTER PICTURES OF MY DRAWINGS BEFORE POSTING THEM

Sidesaddle Cavalry posted:

Was nVidia supposed to announce something today? It almost feels like there was some sort of game of chicken to see whether AMD was going to release sensitive info after the past week. I hate everything. Also I'm another one of those cursed with playing games that are hobbled by one CPU or GPU limitation or another.

There is a big announcement today according to a friend within the company, but hes not giving up any info. Not sure when it's going to be. But it's definitely not just some holiday game bundle.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Schpyder posted:

if they keep Mantle as similar as possible to the APIs they provide on the new consoles

Any ideas as to how they plan to do that given the massive differences between how the consoles move workloads around and operate on them vs. how PCs do it?

Here's a block diagram semiaccurate did on the Xbone:



Bandwidth limitations ahoy trying to port that directly to PC where there's distance to be crossed over PCI-e, memory access isn't unified and direct, etc. - I am not saying that pet project games that get a lot of development help won't be able to outperform non-Mantle games, but I don't think it's as easy as just saying "we'll do it like the consoles" and that's the end of it because they do it way way differently re: coherency.

Agreed fucked around with this message at 15:13 on Oct 18, 2013

Icept
Jul 11, 2001

Factory Factory posted:

Does anyone actually feel hyped by the 290X?

Reasonably hyped for it, considering the only game I really care about performance in is BF4, and Mantle should be providing the goods there. However, there's no way I'm putting down money until we see some independent figures showing at least a %10+ performance increase over what else is in that price range. And it's not even going to be implemented until December, so no day 1 sale either way.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Factory Factory posted:

Does anyone actually feel hyped by the 290X?

After having an AMD card, and now an Nvidia card for the past year (with Nvidia before my AMD card), It would take extraordinary performance gains to convince me to go back to AMD. I know both sides have their bad days when it comes to drivers and whatnot, but my experience between the two was night and day. The only hype I feel for the 290X is in wondering whether or not it will drive gtx780 prices down. It would need to be a good 20% faster to convince me to buy it, and I don't think that's happening.

The_Franz
Aug 8, 2003

Schpyder posted:

I think they actually could, and AnandTech makes the same point I was going to before I even loaded that article to read some more about Mantle:

It's the new consoles.

Low-level APIs are de rigueur in the console space. Since AMD is supplying the CPU & GPU for the XB1 and PS4, and since they're both based on GCN, if they keep Mantle as similar as possible to the APIs they provide on the new consoles, then any dev doing cross-platform development has very little work to do to enable Mantle support. And in that regard, their PC market share is largely irrelevant.

Mantle isn't going to be on the consoles. Microsoft has flat out stated that the only API available on the Xbone, like it's predecessors, is D3D. The PS4 already has a proprietary PS3-esque low level API along with an OpenGL implementation and apparently a D3D compatibility wrapper built around that to ease porting. Mantle is Windows and *nix only. Even then, AMD's OpenGL guy has stated that the performance difference between Mantle and a properly-implemented modern OpenGL rendering system should be minimal so whether it's really worth targeting this proprietary API remains to be seen.

Gwaihir
Dec 8, 2009
Hair Elf
"GTX 780Ti", Good job NV. This g-sync initiative sounds pretty damned cool, if I didn't need to get a new monitor to use it.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Gwaihir posted:

"GTX 780Ti", Good job NV. This g-sync initiative sounds pretty damned cool, if I didn't need to get a new monitor to use it.

God drat it nVidia I'm not giving you my money again for this, I'm just not.

So wishing I'd waited a few months to upgrade, could have got the games I'm probably going to be buying for free :mad: I guess I could just say gently caress it and run two 780s in SLI and get a big ol' monitor to make it slightly less stupid, let's see how far I can go to make sure the tail is very firmly wagging the dog here

Gwaihir
Dec 8, 2009
Hair Elf
Yea, this seems like something I would really like to try- It would just mean selling my new 3014 and upgrading from the GTX680 to something that could push those very high frame rates at high resolutions. From the list of monitor makers that they had signed on, it seems like they'll likely be using the same 144 HZ TN panels though, which is sorta.. Eh. I dunno.

veedubfreak
Apr 2, 2005

by Smythe

Agreed posted:

Two interesting things recently posted about that I just wanted to briefly remark on:

1. A person playing games known to be extremely poorly optimized and/or CPU-bound complains of the performance of two 680s in SLI. Legitimate gripe though on the 2GB VRAM limit, that's a bummer.

2. A person playing at a resolution not quite twice 4K is miffed that no modern card supports it adequately, when the current crop of cards (and I include Titan and the 290X there) aren't capable of running the most demanding games of today singly or in some cases even in tandem at 1080p at a minimum of 60fps. Such is the price of pixels, the charge toward higher resolutions will be the domain of computer graphics as it has been, but especially so with the performance I think we can expect from consoles at or near 1080p.

What am I saying? Nothing bad about either use case, just that some users' demands on hardware are not yet solved problems. Time will tell.

Oh trust me, I know exactly what kind of dumb poo poo I've done. I bought the 690 back in December when my 6970s died and before knowing that the game didn't support SLI. Had I not been a retard I'd have just bought a 7970 or 780, but the 690 was still top dog at the time. I am playing at basically lowest possible settings with my card overclocked to 1200 and still only pushing around 45-50 fps, but at least it's playable, and triple monitors is the bomb. I'm just waiting to see -if- the 290x beats the 780 enough to warranty buying it, or if Nv comes out with something to top it. I'm buying something, I'm just waiting to see who ends up top dog when the 290 comes out, but at the rate we're going, it's going to be Thanksgiving before I can buy a loving card.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

The smartest thing I have done within a couple generations regarding graphics cards was buying a GTX 680 when they came out and I got a decent deal on it that made the performance edge over the 670 actually mean something. Could have got a 7970 and said the same thing, both Kepler GK104 and Tahiti were distance runners when it comes to performance.

The dumbest thing I have done recently regarding graphics cards was to then sell that top-notch 680 that would overclock like loving crazy to replace it with a 780 that's kind of an overclocking dud. I mean obviously it performs significantly better, but... I can't even remember why I thought that was a good idea, even if I paid for most of it by selling the 680 and 580 that came before it I just don't know what motivated me to buy it in the first place. That 680 would still be holding strong right now and I could have got some pretty rad free games or whatever by upgrading a bit later, or probably just stuck with 1080p through to Maxwell and see what nVidia's up to since I tend to go with their cards.

Anyone who buys high end graphics cards does dumb poo poo, I hope it's clear I'm not calling you out. The pursuit of shiny things is what it is.


Edit: Honorable mention for My Dumb Thing - having a PhysX card. That sucker gets so much use you don't even know. :shepface:

Agreed fucked around with this message at 17:02 on Oct 18, 2013

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



With Anand indicating that they think the 780 Ti will take the current $650 price point of the 780, what do you guys think the 780 might drop down to?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

SourKraut posted:

With Anand indicating that they think the 780 Ti will take the current $650 price point of the 780, what do you guys think the 780 might drop down to?

If it gets to around $550 I'll put plan "do a dumb loving thing" into motion and just give up all hope of ever being considered reasonable again. The games bundle and that low of a price would be enough for me. Two 780s. God that's dumb. What is wrong with me. So I'm hoping it just bonks it down to $600 or so.

Right now, though, we're waiting on AMD to lift the NDA and reveal the price of their top-end so nVidia can counter accordingly. I reckon we've underestimated GK110 yields and they've been ready to do this. Or, I've been underestimating them, perhaps something I could have clued into based on the odd lasering and odder validation numbers coming off of the 780s as a whole.

Agreed fucked around with this message at 17:56 on Oct 18, 2013

Animal
Apr 8, 2003

Agreed posted:

If it gets to around $550 I'll put plan "do a dumb loving thing" into motion and just give up all hope of ever being considered reasonable again. The games bundle and that low of a price would be enough for me. Two 780s. God that's dumb. What is wrong with me. So I'm hoping it just bonks it down to $600 or so.

Right now, though, we're waiting on AMD to lift the NDA and reveal the price of their top-end so nVidia can counter accordingly. I reckon we've underestimated GK110 yields and they've been ready to do this. Or, I've been underestimating them, perhaps something I could have clued into based on the odd lasering and odder validation numbers coming off of the 780s as a whole.

Same here. Two 780's. In a MicroATX case :pcgaming:

Wistful of Dollars
Aug 25, 2009

Animal posted:

Same here. Two 780's. In a MicroATX case :pcgaming:

You're a good man. :911:

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Agreed posted:

If it gets to around $550 I'll put plan "do a dumb loving thing" into motion and just give up all hope of ever being considered reasonable again. The games bundle and that low of a price would be enough for me. Two 780s. God that's dumb. What is wrong with me. So I'm hoping it just bonks it down to $600 or so.
Yeah, if it drops to $550, and depending on how the R9-290/X pans out, I may give up my current plan of going with 2x ASUS Matrix 280Xs. (Then picking up a second 780 sometime next year). I game in Windows but I like to play around in OS X as a Hackintosh so the 780 would be more amenable to continuing that.

Yaos
Feb 22, 2003

She is a cat of significant gravy.
So this may be the big thing Nvidia was announcing, Nvidia G-SYNC.
http://blogs.nvidia.com/blog/2013/10/18/g-sync/

Using a chip in the monitor, and a GeForce card ( :argh: ), the monitor provides a variable refresh rate that eliminates screen tearing without using V-Sync and eliminates stuttering. It works with most GeForce cards. According to the Anandtech live blog it will be available in Q1 2014.

Yaos fucked around with this message at 18:38 on Oct 18, 2013

Wistful of Dollars
Aug 25, 2009

I just finished reading the Anand story on g-sync and great googly-moogly it looks nice. :staredog:

Grim Up North
Dec 12, 2011

Yeah, sounds nice, but I'm a bit wary. Will this be a repeat of 3DVision where there's really only one display that supports it, and its great for gaming but poo poo for everything else?

The_Franz
Aug 8, 2003

Yaos posted:

So this may be the big thing Nvidia was announcing, Nvidia G-SYNC.
http://blogs.nvidia.com/blog/2013/10/18/g-sync/

Using a chip in the monitor, and a GeForce card ( :argh: ), the monitor provides a variable refresh rate that eliminates screen tearing without using V-Sync and eliminates stuttering. It works with most GeForce cards. According to the Anandtech live blog it will be available in Q1 2014.

This will be incredibly awesome for applications like video playback and emulation where you are constantly fighting to keep the audio and video in sync due to clock drift or the refresh rate of the output display not quite matching, or even being way off from, the source material. No more weird timing tricks, triple-buffering, occasional skips, tearing or resampling audio to keep everything in sync.

This needs to be in every monitor and TV. Right now.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

The_Franz posted:

This needs to be in every monitor and TV. Right now.

I agree, except "I'm hearing that NVIDIA wants to try and get the module down to below $100 eventually." Since the G-Sync monitor is $400 and the base model is $280 at "10% off", it makes me wonder about 1) how cheap they can get the module, since it seems near to the cheapness goal already, and 2) the markup on monitors.

It would be a rather fundamental change in tech to go from clock-based frames to "New frame? Okay, updating" across ALL displays. All this clocked poo poo has been engineered like mad and made cheap because the first TVs had to get their sync cues from the AC mains power frequency.

Factory Factory fucked around with this message at 19:10 on Oct 18, 2013

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai
As said by others, I think I know what card I'm getting if it drops to $550.

Depending on how this whole 290/290X bit shapes up as well, because if a 290 ends up doing well against a 780, and it's cheaper, I might cave in on that.

G-Sync seems interesting too, but the way they make it sound is that you'll only be able to install it on certain monitors if you get the mod kit. Not like it would do good on my small rear end 21.5" 1080p 60hz monitor anyway. Seems like an interesting future purchase though.

Expecting AMD to make something to counter this in the near future too.

Sindai
Jan 24, 2007
i want to achieve immortality through not dying
I sure hope G-Sync is practical and affordable by the time I need to replace my current monitors. It sounds really cool.

Wistful of Dollars
Aug 25, 2009

Well, at least the pending release of the Ti and the 290x have made things a little interesting for a bit.

I hope the g-sync stuff spreads across the market, I don't fancy the thought of having to buy specific, limited monitors just to use the technology.

TyrantWD
Nov 6, 2010
Ignore my doomerism, I don't think better things are possible
Not being able to do a new build in the summer is looking better and better. I thought I'd have to decide between an overclocked 780 and 290X for $650, but it seems like I will get even more bang for my buck with the 780Ti launching a few weeks later. I'm guessing this is why AMD held off on pricing.

fookolt
Mar 13, 2012

Where there is power
There is resistance
The Nvidia gamestream thing sounds awesome, but is it always going to be limited to the Shield thing? I'd love to just be able to use it with a Windows or Mac laptop with 5GHz wifi.

Gonkish
May 19, 2004

Agreed posted:

God drat it nVidia I'm not giving you my money again for this, I'm just not.

So wishing I'd waited a few months to upgrade, could have got the games I'm probably going to be buying for free :mad: I guess I could just say gently caress it and run two 780s in SLI and get a big ol' monitor to make it slightly less stupid, let's see how far I can go to make sure the tail is very firmly wagging the dog here

DO IT DO IT DO IT DO IT :allears:

Animal posted:

Same here. Two 780's. In a MicroATX case :pcgaming:

You are a loving hero. :patriot:

fookolt posted:

The Nvidia gamestream thing sounds awesome, but is it always going to be limited to the Shield thing? I'd love to just be able to use it with a Windows or Mac laptop with 5GHz wifi.

I'd like to see it on more devices, but right now they're trying to push Shield so that will pretty much never happen.

Gonkish fucked around with this message at 03:05 on Oct 19, 2013

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme
NVidia ShadowPlay has a release date again, which is something I've waited for a while already. Utilizing Nvidia H264 on-chip encoder the software saves 1080p60 .mp4 at 50 Mbps max bitrate. In the future web will be filled with effortless let's plays and livestreams.

Gamer2210
Nov 15, 2011
Hey guys, I finally saved up enough money and I wanna upgrade my GPU from my Asus GTX 580 (1.5gb).
I'm wondering if I should wait until nvidia releases competing GPUs to AMD Radeon's recent releases.

I might get an extra monitor or two for a multi display setup.
So is a single GPU enough? Or get an SLI 2x GTX 770 build like I was planning.

What would you advise me to get? and when?

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Gamer2210 posted:

Hey guys, I finally saved up enough money and I wanna upgrade my GPU from my Asus GTX 580 (1.5gb).
I'm wondering if I should wait until nvidia releases competing GPUs to AMD Radeon's recent releases.

I might get an extra monitor or two for a multi display setup.
So is a single GPU enough? Or get an SLI 2x GTX 770 build like I was planning.

What would you advise me to get? and when?

This one's easy I hope I beat FactoryFactory to it fast typing skills GO!

1. Update when you can afford to get the thing that you want. The future will hold much better and more powerful components, that is always the case, and the best you can do there is pay attention to product release schedules and hope that you don't invest at a bad time. I didn't do a very good job of assessing the value proposition of replacing my high-performing GTX 680 with a GTX 780, and as a result I paid much more for it than I would have to shortly, and in addition didn't get any AMAZINGLY GOOD free game bundles (I did get 3Dmark so I can show off how stupid I am, though, rock and roll :argh:). My advice on this particular moment re: timing can be a little more specific, though, because we are an extremely short period of time away from AMD dropping the NDA on their high end cards and nVidia will respond accordingly. Wait to make this decision until it can be an informed one, because, rarely, you actually have that luxury.

2. If your resolution is higher than 1440p or so, you're probably going to need a multiple card setup, especially if you're using the better price:performance options available as opposed to going with the big ol' heavy hitters, cost-is-no-object style. Right now, nVidia's multi-monitor experience for gaming is better than AMD's because AMD got caught flat-footed and are still scrambling to get their frame pacing issues solved for even most reasonable (let alone all) use cases. But don't jump on it right now, wait until we learn a bit more. If you do get a multiple card setup, the extra VRAM is worth it on the higher VRAM models. Note people with two 680s running away from them because while the chip can handle all the throughput to make a perfectly nice very high res experience, relatively speaking, 2GB of VRAM isn't enough for such high resolutions and swapping has performance penalties that many find inexcusable given the cost of the setup.

3. As a bit of a qualifier to the above, do you know for a fact that you'll be adding multiple monitors? I mean, I'm an rear end in a top hat who is probably an NDA dropping and price adjustment away from selling a few guitar pedals to fund another goddamned GTX 780 and I'm still on 1080p (so I'll end up buying a bigger monitor, which has, to be fair, been on the to-do list for a while now, but still, watch the tail wag the dog). Be realistic and don't overspend, you'll turn out like me and I'm just, god, horrible.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply