Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Animal
Apr 8, 2003

That would be the first 1440p TN panel that I have read about.

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

deimos posted:

It was made by AMD and "ratified" by the HT Consortium of companies, including, but not limited to...

Wait for it...



Sorry, I wasn't saying HyperTransport was a vs NVIDIA thing, but more that Intel went its own way in creating QPI when HyperTransport would have done the job.

Nothing particularly fascinating, although if Intel had adopted it as well, maybe we would have seen HTX slots in actual use.

EoRaptor
Sep 13, 2003

by Fluffdaddy

HalloKitty posted:

Sorry, I wasn't saying HyperTransport was a vs NVIDIA thing, but more that Intel went its own way in creating QPI when HyperTransport would have done the job.

Nothing particularly fascinating, although if Intel had adopted it as well, maybe we would have seen HTX slots in actual use.

A bit of a derail, but QPI is extremely well modeled as a transitor set and an electrical interface within intel. They can easily place it alongside/inside practically any existing silicon and know 100% how it will behave. They are also free to tinker with it without worrying about compatibility, so another plus. Also NIH syndrome.

forbidden dialectics
Jul 26, 2005





Sidesaddle Cavalry posted:

(Emphasis mine:)


We apparently live in a world where $800 monitors are regarded as being in that "price:performance" sweet spot. :smithicide: Still, at least we're headed in the right direction. Let's leave it up to reviews to see how passably bearable the image will be.

Seriously, you can get almost 3 of those Korean PLS 27" 1440p screens for that, most of which can hit at least 96Hz.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
New nVIDIA WHQL driver is out. Not much new in the extras aside from the new HD Audio driver version.

Incredulous Dylan
Oct 22, 2004

Fun Shoe
Just installed my new 780 ti to see that new drivers are finally out! Get hype.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

nVidia responds to the so-called FreeSync demo, basically says "yeah, we're all for that, but it's going to require new monitors anyway and we're not opening up G-Sync for AMD in the meantime." Notably, agrees that it's already supported in the standard (imo it was disingenuous of the AMD rep to claim that nVidia somehow just wouldn't know that), and feel that eventually that will be the route to go, but they want to "nudge the industry" by releasing an actual product.

Which seems to have worked, so far, we'll see how well as things progress.

There's also some slapfight stuff about how seriously AMD free-sync, as described, would only work on a handful of laptops at the moment because of their unique panel interface, and how there aren't any monitors with a similar interface or there wouldn't be a need to replace the scaler ASIC in the first place.

I'm calling FUD on AMD's part, hit after the announcement but just before the tech goes to market and deny sales on the grounds of "hey I don't even need to buy that, it'll be possible with any AMD card and he even said he doesn't think nVidia supports the free one!" - nVidia isn't exactly without sin here either, it'd help adoption if they'd at least just license the poo poo, but then there probably are enough root differences in the basic "discussion" that the card and the FPGA for G-Sync are going to be having and how AMD's GCN 1.1 works that it probably wouldn't be as simple as all that anyway.

Shaocaholica
Oct 29, 2002

Fig. 5E
Im kinda late to the game on this but what does mantle mean for the XBO and PS4? Could they support it or is the hardware incompatible? What would mantle bring, if anything, over the current developer APIs? Would it be politically impossible for MS to support, even on a future console?

beejay
Apr 7, 2002

I don't see anywhere that the AMD guy was claiming nvidia didn't know anything. The TechReport story says "His new theory is that the display controller in Nvidia's current GPUs simply can't support variable refresh intervals, hence the need for an external G-Sync unit."

It will be interesting to see how it all shakes out over the next couple of years.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Agreed posted:

nVidia responds to the so-called FreeSync demo, basically says "yeah, we're all for that, but it's going to require new monitors anyway and we're not opening up G-Sync for AMD in the meantime." Notably, agrees that it's already supported in the standard (imo it was disingenuous of the AMD rep to claim that nVidia somehow just wouldn't know that), and feel that eventually that will be the route to go, but they want to "nudge the industry" by releasing an actual product.

Which seems to have worked, so far, we'll see how well as things progress.

All my eyes are on this, really. If monitor makers can see two different GPU designers competing on this, they'll have more of a reason to accelerate their own development to meet the needs of that competition. I'd like an affordable video card-controlled high-refresh rate monitor with a non-TN panel before I grow gray hairs, drat it :argh:

Generic Monk
Oct 31, 2011

Shaocaholica posted:

Im kinda late to the game on this but what does mantle mean for the XBO and PS4? Could they support it or is the hardware incompatible? What would mantle bring, if anything, over the current developer APIs? Would it be politically impossible for MS to support, even on a future console?

Mantle is a low level API so 'close to the metal' optimizations can be applied to GCN graphics cards in PCs. Console development has always been 'close to the metal' - think of how they managed to wring out performance that could play GTAV at 30FPS on graphics hardware roughly equal to a midrange consumer graphics card from 2005. I don't really think there's any incentive for it to be used in console development, the whole point of the project is to get console-like optimization out of PC hardware.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Sidesaddle Cavalry posted:

All my eyes are on this, really. If monitor makers can see two different GPU designers competing on this, they'll have more of a reason to accelerate their own development to meet the needs of that competition. I'd like an affordable video card-controlled high-refresh rate monitor with a non-TN panel before I grow gray hairs, drat it :argh:

As I understand it, the progression goes something like this:

1. eDP is a thing in laptops for years, because laptops are much more highly integrated and don't have to worry about separate peripherals. The eDP spec allows for variable refresh, so nVidia and AMD support it on silicon (what I don't know for sure is whether the AMD exec's hypothesis as to whether nVidia can't do hardware-controlled variable framerates at the source on non-mobile platforms is correct)

2. nVidia, on the heels of a lot of work with frame pacing generally and their display pipe, from adaptive vsync on, develops the first FPGA to replace the scaler ASICs in monitors.

3. Both companies acknowledge that regular old DP 1.3 (a step up from 1.2!) will support a newly standardized method basically taken from eDP to do variable refresh - but, and here's where nVidia rep says that they're not letting others get in for free - it's going to require a new ASIC in place of the scaler for any monitor that has a variable refresh rate that does basically what the nVidia FPGA does.

4. Now we wait for competition to do what competition does (assuming AMD can afford to compete and intends to do so, as opposed to pointing out a possibility in the DP 1.3 standard and hoping monitor makers will just make an ASIC that talks to their hardware the same way that eDP did, though that'd be proprietary in the other direction and trickle-down, funnily enough)... and see where it all goes.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Agreed posted:

3. Both companies acknowledge that regular old DP 1.3 (a step up from 1.2!) will support a newly standardized method basically taken from eDP to do variable refresh - but, and here's where nVidia rep says that they're not letting others get in for free - it's going to require a new ASIC in place of the scaler for any monitor that has a variable refresh rate that does basically what the nVidia FPGA does.

Not quite, according to the TR thing ou linked. Nvidia says that it's already a capability of DisplayPort (which it is, if it were implemented on the scaler ASIC), not that it'd be a new one in DP 1.3. AMD wants VRR canonized by VESA so new monitor scaler ASICs must support VRR. Nvidia says "We don't need a new standard, it's already there. Just has to be built in."

Of course, Nvidia is the firm selling the scaler with it built in.

quote:

4. Now we wait for competition to do what competition does (assuming AMD can afford to compete and intends to do so, as opposed to pointing out a possibility in the DP 1.3 standard and hoping monitor makers will just make an ASIC that talks to their hardware the same way that eDP did, though that'd be proprietary in the other direction and trickle-down, funnily enough)... and see where it all goes.

I still think that what could really make the difference here is the ability to do sub-30 FPS elegantly and eliminate framerate judder from home theater stuff. Unfortunately, HDMI is not DisplayPort, but there has to be room for high-end home theater that uses DisplayPort.

Silly hack idea: On sub-30FPS framerates, double the framerate and do frame doubling.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Just as an aside while you're both here: Thanks for the help with the 780. It is nice and my new OS is nice.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

Not quite, according to the TR thing ou linked. Nvidia says that it's already a capability of DisplayPort (which it is, if it were implemented on the scaler ASIC), not that it'd be a new one in DP 1.3. AMD wants VRR canonized by VESA so new monitor scaler ASICs must support VRR. Nvidia says "We don't need a new standard, it's already there. Just has to be built in."

Of course, Nvidia is the firm selling the scaler with it built in.

I don't know, it's difficult to parse precisely because AMD's impromptu demo relies on the old eDP standard which relies on having it built-in (that was the bit about not having to worry about peripherals, by the way, I am running on two hours of sleep so apologies for a lack of clarity). If their method is "Our GPUs can just DO this, haha! So long as the monitor's scaler ASIC has the necessary functionality built in, anyway. Cough." that is not really any different from the buyer's standpoint or the monitor manufacturers' standpoint compared to nVidia's "Our GPUs can do this, so long as the monitor's scaler ASIC is replaced with a FPGA that lets the card do the scaling and does variable framerate voodoo."

One of these things is JUST LIKE THE OTHER, only different angles.

It is a very, very fine point to suggest that AMD is the company who wants this to be broadly standardized in VESA, because, lol, of COURSE they'd love that, free-sync indeed, never have to worry about doing the research to make the ASICs, they can just trust their hardware to talk to the newly VRR-enabled ASICS that are suddenly, magically in everything, ta da.

It certainly won't be that easy. And that's where nVidia has a good point in that, gently caress man, they put a lot of work into this, monitor makers weren't exactly taking steps on their own to make it happen - and this doesn't really count, since it comes quite a while after all the requisite crap to get us to this point.

I agree that improving the home theater experience would be awesome, but I also worry that it's stuck with some variety of HDMI standard 'thru 4K, which will kill any kind of early adoption possibilities. Grr.

Shaocaholica
Oct 29, 2002

Fig. 5E

Generic Monk posted:

Mantle is a low level API so 'close to the metal' optimizations can be applied to GCN graphics cards in PCs. Console development has always been 'close to the metal' - think of how they managed to wring out performance that could play GTAV at 30FPS on graphics hardware roughly equal to a midrange consumer graphics card from 2005. I don't really think there's any incentive for it to be used in console development, the whole point of the project is to get console-like optimization out of PC hardware.

Are you sure about that? At least on the xbox I though it was MS initiative to have parity between Xbox graphics API and DX so developers didn't have to manage 2 different APIs.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It also has bare-metal APIs for when you need the performance. That's part of why it has Xbox OS running alongside Windows.

I don't see why you couldn't port Mantle to Xbone or PS4, but I'm hardly an expert in GPU programming. I imagine it'd be higher-level than the native low-level API, though, and so not as fast.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

It also has bare-metal APIs for when you need the performance. That's part of why it has Xbox OS running alongside Windows.

I don't see why you couldn't port Mantle to Xbone or PS4, but I'm hardly an expert in GPU programming. I imagine it'd be higher-level than the native low-level API, though, and so not as fast.

Plus, iirc Microsoft basically said "No." when AMD was pitching Mantle as THE development platform for the XB1. Hard "No." May be misremembering, but it seemed like a big deal at the time.

Say, isn't it January? When is EA going to be done getting sued by their shareholders for how buggy BF4 is so we can see Mantle in action already?

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Agreed posted:

I don't know, it's difficult to parse precisely because AMD's impromptu demo relies on the old eDP standard which relies on having it built-in (that was the bit about not having to worry about peripherals, by the way, I am running on two hours of sleep so apologies for a lack of clarity). If their method is "Our GPUs can just DO this, haha! So long as the monitor's scaler ASIC has the necessary functionality built in, anyway. Cough." that is not really any different from the buyer's standpoint or the monitor manufacturers' standpoint compared to nVidia's "Our GPUs can do this, so long as the monitor's scaler ASIC is replaced with a FPGA that lets the card do the scaling and does variable framerate voodoo."

The thing is that most of the work nVidia is doing (framebuffering) is already part of what needs to be done for eDP 1.3 for PSR, so why not build on it and push VESA to adopt a standard is beyond me and seems a bit disingenuous of nVidia to suggest this is a massive change that they need to drive.

Color me a fanboi but there's a sort of zen simplicity in what AMD is proposing.

e: eDP 1.3 not 1.4


Edit2: I mean, it's not like nVidia isn't a part of VESA and knows VRR is becoming part of the 1.3 standard.

Edit3: And while we're at it, there's also precedent to implementing a "draft" standard, see every early Wireless N router.

deimos fucked around with this message at 21:32 on Jan 8, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

deimos posted:

The thing is that most of the work nVidia is doing (framebuffering) is already part of what needs to be done for eDP 1.3 for PSR, so why not build on it and push VESA to adopt a standard is beyond me and seems a bit disingenuous of nVidia to suggest this is a massive change that they need to drive.

Color me a fanboi but there's a sort of zen simplicity in what AMD is proposing.

e: eDP 1.3 not 1.4


Edit2: I mean, it's not like nVidia isn't a part of VESA and knows VRR is becoming part of the 1.3 standard.

Edit3: And while we're at it, there's also precedent to implementing a "draft" standard, see every early Wireless N router.

FactoryFactory and I had basically the same kind of discussion but more directly on Skype a bit earlier - what it boils down to is that I am tending to give nVidia credit for pushing the market forward here, while the other side of things is wanting to suggest that there's really nothing terribly innovative going on, and anyway it could be done without nVidia's special proprietary Green Team hardware.

I'll expand my thoughts that I discussed with him a bit.

AMD's "Win" scenario looks like this, and is mainly about preventing nVidia from gaining anything from their G-Sync investment.

1. Keep the current GPU market "fight"/conversation about pushing framerates higher (i.e. Mantle) rather than a fundamental shift in display technology - this should make intuitive sense because they don't have a production proven or ready answer to G-Sync, and it is a feature that many have expressed real interest in. AMD obviously wants to compete with that interest, and for them to do so, it's better if the conversation doesn't center around variable refresh rates at this moment in time because they can't sit at that table should it be a big thing.

2. Deny nVidia G-Sync sales if at ALL possible - Again, this is likely just intuitively true and I probably don't have to expand on it, but it's in AMD's best interests if nVidia doesn't gain ground with G-Sync, since nVidia has doubled down on it being very proprietary, and if AMD can stop people from buying G-Sync monitors through any means possible, that prevents them from giving ground on cards since obviously you can't plug an AMD card up to a G-Sync FPGA equipped monitor.

3. Genuinely but ~selfishly push for variable refresh rate in the standard itself - If they can get the standard adjusted so that all or even just enough (say, gamer oriented) monitors come from the factory with an ASIC that'll talk to their hardware, that's just super for AMD. It's also not awesome for nVidia, since nVidia went to the trouble of inventing G-Sync and bringing it to production.


From nVidia's side, the whole thing just boils down to "make G-Sync work as a product," with a light side of enlightened self-interest (which is a nicer word for selfishness :sigh:) in terms of genuinely wanting to move the market forward. I think it's a bit :tinfoil: to suggest that because they're part of the VESA standards organization, there's some kind of conspiratorial insider trading-like thing going on with them rushing out to make their mark on the market before this benevolent free technology from AMD hits - since AMD is trying to move the work of having something instead of the scaler to the display manufacturers instead by getting it standardized, it's not like AMD is doing display makers any favors, whereas nVidia arguably is by taking care of the necessary replacement for the scaler ASIC for them. nVidia has a case when they say they don't feel like making that free for all.

Of course, long-term, we don't want to be bogged down in a proprietary scheme, but I do think nVidia has played a significant role worthy of recognition in bringing the tech practically to market and actually doing the replacement-for-the-scaler thing that will have to happen with the new standard that AMD is proposing, too; AMD is just doing their best to cut into nVidia's market in the limited ways they are able without being able to directly address the product with a work-alike of their own, and nVidia is trying to get out at least what they put in, in terms of R&D and practical expenses, while also being quite adamant about taking credit for variable refresh rates coming to market now instead of in a possible future where it's standardized and the card makers don't have to do anything but (continue to) support it in silicon.

Agreed fucked around with this message at 22:24 on Jan 8, 2014

Stanley Pain
Jun 16, 2001

by Fluffdaddy
So basically the way business is done every day, replacing AMD and NV with company X and Y.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Stanley Pain posted:

So basically the way business is done every day, replacing AMD and NV with company X and Y.

Um... sure. In the sense that there are two competitors and they both want the other to fail so that they can succeed, or possibly force one of the competitors' way into a semi-Schumpeterian monopoly in the instance that one becomes totally disarmed by the other competitively, as Intel arguably did to AMD in the higher performance processor arena. But for display technology.

Honestly, though, it's never as simple as just "the way business is done every day," and oversimplifying can lead to mistaken intuition regarding the nature of the competition going on.

Shaocaholica
Oct 29, 2002

Fig. 5E
Another thing which this opens the gates for in the consumer realm other than consoles also going able to do this is oddball frame rates like Hobbit's 48fps HFR. Instead of requesting and publishing a spec for 48hz over HDMI you could just allow variable sync and content authors can use whatever fps/cadence they want including mixed fps material. Your digital streams will just have a bunch of frames and each frame will have a delta-t attached to it and possibly some other metadata for audio sync.

Of course this is thinking waaaaaaay out

Magic Underwear
May 14, 2003


Young Orc

Agreed posted:

Um... sure. In the sense that there are two competitors and they both want the other to fail so that they can succeed, or possibly force one of the competitors' way into a semi-Schumpeterian monopoly in the instance that one becomes totally disarmed by the other competitively, as Intel arguably did to AMD in the higher performance processor arena. But for display technology.

Honestly, though, it's never as simple as just "the way business is done every day," and oversimplifying can lead to mistaken intuition regarding the nature of the competition going on.

You are seriously way too invested in this. The pure number of words you have written about this, not to mention how dramatic you're being, is way out of proportion with the subject matter: a niche gaming graphics feature for high end enthusiasts.

Shaocaholica
Oct 29, 2002

Fig. 5E

Magic Underwear posted:

...a niche gaming graphics feature for high end enthusiasts...

IMO this will eventually be and always should be a standard graphics feature across the board. Not just for enthusiasts.

beejay
Apr 7, 2002

I also think it will become standard eventually. It will be interesting to see for sure. Will monitor makers make the changes to their own hardware and firmware for FreeSync, or will monitor makers adopt the nvidia module? That will decide things I think. If AMD "loses" and G-Sync becomes widespread then it could be bad news for them. If nvidia "loses" then it sounds like they would be able to use FreeSync anyway.

I wouldn't hitch my wagon to either star yet. It's going to be a while before things shake out. It seems like a bad idea for us as consumers to try to put either one down as all of this should end up with us having a nice new feature and the death of tearing.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Magic Underwear posted:

You are seriously way too invested in this. The pure number of words you have written about this, not to mention how dramatic you're being, is way out of proportion with the subject matter: a niche gaming graphics feature for high end enthusiasts.

I was kinda alluding to this.


Don't get me wrong, I'm VERY glad NV has started to push this issue so that LCD manufacturers get off their asses and start pushing out some good gaming hardware and not some poo poo 2fast2furious TN panels.


Agreed posted:

Honestly, though, it's never as simple as just "the way business is done every day," and oversimplifying can lead to mistaken intuition regarding the nature of the competition going on.

Unfortunately you're wrong. Business in just about every, single sector works like this. A few will lead in innovation and the rest follow suit or we see knee jerk reactions happening. What we're seeing is AMDs knee jerk reaction and a bit of verbal jousting levied towards Nvidia.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Magic Underwear posted:

You are seriously way too invested in this. The pure number of words you have written about this, not to mention how dramatic you're being, is way out of proportion with the subject matter: a niche gaming graphics feature for high end enthusiasts.

Who the hell are you to tell me what it's okay to care about and how much of a poo poo it's okay to give? Wow, man, this is a seriously hosed up post on your part. Thanks for the completely needless put-down. I have no idea what your problem is, but hopefully this helps you get over it.

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Agreed posted:

Who the hell are you to tell me what it's okay to care about and how much of a poo poo it's okay to give? Wow, man, this is a seriously hosed up post on your part. Thanks for the completely needless put-down. I have no idea what your problem is, but hopefully this helps you get over it.

It's kinda been his thing for some reason. Though with that post I've just taken to ignoring him.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
It's like, not cool to care about stuff, man.

If you are a teenager in the 1990s.

Magic Underwear
May 14, 2003


Young Orc

Agreed posted:

Who the hell are you to tell me what it's okay to care about and how much of a poo poo it's okay to give? Wow, man, this is a seriously hosed up post on your part. Thanks for the completely needless put-down. I have no idea what your problem is, but hopefully this helps you get over it.



That's all I'm saying. From my point of view anyway, you are making way too much of this. I don't mean to be snarky, just...take a step back for a second.

(USER WAS PUT ON PROBATION FOR THIS POST)

(USER WAS PUT ON PROBATION FOR THIS POST)

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb
making GBS threads on somebody because they are obviously passionate about something is...pretty dumb. You suck Magic Underwear.

Ham Sandwiches
Jul 7, 2000

Magic Underwear posted:



That's all I'm saying. From my point of view anyway, you are making way too much of this. I don't mean to be snarky, just...take a step back for a second.

This is a GPU megathread in SH/SC where people that are very familiar and very passionate about the subject can exchange information and compare notes on the latest developments in that field.

Syncing monitor refresh to frames is a pretty big improvement in terms of experience that it provides for the user. As you can imagine, the people that are familiar with that field and are excited about developments in that field will chat about it.

The expertise that some of the posters provide in this thread is absolutely badass, and having those same folk hang around and spitball and speculate about stuff is pretty cool too. I enjoy reading their opinions on the general state of the graphics industry, from the business / adoption angle as well as the technical angle.

I think it's totally bizarre that someone would come in and start calling out the regular contributors to a very specialized thread for their postcount and declare that the people providing info care too much.

Double Punctuation
Dec 30, 2009

Ships were made for sinking;
Whiskey made for drinking;
If we were made of cellophane
We'd all get stinking drunk much faster!
I still haven't seen any IPS monitors with VRR in any form. Until that happens, this is all just a bunch of worthless FUD.

Animal
Apr 8, 2003

Magic Underwear posted:



That's all I'm saying. From my point of view anyway, you are making way too much of this. I don't mean to be snarky, just...take a step back for a second.

What a petty little prick.

-edit-
Why dont you go into a sports thread, or an automobile thread, or a movie thread and try the same thing? What makes you think you are above someone else just because their hobby does not appeal to you?

Animal fucked around with this message at 04:13 on Jan 9, 2014

beejay
Apr 7, 2002

dpbjinc posted:

I still haven't seen any IPS monitors with VRR in any form. Until that happens, this is all just a bunch of worthless FUD.

What. Like you are calling G-Sync and FreeSync FUD? I am not sure you're using that right.

Edit: Also AMD introducing FreeSync isn't FUD either. They demoed it, it exists. FUD is not announcing a competing thing. Like when the original Xbox was announced that was not FUD.

beejay fucked around with this message at 04:26 on Jan 9, 2014

Unormal
Nov 16, 2004

Mod sass? This evening?! But the cakes aren't ready! THE CAKES!
Fun Shoe

Magic Underwear posted:



That's all I'm saying. From my point of view anyway, you are making way too much of this. I don't mean to be snarky, just...take a step back for a second.

:gb2gbs:

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
The derail has been handled, please return to your regularly scheduled posting about GPUs and GPU technology!

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:


I don't suppose we have any real information BUT do we have any indication of the next-generation of cards coming from AMD/NVIDIA in regards to pixel pushing power? With all this talk of 4k monitors and ~super gaming elite~ TN panel garbage manufactures insist on shoveling out, and my recent setup of a set up capable of surround gaming I'm intensly curious. Finding out that NOTHING on the market can really drive a game at 7680x1440 (and have trouble with just 2560x1440 at all) is just kinda sad. With 4k monitors VRAM limits actually start to count for something.

Adbot
ADBOT LOVES YOU

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:
EDIT:

Whoops double post

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply