Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Boten Anna
Feb 22, 2010

I got a GTX 670 on launch day which is complete overkill with my current monitor set up (a 1920x1080 primary and a secondary 1600x900 one) but it's quite wonderful so far. Especially since I don't even have any of the high end games to really test it with, but god drat do the ones I have run well, and I'm getting a feeling of "woah, this is how these games were intended to run this whole time!" so far, overall. 3DMark 11 runs great though! :v:

I bought it mainly because I really like good, smooth performance and would rather kill an ant with an elephant gun than try to fiddle with everything to get the right "playable" range. I also would rather just plunk down extra money than get something last-gen; I see it as that I'll get more value out of a $400 launch-day-gen card than a cheaper last-gen one with fancy spoilers and attached and pinstripes painted on (you know, the things that make it go faster :iiaca:) to push it harder that's already headed to the graveyard. I may just be ridiculous though as at the end of the day I'm probably just trying to justify futureproofing. :v: Really, I just had the chance to get this and Ivy Bridge on launch day and wanted something up to the minute for once :shobon:

That said, I am probably going to get a Korean monitor soon so that I can actually make full use of this thing. And here's to hoping this isn't like the ill-fated 560 Ti that bluescreened on loading gfx drivers on two different motherboards that I just said "fuckit" and sent back and ended up replacing with something more expensive!

And god drat, movax, I spent like an hour reading the OP and I didn't even actually physically read all of it. So much information, and so comprehensive. Good work, and thank you!

Adbot
ADBOT LOVES YOU

Boten Anna
Feb 22, 2010

Klyith posted:

One feature you missed: support for 24/32bit color in 3d. The main competition Voodoo2 could only render at 16bit. At the time I had a completely stacked gaming computer (bought for me by my grandmother for "important college work" :raise:), and I had both of them. In the beginning I was a total quake-head and used the voodoo2 all the time for glide... But then a little game called Homeworld came out and showed me that things like color, art, and immersion really mattered in games. I think that game was pretty much the first one that was designed for -- with it's fantastic brightly colored gradient backgrounds and layered transparent effects -- full 32bit color, and only fully enjoyable that way.

(Also does anyone else remember the Riva128 vs Voodoo2 flamewars? So epic. So dumb.)

Since this was the 90s, by both of them do you mean you would actually open up your computer and swap them out? Or since this was the 90s, do you mean they both just sat on the uniform AGP/PCI slots and you just plugged in the one you wanted to use at the time?

Boten Anna
Feb 22, 2010

Woah, that passthrough stuff is amazing, the wonders of the 90s that I didn't get to experience :monocle:

I remember my first graphics card was a diamond viper, and it was totally badass and stuff. I also remember playing with UltraHLE and having to use glide wrappers to get it to work and not really knowing what they were, entirely. Then having my mom excited to hear I got the new zelda working (only video game she ever liked) and having to leave within a minute because it was making her nauseous, haha.

Boten Anna
Feb 22, 2010

I kind of wish this was here now. :shobon: Not to use an nvidia service, but in a way my fiancee could somehow play Diablo 3 from my computer at the same time I am, using her own computer for input over the LAN, that'd be rad as h*ck. I've got the horsepower to spare, dammit! This so would have saved me several hundred dollars in building her a D3 capable machine.

Boten Anna
Feb 22, 2010

Is there any hope in the nearish future for SLI without all the wacky restrictions? Like when my 670 is starting to chug is there any chance I'll be able to smack in another one and keep at it without all this annoying "full screen only" malarkey and poo poo drivers, maybe thanks to things like the 690, or is it kind of hosed for the forseeable future?

Boten Anna
Feb 22, 2010

Dogen posted:

It's the way SLI works, I'm afraid.

Does it work basically by having each card draw different elements on the screen and them merging them together, using hardware or even just the raw video signal somehow?

Is it possible in the future that the connection between the two will be more of a... for a lack of a better word, logical link that basically just uses additional cores to throw more hardware at the rendering similar to a multi-core CPU?

I'm probably phrasing this in all kinds of terrible ways what with having an only rudimentary understanding of how any of this works under the hood.

Boten Anna
Feb 22, 2010

Factory Factory posted:

Crossposting this because it's relevant to some of this thread's interests:

What do they mean exactly by "Pentium 1 cores"? Just that the core lacks all the fancy extensions (MMX, etc.) or is it a literal Pentium 1 just slapped on 22nm process so it's now faster? Kinda both?

Boten Anna
Feb 22, 2010

Tesselation is going to be the death of me. I thought I'd be happy forever with a 670 even with my gigantic Korean monitor, but apparently breaking up a texture into about a billion triangles makes things look goddamned amazing is a thing we're going to be doing going forward and in some things it does push the 670 at least to where it might dip below 60FPS :v:

Boten Anna
Feb 22, 2010

I was loving around with livestreaming and found that it works Pretty Alright with my 3770K and GTX 670 with my game running at Korean IPS resolution, but I do take a noticeable framerate hit streaming in HD. Is there any way to use lucid or something to offload some of the processing, or do something otherwise cheap/free? Other than turn off tesselation or downgrade graphics settings? :)

I should try the beta drivers and see if it helps as well, though the game in question is The Secret World which I'm not sure has any particular optimizations yet.

Boten Anna
Feb 22, 2010

Factory Factory posted:

Ironically, Intel WiDi already does exactly what you want - it encodes the frame buffer with the QuickSync engine for streaming. It just only does so over WiFi to a dedicated receiver and so is completely useless for your purposes.

Yeah I actually like a lot of the GPU streaming stuff they're working on in concept but in execution I can tell it's going to be obnoxiously limited and not do anything I want :argh:

Like there will likely be no way to have an essentially dumb terminal on a LAN that uses the monster GPU on a host to play a game, for instance. That is not a thing that will happen , because the bottom line:(

Boten Anna
Feb 22, 2010

Dogen posted:

Yeah you're right, it just works on some mmorpger that came out today :(

...and that game is The Secret World, and it owns :colbert:

Boten Anna
Feb 22, 2010

Animal posted:

Or if you are able to snag it at all, considering past shortages.

I suggest camping the EVGA site on launch morning. This is how I got a 670 on launch day with absolutely no issues, though goon warning: you'll have to wake up before noon. If no specifics are announced ahead of time, just set some paramaters for yourself. Buy IF less than $X, spending cap is $X so only buy the TURBO EDITION if it is less than $X, and be ready to mash the "add to cart" button as soon as it pops up in the store.

Boten Anna fucked around with this message at 23:19 on Aug 8, 2012

Boten Anna
Feb 22, 2010

Even though I'm a huge nvidia fan, it sucks there's really only two competitors and I'd hate to see AMD get gutted or fold. As such I also hope this generation isn't going to sink them.

Boten Anna
Feb 22, 2010

The 600 series is pretty dang new and you'll probably get a good generation of life out of them. I don't see anything coming on the horizon that my 670 won't be able to play at max settings, and it may be a good long time before it can't do things at moderate settings.

You'll get a solid 3 years out of the card at least, but very possibly more.

Boten Anna
Feb 22, 2010

I have a GTX 670 and am not want for anything, but I "only" have one 2560x1440 display to drive. Like seriously the only wall I've hit is streaming The Secret World at that resolution with the graphics on high, and even then it just makes the game run at a playable but noticeably lower framerate. I just don't know what one would need beyond that unless they wanted to do one of those surround monitor setups but loving christ on a stick I can't even look at my entire 2560 monitor at once.

Boten Anna
Feb 22, 2010

For shits and giggles I tried with a copy of Black Swan using VLC and while Final Fantasy XIV was still running (a hog of a game even though I'm standing in my small inn room). I have a 3770K and a GTX 670.

Everything started artifacting heavily though my computer was usable at 16, 12 was choppy but I've seen people think worse is acceptable.

I closed FFXIV and tried again and it is almost watchable but there is still artifacting which I think is an issue with disk I/O (256GB Crucial M4 SSD notwithstanding) as it doesn't start until I pick random seek points. It seems I could do 13 videos smoothly; still kind of choppy but not as bad as 12 with XIV open.

I think you'd need two of my computers to run a 16 screen video wall well, but it'd be cheaper to use four lesser specced ones I think.

Boten Anna fucked around with this message at 09:39 on Sep 4, 2012

Boten Anna
Feb 22, 2010

Avocados posted:

My ATi Radeon HD 5870 is on it's way towards death. Home remedies can only keep this thing ticking for so long and I really think we're at the end of the line now. While it was a beautifully fast and strong card for me, I had nightmares to no end with the drivers. I'm a little scared of choosing ATi again because it's been my only experience with their brand and NVIDIA cards were generally problem free.

What Nvidia cards are out right now that are comparable to the 5870 in terms of performance? I'd rather "replace" the card instead of upgrade, as funds are a little low.

The reasons you've mentioned are exactly why I've been loyal to nVidia for some time now. I'm really enjoying my GTX 670 but that's probably a bit out of your price range at the moment.

If you can hang on and save up, you'll probably be happy with a 660 Ti or greater for quite some time.

Boten Anna
Feb 22, 2010


someone on that link's comments posted:

Never say never. The reality is that if the rumors of little real progress on the home console front are true, then PC's might already far outstrip consoles this year and last year already.

If that is the case (and traditionally consoles enjoy a six month-year advantage over PC's at launch), then how long do you think it will take Intel with its constant updates to GPU's before they catch up to "acceptable" levels of performance? And how long before there are advantages to using that integrated GPU over a discrete GPU? Directly shared memory space, CPU-to-GPU direct access, etc.

Is this person right or kind of full of poo poo?

With stuff like Lucid starting to exist, might hybrid iGPU/dedicated GPU become something of the norm as a way to take advantage of the aforementioned direct access and shared memory while still having something to throw 200W at to do a bunch of number crunching? Or does this idea have fundamental flaws?

Boten Anna
Feb 22, 2010

Dang, thanks for the explanation Factory! To tl;dr your tl;dr, I'm getting that that guy is a little bit full of poo poo in that the direct memory access doesn't really exist in iGPUs, however a fundamental architecture shift might be needed in a future generation to get to the next level of performance.

Boten Anna
Feb 22, 2010

Jesus christ, every time I mention I go nvidia because they have good drivers with few serious problems and every ATI I've had has had obnoxious showstopping driver issues at some point I get "Gosh, this isn't 2003 anymore, Anna!" and then something like the above happens.

:colbert:

My 670 is still rockin' and the only issue I've had is with the old version of FFXIV crashing every few hours simply because they never bothered to QA with the 600 series because they were busy making an entirely new engine, the alpha version of which runs perfectly at max settings.

Just... ATI's drivers are bad. I hope they some day stop being bad but I haven't seen that happen yet.

Boten Anna
Feb 22, 2010

Rawrbomb posted:

For anecdotal evidence the other way, I had 6 NVidia cards blow out in 2007-9 between me (2) my friends (4). I also had non stop crashing issues. I moved to ATi/AMD and never looked back

edit: I didn't a word.

Yeah see, and I'd take this over software issues any day. If there are software issues it costs me time to gently caress with it to find a workaround or fix. If it's hardware I can return/rma/use it as an excuse to upgrade. This is further mitigated by buying from manufacturers with better reputations and warranties.

I can see where others might feel differently, but my experience with nvidia cards has been great overall. The one problem I ever had was a DOA card I just sent back to Amazon and used the refund toward a better card.

Boten Anna
Feb 22, 2010

Even as much as I personally go for nvidia I certainly don't want to see them have a monopoly. In fact, I hope Intel's work on igpus lights fires under asses to keep things innovative and competitive.

Boten Anna
Feb 22, 2010

One thing I found interesting about the hfr hobbit is that it seemed to make it more obvious when they sped up the raw footage. The beginning of the movie went back and forth with sped up and actual speed footage fairly often, and I suspect that some people conflated the awkwardness of sped up footage with hfr.

Boten Anna
Feb 22, 2010

hobbesmaster posted:

If thats the case then that is what the real problem is.

As another anecdote, a lot of headaches in 3d movies are caused by improper viewports and impossible geometries caused by it. More recent 3d films don't have these issues as much, but early 3d films had it everywhere.

I saw it in HFR 3D and it was quite nice, though I agree it probably won't solve other 3D problems.

The sped up footage made sense in the battle sequences and such, because well, the movie is already 3 goddamned hours long and it can't really be that easy to maneuver in all the ridiculous armor and makeup they had on. I think they kind of missed by doing it at the beginning when Frodo was just like, reading books and stuff; it was making me wonder if something was wrong with the projector.

To keep this GPU related I wasted a bunch of time at work today trying to get Aero to work again after it mysteriously stopped working on my--wait for it!--ATI video card, and I never did get it to work. I told y'all I hate those things. :colbert:

Boten Anna
Feb 22, 2010

I wonder if nVidia/ATI are scared of this, as they kind of should be. At this rate there won't really be a compelling reason to buy a graphics card for all but the most extreme high performance uses in a couple of generations, leaving GPUs to go the way of the sound card.

Boten Anna
Feb 22, 2010

I'm just surprised that 11 versions of DirectX in it sounds like even down to variants of the same model, there is a lot of very low level tweaking and custom coding that needs to be done to get decent performance out of video cards, or sometimes even to work at all. I would think by now there'd be more abstraction and standardization but it seems like change is just happening at too rapid of a pace still to allow for that.

I don't even know if it's a problem that you can throw money at to make go away. It sounds like it needs some very special, dedicated, incredibly knowledgeable people that may or may not even exist to get the drivers right, and quickly.

Adbot
ADBOT LOVES YOU

Boten Anna
Feb 22, 2010

Out of curiosity, what's the tl;dr on the GTX 700 series of graphics cards versus the 600? I assume it's generally not worth full sticker price to upgrade from the previous generation, but is anything interesting happening?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply