Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
An Unoriginal Name
Jul 11, 2011

My favorite touhou is my beloved Nitori.
:swoon:
Hardly an issue unless we start seeing games with no texture compression, or you want to start gaming on huge resolutions.

Adbot
ADBOT LOVES YOU

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

spasticColon posted:

With newer games eating more VRAM will the memory bus on my 660Ti start to become a problem in the near future? Right now it seems my 660Ti is borderline overkill at 1080p but with the new consoles coming with GPUs having access to 8GB of GDDR5 RAM I fear that video cards with 2GB of VRAM or less will become obsolete very quick.

As I've mentioned earlier in the thread, consoles will have shared memory, so having 6-8GB of RAM doesn't necessarily mean they'll use all that up on GPU resources.

It's too early to tell, but I think the real VRAM wasters on PC will be the same as now -- large resolutions, MSAA, etc. which you don't often see on consoles.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

spasticColon posted:

With newer games eating more VRAM will the memory bus on my 660Ti start to become a problem in the near future? Right now it seems my 660Ti is borderline overkill at 1080p but with the new consoles coming with GPUs having access to 8GB of GDDR5 RAM I fear that video cards with 2GB of VRAM or less will become obsolete very quick.
On the 660 Ti I wouldn't really consider it an issue, but yes 2GB of VRAM will be the limiting factor on high-end cards before anything else.

coffeetable
Feb 5, 2006

TELL ME AGAIN HOW GREAT BRITAIN WOULD BE IF IT WAS RULED BY THE MERCILESS JACKBOOT OF PRINCE CHARLES

YES I DO TALK TO PLANTS ACTUALLY

spasticColon posted:

With newer games eating more VRAM will the memory bus on my 660Ti start to become a problem in the near future? Right now it seems my 660Ti is borderline overkill at 1080p but with the new consoles coming with GPUs having access to 8GB of GDDR5 RAM I fear that video cards with 2GB of VRAM or less will become obsolete very quick.

Well video cards have almost always quickly become obsolete, and it's only the ridiculously long life cycle of the PS3 & 360 that's given PC gamers a break the past few years. Unified memory or not, system reqs are likely to make some significant leaps over the next year or two.

On the other hand, economies of scale mean GDDR5 RAM is going to be dirt cheap soon enough, so my guess is that the next gen or so of premium GPUs will be handing out the memory like it's candy.

Nfcknblvbl
Jul 15, 2002

Hopefully PC games will now more commonly be 64 bit since consoles will have to be 64 bit as well.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Alereon posted:

On the 660 Ti I wouldn't really consider it an issue, but yes 2GB of VRAM will be the limiting factor on high-end cards before anything else.

Before I played Bioshock: Infinite, I remember thinking my card's 3GB of memory was silly. Now I'm regularly seeing 2+ GB of usage at 1080P. That said, the game is absolutely gorgeous for a UE3 game.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Space Racist posted:

Before I played Bioshock: Infinite, I remember thinking my card's 3GB of memory was silly. Now I'm regularly seeing 2+ GB of usage at 1080P. That said, the game is absolutely gorgeous for a UE3 game.

It is, but I have always felt that there is something subtly 'off' about UE3 games. A strange, cartoony wrongness to the graphics.

Tacier
Jul 22, 2003

Space Racist posted:

Before I played Bioshock: Infinite, I remember thinking my card's 3GB of memory was silly. Now I'm regularly seeing 2+ GB of usage at 1080P. That said, the game is absolutely gorgeous for a UE3 game.

Meanwhile I bought a 1gb 7850 thinking it'd be obsolete before I had any real need for 2+gb of video memory. How wrong I was...

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Tacier posted:

Meanwhile I bought a 1gb 7850 thinking it'd be obsolete before I had any real need for 2+gb of video memory. How wrong I was...

I'd be annoyed if I bought a card with 1 gram-bit as well... But it is certainly unique

sorry for being a oval office

testtubebaby
Apr 7, 2008

Where we're going,
we won't need eyes to see.


The Lord Bude posted:

It is, but I have always felt that there is something subtly 'off' about UE3 games. A strange, cartoony wrongness to the graphics.

It fits the theme park-esque vibe I get (and love) from Irrational's worlds, both Columbia and Rapture.

I've also asked this in a few other places, but maybe you guys know best: is it to be expected that my 7850 2GB would only be able to give me pretty much constant 60fps @ 1080p on High in Bioshock Infinite, and that V.High and Ultra would see noticeable, extended parts where performance dips at that resolution?

hayden.
Sep 11, 2007

here's a goat on a pig or something
It turns out that I think I had the iehighutil.exe malware (the bitcoin mining one) on my previous machine. It was randomly having huge loads and making the GPU spin up, causing constant driver crashes (about every 60 seconds like clockwork) and I think it ultimately killed my GTX260. Kind of frustrating how much of a headache this has caused me, not to mention money. I kind of needed an upgrade anyway, but spending $800 right now wasn't something I wanted to do and the resale value of my GTX260 having been destroyed has cost me like $50-100.

So anyway, I guess the lesson is don't install poo poo from untrusted sources and use NoScript on your browsers.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

zenintrude posted:

It fits the theme park-esque vibe I get (and love) from Irrational's worlds, both Columbia and Rapture.

I've also asked this in a few other places, but maybe you guys know best: is it to be expected that my 7850 2GB would only be able to give me pretty much constant 60fps @ 1080p on High in Bioshock Infinite, and that V.High and Ultra would see noticeable, extended parts where performance dips at that resolution?

Are you playing with Vsync on, or off? I get consistently higher framrates with it off (but also get horrible screen tearing), but when it's on it's pretty common for the frame rate to dip around 30 fps in more intense scenes. Anyway, I recently sold my 7850 and upgraded to a 7950 and don't have a basis of comparison for Bioshock, but your experience sounds about right. The 7850 is a great little card but it's going to really be pushed to the limit by modern games, and as mentioned Bioshock is much more demanding than you'd expect from a UE3 game.

Bing the Noize
Dec 21, 2008

by The Finn
edit: whoops, sorry

(USER WAS PUT ON PROBATION FOR THIS POST)

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib

ACID POLICE posted:

Trying to get a replacement GPU to make an older PC (Athlon 64 X2 4600+) currently using onboard graphics more capable as a HTPC, not trying to spend more than $50. It's only got a 250W PSU and I'd like to not have to upgrade that... it currently takes up just below 200W as it is, and it's not worth replacing the PSU too so I don't want to have to do that to run an older bigger card like a GTX 260.

Can I do considerably better than the GT 210 for around $50 without replacing the PSU in it?

Alternatively, I would be open to putting my 6450 1GB in this computer and putting a nicer GPU in my tower, but I wouldn't want to spend more than around $100 in that case, and it runs OS X 99% of the time so I'd pick something based on how it performs in OS X.

You want to ask here, as this thread is not for upgrading questions.

Rashomon
Jun 21, 2006

This machine kills fascists
Are Nvidia and AMD really not releasing any new graphics cards (except to fill niches or weird things like a 7990 or whatever) until 2014? I'm going to build a new machine with Haswell, but if it's not worth waiting for a graphics card I may pick up a 670 earlier to play Bioshock Infinite on pretty settings.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It really does look like it. I think it's because TSMC is slow on moving to the next process node.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
You're going to be hard pressed to find anything that crushes the current video card crop till 2015. Outside of BF4 and watch_dogs I think we've seen the best (bioshock) we're going to get graphically this year.

e: unless the next CoD game for this year is going to include high-def, wrinkly old dudes.

incoherent fucked around with this message at 06:48 on Apr 4, 2013

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Rashomon posted:

Are Nvidia and AMD really not releasing any new graphics cards (except to fill niches or weird things like a 7990 or whatever) until 2014? I'm going to build a new machine with Haswell, but if it's not worth waiting for a graphics card I may pick up a 670 earlier to play Bioshock Infinite on pretty settings.
It's not really a big deal when you can just add additional cards in SLI for considerable gain (Nvidia, anyway). This generation's GPU value has been pretty favorable versus modern game tech.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

TheRationalRedditor posted:

It's not really a big deal when you can just add additional cards in SLI for considerable gain (Nvidia, anyway). This generation's GPU value has been pretty favorable versus modern game tech.

But SLI won't help if you're running out of VRAM, as seems to be the case for Bioshock Infinite. With these reports of filling up 2GB in 1080p, I'm almost afraid to try it out on my own PC at 2560x1600.

What I really did, though, was buy XCOM and get a Bioshock Infinite preorder for free. And I'm still not done playing XCOM. :haw:

BeanBandit
Mar 15, 2001

Beanbandit?
Son of a bitch!
The VRAM issue with Bioshock Infinite explains why my frame rate dropped so much going from 1920x1200 to 2560x1600. I almost never see a significant performance drop by just bumping up the resolution like that. I was able to get it running smoothly at Ultra settings after dropping the shadows to medium, which turns off the DX11 contact-hardening shadows. I don't notice much difference when actually playing the game.

I do kind of wish I had kept that GTX 680 4GB now, though. :saddowns:

Shimrra Jamaane
Aug 10, 2007

Obscure to all except those well-versed in Yuuzhan Vong lore.
I thought that Nvidia was releasing the 700 series this summer?

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Shimrra Jamaane posted:

I thought that Nvidia was releasing the 700 series this summer?

They are, but it's just a Kepler refresh on the same die size. There's no new architecture or die shrink this year.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
The 700's will probably just be budget cards and/or OEM only as well.

DaNzA
Sep 11, 2001

:D
Grimey Drawer

hayden. posted:

It turns out that I think I had the iehighutil.exe malware (the bitcoin mining one) on my previous machine. It was randomly having huge loads and making the GPU spin up, causing constant driver crashes (about every 60 seconds like clockwork) and I think it ultimately killed my GTX260. Kind of frustrating how much of a headache this has caused me, not to mention money. I kind of needed an upgrade anyway, but spending $800 right now wasn't something I wanted to do and the resale value of my GTX260 having been destroyed has cost me like $50-100.

So anyway, I guess the lesson is don't install poo poo from untrusted sources and use NoScript on your browsers.
I've seen something similar before on another machine and it seems like the card is dead because it would lockup/freeze every few minutes until the whole system crashes.

However a fresh install of windows brought it back to life and the card is working again with no crashes at all. So it could be that the malware is still in the background killing the driver until you do a reformat.


Also just completely remove java from your machine and disable java from all your browser too since the malware seem to relies on java to function properly.

slidebite
Nov 6, 2005

Good egg
:colbert:

Anyone talked about these mini GTX 670s before?

http://www.engadget.com/2013/04/04/asus-geforce-gtx-670-directcu-mini-graphics-card/

Animal
Apr 8, 2003

Its neat. I wonder how loud it will be.

slidebite
Nov 6, 2005

Good egg
:colbert:

According to the box, 3x quieter! :v:

If I were in the market for a GTX670 and they benchmark the same with a similar price, I'd be all over that.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

slidebite posted:

According to the box, 3x quieter! :v:

If I were in the market for a GTX670 and they benchmark the same with a similar price, I'd be all over that.

Yeah, seriously - that'd be a monster for a HTPC, but I have to imagine *some* compromise went into it compared to the vanilla 670.

slidebite
Nov 6, 2005

Good egg
:colbert:

Here we go
http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_670_Direct_Cu_Mini/

quote:

ASUS pulled off the seemingly impossible with their GeForce GTX 670 DC Mini. They managed to release a high-end gaming-grade graphics card that fits inside a mini-ITX case. NVIDIA laid the foundation for this by engineering their GK104 graphics chip to use as little power as possible and designing their reference GTX 670 with a short PCB. ASUS only had to devise an equally compact cooling solution that could handle the card's heat generation. This seems to have taken some engineering skill to achieve since they are the first to release a GTX 670 with such a small form factor, and it has almost been a year since the GTX 670's initial release.
When I first heard about the ASUS GTX 670 DC Mini, I expected a lot of compromises to have been made, but that is not the case. The card really feels like any other GTX 670 (and I've tested many of them). Gaming works really well. The card easily handles everything with all the eye-candy turned on at full HD. ASUS also includes a small factory-overclock out of the box; barely worth mentioning with just 13 MHz. Overall performance does end up 2% higher than the reference design, which seems to be due to an optimized boost clock algorithm. Even with the Mini-ITX focused design of the DC Mini, I would have liked a bit more overclock out of the box, and a memory overclock. The increased heat output should be negligible.
Talking about heat, the new ASUS DirectCU cooler handles the card's heat very well with a vapor-chamber baseplate instead of Direct-Touch heatpipes. It runs at a comfortable 31°C in idle and 74°C during gaming, which means that there is still plenty of headroom for small cases with bad ventilation.
Noise levels are good, much better than the NVIDIA reference design, better than many custom GTX 670 cards we tested before, but not nearly as good as the fully sized ASUS GTX 670 DirectCU II. I, given the compact size of the card and the constraints that come with it, understand that I can't complain about noise levels under load, but idle noise could have been better, especially for a powerful ITX gaming rig that doubles as a media playback station in the living room.
While overclocking is probably not what most system builders of small form-factor systems have in mind, I can happily report that overclocking works just as well as on any other "big" GTX 670. Maximum GPU and memory overclocking levels were right in the middle of what we've seen on other cards.
ASUS tells us to expect a price of around $395 for the GTX 670 DC Mini, but the exact price has not been finalized yet. I just hope that retailers don't overprice this card like many other small form factor products. If they do, they might even get away with it because there is, for now, no alternative to the ASUS GTX 670 DC Mini. It's the only high-end graphics card with a small form factor on the market today. The next fastest option is the GeForce GTX 650 Ti and it is much much slower. Its $395 price is quite a bit higher than the cheapest GTX 670 reference design, which can be had for around $360, but as long as the price stays below $400, I'd consider it reasonable for this card.

slidebite fucked around with this message at 04:43 on Apr 5, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
670 DC Mini makes me want to burn money I don't have on a new ITX build. Maybe when Broadwell comes out and I can match it with a Maxwell DC Mini...

--

In other news, ever wonder how tablet and smartphone SoCs match up to PC hardware in the graphics department? You're in luck, because AnandTech looked into just that using the new Android/iOS release of 3DMark and newest GL/DXBenchmark. Keeping in mind that 3DMark, while practical in many ways, doesn't directly correlate to real-world application performance, it seems that current top-notch mobile GPU IP blocks tend to hover around half a GeForce 8500 GT, or around the same performance as the 80-shader VLIW5 GPU block in the AMD E-350 APU.

Even earlier cards are difficult to compare, because the 8500 GT was in Nvidia's first generation of unified shader architecture (where pixel and vertex shaders are combined into a single shader unit that can do either job), and pretty much all GPUs use a unified shader architecture these days. GeForce 7000 series and prior cards in the testing hork up majorly on some parts of the 3DMark test for what appears to be architecture mismatch reasons, rather than lacking actual compute capability - on tests which are bound by pixel shaders rather than vertex shaders, the GeForce 7000 series is actually top-performing (above the 8500 GT and Intel HD 4000 in the MS Surface Pro).

Finally, raw performance is one thing, but remember that when you see things like the iPad 4 and Snapdragon 600/Adreno 320 being competitive with a 7900 GS/7800 GT, they're doing so at an order of magnitude lower power consumption. That's pretty rockin'.

--

Also, TechReport published its own FCAT article, if you haven't seen it already. Unlike AnandTech and PCPer, TR wasn't so quick to abandon FRAPS-based timing. Instead, it ran both its FRAPS and FCAT frame time analyses in parallel to see what kinds of patterns came up. The rationale is that, while FCAT measures end-of-render-pipeline hiccups that can identify failings in the GPU's ability to time frames properly, combining this with FRAPS data can identify sources of lag and stutter that take place before the render pipeline entirely. And this panned out - a number of times in testing, the combined FRAPS and FCAT data showed frame time spikes that had nothing to do with the render pipeline, but instead were results of simulation hangs (i.e. from within the game engine itself).

Factory Factory fucked around with this message at 05:44 on Apr 5, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Also, good news, everyone! Seems that Adobe Premiere Pro CS7 will support OpenCL for the next version of the Mercury engine.

E: And it looks like Virtu may be pretty much obsolete - According to the release notes, QuickSync and OpenCL now works in Windows 8 even when you have a discrete video card.

Also looks like Intel has delivered a giant gently caress you to AMD's HSA, because Haswell's GPU will include a proprietary DirectX hook for unified CPU/GPU memory addressing - you can pass the CPU a pointer for the GPU's RAM.

Factory Factory fucked around with this message at 13:30 on Apr 5, 2013

LRADIKAL
Jun 10, 2001

Fun Shoe
Anandtech totally compared fraps and fcat in their article
http://arstechnica.com/gadgets/2013/03/a-new-era-of-gpu-benchmarking-inside-the-second-with-nvidias-frame-capture-tools/

UGH I'm a dummy, Anand just is partnered with tech report and that is their article anyway!!!!!!

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Also, that's not AnandTech, that's Ars Technica. We're still waiting for AnandTech's part 2 with lots of actual benchmarks.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
So, with Haswell coming out, have there been any recent news on its integrated GPU? Last I heard, it was surprisingly competitive (compared to previous offerings at least), and I've been holding off building my HTPC to see if I could also make it a HTPC that can half-decently play some games.

Edit:
vvvvv
That's the thing, I'm not looking for a gaming HTPC, I'm looking for a HTPC that just happens to have some gaming ability. Any serious gaming will still take place on my desktop PC, but the option dicking around with 3-player co-op on Trine 2 would be neat. The priority for the HTPC remains to be small, low power and low noise.

Jan fucked around with this message at 22:55 on Apr 7, 2013

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Jan posted:

So, with Haswell coming out, have there been any recent news on its integrated GPU? Last I heard, it was surprisingly competitive (compared to previous offerings at least), and I've been holding off building my HTPC to see if I could also make it a HTPC that can half-decently play some games.

I literally wouldn't bother. Since the best was rumoured to be competitive with the Geforce 650m, if you're looking to make a half decent gaming HTPC, you have a higher power envelope to play with, and I'd simply get the best possible passive card, or honestly, one of many mid-range cards today - any custom ones with multiple fans are usually extremely quiet, or can have fan ramps sorted out in something like MSI Afterburner.

Edit: also, if your use case was geniune (OK gaming, no add-in card), AMD has already offered that solution with their APUs.

Double edit: in summary, Haswell's GPU is of interest mainly in expansion limited situations - laptops.

HalloKitty fucked around with this message at 22:45 on Apr 7, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Jan posted:

So, with Haswell coming out, have there been any recent news on its integrated GPU? Last I heard, it was surprisingly competitive (compared to previous offerings at least), and I've been holding off building my HTPC to see if I could also make it a HTPC that can half-decently play some games.

Edit:
vvvvv
That's the thing, I'm not looking for a gaming HTPC, I'm looking for a HTPC that just happens to have some gaming ability. Any serious gaming will still take place on my desktop PC, but the option dicking around with 3-player co-op on Trine 2 would be neat. The priority for the HTPC remains to be small, low power and low noise.

Yeah, I posted rumored SKU lists in the Intel thread. Short version: The GT3e (650M-ish) version will be in mobile high-TDP quad core i7 parts only, with an -HQ suffix. GT3, the roughly-double-HD4000 version, will be in most of the rest of the mobile parts. Desktop parts will top out with GT2, the slightly-tweaked HD4000 version.

Guni
Mar 11, 2010
What sort of performance increase will the new generation of cards bring? Are we expecting a huge increase with the new consoles? Or will it likely be around 10-20%?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
We don't know yet. Remember that the consoles are getting almost a decade's worth of hardware improvements all at once, though.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Factory Factory posted:

Also, good news, everyone! Seems that Adobe Premiere Pro CS7 will support OpenCL for the next version of the Mercury engine.

E: And it looks like Virtu may be pretty much obsolete - According to the release notes, QuickSync and OpenCL now works in Windows 8 even when you have a discrete video card.

Also looks like Intel has delivered a giant gently caress you to AMD's HSA, because Haswell's GPU will include a proprietary DirectX hook for unified CPU/GPU memory addressing - you can pass the CPU a pointer for the GPU's RAM.

More good news: Intel's loosened up the licensing restrictions on QuickSync, which will hopefully mean we get one step closer to a small, cheap, low-power NAS box that can transcode video on demand to whatever the client wants.

Adbot
ADBOT LOVES YOU

emdash
Oct 19, 2003

and?
Random GPU semi-technical question: what are the differences between (for example) Quadro cards and GeForce cards (or, for that matter, FireGLs and Radeons)? Especially as it relates to f. ex. a 660Ti having 900+ CUDA cores but being worse for 3dsmax usage than a Quadro NVS 510 with 192 CUDA cores. Or would the 660Ti actually be better?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply