|
Hardly an issue unless we start seeing games with no texture compression, or you want to start gaming on huge resolutions.
|
# ? Apr 1, 2013 21:21 |
|
|
# ? Jun 5, 2024 04:27 |
|
spasticColon posted:With newer games eating more VRAM will the memory bus on my 660Ti start to become a problem in the near future? Right now it seems my 660Ti is borderline overkill at 1080p but with the new consoles coming with GPUs having access to 8GB of GDDR5 RAM I fear that video cards with 2GB of VRAM or less will become obsolete very quick. As I've mentioned earlier in the thread, consoles will have shared memory, so having 6-8GB of RAM doesn't necessarily mean they'll use all that up on GPU resources. It's too early to tell, but I think the real VRAM wasters on PC will be the same as now -- large resolutions, MSAA, etc. which you don't often see on consoles.
|
# ? Apr 1, 2013 21:59 |
|
spasticColon posted:With newer games eating more VRAM will the memory bus on my 660Ti start to become a problem in the near future? Right now it seems my 660Ti is borderline overkill at 1080p but with the new consoles coming with GPUs having access to 8GB of GDDR5 RAM I fear that video cards with 2GB of VRAM or less will become obsolete very quick.
|
# ? Apr 1, 2013 22:13 |
|
spasticColon posted:With newer games eating more VRAM will the memory bus on my 660Ti start to become a problem in the near future? Right now it seems my 660Ti is borderline overkill at 1080p but with the new consoles coming with GPUs having access to 8GB of GDDR5 RAM I fear that video cards with 2GB of VRAM or less will become obsolete very quick. Well video cards have almost always quickly become obsolete, and it's only the ridiculously long life cycle of the PS3 & 360 that's given PC gamers a break the past few years. Unified memory or not, system reqs are likely to make some significant leaps over the next year or two. On the other hand, economies of scale mean GDDR5 RAM is going to be dirt cheap soon enough, so my guess is that the next gen or so of premium GPUs will be handing out the memory like it's candy.
|
# ? Apr 2, 2013 04:17 |
|
Hopefully PC games will now more commonly be 64 bit since consoles will have to be 64 bit as well.
|
# ? Apr 2, 2013 04:57 |
|
Alereon posted:On the 660 Ti I wouldn't really consider it an issue, but yes 2GB of VRAM will be the limiting factor on high-end cards before anything else. Before I played Bioshock: Infinite, I remember thinking my card's 3GB of memory was silly. Now I'm regularly seeing 2+ GB of usage at 1080P. That said, the game is absolutely gorgeous for a UE3 game.
|
# ? Apr 2, 2013 05:54 |
|
Space Racist posted:Before I played Bioshock: Infinite, I remember thinking my card's 3GB of memory was silly. Now I'm regularly seeing 2+ GB of usage at 1080P. That said, the game is absolutely gorgeous for a UE3 game. It is, but I have always felt that there is something subtly 'off' about UE3 games. A strange, cartoony wrongness to the graphics.
|
# ? Apr 2, 2013 05:56 |
|
Space Racist posted:Before I played Bioshock: Infinite, I remember thinking my card's 3GB of memory was silly. Now I'm regularly seeing 2+ GB of usage at 1080P. That said, the game is absolutely gorgeous for a UE3 game. Meanwhile I bought a 1gb 7850 thinking it'd be obsolete before I had any real need for 2+gb of video memory. How wrong I was...
|
# ? Apr 3, 2013 00:03 |
|
Tacier posted:Meanwhile I bought a 1gb 7850 thinking it'd be obsolete before I had any real need for 2+gb of video memory. How wrong I was... I'd be annoyed if I bought a card with 1 gram-bit as well... But it is certainly unique sorry for being a oval office
|
# ? Apr 3, 2013 00:44 |
|
The Lord Bude posted:It is, but I have always felt that there is something subtly 'off' about UE3 games. A strange, cartoony wrongness to the graphics. It fits the theme park-esque vibe I get (and love) from Irrational's worlds, both Columbia and Rapture. I've also asked this in a few other places, but maybe you guys know best: is it to be expected that my 7850 2GB would only be able to give me pretty much constant 60fps @ 1080p on High in Bioshock Infinite, and that V.High and Ultra would see noticeable, extended parts where performance dips at that resolution?
|
# ? Apr 3, 2013 16:42 |
|
It turns out that I think I had the iehighutil.exe malware (the bitcoin mining one) on my previous machine. It was randomly having huge loads and making the GPU spin up, causing constant driver crashes (about every 60 seconds like clockwork) and I think it ultimately killed my GTX260. Kind of frustrating how much of a headache this has caused me, not to mention money. I kind of needed an upgrade anyway, but spending $800 right now wasn't something I wanted to do and the resale value of my GTX260 having been destroyed has cost me like $50-100. So anyway, I guess the lesson is don't install poo poo from untrusted sources and use NoScript on your browsers.
|
# ? Apr 4, 2013 02:14 |
|
zenintrude posted:It fits the theme park-esque vibe I get (and love) from Irrational's worlds, both Columbia and Rapture. Are you playing with Vsync on, or off? I get consistently higher framrates with it off (but also get horrible screen tearing), but when it's on it's pretty common for the frame rate to dip around 30 fps in more intense scenes. Anyway, I recently sold my 7850 and upgraded to a 7950 and don't have a basis of comparison for Bioshock, but your experience sounds about right. The 7850 is a great little card but it's going to really be pushed to the limit by modern games, and as mentioned Bioshock is much more demanding than you'd expect from a UE3 game.
|
# ? Apr 4, 2013 02:46 |
|
edit: whoops, sorry (USER WAS PUT ON PROBATION FOR THIS POST)
|
# ? Apr 4, 2013 02:48 |
|
ACID POLICE posted:Trying to get a replacement GPU to make an older PC (Athlon 64 X2 4600+) currently using onboard graphics more capable as a HTPC, not trying to spend more than $50. It's only got a 250W PSU and I'd like to not have to upgrade that... it currently takes up just below 200W as it is, and it's not worth replacing the PSU too so I don't want to have to do that to run an older bigger card like a GTX 260. You want to ask here, as this thread is not for upgrading questions.
|
# ? Apr 4, 2013 02:59 |
|
Are Nvidia and AMD really not releasing any new graphics cards (except to fill niches or weird things like a 7990 or whatever) until 2014? I'm going to build a new machine with Haswell, but if it's not worth waiting for a graphics card I may pick up a 670 earlier to play Bioshock Infinite on pretty settings.
|
# ? Apr 4, 2013 03:03 |
|
It really does look like it. I think it's because TSMC is slow on moving to the next process node.
|
# ? Apr 4, 2013 03:45 |
|
You're going to be hard pressed to find anything that crushes the current video card crop till 2015. Outside of BF4 and watch_dogs I think we've seen the best (bioshock) we're going to get graphically this year. e: unless the next CoD game for this year is going to include high-def, wrinkly old dudes. incoherent fucked around with this message at 06:48 on Apr 4, 2013 |
# ? Apr 4, 2013 06:46 |
|
Rashomon posted:Are Nvidia and AMD really not releasing any new graphics cards (except to fill niches or weird things like a 7990 or whatever) until 2014? I'm going to build a new machine with Haswell, but if it's not worth waiting for a graphics card I may pick up a 670 earlier to play Bioshock Infinite on pretty settings.
|
# ? Apr 4, 2013 10:26 |
|
TheRationalRedditor posted:It's not really a big deal when you can just add additional cards in SLI for considerable gain (Nvidia, anyway). This generation's GPU value has been pretty favorable versus modern game tech. But SLI won't help if you're running out of VRAM, as seems to be the case for Bioshock Infinite. With these reports of filling up 2GB in 1080p, I'm almost afraid to try it out on my own PC at 2560x1600. What I really did, though, was buy XCOM and get a Bioshock Infinite preorder for free. And I'm still not done playing XCOM.
|
# ? Apr 4, 2013 16:18 |
|
The VRAM issue with Bioshock Infinite explains why my frame rate dropped so much going from 1920x1200 to 2560x1600. I almost never see a significant performance drop by just bumping up the resolution like that. I was able to get it running smoothly at Ultra settings after dropping the shadows to medium, which turns off the DX11 contact-hardening shadows. I don't notice much difference when actually playing the game. I do kind of wish I had kept that GTX 680 4GB now, though.
|
# ? Apr 4, 2013 17:31 |
|
I thought that Nvidia was releasing the 700 series this summer?
|
# ? Apr 4, 2013 19:49 |
|
Shimrra Jamaane posted:I thought that Nvidia was releasing the 700 series this summer? They are, but it's just a Kepler refresh on the same die size. There's no new architecture or die shrink this year.
|
# ? Apr 4, 2013 19:57 |
|
The 700's will probably just be budget cards and/or OEM only as well.
|
# ? Apr 4, 2013 20:47 |
|
hayden. posted:It turns out that I think I had the iehighutil.exe malware (the bitcoin mining one) on my previous machine. It was randomly having huge loads and making the GPU spin up, causing constant driver crashes (about every 60 seconds like clockwork) and I think it ultimately killed my GTX260. Kind of frustrating how much of a headache this has caused me, not to mention money. I kind of needed an upgrade anyway, but spending $800 right now wasn't something I wanted to do and the resale value of my GTX260 having been destroyed has cost me like $50-100. However a fresh install of windows brought it back to life and the card is working again with no crashes at all. So it could be that the malware is still in the background killing the driver until you do a reformat. Also just completely remove java from your machine and disable java from all your browser too since the malware seem to relies on java to function properly.
|
# ? Apr 5, 2013 00:48 |
|
Anyone talked about these mini GTX 670s before? http://www.engadget.com/2013/04/04/asus-geforce-gtx-670-directcu-mini-graphics-card/
|
# ? Apr 5, 2013 01:29 |
|
Its neat. I wonder how loud it will be.
|
# ? Apr 5, 2013 01:36 |
|
According to the box, 3x quieter! If I were in the market for a GTX670 and they benchmark the same with a similar price, I'd be all over that.
|
# ? Apr 5, 2013 03:36 |
|
slidebite posted:According to the box, 3x quieter! Yeah, seriously - that'd be a monster for a HTPC, but I have to imagine *some* compromise went into it compared to the vanilla 670.
|
# ? Apr 5, 2013 03:46 |
|
Here we go http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_670_Direct_Cu_Mini/ quote:ASUS pulled off the seemingly impossible with their GeForce GTX 670 DC Mini. They managed to release a high-end gaming-grade graphics card that fits inside a mini-ITX case. NVIDIA laid the foundation for this by engineering their GK104 graphics chip to use as little power as possible and designing their reference GTX 670 with a short PCB. ASUS only had to devise an equally compact cooling solution that could handle the card's heat generation. This seems to have taken some engineering skill to achieve since they are the first to release a GTX 670 with such a small form factor, and it has almost been a year since the GTX 670's initial release. slidebite fucked around with this message at 04:43 on Apr 5, 2013 |
# ? Apr 5, 2013 04:35 |
|
670 DC Mini makes me want to burn money I don't have on a new ITX build. Maybe when Broadwell comes out and I can match it with a Maxwell DC Mini... -- In other news, ever wonder how tablet and smartphone SoCs match up to PC hardware in the graphics department? You're in luck, because AnandTech looked into just that using the new Android/iOS release of 3DMark and newest GL/DXBenchmark. Keeping in mind that 3DMark, while practical in many ways, doesn't directly correlate to real-world application performance, it seems that current top-notch mobile GPU IP blocks tend to hover around half a GeForce 8500 GT, or around the same performance as the 80-shader VLIW5 GPU block in the AMD E-350 APU. Even earlier cards are difficult to compare, because the 8500 GT was in Nvidia's first generation of unified shader architecture (where pixel and vertex shaders are combined into a single shader unit that can do either job), and pretty much all GPUs use a unified shader architecture these days. GeForce 7000 series and prior cards in the testing hork up majorly on some parts of the 3DMark test for what appears to be architecture mismatch reasons, rather than lacking actual compute capability - on tests which are bound by pixel shaders rather than vertex shaders, the GeForce 7000 series is actually top-performing (above the 8500 GT and Intel HD 4000 in the MS Surface Pro). Finally, raw performance is one thing, but remember that when you see things like the iPad 4 and Snapdragon 600/Adreno 320 being competitive with a 7900 GS/7800 GT, they're doing so at an order of magnitude lower power consumption. That's pretty rockin'. -- Also, TechReport published its own FCAT article, if you haven't seen it already. Unlike AnandTech and PCPer, TR wasn't so quick to abandon FRAPS-based timing. Instead, it ran both its FRAPS and FCAT frame time analyses in parallel to see what kinds of patterns came up. The rationale is that, while FCAT measures end-of-render-pipeline hiccups that can identify failings in the GPU's ability to time frames properly, combining this with FRAPS data can identify sources of lag and stutter that take place before the render pipeline entirely. And this panned out - a number of times in testing, the combined FRAPS and FCAT data showed frame time spikes that had nothing to do with the render pipeline, but instead were results of simulation hangs (i.e. from within the game engine itself). Factory Factory fucked around with this message at 05:44 on Apr 5, 2013 |
# ? Apr 5, 2013 05:18 |
|
Also, good news, everyone! Seems that Adobe Premiere Pro CS7 will support OpenCL for the next version of the Mercury engine. E: And it looks like Virtu may be pretty much obsolete - According to the release notes, QuickSync and OpenCL now works in Windows 8 even when you have a discrete video card. Also looks like Intel has delivered a giant gently caress you to AMD's HSA, because Haswell's GPU will include a proprietary DirectX hook for unified CPU/GPU memory addressing - you can pass the CPU a pointer for the GPU's RAM. Factory Factory fucked around with this message at 13:30 on Apr 5, 2013 |
# ? Apr 5, 2013 13:18 |
|
Anandtech totally compared fraps and fcat in their article http://arstechnica.com/gadgets/2013/03/a-new-era-of-gpu-benchmarking-inside-the-second-with-nvidias-frame-capture-tools/ UGH I'm a dummy, Anand just is partnered with tech report and that is their article anyway!!!!!!
|
# ? Apr 5, 2013 13:33 |
|
Also, that's not AnandTech, that's Ars Technica. We're still waiting for AnandTech's part 2 with lots of actual benchmarks.
|
# ? Apr 5, 2013 13:43 |
|
So, with Haswell coming out, have there been any recent news on its integrated GPU? Last I heard, it was surprisingly competitive (compared to previous offerings at least), and I've been holding off building my HTPC to see if I could also make it a HTPC that can half-decently play some games. Edit: vvvvv That's the thing, I'm not looking for a gaming HTPC, I'm looking for a HTPC that just happens to have some gaming ability. Any serious gaming will still take place on my desktop PC, but the option dicking around with 3-player co-op on Trine 2 would be neat. The priority for the HTPC remains to be small, low power and low noise. Jan fucked around with this message at 22:55 on Apr 7, 2013 |
# ? Apr 7, 2013 21:50 |
|
Jan posted:So, with Haswell coming out, have there been any recent news on its integrated GPU? Last I heard, it was surprisingly competitive (compared to previous offerings at least), and I've been holding off building my HTPC to see if I could also make it a HTPC that can half-decently play some games. I literally wouldn't bother. Since the best was rumoured to be competitive with the Geforce 650m, if you're looking to make a half decent gaming HTPC, you have a higher power envelope to play with, and I'd simply get the best possible passive card, or honestly, one of many mid-range cards today - any custom ones with multiple fans are usually extremely quiet, or can have fan ramps sorted out in something like MSI Afterburner. Edit: also, if your use case was geniune (OK gaming, no add-in card), AMD has already offered that solution with their APUs. Double edit: in summary, Haswell's GPU is of interest mainly in expansion limited situations - laptops. HalloKitty fucked around with this message at 22:45 on Apr 7, 2013 |
# ? Apr 7, 2013 22:43 |
|
Jan posted:So, with Haswell coming out, have there been any recent news on its integrated GPU? Last I heard, it was surprisingly competitive (compared to previous offerings at least), and I've been holding off building my HTPC to see if I could also make it a HTPC that can half-decently play some games. Yeah, I posted rumored SKU lists in the Intel thread. Short version: The GT3e (650M-ish) version will be in mobile high-TDP quad core i7 parts only, with an -HQ suffix. GT3, the roughly-double-HD4000 version, will be in most of the rest of the mobile parts. Desktop parts will top out with GT2, the slightly-tweaked HD4000 version.
|
# ? Apr 7, 2013 23:24 |
|
What sort of performance increase will the new generation of cards bring? Are we expecting a huge increase with the new consoles? Or will it likely be around 10-20%?
|
# ? Apr 7, 2013 23:57 |
|
We don't know yet. Remember that the consoles are getting almost a decade's worth of hardware improvements all at once, though.
|
# ? Apr 8, 2013 00:00 |
|
Factory Factory posted:Also, good news, everyone! Seems that Adobe Premiere Pro CS7 will support OpenCL for the next version of the Mercury engine. More good news: Intel's loosened up the licensing restrictions on QuickSync, which will hopefully mean we get one step closer to a small, cheap, low-power NAS box that can transcode video on demand to whatever the client wants.
|
# ? Apr 8, 2013 00:13 |
|
|
# ? Jun 5, 2024 04:27 |
|
Random GPU semi-technical question: what are the differences between (for example) Quadro cards and GeForce cards (or, for that matter, FireGLs and Radeons)? Especially as it relates to f. ex. a 660Ti having 900+ CUDA cores but being worse for 3dsmax usage than a Quadro NVS 510 with 192 CUDA cores. Or would the 660Ti actually be better?
|
# ? Apr 8, 2013 04:53 |