Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Klyith
Aug 3, 2007

GBS Pledge Week

movax posted:

As a product of the late 90s, the NV3 packed some bitchin' features.
One feature you missed: support for 24/32bit color in 3d. The main competition Voodoo2 could only render at 16bit. At the time I had a completely stacked gaming computer (bought for me by my grandmother for "important college work" :raise:), and I had both of them. In the beginning I was a total quake-head and used the voodoo2 all the time for glide... But then a little game called Homeworld came out and showed me that things like color, art, and immersion really mattered in games. I think that game was pretty much the first one that was designed for -- with it's fantastic brightly colored gradient backgrounds and layered transparent effects -- full 32bit color, and only fully enjoyable that way.

(Also does anyone else remember the Riva128 vs Voodoo2 flamewars? So epic. So dumb.)

Adbot
ADBOT LOVES YOU

Klyith
Aug 3, 2007

GBS Pledge Week

movax posted:

Hm, for some reason I thought I read that the Riva 128 was still limited to 16-bit color depth in 3D operations (I think Wiki says that), but if that isn't the case I will totally update that post!
Oops, I was wrong! :sweatdrop: I had a first-gen TNT. I got the computer in late '98 just after it came out. I now remember that the computer, ordered from Gateway's new custom built computers from an internet store!, was like 2 months late, and during that delay the TNT came out and got put in my machine even though it was ordered with a riva128.

Man, nostalgia. I never had good computers when I was growing up despite being desperately obsessed with them. No c64, no amiga, a 286 long into the 486 and early pentium days, etc. Getting that college computer was like the apotheosis of all my boyhood computer lust.

Boten Anna posted:

Since this was the 90s, by both of them do you mean you would actually open up your computer and swap them out? Or since this was the 90s, do you mean they both just sat on the uniform AGP/PCI slots and you just plugged in the one you wanted to use at the time?
Nope. Since this was the late 90s, the TNT plugged into the AGP slot, sent video out through VGA into the Voodoo2's passthrough port, and thence out again to the monitor. If you wanted SLI you had to add a third pass to the daisychain! In those days SLI was really about (vga) scan lines!

Klyith
Aug 3, 2007

GBS Pledge Week
So if we're taking a trip down memory lane to the GPU technology of the 90s, lets make a quick stop to mention S3. At the same time as nvidia was making the Riva128, S3 made the ViRGE, which was groundbreaking in it's own way. The virge was another combined 2d/3d accelerator, but it was a bit... weak at that 3d thing. In fact it was so bad, that as soon at you asked it to do anything more complex than render a single unfiltered texture on a poly, its performance dropped off a cliff. The average cpu of the time using software rendering could play quake better than a virge; it was mockingly called a "video de-accelerator" on usenet and the early internet.

However, the silver lining to the dark cloud of S3 failure was their mighty efforts to find *some* way to make their followup, the Savage 3D, perform faster. Nothing inspires engineers like having a bunch of nerds mock your efforts for years. So they invented two things that are still in use today: 1-cycle trilinear filtering and a little thing called S3TC, which was licensed by Microsoft to use in DirectX as DXTC: DirectX Texture Compression.

It wasn't enough at all. The Savage 3D had average performance but couldn't be made cheaply due to horrendous yield. (Wikipedia sez: "S3's yield problems forced Hercules to hand pick usable chips from the silicon wafers.") The Savage 4 fixed the yield problems but didn't get much faster, and was a cheapo second fiddle to the TNT2 and ATI Rage. The Savage2000 was kinda better, with hardware performance nearly equal to the new GeForce, but drivers so atrocious that most games were unplayable.


S3 quit the video chip market and sold everything to VIA, then merged with Diamond Multimedia, with whom they had worked together on a cool little product: the Rio PMP, the first portable mp3 player. I had a PMP500 (successor model), with 64mb of internal flash plus a 64mb MMC card. It's impossible to explain how baller it was to walk around with that when everyone else still had big skipping CD walkmen. Later they made the first time-shifting digital TV recorder, the ReplayTV. Both of these products caused them to get sued, by the RIAA and MPAA respectively, making them waste all their time and money fighting 2 huge lawsuits, and eventually killed them even though they won. Which is why the media companies to this day still sue any technology they hate and fear, and why Apple rules the world instead of S3.

Klyith
Aug 3, 2007

GBS Pledge Week

HalloKitty posted:

poo poo, why didn't I know about this already? You can even set up a toggle to see the performance impact and the difference in how it looks. Thanks!

Edit: drat, I've been wasting time on anything else. This is amazing, and with no impact discernible. Christ. I've used MLAA before but it mightily ballsed up text and sharp areas of contrast such as the HUD. This just.. doesn't seem to do that.
It's much more GPU intensive than MLAA. SMAA is not a purely post-process filter: it does sub-pixel sampling similar to MSAA to preserve detail and some temporal supersampling. Think of SMAA as the potluck of anti-aliasing: it uses a little bit of everything and combines them together. High quality SMAA looks pretty good, but you wouldn't want to combine it with other AA techniques like you can with MSAA+MLAA (or MSAA+FXAA on nvidia).

I think the weakness of MLAA/FXAA is that they're serving two masters. They're a good supplement to MSAA for getting rid of obvious aliasing from specular lighting or other things that MSAA doesn't touch. But they're also a low-cost AA for modern inexpensive graphics cards that have plenty of shader power but less memory bandwidth and raster power. As a result they're a bit blurrier than they'd otherwise need to be.


And your issue with MLAA blurring text and such is because the game didn't have hints for the driver to not post-process those bits, either because it's old or made by people who didn't know how to do that. Most new-ish games don't have that issue. The loss of detail on regular textures still happens though.


TheRationalRedditor posted:

SweetFX has a stand-alone GUI that can create custom override profiles for every game independent of driver suites, I do believe.
Yes, sweetfx has a SMAA injector. It's a bit clunkier to use (see sweetfx configurator) but it also has all sorts of other post-process stuff ranging from nice to gimmicky.

Nvidia Inspector is cool but it doesn't have SMAA. That has to be done via some type of injection because it's not implemented in drivers (yet). I just grabbed RadeonPro, and it's running a background service that I guess is doing the injection.

Klyith
Aug 3, 2007

GBS Pledge Week

kuddles posted:

Huh. People keep saying this to me but when I try doing that, the screen still tears like crazy for me.
You can still get tearing with adaptive vsync. When your current framerate is lower than your refresh rate it's like having no vsync at all. To get the most out of adaptive/dynamic vsync you really have to make sure you get your game settings producing FPS above your refresh rate for the bulk of frames. And if you are super-sensitive to tearing it still might not be for you.

To answer your original question: yes, radeonpro has an option in it's vsynch controls to force triple buffering. So you can have that and SMAA at the same time.

Klyith
Aug 3, 2007

GBS Pledge Week

The Lord Bude posted:

In what parallel universe do you need to do anything in particular to install beta graphics drivers? In seven years of owning my own PC running XP, then later vista and now windows 7 I have only ever installed beta drivers.
Beta drivers from nvidia/amd official site still get signed, though possibly not WHQL'ed. The stuff that gets put up on guru3d etc that are not, they're supposedly pre-betas or "internal testing" versions. There are three levels of driver signature enforcement: MS signed, vendor signed, and unsigned. And unless you've changed the default permissions windows will bitch at you for unsigned.

Klyith
Aug 3, 2007

GBS Pledge Week

Dogen posted:

I was installing the new nvidia betas and the installer slideshow indicated a bundle is coming with Metro Last Light
It's already in effect, with GTX 660s and up, so starting at $200. Of course, the AMD bundle is now Bioshock, Tomb Raider, and Far Cry Blood Dragon. Between a 660 and a 7870 the performance is pretty much a wash as well.

Metro is definitely an improvement for nvidia though. That poo poo-tier F2P credit was the main reason I bought my first AMD/ATI a few months ago, after a 10 year run of nvidias (not a fanboy thing, I just never happened to buy a video card in the generations that ATI had the upper hand).

Klyith
Aug 3, 2007

GBS Pledge Week

Space Racist posted:

Nothing justifies the cost of a $200-$300 card like ~$135 worth of free, new AAA games. Of course, I'm curious how significantly this promo is eating into AMD's profit margins.
I think it's actually helped their revenue -- Q1 2013 was a $150 million loss, but that's better than expected and way better than some previous quarters. They're actually better off selling video chips at a small loss than selling nothing at all. They've had to eat giant penalty losses on their fab contracts because they were below promised demand.

Now that we know that both PS4 and MS consoles will use AMD chips, I think their strategy for the past year was to keep their head just barely enough above water to stave off drowning. Margins on the new console chips will be sweet gently caress all, but it improves their revenue and solves that fab capacity problem. They're still in a dicey position.

Klyith
Aug 3, 2007

GBS Pledge Week
I don't think that 1080p inherently needs more memory. Older video cards used to be completely capable of 1600x1200 which is approximately the same number of output pixels as 1080p.

It's the high resolution textures, and multiple textures per surface, that's eating most of the memory. Plus lots of pixel shaders need memory buffers. Also the trend in games to things like open-world and large, diverse levels means keeping a much larger working set in memory.

e:

Jago posted:

Frame buffers and AA are still not free. Look at any benchmarks with a bunch of cards and watch them scale as resolution is increased and decrease.
You have to watch for the difference between memory size and memory bandwidth. Most of the performance hits from increased resolution are not due to running low on memory capacity, they're from increased bandwidth requirements. Antialiasing is a perfect example. Multisampling AA is a perfect example, it doesn't need much of anything in the way of additional capacity, but it needs multiple memory hits for each edge pixel that the AA is being applied to.

Framebuffers at this point have been so outstripped by memory growth that they might as well be free. Triple buffered 1080p 32bit + zbuffer is like 50 mb. If you're supersampling it balloons up but even then the bandwidth requirements kill you first.

Klyith fucked around with this message at 00:07 on Apr 26, 2013

Klyith
Aug 3, 2007

GBS Pledge Week

Mad_Lion posted:

Is it just that games have gotten more complicated, or what?
It's not exactly the games that are more complicated, it's the entire video rendering pipeline. Back in the Voodoo days, nothing in the pipeline was context dependent -- no pixel really cared about the state of it's neighbor. The GPU acquired triangles (already transformed by the CPU), applied textures and maps to them, applied lights, and rendered them. Nothing about the process cared where the next pixel in line was being taken care of.

Now, huge amounts of the GPU work is higher-level programs that don't react well to being split across scan lines. When you split them up, every border is a division that the GPUs have to pass data to keep synched. Which is a problem, because the fewer borders you have the worse the load balancing is.

Klyith
Aug 3, 2007

GBS Pledge Week
e: ^^^ My understanding is that if you enable vsynch it's relatively fine and always has been. I'd think the main benefit of a SLI/CF setup is so you can enable vsynch and never drop below 60 frames.

Mad_Lion posted:

Anandtech's newest review of the 7990 makes it look pretty good vs. the 690. http://www.anandtech.com/show/6915/amd-radeon-hd-7990-review-7990-gets-official

It uses more power and makes a bit more noise, but other than that, it wins in enough games to make it a totally viable choice, especially considering that it has 3GB of ram per GPU vs. 2GB. In addition, if you care about compute, it owns the 680.
Your comparisons are not very good (the thing to put it up against is dual 680s), and your upgrade strategy isn't very good (spending huge amounts on bonkers top-of-the-line parts is less effective than spending sane amounts more frequently, unless someone else is paying for your computer).

But this isn't the thread for upgrade questions / talk.

Klyith
Aug 3, 2007

GBS Pledge Week

dpbjinc posted:

They probably don't have anything new to release yet. As was just mentioned, AMD's working on hUMA. NVIDIA's working on Project Denver, which is probably going to end up connected with their GPUs in some way. Neither are going to be ready for mainstream use before 2014.
Both are also working on various projects in the mobile / tablet space which is the big growth area. But I think the big thing is that nobody needs new graphics cards this year, everything coming out is still multi-platform tied to the old consoles, so even if your video card is 2 generations old you're still ahead of the baseline. Sales were sluggish in the second half of 2012, despite a full range of great options.



In other news, I just got a code for Blood Dragon from AMD, which is pretty awesome since I originally bought just the Tomb Raider and BS:I version. God knows when I'll end up playing it -- I'll have to sign up to Ubistore to get it, but that's not AMD's fault.

Klyith
Aug 3, 2007

GBS Pledge Week

Rigged Death Trap posted:

Why would it be laughable?
It's plausible that the new graphics core can put a game on one monitor at reasonable settings and have it playable. That it will do three is just not in the cards.

Unless "competing with eyefinity" is just the ability to put Windows apps on 3 monitors, in which case whatever. Plenty of office workers get multiple monitors so it's nice they won't have to spec a basic video card for their Dells, but that's not what Eyefinity is all about.

Klyith
Aug 3, 2007

GBS Pledge Week
Your issues sound very similar to a thing I had happen to me about a year ago, except mine was with my nvidia card. At some point I installed new drivers over the top of old ones without fully uninstalling. Windows XP used to be ok with that, but 7 gets hosed up.

Basically every time after that when I tried to update drivers, windows would dig out this old version from the depths of WinSXS and try to install its files instead. Driver cleaning programs would remove the active files, but not the backups from whatever buttcrack they were lodged in. Finally I had to remove any new drivers, and let windows go back the old ones so I could find which exact version they were. Then I got that exact version from nvidia's driver archive, installed it, then used the uninstaller properly. That finally got rid of everything, and I could update again with no problems.

Klyith
Aug 3, 2007

GBS Pledge Week

Jan posted:

Also, take this with a grain of salt since I haven't ever worked with WinSXS, but I'm pretty sure GPU drivers cannot be run from SXS. The kernel mode driver framework doesn't work in a way that allows running the same driver twice in parallel. Of course, with the amount of non-kernel level libraries that drivers come with nowadays, I wouldn't be surprised if that stuff ends up causing trouble with SXS copies.
It wasn't running the drivers out of winsxs, I'm pretty sure it just had copies of the dlls and driver inf there that it was using to reinstall. And yes, sxs does solve more problems than it creates, usually. It's just way more impenetrable. It can also be annoying on an SSD in terms of space consumed.

quote:

All in all, what I do know of driver programming is painful enough that I wouldn't wish it on my worst enemy. Game engines are bad enough. :v:
God yes. Normal hardware drivers are bad enough, but modern graphics drivers are some unholy combination of driver and real-time compiler. We'll see how well AMD treats me after being on the green team for so long, but even nvidia hasn't been perfect at times.

Klyith
Aug 3, 2007

GBS Pledge Week

TheRationalRedditor posted:

IIRC those after-market GPU coolers all reviewed favorably well, the only drawbacks being price and sometime difficulty of install.
And space requirements, but I suppose anyone who has a 79xx card already had a big roomy case.


One thing to try before spending on a aftermarket heatsink: look at your vsynch settings. I was a bit annoyed that my new 7870, which has top fans not a blower, was making a lot more noise than my old GTX460. In particular it was spinning the fans on high in relatively simple games like Kerbal Space Program where I'm not sealed into my headphones to block the noise. But I figured out that RadeonPro disabled vsynch by default, so the card was running max speed to render 140 kerbals per second.

Klyith
Aug 3, 2007

GBS Pledge Week

subx posted:

That's surprising. I haven't kept up with video cards much the past couple of years, but when I did after 2-3 years you could buy something quite a bit faster for ~100-150. The budget market is where I usually pointed people to when they are building a first gaming pc or whatever, so that's pretty disappointing that my 3 year old $350 card is still in that same range. (edit - not disappointing for me so much since it's been a fantastic card, but for people getting into the hobby)

And that link is pretty much exactly what I was looking for, thanks.
Combination of:
1) The 58xx / 4xx generation was a pretty fantastic one. The two since then haven't exactly been stagnant, but the extended console cycle pushed some design attention from raw fps power to stuff like GPU compute, new AA methods, etc.

2) The $100-150 budget card bracket is always pretty crappy. They take forever to catch up to high performance cards from previous years, because they're built with a lot of compromises for price. At $100 a lot more of your money is going to fixed costs like the pcb & assembly, packaging, transport, etc. The $200-300 midrange is normally where you get good value for money at the spread of price points, and then you start entering diminishing returns after that.

Pointing people at a $150 budget card for their first gaming pc is ok if they're really price conscious, a fairly casual gamer, or a WoW junkie. But someone who plays big AAA on a 360/PS3 will probably be disappointed.

Klyith
Aug 3, 2007

GBS Pledge Week

Fauxtool posted:

I have a 7970 that gets as hot as 96C when playing mwo on a triple monitor set up. Most every other game stays below 80C. Is that safe? The fan only ramps up to 66% automatically when I wouldnt mind it going 100%. The noise is a non issue most of the day but I cant just leave it at 100% all the time. Can I set a more aggressive fan profile?
I'd recommend MSI Afterburner for fan control on any video card. It works on any brand, only some advanced hardware tweaks are locked out if you don't have an MSI card. You can make a custom ramp for fan speed based on temperature.


As for whether you need it, temps in the 90s are hot but within acceptable limits for the silicon. They design GPUs to run hot. It might reduce the statistical life of the card a bit, but it's not like you're running it 24/7 mining bitcoins. I wouldn't worry about it too much personally.

But it's pretty bizarre that the card isn't running the fans at 100% at that temperature. Some 7900s have really slack fan profiles that let the chip get pretty hot, especially ones with blower coolers because they're so loud. Not running full speed in the 90s is pretty unusual.

Klyith
Aug 3, 2007

GBS Pledge Week

Killer robot posted:

I've had this happen too, and have assumed it to be a driver thing since it's seemed to come and go with driver installs.
Same here, though mine only makes glitch squares in the tab & location bar area. I suspect turning off hardware acceleration would fix it, but so does 1 alt-tab. I'm pretty agnostic over who is to blame, AMD or firefox.

Klyith
Aug 3, 2007

GBS Pledge Week

Urzza posted:

I was browsing LGA 2011 motherboards, and while most of them have PCI-E 3.0, I haven't found any processors in that socket type that support 3.0. Is it just that I haven't found a processor that supports 3.0, does it not matter except for high end cards, or is it just all marketing on the part for the motherboard manufactures?
Intel delayed (or maybe cancelled at this point) Ivy Bridge chips for LGA 2011. Probably they decided to just keep selling Sandy based Xenons for the server market, since the performance gains of Ivy in that area were modest. Without the server chip to remark, the foolish enthusiast market isn't enough on it's own.

But PCIe 3.0 doesn't matter for any video card, even the high end. It might just barely be measurable for a SLI setup. Video cards are not starved for bandwidth across the PCIe bus. It's been a consistent thing, all the way back to the AGP era, that you can run Card X on the bus from Generation X-2 and have less than 5% impact to performance.


Buying a desktop LGA 2011 system right now would be a massive waste of money, what with Haswell right around the corner. Unless you really need a 6 core chip for some reason.

Klyith
Aug 3, 2007

GBS Pledge Week

Animal posted:

The only way it could get damaged is if you have a crap PSU, like OCZ.
There's nothing wrong with OCZ power supplies, their high-end stuff have the same guts as PC Power & Cooling units (because OCZ bought them). The cheap ones are not as good, but cheapo PSUs are crappy no matter what badge is on the case. I'm not saying OCZ is a company that people should buy any product from, but if one were doing so, an OCZ PSU isn't bad.
A psu would also have to be impressively hosed up, not just cheap trash, to permanently damage components.

It's hard to say why Stumpus's computer is running badly because "slow as poo poo" isn't very descriptive, but on a box you haven't used for a few years the BIOS battery might have run down and reset to safe defaults. Aside from that, it's probably just the OS being full of old crap and a reinstall would be snappy again.

Klyith
Aug 3, 2007

GBS Pledge Week
Endymion, it's probably fine. Even the bad post-OCZ PCP&C is only bad in comparison to what PCP&C used to represent. If you're worried, a backup PSU that sits on a shelf is nice to have around.

PSUs can damage other components when they fail, but it's really not very likely. There's a number of failsafes to prevent that. I suspect a lot of anecdotal "psu killed my cpu / video card / whatever" are explained by people being idiots and not seeing how they hosed up their system.

Srebrenica Surprise posted:

I'd love to hear about the multitude of OCZ units being sold based on a Seasonic design.
Based on a Seasonic design as in they have one big fan on the bottom and modular cables? Welcome to the PC hardware market, any product with superior design gets copied. Or is this something else I'm not aware of?

Animal posted:

Why should anyone buy their brand when Seasonic is the complete opposite and has similar prices?
I said "don't buy any product from OCZ" right?

Klyith
Aug 3, 2007

GBS Pledge Week

NickelSkeleton posted:

So I'm freaking out after my new Radeon 7950 HD stopped working properly.
Sounds like a standard GPU failure, RMA your card. A 500w Seasonic is plenty for that system (unless it's the PSU that's on its way out).

If you want to test it out, you could try using the overclocking panel of CCC to lower the GPU speed in 100mhz chunks and see if it can load a game without crashing. If it works at some lower clock speed, that's good evidence the card is defective.

quote:

another weird thing, when running with the catalyst 13.5 beta2 driver check out my GPU clock:
That's normal, GPUs run at lower clock speed when there's no demand. If you ran a game or something in the background it'd jump up to the rated speed (and crash in your case).

Klyith
Aug 3, 2007

GBS Pledge Week

Joink posted:

On that tab it doesn't update the GPU clock speed, only shows its max setting. Under sensors it shows exactly whats going on with the card. For my card, its running at 300MHz when doing 2D.
Ah, I see. Hadn't actually used GPU-Z myself, so I thought it would be the same as CPU-Z (which does show the current, dynamic clock speed in that location). So that card is definitely hosed in some way.


Zorro KingOfEngland posted:

Has anyone tried out the new Geforce Experience thing nVidia just released? Being able to record the past 20 minutes of gameplay footage sounds really cool, but there's no way it comes without a performance hit.
The Shadow Play recording part of it isn't released yet, that's coming "this summer" or "late summer".

Still, it's a baller app. Being available on all Kepler based cards, not just the $650 fuckoff-expensive ones, is awesome. Finally a way to do high-quality recording that doesn't require a high-end computer.

Klyith
Aug 3, 2007

GBS Pledge Week

Alereon posted:

It'll work, but that's a lower-end power supply and you're pushing it pretty hard, so expect higher noise levels and temperatures. Definitely upgrade power supplies if you consider overclocking.
I don't see how his setup can possibly go over 350 watts, the 780 doesn't draw that much power. Anandtech's system with an i7 overclocked to a nutso 4.3 ghz barely broke 400w, and TR's more normal setup was just over 300.

I don't know how a 60-70% load is pushing pretty hard, and I also can't see how a decent seasonic is a lower-end PSU.

Klyith
Aug 3, 2007

GBS Pledge Week
Last page someone asked about how replacing PSUs after X years: I think every 3 years is probably being over-cautious, but they don't last forever. PSUs have large liquid electrolytic capacitors which have a limited lifespan, and 6 years starts getting close if you're like many of us that 24/7 our desktops.

Alereon posted:

You want the load on the power supply to be 50% or lower, beyond that you get progressively worse noise levels, efficiency, and power delivery quality. On that Seasonic power supply, the fan is ramping up to noticeable levels at 60%, and 80% is about the point where the noise and heat output are excessive. That power supply isn't BAD, it'll keep supplying functional power up to its limit, but it was a low-cost power supply when it was new and is a bit behind the curve today. It was a great choice for a lower-draw system, the problem comes when you try to use that same cheap power supply for a system with a top-end videocard.
Higher loads will lead to more fan noise, and a bigger capacity PSU will help. Marginally -- the size of the load doesn't change. If you are comparing two PSUs with the same efficiency curve the bigger one might be a percent or three more efficient because it's using less capacity. So there's 90w of waste heat instead of 100w, woo. And I strongly disagree that delivered power quality is unacceptable at 80% load for decent brands of PSUs. One big negative of operating at very high loads is to reduce the lifespan of the PSU.


I don't think there's any real reason to buy a 800w PSU for a 300w system. Even overclocking a CPU is only adding a few tens of watts if you're doing normal, non-heroic OC. But I do think that if someone is going to spend 650 dollars on a graphics card, they shouldn't complain about buying a PSU that's over $100.

Klyith
Aug 3, 2007

GBS Pledge Week

Shaocaholica posted:

As long as you're not VRAM bound, a Titan or 780 should smoke a Quadro K5000 at GPGPU tasks given the number of cores on the latter right? No real GPGPU special sauce for the Quadro K5000?
The 780 has double-precision FP operations neutered, while the Titan runs them at full speed.

Zotix posted:

Well my question is, does that seem like it's an issue with the card directly? Like if I purchase a 780 GTX, and lets say it's perfect out of the box, will I likely have high temps similar to the 580, or should they be normal?
Man, I want your life. Got a pretty great video card that's clogged with dust? gently caress it, throw that poo poo away and drop 650 bones on a new one.

Checking your video card for dust is not some kind of rocket surgery. If you have one with a blower style cooler, there will be a couple obvious screws on the top side of the card that you can remove to pop off the plastic shroud. With the shroud off, you will probably see a gently caress-ton of dust blocking the airflow. Blowers are really susceptible to dust, any case without filters they will eventually clog up. If it's a top-mounted fan (or dual fan) seeing the dust will be even easier, though you still may need to remove the shroud to really clean it well.

It less unlikely to be a thermal paste issue, the card isn't very old and they slather that stuff on pretty good in the factory.

Klyith
Aug 3, 2007

GBS Pledge Week

Don Lapre posted:

770 is so god drat huge. Hopefully the 8 series has a new architecture and we can get a 870 in a smaller card.
Probably just the fact it's a reference card with a Titan cooler. I'd guess retail cards will be similar to 680s.



w00tazn posted:

For next gen, I can imagine the target framebuffer resolution going up, but not much more than that. I doub't we'll be seeing any huge leaps in fidelity anytime soon as we haven't really seen anything that really pushes the boundaries on PCs today and doing so would just mean increased costs for developers who already operate on razor thin margins.
Beyond development costs, the other thing is that unless the new consoles sell ridiculously well, there will still be games being made for cross-gen 360/xbone & PS3/4 all the way through fall 2014. They'll be better looking on the new ones (high res textures, draw distances bumped), but not a huge leap. By the time the standard baseline is the xbone & ps4, PC will be on the next entirely new GPU generation.

But if you think next-gen games aren't going to have improved graphics, you are high. There are ways to use that power that don't cost exponentially more money; there are also players ready to spend more money at the start of a new cycle to establish a lead or new IP.

Klyith
Aug 3, 2007

GBS Pledge Week

Sidesaddle Cavalry posted:

I'm graphics-effects-dumb, how much of that can be taken up by the various methods of anti-aliasing? And where does memory bus size come into all of that? I've read a few comments on various 770 reviews somewhat lamenting missing out on the 384-bit bandwidth that came with GK110 cards like the 780 and Titan. Does the blistering 7 Ghz memory clock on the 770 somewhat make up for it?
Only super-sampling AA requires additional memory capacity in any significant way. Multi-sampling needs memory bandwidth, and the various post-process types (FX-, TX-, ML-, and SMAA) hit GPU shader performance hardest (also needing memory and bandwidth to varying extents, but less so).

Buying a >2gb card is mostly about the larger textures available in the big PC releases. Personally I'd say it's only a major requirement if you're buying an expensive card that you plan to keep for years. With a midrange 2gb card you might have to lower the game's texture setting a notch from ultra quality, meh. Not a huge deal, I'll keep going with cheaper cards more frequently myself. I'm sure by the time 2gb feels constricting I'll be ready for a new one.

Klyith
Aug 3, 2007

GBS Pledge Week

FetalDave posted:

That's not true. I've had issues with stuttery graphics from my 3870, to my 5870, and now to my 7970 (see my post above/youtube video). That's almost 5? years of different generations of AMD/ATI cards with the same issue.
If it still happens when you have vsync enabled, then the problem is by definition something other than the micro-stutter associated with Radeon cards. That means it's an actual drop in frame rate. And if the problem happens in all 3d applications, including ones that the card is overqualified for, then the issue could be in the video card or it could be elsewhere in the system. I'd say 50/50 odds you could put in a 780 and still have it.


---
AMD drivers are still 2nd best compared to nvidia, but as a long time nvidia user who just switched to AMD, I'll say they're not a reason to avoid all AMD cards these days. If you have two competing choices that are tied on the price, performance, and pack-ins, the drivers are certainly a good tie-break for nvidia. If you don't give a poo poo about paying more, you could just go with nvidia all the time.

The microstutter thing for example, they've made good strides to fix. Nvidia didn't have the problem because they already had done work on it when it became general knowledge. If TR had started their new method of benchmarking frame delay instead of average FPS a year earlier, both would have had the stutters.

Finally, Nvidia isn't perfect themselves. They haven't had a major fuckup for a while, but when they have it's often been the hardware, which IMHO is worse than bad drivers. Software can be fixed, but things like poo poo analog quality (back when people still had VGA monitors) or a bad heatsink standard resulting in cards that cooked themselves after 18 months are forever. Those examples are from a long time ago, but at the time there was plenty of :argh: about it.

Klyith
Aug 3, 2007

GBS Pledge Week

FetalDave posted:

This. Microstutter is only something that occurs with a dual GPU setup. I have a single GPU setup and it still happens.

I might be getting closer to figuring it out though. In my thread in tech support someone sugguested running FrafsBenchViewer while the stuttering is happening, and there is almost exactly 1 second between the stutters. Maybe it's something in the drivers polling the card for info every second?
I replied in your thread, but for the benefit of this one and the discussion of whether AMD's drivers are good or bad:

quote:

Vsync actually makes it worse. On top of the jitteryness, there's now a distortion bar that runs horizontal and moves from the top of the screen to the bottom every 5 seconds.
AMD does not write drivers that badly.

They may not be as good as nvidia, but I'm pretty sure they can catch a bug that obvious. I feel quite bad for them that in situations where poo poo is broken and their card happens to be present, people will just say "their drivers suck, get an nvidia card".

Klyith
Aug 3, 2007

GBS Pledge Week

Daysvala posted:

Yeah I have one of those reference blowers that exhaust out the back. My case seems to have sufficient airflow, but I can't check to see if the blower is clogged because one of the screws that secures the blower to the card came improperly manufactured and I can't get a screwdriver head to catch.
You should be able to take off the shroud, the plastic/metal thing that goes over the heatsink and directs airflow from the fan, without removing the entire heatsink from the card. The screws that hold on the shroud are small ones flush on the sides of the heatsink. Once you take it off it will look something like this.

Blower style heatsinks are very prone to getting clogged by dust if you don't have a well-filtered case (or a very clean house).

Klyith
Aug 3, 2007

GBS Pledge Week

Jan posted:

On a different subject, does anyone know of any issues with AMD, DisplayPort and turning off monitors? I sometimes get issues where if I turn off my DisplayPort connected monitor and leave it off for a fair amount of time but leave the computer running. When I come back and turn the monitor back on, the monitor doesn't detect any signal. Sometimes I can get it back by cycling the monitor power a few times, but I've also had one or two cases where the drivers freak out, cause a reboot and then Windows is in VGA mode with the device manager saying the card was disable because it was causing issues.

Quick edit: just realised that the monitor was still using the generic PnP driver. Had it update to get the Dell U3011 driver, I suppose time will tell if makes a difference.
If the monitor driver doesn't help, it might be related to the Zero Core power state. Like maybe it's not detecting when you turn the the display back on because the IO is totally off? I'd suggest using only windows power saving rather than complete power off.

(If you want manual control of turning off your screen, there's a great little utility called nircmd that does a whole bunch of useful little things, including power saving the monitor. It's easy to put a shortcut in quicklaunch or someplace to just instantly sleep the monitor whenever you leave the pc.)

quote:

MSI 7850 TwinFrozrs
My previous video card was an (nvidia) MSI twinfrozr. Decent cooling and nicely quiet, but they use some cheapass fans. One of the fans was on the way out with grinding noises after just a year of use. I hacked in a replacement myself because I really didn't want to be without a computer for a week+ of RMA. Then just before I got my new card the second fan started grinding, so if I want to use the thing again I'm probably going to need to pull off both fans and replace them with slim 80s that can do PWM. Ugh.

I saw lots of forum posts about dead twinfrozr fans when I googled about it.

Klyith
Aug 3, 2007

GBS Pledge Week

z06ck posted:

I'm sorry you decided to fix/replace the fan on something that was still in warranty, cause it ain't now.
I'm not. The problem with warranty RMAs on a stupid little problem like that is, between the cost to ship the part and the value of my computer being a doorstop for however long it takes to get a replacement, warranty service easily had negative worth compared to fixing it myself. I didn't have a space pcie card at the time, I had just given away an old card to a friend. And MSI seems to be near-impossible to get cross shipping from.

Though if I had known then what I know now, I would have gone directly to a solution that ditched both stock fans. (Also, warranty violations, pshaw. If you're patient and meticulous, you can solve that just by being better at reassembling the thing than the guy who put it together in the first place.)

Klyith
Aug 3, 2007

GBS Pledge Week

The Lord Bude posted:

You cannot game adequately at 1080p with a $100 gpu, so building a gaming pc with one in it is a pointless waste of money,
You can't game with a 7770, but not everyone demands Ultra Super Quality and minimum 60fps to play a loving computer game. And not everyone can spend $500 a year on videocards like some of the people in this thread. Have a sense of perspective.


edit VVVV
Only if the sole purpose for his computer is playing games. Given how he's specing the computer, the dude probably isn't playing games all the time.

Klyith fucked around with this message at 15:11 on Jul 18, 2013

Klyith
Aug 3, 2007

GBS Pledge Week

quote:

Oculus in 1080p / 1440p / 4k
One offsetting factor is that if you've ever seen an oculus in action you'll see that it doesn't actually use all the pixels of its screen. Game world gets rendered in two fisheyed squares, and there's ~15% that doesn't get used.


The other thing is that rendering at native resolution is not critical for the oculus. It's nice if you have the horsepower to do 90hz at very high rez, but the reason they're pushing to the highest PPI displays they can source is to minimize the screendoor / subpixel effects. Because of the distortions applied to the image, it doesn't have an exact pixel to pixel output even when you're running native. So don't run off to buy a super expensive gpu just because you think the oculus needs 4k type horsepower. (The oculus is gonna be expensive enough on its own.)

Klyith
Aug 3, 2007

GBS Pledge Week

jm20 posted:

How much more will AIBs charge for actual heatsinks on an RX480?

Not much, even though 150w is disappointing it's not crazy by videocard standards, any of the completely ordinary 3 or 4 heatpipe coolers that the OEMs make by the thousands will work. And I can't think of any time where the first-wave cards with crappy blower coolers were cheaper. This isn't a new thing. I think it's an artifact of the extra step in the supply chain (chipmaker handing whole cards to OEMs rather than OEMs doing their own assembly).

Only get one of these AMD blower units if you're planning to replace your heatsink with watercooling or something.

xthetenth posted:

CPUs did. Now there's just the canonical one big core architecture, and Intel iterates on it very slowly.

Which is a good lesson for why being a "fan" of any of these companies is punch-yourself-in-the-dick stupid. The only thing that keeps them from giving you the shaft is a competitor.

I'm already kinda annoyed at nvidia for leveling the performance/price curve down to a straight line for their last couple generations, which they had the leeway to do based on AMD's weak results.

Klyith
Aug 3, 2007

GBS Pledge Week

NewFatMike posted:

I'm definitely interested in DX 12 performance in the future. Hopefully it's not just because of rebrands that AMD cards have aged rather gracefully.
GCN has been pretty amazing at that; it was a solid and forward-looking architecture to begin with and the fact that both consoles use it likely helps.

DX12 and Vulkan, if that gets adoption by PC games, will maybe be an ace for them, it's certainly a place they make up a lot of ground vs nvidia right now. But that very well could fade, it's quite possible that nvidia just hasn't put the same about of work into DX12 yet because there are hardly any games that use it. I wouldn't buy a 480 just on that expectation if the 1060 ends up being better in current games (relative to $).


xthetenth posted:

The reason everyone's super mad is because it's using a ton of watts to do it which means that all the people who put too much money into their computer (most of the thread) are really worried that they won't be able to fit much performance into 300W for a top end card.

I'm kinda disappointed by the power use because I like a quiet computer, even when playing a game if I can get it, and was hoping the process shrink would fix the biggest flaw in the AMD lineup for the last few years.

For Vega, they better hope to have either their process problems un-hosed or some miraculous performance magic pulled from HBM, because 300+ watt cards are a tough sell.

Klyith
Aug 3, 2007

GBS Pledge Week

Cream posted:

I'm on a 7950, the price of the 480 is looking pretty good. Would it be worth the upgrade?

Yes but
a) don't buy these first ones with the bad reference blowers
b) wait until the price & reviews of the 1060 comes out, which is only a bit over a week


Siets posted:

What is the difference between 144Hz and GSync?
Gsync/freesync = you can use vsync (no tearing) at arbitrary FPS and no jitter (most noticeable in things like character animation when using vsync on a standard monitor and bounching between 60hz and 30hz update rates)

quote:

I know that's the refresh rate, but can GSync also work at those refresh speeds?
Yes, though the benefits of *sync decrease on high-refresh monitors because the natural divisions between possible frames is lower.

tl;dr = even without *sync, 50fps on a 144hz monitor will look slightly better than 50fps on 60hz

quote:

I thought the eye couldn't really tell a difference beyond 60Hz or so?
Lots of people with 144hz monitors say there's a big difference.

The overall "framerate" of our vision seems to be 45-90hz. But eyes are not computer devices, they perceive certain things very quickly and yet detail vision likely takes multiple passes through the brain. Biological systems are non-linear.

Adbot
ADBOT LOVES YOU

Klyith
Aug 3, 2007

GBS Pledge Week

Spatial posted:

Human vision doesn't have a framerate. It's a continuous analog system. :negative:
a) I was trying to be concise rather than :goonsay:

b) It's definitely not continuous. Flicker in rapidly rotating objects, persistence of vision, and a number of optical illusions work because there are discrete events. Nothing with nerve cells is continuous, neurons have a "tick rate" just like transistors. But different types run at different speeds.

But it's not a CCD camera either. For one thing, the optic nerve doesn't have enough bandwidth to work like a CCD camera. A few years ago I read a really neat article about scientists trying to extract images from the optic nerve, and trying to decipher the biological data compression -- the pictures they were able to reconstruct were mostly just outlines.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply