Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
sethsez
Jul 14, 2006

He's soooo dreamy...

Gunjin posted:

The only cards that do GPU acceleration with OpenCL instead of CUDA in CS6 are the HD 6750M and HD 6770M (1gb vRAM versions), and then only on OSX 10.7.x. Everyone else needs to use a CUDA card still.


Note that there is a way to "hack" an unsupported card into Adobe CS, so if you have a Mac Pro with a 5770 or 5870 on OSX 10.7.x you could theoretically add support for one of those cards, but I'd be wary of doing that in a production environment where stability is a primary concern.

This is pretty much the only reason why I have a 570. It's not the best value card for gaming (though it still holds up quite well for 1080/1200p), but it's one of the best values for officially-supported CUDA in Adobe CS, especially with the 600 series CUDA being where it is right now.

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
AnandTech has a cool article up about using Thunderbolt and Virtu i-Mode together on a Wintel platform. Windows Thunderbolt isn't fully mature just yet (missing hot-add after boot, though it has hot-remove), but basically poo poo just works.

nmfree
Aug 15, 2001

The Greater Goon: Breaking Hearts and Chains since 2006
Would it be possible to get a link to our own Folding@Home thread in either the OP or the GPU Computing post (or both)? (If you haven't already planned on doing it, I know this is a work in progress :shobon: )

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Is there such a quiet video card? I don't know if it's just me but I find them obscenely loud.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Tab8715 posted:

Is there such a quiet video card? I don't know if it's just me but I find them obscenely loud.

Sure. Many vendors have a mid-low range (the catch) card that's cooled passively.

incoherent fucked around with this message at 03:35 on May 12, 2012

Powdered Toast Man
Jan 25, 2005

TOAST-A-RIFIC!!!
Awesome thread. You'll never convince me that there is anything better than this:

http://en.wikipedia.org/wiki/File:Quantum3D_Obsidian_X24_SLI_PCI.png

I held a mint example of this card in my hands once. I was working for a small IT shop and my boss had it in the box on a shelf. I should have asked for it but never had the guts to do so. It's kind of funny to think now about how much that thing cost in relation to current high end graphics cards and how powerful they are now. How far we have come.

Edit:

Seriously just check out these bitchin' benchmarks people: http://www.tomshardware.com/reviews/voodoo,85-19.html

WHAT MORE COULD YOU WANT

Powdered Toast Man fucked around with this message at 05:05 on May 12, 2012

hobbesmaster
Jan 28, 2008

Factory Factory posted:

AnandTech has a cool article up about using Thunderbolt and Virtu i-Mode together on a Wintel platform. Windows Thunderbolt isn't fully mature just yet (missing hot-add after boot, though it has hot-remove), but basically poo poo just works.

I doubt you'll see hotswapable thunderbolt much. eSATA isn't hotswapable either and desktop PCs can take the removal of hard drives better than PCIe cards.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
It'll be cool if you can get an external GPU for an ultrabook or something so you can game at home. Well worth the premium, or at least I'd pay it :shobon:

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

hobbesmaster posted:

I doubt you'll see hotswapable thunderbolt much. eSATA isn't hotswapable either and desktop PCs can take the removal of hard drives better than PCIe cards.

It says right in the article that Intel is requiring hot-swap as part of the certification process, and it's just a matter of drivers. And eSATA is indeed hot-swappable, with the right drivers. You can also fudge it by forcing a "detect hardware" at the Device Manager.

straw man
Jan 5, 2011

"You're a bigger liar than I am."
Has anybody explored ways to turn off the GPU Boost in the 600 series cards, or the basic EVGA 670 in particular? As convinced as nVidia seems to be of the effectiveness of their algorithm, I'm watching my framerates drop into the 20s (in Minecraft) because the card doesn't think it needs to be running at its rated clock speed.

Fanelien
Nov 23, 2003

straw man posted:

Has anybody explored ways to turn off the GPU Boost in the 600 series cards, or the basic EVGA 670 in particular? As convinced as nVidia seems to be of the effectiveness of their algorithm, I'm watching my framerates drop into the 20s (in Minecraft) because the card doesn't think it needs to be running at its rated clock speed.

As I understand it's like turbo boost, but I've not heard of a way to turn it off. Maybe use EVGA Precision to set a fixed voltage so it can't downvolt and lower the clockspeed?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
There isn't a way to turn it off, currently.

straw man
Jan 5, 2011

"You're a bigger liar than I am."

Fanelien posted:

As I understand it's like turbo boost, but I've not heard of a way to turn it off. Maybe use EVGA Precision to set a fixed voltage so it can't downvolt and lower the clockspeed?

Tried it... the card ignores my voltage setting when it decides to switch to another performance step.

Factory Factory posted:

There isn't a way to turn it off, currently.

Thanks, good to know. I'm going to talk to EVGA support about it anyway, in hopes that they connect me with someone in engineering that knows how. If that doesn't work, I'll probably try the same thing with nvidia.

straw man
Jan 5, 2011

"You're a bigger liar than I am."
Oh, I partially answered my own question, so I'll post it here for public edification. If you use NVIDIA Inspector (http://www.techspot.com/downloads/5077-nvidia-inspector.html) to open the driver settings (the wrench and screwdriver next to the driver version), in the Common section, there's a "Power management mode" setting. Pick "Prefer maximum performance", and it will at least stop dropping below the default clock speed (915mhz on the non-superclocked 670).

It's still not really respecting my overclock settings, but seems happy to go well above 1100mhz when the algorithm decides it's time to boost the speed.

(edit: it's in the nvidia control panel too... i guess that bit got erased when i updated the driver. yeah, that vvv)

straw man fucked around with this message at 09:54 on May 12, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

That's been a trick for some time, since they introduced power saving states and some games don't play nice with 'em. Glad it works with the modern cards too. In the 500 series and before, you don't need nVidiaInspector, it's just in the regular boring control panel as a power management setting.

Nintendo Kid
Aug 4, 2011

by Smythe

hobbesmaster posted:

I doubt you'll see hotswapable thunderbolt much. eSATA isn't hotswapable either and desktop PCs can take the removal of hard drives better than PCIe cards.

I've been hotswapping eSATA hard drives on my laptops for about 2 years now? If you're worried about messing stuff up you can make sure to click the thing on your taskbar to "eject" them first.

Longinus00
Dec 29, 2005
Ur-Quan

Tab8715 posted:

Is there such a quiet video card? I don't know if it's just me but I find them obscenely loud.

There are passively cooled video cards, cards with only a big honking heat sink but no fans, but these tend to be mostly for the lower power video cards. For higher end cards there are ones with dual or triple fans which lower the noise a lot because they can run the individual fans at much lower RPM and get the same amount of airflow.

spanko
Apr 7, 2004
winnar
I feel like I can contribute a bit to this thread given my recent experience. I upgraded from a 24" 1920x1200 (ZR24W) monitor to a Korean 27" 2560x1440 monitor with the following specs:

i5 2500k @ 4.2
8GB ddr3 1600
SSD
GTX570

At 1920x1200 with the GTX570 everything was great, I played the following at completely maxed out settings, except where noted. I generally turn motion blur off in all games and bloom depending on the game because the implementation is terrible.

Witcher 2 (ubersampling, motion blur off)
BF3 (motion blur and bloom off, 4x AA)
WoW
SWToR
Skyrim
Tera

After upgrading to the 27" BF3, Witcher 2, and SWToR were no longer playable at 2560x1440 with the same settings and I had to turn some stuff down like AA to 2x or off, ambient occlusion off, bloom and depth of field off in SWToR, and anisotropic filtering off in BF3. Even with those settings there would be occasional unplayable drops in fps in those three games. The higher resolution of the 27" monitor made a huge difference in performance and my 570 ran significantly hotter and louder as well.

So I bought a GTX680 for $500 and I can now run all those games at 2560x1440 and completely max settings and they run better than they did on the 570 at 1900x1200. All that being said, I wouldn't recommend people buy a GTX680 now that 670s are out. It really doesn't look like its worth the extra $100 and I would have waited and got the 670 instead if I knew performance was going to be as close as it is.

So basically anyone considering those 27" Koreans on ebay keep this in mind. The monitors are fantastic and really cheap for what you get but don't expect to play games at high settings on anything less than the top tier of GPUs.

Longinus00
Dec 29, 2005
Ur-Quan

spanko posted:

I feel like I can contribute a bit to this thread given my recent experience. I upgraded from a 24" 1920x1200 (ZR24W) monitor to a Korean 27" 2560x1440 monitor with the following specs:

i5 2500k @ 4.2
8GB ddr3 1600
SSD
GTX570

At 1920x1200 with the GTX570 everything was great, I played the following at completely maxed out settings, except where noted. I generally turn motion blur off in all games and bloom depending on the game because the implementation is terrible.

Witcher 2 (ubersampling, motion blur off)
BF3 (motion blur and bloom off, 4x AA)
WoW
SWToR
Skyrim
Tera

After upgrading to the 27" BF3, Witcher 2, and SWToR were no longer playable at 2560x1440 with the same settings and I had to turn some stuff down like AA to 2x or off, ambient occlusion off, bloom and depth of field off in SWToR, and anisotropic filtering off in BF3. Even with those settings there would be occasional unplayable drops in fps in those three games. The higher resolution of the 27" monitor made a huge difference in performance and my 570 ran significantly hotter and louder as well.

So I bought a GTX680 for $500 and I can now run all those games at 2560x1440 and completely max settings and they run better than they did on the 570 at 1900x1200. All that being said, I wouldn't recommend people buy a GTX680 now that 670s are out. It really doesn't look like its worth the extra $100 and I would have waited and got the 670 instead if I knew performance was going to be as close as it is.

So basically anyone considering those 27" Koreans on ebay keep this in mind. The monitors are fantastic and really cheap for what you get but don't expect to play games at high settings on anything less than the top tier of GPUs.

Why do you keep mentioning the nationality of the monitor? Why would that make any difference? Everything else that you've said is already covered in the OP.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Longinus00 posted:

Why do you keep mentioning the nationality of the monitor? Why would that make any difference? Everything else that you've said is already covered in the OP.

"Korean monitors" is a catchall used in the monitor megathread for imported 27" IPS screens, factory seconds that didn't make it into 27" iMacs. There are at least four brands being imported that have not-very-memorable names. Except for Catleap. Hence they're being referred to by nationality.

movax
Aug 30, 2008

Verizian posted:

Can we talk about non-gpu bits of a videocard here too because I've got a few questions?

[list]
[*]Why do drivers suck so much?

Mainly an AMD thing from my experience with OpenGL loving up for idTech5 and the Windows 8 CP drivers requiring manual registry editing.
Also heard some horror stories about nVidia drivers too but no first hand knowledge.

It all comes to down to registers in the end. Software needs to program registers with certain values, and the hardware executes tasks based on those values. As I wrote in the section on DirectX in the OP, these APIs were created to software could write against "nicer" APIs, instead of bare-metal. However, it's up to the hundreds of drivers engineers at both companies to create software that translate DirectX/OpenGL API calls into hardware commands the card executes.

Intel released documentation for their HD Graphics hardware. You could pick out say Volume 1 Part 3, look at 1.1.11.2 and see a register called "Observation Architecture Status Register", and in that 32-bit register, you've got various bitfields that can be programmed and/or read that affect the render pipeline. (I don't remember what exactly that register does please don't kill me).

Basically, you've got a metric shitton of these registers, and a pile of humans to generate software to control these registers. Also, the software that the first pile of humans create is used by many other piles of humans to let us murder hookers in Liberty City or kill browns in BF3. :black101:

If you ask me, the "drivers sucking" can probably be traced to management, deadlines, soul-crushing software management practices, etc. I'm not a software engineer by profession but I think you can see the pain they suffer in Cavern of COBOL if you wanted.

Factory Factory posted:

AnandTech has a cool article up about using Thunderbolt and Virtu i-Mode together on a Wintel platform. Windows Thunderbolt isn't fully mature just yet (missing hot-add after boot, though it has hot-remove), but basically poo poo just works.

This is awesome, but guess who has a P67 :downs:

nmfree posted:

Would it be possible to get a link to our own Folding@Home thread in either the OP or the GPU Computing post (or both)? (If you haven't already planned on doing it, I know this is a work in progress :shobon: )

Should have figured we had a Folding team, I will definitely add that!

Tab8715 posted:

Is there such a quiet video card? I don't know if it's just me but I find them obscenely loud.

Like incoherent said, you can totally get passively cooled cards. I had a passively cooled 8600 from Asus in my parents machine for awhile, I think. I use a passive GT210 (IIRC) from MSI to drive my secondary displays because I can't use my iGPU with my P67 chipset.

Powdered Toast Man posted:

Seriously just check out these bitchin' benchmarks people: http://www.tomshardware.com/reviews/voodoo,85-19.html

WHAT MORE COULD YOU WANT

3dfx :patriot:

hobbesmaster posted:

I doubt you'll see hotswapable thunderbolt much. eSATA isn't hotswapable either and desktop PCs can take the removal of hard drives better than PCIe cards.

I effortposted about PCIe hotplug in the Intel thread, here. SATA hotplug is a different (easier) game though, that's almost entirely on the OS drivers-side. Not too much needed from BIOS.

e:

Top Quark posted:

Yay my little graph project made it in! I still have plans to work on this but I've been crazy busy recently (moving to a new country and all).

A note: I recently broke the scraping script thingy so the prices are a bit outdated. Also I'm moving to PassMark scores rather than average FPS' as it's too hard to normalise the benchmarks across cards. I will also be adding support for techreports 99th percentile benchmarks.

Also I looked into converting the thing to a dynamic PNG so you can link it here but holy god it looks complicated and I'm not THAT good a coder. I'll keep seeing if I can throw something together though.

Ah, forgot it was you! Added credit to you in the OP as well! A dynamic PNG would be cool, to draw attention to that link, but I'll try to some up with some nice graphical separators/banners/etc for OP.

movax fucked around with this message at 23:49 on May 12, 2012

Boten Anna
Feb 22, 2010

I got a GTX 670 on launch day which is complete overkill with my current monitor set up (a 1920x1080 primary and a secondary 1600x900 one) but it's quite wonderful so far. Especially since I don't even have any of the high end games to really test it with, but god drat do the ones I have run well, and I'm getting a feeling of "woah, this is how these games were intended to run this whole time!" so far, overall. 3DMark 11 runs great though! :v:

I bought it mainly because I really like good, smooth performance and would rather kill an ant with an elephant gun than try to fiddle with everything to get the right "playable" range. I also would rather just plunk down extra money than get something last-gen; I see it as that I'll get more value out of a $400 launch-day-gen card than a cheaper last-gen one with fancy spoilers and attached and pinstripes painted on (you know, the things that make it go faster :iiaca:) to push it harder that's already headed to the graveyard. I may just be ridiculous though as at the end of the day I'm probably just trying to justify futureproofing. :v: Really, I just had the chance to get this and Ivy Bridge on launch day and wanted something up to the minute for once :shobon:

That said, I am probably going to get a Korean monitor soon so that I can actually make full use of this thing. And here's to hoping this isn't like the ill-fated 560 Ti that bluescreened on loading gfx drivers on two different motherboards that I just said "fuckit" and sent back and ended up replacing with something more expensive!

And god drat, movax, I spent like an hour reading the OP and I didn't even actually physically read all of it. So much information, and so comprehensive. Good work, and thank you!

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

movax posted:

This is awesome, but guess who has a P67 :downs:

:smith::hf::smith:

--

Re: quiet GPUs, the biggest offender is definitely blower-style fans. Blowers are very utilitarian and robust - they work even if you cram cards right next to each other in a case with poor ventilation.



Open-air coolers (i.e. without the tight-fitting shroud) generally provide lower temperatures and much lower noise characteristics because the fans can be larger, and the heatsinks can have more surface area. However, this comes at the cost of requiring better case ventilation, since air no longer moves over the card unidirectionally. Where a blower sucks air in from the front of the case and spews it out the back, an open-air cooler sprays hot air everywhere locally (though some gets ducted out the back slots).

The most quiet, best of the best coolers are triple-slot dealies, open-air designs which take up three expansion slots on the motherboard. These have a ton of room for heatsinks and high-end fans. Many high-end boards will space their CF/SLI-capable slots far enough apart to accept these cards.

One particularly quiet card that came up recently in the system building thread is the Asus GeForce 670 Direct CU II TOP. That fucker has a load fan noise of 25 dBA. It's the quietest high-performance video card I've ever seen, and it's quieter than any current blower-based reference designs until you get down to the level of a Radeon 5670. TechPowerup reviews.

Factory Factory fucked around with this message at 00:50 on May 13, 2012

Deathreaper
Mar 27, 2010
Great thread movax!

Anyways ever since you convinced me to get a Dell U3011 I'v had to run dual GPU configurations for gaming. I'm replacing my crossfire 6970's for a GTX 690. Will post a few pics when it arrives.

Dotcom656
Apr 7, 2007
I WILL TAKE BETTER PICTURES OF MY DRAWINGS BEFORE POSTING THEM
I just blew my wad on a catleap 2560x1440 and a MSI GTX 670. Here's to the incessant waiting of the next week!

hobbesmaster
Jan 28, 2008

movax posted:

I effortposted about PCIe hotplug in the Intel thread, here. SATA hotplug is a different (easier) game though, that's almost entirely on the OS drivers-side. Not too much needed from BIOS.

I saw that in the other thread. Kind of kills my hopes for controlling 10k fps cameras from a MBA but I'll still try it...

I must just have terrible luck with eSATA drivers... last time I tried hotswapping I just got a BSOD.

movax
Aug 30, 2008

hobbesmaster posted:

I saw that in the other thread. Kind of kills my hopes for controlling 10k fps cameras from a MBA but I'll still try it...

I must just have terrible luck with eSATA drivers... last time I tried hotswapping I just got a BSOD.

Ah, it may totally be possible...didn't intend for the post to say that, just to kind of elaborate on why it's kind of tricky to get that functionality working. I definitely expect some growing pains, unless Intel did some weird stuff with regards to how downstream PCIe devices appear to software.

eSATA on my mobo is through JMicron, I installed the JMicron drivers and hot-plugging has been working well since then. (Amazingly).

Dotcom656 posted:

I just blew my wad on a catleap 2560x1440 and a MSI GTX 670. Here's to the incessant waiting of the next week!

Not sure what you're coming from, but 2560x1440 gaming is going to look balls-out awesome, especially with that 670. 2GB VRAM?

Deathreaper posted:

Great thread movax!

Anyways ever since you convinced me to get a Dell U3011 I'v had to run dual GPU configurations for gaming. I'm replacing my crossfire 6970's for a GTX 690. Will post a few pics when it arrives.

Thanks! Does your case happen to limit you to a single-card solution, per-chance? Curious as to why you picked 690 over 2x 680. (Was the 690 actually in stock unlike the 680 :v:)

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug
Actually I've never hotswapped my actual eSATA port, but I have an internal hot-swap bay in a 5.25" slot which is totally awesome. Only problem I have is that I typically have to rescan disks in Disk Management on putting something in.

Dotcom656
Apr 7, 2007
I WILL TAKE BETTER PICTURES OF MY DRAWINGS BEFORE POSTING THEM

movax posted:

Not sure what you're coming from, but 2560x1440 gaming is going to look balls-out awesome, especially with that 670. 2GB VRAM?

Coming from a 23" 1920x1080 display and a 6870 1GB.
And the 670 I bought is a 2GB card.

Getting too excited to even sleep tonight.

Nintendo Kid
Aug 4, 2011

by Smythe

hobbesmaster posted:

I must just have terrible luck with eSATA drivers... last time I tried hotswapping I just got a BSOD.

For what it's worth, I've been doing it on my two Dell laptops, which both are eSATAp ports using an eSATAp enclosure for a drive. They're a Dell Studio 15 and a Dell XPS 15. I DO always go to the little "remove hardware/eject" icon on the taskbar first though.

Josh Lyman
May 24, 2009


My gaming consists of Starcraft 2 and Diablo 3. Should I just get the GTX 460 for $140 now or wait for prices to fall as Kepler continues to roll out?

Josh Lyman fucked around with this message at 23:08 on May 13, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Josh Lyman posted:

My gaming consists of Starcraft 2 and Diablo 3. Should I just get the GTX 460 for $140 now or wait for prices to fall as Kepler continues to roll out?
You'd have to be insane to buy a GTX 460 for $140. Head over to the parts picking megathread.

hobbesmaster
Jan 28, 2008

Josh Lyman posted:

My gaming consists of Starcraft 2 and Diablo 3. Should I just get the GTX 460 for $140 now or wait for prices to fall as Kepler continues to roll out?

I bought a GTX460 over a year ago for that price!

Looking on Newegg you can pay $280 for one if you want! What a rip off.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Alereon posted:

You'd have to be insane to buy a GTX 460 for $140. Head over to the parts picking megathread.

Since it recommends a 6850 in that price range, and if $140 is the budget..

Here is a 6850 with a hefty factory overclock that's $140 after a rebate. Too late, now it's $150
http://www.newegg.com/Product/Product.aspx?Item=N82E16814161363

vvvv If the card has 1GB, I personally think you should not waste time on SLI for that resolution, use your money on a card with 2GB+

HalloKitty fucked around with this message at 08:54 on May 14, 2012

Animal
Apr 8, 2003

I have one of the Korean 1440p monitors, and an MSI 560 Ti-448 Twin Frozr III. It's doing an admirable job driving this resolution while overclocked, but feels the strain on certain games.

What would be best, sell it while I can still get a good price for it and buy a 670, or wait till their price drops, get another one and SLI them?

Mayne
Mar 22, 2008

To crooked eyes truth may wear a wry face.
Get a GTX670. 560Ti will be RAM limited at those resolutions, SLI wouldn't help much.

Mayne fucked around with this message at 23:55 on May 13, 2012

movax
Aug 30, 2008

Blast from the past, a Nvidia Riva 128 die (click for huge):


The NV3 RIVA 128 is one of the earliest (1997) consumer-level GPUs with integrated 3D acceleration. Competitors like the 3dfx Voodoo2 still needed a separate videocard to handle 2D rendering while its three chips performed 3D duties. The Riva 128 did everything on one IC, built on a 0.35 micron process. Yes, it's so old, process tech was still measured in microns (or micrometres). This is 350 nanometres; the smallest transistor on the NV3 is a little over 12x bigger than a 28nm GTX670. The NV3 was a follow-up to the not-entirely successful NV1, a PCI "multimedia accelerator card" that sported a sound-card (playback only) and gameport in addition to a basic graphics core. The NV1's death was ensured after Microsoft announced a revolutionary new programming API known as DirectX.


Stunning 3D Graphics! Did I mention that at this point in time, the maximum 2D resolution was completely separate from the maximum 3D resolution? From Tom's Hardware Guide

As a product of the late 90s, the NV3 packed some bitchin' features. MPEG-2 acceleration for your DVDs, native support for up to AGP 2x, and some basic VIVO functionality (video-in, video-out). It had a clock speed of 100MHz and usually was found accompanied by 4 to 8MB of SGRAM and marketing speak lists it at having a pixel fillrate of around 100Megapixels/s. The GTX680 is listed as have a pixel fillrate of 32.2Gigapixels/s, a 322x increase, for comparison. You could find the NV3 in both PCI and AGP variants, allowing add-in partners huge flexibility in their marketing and production strategies. Older machines (pre-440xX) could get a PCI Riva 128 card, but if you had a 440LX/BX or something that supported AGP, you could get an AGP-version of the card.


A Diamond-branded Riva 128 card

Video memory wise, the NV3 sported support for a new type of video memory, SGRAM. SGRAM built on standard SDRAM by adding features that were of great interest to graphics programmers such as bit-masking and block-writes. These could be handled in hardware, improving performance over using standard SDRAM. This was before the advent of any form of DDR technology in the consumer PC market, so data was transferred only on the rising edge of the 100MHz-clock connecting the video memory to the NV3. This interface was, however, a very wide 128-bits, giving a Nvidia-spec'd 1.6GB/s of bandwidth.

It's fun to note that the only chip on that board that's a BGA is the GPU itself. Everything else is a (T)SOP or TQFP of some sort. The Diamond board is also missing a PLCC socket that could (probably?) hold an expansion ROM of some type, like Asus chose to include (below). The on-board interfaces between various chips, like the 100MHz memory-bus, for instance, were so slow that the inductance from the QFP leads were not of major concern. As the pin-count and speed of buses increased, BGA proved to be superior thanks to not forcing connections solely to the edge of the package, but also a nice, low-impedance connection to the board through a tiny solder ball. Advances in process technology and manufacturing coupled with process-shrinks also helped march-along the move to BGA.


Asus 3DexPlorer 3000 From Tom's Hardware Guide

Sources:
[1] Nvidia Riva 128 Product Page, July 6th, 1997
[2]Wikipedia :downs:

Bonus: non-NDA, complete Riva 128 Datasheet

Stumbled across the die-pic and though a write-up would be fun. I was a little young for the Voodoos, if someone who used them extensively wants to do a writeup on them, that would be pretty cool I think. Pick and choose a GPU family and write away!

odd2k
Jul 18, 2006
I DIDN'T CONTRIBUTE ANYTHING BUT A SHITTY POST SO I GOT THIS SHITTY CUSTOM TITLE!

Agreed posted:

Barring known incompatibilities like the 560Ti's weird issue with Battlefield 3 and black textures despite attempts from the devs and nVidia's driver team to fix them...

Is this still a thing, or did nvidia fix this in the latest drivers? I was planning on getting a gtx 560 ti, but this sounds like a real dealbreaker.

emdash
Oct 19, 2003

and?
Man I remember reading GPU reviews and 2D image quality being a really big deal. Amazing how much things have changed.

Adbot
ADBOT LOVES YOU

Tokin Ring
Jun 12, 2011

  :dong:Teh boners:dong:

Star War Sex Parrot posted:

Holy mother of god.

Agreed.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply