Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
slidebite
Nov 6, 2005

Good egg
:colbert:

Zero VGS posted:

I would exchange the card, and/or see if a driver update fixes it.
TBH, I am leaning that way

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

PcPer tested the Titan V in it's natural habitat, pure compute



:hellyeah:

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Does that mean you can "SLI" two Titan XP and get more compute for the money, or does it not work that way?

repiv
Aug 13, 2009

Zero VGS posted:

Does that mean you can "SLI" two Titan XP and get more compute for the money, or does it not work that way?

It depends on the application, but in that specific case 2xTitanXP would be more economical yeah. Raytracing and pathtracing parallelize very well over multiple cards, just ask this guy.

Regardless it shows that Volta has some serious IPC improvements under the hood, they're turning 20% more FLOPS into 80% more performance. Now the question is whether driver improvements can translate that over to games.

repiv fucked around with this message at 21:21 on Dec 15, 2017

Wistful of Dollars
Aug 25, 2009


Does it manage to beat an overclocked 1080?

SwissArmyDruid
Feb 14, 2014

by sebmojo

Wistful of Dollars posted:

Does it manage to beat an overclocked 1080?

Name/post combo overwhelming!

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.
I’m not married to Nvidia or anything; if AMD made a card that was objectively better per dollar, I’d switch to them.

That being said, why would anyone use AMD cards (unless they’re getting them used at a great price or some outlier case)? Their flagship from a couple months ago is barely better than Nvidia’s second best card at a price that’s higher.

I mean... I just don’t understand what their strategy is. There’s no reason I can think of -as someone who plays video games- that would make me think about switching. They’re not competitive on performance and they’re not competitive on price.

Do they understand how business works? I want them to do well so NVidia has competition and a compelling reason to not raise prices (lol) and AMD is just making GBS threads the bed.

redeyes
Sep 14, 2002

by Fluffdaddy
With buttminers, gaming doesn't matter that much. It's pretty sad.

GRINDCORE MEGGIDO
Feb 28, 1985


tehinternet posted:

I’m not married to Nvidia or anything; if AMD made a card that was objectively better per dollar, I’d switch to them.
I'd switch over if they matched, they don't even need to be better.

1gnoirents
Jun 28, 2014

hello :)

tehinternet posted:

I’m not married to Nvidia or anything; if AMD made a card that was objectively better per dollar, I’d switch to them.

That being said, why would anyone use AMD cards (unless they’re getting them used at a great price or some outlier case)? Their flagship from a couple months ago is barely better than Nvidia’s second best card at a price that’s higher.

I mean... I just don’t understand what their strategy is. There’s no reason I can think of -as someone who plays video games- that would make me think about switching. They’re not competitive on performance and they’re not competitive on price.

Do they understand how business works? I want them to do well so NVidia has competition and a compelling reason to not raise prices (lol) and AMD is just making GBS threads the bed.

did you see the shroud

To be more serious, I firmly believe that Polaris was a huge disappointment in terms of performance (for whatever multitude of reasons) and it just couldn't recover. The strategy was probably for that not to be stuck to $200 performance levels. I dont know the why of any of that, but the whole "$200 for the gamer" spin was super forced, and Vega suffered as a result. Then Raja left of course, very little seems to have gone to plan for them

SlayVus
Jul 10, 2009
Grimey Drawer

tehinternet posted:

I’m not married to Nvidia or anything; if AMD made a card that was objectively better per dollar, I’d switch to them.

I think they screwed themselves over with the HBM2. The Vega 56 should of been like a $300/$350 at the performance it's at. They released at the same price as the 1070, but now they're the same price as the 1070 Ti and slower. If memory bandwidth was such and issue for the GPU, they should have just used a higher bus width.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

repiv posted:

Common sense says that the Titan XM should have severely depreciated by now, but they're still selling for almost 1080ti money: https://www.ebay.com/sch/i.html?_nkw=titan%20x%20maxwell&LH_Complete=1&LH_Sold=1

Either there's a use-case where the extra 1GB of VRAM makes a big difference or some people are just very stupid :iiam:

I sold my Titan XP a week after the 1080 Ti release for $900 and it sold instantly, people are just dumb :shrug:

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




tehinternet posted:

I’m not married to Nvidia or anything; if AMD made a card that was objectively better per dollar, I’d switch to them.

That being said, why would anyone use AMD cards (unless they’re getting them used at a great price or some outlier case)? Their flagship from a couple months ago is barely better than Nvidia’s second best card at a price that’s higher.

I mean... I just don’t understand what their strategy is. There’s no reason I can think of -as someone who plays video games- that would make me think about switching. They’re not competitive on performance and they’re not competitive on price.

Do they understand how business works? I want them to do well so NVidia has competition and a compelling reason to not raise prices (lol) and AMD is just making GBS threads the bed.

Maybe if they bought a Freesync monitor and are continuing to live in denial.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

VulgarandStupid posted:

Maybe if they bought a Freesync monitor and are continuing to live in denial.

Yeah, Freesync seems like the last real reason to justify buying AMD; it’s amazing how cheap those monitors are.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
I have a free sync 1080p75 monitor, I use a gtx 1080 :shrug: No reason to fix the situation either

ZobarStyl
Oct 24, 2005

This isn't a war, it's a moider.

tehinternet posted:

Yeah, Freesync seems like the last real reason to justify buying AMD; it’s amazing how cheap those monitors are.
For me it wasn't just the cost, but also my aversion to proprietary tech like GSync. I can't in good conscience pay for closed solutions in my tech anymore, especially when nVidia's decision not to support an open VRR standard is based not on any technical obstacle but rather the knowledge that no one in their right mind would pay the GSync tax if they did. I'm quite happy with my Vega + Freesync setup, but the AMD group hateboner that is this thread will tell you otherwise.

onesixtwo
Apr 27, 2014

Don't you realize that being nice just makes you get hurt?
Thread, I’m happy to report my 1080 purchase (for gaming) was a good decision, and Witcher 3 on ultra everything is loving beautiful on a 21:9 monitor. Forget gsync / free sync, the real endgame is 21:9. so many pixels

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
It's because AMD kept designing towards raw compute power (and now Volta strips even that away) and that meant designs being massively bottleneck by memory and ROPs. I mean it should have been clear as day when Fiji could barely keep ahead of Hawaii that there were limitations they were going to hit and quickly. Rather than slowing down core count growth and focusing and making those cores more capable, making sure they were fed and not idling, making sure they sucked up less power, they just kept pushing more design gimmicks, floundering as they searched for a way forward that could be compelling for investors and customers. "Innovations" that became meaningless as time went on, not more relevant, because the way Nvidia designed was simpler, more effective and easier to programs for. Now it's too late for AMD to ever come back because they don't have the money or talent anymore and by the time they do it won't be profitable to re-enter dGPU space. Maybe they won't stop this year, or the next (technically support for the Lexa lasts till 2024) but soon they'll bow out and simply move to iGPU, semicustom and CPU.

Craptacular!
Jul 9, 2001

Fuck the DH

Generic Monk posted:

i don't think anyone would call a dominant position in the market, along with good marketing, a 'mind control field'. what i'm saying is not exactly controversial

There isn't a single review of Nvidia cards that aren't comparing them to AMD's card at some point. That company wrote it's own fate going back to it's days as ATI. It's not because Nvidia has some PlayStation-like brand equity in gaming. It's because somebody heard something or read something or had a bad experience with something.

The Rage Pro that came in late 90s OEM PCs kept me from buying any of the Radeon cards until the 9700 because I doubted the company that shipped hardware that couldn't run GLQuake in 1999 could build a solid competitor. The state of their drivers in late 2000s and people going "oh install the Omega Drivers by some dude" sent me over to Fermi and I stayed with Nvidia until the RX470 was mine for about eight months. When I see that they don't even have adaptive Vsync which Nvidia has had since Kepler, but some guy who made the now-deprecated RadeonPro drivers was able to do it for their hardware, it makes me even more leery of buying something in the future.

I'll still keep them in mind, but drat they are working from far behind. Hobbyist coders were frequently making better drivers for their hardware than they can, and it only seems to have stopped because Microsoft has added a complicated signing system to put the brakes on hobby drivers.

Craptacular! fucked around with this message at 02:14 on Dec 16, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

ZobarStyl posted:

For me it wasn't just the cost, but also my aversion to proprietary tech like GSync. I can't in good conscience pay for closed solutions in my tech anymore, especially when nVidia's decision not to support an open VRR standard is based not on any technical obstacle but rather the knowledge that no one in their right mind would pay the GSync tax if they did. I'm quite happy with my Vega + Freesync setup, but the AMD group hateboner that is this thread will tell you otherwise.

On the other hand, the open standard would never have existed if NVIDIA hadn't gone ahead and invented VRR tech as we know it today. Turns out AMD had been sitting on the tech for years and didn't bother to push it until it became a competitive disadvantage. Same story as their GPGPU compute API and ecosystem. They are a reactive force in the marketplace whose defining move is to push a copycat standard and then having a cry when nobody implements it.

I would rather NVIDIA implemented FreeSync too (and I think they will sooner or later, I think it may come with Ampere) but it's hard to blame them for not racing out and implementing their competitor's standard to the thing they invented. Innovation costs money, R&D needs to be paid for.

FaustianQ posted:

It's because AMD kept designing towards raw compute power (and now Volta strips even that away) and that meant designs being massively bottleneck by memory and ROPs. I mean it should have been clear as day when Fiji could barely keep ahead of Hawaii that there were limitations they were going to hit and quickly. Rather than slowing down core count growth and focusing and making those cores more capable, making sure they were fed and not idling, making sure they sucked up less power, they just kept pushing more design gimmicks, floundering as they searched for a way forward that could be compelling for investors and customers. "Innovations" that became meaningless as time went on, not more relevant, because the way Nvidia designed was simpler, more effective and easier to programs for. Now it's too late for AMD to ever come back because they don't have the money or talent anymore and by the time they do it won't be profitable to re-enter dGPU space. Maybe they won't stop this year, or the next (technically support for the Lexa lasts till 2024) but soon they'll bow out and simply move to iGPU, semicustom and CPU.

The problem is it's not enough to make only a midrange card, either in mindshare or in technical design. The high-end card is what pushes your technology forward, and that gets ported down into the rest of your lineup.

Paul MaudDib fucked around with this message at 02:39 on Dec 16, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Paul MaudDib posted:

The problem is it's not enough to make only a midrange card, either in mindshare or in technical design. The high-end card is what pushes your technology forward, and that gets ported down into the rest of your lineup.

I'm not saying AMD couldn't or shouldn't have made a high end card, rather they designed towards raw compute power and wasted die space on frivolous features.They could have easily designed something that surpassed Fiji in games or workstation use, but that wasn't their design goal and they payed for it. They end up making midrange cards with high end compute performance and thermal/power performance as well.

Instead of designing cards that could compete in the consumer space, they kept trying to break into HPC which Nvidia has on lockdown, wasting time and money with halfassed software, incomplete or late launches, gimmicky "innovations" that did nothing to actually improve performance but did waste thermals, power and space, lackluster support and sometimes outright broken features. The tiger they tried riding has thrown them off and is now currently eating their face.

repiv
Aug 13, 2009

AMD even half assed their push into HPC by failing to release literally any new GPU with double precision support.

Want to do scientific computing with AMD? I hope you like Hawaii cards from four loving years ago!

Wistful of Dollars
Aug 25, 2009

tehinternet posted:

I’m not married to Nvidia or anything; if AMD made a card that was objectively better per dollar, I’d switch to them.

I wanted to buy a Vega card when they were released because I want to be able to give my shekels to AMD instead of Nvidia, but ultimately the proof is in the pudding and it sure wasn't worth replacing my 1080. :shrug:

BurritoJustice
Oct 9, 2012

SlayVus posted:

I'd like to point out that the only reason that some Titans keep their resale value is that aren't gimped on their floating point performance. The Titan XP and Titan Xp will invariably lose their resale value. I have seen people selling Titan XP on [H] when the 1080 Ti came out for around $700-800. Or maybe that was when the Xp came out, but I do know it was already selling way below retail used.

TLDR. The Titan V should keeps it's resale value, but the Pascal Titans are gimped FP which hurts resale.

The only Titan where "FP gimping" was relevant was the original and the Black (and by extension the Z but lol), where the x80Ti equivalent actually had arbitrarily disabled FP64 silicon. All the Titan X variants don't have FP64 sequestered off, the FP64 hardware was just completely removed and replaced by additional single precision hardware. The X's would be worse for single and half precision compute if they had kept in the double precision hardware with the same silicon. It actually made them a great buy for many compute scenarios, especially the XM as it was such a bump in VRAM size, due to their chart topping FP16/FP32 performance. The X Maxwell was actually a huge hit for this reason. FP64 is honestly more niche in the professional space than you'd expect, it's not a universal requirement. The equivalent Pascal die to the one used for the Titan V is actually the GP100, which was never released in the consumer space and sees worse FP32 performance due to the different focus of the design. There's no free lunch with double precision performance.

Now that Nvidia has actually unlocked the Quadro specific application optimisations for the Titan drivers they have an additional point of differentiation. Until this recent change the Xp actually got loving destroyed in many benchmarks by the P6000, with which it shares a die, with similar performance to the 1080Ti. Since this change it's much closer to the $5000 card.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BurritoJustice posted:

It actually made them a great buy for many compute scenarios, especially the XM as it was such a bump in VRAM size, due to their chart topping FP16/FP32 performance. The X Maxwell was actually a huge hit for this reason. FP64 is honestly more niche in the professional space than you'd expect, it's not a universal requirement.

Yes, this is a point I like to harp on: if you build it they will come. Jack-of-all-trades hardware pleases nobody, but if you make something that's really really fast at one thing people will bend their programs to fit your strengths. NVIDIA had no idea that Maxwell would usher in the deep-learning revolution when they designed it. It was just a balls-out design for gaming, and it also happened to be a balls-out design for neural networks. To their credit, AMD is also the fastest at integer processing (i.e. crypto), by a pretty wide margin. Inferior efficiency, but they can crunch the numbers.

GPUs are inherently a coprocessor model and you don't use a coprocessor for general-purpose stuff, you optimize to whatever the metal is good at. Not saying GV100 is a bad thing, but a hell of a lot of stuff doesn't really need onboard multi-kernel scheduling and FP64, it runs fine with FP32 and software-emulated scheduling like everything else.

BurritoJustice posted:

The equivalent Pascal die to the one used for the Titan V is actually the GP100, which was never released in the consumer space and sees worse FP32 performance due to the different focus of the design.

It's really funny that the people who insisted there would never be a Titan GP100 were right... but I had the last laugh. Guess having FP64 doesn't make a design inherently suck at gaming after all! :smugmrgw:

Paul MaudDib fucked around with this message at 17:33 on Dec 17, 2017

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

tehinternet posted:

Yeah, Freesync seems like the last real reason to justify buying AMD; it’s amazing how cheap those monitors are.

Too bad for AMD, VRR is now part of the HDMI 2.1 standard, which NV means would have support that eventually, and my crystal ball says they wouldn't kill G-Sync but instead will differentiate it even further from the baseline VRR. Even the Freesync diehards will switch when that happens.

Craptacular! posted:

There isn't a single review of Nvidia cards that aren't comparing them to AMD's card at some point. That company wrote it's own fate going back to it's days as ATI. It's not because Nvidia has some PlayStation-like brand equity in gaming. It's because somebody heard something or read something or had a bad experience with something.

The Rage Pro that came in late 90s OEM PCs kept me from buying any of the Radeon cards until the 9700 because I doubted the company that shipped hardware that couldn't run GLQuake in 1999 could build a solid competitor. The state of their drivers in late 2000s and people going "oh install the Omega Drivers by some dude" sent me over to Fermi and I stayed with Nvidia until the RX470 was mine for about eight months. When I see that they don't even have adaptive Vsync which Nvidia has had since Kepler, but some guy who made the now-deprecated RadeonPro drivers was able to do it for their hardware, it makes me even more leery of buying something in the future.

I'll still keep them in mind, but drat they are working from far behind. Hobbyist coders were frequently making better drivers for their hardware than they can, and it only seems to have stopped because Microsoft has added a complicated signing system to put the brakes on hobby drivers.

Granted I never experienced any showstopping driver problems with my 9800 Pro/X1800XT/5850/7950, I have to say it's almost scary how early NV got the business formula right from 1998: wide compatibility with games aka none of the 3dfx API bullshit, solid drivers, good performance while being the best at that is just a bonus.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

Palladium posted:

Too bad for AMD, VRR is now part of the HDMI 2.1 standard, which NV means would have support that eventually, and my crystal ball says they wouldn't kill G-Sync but instead will differentiate it even further from the baseline VRR. Even the Freesync diehards will switch when that happens.

lmao Jesus Christ AMD is turbo hosed as far as GPUs go.

Not a good thing since Nvidia will charge whatever the gently caress they want, but it is what it is now.

ufarn
May 30, 2009
Wouldn't there be a way for Intel/AMD CPUs to support VRR through iGPUs?

Yaoi Gagarin
Feb 20, 2014

repiv posted:

Regardless it shows that Volta has some serious IPC improvements under the hood, they're turning 20% more FLOPS into 80% more performance.

That's... Not at all how this works. :psyduck:

FLOPS is floating point operations per second, IPC is instructions per cycle. If IPC goes up then you could get more instructions per second for the same clock frequency, but that would translate to more FLOPS, not "more performance for the same FLOPS". If a process is actually computation bound instead of memory bandwidth bound then FLOPS pretty much is performance.

It's also worth noting that hardware companies (and supercomputer designers) love to present the best case scenario for flops. Nvidia's estimate for the Titan V most likely assumes you're lighting up all the tensor cores, and the FP32/16 stuff, and the FP64 stuff, at 100% utilization all the time. It's just unrealistic, especially since the tensor cores are so specialized that they do just one thing (4x4 matrix multiply and accumulate, iirc)

ufarn
May 30, 2009
So uh, it seems that people are still getting crashes in Overwatch after the patch? Was the issue actually resolved, or?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
VRR is an optional part of the HDMI 2.1 standard, NVIDIA is under no obligation to implement it.

Nevertheless I think they eventually will, because it would gut the last straw that people have been clinging to in order to justify purchasing AMD products. The living-room market is going FreeSync, since AMD has the console market locked up, and NVIDIA has spent a lot of money on various products to try and make inroads into that market and the "non-PC gaming market" generally (Shield, etc). VRR is just too compelling of a value story for consoles that can't render a reasonable refresh rate (also iGPUs).

The fact that they didn't support it with Pascal doesn't mean they never will. Gsync was still pretty young when Pascal launched (less than three years old) and I think they will eventually support it now that it's not threatening their ROI. I wouldn't even be surprised if they put out a line of GSync-branded FreeSync monitors at some point with LFC support/etc.

Paul MaudDib fucked around with this message at 17:38 on Dec 17, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

ufarn posted:

So uh, it seems that people are still getting crashes in Overwatch after the patch? Was the issue actually resolved, or?

Kinda seems like not. But the problem is the "rendering device lost" message seems to be a generic "unrecoverable error" message for crashes in the renderer, for example you can get it if you have overclocked too much. NVIDIA users were getting it too. So it's entirely possible that there are multiple bugs that are giving the same user-facing error but have different underlying causes.

Some of them may just be users whose GPU chips are not really stable and need to back down their undervolts/overclocks to factory settings or below. Some of them could even be subtle memory corruption from the IMC issues ("segfault bug" is a bit of a misnomer) on Ryzen, or just an unstable CPU overclock in general.

Some games are just touchier than others and don't like aggressive overclocks. The newer driver may have made OW touchier in general and pushed some users' chips over the line into intermittent stability issues - people are pushing the chips really hard now that everyone is undervolting/etc. Everybody is running the chips with zero safety margin and assuming that just because it works in a few games that it's 100% stable in everything.

So yeah, it's hard to say, that's not a very specific error message. Could be they didn't quite fix the bug, could be another bug, could just be users with unstable systems. The latter is almost certainly true of at least some users regardless of whether there is a bug.

Paul MaudDib fucked around with this message at 04:26 on Dec 17, 2017

Regrettable
Jan 5, 2010



Paul MaudDib posted:


Some games are just touchier than others and don't like aggressive overclocks.

Hell, Battlefleet Gothic Armada will crash from any GPU overclock.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Heads-up: EVGA has B-stock 980 Ti reference cards for $350 with free shipping. Not super fantastic, but decent given how totally hosed 1070 prices are (and unlike ebay cards they do come with a year of warranty). Per their notice, you buy by Tuesday 2PM Pacific Time or they don't ship until after New Years'.

Paul MaudDib fucked around with this message at 18:11 on Dec 17, 2017

Maxwell Adams
Oct 21, 2000

T E E F S

https://www.youtube.com/watch?v=lDD2MFGdRCE&t=671s

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
If for some reason any of you have kept eVGA GeForce 4 ACS cards from 2002: https://www.techpowerup.com/239713/evga-offers-usd-1-000-for-your-old-geforce4-acs-acs2-graphics-cards

Of course, if any of these were for sale on eBay, they sure as hell aren't now.

CaptainSarcastic
Jul 6, 2013



BIG HEADLINE posted:

If for some reason any of you have kept eVGA GeForce 4 ACS cards from 2002: https://www.techpowerup.com/239713/evga-offers-usd-1-000-for-your-old-geforce4-acs-acs2-graphics-cards

Of course, if any of these were for sale on eBay, they sure as hell aren't now.

Goddamnit, I think the oldest Nvidia card I still have around is an FX 5200 PCI card in an old PIII server. Pretty sure it's eVGA. Other than that I think I have GT 9500 and a GTX 9800 from them around here somewhere.

craig588
Nov 19, 2005

by Nyc_Tattoo
Don't feel bad, it's only the first 3 cards and on twitter someone tried to submit one hours later from it starting and they were unable to so I think it was over basically the minute it opened. I have an ACS2 somewhere but didn't even bother searching for it because of the people who keep their videocards on shelves ready to photograph or whatever.

CaptainSarcastic
Jul 6, 2013



This made me think of the video cards I have lingering around, and I'm pretty sure I also have a GTX 460 from eVGA in a box here somewhere. Also have a GTX 660 somewhere, but that one is Zotac, I think.

It's me - I'm the GPU hoarder.

Adbot
ADBOT LOVES YOU

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

BIG HEADLINE posted:

If for some reason any of you have kept eVGA GeForce 4 ACS cards from 2002: https://www.techpowerup.com/239713/evga-offers-usd-1-000-for-your-old-geforce4-acs-acs2-graphics-cards

Of course, if any of these were for sale on eBay, they sure as hell aren't now.

It's almost like "poo poo, we didn't have everything we made in the glass cabinet collection."

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply