Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Volguus
Mar 3, 2009

wolrah posted:

Yeah a well-chosen power supply is probably the longest lasting functional component. The ATX12v 2.0 spec in 2003 was the last major change, bringing the 24 pin ATX connector, PCIe power, and mandating SATA support. Even for that an older supply with sufficient capacity could be used with adapters to handle the new connectors. A well-chosen PSU from the early 2000s could easily still be in use today.

Now that we have modular cables as pretty much standard on the nicer models we're even fairly future-proof on the connections. Unless we need a new voltage or there's a sudden increase in demand for 3.3/5v power any modern modular power supply should be useful for its entire operational life.

I had an Enermax power supply too, bought around 2007 or so. But, in 2012 when I bought my current x79 cpu/mb the drat thing would not power on. Not a blip. Went and bought a new Seasonic power supply and everything was peachy. The supply itself is still working on the old cpu/mb combo that I had before, but not a chance on the newer one. I'm thinking that there must have been some ATX standard change that the old supply didn't support and the new motherboard (ASUS) required.

Adbot
ADBOT LOVES YOU

dissss
Nov 10, 2007

I'm a terrible forums poster with terrible opinions.

Here's a cat fucking a squid.

HalloKitty posted:

Many at the low end, but not too many high end ones.

vvv Many more, but it would be pretty tedious to compile a complete list. A good example of rebrands run amok is the Wikipedia list of 600 series cards. Again, we're talking about low end cards where NVIDIA went crazy, a GT 640 can have five different chips, for example.
At the higher end (cards people might actually have bought) I recall things like the 8800GT to 9800GT, 680 to 770...

They also did annoying stuff with their mobile chips like the 850M being available in DDR3 and GDDR versions and the 860M being either Kepler or Maxwell

Craptacular!
Jul 9, 2001

Fuck the DH

MaxxBot posted:

What confuses me the most are PC desktop users who are Intel fanboys, must have masochistic personalities.

Like I said above, VIA soured my enjoyment of AMD early on. I don't remember all my CPUs, but my P3 was not part of the first run but a 700mhz Coppermine that had been rumored after Athlon trounced the first batch of P3. My P4 was a 1.8ghz Northwood. I *think* I also had a Prescott but the period of 2004-2006 where loaded with drama in my life and I don't remember much.

I always wanted to own a P2/P3 with the big heatsink shroud and the connector that reminded me of an SNES cartridge.



So, speaking of nostalgia, bear with me as I gleam over my past....

I've been watching a few of Adored's "long winding history lecture" videos, when he rambles on about all the generations where AMD had better cards and they didn't sell. He likes to hammer on about Nvidia's mindshare and how so many regular non-enthusiast people didn't know Radeon exists. To me, GPUs were originally called "3DFX cards" because if you wanted to play GLQuake(World) what the hell were you doing buying anything other than a Voodoo? Nvidia was the guy who merged the 2D/3D cards together, and when they stumbled with FX cards introducing loud-as-hell blower coolers is when ATI finally had an advantage in things.

I've never owned any of these cards he talks about that dominated the Steam hardware survey. I had no idea the 8800 GT was such a big deal.

My GPU history is as follows:
Voodoo -> TNT2 -> GeForce DDR -> GF Ti4400 -> Radeon 9700 PRO -> GF 6800GT -> GF 7600GT -> RHD 3470 -> GTX 460 -> GTX 660 -> RX470 -> GTX 1070

At some point, I stopped buying popular cards, I guess, because the card that probably had the best representation in the hardware survey was the 660. I've told the story before, but I spent an arm and a leg on the 6800GT, it's price tag, even in 2004 dollars, was slightly higher than the 1070 I bought last month so it remains the most expensive graphics card I ever bought. Not only was my thinking behind the idea dumb (being a Quake/id fanboy I promised myself years ago I'd buy the best card I could find for Doom3, a game I wound up hating) but also it was on an AGP bus that was deprecated by Intel before it's usefulness was dead. And rather than buy a then-shoddy ASRock motherboard that included an AGP port for legacy purposes, I decided to sell the card and buy the next step.

My 3470 was a passively cooled Sapphire which I bought because I was getting less interested in PC gaming and moved everything into PlayStation 3 in the latter half of 2000s. UserBenchmark shows a 1,800% improvement going from the 3470 to the GTX460 which can't be right unless things really improved a drastic poo poo-ton in two years.

Starting with the GTX460, every purchase I've made has been less about who has better value or fanboyism and simply where the drivers are in non-Windows operating systems (Linux from 2011-2013, Hackintosh 2013-today). Even the RX470, which doesn't work as well, because Nvidia hadn't released their 10-series Mac drivers at the time.

In conclusion, please post (or perhaps link) the most embarrassing GPU box art you've ever actually purchased. Mine would be this one, though also this X-shaped box from XFX earns points for being the most complicated packaging of all time.

GRINDCORE MEGGIDO
Feb 28, 1985


Whoah gently caress that's a post

Craptacular!
Jul 9, 2001

Fuck the DH
Yes but please, share embarrassing boxes. It's time this thread focused on the real important issue, which is that GPU marketing has always been cringy as gently caress.

Shrimp or Shrimps
Feb 14, 2012


While I loved it at the time, the chrome gargoyle of my 9800 pro box is pretty lame now.

My X1800XT box with the girl in red (I want to say "Ruby"?) was also pretty meh.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

wolrah posted:

Yeah a well-chosen power supply is probably the longest lasting functional component. The ATX12v 2.0 spec in 2003 was the last major change, bringing the 24 pin ATX connector, PCIe power, and mandating SATA support. Even for that an older supply with sufficient capacity could be used with adapters to handle the new connectors. A well-chosen PSU from the early 2000s could easily still be in use today.

Now that we have modular cables as pretty much standard on the nicer models we're even fairly future-proof on the connections. Unless we need a new voltage or there's a sudden increase in demand for 3.3/5v power any modern modular power supply should be useful for its entire operational life.

Haswell required some PSU changes to support its very low-current C7 sleep state. Some PSUs were too clever for their own good and decided that since C7 pulled so little power that the whole computer must actually be off.

Paul MaudDib fucked around with this message at 23:59 on Aug 26, 2017

Craptacular!
Jul 9, 2001

Fuck the DH

Shrimp or Shrimps posted:

My X1800XT box with the girl in red (I want to say "Ruby"?) was also pretty meh.

Sapphire definitely got into the "CGI Boobs" game, but I think they had a different set of CGI Boobs. "Ruby" was exclusively an ATI internal thing. At least you didn't have the ASUS model, which had a replicant of Fifty Cent, wearing Unreal Tournament armor, presented alongside a giant monkey.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
A user on Reddit is reporting success flashing a Vega 64 LC VBIOS onto a Vega 64 Air card - thus sidestepping AMD's power limiter on that model.

Craptacular!
Jul 9, 2001

Fuck the DH

He's using an aftermarket liquid cooler, so it's at least not going to erupt his PC into a fire... But what exactly is the benefit of this other than not having to buy a games&coupons bundle?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Craptacular! posted:

He's using an aftermarket liquid cooler, so it's at least not going to erupt his PC into a fire... But what exactly is the benefit of this other than not having to buy a games&coupons bundle?

Getting the extra 75 watts on your power limit without having to buy a games+coupons bundle.

Arivia
Mar 17, 2011

Craptacular! posted:

Sapphire definitely got into the "CGI Boobs" game, but I think they had a different set of CGI Boobs. "Ruby" was exclusively an ATI internal thing. At least you didn't have the ASUS model, which had a replicant of Fifty Cent, wearing Unreal Tournament armor, presented alongside a giant monkey.

Is that an external GPU power brick pictured on that box?

Craptacular!
Jul 9, 2001

Fuck the DH

Arivia posted:

Is that an external GPU power brick pictured on that box?

quote:

Why ASUS is Better?
80-Watt External Power Supply Kit : Provided easy-to-install power supply kit to upgrade your system for dedicated and stable power input to maximize the overclock-ability.

EVIL Gibson
Mar 23, 2001

Internet of Things is just someone else's computer that people can't help attaching cameras and door locks to!
:vapes:
Switchblade Switcharoo

Paul MaudDib posted:

Getting the extra 75 watts on your power limit without having to buy a games+coupons bundle.

If you can somehow find non bundles to anywhere near MSRP considering the buttcoin gpu desert out there.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Arivia posted:

Is that an external GPU power brick pictured on that box?

nobody remind AMD this used to be acceptable.

WanderingKid
Feb 27, 2005

lives here...

Craptacular! posted:

My GPU history is as follows:
Voodoo -> TNT2 -> GeForce DDR -> GF Ti4400 -> Radeon 9700 PRO -> GF 6800GT -> GF 7600GT -> RHD 3470 -> GTX 460 -> GTX 660 -> RX470 -> GTX 1070

I love gpu histories. I'm like the hurdler who starts with a bad rhythm and knocks over all the fences. I missed out on most of the good poo poo:

AccelGraphics Permedia 2
3DFX Voodoo 3 3000
Geforce 2 GTS
Geforce 4 MX something (it was bad)
Radeon 9600 Pro
Radeon X1950 Pro
GTX 460
GTX 760
GTX 1070

The jaw dropping Half Life 2 demo was running on a 9700 Pro, which I couldn't afford so I went 9600 Pro. Cue 3 years of running hacked driver packages because Catalyst was such a mess back then that each update would break as many things as it fixed. By the time I went back to nVidia the damage was already done. They had a reputation for poo poo drivers all the way up to the hot and loud memes.

I can't find the boxart for my geforce 2 but holy cow there were some bad ones:



Edit:



loving lol. Old school gpu boxart guys were total space cadets.

WanderingKid fucked around with this message at 02:28 on Aug 27, 2017

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.
I'm a baby who got late to the GPU race.

Radeon 9600 Pro
Radeon X1800 GTO
Radeon HD 4870
Radeon HD 6950
a second Radeon HD 6950, doing crossfire for about a moment before they both die when my cheap PSU exploded
GTX 660 Ti SLI
GTX 980 Ti

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
GeForce 440 MX
8800GT
2x 6870s in crossfire
GTX 760
GTX 1070

Wirth1000
May 12, 2010

#essereFerrari
I flinched and now own an EVGA 1080 Ti FTW3 GPU. This thing is a loving beauty. The construction is unreal.

I installed Precision XOC and now I'm hella confused about fan curves. Apparently this overrides the built-in vbios fan curve and I don't know what to do... play around with fan curves myself (I'd really rather not) or just let this thing regulate it? Maybe uninstall XOC entirely and let the vbios regulate the fans.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
Voodoo was the coolest loving name ever. Shame mine are in a landfill somewhere.

Craptacular!
Jul 9, 2001

Fuck the DH

Proud Christian Mom posted:

Voodoo was the coolest loving name ever. Shame mine are in a landfill somewhere.

Monster 3D. Every now and then I'm amazed to see Diamond Multimedia still exists as an AMD reference brand.


Wirth1000 posted:

I flinched and now own an EVGA 1080 Ti FTW3 GPU. This thing is a loving beauty. The construction is unreal.

I installed Precision XOC and now I'm hella confused about fan curves. Apparently this overrides the built-in vbios fan curve and I don't know what to do... play around with fan curves myself (I'd really rather not) or just let this thing regulate it? Maybe uninstall XOC entirely and let the vbios regulate the fans.

If you slam the big button that says "DEFAULT", it'll revert back to stock settings. Unless you want to OC, there's no reason to mess with them.

If you're not going to OC (I've never OCed jack poo poo and run everything at stock until replacement), the main reason to install XOC is RGB lighting control, in-game hardware monitoring, and for SC2/FTW2/FTW3 owners like you and me who paid extra for a shitload of cooling and sensors, to actually read all those sensors EVGA put in their ICX models. And enable the thermal-based lighting. My card (1070 SC2) has "G P M" lights that change color based on the temperature, I don't think the 1080ti models have letters but they do have little LED dots for that nonetheless.

Craptacular! fucked around with this message at 03:23 on Aug 27, 2017

Wirth1000
May 12, 2010

#essereFerrari

Craptacular! posted:

Monster 3D. Every now and then I'm amazed to see Diamond Multimedia still exists as an AMD reference brand.


If you slam the big button that says "DEFAULT", it'll revert back to stock settings. Unless you want to OC, there's no reason to mess with them.

If you're not going to OC (I've never OCed jack poo poo and run everything at stock until replacement), the main reason to install XOC is RGB lighting control, in-game hardware monitoring, and for SC2/FTW2/FTW3 owners like you and me who paid extra for a shitload of cooling and sensors, to actually read all those sensors EVGA put in their ICX models. And enable the thermal-based lighting. My card (1070 SC2) has "G P M" lights that change color based on the temperature, I don't think the 1080ti models have letters but they do have little LED dots for that nonetheless.

Yah, I installed XOC to shut the RGB poo poo off. Clicking default didn't default the fan curves though.

Craptacular!
Jul 9, 2001

Fuck the DH

Wirth1000 posted:

Yah, I installed XOC to shut the RGB poo poo off. Clicking default didn't default the fan curves though.

Well, the curve stuff is tied to "Enable Automatic Fan Control" it seems, since disabling it in the settings causes the boxes to disappear.

Here's the default-rear end curve for my card (again, not exactly the same as yours) if it helps at all:


There's two other presets called power and quiet, but this is the default-rear end "custom". What you get if you turn automatic fan curve off entirely, I dunno.

EDIT: Disregard that. Now my fans seem to be stuck at 40%. weeeelp.

EDIT 2: gently caress gently caress now I have fans running at idle all the time. Guess I'll uninstall/reinstall XOC. :(

edit 3: Rescue from the EVGA forums:

quote:

Go to your fan curve screen and click on any node to select it. Now press CTRL+D. That should reset your fan curve to default.
P.S. Here's another tip. If you double-click anywhere in that area that isn't a node, you can toggle the fan curve from a curve to a step configuration. This can be useful in saving your software fan control some CPU cycles by going with a step configuration rather than a grad' slope.

Craptacular! fucked around with this message at 04:26 on Aug 27, 2017

Rap Game Goku
Apr 2, 2008

Word to your moms, I came to drop spirit bombs


Rage 128
Radeon 9200 Mobile
X1300 Mobile
560Ti
1070

Didn't do a ton of PC gaming until I built the system with the 560Ti in it. The first two were Macs as well, so they didn't have graphic card boxes associated with them.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Matrox Mystique
Rage 128
GeForce 2 MX
GeForce 4 MX 440
FX 5800 (maybe 5800 ultra? or maybe 6800?)
A midrange GTX 2something
7850 2 GB ($150)
Refurb R9 280 ($150)
B-stock 780 Ti ($185)
Firesale 980 Ti Classified ($400, with Step Up)
1080 FE ($430)

The Mystique wasn't the fastest and eventually Descent:FreeSpace needed hardware T&L, there was no software emulation option. The Rage 128 was good when it was good, but often wasn't, and APIs were changing a lot back then. Buying cheaper cards more frequently seemed like an OK way to keep up with the APIs. The GF2 MX was fine for what it was, but NVIDIA got me with the whole MX 440 scheme. I got in the practice of "nicer card and run the wheels off" instead.

GCN 1.0 is basically the immortal uarch, it's vastly older than even Tesla was when NVIDIA moved to Fermi. And the truth is because it's in consoles it will be kicking for a long time, all XB1X titles need to at least run on XB1...

My GTX 275 (?) gradually poo poo the bed just out of warranty so my dad was chipping in (I was in grad school) and I pissed him off by waiting through firesale 6000 series cards and getting a Black Friday 7850 instead. That card kicked rear end in its day, and the 280 was very acceptable for a lot of stuff in my later adventures trying higher resolutions on a budget. I got a used Dell P2715Q for $350, I could run at 4K in some optimized console stuff (Mad Max, Titanfall/Source games, etc), 1440p was usually pretty solid (let's say above 45 fps), with only a few poo poo titles (Wolfenstein:TNO) really making GBS threads the bed and needing 1080p.

The combo of GSync lured me away though - the 780 Ti was like stepping up to a 290 in terms of performance, and also GSync had early-mover advantage on low-framerate play. I got a Dell S2716DG from Amazon Warehouse for under $350 IIRC, and that was that for gaming on the P2715Q. High framerate was the cherry on top, but being able to crank settings and run at 45 fps in a slower-paced RPG is a killer feature. This is something that not even all FreeSync monitors are capable of doing - GSync is smooth running it all the way down until you start noticing that you're getting a "slideshow effect", it's absolutely critical to cover at least the 30-60 fps range.

Paul MaudDib fucked around with this message at 11:45 on Aug 27, 2017

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
1st: TNT 16MB in a Dell 450Mhz Pentium II. For an idiotic reason, I paired this with an 8MB Voodoo II, which gave me very little the TNT couldn't emulate or do better.
2nd: Guillemot GeForce 256 DDR 32MB (which died)
3rd: Guillemot GeForce 2 GTS 32MB (RMA replacement for the GF256 DDR...*also* died)
4th: Radeon 8500 64MB (kind of had enough of nVidia)
5th: GeForce 4 Ti4600 (lucked out on a Dell deal/price mistake, killed this card about two years in by loving up an attempt to use AS thermal epoxy on the graphics RAM - this was also around the time when I shelved the Dell for a 950Mhz copper-cored AMD T-Bird)
6th: Radeon 9600XT (didn't use this for too long)
7th: nVidia 6800 GT
8th: A very dark period where I began liking ~gaming laptops~ a bit too much. Mobile 6800 Ultra, then a GFGo 7800GTX, and finally a GTS 9800.
9th: Back to desktops, my initial build used a single 560Ti 448 core, then later on when they fell to ~$220, I bought a second and found out in short order the only thing SLI did for me is raise my case temps.
10th: GTX 970 - current card, but it took me three makers to finally get a card that performed/was built right.
11th: Probably the x70 version of Volta as I'm still pretty happy at 1200p@60Hz, and my next build will change that resolution.

BIG HEADLINE fucked around with this message at 06:47 on Aug 27, 2017

cowofwar
Jul 30, 2002

by Athanatos
I bought an EVGA GeForce GTX 760 2GB 256-bit GDDR5 back in Nov 2014 and looking at video card ranking scores today only the very top end breaks 2x this card's performance. Doubling performance in three years seems a bit sad. I'll be holding on to the 760 for another year I guess.

https://www.videocardbenchmark.net/gpu_list.php

NewFatMike
Jun 11, 2015

Paul MaudDib posted:

Matrox Mystique
Rage 128
GeForce 2 MX
GeForce 4 MX 440
FX 5800 (maybe ultra?)
A midrange GTX 2something
7850 2 GB
Refurb R9 280
B-stock 780 Ti
980 Ti
1080

The Mystique wasn't the fastest and eventually Descent:FreeSpace needed hardware T&L, there was no software emulation option. The Rage 128 was good when it was good, but often wasn't, and APIs were changing a lot back then. Buying cheaper cards more frequently seemed like an OK way to keep up with the APIs. The GF2 MX was fine for what it was, but NVIDIA got me with the whole MX 440 scheme. I got in the practice of "nicer card and run the wheels off" instead.

GCN 1.0 is basically the immortal uarch, it's vastly older than even Tesla was when NVIDIA moved to Fermi. And the truth is because it's in consoles it will be kicking for a long time, all XB1X titles need to at least run on XB1...

My dad was chipping in on a graphics card (I was in grad school) and I pissed him off by waiting through firesale 6000 series cards and getting a 7850 instead. That card kicked rear end in its day, and the 280 was very acceptable for a lot of stuff in my later adventures trying higher resolutions on a budget. I could run at 4K in some optimized console stuff (Mad Max, Titanfall/Source games, etc), 1440p was usually pretty solid (let's say above 45 fps), with only a few poo poo titles (Wolfenstein:TNO) really making GBS threads the bed and needing 1080p.

The combo of GSync lured me away though - the 780 Ti was like stepping up to a 290 in terms of performance, and also GSync had early-mover advantage on low-framerate play. High framerate was the cherry on top, but being able to crank settings and run at 45 fps in a slower-paced RPG is a killer feature. This is something that not even all FreeSync monitors are capable of doing - GSync is smooth running it all the way down until you start noticing that you're getting a "slideshow effect", it's absolutely critical to cover at least the 30-60 fps range.

Since we both have 1080s, how does it handle 1440 ultrawide? I think I'm going to pick up a GSync one once I'm back in the States and I'm looking forward to cranking settings. 60Hz is totally fine for me, I think anything above might be wasted on my crappy eyes.

Craptacular!
Jul 9, 2001

Fuck the DH

Paul MaudDib posted:

Matrox Mystique

It was you! You're the person who bought the creepy clown box! I couldn't stop looking over my shoulder at that thing at Best Buy.



I guess I should share my favorite box, as was saved by some guy on a Japanese site with the not-typical WinDVD sticker:


Creative had a box design that was frequently atrocious and full of all sorts of bad renders but for the original GeForce they went with this very minimalist eclipse theme. It was so good, they chose to reiterate on it for GeForce2 but like all good things in the 90s they kept re-upping it and by 3/4 it was a mess.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

NewFatMike posted:

Since we both have 1080s, how does it handle 1440 ultrawide? I think I'm going to pick up a GSync one once I'm back in the States and I'm looking forward to cranking settings. 60Hz is totally fine for me, I think anything above might be wasted on my crappy eyes.

I'd say it falls about where a 1070 does with a regular 16:9 1440p (which makes sense, this is 31% more pixels and that's roughly the difference between those cards). You're going to crank a reliable 60fps even at ultra everything but if you want to chase framerate in the 75+ range then you're going to have to make some choices in settings. The only problem I've noticed is that since it's quite a bit more pixels, it seems like your drops tend to drop harder. I usually don't notice framedrops when GSync is enabled but nowadays it's almost to the point of stutter during more severe drops for load-ins/etc.

If you want to chase framerate I'd probably say go with a 1080 Ti instead, given the relative pricing. If you can find the 1080 Ti for like $650 that's probably a nicer option at the high end, especially if you are trying to futureproof (as always, best of luck with that).

Subjective: The curve is super great, it gives you proper angular FOV so you don't get distortion (34" flat ultrawide is terrible). I play in the dark and honestly I perceive it as a "flat surface" visually. My unit has terrible backlight bleed, which I posted a while ago. I do not notice the bleed once I get in-game but it's annoying on loading screens and stuff. The one I haven't tried is games that need to be black-barred down to 16:9, that may be a bit ugly... It also picks up fingerprints like a mofo - I think I must have been used to light matte and this is suddenly full gloss. This has the 2nd-gen GSync module with a HDMI input for another non-GSync machine, which is nice, plus speakers and a USB hub if you care about that. The stand isn't good imo, but I haven't really tinkered with it much.

Another downside is of course that software support is mixed. NVIDIA Shadowplay does not like recording BF1 at ultrawide, where it was fine at 16:9, and 21:9 is fine in other games too. It potentially seems a little crashier too, like games will sometimes poo poo themselves tabbing in and out of full screen. Of course :dice:, so that may not be the best tell. Also along with the framedrops it's possible this is just observer sensitivity, or that I need to reinstall drivers/clear game settings or cache/reinstall Windows.

It does overclock to 100 Hz, no weird scan artifacts or anything. I'm not sure if I should keep it, or exchange it and roll the dice again.

Or option C is I'm still loving drooling over that Acer XR382CQK, it's basically the same price. It's a little bigger, has a slightly tighter curve, and kicks up the resolution to 1600p, but it's max 75 Hz and FreeSync. So on an NVIDIA GPU you'd be giving up adaptive sync for now. On the AMD side you'd need something Vega 56 caliber minimum, 64 (1080 performance) would be better, but I do not want to be involved in the AMD early-access circus. CrossFire Fury X or 580 would probably be decent too. Best of all that format basically happens to be native res for "cinematic 4K" content shot in wider aspects...

Man I wish monitor companies would lighten up on everything needing to be zero-bezel - my understanding is that traditional wide-bezel monitors suffer from this much less (Nixeus commented on this). At 34-38", ultrawide is finally to the point where you don't need 2 separate monitors so bezel size really doesn't matter that much to me, whereas BLB is obnoxious.

Paul MaudDib fucked around with this message at 07:44 on Aug 27, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Craptacular! posted:

It was you! You're the person who bought the creepy clown box! I couldn't stop looking over my shoulder at that thing at Best Buy.

I was young at that point so my dad handled the hardware. But yeah, that's a thing :staredog:



It was fine for like Descent and Warcraft 2 and stuff but it kinda stumbled based on some features that it didn't support. Descent:FreeSpace was massively hyped and required hardware T&L. And to be honest it lived up to the hype, it was actually a pretty fantastic space sim, FS2 still has a massive cult following. My dad replaced it with a Rage 128, which was eventually another casualty of how fast APIs were moving back then.

The Voodoo series was the gold standard back then. Anyone remember Glide? Don't forget to check out Tachyon: The Fringe during the next Steam Sale, it's an open-world space sim with hours of content voiced by Bruce Campbell, brought to modern audiences thanks to nglide. But the Voodoo's critical flaw was always that they needed a card for 2D, they would literally only work in 3D mode. Then 3DFX went into a tailspin.

Heh, anyone remember when OEM-specific drivers used to be a thing? Literally the worst, you know they're not updating it after the launch drivers.

Paul MaudDib fucked around with this message at 07:30 on Aug 27, 2017

NewFatMike
Jun 11, 2015

Paul MaudDib posted:

I'd say it falls about where a 1070 does with a regular 16:9 1440p (which makes sense, this is 31% more pixels and that's roughly the difference between those cards). You're going to crank roughly 60fps even at ultra everything but if you want to chase framerate in the 75+ range then you're going to have to make some choices in settings. The only problem I've noticed is that since it's quite a bit more pixels, your drops tend to drop harder. I usually don't notice framedrops when GSync is enabled but nowadays it's almost to the point of stutter during more severe drops for load-ins/etc (it's possible this is just observer sensitivity, or that I need to reinstall drivers/clear game settings or cache/reinstall Windows).

If you want to chase framerate I'd probably say go with a 1080 Ti instead, given the relative pricing. If you can find the 1080 Ti for like $650 that's probably a nicer option at the high end, especially if you are trying to futureproof (as always, best of luck with that).

edit: I guess I should mention that this was with 90% power limit to pamper the dried-out thermal paste on my blower, I'm going to punch it up and see how it does with my CLC.

Subjective: The curve is super great, it gives you proper angular FOV so you don't get distortion (34" flat ultrawide is terrible). I play in the dark and honestly I perceive it as a "flat surface" visually. My unit has terrible backlight bleed, which I posted a while ago. I do not notice the bleed once I get in-game but it's annoying on loading screens and stuff. The one I haven't tried is games that need to be black-barred down to 16:9, that may be a bit ugly... It also picks up fingerprints like a mofo - I think I must have been used to light matte and this is suddenly full gloss. This has the 2nd-gen GSync module with a HDMI input for another non-GSync machine, which is nice, plus speakers and a USB hub if you care about that. The stand isn't good imo, but I haven't really tinkered with it much.

It does overclock to 100 Hz, no weird scan artifacts or anything. I'm not sure if I should keep it, or exchange it and roll the dice again.

Or option C is I'm still loving drooling over that Acer XR382CQK, it's basically the same price. It's a little bigger, has a slightly tighter curve, and kicks up the resolution to 1600p, but it's max 75 Hz and FreeSync. So on an NVIDIA GPU you'd be giving up adaptive sync for now. On the AMD side you'd need something Vega 56 caliber minimum, 64 (1080 performance) would be better, but I do not want to be involved in the AMD early-access circus. CrossFire Fury X or 580 would probably be decent too. Best of all that format basically happens to be native res for "cinematic 4K" content shot in wider aspects...

Man I wish monitor companies would lighten up on everything needing to be zero-bezel - my understanding is that traditional wide-bezel monitors suffer from this much less (Nixeus commented on this). At 34-38", ultrawide is finally to the point where you don't need 2 separate monitors so bezel size really doesn't matter that much to me, whereas BLB is obnoxious.

Excellent, thanks! 60Hz is totally fine, and high vs ultra isn't something that I'll cry about if a few titles are getting stutter. I already have the 1080, so sadly the Acer is out and yeah the current Vega situation is :staredog:

This setup will probably hold me over for 5 years, so I'm pretty psyched about that. I think I'm going to get a lot of utility out of Witcher 3 and Destiny 2 on that monitor. I'm glad I live up the street from Micro Center, too, so I can check them out in person.

Thanks again for the help!

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme

Paul MaudDib posted:

Descent:FreeSpace was massively hyped and required hardware T&L.

It most definitely did not. Freespace 1 and 2 had Glide and D3D support. FS1 was released in 1998, FS2 in 1999 and first T&L cards were just out in 1999. I remember that Call of Duty 1 was one of the first games that required hardware T&L to run (and even then I think it could be bypassed)

What the Mystique lacked were many other things than T&L support. Bilinear filtering and mipmapping was probably the most visible problem so you had the surface pixelation unlike 3dfx, Ati and Nvidia cards. Matrox was hailed as the king of VGA quality an their 3D features were constantly lagging behind speed-wise. G400 finally had good performance and features but it did not feature T&L.

Then Matrox went on a hiatus and returned with Matrox Parhelia during the Geforce 4000/Radeon 9000-battle. It was slower than Nvidia and turned out not be hardware-wise fully compliant with DX9 unlike it was advertised. Only feature it had was support for multi-monitor gaming, which took 7 years from other manufacturers to catch up (and before that you could use Matrox's Triple/Dualhead2Go which would work on Nvidia/Ati hardware).

Silhouette Wires
May 9, 2014

Never Knows Best

Paul MaudDib posted:

Anyone remember Glide? Don't forget to check out Tachyon: The Fringe during the next Steam Sale, it's an open-world space sim with hours of content voiced by Bruce Campbell, brought to modern audiences thanks to nglide.

No joke my favorite video game growing up. I should do an LP at some point. :awesomelon: I still have that disk. No box though.

I'm seriously considering getting a 1080 ti even though I already have a nice freesync monitor. Oh well, better luck next time AMD

Truga
May 4, 2014
Lipstick Apathy

Paul MaudDib posted:

My dad replaced it with a Rage 128, which was eventually another casualty of how fast APIs were moving back then.

Rage 128 and then Radeon 7500 were probably the best cards I ever had, hands down. Many games in those days kept trying to use cool new tricks to implement fog, and those two didn't want to hear anything about that. So there was no fog. I'd snipe dudes forever, meanwhile I was hiding in distance fog for everyone else, it was the loving best.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

NewFatMike posted:

Excellent, thanks! 60Hz is totally fine, and high vs ultra isn't something that I'll cry about if a few titles are getting stutter. I already have the 1080, so sadly the Acer is out and yeah the current Vega situation is :staredog:

This setup will probably hold me over for 5 years, so I'm pretty psyched about that. I think I'm going to get a lot of utility out of Witcher 3 and Destiny 2 on that monitor. I'm glad I live up the street from Micro Center, too, so I can check them out in person.

Thanks again for the help!

Yeah complaining about having to drop from ultra to high settings if you want to chase framerate is very first-world-problem. A 1080 is good for 1440p UW-QHD, just not total overkill like it is with WQHD.

I am in the same boat. If I were building from scratch I'd probably go 1080 Ti but I already have the 1080 and it's not really much of an increment to merit an upgrade. Because I am a huge idiot I am half-entertaining the thought of doing 1080 SLI instead - rationally I know that it would be better to go 1080 Ti first but I've never done SLI. :shobon: (oddly enough I owned a pair of 780 Tis but never SLI'd them - the other card was in a portable Vive PC)

I said pretty much the same thing about a 980 Ti and ended up getting a deal on a 1080 that was too good to pass up ($430 - actually in hindsight it was pretty close to the historical low for that card given how the crypto-craze has trashed the GPU market). My current plan (given the state of the GPU market) is to try and hold out for the 1180 Ti. I'd rather be on the Ti cycle than on the x80 cycle - at the end of the day I don't really care all that much about efficiency if the performance is actually there, and the x80 Ti cards are always vastly more overbuilt in the things that help a card age well (RAM capacity, bandwidth, etc).

(Seriously if AMD could just get the performance up nobody would give a poo poo about the power. I have been running a generation behind to save money and get the overbuilt x80 Ti cards, so my cards lately have literally gone 280 (7950), 780 Ti, 980 Ti. I don't care about eating power if the performance is there, it's the fact that Vega is barely competing with a chip AMD has spent a year trashing as a "midrange product". Well, that and the pre-alpha quality drivers...)

But if I'm gonna be truly honest, over the last couple years I've been the gaming version of this comic.



"I have framerate to spare, I'm going to buy a better monitor." "Oh no, my framerate isn't locked at the limit, I need a faster GPU." "Oh nooooooo....."

Fortunately I buy low so I don't take too much of a bath if I flip my old gear relatively promptly. Using my 780 Tis during the Maxwell generation cost me roughly $15 each, which I consider exceptional for more than a year of use. My 980 Ti cost me $50 for a year, and that was more than planned due to Step Up falling though. I've tried out a couple high-end 27" monitors which cost me roughly $50 each to own/use for a year. Looking at it on a monthly basis, as long as you don't get screwed on the initial purchase, owning a high-end PC actually isn't too expensive relative to other hobbies, especially given the amount of use I get out of it. It works out to probably the same cost as premium cable with some movie channels and poo poo :shrug:

Best advice though, if you know you're gonna want the 1080 Ti and would pull the trigger if you saw a great deal, then don't talk yourself down to the lower card. Just hang in there with what you've got and then wait for a deal on the thing you want. And again, that's where GSync comes in, you can ride out some pretty lovely framerates thanks to GSync's LFC mode if necessary. I could probably drive this panel with a 1060: it would suck rear end and I'd be running minimum settings but it should be doable if needed. Like if I wanna flip before a new generation, or if my main card shits itself.

Paul MaudDib fucked around with this message at 15:17 on Aug 27, 2017

heated game moment
Oct 30, 2003

Lipstick Apathy
Let's see here..
Matrix M3D (PowerVR)
Voodoo 3
GeForce 3?
ti 4400
780 GTX 256mb (then another one later for water cooled SLI)
8800GTS 320
9600GT
4870 1gb
5870
7970
1070

I think of all these the 7800 GTX was the most expensive at around $600 followed by the 7970 and 1070.
May be leaving some things out from the old days but it's hard to remember back that far and all of those systems are long gone.

LOL I had almost forgotten how bad the 5000 series GeForce cards were:

quote:

The first family in the GeForce 6 product-line, the 6800 series catered to the high-performance gaming market. As the very first GeForce 6 model, the 16 pixel pipeline GeForce 6800 Ultra (NV40) was 2 to 2.5 times faster than Nvidia's previous top-line product (the GeForce FX 5950 Ultra), packed four times the number of pixel pipelines, twice the number of texture units and added a much improved pixel-shader architecture. Yet, the 6800 Ultra was fabricated on the same (IBM) 130 nanometer process node as the FX 5950, and it consumed slightly less power.

heated game moment fucked around with this message at 09:56 on Aug 27, 2017

orcane
Jun 13, 2012

Fun Shoe

Rosoboronexport posted:

It most definitely did not. Freespace 1 and 2 had Glide and D3D support. FS1 was released in 1998, FS2 in 1999 and first T&L cards were just out in 1999. I remember that Call of Duty 1 was one of the first games that required hardware T&L to run (and even then I think it could be bypassed)

What the Mystique lacked were many other things than T&L support. Bilinear filtering and mipmapping was probably the most visible problem so you had the surface pixelation unlike 3dfx, Ati and Nvidia cards. Matrox was hailed as the king of VGA quality an their 3D features were constantly lagging behind speed-wise. G400 finally had good performance and features but it did not feature T&L.

Then Matrox went on a hiatus and returned with Matrox Parhelia during the Geforce 4000/Radeon 9000-battle. It was slower than Nvidia and turned out not be hardware-wise fully compliant with DX9 unlike it was advertised. Only feature it had was support for multi-monitor gaming, which took 7 years from other manufacturers to catch up (and before that you could use Matrox's Triple/Dualhead2Go which would work on Nvidia/Ati hardware).

I had a G200 when it came out. It was good at D3D (slightly slower than Riva TNT I think) but awful at OpenGL. Which bit me in the rear end, since I played a lot of games with Quake 2/3 engine back then, frequently at sub 20 fps until I got a GeForce 2 MX.

Later, I had a PowerVR Kyro 2 for a while - performance was good with its deferred rendering, but it lacked hardware T&L and drivers lagged behind new titles more and more, which made them near unplayable until maybe a new driver fixed them.

The last card I had in that P2/P3 build was a GeForce 4 Ti 4200. The reference Ti 4200 launched later with a smaller, cheaper PCB than the Ti 4400/4600, but mine was some Asus deluxe model for which they used a Ti 4400 PCB and high clocks, so it was basically a cheaper Ti 4400. I recently threw it away together with the bonus 3D shutter goggles and AV connection box.

For my next PC I had an Radeon 9800 XT from Asus. Never had performance or driver issues as such, but it eventually died (thermal issues I think - I had problems with alpha textures flickering at factory overclock settings, which always went away if I clocked the card very slightly lower). Went through an X1650 (HIS with lovely DVI output) and then X1950 before I replaced the entire computer (AGP Video card + single core CPU :suicide:).

My computers since then always had Nvidia stuff. GTX 260 (died), GTX 560, GTX 980. And a 1050 Ti I put into a living room SFF PC.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Dislike button posted:

LOL I had almost forgotten how bad the 5000 series GeForce cards were:

The fact ANYONE bought those cards over the incredible-in-DX9 Radeons of the time is unbelievable.

HalloKitty fucked around with this message at 11:32 on Aug 27, 2017

Adbot
ADBOT LOVES YOU

Kazinsal
Dec 13, 2011
I'm finally at the rare point of having an honest-to-god CPU bottleneck in some games. I have a GTX 1070 and an i7-3820, and let me tell you, when you're trying to push 1080p144 that old doesn't-overclock-worth-a-drat Sandy Bridge-E is a problem even with DDR3-1866.

Seriously considering turning this into a stream processing box and going whole hog with an i7-8700K and 1080 Ti when it's out, but then I'll want to buy a 1440p144 monitor, and suddenly I will be just like Paul.

(Also, Voodoo 2 -> GeForce 440MX -> GeForce FX 5600 -> Mobility Radeon 3470 -> Radeon HD 6850M -> 2x Radeon HD 5850 -> R9 290 -> a different R9 290 -> GTX 1070.)

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply