Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Arzachel
May 12, 2012

Zero VGS posted:

I had two 770's that were just the same as the 280x, uncapped they wouldn't break 520 watts in my system.

You might be on to something though, I'll see if a hard cap is handled differently than the Vsync cap. There's no reason it should be, but then again there's no reason that frame limiting shouldn't be built into Nvidia/AMD control panels either, those lazy fucks.

Have you tried undervolting the cards? Not sure about Hawaii but Tahiti cards tended to have pretty high stock voltages.

Adbot
ADBOT LOVES YOU

Vectorwulf
May 5, 2010
The card picking thread mentioned at the top of the OP seems to be archived. Where would be a good spot for a quick question about an upgrade?

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

Humboldt Squid posted:

So, I think my trusty 560ti is on its last legs - after about 10 minutes of anything more graphically intensive than terreria it crashes and the screen becomes multicolored blocks. Updating drivers didn't help, and turning off overclocking at lest lets me use my PC for normal, non-gaming stuff but I get the feeling it's a ticking time-bomb so it's upgrade time! I'm thinking about getting a 750ti to replace it - it's a little slower than radeon's equivalent at that price, but I like the lower power consumption and reduced noise in comparison.
The difference in cost between the 1gb and 2gb models seems to be about $15-20, is there enough of a performance boost to make that worth it? I understand that Asus and EVGA are generally considered good bets, but are there any manufacturers that I should avoid?

I don't know about going from a 560 ti to a 750 ti, but I recently went from a 460 to a 750 ti and it was a moderate upgrade in performance.

Flagrama
Jun 19, 2010

Lipstick Apathy

Vectorwulf posted:

The card picking thread mentioned at the top of the OP seems to be archived. Where would be a good spot for a quick question about an upgrade?

The new parts picking thread stickied to the top of SH/SC http://forums.somethingawful.com/showthread.php?threadid=3623433

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy

Vectorwulf posted:

The card picking thread mentioned at the top of the OP seems to be archived. Where would be a good spot for a quick question about an upgrade?

You want the parts picking thread: http://forums.somethingawful.com/showthread.php?threadid=3623433&userid=0&perpage=40&pagenumber=1

Short answer RE: your other post being yeah you're safe with a 760 or 770, but also check out the AMD 280/280x as they trade blows but have been popping up a lot cheaper. The AMD cards can also be found used for cheap from mining rigs being dumped off, but make sure you can still use the warranty: https://docs.google.com/spreadsheet/ccc?key=0ArCtGVODPUTtdEdEUjRiSFdyckZ1Q1dGNUI3bkd5R1E#gid=0. Someone's been putting some up in SA-Mart for awesome prices that I wish I would've been psychic and known about before I bought a card a little while ago.

Vectorwulf
May 5, 2010
Thanks guys! :)

Shaocaholica
Oct 29, 2002

Fig. 5E
Ok gramps, I want to do something from the long long ago. For fun.

I have a Dell M70 laptop with removeable GPU. Its the same board as the Dell D810. Apparently its possible to transplant a Dell 7800 GTX Go into this thing and those GPUs are pretty cheap on ebay. I found an old incomplete article in web archives about it:

http://web.archive.org/web/20060506140443/http://www.laptoplogic.com/resources/detail.php?id=31&PHPSESSID=9a35383d7fa4ddd9febf66e8acd4ffd1

http://web.archive.org/web/20070621103904/http://www.laptoplogic.com/resources/detail.php?id=31&page=2

Missing pages :(

http://web.archive.org/web/20070621103904/http://www.laptoplogic.com/resources/detail.php?id=31&page=8

However, even with the missing pages its pretty easy to piece together the process. I even found a 3dmark submission from the guy so I know what the target clocks and voltage is.

http://www.3dmark.com/3dm06/81605

I'm just not sure how to use those tools he mentions in page 2 of the article. I assume I'll have to use WinXP...and a floppy?

Edit: the need to under clock and under volt is that the cooler in the M70/D810 isn't sufficient(?) to cool the 7800 GTX Go and the stock cooler on the 7800 GTX Go won't fit into the M70/D810. Come to think of it maybe this isn't a great idea.

Shaocaholica fucked around with this message at 18:54 on Jul 8, 2014

veedubfreak
Apr 2, 2005

by Smythe

Alereon posted:

It is true that there will be lower thermal resistance between the GPU and heatsink (or waterblock) than the CPU and heatsink, especially versus a CPU like a non-delidded, pre-Devil's Canyon Haswell. Fundamentally, this means that at the same water temperature, the GPU will run much cooler than the CPU. Additionally, the GPU can tolerate running at a much higher temperature than the CPU, so you can get more total thermal dissipation out of your cooler before overheating, since the water temperature can be higher (and thermal dissipation of the radiator depends on the temp difference between it and ambient). The question is whether, for a given cooler, the lower thermal resistance of the GPU-waterblock interface is a bigger factor than the increased thermal load from the GPU vs CPU. For a better water cooler it will be, I was wrong on this before as higher-end single-fan units can cool even a high-end GPU very capably. However, I think with the H55 the cooling capacity is so low that it's going to stabilize at a point where either the noise is too loud or temperatures are too high/clockspeeds are too low.

Bonus Edit: For similar reasons, Arctic Cooling's coolers have always been very conservatively rated as far as their cooling capacity. For example, the Twin Turbo II was only rated at 250W and recommended for up to a GTX 770, but can handle a much higher load even with low noise levels and temperatures (they are officially compatible with GTX 780 Ti/Titan Black according to a news release). I'm not sure I'd want to try it with an overclocked R9 290(X), but I think you could overclock a 780 Ti capably.

To add to what he said. My cpu is heavily overclocked and is the first thing in the loop. At full tilt it gets up to around 70, while my video cards never get more than around 45-46. The new cpus just have so much heat in such a small area that it's just hard to get the heat out vs the huge die of a GPU.

1gnoirents
Jun 28, 2014

hello :)
Cool, I was a bit worried. So the end result is what I thought but for the wrong reasons. I'm dumping a lot of money into my car at the moment but I think I'll be picking up an X40 and a second fan for it. I really hope I can figure out how to get the card to 1.21v too

Sulphuric Sundae
Feb 10, 2006

You can't go in there.
Your father is dead.
So I went out and bought an XFX-branded R7 260x today to replace my Radeon HD 4830. It seems like a good deal and my new monitors don't go past 1080p anyway.
I get home and swap out the old for the new. Turn it on. Neither of my monitors show anything. I double check the power cable, the connection, and even unhook other stuff from power to make sure it's getting enough juice. Reboot, still nothing. The fans are spinning on the card, but the computer doesn't seem to ever actually boot up into Windows. I can only tell because the machine doesn't show up as online in Google Remote Desktop after waiting a bit.

I put the old card back in, and everything's fine. I exchange the new card for an identical one at the store, thinking it's just that particular one I bought. Nope. Exact same thing with the second card. I don't think PSU's an issue, and my mobo should theoretically run the card fine.

So, GPU thread, can I get either:
- some ideas on other things to try because I'm pretty baffled, or
- an nVidia card under $130 that would be comparable to the 260x or at least a good step up from my Radeon HD 4830 for 1080p gaming

Thanks!

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Sulphuric Sundae posted:

So I went out and bought an XFX-branded R7 260x today to replace my Radeon HD 4830. It seems like a good deal and my new monitors don't go past 1080p anyway.
I get home and swap out the old for the new. Turn it on. Neither of my monitors show anything. I double check the power cable, the connection, and even unhook other stuff from power to make sure it's getting enough juice. Reboot, still nothing. The fans are spinning on the card, but the computer doesn't seem to ever actually boot up into Windows. I can only tell because the machine doesn't show up as online in Google Remote Desktop after waiting a bit.

I put the old card back in, and everything's fine. I exchange the new card for an identical one at the store, thinking it's just that particular one I bought. Nope. Exact same thing with the second card. I don't think PSU's an issue, and my mobo should theoretically run the card fine.

So, GPU thread, can I get either:
- some ideas on other things to try because I'm pretty baffled, or
- an nVidia card under $130 that would be comparable to the 260x or at least a good step up from my Radeon HD 4830 for 1080p gaming

Thanks!

- Does it have a Dual Bios switch? Try flipping the switch to the other position if so.

- Does anything show if you plug the monitor into the motherboard?

- Any pins bent on your monitor cables?

- Is your motherboard's BIOS updated to the absolute newest?

- Try hitting F8 for advanced bootup options and picking VGA Safe Mode if you're in Windows 7 or newer.

- With regards to your Nvidia question, I'm selling used 750ti cards with transferrable warranty for $120 shipped.

Sulphuric Sundae
Feb 10, 2006

You can't go in there.
Your father is dead.

Zero VGS posted:

- Is your motherboard's BIOS updated to the absolute newest?

Oh man, thank you so much. I've never had to update my mobo's BIOS, so this didn't even cross my mind. The newest version of the BIOS on Gigabyte's site even said "support for more high-end video cards" in the description. Card's working fine now!

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

1gnoirents posted:

I cant remember but I think vsync actually made my cards hotter (so more watts), but actual frame rate capping manually made them work less hard.

Either way I dont think I'd be comfortable with 2x 290's on 620 watts. I wasn't really happy with 2x 770's on 750 watts.

Thanks, this wound up being the fix. Forcing a 60fps limit through MSI Afterburner / RivaTuner got my wattage down to around 600 instead of the crazy 750-ish it was pulling uncapped with vsync.

Of course for some dumb reason the screen still tears like crazy with a 60fps limit even if it never drops a frame (it has been this way for me with both Nvidia and AMD) so for now I'm using FPS Cap and VSync together.

Hopefully Adaptive Sync will come through and let me turn off both.

One last thing, whoever told me that R9 290 had crossfire completely sorted out even though the 280X crossfire was a buggy shitfest for me was correct, it works perfectly for me instead of the 280X's trifecta of missing textures, artifacting, and microstutter.

td4guy
Jun 13, 2005

I always hated that guy.

Zero VGS posted:

Of course for some dumb reason the screen still tears like crazy with a 60fps limit even if it never drops a frame (it has been this way for me with both Nvidia and AMD) so for now I'm using FPS Cap and VSync together.
Have you tried 59fps instead?

SCheeseman
Apr 23, 2003

I have a 650w PSU and an i5-4670k (slightly overclocked, not over-volted) 3 HDDs and an SSD. No optical drive.

Is running crossfire 280X practical on this setup?

EDIT: Just remembered. One of the cards is actually a HD7950

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

td4guy posted:

Have you tried 59fps instead?

Yes, it was even worse. I still don't get the logic in trying that actually, even when people try to explain it.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

SwissCM posted:

I have a 650w PSU and an i5-4670k (slightly overclocked, not over-volted) 3 HDDs and an SSD. No optical drive.

Is running crossfire 280X practical on this setup?

EDIT: Just remembered. One of the cards is actually a HD7950

That should be fine, I ran two 280X on a quality 620 watt PSU and an 4770k overclocked to 4.5 and benchmarking cpu/gpu/gpu all to 100% load I was only drawing 550 watts from the wall.

It's one you get to 290 cards when poo poo gets real.

Edit: Kill-A-Watt meters from home depot give you a lot of insight and only cost $20. I recommend anyone buy one or at least try it to test your rig then return it.

Zero VGS fucked around with this message at 13:40 on Jul 9, 2014

1gnoirents
Jun 28, 2014

hello :)

Zero VGS posted:

Yes, it was even worse. I still don't get the logic in trying that actually, even when people try to explain it.

I'd try turning your settings way down, like painfully low, and see if its resolved. 59 fps definitely works for me (flipping between 59 and 60/61 is dramatic) but I know it definitely doesn't do anything for some. Perhaps its just a sli/crossfire difference

If the settings down low changes how bad the tearing is then the cards are just working a little too hard to be reasonably in sync. Or (I haven't tried this) reduce your fps cap to something really low like 30 fps temporarily, should accomplish the same test easier.

Unfortunately I don't know what to tell you if nothing stops it but at least its something to work from

Kalenden
Oct 30, 2012
Wait, this was mostly on the previous page, but 70-80C for a GPU is considered close to unacceptable?

I've recently bought a shiny new laptop (Alienware 17) with a 880M and the GPU goes to 94C under load. I was worried and asked around (also on this site) if that was acceptable and people told me it was.
It doesn't throttle down or become unreasonably hot to the touch so I decided to just go with it. Should I fix this? If so, how? Will Dell replace the GPU or fix the thermal paste or whatever for issues like these?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Kalenden posted:

Wait, this was mostly on the previous page, but 70-80C for a GPU is considered close to unacceptable?

I've recently bought a shiny new laptop (Alienware 17) with a 880M and the GPU goes to 94C under load. I was worried and asked around (also on this site) if that was acceptable and people told me it was.
It doesn't throttle down or become unreasonably hot to the touch so I decided to just go with it. Should I fix this? If so, how? Will Dell replace the GPU or fix the thermal paste or whatever for issues like these?

80c for Nvidia and 90c for ATI are the factory throttle limits and perfectly safe. Laptop GPU's may be different.

Arzachel
May 12, 2012
Nvidia's mobile GPUs throttle at 95 C°. Turbo uses up thermal headroom so I wouldn't worry unless clocks take a large dip under sustained load. You probably will want to do a repaste after a year or so, though.

Arzachel fucked around with this message at 15:32 on Jul 10, 2014

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
What's with the R7/R9 series of cards? I don't get ATI's market plan here, 7000-series cards were out for like six months and then rebranded, but kept the same chips? They also frequently seem to be a bit more expensive than the same card under a 7000-series label. First time I've ever been honestly confused by GPU marketing.

I'm also delighted with all the work that has gone into compute languages recently. CUDA Compute 3.5 on a 2600-core card is baller as gently caress, dynamic parallelism is a cool concept, and Jetson TK-1s sound like they'd be awesome clustered. 2gb of memory per device gets you a lot of GPU memory fast, 192 cores can do a lot of work, and you're only pulling 5-10w per device.

AMD meanwhile has gone full-bore with the GPU-on-die thing. Being able to dispatch GPU cores without having to copy data across the PCI-e bus clears up the major bottleneck in GPGPU programs. Nvidia's recent releases let you pretend that memory is unified, but it's still transferring things behind the scenes, there's no way to avoid that with two devices separated by a bus. Might be possible in a SOC like the K-1, but due to the separate CPU/GPU caches I don't know if it can do it.

I'm erogenous to heterogenous computing.

Paul MaudDib fucked around with this message at 00:09 on Jul 11, 2014

beejay
Apr 7, 2002

Do you mean R9 series? Do you have an example in mind? I think the R9 cards that were rebrands of 7000 series cards have done nothing but go down in price.

Edit: Also, it's "AMD" not "ATI" now and nvidia rebadges too.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

beejay posted:

Do you mean R9 series? Do you have an example in mind? I think the R9 cards that were rebrands of 7000 series cards have done nothing but go down in price.

Edit: Also, it's "AMD" not "ATI" now and nvidia rebadges too.

Well, I got a 7850 pretty quickly after they came out for $150, and I frequently saw them for $180, and when they switched to the R# nomenclature the same money would only buy you a 77xx with a different sticker for a long, long time.

Now that I think about it, it's probably Bitcoin's fault.

vvv Yeah, it was a Black Friday sale (I guess that dates this to about 6 months after release) but I saw 7850s on sale for $170-200 regularly and when they rebranded it jumped up into the $230-250 range, probably due to coins

Paul MaudDib fucked around with this message at 00:23 on Jul 11, 2014

beejay
Apr 7, 2002

Yes the various coins caused all AMD cards to be out of stock and/or priced higher for quite a while. I somehow forgot about that. But that seems to have passed now. However if you got a 7850 soon after release for $150 you caught a hell of a sale.

Regardless, the technology side of video cards has kind of been creeping along for a while, and we're in kind of a weird place right now with rebadges and chips that used to cost $300+ being $200ish. It is a strange time but it should start picking up next year or so.

beejay fucked around with this message at 00:03 on Jul 11, 2014

1gnoirents
Jun 28, 2014

hello :)
I believe the first real official price drop on AMD stuff was very recently, maybe a week or two ago. I think the 280 non x was a while before that but that was sorely needed since it was a rebadge to take advantage of then-current prices. The rest was of course bitcoin price gouging from ~December (?) - March then the great offloading. It was sad because all along the MSRP of all the new AMD cards were priced to basically beat nvidia at nearly every level.

Arzachel
May 12, 2012
Cape Verde 7750/7770 -> 250X/255
Bonaire 7790(GCN1.1) -> 260/X
Pitcairn 7850 -> 265
Curacao(GCN1.1) 270/X new chips, slightly higher clocked Pitcairns with 290/X feature set
Tahitii 7950/7970 -> 280/X
Hawaii(GCN1.1) 290/X new chips

There was a time where you could snag a new 7950 for <200$ but then Altcoins happened.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Paul MaudDib posted:

What's with the R7/R9 series of cards? I don't get ATI's market plan here, 7000-series cards were out for like six months and then rebranded, but kept the same chips? They also frequently seem to be a bit more expensive than the same card under a 7000-series label. First time I've ever been honestly confused by GPU marketing.

Firstly, 7970 launched in January 2012, was rebranded as R9 280X (with tweaks) in October 2013. It would be an extreme exaggeration to call 21 months "6".
I totally agree that the R5/R7/R9 designation is pointless, and can be immediately ignored.

Secondly, NVIDIA's naming can be more confusing, but people often don't notice since it's usually cards they don't give a poo poo about - (For example, Geforce 430, Geforce 430, Geforce 440, Geforce 440, Geforce 450, Geforce 450, Geforce GT 730, Geforce GT 730, Geforce GT 730; Geforce GT 740, Geforce GT 740. No mistake, they're all separate card configurations, sometimes with different architecture, with repeating names).

Both can have misleading names, essentially, but I don't know where the idea comes from that AMD's naming is somehow worse, and NVIDIA has re-used cards with new names several times too.

Every time you want to buy a card, you just have to put a bit of research into it, it's been like that as long as I can remember. For example, many moons ago, I bought a Geforce 7800 GS.. but it wasn't. It was a Geforce 7800 GT, but had GS written on it because that's the only name NVIDIA seemed to allow for the AGP version, but my card had a 7800 GT core on it, and performed like one..

HalloKitty fucked around with this message at 21:41 on Jul 11, 2014

Eddain
May 6, 2007
I'm still on a GTX 570, should I just get a 7xx card now or wait for the 8xx series? Most games run well enough at high settings right now but it does struggle for some stuff at max settings, so I'm in no real rush to upgrade. I plan to use the new card to play the games coming out in the near future (Dragon Age, Witcher 3).

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

Arzachel posted:

There was a time where you could snag a new 7950 for <200$ but then Altcoins happened.

Yeah, I got a factory overclocked 7950 for 180$ after MIR about a week before the 2XXx rebranding, one of the only times my timing worked out well. Still runs everything at med-high settings @1440p, so I'm holding out for an upgrade until the next-gen stuff is released.

Pimpmust
Oct 1, 2008

Eddain posted:

I'm still on a GTX 570, should I just get a 7xx card now or wait for the 8xx series? Most games run well enough at high settings right now but it does struggle for some stuff at max settings, so I'm in no real rush to upgrade. I plan to use the new card to play the games coming out in the near future (Dragon Age, Witcher 3).

Probably better for you to wait a year or so for the next gen to get off the ground. I just upgraded from a 460 to a 770 but even I was tempted to "wait just another year", and the performance difference for me will be a lot bigger than you doing something similar. All depends on how much disposable money you got for this little hobby though.

My rule of thumb is to wait about 3 generations between card switching to stay somewhat economical about the whole thing, but that has really been a bit of a drag these past 4 years. Once the 8xx0 series rolls around I think we'll be stuck with it and its descendants for a long while longer, short of GDDR6 showing up (supposedly with Maxwell?).

Pimpmust fucked around with this message at 15:35 on Jul 11, 2014

GokieKS
Dec 15, 2012

Mostly Harmless.
I don't know how anyone who has followed the GPU industry can think that nVidia isn't worse with their naming than ATI/AMD.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I've got an MSI R9 290x for $320 in my SA Mart thread. Are you a bad enough dude to run it at 1080p???

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

GokieKS posted:

I don't know how anyone who has followed the GPU industry can think that nVidia isn't worse with their naming than ATI/AMD.

AMD always seemed to have too many number variations where nvidia changes the first number every gen and then the next 2 numbers tell you where it is in the food chain.

GokieKS
Dec 15, 2012

Mostly Harmless.

Don Lapre posted:

AMD always seemed to have too many number variations where nvidia changes the first number every gen and then the next 2 numbers tell you where it is in the food chain.

AMD is nowhere near the offender nVidia is though in terms of either using the same number for multiple different products, or using multiple different numbers for the same product.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
Theyre both loving terrible about it ok

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Don Lapre posted:

AMD always seemed to have too many number variations where nvidia changes the first number every gen and then the next 2 numbers tell you where it is in the food chain.

That's pretty much because NVIDIA adds a bunch of random letters instead of changing the numbers, such as Ti

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


GokieKS posted:

AMD is nowhere near the offender nVidia is though in terms of either using the same number for multiple different products, or using multiple different numbers for the same product.

Most of it is out of the thread's purview, though: the worst of it is in worse-than-integrated desktop parts and the goddamn mess that is laptops (where you should be checking every part of the spec in detail before throwing your hands up and just [YOSPOS Macbook Air/Chromebook/Tablet meme]). I think the worst we've caught lately was that one 650ish card (which may have been a laptop card come to think of it) that was still Fermi and for all three companies it's all been "higher number = better than; if same number then more stuff after number = better than" for a while.

Although having 750/Ti being Maxwell will cause confusion for a bit.

dont be mean to me fucked around with this message at 23:27 on Jul 11, 2014

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Yeah I've run into that too. The chip architecture is a good guideline but doesn't tell the whole picture. For example the DDR3 and GDDR5 versions of the GT640 are way different, the DDR3 only supports CUDA Compute 2.1 and the GDDR5 version supports 3.0. I had the same conclusion, you just need to examine whatever specific brand and model number really drat carefully because there are a lot of small differences between cards.

Adbot
ADBOT LOVES YOU

GokieKS
Dec 15, 2012

Mostly Harmless.

Sir Unimaginative posted:

Most of it is out of the thread's purview, though: the worst of it is in worse-than-integrated desktop parts and the goddamn mess that is laptops (where you should be checking every part of the spec in detail before throwing your hands up and just [YOSPOS Macbook Air/Chromebook/Tablet meme]). I think the worst we've caught lately was that one 650ish card (which may have been a laptop card come to think of it) that was still Fermi and for all three companies it's all been "higher number = better than; if same number then more stuff after number = better than" for a while.

Although having 750/Ti being Maxwell will cause confusion for a bit.

Enthusiast parts aren't quite as bad in that the different parts aren't named EXACTLY the same, but 260/260 Core 216, 560/560 Ti/560 Ti with 448 Cores, or 650/650 Ti/650 Ti BOOST are pretty drat bad in terms of naming confusion. And then there was the G92, which went from 8800 to 9800 to GTS 240/250.

I mean, AMD is far from guilt-free in this, but I definitely think nVidia is a bigger offender, especially once you throw in the OEM and low-end parts (the current GT 730 situation is probably the worst example of terrible numbering ever).

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply