Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Blackfyre
Jul 8, 2012

I want wings.
Well realistically speaking and based on past experience I would expect Nvidia to push ahead when they do push out a new round of cards. Would expect something around sept/oct. If it matters they'll sort a GCN style thing for their cards or we might just find its not such a big thing on most PC titles. With their partnerships with Devs and extra game works things it's in their best interest to keep themselves competitive on all functions that matter.

Apologies if this is a poorly written post, on my phone at the in laws.

Adbot
ADBOT LOVES YOU

Corvo
Feb 5, 2015

Quick question about mobile GPUs. I currently have a GTX 780M with 4GB GDDR5 memory, would a Radeon R9 M395X with 4GB GDDR5 memory be a big upgrade? Was wondering how far things have progressed since the 780M was considered one of the high-end mobile GPUs and if I'm correct in understanding that Nvidia will have some brand new mobile GPUs out sometime later this year? I'm in no rush so can happily wait and see, but I was just wondering if the M395X offers much more than the 780M and how much better the new cards later this year will be.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

DuckConference posted:

I don't believe there are any actual duties on video cards, but even if there aren't, you still have to pay gst/pst on anything over $20 of declared value that you buy in the us

You're right, sorry -- I was lumping taxes and duties together.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Blackfyre posted:

Well realistically speaking and based on past experience I would expect Nvidia to push ahead when they do push out a new round of cards. Would expect something around sept/oct. If it matters they'll sort a GCN style thing for their cards or we might just find its not such a big thing on most PC titles. With their partnerships with Devs and extra game works things it's in their best interest to keep themselves competitive on all functions that matter.

Apologies if this is a poorly written post, on my phone at the in laws.

I'd expect them to catch up. I'd be rather surprised if their first attempt at building GCN style cards was as good as AMD's fourth but they should be at least within striking distance. Their past few generations have been getting their gains in part by stripping stuff out that wasn't needed but now is, so I'd expect that to eat up some or all of the advantage they had relative to AMD when Maxwell launched.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Corvo posted:

Quick question about mobile GPUs. I currently have a GTX 780M with 4GB GDDR5 memory, would a Radeon R9 M395X with 4GB GDDR5 memory be a big upgrade? Was wondering how far things have progressed since the 780M was considered one of the high-end mobile GPUs and if I'm correct in understanding that Nvidia will have some brand new mobile GPUs out sometime later this year? I'm in no rush so can happily wait and see, but I was just wondering if the M395X offers much more than the 780M and how much better the new cards later this year will be.

The M395X is about 40% faster, so it would be a significant bump. I'm guessing moble GPUs will see a significant bump with the new chips because of the smaller process size and greater power efficiency so you might want to wait and see.

sauer kraut
Oct 2, 2004

Corvo posted:

Quick question about mobile GPUs. I currently have a GTX 780M with 4GB GDDR5 memory, would a Radeon R9 M395X with 4GB GDDR5 memory be a big upgrade? Was wondering how far things have progressed since the 780M was considered one of the high-end mobile GPUs and if I'm correct in understanding that Nvidia will have some brand new mobile GPUs out sometime later this year? I'm in no rush so can happily wait and see, but I was just wondering if the M395X offers much more than the 780M and how much better the new cards later this year will be.

Nah M395X is just a rebranded Tonga from 2014 and those two should be fairly close. Not worth buying at a premium price when new stuff is around the corner.
I'm sure mobile bins will benefit a lot from the die shrink.

Edmond Dantes
Sep 12, 2007

Reactor: Online
Sensors: Online
Weapons: Online

ALL SYSTEMS NOMINAL
Today I got a GTX 970. I used the nvidia driver uninstaller thingy to clean up my old drivers and uninstalled Asus GPUTweak before installing the new card just in case. Installed without issues, tested it with some MAXED OUT :pcgaming: games and it's a beauty, Dirt Rally was stunning and running smooth as butter.

I monitored the temps for a bit, and even while hitting 99% GPU usage the card never went above 70C (I think the actual max temp was 67), but afterwards I stopped playing and noticed it was idling at ~60. I checked my task manager to see if anything had exited incorrectly and was still running in the background, but nothing. I restarted the pc and now it's idling at ~40, but I noticed the fans aren't running; I googled a bit and got a lot of hits saying that 50~60 for idle is a normal temp for this card and the fans not spinning is normal because the fans only kick in at ~67, is this possible? :psyduck:

I went to GPUTweak to make a fan curve as I had with the 660ti, but I don't have that option anymore, now you have "auto" and "manual", but manual is just a fixed speed, I don't have the little gear symbol anymore next to the manual button. I do have a "GPU Target Temp" that wasn't there with my previous card, it's defaulted to 79C and I left it there. The fans work fine if I hit manual and select any % bigger than 0, and it I leave it in Auto, it's always 0 until I run a game, when it looks like it's doing its job properly since as I mentioned before, the temp never breaks 70.

Sorry if I don't make much sense and it's all normal for this card, I have the new piece of hardware nerves. :ohdear:

MagusDraco
Nov 11, 2011

even speedwagon was trolled
Yeah the Asus and MSI GTX 970s don't turn on the fan by default until the temperature gets in the 60Cish range. The card will be perfectly happy to run fanless. Stock blower cooler runs the card at around 82-85C I think so it's nothing to worry about.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Edmond Dantes posted:

Sorry if I don't make much sense and it's all normal for this card, I have the new piece of hardware nerves. :ohdear:

That all sounds normal for a good 970, the fans are pretty conservative because of how efficient Maxwell is. If your in-game temps are good, things are working correctly.

There are some monitor configs (multiple, one or both > 60Hz) that cause NVIDIA cards to idle a bit high, but still shouldn't be enough to trigger fans if your case has decent airflow.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
EVGA and even Gigabyte got their act together on the idle fans with their newer revisions. I wish Gigabyte used bigger fans though, I just got a G1 980 Ti because it was the cheapest and I'm not really a fan of Gigabyte's cooler. Bigger fans would have been appreciated even if it meant a longer card, the 92mm fans get audible.

You should still be able to set custom fan curves, not about GPU Tweak but MSI Afterburner works.

NVIDIA's stupid 4K + multi-monitor bug means my 980Ti and my 970 before that "idled" at 900mhz though, which is enough to get the fans to spin up most of the time.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Desuwa posted:

NVIDIA's stupid 4K + multi-monitor bug means my 980Ti and my 970 before that "idled" at 900mhz though, which is enough to get the fans to spin up most of the time.

They're actually going to fix that, I hear.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Subjunctive posted:

They're actually going to fix that, I hear.

Good. It's messed up that my 290 idles lower than my 970 did without workarounds.

GokieKS
Dec 15, 2012

Mostly Harmless.

Paul MaudDib posted:

No, DDR3 is a euphemism but it's always GDDR3. Graphics work would push DDR3 right to its limit let alone DDR2, GDDR3 is [simulated] dual-port and offers quad speed (dual speed in each direction) relative to DDR3's dual-speed. Or at least that's my understanding.

What? No, the cards listed as using "DDR3" are actually using plain old DDR3, not GDDR3 - If you look at the VRAM chips on any of the (terrible) nVidia cards with DDR3 they keep rebadging and search the part numbers, you'll see they're actual DDR3 modules.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Subjunctive posted:

They're actually going to fix that, I hear.

They've been "fixing" it for like a year now.

I just said hell with it and slapped a AIO + G10 on it. Got one of the Zalman 320's off NewEgg for $30, and even with the fan set to below audible (via a fan controller) it never gets above 75C even overclocked running Furmark. Highly recommend.

EdEddnEddy
Apr 5, 2012



Subjunctive posted:

They're actually going to fix that, I hear.

In the latest VR drivers, it appears they have. My 980Ti finally idles at 405mhz instead of the 900 or so it was before with my Multi Monitor setup.

Its nice to finally have that working since It never did with the SLI 780's.

Edmond Dantes
Sep 12, 2007

Reactor: Online
Sensors: Online
Weapons: Online

ALL SYSTEMS NOMINAL
Thanks for the 970 temp talk, it's relief knowing it's not going to burst into flames. :v:

Now, time to reinstall half my steam library and set everything to max. All of it.

Rukus
Mar 13, 2007

Hmph.

Edmond Dantes posted:

I went to GPUTweak to make a fan curve as I had with the 660ti, but I don't have that option anymore, now you have "auto" and "manual", but manual is just a fixed speed, I don't have the little gear symbol anymore next to the manual button. I do have a "GPU Target Temp" that wasn't there with my previous card, it's defaulted to 79C and I left it there. The fans work fine if I hit manual and select any % bigger than 0, and it I leave it in Auto, it's always 0 until I run a game, when it looks like it's doing its job properly since as I mentioned before, the temp never breaks 70.

I just went through this with my 780. (The card downclocks from my OC with the regular fan curve). You have to click on the hexagon thing on the bottom left of the panel:


That'll enable advanced mode to allow for fan curves.

Edmond Dantes
Sep 12, 2007

Reactor: Online
Sensors: Online
Weapons: Online

ALL SYSTEMS NOMINAL

Rukus posted:

I just went through this with my 780. (The card downclocks from my OC with the regular fan curve). You have to click on the hexagon thing on the bottom left of the panel:


That'll enable advanced mode to allow for fan curves.

Aha! That did it; I saw you could add more tuning options in the "tune" tab of the settings and thought that was the advanced mode. Thanks, I'll see how it handles temp the next few days and adjust accordingly if I need to, considering even that even if it idles a bit higher than I'm used to the top temp is still lower than my 660 with the fans going off like crazy.

Thanks! :thipshat:

sout
Apr 24, 2014

Is GTX 1080 kind of a bad name? 1080p doesn't sound impressive now.
I guess people buying GPUs are generally not totally clueless and won't get confused by it but still.

Blackfyre
Jul 8, 2012

I want wings.

sout posted:

Is GTX 1080 kind of a bad name? 1080p doesn't sound impressive now.
I guess people buying GPUs are generally not totally clueless and won't get confused by it but still.

It probably won't be that, I mean we had 8800 etc then went back to 280/290 so they'll probably just use a different numbering system.

HMS Boromir
Jul 16, 2011

by Lowtax
GTX 18. Next year we'll get the one with the titan chip in it, the GTX 18+.

Dead Goon
Dec 13, 2002

No Obvious Flaws



GTX Make Faster More Pretty

Panty Saluter
Jan 17, 2004

Making learning fun!

Dead Goon posted:

GTX Make Faster More Pretty

Just take my credit card info, the cost is not important.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

HMS Boromir posted:

GTX 18. Next year we'll get the one with the titan chip in it, the GTX 18+.

18, the Ti version is P18, the dual GPU card is P18D. Factory overclocks are underlined, you pronounce it two octaves lower. Elon has figured this out for them.

sout
Apr 24, 2014

GTX-Rated
GTXXX-Rated is the Ti

fozzy fosbourne
Apr 21, 2010

GTX 10 Benjamins

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

sout posted:

GTX-Rated
GTXXX-Rated is the Ti

GTX gon give it to ya

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
GTXtreme
GTXtreme Rush
GTXtreme Hyper
GTXtreme Warp

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

GTX 4K16
GTX 2K16
GTX 1K16

Increment the second half with the year, 1K is in the ~$200 range, 2K is the 300-400 range, 4K is everything on up. Can tack a plus or a Ti on the end for the better card in the bracket.

Bonuses: scales infinitely
Problems: rebranding churn and/or letting people know how old that card actually is.

Panty Saluter
Jan 17, 2004

Making learning fun!

Dogen posted:

GTX gon give it to ya

never not funny

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

xthetenth posted:

Problems: rebranding churn and/or letting people know how old that card actually is.

That's not exactly a negative as far as Nvidia is concerned.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

GokieKS posted:

What? No, the cards listed as using "DDR3" are actually using plain old DDR3, not GDDR3 - If you look at the VRAM chips on any of the (terrible) nVidia cards with DDR3 they keep rebadging and search the part numbers, you'll see they're actual DDR3 modules.
Indeed. GDDR5 is based on DDR3, while GDDR3 is based on DDR2, so lower-end versions of GDDR5 cards use DDR3. This does absolutely murder performance since memory bandwidth is now slower than onboard video.

Josh Lyman
May 24, 2009


Both Nvidia and AMD have flip flopped between 3 and 4 digit numbering, so I would expect Pascal to be 2080.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

GokieKS posted:

What? No, the cards listed as using "DDR3" are actually using plain old DDR3, not GDDR3 - If you look at the VRAM chips on any of the (terrible) nVidia cards with DDR3 they keep rebadging and search the part numbers, you'll see they're actual DDR3 modules.

No, some are actually using GDDR3. For example here is a board shot of a Zotac GT640 Zone Edition (GK107) that is listed as using "DDR3":



Closeup on the RAM chips:



P/N 2IKI2 D9PRS is a gDDR3 chip from Micron (abbreviated as D9PRS)

https://www.micron.com/~/media/documents/products/data-sheet/dram/ddr3/ddr3_2gb_graphics_addendum.pdf

Here's an OEM GT 640 (GF116) from NVIDIA:



P/N k4w2g1646c-hc11 is a gDDR3 chip from Samsung:

https://octopart.com/k4w2g1646c-hc11-samsung-30947808

On the other hand an Asus GT 730 (GF106) does appear to be using Nanya nt5cb128m16hp-cg chips, which looks like a regular DDR3, so...



Reference GT640 (GK107) appears to use the same 2IKI2 D9PRS part:



AFOX GT 640 (GK107) uses Hynix h5tq2G63dfa, no direct reference but a similar P/N is listed as DDR3 and it appears to be used as cache on an SSD, so probably DDR3.



So either there's actually multiple versions of some chips (GK107 has both GDDR3 and DDR3 and also a GDDR5 version), or some manufacturers don't care to differentiate GDDR3 parts on their datasheets (Micron only designates them in a "gDDR3 addendum"), or "gDDR3" is not the same thing as "GDDR3", or they're relatively compatible (probably not)...

Paul MaudDib fucked around with this message at 21:51 on Apr 10, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
So my S2716DG is here. I don't think it looks as vivid as my P2715Q but I haven't fully tuned the settings yet. I installed the TFTCentral ICC profile (I think) but didn't really notice much of a difference.

However, my 780 Ti is idling at 28% TDP now. I know NVIDIA was going to fix that for Maxwell, is there anything they can do about Kepler?

GokieKS
Dec 15, 2012

Mostly Harmless.

Paul MaudDib posted:

or "gDDR3" is not the same thing as "GDDR3"

This is the actual answer. It looks like the OEMs calls the DDR3 chips that are used in GPUs instead of DIMMs "gDDR3". The part number "MT41J128M16" listed in that PDF is the same as what's listed in their PDF for SDRAM used in DIMMs, and you can see the same specifications.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

GokieKS posted:

This is the actual answer. It looks like the OEMs calls the DDR3 chips that are used in GPUs instead of DIMMs "gDDR3". The part number "MT41J128M16" listed in that PDF is the same as what's listed in their PDF for SDRAM used in DIMMs, and you can see the same specifications.

It looks like what Micron is doing is saying "here's our DDR3 specs/modules, and here's some additional specifications that supercede some of those on the 'gDDR3'-series modules listed below" but I guess that also argues in favor of them being basically DDR3.

Weird but I guess that makes sense.

Paul MaudDib fucked around with this message at 00:09 on Apr 11, 2016

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

sout posted:

Is GTX 1080 kind of a bad name? 1080p doesn't sound impressive now.
I guess people buying GPUs are generally not totally clueless and won't get confused by it but still.

They'd honestly be better off switching to 2K* and 4K*, with an "M" and "Ti/Titan" attached for mobile and high-end, respectively.

So the lowest-end 4K part would be the Geforce 4K1 (or reverse the order so "1" is the highest, just to purposely confuse people). It'd also be more truth in advertising because some OEMs have begun putting 2GB GPUs into laptops with 48Hz 4K screens *cough* Lenovo *cough*.

Or, even better, just make the Pascals the Geforce 2K1/4K1 for the first generation (even though I'm sure a 2K part would be a re-labeled Maxwell), with identifying characteristics that follow it. Then the next gen would be 2K2/4K2 (and maybe a 5K1).

BIG HEADLINE fucked around with this message at 02:06 on Apr 11, 2016

GokieKS
Dec 15, 2012

Mostly Harmless.

Paul MaudDib posted:

It looks like what Micron is doing is saying "here's our DDR3 specs/modules, and here's some additional specifications that supercede some of those on the 'gDDR3'-series modules listed below" but I guess that also argues in favor of them being basically DDR3.

Weird but I guess that makes sense.

Yes, the use of "gDDR3" is confusing when there's an actual GDDR3 (which precedes it, as GDDR3 is based on DDR2). The long and short of it for most people though is simple enough - don't buy a GPU with DDR3 if you care about performance or can help it.

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
So wait, would DDR4 in fact make a decent enough performance boost over DDR3? Say a 2GB DRR4 @ 4000mhz card with a 128 bit bus, I'm likely miscalculating but that should be ~64GB/s? There should be at least some PCB area and power savings, right?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply