|
Well realistically speaking and based on past experience I would expect Nvidia to push ahead when they do push out a new round of cards. Would expect something around sept/oct. If it matters they'll sort a GCN style thing for their cards or we might just find its not such a big thing on most PC titles. With their partnerships with Devs and extra game works things it's in their best interest to keep themselves competitive on all functions that matter. Apologies if this is a poorly written post, on my phone at the in laws.
|
# ? Apr 9, 2016 14:10 |
|
|
# ? May 30, 2024 22:11 |
|
Quick question about mobile GPUs. I currently have a GTX 780M with 4GB GDDR5 memory, would a Radeon R9 M395X with 4GB GDDR5 memory be a big upgrade? Was wondering how far things have progressed since the 780M was considered one of the high-end mobile GPUs and if I'm correct in understanding that Nvidia will have some brand new mobile GPUs out sometime later this year? I'm in no rush so can happily wait and see, but I was just wondering if the M395X offers much more than the 780M and how much better the new cards later this year will be.
|
# ? Apr 9, 2016 15:20 |
|
DuckConference posted:I don't believe there are any actual duties on video cards, but even if there aren't, you still have to pay gst/pst on anything over $20 of declared value that you buy in the us You're right, sorry -- I was lumping taxes and duties together.
|
# ? Apr 9, 2016 15:35 |
|
Blackfyre posted:Well realistically speaking and based on past experience I would expect Nvidia to push ahead when they do push out a new round of cards. Would expect something around sept/oct. If it matters they'll sort a GCN style thing for their cards or we might just find its not such a big thing on most PC titles. With their partnerships with Devs and extra game works things it's in their best interest to keep themselves competitive on all functions that matter. I'd expect them to catch up. I'd be rather surprised if their first attempt at building GCN style cards was as good as AMD's fourth but they should be at least within striking distance. Their past few generations have been getting their gains in part by stripping stuff out that wasn't needed but now is, so I'd expect that to eat up some or all of the advantage they had relative to AMD when Maxwell launched.
|
# ? Apr 9, 2016 15:44 |
Corvo posted:Quick question about mobile GPUs. I currently have a GTX 780M with 4GB GDDR5 memory, would a Radeon R9 M395X with 4GB GDDR5 memory be a big upgrade? Was wondering how far things have progressed since the 780M was considered one of the high-end mobile GPUs and if I'm correct in understanding that Nvidia will have some brand new mobile GPUs out sometime later this year? I'm in no rush so can happily wait and see, but I was just wondering if the M395X offers much more than the 780M and how much better the new cards later this year will be. The M395X is about 40% faster, so it would be a significant bump. I'm guessing moble GPUs will see a significant bump with the new chips because of the smaller process size and greater power efficiency so you might want to wait and see.
|
|
# ? Apr 9, 2016 16:14 |
|
Corvo posted:Quick question about mobile GPUs. I currently have a GTX 780M with 4GB GDDR5 memory, would a Radeon R9 M395X with 4GB GDDR5 memory be a big upgrade? Was wondering how far things have progressed since the 780M was considered one of the high-end mobile GPUs and if I'm correct in understanding that Nvidia will have some brand new mobile GPUs out sometime later this year? I'm in no rush so can happily wait and see, but I was just wondering if the M395X offers much more than the 780M and how much better the new cards later this year will be. Nah M395X is just a rebranded Tonga from 2014 and those two should be fairly close. Not worth buying at a premium price when new stuff is around the corner. I'm sure mobile bins will benefit a lot from the die shrink.
|
# ? Apr 9, 2016 16:30 |
|
Today I got a GTX 970. I used the nvidia driver uninstaller thingy to clean up my old drivers and uninstalled Asus GPUTweak before installing the new card just in case. Installed without issues, tested it with some MAXED OUT games and it's a beauty, Dirt Rally was stunning and running smooth as butter. I monitored the temps for a bit, and even while hitting 99% GPU usage the card never went above 70C (I think the actual max temp was 67), but afterwards I stopped playing and noticed it was idling at ~60. I checked my task manager to see if anything had exited incorrectly and was still running in the background, but nothing. I restarted the pc and now it's idling at ~40, but I noticed the fans aren't running; I googled a bit and got a lot of hits saying that 50~60 for idle is a normal temp for this card and the fans not spinning is normal because the fans only kick in at ~67, is this possible? I went to GPUTweak to make a fan curve as I had with the 660ti, but I don't have that option anymore, now you have "auto" and "manual", but manual is just a fixed speed, I don't have the little gear symbol anymore next to the manual button. I do have a "GPU Target Temp" that wasn't there with my previous card, it's defaulted to 79C and I left it there. The fans work fine if I hit manual and select any % bigger than 0, and it I leave it in Auto, it's always 0 until I run a game, when it looks like it's doing its job properly since as I mentioned before, the temp never breaks 70. Sorry if I don't make much sense and it's all normal for this card, I have the new piece of hardware nerves.
|
# ? Apr 9, 2016 20:17 |
|
Yeah the Asus and MSI GTX 970s don't turn on the fan by default until the temperature gets in the 60Cish range. The card will be perfectly happy to run fanless. Stock blower cooler runs the card at around 82-85C I think so it's nothing to worry about.
|
# ? Apr 9, 2016 20:23 |
|
Edmond Dantes posted:Sorry if I don't make much sense and it's all normal for this card, I have the new piece of hardware nerves. That all sounds normal for a good 970, the fans are pretty conservative because of how efficient Maxwell is. If your in-game temps are good, things are working correctly. There are some monitor configs (multiple, one or both > 60Hz) that cause NVIDIA cards to idle a bit high, but still shouldn't be enough to trigger fans if your case has decent airflow.
|
# ? Apr 9, 2016 20:25 |
|
EVGA and even Gigabyte got their act together on the idle fans with their newer revisions. I wish Gigabyte used bigger fans though, I just got a G1 980 Ti because it was the cheapest and I'm not really a fan of Gigabyte's cooler. Bigger fans would have been appreciated even if it meant a longer card, the 92mm fans get audible. You should still be able to set custom fan curves, not about GPU Tweak but MSI Afterburner works. NVIDIA's stupid 4K + multi-monitor bug means my 980Ti and my 970 before that "idled" at 900mhz though, which is enough to get the fans to spin up most of the time.
|
# ? Apr 9, 2016 20:48 |
|
Desuwa posted:NVIDIA's stupid 4K + multi-monitor bug means my 980Ti and my 970 before that "idled" at 900mhz though, which is enough to get the fans to spin up most of the time. They're actually going to fix that, I hear.
|
# ? Apr 9, 2016 20:50 |
|
Subjunctive posted:They're actually going to fix that, I hear. Good. It's messed up that my 290 idles lower than my 970 did without workarounds.
|
# ? Apr 9, 2016 20:52 |
|
Paul MaudDib posted:No, DDR3 is a euphemism but it's always GDDR3. Graphics work would push DDR3 right to its limit let alone DDR2, GDDR3 is [simulated] dual-port and offers quad speed (dual speed in each direction) relative to DDR3's dual-speed. Or at least that's my understanding. What? No, the cards listed as using "DDR3" are actually using plain old DDR3, not GDDR3 - If you look at the VRAM chips on any of the (terrible) nVidia cards with DDR3 they keep rebadging and search the part numbers, you'll see they're actual DDR3 modules.
|
# ? Apr 9, 2016 20:59 |
|
Subjunctive posted:They're actually going to fix that, I hear. They've been "fixing" it for like a year now. I just said hell with it and slapped a AIO + G10 on it. Got one of the Zalman 320's off NewEgg for $30, and even with the fan set to below audible (via a fan controller) it never gets above 75C even overclocked running Furmark. Highly recommend.
|
# ? Apr 9, 2016 21:33 |
|
Subjunctive posted:They're actually going to fix that, I hear. In the latest VR drivers, it appears they have. My 980Ti finally idles at 405mhz instead of the 900 or so it was before with my Multi Monitor setup. Its nice to finally have that working since It never did with the SLI 780's.
|
# ? Apr 9, 2016 22:03 |
|
Thanks for the 970 temp talk, it's relief knowing it's not going to burst into flames. Now, time to reinstall half my steam library and set everything to max. All of it.
|
# ? Apr 9, 2016 22:07 |
|
Edmond Dantes posted:I went to GPUTweak to make a fan curve as I had with the 660ti, but I don't have that option anymore, now you have "auto" and "manual", but manual is just a fixed speed, I don't have the little gear symbol anymore next to the manual button. I do have a "GPU Target Temp" that wasn't there with my previous card, it's defaulted to 79C and I left it there. The fans work fine if I hit manual and select any % bigger than 0, and it I leave it in Auto, it's always 0 until I run a game, when it looks like it's doing its job properly since as I mentioned before, the temp never breaks 70. I just went through this with my 780. (The card downclocks from my OC with the regular fan curve). You have to click on the hexagon thing on the bottom left of the panel: That'll enable advanced mode to allow for fan curves.
|
# ? Apr 9, 2016 22:34 |
|
Rukus posted:I just went through this with my 780. (The card downclocks from my OC with the regular fan curve). You have to click on the hexagon thing on the bottom left of the panel: Aha! That did it; I saw you could add more tuning options in the "tune" tab of the settings and thought that was the advanced mode. Thanks, I'll see how it handles temp the next few days and adjust accordingly if I need to, considering even that even if it idles a bit higher than I'm used to the top temp is still lower than my 660 with the fans going off like crazy. Thanks! :thipshat:
|
# ? Apr 9, 2016 22:56 |
|
Is GTX 1080 kind of a bad name? 1080p doesn't sound impressive now. I guess people buying GPUs are generally not totally clueless and won't get confused by it but still.
|
# ? Apr 10, 2016 15:23 |
|
sout posted:Is GTX 1080 kind of a bad name? 1080p doesn't sound impressive now. It probably won't be that, I mean we had 8800 etc then went back to 280/290 so they'll probably just use a different numbering system.
|
# ? Apr 10, 2016 15:29 |
|
GTX 18. Next year we'll get the one with the titan chip in it, the GTX 18+.
|
# ? Apr 10, 2016 15:33 |
|
GTX Make Faster More Pretty
|
# ? Apr 10, 2016 15:54 |
|
Dead Goon posted:GTX Make Faster More Pretty Just take my credit card info, the cost is not important.
|
# ? Apr 10, 2016 15:57 |
|
HMS Boromir posted:GTX 18. Next year we'll get the one with the titan chip in it, the GTX 18+. 18, the Ti version is P18, the dual GPU card is P18D. Factory overclocks are underlined, you pronounce it two octaves lower. Elon has figured this out for them.
|
# ? Apr 10, 2016 16:00 |
|
GTX-Rated GTXXX-Rated is the Ti
|
# ? Apr 10, 2016 16:01 |
|
GTX 10 Benjamins
|
# ? Apr 10, 2016 17:17 |
|
sout posted:GTX-Rated GTX gon give it to ya
|
# ? Apr 10, 2016 17:44 |
|
GTXtreme GTXtreme Rush GTXtreme Hyper GTXtreme Warp
|
# ? Apr 10, 2016 17:54 |
|
GTX 4K16 GTX 2K16 GTX 1K16 Increment the second half with the year, 1K is in the ~$200 range, 2K is the 300-400 range, 4K is everything on up. Can tack a plus or a Ti on the end for the better card in the bracket. Bonuses: scales infinitely Problems: rebranding churn and/or letting people know how old that card actually is.
|
# ? Apr 10, 2016 18:14 |
|
Dogen posted:GTX gon give it to ya never not funny
|
# ? Apr 10, 2016 18:18 |
|
xthetenth posted:Problems: rebranding churn and/or letting people know how old that card actually is. That's not exactly a negative as far as Nvidia is concerned.
|
# ? Apr 10, 2016 18:18 |
|
GokieKS posted:What? No, the cards listed as using "DDR3" are actually using plain old DDR3, not GDDR3 - If you look at the VRAM chips on any of the (terrible) nVidia cards with DDR3 they keep rebadging and search the part numbers, you'll see they're actual DDR3 modules.
|
# ? Apr 10, 2016 20:23 |
|
Both Nvidia and AMD have flip flopped between 3 and 4 digit numbering, so I would expect Pascal to be 2080.
|
# ? Apr 10, 2016 20:27 |
|
GokieKS posted:What? No, the cards listed as using "DDR3" are actually using plain old DDR3, not GDDR3 - If you look at the VRAM chips on any of the (terrible) nVidia cards with DDR3 they keep rebadging and search the part numbers, you'll see they're actual DDR3 modules. No, some are actually using GDDR3. For example here is a board shot of a Zotac GT640 Zone Edition (GK107) that is listed as using "DDR3": Closeup on the RAM chips: P/N 2IKI2 D9PRS is a gDDR3 chip from Micron (abbreviated as D9PRS) https://www.micron.com/~/media/documents/products/data-sheet/dram/ddr3/ddr3_2gb_graphics_addendum.pdf Here's an OEM GT 640 (GF116) from NVIDIA: P/N k4w2g1646c-hc11 is a gDDR3 chip from Samsung: https://octopart.com/k4w2g1646c-hc11-samsung-30947808 On the other hand an Asus GT 730 (GF106) does appear to be using Nanya nt5cb128m16hp-cg chips, which looks like a regular DDR3, so... Reference GT640 (GK107) appears to use the same 2IKI2 D9PRS part: AFOX GT 640 (GK107) uses Hynix h5tq2G63dfa, no direct reference but a similar P/N is listed as DDR3 and it appears to be used as cache on an SSD, so probably DDR3. So either there's actually multiple versions of some chips (GK107 has both GDDR3 and DDR3 and also a GDDR5 version), or some manufacturers don't care to differentiate GDDR3 parts on their datasheets (Micron only designates them in a "gDDR3 addendum"), or "gDDR3" is not the same thing as "GDDR3", or they're relatively compatible (probably not)... Paul MaudDib fucked around with this message at 21:51 on Apr 10, 2016 |
# ? Apr 10, 2016 20:48 |
|
So my S2716DG is here. I don't think it looks as vivid as my P2715Q but I haven't fully tuned the settings yet. I installed the TFTCentral ICC profile (I think) but didn't really notice much of a difference. However, my 780 Ti is idling at 28% TDP now. I know NVIDIA was going to fix that for Maxwell, is there anything they can do about Kepler?
|
# ? Apr 10, 2016 23:40 |
|
Paul MaudDib posted:or "gDDR3" is not the same thing as "GDDR3" This is the actual answer. It looks like the OEMs calls the DDR3 chips that are used in GPUs instead of DIMMs "gDDR3". The part number "MT41J128M16" listed in that PDF is the same as what's listed in their PDF for SDRAM used in DIMMs, and you can see the same specifications.
|
# ? Apr 10, 2016 23:59 |
|
GokieKS posted:This is the actual answer. It looks like the OEMs calls the DDR3 chips that are used in GPUs instead of DIMMs "gDDR3". The part number "MT41J128M16" listed in that PDF is the same as what's listed in their PDF for SDRAM used in DIMMs, and you can see the same specifications. It looks like what Micron is doing is saying "here's our DDR3 specs/modules, and here's some additional specifications that supercede some of those on the 'gDDR3'-series modules listed below" but I guess that also argues in favor of them being basically DDR3. Weird but I guess that makes sense. Paul MaudDib fucked around with this message at 00:09 on Apr 11, 2016 |
# ? Apr 11, 2016 00:05 |
|
sout posted:Is GTX 1080 kind of a bad name? 1080p doesn't sound impressive now. They'd honestly be better off switching to 2K* and 4K*, with an "M" and "Ti/Titan" attached for mobile and high-end, respectively. So the lowest-end 4K part would be the Geforce 4K1 (or reverse the order so "1" is the highest, just to purposely confuse people). It'd also be more truth in advertising because some OEMs have begun putting 2GB GPUs into laptops with 48Hz 4K screens *cough* Lenovo *cough*. Or, even better, just make the Pascals the Geforce 2K1/4K1 for the first generation (even though I'm sure a 2K part would be a re-labeled Maxwell), with identifying characteristics that follow it. Then the next gen would be 2K2/4K2 (and maybe a 5K1). BIG HEADLINE fucked around with this message at 02:06 on Apr 11, 2016 |
# ? Apr 11, 2016 00:06 |
|
Paul MaudDib posted:It looks like what Micron is doing is saying "here's our DDR3 specs/modules, and here's some additional specifications that supercede some of those on the 'gDDR3'-series modules listed below" but I guess that also argues in favor of them being basically DDR3. Yes, the use of "gDDR3" is confusing when there's an actual GDDR3 (which precedes it, as GDDR3 is based on DDR2). The long and short of it for most people though is simple enough - don't buy a GPU with DDR3 if you care about performance or can help it.
|
# ? Apr 11, 2016 00:12 |
|
|
# ? May 30, 2024 22:11 |
|
So wait, would DDR4 in fact make a decent enough performance boost over DDR3? Say a 2GB DRR4 @ 4000mhz card with a 128 bit bus, I'm likely miscalculating but that should be ~64GB/s? There should be at least some PCB area and power savings, right?
|
# ? Apr 11, 2016 00:50 |