Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Nitramster
Mar 10, 2006
THERE'S NO TIME!!!
Thanks, I was able to get the evga superclocked 1070 with 8gigs of ram for just over 400. Installing it tonight, can't wait to see a game played at 2k at 144hz !

Adbot
ADBOT LOVES YOU

1gnoirents
Jun 28, 2014

hello :)
After 9 months or so this G1 Gigabyte 1070 I have is starting to make really obnoxious coil whine (currently, just at high menu FPS). What's troubling is it didn't use to do this, it was very quiet if it happened at all. I've never had a GPU go from no whine to whine. I learned something today!!

I'd frame limit but fastsync works very well and I'd prefer not to.

Anime Schoolgirl
Nov 28, 2002

i use both framelimit (2x of highest refresh rate) and fastsync with no problems

Grundulum
Feb 28, 2006

Anime Schoolgirl posted:

i use both framelimit (2x of highest refresh rate) and fastsync with no problems

Why double the refresh rate? Why not use exactly the monitor's refresh rate as your limit?

Anime Schoolgirl
Nov 28, 2002

Grundulum posted:

Why double the refresh rate? Why not use exactly the monitor's refresh rate as your limit?
You often get a tearing line right in the middle of your screen when you do that, even with fast sync

SwissArmyDruid
Feb 14, 2014

by sebmojo
About to do a swap between the AMD M5100 and an Nvidia M2000M in my workstation laptop.

Wish me luck, and hope that it doesn't take more than DDU and new drivers.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

Have you run the numbers on how much this is going to cost you per month? How many hours do you game that causes your card to draw full power, divided by 1000, then times how much per kilowatt-hour? You'll probably spend at least $75-100 to swap to a 1070 with equivalent performance, let alone a 1080.

Reducing wattage/heat isn't always about saving money on power. The ability to run fans slower and lower the noise produced is also something of importance to many people. For instance, I bother watercooling my 1080 even though it is absolutely overkill, simply because I enjoy being able to game on a nearly silent computer. More heat makes this harder to do.

GRINDCORE MEGGIDO
Feb 28, 1985


SwissArmyDruid posted:

About to do a swap between the AMD M5100 and an Nvidia M2000M in my workstation laptop.

Wish me luck, and hope that it doesn't take more than DDU and new drivers.

I've never changed a laptop graphics card before. How do they connect to cooling?

SwissArmyDruid
Feb 14, 2014

by sebmojo
The same way any other GPU connects to cooling. With thermal compound, a copper contact surface, heat pipes, and fins.

I'll have photos in a bit, but for now, let me just say that it would have been a WHOLE loving lot easier if Dell hadn't updated the drivers on their website to, y'know, some other version that doesn't recognize my hardware despite being listed on my model's support page.

Here's hoping that the generic Nvidia drivers work, I just booted things up now to make things worked before I put everything back together, since getting at the mainboard is surprisingly involved. (read: I had to take almost loving everything off both sides of the chassis.)

edit: Nope, nothing loving doing, RIP me, I'm out $250 until I can hock this thing.

edit edit: Gonna try hacking the inf.

edit edit edit: Wait, this is an _HP_ M2000M, for one of their Zbooks. Maybe I can install the HP driver instead.....?

edit x4: HA! THE INF HACK ON THE HP DRIVER WORKED!

edit x5: OH gently caress YOU, INSTALLER.

edit x6: Trying the INF hack on the dell driver, no dice. Packing it in and putting everything back the way it was, I'm officially calling this a failure.

edit x7: Last ditch, INF hack on the generic driver. Also no dice.

SwissArmyDruid fucked around with this message at 12:19 on Jan 23, 2017

Dejawesp
Jan 8, 2017

You have to follow the beat!
I run two GTX 970 with SLI and the heat exhaust on my computer is hot enough to singe the hairs off my hand. Is this thing about to break down or what?

80 degrees seem like a lot.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
80 degrees (under load) is well inside normal operating temperatures for a graphics card - in fact it's on the low side, especially considering you have 2 cards in sli.

That being said, your fan is only operating at 53% - at that temperature I would have expected it to be higher. You should consider checking how you've set up your fan curve. You may be leaving performance on the table if the GPU is throttling rather than increasing the fan speed.

The Lord Bude fucked around with this message at 12:41 on Jan 23, 2017

Dejawesp
Jan 8, 2017

You have to follow the beat!

The Lord Bude posted:

80 degrees (under load) is well inside normal operating temperatures for a graphics card - in fact it's on the low side, especially considering you have 2 cards in sli.

That being said, your fan is only operating at 53% - at that temperature I would have expected it to be higher. You should consider checking how you've set up your fan curve. You may be leaving performance on the table if the GPU is throttling rather than increasing the fan speed.

I bought the whole rig complete. Should I take a pic of how they're set up?

Or do you mean some fan setting I can adjust on the software side?

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Dejawesp posted:

I bought the whole rig complete. Should I take a pic of how they're set up?

Or do you mean some fan setting I can adjust on the software side?

in software. If it isn't already installed, download and install some GPU overclocking software - MSI, Asus, EVGA, etc all have software available for their respective GPUs. You can use it to monitor the GPU, and adjust how the fan works. You aren't in any danger of actual hardware damage unless the temperature is getting over around 95c. Typically you strike a balance between noise and temps, by having the fan gradually increase in speed as the temps increase - I'd set it up so that the fans run at the minimum speed when the GPU temp is under 40c, then gradually step it up, with fans reaching 100% when the temperature hits 80.

The Lord Bude fucked around with this message at 12:56 on Jan 23, 2017

1gnoirents
Jun 28, 2014

hello :)

SwissArmyDruid posted:

The same way any other GPU connects to cooling. With thermal compound, a copper contact surface, heat pipes, and fins.

I'll have photos in a bit, but for now, let me just say that it would have been a WHOLE loving lot easier if Dell hadn't updated the drivers on their website to, y'know, some other version that doesn't recognize my hardware despite being listed on my model's support page.

Here's hoping that the generic Nvidia drivers work, I just booted things up now to make things worked before I put everything back together, since getting at the mainboard is surprisingly involved. (read: I had to take almost loving everything off both sides of the chassis.)

edit: Nope, nothing loving doing, RIP me, I'm out $250 until I can hock this thing.

edit edit: Gonna try hacking the inf.

edit edit edit: Wait, this is an _HP_ M2000M, for one of their Zbooks. Maybe I can install the HP driver instead.....?

edit x4: HA! THE INF HACK ON THE HP DRIVER WORKED!

edit x5: OH gently caress YOU, INSTALLER.

edit x6: Trying the INF hack on the dell driver, no dice. Packing it in and putting everything back the way it was, I'm officially calling this a failure.

edit x7: Last ditch, INF hack on the generic driver. Also no dice.

I was going to ask if you had to swap the whole mobo but it sounds like you did ... :( sorry on the no luck


Dejawesp posted:

I run two GTX 970 with SLI and the heat exhaust on my computer is hot enough to singe the hairs off my hand. Is this thing about to break down or what?

80 degrees seem like a lot.



The typical throttling temp is 82 or 83 though iirc they run up to 90+. For SLI, this is not surprising for the top card, as long as that is the top card you have pictured. Make sure you check both cards.

53% does seem low for 80 degrees, but this will depend on the manufacturer. If 80 degrees is as hot as the hottest card ever gets there may be no reason to mess with the fan. You can also repaste the thermal paste on the cards and mess with case fans. Both thing can make tremedous differences... Or none at all.

Also what kind of cards are they specifically ?

1gnoirents fucked around with this message at 13:37 on Jan 23, 2017

Dejawesp
Jan 8, 2017

You have to follow the beat!

The Lord Bude posted:

in software. If it isn't already installed, download and install some GPU overclocking software - MSI, Asus, EVGA, etc all have software available for their respective GPUs. You can use it to monitor the GPU, and adjust how the fan works. You aren't in any danger of actual hardware damage unless the temperature is getting over around 95c. Typically you strike a balance between noise and temps, by having the fan gradually increase in speed as the temps increase - I'd set it up so that the fans run at the minimum speed when the GPU temp is under 40c, then gradually step it up, with fans reaching 100% when the temperature hits 80.

I took a screenshot of the curve



It doesn't match the earlier screenshot of 50% at 80 degrees. Is this screenshot what MSI has not set it to or the old default?

Dejawesp fucked around with this message at 13:39 on Jan 23, 2017

Dejawesp
Jan 8, 2017

You have to follow the beat!
Okay new data.

I put it through the same stress test as before but with MSI installed and no settings changed.



The sound is a bit louder but it's also a much healthier sound. Higher in frequency. A bit more mellow. The heat exhaust port blows out more air but it's no longer scorching hot. I can touch it now, Before I couldn't get my hand within 5 cm of it.

You have been a lot of help. Can I get you something from the SA store? A gift certificate for anything?

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
That's ok, it was just a 30 second forum post, that's what we're here for. You might want to adust the curve slightly so that it hits 100% at say 80 degrees - it will be a bit louder when you game, but the temperature will be lower.

I also encourage you to overclock - the 970 can gain substantial amounts of extra performance. You can start with this basic guide:

https://forums.somethingawful.com/showthread.php?noseen=0&threadid=3484126&perpage=40&pagenumber=85#post417157441

Even if you just move the power limit and temp limit sliders all the way to the right, you'll still be getting a decent whack of free performance.

Dejawesp
Jan 8, 2017

You have to follow the beat!

The Lord Bude posted:

That's ok, it was just a 30 second forum post, that's what we're here for. You might want to adust the curve slightly so that it hits 100% at say 80 degrees - it will be a bit louder when you game, but the temperature will be lower.

I also encourage you to overclock - the 970 can gain substantial amounts of extra performance. You can start with this basic guide:

https://forums.somethingawful.com/showthread.php?noseen=0&threadid=3484126&perpage=40&pagenumber=85#post417157441

Even if you just move the power limit and temp limit sliders all the way to the right, you'll still be getting a decent whack of free performance.

Okay I'll give that a try. This overclocking business seems pretty easy. I haven't tried it since 1998 or so. And back then it involved opening up the computer and digging around inside to install home made water cooling and things like that.

1gnoirents
Jun 28, 2014

hello :)

Dejawesp posted:

Okay I'll give that a try. This overclocking business seems pretty easy. I haven't tried it since 1998 or so. And back then it involved opening up the computer and digging around inside to install home made water cooling and things like that.

Its downright expected to OC (from manufacturer's perspective) and you are given a very tightly controlled and safe sandbox to do so in for gpus. In the past this was certainly not the case. I remember strict warnings that any attempt to do so would void your warranty across the board as well. Now if its mentioned at all its bland catch all phrasing that is really only in place to catch bios modding and so on.

Also after a quick glance you will see there was a big jump in speed from the fan change alone . Though its amazing to think it was throttling yet only using 53% if your fan.

1gnoirents fucked around with this message at 14:29 on Jan 23, 2017

Mystic Stylez
Dec 19, 2009

Any good freeware that shows my GPU temperature in game?

Edit VVVVVVVVVVVVV: Thanks, guys!

Mystic Stylez fucked around with this message at 15:05 on Jan 23, 2017

Dead Goon
Dec 13, 2002

No Obvious Flaws



Mystic Stylez posted:

Any good freeware that shows my GPU temperature in game?

Also, should/could I use MSI Afterburner with an EVGA Card or is EVGA Precision X fine?

I know EVGA Precision X can show an OSD in the corner of your screen which shows, amongst other things, temperature.

Any of the overclocking utilities are fine with whatever card.

1gnoirents
Jun 28, 2014

hello :)

Mystic Stylez posted:

Any good freeware that shows my GPU temperature in game?

EVGA Precision X should as mentioned, and so does MSI Afterburner. For that you go to the Monitoring tab I believe and click on the things you want in the OSD (on screen display). You highlight it, then check a checkbox that says Show in OSD or something to that effect. Then go to the tab for hotkeys and see what your OSD display hotkey is and voila any information you want always displayed. This can be any and all data as well, which is useful. For instance I'd typically have GPU temp, fan speed, and clock speed up. But if I'm having CPU or ram issues I'd have those up as well.

SwissArmyDruid
Feb 14, 2014

by sebmojo

GRINDCORE MEGGIDO posted:

I've never changed a laptop graphics card before. How do they connect to cooling?

So, this is a quick rundown on MXM cards.

Back in The Good Ol' Days, before Intel came along with their loving Ultrabooks and ruined laptops until very recently, there was a thing called Whitebooks.

These were an extension of Whiteboxes; barebones computers that you bought, and then supplied your own processor, memory, hard drive, and graphics card into. The first three are easy, processor manufacturers had their mobile sockets, and the other two had well-defined standards. But what didn't have a standard yet were video cards.

Enter Nvidia, who put together a consortium to define a standardized mobile graphics card format, the Mobile PCI Express Module, or MXM card.



Before this, graphics cards were completely unupgradable. They were a part of the mainboard itself, or if not a mainboard, then on a proprietary daughterboard, which obviously made upgrades either expensive or impossible, and absolutely ruled out cross-system compatibility. You were stuck with whatever graphics card you bought your whitebook or notebook with.

Obviously, having a standardized mobile graphics format has its benefits for the OEM: They can swap graphics cards in and out as part of customizing a laptop to the customer's specifications without having to stock many different kinds of mainboards. They install more or less like every other mobile component: Goes into the slot at an angle, pushes down, and is secured. In this case, by two screws at the top corners. Screw holes for the heatsink is standardized, too.



Unfortunately, despite being in use for almost a decade at this point, to this day, they still aren't *completely* drop in: It is not unheard of to still find proprietary MXM modules, and there are two flavors of MXM: The A-type modules you see pictured, and larger, B-type modules used for beefier graphics cards that need more surface area for stuff. That said, some really use the format the way it should be used: MSI is particularly good about this. But sometimes, OEMs will have firmware or certain BIOSes gating off support on what should be drop-in and install new driver, as you would with a desktop video card.

As for cooling, well, it's not THAT different from a normal graphics card. Just flattened out to accommodate the thickness, and heatpipes used to move the heat away to the edges where fans can blow on them. In any case, there is no physical reason why this should not have worked. Whatever was stopping me from picking up Maxwell and an extra 2 GB of VRAM over a downclocked 7000-series desktop chip must have had something to do with BIOSes, firmware, and/or the drivers.



(I confess, even if this turned out to be a complete waste, I needed the excuse to take the laptop apart to repaste the CPU anyways. When I took apart the GPU's heatsink in order to replace it after my New Year's Eve incident, I determined that the GPU had cooked itself to death because it had turned the grey paraffin thermal poo poo to crusty crap. Might still be under warranty, but I wasn't about to put myself out of action when I could do some preventative maintenance.)

Bonus shot: If anyone every goes "ewwww, AMD is still using ZIF sockets, how 2003, get the the loving times", just remind them that Intel still uses ZIF sockets aplenty too, on mobile. (TL;DR, LGA has a minimum height dictated by spring pin length, PGA can be much thinner and is useful to OEMs for customizing a product line.)



And now I'm off to get a few quick Zs before work.

SwissArmyDruid fucked around with this message at 15:31 on Jan 23, 2017

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
Seems like you like to live on the edge placing not one, but both MXM cards on the outside of an anti-static bag.

1gnoirents
Jun 28, 2014

hello :)

SwissArmyDruid posted:

So, this is a quick rundown on MXM cards.

Back in The Good Ol' Days, before Intel came along with their loving Ultrabooks and ruined laptops until very recently, there was a thing called Whitebooks.

These were an extension of Whiteboxes; barebones computers that you bought, and then supplied your own processor, memory, hard drive, and graphics card into. The first three are easy, processor manufacturers had their mobile sockets, and the other two had well-defined standards. But what didn't have a standard yet were video cards.

Enter Nvidia, who put together a consortium to define a standardized mobile graphics card format, the Mobile PCI Express Module, or MXM card.



Before this, graphics cards were completely unupgradable. They were a part of the mainboard itself, or if not a mainboard, then on a proprietary daughterboard, which obviously made upgrades either expensive or impossible, and absolutely ruled out cross-system compatibility. You were stuck with whatever graphics card you bought your whitebook or notebook with.

Obviously, having a standardized mobile graphics format has its benefits for the OEM: They can swap graphics cards in and out as part of customizing a laptop to the customer's specifications without having to stock many different kinds of mainboards. They install more or less like every other mobile component: Goes into the slot at an angle, pushes down, and is secured. In this case, by two screws at the top corners. Screw holes for the heatsink is standardized, too.



Unfortunately, despite being in use for almost a decade at this point, to this day, they still aren't *completely* drop in: It is not unheard of to still find proprietary MXM modules, and there are two flavors of MXM: The A-type modules you see pictured, and larger, B-type modules used for beefier graphics cards that need more surface area for stuff. That said, some really use the format the way it should be used: MSI is particularly good about this. But sometimes, OEMs will have firmware or certain BIOSes gating off support on what should be drop-in and install new driver, as you would with a desktop video card.

As for cooling, well, it's not THAT different from a normal graphics card. Just flattened out to accommodate the thickness, and heatpipes used to move the heat away to the edges where fans can blow on them. In any case, there is no physical reason why this should not have worked. Whatever was stopping me from picking up Maxwell and an extra 2 GB of VRAM over a downclocked 7000-series desktop chip must have had something to do with BIOSes, firmware, and/or the drivers.



(I confess, even if this turned out to be a complete waste, I needed the excuse to take the laptop apart to repaste the CPU anyways. When I took apart the GPU's heatsink in order to replace it after my New Year's Eve incident, I determined that the GPU had cooked itself to death because it had turned the grey paraffin thermal poo poo to crusty crap. Might still be under warranty, but I wasn't about to put myself out of action when I could do some preventative maintenance.)

Bonus shot: If anyone every goes "ewwww, AMD is still using ZIF sockets, how 2003, get the the loving times", just remind them that Intel still uses ZIF sockets aplenty too, on mobile. (TL;DR, LGA has a minimum height dictated by spring pin length, PGA can be much thinner and is useful to OEMs for customizing a product line.)



And now I'm off to get a few quick Zs before work.

I've taken apart my fair share of laptops (strictly for work though so) and all the dgpus were just soldered into the motherboard, but im guessing they just didnt rate a neat standard like this. Nice to see though im sorry yours didnt quite workout :(. I actually just parted out a fairly recent dell 6430 I bought off a customer who broke it and it was still soldered in.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
I once upgraded the GPU in my old Dell Latitude D800 from a lovely FX 5200 (which somehow managed to be lovely AND run hot) to a Radeon 9600 Pro Turbo. That was easy enough and the heatsink and all that fit perfectly. That was some time ago, though!

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord

SwissArmyDruid posted:

edit x7: Last ditch, INF hack on the generic driver. Also no dice.

This post was an emotional journey. I got secondhand anxiety from reading it. Sorry it didn't go well :(

Gwaihir
Dec 8, 2009
Hair Elf

SwissArmyDruid posted:

So, this is a quick rundown on MXM cards.

Back in The Good Ol' Days, before Intel came along with their loving Ultrabooks and ruined laptops until very recently, there was a thing called Whitebooks.

These were an extension of Whiteboxes; barebones computers that you bought, and then supplied your own processor, memory, hard drive, and graphics card into. The first three are easy, processor manufacturers had their mobile sockets, and the other two had well-defined standards. But what didn't have a standard yet were video cards.

Enter Nvidia, who put together a consortium to define a standardized mobile graphics card format, the Mobile PCI Express Module, or MXM card.



Before this, graphics cards were completely unupgradable. They were a part of the mainboard itself, or if not a mainboard, then on a proprietary daughterboard, which obviously made upgrades either expensive or impossible, and absolutely ruled out cross-system compatibility. You were stuck with whatever graphics card you bought your whitebook or notebook with.

Obviously, having a standardized mobile graphics format has its benefits for the OEM: They can swap graphics cards in and out as part of customizing a laptop to the customer's specifications without having to stock many different kinds of mainboards. They install more or less like every other mobile component: Goes into the slot at an angle, pushes down, and is secured. In this case, by two screws at the top corners. Screw holes for the heatsink is standardized, too.



Unfortunately, despite being in use for almost a decade at this point, to this day, they still aren't *completely* drop in: It is not unheard of to still find proprietary MXM modules, and there are two flavors of MXM: The A-type modules you see pictured, and larger, B-type modules used for beefier graphics cards that need more surface area for stuff. That said, some really use the format the way it should be used: MSI is particularly good about this. But sometimes, OEMs will have firmware or certain BIOSes gating off support on what should be drop-in and install new driver, as you would with a desktop video card.

As for cooling, well, it's not THAT different from a normal graphics card. Just flattened out to accommodate the thickness, and heatpipes used to move the heat away to the edges where fans can blow on them. In any case, there is no physical reason why this should not have worked. Whatever was stopping me from picking up Maxwell and an extra 2 GB of VRAM over a downclocked 7000-series desktop chip must have had something to do with BIOSes, firmware, and/or the drivers.



(I confess, even if this turned out to be a complete waste, I needed the excuse to take the laptop apart to repaste the CPU anyways. When I took apart the GPU's heatsink in order to replace it after my New Year's Eve incident, I determined that the GPU had cooked itself to death because it had turned the grey paraffin thermal poo poo to crusty crap. Might still be under warranty, but I wasn't about to put myself out of action when I could do some preventative maintenance.)

Bonus shot: If anyone every goes "ewwww, AMD is still using ZIF sockets, how 2003, get the the loving times", just remind them that Intel still uses ZIF sockets aplenty too, on mobile. (TL;DR, LGA has a minimum height dictated by spring pin length, PGA can be much thinner and is useful to OEMs for customizing a product line.)



And now I'm off to get a few quick Zs before work.

Dang, that sucks that it didn't work out. What model Dell was that? I've been using the generic drivers from Nvidia for the Quadro M2000M in a Precision 7510, although I did have to jump through a lot of hoops to get it working correctly. (It was very finicky about the order of driver installation on a fresh windows install, which is something I haven't run in to for ages.)

Yaoi Gagarin
Feb 20, 2014

Risky Bisquick posted:

Seems like you like to live on the edge placing not one, but both MXM cards on the outside of an anti-static bag.

People do this at work all the time and I feel sad but don't correct them... :smith:

Cowwan
Feb 23, 2011

VostokProgram posted:

People do this at work all the time and I feel sad but don't correct them... :smith:

Is it specific to MXM cards, or is putting stuff on the outside of anti-static bags just a bad idea?

I've never been bitten by it, but I wouldn't be surprised if it is.

Gwaihir
Dec 8, 2009
Hair Elf
Hm now I want to try and find a 1050ti MXM module and experiment, but it looks like those are not really as much of a thing with Pascal. The only one I could find for sale was from a Taiwanese company offering them for embedded solutions: http://www.aetina.com.tw/products-detail.php?i=184

Superficially it looks like a match form factor wise though.

Cowwan posted:

Is it specific to MXM cards, or is putting stuff on the outside of anti-static bags just a bad idea?

I've never been bitten by it, but I wouldn't be surprised if it is.


Nothing specific to MXM cards, it's just that the outside of those bags is where static tends to collect.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Risky Bisquick posted:

Seems like you like to live on the edge placing not one, but both MXM cards on the outside of an anti-static bag.

I assure you, just behind, to the left, and to the right of those photos are no shortage of metal posts I can, and did, slap the bag up against. The LAST thing I want to do is to put the bare dies face-down on the surface of my desk, after I loving assiduously cleaned all the grey waxy poo poo off them (and let me tell you that poo poo's a pain even if I *do* have something that dissolves it now and a brush that's soft and fine enough to get in between all those goddamn micro-SMC components without damaging anything.)

Gwaihir posted:

Dang, that sucks that it didn't work out. What model Dell was that? I've been using the generic drivers from Nvidia for the Quadro M2000M in a Precision 7510, although I did have to jump through a lot of hoops to get it working correctly. (It was very finicky about the order of driver installation on a fresh windows install, which is something I haven't run in to for ages.)

Precision M4800, which is what a 7510 *basically* is, but with older components. Original spec came with either a K1000M (1GB), K2000M (2GB), or an M5100 (2GB), which you can see why I really wanted the drop-in M2000M to work. And "finicky" is an understatement.

SwissArmyDruid fucked around with this message at 21:55 on Jan 23, 2017

GRINDCORE MEGGIDO
Feb 28, 1985


Thanks for that, that's the first time I've ever seen an mxm card. It sucks there are incompatibilities though.

Rap Game Goku
Apr 2, 2008

Word to your moms, I came to drop spirit bombs


That's a way nicer hardware upgrade path than when I "updated" my X1300 to a X1400 in my old dell.

Gwaihir
Dec 8, 2009
Hair Elf

SwissArmyDruid posted:


Precision M4800. Original spec came with either a K1000M (1GB), K2000M (2GB), or an M5100 (2GB). And "finicky" is an understatement.

It's a good thing that installing win10 to a PCIe SSD from a fast usb3 stick only takes about 5 minutes, because I must have done a fresh install at least 10 times before everything was working properly, because there was just no uninstalling or do overs possible.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Wacky Delly posted:

That's a way nicer hardware upgrade path than when I "updated" my X1300 to a X1400 in my old dell.

Hah, well, it would have been... IF IT HAD WORKED.

Again, nothing mechanical getting in the way, just either something in the BIOS or the firmware.

edit: Wish I'd found this youtube video last night. It purports to show an inf mod and then the driver software installing correctly with an M2000M, but idek anymore. Maybe I'll give it one extra crack tonight, or after I get in touch with this person to ask them if the M2000M they got was special in some way or another.

SwissArmyDruid fucked around with this message at 22:11 on Jan 23, 2017

1gnoirents
Jun 28, 2014

hello :)

SwissArmyDruid posted:

Hah, well, it would have been... IF IT HAD WORKED.

Again, nothing mechanical getting in the way, just either something in the BIOS or the firmware.

edit: Wish I'd found this youtube video last night. It purports to show an inf mod and then the driver software installing correctly with an M2000M, but idek anymore. Maybe I'll give it one extra crack tonight, or after I get in touch with this person to ask them if the M2000M they got was special in some way or another.

Is it probable that the motherboard that card would have come with originally is a different part number? If so, though this may or may not be going way above the effort level you wanted to go to for this, you could get that mobo swap the parts in and flip your old one. Of course the end result would be more money dunked in it but parts like that tend to be readily available

EdEddnEddy
Apr 5, 2012



Yep MXM is a pain in the rear end with whatever propritary stuff they tie in either the display outputs, fan control, and most often the stupid heatsink plate that is just different between all the NonStandard MXM GPU's.

Have tossed the idea around a few times to upgrade from the 5870M in my G73JH to a 7970M or similar as a few users have done it and gotten some really good performance with it, however the work required to get the heatsink to mount and cool correctly, along with having to find the correct GPU (Sager only I believe, DELL are common but then HDMI out and fan control won't work correctly.) makes it just not worth it.

Now with laptops rocking deskop level 1080's and such, I am just going to do my best to save up and pick up a new beast, someday...

If that Razer 3 screen beast becomes a actual thing though, I may have to pull that trigger. That thing is absolutely gorgeous.


On the Desktop GPU side, my drat 980Ti loves to give me just that tiny little bit of trouble I haven't been able to pin down.

Gaming at times that use 100% GPU usage, if it sits there just long enough, my system will just hard lock. CPU usage is low, temps are fine (<70C, but when the GPU is stressed to 100%, POP! It will just lock hard and need a reboot.

I was going to blame the OC, but it does it even at stock, (And the CPU is solid even when IT is running at 100% load via Benchmark or some big Render Job), and it also held stable without this issue when I had SLI 780's running at 100% usage.

Also being they were 780's id think they were harder on the PSU as well (Especially while I was OC'ing them to all hell to hit some really good benchmark numbers. The 980Ti still hasn't been able to completely surpass yet alone).

So I am just confused and a bit irritated. IT doesn't happen often, but it is annoying when it does happen.

1gnoirents
Jun 28, 2014

hello :)

EdEddnEddy posted:

Yep MXM is a pain in the rear end with whatever propritary stuff they tie in either the display outputs, fan control, and most often the stupid heatsink plate that is just different between all the NonStandard MXM GPU's.

Have tossed the idea around a few times to upgrade from the 5870M in my G73JH to a 7970M or similar as a few users have done it and gotten some really good performance with it, however the work required to get the heatsink to mount and cool correctly, along with having to find the correct GPU (Sager only I believe, DELL are common but then HDMI out and fan control won't work correctly.) makes it just not worth it.

Now with laptops rocking deskop level 1080's and such, I am just going to do my best to save up and pick up a new beast, someday...

If that Razer 3 screen beast becomes a actual thing though, I may have to pull that trigger. That thing is absolutely gorgeous.


On the Desktop GPU side, my drat 980Ti loves to give me just that tiny little bit of trouble I haven't been able to pin down.

Gaming at times that use 100% GPU usage, if it sits there just long enough, my system will just hard lock. CPU usage is low, temps are fine (<70C, but when the GPU is stressed to 100%, POP! It will just lock hard and need a reboot.

I was going to blame the OC, but it does it even at stock, (And the CPU is solid even when IT is running at 100% load via Benchmark or some big Render Job), and it also held stable without this issue when I had SLI 780's running at 100% usage.

Also being they were 780's id think they were harder on the PSU as well (Especially while I was OC'ing them to all hell to hit some really good benchmark numbers. The 980Ti still hasn't been able to completely surpass yet alone).

So I am just confused and a bit irritated. IT doesn't happen often, but it is annoying when it does happen.

This happened to me twice and it was ram both times. Both times they passed memtest too, the only way I really pinned it down was it just kept getting worse to the point where a stick simply couldn't run at its rated speed at all anymore. Though once there were more telling signs (unusual fps drops combined with lower GPU usage) but the first time was just nothing but random lockups and reboots while gaming only. I've come to sorta hate ram.

Once I was able to diagnose it faster by disabling XMP and then it stopped happening. However the other time that didn't do anything

Adbot
ADBOT LOVES YOU

EdEddnEddy
Apr 5, 2012



That would be one hell of a weird connection. Since I can load it up in Premiere to the point that I run out of my 32G all while it is crunching at 100%.. WTF

If that is it, welp, at least it's lifetime warranty, the fun part is finding the bad stick.





Though, I may try throwing a tiny bump to the Mem Controller voltage as well just to test. It is 8 slots full of 2133DDR3 after all lol.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply