Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Sininu
Jan 8, 2014

Is it possible to limit video memory in software? I have GT555m and a game that doesn't work well with video cards that have more than 1 GB.

Adbot
ADBOT LOVES YOU

Sininu
Jan 8, 2014

Rosoboronexport posted:

What game is that? And has the developer really said that that is the issue? Does your laptop have integrated GPU, could you use it?

It's Settlers 6. It's Ubisoft game so it will never be patched.

quote:

The game has a known problem with video cards featuring more than 2GB of memory - the game forces you to use the lowest res textures, making it look super ugly

The cause is obvious - the function that probes the driver for VRAM size uses signed, instead of an unsigned variable type (an unsigned 16bit integer variable can store numbers from 0 - 65535, for example, while signed 16bit integer can store from -32768 to +32768). Signed types use the highest bit to recognize negative numbers, therefore if a large unsigned number (large enough to ha. ve the highest bit set) is interpreted as signed, it becomes a negative number.
.
I have 2GB . But still can't choose settings and it ignores config file if I make manual changes and make it read only.

Intel HD 3000 can't even handle simple 2D games like Bejeweled well so I can't use integrated GPU.
I have limited system RAM before for Act of War, but I guess doing it with VRAM isn't possible then.

Sininu
Jan 8, 2014

Rosoboronexport posted:

That's where you are wrong. HD3000 is the Sandy Bridge GPU which supports DX10.1. Here it is compared to low-end GPUs in which it fares quite well. It definitely should do OK in a 2007 era game.
How am I wrong if I have played lots of games for minute with it before realizing it's running on HD3000 because of lovely performance? Also supported DX version doesn't really help indicate performance of a GPU.

Factory Factory posted:

It really can struggle on modern 2D games, though - at least, ones that aren't CPU-rendered and aren't targetted toward tablets (e.g. Windows Store apps).
Can Windows Store apps even be set to run on dedicated GPU? I once tried to play latest Asphalt racing game and it ran quite badly with frequent drops under 30 fps. Changing global GPU to Nvidia didn't help at all and I failed to find its executable.
I just tested Asphalt 8 again and this time I successfully ran it on Nvidia GPU, but performance was still the same. I guess it just runs badly.
I doubt HD3000 will have trouble with most 2D games there though. At least Hill Climb Racing runs well. Can't be bothered to try anything else.

Sininu fucked around with this message at 18:38 on Jan 4, 2015

Sininu
Jan 8, 2014

Rosoboronexport posted:

My point was that it should run the Settlers 6 game you have problems with. The game recommends Geforce FX or Radeon 9500, and HD3000 is faster than those. I'm not suggesting that you should use it instead of the Nvidia chip for any other games. The DVMT is limited to 1720 MB but I'm not sure how much VRAM it will report to the game.

It detects 62MB.

There are no workarounds for that error. I can only play it with dedicated card, but then I can't change settings. GTA IV has similar issue, but it doesn't refuse to run and there are simple workarounds for that.

Sininu fucked around with this message at 22:20 on Jan 4, 2015

Sininu
Jan 8, 2014

jkyuusai posted:

Have you checked in your bios to see if there's any options pertaining to memory for the integrated graphics? Some allow you to alter the minimum amount that's allocated.
Sadly no. My laptop bios doesn't let me do that.

I might just get a cheap 1GB GPU for my old desktop to play this game then.

Sininu
Jan 8, 2014

My friend has an issue with is Gigabyte GTX970. He gets BSODs while playing Guild Wars 2. Minidump says it's Nvidia Driver nvlddmkm.sys that causes this. He only gets these when in less demanding least crowded areas. Also temperatures go unusually high there as well. Normally they are 55 degrees, but in less demanding (at least we assume) places they go as high as 69 degrees.
He already tried older drivers, but the issue still persists.
He hasn't overclocked his card. I suggested him to lower VRAM and core frequencies by 100MHz to see if that affects something.

Edit: Lowering clocks by 100 made it not crash. Should he RMA it?

Sininu fucked around with this message at 22:03 on Feb 15, 2015

Sininu
Jan 8, 2014

Desuwa posted:

Sounds like it's just rendering useless frames as fast as it can and overheating the card. I wouldn't have expected it in GW2 but it's actually a pretty common problem, I know Galactic Civilizations 2, HoMM 5, and a bunch of others had/have the same issue. The card still shouldn't be crashing at stock clocks, so he should RMA it, but once he gets a new card that doesn't crash he might want to force a frame limit for GW2 to stop it from heating up so much.

Make sure it doesn't crash before setting a limit though, you want to make sure the card is stable under the worst case at stock clocks so it doesn't crop up in other games.

He had framerate limiter enabled already.
But thanks for help!

Edit: He ran furmark for a bit and the temps never went over 70 at stock clocks.
He also sent me minidump of the crash. Bug check code is 0x00000116 and it mainly seems to blame dxgkrnl.sys, but also has dxgmms1.sys and nvlddmkm.sys
Could someone check it please?
http://puu.sh/fY5Pn.dmp

It only happens in one particular area in Guild Wars 2.

Sininu fucked around with this message at 00:49 on Feb 16, 2015

Sininu
Jan 8, 2014

Potato Salad posted:

What is your PSU?

quote:

my Corsair vx450w
lol

Sininu
Jan 8, 2014

Do individual components ever get replaced during RMA process?
My friend was told that they replaced capacitors on his card, but I thought they would generally just replace the card.

Sininu
Jan 8, 2014

THE DOG HOUSE posted:

I'm sure that is a very case by case basis. A capacitor is dirt cheap though, so if they can quickly (and cheaply) pinpoint a capacitor, like it just exploded or its a common thing, replacing that would be like 100 times cheaper than giving you a new board even with labor and all. Diagnosis being the sticking point
They had no visual damage and supposed replacing did nothing to fix its problems.
He sent it back and hopefully they'll replace it completely.

Sininu
Jan 8, 2014

I have been under impression that only APEX part of PhysX can run on GPU.
I know Arma 3 and all Unreal 4 games use PhysX for general physics that only runs on CPU.

Sininu fucked around with this message at 22:35 on May 19, 2015

Sininu
Jan 8, 2014

El Scotch posted:

How many of those are being hell-choked to death by your cpu?

Pretty much all of them

Edit: Actually I don't know about GTAV and BF4, but other games are really hungry for great CPU performance.

Sininu
Jan 8, 2014

How can my friend get a replacement or refund for Gigabyte GTX970 he bought in march? He gets BSOD's in 50% of games he plays, he has tested it in other people's computers as well and they get crashes too.
He has tried to RMA it three times already. Every time they say that nothing's wrong with it.
No crashes occur in Furmark or other benchmark programs, different driver versions don't change anything, no overclocks.

Sininu
Jan 8, 2014

Even if he would've bought it with CC he can't really do chargeback because he bought all his PC components at once.
He started with RMAs only week after buying the card. He's now waiting for them to return it for 4th time.
BSOD code was 116. Nvidia driver nvlddmkm.sys is also mentioned. Like I said that issue persists with different systems and driver versions.

Zero VGS posted:

Either that or just very subtly destroy the card further so it doesn't work at all... I had a 100% hosed Gigabyte card and they RMA'ed me a replacement without any hassle.

That seems super risky and I'm not going to make my friend do that.

Sininu
Jan 8, 2014

THE DOG HOUSE posted:

I think you've overestimating the difficulty of a chargeback.
I just asked and he paid hand to hand to delivery dude. I didn't know such thing was a thing, but it's common practice in Poland.
He's now doing mediation thing, but he's not optimistic about it.

Sininu
Jan 8, 2014

SinineSiil posted:

How can my friend get a replacement or refund for Gigabyte GTX970 he bought in march? He gets BSOD's in 50% of games he plays, he has tested it in other people's computers as well and they get crashes too.
He has tried to RMA it three times already. Every time they say that nothing's wrong with it.
No crashes occur in Furmark or other benchmark programs, different driver versions don't change anything, no overclocks.

I just checked Newegg for reviews of this specific card he has:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125706&cm_re=gigabyte_gtx970-_-14-125-706-_-Product
Most one and two star reviews describe same issue. Gigabyte refuses to acknowledge the issue.
Does he really have no other options beside screaming on social media?

Sininu fucked around with this message at 19:08 on Aug 29, 2015

Sininu
Jan 8, 2014

sauer kraut posted:

Of course it was that lovely Gigabyte Mini that's perpetually on super sale.
I feel like it was my fault that he got that card. I wasn't aware of its existence back when he showed me parts list so I assumed it was normal Gigabyte 970 and didn't check closer. No-one else who looked at the parts list caught it either.
Also that card apparently had great reviews all around back then.

Sininu
Jan 8, 2014

https://www.youtube.com/watch?v=_jJpbmxGnss
Nvidia will turn this off, won't they?

Sininu
Jan 8, 2014

wipeout posted:

I'd post this in the FO4 thread, but it's pretty much full retard in there right now, so I hope this isn't offtopic.

Which card would be better for running Fallout 4, at 1920 x 1080, and is there much in it?

750ti
7850

Both 2GB.

The CPU is a 4.4Ghz 2500K, currently with a 750Ti in there.
I'd expect the 7850 to be slightly faster, but I don't know if it's worth the effort of swapping cards over or not.
I have no idea about 7850 performance but these videos may help you
https://www.youtube.com/watch?v=37CzgecyvTs
https://www.youtube.com/watch?v=imcj_BxGqD4

Sininu
Jan 8, 2014

cat doter posted:

Why the heck does the nvidia control panel take like a minute to load up every time? I've got windows on my SSD and my computer boots up in literally like 8 seconds, but just the nvidia panel takes forever. Sometimes I have to close it manually and open it again just to get it to stop hanging.
I don't think SSD even makes a difference. It's awfully slow no matter what.

Sininu
Jan 8, 2014

xthetenth posted:

Yeah, it's got some weirdnesses. I had a file with a machine generated file name that was crazy long lying in a folder and every time I tried to open the per-game settings window it would crash the program because it seems it scans the whole file system or a significant fraction. It took some combing through logs in process explorer to figure that one out because exception handling is for scrubs.

I had to reinstall it once because it kept crashing while starting up.
Awful

Sininu
Jan 8, 2014

What is currently the best way to limit frame rate game by game basis?

Also why is DSR still not available for laptops?

Sininu fucked around with this message at 21:22 on Dec 20, 2015

Sininu
Jan 8, 2014

Rosoboronexport posted:

Adaptive VSync, if your card supports it on Nvidia's side. I don't know what AMD calls it in settings. Or through MSI Afterburner/RTSS.

Standard Vsync creates awful feeling mouse lag. Is adaptive Vsync better?
I'll try Afterburner.

Sininu
Jan 8, 2014

I have a problem with what I assume is Shadowplay keeping gpu alive when computer isn't doing much. Fans are off when nothing uses the gpu, but sometimes Shadowplay gets stuck active or something (I see it at gpu activity tray icon) and fans turn on for a bit every 5 minutes.
Has anyone experienced and solved this? It's a laptop so fans are quite audible.

Sininu
Jan 8, 2014

Subjunctive posted:

New low-power (19W) $40 card from NVIDIA (710):

http://www.vortez.net/news_story/low_powerlow_cost_nvidia_launches_the_geforce_gt_710.html

Can drive 4K at 30Hz, up to three displays.

Why do they have fan on it when 750Ti works fine passively? And double-slot models too when all three outputs fit on single slot? Why?

Sininu
Jan 8, 2014

Anime Schoolgirl posted:

unless if you make it use a heatsink 3 slots high, it doesn't actually

but yeah anything involving a fan is a waste for this 5450-grade gpu

Since I have seen passive 750Ti's I thought they worked just fine. So they overheat when gaming?

Don Lapre posted:

The 750ti is maxwell. The 710 is not.
Kepler gets too hot for passive cooling even when it's 19W part?

Sininu
Jan 8, 2014

Watermelon Daiquiri posted:

The article said there would be passive options though
There's even one that's passive on the picture.
Why have fan at all is my question.

Sininu
Jan 8, 2014

Thanks people, I learned something about viability of passive cooling today.

Sininu
Jan 8, 2014

mobby_6kl posted:

All that talk of multi-chip GPU reminds me of something...



and that didn't end very well for them...

How was that thing cooled?

Sininu
Jan 8, 2014

2007 ATI Radeon 3870x2 "heater"
2011 Nvidia GT555M
2015 Nvidia GTX 970M

I haven't had many upgrades

Also I think my next upgrade will be desktop GPU inside that Alienware external enclosure since I have their laptop.

Sininu fucked around with this message at 23:48 on Mar 16, 2016

Sininu
Jan 8, 2014

Dave Angel posted:

It's an April Fools joke though, right?

Nah, read more and pre-order on their site

Sininu
Jan 8, 2014

Afterburner reports that my 970M core clock is 540MHz all the time ingame. Which is surely incorrect since games run as expected. Is there anything I could to fix it?

Sininu
Jan 8, 2014

THE DOG HOUSE posted:

I have found that afterburner has correctly displayed the clock speed for me and even was able to warn me of a driver issue that id probably be unaware of by displaying a low clock. 540 mhz is very close to the limp mode for Nvidia which could indicate a driver problem. Or you could have your igpu selected in afterburner , or the game has very low demand on the gpu . First thing I would do is test another game with a known heavy gpu load, if you don't have one run Heaven 4.0 and see what the clock reports at
Newest drivers and only game I have checked core frequency in is Black Ops 3. Also Nvidia Inspector seems to give correct clock readings.

Sininu
Jan 8, 2014

THE DOG HOUSE posted:

Is your gpu selected from the drop down near the top of the program ? Also does the clock graph just show an even 540 mhz across the board ?
Yes, I'm looking at GPU1 reading which is 970M in Afterburner.
I tested it in Dirt Rally now too - exact same result.
Also restarted my computer, just in case.

Sininu
Jan 8, 2014

THE DOG HOUSE posted:

As long as another program is reporting correctly its safe to say that install is borked. Can try reinstalling, or using EVGA precision which does the same things
I did a little Googling and it seems like Afterburner has trouble reading Alienware laptop GPU core clocks for some reason or something? Quite lame tbh.
https://forums.geforce.com/default/topic/811170/is-msi-afterburner-reporting-the-coreclock-incorrectly-/?offset=3
http://forum.notebookreview.com/threads/official-alienware-17-r2-r3-owners-lounge.770314/page-201
http://forum.notebookreview.com/threads/official-alienware-15-r1-r2-benchmark-thread.770319/page-25

Sininu
Jan 8, 2014

Boris Galerkin posted:

Who's Tom and what did he do that he's possibly fired from nvidia for doing? Someone linked to the entire reveal presentation some pages back but I don't want to watch the whole thing.

Watch this
https://www.youtube.com/watch?v=iAueZ_VWBUU

Sininu
Jan 8, 2014

Boris Galerkin posted:

I'm watching it. Which one is Tom? The guy that's not on the stage right? They both sound pretty awful actually.

Tom is the one who doesn't call other guy Tom

Edit: Yeah, many feel the same way.

Sininu fucked around with this message at 10:21 on May 9, 2016

Sininu
Jan 8, 2014

Peanut3141 posted:

8-pin is spec'd to 150W, 6-pin to 75W.

How do extra 2 pins double the amount of power that can go through?

Sininu
Jan 8, 2014

Peanut3141 posted:

They kinda don't. Let me preface this by saying I'm not an electrical engineer, though I've sometimes played one in my career.

Check out John_VanKirk's post from Feb 4, 2010 in this thread: http://www.tomshardware.com/forum/274631-28-power-spec-power-plug

In short, a 6-pin connector seems to be able to carry at least 192W, probably far more than that. It is for that reason that I said a sane designer would've overpulled from the 6-pin connector instead of the PCI-E slot.

Standard caveats being that the wires should be at least 18 gauge, well made and not supplied by a pos PSU.


SourKraut posted:

it doesn't; both the 6- and 8- pin PCIe connectors (should) have the same number of 12v pins (3). The 8-pin has two additional grounding pins. The PCI-SIG specification required the 8-pin configuration in order to differentiate which power supplies can support 150W for a single cable off of the 12v rail(s) feeding the connector. In older PSUs, the Cables might not physically be capable of feeding the current required without potentially causing damage, or the PSU might not have been able to safely feed it. So the easiest way to (ideally) guarantee someone doesn't use an underpowered PSU is by requiring an 8-pin connector for GPUs needing more than 75W from the connector. Of course this requires the PSU meet standard specifications, which knock-offs, etc. probably won't.

The two extra pins are actually grounds - technically one of them should be a sensing pin but I don't think it often gets included. The other side of this is that you can plug a 6-pin into a GPU's 8-pin connector and it *may* work, and if it does, it should technically pull no more than 75W as such since it wouldn't detect the grounding pins. But I'm not sure how well GPU manufacturers actually stick to that.


BurritoJustice posted:

The idea isn't exactly that the extra pins carry another 75w, it is that an 8 pin connector is rated for 75 more watts. Which means that you know if a power supply offers an 8 pin connection that it will supply 150w over that connector. They could've achieved almost the same thing with a "black 6 pin connector" and a "red 6 pin connector", but they went with 8/6pin because of various advantages (ease of adaptability, easy to understand, minor electrical advantage to the additional grounds).

E: I got beaten by Sour Kraut because I typed this between lives in overwatch :shrug:
Thank you guys. Feels nice to learn new stuff

Adbot
ADBOT LOVES YOU

Sininu
Jan 8, 2014

Fallows posted:

Looks the same to me..

It's in optional beta.
It's quite bad imo, but that's not much different from before.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply