Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
What were other card fuckups? Two that spring to mind were MSI stickergate, and Zotac reference 970's managing to not be able to SLI with any other 970's besides themselves.

Adbot
ADBOT LOVES YOU

GRINDCORE MEGGIDO
Feb 28, 1985


Zero VGS posted:

What were other card fuckups? Two that spring to mind were MSI stickergate, and Zotac reference 970's managing to not be able to SLI with any other 970's besides themselves.

Can't remember the make, might be MSI, but i have a 750ti that won't run its noisy fan at anything below 35% or something like that, even when editing the fan curve. I just fitted a passive cooler in the end and ran it for years. E e - it's EVGA

Think there was some raging about heat pipes not actually being connected on some other card as well.
E- evga 970

GRINDCORE MEGGIDO fucked around with this message at 18:09 on Mar 23, 2017

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

GRINDCORE MEGGIDO posted:

Can't remember the make, might be MSI, but i have a 750ti that won't run its noisy fan at anything below 35% or something like that, even when editing the fan curve. I just fitted a passive cooler in the end and ran it for years.

Think there was some raging about heat pipes not actually being connected on some other card as well.
E- evga 970

I'm certain that was the EVGA 750ti FTW, it's fans were locked to a comically high RPM for a 75w GPU. A bios flash made it slightly quieter but it was still loud as gently caress even though it was running like 5c over ambient.

GRINDCORE MEGGIDO
Feb 28, 1985


Yep, thats the same problem. Mines EVGA, you're right, but a single fan SC model with no external power connector.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

GRINDCORE MEGGIDO posted:

Yep, thats the same problem. Mines EVGA, you're right, but a single fan SC model with no external power connector.

If the noise is still bothering you, you could probably unplug the fan, velcro a 140mm fan over the whole thing, then plug that into the motherboard and it would be dead silent and stay well under 70c I'm sure. They ran super cool.

sauer kraut
Oct 2, 2004

redeyes posted:

Does anyone know how to rip WATTMAN out of the current AMD drivers? I don't want it and I think it might be making my stuff crash.

Nuke everything with DDU, then do a custom install of just the driver without any of the garbage extras ticked (not sure if this is still possible tbh, I'm team green)

eames
May 9, 2009

The latest 3DMark update added Vulkan to the API overhead test. Is this strictly a software test or does it also benchmark the CPU's ability to handle draw calls?

It'd be interesting to see how Ryzen compares to Kaby Lake because allegedly AMD CPUs are not as efficient at handling draw calls (which would explain the lower FPS at 1080p) but futuremark decided not to include the API overhead test in their advanced search.

ballpark results from my lowly Haswell/1060/Win10 system:

DX 11 single-thread: 1.23 million draw calls per second
DX 11 multi-thread: 1.37 million
DX 12: 12.23 million
Vulkan: 13.95 million

repiv
Aug 13, 2009

Guru3D ran it on a few different setups: http://www.guru3d.com/news-story/quick-test-futuremark-3dmark-v2-3-3663-vulkan-api-overhead-benchmarks.html

Strange how Nvidia is doing better in Vulkan than DX12, you'd think the DX12 side of the driver would be more mature. DX12 has been around longer and is used in more games after all.

Knifegrab
Jul 30, 2014

Gadzooks! I'm terrified of this little child who is going to stab me with a knife. I must wrest the knife away from his control and therefore gain the upperhand.
Still no MSI 1080tis?

SlayVus
Jul 10, 2009
Grimey Drawer

Knifegrab posted:

Still no MSI 1080tis?

Almost no details, just the picture really.

http://www.tomshardware.com/news/msi-geforce-gtx-1080ti-graphics-cards,33965.html

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
slightly better than a 1080 box picture with "ti" terribly MSPainted in

GRINDCORE MEGGIDO
Feb 28, 1985


Zero VGS posted:

If the noise is still bothering you, you could probably unplug the fan, velcro a 140mm fan over the whole thing, then plug that into the motherboard and it would be dead silent and stay well under 70c I'm sure. They ran super cool.

That would work great. I replaced the cooler with an Accelero and ran it passive. I've got a 1080 now, that was a good upgrade.

orcane
Jun 13, 2012

Fun Shoe
There are some better picture of the Gaming X:
https://www.techpowerup.com/231589/msi-geforce-gtx-1080-ti-gaming-x-pictured

Unfortunately it has a 2.5 slot cooler now so it won't fit into a Dan A4 (and even if it would, they changed the orientation of the heat sink fins by 90° so you'll just push even less of the hot air out the case). Oh well :(

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

eames posted:

The latest 3DMark update added Vulkan to the API overhead test. Is this strictly a software test or does it also benchmark the CPU's ability to handle draw calls?

It'd be interesting to see how Ryzen compares to Kaby Lake because allegedly AMD CPUs are not as efficient at handling draw calls (which would explain the lower FPS at 1080p) but futuremark decided not to include the API overhead test in their advanced search.

ballpark results from my lowly Haswell/1060/Win10 system:

DX 11 single-thread: 1.23 million draw calls per second
DX 11 multi-thread: 1.37 million
DX 12: 12.23 million
Vulkan: 13.95 million

Speaking of Vulkan, AMD has released "Anvil" http://gpuopen.com/gaming-product/anvil-vulkan-framework/

If I am reading things right, it's a ready made drop in for existing (as a drop in wrapper) and future product that automatically integrates all of AMD optimizations, so my impression is that anything using "Anvil" will have close to similar performance as Doom. Good on AMD? I mean it's meant to simply the process and optimized for AMD hardware already, so it looks like it takes a lot of the work out and the devs using it would only need to make sure Nvidia runs acceptably, and as above Nvidia seems to run better on Vulkan as well.

It's AMD though so I'm trying to figure out the disadvantage.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

Speaking of Vulkan, AMD has released "Anvil" http://gpuopen.com/gaming-product/anvil-vulkan-framework/

If I am reading things right, it's a ready made drop in for existing (as a drop in wrapper) and future product that automatically integrates all of AMD optimizations, so my impression is that anything using "Anvil" will have close to similar performance as Doom. Good on AMD? I mean it's meant to simply the process and optimized for AMD hardware already, so it looks like it takes a lot of the work out and the devs using it would only need to make sure Nvidia runs acceptably, and as above Nvidia seems to run better on Vulkan as well.

It's AMD though so I'm trying to figure out the disadvantage.

hahahahah I told you fucks that GPU manufacturers wouldn't be able to hold back from intervening when lovely DX12/Vulkan coders wrote garbage code that ran like poo poo on your architecture, the pressure to have your hardware look good isn't going anywhere

here we go, next stop driver-optimization town

edit:

Paul MaudDib posted:

They won't be coming out with DX11/OpenGL-style wrappers, but I think there's still going to be places where they could inject code to optimize games. You would just need to inject directly into the game's binary or onto the card itself (the "Gameworks rendering pipeline" xthetenth is talking about), rather than into an API shim between the game and the card.

At the end of the day there's still going to be immense pressure to not have "broken drivers" on launch day, and the public isn't going to accept that they can't play Fallout 5 becasue the gamedevs wrote a lovely renderer that runs on NVIDIA but not AMD.

oh boy was I wrong, here comes the DX11/OpenGL style wrappers

Paul MaudDib fucked around with this message at 02:44 on Mar 24, 2017

Yaoi Gagarin
Feb 20, 2014

Paul MaudDib posted:

hahahahah I told you fucks that GPU manufacturers wouldn't be able to hold back from intervening when lovely DX12/Vulkan coders wrote garbage code that ran like poo poo on your architecture, the pressure to have your hardware look good isn't going anywhere

here we go, next stop driver-optimization town

edit:


oh boy was I wrong, here comes the DX11/OpenGL style wrappers

It is funny how we've come around, but even so it makes sense. A world where you can choose between using a well-defined low-level API and a high-level API implemented in terms of the low-level API is still better than one where your only option is a high-level API with pure magic underneath

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

VostokProgram posted:

It is funny how we've come around, but even so it makes sense. A world where you can choose between using a well-defined low-level API and a high-level API implemented in terms of the low-level API is still better than one where your only option is a high-level API with pure magic underneath

I agree in theory, it's fine to have a framework which does things for you so writing an engine isn't so terribly burdensome, but the problem is it's a real short jump from there to "wow our framerates in Fallout 5 are terrible, what if we detect the game is active and then sub in a different version that optimizes calls or lets SLI/CF profiles work/etc" and then we're right back to the problems with DX11/OpenGL - after all what you've just created is exactly the kind of shim layer that DX12/Vulkan were explicitly not supposed to have

especially since in the feature list they seem to be trying to position this as an OpenGL-to-Vulkan wrapper - this is definitely not going to be something you can compile in and your game magically runs faster than the hand-tweaked-by-driver-wizards version.

Paul MaudDib fucked around with this message at 03:12 on Mar 24, 2017

repiv
Aug 13, 2009

I'm not seeing anything particularly AMD-specific or high level in Anvil. It's mostly just RAII wrappers over raw Vulkan primitives which make it harder to screw up and leak memory/resources, and some random convenience functions :shrug:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

I'm not seeing anything particularly AMD-specific or high level in Anvil. It's mostly just RAII wrappers over raw Vulkan primitives which make it harder to screw up and leak memory/resources, and some random convenience functions :shrug:

Chekov's compiler: if the runtime GLSL transpiler is not meant to be used in the third act, it should not be placed in the framework in the first act, or else it should be a static transpiler instead of runtime

Literally what else would you do with a runtime GLSL transpiler except hotpatch optimizations into some shitlord's terrible GLSL code after they've written it? Every single other use could be done statically.

Less facetiously, pick anything on this list:

quote:

More complicated wrappers:

  • Compute / graphics pipeline managers: provide a way to create regular/derivative pipelines with automatic pipeline layout re-use.
  • Descriptor set groups: simplify descriptor set configuration, management and updates.
  • Pipeline layout manager: caches all created pipeline layouts to avoid instantiating duplicate layout instances.

Miscellaneous functionality:
  • Format info reflection: provides information about format properties.
  • GLSL -> SPIR-V conversion: provides glslang-based GLSL->SPIR-V conversion. The conversion is performed in run-time. Disassembly can also be retrieved, if needed.

Paul MaudDib fucked around with this message at 03:29 on Mar 24, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

repiv posted:

I'm not seeing anything particularly AMD-specific or high level in Anvil. It's mostly just RAII wrappers over raw Vulkan primitives which make it harder to screw up and leak memory/resources, and some random convenience functions :shrug:

Besides what Paul has posted, I'd also point out

quote:

Anvil comes with integrated support for AMD-specific Vulkan extensions, but works on any Vulkan implementation.

The only ones I know of are Shader Intrinsics, but I'd assume that it contains all of AMD and Id's collaboration on Doom's Vulkan implementation.

Yes or no, would this allow AMD to bypass their currently terrible Linux drivers, assuming the game has a Vulkan/Anvil implementation? They've been making huge linux pushes within the past year so I dunno.

repiv
Aug 13, 2009

Paul MaudDib posted:

Literally what else would you do with a runtime GLSL transpiler except hotpatch optimizations into some shitlord's terrible GLSL code after they've written it? Every single other use could be done statically.

It's very useful for debugging - you can press a hotkey and have the engine reload and recompile your shaders on the fly. Huge timesaver compared to doing a full rebuild and having to restart the game.

I don't see why GLSL is special with regard to GPU vendor hotpatching either, they can just as easily hotpatch SPIR-V bytecode. DirectX has exclusively used bytecode forever and that never stopped the driver wizards sticking their noses in there.

FaustianQ posted:

The only ones I know of are Shader Intrinsics, but I'd assume that it contains all of AMD and Id's collaboration on Doom's Vulkan implementation.

Yes or no, would this allow AMD to bypass their currently terrible Linux drivers, assuming the game has a Vulkan/Anvil implementation? They've been making huge linux pushes within the past year so I dunno.

I'm only skimming the code (it's late okay) but as best as I can tell "integrated support for AMD-specific Vulkan extensions" just means their C++ wrappers have methods which pass-through to the raw C extension functions. There's no magic that automatically uses those extensions, they're just re-surfaced through the Anvil API should a developer want to do all the work of actually exploiting that extension themselves.

There's certainly no magic that injects GCN shader intrinsics automatically, if that were possible then they wouldn't need to be intrinsics. They could just apply that magic globally in the drivers compiler backend.

This stuff is seriously benign really. It just makes Vulkan slightly less annoying/tedious to use.

repiv fucked around with this message at 04:34 on Mar 24, 2017

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Woah, an AIB 1080 Ti that's in stock and not overpriced.

https://www.newegg.com/Product/Prod...ID=7057735&SID=

I'm still waiting for a more premium model but hopefully this means they're coming soon.

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf
So, um, I've been having this super odd issue with my EVGA 1070 SC connected to my DVI monitor, its hard to explain so i shot a video of it happening:

https://www.youtube.com/watch?v=LjxGj9Zhc4g

Has anyone encountered this before?

I blew out the input and cable ends and it seemed to help for a while, changing screen hz also seems to reset it. Could it just be a bad DVI cable? I only have one sadly.

edit: it comes and goes, sometimes it will go away for days.

AEMINAL fucked around with this message at 23:00 on Mar 24, 2017

CaptainSarcastic
Jul 6, 2013



AEMINAL posted:

So, um, I've been having this super odd issue with my EVGA 1070 SC connected to my DVI monitor, its hard to explain so i shot a video of it happening:

https://www.youtube.com/watch?v=LjxGj9Zhc4g

Has anyone encountered this before?

I blew out the input and cable ends and it seemed to help for a while, changing screen hz also seems to reset it. Could it just be a bad DVI cable? I only have one sadly.

edit: it comes and goes, sometimes it will go away for days.

I appreciate the Art of Noise background music. Are your drivers up to date? How old is the monitor?

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf

CaptainSarcastic posted:

I appreciate the Art of Noise background music. Are your drivers up to date? How old is the monitor?

my drivers are up to date, the monitor is a samsung syncmaster 940bw, so ancient.

I didnt have this issue before on my gtx960 :(

edit: thank you for revealing "the art of noise" to me, i had no idea this artist sampled them! super dope!! :q:

AEMINAL fucked around with this message at 23:16 on Mar 24, 2017

EdEddnEddy
Apr 5, 2012



Bad DVI cable or bad DVI port on one of the ends.

You Try a HDMI cable or HDMI->DVI adapter to see if it happens there too?

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf

EdEddnEddy posted:

Bad DVI cable or bad DVI port on one of the ends.

You Try a HDMI cable or HDMI->DVI adapter to see if it happens there too?

My main monitor using HDMI is fine, its probably this ancient DVI cable. I'll buy a new one and see.

A bad card would have the same issue on both monitors, right? Never had any issues on my main HDMI one.

EdEddnEddy
Apr 5, 2012



AEMINAL posted:

My main monitor using HDMI is fine, its probably this ancient DVI cable. I'll buy a new one and see.

A bad card would have the same issue on both monitors, right? Never had any issues on my main HDMI one.

Can you try the DVI on the other monitor just to check?

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf

EdEddnEddy posted:

Can you try the DVI on the other monitor just to check?

It's a dumb modern samsung and only has VGA/HDMI for some baffling reason :(

I'll just buy a new DVI cable. It used to be way worse before I dusted it out, so I'm guessing small short somewhere

AEMINAL fucked around with this message at 23:31 on Mar 24, 2017

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf
double post

Obsurveyor
Jan 10, 2003

AEMINAL posted:

It's a dumb modern samsung and only has VGA/HDMI for some baffling reason :(

I'll just buy a new DVI cable. It used to be way worse before I dusted it out, so I'm guessing small short somewhere

I've got a bad DisplayPort to HDMI cable that looked exactly like that. I just had to wiggle on it a bit to make it work a few minutes ago when I replaced my :( bad 1080 :( with a temporary 1060.

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
'[a dollar] bill wrapped around his dick'

way to dox my ultimate sex move >:[

But seriously, anything like that is almost always the cable. Hell, a ton of problems are due to the cable-- especially displayport

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf
Cool!

Thanks guys. Buying a new cable ASAP.

Also, join #gbs on irc.synirc.net if you want to see how deep the rabbit hole goes...

flashman
Dec 16, 2003

My dads 660ti is kicking the bucket so I'm going to buy a 1080 so I can hand me down my 970 to him. They've dropped a bit in price and the two cheapest ones that I am interested in are the MSI Armor or the Asus Strix 8AG. I've read a bunch of benchmarks and there doesn't seem to be much difference in performance in the two, is there any red flags about either of them in particular or does it matter at all? These are like 700 dollars Canadian still so I'm a bit leery about buying something that has a known issue. Google gives a lot of suspect or lovely results to my query.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Anyone here have a Quadro mobile GPU in their laptop who could tell me whether they see G-Sync support in the NVIDIA control panel? I'm wondering whether a K2000M would support it on an external monitor or not. It's not in the official compatibility list so I'm guessing not... but there's no other Quadros in the list either and it looks to be more or less a full GK107 GPU which is riiiiiiight on the edge of when they added support...

(but again I'm guessing not)

edit: pretty sure that's a no, looks like GM204 and above with mobile chips...

Paul MaudDib fucked around with this message at 05:23 on Mar 25, 2017

1gnoirents
Jun 28, 2014

hello :)

flashman posted:

My dads 660ti is kicking the bucket so I'm going to buy a 1080 so I can hand me down my 970 to him. They've dropped a bit in price and the two cheapest ones that I am interested in are the MSI Armor or the Asus Strix 8AG. I've read a bunch of benchmarks and there doesn't seem to be much difference in performance in the two, is there any red flags about either of them in particular or does it matter at all? These are like 700 dollars Canadian still so I'm a bit leery about buying something that has a known issue. Google gives a lot of suspect or lovely results to my query.

MSI Armor is usually the "second best cooler" and the strix is the "best" in their line up usually. But in general no those are two good cards . only thing I'd double check is noise and temps between the two

SlayVus
Jul 10, 2009
Grimey Drawer

1gnoirents posted:

MSI Armor is usually the "second best cooler" and the strixGaming X is the "best" in their line up usually. But in general no those are two good cards . only thing I'd double check is noise and temps between the two

FTFY, Strix is ASUS.

Regrettable
Jan 5, 2010



1gnoirents posted:

MSI Armor is usually the "second best cooler" and the strix is the "best" in their line up usually. But in general no those are two good cards . only thing I'd double check is noise and temps between the two

I've been looking at an MSI Armor 1080 as well and I've had a difficult time finding reviews from a site I trust. The closest I could find was a 1070 Armor review that makes temperatures seem pretty decent. Are the temps roughly the same for 1080s as 1070s?

Tenacious J
Nov 20, 2002

I've recently realized that I care much more for performance (fps) over graphics quality (resolution). I was told that going with a 144hz monitor is "amazing" and "you'll never go back". I'm totally ok with 1080p, so what card would I need to drive a 144hz rig? Is a 1080 overkill?

Adbot
ADBOT LOVES YOU

Dexo
Aug 15, 2009

A city that was to live by night after the wilderness had passed. A city that was to forge out of steel and blood-red neon its own peculiar wilderness.
80 percent of the people who play this game are probably gonna romance one of those two characters as a male Ryder.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply