|
What were other card fuckups? Two that spring to mind were MSI stickergate, and Zotac reference 970's managing to not be able to SLI with any other 970's besides themselves.
|
# ? Mar 23, 2017 16:57 |
|
|
# ? Jun 5, 2024 11:35 |
|
Zero VGS posted:What were other card fuckups? Two that spring to mind were MSI stickergate, and Zotac reference 970's managing to not be able to SLI with any other 970's besides themselves. Can't remember the make, might be MSI, but i have a 750ti that won't run its noisy fan at anything below 35% or something like that, even when editing the fan curve. I just fitted a passive cooler in the end and ran it for years. E e - it's EVGA Think there was some raging about heat pipes not actually being connected on some other card as well. E- evga 970 GRINDCORE MEGGIDO fucked around with this message at 18:09 on Mar 23, 2017 |
# ? Mar 23, 2017 17:11 |
|
GRINDCORE MEGGIDO posted:Can't remember the make, might be MSI, but i have a 750ti that won't run its noisy fan at anything below 35% or something like that, even when editing the fan curve. I just fitted a passive cooler in the end and ran it for years. I'm certain that was the EVGA 750ti FTW, it's fans were locked to a comically high RPM for a 75w GPU. A bios flash made it slightly quieter but it was still loud as gently caress even though it was running like 5c over ambient.
|
# ? Mar 23, 2017 17:18 |
|
Yep, thats the same problem. Mines EVGA, you're right, but a single fan SC model with no external power connector.
|
# ? Mar 23, 2017 18:08 |
|
GRINDCORE MEGGIDO posted:Yep, thats the same problem. Mines EVGA, you're right, but a single fan SC model with no external power connector. If the noise is still bothering you, you could probably unplug the fan, velcro a 140mm fan over the whole thing, then plug that into the motherboard and it would be dead silent and stay well under 70c I'm sure. They ran super cool.
|
# ? Mar 23, 2017 18:46 |
|
redeyes posted:Does anyone know how to rip WATTMAN out of the current AMD drivers? I don't want it and I think it might be making my stuff crash. Nuke everything with DDU, then do a custom install of just the driver without any of the garbage extras ticked (not sure if this is still possible tbh, I'm team green)
|
# ? Mar 23, 2017 19:57 |
|
The latest 3DMark update added Vulkan to the API overhead test. Is this strictly a software test or does it also benchmark the CPU's ability to handle draw calls? It'd be interesting to see how Ryzen compares to Kaby Lake because allegedly AMD CPUs are not as efficient at handling draw calls (which would explain the lower FPS at 1080p) but futuremark decided not to include the API overhead test in their advanced search. ballpark results from my lowly Haswell/1060/Win10 system: DX 11 single-thread: 1.23 million draw calls per second DX 11 multi-thread: 1.37 million DX 12: 12.23 million Vulkan: 13.95 million
|
# ? Mar 23, 2017 20:04 |
|
Guru3D ran it on a few different setups: http://www.guru3d.com/news-story/quick-test-futuremark-3dmark-v2-3-3663-vulkan-api-overhead-benchmarks.html Strange how Nvidia is doing better in Vulkan than DX12, you'd think the DX12 side of the driver would be more mature. DX12 has been around longer and is used in more games after all.
|
# ? Mar 23, 2017 20:59 |
|
Still no MSI 1080tis?
|
# ? Mar 23, 2017 22:03 |
|
Knifegrab posted:Still no MSI 1080tis? Almost no details, just the picture really. http://www.tomshardware.com/news/msi-geforce-gtx-1080ti-graphics-cards,33965.html
|
# ? Mar 23, 2017 23:05 |
|
slightly better than a 1080 box picture with "ti" terribly MSPainted in
|
# ? Mar 23, 2017 23:08 |
|
Zero VGS posted:If the noise is still bothering you, you could probably unplug the fan, velcro a 140mm fan over the whole thing, then plug that into the motherboard and it would be dead silent and stay well under 70c I'm sure. They ran super cool. That would work great. I replaced the cooler with an Accelero and ran it passive. I've got a 1080 now, that was a good upgrade.
|
# ? Mar 23, 2017 23:38 |
|
SlayVus posted:Almost no details, just the picture really. https://www.techpowerup.com/231589/msi-geforce-gtx-1080-ti-gaming-x-pictured Unfortunately it has a 2.5 slot cooler now so it won't fit into a Dan A4 (and even if it would, they changed the orientation of the heat sink fins by 90° so you'll just push even less of the hot air out the case). Oh well
|
# ? Mar 23, 2017 23:58 |
|
eames posted:The latest 3DMark update added Vulkan to the API overhead test. Is this strictly a software test or does it also benchmark the CPU's ability to handle draw calls? Speaking of Vulkan, AMD has released "Anvil" http://gpuopen.com/gaming-product/anvil-vulkan-framework/ If I am reading things right, it's a ready made drop in for existing (as a drop in wrapper) and future product that automatically integrates all of AMD optimizations, so my impression is that anything using "Anvil" will have close to similar performance as Doom. Good on AMD? I mean it's meant to simply the process and optimized for AMD hardware already, so it looks like it takes a lot of the work out and the devs using it would only need to make sure Nvidia runs acceptably, and as above Nvidia seems to run better on Vulkan as well. It's AMD though so I'm trying to figure out the disadvantage.
|
# ? Mar 24, 2017 01:54 |
|
FaustianQ posted:Speaking of Vulkan, AMD has released "Anvil" http://gpuopen.com/gaming-product/anvil-vulkan-framework/ hahahahah I told you fucks that GPU manufacturers wouldn't be able to hold back from intervening when lovely DX12/Vulkan coders wrote garbage code that ran like poo poo on your architecture, the pressure to have your hardware look good isn't going anywhere here we go, next stop driver-optimization town edit: Paul MaudDib posted:They won't be coming out with DX11/OpenGL-style wrappers, but I think there's still going to be places where they could inject code to optimize games. You would just need to inject directly into the game's binary or onto the card itself (the "Gameworks rendering pipeline" xthetenth is talking about), rather than into an API shim between the game and the card. oh boy was I wrong, here comes the DX11/OpenGL style wrappers Paul MaudDib fucked around with this message at 02:44 on Mar 24, 2017 |
# ? Mar 24, 2017 02:30 |
|
Paul MaudDib posted:hahahahah I told you fucks that GPU manufacturers wouldn't be able to hold back from intervening when lovely DX12/Vulkan coders wrote garbage code that ran like poo poo on your architecture, the pressure to have your hardware look good isn't going anywhere It is funny how we've come around, but even so it makes sense. A world where you can choose between using a well-defined low-level API and a high-level API implemented in terms of the low-level API is still better than one where your only option is a high-level API with pure magic underneath
|
# ? Mar 24, 2017 02:52 |
|
VostokProgram posted:It is funny how we've come around, but even so it makes sense. A world where you can choose between using a well-defined low-level API and a high-level API implemented in terms of the low-level API is still better than one where your only option is a high-level API with pure magic underneath I agree in theory, it's fine to have a framework which does things for you so writing an engine isn't so terribly burdensome, but the problem is it's a real short jump from there to "wow our framerates in Fallout 5 are terrible, what if we detect the game is active and then sub in a different version that optimizes calls or lets SLI/CF profiles work/etc" and then we're right back to the problems with DX11/OpenGL - after all what you've just created is exactly the kind of shim layer that DX12/Vulkan were explicitly not supposed to have especially since in the feature list they seem to be trying to position this as an OpenGL-to-Vulkan wrapper - this is definitely not going to be something you can compile in and your game magically runs faster than the hand-tweaked-by-driver-wizards version. Paul MaudDib fucked around with this message at 03:12 on Mar 24, 2017 |
# ? Mar 24, 2017 03:10 |
|
I'm not seeing anything particularly AMD-specific or high level in Anvil. It's mostly just RAII wrappers over raw Vulkan primitives which make it harder to screw up and leak memory/resources, and some random convenience functions
|
# ? Mar 24, 2017 03:14 |
|
repiv posted:I'm not seeing anything particularly AMD-specific or high level in Anvil. It's mostly just RAII wrappers over raw Vulkan primitives which make it harder to screw up and leak memory/resources, and some random convenience functions Chekov's compiler: if the runtime GLSL transpiler is not meant to be used in the third act, it should not be placed in the framework in the first act, or else it should be a static transpiler instead of runtime Literally what else would you do with a runtime GLSL transpiler except hotpatch optimizations into some shitlord's terrible GLSL code after they've written it? Every single other use could be done statically. Less facetiously, pick anything on this list: quote:More complicated wrappers: Paul MaudDib fucked around with this message at 03:29 on Mar 24, 2017 |
# ? Mar 24, 2017 03:24 |
|
repiv posted:I'm not seeing anything particularly AMD-specific or high level in Anvil. It's mostly just RAII wrappers over raw Vulkan primitives which make it harder to screw up and leak memory/resources, and some random convenience functions Besides what Paul has posted, I'd also point out quote:Anvil comes with integrated support for AMD-specific Vulkan extensions, but works on any Vulkan implementation. The only ones I know of are Shader Intrinsics, but I'd assume that it contains all of AMD and Id's collaboration on Doom's Vulkan implementation. Yes or no, would this allow AMD to bypass their currently terrible Linux drivers, assuming the game has a Vulkan/Anvil implementation? They've been making huge linux pushes within the past year so I dunno.
|
# ? Mar 24, 2017 03:47 |
|
Paul MaudDib posted:Literally what else would you do with a runtime GLSL transpiler except hotpatch optimizations into some shitlord's terrible GLSL code after they've written it? Every single other use could be done statically. It's very useful for debugging - you can press a hotkey and have the engine reload and recompile your shaders on the fly. Huge timesaver compared to doing a full rebuild and having to restart the game. I don't see why GLSL is special with regard to GPU vendor hotpatching either, they can just as easily hotpatch SPIR-V bytecode. DirectX has exclusively used bytecode forever and that never stopped the driver wizards sticking their noses in there. FaustianQ posted:The only ones I know of are Shader Intrinsics, but I'd assume that it contains all of AMD and Id's collaboration on Doom's Vulkan implementation. I'm only skimming the code (it's late okay) but as best as I can tell "integrated support for AMD-specific Vulkan extensions" just means their C++ wrappers have methods which pass-through to the raw C extension functions. There's no magic that automatically uses those extensions, they're just re-surfaced through the Anvil API should a developer want to do all the work of actually exploiting that extension themselves. There's certainly no magic that injects GCN shader intrinsics automatically, if that were possible then they wouldn't need to be intrinsics. They could just apply that magic globally in the drivers compiler backend. This stuff is seriously benign really. It just makes Vulkan slightly less annoying/tedious to use. repiv fucked around with this message at 04:34 on Mar 24, 2017 |
# ? Mar 24, 2017 03:51 |
|
Woah, an AIB 1080 Ti that's in stock and not overpriced. https://www.newegg.com/Product/Prod...ID=7057735&SID= I'm still waiting for a more premium model but hopefully this means they're coming soon.
|
# ? Mar 24, 2017 21:25 |
|
So, um, I've been having this super odd issue with my EVGA 1070 SC connected to my DVI monitor, its hard to explain so i shot a video of it happening: https://www.youtube.com/watch?v=LjxGj9Zhc4g Has anyone encountered this before? I blew out the input and cable ends and it seemed to help for a while, changing screen hz also seems to reset it. Could it just be a bad DVI cable? I only have one sadly. edit: it comes and goes, sometimes it will go away for days. AEMINAL fucked around with this message at 23:00 on Mar 24, 2017 |
# ? Mar 24, 2017 22:55 |
|
AEMINAL posted:So, um, I've been having this super odd issue with my EVGA 1070 SC connected to my DVI monitor, its hard to explain so i shot a video of it happening: I appreciate the Art of Noise background music. Are your drivers up to date? How old is the monitor?
|
# ? Mar 24, 2017 23:04 |
|
CaptainSarcastic posted:I appreciate the Art of Noise background music. Are your drivers up to date? How old is the monitor? my drivers are up to date, the monitor is a samsung syncmaster 940bw, so ancient. I didnt have this issue before on my gtx960 edit: thank you for revealing "the art of noise" to me, i had no idea this artist sampled them! super dope!! AEMINAL fucked around with this message at 23:16 on Mar 24, 2017 |
# ? Mar 24, 2017 23:07 |
|
Bad DVI cable or bad DVI port on one of the ends. You Try a HDMI cable or HDMI->DVI adapter to see if it happens there too?
|
# ? Mar 24, 2017 23:19 |
|
EdEddnEddy posted:Bad DVI cable or bad DVI port on one of the ends. My main monitor using HDMI is fine, its probably this ancient DVI cable. I'll buy a new one and see. A bad card would have the same issue on both monitors, right? Never had any issues on my main HDMI one.
|
# ? Mar 24, 2017 23:20 |
|
AEMINAL posted:My main monitor using HDMI is fine, its probably this ancient DVI cable. I'll buy a new one and see. Can you try the DVI on the other monitor just to check?
|
# ? Mar 24, 2017 23:22 |
|
EdEddnEddy posted:Can you try the DVI on the other monitor just to check? It's a dumb modern samsung and only has VGA/HDMI for some baffling reason I'll just buy a new DVI cable. It used to be way worse before I dusted it out, so I'm guessing small short somewhere AEMINAL fucked around with this message at 23:31 on Mar 24, 2017 |
# ? Mar 24, 2017 23:29 |
|
double post
|
# ? Mar 24, 2017 23:30 |
|
AEMINAL posted:It's a dumb modern samsung and only has VGA/HDMI for some baffling reason I've got a bad DisplayPort to HDMI cable that looked exactly like that. I just had to wiggle on it a bit to make it work a few minutes ago when I replaced my bad 1080 with a temporary 1060.
|
# ? Mar 25, 2017 00:43 |
'[a dollar] bill wrapped around his dick' way to dox my ultimate sex move >:[ But seriously, anything like that is almost always the cable. Hell, a ton of problems are due to the cable-- especially displayport
|
|
# ? Mar 25, 2017 01:19 |
|
Cool! Thanks guys. Buying a new cable ASAP. Also, join #gbs on irc.synirc.net if you want to see how deep the rabbit hole goes...
|
# ? Mar 25, 2017 01:23 |
|
My dads 660ti is kicking the bucket so I'm going to buy a 1080 so I can hand me down my 970 to him. They've dropped a bit in price and the two cheapest ones that I am interested in are the MSI Armor or the Asus Strix 8AG. I've read a bunch of benchmarks and there doesn't seem to be much difference in performance in the two, is there any red flags about either of them in particular or does it matter at all? These are like 700 dollars Canadian still so I'm a bit leery about buying something that has a known issue. Google gives a lot of suspect or lovely results to my query.
|
# ? Mar 25, 2017 04:51 |
|
Anyone here have a Quadro mobile GPU in their laptop who could tell me whether they see G-Sync support in the NVIDIA control panel? I'm wondering whether a K2000M would support it on an external monitor or not. It's not in the official compatibility list so I'm guessing not... but there's no other Quadros in the list either and it looks to be more or less a full GK107 GPU which is riiiiiiight on the edge of when they added support... (but again I'm guessing not) edit: pretty sure that's a no, looks like GM204 and above with mobile chips... Paul MaudDib fucked around with this message at 05:23 on Mar 25, 2017 |
# ? Mar 25, 2017 05:12 |
|
flashman posted:My dads 660ti is kicking the bucket so I'm going to buy a 1080 so I can hand me down my 970 to him. They've dropped a bit in price and the two cheapest ones that I am interested in are the MSI Armor or the Asus Strix 8AG. I've read a bunch of benchmarks and there doesn't seem to be much difference in performance in the two, is there any red flags about either of them in particular or does it matter at all? These are like 700 dollars Canadian still so I'm a bit leery about buying something that has a known issue. Google gives a lot of suspect or lovely results to my query. MSI Armor is usually the "second best cooler" and the strix is the "best" in their line up usually. But in general no those are two good cards . only thing I'd double check is noise and temps between the two
|
# ? Mar 25, 2017 06:17 |
|
1gnoirents posted:MSI Armor is usually the "second best cooler" and the FTFY, Strix is ASUS.
|
# ? Mar 25, 2017 06:19 |
|
1gnoirents posted:MSI Armor is usually the "second best cooler" and the strix is the "best" in their line up usually. But in general no those are two good cards . only thing I'd double check is noise and temps between the two I've been looking at an MSI Armor 1080 as well and I've had a difficult time finding reviews from a site I trust. The closest I could find was a 1070 Armor review that makes temperatures seem pretty decent. Are the temps roughly the same for 1080s as 1070s?
|
# ? Mar 25, 2017 06:22 |
|
I've recently realized that I care much more for performance (fps) over graphics quality (resolution). I was told that going with a 144hz monitor is "amazing" and "you'll never go back". I'm totally ok with 1080p, so what card would I need to drive a 144hz rig? Is a 1080 overkill?
|
# ? Mar 25, 2017 06:34 |
|
|
# ? Jun 5, 2024 11:35 |
|
80 percent of the people who play this game are probably gonna romance one of those two characters as a male Ryder.
|
# ? Mar 25, 2017 06:41 |