|
could i use something like this (if not this exact thing) https://www.newegg.com/Product/Prod...A8EF5VJ0319-_-2 to effectively mount a GPU wherever I wanted? Want to make my own PC case and I'm just kind of scoping out what options there may be
|
# ? Oct 20, 2018 21:23 |
|
|
# ? Jun 1, 2024 05:11 |
|
Pretty sure thats what that is for. I'm going custom in-desk next build so I'll be looking at options like that as well.
|
# ? Oct 20, 2018 21:55 |
|
i want to make one out of a box of grenades and an ammo can or 2
|
# ? Oct 20, 2018 21:57 |
|
Has anyone played ghost recon wildlands? I just got it and I've been getting around 50 fps @ 1440p with my 1080ti. That seems low? I have all the settings on ultra but I didn't think it would be sub 60...
|
# ? Oct 20, 2018 22:21 |
|
VostokProgram posted:Has anyone played ghost recon wildlands? I just got it and I've been getting around 50 fps @ 1440p with my 1080ti. That seems low? I have all the settings on ultra but I didn't think it would be sub 60... i have that game @ play it at 1440p with a 1080ti its not a high FPS game. what processor do you have E: ill fire it up right now and see what kind of settings i have and what FPS i get
|
# ? Oct 20, 2018 22:21 |
|
Statutory Ape posted:i have that game @ play it at 1440p with a 1080ti 4790k, probably boosts to about 4.1 or 4.2 under load. Also thanks for checking!
|
# ? Oct 20, 2018 22:26 |
|
getting ~60 fps with an 8700k OC'd to 5 and the 1080ti is a EVGA FTW3 that i OC'd a bit beyond what they did thats a hard game to pull fps in on good settings E: ironically whatever settings i have it set to on my laptop smokes this computer in FPS and it still looks pretty good. guessing the game doesnt like higher resolutions looks amazing tho for an open world game! Worf fucked around with this message at 22:31 on Oct 20, 2018 |
# ? Oct 20, 2018 22:28 |
|
Huh, guess I'm fine then. I have long range shadows on, anisotropic filtering at 16x, and the draw distance at the next setting up (probably ultra). That might account for the 10ish fps delta between us. Next time I'm at home I might play around with turning some texture quality stuff down to "just" high and see if I get a lot more fps out of that
|
# ? Oct 20, 2018 22:48 |
|
Aren't all Ubisoft games super CPU/core bottlenecked - perhaps in part because of Denuvo DRM? It sure as poo poo wasn't because of their GPUs that AMD sponsored AC:O. Usually, Nvidia make great graphics settings guides for games, but I can't find one for Wildlands.
|
# ? Oct 20, 2018 22:49 |
|
ufarn posted:Aren't all Ubisoft games super CPU and core bottlenecked - perhaps in part because of Denuvo DRM? I've been watching my CPU usage in task manager while I play and it doesn't *look* like it's CPU bound. Although now that I think about it, it could easily be memory bound because I only have ddr3 1600 lol
|
# ? Oct 20, 2018 22:51 |
|
i have the thing upscaling 40% which i honestly dont even know what it does, but it drops me down a shitload of FPS so i assume it looks better somewhere i am actually bad at computers
|
# ? Oct 20, 2018 22:55 |
|
Might be it's just a terribly optimized game. It is Ubisoft after all.
|
# ? Oct 20, 2018 22:57 |
|
Statutory Ape posted:i have the thing upscaling 40% which i honestly dont even know what it does, but it drops me down a shitload of FPS so i assume it looks better somewhere 40% upscaling means it's rendering 1.4 times pixels your monitor has.
|
# ? Oct 20, 2018 23:02 |
|
Craptacular! posted:The basic gist of it is once people have a good 144hz 1440p performance below a certain price level, people who aren’t already at 1440/144 will buy Freesync displays and get locked in regardless of what the competing card does. Nvidia can drop 2070s all they want, but Gsync monitors don’t keep dropping like cards do; they’re just eternally higher than their competition. The price point for 1440/144 performance has not been going down, and when the next-gen consoles hit it's going to spike hard again, plus 4k panels will start dropping like a rock in price and that will become the new standard. AMD is absolutely hosed if they can't catch up to Nvidia's architectural lead, and there's no reason to believe they will. Also, Freesync displays don't lock you in - they cost pretty much the same as a normal display, so you can just turn around and sell it to anyone who wants an ordinary-rear end monitor at any point, or swap it over to be your second monitor. Only G-sync's price premium makes any sort of lock-in happen, but AMD is irrelevant to anyone willing to pay that premium unless they can contest the high end again, which almost certainly won't happen until any current monitor is outdated - meaning those people are always going to buy Nvidia anyway. It'd be great if AMD could catch up and drive high end GPU prices down, but it's a complete pipe dream. Intel managing to do so is far more likely, and that's still a minimum of 5-7 years away if everything goes perfectly.
|
# ? Oct 20, 2018 23:04 |
|
Truga posted:40% upscaling means it's rendering 1.4 times pixels your monitor has. is that bad
|
# ? Oct 20, 2018 23:08 |
|
It's supersampling, which is a really high-quality AA that comes with a huge performance penalty. Supersampling is generally not worth it unless you're doing at least 2x native, so I'd probably turn it off. It's mostly useful as a way to get really good image quality out of old games where you can run 4x supersampling and still get 200 FPS.
|
# ? Oct 20, 2018 23:11 |
|
supersampling is a very basic and power hungry form of antialiasing, though it also produces the best results i think tl;dr version is, game is drawing a picture bigger than your screen then it gets scaled down to your screen size and the extra information is used to make the picture appear smoother. e;fb
|
# ? Oct 20, 2018 23:12 |
|
wait if statutory ape is 1.4x supersampled and is still getting higher framerate than me then maybe i'm hosed
|
# ? Oct 20, 2018 23:17 |
|
Truga posted:40% upscaling means it's rendering 1.4 times pixels your monitor has. Though a few do it by total pixel count instead. Llamadeus fucked around with this message at 00:19 on Oct 21, 2018 |
# ? Oct 20, 2018 23:39 |
|
I remember 1 shadow or lighting setting that added like 50 FPS setting it from ultra to high. You could definitely see a difference, I don't know what those people talking about high looks almost as good as ultra are on about, but going from 50 to 100 FPS was worth the image quality cut to me.
|
# ? Oct 20, 2018 23:40 |
|
ufarn posted:Aren't all Ubisoft games super CPU/core bottlenecked - perhaps in part because of Denuvo DRM? Ubisoft games a few years ago were notorious as well for memory leaks. It is funny because all their games are the same, drive around and kill stuff in a open world but they still haven't optimized it at all.
|
# ? Oct 20, 2018 23:43 |
|
Identifying a CPU limit in a game is trivially easy, disable AA modes and crank the resolution down (like take that upscaling setting and move it to negative 50%). If your FPS barely changes at all you are at a CPU limit. If on the other hand it goes up significantly, then its a GPU limit and you shouldn't worry about the CPU. Also I don't know if it is the DRM or what, but Ubisoft does suck at this, Far Cry 5 on a 4.8 GHz i7-7700k/GTX1080 is totally CPU limited even at 1080p with a heavy AA mode. The base engine they are using is like 10 years old at this point, you would think they would have ironed out some optimizations by now.
|
# ? Oct 20, 2018 23:58 |
|
Truga posted:40% upscaling means it's rendering 1.4 times pixels your monitor has. It's worse than that - 40% superscaling means 40% scaling along the axes, so (1.4)^2 = 1.96x the number of pixels. So 40% superscaling reduces frame rate by a bit less than half, probably somewhere around a 30-40% performance hit. E:fb! VostokProgram posted:wait if statutory ape is 1.4x supersampled and is still getting higher framerate than me then maybe i'm hosed Ghost Recon has a couple of settings that are big performance hits for not much gain. This guy is getting a solid 60 fps on a 4790K/1080 @ 1440p just by turning the shadows down to "Very High" and turning off turf effects. If you like the turf effects, you could probably get similar performance by switching the level of detail to "very high", which doesn't really affect graphical fidelity much. Stickman fucked around with this message at 00:09 on Oct 21, 2018 |
# ? Oct 20, 2018 23:58 |
|
Turf effects in that game rule
|
# ? Oct 21, 2018 00:04 |
|
VostokProgram posted:wait if statutory ape is 1.4x supersampled and is still getting higher framerate than me then maybe i'm hosed Wildlands is like one of the most common examples used when talking about the massive performance hit "Ultra" has over "High" without much - if any - visible improvement in fidelity. Shadows and anything to do with clouds are probably the culprits.
|
# ? Oct 21, 2018 00:55 |
|
Llamadeus posted:Usually the value controls the amount per axis, so it's 1.96 times as many pixels~ Oops! I figured everyone does it by pixels, since nvidia does it by pixel count - i have to set 4x scaling in control panel to get 5120x3200 IIRC, which was very counterintuitive to me. e: Is the latest nvidia driver safe right now? I want to test steamvr beta poo poo and I'll need to latest driver, I think lol Truga fucked around with this message at 01:12 on Oct 21, 2018 |
# ? Oct 21, 2018 01:10 |
|
VostokProgram posted:Has anyone played ghost recon wildlands? I just got it and I've been getting around 50 fps @ 1440p with my 1080ti. That seems low? I have all the settings on ultra but I didn't think it would be sub 60... Ultra settings in that game bring GPUs to their knees. Your results are in line with most benchmarks out there, and you are most definitely not CPU bound at those settings unless you're using a Core 2 Duo or something.
|
# ? Oct 21, 2018 01:31 |
|
K8.0 posted:The price point for 1440/144 performance has not been going down, and when the next-gen consoles hit it's going to spike hard again, plus 4k panels will start dropping like a rock in price and that will become the new standard. My understanding was the 1080 was the 1440/144/Gsync ideal card and the 1080ti the 4K/60 card. I do sort of think AMD need to tie the 1080ti to be in it again, but that’s because if you’re the kind of person who goes hard into refresh rate wanking you’re less likely to buy an AMD CPU.
|
# ? Oct 21, 2018 04:45 |
|
Craptacular! posted:My understanding was the 1080 was the 1440/144/Gsync ideal card and the 1080ti the 4K/60 card. I do sort of think AMD need to tie the 1080ti to be in it again, but that’s because if you’re the kind of person who goes hard into refresh rate wanking you’re less likely to buy an AMD CPU. I think that was more true when the cards were released, not as much now. I've got a 1080ti and new games are trending a lot closer to 60-80fps average at 1440p high settings for me now. That's not bad for sure, but it's pretty far from hitting 144 e: not all new games though, Forza Horizon 4 looks awesome and can keep it pegged up near my cap of 143 Enos Cabell fucked around with this message at 04:53 on Oct 21, 2018 |
# ? Oct 21, 2018 04:50 |
|
So what you’re saying is... I waited so long to get my desired monitor price for this card... that I need a new card.
|
# ? Oct 21, 2018 04:55 |
|
VostokProgram posted:4790k, probably boosts to about 4.1 or 4.2 under load. Also thanks for checking! Should be going to 4.4GHz stock.
|
# ? Oct 21, 2018 06:46 |
|
Craptacular! posted:So what you’re saying is... I waited so long to get my desired monitor price for this card... that I need a new card. Oh I think you'd still be really happy with it, maybe just temper expectations a bit. I only upgraded from my 1080 because I needed another card for a second PC.
|
# ? Oct 21, 2018 07:41 |
|
Craptacular! posted:My understanding was the 1080 was the 1440/144/Gsync ideal card and the 1080ti the 4K/60 card. I do sort of think AMD need to tie the 1080ti to be in it again, but that’s because if you’re the kind of person who goes hard into refresh rate wanking you’re less likely to buy an AMD CPU. The 1080ti is borderline for 4k/60. It has enough headroom to do it in some games, but most newer, GPU heavy games are going to give it trouble. The 2080ti is the only card they'll maintain 4k/60 with max settings on pretty much everything.
|
# ? Oct 21, 2018 07:47 |
|
Really wasn't planning to spend this Sunday afternoon computer janitoring, but alas, I noticed colors were looking wonky, and sure enough the monitor had got completely stuck in YCbCr 422 mode for no apparent reason whatsoever. Attempting to change back to RGB from the Nvidia control panel was impossible - changing away from "use default color settings" to "use NVIDIA color settings" and hitting apply just immediately went back to default. Doing a clean driver reinstall via Geforce Experience changed jack poo poo. Rebooting multiple times changed jack poo poo. Booting into safe mode did solve it while in safe mode, so it was clearly a driver issue. According to the internet the generally accepted way of resolving this is to use DDU in safe mode to completely reinstall the drivers, which I did, and it did solve the problem, but what the gently caress? Why? There must be some specific configuration poo poo that can be cleared out in a targeted way rather than reinstalling the entire goddamned thing.
|
# ? Oct 21, 2018 14:29 |
|
TheFluff posted:Really wasn't planning to spend this Sunday afternoon computer janitoring, but alas, I noticed colors were looking wonky, and sure enough the monitor had got completely stuck in YCbCr 422 mode for no apparent reason whatsoever. Attempting to change back to RGB from the Nvidia control panel was impossible - changing away from "use default color settings" to "use NVIDIA color settings" and hitting apply just immediately went back to default. Doing a clean driver reinstall via Geforce Experience changed jack poo poo. Rebooting multiple times changed jack poo poo. Booting into safe mode did solve it while in safe mode, so it was clearly a driver issue. Yeah, I can kind of see why nvidia is still using an expensive FPGA for gsync hdr, there is no point to taping out an ASIC for it in the absence of enough displayport or hdmi bandwidth to drive HDR 4:4:4 at the 144 Hz target refresh rate. With the current 8b/10b encoding used in HDMI and DP, you'd need 60 gbps to deliver 144 Hz HDR. If they ever target 4k HDR 240 Hz they would need over 100 gbps on the cable to pull it off.
|
# ? Oct 21, 2018 16:21 |
|
Indiana_Krom posted:Yeah, I can kind of see why nvidia is still using an expensive FPGA for gsync hdr, there is no point to taping out an ASIC for it in the absence of enough displayport or hdmi bandwidth to drive HDR 4:4:4 at the 144 Hz target refresh rate. With the current 8b/10b encoding used in HDMI and DP, you'd need 60 gbps to deliver 144 Hz HDR. If they ever target 4k HDR 240 Hz they would need over 100 gbps on the cable to pull it off. I have a standard 60Hz 4K SDR monitor though, there is no reason at all for me to ever use 4:2:2 and I sure as heck haven't set it myself. I just sat down at the computer today and noticed the colors were off. Might possibly have come with the last driver update, but I dunno.
|
# ? Oct 21, 2018 16:28 |
|
The 416 series driver from NVidia has intermittent problems waking up from display-off idle. For gently caress's sake, NVidia, what the hell is going on over there?!
|
# ? Oct 21, 2018 17:39 |
|
VelociBacon posted:Should be going to 4.4GHz stock. I think it's maybe thermal limited since I'm using the stock cooler on it. I'll have to double check but on full 100% load in handbrake for example it'll even go as low as 3.9
|
# ? Oct 21, 2018 19:02 |
|
Combat Pretzel posted:The 416 series driver from NVidia has intermittent problems waking up from display-off idle. For gently caress's sake, NVidia, what the hell is going on over there?! The whole 400 series of drivers have been crap. Watch as Nvidia gets a complete by on this, while AMD will get roasted for not having perfect performance in one out of ten games the next time they launch a genuinely new product. The whole situation is poo poo, Nvidia had a terrible, terrible driver when both Vista and 7 launched, and nobody batted an eye. Nobody has a lock on 'good' drivers, and haven't for years.
|
# ? Oct 21, 2018 20:34 |
|
|
# ? Jun 1, 2024 05:11 |
|
Is there anything specific I should be looking out for if I were to want to buy a 1080 Ti off ebay? I know they're mostly mining cards, but I'm seeing them between $499-599 right now, mostly listed as "gently used" or "always ran at 65º" and I just don't want to drop that kinda cash if I end up getting burned. Of course Paypal usually sides with the buyer if I got a dud, I'd rather not go through that if I could avoid it.
|
# ? Oct 21, 2018 21:54 |