|
repiv posted:RX5500 has already leaked a bunch mobile version of this is probably going into the 16" Macbook Pros
|
# ? Oct 1, 2019 20:35 |
|
|
# ? Jun 5, 2024 03:35 |
|
Amd has released new drivers for the next Tom Clancy game and also added Radeon Image Sharpening to the Vega cards. It seems some people wanted that.
|
# ? Oct 1, 2019 20:55 |
|
It was a generally stupid decision not to all around. If there is one place that RIS will have the most effect, it will be on underpowered Vega chips that have to run at a lower resolution before upsampling. You know, like how they'd do on, I dunno... ALL THEIR CURRENT loving APUS THAT ALL USE VEGA SILICON.
|
# ? Oct 1, 2019 21:01 |
|
Drakhoran posted:Amd has released new drivers for the next Tom Clancy game and also added Radeon Image Sharpening to the Vega cards. It seems some people wanted that. Nice. The curse of Raja is appeased. For now
|
# ? Oct 1, 2019 21:17 |
|
So I've found a used Gainward GTX 1080 Ti Phoenix going for 470€ and I'm tempted to go for it, reviews were a bit sparse but apparently it's actually decently cool and quiet. Anyone know more about the manufacturer or should I jump on it while it's still there?
|
# ? Oct 2, 2019 07:09 |
|
It's identical to the Palit Gamerock just with a red shroud, and a really good cooler. Euro only tho, so it's not much spoken of here. Price seems high for a used card, when you can buy https://geizhals.de/powercolor-radeon-rx-5700-red-dragon-axrx-5700-8gbd6-3dhr-oc-a2122768.html?hloc=de for 379€ or the big daddy model https://geizhals.de/powercolor-radeon-rx-5700-xt-red-devil-axrx-5700xt-8gbd6-3dhe-oc-a2115899.html?hloc=de for 465€ I dunno maybe the Nvidia premium is worth it to you sauer kraut fucked around with this message at 07:38 on Oct 2, 2019 |
# ? Oct 2, 2019 07:23 |
|
Oof, yeah that's hard pass then. I just compared that price to new same cards and forgot to compare to other cards.
|
# ? Oct 2, 2019 09:51 |
|
Is there anything to look forward to, for someone who has a 1080 and does not want to spend more than 400-450€ for a new GPU ? Or will I have to keep this GPU until death do us part? E: for 1440p/60 gaming
|
# ? Oct 2, 2019 12:42 |
|
TorakFade posted:Is there anything to look forward to, for someone who has a 1080 and does not want to spend more than 400-450€ for a new GPU ? Nvidia releasing 7nm GPUs sometime next year.
|
# ? Oct 2, 2019 13:13 |
|
A used 1080ti in 2019 probably means a very used 1080ti, unless you can get a receipt of purchase from within the past two years. I bought used myself, but that was a while ago, and not a 1080ti. Hopefully we get that 7nm node shrink next year with a bunch of cool options, but it's a bit late in the cycle to buy a 1080ti.
|
# ? Oct 2, 2019 14:18 |
|
I dunno euro prices, would a 2070 super fit the bill? I’m quite pleased with performance coming from a 980ti. It does 1440p at 60+ if you don’t mess with ray tracing.
|
# ? Oct 2, 2019 15:16 |
|
A 5700 XT would fit the price point pretty perfectly, a good one like the Gigabyte Gaming OC is about ~430€. But it's not a huge upgrade over a 1080. You're better off waiting if you don't want to spend more money.
Lambert fucked around with this message at 16:02 on Oct 2, 2019 |
# ? Oct 2, 2019 16:00 |
|
2070 Super is a definite upgrade but it's around 600-650€ as far as I can tell which is more than I'm comfortable spending right now, given I'm relatively satisfied with current performance except in a few games where I have to tone down details (I go Ultra or Very High, don't want to go any lower if I can help it) 5700XT is right around 450€ but it's like, 10-15% better than a 1080 as far as I can tell ... and mine, besides being a "gaming" SKU (EVGA FTW) so with a pretty good factory overclock, still has 2 years' warranty since I got it on clearance last year just before the 20XX series came out all in all I think it's definitely not worth the money to me since I can still play most games quite well, I'd rather wait next year for 7nm Nvidia GPUs, by then maybe a 3060 will be a nice improvement for not too much money, or I could save more and splurge for a 3070/3080 maybe thanks guys will be waiting for exciting news, hopefully AMD makes an even better comeback next year and nvidia will be forced to lower prices, it's totally insane that a high (not even top!) tier card costs 800-1000€
|
# ? Oct 2, 2019 18:23 |
|
Nvidia 346.51 update. This seems like a small update but I guess enough people were complaining about it to get pushed. https://nvidia.custhelp.com/app/answers/detail/a_id/4876?linkId=100000008374816 2 fixes: FIFA 19/FIFA 20/Star Wars: Battlefront II (2017) – Game may randomly crash to desktop Apex Legends: Random flicker during gameplay
|
# ? Oct 3, 2019 13:54 |
|
repiv posted:FWIW 346.48 also fixed a problem I was having with Tetris Effect, where it capped at 100fps and ran with weird jittery frame pacing for some reason. Now it's back at smooth unlocked FPS. I've been having issues with a controller (Xbox 1 via Bluetooth) causing severe fps drops lately and every time it's been fixed by putting in new batteries. Something about controller vibration combined with low batteries affects the render speed of games, it's very weird.
|
# ? Oct 3, 2019 17:42 |
|
I have a GTX 1080 currently, and just recently bought a 1440p / 144hz monitor. Currently playing Destiny 2, i'm getting around 70-100 in typical play. I'd like to ideally push 120-140 fps in Destiny 2, as even turning everything to low or medium is barely a bump (from 70 to 90) enough to my liking. I quickly spoiled myself with 144hz and won't look back, the monitor thread wasn't wrong. Is the 2070 / 2080 capable of doing 144hz 1440p with Destiny 2, or am I looking at a 2080ti (out of budget) or waiting on the 2020 models to be announced?
|
# ? Oct 3, 2019 17:43 |
|
onesixtwo posted:I have a GTX 1080 currently, and just recently bought a 1440p / 144hz monitor. Currently playing Destiny 2, i'm getting around 70-100 in typical play. I'd like to ideally push 120-140 fps in Destiny 2, as even turning everything to low or medium is barely a bump (from 70 to 90) enough to my liking. I quickly spoiled myself with 144hz and won't look back, the monitor thread wasn't wrong. I have a 1070 and get 85-120 fps at 1440p with a mix of high and highest settings aside from DoF which is off. Seems like there might be something else going on here.
|
# ? Oct 3, 2019 18:11 |
|
This site tells me a 1080 gets you 152+ FPS at 2560x1440 with high details already and a 2070 or 2070S would not get you considerably more. I'd take a look at your system first before buying anything else.
lllllllllllllllllll fucked around with this message at 18:23 on Oct 3, 2019 |
# ? Oct 3, 2019 18:17 |
|
Huh.. wow, yeah that's not even close to attainable for me at the moment. Even with 25% scaling (which is unplayable) I still won't go above 100 near enemies / another person. Something else is the culprit then, which is good to hear my 1080 is still viable in the end. Sounds like a new CPU might be in order of upgrade, have a two year old i5-8600K Coffee Lake 6-Core 3.6 GHz in it now, but didn't expect that to be my bottleneck at all. Also that site is fantastic, thank you!
|
# ? Oct 3, 2019 18:31 |
|
onesixtwo posted:Huh.. wow, yeah that's not even close to attainable for me at the moment. Even with 25% scaling (which is unplayable) I still won't go above 100 near enemies / another person. Something else is the culprit then, which is good to hear my 1080 is still viable in the end. Sounds like a new CPU might be in order of upgrade, have a two year old i5-8600K Coffee Lake 6-Core 3.6 GHz in it now, but didn't expect that to be my bottleneck at all. I highly doubt it's your CPU as I'm using a 6600k though I did OC it to 4.7GHz.
|
# ? Oct 3, 2019 18:44 |
|
Is it overclocked? That (probably) will be all you need to do if it's not.
|
# ? Oct 3, 2019 18:44 |
|
I don't fully understand how to overclock, so I haven't toyed with the bios settings. I figured I'd learn if there ever seemed a solid reason to do so.. this sounds like time. My biggest fear is melting my card by plugging in a voltage number. Regrettable posted:I highly doubt it's your CPU as I'm using a 6600k though I did OC it to 4.7GHz. Cool, this gives me a target to look for. https://www.reddit.com/r/overclocking/comments/5ci9ju/i5_6600k_oc_47_ghz_1385v/ reading this and a few other results, it seems I should be able to plug in 4.7, and just, play with voltage until it doesn't crash on me in a game?
|
# ? Oct 3, 2019 19:46 |
|
You've updated drivers and all that good stuff? I'd get something to monitor thermals too. Most common culprit here is heat causing thermal throttling.
|
# ? Oct 3, 2019 19:50 |
|
Yeah, you should probably start by downloading HWiNFO so you can monitor temps while the game is running to see if anything is thermal throttling. This will also allow you to monitor your voltages in case you do decide to OC. If you do decide to OC, make sure you have adequate cooling. The maximum safe Vcore for an 8600k is 1.5V according to Intel but I wouldn't personally push it that far. Around 1.4V should be fine in most cases but may not even be necessary depending on how lucky you got with your CPU. You should also check out the Overclocking Megathread before you get started so you have a better idea of what you're doing: https://forums.somethingawful.com/showthread.php?threadid=3465021
|
# ? Oct 3, 2019 20:17 |
|
My GPU drivers are up to date, if there have been significant improvements on my mobo / bios side, I haven't updated that since initially did the build so it'd be ~2 years dated to when I did that. Otherwise it's Windows 10, 64 bit, 16gb ram, 860 psu. Running idle, my 6 cores are showing.. whatever this means inside of Core Temp. Open Hardware Monitor doesn't seem to display temps for my individual cores, just the CPU core. Sitting idle / max temps are from me playing destiny2 for ~5 minutes. Are those, scary, or would OC be possible with it do you think.
|
# ? Oct 3, 2019 20:19 |
|
Those are your individual core temps and 60c is well within the safe range. It shouldn't start thermal throttling until you hit around 90c. I'm more curious about your GPU temps at the moment.
|
# ? Oct 3, 2019 20:28 |
|
GPU check is often wildly inaccurate because rather than actual testing their numbers are based on an algorithm that makes some very simplified assumptions about settings scaling and the interaction between CPU and GPU power. It's definitely not a worthwhile substitute for proper benchmarks or even random youtube videos from people with similar settings. Since Destiny 2 doesn't have any benchmarking tools, the frame rates that you see in benchmarks or that other people give are going to be heavily dependent on what they're doing in the game and especially dependent on how many other players are nearby. That said, I have an 8400+1080 Ti and on 1440p/Ultra settings I get 85-115 fps during the invasion fights near the Trostland landing zone. Turning the settings down to medium had basically no effect, but low was 95-125. With no nearby players, traversing worlds with only a few enemies nearby was often 144+ fps but fighting groups of enemies was more often ~100-120 fps. That suggests to me that the firefights themselves might be limited by something other than the gpu, especially when in proximity to other players. Those numbers also line up with this video's 2080/6700k performance. Also with a 8700k/2080 Ti @ 1440p. I suspect you might be able to get some small gains with a GPU upgrade, but consistent 144+ performance just doesn't seem possible. Depending on what you mean by "typical play", your numbers do look a little low for a 1080, though. If you're playing in borderless windowed mode, I'd try changing to fullscreen to see if that boosts performance. Also, if you have RAM faster than 2400MHz, check to make sure you have the XMP Profile set in bios so it can run at full speed. You can check your RAM speed with CPU-Z - it will show half the listed data rate because DDR transfers data twice per clock cycle. E: Check that your RAM is running in dual-channel rather than single-channel mode too. EE: also disable VSync and set an FPS limit instead. Stickman fucked around with this message at 20:43 on Oct 3, 2019 |
# ? Oct 3, 2019 20:30 |
|
One more datapoint, I'm getting 60 fps with no noticable drops at 4k on med/high settings with just a 1070ti and Ryzen 2600. As people already said, I'd look into whether your ram is running in dual channel mode and has xmp enabled and then look to overclocking the CPU somewhat.
|
# ? Oct 3, 2019 20:36 |
|
Okay wow, already a significant improvement. XMP wasn't turned on because I apparently had no idea what it was and wasn't curious enough? Enabling that and my baseline is now floating around 100-144 without tweaking settings beyond putting it on High. Number sounds similar, but it's floating at a much higher value than before when it would consistently dip below 90 more often than not. Same high settings that were giving me sub 100 anywhere but the menus. Ram is in dual channel, next thing to look into is overclocking my CPU a minor amount, but I think this is totally doable without a purchase. Thanks all..
|
# ? Oct 3, 2019 20:57 |
|
Sweet, good to hear! Going by the videos I've been seeing, that sounds like about max performance even with much more expensive hardware.
|
# ? Oct 3, 2019 23:36 |
|
Does AMD still have ridiculously bad drivers? Looking to get a 2060S but was wondering if I should hop on the red wagon.
|
# ? Oct 4, 2019 06:24 |
|
Mr. Crow posted:Does AMD still have ridiculously bad drivers? Looking to get a 2060S but was wondering if I should hop on the red wagon. The last major issue I remember was them taking a long time to fix a crash in Overwatch but it's pretty much a wash nowadays. For some reason the control panel software is much better on AMD's side, same with Linux drivers although sounds like rdna is having some teething issues. Arzachel fucked around with this message at 08:15 on Oct 4, 2019 |
# ? Oct 4, 2019 07:19 |
|
Arzachel posted:The last major issue I remember was them taking a long time to fix a crash in Overwatch but it's pretty much a wash nowadays. For some reason the control panel software is much better on AMD's side, same with Linux drivers although sounds like rdna is having some teething issues. Yeah, AMD drivers got a bit better and Nvidia drivers had a time where they got worse and buggier with every release so it's about even now. There may be some games/edge cases where one is better than the other (hw encoding quality for streaming comes to mind) but driver quality shouldn't be a deciding factor anymore.
|
# ? Oct 4, 2019 10:30 |
|
So if I'm running Linux AMD is doing better with the drivers? Was looking to replace my steamlink and picking up an Nvidia shield as well, assuming this doesn't work unless I've got an Nvidia card?
|
# ? Oct 4, 2019 18:54 |
|
Mr. Crow posted:So if I'm running Linux AMD is doing better with the drivers? I believe Shield uses Gamestream which requires NVidia. For what its worth, I had more luck a couple years ago with NVidia streaming through steamlink than AMD, but I did not do near enough testing to say whether or not it was specific to the GPU manufacturer.
|
# ? Oct 4, 2019 19:13 |
|
If you want to lose the Steam Link device for a more powerful device, Valve has slowly converted Steam Link into software for other devices. I know the current Apple TVs have a Steam Link app AND have added Xbox/PS4 controller support over Bluetooth with the most recent software update. They’re also one of the few devices that has Prime Video, if that interests you.
|
# ? Oct 4, 2019 22:36 |
|
Craptacular! posted:I know the current Apple TVs have a Steam Link app AND have added Xbox/PS4 controller support over Bluetooth with the most recent software update. They’re also one of the few devices that has Prime Video, if that interests you. The Tick was cancelled.
|
# ? Oct 4, 2019 23:53 |
|
Zeluth posted:The Tick was cancelled. Fuckin don't remind me dude
|
# ? Oct 5, 2019 08:05 |
|
Amd has now announced the RX 5500 series. No actual launch date (beyond Q4) or price is given but since they are comparing it to the gtx 1650 I would expect this to be priced below $200 by quite a bit.
|
# ? Oct 7, 2019 17:38 |
|
|
# ? Jun 5, 2024 03:35 |
|
And a mobile version of it too. With 4GB vRAM max. lol.
|
# ? Oct 7, 2019 17:40 |