|
Let's not forget Arch, which I use.
|
# ? Dec 20, 2019 15:25 |
|
|
# ? Jun 9, 2024 08:31 |
|
And of course I discovered the hard way that the Xfce window compositor can cause tearing on Steam In Home Streaming on 18.04 LTS even if vsync is on, but 19.11 seems fine. Point is that it's Xubuntu specific (i.e. an Ubuntu variant)
|
# ? Dec 20, 2019 16:15 |
|
lolPhoronix posted:Mesa 20.0's RADV + ACO Vulkan Driver Now Consistently Beating AMD's AMDVLK Linux Driver Computers are dumb and I love them.
|
# ? Dec 20, 2019 20:08 |
|
I figured out the solution to the graphical glitches on discord, twitch, rockstar launcher, gog etc... It was the new Image sharpening feature at the top of the Nvidia control panel, I had it on globally, and I still do, but I've made special profiles excluding that feature on certain applications. Steam was the only one smart enough to have it disabled by default, go figure.
|
# ? Dec 20, 2019 21:18 |
|
Quaint Quail Quilt posted:I figured out the solution to the graphical glitches on discord, twitch, rockstar launcher, gog etc... NewFatMike posted:Computers are dumb and I love them.
|
# ? Dec 20, 2019 21:47 |
|
Looks like Nvidia are regretting relying on Samsung and moving most production to TSMC instead.
|
# ? Dec 20, 2019 23:26 |
|
All hail our TSMC overlords, I guess.
|
# ? Dec 20, 2019 23:33 |
|
SwissArmyDruid posted:All hail our TSMC overlords, I guess. Indeed: The other real winner (at least watching the stock price) is ASML... EUV is really hard, and really expensive.
|
# ? Dec 21, 2019 00:17 |
|
The Samsung 7nm EUV rumor sorta felt like the yearly "Apple is testing AMD based computers!!!" thing from the start. Nvidia has played the two against each other to lower their wafer prices before, wouldn't surprise me if thats what was happening all along here too.
|
# ? Dec 21, 2019 00:37 |
|
movax posted:Indeed: TSMC's market cap is now bigger than Intel. Which is silly, but that's what a year of all good news vs a year of nothing but bad news does to the stock market. (Also thread relevant: I keep seeing more bad news about Intel's GPU in my phone news feed, but it seems to be republishing reddit rumours. It seems utterly crazy that their performance could be that dismal on a dGPU with real ram, but... who knows? It would fit with Intel's no-good very bad year.)
|
# ? Dec 21, 2019 02:43 |
|
ufarn posted:Looks like Nvidia are regretting relying on Samsung and moving most production to TSMC instead. Owned lol Guess that explains the delay and turing
|
# ? Dec 21, 2019 02:46 |
|
Cygni posted:don’t use linux for games and expect it to work. Things that work: Blizzard games. Valve games. League of Legends. Borderlands 3. Path of Exile. Monster Hunter. Warframe. Dark Souls franchise. Minecraft (obviously). Battlefront II. Most major singleplayer/AAA releases of the past however many years. Hell even Halo Reach released the other week works. Things that don't work: DX12 (working on it). Destiny 2. Anything that uses Easy Anti-Cheat (notably Fortnite, Apex Legends, Dauntless). Unormal posted:Linux is like 1% of a games sales it doesn't matter to anyone and is usually costing you more in support than it earns. This is often true, though for most people "don't ban us for not running bare metal Windows and we'll figure the rest out ourselves" is all anyone asks. Generally speaking, Steam games will allow you to at least try with a Valve-built wrapper of WINE and DXVK, and if there's some missing library or something that can be added in, or only one specific branch of WINE works but not another, Lutris is all about the install scripts and management/containerization of those WINE nodes. Edit: Paul MaudDib posted:Especially since “Linux” isn’t a single homogenous thing like windows. You have the Debian/Ubuntu ecosystem, the Red Hat/CentOS ecosystem, Mint, SuSE, etc. All will probably cost you at least some additional marginal support. While there are certain issues (Distro A uses /bin/share, Distro B uses /usr/bin/share), generally speaking the very largest issues surrounding these "ecosystems" has been solved by systemd. (Systemd's biggest benefit has been to the people who have to maintain the distro, which is why basically all the major distros are defaulting to it in spite of how controversial it is.) That said, Valve only ever officially supports the Ubuntu LTS, but allows any distro to repackage it to work with their ecosystem. Due to US software patents, Red Hat is the most patent paranoid maintainer, but Fedora (which is what you would run on a gaming machine, not CentOS or RHEL) includes a Steam repo. I still wouldn't reccomend Linux anyway for most the people ITT because if you care about GPUs you probably have Nvidia and the current state of the Nvidia drivers is poor. Weirdly, if you're still one of the people hanging on and buying bad, shroud-focused cards in the name of fighting back against the man, AMD cards actually run better because AMD went "aw gently caress it we can't even support one OS decently let alone two, you guys have this code" and open sourced it, so hackers have made kernel-packed AMD driver so optimized that Windows games running through Linux run faster than they do on Windows through AMD's own driver. RTG can't make a driver better than a bunch of people doing it for free. Craptacular! fucked around with this message at 05:40 on Dec 21, 2019 |
# ? Dec 21, 2019 04:37 |
|
honestly a lot of stuff that's hostile to linux gamers (proprietary DRM, paranoid anti-cheat software) is bad for games preservation in the long term, and a supreme annoyance for honest, paying customers in the here and now anyway.
|
# ? Dec 21, 2019 04:55 |
|
Klyith posted:(Also thread relevant: I keep seeing more bad news about Intel's GPU in my phone news feed, but it seems to be republishing reddit rumours. It seems utterly crazy that their performance could be that dismal on a dGPU with real ram, but... who knows? It would fit with Intel's no-good very bad year.) It's not that hard to think, really. GPUs are non-trivial. They're leveraging a conceptual architecture considerably different from CPUs, and then you have to get the driver stack working to optimize the actual jobs on the hardware, which as we've seen from AMD is at least as important as the hardware itself. I don't think there was ever a real chance Intel's first release would be meaningfully competitive in performance with AMD/NVidia's higher-end offerings. From a business perspective, though, they may not have to be if they can sell them at the right price-point. But it's Intel, so we'll see how realistic they decide to be on pricing.
|
# ? Dec 21, 2019 17:24 |
|
Is it just me or am I the only one that finds it odd that the new Xbox is going to be based off of AMDs RDNA when the second version is going to be announced in January 2020 and available later in the year?
|
# ? Dec 21, 2019 18:38 |
|
Tab8715 posted:Is it just me or am I the only one that finds it odd that the new Xbox is going to be based off of AMDs RDNA when the second version is going to be announced in January 2020 and available later in the year? Why are you assuming it’s using the first iteration of RDNA when it very well might use the second? We don’t know which it will be using at this time and we don’t know if amd is announcing rdna 2 this January, it’s all rumors at this point. That and if console or anyone keeps waiting for something better they will end up never building anything. B-Mac fucked around with this message at 18:45 on Dec 21, 2019 |
# ? Dec 21, 2019 18:42 |
|
The new Xbox has features that don't exist in RDNA1 (hardware raytracing and variable rate shading have been confirmed) so it's pretty safe to assume it's using AMDs next iteration of RDNA.
|
# ? Dec 21, 2019 18:49 |
|
https://www.pcgameshardware.de/Radeon-RX-5500-XT-Grafikkarte-275529/Specials/PCI-Express-3-vs-PCI-E-4-GPU-1339415/ PCGH noticed something weird - the 5500XT only has 8 PCIe lanes wired up, which chokes the GPU unless you have a PCIe4 capable system.
|
# ? Dec 21, 2019 19:11 |
|
I'm pretty sure what the tests they're using show is that the interconnect speed matters if you run out of vram, which has always been true. Remember PCI GPUs that would drop from FPS to SPF if you ran out of vram? I'm not sure it matters much because no poo poo you can't run max details on a 4gb GPU in modern titles.
|
# ? Dec 21, 2019 19:18 |
|
repiv posted:https://www.pcgameshardware.de/Radeon-RX-5500-XT-Grafikkarte-275529/Specials/PCI-Express-3-vs-PCI-E-4-GPU-1339415/ The massive qualifier being that the difference is only there when you're running out of vram which isn't a typical scenario to say the least.
|
# ? Dec 21, 2019 19:19 |
|
It registers on the 8GB version too, which shouldn't be swapping VRAM surely? BF5 has 10% better avg and 17% better min FPS when running in PCIe4 mode, and Wolfenstein has +15% avg and +18% min.
repiv fucked around with this message at 19:35 on Dec 21, 2019 |
# ? Dec 21, 2019 19:22 |
|
repiv posted:https://www.pcgameshardware.de/Radeon-RX-5500-XT-Grafikkarte-275529/Specials/PCI-Express-3-vs-PCI-E-4-GPU-1339415/ this is the the benefit of pcie4 for consumers: ideally we shift over to x8 and then free up the x8 for thunderbolt or something Arzachel posted:The massive qualifier being that the difference is only there when you're running out of vram which isn't a typical scenario to say the least. Of course: pcie is only a bottleneck if you need to constantly stream data over it.
|
# ? Dec 21, 2019 19:42 |
|
Malcolm XML posted:this is the the benefit of pcie4 for consumers: ideally we shift over to x8 and then free up the x8 for thunderbolt or something ideally you wire up all 16 lanes and then if you put a card in the second slot it drops to x8 like normal.
|
# ? Dec 21, 2019 21:07 |
|
B-Mac posted:Why are you assuming it’s using the first iteration of RDNA when it very well might use the second? We don’t know which it will be using at this time and we don’t know if amd is announcing rdna 2 this January, it’s all rumors at this point. That and if console or anyone keeps waiting for something better they will end up never building anything. That’s just what I read from earlier but this makes more sense. As a follow-up, are AMDs Drivers still buggy? That’s the last thing holding me to Nvidia.
|
# ? Dec 21, 2019 22:18 |
|
I haven't had any problems with any drivers that aren't the most recent ones. They seem to singularly be a tire fire and a half.
|
# ? Dec 21, 2019 22:20 |
|
I paid $60 for this in 2019! NIB!
|
# ? Dec 21, 2019 22:45 |
|
repiv posted:https://www.pcgameshardware.de/Radeon-RX-5500-XT-Grafikkarte-275529/Specials/PCI-Express-3-vs-PCI-E-4-GPU-1339415/ Like you have to know that 99% of systems this will be going into are going to be PCIE3, cripes that a hosed up launch
|
# ? Dec 22, 2019 01:15 |
|
Yep, pretty poo poo of them to limit it to 8 lanes.
|
# ? Dec 22, 2019 01:49 |
|
Shaocaholica posted:I paid $60 for this in 2019! NIB! Way to defy the thread title!
|
# ? Dec 22, 2019 01:51 |
|
Rexxed posted:Way to defy the thread title! When did the CG girlies go away?
|
# ? Dec 22, 2019 06:32 |
|
the original gamergate
|
# ? Dec 22, 2019 06:51 |
|
That probably explains the poor benchmarking for the 5500XT. Well done AMD.
|
# ? Dec 22, 2019 06:58 |
|
I had that MS mouse, and the black one of the same ....for many, many years. It might be one of the last cheap peripherals(29-39$) I bought that lasted forever Nice kb too!
|
# ? Dec 22, 2019 09:36 |
|
Cygni posted:The Samsung 7nm EUV rumor sorta felt like the yearly "Apple is testing AMD based computers!!!" thing from the start. Nvidia has played the two against each other to lower their wafer prices before, wouldn't surprise me if thats what was happening all along here too. The rumor was that NV really hated TSMC's prices for 7nm/+ and really really wanted to have a viable option with Samsung but Samsung couldn't really deliver something that was as good as TSMC's 7nm, much less 7nm+ performance. Nor could they do it in anything like the time frame NV wanted so NV was forced to go back to TSMC otherwise they'd essentially be giving AMD a process advantage. Samsung's 7nm production is still supposed to be a fair amount cheaper than TSMC's at least so maybe they'll use them for some lower cost parts or something but by all accounts (again, its a rumor...) they didn't get what they wanted.
|
# ? Dec 22, 2019 12:16 |
|
Shaocaholica posted:I paid $60 for this in 2019! NIB! drat, I didn't know there was an AGP 3850. I managed to get a 4650 back in like 2015 and in my Pentium M @2.4/4GB DDR400 desktop, it was capable of running Skyrim and WoW playably. Had to use Windows 8 though, since with 10 they dropped support for everything before 5xxx. I was surprised by how poorly WoW performed (like, 720p/medium or 1080p/low) because I played it for years on the same system back when it was my daily driver with a 6800GT, but I guess Blizzard really did increase requirements in the intervening seven years.
|
# ? Dec 22, 2019 17:22 |
|
I'm disinclined to believe that PCIe 3.0 x8 is a limiting factor for the RX 5500. This was tested with a GTX 1080, which is much closer to the RX 5700XT:Gamers Nexus posted:From a quick look, there is a little below a 1% performance difference in PCI-e 3.0 x16 and PCI-e 3.0 x8 slots. The difference is not even close to perceptible and should be ignored as inconsequential to users fretting over potential slot or lane limitations. We are not sure how this scales with SLI (particularly MDA 'mode') or dual-GPU cards, but hope to research once we've got more hardware in the lab. https://www.gamersnexus.net/guides/2488-pci-e-3-x8-vs-x16-performance-impact-on-gpus
|
# ? Dec 22, 2019 18:27 |
|
NewFatMike posted:I'm disinclined to believe that PCIe 3.0 x8 is a limiting factor for the RX 5500. This was tested with a GTX 1080, which is much closer to the RX 5700XT: Which is complexly irrelevant to a card with a 4GB frame buffer that frequently overspills into ram in modern games, necessitating the use of the available PCIE bandwidth. I mean dude there's literally benchmarks of the specific card in question under PCIE 3 vs 4.
|
# ? Dec 22, 2019 18:43 |
|
I trust the mush-mouth YouTube man, as he's been kind to me in the past
|
# ? Dec 22, 2019 18:58 |
|
Happy_Misanthrope posted:Which is complexly irrelevant to a card with a 4GB frame buffer that frequently overspills into ram in modern games, necessitating the use of the available PCIE bandwidth. I mean dude there's literally benchmarks of the specific card in question under PCIE 3 vs 4. Oh gotcha, I missed that it was 4GB vs 8GB. My mistake.
|
# ? Dec 22, 2019 19:09 |
|
|
# ? Jun 9, 2024 08:31 |
|
Happy_Misanthrope posted:Which is complexly irrelevant to a card with a 4GB frame buffer that frequently overspills into ram in modern games, necessitating the use of the available PCIE bandwidth. I mean dude there's literally benchmarks of the specific card in question under PCIE 3 vs 4. The solution is to lower the settings so that you're not running out of vram in the first place, PCIE bandwidth is a band-aid to a gushing wound. That the 8GB cards also show improvements over the usual 1-3% is the weird part.
|
# ? Dec 22, 2019 19:12 |