Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Stanley Pain
Jun 16, 2001

by Fluffdaddy
Let's not forget Arch, which I use. :derp:

Adbot
ADBOT LOVES YOU

Rooted Vegetable
Jun 1, 2002
And of course I discovered the hard way that the Xfce window compositor can cause tearing on Steam In Home Streaming on 18.04 LTS even if vsync is on, but 19.11 seems fine. Point is that it's Xubuntu specific (i.e. an Ubuntu variant)

NewFatMike
Jun 11, 2015

lol


Phoronix posted:

Mesa 20.0's RADV + ACO Vulkan Driver Now Consistently Beating AMD's AMDVLK Linux Driver
https://www.phoronix.com/scan.php?page=article&item=mesa20radv-aco-amdvlk&num=1

Computers are dumb and I love them.

Quaint Quail Quilt
Jun 19, 2006


Ask me about that time I told people mixing bleach and vinegar is okay
I figured out the solution to the graphical glitches on discord, twitch, rockstar launcher, gog etc...
It was the new Image sharpening feature at the top of the Nvidia control panel, I had it on globally, and I still do, but I've made special profiles excluding that feature on certain applications.

Steam was the only one smart enough to have it disabled by default, go figure.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Quaint Quail Quilt posted:

I figured out the solution to the graphical glitches on discord, twitch, rockstar launcher, gog etc...
It was the new Image sharpening feature at the top of the Nvidia control panel, I had it on globally, and I still do, but I've made special profiles excluding that feature on certain applications.

Steam was the only one smart enough to have it disabled by default, go figure.

NewFatMike posted:

Computers are dumb and I love them.

ufarn
May 30, 2009
Looks like Nvidia are regretting relying on Samsung and moving most production to TSMC instead.

SwissArmyDruid
Feb 14, 2014

by sebmojo
All hail our TSMC overlords, I guess.

movax
Aug 30, 2008

SwissArmyDruid posted:

All hail our TSMC overlords, I guess.

Indeed:



The other real winner (at least watching the stock price) is ASML... EUV is really hard, and really expensive.

Cygni
Nov 12, 2005

raring to post

The Samsung 7nm EUV rumor sorta felt like the yearly "Apple is testing AMD based computers!!!" thing from the start. Nvidia has played the two against each other to lower their wafer prices before, wouldn't surprise me if thats what was happening all along here too.

Klyith
Aug 3, 2007

GBS Pledge Week

movax posted:

Indeed:

TSMC's market cap is now bigger than Intel. Which is silly, but that's what a year of all good news vs a year of nothing but bad news does to the stock market.


(Also thread relevant: I keep seeing more bad news about Intel's GPU in my phone news feed, but it seems to be republishing reddit rumours. It seems utterly crazy that their performance could be that dismal on a dGPU with real ram, but... who knows? It would fit with Intel's no-good very bad year.)

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

ufarn posted:

Looks like Nvidia are regretting relying on Samsung and moving most production to TSMC instead.

Owned lol


Guess that explains the delay and turing

Craptacular!
Jul 9, 2001

Fuck the DH

Cygni posted:

don’t use linux for games and expect it to work.

Things that work: Blizzard games. Valve games. League of Legends. Borderlands 3. Path of Exile. Monster Hunter. Warframe. Dark Souls franchise. Minecraft (obviously). Battlefront II. Most major singleplayer/AAA releases of the past however many years. Hell even Halo Reach released the other week works.

Things that don't work: DX12 (working on it). Destiny 2. Anything that uses Easy Anti-Cheat (notably Fortnite, Apex Legends, Dauntless).


Unormal posted:

Linux is like 1% of a games sales it doesn't matter to anyone and is usually costing you more in support than it earns.

This is often true, though for most people "don't ban us for not running bare metal Windows and we'll figure the rest out ourselves" is all anyone asks.

Generally speaking, Steam games will allow you to at least try with a Valve-built wrapper of WINE and DXVK, and if there's some missing library or something that can be added in, or only one specific branch of WINE works but not another, Lutris is all about the install scripts and management/containerization of those WINE nodes.

Edit:

Paul MaudDib posted:

Especially since “Linux” isn’t a single homogenous thing like windows. You have the Debian/Ubuntu ecosystem, the Red Hat/CentOS ecosystem, Mint, SuSE, etc. All will probably cost you at least some additional marginal support.

While there are certain issues (Distro A uses /bin/share, Distro B uses /usr/bin/share), generally speaking the very largest issues surrounding these "ecosystems" has been solved by systemd. (Systemd's biggest benefit has been to the people who have to maintain the distro, which is why basically all the major distros are defaulting to it in spite of how controversial it is.)

That said, Valve only ever officially supports the Ubuntu LTS, but allows any distro to repackage it to work with their ecosystem. Due to US software patents, Red Hat is the most patent paranoid maintainer, but Fedora (which is what you would run on a gaming machine, not CentOS or RHEL) includes a Steam repo. I still wouldn't reccomend Linux anyway for most the people ITT because if you care about GPUs you probably have Nvidia and the current state of the Nvidia drivers is poor.

Weirdly, if you're still one of the people hanging on and buying bad, shroud-focused cards in the name of fighting back against the man, AMD cards actually run better because AMD went "aw gently caress it we can't even support one OS decently let alone two, you guys have this code" and open sourced it, so hackers have made kernel-packed AMD driver so optimized that Windows games running through Linux run faster than they do on Windows through AMD's own driver. RTG can't make a driver better than a bunch of people doing it for free.

Craptacular! fucked around with this message at 05:40 on Dec 21, 2019

The Big Bad Worf
Jan 26, 2004
Quad-greatness
honestly a lot of stuff that's hostile to linux gamers (proprietary DRM, paranoid anti-cheat software) is bad for games preservation in the long term, and a supreme annoyance for honest, paying customers in the here and now anyway.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Klyith posted:

(Also thread relevant: I keep seeing more bad news about Intel's GPU in my phone news feed, but it seems to be republishing reddit rumours. It seems utterly crazy that their performance could be that dismal on a dGPU with real ram, but... who knows? It would fit with Intel's no-good very bad year.)

It's not that hard to think, really. GPUs are non-trivial. They're leveraging a conceptual architecture considerably different from CPUs, and then you have to get the driver stack working to optimize the actual jobs on the hardware, which as we've seen from AMD is at least as important as the hardware itself. I don't think there was ever a real chance Intel's first release would be meaningfully competitive in performance with AMD/NVidia's higher-end offerings. From a business perspective, though, they may not have to be if they can sell them at the right price-point. But it's Intel, so we'll see how realistic they decide to be on pricing.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Is it just me or am I the only one that finds it odd that the new Xbox is going to be based off of AMDs RDNA when the second version is going to be announced in January 2020 and available later in the year?

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Tab8715 posted:

Is it just me or am I the only one that finds it odd that the new Xbox is going to be based off of AMDs RDNA when the second version is going to be announced in January 2020 and available later in the year?

Why are you assuming it’s using the first iteration of RDNA when it very well might use the second? We don’t know which it will be using at this time and we don’t know if amd is announcing rdna 2 this January, it’s all rumors at this point. That and if console or anyone keeps waiting for something better they will end up never building anything.

B-Mac fucked around with this message at 18:45 on Dec 21, 2019

repiv
Aug 13, 2009

The new Xbox has features that don't exist in RDNA1 (hardware raytracing and variable rate shading have been confirmed) so it's pretty safe to assume it's using AMDs next iteration of RDNA.

repiv
Aug 13, 2009

https://www.pcgameshardware.de/Radeon-RX-5500-XT-Grafikkarte-275529/Specials/PCI-Express-3-vs-PCI-E-4-GPU-1339415/

PCGH noticed something weird - the 5500XT only has 8 PCIe lanes wired up, which chokes the GPU unless you have a PCIe4 capable system.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I'm pretty sure what the tests they're using show is that the interconnect speed matters if you run out of vram, which has always been true. Remember PCI GPUs that would drop from FPS to SPF if you ran out of vram?

I'm not sure it matters much because no poo poo you can't run max details on a 4gb GPU in modern titles.

Arzachel
May 12, 2012

repiv posted:

https://www.pcgameshardware.de/Radeon-RX-5500-XT-Grafikkarte-275529/Specials/PCI-Express-3-vs-PCI-E-4-GPU-1339415/

PCGH noticed something weird - the 5500XT only has 8 PCIe lanes wired up, which chokes the GPU unless you have a PCIe4 capable system.

The massive qualifier being that the difference is only there when you're running out of vram which isn't a typical scenario to say the least.

repiv
Aug 13, 2009

It registers on the 8GB version too, which shouldn't be swapping VRAM surely? BF5 has 10% better avg and 17% better min FPS when running in PCIe4 mode, and Wolfenstein has +15% avg and +18% min.

repiv fucked around with this message at 19:35 on Dec 21, 2019

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

repiv posted:

https://www.pcgameshardware.de/Radeon-RX-5500-XT-Grafikkarte-275529/Specials/PCI-Express-3-vs-PCI-E-4-GPU-1339415/

PCGH noticed something weird - the 5500XT only has 8 PCIe lanes wired up, which chokes the GPU unless you have a PCIe4 capable system.

this is the the benefit of pcie4 for consumers: ideally we shift over to x8 and then free up the x8 for thunderbolt or something

Arzachel posted:

The massive qualifier being that the difference is only there when you're running out of vram which isn't a typical scenario to say the least.

Of course: pcie is only a bottleneck if you need to constantly stream data over it.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Malcolm XML posted:

this is the the benefit of pcie4 for consumers: ideally we shift over to x8 and then free up the x8 for thunderbolt or something


Of course: pcie is only a bottleneck if you need to constantly stream data over it.

ideally you wire up all 16 lanes and then if you put a card in the second slot it drops to x8 like normal.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


B-Mac posted:

Why are you assuming it’s using the first iteration of RDNA when it very well might use the second? We don’t know which it will be using at this time and we don’t know if amd is announcing rdna 2 this January, it’s all rumors at this point. That and if console or anyone keeps waiting for something better they will end up never building anything.

That’s just what I read from earlier but this makes more sense.

As a follow-up, are AMDs Drivers still buggy? That’s the last thing holding me to Nvidia.

SwissArmyDruid
Feb 14, 2014

by sebmojo
I haven't had any problems with any drivers that aren't the most recent ones. They seem to singularly be a tire fire and a half.

Shaocaholica
Oct 29, 2002

Fig. 5E
I paid $60 for this in 2019! NIB!



Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

repiv posted:

https://www.pcgameshardware.de/Radeon-RX-5500-XT-Grafikkarte-275529/Specials/PCI-Express-3-vs-PCI-E-4-GPU-1339415/

PCGH noticed something weird - the 5500XT only has 8 PCIe lanes wired up, which chokes the GPU unless you have a PCIe4 capable system.

Like you have to know that 99% of systems this will be going into are going to be PCIE3, cripes that a hosed up launch

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Yep, pretty poo poo of them to limit it to 8 lanes.

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Shaocaholica posted:

I paid $60 for this in 2019! NIB!





Way to defy the thread title!

Shaocaholica
Oct 29, 2002

Fig. 5E

Rexxed posted:

Way to defy the thread title!

When did the CG girlies go away?

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

the original gamergate

iospace
Jan 19, 2038


That probably explains the poor benchmarking for the 5500XT. Well done AMD.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

I had that MS mouse, and the black one of the same ....for many, many years. It might be one of the last cheap peripherals(29-39$) I bought that lasted forever

Nice kb too!

PC LOAD LETTER
May 23, 2005
WTF?!

Cygni posted:

The Samsung 7nm EUV rumor sorta felt like the yearly "Apple is testing AMD based computers!!!" thing from the start. Nvidia has played the two against each other to lower their wafer prices before, wouldn't surprise me if thats what was happening all along here too.

The rumor was that NV really hated TSMC's prices for 7nm/+ and really really wanted to have a viable option with Samsung but Samsung couldn't really deliver something that was as good as TSMC's 7nm, much less 7nm+ performance. Nor could they do it in anything like the time frame NV wanted so NV was forced to go back to TSMC otherwise they'd essentially be giving AMD a process advantage.

Samsung's 7nm production is still supposed to be a fair amount cheaper than TSMC's at least so maybe they'll use them for some lower cost parts or something but by all accounts (again, its a rumor...) they didn't get what they wanted.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Shaocaholica posted:

I paid $60 for this in 2019! NIB!





drat, I didn't know there was an AGP 3850. I managed to get a 4650 back in like 2015 and in my Pentium M @2.4/4GB DDR400 desktop, it was capable of running Skyrim and WoW playably. Had to use Windows 8 though, since with 10 they dropped support for everything before 5xxx.

I was surprised by how poorly WoW performed (like, 720p/medium or 1080p/low) because I played it for years on the same system back when it was my daily driver with a 6800GT, but I guess Blizzard really did increase requirements in the intervening seven years.

NewFatMike
Jun 11, 2015

I'm disinclined to believe that PCIe 3.0 x8 is a limiting factor for the RX 5500. This was tested with a GTX 1080, which is much closer to the RX 5700XT:

Gamers Nexus posted:

From a quick look, there is a little below a 1% performance difference in PCI-e 3.0 x16 and PCI-e 3.0 x8 slots. The difference is not even close to perceptible and should be ignored as inconsequential to users fretting over potential slot or lane limitations. We are not sure how this scales with SLI (particularly MDA 'mode') or dual-GPU cards, but hope to research once we've got more hardware in the lab.

https://www.gamersnexus.net/guides/2488-pci-e-3-x8-vs-x16-performance-impact-on-gpus

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

NewFatMike posted:

I'm disinclined to believe that PCIe 3.0 x8 is a limiting factor for the RX 5500. This was tested with a GTX 1080, which is much closer to the RX 5700XT:

Which is complexly irrelevant to a card with a 4GB frame buffer that frequently overspills into ram in modern games, necessitating the use of the available PCIE bandwidth. I mean dude there's literally benchmarks of the specific card in question under PCIE 3 vs 4.

Cavauro
Jan 9, 2008

I trust the mush-mouth YouTube man, as he's been kind to me in the past

NewFatMike
Jun 11, 2015

Happy_Misanthrope posted:

Which is complexly irrelevant to a card with a 4GB frame buffer that frequently overspills into ram in modern games, necessitating the use of the available PCIE bandwidth. I mean dude there's literally benchmarks of the specific card in question under PCIE 3 vs 4.

Oh gotcha, I missed that it was 4GB vs 8GB. My mistake.

Adbot
ADBOT LOVES YOU

Arzachel
May 12, 2012

Happy_Misanthrope posted:

Which is complexly irrelevant to a card with a 4GB frame buffer that frequently overspills into ram in modern games, necessitating the use of the available PCIE bandwidth. I mean dude there's literally benchmarks of the specific card in question under PCIE 3 vs 4.

The solution is to lower the settings so that you're not running out of vram in the first place, PCIE bandwidth is a band-aid to a gushing wound. That the 8GB cards also show improvements over the usual 1-3% is the weird part.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply