Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Happy_Misanthrope posted:

Which is complexly irrelevant to a card with a 4GB frame buffer that frequently overspills into ram in modern games, necessitating the use of the available PCIE bandwidth. I mean dude there's literally benchmarks of the specific card in question under PCIE 3 vs 4.

The 8GB card is also showing scaling with PCIe speeds so it’s not purely caused by VRAM limitation.

Adbot
ADBOT LOVES YOU

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Cavauro posted:

I trust the mush-mouth YouTube man, as he's been kind to me in the past

this could be applying to 20 tech guys

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
another reviewer confirming PCGH.de’s findings.

boy, it’s a good thing that AMD stopped partners from enabling 4.0 on older boards, guess you’ll just have to buy a new $200+ motherboard if you want to run your AMD card at full speed.

Boy, they probably went out of their way to not use the standard 16 lanes. Already working hard on that vendor lock-in. Ahem, I meant “brand-specific synergies”.

Paul MaudDib fucked around with this message at 00:27 on Dec 23, 2019

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Paul MaudDib posted:

another reviewer confirming PCGH.de’s findings.

boy, it’s a good thing that AMD stopped partners from enabling 4.0 on older boards, guess you’ll just have to buy a new $200+ motherboard if you want to run your AMD card at full speed.

Boy, they probably went out of their way to not use the standard 16 lanes. Already working hard on that vendor lock-in. Ahem, I meant “brand-specific synergies”.

:catdrugs: + :lsd:

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Seriously gently caress AMD for only enable 8 lanes, what dick move.

Arzachel
May 12, 2012

Paul MaudDib posted:

another reviewer confirming PCGH.de’s findings.

boy, it’s a good thing that AMD stopped partners from enabling 4.0 on older boards, guess you’ll just have to buy a new $200+ motherboard if you want to run your AMD card at full speed.

Boy, they probably went out of their way to not use the standard 16 lanes. Already working hard on that vendor lock-in. Ahem, I meant “brand-specific synergies”.

Are you banking on no one actually clicking the link cause lmao

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

B-Mac posted:

Seriously gently caress AMD for only enable 8 lanes, what dick move.

"So we're ahead of Intel in value for money, hardware security, and cores for the first time in decades. I suppose there's only one thing left to concern ourselves with...how dickish can we be to our low/mid-range end-users now?" :v:

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
AMD aren't being dicks. They're just discouraging users from buying their lovely GPUs.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Yep, no reason to buy the 5500XT VS the
1650S or 1660 right now.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
so other cards don't have this problem because even if they're on PCI-E 3.0, they use 16 lanes, and that's roughly as good as PCI-E 4.0 with 8 lanes?

what would even be the point of limiting themselves to 8 lanes?

GRINDCORE MEGGIDO
Feb 28, 1985


Slightly simpler PCB, less lanes to the port.

Craptacular!
Jul 9, 2001

Fuck the DH
It really wouldn’t surprise me, if they can take advantage of it, that AMD pushes bus evolution to keep the upgrades going. You know, give the people a good value, but in such a way that it requires giving AMD money regularly like some kind of crack monkey so you truly don’t save much in the end.

“Yes this graphics card offers great value for money, but only if paired with an X570/B550 motherboard. Guess you’ll have to buy a new board for that 1600. Here’s our new ray tracing model, it’s a great price but the performance suffers if your don’t use the AM2000 FutureSocket. Aunt Lisa’s upgrade train is leaving the station!”

Worf
Sep 12, 2017

If only Seth would love me like I love him!

it would surprise me if AMD turned lovely with sockets since not replacing a mobo every generation is the leading thing that AMD has on intel over price right now (for my use) and prices can be changed real fast

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Maybe but sockets have to change for DDR5 regardless.

Cygni
Nov 12, 2005

raring to post

But guys, this move will bring board costs down $1.03 per unit!!! Think of the savings!!!

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
I assume that limit is just there for differentiation between models, right?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Craptacular! posted:

It really wouldn’t surprise me, if they can take advantage of it, that AMD pushes bus evolution to keep the upgrades going. You know, give the people a good value, but in such a way that it requires giving AMD money regularly like some kind of crack monkey so you truly don’t save much in the end.

“Yes this graphics card offers great value for money, but only if paired with an X570/B550 motherboard. Guess you’ll have to buy a new board for that 1600. Here’s our new ray tracing model, it’s a great price but the performance suffers if your don’t use the AM2000 FutureSocket. Aunt Lisa’s upgrade train is leaving the station!”

Especially since AMD explicitly shut down the prospect of PCIe 4.0 when using Zen2 on older boards. Partners were working on enabling it (just like they got the 240GE to overclock) and AMD put the foot down and made them stop because it was messing up their product segmentation.

The wires on 4.0 boards aren't magic 4.0 wires, it's all just a question of signal integrity, and most older boards can probably run 4.0 on at least the top slot since that's shortest. Partners evidently thought so, at least.

Paul MaudDib fucked around with this message at 06:54 on Dec 23, 2019

treasure bear
Dec 10, 2012

i run my RTX2070S on PCI-E 3.0 x8 and it seems fine

Arzachel
May 12, 2012

treasure bear posted:

i run my RTX2070S on PCI-E 3.0 x8 and it seems fine

Goons have never seen a budget GPUs so -50% vs -35% seems like a big deal even though that's fixed by dropping the texture settings down a notch. Memory management on the 5500/XT cards might be broken and swapping from main memory too aggressively even when not hitting a vram limit which would explain the 8GB results though.

Arzachel fucked around with this message at 11:11 on Dec 23, 2019

shrike82
Jun 11, 2005

Any good rumours on what Ampere is going to look like - especially in terms of GPU ram?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Arzachel posted:

Goons have never seen a budget GPUs

I was using a GTX 650 since July, then replaced it with an RX 560 in September before moving to an RX 580 last week, or do neither of the first two count

Worf
Sep 12, 2017

If only Seth would love me like I love him!

i recently sold a 1050 on SA mart and have a 1050ti in another computer :shrug:

considering how often you hear people recommend just buying used GPUs, im not sure i agree on that

granted yes a lot of us are playing our 90s retro games on 9/10/20-80ti’s

Klyith
Aug 3, 2007

GBS Pledge Week

GRINDCORE MEGGIDO posted:

Slightly simpler PCB, less lanes to the port.

It's possible that dropping the PCIe lanes + only a 128bit memory bus let them make it with less PCB layers, which would be a real savings. I bet that AMD, just like all of us two days ago, didn't think it would have this much effect on a low-end card.

What I want to see is a review / investigation that tries to pin down why the performance changes -- the 8gb result says it's not from out-of-vram problems. That german site tested a wide range of games, I'd like to see someone test a couple games at multiple different settings to see if there is any difference there. I like Arzachel's idea that something in their drivers is busted.



Also, the conspiracy idea that they intentionally did this to push PCIe 4 is dumb, people who are buying cheap low end graphics cards are not going to say "oh I need a x570 mobo that costs more than the GPU to use it in!" The only result is that the 5500 reviewed worse than it might otherwise have, since all the reviewers are still using intel as their benchmarking platforms.

Drakhoran
Oct 21, 2012

Paul MaudDib posted:



The wires on 4.0 boards aren't magic 4.0 wires, it's all just a question of signal integrity, and most older boards can probably run 4.0 on at least the top slot since that's shortest. Partners evidently thought so, at least.

On the other hand, the main use of PCIe 4.0 is NVMe SSDs and a lot of M.2 slots are further from the CPU than the top slot. I can easily imagine AMD trying to look into the future and seeing 50 000 reddit threads about AMD systems causing file system corruption and data loss and going: Nope, don't want none of that. We'll only allow 4.0 on boards designed for it. NO EXCEPTION!!!!

Craptacular!
Jul 9, 2001

Fuck the DH

Statutory Ape posted:

yes a lot of us are playing our 90s retro games on 9/10/20-80ti’s

Yeah, I need to make something clear because we keep going in this circle: I don’t try to shame for owning expensive video cards, I shame people for shitposting about owning expensive video cards. At some point, and you should know what it is and not need anyone to tell you, you’re not contributing anything of value whatsoever, you’re flaunting that you’re comfortable enough to own more than you need; to the point where it feels like a bad Family Guy joke about late stage capitalism. (“Hey Lois, I have income. Watch what I’m gonna do with it. Eeehhehehehe.”)

As long as you’re not making stupid loving shitposts, you won’t catch poo poo for it in this thread. At least not from me. But nobody wants to applaud a post about how you have money to spend poorly; the people who can’t say the same think you’re pitching an advertisement for guillotines, and the people who can say the same don’t care.

Craptacular! fucked around with this message at 19:28 on Dec 23, 2019

Shaocaholica
Oct 29, 2002

Fig. 5E
So what would be the Nvidia geforce/quadro equivalent of the AMD firepro W4300?

e: Oh its an R7 260 with 4GB?

Shaocaholica fucked around with this message at 20:00 on Dec 23, 2019

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP
It'll be interesting to see if the 5600 suffers the same issue as the 5500 once testers get their hands on them.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Lungboy posted:

It'll be interesting to see if the 5600 suffers the same issue as the 5500 once testers get their hands on them.

It will still be an AMD GPU, yes

bus hustler
Mar 14, 2019

Ha Ha Ha... YES!

Shaocaholica posted:

So what would be the Nvidia geforce/quadro equivalent of the AMD firepro W4300?

e: Oh its an R7 260 with 4GB?

Like a modern one or a direct equivalent to that GPU from 2014? It probably most directly competed with the Quadro K2200 in its day. For a replacement probably a Quadro P1000 through P2200 depending on performance needed

Shaocaholica
Oct 29, 2002

Fig. 5E

charity rereg posted:

Like a modern one or a direct equivalent to that GPU from 2014? It probably most directly competed with the Quadro K2200 in its day. For a replacement probably a Quadro P1000 through P2200 depending on performance needed

Thanks. Is there a modern FirePro equivalent that’s also low profile single slot?

bus hustler
Mar 14, 2019

Ha Ha Ha... YES!

Shaocaholica posted:

Thanks. Is there a modern FirePro equivalent that’s also low profile single slot?

The P400, P600, P1000 all come in single slot SFF.

AMD also makes Radeon Pro Single Slot SFF in the WX3100 and WX4100. The WX4100 runs great in an optiplex 3040 SFF and plays rocket league really well at 1200p dont ask me how i know, i only use mine for work purposes

bus hustler fucked around with this message at 21:19 on Dec 23, 2019

Shaocaholica
Oct 29, 2002

Fig. 5E

charity rereg posted:

The P400, P600, P1000 all come in single slot SFF.

AMD also makes Radeon Pro Single Slot SFF in the WX3100 and WX4100. The WX4100 runs great in an optiplex 3040 SFF and plays rocket league really well at 1200p dont ask me how i know, i only use mine for work purposes

Thanks. I already have a P1000 so I just wanted an AMD part for my mirror clone SFF for vanity reasons.

How well does the WX4100 stack up to a P1000 for modern ~games~

Shaocaholica fucked around with this message at 21:45 on Dec 23, 2019

bus hustler
Mar 14, 2019

Ha Ha Ha... YES!
Probably slightly better, this drops the P1000 but the WX4100 is probably comparable. https://techgage.com/article/radeon-pro-vs-quadro-a-fresh-look-at-workstation-gpu-performance/7/

garfield hentai
Feb 29, 2004

Statutory Ape posted:

i recently sold a 1050 on SA mart and have a 1050ti in another computer :shrug:

considering how often you hear people recommend just buying used GPUs, im not sure i agree on that

granted yes a lot of us are playing our 90s retro games on 9/10/20-80ti’s

I recently played Earthbound on Retroarch using what I read was a pretty heavily GPU intensive CRT filter where it renders each individual phosphor. There's a part in the game where you're crossing a suspension bridge with narrow white cables and thanks to this CRT filter the cables did this rainbow shimmer effect I hadn't seen since I was a little kid playing on my actual SNES on an actual CRT and god dammit buying a high end GPU for 90s games is worth it

Worf
Sep 12, 2017

If only Seth would love me like I love him!

What resolution/fps were you pushing ?

Shaocaholica
Oct 29, 2002

Fig. 5E

garfield hentai posted:

I recently played Earthbound on Retroarch using what I read was a pretty heavily GPU intensive CRT filter where it renders each individual phosphor. There's a part in the game where you're crossing a suspension bridge with narrow white cables and thanks to this CRT filter the cables did this rainbow shimmer effect I hadn't seen since I was a little kid playing on my actual SNES on an actual CRT and god dammit buying a high end GPU for 90s games is worth it

Lol CRT emulator to play Atari games on a $1000 GPU.

Spiderdrake
May 12, 2001



Finally, the transparent water in Sonic 2 will be restored!

CrazyLoon
Aug 10, 2015

"..."
So a bit of an update from my last post in here, that turned out to be a DOA motherboard instead: turns out there is still something the matter with my used XFX XXX mining card RX 580 8GB. Namely the VRAM, which I need to lock to State 1 if I don't want the card to crash and system restart while running any modern game (this even includes games like CK2, that are not graphically demanding whatsoever).

It's not the end of the world, since this does give me stable 50-60FPS at High-Ultra settings and that's all I really want for bargain bin pricing I suppose. Just yea, wondering if there's something that I've missed about this model that anyone here knows about, that might let me activate State 2, or if the VRAM speed on this is somehow just borked (to clarify, I've done heavy stress tests on my PSU with GPU + CPU and I am sure it's not the issue, considering it's a new Seasonic Platinum 650W. Otherwise, I've tried just about most of the things you usually would, from raising power limit to 50% more, switching to the non-mining BIOS, reinstalling all the different versions of this model from TechPowerup, lowering the normal clock and other settings to the most basic ones...doesn't matter...if I activate State 2 on the VRAM, even at the lowest possible speed of 1750Mhz, I get an instant crash).

CrazyLoon fucked around with this message at 05:14 on Dec 24, 2019

Indiana_Krom
Jun 18, 2007
Net Slacker
If you really want to get back some of the experience of playing a game on a CRT, you should also get a monitor with ULMB or some other back light strobe/low persistence mode. The motion clarity is pretty close, and you also lose a ton of peak luminosity so it is dim just like CRTs.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

CrazyLoon posted:

So a bit of an update from my last post in here, that turned out to be a DOA motherboard instead: turns out there is still something the matter with my used XFX XXX mining card RX 580 8GB. Namely the VRAM, which I need to lock to State 1 if I don't want the card to crash and system restart while running any modern game.

It's not the end of the world, since this does give me stable 50-60FPS at High-Ultra settings and that's all I really want for bargain bin pricing I suppose. Just yea, wondering if there's something that I've missed about this model that anyone here knows about, that might let me activate State 2, or if the VRAM speed on this is somehow just borked (to clarify, I've done heavy stress tests on my PSU with GPU + CPU and I am sure it's not the issue, considering it's a new Seasonic Platinum 650W. Otherwise, I've tried just about most of the things you usually would, from raising power limit to 50% more, switching to the non-mining BIOS, reinstalling all the different versions of this model from TechPowerup, lowering the normal clock and other settings to the most basic ones...doesn't matter...if I activate State 2 on the VRAM, even at the lowest possible speed of 1750Mhz, I get an instant crash).

it's probably had a mining VBIOS flashed and if you can find the appropriate VBIOS for your exact model (brand, model, memory manufacturer, possibly power configuration) then you can probably flash it back to original and it'll work normally

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply