|
Happy_Misanthrope posted:Which is complexly irrelevant to a card with a 4GB frame buffer that frequently overspills into ram in modern games, necessitating the use of the available PCIE bandwidth. I mean dude there's literally benchmarks of the specific card in question under PCIE 3 vs 4. The 8GB card is also showing scaling with PCIe speeds so it’s not purely caused by VRAM limitation.
|
# ? Dec 22, 2019 19:47 |
|
|
# ? Jun 5, 2024 08:13 |
|
Cavauro posted:I trust the mush-mouth YouTube man, as he's been kind to me in the past this could be applying to 20 tech guys
|
# ? Dec 22, 2019 20:30 |
|
another reviewer confirming PCGH.de’s findings. boy, it’s a good thing that AMD stopped partners from enabling 4.0 on older boards, guess you’ll just have to buy a new $200+ motherboard if you want to run your AMD card at full speed. Boy, they probably went out of their way to not use the standard 16 lanes. Already working hard on that vendor lock-in. Ahem, I meant “brand-specific synergies”. Paul MaudDib fucked around with this message at 00:27 on Dec 23, 2019 |
# ? Dec 22, 2019 22:27 |
|
Paul MaudDib posted:another reviewer confirming PCGH.de’s findings. +
|
# ? Dec 22, 2019 22:50 |
|
Seriously gently caress AMD for only enable 8 lanes, what dick move.
|
# ? Dec 23, 2019 00:02 |
|
Paul MaudDib posted:another reviewer confirming PCGH.de’s findings. Are you banking on no one actually clicking the link cause lmao
|
# ? Dec 23, 2019 00:13 |
|
B-Mac posted:Seriously gently caress AMD for only enable 8 lanes, what dick move. "So we're ahead of Intel in value for money, hardware security, and cores for the first time in decades. I suppose there's only one thing left to concern ourselves with...how dickish can we be to our low/mid-range end-users now?"
|
# ? Dec 23, 2019 00:23 |
|
AMD aren't being dicks. They're just discouraging users from buying their lovely GPUs.
|
# ? Dec 23, 2019 00:48 |
|
Yep, no reason to buy the 5500XT VS the 1650S or 1660 right now.
|
# ? Dec 23, 2019 01:20 |
|
so other cards don't have this problem because even if they're on PCI-E 3.0, they use 16 lanes, and that's roughly as good as PCI-E 4.0 with 8 lanes? what would even be the point of limiting themselves to 8 lanes?
|
# ? Dec 23, 2019 01:32 |
|
Slightly simpler PCB, less lanes to the port.
|
# ? Dec 23, 2019 01:59 |
|
It really wouldn’t surprise me, if they can take advantage of it, that AMD pushes bus evolution to keep the upgrades going. You know, give the people a good value, but in such a way that it requires giving AMD money regularly like some kind of crack monkey so you truly don’t save much in the end. “Yes this graphics card offers great value for money, but only if paired with an X570/B550 motherboard. Guess you’ll have to buy a new board for that 1600. Here’s our new ray tracing model, it’s a great price but the performance suffers if your don’t use the AM2000 FutureSocket. Aunt Lisa’s upgrade train is leaving the station!”
|
# ? Dec 23, 2019 02:11 |
|
it would surprise me if AMD turned lovely with sockets since not replacing a mobo every generation is the leading thing that AMD has on intel over price right now (for my use) and prices can be changed real fast
|
# ? Dec 23, 2019 02:18 |
|
Maybe but sockets have to change for DDR5 regardless.
|
# ? Dec 23, 2019 02:42 |
|
But guys, this move will bring board costs down $1.03 per unit!!! Think of the savings!!!
|
# ? Dec 23, 2019 03:26 |
|
I assume that limit is just there for differentiation between models, right?
|
# ? Dec 23, 2019 03:42 |
|
Craptacular! posted:It really wouldn’t surprise me, if they can take advantage of it, that AMD pushes bus evolution to keep the upgrades going. You know, give the people a good value, but in such a way that it requires giving AMD money regularly like some kind of crack monkey so you truly don’t save much in the end. Especially since AMD explicitly shut down the prospect of PCIe 4.0 when using Zen2 on older boards. Partners were working on enabling it (just like they got the 240GE to overclock) and AMD put the foot down and made them stop because it was messing up their product segmentation. The wires on 4.0 boards aren't magic 4.0 wires, it's all just a question of signal integrity, and most older boards can probably run 4.0 on at least the top slot since that's shortest. Partners evidently thought so, at least. Paul MaudDib fucked around with this message at 06:54 on Dec 23, 2019 |
# ? Dec 23, 2019 05:48 |
|
i run my RTX2070S on PCI-E 3.0 x8 and it seems fine
|
# ? Dec 23, 2019 07:00 |
|
treasure bear posted:i run my RTX2070S on PCI-E 3.0 x8 and it seems fine Goons have never seen a budget GPUs so -50% vs -35% seems like a big deal even though that's fixed by dropping the texture settings down a notch. Memory management on the 5500/XT cards might be broken and swapping from main memory too aggressively even when not hitting a vram limit which would explain the 8GB results though. Arzachel fucked around with this message at 11:11 on Dec 23, 2019 |
# ? Dec 23, 2019 09:39 |
|
Any good rumours on what Ampere is going to look like - especially in terms of GPU ram?
|
# ? Dec 23, 2019 12:44 |
|
Arzachel posted:Goons have never seen a budget GPUs I was using a GTX 650 since July, then replaced it with an RX 560 in September before moving to an RX 580 last week, or do neither of the first two count
|
# ? Dec 23, 2019 13:03 |
|
i recently sold a 1050 on SA mart and have a 1050ti in another computer considering how often you hear people recommend just buying used GPUs, im not sure i agree on that granted yes a lot of us are playing our 90s retro games on 9/10/20-80ti’s
|
# ? Dec 23, 2019 13:32 |
|
GRINDCORE MEGGIDO posted:Slightly simpler PCB, less lanes to the port. It's possible that dropping the PCIe lanes + only a 128bit memory bus let them make it with less PCB layers, which would be a real savings. I bet that AMD, just like all of us two days ago, didn't think it would have this much effect on a low-end card. What I want to see is a review / investigation that tries to pin down why the performance changes -- the 8gb result says it's not from out-of-vram problems. That german site tested a wide range of games, I'd like to see someone test a couple games at multiple different settings to see if there is any difference there. I like Arzachel's idea that something in their drivers is busted. Also, the conspiracy idea that they intentionally did this to push PCIe 4 is dumb, people who are buying cheap low end graphics cards are not going to say "oh I need a x570 mobo that costs more than the GPU to use it in!" The only result is that the 5500 reviewed worse than it might otherwise have, since all the reviewers are still using intel as their benchmarking platforms.
|
# ? Dec 23, 2019 16:34 |
|
Paul MaudDib posted:
On the other hand, the main use of PCIe 4.0 is NVMe SSDs and a lot of M.2 slots are further from the CPU than the top slot. I can easily imagine AMD trying to look into the future and seeing 50 000 reddit threads about AMD systems causing file system corruption and data loss and going: Nope, don't want none of that. We'll only allow 4.0 on boards designed for it. NO EXCEPTION!!!!
|
# ? Dec 23, 2019 17:53 |
|
Statutory Ape posted:yes a lot of us are playing our 90s retro games on 9/10/20-80ti’s Yeah, I need to make something clear because we keep going in this circle: I don’t try to shame for owning expensive video cards, I shame people for shitposting about owning expensive video cards. At some point, and you should know what it is and not need anyone to tell you, you’re not contributing anything of value whatsoever, you’re flaunting that you’re comfortable enough to own more than you need; to the point where it feels like a bad Family Guy joke about late stage capitalism. (“Hey Lois, I have income. Watch what I’m gonna do with it. Eeehhehehehe.”) As long as you’re not making stupid loving shitposts, you won’t catch poo poo for it in this thread. At least not from me. But nobody wants to applaud a post about how you have money to spend poorly; the people who can’t say the same think you’re pitching an advertisement for guillotines, and the people who can say the same don’t care. Craptacular! fucked around with this message at 19:28 on Dec 23, 2019 |
# ? Dec 23, 2019 19:18 |
|
So what would be the Nvidia geforce/quadro equivalent of the AMD firepro W4300? e: Oh its an R7 260 with 4GB? Shaocaholica fucked around with this message at 20:00 on Dec 23, 2019 |
# ? Dec 23, 2019 19:57 |
|
It'll be interesting to see if the 5600 suffers the same issue as the 5500 once testers get their hands on them.
|
# ? Dec 23, 2019 20:07 |
|
Lungboy posted:It'll be interesting to see if the 5600 suffers the same issue as the 5500 once testers get their hands on them. It will still be an AMD GPU, yes
|
# ? Dec 23, 2019 20:31 |
|
Shaocaholica posted:So what would be the Nvidia geforce/quadro equivalent of the AMD firepro W4300? Like a modern one or a direct equivalent to that GPU from 2014? It probably most directly competed with the Quadro K2200 in its day. For a replacement probably a Quadro P1000 through P2200 depending on performance needed
|
# ? Dec 23, 2019 20:51 |
|
charity rereg posted:Like a modern one or a direct equivalent to that GPU from 2014? It probably most directly competed with the Quadro K2200 in its day. For a replacement probably a Quadro P1000 through P2200 depending on performance needed Thanks. Is there a modern FirePro equivalent that’s also low profile single slot?
|
# ? Dec 23, 2019 21:06 |
|
Shaocaholica posted:Thanks. Is there a modern FirePro equivalent that’s also low profile single slot? The P400, P600, P1000 all come in single slot SFF. AMD also makes Radeon Pro Single Slot SFF in the WX3100 and WX4100. The WX4100 runs great in an optiplex 3040 SFF and plays rocket league really well at 1200p dont ask me how i know, i only use mine for work purposes bus hustler fucked around with this message at 21:19 on Dec 23, 2019 |
# ? Dec 23, 2019 21:11 |
|
charity rereg posted:The P400, P600, P1000 all come in single slot SFF. Thanks. I already have a P1000 so I just wanted an AMD part for my mirror clone SFF for vanity reasons. How well does the WX4100 stack up to a P1000 for modern ~games~ Shaocaholica fucked around with this message at 21:45 on Dec 23, 2019 |
# ? Dec 23, 2019 21:42 |
|
Probably slightly better, this drops the P1000 but the WX4100 is probably comparable. https://techgage.com/article/radeon-pro-vs-quadro-a-fresh-look-at-workstation-gpu-performance/7/
|
# ? Dec 23, 2019 22:54 |
|
Statutory Ape posted:i recently sold a 1050 on SA mart and have a 1050ti in another computer I recently played Earthbound on Retroarch using what I read was a pretty heavily GPU intensive CRT filter where it renders each individual phosphor. There's a part in the game where you're crossing a suspension bridge with narrow white cables and thanks to this CRT filter the cables did this rainbow shimmer effect I hadn't seen since I was a little kid playing on my actual SNES on an actual CRT and god dammit buying a high end GPU for 90s games is worth it
|
# ? Dec 24, 2019 00:19 |
|
What resolution/fps were you pushing ?
|
# ? Dec 24, 2019 00:25 |
|
garfield hentai posted:I recently played Earthbound on Retroarch using what I read was a pretty heavily GPU intensive CRT filter where it renders each individual phosphor. There's a part in the game where you're crossing a suspension bridge with narrow white cables and thanks to this CRT filter the cables did this rainbow shimmer effect I hadn't seen since I was a little kid playing on my actual SNES on an actual CRT and god dammit buying a high end GPU for 90s games is worth it Lol CRT emulator to play Atari games on a $1000 GPU.
|
# ? Dec 24, 2019 02:59 |
|
Finally, the transparent water in Sonic 2 will be restored!
|
# ? Dec 24, 2019 03:01 |
|
So a bit of an update from my last post in here, that turned out to be a DOA motherboard instead: turns out there is still something the matter with my used XFX XXX mining card RX 580 8GB. Namely the VRAM, which I need to lock to State 1 if I don't want the card to crash and system restart while running any modern game (this even includes games like CK2, that are not graphically demanding whatsoever). It's not the end of the world, since this does give me stable 50-60FPS at High-Ultra settings and that's all I really want for bargain bin pricing I suppose. Just yea, wondering if there's something that I've missed about this model that anyone here knows about, that might let me activate State 2, or if the VRAM speed on this is somehow just borked (to clarify, I've done heavy stress tests on my PSU with GPU + CPU and I am sure it's not the issue, considering it's a new Seasonic Platinum 650W. Otherwise, I've tried just about most of the things you usually would, from raising power limit to 50% more, switching to the non-mining BIOS, reinstalling all the different versions of this model from TechPowerup, lowering the normal clock and other settings to the most basic ones...doesn't matter...if I activate State 2 on the VRAM, even at the lowest possible speed of 1750Mhz, I get an instant crash). CrazyLoon fucked around with this message at 05:14 on Dec 24, 2019 |
# ? Dec 24, 2019 05:01 |
|
If you really want to get back some of the experience of playing a game on a CRT, you should also get a monitor with ULMB or some other back light strobe/low persistence mode. The motion clarity is pretty close, and you also lose a ton of peak luminosity so it is dim just like CRTs.
|
# ? Dec 24, 2019 05:06 |
|
|
# ? Jun 5, 2024 08:13 |
|
CrazyLoon posted:So a bit of an update from my last post in here, that turned out to be a DOA motherboard instead: turns out there is still something the matter with my used XFX XXX mining card RX 580 8GB. Namely the VRAM, which I need to lock to State 1 if I don't want the card to crash and system restart while running any modern game. it's probably had a mining VBIOS flashed and if you can find the appropriate VBIOS for your exact model (brand, model, memory manufacturer, possibly power configuration) then you can probably flash it back to original and it'll work normally
|
# ? Dec 24, 2019 05:10 |