Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
The Big Bad Worf
Jan 26, 2004
Quad-greatness

Indiana_Krom posted:

If you really want to get back some of the experience of playing a game on a CRT, you should also get a monitor with ULMB or some other back light strobe/low persistence mode. The motion clarity is pretty close, and you also lose a ton of peak luminosity so it is dim just like CRTs.

TVs are actually really suited for exactly this since TV manufacturer's aren't afraid to backlight strobe or do black frame insertion at 60hz. PC monitor manufacturer's don't let you go below 75hz in most cases which isn't low enough to give CRT-like motion clarity to emulated games or 60hz/60fps locked experiences. In some cases (like with emulation) you can use an emulator front end like Retroarch to combine your monitor's ULMB (backlight strobing) with software based black frame insertion, to say, run your monitor at 240hz, with 1 frame being a real game frame and 3 being black frames. 120hz with alternating real / black frames can work too, but you might also get temporary image retention (depending on your specific monitor) :v:

whenever i want to sit on a couch and play old console games I don't own / for a console I don't yet have a flashcart for, I use retroarch on my media pc connected to an LG c9 and have black frame insertion enabled + a fancy CRT shader. 4k resolution gives enough overhead to create a convincing enough simulation of the phosphors, and the black frame insertion (plus the response time of this specific set) provides a reasonable simulation of a CRT with slower phosphor decay than average. Still pretty crisp and I can do the broke brain poo poo like simulate composite / s-video artifacts too.

Adbot
ADBOT LOVES YOU

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Jesus, emulation is a dark rabbit hole.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
You don't know the half of it. Just wait until you start learning about how impractical it is to do truly proper emulation of even the simplest pieces of hardware. There are electrical and timing factors that emulators just completely overlook because they're impossible to truly simulate and we're nowhere near emulating them all yet.

Of course, it essentially never matters, and in practice it will be easier to put together hacks to make every single game work right, but that doesn't mean people won't stop chasing perfection.

LRADIKAL
Jun 10, 2001

Fun Shoe
Eventually we'll have enough processing power to actually simulate the physical chips and boards, good reason to upgrade.

Shaocaholica
Oct 29, 2002

Fig. 5E

LRADIKAL posted:

Eventually we'll have enough processing power to actually simulate the physical chips and boards, good reason to upgrade.

You mean the wires and transistors. Sure why not.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

is 3d printing your own c64 parts 'simulating' ? lol

good luck to the AMD guy w the wrong BIOS on your card, genuinely hope you can flash that thing to where it needs to be. basically tried the same thing myself this summer

SwissArmyDruid
Feb 14, 2014

by sebmojo
Techpowerup is a good place to start looking for that kind of thing.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Lockback posted:

Jesus, emulation is a dark rabbit hole.

That's why I stick to real hardware with a bunch of rgb scart cables, a gscart switcher and framemeister...
So simple :v: ..then there's the question of flash carts..

Seriously though, it's nice using the real controllers and getting back to the simplicity of gaming on older machines. No popups, distractions, save states or any of that crap. I find using emulators it's just too easy to speed skip a cutscene, alt-tab to look at a website, or whatever.

HalloKitty fucked around with this message at 10:44 on Dec 24, 2019

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Arzachel posted:

Are you banking on no one actually clicking the link cause lmao

btw now that we've got HUB covering this topic, what exactly are these posts supposed to convey here, besides general disdain for any criticism of AMD at any level?

not like AMD choosing to gimp their cards to x8 is a good thing, lol

Paul MaudDib fucked around with this message at 11:46 on Dec 24, 2019

Arzachel
May 12, 2012

Paul MaudDib posted:

btw now that we've got HUB covering this topic, what exactly are these posts supposed to convey here, besides general disdain for any criticism of AMD at any level?

not like AMD choosing to gimp their cards to x8 is a good thing, lol

There's plenty of criticism of AMD in this thread that doesn't get that response because they aren't the youtube equalent of a random reddit poster + "This is bad for AMD". It's almost like you post bad.

They're "gimping" the cards as much as anyone running something on the second PCIE socket on an Intel consumer platform.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Arzachel posted:

They're "gimping" the cards as much as anyone running something on the second PCIE socket on an Intel consumer platform.

NVIDIA cards running in the second slot don't take that much hit though. Hell, a 2080 ti in the second slot (3.0x8 at 1080p) takes less than 1/3 the hit as a 5500XT apparently. :thunk:

Paul MaudDib fucked around with this message at 12:45 on Dec 24, 2019

Arzachel
May 12, 2012

Paul MaudDib posted:

NVIDIA cards running in the second slot don't take that much hit though. Hell, a 2080 ti in the second slot (3.0x8 at 1080p) takes less than 1/3 the hit as a 5500XT apparently. :thunk:

Exactly! 8x bus width on the cards or how much the frame rate tanks *when you're out of vram* are meaningless but 5500/xt swapping way more memory than they have any right will actually hurt performance.

I would be very surprised if that turned out to be a hardware issue and not AMD botching the launch drivers again.

GRINDCORE MEGGIDO
Feb 28, 1985


HalloKitty posted:

Seriously though, it's nice using the real controllers and getting back to the simplicity of gaming on older machines. No popups, distractions, save states or any of that crap. I find using emulators it's just too easy to speed skip a cutscene, alt-tab to look at a website, or whatever.

I'm stoked about building an Amiga system.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Arzachel posted:

Exactly! 8x bus width on the cards or how much the frame rate tanks *when you're out of vram* are meaningless but 5500/xt swapping way more memory than they have any right will actually hurt performance.

I would be very surprised if that turned out to be a hardware issue and not AMD botching the launch drivers again.

then why does the 5500XT 8GB still show scaling? :thunk:

Never saw the 1080 8 GB scaling that much at x8...

Paul MaudDib fucked around with this message at 13:13 on Dec 24, 2019

Arzachel
May 12, 2012
poo poo launch drivers is my guess which with AMD's track record is very safe bet.

RME
Feb 20, 2012

the CRT shaders out there now (not the lovely filters you see in official collections) are cool but even intense ones don't really need a Ti or anything lol

e; i havent run into a shader that my 1060 can't handle and even that's prolly overkill. The GPU isn't engaged in anything else during emulation really, it's almost entirely cpu bound

RME fucked around with this message at 18:07 on Dec 24, 2019

orcane
Jun 13, 2012

Fun Shoe
I guess if you spend all day on :reddit: /r/amd to get yourself mad at the underdog it makes sense AMD is gimping their $200 (= entry level mainstream) card on purpose to sell more $200+ mainboards to people building computers for Fortnite. They didn't want to sell them to people using cheaper computers or Intel chipsets to begin with.

:thunk:

Schadenboner
Aug 15, 2011

by Shine

Cygni posted:

don’t use linux for games and expect it to work. thats been true for like 20 years. gaming in Linux is the realm of grey beards and people who like to do computer setups explicitly because they are hard and suck (technology dark souls)

I'll have you know my beard is a lustrous chestnut with only hints of grey.

:mad:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Arzachel posted:

poo poo launch drivers is my guess which with AMD's track record is very safe bet.

I wonder if it’s cause AMD is using a software-based HBCC style strategy while NVIDIA is hand-tuning what gets pinned into memory at a driver level? If AMD is blindly swapping big assets into and out of VRAM while NVIDIA is choosing to hold the important stuff and swap smaller stuff, that might explain why AMD seemingly sees more bus traffic. Basically a cache thrashing problem from setting “swappiness” too high.

The big question to me is why the 8GB seeming swaps so much more than NVIDIA 8GB cards. I can accept that the 4GB might be VRAM limited at max settings but that shouldn’t happen too much on 8GB.

I don’t really see what else it could be besides swapping. It’s the same graphics API calls in both cases.

Is there a meter in GPUz or HwMonitor for PCIe bus utilization? I think monitoring that number and comparing across brands/generations/VRAM capacity is the next step here for GN/HUB/etc.

repiv
Aug 13, 2009

Paul MaudDib posted:

Is there a meter in GPUz or HwMonitor for PCIe bus utilization? I think monitoring that number and comparing across brands/generations/VRAM capacity is the next step here for GN/HUB/etc.

GPU-Z has "Bus Interface Load" on Nvidia but I don't have an AMD system handy to see if their driver exposes a similar metric.

Freakazoid_
Jul 5, 2013


Buglord

Lockback posted:

Jesus, emulation is a dark rabbit hole.

I'm weirded out by it. But only because I am the true weirdo who has managed to keep almost all his childhood consoles and more, plus has at least two crt tvs (one of which I got for free from a warehouse cleanout earlier this year, unopened) that I can connect them to at any time and play some old games. From my perspective it's like some sort of necromancy.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

orcane posted:

I guess if you spend all day on :reddit: /r/amd to get yourself mad at the underdog it makes sense AMD is gimping their $200 (= entry level mainstream) card on purpose to sell more $200+ mainboards to people building computers for Fortnite. They didn't want to sell them to people using cheaper computers or Intel chipsets to begin with.

:thunk:

lol yes it’s definitely my fault, personally, that AMD gimped their own card with too small of a PCIe bus to cut costs. /s

Like, x16 busses have been normal for a long time, this is the first time we’ve seen x8 outside of the ultra low cost garbage like 5450 and GT 210 and stuff. They went out of their way to do that. And that’s just a simple factual observation, nobody else does x8 on 290X/580/1060 tier cards including AMDs own previous cards.

I do think AMD is hoping that it’ll be a bit of a brand lock-in once they get lower cost 4.0 boards out. B550 will support 4.0 to the GPU lanes (not that X470 and others couldn’t have if AMD didn’t gimp them too).

I guess it is what it is, not like the 5500XT is some amazing value anyway, so it’s just another reason not to buy it. 580 or 1650 Super work fine without needing a specific motherboard and are outright better price-to-perf.

Looking forward to HUB benchmarks that require purchasing all-AMD with hand tuned DRAM timings to match their numbers, lol.

Paul MaudDib fucked around with this message at 22:38 on Dec 24, 2019

eames
May 9, 2009

Paul MaudDib posted:

The big question to me is why the 8GB seeming swaps so much more than NVIDIA 8GB cards. I can accept that the 4GB might be VRAM limited at max settings but that shouldn’t happen too much on 8GB.

I remember seeing side to side comparisons between Polaris and Pascal cards where the AMD part consistently showed higher (>10%) VRAM utilization. The reviewers stated that this is due to different compression algorithms. If nothing changed since then this could explain additional swapping, particularly in titles/settings optimized for 8GB NVidia cards.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

eames posted:

I remember seeing side to side comparisons between Polaris and Pascal cards where the AMD part consistently showed higher (>10%) VRAM utilization. The reviewers stated that this is due to different compression algorithms. If nothing changed since then this could explain additional swapping, particularly in titles/settings optimized for 8GB NVidia cards.

That’s a misunderstanding of how delta compression works. It increases bandwidth but doesn’t reduce the footprint in VRAM.

It could but then you’d have to maintain a lookup table of “virtual” memory addresses when compressed and update it when things get swapped, etc. so they don’t, there is a “stop codon” and after that the texture is padded to equal size with zeroes. You don’t have to read the series once you see the stop marker, but it’s there.

(at least afaik from my understanding of delta compression, maybe it’s changed now)

That said, it’s certainly possible that AMD is just using a little more VRAM (less efficient driver/firmware with more buffers, etc) and they’re riding right on the edge of the VRAM capacity, so AMD bumps over a little but NVIDIA doesn’t? It’s just not delta compression that does it.

Paul MaudDib fucked around with this message at 22:50 on Dec 24, 2019

repiv
Aug 13, 2009

fun fact: delta compression actually increases memory footprint because in addition to the image data (which is reserved for the worst case, where no tiles can be compressed) it also has to store a metadata image to keep track of which encoding method each tile was stored with, and any parameters for the encoding if they aren't stored inline with the compressed tile

it's probably not a huge amount of data in practice (dcc only applies to framebuffer-type images, texture compression is handled differently) but it's something you have to take into account when working with low level apis, even if you intuitively know how many bytes a resource should take you still have to ask the driver how many bytes to allocate in case the hardware wants to store some metadata on the side

repiv fucked around with this message at 23:19 on Dec 24, 2019

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I realize there are problems with this comparison but it would be interesting to virtualize an OS and then take a MI50 and lock it PCIe 3.0 and various lane counts to see how it scales. Use a DX12 title so the drivers don’t matter.

That’s the only other PCIe 4.0 GPU that exists right now (other than V100 I guess).

Maybe the problem has existed for a while and nobody noticed it because this is the first consumer 4.0 GPUs. All the PCIe scaling tests I’ve seen for the last 5 years have used NVIDIA because they were faster, and it was assumed that both uarchs behaved similarly...

Paul MaudDib fucked around with this message at 23:05 on Dec 24, 2019

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

V100 is Gen3. Good luck getting ahold of a MI50 if you aren’t buying thousands.

CrazyLoon
Aug 10, 2015

"..."

Statutory Ape posted:

good luck to the AMD guy w the wrong BIOS on your card, genuinely hope you can flash that thing to where it needs to be. basically tried the same thing myself this summer

Appreciated. I'll take my time and just try them carefully one by one from xfx rx 580s I think...and if anything goes wrong during the flashing, well that's why dual BIOS with a simple flick of a switch and Buildzoid's tutorial on how to recover it is, honestly, a godsend.

I might succeed or I might not...I honestly half-expected this, considering I got an 8gb card like this for 130 EUR. Still, it is stable at state 1 atm and it's led me to learn more about my computer than I even remotely would've if I just plugged it all in and it all worked so...no regrets.

Schadenboner
Aug 15, 2011

by Shine
I think some of the 1660 Super cards (the <180mm shorties, still otherwise full-spec from what I can see) are logical 3.0 x8s?

Craptacular!
Jul 9, 2001

Fuck the DH

Arzachel posted:

poo poo launch drivers is my guess which with AMD's track record is very safe bet.

At the risk of being a Linux pedant again, if there's a problem with AMD's drivers then it shouldn't exist in Linux because AMD doesn't develop them.

Realistically, I kind of doubt AMD would expect 5500 buyers to buy X570 boards. That's Bx50 territory.

SwissArmyDruid
Feb 14, 2014

by sebmojo
PowerColor releases a single-fan, dual-slot, single-8-pin RX 5700.

BUT ONLY IN JAPAN.

http://www.gdm.or.jp/pressrelease/2019/1218/331737

apropos man
Sep 5, 2016

You get a hundred and forty one thousand years and you're out in eight!

Schadenboner posted:

I think some of the 1660 Super cards (the <180mm shorties, still otherwise full-spec from what I can see) are logical 3.0 x8s?

I'm on the lookout for a 1660 Super or a Ti after Christmas is over. I take it the MSI cards aren't bandwidth limited?

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Is there anywhere that has a handy table listing which pro cards have the same GPU as which consumer cards?

repiv
Aug 13, 2009

ItBreathes posted:

Is there anywhere that has a handy table listing which pro cards have the same GPU as which consumer cards?

it's not a direct mapping but you can figure it out pretty easily from these tables

https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units
https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units

just match up the chip name then narrow it down to the closest core count, etc

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

That's what I've been doing / was hoping to get away from.

Sininu
Jan 8, 2014

So I ordered myself Kraken G12, H55 and some VRAM heatsinks. What should I know before that stuff arrives and I start to mount it?

I got 2070S inside Meshify C case like couple other people here have.

CrazyLoon
Aug 10, 2015

"..."
Welp, I'm stumped. I am dead certain I have the right card matched with the right TechPowerUp BIOS and yet no progress, other than the fact that it might take it an extra second with this one before it crashes so...I suppose that's some confirmation at least. Even tried to link in dual 6+2-pin connectors from the PSU to the included 8 to two 6-pin adapter but it didn't make a lick of difference.

My only possible conclusions are that either a) This was initially released as some sort of mining-only GPU and also configured that way by XFX and that's what's limiting it or b) There's something physically screwy with the VRM inside the card itself, hence why it can't supply enough power for State 2, and I don't have anywhere near enough experience to fiddle with that and not mess poo poo up.

Ah well..c'est la vie... if I ever feel like it I might try fiddling with the Polaris Bios Editor, like the coin people do, and see if I can figure out these hash timings myself somehow, in case that is still screwing it up, but I rather doubt it by this point...

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Sininu posted:

So I ordered myself Kraken G12, H55 and some VRAM heatsinks. What should I know before that stuff arrives and I start to mount it?

I got 2070S inside Meshify C case like couple other people here have.

It's pretty straightforward, or at least it was for me. Use the AMD brackets for the G12, they fit perfectly (but the Nvidia ones don't). Careful when unplugging the fan/LED connectors of the stock cooler from the GPU - don't pull on the cables. Think about where you want the tubing to go when mounting the pump/coldplate assembly on the GPU - you can orient it whichever way you want. If you can re-attach the backplate of the card after installing the G12 you might want to do that, it can help with cooling the back of the board and with keeping the card from sagging. In my case though I couldn't do that (and the backplate didn't have any thermal pads anyway so it didn't do anything for cooling).

codo27
Apr 21, 2008

So I brought my tower home for christmas, left my monitor and stuff back at the apartment. Just got it hooked up to my X850E. Playing Nioh fine, runs great. I got RE2 remake for christmas and I just fired it up, going through the graphics settings and all of a sudden everything gets a pink hue. I tried disabling HDR, everything went a lot dimmer but still off. I alt tabbed out and noticed a HDR setting in the Windows display settings and when I hit that, the game took focus and I could see it was fixed, but when I actually opened the window it went pink again and I noticed that the HDR setting in Windows didn't stick. Just flicks back whenever I try to slide it over. I know the TV and reciever (TXNR676) both support HDR, I'm sure my 2080 does. What gives?

Adbot
ADBOT LOVES YOU

Klyith
Aug 3, 2007

GBS Pledge Week

CrazyLoon posted:

My only possible conclusions are that either a) This was initially released as some sort of mining-only GPU and also configured that way by XFX and that's what's limiting it or b) There's something physically screwy with the VRM inside the card itself, hence why it can't supply enough power for State 2, and I don't have anywhere near enough experience to fiddle with that and not mess poo poo up.

c) the miner you bought it from damaged the memory controller by overvolting or some other stupidity


codo27 posted:

So I brought my tower home for christmas, left my monitor and stuff back at the apartment. Just got it hooked up to my X850E. Playing Nioh fine, runs great. I got RE2 remake for christmas and I just fired it up, going through the graphics settings and all of a sudden everything gets a pink hue. I tried disabling HDR, everything went a lot dimmer but still off. I alt tabbed out and noticed a HDR setting in the Windows display settings and when I hit that, the game took focus and I could see it was fixed, but when I actually opened the window it went pink again and I noticed that the HDR setting in Windows didn't stick. Just flicks back whenever I try to slide it over. I know the TV and reciever (TXNR676) both support HDR, I'm sure my 2080 does. What gives?

Try plugging it directly into the TV and/or a different HDMI cable?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply