Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It stops people from getting locked in to solutions that lock them to Nvidia hardware. Companies generally don't just go "hey, time for an upgrade cycle, we're going to replace all our software and all our hardware simultaneously!"

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

It also stacks the deck in AMDs favour because although ProRender runs on Nvidia cards, it doesn't use the RTX hardware features for acceleration.

Nvidia's raytracing advantage is nullified anywhere that ProRender/RadeonRays gets adopted over something based on DXR/VKRT/OptiX.

repiv fucked around with this message at 21:59 on Mar 28, 2019

Anime Schoolgirl
Nov 28, 2002

BIG HEADLINE posted:

So I've the chance to pick up an "XFX Radeon RX 580 GTS XXX Edition" for ~$160 through a credit card promo, but I have to confess an ignorance toward AMD card makers. Is XFX still decent or is there a reason they're the cheapest?
XFX is still above the median when it comes to AMD cards.

Listerine
Jan 5, 2005

Exquisite Corpse

Cavauro posted:

Installing the drivers is extremely straightforward. That poster who works in rendering probably just doesn't want any bloat and doesn't happen to know what GeForce Experience is

This is it. I know how to install drivers on my own, I don't need a program that does it for me, and I definitely don't need something making changes that may or may not break how my software works. Does Geforce Experience operate automatically int he background, or does it ask before updating the drivers?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It downloads drivers automatically and then very politely asks if you want to install a new driver that has a 95% chance to gently caress SOMETHING up.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

I just unplug high value items if there's a bad storm, sometimes.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
GFE isn't terrible, but I think its biggest sin is making you create an account to log into so you can download drivers. The Shadowplay stuff is pretty cool though. Like XB1 and PS4, you can set it to constantly record and then hit a key to save the last X seconds of gameplay. Anecdotal but I haven't had problems with driver updates. The GFE install process is pretty simple and takes less time than me manually going online and doing it.

However, I used to have an (AMD) graphics card and know the horror of buggy drivers breaking more things than fixing. I know the value of staying with drivers that work, even if its months or years old.

repiv
Aug 13, 2009

Nvidia is cutting RTX prices across the board in the UK:

2080ti: £1,099.00 -> £979.99
2080: £749.00 -> £599.99
2070: £549.00 -> £419.99
2060: £329.00 -> £299.99

Dunno if this is happening worldwide or only in some regions.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

repiv posted:

Nvidia is cutting RTX prices across the board in the UK:

2080ti: £1,099.00 -> £979.99
2080: £749.00 -> £599.99
2070: £549.00 -> £419.99
2060: £329.00 -> £299.99

Dunno if this is happening worldwide or only in some regions.

At one retailer. Might just be a sale. It also brings it inline with US prices after conversion.

EdEddnEddy
Apr 5, 2012



I am enjoying all the news rumors saying that AMD's next GPU could do Ray Tracing and Beat Nvidia 2080Ti!!!!

Yea we have all heard this story before. If they just had a single generation performance slip or something they we could maybe buy it, but when its been like 4, I think going in cautious would be better for AMD overall. Could they possibly put out something in the GPU side similar to what they did with Ryzen? Sure, will they? Who the heck knows at this point.

We all hope they can so it can bring some solid competition back into the GPU world, but for now, I think wait and see would be the best as unfortunately AMD hasn't been able to hit Nvidia in the high end directly for ages.

Wasn't the last time AMD could hit the high end of the Nvidia market was literally in the 290/390 vs GTX7XX series?

I do miss the old ATI days where the battle was pretty solid in the 1800/1900XT/X days, then the 8800GTX vs the 4870/5870 days. Those were pretty good times. (Minus the DX10 Driver fiasco with Nvidia and the lovely solder days but eh.)

wargames
Mar 16, 2008

official yospos cat censor

EdEddnEddy posted:

I am enjoying all the news rumors saying that AMD's next GPU could do Ray Tracing and Beat Nvidia 2080Ti!!!!

Yea we have all heard this story before. If they just had a single generation performance slip or something they we could maybe buy it, but when its been like 4, I think going in cautious would be better for AMD overall. Could they possibly put out something in the GPU side similar to what they did with Ryzen? Sure, will they? Who the heck knows at this point.

We all hope they can so it can bring some solid competition back into the GPU world, but for now, I think wait and see would be the best as unfortunately AMD hasn't been able to hit Nvidia in the high end directly for ages.

Wasn't the last time AMD could hit the high end of the Nvidia market was literally in the 290/390 vs GTX7XX series?

I do miss the old ATI days where the battle was pretty solid in the 1800/1900XT/X days, then the 8800GTX vs the 4870/5870 days. Those were pretty good times. (Minus the DX10 Driver fiasco with Nvidia and the lovely solder days but eh.)

I mean the amd gpus are terrible and the amd driver stack and bonus software is good just get a good price/perf card and who care who has the 1000+ best gpu. You really can't go wrong with either team.

Enos Cabell
Nov 3, 2004


wargames posted:

You really can't go wrong with either team at 1080p.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
He was talking about the future though. AMD shows no signs of life ahead. Nvidia is going to release 7nm RTX GPUs at some point, and AMD is going to release... another GCN respin. Being competitive at the bottom of the barrel isn't going to cut it, especially as more and more of the GPU-sector revenue is coming from areas outside gaming where Nvidia continues to dominate. Ironically, it's AMDs only win that's going to hurt them most. The new consoles will come out, people will need new GPUs to run games at the settings they actually want, and AMD won't have anything to sell those people.

wargames
Mar 16, 2008

official yospos cat censor

K8.0 posted:

He was talking about the future though. AMD shows no signs of life ahead. Nvidia is going to release 7nm RTX GPUs at some point, and AMD is going to release... another GCN respin. Being competitive at the bottom of the barrel isn't going to cut it, especially as more and more of the GPU-sector revenue is coming from areas outside gaming where Nvidia continues to dominate. Ironically, it's AMDs only win that's going to hurt them most. The new consoles will come out, people will need new GPUs to run games at the settings they actually want, and AMD won't have anything to sell those people.

Is 2070 is bottom of the barrel nowadays? AMD has a card that does equal that but losses money on that card because of the HBM, and i do agree navi will be another gcn respin but its not like amd isn't going to stay on gcn forever. What happens after Navi another gcn or their new gpu design?

Craptacular!
Jul 9, 2001

Fuck the DH
AMD doesn’t have anything as good as the best Nvidia card, but that’s not distressing because they already have made it clear for years that they’re not competing on halo products. That means people who can afford to be gouged for the very best will be gouged further, but as the guy who keeps pivoting this thread into class warfare you’ll have to forgive me for playing the tiny violin on that one.

The real distressing problem is that the things AMD does make are falling as flat with mid range Nvidia, and it’s entirely because their bet on HBM was losing and they are still suffering the consequences.

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Feels like I'm dying of old age speculating/waiting on Navi details. Just die shrink Polaris to 7nm, up the clock speed accordingly, hook it up to GDDR6 and hit the send button.

Xerophyte
Mar 17, 2008

This space intentionally left blank

TheFluff posted:

The thing about video games though is that strictly speaking they're not sampled temporally in the signals processing sense of the term - the game engine computes movement in discrete steps, it doesn't sample a continous signal. You could say the input (keyboard/mouse) is sampling, though. You could also make a good argument that things like speedrunners moving so fast that they break the collision detection and clip through walls are temporal aliasing bugs. I think that these things are what you're after, so that's why I'm saying that I think you're talking about temporal aliasing, not spatial aliasing.

Good post! It's arguably a little more complex than that for modern games, though. It's true a game only computes the game state at discrete times. Nowadays I'd say it's most commonly done at some fixed lockstep rate that's decoupled from the frame rate. Expect a fixed 60 Hz for action-y games, but it can be less. Older RTSes like Starcraft or Supreme Commander actually run at 10 Hz since the game state updates are relatively heavy (supcom can still bring modern CPUs to their proverbial knees, which kind of impressive). When rendering the game will display the state that's a weighted interpolation between the two closest computed game states. The displayed image is basically sampled from some piecewise linear-ish continuous video signal, and we can vary the sampling rate for the linear approximation we use in the simulation independently of the sampling rate used for display. Both those sample rates can cause aliasing issues, either in the simulation of the game or in displaying the image.

Other possible approaches are to do variable rate updates where the game state is computed with a timestep based on the current frame rate, but that produces inconsistent results so it's usually avoided for robustness and generally making multiplayer a thing that is possible. Console games often lock graphics and game to a fixed absolute cadence of either 30 Hz or 60 Hz and just double the last frame if a state update isn't done in time, which is very easy and consistent but only works when you're on fixed hardware and can design your entire game to guarantee that your specified framerate will very nearly always be met by anyone playing. That design quirk is incidentally why ports of console exclusive to PC often have fixed refresh rates: changing the entire game to variable step rate after the fact is going to cause an absolute shittton of infuriating and hard to fix little collision bugs and the like. Fixed rate is pretty uncommon now that even the consoles come with different specs.

I think people are additionally confused about temporal aliasing in particular because temporal anti-aliasing is of course a common technique used in games to remove spatial aliasing errors.


The majority of anti-aliasing efforts in graphics in general and games in particular are still aimed at combating the issues caused by insufficient spatial sampling. Mostly it's prefiltering techniques like mipmaps, various specular reflection filters to account for pixel and light area, normal map filters like LEAN or LEADR, etc. Then there's all the fullscreen AA filters like TAA, SMAA or FXAA, and maybe some degree of multisampling. If you could somehow run a modern game in 4K with all those filtering techniques turned off and only strict point sampling it would look like a godawful crawly noisy speckled mess: we're a couple of orders of magnitude away from a resolution where we can get away with not filtering anything at all. I guess 40K would probably work, most of the time, if there's not too much detail.

Averaging over the time of a frame is definitely necessary for properly conveying fast moving things but it's both more niche and also much harder to do in general. In offline graphics we just sample the aforementioned underlying piecewise linear-ish video signal around the frame time and average, but that's extremely expensive and not possible in games. For real time you can up the framerate to a point but even then fast motion has to be mocked with cheap, inaccurate post process based motion blur which is a little too blunt to use everywhere.

Anyhow, I wouldn't say our sampling rates are totally out whack at 4K 60Hz as such, but it is the case that we have very limited temporal filtering techniques versus very good spatial ones. Effectively that means that frame rate impacts your perception of fast motion much more than resolution impacts your perception of high detail even if both are aliasing problems, so your weighting of the two ends up very subjective based on what sort of content you're looking at.

Xerophyte fucked around with this message at 21:56 on Mar 29, 2019

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Xerophyte posted:

I guess 40K would probably work, most of the time, if there's not too much detail.

In the grim darkness of the far future, there is only TAA and sharpening filters

Falcorum
Oct 21, 2010

Zedsdeadbaby posted:

In the grim darkness of the far future, there is only TAA and sharpening filters

Praise the RTX.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Xerophyte posted:

Good post! It's arguably a little more complex than that for modern games, though. It's true a game only computes the game state at discrete times. Nowadays I'd say it's most commonly done at some fixed lockstep rate that's decoupled from the frame rate. Expect a fixed 60 Hz for action-y games, but it can be less. Older RTSes like Starcraft or Supreme Commander actually run at 10 Hz since the game state updates are relatively heavy (supcom can still bring modern CPUs to their proverbial knees, which kind of impressive). When rendering the game will display the state that's a weighted interpolation between the two closest computed game states. The displayed image is basically sampled from some piecewise linear-ish continuous video signal, and we can vary the sampling rate for the linear approximation we use in the simulation independently of the sampling rate used for display. Both those sample rates can cause aliasing issues, either in the simulation of the game or in displaying the image.

Other possible approaches are to do variable rate updates where the game state is computed with a timestep based on the current frame rate, but that produces inconsistent results so it's usually avoided for robustness and generally making multiplayer a thing that is possible. Console games often lock graphics and game to a fixed absolute cadence of either 30 Hz or 60 Hz and just double the last frame if a state update isn't done in time, which is very easy and consistent but only works when you're on fixed hardware and can design your entire game to guarantee that your specified framerate will very nearly always be met by anyone playing. That design quirk is incidentally why ports of console exclusive to PC often have fixed refresh rates: changing the entire game to variable step rate after the fact is going to cause an absolute shittton of infuriating and hard to fix little collision bugs and the like. Fixed rate is pretty uncommon now that even the consoles come with different specs.

I think people are additionally confused about temporal aliasing in particular because temporal anti-aliasing is of course a common technique used in games to remove spatial aliasing errors.


The majority of anti-aliasing efforts in graphics in general and games in particular are still aimed at combating the issues caused by insufficient spatial sampling. Mostly it's prefiltering techniques like mipmaps, various specular reflection filters to account for pixel and light area, normal map filters like LEAN or LEADR, etc. Then there's all the fullscreen AA filters like TAA, SMAA or FXAA, and maybe some degree of multisampling. If you could somehow run a modern game in 4K with all those filtering techniques turned off and only strict point sampling it would look like a godawful crawly noisy speckled mess: we're a couple of orders of magnitude away from a resolution where we can get away with not filtering anything at all. I guess 40K would probably work, most of the time, if there's not too much detail.

Averaging over the time of a frame is definitely necessary for properly conveying fast moving things but it's both more niche and also much harder to do in general. In offline graphics we just sample the aforementioned underlying piecewise linear-ish video signal around the frame time and average, but that's extremely expensive and not possible in games. For real time you can up the framerate to a point but even then fast motion has to be mocked with cheap, inaccurate post process based motion blur which is a little too blunt to use everywhere.

Anyhow, I wouldn't say our sampling rates are totally out whack at 4K 60Hz as such, but it is the case that we have very limited temporal filtering techniques versus very good spatial ones. Effectively that means that frame rate impacts your perception of fast motion much more than resolution impacts your perception of high detail even if both are aliasing problems, so your weighting of the two ends up very subjective based on what sort of content you're looking at.

Interesting, thanks. When I said 4K doesn't really need much antialiasing, I was pretty much only talking about the full screen sort. I didn't really think about how many other parts of 3D rendering that are also anti-aliasing in some way.

I'm still slowly poking through the references cited in that NASA paper - haven't really understood it fully yet.

TheFluff fucked around with this message at 18:08 on Mar 30, 2019

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
:siren: TOM STATUS: NO LONGER EMPLOYED BY NVIDIA :siren:

RIP TOM'S JOB 2005-2019

Paul MaudDib fucked around with this message at 23:20 on Mar 30, 2019

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Ashraf Eassa says Ampere taped out this week on N7+.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Nvidia admits its 2080 Ti cards have a problem, but isn’t saying what it is

quote:

After weeks of speculation, rumors, and mounting evidence that Nvidia’s RTX 2080 Ti Founders Edition graphics cards were experiencing a serious problem that caused many cards to outright die on their new owners, Nvidia has admitted that there’s a problem. It hasn’t explained what it is, but has alluded, in a roundabout manner, that the problem managed to sneak past quality control somehow. It also pledged to continue to work with affected consumers.

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.


It’s too awesome guys

Alchenar
Apr 9, 2008

Ooh, I bet the problem involves an email on NVIDIA's internal server containing the words "ah it'll probably be fine"

GRINDCORE MEGGIDO
Feb 28, 1985


Just buy it™

Absurd Alhazred
Mar 27, 2010

by Athanatos
I'm bricking it®

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

quote:

Posted on 11.16.18 - 7:37AM PST

This is from half a year ago and that issue was discussed a lot back then.

SwissArmyDruid
Feb 14, 2014

by sebmojo
....alright, that's on me for not checking the date of things that pop to the top of my news feed. Sorry, my bad.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

TheFluff posted:

This is from half a year ago and that issue was discussed a lot back then.

Was that the whole debacle with the memory issues? I remember seeing that & saw a few reviewers were able to stabilize cards with the "faulty" Micron memory by underclocking them. Made me think that it wasn't really bad chips per se, but something with the memory settings, whether it was clock speed or timings.

SwissArmyDruid
Feb 14, 2014

by sebmojo
https://i.imgur.com/CXDEQWk.gifv

Allegedly, some guy named "Sonic Ether" makes raytracing shaders for Minecraft?

repiv
Aug 13, 2009

yeah it's some good poo poo

https://www.youtube.com/watch?v=YD7sBYBj06Q

there's no rtx acceleration though, someone needs to port the minecraft renderer from opengl to vulkan before that can happen

ufarn
May 30, 2009
Pathtracing, not raytracing, was what I heard.

Media Bloodbath
Mar 1, 2018

PIVOT TO ETERNAL SUFFERING
:hb:
My 2080ti died after one week so I got my money back.
Should I take it as a sign to keep on trucking with my 1070 (non ti) and wait for the next generation or get a new one with a custom pcb?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It entirely depends on what you're going to do. There's no point in buying hardware you won't use, and if you have money to burn, little point in not buying hardware you will use.

ufarn
May 30, 2009
Depends on the resolution you play on. The next 7nm GPU should hopefully be a decent jump in performance, but we're talking Autumn/Winter at best.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
If that rumor about taping out Ampere this week is true (and Ampere is compute only AFAIUI, no consumer parts), then I'd be very surprised if we see the corresponding consumer cards any sooner than a year from now. Summer/fall 2020 seems more likely to me.

eames
May 9, 2009

BOOTY-ADE posted:

Was that the whole debacle with the memory issues? I remember seeing that & saw a few reviewers were able to stabilize cards with the "faulty" Micron memory by underclocking them. Made me think that it wasn't really bad chips per se, but something with the memory settings, whether it was clock speed or timings.

A pretty credible theory is that the VRMs of the 2080ti produce too much thermal dissipation too close to four of the VRAM chips. The relatively massive copper power plane transfers the heat to the chips and cooks them from below. Reputable review sites have measured >100°C at the back of the PCB between VRM and GPU where those chips are. The one in the hottest location is also the one chip that's left empty on the 1080ti.
Many of the newer third party cards now have a big copper shim around the GPU to better cool the VRAM chips.

Adbot
ADBOT LOVES YOU

Media Bloodbath
Mar 1, 2018

PIVOT TO ETERNAL SUFFERING
:hb:

K8.0 posted:

It entirely depends on what you're going to do. There's no point in buying hardware you won't use, and if you have money to burn, little point in not buying hardware you will use.

True enough, I'm currently on paid leave for a few months so I'd get quite a logo of mileage out of it.
1440p on an Acer 271hu. I went with the TI because I wanted to get stupidly high FPS or run DSR instead of anti-aliasing.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply