|
It stops people from getting locked in to solutions that lock them to Nvidia hardware. Companies generally don't just go "hey, time for an upgrade cycle, we're going to replace all our software and all our hardware simultaneously!"
|
# ? Mar 28, 2019 21:33 |
|
|
# ? Jun 3, 2024 22:47 |
|
It also stacks the deck in AMDs favour because although ProRender runs on Nvidia cards, it doesn't use the RTX hardware features for acceleration. Nvidia's raytracing advantage is nullified anywhere that ProRender/RadeonRays gets adopted over something based on DXR/VKRT/OptiX. repiv fucked around with this message at 21:59 on Mar 28, 2019 |
# ? Mar 28, 2019 21:50 |
|
BIG HEADLINE posted:So I've the chance to pick up an "XFX Radeon RX 580 GTS XXX Edition" for ~$160 through a credit card promo, but I have to confess an ignorance toward AMD card makers. Is XFX still decent or is there a reason they're the cheapest?
|
# ? Mar 29, 2019 00:55 |
|
Cavauro posted:Installing the drivers is extremely straightforward. That poster who works in rendering probably just doesn't want any bloat and doesn't happen to know what GeForce Experience is This is it. I know how to install drivers on my own, I don't need a program that does it for me, and I definitely don't need something making changes that may or may not break how my software works. Does Geforce Experience operate automatically int he background, or does it ask before updating the drivers?
|
# ? Mar 29, 2019 01:44 |
|
It downloads drivers automatically and then very politely asks if you want to install a new driver that has a 95% chance to gently caress SOMETHING up.
|
# ? Mar 29, 2019 02:19 |
|
I just unplug high value items if there's a bad storm, sometimes.
|
# ? Mar 29, 2019 02:35 |
|
GFE isn't terrible, but I think its biggest sin is making you create an account to log into so you can download drivers. The Shadowplay stuff is pretty cool though. Like XB1 and PS4, you can set it to constantly record and then hit a key to save the last X seconds of gameplay. Anecdotal but I haven't had problems with driver updates. The GFE install process is pretty simple and takes less time than me manually going online and doing it. However, I used to have an (AMD) graphics card and know the horror of buggy drivers breaking more things than fixing. I know the value of staying with drivers that work, even if its months or years old.
|
# ? Mar 29, 2019 04:43 |
|
Nvidia is cutting RTX prices across the board in the UK: 2080ti: £1,099.00 -> £979.99 2080: £749.00 -> £599.99 2070: £549.00 -> £419.99 2060: £329.00 -> £299.99 Dunno if this is happening worldwide or only in some regions.
|
# ? Mar 29, 2019 15:54 |
|
repiv posted:Nvidia is cutting RTX prices across the board in the UK: At one retailer. Might just be a sale. It also brings it inline with US prices after conversion.
|
# ? Mar 29, 2019 16:07 |
|
I am enjoying all the news rumors saying that AMD's next GPU could do Ray Tracing and Beat Nvidia 2080Ti!!!! Yea we have all heard this story before. If they just had a single generation performance slip or something they we could maybe buy it, but when its been like 4, I think going in cautious would be better for AMD overall. Could they possibly put out something in the GPU side similar to what they did with Ryzen? Sure, will they? Who the heck knows at this point. We all hope they can so it can bring some solid competition back into the GPU world, but for now, I think wait and see would be the best as unfortunately AMD hasn't been able to hit Nvidia in the high end directly for ages. Wasn't the last time AMD could hit the high end of the Nvidia market was literally in the 290/390 vs GTX7XX series? I do miss the old ATI days where the battle was pretty solid in the 1800/1900XT/X days, then the 8800GTX vs the 4870/5870 days. Those were pretty good times. (Minus the DX10 Driver fiasco with Nvidia and the lovely solder days but eh.)
|
# ? Mar 29, 2019 17:51 |
|
EdEddnEddy posted:I am enjoying all the news rumors saying that AMD's next GPU could do Ray Tracing and Beat Nvidia 2080Ti!!!! I mean the amd gpus are terrible and the amd driver stack and bonus software is good just get a good price/perf card and who care who has the 1000+ best gpu. You really can't go wrong with either team.
|
# ? Mar 29, 2019 17:57 |
|
wargames posted:You really can't go wrong with either team at 1080p.
|
# ? Mar 29, 2019 18:06 |
|
He was talking about the future though. AMD shows no signs of life ahead. Nvidia is going to release 7nm RTX GPUs at some point, and AMD is going to release... another GCN respin. Being competitive at the bottom of the barrel isn't going to cut it, especially as more and more of the GPU-sector revenue is coming from areas outside gaming where Nvidia continues to dominate. Ironically, it's AMDs only win that's going to hurt them most. The new consoles will come out, people will need new GPUs to run games at the settings they actually want, and AMD won't have anything to sell those people.
|
# ? Mar 29, 2019 18:11 |
|
K8.0 posted:He was talking about the future though. AMD shows no signs of life ahead. Nvidia is going to release 7nm RTX GPUs at some point, and AMD is going to release... another GCN respin. Being competitive at the bottom of the barrel isn't going to cut it, especially as more and more of the GPU-sector revenue is coming from areas outside gaming where Nvidia continues to dominate. Ironically, it's AMDs only win that's going to hurt them most. The new consoles will come out, people will need new GPUs to run games at the settings they actually want, and AMD won't have anything to sell those people. Is 2070 is bottom of the barrel nowadays? AMD has a card that does equal that but losses money on that card because of the HBM, and i do agree navi will be another gcn respin but its not like amd isn't going to stay on gcn forever. What happens after Navi another gcn or their new gpu design?
|
# ? Mar 29, 2019 18:55 |
|
AMD doesn’t have anything as good as the best Nvidia card, but that’s not distressing because they already have made it clear for years that they’re not competing on halo products. That means people who can afford to be gouged for the very best will be gouged further, but as the guy who keeps pivoting this thread into class warfare you’ll have to forgive me for playing the tiny violin on that one. The real distressing problem is that the things AMD does make are falling as flat with mid range Nvidia, and it’s entirely because their bet on HBM was losing and they are still suffering the consequences.
|
# ? Mar 29, 2019 19:18 |
|
Feels like I'm dying of old age speculating/waiting on Navi details. Just die shrink Polaris to 7nm, up the clock speed accordingly, hook it up to GDDR6 and hit the send button.
|
# ? Mar 29, 2019 19:42 |
|
TheFluff posted:The thing about video games though is that strictly speaking they're not sampled temporally in the signals processing sense of the term - the game engine computes movement in discrete steps, it doesn't sample a continous signal. You could say the input (keyboard/mouse) is sampling, though. You could also make a good argument that things like speedrunners moving so fast that they break the collision detection and clip through walls are temporal aliasing bugs. I think that these things are what you're after, so that's why I'm saying that I think you're talking about temporal aliasing, not spatial aliasing. Good post! It's arguably a little more complex than that for modern games, though. It's true a game only computes the game state at discrete times. Nowadays I'd say it's most commonly done at some fixed lockstep rate that's decoupled from the frame rate. Expect a fixed 60 Hz for action-y games, but it can be less. Older RTSes like Starcraft or Supreme Commander actually run at 10 Hz since the game state updates are relatively heavy (supcom can still bring modern CPUs to their proverbial knees, which kind of impressive). When rendering the game will display the state that's a weighted interpolation between the two closest computed game states. The displayed image is basically sampled from some piecewise linear-ish continuous video signal, and we can vary the sampling rate for the linear approximation we use in the simulation independently of the sampling rate used for display. Both those sample rates can cause aliasing issues, either in the simulation of the game or in displaying the image. Other possible approaches are to do variable rate updates where the game state is computed with a timestep based on the current frame rate, but that produces inconsistent results so it's usually avoided for robustness and generally making multiplayer a thing that is possible. Console games often lock graphics and game to a fixed absolute cadence of either 30 Hz or 60 Hz and just double the last frame if a state update isn't done in time, which is very easy and consistent but only works when you're on fixed hardware and can design your entire game to guarantee that your specified framerate will very nearly always be met by anyone playing. That design quirk is incidentally why ports of console exclusive to PC often have fixed refresh rates: changing the entire game to variable step rate after the fact is going to cause an absolute shittton of infuriating and hard to fix little collision bugs and the like. Fixed rate is pretty uncommon now that even the consoles come with different specs. I think people are additionally confused about temporal aliasing in particular because temporal anti-aliasing is of course a common technique used in games to remove spatial aliasing errors. The majority of anti-aliasing efforts in graphics in general and games in particular are still aimed at combating the issues caused by insufficient spatial sampling. Mostly it's prefiltering techniques like mipmaps, various specular reflection filters to account for pixel and light area, normal map filters like LEAN or LEADR, etc. Then there's all the fullscreen AA filters like TAA, SMAA or FXAA, and maybe some degree of multisampling. If you could somehow run a modern game in 4K with all those filtering techniques turned off and only strict point sampling it would look like a godawful crawly noisy speckled mess: we're a couple of orders of magnitude away from a resolution where we can get away with not filtering anything at all. I guess 40K would probably work, most of the time, if there's not too much detail. Averaging over the time of a frame is definitely necessary for properly conveying fast moving things but it's both more niche and also much harder to do in general. In offline graphics we just sample the aforementioned underlying piecewise linear-ish video signal around the frame time and average, but that's extremely expensive and not possible in games. For real time you can up the framerate to a point but even then fast motion has to be mocked with cheap, inaccurate post process based motion blur which is a little too blunt to use everywhere. Anyhow, I wouldn't say our sampling rates are totally out whack at 4K 60Hz as such, but it is the case that we have very limited temporal filtering techniques versus very good spatial ones. Effectively that means that frame rate impacts your perception of fast motion much more than resolution impacts your perception of high detail even if both are aliasing problems, so your weighting of the two ends up very subjective based on what sort of content you're looking at. Xerophyte fucked around with this message at 21:56 on Mar 29, 2019 |
# ? Mar 29, 2019 21:52 |
|
Xerophyte posted:I guess 40K would probably work, most of the time, if there's not too much detail. In the grim darkness of the far future, there is only TAA and sharpening filters
|
# ? Mar 30, 2019 01:14 |
|
Zedsdeadbaby posted:In the grim darkness of the far future, there is only TAA and sharpening filters Praise the RTX.
|
# ? Mar 30, 2019 16:54 |
|
Xerophyte posted:Good post! It's arguably a little more complex than that for modern games, though. It's true a game only computes the game state at discrete times. Nowadays I'd say it's most commonly done at some fixed lockstep rate that's decoupled from the frame rate. Expect a fixed 60 Hz for action-y games, but it can be less. Older RTSes like Starcraft or Supreme Commander actually run at 10 Hz since the game state updates are relatively heavy (supcom can still bring modern CPUs to their proverbial knees, which kind of impressive). When rendering the game will display the state that's a weighted interpolation between the two closest computed game states. The displayed image is basically sampled from some piecewise linear-ish continuous video signal, and we can vary the sampling rate for the linear approximation we use in the simulation independently of the sampling rate used for display. Both those sample rates can cause aliasing issues, either in the simulation of the game or in displaying the image. Interesting, thanks. When I said 4K doesn't really need much antialiasing, I was pretty much only talking about the full screen sort. I didn't really think about how many other parts of 3D rendering that are also anti-aliasing in some way. I'm still slowly poking through the references cited in that NASA paper - haven't really understood it fully yet. TheFluff fucked around with this message at 18:08 on Mar 30, 2019 |
# ? Mar 30, 2019 18:04 |
|
TOM STATUS: NO LONGER EMPLOYED BY NVIDIA RIP TOM'S JOB 2005-2019 Paul MaudDib fucked around with this message at 23:20 on Mar 30, 2019 |
# ? Mar 30, 2019 18:37 |
|
Ashraf Eassa says Ampere taped out this week on N7+.
|
# ? Mar 30, 2019 23:13 |
|
Nvidia admits its 2080 Ti cards have a problem, but isn’t saying what it isquote:After weeks of speculation, rumors, and mounting evidence that Nvidia’s RTX 2080 Ti Founders Edition graphics cards were experiencing a serious problem that caused many cards to outright die on their new owners, Nvidia has admitted that there’s a problem. It hasn’t explained what it is, but has alluded, in a roundabout manner, that the problem managed to sneak past quality control somehow. It also pledged to continue to work with affected consumers.
|
# ? Mar 31, 2019 00:43 |
|
It’s too awesome guys
|
# ? Mar 31, 2019 00:45 |
|
Ooh, I bet the problem involves an email on NVIDIA's internal server containing the words "ah it'll probably be fine"
|
# ? Mar 31, 2019 00:49 |
|
Just buy it™
|
# ? Mar 31, 2019 01:33 |
|
I'm bricking it®
|
# ? Mar 31, 2019 01:36 |
|
GRINDCORE MEGGIDO posted:Just fry it™
|
# ? Mar 31, 2019 01:50 |
|
quote:Posted on 11.16.18 - 7:37AM PST This is from half a year ago and that issue was discussed a lot back then.
|
# ? Mar 31, 2019 02:08 |
|
....alright, that's on me for not checking the date of things that pop to the top of my news feed. Sorry, my bad.
|
# ? Mar 31, 2019 02:58 |
|
TheFluff posted:This is from half a year ago and that issue was discussed a lot back then. Was that the whole debacle with the memory issues? I remember seeing that & saw a few reviewers were able to stabilize cards with the "faulty" Micron memory by underclocking them. Made me think that it wasn't really bad chips per se, but something with the memory settings, whether it was clock speed or timings.
|
# ? Mar 31, 2019 05:14 |
|
https://i.imgur.com/CXDEQWk.gifv Allegedly, some guy named "Sonic Ether" makes raytracing shaders for Minecraft?
|
# ? Mar 31, 2019 14:41 |
|
yeah it's some good poo poo https://www.youtube.com/watch?v=YD7sBYBj06Q there's no rtx acceleration though, someone needs to port the minecraft renderer from opengl to vulkan before that can happen
|
# ? Mar 31, 2019 14:52 |
|
Pathtracing, not raytracing, was what I heard.
|
# ? Mar 31, 2019 14:56 |
|
My 2080ti died after one week so I got my money back. Should I take it as a sign to keep on trucking with my 1070 (non ti) and wait for the next generation or get a new one with a custom pcb?
|
# ? Mar 31, 2019 15:52 |
|
It entirely depends on what you're going to do. There's no point in buying hardware you won't use, and if you have money to burn, little point in not buying hardware you will use.
|
# ? Mar 31, 2019 15:56 |
|
Depends on the resolution you play on. The next 7nm GPU should hopefully be a decent jump in performance, but we're talking Autumn/Winter at best.
|
# ? Mar 31, 2019 16:16 |
|
If that rumor about taping out Ampere this week is true (and Ampere is compute only AFAIUI, no consumer parts), then I'd be very surprised if we see the corresponding consumer cards any sooner than a year from now. Summer/fall 2020 seems more likely to me.
|
# ? Mar 31, 2019 16:19 |
|
BOOTY-ADE posted:Was that the whole debacle with the memory issues? I remember seeing that & saw a few reviewers were able to stabilize cards with the "faulty" Micron memory by underclocking them. Made me think that it wasn't really bad chips per se, but something with the memory settings, whether it was clock speed or timings. A pretty credible theory is that the VRMs of the 2080ti produce too much thermal dissipation too close to four of the VRAM chips. The relatively massive copper power plane transfers the heat to the chips and cooks them from below. Reputable review sites have measured >100°C at the back of the PCB between VRM and GPU where those chips are. The one in the hottest location is also the one chip that's left empty on the 1080ti. Many of the newer third party cards now have a big copper shim around the GPU to better cool the VRAM chips.
|
# ? Mar 31, 2019 16:30 |
|
|
# ? Jun 3, 2024 22:47 |
|
K8.0 posted:It entirely depends on what you're going to do. There's no point in buying hardware you won't use, and if you have money to burn, little point in not buying hardware you will use. True enough, I'm currently on paid leave for a few months so I'd get quite a logo of mileage out of it. 1440p on an Acer 271hu. I went with the TI because I wanted to get stupidly high FPS or run DSR instead of anti-aliasing.
|
# ? Mar 31, 2019 16:39 |