|
Sorry, this is unrelated, but would there be any way to squeeze out more details from grainy low-res videos from the new algorithm? I suppose you need to know the direction the camera is moving, but perhaps the video itself has enough information about camera-shake or movement. That would be fascinating, but probably not possible. Anyway, looking forward to an experimental DLSS-based VLC plug-in around 2030 or so, hopefully.
|
# ? Jul 21, 2020 21:53 |
|
|
# ? Jun 6, 2024 06:09 |
|
Paul MaudDib posted:Nvidia has done mid-generation refreshes that either drop prices on the first batch of products, or obsolete them and effectively replace them, since forever. This time around it’s called Super, previously it was the 980 Ti and 1080 Ti, before that it was the 700 series (Kepler refresh), etc. Yeah that's one thing but you suggested launching the top end card at basically half price of the last gen and upsetting their whole stack pricing (top apart from whatever 3090 they might have planned).
|
# ? Jul 21, 2020 21:56 |
|
Carecat posted:Yeah that's one thing but you suggested launching the top end card at basically half price of the last gen and upsetting their whole stack pricing (top apart from whatever 3090 they might have planned). Well, remember that NVidia's MSRP for the 2080 Ti was $999. That all the AIB's sold cards for considerably more than that was not really NVidia's doing. But I agree, I think NVidia has determined that enough people will pay $1k for the top-end card that they're not going to undercut that price point unless absolutely forced to. So the real question is more whether or not the 3080Ti will be available at launch to begin with, or if it'll be going back to the 9- and 10- series staggered launch with a $700 (probably more like $750-$800 now) xx80 part and a $1200+ Titan/3090 part and let the 3080 Ti pop up during the mid-cycle refresh a year or so later.
|
# ? Jul 21, 2020 22:11 |
|
lllllllllllllllllll posted:Sorry, this is unrelated, but would there be any way to squeeze out more details from grainy low-res videos from the new algorithm? I suppose you need to know the direction the camera is moving, but perhaps the video itself has enough information about camera-shake or movement. That would be fascinating, but probably not possible. Anyway, looking forward to an experimental DLSS-based VLC plug-in around 2030 or so, hopefully. No. The motion information DLSS is using is not about camera movement, but about how samples are moving relative to the camera. Imagine a screenshot, but think of the pixels as points with space in between them instead of squares. Imagine each one has a motion vector showing how far it will move by the next frame. Then you move them all by that amount and now you have extra data you can keep from frame to frame. Now jitter the sampling in the X and Y direction for each frame slightly, so that even when still you aren't just collecting the same piece of data repeatedly frame after frame. By keeping the last X frames worth of data around, you have a very dense network of data to sample for the current frame you want to create, even if the data from each frame isn't enough to construct a single full frame. Without those motion vectors and the ability to jitter exactly where you're sampling an algorithm like DLSS is pretty useless. Offline (i.e. non-realtime) upscaling algorithms will always be able to do more than a real-time one like DLSS anyway, because they can take all the time they need to reconstruct data, as well as looking forward and not just backward in time.
|
# ? Jul 21, 2020 22:25 |
|
The only AIB maker who put a 2080Ti out at $999 (and *kept* putting them out periodically) was EVGA and they were rare as hell in the retail channel. I think the majority of people who have them got them through the Step Up program.
|
# ? Jul 21, 2020 23:07 |
|
DrDork posted:Basically we have no idea, other than devs are going to have to figure something reasonable out or lose money on PC sales they'd otherwise have gotten. Build a similar high speed SSD into the graphics card itself with the super high speed connection to the VRAM and any necessary decoder hardware. Downside is your initial loading time and an even more expensive graphics card. I have no idea if this is practical (or cheaper than x GB of GDDR) but I can't see the same motherboard to VRAM connection that avoids CPU overhead getting done easily and everyone would need new motherboards, drives and GPU. Carecat fucked around with this message at 23:30 on Jul 21, 2020 |
# ? Jul 21, 2020 23:27 |
|
DrDork posted:Well, remember that NVidia's MSRP for the 2080 Ti was $999. That all the AIB's sold cards for considerably more than that was not really NVidia's doing. But I agree, I think NVidia has determined that enough people will pay $1k for the top-end card that they're not going to undercut that price point unless absolutely forced to. I hope the Ti variant is always at launch of each new series because it makes very little commercial sense to stagger it later. As a non-price conscious (and therefore a core 3080Ti demo) buyer, I’m not going to wait. I’ll just buy a vanilla 3080 and they lose their hat versus the margins they could have had. I’m not gonna wait and I’m also not going to upgrade to a theoretical better card in only 6 months. I’m guessing there are a lot of people like me in that demo. Relatively speaking. I’m not very worried about it though. i think that’s exactly why the 2080Ti launched alongside and expect that to be the case for the foreseeable future barring technical issues. For example if Nvidia really, really wants the 3080Ti or the 3090 to be on TSMC 7nm, I guess that could maybe justify a wait, but I doubt it personally.
|
# ? Jul 21, 2020 23:48 |
|
BIG HEADLINE posted:The only AIB maker who put a 2080Ti out at $999 (and *kept* putting them out periodically) was EVGA and they were rare as hell in the retail channel. I think the majority of people who have them got them through the Step Up program. They must have made changes since the 900/1000 series because I waited about seven months to try and go from a 980 ti to their first 1080 model. EVGA had no problem selling most of their stock to retailers during that time. In the time it took to get my step-up email, they had revised their coolers two or three times and created most of the launch 1080 ti models. I wasn't dying to pay full MSRP on a launch 1080 with mediocre cooling more than half a year after release, so I am still rocking the 980 ti.
|
# ? Jul 21, 2020 23:55 |
|
lol if you're "non-price conscious" and buy a 3080 over a Ti/3090 because you can't wait. just get a PS5/XSX and wait for the higher-end card. on a separate note, do people feel like their willingness-to-pay for PC parts has gone up over the years? it's mostly earning more as I grow older but I recall aiming for a 1.5K build (excluding monitor) a decade ago while these day, my budget is at 3K. part of it is also realizing how cheap PC gaming is as a "hobby" compared to anything else e.g. owning a boat, or being an audiophile.
|
# ? Jul 21, 2020 23:55 |
|
The amount I've spent on my builds has definitely gone up since I started building PCs, almost certainly because I've got a full time job now compared to back when I started building PCs in grade school, but that amount certainly spiked unfairly high when I had to replace my last PC in the middle of the latest cryptocurrency caused GPU drought. I bought a Gigabyte 1080 Ti marked up $300 from its base price, advertised as being a factory overclocked model, only for it to be a dud card that would only run stable if I underclocked it under stock speeds about 200 Mhz. Gigabyte refused to honor a warranty replacement because I couldn't prove it was the card at fault without a second graphics card to test against. The only other graphics card available and in stock at the time? A Titan V. Turns out that was a dud too for other reasons: it would crash frequently while running games and the stock blower cooler was just awful at keeping it at sane temps. NVIDIA wouldn't replace the card for the former issue, so to try and at least make it usable, I built a watercooling loop for it to keep the temps under control at least. The temps staying under control helped the stability issue, but unfortunately the pump in the loop died in the middle of a gaming session and the card was toast after that. So I'm looking forward to a new tier of cards to come out so I can replace the busted 1080 Ti I kept as backup, and crossing my fingers I can get my hands on one before the inevitable price gouging begins and I'm forced to make more poor decisions. MadFriarAvelyn fucked around with this message at 00:26 on Jul 22, 2020 |
# ? Jul 22, 2020 00:18 |
|
shrike82 posted:lol if you're "non-price conscious" and buy a 3080 over a Ti/3090 because you can't wait. just get a PS5/XSX and wait for the higher-end card. I need HDMI 2.1 immediately for my pc stack. I’m also buying a Ps5 of course. That’s the point, I’ll spend whatever but I’m also not willing to wait. I need 2.1 too dearly. Not sure why you put quotes around real words. In fact if you could just kindly not respond or engage with me that would be ideal, no offense. Just doesn’t seem like anything good ever comes of it, so why bother, ya know. Taima fucked around with this message at 00:34 on Jul 22, 2020 |
# ? Jul 22, 2020 00:26 |
|
shrike82 posted:lol if you're "non-price conscious" and buy a 3080 over a Ti/3090 because you can't wait. just get a PS5/XSX and wait for the higher-end card. My willingness-to-pay has maybe gone up a little bit, but it's secondary to my ability-to-pay. I also tend to have a good idea what goals I am trying to achieve with my hardware, and try to have the most cost-effective approach to reaching them. I spent more on my 2070 Super than I have spent on any single computer component before, but my goal was to be able to drive 1440p at the highest graphics settings and it meets that goal.
|
# ? Jul 22, 2020 00:54 |
|
CaptainSarcastic posted:My willingness-to-pay has maybe gone up a little bit, but it's secondary to my ability-to-pay. I also tend to have a good idea what goals I am trying to achieve with my hardware, and try to have the most cost-effective approach to reaching them. I spent more on my 2070 Super than I have spent on any single computer component before, but my goal was to be able to drive 1440p at the highest graphics settings and it meets that goal. It's both ability and willingness that's changed for me. The first computer I got was when I was making $18k yearly on a grad student stipend in 1999. Having a proper salary now means I can afford a greater range of parts. But my needs have also changed for what I'm doing with the machine, so that's probably a bigger driver in terms of what I'll put down for. Since I'm not gaming, whether or not I buy a 3000 series (and when) will depend on the numbers those cards put up for GPU rendering. Since that application scales with the number of cards, it may make more sense in terms of price/performance for me to just add another 2000 series.
|
# ? Jul 22, 2020 02:56 |
|
I'm going to open palm slam down money on the next Titan that comes out because gently caress it
|
# ? Jul 22, 2020 02:58 |
|
Carecat posted:Build a similar high speed SSD into the graphics card itself with the super high speed connection to the VRAM and any necessary decoder hardware. That...actually could be a pretty interesting option. Some of the devs working on the next-gen games have noted that part of the reason games are so loving bloated these days is multiple textures per resolution, and straight up data duplication to help deal with the physical reality of spinning hard drives. You could probably slap 25-50GB or so of flash storage onto a card and use that as the fast data buffer and only pull stuff occasionally from actual storage. No idea how much it would cost, but <100GB SSDs are real cheap these days, so I'd imagine it'd be more a question of the engineering and board space than the extra BOM. It might be a better solution than tossing a few extra GB of VRAM in there, because that's really not going to fix much: if a game is actively paging in and out 20GB of stuff, 4GB more VRAM isn't going to make much of a difference. But, again, we don't know what devs are actually going to do with it. There's a very real possibility the answer is "not a whole lot" because of the disparity between XBox and PS5 specs on that front. And since the XBox specs can be equaled by a high-end NVMe drive, maybe having one of those will be enough. Or, hell, maybe just a NVMe drive at all might be enough, and the utility is going to be more about being able to load small textures on-demand at a rate of a few hundred MBps but with very low latency, and the multi-GBps speeds will only really be useful for initial game/level loads.
|
# ? Jul 22, 2020 02:59 |
|
MadFriarAvelyn posted:Gigabyte refused to honor a warranty replacement because I couldn't prove it was the card at fault without a second graphics card to test against lol good to know, ty
|
# ? Jul 22, 2020 03:01 |
|
shrike82 posted:on a separate note, do people feel like their willingness-to-pay for PC parts has gone up over the years? it's mostly earning more as I grow older but I recall aiming for a 1.5K build (excluding monitor) a decade ago while these day, my budget is at 3K. part of it is also realizing how cheap PC gaming is as a "hobby" compared to anything else e.g. owning a boat, or being an audiophile. Inflation is also a thing. $1500 in 2000 is $2250 today. $1500 in 1990 is $2950 today. But, yeah, the bits others have noted about getting a "real job" and the like also shifts your idea of what "a lot" of money is, and the more that shifts, the more reasonable it seems to shell out for hardware you'd never have even considered 20 years ago when you were living off ramen and stolen Subway bread or whatever.
|
# ? Jul 22, 2020 03:02 |
|
shrike82 posted:lol if you're "non-price conscious" and buy a 3080 over a Ti/3090 because you can't wait. just get a PS5/XSX and wait for the higher-end card. You're old now and making more money/better at handling money. SA is also very skewed to white males in their 30s in the tech sector so.... we're basically the most disposable-income oriented demographic there can be.
|
# ? Jul 22, 2020 03:19 |
|
I think my theoretical willingness to spend has gone up, but my practical willingness has gone down in a lot of ways when it comes to GPUs. I'd buy a 3080 (or Ti if available) day one if there were games I was actually excited to play with it, but mostly I just wind up playing stuff that's older or indie because AAA gaming has become so rotten over the past 15 years that I'm not usually in a rush to play the maybe one AAA game a year that interests me. Combine that with multiplayer FPSs being as dead as FMV adventure games were in the 2000s and while the money isn't a big deal to me, a lot of the time I'm not even sure I care enough to bother swapping GPUs out since it won't even matter for the stuff I actually want to play right now.
|
# ? Jul 22, 2020 03:29 |
|
DrDork posted:Inflation is also a thing. $1500 in 2000 is $2250 today. $1500 in 1990 is $2950 today. This is wading into economics chat but we've seen deflationary pressure on consumer electronics and vidjagames for the past couple decades. My dad spent more on a gaming PC in the 90s in real terms than I'd be spending on an equivalent (relative to game requirements) one today. That's why there's that broader discussion about whether next-gen games are going to go up in price. The flipside is that his college education cost a tenth of mine and a starter home likewise. So I guess his generation had the last laugh.
|
# ? Jul 22, 2020 03:53 |
|
It's the opposite for me even though I earn more, but I had terrible PC component spending habits in my 20s that I've reined in now that I'm in my 30s. I have a greater focus on saving and being more intelligent with my purchases. But back in the day, I wanted to upgrade every generation which, in hindsight, probably wasn't all that necessary, and had a full custom water loop, full-cover blocks, for some silly reason cared about getting binned chips or known quantities from overclockers selling at a markup, ran quad-sli GTX295s, etc. Just stupidly spending every dime I earned. Now I don't play a lot of bleeding edge graphics games and don't care if I have to turn graphics settings down. The only triple-A I ever play is the Battlefield series and there I turn everything to minimum for framerate stability in multiplayer anyway.
|
# ? Jul 22, 2020 04:01 |
|
Shrimp or Shrimps posted:ran quad-sli GTX295s, etc lol how did that run
|
# ? Jul 22, 2020 04:04 |
|
K8.0 posted:I think my theoretical willingness to spend has gone up, but my practical willingness has gone down in a lot of ways when it comes to GPUs. I'd buy a 3080 (or Ti if available) day one if there were games I was actually excited to play with it, but mostly I just wind up playing stuff that's older or indie because AAA gaming has become so rotten over the past 15 years that I'm not usually in a rush to play the maybe one AAA game a year that interests me. Combine that with multiplayer FPSs being as dead as FMV adventure games were in the 2000s and while the money isn't a big deal to me, a lot of the time I'm not even sure I care enough to bother swapping GPUs out since it won't even matter for the stuff I actually want to play right now. The Skinner box nonsense has really gotten out of control and it’s to the point where it’s ruining games. Even single player games are now chock full of microtransactions, I’m not buying a game and then paying to not play it to bypass artificial grinding and so on. It is to the point where it’s seriously affecting my enjoyment of newer AAA titles. (It really is a conflicting feeling that Team Fortress 2, a game that I enjoy so much, put us on this path of incredible over-monetization. TF2 MTX weren’t really particularly exploitative but everyone else saw that and decided to see what else people would pay for.) At this point I think my upgrades are mostly going to be monitor driven, my next upgrade will probably be the 38GL950G and then I’ll get a fast GPU to help push it. I think I can coast on my 1080 Ti for a year or two more until they come down a bit. Conversely though as someone who is aiming at 175 Hz gaming, and someone who does a lot of older CPU-bound titles... “overspending” on the CPU relative to the GPU doesn’t bother me. Paul MaudDib fucked around with this message at 04:21 on Jul 22, 2020 |
# ? Jul 22, 2020 04:13 |
|
shrike82 posted:lol how did that run Good for benches, otherwise pretty awful and the main game I played was Tribes 2 which could only use 1 GPU core anyway lol. When redoing a waterblock after upgrading my loop to dual DDCs one time I must have tightened the block on one side too much or done something, but one of the cores busted and I could only run tri-sli. I am honestly surprised it even worked. Managed to sell both to someone for a proportional discount due to only having 3 cores, but paypal fees, currency conversion, shipping costs which I ate etc made them stupid purchases to begin with. Shrimp or Shrimps fucked around with this message at 04:26 on Jul 22, 2020 |
# ? Jul 22, 2020 04:18 |
|
ConanTheLibrarian posted:Is there any reason to think that PCs with sufficient RAM aren't going to be at least as good as the consoles at loading textures on demand? 8GB+ on the card and 16GB system RAM is a lot after all. The new consoles have dedicated hardware to provide texture decompression from storage with basically zero CPU involvement, plus can load textures directly accessible to the GPU without having to go through main memory -> PCIE bus -> GPU memory.
|
# ? Jul 22, 2020 04:45 |
|
shrike82 posted:This is wading into economics chat but we've seen deflationary pressure on consumer electronics and vidjagames for the past couple decades. The CGW review of a high-end computer in the issue where Deus Ex came out had a price tag of over 8k of 2020 money. It was basically obsolete in two years.
|
# ? Jul 22, 2020 04:52 |
|
I only started getting seriously into PC hardware last year so I don't feel too bad spending on something that I've only very seldom spent money on before, but I think I have a weakness for old/budget hardware, so I'm probably bad with money anyway!Happy_Misanthrope posted:The new consoles have dedicated hardware to provide texture decompression from storage with basically zero CPU involvement, plus can load textures directly accessible to the GPU without having to go through main memory -> PCIE bus -> GPU memory. and the thread goes full circle yet again
|
# ? Jul 22, 2020 05:03 |
|
Carecat posted:I found an actual chart to see what usage is like, there doesn't seem to be many tests of VRAM usage. are they maxing out at 6 GB because they are maxing out at 6 GB, or are they maxing out at 6 GB because they're intentionally being programmed that way? MadFriarAvelyn posted:The amount I've spent on my builds has definitely gone up since I started building PCs, almost certainly because I've got a full time job now compared to back when I started building PCs in grade school, but that amount certainly spiked unfairly high when I had to replace my last PC in the middle of the latest cryptocurrency caused GPU drought. Should have said something in Catgirl Bargains or the FFXIV thread, I could have absolutely sent you over something to sort that poo poo out. Been sitting on a 1650 Super and a 1050 Ti I should really get about hocking. Hell, if the warranty window is still open *now*, I'll send one over.
|
# ? Jul 22, 2020 05:13 |
|
shrike82 posted:on a separate note, do people feel like their willingness-to-pay for PC parts has gone up over the years? it's mostly earning more as I grow older but I recall aiming for a 1.5K build (excluding monitor) a decade ago while these day, my budget is at 3K. part of it is also realizing how cheap PC gaming is as a "hobby" compared to anything else e.g. owning a boat, or being an audiophile. No, the opposite. It has probably gone down due to a combination of parts lasting longer (particularly CPUs, which don't have 18 month lifespans like before) and the cost of consoles going up from all players. You see this all over tech; in Linux land they used to keep a bunch of no-frills GUIs for older hardware, where "older" was defined as ten years. Except ten year old computers don't struggle to simply boot and load a browser as a 1993 computer did in 2003. That said, it's natural if you're in the US due to the inflation we've experienced from years of policy mismanagement. $450 is my limit for a graphics card today because that's what I spent on a 6600GT in the summer of 2004, though thanks to inflation that would be $650 and I would never pay that much for a GPU (especially since it was to max out Doom3, which oh my god was a terrible idea.) K8.0 posted:I think my theoretical willingness to spend has gone up, but my practical willingness has gone down in a lot of ways when it comes to GPUs. I'd buy a 3080 (or Ti if available) day one if there were games I was actually excited to play with it, but mostly I just wind up playing stuff that's older or indie because AAA gaming has become so rotten over the past 15 years that I'm not usually in a rush to play the maybe one AAA game a year that interests me. Combine that with multiplayer FPSs being as dead as FMV adventure games were in the 2000s and while the money isn't a big deal to me, a lot of the time I'm not even sure I care enough to bother swapping GPUs out since it won't even matter for the stuff I actually want to play right now. There's also a lot of this. In an age of microtransactions, battle passes, and "games as a service" where even lowly arena shooters are supposed to be a five-years or greater epic journey slowly dripping out plot details with each in-game 'season', there's really no point in buying a high-end GPU anymore. If every movie took the Marvel Cinematic Universe approach, I would never go to IMAX again, and the same applies here. Why go whole-hog into graphics if games are going to be like Anthem? I know that's an exceptionally terrible game to use as an example but it's one that was graphically pushing things and people were excited for. And then they touched it, and oops. Craptacular! fucked around with this message at 06:43 on Jul 22, 2020 |
# ? Jul 22, 2020 06:30 |
|
Craptacular! posted:No, the opposite. It has probably gone down due to a combination of parts lasting longer (particularly CPUs, which don't have 18 month lifespans like before) and the cost of consoles going up from all players. You see this all over tech; in Linux land they used to keep a bunch of no-frills GUIs for older hardware, where "older" was defined as ten years. Except ten year old computers don't struggle to simply boot and load a browser as a 1993 computer did in 2003. Okay, Boomer.
|
# ? Jul 22, 2020 07:20 |
|
I’m really excited for the next Switch, though.
|
# ? Jul 22, 2020 07:35 |
|
CaptainSarcastic posted:Okay, Boomer. Did I accidently stumble in to a youtube video's comments section?
|
# ? Jul 22, 2020 09:07 |
|
Shrimp or Shrimps posted:It's the opposite for me even though I earn more, but I had terrible PC component spending habits in my 20s that I've reined in now that I'm in my 30s. I have a greater focus on saving and being more intelligent with my purchases. But back in the day, I wanted to upgrade every generation which, in hindsight, probably wasn't all that necessary, and had a full custom water loop, full-cover blocks, for some silly reason cared about getting binned chips or known quantities from overclockers selling at a markup, ran quad-sli GTX295s, etc. Just stupidly spending every dime I earned. That said I'm really itching to upgrade now. Unfortunately the new ryzens seem to be not what I'm looking so maybe by the end of year with Ampere and whatever Intel/AMD come up with.
|
# ? Jul 22, 2020 09:30 |
|
Listerine posted:Since I'm not gaming, whether or not I buy a 3000 series (and when) will depend on the numbers those cards put up for GPU rendering. Since that application scales with the number of cards, it may make more sense in terms of price/performance for me to just add another 2000 series. Can consumer GPU's do NVLink memory pooling?
|
# ? Jul 22, 2020 10:55 |
|
Mercrom posted:Can consumer GPU's do NVLink memory pooling? The cheap Nvlink that's on the top end consumer Turings does like 4% of vram transfer speeds. Nvlink3 is the good stuff for pro cards, but the little connector bridge clamp thing is the price of a whole 2080Super last time I read something about it.
|
# ? Jul 22, 2020 12:04 |
|
haha two chips at the same time , man
|
# ? Jul 22, 2020 12:36 |
|
Mercrom posted:Can consumer GPU's do NVLink memory pooling? I have no idea what that is, I don't know technical stuff. I'm using Redshift for rendering, it uses every GPU in your system, so you don't have to swap out when you upgrade, you can just keep adding cards to get better render times. I'm running it with a 780 and a 2070.
|
# ? Jul 22, 2020 12:44 |
|
Scarecow posted:I'm going to open palm slam down money on the next Titan that comes out because gently caress it Yeah this is me. I'm convinced, from just looking around and seeing what's happening, that civilisation as we know it is coming to an end in the next decade or so, so I'm gonna blow my resources on the things I like to do in the meantime and then deal with that (or die) when it comes. So if it turns out I have to buy a new PSU with my GPU? Ok then, I will, and I'll file that decision in the same folder as I filed the "Ok well, I'll just buy the game on Epic store then" folder. It's the shrug folder. We all have one, and the only thing that differs between us is what we choose to file in it.
|
# ? Jul 22, 2020 15:19 |
|
Statutory Ape posted:haha two chips at the same time , man that's it? you have a million dollars you'd have two chips at the same time?
|
# ? Jul 22, 2020 15:54 |
|
|
# ? Jun 6, 2024 06:09 |
|
Cactus posted:Yeah this is me. I'm convinced, from just looking around and seeing what's happening, that civilisation as we know it is coming to an end in the next decade or so, New Nvidia marketing campaign
|
# ? Jul 22, 2020 15:55 |