Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
Sorry, this is unrelated, but would there be any way to squeeze out more details from grainy low-res videos from the new algorithm? I suppose you need to know the direction the camera is moving, but perhaps the video itself has enough information about camera-shake or movement. That would be fascinating, but probably not possible. Anyway, looking forward to an experimental DLSS-based VLC plug-in around 2030 or so, hopefully.

Adbot
ADBOT LOVES YOU

Carecat
Apr 27, 2004

Buglord

Paul MaudDib posted:

Nvidia has done mid-generation refreshes that either drop prices on the first batch of products, or obsolete them and effectively replace them, since forever. This time around it’s called Super, previously it was the 980 Ti and 1080 Ti, before that it was the 700 series (Kepler refresh), etc.

Yeah that's one thing but you suggested launching the top end card at basically half price of the last gen and upsetting their whole stack pricing (top apart from whatever 3090 they might have planned).

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Carecat posted:

Yeah that's one thing but you suggested launching the top end card at basically half price of the last gen and upsetting their whole stack pricing (top apart from whatever 3090 they might have planned).

Well, remember that NVidia's MSRP for the 2080 Ti was $999. That all the AIB's sold cards for considerably more than that was not really NVidia's doing. But I agree, I think NVidia has determined that enough people will pay $1k for the top-end card that they're not going to undercut that price point unless absolutely forced to.

So the real question is more whether or not the 3080Ti will be available at launch to begin with, or if it'll be going back to the 9- and 10- series staggered launch with a $700 (probably more like $750-$800 now) xx80 part and a $1200+ Titan/3090 part and let the 3080 Ti pop up during the mid-cycle refresh a year or so later.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

lllllllllllllllllll posted:

Sorry, this is unrelated, but would there be any way to squeeze out more details from grainy low-res videos from the new algorithm? I suppose you need to know the direction the camera is moving, but perhaps the video itself has enough information about camera-shake or movement. That would be fascinating, but probably not possible. Anyway, looking forward to an experimental DLSS-based VLC plug-in around 2030 or so, hopefully.

No. The motion information DLSS is using is not about camera movement, but about how samples are moving relative to the camera. Imagine a screenshot, but think of the pixels as points with space in between them instead of squares. Imagine each one has a motion vector showing how far it will move by the next frame. Then you move them all by that amount and now you have extra data you can keep from frame to frame. Now jitter the sampling in the X and Y direction for each frame slightly, so that even when still you aren't just collecting the same piece of data repeatedly frame after frame. By keeping the last X frames worth of data around, you have a very dense network of data to sample for the current frame you want to create, even if the data from each frame isn't enough to construct a single full frame. Without those motion vectors and the ability to jitter exactly where you're sampling an algorithm like DLSS is pretty useless. Offline (i.e. non-realtime) upscaling algorithms will always be able to do more than a real-time one like DLSS anyway, because they can take all the time they need to reconstruct data, as well as looking forward and not just backward in time.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
The only AIB maker who put a 2080Ti out at $999 (and *kept* putting them out periodically) was EVGA and they were rare as hell in the retail channel. I think the majority of people who have them got them through the Step Up program.

Carecat
Apr 27, 2004

Buglord

DrDork posted:

Basically we have no idea, other than devs are going to have to figure something reasonable out or lose money on PC sales they'd otherwise have gotten.

Build a similar high speed SSD into the graphics card itself with the super high speed connection to the VRAM and any necessary decoder hardware.

Downside is your initial loading time and an even more expensive graphics card. I have no idea if this is practical (or cheaper than x GB of GDDR) but I can't see the same motherboard to VRAM connection that avoids CPU overhead getting done easily and everyone would need new motherboards, drives and GPU.

Carecat fucked around with this message at 23:30 on Jul 21, 2020

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

DrDork posted:

Well, remember that NVidia's MSRP for the 2080 Ti was $999. That all the AIB's sold cards for considerably more than that was not really NVidia's doing. But I agree, I think NVidia has determined that enough people will pay $1k for the top-end card that they're not going to undercut that price point unless absolutely forced to.

So the real question is more whether or not the 3080Ti will be available at launch to begin with, or if it'll be going back to the 9- and 10- series staggered launch with a $700 (probably more like $750-$800 now) xx80 part and a $1200+ Titan/3090 part and let the 3080 Ti pop up during the mid-cycle refresh a year or so later.

I hope the Ti variant is always at launch of each new series because it makes very little commercial sense to stagger it later.

As a non-price conscious (and therefore a core 3080Ti demo) buyer, I’m not going to wait. I’ll just buy a vanilla 3080 and they lose their hat versus the margins they could have had.

I’m not gonna wait and I’m also not going to upgrade to a theoretical better card in only 6 months. I’m guessing there are a lot of people like me in that demo. Relatively speaking.

I’m not very worried about it though. i think that’s exactly why the 2080Ti launched alongside and expect that to be the case for the foreseeable future barring technical issues.

For example if Nvidia really, really wants the 3080Ti or the 3090 to be on TSMC 7nm, I guess that could maybe justify a wait, but I doubt it personally.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

BIG HEADLINE posted:

The only AIB maker who put a 2080Ti out at $999 (and *kept* putting them out periodically) was EVGA and they were rare as hell in the retail channel. I think the majority of people who have them got them through the Step Up program.

They must have made changes since the 900/1000 series because I waited about seven months to try and go from a 980 ti to their first 1080 model. EVGA had no problem selling most of their stock to retailers during that time. In the time it took to get my step-up email, they had revised their coolers two or three times and created most of the launch 1080 ti models. I wasn't dying to pay full MSRP on a launch 1080 with mediocre cooling more than half a year after release, so I am still rocking the 980 ti.

shrike82
Jun 11, 2005

lol if you're "non-price conscious" and buy a 3080 over a Ti/3090 because you can't wait. just get a PS5/XSX and wait for the higher-end card.

on a separate note, do people feel like their willingness-to-pay for PC parts has gone up over the years? it's mostly earning more as I grow older but I recall aiming for a 1.5K build (excluding monitor) a decade ago while these day, my budget is at 3K. part of it is also realizing how cheap PC gaming is as a "hobby" compared to anything else e.g. owning a boat, or being an audiophile.

MadFriarAvelyn
Sep 25, 2007

The amount I've spent on my builds has definitely gone up since I started building PCs, almost certainly because I've got a full time job now compared to back when I started building PCs in grade school, but that amount certainly spiked unfairly high when I had to replace my last PC in the middle of the latest cryptocurrency caused GPU drought.

I bought a Gigabyte 1080 Ti marked up $300 from its base price, advertised as being a factory overclocked model, only for it to be a dud card that would only run stable if I underclocked it under stock speeds about 200 Mhz. Gigabyte refused to honor a warranty replacement because I couldn't prove it was the card at fault without a second graphics card to test against. The only other graphics card available and in stock at the time?

A Titan V.

:shepicide:

Turns out that was a dud too for other reasons: it would crash frequently while running games and the stock blower cooler was just awful at keeping it at sane temps. NVIDIA wouldn't replace the card for the former issue, so to try and at least make it usable, I built a watercooling loop for it to keep the temps under control at least. The temps staying under control helped the stability issue, but unfortunately the pump in the loop died in the middle of a gaming session and the card was toast after that.

So I'm looking forward to a new tier of cards to come out so I can replace the busted 1080 Ti I kept as backup, and crossing my fingers I can get my hands on one before the inevitable price gouging begins and I'm forced to make more poor decisions.

MadFriarAvelyn fucked around with this message at 00:26 on Jul 22, 2020

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

shrike82 posted:

lol if you're "non-price conscious" and buy a 3080 over a Ti/3090 because you can't wait. just get a PS5/XSX and wait for the higher-end card.

I need HDMI 2.1 immediately for my pc stack. I’m also buying a Ps5 of course. That’s the point, I’ll spend whatever but I’m also not willing to wait. I need 2.1 too dearly.

Not sure why you put quotes around real words. In fact if you could just kindly not respond or engage with me that would be ideal, no offense. Just doesn’t seem like anything good ever comes of it, so why bother, ya know.

Taima fucked around with this message at 00:34 on Jul 22, 2020

CaptainSarcastic
Jul 6, 2013



shrike82 posted:

lol if you're "non-price conscious" and buy a 3080 over a Ti/3090 because you can't wait. just get a PS5/XSX and wait for the higher-end card.

on a separate note, do people feel like their willingness-to-pay for PC parts has gone up over the years? it's mostly earning more as I grow older but I recall aiming for a 1.5K build (excluding monitor) a decade ago while these day, my budget is at 3K. part of it is also realizing how cheap PC gaming is as a "hobby" compared to anything else e.g. owning a boat, or being an audiophile.

My willingness-to-pay has maybe gone up a little bit, but it's secondary to my ability-to-pay. I also tend to have a good idea what goals I am trying to achieve with my hardware, and try to have the most cost-effective approach to reaching them. I spent more on my 2070 Super than I have spent on any single computer component before, but my goal was to be able to drive 1440p at the highest graphics settings and it meets that goal.

Listerine
Jan 5, 2005

Exquisite Corpse

CaptainSarcastic posted:

My willingness-to-pay has maybe gone up a little bit, but it's secondary to my ability-to-pay. I also tend to have a good idea what goals I am trying to achieve with my hardware, and try to have the most cost-effective approach to reaching them. I spent more on my 2070 Super than I have spent on any single computer component before, but my goal was to be able to drive 1440p at the highest graphics settings and it meets that goal.

It's both ability and willingness that's changed for me. The first computer I got was when I was making $18k yearly on a grad student stipend in 1999. Having a proper salary now means I can afford a greater range of parts. But my needs have also changed for what I'm doing with the machine, so that's probably a bigger driver in terms of what I'll put down for. Since I'm not gaming, whether or not I buy a 3000 series (and when) will depend on the numbers those cards put up for GPU rendering. Since that application scales with the number of cards, it may make more sense in terms of price/performance for me to just add another 2000 series.

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy
I'm going to open palm slam down money on the next Titan that comes out because gently caress it

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Carecat posted:

Build a similar high speed SSD into the graphics card itself with the super high speed connection to the VRAM and any necessary decoder hardware.

Downside is your initial loading time and an even more expensive graphics card. I have no idea if this is practical (or cheaper than x GB of GDDR) but I can't see the same motherboard to VRAM connection that avoids CPU overhead getting done easily and everyone would need new motherboards, drives and GPU.

That...actually could be a pretty interesting option. Some of the devs working on the next-gen games have noted that part of the reason games are so loving bloated these days is multiple textures per resolution, and straight up data duplication to help deal with the physical reality of spinning hard drives. You could probably slap 25-50GB or so of flash storage onto a card and use that as the fast data buffer and only pull stuff occasionally from actual storage. No idea how much it would cost, but <100GB SSDs are real cheap these days, so I'd imagine it'd be more a question of the engineering and board space than the extra BOM.

It might be a better solution than tossing a few extra GB of VRAM in there, because that's really not going to fix much: if a game is actively paging in and out 20GB of stuff, 4GB more VRAM isn't going to make much of a difference.

But, again, we don't know what devs are actually going to do with it. There's a very real possibility the answer is "not a whole lot" because of the disparity between XBox and PS5 specs on that front. And since the XBox specs can be equaled by a high-end NVMe drive, maybe having one of those will be enough. Or, hell, maybe just a NVMe drive at all might be enough, and the utility is going to be more about being able to load small textures on-demand at a rate of a few hundred MBps but with very low latency, and the multi-GBps speeds will only really be useful for initial game/level loads.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

MadFriarAvelyn posted:

Gigabyte refused to honor a warranty replacement because I couldn't prove it was the card at fault without a second graphics card to test against

lol good to know, ty

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

shrike82 posted:

on a separate note, do people feel like their willingness-to-pay for PC parts has gone up over the years? it's mostly earning more as I grow older but I recall aiming for a 1.5K build (excluding monitor) a decade ago while these day, my budget is at 3K. part of it is also realizing how cheap PC gaming is as a "hobby" compared to anything else e.g. owning a boat, or being an audiophile.

Inflation is also a thing. $1500 in 2000 is $2250 today. $1500 in 1990 is $2950 today. But, yeah, the bits others have noted about getting a "real job" and the like also shifts your idea of what "a lot" of money is, and the more that shifts, the more reasonable it seems to shell out for hardware you'd never have even considered 20 years ago when you were living off ramen and stolen Subway bread or whatever.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

shrike82 posted:

lol if you're "non-price conscious" and buy a 3080 over a Ti/3090 because you can't wait. just get a PS5/XSX and wait for the higher-end card.

on a separate note, do people feel like their willingness-to-pay for PC parts has gone up over the years? it's mostly earning more as I grow older but I recall aiming for a 1.5K build (excluding monitor) a decade ago while these day, my budget is at 3K. part of it is also realizing how cheap PC gaming is as a "hobby" compared to anything else e.g. owning a boat, or being an audiophile.

You're old now and making more money/better at handling money. SA is also very skewed to white males in their 30s in the tech sector so.... we're basically the most disposable-income oriented demographic there can be.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I think my theoretical willingness to spend has gone up, but my practical willingness has gone down in a lot of ways when it comes to GPUs. I'd buy a 3080 (or Ti if available) day one if there were games I was actually excited to play with it, but mostly I just wind up playing stuff that's older or indie because AAA gaming has become so rotten over the past 15 years that I'm not usually in a rush to play the maybe one AAA game a year that interests me. Combine that with multiplayer FPSs being as dead as FMV adventure games were in the 2000s and while the money isn't a big deal to me, a lot of the time I'm not even sure I care enough to bother swapping GPUs out since it won't even matter for the stuff I actually want to play right now.

shrike82
Jun 11, 2005

DrDork posted:

Inflation is also a thing. $1500 in 2000 is $2250 today. $1500 in 1990 is $2950 today.

This is wading into economics chat but we've seen deflationary pressure on consumer electronics and vidjagames for the past couple decades.
My dad spent more on a gaming PC in the 90s in real terms than I'd be spending on an equivalent (relative to game requirements) one today. That's why there's that broader discussion about whether next-gen games are going to go up in price.

The flipside is that his college education cost a tenth of mine and a starter home likewise. So I guess his generation had the last laugh.

Shrimp or Shrimps
Feb 14, 2012


It's the opposite for me even though I earn more, but I had terrible PC component spending habits in my 20s that I've reined in now that I'm in my 30s. I have a greater focus on saving and being more intelligent with my purchases. But back in the day, I wanted to upgrade every generation which, in hindsight, probably wasn't all that necessary, and had a full custom water loop, full-cover blocks, for some silly reason cared about getting binned chips or known quantities from overclockers selling at a markup, ran quad-sli GTX295s, etc. Just stupidly spending every dime I earned.

Now I don't play a lot of bleeding edge graphics games and don't care if I have to turn graphics settings down. The only triple-A I ever play is the Battlefield series and there I turn everything to minimum for framerate stability in multiplayer anyway.

shrike82
Jun 11, 2005

Shrimp or Shrimps posted:

ran quad-sli GTX295s, etc

lol how did that run

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

K8.0 posted:

I think my theoretical willingness to spend has gone up, but my practical willingness has gone down in a lot of ways when it comes to GPUs. I'd buy a 3080 (or Ti if available) day one if there were games I was actually excited to play with it, but mostly I just wind up playing stuff that's older or indie because AAA gaming has become so rotten over the past 15 years that I'm not usually in a rush to play the maybe one AAA game a year that interests me. Combine that with multiplayer FPSs being as dead as FMV adventure games were in the 2000s and while the money isn't a big deal to me, a lot of the time I'm not even sure I care enough to bother swapping GPUs out since it won't even matter for the stuff I actually want to play right now.

The Skinner box nonsense has really gotten out of control and it’s to the point where it’s ruining games. Even single player games are now chock full of microtransactions, I’m not buying a game and then paying to not play it to bypass artificial grinding and so on. It is to the point where it’s seriously affecting my enjoyment of newer AAA titles.

(It really is a conflicting feeling that Team Fortress 2, a game that I enjoy so much, put us on this path of incredible over-monetization. TF2 MTX weren’t really particularly exploitative but everyone else saw that and decided to see what else people would pay for.)

At this point I think my upgrades are mostly going to be monitor driven, my next upgrade will probably be the 38GL950G and then I’ll get a fast GPU to help push it. I think I can coast on my 1080 Ti for a year or two more until they come down a bit.

Conversely though as someone who is aiming at 175 Hz gaming, and someone who does a lot of older CPU-bound titles... “overspending” on the CPU relative to the GPU doesn’t bother me.

Paul MaudDib fucked around with this message at 04:21 on Jul 22, 2020

Shrimp or Shrimps
Feb 14, 2012


shrike82 posted:

lol how did that run

Good for benches, otherwise pretty awful and the main game I played was Tribes 2 which could only use 1 GPU core anyway lol.

When redoing a waterblock after upgrading my loop to dual DDCs one time I must have tightened the block on one side too much or done something, but one of the cores busted and I could only run tri-sli. I am honestly surprised it even worked. Managed to sell both to someone for a proportional discount due to only having 3 cores, but paypal fees, currency conversion, shipping costs which I ate etc made them stupid purchases to begin with.

Shrimp or Shrimps fucked around with this message at 04:26 on Jul 22, 2020

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

ConanTheLibrarian posted:

Is there any reason to think that PCs with sufficient RAM aren't going to be at least as good as the consoles at loading textures on demand? 8GB+ on the card and 16GB system RAM is a lot after all.

The new consoles have dedicated hardware to provide texture decompression from storage with basically zero CPU involvement, plus can load textures directly accessible to the GPU without having to go through main memory -> PCIE bus -> GPU memory.

Ugly In The Morning
Jul 1, 2010
Pillbug

shrike82 posted:

This is wading into economics chat but we've seen deflationary pressure on consumer electronics and vidjagames for the past couple decades.
My dad spent more on a gaming PC in the 90s in real terms than I'd be spending on an equivalent (relative to game requirements) one today. That's why there's that broader discussion about whether next-gen games are going to go up in price.

The flipside is that his college education cost a tenth of mine and a starter home likewise. So I guess his generation had the last laugh.

The CGW review of a high-end computer in the issue where Deus Ex came out had a price tag of over 8k of 2020 money. It was basically obsolete in two years.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I only started getting seriously into PC hardware last year so I don't feel too bad spending on something that I've only very seldom spent money on before, but I think I have a weakness for old/budget hardware, so I'm probably bad with money anyway!

Happy_Misanthrope posted:

The new consoles have dedicated hardware to provide texture decompression from storage with basically zero CPU involvement, plus can load textures directly accessible to the GPU without having to go through main memory -> PCIE bus -> GPU memory.

and the thread goes full circle yet again

SwissArmyDruid
Feb 14, 2014

by sebmojo

Carecat posted:

I found an actual chart to see what usage is like, there doesn't seem to be many tests of VRAM usage.

https://www.techspot.com/article/1785-nvidia-geforce-rtx-2060-vram-enough/


So most games are maxing out at 6GB right now. The new consoles will be able to use 8-10GB of VRAM depending on how much else they need to fit in memory but what is the crazy storage -> ram throughput that both consoles are boasting going to mean?

are they maxing out at 6 GB because they are maxing out at 6 GB, or are they maxing out at 6 GB because they're intentionally being programmed that way?

MadFriarAvelyn posted:

The amount I've spent on my builds has definitely gone up since I started building PCs, almost certainly because I've got a full time job now compared to back when I started building PCs in grade school, but that amount certainly spiked unfairly high when I had to replace my last PC in the middle of the latest cryptocurrency caused GPU drought.

I bought a Gigabyte 1080 Ti marked up $300 from its base price, advertised as being a factory overclocked model, only for it to be a dud card that would only run stable if I underclocked it under stock speeds about 200 Mhz. Gigabyte refused to honor a warranty replacement because I couldn't prove it was the card at fault without a second graphics card to test against. The only other graphics card available and in stock at the time?

A Titan V.

:shepicide:

Turns out that was a dud too for other reasons: it would crash frequently while running games and the stock blower cooler was just awful at keeping it at sane temps. NVIDIA wouldn't replace the card for the former issue, so to try and at least make it usable, I built a watercooling loop for it to keep the temps under control at least. The temps staying under control helped the stability issue, but unfortunately the pump in the loop died in the middle of a gaming session and the card was toast after that.

So I'm looking forward to a new tier of cards to come out so I can replace the busted 1080 Ti I kept as backup, and crossing my fingers I can get my hands on one before the inevitable price gouging begins and I'm forced to make more poor decisions.

Should have said something in Catgirl Bargains or the FFXIV thread, I could have absolutely sent you over something to sort that poo poo out. Been sitting on a 1650 Super and a 1050 Ti I should really get about hocking.

Hell, if the warranty window is still open *now*, I'll send one over.

Craptacular!
Jul 9, 2001

Fuck the DH

shrike82 posted:

on a separate note, do people feel like their willingness-to-pay for PC parts has gone up over the years? it's mostly earning more as I grow older but I recall aiming for a 1.5K build (excluding monitor) a decade ago while these day, my budget is at 3K. part of it is also realizing how cheap PC gaming is as a "hobby" compared to anything else e.g. owning a boat, or being an audiophile.

No, the opposite. It has probably gone down due to a combination of parts lasting longer (particularly CPUs, which don't have 18 month lifespans like before) and the cost of consoles going up from all players. You see this all over tech; in Linux land they used to keep a bunch of no-frills GUIs for older hardware, where "older" was defined as ten years. Except ten year old computers don't struggle to simply boot and load a browser as a 1993 computer did in 2003.

That said, it's natural if you're in the US due to the inflation we've experienced from years of policy mismanagement. $450 is my limit for a graphics card today because that's what I spent on a 6600GT in the summer of 2004, though thanks to inflation that would be $650 and I would never pay that much for a GPU (especially since it was to max out Doom3, which oh my god was a terrible idea.)

K8.0 posted:

I think my theoretical willingness to spend has gone up, but my practical willingness has gone down in a lot of ways when it comes to GPUs. I'd buy a 3080 (or Ti if available) day one if there were games I was actually excited to play with it, but mostly I just wind up playing stuff that's older or indie because AAA gaming has become so rotten over the past 15 years that I'm not usually in a rush to play the maybe one AAA game a year that interests me. Combine that with multiplayer FPSs being as dead as FMV adventure games were in the 2000s and while the money isn't a big deal to me, a lot of the time I'm not even sure I care enough to bother swapping GPUs out since it won't even matter for the stuff I actually want to play right now.

There's also a lot of this. In an age of microtransactions, battle passes, and "games as a service" where even lowly arena shooters are supposed to be a five-years or greater epic journey slowly dripping out plot details with each in-game 'season', there's really no point in buying a high-end GPU anymore. If every movie took the Marvel Cinematic Universe approach, I would never go to IMAX again, and the same applies here.

Why go whole-hog into graphics if games are going to be like Anthem? I know that's an exceptionally terrible game to use as an example but it's one that was graphically pushing things and people were excited for. And then they touched it, and oops.

Craptacular! fucked around with this message at 06:43 on Jul 22, 2020

CaptainSarcastic
Jul 6, 2013



Craptacular! posted:

No, the opposite. It has probably gone down due to a combination of parts lasting longer (particularly CPUs, which don't have 18 month lifespans like before) and the cost of consoles going up from all players. You see this all over tech; in Linux land they used to keep a bunch of no-frills GUIs for older hardware, where "older" was defined as ten years. Except ten year old computers don't struggle to simply boot and load a browser as a 1993 computer did in 2003.

That said, it's natural if you're in the US due to the inflation we've experienced from years of policy mismanagement. $450 is my limit for a graphics card today because that's what I spent on a 6600GT in the summer of 2004, though thanks to inflation that would be $650 and I would never pay that much for a GPU (especially since it was to max out Doom3, which oh my god was a terrible idea.)


There's also a lot of this. In an age of microtransactions, battle passes, and "games as a service" where even lowly arena shooters are supposed to be a five-years or greater epic journey slowly dripping out plot details with each in-game 'season', there's really no point in buying a high-end GPU anymore. If every movie took the Marvel Cinematic Universe approach, I would never go to IMAX again, and the same applies here.

Why go whole-hog into graphics if games are going to be like Anthem? I know that's an exceptionally terrible game to use as an example but it's one that was graphically pushing things and people were excited for. And then they touched it, and oops.

Okay, Boomer.

Craptacular!
Jul 9, 2001

Fuck the DH
I’m really excited for the next Switch, though.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Did I accidently stumble in to a youtube video's comments section?

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Shrimp or Shrimps posted:

It's the opposite for me even though I earn more, but I had terrible PC component spending habits in my 20s that I've reined in now that I'm in my 30s. I have a greater focus on saving and being more intelligent with my purchases. But back in the day, I wanted to upgrade every generation which, in hindsight, probably wasn't all that necessary, and had a full custom water loop, full-cover blocks, for some silly reason cared about getting binned chips or known quantities from overclockers selling at a markup, ran quad-sli GTX295s, etc. Just stupidly spending every dime I earned.

Now I don't play a lot of bleeding edge graphics games and don't care if I have to turn graphics settings down. The only triple-A I ever play is the Battlefield series and there I turn everything to minimum for framerate stability in multiplayer anyway.
My willingness to spend money on PC poo poo seems to be inversely proportional to how much I make. I bought a brand new Q6600 / 8800GT machine a few months into my first full time job. I'm now making like 8 times more and my computer is a refurbished Optiplex with a 1070 in it. There's just less time for playing games and more alternatives to spend time and money on. I'd rather go to a track day than spend any more time sitting alone in front of a monitor.

That said I'm really itching to upgrade now. Unfortunately the new ryzens seem to be not what I'm looking so maybe by the end of year with Ampere and whatever Intel/AMD come up with.

Mercrom
Jul 17, 2009

Listerine posted:

Since I'm not gaming, whether or not I buy a 3000 series (and when) will depend on the numbers those cards put up for GPU rendering. Since that application scales with the number of cards, it may make more sense in terms of price/performance for me to just add another 2000 series.

Can consumer GPU's do NVLink memory pooling?

sauer kraut
Oct 2, 2004

Mercrom posted:

Can consumer GPU's do NVLink memory pooling?

The cheap Nvlink that's on the top end consumer Turings does like 4% of vram transfer speeds.
Nvlink3 is the good stuff for pro cards, but the little connector bridge clamp thing is the price of a whole 2080Super last time I read something about it.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

haha two chips at the same time , man

Listerine
Jan 5, 2005

Exquisite Corpse

Mercrom posted:

Can consumer GPU's do NVLink memory pooling?

I have no idea what that is, I don't know technical stuff. I'm using Redshift for rendering, it uses every GPU in your system, so you don't have to swap out when you upgrade, you can just keep adding cards to get better render times. I'm running it with a 780 and a 2070.

Cactus
Jun 24, 2006

Scarecow posted:

I'm going to open palm slam down money on the next Titan that comes out because gently caress it

Yeah this is me. I'm convinced, from just looking around and seeing what's happening, that civilisation as we know it is coming to an end in the next decade or so, so I'm gonna blow my resources on the things I like to do in the meantime and then deal with that (or die) when it comes. So if it turns out I have to buy a new PSU with my GPU? Ok then, I will, and I'll file that decision in the same folder as I filed the "Ok well, I'll just buy the game on Epic store then" folder. It's the shrug folder. We all have one, and the only thing that differs between us is what we choose to file in it.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Statutory Ape posted:

haha two chips at the same time , man

that's it? you have a million dollars you'd have two chips at the same time?

Adbot
ADBOT LOVES YOU

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Cactus posted:

Yeah this is me. I'm convinced, from just looking around and seeing what's happening, that civilisation as we know it is coming to an end in the next decade or so,

New Nvidia marketing campaign

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply