|
I would wait for the next generation, the 1080 Ti is still a pretty good card and the 2080 Ti is now more than a year old.
|
# ? Oct 10, 2019 08:36 |
|
|
# ? Jun 5, 2024 03:51 |
|
Yeah nthing the advice here. The 1080ti is really strong so unless you NEED the rtx right now you should be good until the 3080ti rolls around( or whatever it is called at that time).
|
# ? Oct 10, 2019 08:50 |
|
If you hestitate in buying the 2080 Ti, you don't have the money to waste on it. It's not worth upgrading a 1080 Ti yet
|
# ? Oct 10, 2019 09:15 |
|
Thelonius Van Funk posted:I have a GTX 1080 Ti that I am pretty pleased with, but the prices for the RTX 2080 Ti have finally started dropping in Sweden. Is there any point in upgrading or should I just hold out for the new cards in 2020? The 2080Ti is 30-40% faster than the 1080Ti. So the question is whether or not that's worth the price for you. If the used market in Sweden is anything like it is in the US, you can still get a good price for a used 1080Ti, reducing the total upgrade cost. If you've got a monitor and/or games that could really benefit from that 30-40% bump, and you want it now then the 2080Ti is a great card to have. That said, I have to agree with everyone else: NVidia should be releasing their next gen cards within 6 months or so. These cards are widely expected to be on the newer 7nm node, meaning that the 2180Ti (or whatever they call it) should be, at minimum, 30-50% faster than the 2080Ti, and will likely come with respun RTX segments to allow you to actually utilize RTX at the resolutions and framerates you'd expect from a $1000+ card. So if you're not really hurting for a new card right this minute, I'd wait. If you are, though, it's not like you're going to have a bad time rockin' a 2080Ti for a few years.
|
# ? Oct 10, 2019 14:31 |
|
Not an old game but I would KILL to see an RTX enabled version of the Deathwing game.
|
# ? Oct 10, 2019 14:36 |
|
DrDork posted:NVidia should be releasing their next gen cards within 6 months or so. Why so soon? Is there anything that actually indicates they'll cut the usual two year product cycle short? There's still no meaningful competition and we've even had a mid-cycle refresh with the supers this time around. I don't expect an announcement until like a year from now, and mainstream availability/reasonable pricing will probably take another 2-3 months in many parts of the world.
|
# ? Oct 10, 2019 15:08 |
|
repiv posted:Nvidia is hiring people to work on more remasters a-la Quake 2 RTX Pong
|
# ? Oct 10, 2019 15:16 |
|
TheFluff posted:Why so soon? Is there anything that actually indicates they'll cut the usual two year product cycle short? There's still no meaningful competition and we've even had a mid-cycle refresh with the supers this time around. I don't expect an announcement until like a year from now, and mainstream availability/reasonable pricing will probably take another 2-3 months in many parts of the world. While NVidia hasn't announced it themselves, there have been multiple corroborating reports that they're targeting a 1H2020 release. The reason why is pretty simple: the 20-series cards were lackluster in a lot of ways, and have not sold particularly well. Compared to the 10-series, they didn't improve on performance nearly as much as NVidia generally does with a new generation, and the price:performance ratio was disappointing thanks to price hikes. There're a lot of reasons to believe that they never really wanted to do the 20-series on 14nm, but just couldn't make the jump to 7nm in time. Instead, they basically burned almost an entire generation on priming the pump for RTX. The 21-series (or whatever) should provide a substantial performance, power efficiency, and density jump compared to current cards. The end result of that is they should be able to get better yields (read: lower prices or better profits), and be able to turn RTX from its current status as more or less a tech demo into something that people with cards that don't end in 80Ti will actually be able to use and want. And while AMD might not be able to compete with the 2080Ti, their mid-tier and low-end products are actually viable alternatives in many cases. Quick turning to the next generation on a 7nm node should allow NVidia to take that space back again, and that space is where a ton of money is made. I mean, AMD was able to survive for the past how many years without a flagship card precicely because of mid- and low-end card sales. There's really no reason for them to want to stay on 14nm any longer than they absolutely have to. Everyone has been assuming the 20-series would be short-lived compared to normal generations since day 1, really.
|
# ? Oct 10, 2019 15:38 |
|
DrDork posted:While NVidia hasn't announced it themselves, there have been multiple corroborating reports that they're targeting a 1H2020 release. The recent "leaks" in the last few days all seem to be barely better than pure speculation. They are also all talking about Ampere, which last I heard was compute-only (like Volta) with no consumer parts. I'd still say new consumer cards before summer 2020 is more wishful thinking than anything else. e: pretty much all the reporting in the last few days that "new nvidia gpus are coming 1H2020!" from all the various outlets can be traced back to a single sentence near the end of an igor's lab post on the 1660 super and the 5500xt which says: quote:Whereby up to the half of the year 2020 Ampere should finally come it's 100% wishful thinking all the way down lmao the entire idea that turing's product cycle would be shorter than normal is just some weird idea people got into their heads because it wasn't as good as they'd hoped, it has no foundation in anything at all other than disappointment with the product TheFluff fucked around with this message at 16:14 on Oct 10, 2019 |
# ? Oct 10, 2019 16:00 |
|
The longer they take, the more I can save up, the better card I'll be able to get to replace my perfectly fine in every single AAA game, with mostly ultra settings, @1440p GTX1080, a GPU released over 3 years ago. Those are the times where it feels good not having 4K or 144Hz I guess I'll be on the lookout for a new card around the time Cyberpunk 2077 comes out or thereabouts
|
# ? Oct 10, 2019 16:31 |
|
TheFluff posted:Why so soon? Is there anything that actually indicates they'll cut the usual two year product cycle short? There's still no meaningful competition and we've even had a mid-cycle refresh with the supers this time around. I don't expect an announcement until like a year from now, and mainstream availability/reasonable pricing will probably take another 2-3 months in many parts of the world. It's not soon, it's almost precisely on the schedule Nvidia have been keeping since 2013.
|
# ? Oct 10, 2019 17:10 |
|
Riflen posted:It's not soon, it's almost precisely on the schedule Nvidia have been keeping since 2013. GTX 680/Kepler, 28nm: March 2012 GTX 780/Kepler, 28nm: May 2013 (14 months) GTX 750/Maxwell, 28nm: February 2014 (23 months from Kepler) GTX 980/Maxwell, 28nm: September 2014 (16 months from 780) GTX 1080/Pascal, 16nm: May 2016 (20 months from 980, 27 months from Maxwell) RTX 2080/Turing, 12nm: September 2018 (28 months) 680 and 780 were both Kepler, and Maxwell actually launched as early as in February 2014 with the GTX 750, and both Kepler and Maxwell were on the 28nm node. But no, that could not possibly have made development faster and product cycles shorter. Clearly there is a strict 18 month product cycle here! e: date calculation is hard, fixed some errors. the shortest time between new consumer card architectures since 2012 has been 23 months. e2: not only that, but there's clearly no schedule even if you just consider times between x80 cards - rather, there's a clear trend towards longer intervals (14/16/20/28 months between releases). TheFluff fucked around with this message at 18:04 on Oct 10, 2019 |
# ? Oct 10, 2019 17:27 |
|
DrDork posted:There's really no reason for them to want to stay on 14nm any longer than they absolutely have to. Everyone has been assuming the 20-series would be short-lived compared to normal generations since day 1, really. This has been wishful thinking since day 1, honestly.
|
# ? Oct 10, 2019 17:45 |
|
Cygni posted:This has been wishful thinking since day 1, honestly. Yea. I mean...NVidia's track record these past few years really has been so poo poo, that it's made me consider AMD options as just far better deals overall, especially aftermarket ones. But that doesn't diminish the big pile of money NVidia is sitting on and which allows them to throw RTX sand in everyone's eyes while they take their sweet time with their version of 7nm.
|
# ? Oct 10, 2019 18:01 |
|
TheFluff posted:
I didn't say 2012, you did. Since 2013, the average time between releases of x80 parts has been 21 months. The average time between release of the x80 Ti parts has been 19 months. There has been a Titan GPU released at least every calendar year (in 2017 we got two and the average time between releases is currently just under 12 months). These schedules are dictated by technological and economic factors and this trend only continues until it doesn't. But the middle of next year for an x80 Ampere or whatever it'll be called, is in keeping with trends seen over the last 6 years. Typically the x104 part (x80) is the first Geforce product released from a new architecture. Edit: Seeing as you edited, I'll add that there were very credible reports that the next GPU design taped out in April 2019. In the past, tape out happens about a year and a bit before the first products are released but again, it depends. Riflen fucked around with this message at 18:26 on Oct 10, 2019 |
# ? Oct 10, 2019 18:13 |
|
I wonder if Cyberpunk 2077 is going to be a big enough deal for GPU manufacturers to rush something to market before it releases.
|
# ? Oct 10, 2019 18:35 |
|
Riflen posted:I didn't say 2012, you did. The x80Ti's are just irrelevant to the discussion and so are the Titans. Again, the only reason that the 780 and the 980 were so close in time was that Maxwell launched with the 750, not with the 780. Look at the microarchs, not the x80 cards. Riflen posted:Edit: Seeing as you edited, I'll add that there were very credible reports that the next GPU design taped out in April 2019. In the past, tape out happens about a year and a bit before the first products are released but again, it depends. Yes, I remember that leak too. What I also remember from that leak though that you either forgot or missed at the time is that it said that Ampere was likely compute-only. TheFluff fucked around with this message at 18:51 on Oct 10, 2019 |
# ? Oct 10, 2019 18:38 |
|
eames posted:I wonder if Cyberpunk 2077 is going to be a big enough deal for GPU manufacturers to rush something to market before it releases.
|
# ? Oct 10, 2019 18:48 |
|
Isn't the biggest problem with the 20-series, from Nvidia's perspective, that it's kinda inefficient and expensive to make due to the AI/deeplearning tensor cores being essentially wasted die space for gaming use? I could see them rushing out new high end GPUs that ditch almost all the AI poo poo.
|
# ? Oct 11, 2019 06:02 |
|
FuturePastNow posted:Isn't the biggest problem with the 20-series, from Nvidia's perspective, that it's kinda inefficient and expensive to make due to the AI/deeplearning tensor cores being essentially wasted die space for gaming use? They won't do this because it would invalidate their earlier efforts + they have stock to clear.
|
# ? Oct 11, 2019 06:30 |
|
FuturePastNow posted:Isn't the biggest problem with the 20-series, from Nvidia's perspective, that it's kinda inefficient and expensive to make due to the AI/deeplearning tensor cores being essentially wasted die space for gaming use? That's why they're trying to push DLSS and all that so they do have a gaming use, but so far it hasn't been going well.
|
# ? Oct 11, 2019 07:17 |
|
And so far doesn't actually use the tensor cores
|
# ? Oct 11, 2019 07:18 |
|
FuturePastNow posted:Isn't the biggest problem with the 20-series, from Nvidia's perspective, that it's kinda inefficient and expensive to make due to the AI/deeplearning tensor cores being essentially wasted die space for gaming use? 12nm wafers should be much cheaper than anything 7nm. If anything, Nvidia is saving money despite the large die sizes.
|
# ? Oct 11, 2019 08:27 |
|
TheFluff posted:Come on dude. When the actual intervals between x80 cards have been 14/16/20/28 months in that order, saying that the average is 21 months is not just meaningless, it's actively misleading. If you're excluding the 680 you're also taking an average of three (3) values. You seem to want to have an argument about this. There's no need to be combative. All I said was June is 1H 2020 and June-Sept is likely for Gx104 considering how Nvidia have operated in recent years. It's all speculation based on what little the public knows ahead of time anyway. I agree these recent reports have zero information in them and are probably just websites trying to get ad hits, but grabbing on to one off events like how Maxwell was launched doesn't make a strong case for any other launch window either.
|
# ? Oct 11, 2019 09:23 |
|
I figure I'll leave this here because we don't seem to have a SH/SC streaming thread yet and the readers of this topic might be interested.Google Stadia VP of engineering posted:“Ultimately, we think in a year or two we’ll have games that are running faster and feel more responsive in the cloud than they do locally,” Bakar says to Edge, “regardless of how powerful the local machine is.” https://www.pcgamesn.com/stadia/negative-latency-prediction They plan to do this using fancy algorithms for input anticipation (prediction?), kind of what I failed to describe here.
|
# ? Oct 11, 2019 11:06 |
|
eames posted:I figure I'll leave this here because we don't seem to have a SH/SC streaming thread yet and the readers of this topic might be interested. My Little Stadia: Algorthms are Magic
|
# ? Oct 11, 2019 11:28 |
|
eames posted:
Not with the kind of latency, ping and generally poo poo broadband connections we have here, no. I have FTTC, but cabinet to house is still good ol' copper wire and while I can get 60mbps download/20mbps upload, ping is very rarely less than 70-80ms in real-world situations and latency is pretty noticeable, much like the 7mbps ADSL we all used to have few years ago (or even 640kbps ). Unless they manage to renovate the entire network infrastructure on a global level so we all get FTTH in 1-2 years, which sounds absolutely ridiculous. Maybe in smaller nations with already great infrastructure (Japan, Sweden...) and maybe big cities with >1,000,000 people in major nations? But there's not many of those where I live E: ok sorry missed the part of your post where apparently their solution to this problem is "algorithm predicting what action, button, or movement you’re likely to do next and render it ready for you", which I believe is utter bullshit TorakFade fucked around with this message at 11:58 on Oct 11, 2019 |
# ? Oct 11, 2019 11:55 |
|
Yeah, it still needs to figure out which input you actually did and stream the image back to you, so it’ll only eliminate the internal server latency at best. I’d put good money on the network latency dwarfing the internal latency. E: Unless it’s sending you 20 streams and you get to pick the one you did Stickman fucked around with this message at 12:13 on Oct 11, 2019 |
# ? Oct 11, 2019 12:06 |
|
Latency? We have... negative latency.
|
# ? Oct 11, 2019 12:30 |
|
Without getting into the quagmire of just how they're leveraging predictive action into reducing perceived latency, I want to know how they plan on avoiding the latency impact of failed predictions. Like, what happens to the mouse cursor? If I move left when Stadia thinks right, is it going to jump? I could see a game being designed around asymmetric UI paradigms and have each 'layer' predicted separately, like Stadia doesn't predict the mouse layer, the UI layer can be cheaply prerendered, the game world layer runs at it own speed and it's all stitched together locally. But that's something you'd have to have the game built for from the ground up, you can't retrofit existing games into that kind of paradigm.
|
# ? Oct 11, 2019 12:51 |
|
I'm pretty sure it's just bullshit marketing and they're not actually going to do any of that. They're trying to downplay the downsides to this with "we'll use math to fix it". It would probably be easier for them to send everyone who has Stadia a monitor with minimal input lag instead of a TV with 30ms+. Google where's my CRT monitor
|
# ? Oct 11, 2019 14:01 |
|
I have high quality FTTH, and even then average RTT to local datacenters is in the 8-10ms range. Most of my gaming targets sub-15ms frame times. With just the network layer taking half or more of the time it takes my local system to render an entire frame, good luck ever getting latency to compare. The only way I could see this being true is if they cherry picked the scenario so you had someone on fiber sitting next to their datacenter but playing on some potato computer with an iGPU. Then maybe it would technically be faster because their local processing capacity would be almost nil, but that's not what their clipping said, so Wouldn't be the first tech startup to outright lie about the capabilities of their product, though.
|
# ? Oct 11, 2019 14:03 |
|
isndl posted:Without getting into the quagmire of just how they're leveraging predictive action into reducing perceived latency, I want to know how they plan on avoiding the latency impact of failed predictions. Like, what happens to the mouse cursor? If I move left when Stadia thinks right, is it going to jump? You could minimise the latency impact by rolling back to a previous game state and replaying it with your actual inputs but that sounds like a stuttery mess for anything with direct camera controls.
|
# ? Oct 11, 2019 14:10 |
|
Rollback netcode for all games not just fighting games, let's go.
|
# ? Oct 11, 2019 14:47 |
|
Riflen posted:You seem to want to have an argument about this. There's no need to be combative. All I said was June is 1H 2020 and June-Sept is likely for Gx104 considering how Nvidia have operated in recent years. It's all speculation based on what little the public knows ahead of time anyway. I agree these recent reports have zero information in them and are probably just websites trying to get ad hits, but grabbing on to one off events like how Maxwell was launched doesn't make a strong case for any other launch window either. I disagree, and I'd say there's a pretty convincing case for the next xx80 coming later than that. We have the leak saying Ampere taped out back in April and that it's likely compute-only. Supporting that is the EEC filing that WCCFTech reported on recently where the only two Ampere parts listed are a GA104 and a GA104-400. A GA104 with no suffix sounds like a Quadro part, and while a GA104-400 could be an xx80 part it's more likely a Titan. This supports my belief that Ampere will be like Volta, and the next xx80 parts will probably not launch before Q4 2020 given the past track record of 24 months or more between consumer microarch launches in the last decade (it goes further back than Kepler - Fermi launched in April 2010, 24 months before Kepler). Fermi and Kepler had two x80 parts each, which contributed to shorter product cycles, but that hasn't been a thing since Maxwell. Even supposedly well-informed publications like WCCFTech bring up completely inane arguments like "well Nvidia took a beating in the recent quarterly reports" as something that makes a consumer card launch likely to happen earlier, but designing, prototyping, testing and mass producing a new microarch takes several years and involves huge supplier lead times - it's nothing you can just rush out because the latest quarterly report looks bad. The roadmap and launch quarter for Ampere and whatever comes next was probably decided over a year ago already and there's nothing that can be done to change it now. Or maybe I'm wrong and Ampere will launch compute only at first but get consumer parts soon after, what do I know. I don't see any arguments other than "it would feel good" for a 1H2020 consumer launch though. I really want to replace my GTX1080 myself but I don't feel hopeful about getting to do it soon. edit: haha never mind, i shouldn't have trusted wccftech. the filing they're reporting on is from back in august 2018, at which time they made this piece of highly accurate reporting based on it (like seriously, it's hilarious how confidently they are wrong). never mind, not much reason to believe anything at all right now. TheFluff fucked around with this message at 17:03 on Oct 11, 2019 |
# ? Oct 11, 2019 14:55 |
|
SwissArmyDruid posted:Rollback netcode for all games not just fighting games, let's go. Fighting games were the last genre to get it. Half-Life pioneered it in 1999 or 2000 and almost every non-fighting game made since at least 2004 has it.
|
# ? Oct 11, 2019 16:44 |
|
lllllllllllllllllll posted:Latency? We have... negative latency. Stadia is officially a console, as we have reached console level lovely marketing.
|
# ? Oct 11, 2019 17:07 |
|
Yeah, and that's why netcode post like quake3 has been a pile of garbage. Quake 2 still has the best amount of prediction for an online multiplayer game.
|
# ? Oct 11, 2019 17:11 |
|
reminder that everyone on a 60hz monitor that's using v-sync (which is to say most people) likely has ~90ms input lag already
TheFluff fucked around with this message at 17:23 on Oct 11, 2019 |
# ? Oct 11, 2019 17:16 |
|
|
# ? Jun 5, 2024 03:51 |
|
pzy posted:Maybe Half-Life: https://www.pcgamer.com/the-original-half-life-just-got-patched-for-some-reason/ Well, we're never going to see Half-Life 3, so may as well Raytrace the gently caress out of the games we do have. Black Mesa does look pretty good though!
|
# ? Oct 11, 2019 17:22 |