|
Buy the new card, 4gb of ram today, 3.5gb of ram tomorrow
|
# ? Sep 19, 2018 21:03 |
|
|
# ? May 30, 2024 17:47 |
|
I like Steve and I prefer his videos to any other GPU source of information out there, but I think he's really stretching the "this is good for developers..." angle. Its possible all the raytracing games fall through or maybe the performance hit will be crazy but these games are coming out very soon. He and others are putting zero value in RTX features because they arent available now. I'd agree with that (much like I did with DX12) if it weren't for the release dates of these major AAA games. So that is to say I dont agree with that at all. At worst its an unknown, but I'm pretty sure we'll know sooner rather than later. Regardless the pricing puts a major negative shade on everything
|
# ? Sep 19, 2018 21:11 |
|
As unspectacular as the top end of the product stack is, I kind of feel like Nvidia would have been better off holding the 2080Ti to the usual mid-cycle timing.
|
# ? Sep 19, 2018 21:13 |
|
1gnoirents posted:I like Steve and I prefer his videos to any other GPU source of information out there, but I think he's really stretching the "this is good for developers..." angle. Its possible all the raytracing games fall through or maybe the performance hit will be crazy but these games are coming out very soon. He and others are putting zero value in RTX features because they arent available now. I'd agree with that (much like I did with DX12) if it weren't for the release dates of these major AAA games. So that is to say I dont agree with that at all. At worst its an unknown, but I'm pretty sure we'll know sooner rather than later. It's tough for reviews to put value on something that they can't test, though, especially if it's potentially a performance hit. SwissArmyDruid posted:As unspectacular as the top end of the product stack is, I kind of feel like Nvidia would have been better off holding the 2080Ti to the usual mid-cycle timing. I feel like they included it precisely because that's where there was a gap in the market. The 2080 and 2070 are basically flat for their price point (problematic when there's a glut of overstocked 1080/Tis floating around), but the 2080 Ti doesn't have a competitor. Stickman fucked around with this message at 21:21 on Sep 19, 2018 |
# ? Sep 19, 2018 21:15 |
|
Paul MaudDib posted:Again, if you could pick up a 1080 Ti for like $400 then by all means it would be a great deal but when you're already spending $600 on a card you should really think seriously about dropping the extra $100 to get the newer card even if it's not quite as good a price-to-performance today. The 2080 is already a little faster today, it's potentially a lot faster down the road, and as a current-gen product you are the priority for support/optimizations in future titles. counterpoint: don't spend the better part of a thousand dollars based on something that might potentially happen, jesus loving christ The fact is that today the 2080 is a lovely deal. Don't try to talk people into buying it based on the idea that in a year from now, it might turn out to have been a less lovely deal than it seems like right now. It's highly unlikely that it's going to go up in price, so how about you wait it out and make an informed decision later rather than throwing hundreds of dollars at a speculative purchase today? I mean, yeah, if you're sitting on some old garbage card and you were going to upgrade this month anyway and you have more money than sense, then sure, fine, go ahead and buy a 2080 instead of a 1080Ti. If not though, just wait it out and see what happens.
|
# ? Sep 19, 2018 21:17 |
|
SwissArmyDruid posted:As unspectacular as the top end of the product stack is, I kind of feel like Nvidia would have been better off holding the 2080Ti to the usual mid-cycle timing. Why? Outside of DLSS and RT, the 2080ti is the only card that has a value proposition at all. The others are all roughly equivalent to Pascal MSRP (let alone current retail price) in terms of perf/$ which is like...why bother releasing them. The 2080ti actually has a purpose by virtue of not having a Pascal equivalent at all. I will admit Turing's performance is a little better than I expected based on the Titan V, but still not enough to justify the prices on the lower skus. For machine learning these cards are insanely good value though.
|
# ? Sep 19, 2018 21:19 |
|
Stickman posted:It's tough for reviews to put value on something that they can't test, though, especially if it's potentially a performance hit. I understand, but he is implying it should be discounted like it wont happen at all. Its one thing to not include it in your judgement but its another thing to say those features are only useful for... developers?
|
# ? Sep 19, 2018 21:21 |
|
TheFluff posted:I mean, yeah, if you're sitting on some old garbage card and you were going to upgrade this month anyway and you have more money than sense, then sure, fine, go ahead and buy a 2080 instead of a 1080Ti. If not though, just wait it out and see what happens. I don't think I or anyone else suggested you should upgrade from a 1080 Ti to a 2080, that would be stupid. Obviously that would implicitly narrow things down to people who are upgrading from low-end or older cards, so way to get mad about something nobody said? If you are looking to upgrade and you don't want to spend $1200 for a 2080 Ti, the 2080 makes more sense at this point in time than the 1080 Ti. The 2080 is already 6-8% faster and in FP16 titles it's more like 20% faster. Again, we've already seen aftermarket cards at $750, at which point the 1080 Ti is starting to get overpriced at $600-700. The 1080 Ti really needs to drop down to like $500-550 to really stay viable, especially as aftermarket 2080s start approaching MSRP. If anything Pascal cards are actually bouncing upwards right now due to the perceived weakness of Turing, like you can't actually find 1080 Tis at $600 anymore, they're all $630 or higher on Newegg right now, with high-end models at $650-700. Paying $650 for a 1080 Ti on the eve of the Turing launch is loving stupid. Paul MaudDib fucked around with this message at 21:27 on Sep 19, 2018 |
# ? Sep 19, 2018 21:22 |
|
1gnoirents posted:I understand, but he is implying it should be discounted like it wont happen at all. Its one thing to not include it in your judgement but its another thing to say those features are only useful for... developers? It sounds like he's just saying he can't recommend it now. quote:Until we see a price drop in the 2080, compelling RTX implementations in an actually relevant game, or depleted stock on the 1080 Ti, there is no strong reason we would recommend the RTX 2080 card. Seems sensible. It's tough to recommend something sight-unseen; when more RTX implementations actually show up then the recommendation can be revised. Stickman fucked around with this message at 21:28 on Sep 19, 2018 |
# ? Sep 19, 2018 21:23 |
|
Paul MaudDib posted:That's just like, your opinion man. The 780 Ti was pretty bad. It ran very hot and was hampered by 3 GB of RAM. Really regretted buying that card. But I didn't buy it for $180, either (full launch price) Lambert fucked around with this message at 21:26 on Sep 19, 2018 |
# ? Sep 19, 2018 21:23 |
|
Paul MaudDib posted:Also can we please stop using GTA:V as a benchmark now, that engine is all kinds of hosed up and everything can run it at a million fps, we don't need to be benchmarking it in TYOOL 2018. I don't think we necessarily pay ENOUGH attention to games that are actually widely played when we recommend things in the PC part picking thread or in this thread. Heres' steam's top ten list 402,047 650,922 Dota 2 385,816 477,031 Counter-Strike: Global Offensive 365,992 994,205 PLAYERUNKNOWN'S BATTLEGROUNDS 64,689 92,532 Tom Clancy's Rainbow Six Siege 52,181 55,741 Rocket League 51,945 57,112 Warframe 46,905 47,025 Football Manager 2018 43,632 124,025 MONSTER HUNTER: WORLD 42,509 44,118 Path of Exile 40,612 62,834 Grand Theft Auto V With the exception of PUBG and Monster Hunter (and kind of GTA), there's not anything on here that requires a very high performance graphics card. I know Nvidia has only released their high end, but the amount of games in which this high end stuff matters is pretty slim. On the flipside of that, below are the top graphics cards in gamer systems(i extended the list a little to get an AMD on it, lol) If you're a developer, what do you target for your mainstream performance? A 960? Maybe a 1060 (3GB? 6GB?)? NVIDIA has pushed up the tent poles, but for many devs, implementing these features might be more work than it's worth given current graphics market share. I do hope that these technologies actually end up lowering the amount of work devs need to do, but for now they'll need to still do all the rasterization stuff for all the games released. I think the point I'm trying to make is that these releases do nothing to bring the majority of gamers forward into new tech and value. Perhaps we'll see that on the lower end with a 2060 or 2050, but it's not looking good. code:
|
# ? Sep 19, 2018 21:26 |
|
SwissArmyDruid posted:As unspectacular as the top end of the product stack is, I kind of feel like Nvidia would have been better off holding the 2080Ti to the usual mid-cycle timing. As others have guessed the problem is we're somewhat on the cusp of the 7nm generation, it just wouldn't make sense to wait ~6 months when we would see the 7nm refresh come out just a few months later. I think this is basically an architecture being crammed into 12nm when it really becomes far more viable at 7. It would also mean that at least with same timing and the lack of RTX-supported games, without a 2080ti the reviews today would be basically "So, 1080ti performance, 3 gb less memory for $200 more". I mean most reviews point out how absurdly expensive the 2080ti is but at least it makes for some wowza-graphs with nothing touching it performance wise.
|
# ? Sep 19, 2018 21:28 |
|
Stickman posted:The loss of detail is pretty noticeable in the FFXV review, too Yeah those videos actually made me less enthusiastic about DLSS. The Infiltrator demo in particular, the city scene had a lot of shimmering/aliasing while the native was perfectly still. Note that the Candyland video only showed stills from that, you need to download the German hardware site video to see it (Candyland is produced by a PC OEM mind you, I wouldn't necessarily go to them for the most unbiased angle). So much we don't know about DLSS - if you enable it, does the source resolution have to stay fixed based on what they submitted to Nvidia? Meaning, could you choose an intermediate resolution based on your own preference or does Nvidia need to work from that specific res to make DLSS work at all? Who knows. It's also something that would make far more sense for the lower-end cards, for something like a potential 2060/70 it may hold far more value than it does for cards that are already getting close to 60fps at 4k. Based on those videos I'd probably want to turn down some shadow settings first if I wanted 4k 60fps for the few games that couldn't manage it maxxed before enabling DLSS. Happy_Misanthrope fucked around with this message at 21:52 on Sep 19, 2018 |
# ? Sep 19, 2018 21:35 |
|
I understand its blurrier but the TAA side is really bad, the aliasing is so bad its affecting the textures. Look at the headrest and the nearest door edge. Much less if you pause a crystal clear video at any specific frame its always kind of blurry, in motion it looks great. I havent downloaded any of the comparison videos yet but the stills dont bother me unless it literally looks blurry in all the frames
|
# ? Sep 19, 2018 21:52 |
|
1gnoirents posted:I understand its blurrier but the TAA side is really bad, the aliasing is so bad its affecting the textures. Look at the headrest and the nearest door edge. Much less if you pause a crystal clear video at any specific frame its always kind of blurry, in motion it looks great. I havent downloaded any of the comparison videos yet but the stills dont bother me unless it literally looks blurry in all the frames If you have a 4k monitor, there's a FF XV 4k video up (running on a 2080). I'm at 1440p, so it's tough to tell what's due to downsampling, what's from the game engine, and what's DLSS. E: And what's from the video encoding, I suppose.
|
# ? Sep 19, 2018 21:58 |
|
Stickman posted:If you have a 4k monitor, there's a FF XV 4k video up (running on a 2080). I'm at 1440p, so it's tough to tell what's due to downsampling, what's from the game engine, and what's DLSS. The Germans have better quality videos for comparison: repiv posted:ComputerBase has uploaded extremely high bitrate captures of FF15 Benchmark and UE4 Infiltrator, comparing native TAA to upscaled DLSS.
|
# ? Sep 19, 2018 22:01 |
|
Buying a 20xx with the idea of futureproofing can't be sensible. This is the first version of RTX which will most likely be much more refined in the next.
|
# ? Sep 19, 2018 22:02 |
|
TheFluff posted:counterpoint: don't spend the better part of a thousand dollars based on something that might potentially happen, jesus loving christ Seriously, it's hilarious seeing the nvidia fanboys like paulie here doing the whole fine wine song and dance how the turntables
|
# ? Sep 19, 2018 22:09 |
|
wow it performed exactly as i expected a card destined for the deep compute market (with functions hastily repurposed for trayracing because retards wouldn't stop asking when the next card is coming out) would weird
|
# ? Sep 19, 2018 22:12 |
|
repiv posted:The Germans have better quality videos for comparison: Thanks! The native DLSS (2X) looks very nice compared to TAA, and is still a pretty hefty performance boost: https://www.youtube.com/watch?v=ZD6aBykmayc At least I think that's 2X DLSS - the frame rate doesn't seem to line up with the standard DLSS framerate, and '2X' is mentioned in the preceding paragraph (which I can't read because I'm one of the lazy Americans). E: I google-translated it and now I'm more confused. They seem to be talking about DLSS2X as if it's upscaling? Stickman fucked around with this message at 22:17 on Sep 19, 2018 |
# ? Sep 19, 2018 22:13 |
|
3peat posted:Seriously, it's hilarious seeing the nvidia fanboys like paulie here doing the whole fine wine song and dance nah, 1080 Ti is fine, just not at 85% of the price of the 2080. If Vega 64 was 8% faster than the 1080 and launched at 15% higher price then reviews would have gone a lot different, but it was more like 8% less performance and 25% higher price, even before we bring the "bundles" into the mix. The bundles put the V64 at $600, at a time when the 1080 was running $400. If the 5% difference in price-to-performance really matters to you then buy the 1080 Ti, but the 2080 really is not behind much in price-to-performance and is worth a small premium to have the latest/best-supported. We literally know that some of the changes like FP16 are there and are working but need to be coded for. But I guess AMD put out a bad GPU once and therefore Paul MaudDib fucked around with this message at 22:54 on Sep 19, 2018 |
# ? Sep 19, 2018 22:17 |
|
Stickman posted:Thanks! The native DLSS (2X) looks very nice compared to TAA, and is still a pretty hefty performance boost: My interpretation of the Google Translate was that neither the FF15 or Infiltrator builds they were provided support DLSS2X, so they only tested DLSS upscaling. Google Translate posted:Both programs have one thing in common: DLSS2X, which is the most visually beautiful mode that adds "KI upscaling" to the native resolution, does not exist. Instead, the normal DLSS is used. The goal is to visually produce a 4K-like graphic, but without rendering in 4K or Ultra HD. That said, the internal resolution is lower than Ultra HD, so the performance increases. The image to be seen should achieve a comparable quality. german goons please assist
|
# ? Sep 19, 2018 22:19 |
|
Paul MaudDib posted:nah, 1080 Ti is fine, just not at 90% of the price of the 2080. You're FP16 point was good - The Wolfenstein 2 benchmarks are ~15% increase going from 1080 Ti -> 2080. I assume FP16 rendering is likely to become more ubiquitous over the next few years? repiv posted:My interpretation of the Google Translate was that neither the FF15 or Infiltrator builds they were provided support DLSS2X, so they only tested DLSS upscaling. That makes more sense - it still looks pretty good when scaled back down to 1440p (come on reviewers - post something for us plebs)! Stickman fucked around with this message at 22:22 on Sep 19, 2018 |
# ? Sep 19, 2018 22:20 |
|
repiv posted:My interpretation of the Google Translate was that neither the FF15 or Infiltrator builds they were provided support DLSS2X, so they only tested DLSS upscaling. The Google translation is accurate. The next paragraph states that the DLSS rendering resolution of FFXV is unknown. Infiltrator is stated to be "half resolution", but they don't know what that means in terms of actual rendering resolution.
|
# ? Sep 19, 2018 22:25 |
|
Stickman posted:You're FP16 point was good - The Wolfenstein 2 benchmarks are ~15% increase going from 1080 Ti -> 2080. I assume FP16 rendering is likely to become more ubiquitous over the next few years? Yeah, the PS4 Pro supports FP16 so adoption is inevitable as devs try to squeeze more performance out of that platform.
|
# ? Sep 19, 2018 22:26 |
|
Stickman posted:You're FP16 point was good - The Wolfenstein 2 benchmarks are ~15% increase going from 1080 Ti -> 2080. I assume FP16 rendering is likely to become more ubiquitous over the next few years? AMD is going to be pushing it for consoles, and NVIDIA will be pushing it too now that it helps sell their latest cards, so it's a very safe bet. Ironically this is going to end up giving Vega the same 10-15% boost relative to Pascal. Still not nearly enough to catch the 1080 Ti but every bit helps. This was one of the few easy/obvious gains in Vega and I kinda suspected that NVIDIA would pick it up next cycle too. There aren't too many easy uarch wins left anymore.
|
# ? Sep 19, 2018 22:27 |
|
1gnoirents posted:I like Steve and I prefer his videos to any other GPU source of information out there, but I think he's really stretching the "this is good for developers..." angle. Its possible all the raytracing games fall through or maybe the performance hit will be crazy but these games are coming out very soon. He and others are putting zero value in RTX features because they arent available now. I'd agree with that (much like I did with DX12) if it weren't for the release dates of these major AAA games. So that is to say I dont agree with that at all. At worst its an unknown, but I'm pretty sure we'll know sooner rather than later. It’s easy to see raytracing going nowhere because it’s a PCMR feature. It’s presently money invested exclusively in the PC edition of a game that contributes zero to the console release. When DirectXbox 12 is released, you’ll see better adoption.
|
# ? Sep 19, 2018 22:31 |
|
It is a little hilarious to me that FP16 rendering is becoming the norm. I remember when ATI, in the last year or so before AMD retired the branding entirely, got in trouble for using FP16 demotion to cheat their way to better scores over Nvidia.
|
# ? Sep 19, 2018 22:37 |
|
The $430 I spent on a 1070 over 2 years ago seems like a pretty good deal in hindsight
|
# ? Sep 19, 2018 22:40 |
|
SwissArmyDruid posted:It is a little hilarious to me that FP16 rendering is becoming the norm. I remember when ATI, in the last year or so before AMD retired the branding entirely, got in trouble for using FP16 demotion to cheat their way to better scores over Nvidia. The controversy there was that ATI wasn't using FP16, when the game created an FP16 render target ATIs driver silently used an even lower precision R11G11B10 format instead. FP16 render targets are still widely used now but shaders operate on FP32 and throw away the extra precision on write, the new poo poo with Vega/Turing is directly operating on FP16 values.
|
# ? Sep 19, 2018 22:42 |
|
There is actually one EVGA review out right now. JayzTwoCents has one. https://www.youtube.com/watch?v=WDrpsv0QIR0&t=635s "The EVGA is any where from 10 to 15 C cooler at the same frequencies than the founders edition because again it's a 2.75 slot." His theory why the FE cards aren't better thermally than the EVGA is because of the shroud. Where the shroud dips down past the top of the fins, it blocks the potential air flow from the fans.
|
# ? Sep 19, 2018 22:47 |
|
Also I’m watching Steve’s review of the 2080ti now, and I’m starting to think Nvidia’s mistake was not calling the 2080 a 2070 and the 2080ti a 2080. And then if they want to performance bump later they totally could.
|
# ? Sep 19, 2018 22:49 |
|
ufarn posted:MSI has 8+8 for 2080 and 6+8+8 for 2080ti. https://twitter.com/JayzTwoCents/status/1042529712331878400
|
# ? Sep 19, 2018 22:55 |
|
SlayVus posted:There is actually one EVGA review out right now. JayzTwoCents has one. > tests doom to see how turing performs under vulkan > makes a point of using smaa instead of tssaa But TSSAA is the only anti-aliasing mode that uses async compute
|
# ? Sep 19, 2018 22:59 |
|
Craptacular! posted:Also I’m watching Steve’s review of the 2080ti now, and I’m starting to think Nvidia’s mistake was not calling the 2080 a 2070 and the 2080ti a 2080. And then if they want to performance bump later they totally could. They probably wanted to keep the mainstream marketing in a reasonable price range. The Ti is just a Titan with a new marketing name to exploit the new tier of ultra nerd that Nvidia discovered during the mining boom, and boy has it worked based on the attention the thing is getting. If they would have called it the Titan T, they would have gotten like 20% of this press and hype.
|
# ? Sep 19, 2018 23:00 |
|
Craptacular! posted:Also I’m watching Steve’s review of the 2080ti now, and I’m starting to think Nvidia’s mistake was not calling the 2080 a 2070 and the 2080ti a 2080. And then if they want to performance bump later they totally could. They'd likely have to make another, even bigger chip then, and they probably decided the yields wouldn't be worth it at ti prices. They did leave room for a titan, but I wouldn't be surprised if that got skipped this gen.
|
# ? Sep 19, 2018 23:02 |
|
All about on par for what I expected. However I think I am content with my 980Ti for at least another 6+ months. By the time I need to replace it I think it will be time for a whole new rebuild anyway. For now 1440P/100Hz Ultrawide G-Sync is still above 60FPS 99% of the time so yea. That late VR upgrade to the 980Ti wasn't a terrible move back in what, May 2016?
|
# ? Sep 19, 2018 23:03 |
|
Craptacular! posted:It’s easy to see raytracing going nowhere because it’s a PCMR feature. It’s presently money invested exclusively in the PC edition of a game that contributes zero to the console release. Well I hope not. Id be more skeptical if it wasnt ~raytracing~. Its been the goal for as long as I can personally remember after literal real time 3d graphics came around. If not I guess I lose the gamble with the fastest GPU ever made that I will just resell in 11 months for $975
|
# ? Sep 19, 2018 23:16 |
|
1gnoirents posted:Well I hope not. Id be more skeptical if it wasnt ~raytracing~. Its been the goal for as long as I can personally remember after literal real time 3d graphics came around. If not I guess I lose the gamble with the fastest GPU ever made that I will just resell in 11 months for $975 I don’t know if 11 months is going to be enough time, but you’ll see increased DX12 adoption when you see games on this series of consoles that are produced with the quiet expectation to re-release on the next generation.
|
# ? Sep 19, 2018 23:45 |
|
|
# ? May 30, 2024 17:47 |
|
Craptacular! posted:I don’t know if 11 months is going to be enough time, but you’ll see increased DX12 adoption when you see games on this series of consoles that are produced with the quiet expectation to re-release on the next generation. I cant have a video card for more than a year otherwise it gets old gently caress that made me sadder than i thought it would I should mention again I pre ordered within 1 hour when you could so this is probably bad news for a lot of people, though I suppose not unexpected 1gnoirents fucked around with this message at 23:57 on Sep 19, 2018 |
# ? Sep 19, 2018 23:48 |