Got Watch Dogs 2 with my last card, and even though I have no hopes for Legion being nearly as good, it's a neat bit of recursion.
|
|
# ? Sep 2, 2020 00:20 |
|
|
# ? May 30, 2024 19:02 |
|
Out of curiosity I went to try and trace back my GPUs... only got as far back as a used Ti 4600, but from there I think I hit a 7900, 8800 GTS, GTX 460, GTX 670 and then GTX 1080. Surprised at past me for upgrading so "quick" from 460 to 670, but maybe that was a big jump.
|
# ? Sep 2, 2020 00:24 |
|
No matter what initially got Nvidia on Samsung, one thing is for sure, Nvidia is making out like loving bandits to be slinging the 3080 at $699. There's no way they're just dumping their margins just to hold off AMD. They in all likelihood got a killer deal from Samsung.
|
# ? Sep 2, 2020 00:26 |
|
K8.0 posted:HWUnboxed just did a video on CPU performance for benching the 30 series. At 4k the 3950x tends to pull ahead in a lot of DX12/Vulkan titles. Not surprising, with newer APIs/newer engines having better multithreading support, having a bunch more cores can help avoid any waiting for the CPU when you're GPU bottlenecked. It's not something I previously considered very much, but it's worth thinking about. The 3950x with 2 extra cores and costing 200 bucks more than the 10900k, but just barely squeezes out a few more frames in some games or get beaten/having equal performance in dx12 means the 3950x isn't really a great buy. You are also counting on game devs putting in the extra effort (lol) to optimize for multithreading just to have barely equal performance. Which is a lot less likely to happen to non triple a titles.
|
# ? Sep 2, 2020 00:27 |
|
shrike82 posted:I don't expect AMD to beat Nvidia on pure performance but there's definitely an opening for them on some kind of efficiency metric versus a 320W 3080 esp since they have both TSMC and HBM. I am not only curious about AMD’s raw performance but especially about their Raytracing Solution as well, because both next gen consoles are visibly featuring RT - on RDNA 2 design. Will be interesting how AMD and Nvidia will end up in direct Raytracing comparisons and benchmarks. About the aggressive pricing: The 3080 with 699 is interesting as it is now EXACTLY in the performance to price bracket where many would have expected AMD’s counter to the 3080. The Radeon VII had a great environment for pricing as the 2080TI was beyond 1200 €/$ and the 2080s were around 800-900 €/$ so it was an easy undercut to 700 €/$. That breathing room is denied by Nvidia now as their High End Ampere „flagship“ GPU 3080 (the 3090 is targeted at enthusiasts, designers, and disposable income nerds) is now sold for less than the expected (feared) 800-1000 moneys. Custom Street prices will be 100-200 more, depending on demand and AiC Solution obviously. Mr.PayDay fucked around with this message at 00:32 on Sep 2, 2020 |
# ? Sep 2, 2020 00:28 |
|
The Radeon VII was weird. An absolutely gigantic portion of its total cost was just in the HBM alone. If they hadn't gone HBM that card could arguably have been a lot more successful at a lower price point.
|
# ? Sep 2, 2020 00:30 |
|
Encrypted posted:The 3950x with 2 extra cores and costing 200 bucks more than the 10900k, but just barely squeezes out a few more frames in some games or get beaten/having equal performance in dx12 means the 3950x isn't really a great buy. the real reason to buy high end amd chips is 64mb of cache 2020 is the first time ever i'm playing stellaris into late game and gameplay staying smooth despite all the excuses from paradox about how their game can't be made any faster
|
# ? Sep 2, 2020 00:30 |
|
Mr.PayDay posted:Herkelman (CVP & GM AMD Radeon) just answered with a 🤔 It has been less than 12 hours since Huang walked on stage. Relax, RDNA 2 leaks will be coming soon. Certainly before the 3080 launches. You will know AMD is royally hosed if you don't hear anything in the next 2 weeks. Subjunctive posted:How do you think that AMD will manage to exceed NVIDIA on efficiency? What evidence has there been that they'll even be able to hit the 3080 mark ignoring DLSS capability? Many second-place market contenders have learned that just wanting really badly to catch the leader doesn't get you there, and AMD's record on this stuff--especially in terms of running cool--doesn't make me at all confident that they'll catch the 3080 in 2020 at least. TMSC's 7nm node is already better than the Samsung node that Nvidia is using for gaming Ampere. This is why they will run quieter. The latest leaks have RDNA 2 using TSMC's n7P (not EUV) which can supply another 10% saving in energy or 7% clock speed, your choice. I don't think I heard RDNA 2 being moved on TSMC 7nm EUV (N7+) which would be even better. It is almost certain the node advantage will let RDNA 2 win the efficiency crown. DLSS remains their trump card but DLSS is not in every title yet. I am optimistic about mass adoption by next year but that is not certain. But without DLSS, there is no reason to doubt that AMD can crank outsomething that can match the 3080. They matched the 2070 with RDNA 1 with 5700XT and cancelled their top RDNA 1 die which was scheduled to compete with the 2080 (original 'Big Navi'). Maybe some day we will know why but it might have been the fact that RDNA 1 had issues with power consumption despite being on TSMC N7. Thats probably why they didn't come out with their rumored 72 CU RDNA 1 card. But now that we know XBOX and PS5 have RDNA 2 on an APU that maxes out at 300W. A discrete RNDA 2 GPU can have 300W to itself (less power than the 3080) so a full sized 80 CU RDNA 2 GPU should be able to match or come close to the 3080 which is about 25% better than a 2080Ti. I believe in Ampere and will probably be on the 3080 train after reviews come out and we get a better idea or what RDNA 2 is or isn't. But to act like they can't catch the 3080 is hubris. Jensen wouldn't be nearly as aggressive as he was today in pricing if he himself didn't think that RDNA 2 was a threat.
|
# ? Sep 2, 2020 00:32 |
|
Taima posted:The Radeon VII was weird. An absolutely gigantic portion of its total cost was just in the HBM alone. If they hadn't gone HBM that card could arguably have been a lot more successful at a lower price point. Don't forget the SHROUD. I've a feeling Raja's a 'swing for the fences' guy. He just starts swinging before the ball is even thrown.
|
# ? Sep 2, 2020 00:32 |
|
OMG 1000 posts. Where can I go to reserve this fabled $700 3080? Preferably somewhere I can get an EVGA version since I think the XC3 is the only one that will fit. Just post all the reservation sites anyone.
|
# ? Sep 2, 2020 00:32 |
|
Taima posted:The Radeon VII was weird. An absolutely gigantic portion of its total cost was just in the HBM alone. If they hadn't gone HBM that card could arguably have been a lot more successful at a lower price point. Radeon VII makes more sense when you realize it was a Titan-style prestige card that was never intended to sell units, just to blunt the “lol it's 2019 and AMD still has nothing to compete with 1080 Ti let alone anything newer” factor until they could get Navi out six months or so later. They absolutely knew Navi was going to be better and cheaper and nobody should really have bought VII, that wasn't the point. It was leftover trash MI50 compute cards that they couldn’t sell because ROCm is a loving mess and welp might as well see what we can get for them from the gamers. Easy to turn into a "prestige" product. (Bearing in mind this is exactly how Titans got started as well!) It also probably would not have existed at all if NVIDIA had come out swinging with Turing pricing like they are doing on Ampere. Paul MaudDib fucked around with this message at 00:47 on Sep 2, 2020 |
# ? Sep 2, 2020 00:32 |
|
https://news.developer.nvidia.com/new-features-to-dlss-coming-to-nvidia-rtx-unreal-engine-4-branch/quote:Improved VR support. Maintaining the 2880×1600 resolution of top-end VR head mounted displays while delivering the recommended 90 FPS has been made easier with DLSS. I think this is the first time Nvidia has talked about DLSS being suitable for VR?
|
# ? Sep 2, 2020 00:33 |
|
Zero VGS posted:OMG 1000 posts. Where can I go to reserve this fabled $700 3080? Preferably somewhere I can get an EVGA version since I think the XC3 is the only one that will fit. Just post all the reservation sites anyone. Preorders aren't live yet and probably won't be until 9/17.
|
# ? Sep 2, 2020 00:33 |
|
Yeah AMD's GPU division seems like it's getting back on its feet after Raja left. It's hilarious how he's failing upwards and is a serious contender to takeover as CEO at Intel
|
# ? Sep 2, 2020 00:33 |
|
shrike82 posted:Yeah AMD's GPU division seems like it's getting back on its feet after Raja left. Their *hardware* is. They desperately need to find a driver team worth a drat and send them a few dump trucks full of money.
|
# ? Sep 2, 2020 00:35 |
|
repiv posted:I think this is the first time Nvidia has talked about DLSS being suitable for VR? yeah afaik there's still a lot of work that needs to be done (not to mention having games actually support it), but it's definitely coming sometime during the 3000 generation if it's coming.
|
# ? Sep 2, 2020 00:35 |
|
For anyone interested in UK/EU pricing, Overclockers is adding models: https://www.overclockers.co.uk/pc-components/graphics-cards/nvidia/geforce-rtx-3080. The fancier 3080s will go for £750-800. I would probably have been happy with a £650 FE but the illustration of hot air being shitted into the position of the CPU air cooler ain’t great.
|
# ? Sep 2, 2020 00:36 |
|
BIG HEADLINE posted:Their *hardware* is. They desperately need to find a driver team worth a drat and send them a few dump trucks full of money. i'm honestly waiting for them to scrap the windows driver and just run a wrapper around the linux one, in a reverse "old linux ATi catalyst" way
|
# ? Sep 2, 2020 00:37 |
|
Taima posted:The Radeon VII was weird. An absolutely gigantic portion of its total cost was just in the HBM alone. If they hadn't gone HBM that card could arguably have been a lot more successful at a lower price point.
|
# ? Sep 2, 2020 00:39 |
|
Encrypted posted:The 3950x with 2 extra cores and costing 200 bucks more than the 10900k, but just barely squeezes out a few more frames in some games or get beaten/having equal performance in dx12 means the 3950x isn't really a great buy. The story in that isn't so much the 3950x as it is the PCIe 4.0 GPU gaining relatively of a more performance advantage at 4k while the PCIe 3.0 GPU was a statistical tie. That's the relevant part for people looking forward.
|
# ? Sep 2, 2020 00:42 |
|
MikeC posted:It has been less than 12 hours since Huang walked on stage. Relax, RDNA 2 leaks will be coming soon. Certainly before the 3080 launches. You will know AMD is royally hosed if you don't hear anything in the next 2 weeks. RDNA was basically at parity vs turing efficiency wise, and that was tsmc7 vs 12. I don't think the situation improves vs ampere on Samsung 8. They may still be on a better process, but that gap is shrinking.
|
# ? Sep 2, 2020 00:43 |
|
Encrypted posted:The 3950x with 2 extra cores and costing 200 bucks more than the 10900k, but just barely squeezes out a few more frames in some games or get beaten/having equal performance in dx12 means the 3950x isn't really a great buy. The point isn't "you should buy a 3950X" it's 'the 3950X is the proper platform for benching Ampere." Same reason that you should bench CPUs with a 3090, doesn't mean everyone needs a 3090.
|
# ? Sep 2, 2020 00:45 |
|
BIG HEADLINE posted:Their *hardware* is. They desperately need to find a driver team worth a drat and send them a few dump trucks full of money. I was going to ask — one of the historical differentiators all the way back to the ATI R100/R200 days was that driver quality differed greatly. I've never first-hand owned or run an ATI/AMD GPU — is this historical FUD or do AMD drivers still leave a lot to be desired vs NV?
|
# ? Sep 2, 2020 00:46 |
|
shrike82 posted:odd that they got stuck on SS 8nm and G6X (as opposed to HBM) I wonder if it came down to them worrying about supply. As it's playing out, their consumer cards aren't competing for fab capacity or memory with a100. Otherwise hbm2e makes all the sense in the world if you want to push that much bandwidth.
|
# ? Sep 2, 2020 00:47 |
|
I think that a lot of people hoping to build a system for Cyberpunk are going to be running AMD cards because it's very likely that nobody here will be able to get an Ampere card at release due to bots and demand. Before the 3080 reaches your hands you will need to pay the tax to the reseller to the tune of anywhere between 200 to 800 depending on how badly you need it. Finding a 3080 or 3090 in October will be about as easy as finding toilet paper in April.
|
# ? Sep 2, 2020 00:47 |
|
movax posted:I was going to ask — one of the historical differentiators all the way back to the ATI R100/R200 days was that driver quality differed greatly. I've never first-hand owned or run an ATI/AMD GPU — is this historical FUD or do AMD drivers still leave a lot to be desired vs NV? early amd drivers weren't good but they fixed their poo poo eventually. now it's FUD about 95% of the time, but now and then it isn't and people fall over each other calling the drivers trash for 5 years every time and here we are
|
# ? Sep 2, 2020 00:49 |
|
Paul MaudDib posted:Radeon VII makes more sense when you realize it was a Titan-style prestige card that was never intended to sell units, just to blunt the “lol it's 2019 and AMD still has nothing to compete with 1080 Ti let alone anything newer” factor until they could get Navi out six months or so later. They absolutely knew Navi was going to be better and cheaper and nobody should really have bought VII, that wasn't the point. radeon vii was purely to satisfy investors that there was, in fact, a 7nm consumer gpu it was clearly a shitshow in every other way except as a cheap MI50
|
# ? Sep 2, 2020 00:52 |
|
Truga posted:early amd drivers weren't good but they fixed their poo poo eventually. now it's FUD about 95% of the time, but now and then it isn't and people fall over each other calling the drivers trash for 5 years every time and here we are nah the RDNA 1 drivers still suck rear end
|
# ? Sep 2, 2020 00:52 |
|
movax posted:I was going to ask — one of the historical differentiators all the way back to the ATI R100/R200 days was that driver quality differed greatly. I've never first-hand owned or run an ATI/AMD GPU — is this historical FUD or do AMD drivers still leave a lot to be desired vs NV? Some people are legitimately screwed over. 5700XT was a great example. The mainstream reviewers never had issues on their setups. I had occasional black screens until the big Xmas update. For a small minority of people, problems continue to persist. There is a dude on Youtube that consistently tests every single release beta or otherwise and can report bugs on one setup or another. VorpalFish posted:RDNA was basically at parity vs turing efficiency wise, and that was tsmc7 vs 12. Yes, but they obviously fixed that for RDNA 2 since it the APU can fit on a 300W package of which the GPU can only get a portion of that. So unless you have magic information that we don't that says they can't keep the same efficiency when scaling past 56 CUs at the XBOX clock speeds then you are really just blowing smoke. MikeC fucked around with this message at 00:54 on Sep 2, 2020 |
# ? Sep 2, 2020 00:52 |
|
the trailer for CP2077 with RTX on looked... meh? bolting on RT at a late stage is going to limit the game's visual flair - especially with a game that's been in development for a while don't forget that it was originally slated to launch at the start of this year (i.e., with Turing cards). people are going to be fine as long as they don't feel the need to run the game at "ultra" settings
|
# ? Sep 2, 2020 00:53 |
I am curious to know what the price will be here in Norway. Demand is not quite as high, so there's a higher chance of getting it closer to launch, but shits also more expensive here because we're richer than y'all.
|
|
# ? Sep 2, 2020 00:54 |
|
shrike82 posted:as long as they don't feel the need to run the game at "ultra" settings lol
|
# ? Sep 2, 2020 00:56 |
|
Here is a post from the EVGA Forums detailing their ampere product: https://forums.evga.com/Introducing-the-EVGA-GeForce-RTX-30-Series-with-iCX3-Technology-m3072847-p7.aspx#3073192 EVGA_TechLeeM posted:Currently no pre-order planned. Similarly, no pricing is available at this time.
|
# ? Sep 2, 2020 00:57 |
|
Black Griffon posted:I am curious to know what the price will be here in Norway. Demand is not quite as high, so there's a higher chance of getting it closer to launch, but shits also more expensive here because we're richer than y'all. https://www.nvidia.com/nb-no/geforce/buy/
|
# ? Sep 2, 2020 00:57 |
|
Kraftwerk posted:Here is a post from the EVGA Forums detailing their ampere product: So the idea in the hybrids is to watercool GPU, air cool VRM/RAM, and then the other hybrid is a full waterblock?
|
# ? Sep 2, 2020 00:59 |
|
shrike82 posted:the trailer for CP2077 with RTX on looked... meh? bolting on RT at a late stage is going to limit the game's visual flair - especially with a game that's been in development for a while Something about 2077 in general feels off and I just can’t tell if it’s the gameplay not being good enough to carry everything else or if everything else will carry the gameplay. Like a lot of it looks real good but the few snippets of gunplay looks... stiff? Same with the driving.
|
# ? Sep 2, 2020 00:59 |
|
Happy Noodle Boy posted:Something about 2077 in general feels off and I just cant tell if its the gameplay not being good enough to carry everything else or if everything else will carry the gameplay. the fact that it's first-person?
|
# ? Sep 2, 2020 01:00 |
oh hey there we go. ... that's a lot of kroner.
|
|
# ? Sep 2, 2020 01:00 |
|
Happy Noodle Boy posted:Like a lot of it looks real good but the few snippets of gunplay looks... stiff? Same with the driving. the combat in TW3 sucked and that was their third attempt at it their first go at first person shooter combat probably won't be amazing
|
# ? Sep 2, 2020 01:00 |
|
|
# ? May 30, 2024 19:02 |
|
Anyone have experience with preordering from Microcenter? Wondering if trying to get a card through them will be easier than directly through Nvidia.
|
# ? Sep 2, 2020 01:01 |