|
Taima posted:- Some kind of memory compression technology that will reduce the need for high VRAM cards yeah right
|
# ? May 4, 2020 09:05 |
|
|
# ? Jun 6, 2024 07:17 |
|
VRAM is cheap.
|
# ? May 4, 2020 09:21 |
|
Maybe if it’s direct NVMe access with hardware decompression, but it seems like that would require motherboard support? They’ve been working on it for HPC at least. They probability wouldn’t describe it as a “compression technology”, though.
|
# ? May 4, 2020 09:24 |
|
Stickman posted:Maybe if it’s direct NVMe access with hardware decompression, but it seems like that would require motherboard support? They’ve been working on it for HPC at least. They probability wouldn’t describe it as a “compression technology”, though. for the record, NVIDIA GPUDirect NVME has existed for the best part of a decade, and AMD just tried to do a thing with Vega in like 2017. You've even been able to do remote DMA access to NVMe on a completely different system in hard-realtime at full data transfer rate for like a decade now. copyright 2012: "RDMA accelerated communication with storage devices" http://developer.download.nvidia.com/devzone/devcenter/cuda/docs/GPUDirect_Technology_Overview.pdf what amd did was put a PLX chip on a card, like any average dual-GPU card, and sat a gpu and a m.2 drive behind it. It leeches 4 GB/s of bandwidth from the shared connection (that would otherwise have been served by the host) and lets you put a m.2 behind a shared x16 link. It's the new Radeon SSG (tm). it doesn't really work in any case, because "full nvme transfer rate" is like 4 GB/s compared to hundreds of GB/s for normal VRAM. You can't sustain performance-significant amounts of data access over a 4 GB/s bus. Whether that's host memory (HBCC lol) or another peer PCIe NVMe device, whether local or remote. (does anyone remember HBCC? Anyone want to do a benchmark revisit on how much it lets you "use system RAM as VRAM" these days? Boy that AMD marketing was sure "honest"...) Paul MaudDib fucked around with this message at 09:58 on May 4, 2020 |
# ? May 4, 2020 09:35 |
|
Taima posted:- Some kind of memory compression technology that will reduce the need for high VRAM cards Ha. quote:- September launch Ha! quote:- No more login for Geforce Experience HA!
|
# ? May 4, 2020 09:54 |
|
oh, this is Moore's Law Is Dead. Guy is part of the AMD vlogosphere. Like RedGamingTech, AdoredTV, Mindblanktech, nerdtechgasm (god I can't even keep all the AMD tech-orgasm-themed channels straight anymore), etc etc. "AMD is totally going to own NVIDIA this time guys" is like, his channel's whole premise, along with all those other channels. Sooner or later it has to be right.
|
# ? May 4, 2020 10:01 |
|
Paul MaudDib posted:oh, this is Moore's Law Is Dead. I was confused why he was going on about how "this was going to be the most competitive Fall in living memory" when Nvidia seems like they're pulling a new rabbit out of their hat every week and AMD has . . . RDNA2 coming soon. Someday. (Granted, I don't know what RDNA2 actually entails, but I already got the impression that DLSS 2.0 was a "kick 'em while they're down" surprise and even he made it sound like Ampere is going to roll out with enough goodies that DLSS 2.0 will be a distant memory by then)
|
# ? May 4, 2020 11:43 |
Lockback posted:The fact that it's EVGA is why it took you this long to realize you've had so many RMAs. Yeah that is true. I am looking down a double whammy of issues at the moment where my Ram can't go above 2133Mhz regardless of timing and this video card issue. Lowering the RAM timing fixes the studdering but it is affecting the overclock I have on my 8600k. Gskill is the biggest pain in the rear end to deal with. It has Lifetime Warranty sure, but they told me that the RAM was optimized for AMD and my usage on z370 will vary. EVGA has been ace the whole time so no complaints from me. I am replacing the 16GB of G.Skill RAM with 32 GB of Corsair RAM so I would have better Tech Support. Again, Corsair support is so good that they make the RMA process trivial compared to other vendors.
|
|
# ? May 4, 2020 12:30 |
|
Zarin posted:I was confused why he was going on about how "this was going to be the most competitive Fall in living memory" when Nvidia seems like they're pulling a new rabbit out of their hat every week and AMD has . . . RDNA2 coming soon. Someday. (Granted, I don't know what RDNA2 actually entails, but I already got the impression that DLSS 2.0 was a "kick 'em while they're down" surprise and even he made it sound like Ampere is going to roll out with enough goodies that DLSS 2.0 will be a distant memory by then) I feel like it’s been this way for a while. AMD GPUs are great in theory, not so much in practice.
|
# ? May 4, 2020 12:32 |
|
CaptainSarcastic posted:Starting to seriously consider getting an RTX 2070. Price-wise I'm seeing some around the price of a 2060 Super, and they perform slightly better. The 2070 Super is priced just high enough that I am leaning toward the stock 2070 since the pricing is looking a lot better. I'm no expert, but between DLSS (however limited the game-support is at the moment) and things like RTX Voice and of course ray-tracing itself, Nvidia does seem to be ahead of the game if you can at all afford an RTX card, even if we disregarded all the AMD driver issues.
|
# ? May 4, 2020 12:41 |
|
That video reeks of someone with limited knowledge making poo poo up, there's absolutely no way Nvidia can get DLSS working as a generic thing that overrides any TAA implementation. Unlike MSAA, TAA is implemented purely in shaders so the driver has no indication of where the TAA is happening (or if it's happening at all) or how the input data is laid out. The best they could do is per-game profiles to splice DLSS in place of TAA for some games.
|
# ? May 4, 2020 13:41 |
|
Paul MaudDib posted:oh, this is Moore's Law Is Dead. Whatever about the veracity of the rumours, he did present them in a neutral light before getting into more biased comparisons with AMD. I was surprised to see DLSS 3.0 being mentioned when 2.0 is barely out. Is there even a list of games that support/will support 2.0?
|
# ? May 4, 2020 15:25 |
|
Wolfenstein Youngblood, Control, Minecraft, Delivery Us To The Moon, and MechWarrior 5, IIRC
|
# ? May 4, 2020 15:30 |
|
How was this tiny 2011 eGPU supposed to actually accelerate anything over a USB3 interface? : https://www.youtube.com/watch?v=mEcWj52NCDU
|
# ? May 4, 2020 19:54 |
|
Maybe it wasn't USB3? The docks connector is designed so it only fits in that one port on that one laptop, so they could have done some weird proprietary signalling to jam PCIe through a USB3 port.
|
# ? May 4, 2020 20:16 |
|
It used a proprietary thunderbolt implementation. It was also very bad. If you thought performance on the internal display was questionable with TB3 eGPU boxes, now lets do it on something with 25% the bandwidth (being charitable) and a non standard implementation that never got a driver update.
|
# ? May 4, 2020 20:20 |
|
Paul MaudDib posted:oh, this is Moore's Law Is Dead. I remember a video where he 'revealed' (this was like a month before the Xbox Series X's specs were actually introduced) is that the next Xbox will be upgradable, you'll be able to swap out the GPU through a daughtercard
|
# ? May 4, 2020 20:39 |
|
Shaocaholica posted:How was this tiny 2011 eGPU supposed to actually accelerate anything over a USB3 interface? : Maybe AMD could have actually done this with Adreno. edit: You know what, remembering this has given me a new take on AMD and RDNA2, and when, if ever, will AMD move to chiplet-based APUs. AMD's about to get to get a lot of money from Samsung to do GPUs for their Exynos processors, right? Maybe this will finally let them figure out if it's actually possible, and whether it is or not, they will really know how how low they can clock RDNA down to and how efficient they can make it. SwissArmyDruid fucked around with this message at 21:47 on May 4, 2020 |
# ? May 4, 2020 21:41 |
|
i just read about 200 posts itt and in summation, i will believe that GFE doesnt require a login when it actually happens. im really curious to see what theyre going to implement instead.
|
# ? May 4, 2020 22:49 |
|
After like the 14th time I had to solve a captcha (wtf) after being forcibly logged out of a desktop app, I just gave up on it and uninstalled. I am skeptical that anyone involved has ever had to actually use a computer.
|
# ? May 4, 2020 22:55 |
|
Statutory Ape posted:im really curious to see what theyre going to implement instead. SSO via MAAD + phone-based authenticator app.
|
# ? May 5, 2020 01:40 |
|
Pretty much what I expect. You don't really snap your fingers and decide to start collecting less data on your customers in 2020 I really don't think the g-force experience is going to change They could eliminate captchas from it though and I'd call it even
|
# ? May 5, 2020 02:39 |
|
I've avoided the GeForce Experience for years and years. Has it ever had real advantages over just setting program-specific stuff in the Nvidia control panel?
|
# ? May 5, 2020 03:12 |
|
You really don't need it unless you want to use shadowplay. I mostly use it so that it auto checks for a new driver when my computer boots, then i close it. People get very upset talking about it though so be careful of venturing down that path.
|
# ? May 5, 2020 03:43 |
|
The recommended game settings can be cool if you like to just jump in and start playing rather than play trial and error to optimize the experience. It's fine to keep drivers up to date? I definitely updated more frequently when it was installed. A lot of people really hate the fact that it feeds data to nvidia but at this point every company in the world is pretty much up your rear end with a microscope. The required logging in is absolutely maddening and the captcha is so beyond weird I don't even know how to parse it. Like I've only seen captchas used on websites, and I see them used generally one of 2 ways: a) restrict account creation to try and limit posting bots and b) rate limit requests that can be issued without logging in to an account. A captcha on an installed program to log in to an account that already exists. Why is it there? What is it for? I must know.
|
# ? May 5, 2020 12:59 |
|
The GTA5/Rockstar Social Club launcher also uses captchas and it's beyond stupid. I'm sure it's to mess with hackers who threaten their precious shark card scam but come on.
|
# ? May 5, 2020 13:09 |
|
VorpalFish posted:The recommended game settings can be cool if you like to just jump in and start playing rather than play trial and error to optimize the experience. It's fine to keep drivers up to date? I definitely updated more frequently when it was installed. Yeah basically this. GFE also likes to update itself in a kind of an obnoxious way which is annoying, but it does have some nice features like shadowplay and an FPS counter if you're playing games off Steam (and don't want to installl your own 500KB version). I like the driver reminders, and the suggested settings are kinda nice to flip through.
|
# ? May 5, 2020 14:59 |
|
Steam literally has its own FPS counter though.
|
# ? May 5, 2020 17:16 |
|
c0burn posted:Steam literally has its own FPS counter though. and thats what he intimated, yes
|
# ? May 5, 2020 17:32 |
|
c0burn posted:Steam literally has its own FPS counter though. Playing games off Steam: Playing games AWAY from the Steam platform, sorry was a little ambigous. I mean when you are not playing in Steam but want to turn on an FPS counter.
|
# ? May 5, 2020 17:39 |
|
If you add any game in Steam's game list, and launch it from there, you can have the overlay, but you probably know that anyway
|
# ? May 5, 2020 18:12 |
|
And if they don't want to be running steam at that moment?
|
# ? May 5, 2020 18:19 |
|
Cygni posted:People get very upset talking about it though so be careful of venturing down that path. Prophecy proven correct. Entire thread lookin' like ww1 trenches preparing for battle.
|
# ? May 5, 2020 18:24 |
|
Twibbit posted:And if they don't want to be running steam at that moment? Nvidia Frameview works well enough as FPS counter for me. It includes detailed statistics of power consumption (down to the chip and board level) and accurate frametimes with a 95th and 99th percentile counter. https://www.nvidia.com/en-us/geforce/technologies/frameview/
|
# ? May 5, 2020 18:32 |
|
eames posted:Nvidia Frameview works well enough as FPS counter for me. It includes detailed statistics of power consumption (down to the chip and board level) and accurate frametimes with a 95th and 99th percentile counter. I somehow missed that this existed, huh! Guess i can uninstall that ancient copy of Fraps i was keeping around just to do frame counter stuff in finicky games that don't like steam overlays. Thanks man!
|
# ? May 5, 2020 18:37 |
|
Happy_Misanthrope posted:I remember a video where he 'revealed' (this was like a month before the Xbox Series X's specs were actually introduced) is that the next Xbox will be upgradable, you'll be able to swap out the GPU through a daughtercard
|
# ? May 5, 2020 18:41 |
|
Didn't it end up being true that there will be hot-swappable SSDs or something, or was that fake too, it's hard to keep track
|
# ? May 5, 2020 18:49 |
|
Taima posted:Didn't it end up being true that there will be hot-swappable SSDs or something, or was that fake too, it's hard to keep track Not sure about them being hot-swappable, but yeah, the XBox SX will have an expansion slot for a second SSD (currently only Seagate has announced support, with a 1TB module). The PS5 also apparently will have a m.2 expansion bay for "certified" SSDs.
|
# ? May 5, 2020 19:01 |
|
HalloKitty posted:If you add any game in Steam's game list, and launch it from there, you can have the overlay, but you probably know that anyway Yeah, that's what I used to do back when 99% of what I was playing was on Steam. Now with Free games on Epic/Twitch and occasional better deals its more likely I'm not running off Steam. Usually I only care about the FPS counter when doing initial setup then I turn it off, so it being part of GFE is kinda nice. Like I said, it's a really small reason though.
|
# ? May 5, 2020 20:35 |
|
|
# ? Jun 6, 2024 07:17 |
|
DigiTimes dropped a bunch of big Nvidia news behind their paywall. Supposedly Ampere is going to be split between Samsung 7nm/8nm for lower end parts, and TSMC's N7FF+ EUV node for higher end parts. This is similar to what they did with Pascal, and meets their stated intent to diversify from TSMCs super high demand (and higher cost) nodes. The TSMC parts were the ones that were supposed to have already launched by now, and were intended to be available Q3 2020, so who knows how thats all been impacted. First unveil will be May 14th. DigiTimes also dropped that Nvidia's next architecture, Hopper, has already been ordered on TSMCs 5nm node for 2021 production which will likely make it the launch customer for a "big mask" part.
|
# ? May 5, 2020 21:28 |