Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
orcane
Jun 13, 2012

Fun Shoe

Taima posted:

- Some kind of memory compression technology that will reduce the need for high VRAM cards

:laffo: yeah right

Adbot
ADBOT LOVES YOU

Shaocaholica
Oct 29, 2002

Fig. 5E
VRAM is cheap.

Stickman
Feb 1, 2004

Maybe if it’s direct NVMe access with hardware decompression, but it seems like that would require motherboard support? They’ve been working on it for HPC at least. They probability wouldn’t describe it as a “compression technology”, though.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Stickman posted:

Maybe if it’s direct NVMe access with hardware decompression, but it seems like that would require motherboard support? They’ve been working on it for HPC at least. They probability wouldn’t describe it as a “compression technology”, though.

for the record, NVIDIA GPUDirect NVME has existed for the best part of a decade, and AMD just tried to do a thing with Vega in like 2017. You've even been able to do remote DMA access to NVMe on a completely different system in hard-realtime at full data transfer rate for like a decade now.

copyright 2012: "RDMA accelerated communication with storage devices" http://developer.download.nvidia.com/devzone/devcenter/cuda/docs/GPUDirect_Technology_Overview.pdf

what amd did was put a PLX chip on a card, like any average dual-GPU card, and sat a gpu and a m.2 drive behind it. It leeches 4 GB/s of bandwidth from the shared connection (that would otherwise have been served by the host) and lets you put a m.2 behind a shared x16 link. It's the new Radeon SSG (tm).

it doesn't really work in any case, because "full nvme transfer rate" is like 4 GB/s compared to hundreds of GB/s for normal VRAM. You can't sustain performance-significant amounts of data access over a 4 GB/s bus. Whether that's host memory (HBCC lol) or another peer PCIe NVMe device, whether local or remote.

(does anyone remember HBCC? Anyone want to do a benchmark revisit on how much it lets you "use system RAM as VRAM" these days? Boy that AMD marketing was sure "honest"...)

Paul MaudDib fucked around with this message at 09:58 on May 4, 2020

SwissArmyDruid
Feb 14, 2014

by sebmojo

Taima posted:

- Some kind of memory compression technology that will reduce the need for high VRAM cards

Ha.

quote:

- September launch

Ha!

quote:

- No more login for Geforce Experience

HA!

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
oh, this is Moore's Law Is Dead.

Guy is part of the AMD vlogosphere. Like RedGamingTech, AdoredTV, Mindblanktech, nerdtechgasm (god I can't even keep all the AMD tech-orgasm-themed channels straight anymore), etc etc.

"AMD is totally going to own NVIDIA this time guys" is like, his channel's whole premise, along with all those other channels. Sooner or later it has to be right.

Zarin
Nov 11, 2008

I SEE YOU

Paul MaudDib posted:

oh, this is Moore's Law Is Dead.

Guy is part of the AMD vlogosphere. Like RedGamingTech, AdoredTV, Mindblanktech, nerdtechgasm (god I can't even keep all the AMD tech-orgasm-themed channels straight anymore), etc etc.

"AMD is totally going to own NVIDIA this time guys" is like, his channel's whole premise, along with all those other channels. Sooner or later it has to be right.

I was confused why he was going on about how "this was going to be the most competitive Fall in living memory" when Nvidia seems like they're pulling a new rabbit out of their hat every week and AMD has . . . RDNA2 coming soon. Someday. (Granted, I don't know what RDNA2 actually entails, but I already got the impression that DLSS 2.0 was a "kick 'em while they're down" surprise and even he made it sound like Ampere is going to roll out with enough goodies that DLSS 2.0 will be a distant memory by then)

TacticalHoodie
May 7, 2007

Lockback posted:

The fact that it's EVGA is why it took you this long to realize you've had so many RMAs.

Yeah that is true. I am looking down a double whammy of issues at the moment where my Ram can't go above 2133Mhz regardless of timing and this video card issue. Lowering the RAM timing fixes the studdering but it is affecting the overclock I have on my 8600k. Gskill is the biggest pain in the rear end to deal with. It has Lifetime Warranty sure, but they told me that the RAM was optimized for AMD and my usage on z370 will vary. EVGA has been ace the whole time so no complaints from me.

I am replacing the 16GB of G.Skill RAM with 32 GB of Corsair RAM so I would have better Tech Support. Again, Corsair support is so good that they make the RMA process trivial compared to other vendors.

Ganondork
Dec 26, 2012

Ganondork

Zarin posted:

I was confused why he was going on about how "this was going to be the most competitive Fall in living memory" when Nvidia seems like they're pulling a new rabbit out of their hat every week and AMD has . . . RDNA2 coming soon. Someday. (Granted, I don't know what RDNA2 actually entails, but I already got the impression that DLSS 2.0 was a "kick 'em while they're down" surprise and even he made it sound like Ampere is going to roll out with enough goodies that DLSS 2.0 will be a distant memory by then)

I feel like it’s been this way for a while. AMD GPUs are great in theory, not so much in practice.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

CaptainSarcastic posted:

Starting to seriously consider getting an RTX 2070. Price-wise I'm seeing some around the price of a 2060 Super, and they perform slightly better. The 2070 Super is priced just high enough that I am leaning toward the stock 2070 since the pricing is looking a lot better.

I had been thinking about a 5700 XT, but the driver issues give me pause. I haven't run an AMD GPU since the ATI 9600 Pro AIW, and since my last two builds have been around AMD CPUs I have been curious about running an all-AMD setup, but it is not looking like the time is right for that at the moment.

I know the usual arguments about waiting for the next generation to drop, but I just built a new desktop and bought a new monitor, so a GPU upgrade feels like a sooner rather than later thing. My GTX 1060 6GB is holding its own, but it would be nice to run 1440p with more demanding settings and a higher frame rate.

I'm no expert, but between DLSS (however limited the game-support is at the moment) and things like RTX Voice and of course ray-tracing itself, Nvidia does seem to be ahead of the game if you can at all afford an RTX card, even if we disregarded all the AMD driver issues.

repiv
Aug 13, 2009

That video reeks of someone with limited knowledge making poo poo up, there's absolutely no way Nvidia can get DLSS working as a generic thing that overrides any TAA implementation.

Unlike MSAA, TAA is implemented purely in shaders so the driver has no indication of where the TAA is happening (or if it's happening at all) or how the input data is laid out.

The best they could do is per-game profiles to splice DLSS in place of TAA for some games.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

Paul MaudDib posted:

oh, this is Moore's Law Is Dead.

Guy is part of the AMD vlogosphere. Like RedGamingTech, AdoredTV, Mindblanktech, nerdtechgasm (god I can't even keep all the AMD tech-orgasm-themed channels straight anymore), etc etc.

"AMD is totally going to own NVIDIA this time guys" is like, his channel's whole premise, along with all those other channels. Sooner or later it has to be right.

Whatever about the veracity of the rumours, he did present them in a neutral light before getting into more biased comparisons with AMD.


I was surprised to see DLSS 3.0 being mentioned when 2.0 is barely out. Is there even a list of games that support/will support 2.0?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Wolfenstein Youngblood, Control, Minecraft, Delivery Us To The Moon, and MechWarrior 5, IIRC

Shaocaholica
Oct 29, 2002

Fig. 5E
How was this tiny 2011 eGPU supposed to actually accelerate anything over a USB3 interface? :

https://www.youtube.com/watch?v=mEcWj52NCDU

repiv
Aug 13, 2009

Maybe it wasn't USB3? The docks connector is designed so it only fits in that one port on that one laptop, so they could have done some weird proprietary signalling to jam PCIe through a USB3 port.

Cygni
Nov 12, 2005

raring to post

It used a proprietary thunderbolt implementation. It was also very bad. If you thought performance on the internal display was questionable with TB3 eGPU boxes, now lets do it on something with 25% the bandwidth (being charitable) and a non standard implementation that never got a driver update.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Paul MaudDib posted:

oh, this is Moore's Law Is Dead.

Guy is part of the AMD vlogosphere. Like RedGamingTech, AdoredTV, Mindblanktech, nerdtechgasm (god I can't even keep all the AMD tech-orgasm-themed channels straight anymore), etc etc.

"AMD is totally going to own NVIDIA this time guys" is like, his channel's whole premise, along with all those other channels. Sooner or later it has to be right.

I remember a video where he 'revealed' (this was like a month before the Xbox Series X's specs were actually introduced) is that the next Xbox will be upgradable, you'll be able to swap out the GPU through a daughtercard

SwissArmyDruid
Feb 14, 2014

by sebmojo

Shaocaholica posted:

How was this tiny 2011 eGPU supposed to actually accelerate anything over a USB3 interface? :

https://www.youtube.com/watch?v=mEcWj52NCDU

Maybe AMD could have actually done this with Adreno.

edit: You know what, remembering this has given me a new take on AMD and RDNA2, and when, if ever, will AMD move to chiplet-based APUs.

AMD's about to get to get a lot of money from Samsung to do GPUs for their Exynos processors, right? Maybe this will finally let them figure out if it's actually possible, and whether it is or not, they will really know how how low they can clock RDNA down to and how efficient they can make it.

SwissArmyDruid fucked around with this message at 21:47 on May 4, 2020

Worf
Sep 12, 2017

If only Seth would love me like I love him!

i just read about 200 posts itt and in summation, i will believe that GFE doesnt require a login when it actually happens.

im really curious to see what theyre going to implement instead.

VorpalFish
Mar 22, 2007
reasonably awesometm

After like the 14th time I had to solve a captcha (wtf) after being forcibly logged out of a desktop app, I just gave up on it and uninstalled.

I am skeptical that anyone involved has ever had to actually use a computer.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Statutory Ape posted:

im really curious to see what theyre going to implement instead.

SSO via MAAD + phone-based authenticator app.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Pretty much what I expect.

You don't really snap your fingers and decide to start collecting less data on your customers in 2020

I really don't think the g-force experience is going to change

They could eliminate captchas from it though and I'd call it even

CaptainSarcastic
Jul 6, 2013



I've avoided the GeForce Experience for years and years. Has it ever had real advantages over just setting program-specific stuff in the Nvidia control panel?

Cygni
Nov 12, 2005

raring to post

You really don't need it unless you want to use shadowplay. I mostly use it so that it auto checks for a new driver when my computer boots, then i close it.

People get very upset talking about it though so be careful of venturing down that path.

VorpalFish
Mar 22, 2007
reasonably awesometm

The recommended game settings can be cool if you like to just jump in and start playing rather than play trial and error to optimize the experience. It's fine to keep drivers up to date? I definitely updated more frequently when it was installed.

A lot of people really hate the fact that it feeds data to nvidia but at this point every company in the world is pretty much up your rear end with a microscope.

The required logging in is absolutely maddening and the captcha is so beyond weird I don't even know how to parse it.

Like I've only seen captchas used on websites, and I see them used generally one of 2 ways: a) restrict account creation to try and limit posting bots and b) rate limit requests that can be issued without logging in to an account.

A captcha on an installed program to log in to an account that already exists. Why is it there? What is it for? I must know.

orcane
Jun 13, 2012

Fun Shoe
The GTA5/Rockstar Social Club launcher also uses captchas and it's beyond stupid. I'm sure it's to mess with hackers who threaten their precious shark card scam but come on.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

VorpalFish posted:

The recommended game settings can be cool if you like to just jump in and start playing rather than play trial and error to optimize the experience. It's fine to keep drivers up to date? I definitely updated more frequently when it was installed.

A lot of people really hate the fact that it feeds data to nvidia but at this point every company in the world is pretty much up your rear end with a microscope.

The required logging in is absolutely maddening and the captcha is so beyond weird I don't even know how to parse it.

Like I've only seen captchas used on websites, and I see them used generally one of 2 ways: a) restrict account creation to try and limit posting bots and b) rate limit requests that can be issued without logging in to an account.

A captcha on an installed program to log in to an account that already exists. Why is it there? What is it for? I must know.

Yeah basically this. GFE also likes to update itself in a kind of an obnoxious way which is annoying, but it does have some nice features like shadowplay and an FPS counter if you're playing games off Steam (and don't want to installl your own 500KB version). I like the driver reminders, and the suggested settings are kinda nice to flip through.

c0burn
Sep 2, 2003

The KKKing
Steam literally has its own FPS counter though.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

c0burn posted:

Steam literally has its own FPS counter though.

and thats what he intimated, yes

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

c0burn posted:

Steam literally has its own FPS counter though.

Playing games off Steam: Playing games AWAY from the Steam platform, sorry was a little ambigous. I mean when you are not playing in Steam but want to turn on an FPS counter.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
If you add any game in Steam's game list, and launch it from there, you can have the overlay, but you probably know that anyway

Twibbit
Mar 7, 2013

Is your refrigerator running?
And if they don't want to be running steam at that moment?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Cygni posted:

People get very upset talking about it though so be careful of venturing down that path.

Prophecy proven correct. Entire thread lookin' like ww1 trenches preparing for battle.

eames
May 9, 2009

Twibbit posted:

And if they don't want to be running steam at that moment?

Nvidia Frameview works well enough as FPS counter for me. It includes detailed statistics of power consumption (down to the chip and board level) and accurate frametimes with a 95th and 99th percentile counter.

https://www.nvidia.com/en-us/geforce/technologies/frameview/

Cygni
Nov 12, 2005

raring to post

eames posted:

Nvidia Frameview works well enough as FPS counter for me. It includes detailed statistics of power consumption (down to the chip and board level) and accurate frametimes with a 95th and 99th percentile counter.

https://www.nvidia.com/en-us/geforce/technologies/frameview/

I somehow missed that this existed, huh! Guess i can uninstall that ancient copy of Fraps i was keeping around just to do frame counter stuff in finicky games that don't like steam overlays. Thanks man!

wolrah
May 8, 2006
what?

Happy_Misanthrope posted:

I remember a video where he 'revealed' (this was like a month before the Xbox Series X's specs were actually introduced) is that the next Xbox will be upgradable, you'll be able to swap out the GPU through a daughtercard
There's a part of me that's always wanted to start a site that tracks all the various "leak" sites' claims and their accuracy over time, but :effort:

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Didn't it end up being true that there will be hot-swappable SSDs or something, or was that fake too, it's hard to keep track

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Taima posted:

Didn't it end up being true that there will be hot-swappable SSDs or something, or was that fake too, it's hard to keep track

Not sure about them being hot-swappable, but yeah, the XBox SX will have an expansion slot for a second SSD (currently only Seagate has announced support, with a 1TB module). The PS5 also apparently will have a m.2 expansion bay for "certified" SSDs.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

HalloKitty posted:

If you add any game in Steam's game list, and launch it from there, you can have the overlay, but you probably know that anyway

Yeah, that's what I used to do back when 99% of what I was playing was on Steam. Now with Free games on Epic/Twitch and occasional better deals its more likely I'm not running off Steam. Usually I only care about the FPS counter when doing initial setup then I turn it off, so it being part of GFE is kinda nice.

Like I said, it's a really small reason though.

Adbot
ADBOT LOVES YOU

Cygni
Nov 12, 2005

raring to post

DigiTimes dropped a bunch of big Nvidia news behind their paywall. Supposedly Ampere is going to be split between Samsung 7nm/8nm for lower end parts, and TSMC's N7FF+ EUV node for higher end parts. This is similar to what they did with Pascal, and meets their stated intent to diversify from TSMCs super high demand (and higher cost) nodes. The TSMC parts were the ones that were supposed to have already launched by now, and were intended to be available Q3 2020, so who knows how thats all been impacted. First unveil will be May 14th.

DigiTimes also dropped that Nvidia's next architecture, Hopper, has already been ordered on TSMCs 5nm node for 2021 production which will likely make it the launch customer for a "big mask" part.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply