Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
that would be a perfectly reasonable workstation-tier GPU though (like, RDNA2 Frontier Edition), so that's a great example of an april fool's joke that is not really funny

edit: except GDDR7 I guess, wow so funny

Paul MaudDib fucked around with this message at 16:46 on Apr 2, 2020

Adbot
ADBOT LOVES YOU

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!

ShaneB posted:

Someone talk me out of moving from an RX580 to an RTX 2060 Super... please?

Seems like a reasonable upgrade? :)

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Lube banjo posted:

Its probably nothing/false since :WCCFTECH: but 32gb of VRAM what the hell
dude, really

ShaneB
Oct 22, 2002


lllllllllllllllllll posted:

Seems like a reasonable upgrade? :)

Yeah but then I am dropping $400. QUESTIONABLE.

Setset
Apr 14, 2012
Grimey Drawer

Yeah sorry. I got like 3 hrs of sleep last night and I've been in zombie-mode playing Fire Emblem all day. It popped up in a news feed and I didn't see anyone talking about it.

Seemed somewhat realistic but I did notice the blower fans were spinning in opposite directions. Thought it was kind of weird but hey :justpost:

Yaoi Gagarin
Feb 20, 2014

So AMD's joke was "we made a good GPU"?

Shaocaholica
Oct 29, 2002

Fig. 5E
32GB not a joke tho :confused:

greasyhands
Oct 28, 2006

Best quality posts,
freshly delivered

VostokProgram posted:

So AMD's joke was "we made a good GPU"?

I think it was a techbro website joke, not amds own

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!

ShaneB posted:

Yeah but then I am dropping $400. QUESTIONABLE.
Well, you'll have about half a year of fun with it, ignoring leaks and promises of upcoming tech, probably. Then the real fun's starting and there'll be two new consoles, two new cards by both NVIDIA and AMD and things will be getting wild. So a few months of patience may be worth it. ;)

Inept
Jul 8, 2003

april 1 - push bullshit to the limit to see what people will still click on and share

eames
May 9, 2009

every day is April 1st if you write for WCCF Tech

Quaint Quail Quilt
Jun 19, 2006


Ask me about that time I told people mixing bleach and vinegar is okay

TorakFade posted:

Hmm, after the last couple of driver updates (Nvidia, I have an EVGA 1080 FTW bought at the end of 2018) RDR2 seems to have some weird graphics glitch, with random red and white flashes.

My baseless guess/suggestion is try changing your render engine from dx12 to vulkan or vice versa and see what that does.

repiv
Aug 13, 2009

The GTC talk on DLSS 2.0 went public if anyone wants a deeper dive: https://developer.nvidia.com/gtc/2020/video/s22698

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

repiv posted:

The GTC talk on DLSS 2.0 went public if anyone wants a deeper dive: https://developer.nvidia.com/gtc/2020/video/s22698

This is great, thanks for the link.

repiv
Aug 13, 2009

Really it's more of a primer on the limitations of other upscalers, and a demonstration that DLSS avoids those problems, but they don't give any real details on the inner workings.

Nvidia are playing their hand close to their chest with this one, it's their secret free performance juice and they're not sharing with the class :colbert:

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
The one thing that I think is really striking in that presentation is the comparison shots with 32x supersampling. I was mindblown to find out that much of what I thought was oversharpening in DLSS 2.0 is actually just better conformance to a supersampled image. My assumption is that their algorithm must be incredibly good at using the motion data to allow data to be re-used over time, allowing them to be very effectively supersampling anything that isn't in extreme motion. I think the fact that they got the DLSS processing time down is actually very important. Being able to run at high base framerates is really important to anything like this working well, because for a given motion, a higher framerate is more samples/less difference between samples/easier correction and thus better image quality.

I wonder what AMD's response is. If they haven't been playing catch-up in secret, I can't imagine them not getting absolutely hosed next gen. Nvidia is almost certainly going to include tensor cores across the stack, and so even if an AMD card benches better, in practice being able to use DLSS for maybe a minor IQ hit and a huge performance gain is going to make AMD largely irrelevant. It's going to be an interesting time, because AMD will probably be able to point to benchmarks and truthfully claim that they are providing better 1:1 performance at a given price point, but Nvidia will probably be providing significantly better actual experience per dollar without any sufficiently objective way to quantify that.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

K8.0 posted:

AMD will probably be able to point to benchmarks and truthfully claim that they are providing better 1:1 performance at a given price point, but Nvidia will probably be providing significantly better actual experience per dollar without any sufficiently objective way to quantify that.

Well, there are a few parts to this. First, unless AMD has some sort of :master: that no one has seen coming, NVidia will likely dominate the upper-mid and high-end tiers again. I'd be real surprised if AMD could come out with a price-competitive card to match the xx70 on up without also accepting some silly compromises, like 300+W power use or something. So NVidia won't even have to bother with benchmark wars there; they'll be able to say that not only is the xx80(Ti) X% faster in benchmarks, it's X+50% faster with DLSS.

In the mid-tier where AMD probably will be price-competitive, I don't think the story changes much in their favor from today: even at the price points where they are competitive, they often lose out to NVidia based on drivers, name recognition, etc. There's a reason that the xx60 has been a massive seller, and it's not because the 580/590's are bad cards. Regardless of whether DLSS goes down to the xx50/60 cards, I don't think many people are going to care that the 680/690 bench 5% higher (if they even do--which I think is doubtful).

The low end...who the gently caress knows. That's always been a weird space with no rules and very strange decisions.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
The Xbox Series X (still a poo poo name) presents an interesting case, though.
Ray tracing hardware, newer architecture, 52 CUs, 1.825GHz... 12 TFLOPS, it seems like it could be a serious contender if it was released as a normal card, maybe pushing the boost out even a little further, and possibly enabling the remaining 4 CUs for an XT part
It's a lot beefier than anything they're selling right now.

HalloKitty fucked around with this message at 17:11 on Apr 3, 2020

Perplx
Jun 26, 2004


Best viewed on Orgasma Plasma
Lipstick Apathy

repiv posted:

Really it's more of a primer on the limitations of other upscalers, and a demonstration that DLSS avoids those problems, but they don't give any real details on the inner workings.

Nvidia are playing their hand close to their chest with this one, it's their secret free performance juice and they're not sharing with the class :colbert:

I think the secret sauce is compute time, they needed 1.5 years after dlss 1.0 to get the model trained with enough data.

repiv
Aug 13, 2009

Now the question is, how quickly can Nvidia's devrel people crank out DLSS patches for games

DLSS 2.0 was just confirmed for Amid Evil, which doesn't sound particularly useful for such a lightweight game, but they're probably taking the shotgun approach and trying to get it into as many UE4 games as possible now the integration work is done

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Are you sure the integration takes nvidia assistance? Sounds like devs are implementing it themselves but I'm no expert.

repiv
Aug 13, 2009

The developers can do it themselves if they want to (there's no need for NV to train a model for them anymore) but NV have a bunch of people whose job is to embed in game studios and assist with integrating NV tech into their engines. It's in NVs best interests to speed things along.

repiv fucked around with this message at 18:25 on Apr 3, 2020

Murvin
Jan 7, 2008
Jet-setter and raconteur


So I jumped the gun and upgraded to a 2560x1440 monitor before my hardware was probably ready for it and some games are struggling a bit. I am using the old and formerly very popular i5 4460 cpu and a GTX 1070 gpu. At this point I know I need a new CPU and that may be bottle-necking me, but is my GPU also good enough for 1440p or should I be considering a step up? If so, is there a current card in the sweet spot for that resolution right now?

Stanley Tucheetos
May 15, 2012

Murvin posted:

So I jumped the gun and upgraded to a 2560x1440 monitor before my hardware was probably ready for it and some games are struggling a bit. I am using the old and formerly very popular i5 4460 cpu and a GTX 1070 gpu. At this point I know I need a new CPU and that may be bottle-necking me, but is my GPU also good enough for 1440p or should I be considering a step up? If so, is there a current card in the sweet spot for that resolution right now?

1070 should be fine for most games at 1440p as long as you aren't trying for 144hz. What games in particular are you struggling with?

ChazTurbo
Oct 4, 2014

Murvin posted:

So I jumped the gun and upgraded to a 2560x1440 monitor before my hardware was probably ready for it and some games are struggling a bit. I am using the old and formerly very popular i5 4460 cpu and a GTX 1070 gpu. At this point I know I need a new CPU and that may be bottle-necking me, but is my GPU also good enough for 1440p or should I be considering a step up? If so, is there a current card in the sweet spot for that resolution right now?

1070 should be fine so long as you don't mind turning some settings down.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Yep, dollar for dollar you are still probably better off spending on a GPU, so I wouldn't say you're bottlenecked, but a 1070 should get you 60fps+ in most games with settings turned down a smidge.

Ganondork
Dec 26, 2012

Ganondork

Murvin posted:

So I jumped the gun and upgraded to a 2560x1440 monitor before my hardware was probably ready for it and some games are struggling a bit. I am using the old and formerly very popular i5 4460 cpu and a GTX 1070 gpu. At this point I know I need a new CPU and that may be bottle-necking me, but is my GPU also good enough for 1440p or should I be considering a step up? If so, is there a current card in the sweet spot for that resolution right now?

A little insight from someone who went from a O/C’d 3770k to a 3800x + 3600 CL15. I was running a 2070 under both setups, and frame rates were pretty good in both cases, but there was a noticeable increase in the minimum FPS and stability of FPS in general with the upgrade to the 3800x.

Prior to the 2070, I had a 1070 until it popped a capacitor. The 1070 is a decent 1440p card, and should handle most of what you throw at it with some compromises. You’ll definitely see better FPS with a vid card upgrade, but more consistent FPS with a CPU upgrade.

I guess what I’m trying to say is, either way you’ll benefit.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

What’s the forecast look like for USB-C monitor/GPU connection in the near future? I’m trying to figure out how to share my peripherals between my MacBook and my next desktop, and it would be pretty nifty if I didn’t have to chain a bunch of moderately expensive converters and switches to do so.

repiv
Aug 13, 2009

RTX cards can already do it, the USB-C VirtualLink port can also drive regular USB-C monitors.

Make sure the SKU you choose has the port though, some non-reference designs omit it.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Murvin posted:

So I jumped the gun and upgraded to a 2560x1440 monitor before my hardware was probably ready for it and some games are struggling a bit. I am using the old and formerly very popular i5 4460 cpu and a GTX 1070 gpu. At this point I know I need a new CPU and that may be bottle-necking me, but is my GPU also good enough for 1440p or should I be considering a step up? If so, is there a current card in the sweet spot for that resolution right now?

In addition to what other people have said, you can consider that right now new GPUs that present a big value increase will come sooner than new CPUs that present a big value increase. Unless you really need a GPU right now, I'd be going CPU upgrade first.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

repiv posted:

RTX cards can already do it, the USB-C VirtualLink port can also drive regular USB-C monitors.

Make sure the SKU you choose has the port though, some non-reference designs omit it.

Ah, good. I would just then need to add yet another criterion when shopping for my next monitor, which tbh may be a mercy given how confusing I found the process last time!

Cygni
Nov 12, 2005

raring to post

repiv posted:

RTX cards can already do it, the USB-C VirtualLink port can also drive regular USB-C monitors.

Make sure the SKU you choose has the port though, some non-reference designs omit it.

Huh, i didnt know that. Is it DisplayLink over USB-C? I assumed that was just gonna be a useless plug nobody ever used, like SATA Express.

ufarn
May 30, 2009
The MSI Optix MAG251RX has a USB-C port, and it's looking like the best 1080p240 monitor when it comes out this Summer. Not sure how it works with video, especially Sync.

repiv
Aug 13, 2009

Cygni posted:

Huh, i didnt know that. Is it DisplayLink over USB-C? I assumed that was just gonna be a useless plug nobody ever used, like SATA Express.

Isn't DisplayLink a fallback thing for cramming video down a data-only USB port? The port on the RTX cards has native DisplayPort-over-USB-C.

It's also a general purpose USB 3.1 gen2 port you can use for whatever, and it supports USB-PD up to 27W so it can serve as a fast charger.

VirtualLink seems to be stillborn though, Valve cancelled the VL cable for the Index and I don't think anyone else has adopted it.

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry
Yeah, Valve said they couldn't get it work reliably enough.

Cygni
Nov 12, 2005

raring to post

repiv posted:

Isn't DisplayLink a fallback thing for cramming video down a data-only USB port? The port on the RTX cards has native DisplayPort-over-USB-C.

It's also a general purpose USB 3.1 gen2 port you can use for whatever, and it supports USB-PD up to 27W so it can serve as a fast charger.

VirtualLink seems to be stillborn though, Valve cancelled the VL cable for the Index and I don't think anyone else has adopted it.

Nice, i didnt know about the displayport or general USB usage, i literally just thought it was a dumb dead port for me to ignore on my card forever. I guess its slightly more useful than that!

Ganondork
Dec 26, 2012

Ganondork

Lowen SoDium posted:

Yeah, Valve said they couldn't get it work reliably enough.

Is it just me or has every iteration of USB since 2 been less and less reliable?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Ganondork posted:

Is it just me or has every iteration of USB since 2 been less and less reliable?

The actual reliability of USB 3 is perfectly fine. What's not has been the compatibility, as 3.x decided to introduce multiple layers of confusing and conflicting speeds, capabilities, etc. They'll all work as a generic 3.0 host, but then whether it supports super high speed, a given wattage for charging, video, etc., is all a poo poo-show.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
The selling point with USB-C in particular is that hey, it's one connector that does everything and you don't have to flip it three times when plugging in, sweet. The problem is that if you look at any random USB-C port there's absolutely no way of telling what it might support. Does it supply power? Maybe, but at 5V, 12V or 20V, or any of them? How much power? Enough to charge your laptop? Who knows! Does it support power input, for that matter? Sometimes only one of the ports on a device does. Can you use it for display output? Quite possibly, but it's anyone's guess what resolutions and refresh rates are supported, or even which protocol is being used (HDMI is possible, AFAIK). It's quite common that 4K 60Hz doesn't work. And even if you just use it for data transfer, there's no consistency there either. It's a loving mess.

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

DrDork posted:

The actual reliability of USB 3 is perfectly fine. What's not has been the compatibility, as 3.x decided to introduce multiple layers of confusing and conflicting speeds, capabilities, etc. They'll all work as a generic 3.0 host, but then whether it supports super high speed, a given wattage for charging, video, etc., is all a poo poo-show.

We wanted so badly for USB 3 to be viable for the original Rift, because it would have reduced tracker latency and improved accuracy, but so many chipsets were broken in so many ways that we abandoned work on that mode. Poor guy on my team spending hours each day on the phone with firmware teams trying to get them to even acknowledge that they might want to be compatible at some point in the future. I forget what it was that we wanted to do, maybe something with isochronous mode, but pretty much every engagement with a vendor ended with “oh, yeah, the standard says that but we don’t do it. yeah, we advertise it because we do this other thing and some software expects to see both together, sorry about your life”.

Maybe it’s better now, but I wouldn’t have the heart to ask Justin and find out!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply