Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

redreader posted:

When I asked about this earlier in the thread someone said that ray-tracing is being standardized in directx and that Nvidia will have better ray-tracing due to dedicated hardware. AMD's big navi will be on 7nm and doesn't have dedicated ray-tracing hardware, maybe they'll have better standard 3d performance. It's possible that their cards are a cheaper way to get better standard 3d performance if you don't care about ray-tracing. I do, though.

AirRaid posted:

Big Navi does support hardware ray tracing though.

yeah, people misunderstood what that patent was, AMD built their dedicated hardware into the texture unit, but they do have dedicated hardware ("BVH Traversal unit").

I think of it as being analogous to an execution port on a CPU, AMD's RT and texture units share an execution port. They're betting those operations won't happen at the same time often enough to hurt performance too much.

Adbot
ADBOT LOVES YOU

lDDQD
Apr 16, 2006

redreader posted:

When I asked about this earlier in the thread someone said that ray-tracing is being standardized in directx and that Nvidia will have better ray-tracing due to dedicated hardware. AMD's big navi will be on 7nm and doesn't have dedicated ray-tracing hardware, maybe they'll have better standard 3d performance. It's possible that their cards are a cheaper way to get better standard 3d performance if you don't care about ray-tracing. I do, though.

There's only one known way to compute BVH trees fast, and it requires fixed-function hardware, so if AMD wants to do real time raytracing, they have to have it. I'm fairly sure both companies are copying notes from the exact same academic paper, either way.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

AirRaid posted:

Big Navi does support hardware ray tracing though.

But in what manner is the question. They could cheap out and have "ray tracing support" to the same extent the consoles will, which is to say a limited, low-end implementation that makes no real attempt to replicate what something like RTX Quality can do. Or they could take a real stab at it and bank on many titles already baking console RT into their engines and then layering on top of that in a manner than lets them leverage the optimizations from a console-centric origin, so the end result is comparable to RTX quality.

We really have no idea, though, because AMD hasn't said much of anything about what their cards will be able to do yet. Just gotta wait.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

VorpalFish posted:

I mean obviously the games you play and your expectations for settings matter a lot. AAA titles with high settings still belongs to the more money than sense I don't care about value crowd, and even there you're seeing frame rate dips.

We'll see, maybe the 3080 finally does it under $1k.

yeah it just all seems fairly pointless, 1440p looks awesome, and the same hardware that gets you 4K60 gets you high refresh 1440p. And 34"/38" gaming ultrawides own too, the current crop do at least 120 hz and up to 175 hz in some cases, there are really no high-refresh 4K gaming ultrawides.

If you want it for productivity, and for older games, yeah, I guess. But for modern AAA titles it's just needlessly intensive for very little real gain imo, and means you're missing out on a bunch of other cool hardware niches.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

there are really no high-refresh 4K gaming ultrawides.

To be fair, this has largely been because you couldn't shove enough bandwidth to drive one down a DP 1.4 or HDMI 2.0b line. HDMI 2.1 opens that up considerably, so I would expect we'll start seeing stuff like 5120x2160@120Hz in the nearish future.

Cygni
Nov 12, 2005

raring to post

AirRaid posted:

It's a cover. Nvidia's site has images of both the 3080 with the cover also and the 3090 without -



ok word, i guess them little fins are gonna be that delicate in the 3090. this cooling solution really is insane, the boost clocks better be mental.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

DrDork posted:

To be fair, this has largely been because you couldn't shove enough bandwidth to drive one down a DP 1.4 or HDMI 2.0b line. HDMI 2.1 opens that up considerably, so I would expect we'll start seeing stuff like 5120x2160@120Hz in the nearish future.

that's the point though - 4K's bandwidth requirements are so extreme that it will forever be lagging behind what is possible at 1440p and 1080p. When HDMI 2.1 comes around we can do 1440p240 and stuff like that as well.

Every resolution is a particular set of compromises, I just don't feel like 4K gaming is really a set of compromises that work that well for many people, high-refresh 1440p is just a better general-purpose monitor for most tasks, and much cheaper.

(and 8K is going to be lol, like 30 fps or less outside DLSS-accelerated titles.)

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

that's the point though - 4K's bandwidth requirements are so extreme that it will forever be lagging behind what is possible at 1440p and 1080p. When HDMI 2.1 comes around we can do 1440p240 and stuff like that as well.

Sure, but diminishing returns also exists. It's not like the difference between 240Hz and 300Hz is anywhere near as noticeable as between 60Hz and 120Hz, after all. With HDMI 2.1 we'll be able to get 4-5k into the 100-120Hz range, which is a great option for a lot of people. Is it perfect for everyone? Nope--some will legitimately prefer 1440p@200Hz or whatever, and others will say gently caress it and stick with 1080p@60Hz because it won't take a super-expensive GPU to drive.

Personally, I've been loving my ultrawide, and am very much looking forward to a modest bump in both resolution and refresh rate. But to each their own.

shrike82
Jun 11, 2005

4K PC gaming doesn’t really make sense until we get decent cheap gaming monitors at that resolution.

Ugly In The Morning
Jul 1, 2010
Pillbug

shrike82 posted:

4K PC gaming doesn’t really make sense until we get decent cheap gaming monitors at that resolution.

For sure. I was just shopping those and it looks like getting one without dropping six hundo isn’t going to happen. I’m happy with 1440p144hz for now, dropping that much on a monitor feels crazy to me (I say, planning to buy a 3080)

ufarn
May 30, 2009
Some DirectML news: TensorFlow code is on GitHub.

https://twitter.com/DirectX12/status/1303440401848823809

So here is what I don't get about DirectML; who's supposed to be in charge of building and training the data we will eventually use for DirectML, and how, or through what, would that be made available to a general audience? GPU drivers? A Windows 10 update?

kloa
Feb 14, 2007


I dunno, I bought a nice Samsung LED IPS 49” to use as a gaming monitor and it’s fuggin nice.

Granted I’m not playing BR games like y’all that need 9999 FPS to shoot 12 year olds so :shrug:

Truga
May 4, 2014
Lipstick Apathy

Ugly In The Morning posted:

For sure. I was just shopping those and it looks like getting one without dropping six hundo isn’t going to happen. I’m happy with 1440p144hz for now, dropping that much on a monitor feels crazy to me (I say, planning to buy a 3080)

for me dropping a bunch on a monitor is a no-brainer, since it'll work fine and be very good for at least 10 years, unlike gpus that are good for 2 years and passable for another 3-4 max

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

shrike82 posted:

4K PC gaming doesn’t really make sense until we get decent cheap gaming monitors at that resolution.

Guess that depends on your definition of "cheap."

You can get a 4k@60 IPS monitor for ~$350. Ain't nothin, but it's also not crazy expensive or anything, especially since if you're buying into 4k gaming you're probably not thinking of driving it with a $300 GPU.

If you mean 4k@144Hz, yeah, those are still quite expensive, but several good models exist around $800. Obviously a different market at that point, but when we're also talking about people trying to decide between a $700 and a $1500 GPU, well, it's not that outlandish, either. There are numerous 1440p@144-165Hz monitors that cost around the same.

You can argue whether you think more resolution or higher refresh rates is better for a given scenario, but it's not like one is really any cheaper than the other.

Truga posted:

for me dropping a bunch on a monitor is a no-brainer, since it'll work fine and be very good for at least 10 years, unlike gpus that are good for 2 years and passable for another 3-4 max

Pretty much this. I mean we all have our priorities and all, but if you invest in a real good monitor chances are it'll outlast pretty much any other part of your system.

shrike82
Jun 11, 2005

Except even the high end 4K gaming monitors are compromised - isn’t the 2nd batch of Samsung G9s ridden with issues?

Ugly In The Morning
Jul 1, 2010
Pillbug

Truga posted:

for me dropping a bunch on a monitor is a no-brainer, since it'll work fine and be very good for at least 10 years, unlike gpus that are good for 2 years and passable for another 3-4 max

I think for me part of it is that I can’t gently caress around with monitors as much so they’re just not as fun as a new graphics card. There’s no OCing or benchmarks or stress tests to optimize, so... :effort:

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

MikeC posted:

Should have quoted I guess.


Obviously an over the top take but amusing none the less. I continue to hold out hope that RDNA 2 brings back some reasonable degree of competition in the upper mid range to high end space. Even if it is just raw performance without the software stack.

I think we can all agree that if the best AMD can squeeze out is a repeat of the GCN years they've given up and ceded the interesting part of the market to nvidia and maybe intel. But low-end/midrange dGPUs are kind of a dying breed to begin with considering the quality of integrated GPUs.

Would make a lot of sense for AMD to focus on APUs/Consoles only but that'd lock them out of half of the HPC market that Epyc opened up.

shrike82
Jun 11, 2005

Lisa Bae has a pretty good handle on the company - it’s probably a competitive advantage to be one of the few shops to be able to spin a CPU+GPU solution. I’d be surprised if their GPU division was in the red.

They’re still recovering from years of mismanagement under Koduri and now that he’s loving things up at Intel, the hope is that AMD can rebuild that division.

movax
Aug 30, 2008

Truga posted:

for me dropping a bunch on a monitor is a no-brainer, since it'll work fine and be very good for at least 10 years, unlike gpus that are good for 2 years and passable for another 3-4 max

I paid $1245 for my U3011 on Thanksgiving 2010 — still have the original box and it's moved like 6 times, only real "complaint" is the backlight.

shrike82 posted:

Lisa Bae has a pretty good handle on the company - it’s probably a competitive advantage to be one of the few shops to be able to spin a CPU+GPU solution. I’d be surprised if their GPU division was in the red.

They’re still recovering from years of mismanagement under Koduri and now that he’s loving things up at Intel, the hope is that AMD can rebuild that division.

What were some of the major strategic boo-boos Koduri made?

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Malcolm XML posted:

But low-end/midrange dGPUs are kind of a dying breed to begin with considering the quality of integrated GPUs.

As impressive as they are, isn't there going to be a big jump with next gen that'll put most new ("big") games out of their range again? Or will they catch up quicker this time?

shrike82
Jun 11, 2005

movax posted:

What were some of the major strategic boo-boos Koduri made?

We don’t have that much visibility into AMD then but he presided over two lovely GPU generations while their marketing made pretty dumb claims about them. He seems to be repeating his performance at Intel by coyly hinting at the gaming performance of Xe discrete which I’m guessing will be a shitshow. And the funniest thing is he’s supposedly in the running for CEO there.

repiv
Aug 13, 2009

ufarn posted:

So here is what I don't get about DirectML; who's supposed to be in charge of building and training the data we will eventually use for DirectML, and how, or through what, would that be made available to a general audience? GPU drivers? A Windows 10 update?

the model would be trained by whoever is developing the software that utilizes directml and shipped with the software

directml is just a set of low-level ML primitives that a model can be implemented on top of, it doesn't handle models directly

shrike82
Jun 11, 2005

I suspect that people who buy 3080 FE will be in for a bad time - the 2 slot dual fan form factor doesn’t make sense cooling a 320W GPU. Even Nvidia marketing copy for the card shows it running 20c hotter than the 3090 at various noise levels.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
Well good news, the FE reviews are supposed to hit 3 days before launch so hopefully some actual meaningful data will be available.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

shrike82 posted:

I suspect that people who buy 3080 FE will be in for a bad time - the 2 slot dual fan form factor doesn’t make sense cooling a 320W GPU. Even Nvidia marketing copy for the card shows it running 20c hotter than the 3090 at various noise levels.

Yeah I think I've relegated myself to shooting for a higher end EVGA or ASUS for the BIOS switch.

Speaking of that, I know ASUS has confirmed it for the Strix but has anyone seen it for the EVGA FTW3?

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Rinkles posted:

As impressive as they are, isn't there going to be a big jump with next gen that'll put most new ("big") games out of their range again? Or will they catch up quicker this time?

Even the best iGPUs these days are outperformed by a 1030, he's talking out his rear end.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

shrike82 posted:

4K PC gaming doesn’t really make sense until we get decent cheap gaming monitors at that resolution.

There is only one reason I care about 4K and plan my build around it: 4K OLED with a setup dedicated entirely to that, where you're sitting close to the set so you take full advantage of the pixels (for example that's about 6-8 feet away for a 65 inch).

You need to be taking full advantage of the benefits of OLED and HDMI 2.1 VRR for the whole setup to really be worth it.

It's insanely niche and I agree that almost no one should care about 4K pc gaming yet

Ugly In The Morning
Jul 1, 2010
Pillbug

Some Goon posted:

Even the best iGPUs these days are outperformed by a 1030, he's talking out his rear end.

I just checked and even the Iris Plus G7 is ~25 percent behind the 1030 on benchmarks. Fine for Windows, but yeah, incredibly anemic for most stuff past that.

shrike82
Jun 11, 2005

highend gaming monitors going down the route of large ultrawides with huge curves is a deadend too -

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

jisforjosh posted:

Yeah I think I've relegated myself to shooting for a higher end EVGA or ASUS for the BIOS switch.

Speaking of that, I know ASUS has confirmed it for the Strix but has anyone seen it for the EVGA FTW3?

The FTW3 will have dual bios.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

shrike82 posted:

Except even the high end 4K gaming monitors are compromised - isn’t the 2nd batch of Samsung G9s ridden with issues?

If your argument is that one incredibly niche monitor had issues bad enough for a recall, therefore 4k gaming is dumb, then might I also suggest you never buy a video card at all because every GPU manufacturer out there has had a similar recall / whoopsie? I mean, even thread-darling EVGA has hosed it up pretty bad before.

I will agree that current 4k@>100Hz monitors are all compromised in the sense that they need to use DSC to hit those numbers, which is part of why I'm excited to see HDMI 2.1 actually launch for real and hopefully spur some advances in monitor tech.

shrike82
Jun 11, 2005

what's a no compromise 4k144 monitor I can buy today that I don't have to worry about for 5 years then?

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

shrike82 posted:

what's a no compromise 4k144 monitor I can buy today that I don't have to worry about for 5 years then?

Well its close but the lg 48 inch oled will do 4k/120fps +gsync on hdmi 2.1

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

shrike82 posted:

what's a no compromise 4k144 monitor I can buy today that I don't have to worry about for 5 years then?

I keep saying "now that HDMI 2.1 is actually launching we should see good 4k high-hz monitors soon" and you keep ignoring that "soon" is not "now."

But, yeah, the C9 series is great if you're in the situation where you can use a TV for a monitor. I'm not, so I'm stuck waiting.

Chimp_On_Stilts
Aug 31, 2004
Holy Hell.

shrike82 posted:

highend gaming monitors going down the route of large ultrawides with huge curves is a deadend too -

Why do you think it's a dead end? Do you mean a fad that will die like 3D TV or something else?

VorpalFish
Mar 22, 2007
reasonably awesometm

HDMI 2.1 solves the bandwidth problem but oled is still the only solution for really good blacks and fast response times, oled options are pretty sparse in monitor sizes and then you get to deal with burn in and on top of all that gpus still aren't close to pushing high refresh rate.

I figure 1440p144 is a good stopgap until 4k144 microled is actually viable in the market. If I'm gonna drop 1k+ on a monitor it better be perfect.

shrike82
Jun 11, 2005

quote:

Ethereum Miners Eye NVIDIA’s RTX 30 Series GPU as RTX 3080 Offers 3-4x Better Performance in Eth
https://www.hardwaretimes.com/ethereum-miners-eye-nvidias-rtx-30-series-gpu-as-rtx-3080-offers-3-4x-better-performance-in-eth/
In the above image, you can see a mining farm using up to 8 GeForce RTX 3080 cards, possibly the iChill variant from Inno3D. As per the miners, the RTX 3080 is nearly 3-4x faster than the RTX 2080 in terms of Ethereum mining capabilities. You’re looking at 115 Mh/s, while the RTX 2080 manages just about 30-40 Mh/s.

This is a bit worrying as the Ampere GPUs are already supposed to be limited in supply for the first couple of months. Both Ethereum and Bitcoin prices have seen a revival over the past 6 months. At present, Eth is trading at $347 while Bitcoin is placed north of the 10K mark at $10,174. It’ll be really unfortunate if like the last mining boom, consumer graphics cards are cannibalized by miners.

:yeshaha:

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

VorpalFish posted:

gpus still aren't close to pushing high refresh rate.

Watch me bitch :agesilaus:

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

this is good news...

for AMD!!!

Adbot
ADBOT LOVES YOU

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
AMD counters by announcing that it was never in fact Big Navi, but Dig Navi, the most powerful mining GPU of all time.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply