Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I think the $200 price deltas between the products are a hush-hush way of announcing *some* kind of refresh is guaranteed. The $600 delta at the highest end allows for a 3080Ti as well as a 3080 Super.

Adbot
ADBOT LOVES YOU

Rabid Snake
Aug 6, 2004



sean10mm posted:

Yeah if a 3070 is almost a 2080 loving TI, they can have my $600 or whatever.

But of course this is all still dumb rumors so who knows.

I remember when the 1070 matched the 980 Ti.

What sucked was the 980 Ti was priced at $649

and the 1070 was priced at $379.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC
Did anyone actually manage to buy a 1070 for under $400?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rabid Snake posted:

I remember when the 1070 matched the 980 Ti.

What sucked was the 980 Ti was priced at $649

and the 1070 was priced at $379.

I mean, it sucked for the people who bought a 980Ti late in its life, but if you bought it early you got a year+ out of it before the 1070 was released. For everyone else, that was great because they moved the performance bar up considerably with the 10-series. I'd much prefer that sort of "suck" over the suck that was the original 20-series launch.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Martian Manfucker posted:

Is 75-85c~ under load Too Hot for a 1660 ti? I feel like that might be too hot. It idles around 50c. I have the EVGA 1660 ti SC Ultra and it has a pretty dinky looking cooling apparatus, but it doesn't seem like the kind of card I'd want to spend any extra money on. I haven't messed with the fan curve on it at all or overclocked it beyond whatever the factory did.

Its hot, its not "Too Hot".

Regrettable
Jan 5, 2010



OhFunny posted:

Did anyone actually manage to buy a 1070 for under $400?

Yeah, I got an Asus Dual 1070 for $360 right before all the bitcoin fuckery vastly inflated the prices.

RGX
Sep 23, 2004
Unstoppable

Some Goon posted:

Its hot, its not "Too Hot".

:hmmyes:

A lot of people get way too concerned about temps imo, modern hardware is very good at throttling when things get too toasty and those temps are nowhere near "too hot" levels. By all means have a look at your airflow if you want an effeciency project or if your card is underperforming in the games you play but don't let it keep you up at night, particularly if you run it with hot ambient temps. That's well within reasonable margins.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Some Goon posted:

Its hot, its not "Too Hot".

There were several generations of cards used to mine bitcoin and some people would have them run at 90c 24/7 for months with no ill effect. You're gonna have a 3 year warranty anyway.

MonkeyLibFront
Feb 26, 2003
Where's the cake?
So if the bit coinery fuckery is all over, why are GPUs still expensive as hell and why do we think that the new ones will be a more a "reasonable" price, as someone who owns a 1050 and will be buying a 3080 variant because I now have the disposable income that I didn't 5 years ago.

Seamonster
Apr 30, 2007

IMMER SIEGREICH

OhFunny posted:

Did anyone actually manage to buy a 1070 for under $400?

My 1080 was 4 bills on the dot.

Martian Manfucker
Dec 27, 2012

misandry is real

RGX posted:

:hmmyes:

A lot of people get way too concerned about temps imo, modern hardware is very good at throttling when things get too toasty and those temps are nowhere near "too hot" levels. By all means have a look at your airflow if you want an effeciency project or if your card is underperforming in the games you play but don't let it keep you up at night, particularly if you run it with hot ambient temps. That's well within reasonable margins.

Thanks for the reality check. I got myself all worried because a friend mentioned offhand that the temperature was high and I might be getting throttling etc. But if it doesn't come into effect until 90c+, I've never seen temps that high, so I'm not gonna worry about it anymore. I'm also gonna put that sound dampening padding back on the case because the fan noise is too much for me and having all those open holes right near the motherboard is just asking for something to get spilled directly in there.

Cygni
Nov 12, 2005

raring to post

MonkeyLibFront posted:

So if the bit coinery fuckery is all over, why are GPUs still expensive as hell and why do we think that the new ones will be a more a "reasonable" price, as someone who owns a 1050 and will be buying a 3080 variant because I now have the disposable income that I didn't 5 years ago.

Like all technologies reaching their more developed/derived phases, every % performance increase is costing more and more to develop. The days of some random untrained dork stumbling into the room and saying "why dont you turn this part around backwards!!!" and magically doubling performance is long since passed. So with the "easy wins" mostly already out of the way, every small gain becomes much more of a slog and much more rife with conscious trade-offs. The cheap and easy stuff was already done years ago. R&D costs go up, R&D costs get passed to consumers, prices go up.

Part of the rising R&D cost cycle is that it also strips competition out of the market. If you don't have billions in liquid assets to bankroll development and thousands of trained engineers on staff, you can't even begin to play in the gaming GPU space. Less competition lets the margins rise.

And of course, there are real physical limits in the universe. Investors always want to think that if you just invest some amount of cubic dollars, you can force the next big leap forward. They are stuck in that post WW2 mentality where everything advanced fast forever. But technology doesn't work that way. There really is a point where you've made the best hammer possible, and thats it forever.

I personally don't think the 3XXX series will really be a huge price/performance step, unless Nvidia got a great deal on Samsung's manufacturing... for the simple reason that they don't have to do that. People have demonstrated that they are willing to pay high prices, and they have little competition. Price/performance was flat with the Turing launch, got a little better with the Supers, and I bet it is going to be relatively flat again this time. The wildcard here is whether Nvidia really does feel they have something to fear from the consoles.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

MonkeyLibFront posted:

So if the bit coinery fuckery is all over, why are GPUs still expensive as hell and why do we think that the new ones will be a more a "reasonable" price, as someone who owns a 1050 and will be buying a 3080 variant because I now have the disposable income that I didn't 5 years ago.

We don't have a particularly good idea what pricing is going to be, Jensen is known for changing his mind on pricing literally last minute. It's probably not going to be cheap, but should represent a significant leap in value from Turing.

A GTX 1050 is 132 mm2 and launched at $109. A GTX 1060 launch 6gb GPU is 200 mm2 and launched at $249. A GTX 1650 is 200 mm2 and launched at $149.
A GTX 1080Ti is 471 mm2 and launched at $700. An RTX 2070 is 445 mm2 and launched at ~$499-599. An RTX 2080Ti is 754 mm2 and launched at ~$1200.

Size is a very significant factor in the price of Turing. Semiconductor production isn't perfect, there are defects. The larger a die is, the more likely defects will ruin it, even if you've tried to build in redundancy. It's really loving expensive to produce huge pieces of silicon, but since GPU work is ultra-parallel, the main way to make GPUs faster is to make them bigger. The 3000 series is a significant process shrink, but the dies are still going to be huge - a 3080 is rumored to have as many cores as a 2080Ti, and probably more extra hardware such as extra raytracing power. Also, AMD has no competition for Nvidia except at the low end.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

MonkeyLibFront posted:

So if the bit coinery fuckery is all over, why are GPUs still expensive as hell

Because the manufacturers figured out that people were still buying GPUs at the inflated prices caused by mining, so the prices stuck

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

Regrettable posted:

Yeah, I got an Asus Dual 1070 for $360 right before all the bitcoin fuckery vastly inflated the prices.


Seamonster posted:

My 1080 was 4 bills on the dot.

drat. I paid $436, but then again I did buy at launch.

BurritoJustice
Oct 9, 2012

Nemo2342 posted:

I'm still using a 5 year old 980 (I skipped the 20 series because I needed to replace everything else in my PC).

980 gang :coolfish:

Also my 980 sli setup with stupid OC beats a stock 2080 in TS and FSE, for the one game I currently play with sli suppory

BurritoJustice fucked around with this message at 09:25 on Aug 28, 2020

Arzachel
May 12, 2012

Rabid Snake posted:

I remember when the 1070 matched the 980 Ti.

What sucked was the 980 Ti was priced at $649

and the 1070 was priced at $379.

By then you could get a 980ti for less than the price of a 1070. Pascal got a lukewarm reception on launch since the 970 was such amazing value despite the vram fuckery. That it's now fondly remembered illustrates how hosed we are right now :v:

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
https://twitter.com/kopite7kimi/status/1299237294742724608?s=20

Shipon
Nov 7, 2005
curses that i bought a gsync monitor instead of a gsync compatible freesync monitor so i'm locked into nvidia at this point

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Shipon posted:

curses that i bought a gsync monitor instead of a gsync compatible freesync monitor so i'm locked into nvidia at this point

depends on which one actually, some of the newer ones support Adaptive Sync and HDMI VRR

Vir
Dec 14, 2007

Does it tickle when your Body Thetans flap their wings, eh Beatrice?
I wonder if the new Avatar movies will bring a renewed push for 3D monitors, and the implication that might have for graphics cards bandwidths. No-glasses 3D is a thing, I guess, but all you have to put in a monitor to make it 3D ready is a polarization filter, right? The rest is software. James Cameron is a bit of a techno geek, so he apparently forced Disney to put both types of HDR (Dolby Vision and HDR10+) on the Alita 4K Blu-ray. But 3D seemed to be rejected by the market last time, and at the moment you can barely find 3D screens for sale - only projectors

Almost Smart
Sep 14, 2001

so your telling me you wasn't drunk or fucked up in anyway. when you had sex with me and that monkey
Is there a reliable way to estimate power consumption and determine if your PSU is sufficient? And could a current card like a 2080ti or RTX Titan be used as a surrogate in these calculations (unless we already have an idea of how power hungry the new cards will be)?

I think I'll probably be ok where I'm at, but I want to get an idea of how much if any headroom I'll have left.

Sphyre
Jun 14, 2001

your 750W psu will be fine to power the 3090 and that's a Sphyre Promise

ufarn
May 30, 2009

Almost Smart posted:

Is there a reliable way to estimate power consumption and determine if your PSU is sufficient? And could a current card like a 2080ti or RTX Titan be used as a surrogate in these calculations (unless we already have an idea of how power hungry the new cards will be)?

I think I'll probably be ok where I'm at, but I want to get an idea of how much if any headroom I'll have left.
If you're asking, you're probably fine. You can add your components to PCPP and have it calculate the usage for you, as long as it has the W data in the components.

Assume ~350W for the Ampere GPU and add this for your CPU:



CPU fans are negligible, and storage drives are probably, what, 5W each roughly?

People are generally fine, but "generally" does not deal with edge cases where people have all kinds of crap hooked up to their USB ports and PCIe slots.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
There are a number of PSU Wattage calculators, and they seem to be mostly consistent, pretty broadly speaking. I plugged my specs into a couple of them and I see a ~15W variance in the result.

repiv
Aug 13, 2009

https://twitter.com/VideoCardz/status/1299313471297814528?s=19

https://twitter.com/VideoCardz/status/1299315759173296128?s=19

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
Such detail and eloquence in these tweets.

repiv
Aug 13, 2009

https://twitter.com/VideoCardz/status/1299320860386430977

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
I wonder how long it'll take for the 20GB 3080s to surface.

shrike82
Jun 11, 2005

lol that the TDP is 350W (3090), 320W (3080) and then ... 220W for the 3070. A major difference seems to be that the first two are on G6X and the 3070 is on G6.

And they’ve also confirmed the 20GB 3080.

Alchenar
Apr 9, 2008

shrike82 posted:

And they’ve also confirmed the 20GB 3080.

Coming some months from now. So presumably that one will get the Ti tag.

VorpalFish
Mar 22, 2007
reasonably awesometm

shrike82 posted:

lol that the TDP is 350W (3090), 320W (3080) and then ... 220W for the 3070. A major difference seems to be that the first two are on G6X and the 3070 is on G6.

And they’ve also confirmed the 20GB 3080.

Gddr6x being the cause of the ballooning power consumption would make a lot of sense.

If we're expecting better efficiency between uarch enhancements and a more advanced node, bumping power by 25% would lead me to expect somewhere around +60% performance over the 2080ti which seems like a lot.

Really makes it seem like it would have been worth going hbm2e instead though for the high end skus. They have the flexibility to price them pretty much as they please.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Arzachel posted:

By then you could get a 980ti for less than the price of a 1070. Pascal got a lukewarm reception on launch since the 970 was such amazing value despite the vram fuckery. That it's now fondly remembered illustrates how hosed we are right now :v:

Pascal was a huge hit on release. The 980tis weren't under $400 until after Pascal's launch, unless you're talking used. The 1070 was about that same performance using way less power and less heat.

The 970 was still a great value, sure. But the biggest complaint about Pascal was they kept selling out. Nothing about that launch was lukewarm.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

I think the 970 is remembered fondly because despite the fact it was kind of hosed, it kept people running games at decent enough settings on their 1080p screens. A minor amount of games and settings would challenge the cards available VRAM, and the card outside of VRAM is competitive with midrange pascal and low end turing for performance.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
When I was looking at my options for a cheap upgrade for freesync compatibility, from what I gleaned, the 970 still outperformed the 1650 most of the time.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Rinkles posted:

When I was looking at my options for a cheap upgrade for freesync compatibility, from what I gleaned, the 970 still outperformed the 1650 most of the time.

I'd take a 1650 Super over a 970 zero questions asked

and the reality is, i'd probably take the integer scaling in turing over the performance in the 970 too, even in the regular 1650

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
i think they don't come blower style

repiv
Aug 13, 2009

VorpalFish posted:

Gddr6x being the cause of the ballooning power consumption would make a lot of sense.

if the memory is extremely spicy i wonder if that will be a problem for g10/g12 users

passive vram heatsinks might not cut it this time?

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

repiv posted:

if the memory is extremely spicy i wonder if that will be a problem for g10/g12 users

passive vram heatsinks might not cut it this time?

Yeah I've already relegated myself to selling my 1080Ti with the AIO to a friend

Adbot
ADBOT LOVES YOU

Rollie Fingers
Jul 28, 2002

Only 10GB of memory on the 3080 is incredibly cynical from Nvidia. Sigh.

For those of us who aren't gamers and instead need a beefy graphics card for certain production software, it's basically saying 'you're hosed. go buy a Quadro that costs 150% more'.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply