Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
whatis
Jun 6, 2012

AirRaid posted:

Regarding the Titan name, has anyone even looked at the range of Titan cards recently? When it was just one it was fine, but now there is -

GTX Titan, released in 2013
GTX Titan Black, released in February 2014
GTX Titan Z, released in March 2014
GTX Titan X, released in 2015
Titan X (2016), released in 2016
Titan Xp, released in April 2017
Titan V, released in December 2017
Titan RTX, released in 2018

That is confusing as poo poo. Best drop it and roll the nomenclature into the generation numbering they have.

hoping they get rid of the Ti / Super designation as well. just go with 3085, 3075, etc

Adbot
ADBOT LOVES YOU

Maxwell Adams
Oct 21, 2000

T E E F S

I just can't deal with that 8-pin connector right in the middle of the card.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Taima posted:

Would also argue that you're overstating the case on mining because that's not "constant load for hours" that's "constant load 24/7" and actual gamers had nowhere near these issues.

Most "real" miners also undervolted to keep power use down to increase profit margins, so neither overclocking performance nor flat-out cooling abilities really mattered. They bought AIB cards because they could often get them cheaper than FE cards, not because of cooling issues.

That said, yeah, "reference coolers" (non-FE) have frequently been highly compromised options for people who strangely can spend $700 on a card but can't spend $750 for one with a good cooler.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

CapnAndy posted:

Yeah, $200 isn't much and I don't do this more than once per decade.

What does monitor have to do with it, though? I need new those, too.

If you have a 1080p60hz monitor you're literally not able to go below your monitors max, so why spend money?

If you have 4k, a 3070 is going to have some deficiencies that are a real problem.

1440p is kind of inbetween at high refresh. Depends on the use case, mostly. I am guessing the 3080 is a better deal but I think people are trying to make decisions without seeing any unbiased reviews which is how mistakes are made.

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Combat Pretzel posted:

Hmm the 3080 is 720 eurobux here in Belgium.

Kind of hilarious that an extreme performance device with a hard to manufacture gently caress-off huge die and idiotic thermals is available for less than a loving iPhone.

The iPhone's got bionic though.

repiv
Aug 13, 2009

it was a foregone conclusion but virtuallink is indeed dead, the FE cards have 1xHDMI, 3xDP and no USB-C

some AIB cards have 2xHDMI and 3xDP but none have been spotted with USB-C so far

CapnAndy
Feb 27, 2004

Some teeth long for ripping, gleaming wet from black dog gums. So you keep your eyes closed at the end. You don't want to see such a mouth up close. before the bite, before its oblivion in the goring of your soft parts, the speckled lips will curl back in a whinny of excitement. You just know it.

Lockback posted:

If you have a 1080p60hz monitor you're literally not able to go below your monitors max, so why spend money?

If you have 4k, a 3070 is going to have some deficiencies that are a real problem.

1440p is kind of inbetween at high refresh. Depends on the use case, mostly. I am guessing the 3080 is a better deal but I think people are trying to make decisions without seeing any unbiased reviews which is how mistakes are made.
So basically to figure out what card I want, I need to decide if I'm gonna spend the money on 4k monitors? What an odd state of affairs.

movax
Aug 30, 2008

chaleski posted:

I'm gonna go from a 770 straight to a 3080 and I'm pretty sure my heads going to explode

chaleski posted:

I'm gonna go from a 770 straight to a 3080 and I'm pretty sure my heads going to explode

Did someone post on 6xx to Ampere? I want to know which goon has the biggest head explosion because it’ll be glorious.

Cao Ni Ma
May 25, 2010



$700 for the 3080 is better than I expected, I wont be getting one at release though but when the supers come out? Def jumping into a new rig.


vvvv
Price segmentation wise, look at the 3080 like the 980ti unless they pop in with a stripped down version of the 3090 thats like a 1k.

Cao Ni Ma fucked around with this message at 19:00 on Sep 1, 2020

sauer kraut
Oct 2, 2004
Just caught up on the video. For someone as challenged as Jensen in the presentation department, that was a smashing success through use of props :allears:
I don't like that prices stayed at the inflated Turing levels, realistically it's $100 too much for the 70/80 but I understand why Nvidia did it.

Black Griffon
Mar 12, 2005

Now, in the quantum moment before the closure, when all become one. One moment left. One point of space and time.

I know who you are. You are destiny.


wondering if there's any chance to sell my 1070. I can't really see any reason to buy one.

Internet Explorer
Jun 1, 2005





movax posted:

Did someone post on 6xx to Ampere? I want to know which goon has the biggest head explosion because it’ll be glorious.

I'll be going 980ti to 3080, which I feel like is a significant jump, but not quite 6xx. Have been using it since release on a 3440x1440 monitor and it wasn't really ever able to do ultra settings on new games. Looking forward to the upgrade. That being said, younger me is really amazed that I kept a graphics card for 5 years.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


How many bitcoin miners are going to buy all of these and mess up pricing?

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
DF video: 3080 is 70-100% faster than the 2080

Rubellavator
Aug 16, 2007

Should I upgrade to a 750W PSU or is there reason to go above that?

codo27
Apr 21, 2008

Black Griffon posted:

wondering if there's any chance to sell my 1070. I can't really see any reason to buy one.

Imagine poor me. I bought the fuckin 2080, seconds after the trade up window expired they announced the 2080 super which they obviously knew about all along. Now what can I get for this card? Not much

Nfcknblvbl
Jul 15, 2002

Internet Explorer posted:

I'll be going 980ti to 3080, which I feel like is a significant jump, but not quite 6xx. Have been using it since release on a 3440x1440 monitor and it wasn't really ever able to do ultra settings on new games. Looking forward to the upgrade. That being said, younger me is really amazed that I kept a graphics card for 5 years.

Same but 2560x1440 144hz. I'm definitely going low-medium settings with my 980ti these days but still glad with how long this card lasted.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

CapnAndy posted:

So basically to figure out what card I want, I need to decide if I'm gonna spend the money on 4k monitors? What an odd state of affairs.

I mean, yes?

To figure out what engine you need, first you should know how fast you want to go and how heavy the car is, no? Same here. A 4k screen is literally 4x the size of a 1080p one, so you need to be able to push 4x the pixels. Same with going high-refresh: a 1080p@60 takes 1/8th the power a 4k@120hz does.

So, yeah, you can either decide what monitors you want and then pick a GPU that will support those well, or go the other way around and figure out what you're comfortable spending for a GPU, and then buy monitors that will match your GPU. But you need to think of them together, rather than separately, to get the most out of your money.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

CapnAndy posted:

So basically to figure out what card I want, I need to decide if I'm gonna spend the money on 4k monitors? What an odd state of affairs.

I mean, I don't know how else you'd do it? Basically you decide what your target is (the monitor) then you have a card in the right tier for your target.

If you are getting a card and a monitor I'd just decide on a budget and then figure out what fits in there. If your budget is $800 a 3080 on a $100 monitor is dumb, so get a 3070 and a decent monitor. If 4K is a big deal to you, then you have to spend more overall.

Its how segmentation works.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rubellavator posted:

Should I upgrade to a 750W PSU or is there reason to go above that?

No and no.

The "recommended" 750W from NVidia is likely assuming you're buying some no-name Chinese poo poo-tier 750W PSU. If you have something from an actual brand name you recognize, you can probably get away with considerably less than 750W. Either way, you're fine.

Cantide
Jun 13, 2001
Pillbug

Rubellavator posted:

Should I upgrade to a 750W PSU or is there reason to go above that?

Gamersnexus just did a video on that.
https://youtu.be/X_wtoCBahhM

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

repiv posted:

it was a foregone conclusion but virtuallink is indeed dead, the FE cards have 1xHDMI, 3xDP and no USB-C

some AIB cards have 2xHDMI and 3xDP but none have been spotted with USB-C so far

Look at how they massacred my boy, RIP virtual link.

MarcusSA
Sep 23, 2007

I kinda feel like a high refresh rate 1440p is better vs a 4K 60hz for the long term.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

MarcusSA posted:

I kinda feel like a high refresh rate 1440p is better vs a 4K 60hz for the long term.

It is. 60hz is just sickly now. We aren't buying these new monster GPUs just to play games at such a low output.

That said, these GPUs are clearly paving the way for 4k144.

Fuzz
Jun 2, 2003

Avatar brought to you by the TG Sanity fund
EVGA, take my money and lemme buy a 3090. :f5:

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Digital Foundry written article, they have a 3080 already as previously mentioned;

https://www.eurogamer.net/articles/digitalfoundry-2020-hands-on-with-nvidia-rtx-3080

quote:

in fact, the card has been with me since late last week

codo27
Apr 21, 2008

If you dont have high refresh rate you might as well be on console. Once you've seen it, everything else is like looking at a slide show

Black Griffon
Mar 12, 2005

Now, in the quantum moment before the closure, when all become one. One moment left. One point of space and time.

I know who you are. You are destiny.


DrDork posted:

No and no.

The "recommended" 750W from NVidia is likely assuming you're buying some no-name Chinese poo poo-tier 750W PSU. If you have something from an actual brand name you recognize, you can probably get away with considerably less than 750W. Either way, you're fine.

What about my five year old 550W for a 3080 though?

codo27 posted:

Imagine poor me. I bought the fuckin 2080, seconds after the trade up window expired they announced the 2080 super which they obviously knew about all along. Now what can I get for this card? Not much

for sale, 2080, non-super

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

Well I've settled on the 3080 since I have the Asus ROG P278Q. By the time my end of year bonus pays out next year I can re-evaluate the 4k monitor market and see if reselling the 3080 for whatever 3090/TI equivalent offers the best 4k performance if it really matters that much...

Is 4k really that great anyway? I found 1440p blew me away jumping from 1080p.

KingKapalone
Dec 20, 2005
1/16 Native American + 1/2 Hungarian = Totally Badass
Do AIBs come out on 9/17 too? What vendors have been good lately? My 1070 is from Gigabyte but I haven't heard much about them lately.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Maxwell Adams posted:

I just can't deal with that 8-pin connector right in the middle of the card.

That's actually the new 12-pin.

repiv
Aug 13, 2009

BIG HEADLINE posted:

That's actually the new 12-pin.

and it's mounted at a 45 degree angle so it should neatly wrap around the card when installed

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Zedsdeadbaby posted:

Digital Foundry written article, they have a 3080 already as previously mentioned;

https://www.eurogamer.net/articles/digitalfoundry-2020-hands-on-with-nvidia-rtx-3080

Yuck on these review terms:

quote:

Full disclosure: I can bring you the results of key tests today, but there are caveats in place. Nvidia has selected the games covered, for starters, and specified 4K resolution to remove the CPU completely from the test results and in all cases, settings were maxed as much as they could be. The games in question are Doom Eternal, Control, Shadow of the Tomb Raider, Battlefield 5, Borderlands 3 and Quake 2 RTX. Secondly, frame-time and frame-rate metrics are reserved for the reviews cycle, meaning our tests were limited to comparisons with RTX 2080 (its last-gen equivalent in both naming and price) and differences had to be expressed in percentage terms

Shogunner
Apr 29, 2010

Ready to crash and burn.
I never learn.
I'm on the rapetrain.
So ZOTAC says here https://twitter.com/ZOTAC/status/1300856066566225927/photo/1 "first availability will be the RTX 3080 Series Sept 17"

So I guess we're all just going to be mashing F5 on launch day?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Black Griffon posted:

What about my five year old 550W for a 3080 though?

Quite iffy, and heavily dependent on what CPU you're running and if you overclock it. 600-650W is probably where the "safe" bottom is gonna be.

Also, at 5 years old, I'd look to see if it's still under warranty. If it's not, I might consider replacing it sooner rather than later regardless.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Lockback posted:

Yuck on these review terms:

It's likely just down to drivers not being ready yet.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

Zedsdeadbaby posted:

Digital Foundry written article, they have a 3080 already as previously mentioned;

https://www.eurogamer.net/articles/digitalfoundry-2020-hands-on-with-nvidia-rtx-3080

70-80% uplift at 4K on rasterization? :aaaaa:

repiv
Aug 13, 2009

https://www.nvidia.com/en-us/geforce/news/30-series-watch-dogs-legion-bundle/

the 3070/3080/3090 have a day-one bundle with watch dogs legion and 1 year of geforce now

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



Here's that chart in all its glory ...

mcbexx fucked around with this message at 19:50 on Sep 1, 2020

Adbot
ADBOT LOVES YOU

Shrimp or Shrimps
Feb 14, 2012


#LiveToGame

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply