Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Drone
Aug 22, 2003

Incredible machine
:smug:


So if I'm only interested in 1440p@144 Hz, I should probably get a 3070? That's kinda the vibe I'm getting from reviews that conclude the 3080 is probably overkill for anyone not interested in 4k.

Or am I interpreting this wrong?

Adbot
ADBOT LOVES YOU

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Drone posted:

So if I'm only interested in 1440p@144 Hz, I should probably get a 3070? That's kinda the vibe I'm getting from reviews that conclude the 3080 is probably overkill for anyone not interested in 4k.

Or am I interpreting this wrong?

You'll still likely see a performance gain over the 3070, but those gains seem to diminish the lower your resolution. At 1440p it's looking more like about 25-30% improvement over the 2080ti in non-ray traced or DLSS scenarios.

Rolo
Nov 16, 2005

Hmm, what have we here?
Well if I’m gonna upgrade from 3200 memory is there a current sweet spot that isn’t throwing 400 dollars at a diminished return? Is 16Gb still plenty?

I’m gonna be going for 4K on a 3080 and Ryzen 7 3700x

shrike82
Jun 11, 2005

for compute - 3080's way faster on anything that uses ray tracing, we're talking about 2x
no one's done any benchmarks related to ML which sucks

sauer kraut
Oct 2, 2004
So the 3080 is jacked almost to the max AMD style, I wonder why they did that.
Computerbase recommends lowering the power limit to ~270W to get an efficiency boost worthy of the new node (and improve the already decent cooler a bit more), while only losing 4% performance on average.
OC beyond 320W is a hearty lol unless you wanna play around with the memory.

KidDynamite
Feb 11, 2005

I'm on a 980ti and if I can't get one of these cards before the end of the year I'm just going to wait for the TI or Super gen.

Cabbages and VHS
Aug 25, 2004

Listen, I've been around a bit, you know, and I thought I'd seen some creepy things go on in the movie business, but I really have to say this is the most disgusting thing that's ever happened to me.

Drone posted:

So if I'm only interested in 1440p@144 Hz, I should probably get a 3070? That's kinda the vibe I'm getting from reviews that conclude the 3080 is probably overkill for anyone not interested in 4k.

Or am I interpreting this wrong?

I think it depends on how future proof you want to be. After having x80 series cards, I'll never personally buy an x70 again.

My current setup is also 2k @ 144 and I don't see changing that soon; on the other hand, I also have a rift and do some VR so that's a consideration for me. (1080gtx cannot run Alyx at full detail). There's also stuff like Flight Sim, though I haven't seen any benchmarks on that for 3070 vs 3080.

KidDynamite posted:

I'm on a 980ti and if I can't get one of these cards before the end of the year I'm just going to wait for the TI or Super gen.

It's kinda frustrating to me they can't saturate demand a bit faster. Ultimately I'm almost 40 and work full time and have a kid and so the difference between landing one of these cards tomorrow and sometime in February isn't that significant, but the whole thing is sort of obnoxious to me. I'd be fine with a normal preorder process which included a definite ship date some point in the future; having to play whack a mole with a bunch of different sites is grating.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Rolo posted:

Well if I’m gonna upgrade from 3200 memory is there a current sweet spot that isn’t throwing 400 dollars at a diminished return? Is 16Gb still plenty?

I’m gonna be going for 4K on a 3080 and Ryzen 7 3700x

The performance benefits aren't enough to justify replacing memory unless you're only hitting 3200mhz with really terrible timings.

ChazTurbo
Oct 4, 2014
Any word on the length of the gigabyte models?

Drone
Aug 22, 2003

Incredible machine
:smug:


Cabbages and Kings posted:

I think it depends on how future proof you want to be.

Pretty future-proof? I mean I'd like this card to last me a good 4 years or so at least, whichever card it is I end up getting. I don't see an upgrade to 4K in the cards for me for at least that long. Right now I'm on a 1050 Ti with a lovely 1080p/60 Hz cheapo monitor, that I would upgrade to a 1440p/144 Hz monitor parallel to my card purchase.

I've never had anything but budget or horribly out-of-date GPUs in the past, and I'm building a new PC right now anyway (i5 10600K), so the temptation to go slightly all-out in the name of future proofing and finally, for once in my drat life, giving me that "drat, I didn't have to compromise on anything in this PC to get the performance I want" feeling.

Rolo
Nov 16, 2005

Hmm, what have we here?

K8.0 posted:

The performance benefits aren't enough to justify replacing memory unless you're only hitting 3200mhz with really terrible timings.

Ok cool.

shrike82
Jun 11, 2005

lol Dave Lee had issues running the 3080 on a 600W PSU

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Drone posted:

Pretty future-proof? I mean I'd like this card to last me a good 4 years or so at least, whichever card it is I end up getting. I don't see an upgrade to 4K in the cards for me for at least that long. Right now I'm on a 1050 Ti with a lovely 1080p/60 Hz cheapo monitor, that I would upgrade to a 1440p/144 Hz monitor parallel to my card purchase.

I've never had anything but budget or horribly out-of-date GPUs in the past, and I'm building a new PC right now anyway (i5 10600K), so the temptation to go slightly all-out in the name of future proofing and finally, for once in my drat life, giving me that "drat, I didn't have to compromise on anything in this PC to get the performance I want" feeling.

Usually its better across that timeline to buy a cheaper card now and a cheaper one in 2 years to get the most performance over 4. But the timelines have been stretching and, uhh, that 1050tis been doing some work so a 3080 might not be a bad get given your circumstance.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

shrike82 posted:

lol Dave Lee had issues running the 3080 on a 600W PSU

No loving poo poo. The drat thing eats 335W without an OC, that's over half the rated output. Assuming he's testing it with other top end parts like an i9 or whatever huge Ryzen CPU, gently caress yeah he's gonna have problems.

E: this rant aimed at him not you.

sauer kraut
Oct 2, 2004

Drone posted:

So if I'm only interested in 1440p@144 Hz, I should probably get a 3070? That's kinda the vibe I'm getting from reviews that conclude the 3080 is probably overkill for anyone not interested in 4k.

Or am I interpreting this wrong?

It varies heavily from game to game.
To brute force the latest Ubisoft stuff, Horizon Zero Dawn-quality ports, or god forbid FS2020 you want to best card you can afford.
For better optimized and lighter stuff a 3070 would do fine.

Drone
Aug 22, 2003

Incredible machine
:smug:


sauer kraut posted:

god forbid FS2020

Welp I guess I'm getting a 3080 then.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
Bear in mind FS2020 is HEAVILY single-thread CPU bound at the moment.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

AirRaid posted:

The Gigabyte website says nothing about 4 years for the Eagle, it only touts it on the page for the Gaming line.

A gentle reminder to anyone looking for free extended warranties: many credit cards give you an extra 12 months for free, and Citibank cards and a few others give you an extra 24 months. Worth considering how you pay for your electronics sometimes.

Raymond Hog
Nov 28, 2013

Haven't seen this mentioned in the reviews I've skimmed over, but does the FE cooler design affect CPU tower cooling more than usual?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

AirRaid posted:

No loving poo poo. The drat thing eats 335W without an OC, that's over half the rated output. Assuming he's testing it with other top end parts like an i9 or whatever huge Ryzen CPU, gently caress yeah he's gonna have problems.

E: this rant aimed at him not you.

Yeah, if anything the testing so far has validated that a 650W (hell, even a 600W) PSU should be more than enough for a 3080 + reasonable AMD CPU.

One of the other takeaways from the FE testing is that overclocking it does basically nothing: 3-5% in most cases, for an extra 50+W in power. So if you're trying to fit within a constrained temp/power envelope, just...don't overclock it.

Boz0r
Sep 7, 2006
The Rocketship in action.
When are the non-FE reviews coming out?

Cancelbot
Nov 22, 2006

Canceling spam since 1928

I am so glad my new PSU is 750W. Looks like it's right on/slightly over the recommended power consumption.

shrike82
Jun 11, 2005

a bit surprised they've pulled an AMD and pushed the 3080 on TDP. looks like undervolting will be a good idea and 270W a realistic target without losing perf

Enos Cabell
Nov 3, 2004


Boz0r posted:

When are the non-FE reviews coming out?

Tomorrow at same time you can buy them from what I understand.

repiv
Aug 13, 2009

Raymond Hog posted:

Haven't seen this mentioned in the reviews I've skimmed over, but does the FE cooler design affect CPU tower cooling more than usual?

Reviewers aren't allowed to compare it to the more conventional AIB coolers until the AIB embargo drops tomorrow

It could go either way, the FE cooler does blast hot air at the CPU but only half of the cards thermal output goes that way, the other half gets ejected out of the back of the case

OTOH the AIB cards will circulate all of the cards heat into the case, not as directly as the FE card but still much of it will end up being drawn into the CPU cooler

shrike82
Jun 11, 2005

and it's clear why nvidia's been mum on the 3090 - ~1.2X performance over the 25% uplift of the 3080 vs. the 2080ti...

my bar is a 50% uplift on ML perf over my RTX Titan... seems iffy the 3090 will hit it

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

Looks like I'm staying on 2560x1440 with a 3080 for the next 4-5 years which should give me sufficient enough performance to keep graphics on ultra and good FPS.

I imagine in 3-4 years, 4K RTX+DLSS will be cheap and easily accessible performance while 8K is going to occupy the slot 4K does today.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

Raymond Hog posted:

Haven't seen this mentioned in the reviews I've skimmed over, but does the FE cooler design affect CPU tower cooling more than usual?

In addition to what repiv said, the 3080 FE is not even the hottest card tested in any of the benchmarks. Shouldn't be too much of an issue.

waffle enthusiast
Nov 16, 2007



Er…I have a 650W and was hoping not to have to upgrade. (An i7-10700 arrives today). Is this a bad idea?

waffle enthusiast fucked around with this message at 15:39 on Sep 16, 2020

shrike82
Jun 11, 2005

Dangerllama posted:

Er…I have a 650W and was hoping not to have to upgrade. (An i7-10700 arrives today). Is this a bad idea?

he said 650W was fine

spunkshui
Oct 5, 2011



shrike82 posted:

a bit surprised they've pulled an AMD and pushed the 3080 on TDP. looks like undervolting will be a good idea and 270W a realistic target without losing perf

Can you just go hammer down on a card with 3 power plugs?

Does it cook itself into lower clock speeds?

repiv
Aug 13, 2009

spunkshui posted:

Can you just go hammer down on a card with 3 power plugs?

tune in tomorrow when the AIB cards with 3 power plugs get reviewed

we don't know what the power limit on those will be yet

spunkshui
Oct 5, 2011



repiv posted:

tune in tomorrow when the AIB cards with 3 power plugs get reviewed

we don't know what the power limit on those will be yet

Oh neat. Didn’t know that was so soon.

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness

Raymond Hog posted:

Haven't seen this mentioned in the reviews I've skimmed over, but does the FE cooler design affect CPU tower cooling more than usual?

German site saw no significant difference in a test:

TorakFade
Oct 3, 2006

I strongly disapprove


so,

my 750W EVGA G3 power supply is good enough for the 3080 (well it better be, since it has a 12 years warranty and I bought it 2 years ago I intend to use it until 2025 at least), upgrading from a 1080 would get me 100% more performance (plus DLSS and raytracing), the price/performance ratio seems very good considering that the card is 25-50% faster than the previous 1200$ top-of-the-line GPU

it's overkill for my 1440p60 monitor, but that should allow me to enjoy maxed settings games for a good few years, probably until the 5XXX series

my computer and wallet are ready, but where's the stock? No shop in my country even has it listed at all, neither the FE nor 3rd party ones. I hope there's going to be at least some models in stock by Christmas :ohdear:

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Dangerllama posted:

Er…I have a 650W and was hoping not to have to upgrade. (An i7-10700 arrives today). Is this a bad idea?

It will be fine. If you got a 10700k, you may have to keep the OCing in check, since you can about double its power use that way. But since you only can get like 5-7% absolute max out of overclocking it without water, it's again probably not really worth doing in the first place.

At stock settings for both you should be somewhere around 200-250W for system + 325W for 3080 under load, so at worst probably staying under 600W unless you want to OC (which at least on the FE edition seems pointless).

Shipon
Nov 7, 2005
As an upgrade from a 1080Ti this looks like a very solid performance gain - I'll be trying to pick one up tomorrow for sure then.

pocket pool
Aug 4, 2003

B U T T S

Bleak Gremlin
Like a lot of other people in this thread, I'm gaming at 1440p/144. I kind of wish I had a 4K monitor because it feels like the decision to upgrade here would be a little more of a no-brainer. As it stands, I feel like the most practical thing to do would be to wait a bit to see what 3080 partner card and 3070 performance looks like. Particularly because I've almost entirely been playing CS:GO, lately.

The performance jump is big, for sure, but I'm wondering if it is big enough to justify $400 more over a 3070 - especially if you have a GSync monitor like me.

Sickening
Jul 16, 2007

Black summer was the best summer.
I am just ready for this to be over with. I am already bored with it.

Adbot
ADBOT LOVES YOU

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

pocket pool posted:

Like a lot of other people in this thread, I'm gaming at 1440p/144. I kind of wish I had a 4K monitor because it feels like the decision to upgrade here would be a little more of a no-brainer. As it stands, I feel like the most practical thing to do would be to wait a bit to see what 3080 partner card and 3070 performance looks like. Particularly because I've almost entirely been playing CS:GO, lately.

The performance jump is big, for sure, but I'm wondering if it is big enough to justify $400 more over a 3070 - especially if you have a GSync monitor like me.

I am in the same boat and will be :f5:ing tomorrow for a 3080.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply