Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
sauer kraut
Oct 2, 2004

Lungboy posted:

Is a 1650 super enough to run stuff on high on a 1200p 60hz screen, or would a 1660 super be better suited? I usually run a second screen at 900p alongside with a movie or something on screen too.

e: or is it more a case of "don't buy 4gb cards in 2020" ?

Depends on what kinda stuff you have in mind. Generally the 1650 is more of a notebook chip or Asian internet cafe thing imo.

Adbot
ADBOT LOVES YOU

Indiana_Krom
Jun 18, 2007
Net Slacker

Lungboy posted:

e: or is it more a case of "don't buy 4gb cards in 2020" ?
This. "High" quality on most modern games is not going to work well on 4 GB, even 6 is pretty suspect. I'd say 8 should be the minimum these days if you are buying a card to use for more than just driving a display and watching video. If you are married to nvidia but still have a budget for mortals, a used 1070 or better card should still have plenty of usable life on current and upcoming games for 1200p/60 Hz "high" (except for RTX stuff of course).

Worf
Sep 12, 2017

If only Seth would love me like I love him!

sauer kraut posted:

Depends on what kinda stuff you have in mind. Generally the 1650 is more of a notebook chip or Asian internet cafe thing imo.

1650 super will go toe to toe w a 1070 when 4gb vram doesn't smoke it

Check it out on YouTube. That card freaking owns

The pure 1650 is absolutely a notebook card tho or low end / rail powered etc agree 100%

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Lungboy posted:

Is a 1650 super enough to run stuff on high on a 1200p 60hz screen, or would a 1660 super be better suited? I usually run a second screen at 900p alongside with a movie or something on screen too.

e: or is it more a case of "don't buy 4gb cards in 2020" ?

For 1200p on High if you play graphic intensive games, I think you'd be happier with a 1660 Super unless you absolutely think you'll upgrade as soon as next gen comes around. 1650S is a decent card but I just don't know how long you'd be happy with it unless you're just talking about playing games that aren't going to push too hard.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Statutory Ape posted:

1650 super will go toe to toe w a 1070 when 4gb vram doesn't smoke it

Check it out on YouTube. That card freaking owns

The pure 1650 is absolutely a notebook card tho or low end / rail powered etc agree 100%

Do you mean the 1660 Super will go toe to toe with the 1070? 1650 Super isn’t close to a 1070, otherwise no one would buy the 1660, 1660 Super or 1660 ti.

Christ, gently caress nvidias naming lineup is generation.

Shaocaholica
Oct 29, 2002

Fig. 5E
I need a low profile 2 slot card. Should I wait or get a 1650 for $150?

SwissArmyDruid
Feb 14, 2014

by sebmojo
I am pretty sure Zotac makes a low-pro 1650.

edit: They do. It's even $150. https://www.amazon.com/ZOTAC-GeForce-Graphics-Low-Profile-ZT-T16500H-10L/dp/B07W15Y8R8

edit edit: MSI also has one, if you're allergic to Zotac, but you pay a $10 premium. https://www.amazon.com/MSI-GeForce-GTX-1650-4GT/dp/B07ZTXYVMN

SwissArmyDruid fucked around with this message at 03:18 on Jan 19, 2020

Maxwell Adams
Oct 21, 2000

T E E F S

B-Mac posted:

Do you mean the 1660 Super will go toe to toe with the 1070? 1650 Super isn’t close to a 1070, otherwise no one would buy the 1660, 1660 Super or 1660 ti.

This is correct.

quote:

Christ, gently caress nvidias naming lineup is generation.

This is also correct.

Shaocaholica
Oct 29, 2002

Fig. 5E

SwissArmyDruid posted:

I am pretty sure Zotac makes a low-pro 1650.

edit: They do. It's even $150. https://www.amazon.com/ZOTAC-GeForce-Graphics-Low-Profile-ZT-T16500H-10L/dp/B07W15Y8R8

edit edit: MSI also has one, if you're allergic to Zotac, but you pay a $10 premium. https://www.amazon.com/MSI-GeForce-GTX-1650-4GT/dp/B07ZTXYVMN

That's what I meant. Should I get that or is something better on the horizon?

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Ok it's not that close otoh, it does play a lot of the same games at the same settings at playable framerates and it's way lower tier.

Idk, that they were even that close was impressive to me esp considering the OG 1650

SwissArmyDruid
Feb 14, 2014

by sebmojo

Shaocaholica posted:

That's what I meant. Should I get that or is something better on the horizon?

Nothing better is on the horizon. Even if you were to wait for Ampere, Nvidia comes out with their flagship cards first, and then fills out the bottom end months later.

And AMD isn't capable of shipping anything that comes close in the 75W, powered solely by the PCIe bus category that low-profile cards most frequently find their use case.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

ItBreathes posted:

Right now that's the one market AMD GPUs are competitive in. The 5700xt costs 80% of the 2070S while delivering ~90% of the performance. If the extra performance is worth it to you knock yourself out, the 2070S is a fine card, but even when you have the money to spend it's not like it can't be put towards other things.

I have a 1080ti. It's going to be a long wait

wargames
Mar 16, 2008

official yospos cat censor

SwissArmyDruid posted:

Nothing better is on the horizon. Even if you were to wait for Ampere, Nvidia comes out with their flagship cards first, and then fills out the bottom end months later.

And AMD isn't capable of shipping anything that comes close in the 75W, powered solely by the PCIe bus category that low-profile cards most frequently find their use case.

intel with the dg1 might be able to do the 75w only pci lane card, atleast i think that was what was on show at ces.

Shaocaholica
Oct 29, 2002

Fig. 5E
I see everyone dumping on the DG1 but is it really that bad -per watt-?

SwissArmyDruid
Feb 14, 2014

by sebmojo
Let's put it this way, how does playing Destiny 2 at 1080p Low at around 40fps with significant input lag off a bus-powered card sound?

If that sounds good to you, sure, wait for DG1, because that's what they were "showing off", if it can even be called that, at CES, based on TechJesus's sneaky frame counting of high-speed footage.

Be aware that the 1650 you're looking at, powered off the same 75W is roughly analogous to a 1060 now, which may command one 6-pin or one 8-pin based on overclocks, but even in the laptop version, of which I am presently using, can do 1080p60 on medium-high settings.

SwissArmyDruid fucked around with this message at 18:40 on Jan 19, 2020

Shaocaholica
Oct 29, 2002

Fig. 5E
Oh I'm not saying the DG1 is comparable for my actual use case. Just curious what the per watt performance is. Yeah it's impractical regardless.

SwissArmyDruid
Feb 14, 2014

by sebmojo
In Intel's defense (and that's my one allotted use of the phrase for the year, drat, I sure got it out of the way early this time) it is entirely possible that DG1 exists within a sub-75W power band. Like, somewhere in the 15W-35W range, because its end-goal is to be put into iGPUs, and is only on a physical PCIe card to make it much easier to slot into existing machines for software and driver testing. At which point, it would be more impressive, but until we know actually how much power it's pulling, the comparison doesn't look good at all, and begs the question why even bother showing it off at CES. EDIT: Why bother even showing it off at CES without disclosing those caveats to make it look less-bad.

SwissArmyDruid fucked around with this message at 20:05 on Jan 19, 2020

Shaocaholica
Oct 29, 2002

Fig. 5E

SwissArmyDruid posted:

...the question why even bother showing it off at CES.

Yeah totally. Intel just can't help itself lately it seems. It's like Gil from Simpsons but only with billions of dollars.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

Shaocaholica posted:

I see everyone dumping on the DG1 but is it really that bad -per watt-?

Or... Per inch?

redeyes
Sep 14, 2002

by Fluffdaddy
The new shroud is super sweet though.

GRINDCORE MEGGIDO
Feb 28, 1985


redeyes posted:

The new shroud is super sweet though.

Raja doing what Raja do

Hopefully he'll get fired from Intel eventually and have a bright future designing shrouds for xfx

Maxwell Adams
Oct 21, 2000

T E E F S

SwissArmyDruid posted:

Let's put it this way, how does playing Destiny 2 at 1080p Low at around 40fps with significant input lag off a bus-powered card sound?

It was only able to pull off 40+ fps in that hand-picked part of the game where they just went up a narrow staircase where nothing was happening.

For some reason I thought Intel's GPU was going to be a compute beast that could also do high-end gaming. I'm really disappointed that it turned out to be a weakass integrated GPU that was split off into it's own card. Would have been nice for someone to compete with nVidia at the high end.

Sininu
Jan 8, 2014

I got my H55 back. They said they didn't detect any issues. I did some more extensive testing now and concluded that the noise is there using any of the fan headers, not just AIO header and confirmed for sure they were running DC mode at 100%.
I also found this where someone recorded that noise. Corsair employees in the thread are saying it's normal.
https://forum.corsair.com/v3/showthread.php?p=990682
https://www.youtube.com/watch?v=B_w0TYafOSE
But I can hear it from loving other side of the room!! 4 meters away. gently caress this thing.

E: I found the water-cooling thread so I'll post there from now on.

Sininu fucked around with this message at 11:38 on Jan 20, 2020

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Maxwell Adams posted:

It was only able to pull off 40+ fps in that hand-picked part of the game where they just went up a narrow staircase where nothing was happening.

For some reason I thought Intel's GPU was going to be a compute beast that could also do high-end gaming. I'm really disappointed that it turned out to be a weakass integrated GPU that was split off into it's own card. Would have been nice for someone to compete with nVidia at the high end.

As far as competitiveness is concerned, Intel has two primary concerns : server hardware, and laptops. It makes sense for them to work toward laptop hardware first, because they already have momentum in that market, where in the big compute space they have to try to dislodge Nvidia.

I'm sure their goal is to ultimately produce some beastly high-end product, but it didn't make sense for them to try to rush there.

The unfortunate reality is that no one is going to compete with Nvidia at the high end for at least several years, but I wouldn't completely give up hope that Intel can eventually make it there.

TorakFade
Oct 3, 2006

I strongly disapprove


with Cyberpunk being delayed to September, hopefully a "next gen" GPU will be out by then :toot:

I actually wonder if that's part of the reason for the delay, with RDR2 already giving a figurative wedgie even to the 2080Ti ... we're getting back to the wonderful Crysis times where you can't max a new, beautiful AAA game even with a top-end GPU

only back then a top-end GPU was 600-800€, today it's 1200€ :gonk:

orcane
Jun 13, 2012

Fun Shoe

TorakFade posted:

with Cyberpunk being delayed to September, hopefully a "next gen" GPU will be out by then :toot:

I actually wonder if that's part of the reason for the delay, with RDR2 already giving a figurative wedgie even to the 2080Ti ... we're getting back to the wonderful Crysis times where you can't max a new, beautiful AAA game even with a top-end GPU

only back then a top-end GPU was 600-800€, today it's 1200€ :gonk:
Uh what? Those times never went away, because badly optimized games always existed (and the goalposts of what it means to "max" a game keep shifting somewhat). You can find plenty of benchmarks where that previous €800(+) top GPU, the GTX 1080 Ti, couldn't hit 60 (average) fps in certain contemporary games at the time of the GPU's launch, and I bet you can do the same comparison with older generations.

TorakFade
Oct 3, 2006

I strongly disapprove


orcane posted:

Uh what? Those times never went away, because badly optimized games always existed (and the goalposts of what it means to "max" a game keep shifting somewhat). You can find plenty of benchmarks where that previous €800(+) top GPU, the GTX 1080 Ti, couldn't hit 60 (average) fps in certain contemporary games at the time of the GPU's launch, and I bet you can do the same comparison with older generations.

dunno, since I got my 1080 I could literally max every game without issue (1440p60) until RDR2 and AC: Odyssey which are both 2019 games ... of course that's just my empirical worthless experience, but last time that I heard people complaining about that happening was, literally, when Crysis came out.

Admittedly the era of the 970 was almost too good to be true, and today the big issue is growing resolution requiring exponentially more processing power, and I wasn't thinking of that. I bet I could still max everything with my 1080, including RDR2 and AC:Odyssey if I went back to 1080p

orcane
Jun 13, 2012

Fun Shoe
Historically, PC games could be crippled with features like ultra shadows, Hairworks/TressFX, PhysX or MSAA very regularly. Or using high-res texture packs.

Odyssey is not a good example for a game that needs parts faster than current flagship hardware to run well. It's a great example for how it's more common today to have visual options (here: volumetric clouds) that are hard to see the effect of outside of zoomed-in screenshots, while they cripple your framerate. It also relies on the CPU a huge deal and wastes part of its potential by aggressively unloading/reloading assets, so even a GPU faster than the GTX 2080 Ti is probably not going to help much.

And RDR2 does the thing where it offers you the "future-proof" option to drag settings beyond reasonable limits, but it's also shows very weird jumps in relative power of GPUs between resolutions and presets so I'm thinking it's not as well optimized as it should be. This makes it hard to use it as proof for a general trend towards hardware that's not even out yet. If anything, that's a Rockstar trend and not exactly a new one (GTA5 did the same, to a lesser extent).

CDPR had no problem releasing The Witcher 3 when even a GTX Titan X couldn't reliably hit 60 fps in Full HD with max. details except HairWorks. I really doubt they postponed Cyberpunk because they need a new GPU generation to run the game well.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

TorakFade posted:

dunno, since I got my 1080 I could literally max every game without issue (1440p60) until RDR2 and AC: Odyssey which are both 2019 games ... of course that's just my empirical worthless experience, but last time that I heard people complaining about that happening was, literally, when Crysis came out.

Admittedly the era of the 970 was almost too good to be true, and today the big issue is growing resolution requiring exponentially more processing power, and I wasn't thinking of that. I bet I could still max everything with my 1080, including RDR2 and AC:Odyssey if I went back to 1080p

That has to do with the console refreshes, not the GPU market. A few years ago there was still an attempt of at least some parity between the two but that's less likely right now. With the new gen of consoles it may go back to that for a couple years.

Also, the 1080 is a nearly 4 year old GPU. I don't think there ever has been a time when you can take a 4 year old card and expect it to do 1440p60 on new games. When the 1080 came out, if you go back the 44 months the 680 had just come out, though I don't think that would have been able to push new games at that resolution/FPS either.

And the 690 (which was the top of the stack then, no 680ti) was $1000 in 2012 dollars (equivalent to 1200 today), so the cost complaint also doesn't hold water.

Cavauro
Jan 9, 2008

The $500 GPU I bought for Cyberpunk 2077, which now isn't coming out until September or possibly even next March, is actually going to run the game at the maximum settings except for one or two wonky features turned down to 'high' or 'medium' if it's really not very impressive and that's the end of it

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"
Take with a mountain of salt because this just sounds like someone making poo poo up completely and it's Tweaktown

https://www.tweaktown.com/news/70053/nvidia-geforce-rtx-3080-3070-leaked-specs-up-20gb-gddr6-ram/index.html


Lol Tweaktown posted:

GA103 (GeForce RTX 3080)
10/20GB GDDR6
320-bit memory interface
60 SMs
3480 stream processors

GA104 (GeForce RTX 3070)
8/16GB GDDR6
256-bit memory interface
48 SMs
3072 stream processors

:laffo: at 16GB on the 3070

eames
May 9, 2009

8 and 10 GB seems very realistic in light of memory prices. 16 and 20 not so much.

Techpowerup reviewed the new $299 RTX2060 and found that it uses the same larger TU104 die as the 2070 Super. Seems like the early reports of Nvidia producing price cut 2070 were accurate in a way...

repiv
Aug 13, 2009

jisforjosh posted:

:laffo: at 16GB on the 3070

As the article says, it would probably be 8GB on the RTX3070 and 16GB on the Quadro/Tesla counterparts. Nvidia nearly always doubles up the memory on the pro variants.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin
https://twitter.com/VideoCardz/status/1219171309298561024

Pretty plausible imo. That spec leak from tweaktown seems made up though.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

TorakFade posted:

dunno, since I got my 1080 I could literally max every game without issue (1440p60) until RDR2 and AC: Odyssey which are both 2019 games ... of course that's just my empirical worthless experience, but last time that I heard people complaining about that happening was, literally, when Crysis came out.

Admittedly the era of the 970 was almost too good to be true, and today the big issue is growing resolution requiring exponentially more processing power, and I wasn't thinking of that. I bet I could still max everything with my 1080, including RDR2 and AC:Odyssey if I went back to 1080p

AC:O is just a shitshow. The 1080 will do 50fps average at 1440p, but only 60fps at 1080p (per TechPowerUp). Thats a 17% performance improvement despite rendering 43% fewer pixels. While 'how does it run the games I want to play" is the key metric for performance, I don't think you can generalize future performance of the hardware based on anomalous software. Its probably a decent benchmark for how well it will perform for similarly troublesome games, but it (and RDR2, though I haven't looked back at that since the launch issues) are not representative of how basically any other games perform.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Odyssey definitely has a few graphics settings that tank performance, dropping volumetric lightning to medium maintained the same visual quality but added double digit fps gains. The game is a CPU hog too, especially in cities. The 9900k even bottlenecks at 1440p, I’d see GPU usage drop to 70% in certain areas.

Hardware Unboxed has a good optimization video on it.

https://m.youtube.com/watch?v=chqQanHcvHk

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme
I'm posting here for a bit of perspective and sanity check: I'm thinking of moving from my current monitor (Dell 2415, 1920x1200@60) to Samsung 27-inch 1440p@144 monitor. However, my current GPU is GTX 1060 6GB, CPU is mildly overclocked i7-3770k.

There are some GTX 1070Ti ex-miner cards being sold for 280 eur so I could buy one and sell the 1060 for around 100-120 eur while waiting for the Ampere cards to pop up. But then again I mostly play CSGO and Battletech, the first one should have no problem running at 1440p and the other I can play at under 20 fps.

Are there other GTX 1060 owners with 1440p displays, how do you manage with this combination, should I just skip updating the GPU? I'm aiming for upgrading the GPU when Cyberpunk is released in September.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
If you don't plan on playing anything particularly new/demanding at high settings for the next few months, you will be perfectly fine.

However, I would strongly advise against that particular monitor unless watching TV/movies is your primary concern. It's a VA panel, so while the black levels are pretty solid, it has extremely slow responses on darker transitions (as in struggles to keep up with 30 FPS, never mind 144). You can get a good 27" 144hz IPS Freesync monitor in the same price range and have a much better gaming experience.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

jisforjosh posted:

Take with a mountain of salt because this just sounds like someone making poo poo up completely and it's Tweaktown

https://www.tweaktown.com/news/70053/nvidia-geforce-rtx-3080-3070-leaked-specs-up-20gb-gddr6-ram/index.html


:laffo: at 16GB on the 3070

Nvidia has never used odd numbers on its codenames.

Lockback posted:

That has to do with the console refreshes, not the GPU market. A few years ago there was still an attempt of at least some parity between the two but that's less likely right now. With the new gen of consoles it may go back to that for a couple years.

Also, the 1080 is a nearly 4 year old GPU. I don't think there ever has been a time when you can take a 4 year old card and expect it to do 1440p60 on new games. When the 1080 came out, if you go back the 44 months the 680 had just come out, though I don't think that would have been able to push new games at that resolution/FPS either.

And the 690 (which was the top of the stack then, no 680ti) was $1000 in 2012 dollars (equivalent to 1200 today), so the cost complaint also doesn't hold water.

Forget new games. I was rather shocked my 1080 had trouble holding 60fps at 1440p when I decided to revisit The Witcher 3 and turned every setting to max.

Adbot
ADBOT LOVES YOU

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme

K8.0 posted:

You can get a good 27" 144hz IPS Freesync monitor in the same price range and have a much better gaming experience.
Could you link some because I've been searching around and only other 27" 1440p fast refresh monitors have had TN panels in them. Though I might have searched for "HDR" panels at the same time.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply