Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Shaocaholica
Oct 29, 2002

Fig. 5E
Fastest low profile single slot card under 35W is the GT 1030?

E: lol I guess there’s the Quadro P1000

Shaocaholica fucked around with this message at 02:59 on Apr 16, 2018

Adbot
ADBOT LOVES YOU

Zarin
Nov 11, 2008

I SEE YOU
I've never actually paid attention until a GPU launch cycle. However, this upcoming one for Nvidia is intriguing to me, most notably because I've been holding out for new GPU for so long that I probably want to just become an early adopter of the next generation. So, I have a couple questions:

1). What comes out first? The xx80? Something lower, like the xx50? Or something inbetween?
2). How long after Nvidia launches "founders edition" stuff do secondary manufacturers (such as EVGA) begin producing product?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Zarin posted:

I've never actually paid attention until a GPU launch cycle. However, this upcoming one for Nvidia is intriguing to me, most notably because I've been holding out for new GPU for so long that I probably want to just become an early adopter of the next generation. So, I have a couple questions:

1). What comes out first? The xx80? Something lower, like the xx50? Or something inbetween?
2). How long after Nvidia launches "founders edition" stuff do secondary manufacturers (such as EVGA) begin producing product?

1) The x70 and x80 parts generally come out first, while the x60 part follows about 6-8 months later in a blatant profit-grab designed to get people who HAVE TO HAVE IT buy the more expensive cards. The x50/x50Ti part follows the x60 part ~3+ months after, and then they quietly launch the 'I just want a discrete GPU, I don't care about games you nerds' x30-like parts whenever. Then, historically, the sorta-gaming-focused Titan comes out, followed about 6-9 months later by the x80Ti part.

2) A few weeks to a month, usually. GPP partners might get a jump on the others this time around. The FE thing is still 'new' and I'd imagine nVidia might do some stupid/crazy/quasi-illegal poo poo with it.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
what do x30 cards do that integrated GPUs cant do nowadays? At least the integrated ones I've used on my macbook and work pc can actually play some games.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
I would assume they are often used in older machines or ones without integrated graphics. I bought a 1030 for a Core 2 Duo system with very basic VGA-only integrated graphics so that I could use it as a 4K HTPC. I am considering building a Sandy Bridge-E workstation to replace my work desktop and although I already have a GT 640 that does the job, if I decided to upgrade to a 4K monitor there something like a 1030 (or RX 550, to get a third output) would be a good fit.

Eletriarnation fucked around with this message at 04:15 on Apr 16, 2018

Zarin
Nov 11, 2008

I SEE YOU

BIG HEADLINE posted:

1) The x70 and x80 parts generally come out first, while the x60 part follows about 6-8 months later in a blatant profit-grab designed to get people who HAVE TO HAVE IT buy the more expensive cards. The x50/x50Ti part follows the x60 part ~3+ months after, and then they quietly launch the 'I just want a discrete GPU, I don't care about games you nerds' x30-like parts whenever. Then, historically, the sorta-gaming-focused Titan comes out, followed about 6-9 months later by the x80Ti part.

2) A few weeks to a month, usually. GPP partners might get a jump on the others this time around. The FE thing is still 'new' and I'd imagine nVidia might do some stupid/crazy/quasi-illegal poo poo with it.

Good info, thanks!

Is the initial launch of the xx70 and xx80 parts noticeably more expensive than later in the lifecycle? Or is the MSRP at launch pretty much what the MSRP will be throughout the lifespan?

To put it another way: if I buy early in the cycle, am I looking at getting fleeced compared to if I had waited, say, six months?

I'm guessing that prices are probably going to be pretty stable for the first six months, and that I wouldn't expect to see deep discounts until the end of production run, IF there will be any discounts to be had. (I usually buy my graphics cards when the next gen is announced and prices kinda drop, but, welp)

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

buglord posted:

what do x30 cards do that integrated GPUs cant do nowadays? At least the integrated ones I've used on my macbook and work pc can actually play some games.

For one, they don't eat into your system memory, and even in the case of the chopped-down 1030 part they just put out that uses DDR4, it's still a larger fixed buffer pool than the iGPU takes.

Zarin posted:

Good info, thanks!

Is the initial launch of the xx70 and xx80 parts noticeably more expensive than later in the lifecycle? Or is the MSRP at launch pretty much what the MSRP will be throughout the lifespan?

To put it another way: if I buy early in the cycle, am I looking at getting fleeced compared to if I had waited, say, six months?

I'm guessing that prices are probably going to be pretty stable for the first six months, and that I wouldn't expect to see deep discounts until the end of production run, IF there will be any discounts to be had. (I usually buy my graphics cards when the next gen is announced and prices kinda drop, but, welp)

No one knows this time around. If crypto keeps roller-coastering in price, there's a 50/50 chance of being able to MAYBE buy a card for a semi-realistic price. But nVidia WILL figure out how to ~get dat papah~ if prices keep climbing. They want to hit $300 on their stock price. Similarly, nVidia has *no reason whatsoever* at the moment to be in a hurry to launch their new GPUs - this might be the first time in history where they'll have a stupidly low rate of product leftovers headed into a new architecture. If there's nothing old to discount, there's way less reason to be ~generous~ on an MSRP.

I wouldn't put it past nVidia to sell Founder's Editions to people who are willing to jump through hoops to prove they're only buying 1-2 cards, but I doubt they'll go out of their way to do that on a large scale.

All of us who've been reading the writing on the wall are expecting that this time around the MSRP of the x70 part is going to be $499 or higher. $799 isn't out of the realm of possibility for the x80 part, and at the worst peak of the BTC boom, people were saying the "1180 was going to cost $12-1500."

BIG HEADLINE fucked around with this message at 04:26 on Apr 16, 2018

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
I’m tempted to get the next x70 GPU out of sheer anxiety that prices will go into orbit again. I’ll likely just wait though. How do next gen cards perform? I read somewhere that a current gen x70 is a last gen x80, ditto with current x60s being the last gen equivalent to x70. Then there’s that whole Ti lineup and I dunno what in the world that falls into.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Generally the current generation x70 is comparable to the prior generation's Ti part, and the x80 is usually ~15-25% faster than that. Then everyone knows the Titan and Ti are coming.

The 970 and 980 bucked this trend, as (on paper, at launch), the 980 was only ~5-15% faster than the 970 part. Of course, the 980 owners never had to worry about the 3.5/0.5 fiasco.

BIG HEADLINE fucked around with this message at 05:34 on Apr 16, 2018

PITT
Sep 21, 2004
MISTER

buglord posted:

I’m tempted to get the next x70 GPU out of sheer anxiety that prices will go into orbit again. I’ll likely just wait though. How do next gen cards perform? I read somewhere that a current gen x70 is a last gen x80, ditto with current x60s being the last gen equivalent to x70. Then there’s that whole Ti lineup and I dunno what in the world that falls into.

I just did exactly this for an evga ftw3 hybrid on amazon, been watching eth prices as my bellwether, at the end of March it dropped to $369 and was around that time gpu prices took a noticeable hit but I haven't seen them move at all since then, now 2 weeks later eth is up over $500, so I hope I made the right call. My new coffee lake system has been sad without a gpu upgrade.

Fuzz1111
Mar 17, 2001

Sorry. I couldn't find anyone to make you a cool cipher-themed avatar, and the look on this guy's face cracks me the fuck up.

magimix posted:

Thanks for the advice. It is *definitely* one of the GPU fans...
I've managed to stop 12cm case fans vibrating by putting a small peice of tape on one of the fins (seemed to knock it off balance just enough that it prevented oscillation in fan bearings from starting), might work in your case, if it's not a blower style fan anyway.

Oh and by the way instead of load testing to get the fans spinning you could have just used something like precision x or msi afterburner to set the GPU fan speed to whatever you wanted (there's also programs that can do same for CPU and other fans, and you can also probably change speeds in bios). PSU fan is probably the only one that you can't manually control.

Struensee
Nov 9, 2011
I'm stilling sitting on an old as gently caress 6950HD, and am thinking about going straight into 4k 120 Hz when this eventually comes out. Is the 1180 probably going to be good enough, or do I probably need to wait for the 1180 Ti?

Remo
Oct 10, 2007

I wish this would go on forever
I think you will need like double the power of a GTX 1080 ti to hit 4K120 at highest settings, which I doubt the 1180 or 1180 ti will hit.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Was in a discussion in one of the goon discords. Went back and looked up how much I paid for a pair of 1050tis for the stepbrothers last summer. Cards that I would have liked to have been 1060 6GBs instead, but those were in stock nowhere.

I paid $154 for a ASUS Geforce GTX 1050 Ti 4GB Phoenix, and $139 for a EVGA GeForce GTX 1050 Ti SC GAMING.

Those cards are now $228 and $239, respectively, and are still selling for $175-ish, used.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Struensee posted:

I'm stilling sitting on an old as gently caress 6950HD, and am thinking about going straight into 4k 120 Hz when this eventually comes out. Is the 1180 probably going to be good enough, or do I probably need to wait for the 1180 Ti?

4K 120Hz in new games, at high settings in all situations, is almost certainly not happening for quite some time, maybe the generation after the next.
Even so, it's a given that games will also increase their requirements to the point where it never quite happens/becomes very expensive.
If we get a new round of consoles next year, or the year after, expect that 4K 120Hz dream to be pushed back even further.

HalloKitty fucked around with this message at 15:52 on Apr 16, 2018

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

buglord posted:

I’m tempted to get the next x70 GPU out of sheer anxiety that prices will go into orbit again. I’ll likely just wait though. How do next gen cards perform? I read somewhere that a current gen x70 is a last gen x80, ditto with current x60s being the last gen equivalent to x70. Then there’s that whole Ti lineup and I dunno what in the world that falls into.

I know how you feel. I couldn't wait any longer for the next-gen and grabbed a GTX 1080FE at the local Best Buy for $589 to replace the GTX 670 I'd been usingfor the last year.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
4k120 isn't going to happen for a good few years at least. The only chance of that coming early is if PC games adopt checkerboard rendering, which is slim to none since nvidia and AMD want you to keep buying their fancy new shiny graphics cards.

repiv
Aug 13, 2009

Zedsdeadbaby posted:

4k120 isn't going to happen for a good few years at least. The only chance of that coming early is if PC games adopt checkerboard rendering, which is slim to none since nvidia and AMD want you to keep buying their fancy new shiny graphics cards.

UE4 recently started shipping a temporal upscaler which is similar in principle to checkerboard rendering, and works on PC. Here's an example from their docs:

70% resolution scale, naive upscale: https://a.safe.moe/PrrH5a4.jpg
70% resolution scale, temporal upscale: https://a.safe.moe/lmteaFB.jpg

4K monitors will be far more manageable with this kind of temporal fuckery available :)

repiv fucked around with this message at 16:38 on Apr 16, 2018

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It looks less poo poo, but still like poo poo. The main reason it's valuable is that while 1440 will remain the ideal resolution, we're likely about to be forced to 4k in a year or two because 4k panels are where the development is happening and thus where the next big generation of high performance displays will be.

repiv
Aug 13, 2009

The ideal will be when we can run both temporal upscaling and dynamic resolution at the same time, then you could nominally target native 4K but allow it to dip to subnative+temporal during intense moments.

Rap Game Goku
Apr 2, 2008

Word to your moms, I came to drop spirit bombs


The Consoles are doing more of that for their 4K models. I'd guess some of that might make it back over to the PC.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

K8.0 posted:

It looks less poo poo, but still like poo poo. The main reason it's valuable is that while 1440 will remain the ideal resolution, we're likely about to be forced to 4k in a year or two because 4k panels are where the development is happening and thus where the next big generation of high performance displays will be.

EDIT: Doh I'm a dummy, I though the second pic was native and the first was the temporal method. Actually that looks drat good for 70% upscale IMO.

Happy_Misanthrope fucked around with this message at 18:53 on Apr 16, 2018

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Wacky Delly posted:

The Consoles are doing more of that for their 4K models. I'd guess some of that might make it back over to the PC.

Well far more the PS4 Pro, and even then it seems it's far less prevalent than Sony indicated it would be (1440p is quite a common Pro res) - it's obviously not as simple as flicking on a software switch.

The One X Xbox Next 10X is rendering either at native 4K in a lot of its enhanced titles, or so close with dynamic resolution that it's basically 'close enough' 95% of the time.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
Is 4k something a lot of people are using nowadays? I see it marketed everywhere but I have yet to find more than 1 person with a 4k TV/monitor.

Is there any point to having a monitor below 30 inches and it being 4k? 1080p still looks great to my stupid eyes, but ill admit none of my monitors ever look as good as my laptop's Retina screen which is basically 4k. RIP game framerates though.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
4k is getting adopted far faster than HD did, but it's still rather slow in modern tech-terms.

repiv
Aug 13, 2009

Happy_Misanthrope posted:

EDIT: Doh I'm a dummy, I though the second pic was native and the first was the temporal method. Actually that looks drat good for 70% upscale IMO.

Yeah I was going to say, 70% of ~800p is never going to look amazing but it's certainly a big improvement over bicubic.

I'd like to see how it performs in a more realistic scenario like 70% of 4K but I don't have a 4K monitor to test on.

wolrah
May 8, 2006
what?

buglord posted:

Is 4k something a lot of people are using nowadays? I see it marketed everywhere but I have yet to find more than 1 person with a 4k TV/monitor.
With TVs 4K is the basically standard for larger screens and is pretty widely available at 49" and above. In that size range only the cheap ones are still 1080p.

On laptops most midrange to higher-end laptops in the 15 and 17" categories at least offer a 4K screen, and smaller ones often offer increased resolutions in between those two.

With desktop monitors, the situation still kinda sucks. 1080p is the standard at 24", 1440p at 27", and most 4K offerings are either expensive or compromised in some way.

quote:

Is there any point to having a monitor below 30 inches and it being 4k? 1080p still looks great to my stupid eyes, but ill admit none of my monitors ever look as good as my laptop's Retina screen which is basically 4k. RIP game framerates though.
Same point as on the Retina laptops. Text and any other vectorized or high-DPI UI elements look much clearer and sharper because they are. Older games can be played at higher detail levels. Newer games can still be played at 1080p doing integer scaling. Unfortunately while Apple had control over one end of the chicken-and-egg problem and was able to force developers to care about their apps' behavior in high-DPI environments by just telling them "fix it or be broken on most new Macs" the PC world has no one able to do the same and push an ultimatum on lazy corporate app developers.

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~
I thought nvidia GPUs don't do integer scaling and it has to be an option of your monitor? That's one thing holding me back, I don't want soft/interpolated 1080P when it should be pixel-perfect.

repiv
Aug 13, 2009

repiv posted:

Yeah I was going to say, 70% of ~800p is never going to look amazing but it's certainly a big improvement over bicubic.

I'd like to see how it performs in a more realistic scenario like 70% of 4K but I don't have a 4K monitor to test on.

oh yeah i can just fake it with dsr :doh:

Here's a better example then:
Real native 4K: https://a.safe.moe/PbCSEJQ.png
Temporal 70% 4K: https://a.safe.moe/4zpd61C.png
Standard 70% 4K: https://a.safe.moe/jQ7dh2L.png

Does the temporal upscale look as good as native? Nope, but it's really loving close despite only rendering half the pixels. The consoles have the right idea here.

repiv fucked around with this message at 20:28 on Apr 16, 2018

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

buglord posted:

Is 4k something a lot of people are using nowadays? I see it marketed everywhere but I have yet to find more than 1 person with a 4k TV/monitor.

Is there any point to having a monitor below 30 inches and it being 4k? 1080p still looks great to my stupid eyes, but ill admit none of my monitors ever look as good as my laptop's Retina screen which is basically 4k. RIP game framerates though.

I have a 27" 1440 monitor and I would definitely love a 27" 4k monitor for games but I'm holding out for a larger one because I don't want to deal with Windows UI scaling issues.

Unfortunately I've been waiting for 4k prices to drop for like 5 years now and the next gen GPUs probably won't even have an output for my Korean monitor anymore while a monitor upgrade will still be $1k :negative:

MaxxBot fucked around with this message at 20:53 on Apr 16, 2018

1gnoirents
Jun 28, 2014

hello :)
I'd say 4k monitors are apparently different at 27" to my eyes, over 1440p that is. 1080p is wildly different at that size and higher imo. But like everybody I'm waiting for 4k monitors at high refresh and most importantly GPU's that can handle that. 1440p is still my sweetspot after all these years.

4k TV's with 4k content these days look really great. For a long time it was kind of a waste of money but I dont think it is anymore (it helps theyve come down a lot too)

Whats silly to me is 4k phones, 4k laptops... I mean I can still tell on a 4k laptop but its just sooooooooo pointless unless youre drawing on it.

NewFatMike
Jun 11, 2015

If those upscaling tricks work down to 1080p, that would give me a (stronger) boner for APUs.

1080p30+ guaranteed on a ULV processor is worth being horny for.

Verizian
Dec 18, 2004
The spiky one.

MaxxBot posted:

I have a 27" 1440 monitor and I would definitely love a 27" 4k monitor for games but I'm holding out for a larger one because I don't want to deal with Windows UI scaling issues.

Unfortunately I've been waiting for 4k prices to drop for like 5 years now and the next gen GPUs probably won't even have an output for my Korean monitor anymore while a monitor upgrade will still be $1k :negative:

Nvidia's BFGD 65" hybrid displays are due out for "executive tier" pricing later this year and are reported to solve the issue of windows UI scaling using the built in Shield to run post-effects over the top of picture in picture signals.

Major downside is it's going to be at least 18 months after that for a prosumer version to be released and it's confirmed to use HDMI 2.0b, forcing people to use nvidia cards for the built in Gsync and hinting next gen nvidia cards at least won't have HDMI2.1

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
I'm going to laugh so hard if Nvidia's new line of gaming video cards don't have HDMI 2.1 support.

repiv
Aug 13, 2009

edit: wait nevermind that's wrong

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

spasticColon posted:

I'm going to laugh so hard if Nvidia's new line of gaming video cards don't have HDMI 2.1 support.

It wouldn't surprise me at all. HDMI 2.0 already covers 4k@60Hz, which is sufficient for the vast majority of users. If you want higher than that, DP 1.4 already covers 4k@120Hz native and up to 4k@240Hz/8k@60Hz with DSC, so monitors are already taken care of. Want better than 4k@60 on a TV? Buy the NVidia-branded BFGD! Problem solved!

In that no AMD solution can reasonably push 4k@much-of-anything, it's not like NVidia needs the feature set to compete. Frankly, it'd only encourage people to get non-BFGD TVs, effectively costing NVidia a few bucks.

Slider
Jun 6, 2004

POINTS
High refresh rate 4k monitors confuse me, are these mainly for csgo and overwatch players?

Seamonster
Apr 30, 2007

IMMER SIEGREICH
They're for people who want a bigass monitor but aren't into the whole ultrawide thing. Like me.

Because for the time being and the near future, a more orthodox aspect ratio really is just more practical. And you don't look like an rear end in a top hat when you post a screenshot that nobody can view properly on a 16:9 or 16:10 screen.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

PerrineClostermann posted:

4k is getting adopted far faster than HD did, but it's still rather slow in modern tech-terms.

IMO it has less to do with people wanting 4K and more to do with its getting harder to buy a TV that isn't 4K especially when people tend to go up screen sizes when buying a new one.

BTW 55 inchers and above is only like 15% of the entire TV market IIRC.

Adbot
ADBOT LOVES YOU

TOOT BOOT
May 25, 2010

4k monitors don't have much penetration at all. I think the Steam survey had it at 1-2%.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply