|
Fastest low profile single slot card under 35W is the GT 1030? E: lol I guess there’s the Quadro P1000 Shaocaholica fucked around with this message at 02:59 on Apr 16, 2018 |
# ? Apr 16, 2018 01:47 |
|
|
# ? May 29, 2024 08:04 |
|
I've never actually paid attention until a GPU launch cycle. However, this upcoming one for Nvidia is intriguing to me, most notably because I've been holding out for new GPU for so long that I probably want to just become an early adopter of the next generation. So, I have a couple questions: 1). What comes out first? The xx80? Something lower, like the xx50? Or something inbetween? 2). How long after Nvidia launches "founders edition" stuff do secondary manufacturers (such as EVGA) begin producing product?
|
# ? Apr 16, 2018 03:45 |
|
Zarin posted:I've never actually paid attention until a GPU launch cycle. However, this upcoming one for Nvidia is intriguing to me, most notably because I've been holding out for new GPU for so long that I probably want to just become an early adopter of the next generation. So, I have a couple questions: 1) The x70 and x80 parts generally come out first, while the x60 part follows about 6-8 months later in a blatant profit-grab designed to get people who HAVE TO HAVE IT buy the more expensive cards. The x50/x50Ti part follows the x60 part ~3+ months after, and then they quietly launch the 'I just want a discrete GPU, I don't care about games you nerds' x30-like parts whenever. Then, historically, the sorta-gaming-focused Titan comes out, followed about 6-9 months later by the x80Ti part. 2) A few weeks to a month, usually. GPP partners might get a jump on the others this time around. The FE thing is still 'new' and I'd imagine nVidia might do some stupid/crazy/quasi-illegal poo poo with it.
|
# ? Apr 16, 2018 03:54 |
|
what do x30 cards do that integrated GPUs cant do nowadays? At least the integrated ones I've used on my macbook and work pc can actually play some games.
|
# ? Apr 16, 2018 04:05 |
|
I would assume they are often used in older machines or ones without integrated graphics. I bought a 1030 for a Core 2 Duo system with very basic VGA-only integrated graphics so that I could use it as a 4K HTPC. I am considering building a Sandy Bridge-E workstation to replace my work desktop and although I already have a GT 640 that does the job, if I decided to upgrade to a 4K monitor there something like a 1030 (or RX 550, to get a third output) would be a good fit.
Eletriarnation fucked around with this message at 04:15 on Apr 16, 2018 |
# ? Apr 16, 2018 04:11 |
|
BIG HEADLINE posted:1) The x70 and x80 parts generally come out first, while the x60 part follows about 6-8 months later in a blatant profit-grab designed to get people who HAVE TO HAVE IT buy the more expensive cards. The x50/x50Ti part follows the x60 part ~3+ months after, and then they quietly launch the 'I just want a discrete GPU, I don't care about games you nerds' x30-like parts whenever. Then, historically, the sorta-gaming-focused Titan comes out, followed about 6-9 months later by the x80Ti part. Good info, thanks! Is the initial launch of the xx70 and xx80 parts noticeably more expensive than later in the lifecycle? Or is the MSRP at launch pretty much what the MSRP will be throughout the lifespan? To put it another way: if I buy early in the cycle, am I looking at getting fleeced compared to if I had waited, say, six months? I'm guessing that prices are probably going to be pretty stable for the first six months, and that I wouldn't expect to see deep discounts until the end of production run, IF there will be any discounts to be had. (I usually buy my graphics cards when the next gen is announced and prices kinda drop, but, welp)
|
# ? Apr 16, 2018 04:12 |
|
buglord posted:what do x30 cards do that integrated GPUs cant do nowadays? At least the integrated ones I've used on my macbook and work pc can actually play some games. For one, they don't eat into your system memory, and even in the case of the chopped-down 1030 part they just put out that uses DDR4, it's still a larger fixed buffer pool than the iGPU takes. Zarin posted:Good info, thanks! No one knows this time around. If crypto keeps roller-coastering in price, there's a 50/50 chance of being able to MAYBE buy a card for a semi-realistic price. But nVidia WILL figure out how to ~get dat papah~ if prices keep climbing. They want to hit $300 on their stock price. Similarly, nVidia has *no reason whatsoever* at the moment to be in a hurry to launch their new GPUs - this might be the first time in history where they'll have a stupidly low rate of product leftovers headed into a new architecture. If there's nothing old to discount, there's way less reason to be ~generous~ on an MSRP. I wouldn't put it past nVidia to sell Founder's Editions to people who are willing to jump through hoops to prove they're only buying 1-2 cards, but I doubt they'll go out of their way to do that on a large scale. All of us who've been reading the writing on the wall are expecting that this time around the MSRP of the x70 part is going to be $499 or higher. $799 isn't out of the realm of possibility for the x80 part, and at the worst peak of the BTC boom, people were saying the "1180 was going to cost $12-1500." BIG HEADLINE fucked around with this message at 04:26 on Apr 16, 2018 |
# ? Apr 16, 2018 04:20 |
|
I’m tempted to get the next x70 GPU out of sheer anxiety that prices will go into orbit again. I’ll likely just wait though. How do next gen cards perform? I read somewhere that a current gen x70 is a last gen x80, ditto with current x60s being the last gen equivalent to x70. Then there’s that whole Ti lineup and I dunno what in the world that falls into.
|
# ? Apr 16, 2018 05:14 |
|
Generally the current generation x70 is comparable to the prior generation's Ti part, and the x80 is usually ~15-25% faster than that. Then everyone knows the Titan and Ti are coming. The 970 and 980 bucked this trend, as (on paper, at launch), the 980 was only ~5-15% faster than the 970 part. Of course, the 980 owners never had to worry about the 3.5/0.5 fiasco. BIG HEADLINE fucked around with this message at 05:34 on Apr 16, 2018 |
# ? Apr 16, 2018 05:32 |
|
buglord posted:I’m tempted to get the next x70 GPU out of sheer anxiety that prices will go into orbit again. I’ll likely just wait though. How do next gen cards perform? I read somewhere that a current gen x70 is a last gen x80, ditto with current x60s being the last gen equivalent to x70. Then there’s that whole Ti lineup and I dunno what in the world that falls into. I just did exactly this for an evga ftw3 hybrid on amazon, been watching eth prices as my bellwether, at the end of March it dropped to $369 and was around that time gpu prices took a noticeable hit but I haven't seen them move at all since then, now 2 weeks later eth is up over $500, so I hope I made the right call. My new coffee lake system has been sad without a gpu upgrade.
|
# ? Apr 16, 2018 07:13 |
|
magimix posted:Thanks for the advice. It is *definitely* one of the GPU fans... Oh and by the way instead of load testing to get the fans spinning you could have just used something like precision x or msi afterburner to set the GPU fan speed to whatever you wanted (there's also programs that can do same for CPU and other fans, and you can also probably change speeds in bios). PSU fan is probably the only one that you can't manually control.
|
# ? Apr 16, 2018 08:33 |
|
I'm stilling sitting on an old as gently caress 6950HD, and am thinking about going straight into 4k 120 Hz when this eventually comes out. Is the 1180 probably going to be good enough, or do I probably need to wait for the 1180 Ti?
|
# ? Apr 16, 2018 08:39 |
|
I think you will need like double the power of a GTX 1080 ti to hit 4K120 at highest settings, which I doubt the 1180 or 1180 ti will hit.
|
# ? Apr 16, 2018 09:43 |
|
Was in a discussion in one of the goon discords. Went back and looked up how much I paid for a pair of 1050tis for the stepbrothers last summer. Cards that I would have liked to have been 1060 6GBs instead, but those were in stock nowhere. I paid $154 for a ASUS Geforce GTX 1050 Ti 4GB Phoenix, and $139 for a EVGA GeForce GTX 1050 Ti SC GAMING. Those cards are now $228 and $239, respectively, and are still selling for $175-ish, used.
|
# ? Apr 16, 2018 09:50 |
|
Struensee posted:I'm stilling sitting on an old as gently caress 6950HD, and am thinking about going straight into 4k 120 Hz when this eventually comes out. Is the 1180 probably going to be good enough, or do I probably need to wait for the 1180 Ti? 4K 120Hz in new games, at high settings in all situations, is almost certainly not happening for quite some time, maybe the generation after the next. Even so, it's a given that games will also increase their requirements to the point where it never quite happens/becomes very expensive. If we get a new round of consoles next year, or the year after, expect that 4K 120Hz dream to be pushed back even further. HalloKitty fucked around with this message at 15:52 on Apr 16, 2018 |
# ? Apr 16, 2018 11:42 |
|
buglord posted:I’m tempted to get the next x70 GPU out of sheer anxiety that prices will go into orbit again. I’ll likely just wait though. How do next gen cards perform? I read somewhere that a current gen x70 is a last gen x80, ditto with current x60s being the last gen equivalent to x70. Then there’s that whole Ti lineup and I dunno what in the world that falls into. I know how you feel. I couldn't wait any longer for the next-gen and grabbed a GTX 1080FE at the local Best Buy for $589 to replace the GTX 670 I'd been usingfor the last year.
|
# ? Apr 16, 2018 12:31 |
|
4k120 isn't going to happen for a good few years at least. The only chance of that coming early is if PC games adopt checkerboard rendering, which is slim to none since nvidia and AMD want you to keep buying their fancy new shiny graphics cards.
|
# ? Apr 16, 2018 15:51 |
|
Zedsdeadbaby posted:4k120 isn't going to happen for a good few years at least. The only chance of that coming early is if PC games adopt checkerboard rendering, which is slim to none since nvidia and AMD want you to keep buying their fancy new shiny graphics cards. UE4 recently started shipping a temporal upscaler which is similar in principle to checkerboard rendering, and works on PC. Here's an example from their docs: 70% resolution scale, naive upscale: https://a.safe.moe/PrrH5a4.jpg 70% resolution scale, temporal upscale: https://a.safe.moe/lmteaFB.jpg 4K monitors will be far more manageable with this kind of temporal fuckery available repiv fucked around with this message at 16:38 on Apr 16, 2018 |
# ? Apr 16, 2018 16:12 |
|
It looks less poo poo, but still like poo poo. The main reason it's valuable is that while 1440 will remain the ideal resolution, we're likely about to be forced to 4k in a year or two because 4k panels are where the development is happening and thus where the next big generation of high performance displays will be.
|
# ? Apr 16, 2018 17:10 |
|
The ideal will be when we can run both temporal upscaling and dynamic resolution at the same time, then you could nominally target native 4K but allow it to dip to subnative+temporal during intense moments.
|
# ? Apr 16, 2018 17:13 |
|
The Consoles are doing more of that for their 4K models. I'd guess some of that might make it back over to the PC.
|
# ? Apr 16, 2018 17:18 |
|
K8.0 posted:It looks less poo poo, but still like poo poo. The main reason it's valuable is that while 1440 will remain the ideal resolution, we're likely about to be forced to 4k in a year or two because 4k panels are where the development is happening and thus where the next big generation of high performance displays will be. EDIT: Doh I'm a dummy, I though the second pic was native and the first was the temporal method. Actually that looks drat good for 70% upscale IMO. Happy_Misanthrope fucked around with this message at 18:53 on Apr 16, 2018 |
# ? Apr 16, 2018 18:43 |
|
Wacky Delly posted:The Consoles are doing more of that for their 4K models. I'd guess some of that might make it back over to the PC. Well far more the PS4 Pro, and even then it seems it's far less prevalent than Sony indicated it would be (1440p is quite a common Pro res) - it's obviously not as simple as flicking on a software switch. The One X Xbox Next 10X is rendering either at native 4K in a lot of its enhanced titles, or so close with dynamic resolution that it's basically 'close enough' 95% of the time.
|
# ? Apr 16, 2018 18:46 |
|
Is 4k something a lot of people are using nowadays? I see it marketed everywhere but I have yet to find more than 1 person with a 4k TV/monitor. Is there any point to having a monitor below 30 inches and it being 4k? 1080p still looks great to my stupid eyes, but ill admit none of my monitors ever look as good as my laptop's Retina screen which is basically 4k. RIP game framerates though.
|
# ? Apr 16, 2018 18:58 |
|
4k is getting adopted far faster than HD did, but it's still rather slow in modern tech-terms.
|
# ? Apr 16, 2018 18:59 |
|
Happy_Misanthrope posted:EDIT: Doh I'm a dummy, I though the second pic was native and the first was the temporal method. Actually that looks drat good for 70% upscale IMO. Yeah I was going to say, 70% of ~800p is never going to look amazing but it's certainly a big improvement over bicubic. I'd like to see how it performs in a more realistic scenario like 70% of 4K but I don't have a 4K monitor to test on.
|
# ? Apr 16, 2018 19:05 |
|
buglord posted:Is 4k something a lot of people are using nowadays? I see it marketed everywhere but I have yet to find more than 1 person with a 4k TV/monitor. On laptops most midrange to higher-end laptops in the 15 and 17" categories at least offer a 4K screen, and smaller ones often offer increased resolutions in between those two. With desktop monitors, the situation still kinda sucks. 1080p is the standard at 24", 1440p at 27", and most 4K offerings are either expensive or compromised in some way. quote:Is there any point to having a monitor below 30 inches and it being 4k? 1080p still looks great to my stupid eyes, but ill admit none of my monitors ever look as good as my laptop's Retina screen which is basically 4k. RIP game framerates though.
|
# ? Apr 16, 2018 19:13 |
|
I thought nvidia GPUs don't do integer scaling and it has to be an option of your monitor? That's one thing holding me back, I don't want soft/interpolated 1080P when it should be pixel-perfect.
|
# ? Apr 16, 2018 19:16 |
|
repiv posted:Yeah I was going to say, 70% of ~800p is never going to look amazing but it's certainly a big improvement over bicubic. oh yeah i can just fake it with dsr Here's a better example then: Real native 4K: https://a.safe.moe/PbCSEJQ.png Temporal 70% 4K: https://a.safe.moe/4zpd61C.png Standard 70% 4K: https://a.safe.moe/jQ7dh2L.png Does the temporal upscale look as good as native? Nope, but it's really loving close despite only rendering half the pixels. The consoles have the right idea here. repiv fucked around with this message at 20:28 on Apr 16, 2018 |
# ? Apr 16, 2018 19:53 |
|
buglord posted:Is 4k something a lot of people are using nowadays? I see it marketed everywhere but I have yet to find more than 1 person with a 4k TV/monitor. I have a 27" 1440 monitor and I would definitely love a 27" 4k monitor for games but I'm holding out for a larger one because I don't want to deal with Windows UI scaling issues. Unfortunately I've been waiting for 4k prices to drop for like 5 years now and the next gen GPUs probably won't even have an output for my Korean monitor anymore while a monitor upgrade will still be $1k MaxxBot fucked around with this message at 20:53 on Apr 16, 2018 |
# ? Apr 16, 2018 20:50 |
|
I'd say 4k monitors are apparently different at 27" to my eyes, over 1440p that is. 1080p is wildly different at that size and higher imo. But like everybody I'm waiting for 4k monitors at high refresh and most importantly GPU's that can handle that. 1440p is still my sweetspot after all these years. 4k TV's with 4k content these days look really great. For a long time it was kind of a waste of money but I dont think it is anymore (it helps theyve come down a lot too) Whats silly to me is 4k phones, 4k laptops... I mean I can still tell on a 4k laptop but its just sooooooooo pointless unless youre drawing on it.
|
# ? Apr 16, 2018 22:37 |
|
If those upscaling tricks work down to 1080p, that would give me a (stronger) boner for APUs. 1080p30+ guaranteed on a ULV processor is worth being horny for.
|
# ? Apr 16, 2018 22:51 |
|
MaxxBot posted:I have a 27" 1440 monitor and I would definitely love a 27" 4k monitor for games but I'm holding out for a larger one because I don't want to deal with Windows UI scaling issues. Nvidia's BFGD 65" hybrid displays are due out for "executive tier" pricing later this year and are reported to solve the issue of windows UI scaling using the built in Shield to run post-effects over the top of picture in picture signals. Major downside is it's going to be at least 18 months after that for a prosumer version to be released and it's confirmed to use HDMI 2.0b, forcing people to use nvidia cards for the built in Gsync and hinting next gen nvidia cards at least won't have HDMI2.1
|
# ? Apr 16, 2018 23:27 |
|
I'm going to laugh so hard if Nvidia's new line of gaming video cards don't have HDMI 2.1 support.
|
# ? Apr 17, 2018 01:29 |
|
edit: wait nevermind that's wrong
|
# ? Apr 17, 2018 01:40 |
|
spasticColon posted:I'm going to laugh so hard if Nvidia's new line of gaming video cards don't have HDMI 2.1 support. It wouldn't surprise me at all. HDMI 2.0 already covers 4k@60Hz, which is sufficient for the vast majority of users. If you want higher than that, DP 1.4 already covers 4k@120Hz native and up to 4k@240Hz/8k@60Hz with DSC, so monitors are already taken care of. Want better than 4k@60 on a TV? Buy the NVidia-branded BFGD! Problem solved! In that no AMD solution can reasonably push 4k@much-of-anything, it's not like NVidia needs the feature set to compete. Frankly, it'd only encourage people to get non-BFGD TVs, effectively costing NVidia a few bucks.
|
# ? Apr 17, 2018 01:40 |
|
High refresh rate 4k monitors confuse me, are these mainly for csgo and overwatch players?
|
# ? Apr 17, 2018 01:57 |
|
They're for people who want a bigass monitor but aren't into the whole ultrawide thing. Like me. Because for the time being and the near future, a more orthodox aspect ratio really is just more practical. And you don't look like an rear end in a top hat when you post a screenshot that nobody can view properly on a 16:9 or 16:10 screen.
|
# ? Apr 17, 2018 02:04 |
|
PerrineClostermann posted:4k is getting adopted far faster than HD did, but it's still rather slow in modern tech-terms. IMO it has less to do with people wanting 4K and more to do with its getting harder to buy a TV that isn't 4K especially when people tend to go up screen sizes when buying a new one. BTW 55 inchers and above is only like 15% of the entire TV market IIRC.
|
# ? Apr 17, 2018 02:23 |
|
|
# ? May 29, 2024 08:04 |
|
4k monitors don't have much penetration at all. I think the Steam survey had it at 1-2%.
|
# ? Apr 17, 2018 02:55 |