Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
Buy the new card, 4gb of ram today, 3.5gb of ram tomorrow

Adbot
ADBOT LOVES YOU

1gnoirents
Jun 28, 2014

hello :)
I like Steve and I prefer his videos to any other GPU source of information out there, but I think he's really stretching the "this is good for developers..." angle. Its possible all the raytracing games fall through or maybe the performance hit will be crazy but these games are coming out very soon. He and others are putting zero value in RTX features because they arent available now. I'd agree with that (much like I did with DX12) if it weren't for the release dates of these major AAA games. So that is to say I dont agree with that at all. At worst its an unknown, but I'm pretty sure we'll know sooner rather than later.

Regardless the pricing puts a major negative shade on everything

SwissArmyDruid
Feb 14, 2014

by sebmojo
As unspectacular as the top end of the product stack is, I kind of feel like Nvidia would have been better off holding the 2080Ti to the usual mid-cycle timing.

Stickman
Feb 1, 2004

1gnoirents posted:

I like Steve and I prefer his videos to any other GPU source of information out there, but I think he's really stretching the "this is good for developers..." angle. Its possible all the raytracing games fall through or maybe the performance hit will be crazy but these games are coming out very soon. He and others are putting zero value in RTX features because they arent available now. I'd agree with that (much like I did with DX12) if it weren't for the release dates of these major AAA games. So that is to say I dont agree with that at all. At worst its an unknown, but I'm pretty sure we'll know sooner rather than later.

Regardless the pricing puts a major negative shade on everything

It's tough for reviews to put value on something that they can't test, though, especially if it's potentially a performance hit.

SwissArmyDruid posted:

As unspectacular as the top end of the product stack is, I kind of feel like Nvidia would have been better off holding the 2080Ti to the usual mid-cycle timing.

I feel like they included it precisely because that's where there was a gap in the market. The 2080 and 2070 are basically flat for their price point (problematic when there's a glut of overstocked 1080/Tis floating around), but the 2080 Ti doesn't have a competitor.

Stickman fucked around with this message at 21:21 on Sep 19, 2018

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Paul MaudDib posted:

Again, if you could pick up a 1080 Ti for like $400 then by all means it would be a great deal but when you're already spending $600 on a card you should really think seriously about dropping the extra $100 to get the newer card even if it's not quite as good a price-to-performance today. The 2080 is already a little faster today, it's potentially a lot faster down the road, and as a current-gen product you are the priority for support/optimizations in future titles.

counterpoint: don't spend the better part of a thousand dollars based on something that might potentially happen, jesus loving christ

The fact is that today the 2080 is a lovely deal. Don't try to talk people into buying it based on the idea that in a year from now, it might turn out to have been a less lovely deal than it seems like right now. It's highly unlikely that it's going to go up in price, so how about you wait it out and make an informed decision later rather than throwing hundreds of dollars at a speculative purchase today?

I mean, yeah, if you're sitting on some old garbage card and you were going to upgrade this month anyway and you have more money than sense, then sure, fine, go ahead and buy a 2080 instead of a 1080Ti. If not though, just wait it out and see what happens.

Eyes Only
May 20, 2008

Do not attempt to adjust your set.

SwissArmyDruid posted:

As unspectacular as the top end of the product stack is, I kind of feel like Nvidia would have been better off holding the 2080Ti to the usual mid-cycle timing.

Why? Outside of DLSS and RT, the 2080ti is the only card that has a value proposition at all. The others are all roughly equivalent to Pascal MSRP (let alone current retail price) in terms of perf/$ which is like...why bother releasing them. The 2080ti actually has a purpose by virtue of not having a Pascal equivalent at all.

I will admit Turing's performance is a little better than I expected based on the Titan V, but still not enough to justify the prices on the lower skus.

For machine learning these cards are insanely good value though.

1gnoirents
Jun 28, 2014

hello :)

Stickman posted:

It's tough for reviews to put value on something that they can't test, though, especially if it's potentially a performance hit.


I feel like they included it precisely because that's where there was a gap in the market. The 2080 and 2070 are basically flat for their price point, but the 2080 Ti doesn't have a competitor.

I understand, but he is implying it should be discounted like it wont happen at all. Its one thing to not include it in your judgement but its another thing to say those features are only useful for... developers?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

TheFluff posted:

I mean, yeah, if you're sitting on some old garbage card and you were going to upgrade this month anyway and you have more money than sense, then sure, fine, go ahead and buy a 2080 instead of a 1080Ti. If not though, just wait it out and see what happens.

I don't think I or anyone else suggested you should upgrade from a 1080 Ti to a 2080, that would be stupid. Obviously that would implicitly narrow things down to people who are upgrading from low-end or older cards, so way to get mad about something nobody said?

If you are looking to upgrade and you don't want to spend $1200 for a 2080 Ti, the 2080 makes more sense at this point in time than the 1080 Ti. The 2080 is already 6-8% faster and in FP16 titles it's more like 20% faster. Again, we've already seen aftermarket cards at $750, at which point the 1080 Ti is starting to get overpriced at $600-700. The 1080 Ti really needs to drop down to like $500-550 to really stay viable, especially as aftermarket 2080s start approaching MSRP.

If anything Pascal cards are actually bouncing upwards right now due to the perceived weakness of Turing, like you can't actually find 1080 Tis at $600 anymore, they're all $630 or higher on Newegg right now, with high-end models at $650-700. Paying $650 for a 1080 Ti on the eve of the Turing launch is loving stupid.

Paul MaudDib fucked around with this message at 21:27 on Sep 19, 2018

Stickman
Feb 1, 2004

1gnoirents posted:

I understand, but he is implying it should be discounted like it wont happen at all. Its one thing to not include it in your judgement but its another thing to say those features are only useful for... developers?

It sounds like he's just saying he can't recommend it now.

quote:

Until we see a price drop in the 2080, compelling RTX implementations in an actually relevant game, or depleted stock on the 1080 Ti, there is no strong reason we would recommend the RTX 2080 card.

Seems sensible. It's tough to recommend something sight-unseen; when more RTX implementations actually show up then the recommendation can be revised.

Stickman fucked around with this message at 21:28 on Sep 19, 2018

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib

Paul MaudDib posted:

That's just like, your opinion man.

I bought the 780 Ti after Maxwell released, at the time EVGA was running B-stock for $180 pretty regularly. It was a good card and I got my money out of it

The 780 Ti was pretty bad. It ran very hot and was hampered by 3 GB of RAM. Really regretted buying that card. But I didn't buy it for $180, either (full launch price)

Lambert fucked around with this message at 21:26 on Sep 19, 2018

LRADIKAL
Jun 10, 2001

Fun Shoe

Paul MaudDib posted:

Also can we please stop using GTA:V as a benchmark now, that engine is all kinds of hosed up and everything can run it at a million fps, we don't need to be benchmarking it in TYOOL 2018.

I don't think we necessarily pay ENOUGH attention to games that are actually widely played when we recommend things in the PC part picking thread or in this thread. Heres' steam's top ten list

402,047 650,922 Dota 2
385,816 477,031 Counter-Strike: Global Offensive
365,992 994,205 PLAYERUNKNOWN'S BATTLEGROUNDS
64,689 92,532 Tom Clancy's Rainbow Six Siege
52,181 55,741 Rocket League
51,945 57,112 Warframe
46,905 47,025 Football Manager 2018
43,632 124,025 MONSTER HUNTER: WORLD
42,509 44,118 Path of Exile
40,612 62,834 Grand Theft Auto V

With the exception of PUBG and Monster Hunter (and kind of GTA), there's not anything on here that requires a very high performance graphics card. I know Nvidia has only released their high end, but the amount of games in which this high end stuff matters is pretty slim.

On the flipside of that, below are the top graphics cards in gamer systems(i extended the list a little to get an AMD on it, lol)
If you're a developer, what do you target for your mainstream performance? A 960? Maybe a 1060 (3GB? 6GB?)? NVIDIA has pushed up the tent poles, but for many devs, implementing these features might be more work than it's worth given current graphics market share. I do hope that these technologies actually end up lowering the amount of work devs need to do, but for now they'll need to still do all the rasterization stuff for all the games released.

I think the point I'm trying to make is that these releases do nothing to bring the majority of gamers forward into new tech and value. Perhaps we'll see that on the lower end with a 2060 or 2050, but it's not looking good.

code:
NVIDIA GeForce GTX 1060
13.31%

NVIDIA GeForce GTX 1050 Ti
8.45%

NVIDIA GeForce GTX 1050
4.69%

NVIDIA GeForce GTX 960
4.18%

NVIDIA GeForce GTX 1070
4.02%

NVIDIA GeForce GTX 750 Ti
3.51%

NVIDIA GeForce GTX 970
3.41%

NVIDIA GeForce GTX 1080
2.72%

NVIDIA GeForce GTX 960M
1.89%

NVIDIA GeForce GTX 1080 Ti
1.45%

NVIDIA GeForce GTX 950
1.41%

Intel HD Graphics 4000
1.34%

NVIDIA GeForce GT 730
1.13%

NVIDIA GeForce 940M
1.12%

NVIDIA GeForce GTX 950M
1.08%

NVIDIA GeForce GTX 760
1.02%

Intel Haswell 0.99%

AMD Radeon R7 Graphics 0.92%

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

SwissArmyDruid posted:

As unspectacular as the top end of the product stack is, I kind of feel like Nvidia would have been better off holding the 2080Ti to the usual mid-cycle timing.

As others have guessed the problem is we're somewhat on the cusp of the 7nm generation, it just wouldn't make sense to wait ~6 months when we would see the 7nm refresh come out just a few months later. I think this is basically an architecture being crammed into 12nm when it really becomes far more viable at 7.

It would also mean that at least with same timing and the lack of RTX-supported games, without a 2080ti the reviews today would be basically "So, 1080ti performance, 3 gb less memory for $200 more". I mean most reviews point out how absurdly expensive the 2080ti is but at least it makes for some wowza-graphs with nothing touching it performance wise.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Stickman posted:

The loss of detail is pretty noticeable in the FFXV review, too



I suspect that it's probably basically unnoticeable for action games, at least, since you're generally not looking at tiny details anyway. Has anyone posted reviews with the performance of native-resolution DLSS? It seems like it should still slightly outperform TAA since it'll be running in parallel on the tensor cores, unless it bottlenecks the scene. What about 1440p? I remember you saying that lower resolutions may have lower fidelity since important features will have less information than at 4k.

Yeah those videos actually made me less enthusiastic about DLSS. The Infiltrator demo in particular, the city scene had a lot of shimmering/aliasing while the native was perfectly still. Note that the Candyland video only showed stills from that, you need to download the German hardware site video to see it (Candyland is produced by a PC OEM mind you, I wouldn't necessarily go to them for the most unbiased angle).

So much we don't know about DLSS - if you enable it, does the source resolution have to stay fixed based on what they submitted to Nvidia? Meaning, could you choose an intermediate resolution based on your own preference or does Nvidia need to work from that specific res to make DLSS work at all? Who knows. It's also something that would make far more sense for the lower-end cards, for something like a potential 2060/70 it may hold far more value than it does for cards that are already getting close to 60fps at 4k. Based on those videos I'd probably want to turn down some shadow settings first if I wanted 4k 60fps for the few games that couldn't manage it maxxed before enabling DLSS.

Happy_Misanthrope fucked around with this message at 21:52 on Sep 19, 2018

1gnoirents
Jun 28, 2014

hello :)
I understand its blurrier but the TAA side is really bad, the aliasing is so bad its affecting the textures. Look at the headrest and the nearest door edge. Much less if you pause a crystal clear video at any specific frame its always kind of blurry, in motion it looks great. I havent downloaded any of the comparison videos yet but the stills dont bother me unless it literally looks blurry in all the frames

Stickman
Feb 1, 2004

1gnoirents posted:

I understand its blurrier but the TAA side is really bad, the aliasing is so bad its affecting the textures. Look at the headrest and the nearest door edge. Much less if you pause a crystal clear video at any specific frame its always kind of blurry, in motion it looks great. I havent downloaded any of the comparison videos yet but the stills dont bother me unless it literally looks blurry in all the frames

If you have a 4k monitor, there's a FF XV 4k video up (running on a 2080). I'm at 1440p, so it's tough to tell what's due to downsampling, what's from the game engine, and what's DLSS.

E: And what's from the video encoding, I suppose.

repiv
Aug 13, 2009

Stickman posted:

If you have a 4k monitor, there's a FF XV 4k video up (running on a 2080). I'm at 1440p, so it's tough to tell what's due to downsampling, what's from the game engine, and what's DLSS.

E: And what's from the video encoding, I suppose.

The Germans have better quality videos for comparison:

Carecat
Apr 27, 2004

Buglord
Buying a 20xx with the idea of futureproofing can't be sensible. This is the first version of RTX which will most likely be much more refined in the next.

3peat
May 6, 2010

TheFluff posted:

counterpoint: don't spend the better part of a thousand dollars based on something that might potentially happen, jesus loving christ

Seriously, it's hilarious seeing the nvidia fanboys like paulie here doing the whole fine wine song and dance
how the turntables

Anime Schoolgirl
Nov 28, 2002

wow it performed exactly as i expected a card destined for the deep compute market (with functions hastily repurposed for trayracing because retards wouldn't stop asking when the next card is coming out) would

weird

Stickman
Feb 1, 2004

repiv posted:

The Germans have better quality videos for comparison:

Thanks! The native DLSS (2X) looks very nice compared to TAA, and is still a pretty hefty performance boost:

https://www.youtube.com/watch?v=ZD6aBykmayc

At least I think that's 2X DLSS - the frame rate doesn't seem to line up with the standard DLSS framerate, and '2X' is mentioned in the preceding paragraph (which I can't read because I'm one of the lazy Americans).

E: I google-translated it and now I'm more confused. They seem to be talking about DLSS2X as if it's upscaling?

Stickman fucked around with this message at 22:17 on Sep 19, 2018

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

3peat posted:

Seriously, it's hilarious seeing the nvidia fanboys like paulie here doing the whole fine wine song and dance
how the turntables

nah, 1080 Ti is fine, just not at 85% of the price of the 2080.

If Vega 64 was 8% faster than the 1080 and launched at 15% higher price then reviews would have gone a lot different, but it was more like 8% less performance and 25% higher price, even before we bring the "bundles" into the mix. The bundles put the V64 at $600, at a time when the 1080 was running $400.

If the 5% difference in price-to-performance really matters to you then buy the 1080 Ti, but the 2080 really is not behind much in price-to-performance and is worth a small premium to have the latest/best-supported. We literally know that some of the changes like FP16 are there and are working but need to be coded for.

But I guess AMD put out a bad GPU once and therefore

Paul MaudDib fucked around with this message at 22:54 on Sep 19, 2018

repiv
Aug 13, 2009

Stickman posted:

Thanks! The native DLSS (2X) looks very nice compared to TAA, and is still a pretty hefty performance boost:

At least I think that's 2X DLSS - the frame rate doesn't seem to line up with the standard DLSS framerate, and '2X' is mentioned in the preceding paragraph (which I can't read because I'm one of the lazy Americans).

E: I google-translated it and now I'm more confused. They seem to be talking about DLSS2X as if it's upscaling?

My interpretation of the Google Translate was that neither the FF15 or Infiltrator builds they were provided support DLSS2X, so they only tested DLSS upscaling.

Google Translate posted:

Both programs have one thing in common: DLSS2X, which is the most visually beautiful mode that adds "KI upscaling" to the native resolution, does not exist. Instead, the normal DLSS is used. The goal is to visually produce a 4K-like graphic, but without rendering in 4K or Ultra HD. That said, the internal resolution is lower than Ultra HD, so the performance increases. The image to be seen should achieve a comparable quality.

german goons please assist

Stickman
Feb 1, 2004

Paul MaudDib posted:

nah, 1080 Ti is fine, just not at 90% of the price of the 2080.

If Vega 64 was 8% faster than the 1080 and launched at 15% higher price then reviews would have gone a lot different than 8% less performance and 25% higher price.

You're FP16 point was good - The Wolfenstein 2 benchmarks are ~15% increase going from 1080 Ti -> 2080. I assume FP16 rendering is likely to become more ubiquitous over the next few years?

repiv posted:

My interpretation of the Google Translate was that neither the FF15 or Infiltrator builds they were provided support DLSS2X, so they only tested DLSS upscaling.

german goons please assist

That makes more sense - it still looks pretty good when scaled back down to 1440p (come on reviewers - post something for us plebs)!

Stickman fucked around with this message at 22:22 on Sep 19, 2018

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib

repiv posted:

My interpretation of the Google Translate was that neither the FF15 or Infiltrator builds they were provided support DLSS2X, so they only tested DLSS upscaling.


german goons please assist

The Google translation is accurate. The next paragraph states that the DLSS rendering resolution of FFXV is unknown. Infiltrator is stated to be "half resolution", but they don't know what that means in terms of actual rendering resolution.

repiv
Aug 13, 2009

Stickman posted:

You're FP16 point was good - The Wolfenstein 2 benchmarks are ~15% increase going from 1080 Ti -> 2080. I assume FP16 rendering is likely to become more ubiquitous over the next few years?

Yeah, the PS4 Pro supports FP16 so adoption is inevitable as devs try to squeeze more performance out of that platform.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Stickman posted:

You're FP16 point was good - The Wolfenstein 2 benchmarks are ~15% increase going from 1080 Ti -> 2080. I assume FP16 rendering is likely to become more ubiquitous over the next few years?

AMD is going to be pushing it for consoles, and NVIDIA will be pushing it too now that it helps sell their latest cards, so it's a very safe bet.

Ironically this is going to end up giving Vega the same 10-15% boost relative to Pascal. Still not nearly enough to catch the 1080 Ti but every bit helps.

This was one of the few easy/obvious gains in Vega and I kinda suspected that NVIDIA would pick it up next cycle too. There aren't too many easy uarch wins left anymore.

Craptacular!
Jul 9, 2001

Fuck the DH

1gnoirents posted:

I like Steve and I prefer his videos to any other GPU source of information out there, but I think he's really stretching the "this is good for developers..." angle. Its possible all the raytracing games fall through or maybe the performance hit will be crazy but these games are coming out very soon. He and others are putting zero value in RTX features because they arent available now. I'd agree with that (much like I did with DX12) if it weren't for the release dates of these major AAA games. So that is to say I dont agree with that at all. At worst its an unknown, but I'm pretty sure we'll know sooner rather than later.

Regardless the pricing puts a major negative shade on everything

It’s easy to see raytracing going nowhere because it’s a PCMR feature. It’s presently money invested exclusively in the PC edition of a game that contributes zero to the console release.

When DirectXbox 12 is released, you’ll see better adoption.

SwissArmyDruid
Feb 14, 2014

by sebmojo
It is a little hilarious to me that FP16 rendering is becoming the norm. I remember when ATI, in the last year or so before AMD retired the branding entirely, got in trouble for using FP16 demotion to cheat their way to better scores over Nvidia.

heated game moment
Oct 30, 2003

Lipstick Apathy
The $430 I spent on a 1070 over 2 years ago seems like a pretty good deal in hindsight

repiv
Aug 13, 2009

SwissArmyDruid posted:

It is a little hilarious to me that FP16 rendering is becoming the norm. I remember when ATI, in the last year or so before AMD retired the branding entirely, got in trouble for using FP16 demotion to cheat their way to better scores over Nvidia.

The controversy there was that ATI wasn't using FP16, when the game created an FP16 render target ATIs driver silently used an even lower precision R11G11B10 format instead.

FP16 render targets are still widely used now but shaders operate on FP32 and throw away the extra precision on write, the new poo poo with Vega/Turing is directly operating on FP16 values.

SlayVus
Jul 10, 2009
Grimey Drawer
There is actually one EVGA review out right now. JayzTwoCents has one.

https://www.youtube.com/watch?v=WDrpsv0QIR0&t=635s

"The EVGA is any where from 10 to 15 C cooler at the same frequencies than the founders edition because again it's a 2.75 slot." His theory why the FE cards aren't better thermally than the EVGA is because of the shroud. Where the shroud dips down past the top of the fins, it blocks the potential air flow from the fans.

Craptacular!
Jul 9, 2001

Fuck the DH
Also I’m watching Steve’s review of the 2080ti now, and I’m starting to think Nvidia’s mistake was not calling the 2080 a 2070 and the 2080ti a 2080. And then if they want to performance bump later they totally could.

ufarn
May 30, 2009

ufarn posted:

MSI has 8+8 for 2080 and 6+8+8 for 2080ti. :stare:

https://www.youtube.com/watch?v=zRR5CbBF9yU


Absolutely disgusting:

https://twitter.com/JayzTwoCents/status/1042529712331878400

repiv
Aug 13, 2009

SlayVus posted:

There is actually one EVGA review out right now. JayzTwoCents has one.

https://www.youtube.com/watch?v=WDrpsv0QIR0&t=635s

> tests doom to see how turing performs under vulkan
> makes a point of using smaa instead of tssaa

But TSSAA is the only anti-aliasing mode that uses async compute :thunk:

Cygni
Nov 12, 2005

raring to post

Craptacular! posted:

Also I’m watching Steve’s review of the 2080ti now, and I’m starting to think Nvidia’s mistake was not calling the 2080 a 2070 and the 2080ti a 2080. And then if they want to performance bump later they totally could.

They probably wanted to keep the mainstream marketing in a reasonable price range. The Ti is just a Titan with a new marketing name to exploit the new tier of ultra nerd that Nvidia discovered during the mining boom, and boy has it worked based on the attention the thing is getting.

If they would have called it the Titan T, they would have gotten like 20% of this press and hype.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Craptacular! posted:

Also I’m watching Steve’s review of the 2080ti now, and I’m starting to think Nvidia’s mistake was not calling the 2080 a 2070 and the 2080ti a 2080. And then if they want to performance bump later they totally could.

They'd likely have to make another, even bigger chip then, and they probably decided the yields wouldn't be worth it at ti prices. They did leave room for a titan, but I wouldn't be surprised if that got skipped this gen.

EdEddnEddy
Apr 5, 2012



All about on par for what I expected.

However I think I am content with my 980Ti for at least another 6+ months. By the time I need to replace it I think it will be time for a whole new rebuild anyway. For now 1440P/100Hz Ultrawide G-Sync is still above 60FPS 99% of the time so yea. That late VR upgrade to the 980Ti wasn't a terrible move back in what, May 2016?

1gnoirents
Jun 28, 2014

hello :)

Craptacular! posted:

It’s easy to see raytracing going nowhere because it’s a PCMR feature. It’s presently money invested exclusively in the PC edition of a game that contributes zero to the console release.

When DirectXbox 12 is released, you’ll see better adoption.

Well I hope not. Id be more skeptical if it wasnt ~raytracing~. Its been the goal for as long as I can personally remember after literal real time 3d graphics came around. If not I guess I lose the gamble with the fastest GPU ever made that I will just resell in 11 months for $975

Craptacular!
Jul 9, 2001

Fuck the DH

1gnoirents posted:

Well I hope not. Id be more skeptical if it wasnt ~raytracing~. Its been the goal for as long as I can personally remember after literal real time 3d graphics came around. If not I guess I lose the gamble with the fastest GPU ever made that I will just resell in 11 months for $975

I don’t know if 11 months is going to be enough time, but you’ll see increased DX12 adoption when you see games on this series of consoles that are produced with the quiet expectation to re-release on the next generation.

Adbot
ADBOT LOVES YOU

1gnoirents
Jun 28, 2014

hello :)

Craptacular! posted:

I don’t know if 11 months is going to be enough time, but you’ll see increased DX12 adoption when you see games on this series of consoles that are produced with the quiet expectation to re-release on the next generation.

I cant have a video card for more than a year otherwise it gets old



gently caress that made me sadder than i thought it would

I should mention again I pre ordered within 1 hour when you could so this is probably bad news for a lot of people, though I suppose not unexpected

1gnoirents fucked around with this message at 23:57 on Sep 19, 2018

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply