Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
pyrotek
May 21, 2004



HalloKitty posted:

Make sure you don't get 2GiB cards, because they will struggle very soon.
That silly ASUS card that has two 2GiB 760s on one card is totally pointless.

How long does everybody think 3GB cards will last? I was thinking about getting a 780, but with the new consoles having 8GB of unified memory to play with, 3GB kind of scares me.

Adbot
ADBOT LOVES YOU

pyrotek
May 21, 2004



fletcher posted:

Any good deals on video cards for Black Friday?

NCIX had a Powercolor 7970 for $230 ($210 AR), but that is unsurprisingly sold out.

Newegg has a MSI 2GB GTX 770 for $295 ($280 AR). They also had another MSI 770 with a custom cooler for $5 more, but that is sold out.

NCIX has a Zotac GTX 780 for $450. That is the best price I've seen on a 780, and I'm tempted to pick one up. Probably overkill for 1200p gaming, but I could always get one of those ASUS VG248QE's and a G-Sync module when they come out and not let all of those extra frames go to waste...

pyrotek
May 21, 2004



Agreed posted:

Looks like they've started to figure out that 280x is the same card as the 7970GHz but with a few easily-corrected differences (from the perspective of a coin miner). I look forward to the mass influx of nVidia owners as coin miners will absolutely keep stock cleared on AMD's side of things perpetually until this bubble dies down. Welcome to team green by default, y'all! Feel free to ask questions about neat features and why nVidia cards are pretty cool, I'll do my best to make you feel welcome. We're a welcoming bunch, y'know.

And when the bubble bursts there will be a flood of cheap 7970/7990/280x cards on the market. I can't wait!

pyrotek
May 21, 2004



Ganondork posted:

RE: DLSS 2.0 - I wouldn’t get your hopes up. The DLSS dev page only mentions access for Unreal Engine. Given the need for motion vectors, it seems like DLSS isn’t a general solution, and requires support from the engine and game dev.

https://developer.nvidia.com/dlss

Given that few games implement DLSS 1.0, I’d expect the same for 2.0. Hopefully, we’ll see wider adoption, but given gaming’s past with proprietary tech, and consoles consistently going AMD...I doubt it.

Neither Control nor Wolfenstein: Youngblood use Unreal, so hopefully it will have wider adoption than you think. It really looks great in person. Light halos and other related sharpening artifacts really annoy me so I'm normally not a fan of sharpening techniques, but the effects seem really minimal with DLSS 2.0 and the performance gains compared to native rendering are massive.

pyrotek
May 21, 2004



DLSS 2.0 is amazing in the whopping four games that support it right now, but I can't stop thinking how much I'd love to have the technology in a new Switch revision.

pyrotek
May 21, 2004



eames posted:

I feel like it could go either way and might turn out to be two-edged sword, because if this catches on native rendering output might be gone for good.

Yes, it's indubitably cool to get a free 50% performance gain without an immediately obvious difference in image quality, but I'm not sure I'll be as thrilled when <insert big AAA publisher> forces those tradeoffs for me and the locally rendered title displays videocompression-esque artifacts to achieve playable framerates.
I realize I'm overly negative again but :tinfoil:

Look at it from the other way: why waste time rendering at needlessly high resolution when you can get the same or better quality using DLSS and use the savings to push the overall rendering quality higher?

Variable rate shading will also similarly improve image quality by not wasting time rendering all of the pixels at the highest shading rate when the differences wouldn't be visible. This improves image quality by freeing extra performance to render the pixels that are most important to the image at a higher quality.

Real-time rendering is all about making trade-offs between rendering speed and quality. Rendering internally at your monitor's native resolution may be one of the next things we have to get used to not always being the best idea, especially as monitor resolutions continue to climb.

If you really wanted to, you could use DSR to render at a resolution higher than your monitor and set the DLSS resolution to the same as your monitor. Would that count as native rendering output?

pyrotek
May 21, 2004



Mercrom posted:

But it's not 1/2 pixels though? What's the actual reason for those numbers?

4/9 doesn't sound as catchy.

pyrotek
May 21, 2004



Taima posted:

Just out of curiosity as I'm no expert on the whole timeline of video cards- what other historical technologies have garnered a similar or greater performance than DLSS 2.0?

Obviously it's super powerful but I have no idea about the histories of such technologies. Presumably there have been jumps similar to this.

Outside of new card releases? Nothing.

The closest analog to DLSS 2.0 I can think of is having some games I already owned add Glide support when I got a card with the 3dfx Voodoo Graphics chipset. Support wasn't very widespread at first for old or new games there, either. Then GLQuake came out and everything changed very quickly.

pyrotek
May 21, 2004



HalloKitty posted:

Don't forget the layers of DRM that have been proven to cause stutter!


SwissArmyDruid posted:

And if you took more than two seconds to look at the graphs you linked, you'd see the MASSIVE difference in frametime pacing.



See those white spikes on the left side that you don't see on the right? Those are framedrops and microstutters caused by load spikes from the DRM. Killing performance is absolutely real.

It is not enough, in this day and age, to rely on simplistic averages to determine performance. An average obscures the actual moment-to-moment values of what is actually happening. It's the same thing where the average of 4 and 6 is 5, and the average of 0 and 10 is also 5, but one is more even than the other. We did not get here on the backs of the work of Wasson et al. to just ignore everything on the left side of those screenshots to go, "UNGA, NUMBER SAME, NO PROBLEM".

And even if there weren't any problems on this VERY well-specced machine, there WILL be problems when you scale your hardware down. So on top of being a caveman that can't read graphs, does this mean you're also a hardware elitist?

The ONLY counterarguement here that you may have is that the CPU graph is hosed, but the other two graphs are still agree with the assertion that the DRM kills performance.

Weird that I don't get any spikes on similar hardware to those two sources (3600, 2060 super).

pyrotek
May 21, 2004



https://twitter.com/TUM_APISAK/status/1274560482192457728

Can't wait to find out which card that is.

pyrotek
May 21, 2004



shrike82 posted:

There is absolutely a loss of quality at least when I tried it on Control on the highest quality setting. We can argue about how (in)significant it is but then we're making similar arguments to "if you turn down these settings from Ultra to High, you'll never notice it in-game".

I think it looks better in some ways, worse in other ways than native rendering.

https://www.youtube.com/watch?v=tMtMneugt0A

That video convinced me that some of the things I thought were flaws were actually closer to the 32 sample per pixel reference they trained the algorithm with. We are just used to the types of image flaws that current rendering brings, and DLSS does look different from that.

I certainly understand people having different opinions on the perceptual quality of the image, but for such massive performance gains I'll live with "pretty close".

pyrotek
May 21, 2004



Happy_Misanthrope posted:

There's also this video by DF, in particular this section which shows that in motion, DLSS can have significantly superior detail compared to TAA. Remember that even 'native' is compromised to some degree by how TAA works.

The specific part that looks worse to me is the halos they point out from the sharpening. Halos on edges is one of those things that really bothers me (lots of old Blu-rays had that problem from over-sharpening too) and I can see it in motion on any DLSS 2.0 game. It is probably the worst in Minecraft.

Ignoring that issue, I prefer the DLSS 2.0 image to native resolution. Hopefully future games add DLSS options for things such as sharpening.

pyrotek
May 21, 2004



sean10mm posted:

Control on my current non-RTX GPU can't even run the normal reflection effects without glitches, but looks rad even with them turned off. I'd love to see it on RTX hardware.

Control almost seems like it was designed specifically to show off the various ray tracing effects.

pyrotek
May 21, 2004



shrike82 posted:

Wonder if it's theoretically possible for a future DLSS version to support all games without the need for additional development work.
That'd be a huge win.

Rumor is that they are working on a way to be able to turn on DLSS in games that use TAA through the control panel or GeForce Experience. There is a chance they might make that exclusive to the new generation of cards, though.

I really hope that isn't bullshit because I'd like a performance boost in RDR2 among other games.

pyrotek
May 21, 2004



repiv posted:

Nvidia currently isn't pushing DLSS for VR anyway, why not is anyone's guess.

Even if MSAA is still better than reconstruction in VR it would be nice to have a better TAA for those VR games that use deferred shading and can't support MSAA.

Perhaps the AI upscaling isn't and can't be 100% identical per eye, causing disconcerting artifacts?

pyrotek
May 21, 2004



repiv posted:

yikes, so much for death strandings port boding well for horizon zero dawn

it's pretty clear they were ported independently of each other

https://www.youtube.com/watch?v=7w-pZm7Zay4

I was wondering why this game didn't have DLSS support like Death Stranding, and I think this video hints at why. I have to imagine all the parts that are limited to 30 FPS keep the temporal reconstruction from working correctly.

pyrotek
May 21, 2004



Happy_Misanthrope posted:

Nah, I don't really see sharpening artifacts. Look at the crowd and the top of the video display on the right, DLSS has noticeably more stair-stepping, as well the railing on the sign in the distance is significantly thicker. It just looks like lower res, the performance boost with it isn't massive like other games as well - this implementation looks barely improved, if at all, from just dropping the res a tad based on these shots. Maybe Guru3D screwed it up?

It looks to me like there is a thin black border around the... whatever it is on top of the video board. You can see it is there in both versions above the display on the left side. It is pretty hard to tell which one is more accurate without knowing how it "should" look.

Either way, it is close enough that I'd gladly take whichever version gave me 30+% more performance.

pyrotek
May 21, 2004



Lockback posted:

I don't think AMD and NVidia are necessarily the biggest threats to each other. Mostly their competing against people just using whatever GPU they already have more than the competitor.

In particular, anything that makes CPUs a bigger bottleneck is a net gain for them. They're going to get tailwind anytime NVidia does something cool.

Only AMD having PCIe 4.0 also helps, even if I doubt it will make much of a difference unless you are buying a 3090.

pyrotek
May 21, 2004



Control Volume posted:

So whats the best time to pick up a used 2080 Ti on the back of this? Because I definitely dont think Im going to be able to snag a 3070 lol

They'll probably still be more expensive than 3070s used, I'd wait for benchmarks before making that decision.

There are a lot of advantages to getting a new card (warranty, better driver optimizations, reduced power consumption, etc.) compared to the extra memory on the 2080 Ti. Maybe the memory will make a difference, but reviews should make that clear.

pyrotek
May 21, 2004



Zedsdeadbaby posted:

Microcenter sounds like one of those insufferable stores that artificially tries to prop up their failing bricks and mortar shops to the detriment of their online service cos they can't see the writing on the wall and refuse to adapt

In the UK we gleefully let them collapse, our largest PC shop by a huge margin is an online-only shop with one single store in Newcastle

The Micro Centers near me are so incredibly far away from suffering it is unbelievable. I ordered something for pickup a few weeks ago on a Sunday, walked in the store, saw that the checkout line wrapped around half the store, and walked right back out and ordered on amazon. Everyone was wearing a mask, but there was no distancing in the line. It was at least a couple hundred people long.

It was worse than Black Friday and there weren't even any huge sales going on. People like having a local store with better prices than online.

pyrotek
May 21, 2004



DrDork posted:

They do if you want to enable RTX at >1080p, though. But, yes, it'd be nice to see them add it to games where the visual quality improvements would be a bit more noticeable.

MSFS is up to MS to fix, considering it's on DX11 right now.

e; lol, just checked the player numbers and freaking Farming Simulator 19 has over double the active players of MSFS on Steam. It also loses handily to Conan Exiles (wut?), VRChat, SMITE, Euro Truck Simulator 2, Borderlands 3, and Stardew Valley.

I imagine most of the people playing MSFS are doing so through Game Pass.

Also, all those games are free or have been on huge sales before unlike MSFS and don't require a high-end computer to run well, except for Borderlands.

pyrotek
May 21, 2004



Cancelbot posted:

Worried about the 3080 FE melting my air-cooled CPU. Turns out my 8086k was set to some wacky auto voltage where prime95/cinebench would set the CPU to "MELT SELF" mode, told AVX to go to hell with a -3 offset, set the voltage properly and now it's nice and cool.

I'm assuming you mostly use your computer for gaming? The Decima engine (Horizon: Zero Dawn and Death Stranding) uses AVX and it is part of DX12, so I would expect more usage in the future. Prime95 and Cinebench in no way represent a realistic gaming workload. I would recommend turning AVX back to normal, the proper voltage should suffice.

pyrotek
May 21, 2004



gradenko_2000 posted:

I'm still on 2666 Mhz :(

anyway those big huge charts from TechPowerUp tell me that even a 2060 would be enough for my 1080p60Hz rear end

It absolutely is with current games.

shrike82 posted:

there seems to be a lot of variance in benchmarks between the reviewers... some have Control 4K RTX+DLSS >60, others <60 ....
it might be highlighting other bottlenecks

Could also depend on where they are doing the testing. If one place is testing the most demanding spot in the game and the other isn't, that would make a big difference.

pyrotek
May 21, 2004



jisforjosh posted:

Some of GN's results were showing worse frametime consistency with those overclocks.

This page does a good job explaining memory overclocking on the 3080.

techpowerup posted:

For memory overclocking, this means you can no longer rely on crashes or artifacts as indicators of the maximum memory clock. Rather, you have to observe performance, which will reach a plateau when memory is 100% stable and no transactions have to be replayed. Replaying memory errors reduces overall memory performance, which will be visible in reduced FPS scores. For the following OC testing, I took this into account, of course.

So GN pushed too far on the GDDR6X. I imagine this will be common at first until word gets out.

Is there a good benchmark that automatically does 1% and 0.1% lows to test that easily?

pyrotek
May 21, 2004



Josh Lyman posted:

All the reviews focusing on comparing it to 2080 Ti are flawed. Yes it should be faster and it is by 25%, but the proper comparison is the 2080. Just because reviewers got them for free doesn’t change the fact that almost nobody bought a $1200 video card.

edit: apparently Steam survey says 2080 is 0.96% and 2080Ti is 0.91% so they basically had the same sales? :psyduck:

I'd almost add the 2070 Super and 2080 Super to those totals since they are basically the same as the 2080 +/- ~5%.

pyrotek
May 21, 2004



Incessant Excess posted:

It also offers very limited overclocking headroom and we don't yet know how the acoustics compare to AIB cards.

Putting an embargo on AIB reviews until the exact moment your cards go on sale is pretty suspicious...

That said, the FE looks like a good design and they seem to have already pushed power about as far as is reasonable. I'm very curious to see what kind of gains you get from overclocking on the AIB cards and how much efficiency you lose.

pyrotek
May 21, 2004



CaptainSarcastic posted:

If AMD pulls a rabbit out of their hat and has competitive GPUs at a reasonable price in the next month then this launch will look like an even bigger clusterfuck.

In that case AMD would have to be able to produce even more of them than Nvidia, which is absolutely not a sure thing

pyrotek
May 21, 2004



I know everyone here has been asking over and over for a blower-style 3090 that will fit in a SFF case. Well I have good news!

https://twitter.com/momomo_us/status/1306947499626618883?s=20

Clocks TBD

pyrotek
May 21, 2004



VorpalFish posted:

That is going to be then loudest imaginable thing.

Like that has to be 40mm 8k rpm delta fan loud.

I'm actually really curious to see what they do with this thing. Will it be louder than a blow drier? Will they expect people to undervolt or lower the power target for reasonable thermals?

pyrotek
May 21, 2004



redreader posted:

I saw a tweet yesterday saying something like 'msi hosed up and somehow confirmed a 20gb 3080 and a 16gb 3070'. So clearly they're not announcing them until after they've sold all of their 3080's and 3070's. probably (?) not selling those ti/supers any time soon but who knows!

They are probably planned to be a response to big Navi.

pyrotek
May 21, 2004



Kraftwerk posted:

I paid because having a quiet and effective cooling solution mattered to me.

The FE is subsidized by Nvidia and they paid AIBs $50 per card as a rebate so the initial base model cards would be in line with the FE price. Once FEs sell out, prices will go up. So those premium Strix, Trio, FTW3 cards are closer to the actual price/cost the AIBs face.

What makes you think the production of the FE cards is any more limited than the AIB cards? I read anything about that.

phosdex posted:

oo, just got my shipment notification from nvidia. Fedex is saying delivery today but it's in the label created stage and 5:30pm already so that doesn't seem possible.

You actually managed to get in an order yesterday with Nvidia? How?

pyrotek fucked around with this message at 23:41 on Sep 18, 2020

pyrotek
May 21, 2004



Truga posted:

yes, but i don't care about dlss or rtx at this point in time, nor am i looking for 144hz@4k in cp2077 on ultramax or whatever

So you think AMD will be competitive as long as you don't care about ray tracing, AI upscaling, high framerates, high resolution, or high settings in highly anticipated upcoming games?

pyrotek
May 21, 2004



Truga posted:

I know for a fact AMD has way better drivers for my operating system of choice which has been a pretty annoying issue for me recently. Also, out of the list of games that are supposed to support either DLSS or RTX, I'm only interested in 2077. That's one game I'll probably play for 20-40 hours, out of the hundreds of hours a year I spend playing games.

No other game I currently play or intend to play in the next couple years supports either, and most of the games I play on my 980Ti run fine at 4k60 vsync locked, by only lowering or disabling settings that don't actually visually improve the picture quality, or even degrade it (like blur/dof etc).

Additionally, DLSS isn't supported in VR games, and is unlikely to be in the next couple years, which is the thing I'm actually buying gpu power for at this point in time, so if I can get a ~3070 equivalent with a not-lovely driver situation for ~500, I'll probably buy it immediately just because I've been waiting for so long, thanks to 2000 series being overpriced as gently caress.

I just hope this 980Ti keeps working fine until then, recently it stopped boosting to 1600MHz, which has hurt my VR performance just enough that I get bad stutters in some games that ran fairly smoothly before. it now stops at 1488Mhz, if I try going for 1492 or higher it crashes instantly :godwinning: no matter how high i crank the voltage

I'm glad you understand your use case, but it is hard to say AMD will be competitive if they can't keep up in framerates, resolution, etc. at high settings in popular games. Even ray tracing will probably be important soon with the new consoles supporting it.

DLSS 2.1 does actually support VR. How many games will actually use it is definitely a question, given I haven't seen any that have announced they are using it.

pyrotek
May 21, 2004



Alchenar posted:

Yeah I'm feeling extremely happy about getting that xbox preorder now. I can push pause on a new build until spring and see how everything has shook out.

Enjoy your launch Xbox. Those never have problems.

pyrotek
May 21, 2004



Mikojan posted:

I was convinced at first that the 3080 was the card for me but, its only 10-15% faster than a 2080TI? And the 3070 will be slightly slower than a 2080TI most likely?

I mean, the 3080 being more than 50% more expensive and probably only 20% faster seems like the 3070 will be better value?

On top of that I'm put off by the huge power usages so I'm cancelling my preorder I think and I'll wait it out for the 3070.

But it is 40% more expensive, not 50%, and around 25% faster at 1440p and 30% faster at 2160p? Where are you getting 10-15%? The 3080 is actually pretty compelling in terms of price-to-performance compared to most high-end cards. There are always diminishing returns, but much less than usual here.

Based off of trends, I'd expect the 3070 to be slower than the 2080Ti at 1440p and probably only faster at 2160p.

pyrotek fucked around with this message at 16:22 on Oct 8, 2020

pyrotek
May 21, 2004



Pivo posted:

I used to fly real airplanes as a hobbyist (65hrs in a 152 but no PPL to show for it, life got in the way) and when I loaded it up and set it up to take off from my dinky little old field, I was 100% sold within minutes. They've even got the tiny untowered fields I used to practice touch & gos at. Without a moving map flying VFR IRL you get up to some tricks like identifying POIs and following highways and I was surprised at how much of the landscape I was able to recognize by sight. It's nothing like flying the real thing, mostly because the real thing shakes and can break and the humans involved aren't robots, but it's insane how much of that "feeling" of "being there" it replicates. Now the little Cessna isn't exactly difficult to model so I can't speak to accuracy flying bigger and faster things, but I've got a Logi / Saitek X52 Pro on the way just so I can take that little Cessna to places I've never gotten to go. This is going to be the thing that makes me take the plunge into VR as well (although SW Squadrons looks good too). poo poo it just might put me back in the real cockpit as well.

My buddy used to play Flight Sim for hours and hours when he was doing his multi-engine and then his commercial, and I *get it* now.

My 3080 pulls 40-50fps at 1440p with Alex Battaglia's optimized settings with the GPU load sitting around 50-60% ... I mean sure I hope they optimize it, but it's not a twitch game, I never even felt the framerate for a second.

It's pretty tough to overstate how exciting this technology is.

If you would like to better utilize your 3080 better in the game, increasing the render scaling setting shouldn't affect performance since you are CPU bound.

Just out of personal curiosity, what CPU do you have?

pyrotek
May 21, 2004



Pivo posted:

9900k @ 4.9GHz all-core

I haven't actually benchmarked MSFS, I glanced at stats a couple of times while playing. If there's actual interest I can try to set up a proper benchmark run later today.

Render scale is cool but I don't see many jaggies. Certainly some headroom to play with.

Increasing the render scale should help with the clarity more than reduce the jaggies. I agree the TAA solution in the game is very effective.

If you don't mind, give it a shot and let me know how it works! One day in the theoretical future where they are in stock, I'd like to get a 3080, and MSFS @ 1440 is one of the big reasons for me to upgrade.

pyrotek
May 21, 2004



Shlomo Palestein posted:

Let's say I have an EVGA 600 watt 80+ Gold supply. It has the two 8 pins I need for the incoming ASUS, but it looks like the ASUS itself can be a bit power-hungry. I assume I'm skirting the line here if I have a 3600 as my CPU and all my internal hard drives are SSD's...? I do have a few external USB drives, but those all have power warts anyhow.

EDIT: Yep, cutting it pretty close:



That estimate seems really high for a system with a 3600. How many SSDs do you have? That said, I wouldn't recommend overclocking the 3080 at all.

pyrotek
May 21, 2004



Pivo posted:

Right. So I have. Here's a better run. I think having a moving map popped out was interfering with frametime capture.



GPU usage remains far below 100%. I don't think it necessarily scales like that.

Surprising and confusing. Thanks for your work.

Adbot
ADBOT LOVES YOU

pyrotek
May 21, 2004



Zedsdeadbaby posted:

More like they literally can't even make them if they tried

Why even bother with new models if they can't make the current ones to save their lives

Also, if the 7nm rumors are true, maybe they want to save the higher VRAM models for that.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply