Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness
Anyone here play some RDR2 on their 3080 yet? From benchmarks you can expect just a few dips below 60 at 4k ultra, but I'm wondering how close to locked it actually feels like.

Adbot
ADBOT LOVES YOU

FiftySeven
Jan 1, 2006


I WON THE BETTING POOL ON TESSAS THIRD STUPID VOTE AND ALL I GOT WAS THIS HALF-ASSED TITLE



Slippery Tilde

Cancelbot posted:

Are air bubbles common in AIOs? It's the next thing I want after the GPU, I guess they can't be 100% filled and it's like a spirit level bubble that exists in the system itself. My PC is 80% heatsink with a Black Rock 4 and the 2080 Super, not that it will improve with a huge radiator.

I can't decide which I hate more - transferring a PC to a new case virtually wholesale, or CPU cooler removal/cleaning/installation.

You should watch that video, it actually explains the whole thing but the short answer is yes, most have between 1-10% air inside the loop and when its properly orientated, it doesnt matter because the bubble just sits at the top basically never moving. My mistake was that I agitated the radiator several times when it was resting on my desk below the CPU/pump which made it flow through the loop to the highest point which happened to be exactly where the impeller was.

Vir
Dec 14, 2007

Does it tickle when your Body Thetans flap their wings, eh Beatrice?

Pivo posted:

Are there people that think it is? 🤦‍♂️

Sim time isn't something you just stumble upon ... A school for commercial pilots might have one / have time on one, airlines will have them / have access to them, but aviation + certification = $$$$$$$$$$. This is just a program running on your home computer!
The neat thing about X-Plane is that the consumer version is essentially the same program, the same physics engine, the same graphic quality, etc. which runs in the big commercial X-Plane simulators. The certification key mainly switches off the fun spaceflight stuff, enables more support for IO interfaces and simulator screens, and refuses to run if your FPS is too low. Depending on the aircraft you can log up to 25% of your FAA mandated instrument flight time on an "affordable" desktop simulator.
https://www.youtube.com/watch?v=mKQ0OnrZ-uM&t=32s
But yeah, you can self-build a neat 180 degree whole-room simulator with a better GPU for less than what that FAA approved wall of buttons costs.

Admiral Joeslop
Jul 8, 2010




I'm considering a 3070 when they drop to replace this 1660 Ti. I'm also looking at a new monitor, something in the 1440p/144hz range. Does the 3070 sound as competent at getting 144 fps even on ultra settings for a lot of games at 1440p? I'm struggling to hit 144 fps in some games with the 1660 at 1080p.

VorpalFish
Mar 22, 2007
reasonably awesometm

Admiral Joeslop posted:

I'm considering a 3070 when they drop to replace this 1660 Ti. I'm also looking at a new monitor, something in the 1440p/144hz range. Does the 3070 sound as competent at getting 144 fps even on ultra settings for a lot of games at 1440p? I'm struggling to hit 144 fps in some games with the 1660 at 1080p.

Nobody knows yet but if you want an idea I'd look at what the 2080ti benches at now. They're supposed to be close.

Enos Cabell
Nov 3, 2004


Admiral Joeslop posted:

I'm considering a 3070 when they drop to replace this 1660 Ti. I'm also looking at a new monitor, something in the 1440p/144hz range. Does the 3070 sound as competent at getting 144 fps even on ultra settings for a lot of games at 1440p? I'm struggling to hit 144 fps in some games with the 1660 at 1080p.

I really doubt the 3070 will be a compelling 1440p/144hz card if you want ultra settings. Just look at 3080 benchmarks, there are plenty of games out now that it can't even run at 1440p/144 ultra.

latinotwink1997
Jan 2, 2008

Taste my Ball of Hope, foul dragon!


Hemish posted:

Anyone knows why Amazon Canada seems to have received 0 3080 card from EVGA since launch?

Also, everyone knows the real magic is NVIDIA HairWorks and not DLSS, everyone knows that.



Hmmm, doesn’t seem to be working. He’s still bald.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

Admiral Joeslop posted:

I'm considering a 3070 when they drop to replace this 1660 Ti. I'm also looking at a new monitor, something in the 1440p/144hz range. Does the 3070 sound as competent at getting 144 fps even on ultra settings for a lot of games at 1440p? I'm struggling to hit 144 fps in some games with the 1660 at 1080p.

Yeah I have a 3080 now and on some games it doesn't max out my 144hz 1440p monitor. Everything is 100+ and super smooth with GSync compatibility, but at top settings it drops 20-30 frames from cap usually in some games. Other games it will crush 144hz steady tho, like Doom Eternal which is eternally optimised.

So a 3070 definitely won't do it.

Having said that as lots of people have said, "max settings" often introduces options which increment image quality by such a small amount they functionally don't matter, and turning them off nets huge gains in performance. This varies from game to game though.

George RR Fartin
Apr 16, 2003




ASUS TUF just dropped on Amazon and I hit buy it now immediately. Fingers crossed this isn't some weird glitch, but if not, the stupid waiting is finally over!

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Sininu posted:

What clockspeed does everyone's 2070S idle at?
Two displays at 1440p 120hz each, with GSync.



Bumped up for a bit to 780 MHz after the snipping tool took the screen capture.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Shlomo Palestein posted:

ASUS TUF just dropped on Amazon and I hit buy it now immediately. Fingers crossed this isn't some weird glitch, but if not, the stupid waiting is finally over!

drat it! None of my monitors went off to notify me :(

spunkshui
Oct 5, 2011



AirRaid posted:

Yeah I have a 3080 now and on some games it doesn't max out my 144hz 1440p monitor. Everything is 100+ and super smooth with GSync compatibility, but at top settings it drops 20-30 frames from cap usually in some games. Other games it will crush 144hz steady tho, like Doom Eternal which is eternally optimised.

So a 3070 definitely won't do it.

Having said that as lots of people have said, "max settings" often introduces options which increment image quality by such a small amount they functionally don't matter, and turning them off nets huge gains in performance. This varies from game to game though.

I think the 3080 is going to have higher demand then the 3070 because you either want no compromises or you want to pay less then $500 for a GPU. Then again brutal supply limits will mean the 3070s will go to people who wanted a 3080.

If AMD can just beat a 2080 super they have a real chance to sell a poo poo ton of GPUs right now.

I'm cool waiting but if Overwatch 2 comes and I don't have a strix im going 2x AMD for us just out of pure spite.

Oh wait I have gsync displays, poo poo.

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness

Incessant Excess posted:

Anyone here play some RDR2 on their 3080 yet? From benchmarks you can expect just a few dips below 60 at 4k ultra, but I'm wondering how close to locked it actually feels like.

Tried it myself for a bit and 4k every setting maxed manually gives me a lil over 40fps on the benchmark, dropping it down to the ultra preset its pretty much locked 60.

Wiggly Wayne DDS
Sep 11, 2010



Combat Pretzel posted:

Two displays at 1440p 120hz each, with GSync.



Bumped up for a bit to 780 MHz after the snipping tool took the screen capture.
ya i expected some multi-mon idle issues near launch but haven't hit any at 4k60 (inc. gsync/hdr variances between displays at times)

idling at 210MHz atm

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
Yeah my 3080 also idles at 210Mhz Core / 50Mhz memory (:v:), with 2 screens (1440/144 and 1080/60)

repiv
Aug 13, 2009

My 2070S doesn't idle properly with a 1440p165 and 1080p60, I guess because the refresh rates don't match or aren't an integer multiple or something

spunkshui
Oct 5, 2011



Auto refresh every 10 seconds on best buy hits you with a survey but 20 seconds does not.

I have also learned my computer does not appreciate me auto refreshing 3 websites and playing Overwatch at the same time.

2020 full of new experiences.

pyrotek
May 21, 2004



Pivo posted:

I used to fly real airplanes as a hobbyist (65hrs in a 152 but no PPL to show for it, life got in the way) and when I loaded it up and set it up to take off from my dinky little old field, I was 100% sold within minutes. They've even got the tiny untowered fields I used to practice touch & gos at. Without a moving map flying VFR IRL you get up to some tricks like identifying POIs and following highways and I was surprised at how much of the landscape I was able to recognize by sight. It's nothing like flying the real thing, mostly because the real thing shakes and can break and the humans involved aren't robots, but it's insane how much of that "feeling" of "being there" it replicates. Now the little Cessna isn't exactly difficult to model so I can't speak to accuracy flying bigger and faster things, but I've got a Logi / Saitek X52 Pro on the way just so I can take that little Cessna to places I've never gotten to go. This is going to be the thing that makes me take the plunge into VR as well (although SW Squadrons looks good too). poo poo it just might put me back in the real cockpit as well.

My buddy used to play Flight Sim for hours and hours when he was doing his multi-engine and then his commercial, and I *get it* now.

My 3080 pulls 40-50fps at 1440p with Alex Battaglia's optimized settings with the GPU load sitting around 50-60% ... I mean sure I hope they optimize it, but it's not a twitch game, I never even felt the framerate for a second.

It's pretty tough to overstate how exciting this technology is.

If you would like to better utilize your 3080 better in the game, increasing the render scaling setting shouldn't affect performance since you are CPU bound.

Just out of personal curiosity, what CPU do you have?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

spunkshui posted:

If AMD can just beat a 2080 super they have a real chance to sell a poo poo ton of GPUs right now.

From what was dropped in the AMD presentation yesterday, we can probably expect their top offering to be roughly 10-15% behind the 3080 (and without DLSS, and likely with lesser RT support).

That still beats a 2080Ti as long as you're not CPU limited (1080p gaming), so the question is gonna be pricing. If they try to ask $650 for it, no one should even consider it. $550? Now we're talkin'.

Pivo
Aug 20, 2004


pyrotek posted:

If you would like to better utilize your 3080 better in the game, increasing the render scaling setting shouldn't affect performance since you are CPU bound.

Just out of personal curiosity, what CPU do you have?

9900k @ 4.9GHz all-core

I haven't actually benchmarked MSFS, I glanced at stats a couple of times while playing. If there's actual interest I can try to set up a proper benchmark run later today.

Render scale is cool but I don't see many jaggies. Certainly some headroom to play with.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Picked up my scalped-at-retail-price MSI Ventus 3090 and I'll install it today. Feeling sort of dumb about it, but I think I can resell it without much loss when I get my 3080 of choice, if I don't find a compelling use for All The VRAM or whatever.

Admiral Joeslop
Jul 8, 2010




drat, maybe I'll try saving a bit longer for the 3080 then, they'll probably actually exist by then. Or just settle for less than max settings.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
NVIDIA finds a new algorithm to improve video compression by three orders of magnitude

quote:

NVIDIA Research has invented a way to use AI to dramatically reduce video call bandwidth while simultaneously improving quality.

What the researchers have achieved has remarkable results: by replacing the traditional h.264 video codec with a neural network, they have managed to reduce the required bandwidth for a video call by an order of magnitude. In one example, the required data rate fell from 97.28 KB/frame to a measly 0.1165 KB/frame – a reduction to 0.1% of required bandwidth.

The mechanism behind AI-assisted video conferencing is breathtakingly simple. The technology works by replacing traditional full video frames with neural data. Typically, video calls work by sending h.264 encoded frames to the recipient, and those frames are extremely data-heavy. With AI-assisted video calls, first, the sender sends a reference image of the caller. Then, instead of sending a stream of pixel-packed images, it sends specific reference points on the image around the eyes, nose, and mouth.


A generative adversarial network (or GAN, a type of neural network) on the receiver side then uses the reference image combined with the keypoints to reconstruct subsequent images. Because the keypoints are so much smaller than full pixel images, much less data is sent and therefore an internet connection can be much slower but still provide a clear and functional video chat.

In the researchers’ initial example, they show that a fast internet connection results in pretty much the same quality of stream using both the traditional method and the new neural network method. But what’s most impressive is their subsequent examples, where internet speeds show a considerable degradation of quality using the traditional method, while the neural network is able to produce extremely clear and artifact-free video feeds.

The neural network can work even when the subject is wearing a mask, glasses, headphones, or a hat.





With this technology, more people can enjoy a greater number of features all while using monumentally less data.

But the technology use cases don’t stop there: because the neural network is using reference data instead of the full stream, the technology will allow someone to even change the camera angle to appear like they are looking directly at the screen even if they are not. Called “Free View,” this would allow someone who has a separate camera off-screen to seemingly keep eye contact with those on a video call.



NVIDIA can also use this same method for character animations. Using different keypoints from the original feed, they can add clothing, hair, or even animate video game characters.



Using this kind of neural network will have huge implications for the modern workforce that will not only serve to relieve strain on networks, but also give users more freedom when working remotely. However, because of the way this technology works, there will almost certainly be questions on how it can be deployed and lead to possible issues with “deep fakes” that become more believable and harder to detect.

Looks to be a special case for headshot type scenarios rather than in general but still, that's impressive.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

NVidia posted:

Called “Free View,” this would allow someone who has a separate camera off-screen to seemingly keep eye contact with those on a video call.
...
NVIDIA can also use this same method for character animations. Using different keypoints from the original feed, they can add clothing, hair, or even animate video game characters.

Eagerly awaiting this so all the nerds in my standup meetings can transform themselves into random anime toons who keep 100% locked eye contact with you while the actual engineers are loving off and reading dumb internet forums instead of paying attention.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World
Saying you need a 3080 for 1440p gaming seems a bit much, you're saying in effect that 1440p gaming didn't exist until it released. A 3070 is functionally a 2080ti supposedly, it will be fine for 1440p by most sane standards.

KingKapalone
Dec 20, 2005
1/16 Native American + 1/2 Hungarian = Totally Badass

DrDork posted:

drat it! None of my monitors went off to notify me :(

I don't see a post in the discord for this at all.

Also, I know the best 3080 is whatever one you can get, but are there any poor enough to avoid entirely?

The PSU I wanted came in stock so I have that at least.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Paul MaudDib posted:

NVIDIA finds a new algorithm to improve video compression by three orders of magnitude


Looks to be a special case for headshot type scenarios rather than in general but still, that's impressive.

It’s interesting that we consider this as lossy compression, rather than synthesis. A spectrum of course, but a solid branding choice given the noise around generated video. It seems like, rather than with DLSS just parameterizing a deterministic scaling algorithm, this has the model actually generating the replacement content, but it’s a little hard to tell from what they’ve said so far.

repiv
Aug 13, 2009


speaking of nvidia research they've published a new technique that allows rendering extremely high numbers of shadowed polygonal area lights (on the order of millions) with a fixed budget of a few rays per pixel

https://news.developer.nvidia.com/render-millions-of-direct-lights-in-real-time-with-rtx-direct-illumination-rtxdi/

nicely complements the dynamic probe global illumination they published earlier, we're a way off from this stuff being widely used but it's a cool look at what future renderers might be like

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

KingKapalone posted:

Also, I know the best 3080 is whatever one you can get, but are there any poor enough to avoid entirely?

Not that we've seen so far, no. The ASUS TUF seems to be the best bang for your buck, but all the others seem to be perfectly competent, and mostly seem to differentiate themselves based on aesthetics, frankly.

The FE version tends to be the hottest / loudest of the bunch (though not by much), but it's also notably smaller than most of the others, so that's a simple tradeoff there depending on what you're looking for.

We may see some better overclocking results from cards like the EVGA FTW3 or the ASUS Strix, but I've yet to see comprehensive reviews of them.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
We're also not seeing the numbers yet to know. Keep in mind, even the most unreliable models of past releases had a failure rate of like 3% instead of the best having failure rates of 1%. There just aren't enough in the wild to get a sense for that.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

DrDork posted:

Eagerly awaiting this so all the nerds in my standup meetings can transform themselves into random anime toons who keep 100% locked eye contact with you while the actual engineers are loving off and reading dumb internet forums instead of paying attention.

George RR Fartin
Apr 16, 2003




DrDork posted:

Not that we've seen so far, no. The ASUS TUF seems to be the best bang for your buck, but all the others seem to be perfectly competent, and mostly seem to differentiate themselves based on aesthetics, frankly.

The FE version tends to be the hottest / loudest of the bunch (though not by much), but it's also notably smaller than most of the others, so that's a simple tradeoff there depending on what you're looking for.

We may see some better overclocking results from cards like the EVGA FTW3 or the ASUS Strix, but I've yet to see comprehensive reviews of them.

And even though the Zotac gets a lot of poo poo for being a Zotac and "the worst", the warranty is long as hell, and it's only like single digit frames "worse" than anything else, which you will not notice. It's also the quietest 3080, so it's really not a horrible deal if you don't get gouged for it.

Enos Cabell
Nov 3, 2004


sean10mm posted:

Saying you need a 3080 for 1440p gaming seems a bit much, you're saying in effect that 1440p gaming didn't exist until it released. A 3070 is functionally a 2080ti supposedly, it will be fine for 1440p by most sane standards.

Right, but he was specifically asking about playing at 144fps with ultra settings.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Shlomo Palestein posted:

And even though the Zotac gets a lot of poo poo for being a Zotac and "the worst", the warranty is long as hell, and it's only like single digit frames "worse" than anything else, which you will not notice. It's also the quietest 3080, so it's really not a horrible deal if you don't get gouged for it.

It's worth noting that you need to register your Zotac card within 30 days to get the 3 year warranty, otherwise you just get a 2 year warranty. EVGA gives a 3 year warranty standard, and it's transferrable, unlike Zotac's.

Still, you're right that the actual performance differences between the top and bottom of the stack right now are <5%, so functionally meaningless.

pyrotek
May 21, 2004



Pivo posted:

9900k @ 4.9GHz all-core

I haven't actually benchmarked MSFS, I glanced at stats a couple of times while playing. If there's actual interest I can try to set up a proper benchmark run later today.

Render scale is cool but I don't see many jaggies. Certainly some headroom to play with.

Increasing the render scale should help with the clarity more than reduce the jaggies. I agree the TAA solution in the game is very effective.

If you don't mind, give it a shot and let me know how it works! One day in the theoretical future where they are in stock, I'd like to get a 3080, and MSFS @ 1440 is one of the big reasons for me to upgrade.

Vir
Dec 14, 2007

Does it tickle when your Body Thetans flap their wings, eh Beatrice?
The sneaker bot'ers might have just hit Cinemark and taken a bunch of free movie screenings on Halloween. At least the aftermath and bad reactions from the public seems to echo the 3080 FE launch day.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
NVidia Broadcast gonna get patched to let you do vtubing

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

DrDork posted:

Eagerly awaiting this so all the nerds in my standup meetings can transform themselves into random anime toons who keep 100% locked eye contact with you while the actual engineers are loving off and reading dumb internet forums instead of paying attention.

Its primary use is going to be porn. We all know this.

DeadFatDuckFat
Oct 29, 2012

This avatar brought to you by the 'save our dead gay forums' foundation.


Who else is F5ing the best buy site with me??? I'm really just hoping that cards other than that gigabyte go up on BB.

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

K8.0 posted:

Its primary use is going to be porn. We all know this.

I mean, the entire internet is for porn, so this seems reasonable. How they're gonna use it when it only works for headshots will be interesting, I'm sure.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply