Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Tamba
Apr 5, 2010

Zedsdeadbaby posted:

I remember when people said this about 1280x720, and then 1920x1080, and then.... you get the message. Either you have the hardware or you don't

https://en.m.wikipedia.org/wiki/Fovea_centralis#Angular_size_of_foveal_cones

There is a point where no amount of increased resolution will be useful to anyone but the visual equivalent of audiophiles

Adbot
ADBOT LOVES YOU

Cygni
Nov 12, 2005

raring to post

ijyt posted:

what is diminishing returns

4k def has diminished returns, but 4k vs 1440p for a 27in computer display is absolutely noticeable for me, specifically for text. But there are also pretty noticeable differences in lots of lower paced or less post processed games. Is it worst the current cost for 4k/144hz is another question, but that cost (and the cost of the GPU horsepower to drive it) comes down everyday.

Dr. Video Games 0031
Jul 17, 2004

Listerine posted:

Would ebay be the safest way to resell?

I do not sell stuff online often at all, but my understanding is that ebay is not the safest way to sell poo poo, though it's safe-ish as long as you know how to spot the red flags for scammers. The safest would probably be SA Mart or selling it to somebody local on Craigslist (while taking typical safe craigslist precautions like doing the transaction in a a bank or police station lobby or something). You'd get a lower price at SA-Mart though since anyone paying $100 for a 780 is a sucker (who knows when a heavily-used 9-year-old card will just die on you)

Dr. Video Games 0031 fucked around with this message at 23:18 on Aug 12, 2022

Bad Munki
Nov 4, 2008

We're all mad here.


My work Mac laptop, a few years old now, has an AMD Radeon Pro 5500M 4 GB. Work's going to augment me with an extra display or two, I believe the ones they keep on hand are 2k. Not at all using it for gaming, just regular desktop use. Pycharm and email and fifty different chat clients and poo poo like that. Is this thing going to be able to drive both displays or am I going to be sad? I feel like for that type of work, it should be fine, but it's still a lotta pixels to push around.

Dr. Video Games 0031
Jul 17, 2004

Bad Munki posted:

My work Mac laptop, a few years old now, has an AMD Radeon Pro 5500M 4 GB. Work's going to augment me with an extra display or two, I believe the ones they keep on hand are 2k. Not at all using it for gaming, just regular desktop use. Pycharm and email and fifty different chat clients and poo poo like that. Is this thing going to be able to drive both displays or am I going to be sad? I feel like for that type of work, it should be fine, but it's still a lotta pixels to push around.

It's fine. It doesn't take a powerful GPU at all to just do desktop stuff, even with multiple monitors at high resolutions.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

ijyt posted:

what is diminishing returns

You are right of course (and again, I vividly remember exactly this very debate between users of 720p and 1080p, with many saying it's pointless going to the latter), but there is an enormous difference visually between 1440p and 4k. I can immediately tell the two apart, especially on desktop and on games that do not use any kind of temporal anti-aliasing. 1440p isn't stable on edges (or 'jaggy-free') by a long, long shot and Destiny 2 proves that even 4k is very shimmery if there is no proper anti-aliasing to support the image quality.

carry on then
Jul 10, 2010

by VideoGames

(and can't post for 10 years!)

Bad Munki posted:

My work Mac laptop, a few years old now, has an AMD Radeon Pro 5500M 4 GB. Work's going to augment me with an extra display or two, I believe the ones they keep on hand are 2k. Not at all using it for gaming, just regular desktop use. Pycharm and email and fifty different chat clients and poo poo like that. Is this thing going to be able to drive both displays or am I going to be sad? I feel like for that type of work, it should be fine, but it's still a lotta pixels to push around.

I use a 2017 MacBook Pro hooked up to two 4k monitors running at effectively 6k each internally with the display scaling without productivity issues. Of course, google earth and the like will stutter heavily in full screen, but you shouldn’t feel like it’s unusable.

CaptainSarcastic
Jul 6, 2013



Listerine posted:

I'm about to pop a couple new 3000 series cards into my computer to replace my old 780 and 2070. Just a little nervous since it's been forever since I swapped parts in. This would be the correct cable to plug into the adapter right? It has VGA written on it which is throwing me for a loop for some reason, but that's the only 8 pin cables in my PSU's kit.

Also is there any market at all for a 780? I'm guessing since they ended driver support I might as well throw it away?



What wattage is your PSU, and which 3000 series cards did you get? I know you're not using them for gaming, and am not sure what kind of power draw you can expect from rendering, but I figured it was worth asking.

b0ner of doom
Mar 17, 2006
4k vs 1440p is 1000 noticeable to me, and I find games just look better in 4k.

I'm not really sure if there's an established point to going past 4k yet tho.

Bad Munki
Nov 4, 2008

We're all mad here.


Dr. Video Games 0031 posted:

It's fine. It doesn't take a powerful GPU at all to just do desktop stuff, even with multiple monitors at high resolutions.

carry on then posted:

I use a 2017 MacBook Pro hooked up to two 4k monitors running at effectively 6k each internally with the display scaling without productivity issues. Of course, google earth and the like will stutter heavily in full screen, but you shouldn’t feel like it’s unusable.

Great, thanks. That’s all exactly what I figured but a sanity check never hurts. :cheers:

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

b0ner of doom posted:

4k vs 1440p is 1000 noticeable to me, and I find games just look better in 4k.

I'm not really sure if there's an established point to going past 4k yet tho.

Higher resolutions exponentially get harder and harder to do just by the simple fact that it's height by width.
4k is 1080p four times over.
8k is 4k four times over.

Sony tried to pull a fast one and advertise their PS5 as 8k, it's even got 8k label on the box. Yet it can't even output that at all.

repiv
Aug 13, 2009

if 8k gaming ever becomes an actual thing it's going to be exclusively driven by temporal upscaling, rendering 8k native is just stupid

Craptacular!
Jul 9, 2001

Fuck the DH
Much like how a Hyundai Accent can hit 120 MPH but it takes a Bugatti Veyron to hit 220 MPH, the problem is that 4K is so far away from 1080 relative to where 720 was to 1080. You can maybe get much of the way to 4K with small improvements, but if getting the rest of the power requires a 4-slot infernocard that comes pre-equipped with a 280mm rad, you can forget about it.

Partially because of consoles and possibly because of a 90s-2000s era narrative about gaming being The New Hollywood or whatever, we keep letting TV manufacturers and movie studios determine how we use computers. It's time to accept that the spec best for one isn't best for the other. Either that, or we can target gaming on IMAX.

KillHour
Oct 28, 2007


Dr. Video Games 0031 posted:

In normal media where you're looking at like, people in normal, realistic environments, even if it's stylized or whatever, going past 4K will really give you some heavy diminishing returns. It's not that there are no benefits, but you are butting up against the limits of human visual acuity at that point.

This is not a normal use case at all but I just built a theater room and with the huge rear end TV in there, I can see the individual pixels even in 4k. 1080p looks legit rear end. Again, not a normal use case, but situationally, it can be important.

Dr. Video Games 0031
Jul 17, 2004

Yeah, of course. Ultimately what matters is visual density, pixels per degree of vision, which means that display size and seating distance both matter. This site has some pretty interesting information on the matter: http://phrogz.net/tmp/ScreenDensityCalculator.html#find:density,pxW:3840,pxH:2160,size:32,sizeUnit:in,axis:diag,distance:30,distUnit:in

So if you're, for example, sitting 5 feet away from a 100-inch 4K projector screen, then 1) move back you fool, and 2) the visual density will be more akin to sitting 2 feet away from a 27" 1440p screen. Respectable, maybe even good, but with room for improvement.

Dr. Video Games 0031 fucked around with this message at 04:35 on Aug 13, 2022

hobbesmaster
Jan 28, 2008

KillHour posted:

This is not a normal use case at all but I just built a theater room and with the huge rear end TV in there, I can see the individual pixels even in 4k. 1080p looks legit rear end. Again, not a normal use case, but situationally, it can be important.

A projector helps because the MTF of the lens can be considered a very soft form of anti aliasing and you can de focus the image slightly too if pixels are too pronounced.

Listerine
Jan 5, 2005

Exquisite Corpse

CaptainSarcastic posted:

What wattage is your PSU, and which 3000 series cards did you get? I know you're not using them for gaming, and am not sure what kind of power draw you can expect from rendering, but I figured it was worth asking.

1600W, I've got one 3070 and one 3060Ti.

CaptainSarcastic
Jul 6, 2013



Listerine posted:

1600W, I've got one 3070 and one 3060Ti.

Cool, I would think you're more than fine there. Just thought I should ask.

Listerine
Jan 5, 2005

Exquisite Corpse

CaptainSarcastic posted:

Cool, I would think you're more than fine there. Just thought I should ask.

I appreciate it, I was pretty sure I'd be fine but the spot check is reassuring.

KillHour
Oct 28, 2007


hobbesmaster posted:

A projector helps because the MTF of the lens can be considered a very soft form of anti aliasing and you can de focus the image slightly too if pixels are too pronounced.

I could never go projector. I got a fuckoff huge OLED and it's sooooooo good.

Listerine
Jan 5, 2005

Exquisite Corpse
So I popped these two cards in, and the fan is going hard on the 3060Ti, and looking at the temp in both Geforce Experience's performance monitor and in Windows Task Manager, it's in the 80C range.

I'm using the 3070 to drive my monitor, and this is at idle. I can revisit my case's cooling situation and add fans but why would the 3060Ti be operating at 20% when it's not driving any monitors and I'm not using it to render?

Only registered members can see post attachments!

Cream-of-Plenty
Apr 21, 2010

"The world is a hellish place, and bad writing is destroying the quality of our suffering."

Listerine posted:

So I popped these two cards in, and the fan is going hard on the 3060Ti, and looking at the temp in both Geforce Experience's performance monitor and in Windows Task Manager, it's in the 80C range.

I'm using the 3070 to drive my monitor, and this is at idle. I can revisit my case's cooling situation and add fans but why would the 3060Ti be operating at 20% when it's not driving any monitors and I'm not using it to render?



I don't have an answer for you, but furthermore why would the 3060Ti be hovering at 80C at 20% usage? That seems fucky.

Dr. Video Games 0031
Jul 17, 2004

Listerine posted:

So I popped these two cards in, and the fan is going hard on the 3060Ti, and looking at the temp in both Geforce Experience's performance monitor and in Windows Task Manager, it's in the 80C range.

I'm using the 3070 to drive my monitor, and this is at idle. I can revisit my case's cooling situation and add fans but why would the 3060Ti be operating at 20% when it's not driving any monitors and I'm not using it to render?



Look at the power consumption in the Geforce Experience's performance panel, or in hwinfo64 or something. What's it at?

Cavauro
Jan 9, 2008

is the fan assembly of the 3060 Ti right up against the back of the 3070?

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

Craptacular! posted:

Either that, or we can target gaming on IMAX.


I agree with what you say, but what you call gaming on IMAX is I think actually a reasonable proposition. I have a 27" screen that starts feeling normal sized and eventually I can imagine that I would like to go even bigger, perhaps curved. Or simply, a wide-FOV VR headset. I'm absolutely convinced that we don't run out of use cases for pixels in the forseeable future.

Listerine
Jan 5, 2005

Exquisite Corpse

Cavauro posted:

is the fan assembly of the 3060 Ti right up against the back of the 3070?

3070 is above the 3060Ti, so its fans are blowing down onto the 3060Ti, but I spaced them out so there's an empty PCI-e slot between them. So two slots of empty space in between the cards. I could move it down one more slot I suppose, but the 3070 is not running hot and fan is barely moving on that card.

Dr. Video Games 0031 posted:

Look at the power consumption in the Geforce Experience's performance panel, or in hwinfo64 or something. What's it at?

See the screenshot. This is about 2 minutes after booting up, the only program I have open is Firefox.

Only registered members can see post attachments!

Sininu
Jan 8, 2014

Listerine posted:

3070 is above the 3060Ti, so its fans are blowing down onto the 3060Ti, but I spaced them out so there's an empty PCI-e slot between them. So two slots of empty space in between the cards. I could move it down one more slot I suppose, but the 3070 is not running hot and fan is barely moving on that card.

See the screenshot. This is about 2 minutes after booting up, the only program I have open is Firefox.


Holy gently caress the usage is all sorts of hosed up with 3060Ti, but also 178W of power use at 0% utilization with 3070???

Could you check task manager process list? It shows GPU utilization for each process separately. If you can't see the GPU column then you gave to right click on the column headers and tick GPU.

Dr. Video Games 0031
Jul 17, 2004

Listerine posted:

3070 is above the 3060Ti, so its fans are blowing down onto the 3060Ti, but I spaced them out so there's an empty PCI-e slot between them. So two slots of empty space in between the cards. I could move it down one more slot I suppose, but the 3070 is not running hot and fan is barely moving on that card.

See the screenshot. This is about 2 minutes after booting up, the only program I have open is Firefox.



I admittedly have never used a dual GPU setup so I don't know how it usually is, but this is definitely seems odd to me. First of all, the 3070 is clearly misreporting its power consumption. My first thought is that it must be adding the 3060 Ti's consumption to its own, because 30W would be about right for an idling 3070, and 525MHz at 0.7 volts does not draw 178W. The 148W on the 3060 Ti does seem to be real though, considering that it's reporting 89% utilization at 1875MHz and 1.1 volts. Additionally, 82C at 148W power consumption is terrible. Either that GPU has the world's worst cooler, or something else is wrong.

About the card positioning, GPU fans pull air in, not blow it out. They also need a lot of room to pull air from, so I would not move the 3060 Ti any lower if it means moving it closer to a solid case panel. The main point of concern here would be if the 3060 Ti's fans don't have enough room to breath. But even if you fix the cooling situation, the power draw/utilization situation is bizarre. Can you open up the task manager and sort the tasks by GPU utilization to see what's using it?

edit: It would also be helpful to know the model of card the 3060 Ti is, what case it's in, and the fan configuration. I'm suspecting there are two problems happening in tandem here: something is heavily utilizing the GPU in the background (almost like a backdoor crypto miner would), and the card isn't getting enough air to cool itself. The 3070 misreporting its power usage is also an issue, but just an annoying one and not an especially problematic one.

Dr. Video Games 0031 fucked around with this message at 10:01 on Aug 13, 2022

Listerine
Jan 5, 2005

Exquisite Corpse
Welp it was Folding at Home. I haven't used it in over a year, haven't opened it or assigned it any new tasks, and I couldn't even tell if it was working on anything, but that was the process that had the GPU cranked up. I don't know why it would come alive again just because I swapped out cards

Thanks so much, both cards dropped to ~40C drawing 5-10W in the Geforce app shortly after uninstalling FaH.

Both are Nvidia cards in a Cooler Master II Ultra Tower Case, with the stock fans- a 200 mm in the front, 140 mm in the back, 120 mm on the top, and 2x 120mm on the side by the lower hard drive cage.

Dr. Video Games 0031
Jul 17, 2004

There was a fresh RDNA3 leak earlier, and it contradicts some earlier rumors and supports some others: https://www.angstronomics.com/p/amds-rdna-3-graphics (edit: oops, linked the wrong article at first)

This leaker previously accurately leaked many details about the upcoming AM5 chipsets (which was confirmed at computex). There are no performance estimates or anything in this new leak, but it lays out the chief specs of the three planned RDNA3 GPUs. The main difference from the earlier leaks is that there will be half as much infinity cache per MCD as previously thought. AMD is also probably not planning to release any cards with 3d-stacked MCDs at launch since there isn't enough performance benefit for the cost. This means the 7900 XT will apparently not have as much cache as the 6900 XT, but in exchange, the new cache will have more bandwidth, so maybe it's a wash or net gain. And I wouldn't be surprised if they do some super expensive 7950 XT with v-cache at some point down the line.

The idea presented here that they can create different SKUs with the same GPU but with different amounts of MCDs is interesting and must help save costs. They aren't just fusing off parts of the GPU—they're actually removing silicon from the GPU. When taken to the extreme, there could be a future where they have many small GCDs in a GPU, and each SKU is just a different combinations of tiny dies. Maybe that's for RDNA4 or 5.

Listerine posted:

Welp it was Folding at Home. I haven't used it in over a year, haven't opened it or assigned it any new tasks, and I couldn't even tell if it was working on anything, but that was the process that had the GPU cranked up. I don't know why it would come alive again just because I swapped out cards

Thanks so much, both cards dropped to ~40C drawing 5-10W in the Geforce app shortly after uninstalling FaH.

Both are Nvidia cards in a Cooler Master II Ultra Tower Case, with the stock fans- a 200 mm in the front, 140 mm in the back, 120 mm on the top, and 2x 120mm on the side by the lower hard drive cage.

I figured it would be something like that, and I'm glad it wasn't a sneaky crypto miner.

I still think 82C for 148W on a 3060 Ti Founders edition is kind of alarming, but maybe F@H was hammering it in a way that heats up the GPU more? I'm looking at that case, and it seems like the GPUs should at least have plenty of room to breathe. That's also a lot of fans, though the positioning of them is a little odd and not conducive to great airflow (nothing you can do about it, that's just how the case is built). But if everything works well during the normal, intended usage of those GPUs, then I guess I wouldn't worry about it.

Dr. Video Games 0031 fucked around with this message at 12:56 on Aug 13, 2022

namlosh
Feb 11, 2014

I name this haircut "The Sad Rhino".


repiv posted:

the boox mira is a 13.3" 2200x1650 e-ink monitor

800 bux though

Well dang! The Mira looks like a winner for my use case of displaying pdfs. You’re right that it’s too rich for my blood right now though.

Thx both of you for enlightening me!

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

teagone posted:

Where are my 24-25" 1080p 144+ Hz OLED monitors? I don't care for 4K PC displays or 4K gaming, personally.

I don’t really get this — how can you not care for a certain resolution? It seems immaterial beyond higher is better, but higher isn’t better for you?

Is it because it’s cost prohibitive or is it a nostalgia thing or something else? I could see wanting to play on old tech for older games, that could make sense.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

tehinternet posted:

I don’t really get this — how can you not care for a certain resolution? It seems immaterial beyond higher is better, but higher isn’t better for you?

Is it because it’s cost prohibitive or is it a nostalgia thing or something else? I could see wanting to play on old tech for older games, that could make sense.

I can see this from the sense that it takes the same GPU power (in theory) to push 1080@240 as it does 4k@60. And some people would very reasonably prefer the sky-high FPS instead of the increased resolution.

24" OLED ain't happening anytime soon, though--most everyone wants larger than that, so it'll be a long time in trickling down.

CoolCab
Apr 17, 2005

glem

tehinternet posted:

I don’t really get this — how can you not care for a certain resolution? It seems immaterial beyond higher is better, but higher isn’t better for you?

Is it because it’s cost prohibitive or is it a nostalgia thing or something else? I could see wanting to play on old tech for older games, that could make sense.

it adds an absolute ton of processing power for something that i very strongly suspect i wouldn't pass a double blind on - i could probably order 1080, 1440 and 4k if you gave me them all at once but if you gave me a series of images and told me to guess the resolution i'd be sunk. i guess i'm more responsive to higher framerates, but i've never watched a 4k eg video and thought it looked better or even noticeably different to a blu ray. probably comes down to user visual fidelity at some point, my eyes suck even with glasses so higher levels of detail don't really come across while more smooth animation is very clear.

because LCD monitors suck at upscaling, you can't really buy a 4k and run it in 1080 without it looking like dogshit. this is why all the high end esports nerds buy those 300+ Hz monitors in 1080p, too - if you are chasing the most frames then more resolution is totally counterproductive.

teagone
Jun 10, 2003

That was pretty intense, huh?

tehinternet posted:

I don’t really get this — how can you not care for a certain resolution? It seems immaterial beyond higher is better, but higher isn’t better for you?

Is it because it’s cost prohibitive or is it a nostalgia thing or something else? I could see wanting to play on old tech for older games, that could make sense.

I've always used my PC's primarily for gaming, and I would likely see no difference in gaming quality/experience between 4k and 1080p on a 24-25" panel at the distance I'm sitting away from the monitor. I also have always skewed towards mainstream GPUs, and the models in that price bracket ($200-$250ish) typically haven't been able to push past 200+ FPS at 4K resolution.

CoolCab
Apr 17, 2005

glem
the only reason i went up in resolution was pixel density tbh

e: as in i could get a 27 inch monitor without it looking all smeary

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
Isn't some of the point of all of the investment in scaling tech we're seeing recently like DLSS and FSR to get some of the benefit of having 4K displays without having to do computationally expensive 4K rendering?

With TAA in lots of game engines, as well as DLSS and FSR, you can render things at 1080p, 1440p, or a floating resolution in between those, scale it up to 4K, and it will look better than a 1080p or 1440 native render would?

I have to admit I haven't done any investigation into DLSS performance on the 3xxx generation, but the higher end 20x0 GPUs could push 1440p rendering scaled up to 4K via DLSS past 150fps, and I'm assuming that with modern scalers you framerate freaks could run at 720p scaled up to 1440p and push your hundreds of frames even more consistently, and it would look better than native 1080p or especially 720p.

teagone
Jun 10, 2003

That was pretty intense, huh?

CoolCab posted:

the only reason i went up in resolution was pixel density tbh

e: as in i could get a 27 inch monitor without it looking all smeary

Yeah, if I ever moved up beyond 24 or 25" displays, I'd go for 27" 1440p at the most. But my old XG2560 (240Hz TN G-Sync panel) and my newer XG2405 (FreeSync Premium panel that I also have an Xbox Series S hooked up to) are pretty much perfect for me.

Reason I want OLED is for better colors and eliminating backlight bleed. I care more about higher refresh rate than I do higher resolution.

kliras
Mar 27, 2021

Twerk from Home posted:

Isn't some of the point of all of the investment in scaling tech we're seeing recently like DLSS and FSR to get some of the benefit of having 4K displays without having to do computationally expensive 4K rendering?

With TAA in lots of game engines, as well as DLSS and FSR, you can render things at 1080p, 1440p, or a floating resolution in between those, scale it up to 4K, and it will look better than a 1080p or 1440 native render would?

I have to admit I haven't done any investigation into DLSS performance on the 3xxx generation, but the higher end 20x0 GPUs could push 1440p rendering scaled up to 4K via DLSS past 150fps, and I'm assuming that with modern scalers you framerate freaks could run at 720p scaled up to 1440p and push your hundreds of frames even more consistently, and it would look better than native 1080p or especially 720p.
it's also just because nvidia decided to tank all videogame performance with rtx. kind of the 1-2 punch of offering semi-performant raytracing over the competition

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

Twerk from Home posted:

Isn't some of the point of all of the investment in scaling tech we're seeing recently like DLSS and FSR to get some of the benefit of having 4K displays without having to do computationally expensive 4K rendering?

With TAA in lots of game engines, as well as DLSS and FSR, you can render things at 1080p, 1440p, or a floating resolution in between those, scale it up to 4K, and it will look better than a 1080p or 1440 native render would?

yes, it's been the norm on consoles since the PS4Pro/XB1X generation since those systems weren't really capable of driving native 4K, and that tech is filtering it's way onto PC now

speaking of which, spiderman PC offers FSR2, DLSS and insomniacs own TAAU they use on the playstation, i wonder how that stacks up to the usual PC scalers

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply