Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
njsykora
Jan 23, 2012

Robots confuse squirrels.


GN in their 4060ti review made a crack about lazy devs not optimising and in the same video showed Control not loading textures properly on the card. Noted badly optimised game, Control.

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

You can have good ports with advanced features that don't support 8GB maxed out and poorly optimized games happening at the same time. And it can also be true that badly optimized games are a mainstay PC ports, and as 8GB becomes more borderline, you'll run into more games that don't even try to support it well.

Cygni
Nov 12, 2005

raring to post

Presented without a take: 17.75% of users on the steam hardware survey have more than 8gb of VRAM.

Yudo
May 15, 2003

Cygni posted:

Presented without a take: 17.75% of users on the steam hardware survey have more than 8gb of VRAM.

Consoles are king. We are along for the ride, not driving.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Yudo posted:

Consoles are king. We are along for the ride, not driving.

Series S has 8GB of VRAM, and MS doesn't allow you to target only the Series X.

If you aren't optimizing for 8GB, you are effectively a PS5 exclusive.

Series X also only has 10GB of VRAM. If you go past either of these, you get GTX 970 style slowdowns. Nobody is legitimately going to put graphics assets in the slow system memory segment.

Is some cushion nice? Sure. But the problem is devs not utilizing the tools they've been given, and not caring because there's no King Of PC Games who will come down on the studio if it's lovely. Steam refunds aren't the same thing as Sony or MS riding shotgun and telling your publisher that you're a fuckup and you need to be working nights and weekends until this is fixed because we're not releasing this patch like this.

These days you can literally spend the first 45 minutes of your 2 hour refund windows just waiting for shaders to compile, even with a 3600.

Like what do you call a game that gets 6fps in everything except the 4090? Well-optimized, I guess? 6fps would not make it past the MS/Sony QA, period. But there's nobody to tell them no on PC, apart from Steam refunds, which barely even matter.

And if your standard is "will bash through broken ports on launch day"... you're in the market for a 4090 and a 7800X3D, nothing short of that is going to do. As we've seen with Jedi Survivor and Gollum and others. 20fps is better than 6fps, right?

Paul MaudDib fucked around with this message at 03:36 on May 28, 2023

Anime Schoolgirl
Nov 28, 2002

Zen 2 and Zen 3 APUs allow you to reserve up to 16gb of DRAM as dedicated VRAM :buddy:

Dr. Video Games 0031
Jul 17, 2004

The current-gen consoles allow developers to utilize around 10GB of their memory pool as VRAM, so this is just another one of those things where, as more current-gen-only games come out, more games that struggle with 8GB will crop up. Especially for games that rely on the PS5's lightning fast data streaming capabilities, since you'll probably need to load more data into memory at once as a buffer to compensate for the slower streaming on PC.

How do you get around these issues and still offer robust support for 8GB cards? It'll be possible, but it'll take a large amount of extra effort that not all studios will go through, and 8GB owners will probably have to use lower-res textures and may be locked out of the more advanced RT options. This isn't really a concern for a ~$250 card intended for 1080p, but the selling an 8GB card for $400 in 2023 is a joke, and Nvidia should never have been blindsided by this. They can look at what's happening on console as well as anyone.

Paul MaudDib posted:

And if your standard is "will bash through broken ports on launch day"... you're in the market for a 4090 and a 7800X3D, nothing short of that is going to do. As we've seen with Jedi Survivor and Gollum and others. 20fps is better than 6fps, right?

There are degrees of brokenness here, and you're using the most extreme example possible with Gollum. I'd like to place the bar for a $400 card at "Can at least run games at the same level of quality and performance as a 2.5-year-old console," which is something the 4060 Ti may struggle with on games are heavy on memory usage, even if they are well-optimized.

Dr. Video Games 0031 fucked around with this message at 03:43 on May 28, 2023

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr. Video Games 0031 posted:

The current-gen consoles allow developers to utilize around 10GB of their memory pool as VRAM, so this is just another one of those things where, as more current-gen-only games come out, more games that struggle with 8GB will crop up. Especially for games that rely on the PS5's lightning fast data streaming capabilities, since you'll probably need to load more data into memory at once as a buffer to compensate for the slower streaming on PC.

Series S has 8GB and uses the same DirectStorage APIs. MS doesn't let you target only the Series X so any xbox title has to target 8GB.

The problem is mostly from PS5 ports that don't have a DirectStorage implementation, or titles that do dumb poo poo that violates API spec/PC performance norms. Like there is no TLOU for Xbox, because it was a PS5 exclusive. Jedi Survivor was allocating resources inside shaders (which isn't great on console, but is less catastrophic). That's the problem. And there is no real gating like Sony and MS do for their platforms with pre-launch and pre-patch QA validation, you can slap any old poo poo onto Steam and 95% of people won't refund it. On consoles literally every single patch has to go through MS/Sony QA who will grill you on why your game is running at 6fps, that simply would never fly on consoles because of the gating that Sony/MS do before releases.

Anyway, yes, Series S level VRAM doesn't mean you get an ultra-textures experience, but, it shouldn't be the kinds of bullshit that TLOU pulled where it's literally PS3 mode if you go down a texture level. That was an implementation problem (ahem, "well-optimized" according to HUB). 8GB should get you medium textures on PC throughout the entire generation, at absolute minimum.

People have become allergic to turning down textures though. And it's an easy quality win, of course. But the reality is a $300 card is the barest of entry-level at this point, $200 is 6500XT/6400 style bullshit ported over from laptops (again, this actually includes 7600 too). Even $400 is below the midrange now. 4070 is the lowest card that isn't compromised and that's where the "enthusiast midrange" starts now. Honestly even after AMD responds I really can't see anything below $500 that doesn't have significant compromises in something that enthusiasts care about.

6700/6700XT is a firesale on an outgoing model and won't be sustained in the long term. And I've been saying for literally a year now that if you like the 6600/6700 family prices you should just buy it because the next-gen cards obviously weren't going to compete with super deeply discounted RDNA2 stuff in the low end. 7600 at $279 is pretty ok for a newly launched card in the grand scheme of things, but like the 2070 it can't possibly measure up to 1080 Tis selling at half of MSRP, so to speak. 6700XT is a $480 MSRP card. And some of that was due to pandemic/mining but even in an ideal market it probably would have launched at $399 or $379 and fallen to $359 or something along those lines.

Consumer expectations have just generally gotten out of line with reality. Low end cards are not advancing at the same rate as midrange cards due to PHY die area and fixed BOM/manufacturing costs. And $300-400 is a low-end card now. If you find that statement offensive, buy a console, I have said they are a more cost-effective way to build a system. They are.

AMD is learning the same lesson as NVIDIA did with the 20-series: don't over-cut your old stuff or the new stuff looks like poo poo. In some cases if you go too deep it can be an actual regression. And video game studios have learned the same lesson... if you train consumers that steam sale prices are the new normal, you damage your sales at more reasonable prices. If you mark all your AAA titles down to $8-10 after a year or two, you won't make any revenue at launch.

Paul MaudDib fucked around with this message at 05:08 on May 28, 2023

Anime Schoolgirl
Nov 28, 2002

many developers have been demonstrated deliberately using more than 10gb of "soft-cap" VRAM on the series x leading to it trailing behind PS5 in performance, mostly because the PS5 lets you use 13.5gb freely as video ram, so why shouldn't they do the same thing on xbox? :downs:

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Anime Schoolgirl posted:

Zen 2 and Zen 3 APUs allow you to reserve up to 16gb of DRAM as dedicated VRAM :buddy:

gradenko_2000 posted:

obviously the solution here is to use an APU because then you can have 64 gigs of RAM or whatever

Lmfao hell yeah

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
haha but what if instead of some pissant 2-dimm bus we threw 6 stacks of LPDDR5X on top of the die? ... ... ...

yea $3000 laptop with 64GB of "VRAM" why not, a GPGPU with DDR-level latency and GDDR level bandwidth is probably fun.

🍏

Cygni
Nov 12, 2005

raring to post

it seems that part of porting to the PC lately includes saying "gently caress it, make the install 150gb and give them the original sized textures, who cares if they look no different"

once the 3090 game out with 24gb and gamers actually bought it, the cap for "ultra textures" jumped as well, even if they don't really make any difference in some games.

Anime Schoolgirl
Nov 28, 2002

starfield might probably run better on 8gb cards due to "developing like PS5 is the only console that exists" disease deliberately not being a thing due to exclusivity, but I still expect it to be a massive disaster in many other ways because you can't possibly feasibly quickly onboard a large part of a 500 person development team on a studio used to having 130 people on a project at most unless you have an assembly line setup like the call of duty studios

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr. Video Games 0031 posted:

you're using the most extreme example possible with Gollum.

well, would you say it's NVIDIA's planned obsolescence in action? buy a 4090 to play at 20fps, that's the future now?

clearly a well-optimized game, scales down well, and I think we'll be seeing more of this as we move into true next-gen titles. everything below 4090/7900XTX is obsolete now, enjoy your 6fps with 6950XT. 16GB is just not enough anymore - Jedi Survivor showed the same thing, 16GB is really pretty minimal if you want to play these ports. Buy 24GB or buy some crap, your choice.

It's not that 6950XT is obsolete, it's that 16GB just isn't enough anymore. The scaling is good, it's your hardware. Planned obsolescence in action, we've seen it across multiple titles now.

https://twitter.com/HardwareUnboxed/status/1642018602885971968

Paul MaudDib fucked around with this message at 05:26 on May 28, 2023

shrike82
Jun 11, 2005

Anime Schoolgirl posted:

starfield might probably run better on 8gb cards due to "developing like PS5 is the only console that exists" disease deliberately not being a thing due to exclusivity, but I still expect it to be a massive disaster in many other ways because you can't possibly feasibly quickly onboard a large part of a 500 person development team on a studio used to having 130 people on a project at most unless you have an assembly line setup like the call of duty studios

lol that game is going to make for great content either in-game or on a meta level if it's a shitshow
people are expecting it to turn around the xbox's fortunes

Dr. Video Games 0031
Jul 17, 2004

Paul MaudDib posted:

well, would you say it's NVIDIA's planned obsolescence in action? buy a 4090 to play at 20fps, that's the future now?

clearly a well-optimized game, scales down well, and I think we'll be seeing more of this as we move into true next-gen titles. everything below 4090/7900XTX is obsolete now, enjoy your 6fps with 6950XT. 16GB is just not enough anymore - Jedi Survivor showed the same thing, 16GB is really pretty minimal if you want to play these ports. Buy 24GB or buy some crap, your choice.

It's not that 6950XT is obsolete, it's that 16GB just isn't enough anymore. The scaling is good, it's your hardware. Planned obsolescence in action, we've seen it across multiple titles now.

https://twitter.com/HardwareUnboxed/status/1642018602885971968

...What? I was talking about the 4060 Ti. What are you even saying with this post?

Anime Schoolgirl
Nov 28, 2002

shrike82 posted:

lol that game is going to make for great content either in-game or on a meta level if it's a shitshow
people are expecting it to turn around the xbox's fortunes
people are hoping it finally kills bethesda but sadly the reality is that as long as people can add their favorite 20 year old `slab mods to it it'll still sell gangbusters, but any effort made to restrict mods in any way might be the beginning of the end. fallout 4 showed that they can barely make a game and people will buy it in droves anyway.

that strategy just wasn't successful for a full on multiplayer looter shooter like 76 because people expect an online game to have a continual amount of effort spent on it which is why it's strangely significantly better than 4 ever was now. not that i'd tell people to buy 76 for more than $7 though, it's nowhere near worth its regular price

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr. Video Games 0031 posted:

...What? I was talking about the 4060 Ti. What are you even saying with this post?

gollum's hosed on pretty much everything, even a 4090 doesn't bash through it all that well.

https://twitter.com/OfficialPCMR/status/1661790642064834561

There's no way that single-digit FPS on everything except 4090, 4080, 7900XTX, and 7900XT (ok 6950XT gets 15) is right either, it's a poo poo-tier A studio that got hopped from AA to AAA and the game sucks and is broken.

it's just broken, and a lot of these ports like TLOU are just broken too. TLOU got conspicuously and dramatically fixed, it seems like a couple basic and dramatic mistakes have been identified in Jedi Survivor as well.

I'm not saying every game will get fixed, but it's illustrative of the problem. There's nobody riding herd on a lot of the ports, poo poo is shipping totally broken on literally everybody's machine, it's not like AMD is dumpstering NVIDIA anyway 7950XT is literally sub-30fps (4090 is 40fps) and Jedi Survivor didn't run right on PC at all either, and TLOU went from PS3 to proper at medium at 30% less VRAM. It was doing obviously broken poo poo that was fixed reasonably quickly once someone bothered.

If you wanna bash through with a 4090/4080/7900XTX at 20-40fps then ok I guess but this is the "how much performance can I buy?" strategy, value is not a factor in this equation. 8GB vs 12GB does not even come into this question, you are buying the 4090 or 7900XTX or 7900XT because those are the fastest and have enough VRAM to just load all the assets, and you will install them on your 5800X3D/7800X3D. That's the current strategy for tanking through bullshit broken launch titles. can you do it, yes always, even fallout 4 etc. is it cost effective? never.

5800X3D/7800X3D might frankly also be more influential than the actual GPU. V-cache actually bashes through broken pointer-chasing bullshit engines super well, and any NVIDIA driver stack overhead probably as well (although I think a lot of this is probably often BVH). buying a long-term build without v-cache is kinda stupid tbh, it's an absurd boost to worst-case CPU performance in bullshit situations. you know, hosed up game ports and poo poo like that.

Paul MaudDib fucked around with this message at 06:45 on May 28, 2023

MarcusSA
Sep 23, 2007

Starfield is gonna be a banger.

Cygni
Nov 12, 2005

raring to post

MarcusSA posted:

Starfield is gonna be a banger.

That’s Right. Even if it’s Skyrim/Fallout 4 with some janky spaceships I will fuckin love that poo poo.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Starfield is going to be such a laggy, hitchy, stuttery piece of poo poo, I cannot wait to see it release and crash & burn to absolutely nobody's surprise.

There's gonna be eulogies for ms's gaming division everywhere

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I know people are rightfully making GBS threads all over the 4060ti, but I'm building a Plex and Cloud Gaming (Parsec or Moonlight) server right now to put in a friend's place where they have fiber, and I'm considering trying to sell my spare RTX 3070 in order to side-grade.

- Seems like I can get around $350 for the 3070 still, and people are so disgusted by the 4060ti that it is readily available for purchase at $400. At least I'd be refreshing my warranty?

- The 40-series all have the dual AV1 encoders; for the Plex and Parsec, it seems like Plex already has AV1 support, and Parsec should get it eventually? The 30-50% less bandwidth figures sound great if we're going to be blasting out Plex streams left and right, especially if our friends connect over a hotspot or something.

- It looks like the performance is neck and neck with the 3070, but the 4060ti's TDP is 73% of the 3070, so our power bill would appreciate that

I dunno, do I actually have a use-case?

Zedsdeadbaby posted:

Starfield is going to be such a laggy, hitchy, stuttery piece of poo poo, I cannot wait to see it release and crash & burn to absolutely nobody's surprise.

I still maintain that the funniest thing to ever happen in this thread was that when the GTX 970 came out, there were pages of "Will this run Star Citizen?!" "I'm buying this to run Star Citizen!"

And now Todd Howard of all loving people is gonna tackle it

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Zero VGS posted:

I know people are rightfully making GBS threads all over the 4060ti, but I'm building a Plex and Cloud Gaming (Parsec or Moonlight) server right now to put in a friend's place where they have fiber, and I'm considering trying to sell my spare RTX 3070 in order to side-grade.

- Seems like I can get around $350 for the 3070 still, and people are so disgusted by the 4060ti that it is readily available for purchase at $400. At least I'd be refreshing my warranty?

- The 40-series all have the dual AV1 encoders; for the Plex and Parsec, it seems like Plex already has AV1 support, and Parsec should get it eventually? The 30-50% less bandwidth figures sound great if we're going to be blasting out Plex streams left and right, especially if our friends connect over a hotspot or something.

- It looks like the performance is neck and neck with the 3070, but the 4060ti's TDP is 73% of the 3070, so our power bill would appreciate that

I dunno, do I actually have a use-case?

So, you are colo'ing your moonlight server on residential fiber? what's your bw / latency to them under load?

... do you have any spare hardware you could experiment with before you commit to an upgrade?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Subjunctive posted:

I’m not sure I believe everything in them, but Iove your effort posts, Paul.

I have a very very heterodox opinion of what NVIDIA does. They've pretty consistently delivered innovation and dominance that follows, and that's what matters.

Don't misunderstand, NVIDIA also has the best OpenCL drivers. You can write your own platform too. CUDA and the NVIDIA software ecosystem are the banners so bright.. It's legitimately something that everyone involved is aware of, it's the cage with the golden bars. And there really isn't a replacement, it's way too high-value for anyone to trivially replace.

Muh realized client value, how horrifying.

Also includes things like gsync etc. Super duper satisfied XB270HU IPS owner - I ran it on a 780 Ti for a long time. How? Gsync is awesome. Then 980 Ti/1080/etc. I kept that guy a long time.

( "cost $150 more" ok I paid $270 for my 144 hz IPS that replaced my $300 144 hz tn. get more online and snipe some AcerRecertified dealz in 2016-2017, all the cool kids were doing it. "oh no $425 is way too much for a monitor that actually works but's gonna lock me into the good ecosystem for the next 5 loving years" you loving cheapasses.)

Simply buy more screen real estate in 2016 and you'd be screen-rich right now.

Subjunctive posted:

In 2013, I was working with AMD to get parallel rendering for VR working on pre-release Hawaii, and they basically had to put their entire driver team on it because they had such a lean software team (and the driver stack itself was reportedly a fragile mess). We’d get drivers that would device-reset on widescreen resolutions or have random visual artifacts, so it was always a fun guessing game when someone would drive to Santa Clara to get a new USB key with the latest internal driver dice-roll.

At that time we were absolutely not exploring building a cheap console to pair with a future Rift. AMD’s semicustom people were willing to do just about any mix of IP but I don’t know that we’d have gone that way because of how bad our experience had been with their software folks. NVIDIA was pretty much “take it on leave it”, but at least we were pretty sure what we took would work. Their devrel and driver folks were super responsive and helpful too, though they weren’t as interested in doing special stuff for VR.

I’m not sure 10-year-old memories are really predictive of today, but Paul’s post evoked them so here you go.

ah yes, vendor B

Paul MaudDib fucked around with this message at 10:27 on May 28, 2023

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Hell yeah brother, thank you for responding to my giant post with an equally giant post, that's living. I feel like I understand the situation wayyyyy better as I can approach this stuff from the mind of someone who is close to tech marketing and market deployment, but didn't have the technical experience to really break it down, which you totally did. Cheers buddy!

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
I've seen Nvidia's pervasiveness in software all over, but I can't help but remember the crushing pain I always feel when I have to update my Linux kernel. It's always a game of luck if dkms processes right and I have a display, or the united becomes so bug that I can't even use it anymore. It's a game I get to play with my work machine and it's NVidia GPU every few months. Otherwise, I get to sometimes reboot to regain one of my screens sometimes since nothing else works.

CaptainSarcastic
Jul 6, 2013



Rocko Bonaparte posted:

I've seen Nvidia's pervasiveness in software all over, but I can't help but remember the crushing pain I always feel when I have to update my Linux kernel. It's always a game of luck if dkms processes right and I have a display, or the united becomes so bug that I can't even use it anymore. It's a game I get to play with my work machine and it's NVidia GPU every few months. Otherwise, I get to sometimes reboot to regain one of my screens sometimes since nothing else works.

This is anecdotal as hell, and specific to OpenSUSE, but it seems to me this has gotten a lot better in the last year or so. I used to try to catch whether I had a kernel update waiting, and delay it until I saw the Nvidia driver had an update, too. For a while now, though, I've gotten lazy because things have been running fine. The machine I'm typing this on is running Tumbleweed and a 2070 Super, and my main machine is running 15.4 and a 3080 12GB.

The one thing I have been too lazy to deal with is getting secure boot running with the proprietary Nvidia drivers installed. I know how to do it, mostly, but it is just fiddly enough I haven't bothered. Neither machine is running Windows 11 (both are dual-boots with Windows 10), so it's not like I've had a pressing need for it.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Rocko Bonaparte posted:

the crushing pain I always feel when I have to update my Linux kernel. It's always a game of luck

Oh? The two linux people I know have been assuring me that linux is nearly as easy and more powerful to use than windows and its day of mainstream dominance is right around the corner. Next year is the year of linux for sure. That's been the refrain for *checks notes* infinity years running.

Just kidding mostly, I know it's in Steam Deck, it's just hilarious that on one hand people are like "Linux right around the corner baby!" since 1924 then another person is like "I have PTSD from having to update my Linux. My therapist says I'm making great progress though, and I no longer wake up screaming about kernels"

The dichotomy is just funny.

shrike82
Jun 11, 2005

why are you updating your work machine's kernel every couple months?
the only time where i had a piece of hardware impact my (linux) work was a AMD CPU config issue which slowed down inter-GPU communications

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.

shrike82 posted:

why are you updating your work machine's kernel every couple months?
the only time where i had a piece of hardware impact my (linux) work was a AMD CPU config issue which slowed down inter-GPU communications

Because the kernel will get a security update about once a month, and among all the other security updates it's just easier to update everything than try to do it piece by piece.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

CaptainSarcastic posted:

This is anecdotal as hell, and specific to OpenSUSE, but it seems to me this has gotten a lot better in the last year or so. I used to try to catch whether I had a kernel update waiting, and delay it until I saw the Nvidia driver had an update, too. For a while now, though, I've gotten lazy because things have been running fine. The machine I'm typing this on is running Tumbleweed and a 2070 Super, and my main machine is running 15.4 and a 3080 12GB.

The one thing I have been too lazy to deal with is getting secure boot running with the proprietary Nvidia drivers installed. I know how to do it, mostly, but it is just fiddly enough I haven't bothered. Neither machine is running Windows 11 (both are dual-boots with Windows 10), so it's not like I've had a pressing need for it.

NVIDIA got their kernelland into the kernel upstream a while ago, iirc?.

Paul MaudDib fucked around with this message at 10:51 on May 28, 2023

Arivia
Mar 17, 2011

Taima posted:

Oh? The two linux people I know have been assuring me that linux is nearly as easy and more powerful to use than windows and its day of mainstream dominance is right around the corner. Next year is the year of linux for sure. That's been the refrain for *checks notes* infinity years running.

Just kidding mostly, I know it's in Steam Deck, it's just hilarious that on one hand people are like "Linux right around the corner baby!" since 1924 then another person is like "I have PTSD from having to update my Linux. My therapist says I'm making great progress though, and I no longer wake up screaming about kernels"

The dichotomy is just funny.

it's the year of linux on the desktop, not the year of linux on the portable handheld gaming pc

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!

Taima posted:

Oh? The two linux people I know have been assuring me that linux is nearly as easy and more powerful to use than windows and its day of mainstream dominance is right around the corner. Next year is the year of linux for sure. That's been the refrain for *checks notes* infinity years running.


If you're running an AMD GPU they are right that it's pretty great these days.
Nvidias Linux support has historically been awful to the point of the creator of Linux repeatedly putting them on blast for it.

Paul MaudDib posted:

NVIDIA got their kernelland into the kernel upstream a while ago, iirc?.

Yes but it's not there quite yet. If your running a nvidia GPU you still want to spend the time wrestling with their proprietary drivers if you want decent performance. It's very promising though and shows Nvidia have finally turned a corner on their Linux support. (Probably because of the steam deck)

shrike82
Jun 11, 2005

WSL2 is a decent alternative these days
I got work to replace my MBP with a gaming laptop with a 4090 and being able to do deep learning coding with VS Code and also use applications like Zoom is neat

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!
Lol Zoom has a Linux app.
When it comes to boring work-related apps, unless they are Microsoft owned they probably work on Linux.

It's the creative apps that are missing, Adobe etc.

Mega Comrade fucked around with this message at 11:29 on May 28, 2023

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

holy gently caress this is from nearly a decade ago, it could easily be yesterday

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Paul MaudDib posted:

So, you are colo'ing your moonlight server on residential fiber? what's your bw / latency to them under load?

... do you have any spare hardware you could experiment with before you commit to an upgrade?

It's 300 mbps symmetric and tends to be around 10ms ping. I did Moonlight over Fiber before and I know it kicks rear end most of the time, until you start going across states with it.

I dunno what you would mean about experimenting, besides confirming that the streaming works. I'm just saying, it seems like for a server that's going to be doing a ton of streaming, getting a mostly even trade from my 3070 to the 4060ti, even if I lose like 5% of the performance in some cases, is probably made up for by the higher efficiency and the possibilities enabled by AV1.

Mega Comrade posted:

Lol Zoom has a Linux app.
When it comes to boring work-related apps, unless they are Microsoft owned they probably work on Linux.

It's the creative apps that are missing, Adobe etc.

Yeah I know Zoom has a Linux app, because one of my coworkers in IT was That Guy who insisted on being the only person in the entire company running Linux on his workstation. We had a few weeks in a row where we were missing the last 20 minutes of recordings for our company Zoom meetings, which employees in other timezones needed to view, and we were getting chewed out for having no explanation. Turns out we were the first in the world to reproduce a bug: if an admin who happens to be using Linux logs out of the meeting, it ends recording even if the other admins who made the meeting remain.

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!
Zoom is a piece of poo poo regardless of platform you are running it on.

Truga
May 4, 2014
Lipstick Apathy
i've moved from a 980ti to a 6900xt during the corona and i've had exactly 1 crash/hang since, and i'm not even sure if it was even driver related or just beta steamvr bullshit because everything else kept working once logged out/in again

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

Zero VGS posted:

- The 40-series all have the dual AV1 encoders; for the Plex and Parsec, it seems like Plex already has AV1 support, and Parsec should get it eventually? The 30-50% less bandwidth figures sound great if we're going to be blasting out Plex streams left and right, especially if our friends connect over a hotspot or something.

The dual encoders are only on the 4070 and above, you get one on the 4060ti. It's still really fast, but not as crazy as higher end Ada.

The AD102 die actually has 3 encoders, 3 decoders, but they only enable 2:1 on the consumer cards.

Zedsdeadbaby posted:

holy gently caress this is from nearly a decade ago, it could easily be yesterday

I mean, the person paul is responding to is talking about their experience in 2013 as well. AMD did eventually rewrite their OpenGL driver, but it was literally only last year after it was broken for decades. The new one is faster but busted in it's own unique ways in that a bunch of things that used to work, don't anymore. If there is one area that AMD deserves a load of poo poo for, it's OpenGL.

BurritoJustice fucked around with this message at 12:38 on May 28, 2023

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply