Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Indiana_Krom
Jun 18, 2007
Net Slacker
The AI salesman's claim of "Now you won't have to do <bullshit task> anymore!" is technically true, but usually omits the other part: "So faceless evil megacorp you/others were formerly employed at to do <bullshit task> won't have to pay for workers at all anymore, you're fired for failing to do <bullshit task> as quickly as <bullshit task robot>, now go lay in a ditch and loving die, no severance or unemployment for you!".

Both parts are coming, and I have very low expectations that enough people in high places will realize "If we use AI to replace all workers, who is going to be able to afford our product?" as they watch number go up while slowly bleeding the rest of the world dry.

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Software developers unionizing would probably increase the overall power of labour within the US by a fair bit. They represent a lot of economic power.

more on topic, unionized US workers at a future TSMC plant would have a lot of leverage, I think

SlowBloke
Aug 14, 2017

PC LOAD LETTER posted:

LOL no.

Unions and vast swaths of blue collar workers constantly get screwed and vote to try to do something about it and usually only in very limited instances are efforts made by the govt to protect them.

This isn't the 1930's or 40's anymore where there is strong pro labor sentiment commonly found in the populace or govt.

The most that likely would be offered are job retraining programs. Similar stuff was attempted or talked about for coal miners but even half assed attempts at doing that went nowhere. Or were outright rejected by the miners.

Honestly I'd rather a UBI be instituted than (soon to be) pointless drudgery jobs 'protection' anyways. If AI can do the job let it. The whole point of tech is to make people's lives better anyways. The real issue will be getting the powerful to share the wealth with everyone else.

As always of course.

Maybe in the US but in Europe I'm already seeing regulatory frameworks getting started https://www.europarl.europa.eu/news...-trustworthy-ai and the limited feedback I can hear from public sector unions is that banning ai is a quick and easy deliverable that politicians will be deploying instead of structural changes next time pushes come to shove.

Star Man
Jun 1, 2008

There's a star maaaaaan
Over the rainbow
How much would an RTX 4070 Super be bogged down by a system of an Intel 8600K, 16GB of RAM, and in a PCIe 3.0 slot?

I currently run a GTX 1060. The plan is to replace the rest of the system, but the video card will be the single biggest line item. Am I better off replacing everything else first?

Indiana_Krom
Jun 18, 2007
Net Slacker

Star Man posted:

How much would an RTX 4070 Super be bogged down by a system of an Intel 8600K, 16GB of RAM, and in a PCIe 3.0 slot?

I currently run a GTX 1060. The plan is to replace the rest of the system, but the video card will be the single biggest line item. Am I better off replacing everything else first?

Depends on the resolution/refresh rate (and game to some extent).

Like at 4k/60, totally fine, GPU limit will hit way before the 8600K CPU limit, slap it in and don't worry. But say 1920x1080@240 Hz display in a game that is around 3 years old? That CPU is going to hold a 4070 super back by as much as half its potential performance.

Star Man
Jun 1, 2008

There's a star maaaaaan
Over the rainbow
My primary display is 2560x1440, 144Hz capable, and has G-Sync. I have no aspirations for 4K for now.

Indiana_Krom
Jun 18, 2007
Net Slacker
Then most likely the 8600K CPU will hobble your performance some in newer games. I ran a 3080 Ti on an Intel 9900k with a 2560x1440 @ 240 Hz display and saw some pretty significant CPU limits in some games, and a 4070 super is roughly equivalent to a 3080 Ti.

I later upgraded to a 4090 founders edition and saw basically no performance uplift vs the 3080 Ti at all until I completed the rest of my upgrade to an AMD 7800X3D, which in some cases more than doubled the performance.

2560x1440 and high refresh basically demands an all around powerful system, much more so than 4k or 1080p which shift loads more towards one component or the other.

Ihmemies
Oct 6, 2012

Star Man posted:

How much would an RTX 4070 Super be bogged down by a system of an Intel 8600K, 16GB of RAM, and in a PCIe 3.0 slot?

I currently run a GTX 1060. The plan is to replace the rest of the system, but the video card will be the single biggest line item. Am I better off replacing everything else first?

8700K->12700K with 3080 increased average fps by 50%. You waste money with the ancient cpu - new gpu combo.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Older CPUs mostly gimp your minimum framerates

People mostly think about the maximums, that's why they spend money on new hardware after all, but don't forget about the other end of the scale too.

Shipon
Nov 7, 2005

Subjunctive posted:

Software developers unionizing would probably increase the overall power of labour within the US by a fair bit. They represent a lot of economic power.

more on topic, unionized US workers at a future TSMC plant would have a lot of leverage, I think

The semi industry is going to be basically impossible for unions to crack

shrike82
Jun 11, 2005

https://twitter.com/digitalfoundry/status/1758896878798713039

Dr. Video Games 0031
Jul 17, 2004


NvTrueHDR is a really neat and superior alternative to AutoHDR, but I actually disagree with Alex when he talks about how good of a fit it is for bloom-heavy classic HDR games like Lost Planet and HL2 (lost coast eps 1 and 2). Those games are rendering with an HDR pipeline and then tonemapping that back to SDR, with bloom and camera exposure effects used to simulate the HDR-ness on an SDR screen. NvTrueHDR will then tonemap the already tonemapped image back into HDR, which only makes the bloom effects look even brighter and more blown-out. It's the HDR equivalent of shrinking down an image and then upscaling it again. A proper HDR game will render in HDR internally and then tonemap once according to the capabilities of your display.

That said, it does look really good in games that don't go heavy on the bloom, and it's a step up over Windows AutoHDR in quality. NvTrueHDR images tend to have more black depth and less banding. And it is neat how it's compatible with pretty much everything.

Dr. Video Games 0031 fucked around with this message at 05:42 on Feb 18, 2024

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.
If there a way to set a different FPS limits on separate displays? My main gaming monitor is 165Hz 1440p, but I also game on my 120Hz 4K LG TV. On NVIDIA Control Panel I seem to only be able to set a common 115 frame limit.

Indiana_Krom
Jun 18, 2007
Net Slacker

Saukkis posted:

If there a way to set a different FPS limits on separate displays? My main gaming monitor is 165Hz 1440p, but I also game on my 120Hz 4K LG TV. On NVIDIA Control Panel I seem to only be able to set a common 115 frame limit.

Easiest way is to check and see how your games perform latency wise with just plain vsync (you should already have low latency mode enabled). A few games have horrible buffering with vsync enabled because the engine is poo poo and buffers a bazillion frames ahead when GPU or vsync limited (eg: overwatch) and absolutely require a FPS limit (meaning a CPU framerate cap) to maintain reasonable latency. However most games do not, and you will barely notice the difference between a 4 FPS less than the VRR limit cap, and just plain vsync at the VRR limit. So if your games perform well with just plain vsync, then use that, it will automatically set to whatever refresh rate the display is capable of.

In the case of games that do have a horrible vsync implementation, you can use a per-game profile to frame limit just that game.

Dr. Video Games 0031
Jul 17, 2004

People like to preach about best practices a lot, and it's true that setting a cap is ideal. But Vsync is fine a lot of the time too, to be honest. You're not gonna notice if you get 10ms more lag or whatever from having vsync enabled in a single player game.

Otherwise my answer would be to just use the game profiles instead of a global profile. I assume you're sticking to one monitor for each game depending on the type of game it is?

karthun
Nov 16, 2006

I forgot to post my food for USPOL Thanksgiving but that's okay too!

SlowBloke posted:

If I had to take a guess for long term, it would be a heavily regulated market since it has the potential to kill a lot of white collar jobs staffed by weak(40+) workers. That age bracket is a strong voting block, which will likely have governments step in to not lose votes.

We have used copilot since Jan 1st at my work. The ability to autogenerated meeting minutes and action items has been it's most used and useful feature. It has led to exactly zero layoffs because no one made meeting minutes and action items for relatively minor meetings.

Arzachel
May 12, 2012

Dr. Video Games 0031 posted:

People like to preach about best practices a lot, and it's true that setting a cap is ideal. But Vsync is fine a lot of the time too, to be honest. You're not gonna notice if you get 10ms more lag or whatever from having vsync enabled in a single player game.

*If you have VRR set up properly*

It's going to be a hell lot worse then 10ms otherwise

ijyt
Apr 10, 2012

karthun posted:

We have used copilot since Jan 1st at my work. The ability to autogenerated meeting minutes and action items has been it's most used and useful feature. It has led to exactly zero layoffs because no one made meeting minutes and action items for relatively minor meetings.

ok

Star Man
Jun 1, 2008

There's a star maaaaaan
Over the rainbow

Indiana_Krom posted:

Then most likely the 8600K CPU will hobble your performance some in newer games. I ran a 3080 Ti on an Intel 9900k with a 2560x1440 @ 240 Hz display and saw some pretty significant CPU limits in some games, and a 4070 super is roughly equivalent to a 3080 Ti.

I later upgraded to a 4090 founders edition and saw basically no performance uplift vs the 3080 Ti at all until I completed the rest of my upgrade to an AMD 7800X3D, which in some cases more than doubled the performance.

2560x1440 and high refresh basically demands an all around powerful system, much more so than 4k or 1080p which shift loads more towards one component or the other.

Okay. Looks like I'll direct my attention to upgrading the base system first and get a new video card later.

njsykora
Jan 23, 2012

Robots confuse squirrels.


Another driver check-in with Intel GPUs, A770 is right up against the 4060 in most games even with RT enabled but there's still big games like GTA5 where it's just completely broken.
https://www.youtube.com/watch?v=w3WSqLEciEw

Cyrano4747
Sep 25, 2006

Yes, I know I'm old, get off my fucking lawn so I can yell at these clouds.

Star Man posted:

Okay. Looks like I'll direct my attention to upgrading the base system first and get a new video card later.

I'm going to be the contrary voice and say that the uplift you'll get from even a bottlenecked 4070 compared to your current 1070 is pretty crazy, and you'll notice it right away if you're playing anything at all modern.

Putting your current 1070 in a newer mobo/cpu/ram setup would feel really, really bad.

How tight are your finances? How long between upgrades are you looking at? Because your whole setup is old enough that ideally you'd just upgrade everything at once at this point.

That said, if you need to save up for a few months or even a year between GPU and CPU, you're going to have a much more capable system with your old CPU(and ram etc) and a 4070 than you will with a new CPU and a 1070.

Edit: poo poo, 4070 S and a 1060, got the cards wrong. gently caress, if you're on a 1060 doubly so for everything I said above.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
I agree. I think you'll see a bigger uplift with a new GPU than you will with a CPU, even if you're still CPU limited.

Cygni
Nov 12, 2005

raring to post

Yeah, i think im on team Get the GPU. The 8600K will prevent the 4070S from getting its max frames at 1440p in some games, but it will still be a good step-up across the board from a 1060.

It doesnt look likely that there are going to be any new GPU launches in that performance/price category for a year+, and Nvidia doesnt have a lot of reason for deep price cuts on their side... so at most, i imagine you get the board vendors giving smallish cuts out of their own margin, if there are any discounts at all. Meanwhile, there are expected to be big shakeups in CPU land (perhaps the biggest in a long time?) by the end of they year. So if you can only get one now, i would go GPU first.

MarcusSA
Sep 23, 2007

Yeah I’m on team GPU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Can we get “Team GPU” shirts? Maybe with a silhouette of Paul, in AMD red, as the logo.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Make the text green and the shirt material itself an Intel-blue that’s so dark it might as well be indistinguishable from black.

Dr. Video Games 0031
Jul 17, 2004

njsykora posted:

Another driver check-in with Intel GPUs, A770 is right up against the 4060 in most games even with RT enabled but there's still big games like GTA5 where it's just completely broken.
https://www.youtube.com/watch?v=w3WSqLEciEw

Intel really lucked out with Starfield becoming so unpopular, huh? It's kind of insane that the game is still unplayable on Arc GPUs. But still, as long as cases like Starfield continue to exist, Intel GPUs are impossible to recommend. You just never know when the next big game will be literally unplayable for 5+ months.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
lol settings like this from a pc centric dev?



getting a native resolution required changing the quality setting to ultra

runeterra, btw

Bloody Pom
Jun 5, 2011



I'm looking to replace my slowly dying 2070 without spending a small fortune (trying not to go spend more than 400USD). Are the 6700XTs I see on Amazon still my best option? CPU is a 9700K and I'm not looking to go above 1440p.

Bloody Pom fucked around with this message at 02:32 on Feb 19, 2024

njsykora
Jan 23, 2012

Robots confuse squirrels.


Rinkles posted:

lol settings like this from a pc centric dev?



getting a native resolution required changing the quality setting to ultra

runeterra, btw

For a game that's basically abandoned that was primarily designed for mobile and to run mostly in a window on PC, yeah kinda makes sense.

FallenGod
May 23, 2002

Unite, Afro Warriors!

DLDSR is fantastic for letting you turn spare GPU horsepower into super-sampling, but boy does it require more work to figure out than just turning it on and selecting the new resolution in game. Sometimes the game can't see the new resolution / vsync breaks / windows selects a random refresh rate / you get horrible frame pacing despite the game's performance being perfect. Setting the desktop to the DLDSR resolution before starting the game fixes all that and game itself works perfectly, but a harder to notice problem still lurks.

Literally every part of a PC wants to have a say in scaling the image being displayed on your screen, for some drat reason. Windows, your monitor, the game itself and nvidia's own driver. It was easily noticeable on the desktop, where the horizontal "text" lines on the icon of a notepad file would vanish in the DLDSR mode, blurred out of existence by...something. The "adjust desktop size and position" portion of the nvidia control panel lets you turn all that poo poo off: set scaling to GPU, override scaling mode for games / programs and finally select "no scaling". Viola, the promised super-sampled image, fresh from the tensor cores, without every element of your PC trying to ruin it.

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer
Has anybody got a good link to an undervolting guide for ampere? I tried following the instructions on the optimum tech youtube vid but during benchmarks I can still see the card hitting like 330w and upwards in afterburner. I did have this working ages ago when I first got the gpu, but I reset all the settings for it when I had to RMA the card back to nvidia.

Only doing this because when I'm playing Yakuza LAD the gpu is just constantly going ham walking around the city and the FE fan noise is a bit annoying. Card is a 3080ti

kliras
Mar 27, 2021
you just power limit them to 80% and call it a day; with older gens you'd undervolt, but now you just power limit i'm pretty sure

sauer kraut
Oct 2, 2004

Bloody Pom posted:

I'm looking to replace my slowly dying 2070 without spending a small fortune (trying not to go spend more than 400USD). Are the 6700XTs I see on Amazon still my best option? CPU is a 9700K and I'm not looking to go above 1440p.

Yeah the 6700XT is It around $330.
You can pay 1/3 more for a 7700XT and get 1/3 more fps, but with that old a CPU I'm guessing you're not aiming at the ultra demanding games at the moment.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

The original 4070 ti is down to $700 new at Newegg. What’s the lowest price everyone would consider a good buy? $600?

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Isn't the 4070 Ti only a few % ahead of the 4070 Super now? Since you can get the stock 4070 Super at $600, I'd probably want to see that 4070 Ti go down to around $650 before calling it a good buy.

slidebite
Nov 6, 2005

Good egg
:colbert:

KillHour posted:

A slightly deeper (but still simplified - don't @ me) answer is that a model is composed of a bunch of matrix transforms called parameters. The input needs to be run through all these transforms to get the resulting output, so they need to fit in memory on the GPU. Model capabilities tend to correlate with the number of parameters, so the more you can fit in memory, the better. Modern large models can have billions of parameters.

Cyrano4747 posted:

My (very limited) understanding is that the more of their data set or model or whatever they can hold in memory, the better.
Thanks.

Maybe I'll also ask an AI to explain it to me :haw:

Smashing Link
Jul 8, 2003

I'll keep chucking bombs at you til you fall off that ledge!
Grimey Drawer
Would a 4060 run Baldurs Gate 3 reasonably well?

SpaceDrake
Dec 22, 2006

I can't avoid filling a game with awful memes, even if I want to. It's in my bones...!

Smashing Link posted:

Would a 4060 run Baldurs Gate 3 reasonably well?

It depends. Resolution, intended settings?

Adbot
ADBOT LOVES YOU

Cyrano4747
Sep 25, 2006

Yes, I know I'm old, get off my fucking lawn so I can yell at these clouds.

Smashing Link posted:

Would a 4060 run Baldurs Gate 3 reasonably well?

Yeah I’ve got a friend happily playing it at 1440 on one right now.

Looked great on my 2080 at 1440 too, and they’re broadly comparable.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply