|
The AI salesman's claim of "Now you won't have to do <bullshit task> anymore!" is technically true, but usually omits the other part: "So faceless evil megacorp you/others were formerly employed at to do <bullshit task> won't have to pay for workers at all anymore, you're fired for failing to do <bullshit task> as quickly as <bullshit task robot>, now go lay in a ditch and loving die, no severance or unemployment for you!". Both parts are coming, and I have very low expectations that enough people in high places will realize "If we use AI to replace all workers, who is going to be able to afford our product?" as they watch number go up while slowly bleeding the rest of the world dry.
|
# ? Feb 17, 2024 17:39 |
|
|
# ? Jun 10, 2024 10:58 |
|
Software developers unionizing would probably increase the overall power of labour within the US by a fair bit. They represent a lot of economic power. more on topic, unionized US workers at a future TSMC plant would have a lot of leverage, I think
|
# ? Feb 17, 2024 18:07 |
|
PC LOAD LETTER posted:LOL no. Maybe in the US but in Europe I'm already seeing regulatory frameworks getting started https://www.europarl.europa.eu/news...-trustworthy-ai and the limited feedback I can hear from public sector unions is that banning ai is a quick and easy deliverable that politicians will be deploying instead of structural changes next time pushes come to shove.
|
# ? Feb 17, 2024 19:54 |
|
How much would an RTX 4070 Super be bogged down by a system of an Intel 8600K, 16GB of RAM, and in a PCIe 3.0 slot? I currently run a GTX 1060. The plan is to replace the rest of the system, but the video card will be the single biggest line item. Am I better off replacing everything else first?
|
# ? Feb 17, 2024 20:38 |
|
Star Man posted:How much would an RTX 4070 Super be bogged down by a system of an Intel 8600K, 16GB of RAM, and in a PCIe 3.0 slot? Depends on the resolution/refresh rate (and game to some extent). Like at 4k/60, totally fine, GPU limit will hit way before the 8600K CPU limit, slap it in and don't worry. But say 1920x1080@240 Hz display in a game that is around 3 years old? That CPU is going to hold a 4070 super back by as much as half its potential performance.
|
# ? Feb 17, 2024 21:18 |
|
My primary display is 2560x1440, 144Hz capable, and has G-Sync. I have no aspirations for 4K for now.
|
# ? Feb 17, 2024 22:49 |
|
Then most likely the 8600K CPU will hobble your performance some in newer games. I ran a 3080 Ti on an Intel 9900k with a 2560x1440 @ 240 Hz display and saw some pretty significant CPU limits in some games, and a 4070 super is roughly equivalent to a 3080 Ti. I later upgraded to a 4090 founders edition and saw basically no performance uplift vs the 3080 Ti at all until I completed the rest of my upgrade to an AMD 7800X3D, which in some cases more than doubled the performance. 2560x1440 and high refresh basically demands an all around powerful system, much more so than 4k or 1080p which shift loads more towards one component or the other.
|
# ? Feb 17, 2024 23:25 |
|
Star Man posted:How much would an RTX 4070 Super be bogged down by a system of an Intel 8600K, 16GB of RAM, and in a PCIe 3.0 slot? 8700K->12700K with 3080 increased average fps by 50%. You waste money with the ancient cpu - new gpu combo.
|
# ? Feb 17, 2024 23:30 |
|
Older CPUs mostly gimp your minimum framerates People mostly think about the maximums, that's why they spend money on new hardware after all, but don't forget about the other end of the scale too.
|
# ? Feb 17, 2024 23:35 |
|
Subjunctive posted:Software developers unionizing would probably increase the overall power of labour within the US by a fair bit. They represent a lot of economic power. The semi industry is going to be basically impossible for unions to crack
|
# ? Feb 17, 2024 23:52 |
|
https://twitter.com/digitalfoundry/status/1758896878798713039
|
# ? Feb 18, 2024 03:26 |
|
NvTrueHDR is a really neat and superior alternative to AutoHDR, but I actually disagree with Alex when he talks about how good of a fit it is for bloom-heavy classic HDR games like Lost Planet and HL2 (lost coast eps 1 and 2). Those games are rendering with an HDR pipeline and then tonemapping that back to SDR, with bloom and camera exposure effects used to simulate the HDR-ness on an SDR screen. NvTrueHDR will then tonemap the already tonemapped image back into HDR, which only makes the bloom effects look even brighter and more blown-out. It's the HDR equivalent of shrinking down an image and then upscaling it again. A proper HDR game will render in HDR internally and then tonemap once according to the capabilities of your display. That said, it does look really good in games that don't go heavy on the bloom, and it's a step up over Windows AutoHDR in quality. NvTrueHDR images tend to have more black depth and less banding. And it is neat how it's compatible with pretty much everything. Dr. Video Games 0031 fucked around with this message at 05:42 on Feb 18, 2024 |
# ? Feb 18, 2024 05:39 |
|
If there a way to set a different FPS limits on separate displays? My main gaming monitor is 165Hz 1440p, but I also game on my 120Hz 4K LG TV. On NVIDIA Control Panel I seem to only be able to set a common 115 frame limit.
|
# ? Feb 18, 2024 11:44 |
|
Saukkis posted:If there a way to set a different FPS limits on separate displays? My main gaming monitor is 165Hz 1440p, but I also game on my 120Hz 4K LG TV. On NVIDIA Control Panel I seem to only be able to set a common 115 frame limit. Easiest way is to check and see how your games perform latency wise with just plain vsync (you should already have low latency mode enabled). A few games have horrible buffering with vsync enabled because the engine is poo poo and buffers a bazillion frames ahead when GPU or vsync limited (eg: overwatch) and absolutely require a FPS limit (meaning a CPU framerate cap) to maintain reasonable latency. However most games do not, and you will barely notice the difference between a 4 FPS less than the VRR limit cap, and just plain vsync at the VRR limit. So if your games perform well with just plain vsync, then use that, it will automatically set to whatever refresh rate the display is capable of. In the case of games that do have a horrible vsync implementation, you can use a per-game profile to frame limit just that game.
|
# ? Feb 18, 2024 13:57 |
|
People like to preach about best practices a lot, and it's true that setting a cap is ideal. But Vsync is fine a lot of the time too, to be honest. You're not gonna notice if you get 10ms more lag or whatever from having vsync enabled in a single player game. Otherwise my answer would be to just use the game profiles instead of a global profile. I assume you're sticking to one monitor for each game depending on the type of game it is?
|
# ? Feb 18, 2024 14:07 |
|
SlowBloke posted:If I had to take a guess for long term, it would be a heavily regulated market since it has the potential to kill a lot of white collar jobs staffed by weak(40+) workers. That age bracket is a strong voting block, which will likely have governments step in to not lose votes. We have used copilot since Jan 1st at my work. The ability to autogenerated meeting minutes and action items has been it's most used and useful feature. It has led to exactly zero layoffs because no one made meeting minutes and action items for relatively minor meetings.
|
# ? Feb 18, 2024 14:29 |
|
Dr. Video Games 0031 posted:People like to preach about best practices a lot, and it's true that setting a cap is ideal. But Vsync is fine a lot of the time too, to be honest. You're not gonna notice if you get 10ms more lag or whatever from having vsync enabled in a single player game. *If you have VRR set up properly* It's going to be a hell lot worse then 10ms otherwise
|
# ? Feb 18, 2024 15:51 |
|
karthun posted:We have used copilot since Jan 1st at my work. The ability to autogenerated meeting minutes and action items has been it's most used and useful feature. It has led to exactly zero layoffs because no one made meeting minutes and action items for relatively minor meetings. ok
|
# ? Feb 18, 2024 16:04 |
|
Indiana_Krom posted:Then most likely the 8600K CPU will hobble your performance some in newer games. I ran a 3080 Ti on an Intel 9900k with a 2560x1440 @ 240 Hz display and saw some pretty significant CPU limits in some games, and a 4070 super is roughly equivalent to a 3080 Ti. Okay. Looks like I'll direct my attention to upgrading the base system first and get a new video card later.
|
# ? Feb 18, 2024 18:32 |
|
Another driver check-in with Intel GPUs, A770 is right up against the 4060 in most games even with RT enabled but there's still big games like GTA5 where it's just completely broken. https://www.youtube.com/watch?v=w3WSqLEciEw
|
# ? Feb 18, 2024 20:23 |
|
Star Man posted:Okay. Looks like I'll direct my attention to upgrading the base system first and get a new video card later. I'm going to be the contrary voice and say that the uplift you'll get from even a bottlenecked 4070 compared to your current 1070 is pretty crazy, and you'll notice it right away if you're playing anything at all modern. Putting your current 1070 in a newer mobo/cpu/ram setup would feel really, really bad. How tight are your finances? How long between upgrades are you looking at? Because your whole setup is old enough that ideally you'd just upgrade everything at once at this point. That said, if you need to save up for a few months or even a year between GPU and CPU, you're going to have a much more capable system with your old CPU(and ram etc) and a 4070 than you will with a new CPU and a 1070. Edit: poo poo, 4070 S and a 1060, got the cards wrong. gently caress, if you're on a 1060 doubly so for everything I said above.
|
# ? Feb 18, 2024 20:26 |
|
I agree. I think you'll see a bigger uplift with a new GPU than you will with a CPU, even if you're still CPU limited.
|
# ? Feb 18, 2024 21:54 |
|
Yeah, i think im on team Get the GPU. The 8600K will prevent the 4070S from getting its max frames at 1440p in some games, but it will still be a good step-up across the board from a 1060. It doesnt look likely that there are going to be any new GPU launches in that performance/price category for a year+, and Nvidia doesnt have a lot of reason for deep price cuts on their side... so at most, i imagine you get the board vendors giving smallish cuts out of their own margin, if there are any discounts at all. Meanwhile, there are expected to be big shakeups in CPU land (perhaps the biggest in a long time?) by the end of they year. So if you can only get one now, i would go GPU first.
|
# ? Feb 18, 2024 22:27 |
|
Yeah I’m on team GPU
|
# ? Feb 18, 2024 22:43 |
|
Can we get “Team GPU” shirts? Maybe with a silhouette of Paul, in AMD red, as the logo.
|
# ? Feb 18, 2024 22:58 |
|
Make the text green and the shirt material itself an Intel-blue that’s so dark it might as well be indistinguishable from black.
|
# ? Feb 18, 2024 23:18 |
|
njsykora posted:Another driver check-in with Intel GPUs, A770 is right up against the 4060 in most games even with RT enabled but there's still big games like GTA5 where it's just completely broken. Intel really lucked out with Starfield becoming so unpopular, huh? It's kind of insane that the game is still unplayable on Arc GPUs. But still, as long as cases like Starfield continue to exist, Intel GPUs are impossible to recommend. You just never know when the next big game will be literally unplayable for 5+ months.
|
# ? Feb 18, 2024 23:41 |
|
lol settings like this from a pc centric dev? getting a native resolution required changing the quality setting to ultra runeterra, btw
|
# ? Feb 19, 2024 01:15 |
I'm looking to replace my slowly dying 2070 without spending a small fortune (trying not to go spend more than 400USD). Are the 6700XTs I see on Amazon still my best option? CPU is a 9700K and I'm not looking to go above 1440p.
Bloody Pom fucked around with this message at 02:32 on Feb 19, 2024 |
|
# ? Feb 19, 2024 02:30 |
|
Rinkles posted:lol settings like this from a pc centric dev? For a game that's basically abandoned that was primarily designed for mobile and to run mostly in a window on PC, yeah kinda makes sense.
|
# ? Feb 19, 2024 02:34 |
|
DLDSR is fantastic for letting you turn spare GPU horsepower into super-sampling, but boy does it require more work to figure out than just turning it on and selecting the new resolution in game. Sometimes the game can't see the new resolution / vsync breaks / windows selects a random refresh rate / you get horrible frame pacing despite the game's performance being perfect. Setting the desktop to the DLDSR resolution before starting the game fixes all that and game itself works perfectly, but a harder to notice problem still lurks. Literally every part of a PC wants to have a say in scaling the image being displayed on your screen, for some drat reason. Windows, your monitor, the game itself and nvidia's own driver. It was easily noticeable on the desktop, where the horizontal "text" lines on the icon of a notepad file would vanish in the DLDSR mode, blurred out of existence by...something. The "adjust desktop size and position" portion of the nvidia control panel lets you turn all that poo poo off: set scaling to GPU, override scaling mode for games / programs and finally select "no scaling". Viola, the promised super-sampled image, fresh from the tensor cores, without every element of your PC trying to ruin it.
|
# ? Feb 19, 2024 07:41 |
|
Has anybody got a good link to an undervolting guide for ampere? I tried following the instructions on the optimum tech youtube vid but during benchmarks I can still see the card hitting like 330w and upwards in afterburner. I did have this working ages ago when I first got the gpu, but I reset all the settings for it when I had to RMA the card back to nvidia. Only doing this because when I'm playing Yakuza LAD the gpu is just constantly going ham walking around the city and the FE fan noise is a bit annoying. Card is a 3080ti
|
# ? Feb 19, 2024 14:36 |
|
you just power limit them to 80% and call it a day; with older gens you'd undervolt, but now you just power limit i'm pretty sure
|
# ? Feb 19, 2024 16:11 |
|
Bloody Pom posted:I'm looking to replace my slowly dying 2070 without spending a small fortune (trying not to go spend more than 400USD). Are the 6700XTs I see on Amazon still my best option? CPU is a 9700K and I'm not looking to go above 1440p. Yeah the 6700XT is It around $330. You can pay 1/3 more for a 7700XT and get 1/3 more fps, but with that old a CPU I'm guessing you're not aiming at the ultra demanding games at the moment.
|
# ? Feb 19, 2024 16:44 |
|
The original 4070 ti is down to $700 new at Newegg. What’s the lowest price everyone would consider a good buy? $600?
|
# ? Feb 19, 2024 17:49 |
|
Isn't the 4070 Ti only a few % ahead of the 4070 Super now? Since you can get the stock 4070 Super at $600, I'd probably want to see that 4070 Ti go down to around $650 before calling it a good buy.
|
# ? Feb 19, 2024 18:08 |
|
KillHour posted:A slightly deeper (but still simplified - don't @ me) answer is that a model is composed of a bunch of matrix transforms called parameters. The input needs to be run through all these transforms to get the resulting output, so they need to fit in memory on the GPU. Model capabilities tend to correlate with the number of parameters, so the more you can fit in memory, the better. Modern large models can have billions of parameters. Cyrano4747 posted:My (very limited) understanding is that the more of their data set or model or whatever they can hold in memory, the better. Maybe I'll also ask an AI to explain it to me
|
# ? Feb 19, 2024 18:30 |
|
Would a 4060 run Baldurs Gate 3 reasonably well?
|
# ? Feb 19, 2024 18:49 |
|
Smashing Link posted:Would a 4060 run Baldurs Gate 3 reasonably well? It depends. Resolution, intended settings?
|
# ? Feb 19, 2024 19:00 |
|
|
# ? Jun 10, 2024 10:58 |
|
Smashing Link posted:Would a 4060 run Baldurs Gate 3 reasonably well? Yeah I’ve got a friend happily playing it at 1440 on one right now. Looked great on my 2080 at 1440 too, and they’re broadly comparable.
|
# ? Feb 19, 2024 19:04 |