|
necrobobsledder posted:The primary reason was to get him to stop being a scalper by threatening him to lose his clearance potentially if there was something legit illegal. So many people in the area have clearances to do their jobs here most folks are pretty drat straight laced outside leadership positions It's not illegal to scalp, so there's nothing you can do about the dude besides ban him from a store, if you're store management. Threatening adverse action against a person's clearance, if they have one, without legitimate cause, however, is illegal.
|
# ¿ Mar 10, 2021 06:44 |
|
|
# ¿ May 14, 2024 19:32 |
|
please do not cyberstalk dudes over gpus, it isn't worth it.
|
# ¿ Mar 10, 2021 23:36 |
|
MarcusSA posted:This is surprisingly still in stock It's a laptop. Laptop cooling capabilities mean that the cpu and gpu have to fit inside about 75w. They allocate about 35w to the cpu and the rest to the GPU. A lot of things will beat that laptop performance wise in the desktop space. There are one off stupid builds by like Sager and Clevo etc, but if you're looking at that msi laptop you can't afford the ones I mentioned. orange juche fucked around with this message at 07:42 on Mar 11, 2021 |
# ¿ Mar 11, 2021 07:38 |
|
MarcusSA posted:I’m specifically talking about other laptops. 3060s aren't quick. I think the desktop version got a review of "well, it's a GPU, we guess" The 3060Ti is plenty quick but unobtainable, much like loving everything in the GPU space. I could sell my loving 980Ti I bought 6? years ago for more than 50% of its sale price. orange juche fucked around with this message at 07:59 on Mar 11, 2021 |
# ¿ Mar 11, 2021 07:43 |
|
my kinda ape posted:Just sold my GTX 1070 for $490 minus shipping within two hours of putting it up on eBay I've got a 980Ti "Golden Dragon" edition which I could probably re-paste and then sell for about 400 bucks, might look into that since I've still got the box, antistatic bag, and all the goodies for the card.
|
# ¿ Mar 18, 2021 00:09 |
|
Mordekai posted:Am I a fool for going with Pop! OS in my new desktop built around the 3070? I really don't care much about RTX, maybe for trying it out in some games. If you're gaming, linux isn't an excellent choice, but if you're a "linux gamer" Pop! OS is debian based, with all of the positives and negatives that implies. Proton and DXVK have made linux gaming a lot more workable, but it's still very hit and miss. It is a very deep rabbit hole you can get into with custom Proton builds for specific games and Lutris to manage your various installs outside of Steam. orange juche fucked around with this message at 09:27 on Mar 18, 2021 |
# ¿ Mar 18, 2021 09:23 |
|
I mean it works, but you're going to be experiencing weird bugs. With warzone for examplequote:AverytheFurry 2 weeks ago Yeah it's still a bit poo poo If you want to go on a quixotic quest to game on linux successfully, you're going to need to work on average twice as hard to get poo poo working vs someone on windows, and many games straight won't work or will break in awful ways. I have recent experience with linux gaming and it's not worth it if you have an option. Good luck with it though. orange juche fucked around with this message at 09:40 on Mar 18, 2021 |
# ¿ Mar 18, 2021 09:32 |
|
Paul MaudDib posted:With GPUs you absolutely do want to pre-spread it, there isn’t a heatspreader to “average” the hotspots so if you have spots that don’t get covered you can potentially damage the die. For the record for both that cpu and GPU, there is no heat spreader, so he needs to apply an extremely thin layer across the entire die surface on both chips.
|
# ¿ Mar 19, 2021 02:18 |
|
njsykora posted:This is why so many youtubers started hiding how much paste they use. Paste application methods are like assholes, everyone has one and they all stink
|
# ¿ Mar 19, 2021 05:32 |
|
shrike82 posted:dumb question but what are the headless plugs for GPUs for? i have machines running headless without them doing GPU compute frequently so i'm curious if there's some use case locked behind needing to pretend a monitor is connected Certain headless apps, like using GPU transcode with an Nvidia card on a PLEX server requires a dummy plug in one of the video outputs in order to utilize the NVENC capabilities. I believe it's also used for GPU passthrough for virtual machines, because if there's nothing plugged in to the GPU the 3d acceleration and video decode functions are shut down/cant be utilized.
|
# ¿ Mar 23, 2021 01:35 |
|
punk rebel ecks posted:https://www.theguardian.com/business/2021/mar/21/global-shortage-in-computer-chips-reaches-crisis-point Probably a safe bet. On the other hand in 2023 when semiconductor prices collapse you'll be able to trade 3 3080s for a dominos pizza.
|
# ¿ Mar 23, 2021 07:47 |
|
Cygni posted:go to a potion seller with a big iron cauldron and get a antidote do I have to think of everything asus? https://www.youtube.com/watch?v=R_FQU4KzN7A you trying to get BIOS updates from asus
|
# ¿ Mar 31, 2021 20:30 |
|
Both Aveum and that UE5 hobbyist project look not super impressive, they're leaning too far into the hyper-real which causes your eyes to pick out the inconsistencies such as how the smoke from the chimney disappates into nothing instead of being a steadily rising haze, the water in both projects looks like poorly defined poo poo, etc. Aveum looks way the gently caress worse than the other, EA is going to spend a shitpile of money for a complete dogshit game, since the gameplay is as inspired as the graphics, and we're all here to laugh at it. VVV Pretty much, that bodycam game even though it was likely "gameplay" did a better job of fooling me into believing it was realistic via sound design, actor/camera movement/distortion, and faint artifacting you'd see on a lovely bodycam than either of the other 2. There were definitely some smeared textures and you can find lots of problems if you pick through it, but the artifacting covered for it in the moment. orange juche fucked around with this message at 09:20 on Apr 22, 2023 |
# ¿ Apr 22, 2023 08:58 |
|
Dr. Video Games 0031 posted:I'm left somewhat confused by these configurations. Why is there no 8CU configuration? Why do they think that gaming handhelds have a use for 8 CPU cores? 6 cores/12CU, 4 cores/8CU, and 4 cores/4CU would make much more sense and would surely be cheaper to produce too. There's a reason why, when valve commissioned a custom APU, they gave it only 4 cores. It likely will still be better as the steam deck is quite limited on its tdp, I believe it's capped to 15W where a lot of the handheld PCs are going up to 20+ on up through 28w, though it really doesn't do much after 20w on a 6800u. The steam deck does have the benefit of having a known hardware configuration that the steam deck OS can optimize for, and the other handheld PCs don't, and it's observed in the steam deck doing more with 15w tdp than some other setups do all the way up to 28w.
|
# ¿ Apr 25, 2023 16:06 |
|
Branch Nvidian posted:Okay, so I'm just confused as poo poo and I've been trying a bunch of stuff on my own and haven't been able to figure out the exact source of the problem without going down several rabbit holes about silicon stepping mismatches and other poo poo. I've mentioned my system before, but for the sake of keeping everything in one post I'll list it again When you swapped the 3070ti for the 7900xtx, did you use DDU to clean every trace of NVidia software from your PC? Nvidia leaves some poo poo in your registry and system files that is not purged by the NVidia uninstaller. Swapping from NVidia to AMD, or vice versa can cause weird/sporadic issues with clocks and performance if you don't run DDU. https://www.guru3d.com/files-tags/download-ddu.html We would all like to believe that uninstalling the drivers for something will purge it, but often enough it leaves little bits behind. Purge the drivers for both AMD GPUs and NVidia GPUs from the system, and then do a clean install of the Radeon drivers. E: I see you used DDU to purge the Adrenalin poo poo from your system, but I'm unsure if you'd already done that for NVidia. If you didn't, try it. Failing that, OS refresh. Backup your poo poo and then reinstall Windows, as it's very likely something left over from the fact that you went from NVidia to AMD GPUs causing the issue. Did you have NVidia Broadcast or anything related to NVenc or other NVidia specific software installed on the system? orange juche fucked around with this message at 03:17 on Apr 27, 2023 |
# ¿ Apr 27, 2023 03:11 |
|
Branch Nvidian posted:I used DDU in safemode to remove all traces of Nvidia software. I also ended up actually doing a whole OS reinstall about two weeks ago anyway. Part of me wonders if I've just got some strange setting toggled somewhere buried under several menus that's causing this to happen. On the OS reinstall, did you reinstall the Nvidia drivers/3070ti under the reinstall, or has it been living wholly on the 7900XTX? Also, do you use multiple monitors? E: There's also apparently a compositor thing that Microsoft added a couple years back that can cause weird interactions with GPUs called Multi-Plane Overlay, which allows Windows to continue displaying things that are not in focus in the background. https://github.com/RedDot-3ND7355/MPO-GPU-FIX Apparently it can gently caress up people on most AMD cards from 6000 series up, and also Nvidia RTX 3000 series cards. I've never had the issue with my RTX 3080, but since the problem and the fix was literally discovered by NVidia of all things, it is apparently a thing. orange juche fucked around with this message at 03:34 on Apr 27, 2023 |
# ¿ Apr 27, 2023 03:25 |
|
gradenko_2000 posted:Branch Nvidian is a pro-tier username, dang Even more ironic since they don't have any NVidia hardware in their PC now. They drank the red koolaid and that poo poo was poison.
|
# ¿ Apr 27, 2023 03:35 |
|
ijyt posted:£700 and dealing with ASUS made software, hmmm. Don't think I'll be regretting my 512GB Deck purchase. Yup just wait for one of the chinese companies to make one that doesn't have weird ASUS bloatware and is also £100 cheaper, with better RAM or storage.
|
# ¿ Apr 27, 2023 10:51 |
|
Dr. Video Games 0031 posted:Just hook up one of the daisy chained power connectors to the adapter. That's perfectly fine to do. Daisy-chaining the power adapters is bad actually, you can overload the connectors and make the magic smoke come out, especially when you're dealing with a 4090. The 4090 draws 450W, which means you need 3x 150w connectors to guarantee you can drive it, and the 4th is for transients. Don't daisy chain your 4090. Each 8 pin connector is rated to pass 150w, and while the wires can handle more, even up to 300-400w, do you want that power going through the connector that is rated to handle 150w? Get a slight power hiccup and suddenly your GPU is smoking from the power pins.
|
# ¿ May 1, 2023 23:58 |
|
RDNA3 is actually pretty good at RT, problem is FSR2 which is a core component of a raytracing solution at 4K, is basically non-existent. DLSS has much wider support and while FSR2 might eventually wind up supporting more games, it's a long way from catching up to DLSS in how widespread it is.
|
# ¿ May 3, 2023 16:06 |
|
AzureSkys posted:Kind of a dumb question, but I need to use a few more monitors with my RTX 3090 (Which has 4 connectors and I need two more). I have an old GTX 660ti that fits OK to install on my motherboard and case. Not really unless you want to dabble into custom edited drivers and you don't want to do that, unsigned drivers are a pretty big safety risk. Get a newer 2nd GPU that fits in your case, there's single card solutions that are new enough to be supported I think, if card height is your concern.
|
# ¿ May 3, 2023 16:18 |
|
Yudo posted:I have the terrible feeling that fsr 3 isn't coming until the middle of rdna 3's life cycle. That isn't going to help in terms of developer adoption, though apparently fsr is easy to implement. Maybe if it runs on rdna 2 it will end up on consoles. FSR3 does run on RDNA2, since it's an open standard and AMD specifically stated it would run on older hardware, they'd be silly to not have it run on older hardware as current consoles are RDNA2 based, and also the lions share of their graphics hardware goes into consoles.
|
# ¿ May 3, 2023 16:33 |
|
They made the game for console and were told at the last minute "btw you need to release for PC too" but since they made the game all hosed up on account of consoles being able to handle that, the game is just wrecked. Not that their dumb coding doesn't have an impact on consoles, just less of one.
|
# ¿ May 5, 2023 02:11 |
|
Subjunctive posted:This must mean that nobody has even gestured at the build with a profiler on PC, or else the supernova-bright problems are just baked in so deep they don’t know how to attack them. They got their professional certs at the bottom of a cracker jack box, the entire team
|
# ¿ May 5, 2023 02:53 |
|
Cross-Section posted:Yeah, I was planning on taking a break until more patches dropped but then I stumbled upon that DLSS3 mod It didn't get rid of the stutter it just hid it. The game is still hitching and locking up like complete garbage, your GPU is just masking it from you with a fake frame.
|
# ¿ May 5, 2023 02:58 |
|
gradenko_2000 posted:now I haven't programmed in over a decade, but from a process-oriented perspective, it seems to me like you would approach this problem from the end result backwards Wiggly Wayne DDS posted:you've just described the FOX Engine as developed at Konami. the development costs for that were around ~$50m from memory, then it got abandoned as everyone gawked from the side at how it ran perfectly on any platform. it costs a lot to create variations of fast or accurate shaders, budget ai, interleave it all across threads, but the end result was insane. that was a ground-up redesign where other engines are bolting on dynamic resolution scaling and calling it at day Basically the devs planned on AI upscaling via DLSS and AI false frame generation saving the day, and were supremely lazy in prioritization of tasks in the render queue. It's gone from "must have been a bug" to "wow this is shocking incompetence and laziness, how in the gently caress did this pass QA", and yet EA will still make a fuckton of money because Star Wars is popular. I don't really know how you fix something that is broken at such a low level that your render pipeline has 900ms thread locks, like, they developed the whole game wrong, as a joke, or something.
|
# ¿ May 5, 2023 04:35 |
|
power crystals posted:The 5800X3D is notoriously a nightmare to cool due to the stacked die topology (the cache die effectively acts as a thermal insulator). The stock cooler won't break it or anything, but getting a fancier one may appreciably improve its performance by letting it boost for longer. One of the best things you can do with the 5800x3d is undervolt it a bit on vcore, I dropped my voltage down in the BIOS by nearly 0.1 volt, i think .08 or something, and it shaved off 10C as long as I'm not under all-core 100% utilization. The CPU doesn't get nearly as hot under gaming loads as it did before, I was sitting close to 80C while playing VRChat, and went down to 68C with no other changes. The amount of thermal load that the stacked cache puts the CPU under is pretty crazy, the CPU itself won't use more than about 70w under load, but it will drive right on up to 80C even under water cooling because the cache die blocks heat from traveling from the cores to the IHS. orange juche fucked around with this message at 00:34 on May 7, 2023 |
# ¿ May 7, 2023 00:27 |
|
Yudo posted:My undervolted 7900xt uses 80w of power idle (stock it is about 90w). That seems like alot. Aside from that, no driver problems! yeah the MCD setup for the 7xxx series GPUs is an idle power hog. Under load it uses less than the equivalent Nvidia card but the Nvidia card is less power hungry at idle. Not sure if the 7x series cards just don't clock down as well, or if there's inefficiencies that are somehow gobbling up ~50w of power.
|
# ¿ May 7, 2023 07:56 |
|
Zephro posted:Well, it doesn't until it does. If you start to bump up against the limits of the VRAM then it affects performance pretty dramatically. Forecasting the future is a mug's game but given that most new AAA games will be targeting the Series X and the PS5 and dropping support for older consoles, I'd be a little uncomfortable with 12GB if I was planning to hold onto the card for more than a couple of years. It might be fine, but even if it is it'll be a bit of tight squeeze. Yeah I mostly play VR poo poo nowadays and things like VRChat especially are VRAM hogs, because nobody believes in optimized texture work. I have a 3080 12GB and I can watch my frame rate literally slash in half as soon as I fill my VRAM and textures wind up cached in system RAM. Even worse is if you somehow wind up overflowing your VRAM significantly, which seems to pretty much be impossible outside of poo poo like VRChat, you can eat up all your system RAM with textures that didn't fit in VRAM, and you can crash your PC.
|
# ¿ May 10, 2023 04:26 |
|
Shipon posted:one time i heard someone bitching about 16gb not being enough because of modded skyrim I've run across poo poo where people are using 8k textures for skinning models, individual 8k textures for each material instead of realizing that 4k is quite probably fine, and they could also atlas the textures to save space in vram. Some people build avatars which consume an entire GB of VRAM just for the textures on one model. It's what happens when you give people who know gently caress all about best practices for 3d modelling the ability to import whatever the gently caress they want into your game.
|
# ¿ May 10, 2023 05:19 |
|
Twibbit posted:Fsr without aa was in splat3 as well. I am not sure what is happening at Nintendo Company bureaucracy basically. "This is the way it's always been done" is often a reason for those kinds of decisions.
|
# ¿ May 11, 2023 15:50 |
|
Kazinsal posted:To me it's the fact that Nintendo has just been microwaving old hardware for the past two decades and going "yeah what about it" when challenged on it that miffs me because it forces them to make their games run in environments that are just utter crap in the name of cost and power efficiency. The fact that the current new hotness, the Asus ROG Ally can run TOTK at a higher framerate than the official hardware, with all of the overhead that emulation imposes, is pretty damning on the performance front for the Switch. Even a Switch 2 or Switch Pro or whatever is going to be hosed in terms of performance. Current gen processors are just too far ahead of what Nintendo wants to use for their poo poo.
|
# ¿ May 15, 2023 08:40 |
|
I will say one thing, Nintendo hits you coming and going, they're the only console maker whose console is not a loss leader. They make money off of every console sold, from the instant you buy it. Sony and Microsoft have to dig out of a hole on price, as their consoles probably cost somewhere between 33 and 50% more than it sells for to make, and they have to make it up in game and peripheral sales to customers before they profit.
|
# ¿ May 15, 2023 09:09 |
|
Starship Troopers without any of the spirit is just not going to be great imo
|
# ¿ Jun 18, 2023 11:45 |
|
Paul MaudDib posted:https://twitter.com/young_particle/status/1631257198327132164?t=4EhHsb0Sg0hFXK89j-FMNA&s=19 Empress is legit loving nuts, she's on Timecube level brain, but you have to be loving nuts to crack poo poo like Denuvo.
|
# ¿ Jun 19, 2023 10:42 |
|
12GB VRAM is essentially dead at 1440P if developers keep on the path they are travelling, which they will do as the current gen consoles continue to evolve. Diablo 4 for example will happily consume 12GB of VRAM on my RTX 3080 without blinking, and then spill over into system RAM causing my whole system to slow down. Seriously, the game barely uses 25% of my GPU's power when the graphics are maxed out, but it inhales VRAM like nobody's business which absolutely shits on the performance with stuttering. E: Diablo 4 specifically uses up to 16GB of VRAM in order to match the console experience. Taking textures down to high resolves the VRAM issue on 12GB VRAM, but the textures are quite noticeably blurrier in the environment. https://www.youtube.com/watch?v=2Rl6sFoeOSU orange juche fucked around with this message at 09:57 on Jun 27, 2023 |
# ¿ Jun 27, 2023 08:46 |
|
Dr. Video Games 0031 posted:I think there was probably a mistake made in the settings menu if he was only getting 10 fps with RT Ultra enabled while using DLSS. The 4060 is capable of better than that at native 1080p or 1440p. They're using Path Tracing/RT Overdrive, which a 4060 is woefully inadequate for. Hell on my RTX 3080, Path Tracing takes me down to ~40 FPS at 3440x1440 with DLSS Balanced. If they were using RT Ultra they'd probably be in the 40s to high 50s with frame gen.
|
# ¿ Oct 1, 2023 02:06 |
|
what is the percentage performance impact of using a PCIe 4.0x16 GPU in a 3.0x16 slot? Assuming you have a pretty decent gaming GPU like a 5800x3d
|
# ¿ Oct 2, 2023 05:27 |
|
Dr. Video Games 0031 posted:In TechPowerUp's testing, they found a 2% difference on average with the 4090 at 1440p and 4K: https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-pci-express-scaling/ Sweet, I figured it was only a small percent drop, I'm running a 5800x3d on an X470 motherboard, and while I was prepared to buy an X570 board if needed to get 4.0x16 slots, I was hoping I wouldn't have to ship of theseus my computer any more than I already have. The next upgrade after GPU is going to be rebuilding the entire system on Zen 5 or whatever comes out after the 7000 series Ryzens. I upgrade part of my PC every few years, CPU/mobo/ram/case then GPU, but since I upgraded the motherboard last time they've gone from 3.0x16 to 5.0x16. E: I may still clean slate it, because I've pushed the RAM expandability to its limits, and I've got a sub-par storage solution going on right now, so it might be time to just start over with a brand new build, and carry over my current GPU and upgrade it 6 months down the road when new GPUs are out/they're trying to kill off stock of current gen GPUs before next gen releases. orange juche fucked around with this message at 06:02 on Oct 2, 2023 |
# ¿ Oct 2, 2023 05:59 |
|
|
# ¿ May 14, 2024 19:32 |
|
https://www.youtube.com/watch?v=H1bj19C4ohA Double the frames, double the blur, which means 4x the images in your eyes! Definitely no one saw AMD's huge surprise because it's so loving blurry
|
# ¿ Oct 3, 2023 01:00 |