Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Cyrano4747
Sep 25, 2006

Yes, I know I'm old, get off my fucking lawn so I can yell at these clouds.

There’s a market for anything if you’re willing to let the price slide. I sold a 1070 a year or so back for ~150

Adbot
ADBOT LOVES YOU

MarcusSA
Sep 23, 2007

Zotac Store has ZOTAC GAMING GeForce RTX 4090 Trinity 24GB Graphics Card (Open Box, ZT-D40900D-10P) on sale for $1,312.19 when you apply discount code ZTUDISCORD2024 during checkout

https://slickdeals.net/?adobeRef=08...t-d40900d-10p-o


I can't get the non slickdeals link to past but I guess its on their site.

Nfcknblvbl
Jul 15, 2002

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.
Out of stock. Knew I should have just pulled the trigger when I saw it this morning.

Sorbus
Apr 1, 2010
Got a Tuf 4080 Super for 1060 EUR as it was a customer return to a store. For the past week I have been loving around with GPU and CPU air cooling curves (I had 1080ti and 9700k in a water loop, dismantled it for the update) trying to find out what is causing a "WHOOOOSHH" noise when starting games or benchmarks.

Turns out it is the new PSU I bought, an Asus TUF Gaming 1000w :doh:

I got a 1000w seasonic on the way now and asus goes back to the store when it arrives.

shrike82
Jun 11, 2005

Rumored GPU spec for Switch 2

• 1536 CUDA Cores, 48 tensor cores, 12 RT cores
• Ampere architecture with features backported from Ada

https://twitter.com/Okami13_/status/1788657212023325109

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

shrike82 posted:

Rumored GPU spec for Switch 2

• 1536 CUDA Cores, 48 tensor cores, 12 RT cores
• Ampere architecture with features backported from Ada

https://twitter.com/Okami13_/status/1788657212023325109

MicroSD card makers are already licking their chops, more powerful hardware means more ports (I definitely assume BG3 is coming at some point, Larian did do DOS2 for the original Switch) and those types of games will need a ton of storage space

repiv
Aug 13, 2009

if games actually lean into using that ~2GB/sec internal storage then sd cards are going to be a problem, most microsd cards are an order of magnitude slower than that

sd express gets up to 1GB/sec or so but that's still barely available

Dr. Video Games 0031
Jul 17, 2004

shrike82 posted:

Rumored GPU spec for Switch 2

• 1536 CUDA Cores, 48 tensor cores, 12 RT cores
• Ampere architecture with features backported from Ada

https://twitter.com/Okami13_/status/1788657212023325109

These are all specs that were rumored already, but them showing up in a new place adds to the likelihood of them being real. The actual performance you can expect depends on too many unknown variables to reliably estimate though, so I'd take any estimates you see online with a large grain of salt.

edit: this tweet quotes an estimate that claims it'd be better than a base PS4 in handheld mode before dlss. For comparison's sake, the Steam Deck can only do at 720p what the PS4 does at 1080p. The performance required for 1080p is quite a bit higher than 720p, so this would make the Switch 2 considerably faster than the Steam Deck. Considering Nintendo favors slim, light, and quiet devices, that would be pretty surprising to me. It would have to be manufactured on a pretty advanced process node with an advanced cooling system (vapor chamber, liquid metal, dual fans, etc) for this to be true imo, which seems unlikely to me.

Dr. Video Games 0031 fucked around with this message at 00:02 on May 10, 2024

Twibbit
Mar 7, 2013

Is your refrigerator running?
While I also find that claim suspicious, It will be running a light weight OS and have a pretty to the metal API like switch 1 that will allow them to squeeze more out of it. Comparisons between the few switch games that were also on the Nvidia shield of same spec shows what a difference that can make

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Twibbit posted:

While I also find that claim suspicious, It will be running a light weight OS and have a pretty to the metal API like switch 1 that will allow them to squeeze more out of it. Comparisons between the few switch games that were also on the Nvidia shield of same spec shows what a difference that can make

is the PS4’s API very heavyweight?

DLSS is going to give that thing serious longevity. I hope Nintendo makes launch titles hit framerate target with it disabled.

Twibbit
Mar 7, 2013

Is your refrigerator running?
No, Sony is lightweight too, I was discussing comparisons with steam deck which is running a form of Linux.

Nalin
Sep 29, 2007

Hair Elf
I hope the Switch 2 has DLSS frame-gen so in 8 years we can play games getting 30 fps with it enabled.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Are there benchmarks around of the various graphics pipelines and their overhead? Proton is emulating, but otoh sometimes it runs faster than under Windows, so I don’t trust my intuition here.

Like, Linux has io_uring and futexes and other things that are designed to be exceptionally low-overhead, so it’s not like they don’t care about that stuff.

Nalin
Sep 29, 2007

Hair Elf

Subjunctive posted:

Are there benchmarks around of the various graphics pipelines and their overhead? Proton is emulating, but otoh sometimes it runs faster than under Windows, so I don’t trust my intuition here.

Like, Linux has io_uring and futexes and other things that are designed to be exceptionally low-overhead, so it’s not like they don’t care about that stuff.

It's probably a combination of many things, like Linux requiring less overhead, a lower latency process scheduler, the fact that the process scheduler keeps threads on the same CCX for longer, and basically converting everything to Vulkan.

And I'm pretty sure Proton is not really emulating in a normal sense. It uses things like DXVK to convert DirectX to Vulkan. I would think most stuff is actually running Vulkan behind the scenes.

EDIT: DXVK to convert DX9-11 to Vulkan and VKD3D for DX12 to Vulkan

Nalin fucked around with this message at 01:33 on May 10, 2024

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Nalin posted:

And I'm pretty sure Proton is not really emulating in a normal sense. It uses things like DXVK to convert DirectX to Vulkan. I would think most stuff is actually running Vulkan behind the scenes.

I think “convert from unsupported form to supported form” is the most common form of emulation!

Truga
May 4, 2014
Lipstick Apathy
yeah wine is the stupid backronym, it's a compatibility patch rather than emulation

it's "just" a bunch of dlls that live in system32 or whatever, that talk to apps in win32, but then talk to the system in lunix instead of nt. obviously it's way more complicated than that so there is some overhead, but that's the basic idea

for dxvk there is some more overhead because gpu poo poo is way more complicated than "normal" syscalls, but performance is generally within 15% in even worst cases unless your cpu is really bad

there are however certain outliers, where running dxvk or dx12proxy gives a significant perf boost, and those games will obviously run better on proton, they also do on windows if you bother modding dxvk in

Subjunctive posted:

DLSS is going to give that thing serious longevity. I hope Nintendo makes launch titles hit framerate target with it disabled.
press x for doubt

maybe for 1st party games lol

repiv
Aug 13, 2009

the irony is that nintendos first party engines don't use TAA (or really any AA at all) so they're going to have to catch up real quick to take advantage of DLSS

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
https://twitter.com/VideoCardz/status/1788488460786893217?t=OaqCG1FkzyM5QMioFt5qsg&s=19

I haven't tried using it with a 30 fps base but I was playing with this all of last night to go from 60 to 120 in Fallout 4, FO76, and Diablo 2 Resurrected and I like the results. The artifacting has been significantly reduced (to my eyes) and it wasn't giving me a headache anymore.

Antifa Spacemarine
Jan 11, 2011

Tzeentch can suck it.
I've seen some games allow you to select Vulkan instead of DirectX, I always figured the overhead with Proton on those games must be very minimal because the only thing it really has to do compat for are a few Windows calls.

Dr. Video Games 0031
Jul 17, 2004

Subjunctive posted:

I think “convert from unsupported form to supported form” is the most common form of emulation!

Wrappers and conversion layers have traditionally not counted as emulation.

Subjunctive posted:

DLSS is going to give that thing serious longevity. I hope Nintendo makes launch titles hit framerate target with it disabled.

DLSS may not be the silver bullet a lot of people are hoping for. Many people seem to apply their experience of using it on PC and say that it will make games run a lot faster. Except console games already run at sub-native resolutions pretty much by default, and all DLSS adds to that picture is better image quality, not necessarily better frame rates. This will be especially true in games that are CPU limited.

The other issue is that the Switch 2 hardware may not actually be fast enough to make great use of DLSS. Digital Foundry did some testing on an Nvidia GPU with slightly better specs than the rumored Switch 2 specs, and found that DLSS had some pretty heavy overhead that made it impractical for targeting 60fps.

https://www.youtube.com/watch?v=czUipNJ_Qqs

I know it seems this way to us 300 - 450W GPU owners, but DLSS isn't free. It requires hardware to run it, and on very weak low-power hardware, it may be too heavy. There's some speculation that Nvidia may be designing a special low-overhead version of DLSS just for Nintendo, but this would come with worse image quality—would it really be much better than FSR2, which devs can utilize already?

Dr. Video Games 0031 fucked around with this message at 08:33 on May 10, 2024

Weird Pumpkin
Oct 7, 2007

I thought the big theory was that DLSS would give better upscaling image quality, which means they could render it even lower resolution and get the same picture?

Still though, definitely sounds like it wouldn't work out all that great with that test

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Lol I think I shorted my GPU last night trying to plug in an HDMI cable because the system died and when I turned it back on, everything was super laggy/kept crashing. No matter, I ran DDU and did a fresh driver install, to no avail. Clean Windows install didn't help either. After messing around with it for an entire work day, I finally got the idea to replace my GPU with my old 3080 and everything's working perfectly again. This sucks rear end

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
A shorted GPU running slow seems really weird. Usually it dies.

I'd guess it wasn't getting proper power. Try a different power cable with that GPU?

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Lockback posted:

A shorted GPU running slow seems really weird. Usually it dies.

I'd guess it wasn't getting proper power. Try a different power cable with that GPU?

I'm getting it replaced but my 3080 (which draws 100 more watts) runs perfectly fine without crashing now. That does make me wonder if it's just the power connector, though, hopefully ASUS checks it out and doesn't just chuck it in the trash or something

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Lockback posted:

A shorted GPU running slow seems really weird. Usually it dies.

I'd guess it wasn't getting proper power. Try a different power cable with that GPU?

Could the short have damaged part of the power delivery?

Dr. Video Games 0031
Jul 17, 2004

I'm not sure how that could have lead you to shorting anything if you were just futzing with the back panel. My guess is that you inadvertently jostled something loose somehow.

VectorSigma
Jan 20, 2004

Transform
and
Freak Out



I've had a machine trip off due to transient voltage in a long HDMI cable, probably from a ground potential difference. Thankfully it powered up again with no issues.

I can totally see how that might damage a component on the board or even the die itself just enough to hobble it without rendering it completely dead. Basically your chip got binned by misadventure.

Cygni
Nov 12, 2005

raring to post

That sounds like an extremely unlikely failure mode to me. Won’t say impossible cause computers can become haunted through various arcane and paranormal processes (fact), but very unlikely.

Sounds like the op already RMAed it tho so I guess we won’t ever know!

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.
I burned out two HDMI ports on my old TV when hotplugging it to my PC. Since then I've made sure the PC or TV are turned off when plugging.

PC LOAD LETTER
May 23, 2005
WTF?!
Yeah this it rare but it can happen.

I've caused crashes plugging in HDMI cables maybe 2-3 times over the years. I only know that was the issue because I was happened to be looking at the port and saw a small spark when plugging it in. It was for the integrated GPU on the mobo or in the CPU and not a dGPU. Rebooting 'fixed' it and they ran fine afterwards but minor static (I'm assuming its static I don't know) discharges can indeed cause issues with them.

Truga
May 4, 2014
Lipstick Apathy
what the gently caress? what is hdmi even doing that that can happen??

i've probably plugged in a million vga/dvi/dp monitors over my computer janitor years and have never had that happen

FuturePastNow
May 19, 2014


If you get a little shock while plugging in the cable, it's because you make a ground loop if the computer and display are plugged into separate outlets

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

FuturePastNow posted:

If you get a little shock while plugging in the cable, it's because you make a ground loop if the computer and display are plugged into separate outlets

this reminds me of the big brained EEs designing everything with a 2-pin AC to give everyone an annoying 120V AC tingle whenever they touch an ungrounded metal surface, just to save $1 off from not using a proper grounded 3-pin plug and cable

Indiana_Krom
Jun 18, 2007
Net Slacker

FuturePastNow posted:

If you get a little shock while plugging in the cable, it's because you make a ground loop if the computer and display are plugged into separate outlets

There are a couple different ways that could happen.
Separate circuits (especially if they are on opposite sides of the split phase).
One or the other side isn't grounded properly.
Separate outlets is iffy, as long as it is on the same circuit and both are properly grounded then it shouldn't happen (because the grounds would already be bonded).

Best way to avoid it is to make sure your outlets are grounded and plug the PC and all its peripherals into a single power strip, thus guaranteeing the ground plane is bonded across all devices.

PC LOAD LETTER
May 23, 2005
WTF?!
Yeah at least one of the times it happened that I can recall the monitor was already plugged in and the HDMI cable was plugged into the monitor first before I went to attach it to the PC. I can't recall how everything was plugged into the power sockets but it wouldn't surprise me if it was a ground loop issue from them being plugged in differently.

Jippa
Feb 13, 2009
I updated my gtx 1080's drivers for the first time in over a year and it is causing all sorts of problems.

Is there a way to view my driver update history? I specifically want to know which older ones I was using as they worked perfectly.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Jippa posted:

I updated my gtx 1080's drivers for the first time in over a year and it is causing all sorts of problems.

Is there a way to view my driver update history? I specifically want to know which older ones I was using as they worked perfectly.

Find the card in device manager and roll the driver back on the driver tab of the device properties

Jippa
Feb 13, 2009

HalloKitty posted:

Find the card in device manager and roll the driver back on the driver tab of the device properties

Ah, cheers.

Adbot
ADBOT LOVES YOU

MintFresh
Jun 24, 2020

I won a Starforge PC off a twitch NFL draft contest. It's nowhere as good as my current PC (I have a 7800x3d AM5 set up) except the GPU.

I currently have a 2080 Ti that I've had since 2019. The games I do play haven't really required much more power than that.

The PC I won comes comes with a 4060 Ti but it's the 8GB version. I looked up benchmarks and they trade blows depending on the games.

I play on a 1440p 170hz monitor and I worry about the 8GB of VRAM being limiting. I am enticed with the better NVEC (AV1 encoding!), the wattage comparison (150~ less watts), and the new DLSS features introduced into the 4000 series (frame generation I think?).

Am I overthinking the VRAM part? I have a feeling in newer games that the 4060 Ti 8GB will end up better except for the VRAM. I plan to use the Starforge PC in the living room as an extra workstation so it does not matter which videocard it gets.

edit: Saw it was actually 150watts difference

MintFresh fucked around with this message at 04:30 on May 14, 2024

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply