Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

gradenko_2000 posted:

https://videocardz.com/newz/asus-dg1-graphics-card-only-compatible-with-two-motherboards

Asus is saying that its Intel DG1 GPUs are only going to work with its corporate/business-grade motherboards. At 80 EUs, it's not going to be much better than an Xe laptop iGPU (I've talked about a 96 EU model here), which is good for some light gaming, but the fact that Asus is placing board-level restrictions means it's going to be even harder if not impossible for this to make a dent in the consumer demand space.

A passively cooled 1-slot video card probably has a decent niche market.

wolrah posted:

Haha, oops.

Reminds me of the time I learned the hard way that while ROM usually doesn't actually mean Read Only these days OTP sure as gently caress means One Time.

That said a dev board in particular should offer some kind of a soft-secure option that allows for safe testing without the risk of bricking. Like allowing signing to be enabled either by a switch on the board or by some kind of eFuse or OTP flag so developers can test freely but those who want to just lazily use the dev board in production can permanently enable it on production units.

I'm surprised, the last OTP chip I dealt with had a few banks of parallel xor efuses. You pop 1s to 0s on bank1 to program it, and if you gently caress it up you pop a few bits on bank2.

Useful, but since it's only really available in pre-programmed but mutually incompatible configurations we ended up strapping it to default (ignore OTP) and using the bootloader to set some registers. Made buying so much easier when we could get any of the flashed SKUs when the correct one ran out.

On the flipside, bypassing the OTP voltage range lock was exceptionally fun and exciting.

Harik fucked around with this message at 17:23 on Mar 22, 2021

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Harik posted:

A passively cooled 1-slot video card probably has a decent niche market.

You can probably find them too, just for extra video outputs. Otherwise, a CPU with a half decent integrated GPU is going to be a fine solution

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

HalloKitty posted:

You can probably find them too, just for extra video outputs. Otherwise, a CPU with a half decent integrated GPU is going to be a fine solution

Yeah, PCIe x4 screams 'multihead'. Selling to video wall companies?

denereal visease
Nov 27, 2002

"Research your own experience. Absorb what is useful, reject what is useless, add what is essentially your own."

shrike82 posted:

does memory overclocking improve performance much? never looked into that

Ugly In The Morning posted:

I’ve got it in stress test now at +1100 and I’m not really seeing a difference on the FPS counter.

Ugly In The Morning posted:

So I know I just said that I didn’t see a difference in the stress test but that’s a synthetic benchmark so I decided to go and give it an actual gaming test with something I know is memory bottlenecked a lot, Jedi Fallen Order. Went from 50fps at 4k to hitting my monitor’s max of 60 with few to no dips in the intro segment.
Not an expert, but I think GPU memory overclocking usually provides boosts in Min and Avg FPS (not Max).

RGX
Sep 23, 2004
Unstoppable
https://www.rockpapershotgun.com/resident-evil-village-pc-requirements

Thought this was interesting, Capcom are recommending a 3070 as recommended requirements for 4k 60FPS in their new Resident Evil game with ray tracing on. Apparently a 2070 can manage 45 FPS at the same settings.

Given how resource intensive raytracing currently is in other games, nevermind 4k res, do we think this is because their usage is relatively light or is ray tracing going to become more optimised in general? Is that resource hogging aspect of the tech simply because developers have yet to refine the implementation, or are we still simply not there hardware wise for it to be a viable option at 4k for all but the absolute top end cards?

I wonder, now that developers have had a decent amount of time to play with the tech, whether we are going to see increasingly optimised versions of it that might give the lower end cards more of a chance at decent framerates with fancy lighting.

ufarn
May 30, 2009
oh man, we are not getting that dlss are we

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It's a last-gen console game with tacked on RT. Capcom aren't the sort to bother with extra high-end RT effects for the PC version, especially for something as static as an RE game where an immense amount of work has gone into hand-tuning the static lighting. Since it has to work on the weak console RT hardware, PC RT hardware will crush it. It's going to be this way for most games for the next 5 years, same way that last console generation it was easy to push games bound to 30 FPS by the awful Jaguar CPUs to 100+ CPU FPS on nearly any PC.

repiv
Aug 13, 2009

The RT in the PS5 demo was pretty conservative, just low resolution (1/8th-ish) reflections on the few very shiny surfaces and some extra diffuse GI detail in a small radius around the camera

https://www.youtube.com/watch?v=bP4p_unwBPA&t=338s

It seems to work okay for that kind of game, as K8.0 said the baked lighting holds up very well

I wouldn't be surprised if they turn off the RT in the outdoor sections not seen in the demo though

repiv fucked around with this message at 17:52 on Mar 22, 2021

Sagebrush
Feb 26, 2012

RGX posted:

Given how resource intensive raytracing currently is in other games, nevermind 4k res, do we think this is because their usage is relatively light or is ray tracing going to become more optimised in general? Is that resource hogging aspect of the tech simply because developers have yet to refine the implementation, or are we still simply not there hardware wise for it to be a viable option at 4k for all but the absolute top end cards?

I wonder, now that developers have had a decent amount of time to play with the tech, whether we are going to see increasingly optimised versions of it that might give the lower end cards more of a chance at decent framerates with fancy lighting.

There will continue to be optimizations, but people have been trying to do real-time raytracing for what, thirty years now? There are definitely SIGGRAPH papers about it from the 90s. The algorithms are pretty well worked out by now and there isn't much more you can do than throw cycles at it.

Of course there are tricks and simplifications, but most of those are already implemented. Things like blending the light transport across several frames, lowering the resolution of effects, etc. The biggest leap in making raytracing feasible is DLSS, which doesn't have anything to do with raytracing specifically -- it just lets you get away with a lower-resolution rendering. And while that technology may continue to improve, you can't just make something out of nothing. You can already see the cracks in DLSS when you set it to performance mode.

So no I don't think raytracing is going to get significantly more efficient as time goes on.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Harik posted:

Yeah, PCIe x4 screams 'multihead'. Selling to video wall companies?

I wonder if it will even support multi-head systems. It already requires custom BIOS support, that doesn't scream "tons of flexibility in system configuration" either.

I read an interesting thought that maybe it's actually running DMI under the hood and not PCIe and that may be why it requires custom BIOS support...

Shinjobi
Jul 10, 2008


Gravy Boat 2k
I am hesitant to really explore this option, but given just how old my current rig is I feel it's still worth looking into:


Are there are any recommended pre-built options for a new PC? I know absolutely nothing about who's making pre-built stuff, aside from like....I think I wanted Alienware when it was 2005.

CoolCab
Apr 17, 2005

glem

Shinjobi posted:

I am hesitant to really explore this option, but given just how old my current rig is I feel it's still worth looking into

absolutely nothing wrong with a prebuild, particularly given circumstances as they are where it's the only way you'll get GPUs for ballpark RRP prices.

as for makes i could give you some UK recommendations but it tends to be pretty regional, very cut-throat business at the low end particularly.

Ebola Dog
Apr 3, 2011

Dinosaurs are directly related to turtles!

Shinjobi posted:

I am hesitant to really explore this option, but given just how old my current rig is I feel it's still worth looking into:


Are there are any recommended pre-built options for a new PC? I know absolutely nothing about who's making pre-built stuff, aside from like....I think I wanted Alienware when it was 2005.

Best place to ask about this sort of thing is the PC building megathread.

Shinjobi
Jul 10, 2008


Gravy Boat 2k
Yeah my brain shorted out, completely forgot about that thread. I'll take my questions there!

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:

ufarn posted:

oh man, we are not getting that dlss are we

Nope. Pretty sure Capcom partnered with AMD on this one.

FuturePastNow
May 19, 2014


Harik posted:

Yeah, PCIe x4 screams 'multihead'. Selling to video wall companies?

Really low-end cards haven't used the whole 16 PCIe lanes for a while; the GT 1030 and its predecessor the 710/730 are x4 cards, and I think the RX 550 is either an x4 or an x8 electrically.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

CoolCab posted:

absolutely nothing wrong with a prebuild, particularly given circumstances as they are where it's the only way you'll get GPUs for ballpark RRP prices.

under normal circumstances I kinda don't like to recommend prebuilds just because they tend to cut corners in places you don't expect. they'll use special one-off versions of motherboards or GPUs that are cost-reduced and have worse cooling/VRMs/etc and will never get proper bios updates, they will use non-standard mounting patterns or connectors that make it impossible to swap a component if they fail, etc. unless they are a brand like microcenter's ("powerspec" I think?) that is just putting together off-the-shelf parts, for a technical user I would recommend building your own even if it's "the same price" just because you know exactly what you are getting.

obviously doesn't apply to building a momputer where that makes you the de-facto tech support, or the current circumstances where DIY prices are just totally hosed, but if you know what you're doing then just take the extra 2 hours and put the parts together yourself.

Kunabomber
Oct 1, 2002


Pillbug
The RX 5500XT is x8 and in certain cases the 4 GB versions got hammered when running in PCIE 3.0 whenever there was a need to swap data in the VRAM IIRC.

CaptainSarcastic
Jul 6, 2013



Paul MaudDib posted:

under normal circumstances I kinda don't like to recommend prebuilds just because they tend to cut corners in places you don't expect. they'll use special one-off versions of motherboards or GPUs that are cost-reduced and have worse cooling/VRMs/etc and will never get proper bios updates, they will use non-standard mounting patterns or connectors that make it impossible to swap a component if they fail, etc. unless they are a brand like microcenter's ("powerspec" I think?) that is just putting together off-the-shelf parts, for a technical user I would recommend building your own even if it's "the same price" just because you know exactly what you are getting.

obviously doesn't apply to building a momputer where that makes you the de-facto tech support, or the current circumstances where DIY prices are just totally hosed, but if you know what you're doing then just take the extra 2 hours and put the parts together yourself.

I don't know if this is still a thing that happens, but I remember years ago getting a used HP or other brand desktop with a dumbed-down BIOS. I researched it and found out it was just a rebadged MSI motherboard and was able to flash the non-proprietary BIOS onto it and it was like a new machine.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

CoolCab posted:

absolutely nothing wrong with a prebuild, particularly given circumstances as they are where it's the only way you'll get GPUs for ballpark RRP prices.

For instance, here's an HP Omen with a 3070 for $1150, which is uh... the price of that GPU on its own

https://kinjadeals.theinventory.com/buy-a-whole-pc-with-an-rtx-3070-built-in-for-around-the-1846286448

Ugly In The Morning
Jul 1, 2010
Pillbug

change my name posted:

For instance, here's an HP Omen with a 3070 for $1150, which is uh... the price of that GPU on its own

https://kinjadeals.theinventory.com/buy-a-whole-pc-with-an-rtx-3070-built-in-for-around-the-1846286448

HP omens are weirdly cheap, I got a 2070S/i7-9700 one last year for that price.

denereal visease posted:

Not an expert, but I think GPU memory overclocking usually provides boosts in Min and Avg FPS (not Max).

I deliberately chose a game that was weird with running into memory stuff, I doubt I could replicate those results with anything else. Maybe pre patch HZD but I think its memory fuckery was all just regular RAM stuff and nothing with the GPU.

Cygni
Nov 12, 2005

raring to post

You still need to be careful with prebuilts, imo. There are lots of caveats still out there with them. If they work for what you need, i dont think anyone should be shamed for buying one, though.

Just look out for stuff like: single dimm, single rank memory configs with bad speeds and worse timings, in house motherboards with no BIOS options, Realtek LAN/WiFi, trash tier dramless-SSDs and insane upsell pricing, low wattage powersupplies with custom or 12VO connectors, very bad no good cooling, GPUs that are BIOS locked to the motherboard it is with (thankfully becoming less common), and the big one these days... if it uses an in house motherboard (so not one specifically marketed as ASUS, Asrock, MSI, or Gigabyte) and a CPU without a K or X on the end of the name, you need to assume that you are going to be TDP throttled heavily and your CPU performance will be nowhere near what you see on 3rd party charts. This applies to anything Intel with 6 or more cores, and anything AMD with 8 or more cores, generally.

Prebuilds often strictly enforce the Intel/AMD TDP limits on those non K or X SKUs to save a few pennies with a dinky cooling solution, and with a lot of the in house BIOS still being blue screen affairs, good luck overriding it. It is a real problem. It happens in DIY builds too, but most boards will ignore the TDP limits if you turn XMP/DOCP on, and at least you have the ability to override it if you are the type of person who would know the difference.

CaptainSarcastic
Jul 6, 2013



Cygni posted:

GPUs that are BIOS locked to the motherboard it is with (thankfully becoming less common)

Has that been a thing in recent memory (aside from the Intel thing from the last week)? I'm not trying to be a dick, but I'm honestly surprised if this was a thing going on that I had somehow managed to be oblivious to.

shrike82
Jun 11, 2005

dumb question but what are the headless plugs for GPUs for? i have machines running headless without them doing GPU compute frequently so i'm curious if there's some use case locked behind needing to pretend a monitor is connected

orange juche
Mar 14, 2012



shrike82 posted:

dumb question but what are the headless plugs for GPUs for? i have machines running headless without them doing GPU compute frequently so i'm curious if there's some use case locked behind needing to pretend a monitor is connected

Certain headless apps, like using GPU transcode with an Nvidia card on a PLEX server requires a dummy plug in one of the video outputs in order to utilize the NVENC capabilities. I believe it's also used for GPU passthrough for virtual machines, because if there's nothing plugged in to the GPU the 3d acceleration and video decode functions are shut down/cant be utilized.

Shear Modulus
Jun 9, 2010



I think sometimes the machine will shut off the GPU or put it in low power mode if it thinks there's no display attached

Geemer
Nov 4, 2010



Some motherboards refuse to boot without a monitor attached. Probably for market segmentation reasons.

You occasionally see support for headless mode added in bios updates.

Cygni
Nov 12, 2005

raring to post

CaptainSarcastic posted:

Has that been a thing in recent memory (aside from the Intel thing from the last week)? I'm not trying to be a dick, but I'm honestly surprised if this was a thing going on that I had somehow managed to be oblivious to.

Yeah, it was still common in some SFF office machine prebuilds as of the Maxwell-ish generation. They move as much as they can to the motherboard for cost/space, and that includes the goddamn VBIOS chip itself sometimes. Like I said, less common these days thankfully. Haven’t heard of it in a while, but I don’t look at tech support forums very often anymore.

CaptainSarcastic
Jul 6, 2013



Cygni posted:

Yeah, it was still common in some SFF office machine prebuilds as of the Maxwell-ish generation. They move as much as they can to the motherboard for cost/space, and that includes the goddamn VBIOS chip itself sometimes. Like I said, less common these days thankfully. Haven’t heard of it in a while, but I don’t look at tech support forums very often anymore.

Ah, okay, thanks - that makes sense.

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
https://www.theguardian.com/business/2021/mar/21/global-shortage-in-computer-chips-reaches-crisis-point

I assume the 3080 will be impossible to find until 2023?

punk rebel ecks fucked around with this message at 07:48 on Mar 23, 2021

orange juche
Mar 14, 2012




Probably a safe bet. On the other hand in 2023 when semiconductor prices collapse you'll be able to trade 3 3080s for a dominos pizza.

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
I'll probably just skip the 3080 all together then and go with Nvidia's next in line GPU.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
what makes you think it will be any more available.

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.

Fauxtool posted:

what makes you think it will be any more available.

No shortage of chips.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
The Switch successor is apparently using hardware with DLSS support :vince:

https://www.bloomberg.com/news/articles/2021-03-23/nintendo-to-use-new-nvidia-graphics-chip-in-2021-switch-upgrade

The caveat being that it will only be for new games, not existing ones (which isn't surprising, considering that games on PS4 and Xbone needed patches to get pro/x support)

shrike82
Jun 11, 2005

lol can't wait to see the next xenoblade switch pro vaselined up from 720P to 4K

it dont matter
Aug 29, 2008

orange juche posted:

Certain headless apps, like using GPU transcode with an Nvidia card on a PLEX server requires a dummy plug in one of the video outputs in order to utilize the NVENC capabilities. I believe it's also used for GPU passthrough for virtual machines, because if there's nothing plugged in to the GPU the 3d acceleration and video decode functions are shut down/cant be utilized.

Does this also apply if the GPU is plugged into a display, but the screen isn't on? I have a media server PC that's plugged into my TV, but I rarely use it through the TV directly: it's either streaming from Plex via a Shield, or I'm connected via remote desktop.

Sometimes when I reboot the server I can't see anything on the remote connection until I switch the TV on and swap input to the PC. Would using a dummy plug stop that behaviour? Because it's prevented me from getting on the server when I'm not at home to switch on the TV.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

punk rebel ecks posted:

No shortage of chips.

i wouldnt count on it. There was a shortage before the several of the fabs got taken offline. Supply wont have gotten better by then and demand will only have gone up

MarcusSA
Sep 23, 2007

Zedsdeadbaby posted:

The Switch successor is apparently using hardware with DLSS support :vince:

https://www.bloomberg.com/news/articles/2021-03-23/nintendo-to-use-new-nvidia-graphics-chip-in-2021-switch-upgrade

The caveat being that it will only be for new games, not existing ones (which isn't surprising, considering that games on PS4 and Xbone needed patches to get pro/x support)

I’ll believe it when I see it. These switch rumors have been going since the drat thing launched.

Adbot
ADBOT LOVES YOU

njsykora
Jan 23, 2012

Robots confuse squirrels.


I would be shocked if the hardware upgrade is anything further than just upgrading it from a Tegra X1 to an X2.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply