Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SwissArmyDruid
Feb 14, 2014

by sebmojo
but will they? I was not aware that Nvidia did things like push firmware updates to their Gsync modules.

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

SwissArmyDruid posted:

but will they? I was not aware that Nvidia did things like push firmware updates to their Gsync modules.

Yes. The Asus PG27AQN and Acer Predator XB273U F already have firmware updates available, and updates to the Asus PG248QP and AOC AGON AG276QSG have already been announced.

repiv
Aug 13, 2009

i'm not sure if that justifies them sticking with FPGAs for an entire decade, but it's something

SpartanIvy
May 18, 2007
Hair Elf

Dr. Video Games 0031 posted:

The obvious problem with this is that you will need a special case specifically designed for this. There are some of those already available, but it's a very small market for now.
You underestimate the power of a drill

Dr. Video Games 0031
Jul 17, 2004

https://www.youtube.com/watch?v=IlWT_TdOK6s

The supercut at the start here is really something else.

Branch Nvidian
Nov 29, 2012



Dr. Video Games 0031 posted:

https://www.youtube.com/watch?v=IlWT_TdOK6s

The supercut at the start here is really something else.

:shepface: what the gently caress is going on at Nvidia?

Arrath
Apr 14, 2011


SpartanIvy posted:

You underestimate the power of a drill

Yeah I was just thinking that the fit and finish and lack of knuckle-flaying burrs on modern cases had gone too far. Time to whip out the dremel and hack in some cutouts in the backplate for, *checks notes*, all my power cords.

shrike82
Jun 11, 2005

https://twitter.com/VideoCardz/status/1663178535689760771?s=20

hobbesmaster
Jan 28, 2008

Dr. Video Games 0031 posted:

https://www.youtube.com/watch?v=IlWT_TdOK6s

The supercut at the start here is really something else.

This reminds me of that disastrous Qualcomm keynote (10 years ago!)
https://www.youtube.com/watch?v=Vn8qxSUbOko&t=383s

Anime Schoolgirl
Nov 28, 2002

Branch Nvidian posted:

:shepface: what the gently caress is going on at Nvidia?
valuation from the AI bubble buys a hell of a lot of cocaine

ChaseSP
Mar 25, 2013



release power cables w rgb lights on them that only light up if properly plugged in fully, this is the only solution

UHD
Nov 11, 2006


i'm kind of surprised that isn't a thing already

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

by the way, I think people are also underselling the power implications of APUs in the long term. Console-style APU inherently uses less power than dGPU+separate ICs. But Apple Silicon-style memory stacking uses even less - GDDR PHYs are big and don't shrink and use tons of power, getting rid of off-chip links for your memory is super cool and direct-bonded memory (or cache) will be a huge savings. LPDDR stacking is super cool for highly efficient devices, I think that is where the leading-edge (but perhaps not cost-focused segments) of the market is going to go. It's not making a splash because Apple has zilch for gaming presence, but, imagine if you could get a little AMD APU with 3060 performance at 25W total power. Even if it's not a 3090 or whatever, 3060 performance at 25W is super cool. You can't even fit a worthwhile dGPU+VRAM in that power budget let alone a separate CPU too. How do you compete with that in the long term?

I think Grace is a lot more important than people realize with this. That LTT video was neat, NVIDIA is getting serious about trying to build Grace into an Ampere competitor or whatever. And maybe that's the long-term future of the low-end market as well. AMD and Intel are the only ones who can do x86, and that's a huge barrier. But, I think in the long term if NVIDIA doesn't do a console to capture some of those "everything below $500 is slowing to a crawl probably forever" leavers from the dGPU they're bleeding marketshare/platform access. Switch/Switch NX isn't enough, they also need some presence in the console market and really ideally the living room market too.

And that's why Intel is racing for graphics too. The future is MI300 and Grace style large compute APUs for energy-conscious (and energy-limited) customers like HPC and it'll probably eventually fall through to the server market as well. Gotta get that power gain, off-chip PHYs for everything is too much, you need memory or at least cache right there on die on everything now.

--

Anyway re:the video Jensen is right about Moore's Law and the high end continuing to progress at higher prices (and less burdened by the fixed costs of the low-end market, warranties, and a lot of other things) doesn't mean he's wrong, that's not contradictory. Like again, sorry, but, the low end market is getting eaten up by APUs and that's just how the financials of it work. Nobody buys a $100 GPU anymore, and the gains were marginal before that. That's the cycle here, fixed costs on the board (and fixed die area from things like PHYs) overwhelm the gains from shrinking, the gains become marginal, people stop buying it, companies stop making it. 7850 used to be almost a $100 card and it was only 1GB, I paid $150 for my 2GB 6 months after it launched. What cards are still being released at $100 now, or even $150? RX 6400/6500XT have hit that after deep firesales, that's it, and people hate those, nobody is paying to upgrade their GTX 960 or GTX 1050 to a 6500XT or whatever. That segment is dead, and it's going to continue to progress its way up the stack because moore's law is dead.

GN doing another clickbait outrage video but again, just like with the "wow prebuilts suck haha right guys" series, you're not going to change the market realities of a $10b industry, $25 cheaper on the walmart shelf sells the PC and leaves the competition on the shelf, they're not going to not cut costs just because steve is upset. fix it, rebrand it and cheapen it back up in 6 months, done. XMP not being enabled was because XMP burns out CPUs eventually, that's not just an Asus or a DDR5 thing, it's a nontrivial bump in their warranty costs to have it on. In this case, NVIDIA might be willing to cut $50 or $100 off the 4070, that type of thing, they're not going back to $329 for that tier of product and neither is AMD, that's not financially sustainable. Same thing, they'll throw down a few sops like a Super series eventually to fix 4060 Ti/4070 Ti/4080 pricing (clearly out of line with even 4070/4060/4090) and maybe adjust 4060 down $25 or so and 4070 down $50 or so and call it good (and everyone will nod about "AMD just missing a market opportunity"/"choosing to profiteer with NVIDIA"). NVIDIA is not gonna get to where people are demanding because people's expectations are fundamentally out of line with production costs, and they won't buy them at NVIDIA/AMD's price, and that's fine because that's the process of the price segment dying. It stops being worth upgrading -> it stops being worth it to produce an upgrade -> products in that cost segment are discontinued. Just like my 7850 - it doesn't mean they're conspiring to just not do it, it's not viable to produce a mainstream-enthusiast card for $130-150 anymore. $150 buys you a 64b bus and two RAM chips, and puts it on a PCB for you. Buy a console.

Serious question what do people think an 8500XT at $200 would even look like? What do you think the entry-level is going to be next time around? Like, the reason AMD is following NVIDIA is because the costs are getting increasingly tight and they can't make great margins at super low prices either. Do people imagine a 4-PHY with 16GB of memory, or a 6-PHY with 12GB, or what? And with the way logic vs PHY is shrinking poo poo's gonna look like an iGPU scotched taped onto a PHY bank. PCIe PHYs are eating more and more too, that's the reason they've been cutting it. I don't like it on a performance card, but again, 4070 is safe, 4060 and 7600 are budget cards now.

And I'm sure PHY overhead is part of why AMD stayed at 6nm for 7600 too. If your PHY dominates your die area after you shrink... why not just use a larger node and have larger logic and just accept that in the big picture N6 only ends up 10% bigger than N5P for something 7600-sized or whatever? That's why everybody's doing x8 and x4 PCIe poo poo too, try and cut down some of that PHY.

That's where APUs and stacking come in. Direct-bonded stacked silicon needs much much smaller PHYs than off-chip links. That's a viable replacement for lower-end GPUs in performance terms - but not a cheap one. And the PC market just has not adopted APUs much at all. But Steam Console is where the tech wants to go in the low-end price segment, and it's probably going to speedrun Actual Consoles' evolution from dGPU to APUs. The viable solution already exists and it's the 4700S APU with the graphics turned back on. Or something functionally similar. And then start stacking.

But the diagnosis for the dGPU in the low/mid market is cancer and no it’s not the doctors making you sick. The cancer is eating its way up the stack, it has been for 10 years now (since 28nm), and now it’s starting to get to the places consumers care about in the market. Sucks but it’s not a conspiracy. APUs are just better at that price segment, that’s the tool that’s evolved to address the problems with the low-end dGPU market.

Paul MaudDib fucked around with this message at 04:47 on May 30, 2023

CBD Corndog
Jun 21, 2009



UHD posted:

i'm kind of surprised that isn't a thing already

you're a year late
https://www.youtube.com/watch?v=r76EcD9UMGI

UHD
Nov 11, 2006



ok there it is :pcgaming:

spunkshui
Oct 5, 2011



UHD posted:

i'm kind of surprised that isn't a thing already

It actually is a thing already.

Our 3080 TI STRIX has a red led for each power connector.

Anime Schoolgirl
Nov 28, 2002

the other scuttlebutt for the dGPU market is that sometimes it's not even necessary to buy a card made within the last 3 years if your standards for playability and graphics don't meaningfully increase (or you have an obsession with buying the shittiest hardware and running games at 10 fps like I do.) My main PC's 2070 Super is 20% lower than a PS5's graphical power, but I don't really care that I have to run render resolution at 70% for a game to go above 60fps running on my 4k monitor and I'm running a 60hz monitor mostly because I need the res more than I do the refresh rate for one of my hobbies.

This is probably tech idiots content, but there are several youtubers that run modern games on GPUs you'd find used at $70-80 and they're still cranking 40-50 fps on medium settings: https://www.youtube.com/watch?v=CyaiP_NKmpg

Lareine
Jul 22, 2007

KIIIRRRYYYUUUUU CHAAAANNNNNN

Anime Schoolgirl posted:

the other scuttlebutt for the dGPU market is that sometimes it's not even necessary to buy a card made within the last 3 years if your standards for playability and graphics don't meaningfully increase (or you have an obsession with buying the shittiest hardware and running games at 10 fps like I do.) My main PC's 2070 Super is 20% lower than a PS5's graphical power, but I don't really care that I have to run render resolution at 70% for a game to go above 60fps running on my 4k monitor and I'm running a 60hz monitor mostly because I need the res more than I do the refresh rate for one of my hobbies.

This is probably tech idiots content, but there are several youtubers that run modern games on GPUs you'd find used at $70-80 and they're still cranking 40-50 fps on medium settings: https://www.youtube.com/watch?v=CyaiP_NKmpg

I mean, they aren't wrong. I aim for 60 FPS at 1080p and my 1060 is still doing okay in that regard. The games I typically play, I'm able to pull that off with med-high settings. It's just with stuff like Cyberpunk or any of the numerous unoptimized PC ports I begin to run into problems but unlike some people, I'm able to live with a stable 30 without melting.

The only real reason I'm considering a 4060 is a paranoia that no card in the future is going to be able to fit in my energy budget without me going through the trouble of undervolting it.

BurritoJustice
Oct 9, 2012

Dr. Video Games 0031 posted:

Yes. The Asus PG27AQN and Acer Predator XB273U F already have firmware updates available, and updates to the Asus PG248QP and AOC AGON AG276QSG have already been announced.

The cooler part is how the firmware updates work, it's done over the Displayport cable purely through sending frames. First a trigger frame, then a sequence of images that are actually encodings of the firmware. Kinda whacky, you just sit there while it flashes at you like a disco while patching :pcgaming:

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!

Paul MaudDib posted:

To many words

Do you just Label everything that's even slightly critical of Nvidia as clickbait?

SlowBloke
Aug 14, 2017
I sorta missed the edge connector power feasibility debate yesterday but i wanted to point to this:

https://www.te.com/usa-en/products/connectors/pcb-connectors/card-edge-connectors/card-edge-power-connectors.html?tab=pgp-story

The chunkiest one can handle up to 3.4KW, enough to saturate the average household power in Europe.

Cantide
Jun 13, 2001
Pillbug
Nvidia making Stuff like this is cool and all but the enthusiast in me worries about the end of consumer hardware and the coming cloud computing hellscape :tinfoil:
https://www.youtube.com/watch?v=It9D08W8Z7o

SlowBloke
Aug 14, 2017

Cantide posted:

Nvidia making Stuff like this is cool and all but the enthusiast in me worries about the end of consumer hardware and the coming cloud computing hellscape :tinfoil:
https://www.youtube.com/watch?v=It9D08W8Z7o

My dream would be to go back to a sorta 3dfx era, where the local pc does 2d and video encode/decode and calls an ondemand 3d accelerator in a cloud datacenter. Even on FTTH, latency is too big for a similar scenario and passive optical networks are not going to improve on that front(with the time division multiplex that is characteristic of xPON) soon tho.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I've watched HWUB's take on the NVidia keynote, but not GN's - though I assume that it might be along similar lines

my understanding of it is, NVidia is going heavy on AI, because they anticipate that it's the Hot New Thing, and they want to see if they can pivot from "we made big bucks selling silicon to crypto miners" to "we made big bucks selling silicon to AI prompt engineers [or whatever]"

now, I get it if this means that NVidia's future GPUs are of middling/bad value that they won't care if it doesn't sell well because what choice do gamers have and they'll make the bulk of their profits from the data center space anyway

but HWUB seemed to be saying that gamers would be "competing" with people who want AI computing power for GPUs, in the same way that gamers were "competing" with miners for GPUs...

... and that one I don't quite follow, because I thought that most AI work was done in the cloud, by servers?

like I posted that hot take that this AI stuff was going to reinvigorate the notion of having lots of horsepower on your desktop again, but apparently it was dumb because you wouldn't use your desktop for AI. And if that's true, then gamers wouldn't really be fighting for GPU stock, right? Like the quality/value might be bad but the data center space isn't exactly hitting up Microcenter for boxes of RTX

unimportantguy
Dec 25, 2012

Hey, Johnny, what's a "shitpost"?
I suspect the issue would be about where NV allocates their silicon wafers to. Why make RTX cards if you can make data center stuff and make more money?

SlowBloke
Aug 14, 2017

gradenko_2000 posted:

I've watched HWUB's take on the NVidia keynote, but not GN's - though I assume that it might be along similar lines

my understanding of it is, NVidia is going heavy on AI, because they anticipate that it's the Hot New Thing, and they want to see if they can pivot from "we made big bucks selling silicon to crypto miners" to "we made big bucks selling silicon to AI prompt engineers [or whatever]"

now, I get it if this means that NVidia's future GPUs are of middling/bad value that they won't care if it doesn't sell well because what choice do gamers have and they'll make the bulk of their profits from the data center space anyway

but HWUB seemed to be saying that gamers would be "competing" with people who want AI computing power for GPUs, in the same way that gamers were "competing" with miners for GPUs...

... and that one I don't quite follow, because I thought that most AI work was done in the cloud, by servers?

like I posted that hot take that this AI stuff was going to reinvigorate the notion of having lots of horsepower on your desktop again, but apparently it was dumb because you wouldn't use your desktop for AI. And if that's true, then gamers wouldn't really be fighting for GPU stock, right? Like the quality/value might be bad but the data center space isn't exactly hitting up Microcenter for boxes of RTX

AI researchers that don't have big bucks usually rent servers with GPU in it. This is a short summary about Azure choices https://learn.microsoft.com/en-us/azure/virtual-machines/sizes-gpu.
So you are going to see the big buyers (Microsoft, Amazon and Google) grabbing GPU silicon in bulk, then the big spender onprem datacenter operators(like university and other government entities) and then the consumer space.

A nearby university datacenter purchased NV Teslas by the pallet for god knows what amount when the big miner craze was in full effect and got the card in give or take months from the order.

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!

SlowBloke posted:

AI researchers that don't have big bucks usually rent servers with GPU in it. This is a short summary about Azure choices https://learn.microsoft.com/en-us/azure/virtual-machines/sizes-gpu.
So you are going to see the big buyers (Microsoft, Amazon and Google) grabbing GPU silicon in bulk, then the big spender onprem datacenter operators(like university and other government entities) and then the consumer space.

A nearby university datacenter purchased NV Teslas by the pallet for god knows what amount when the big miner craze was in full effect and got the card in give or take months from the order.

To add to this there are also hobbiests. Lots of people are currently playing with ML and LLM in their home pcs using just high end off the shelf Nvidia GPUs.
I don't think it will be like the crypto days, but those people will be competing for stock with gamers.

I think the AI hype train is going a little too hard, but there is little doubt Nvidia is going to make a huge amount of money over the next 5 years it takes competitors to even catch up and try and compete in that space.

CatHorse
Jan 5, 2008

SlowBloke posted:

My dream would be to go back to a sorta 3dfx era, where the local pc does 2d and video encode/decode and calls an ondemand 3d accelerator in a cloud datacenter. Even on FTTH, latency is too big for a similar scenario and passive optical networks are not going to improve on that front(with the time division multiplex that is characteristic of xPON) soon tho.

The latency on highest tier Geforce Now is lower than a local console and a tv.

SlowBloke
Aug 14, 2017

MikusR posted:

The latency on highest tier Geforce Now is lower than a local console and a tv.

Geforce now does both roles (visualization and compute), my wish was for running the game onprem(so inputs and visualization) and then calling the datacenter only for the 3d calculations. That way you are not beholden on playing just the games that geforce now has installed on their nodes.

kliras
Mar 27, 2021
yeah ai support isn't like the crypto fad; more like, i don't know, cuda. it's definitely useful, just as long as you don't think you're a future billionaire for buying a 4080

jensen's talking about ai as the new trend for morons, which is of course true, but ai actually serves meaningful purposes, so he's probably just tapdancing for prospective shareholders, especially as the stock just jumped 30%

kliras fucked around with this message at 10:51 on May 30, 2023

shrike82
Jun 11, 2005

their PE ratio is 200 - they have a solid business selling shovels for the gold rush but jensen's just playing up the biz to investors.
their technical guys are a lot more interested in deep learning than crypto or the metaverse, that's for sure

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
What is the best GPU showcase title right now?

I'm tempted to say Last of Us. I'm just blown away by how good that game looks. It's like playing a pre-rendered cutscene but in the best way.

The audio mix is also out of this world, we have two cats and they normally understand what's a speaker noise and what's a real life noise. You could play some obnoxious games and they wouldn't bat an eye.

The 7.1 Atmos mix in TLOU is the one game that regularly fucks with them. They get startled all the time by random noises. I genuinely don't know what separates the audio in this game from other ones but it's so well done that you feel completely immersed in whatever it going on. Even just the simple feeling of being in a rain storm is so convincingly modeled. It's unreal.

Anyways, the reason I'm searching for the most awesome looking PC games is because I have a good friend coming through in a few weeks and I am trying to figure out a game or two we can play that would really make his day! TLOU is definitely on the menu. It really seems to me that the best looking titles are always Sony ports. Maybe that's hyperbolic but all of the games I've appreciated graphically recently are all Sony joints. They also usually support DLSS well which is huge.

njsykora
Jan 23, 2012

Robots confuse squirrels.


Cyberpunk with all the poo poo turned up. I've heard A Plague Tale Requiem is also a GPU melter. The recent Spiderman games I think also look awesome.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
yeah Cyberpunk with the Overdrive mode looks incredible

Guardians of the Galaxy has probably been overtaken since its release, but I still really like how vibrant that game was

SlowBloke
Aug 14, 2017

Taima posted:

What is the best GPU showcase title right now?

I'm tempted to say Last of Us. I'm just blown away by how good that game looks. It's like playing a pre-rendered cutscene but in the best way.

The audio mix is also out of this world, we have two cats and they normally understand what's a speaker noise and what's a real life noise. You could play some obnoxious games and they wouldn't bat an eye.

The 7.1 Atmos mix in TLOU is the one game that regularly fucks with them. They get startled all the time by random noises. I genuinely don't know what separates the audio in this game from other ones but it's so well done that you feel completely immersed in whatever it going on. Even just the simple feeling of being in a rain storm is so convincingly modeled. It's unreal.

Anyways, the reason I'm searching for the most awesome looking PC games is because I have a good friend coming through in a few weeks and I am trying to figure out a game or two we can play that would really make his day! TLOU is definitely on the menu. It really seems to me that the best looking titles are always Sony ports. Maybe that's hyperbolic but all of the games I've appreciated graphically recently are all Sony joints. They also usually support DLSS well which is huge.

Crysis remastered at full tilt for old times sake?

Nfcknblvbl
Jul 15, 2002

Portal with RTX is pretty nice looking, and it's a preview to Remix which hopefully should be coming out soon.

Annath
Jan 11, 2009

Batatouille is a great and funny play on words for a video game creature and I love silly words like these
Clever Betty
Aloha!

I'm upgrading from a 1080Ti to a 3080, and I want to make sure my PSU can handle it.

This is what I currently have:

CORSAIR HX850i, 850 Watt, 80+ Platinum

My CPU is a Core i5 6600k (it needs an upgrade next...)

Nfcknblvbl
Jul 15, 2002

I'm rocking an 850W PSU with my 4090, you're good.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Mega Comrade posted:

Do you just Label everything that's even slightly critical of Nvidia as clickbait?

man that’s really your takeaway from all that?

lol at gamers getting visibly aroused by the techtube equivalent of “Biden impaired speech supercut” videos, Steve knows his audience is morons too.

It just turns out to be very profitable to play to morons. It’s a large base. And that goes for Steve too. Nobody ever went broke betting against the intelligence of the common man

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply