Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.

K8.0 posted:

It means it's made out of 4 smaller pieces put together instead of one monolithic one. Same sort of principle as Zen chiplets in Ryzen/Threadripper CPUs. It's going to happen with all GPUs eventually, at least at the mid-high end.

I see. Is this a good thing?

Adbot
ADBOT LOVES YOU

Kibner
Oct 21, 2008

Acguy Supremacy

punk rebel ecks posted:

I see. Is this a good thing?

It is less ideal than a monolithic die, but may be a necessary move in order to continue getting more powerful parts at a reasonable price.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Zedsdeadbaby posted:

Five to six years running games at high settings is my guess based on personal experience. My 1080ti was bought at launch in 2017 and it's still running games four years later on high settings at 4K with the occasional lowered resolution scale and individual taxing settings lowered a notch or two.

gradenko_2000 posted:

assuming DLSS keeps getting implemented into games, and FSR gets implemented in whatever doesn't get DLSS (given that FSR works on NVidia GPUs), and optimistically also looking forward to whatever XeSS has to offer both in game support and as a feature that can also be run on NVidia GPUs, I would wager that you'd be able to stretch out an Ampere card to at least half a decade just by progressively toning-down your expectations as the envelope gets pushed further and further

Thanks for the feedback! I had used the 210 step up method to get the 3090 and had gotten the 10-year warranty; I’ve been planning now to sell it but there’s a part of me that thinks with the 10 year warranty, if I get several years out of it, maybe it isn’t so bad.

A friend had a 3090 and downgraded to a 3080ti since he could sell the 3090 for enough to cover it, so I’ve been debating just going with a 3080ti otherwise.

Instruction Manuel
May 15, 2007

Yes, it is what it looks like!

punk rebel ecks posted:

What does 4 graphics chiplets mean?

imagine 4 chiplets on the edge of a wafer...

Shifty Pony
Dec 28, 2004

Up ta somethin'


Kibner posted:

It is less ideal than a monolithic die, but may be a necessary move in order to continue getting more powerful parts at a reasonable price.

To expand a bit, using chiplets can drastically improve yield. If you are using a single hunk of silicon with four cores on it a defect in any of them effectively kills the whole thing, but with a chiplet design a defect in any single core only kills that core.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Really they should make square wafers. Such a waste.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Rinkles posted:

Really they should make square wafers. Such a waste.

Ah yes, the Wendy's Wafer

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Rinkles posted:

Really they should make square wafers. Such a waste.

Silicon is rotationally formed, so it's round by default. You'd have to cut them down to get them square. You may as well make them round and do what you can to fill in the edges - which I guarantee is a heavy focus with the really high-end processes. TSMC isn't run by morons, they aren't wasting anything they don't absolutely have to.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
It was a joke.

njsykora
Jan 23, 2012

Robots confuse squirrels.


They should make pentagonal wafers.

repiv
Aug 13, 2009

making the dies hexagons so they form a honeycomb lattice would waste less wafer area, but i guess they would have to CNC cut them instead of just running a cutting wheel over the whole wafer

Sphyre
Jun 14, 2001

I would simply form the silicon in a square manner.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

I would simply produce 100% percent perfect wafers every time, ensuring none went to waste

NeverRelax
Jul 16, 2021

by Jeffrey of YOSPOS
3080 ti ftw reservation on evga.com just popped. Its good for about 7 more hours.
I have my 210 step up to 3090 on the way so I really dont need it.

Anyone looking for a MSRP 3080ti?

Shifty Pony
Dec 28, 2004

Up ta somethin'


Typically the most important criteria in laying out a semiconductor wafer is maximizing use of the exposure area of the photolithography equipment. Since that area is formed by scanning a thin rectangular slit across the wafer that means the dies are rectangular.

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

change my name posted:

I would simply produce 100% percent perfect wafers every time, ensuring none went to waste

Agreed

Personally I'd build them in a dick-shaped manner. The longer the shaft, the faster the chip. Hire me Intel

shrike82
Jun 11, 2005

https://twitter.com/greymon55/status/1435217663765057540?s=20

hope this is true

CoolCab
Apr 17, 2005

glem
it's a really good price/performance point. i know i know it's far from scientific (point me at a better one!) but it's the only 30 series on the steam hardware sale doodad for a reason. i love mine very much!

Enos Cabell
Nov 3, 2004



just wait, slide is probably about msrp

or shroud horniness levels

Craptacular!
Jul 9, 2001

Fuck the DH
While I wouldn't sneeze at a 3060ti/3070 if I could actually get one for $500/$700 respectively, I kinda just want either that Intel or an 6700xt these days just to have the Linux support. In Intel's case I'm hoping the video encoder is good, too. Like Pascal or better, even if it's not up to Turing, is still better than AMD at this moment.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Craptacular! posted:

While I wouldn't sneeze at a 3060ti/3070 if I could actually get one for $500/$700 respectively, I kinda just want either that Intel or an 6700xt these days just to have the Linux support. In Intel's case I'm hoping the video encoder is good, too. Like Pascal or better, even if it's not up to Turing, is still better than AMD at this moment.

QuickSync has been a feather in Intel’s cap for a while. I’d expect it to compare favorably.

Shipon
Nov 7, 2005
The lucky few that actually got a 3070 for $500 got one of the best deals the PC gaming scene has ever seen imo, beefy performance for the price for sure

K8.0 posted:

Silicon is rotationally formed, so it's round by default. You'd have to cut them down to get them square. You may as well make them round and do what you can to fill in the edges - which I guarantee is a heavy focus with the really high-end processes. TSMC isn't run by morons, they aren't wasting anything they don't absolutely have to.
i suppose you could lop off the sides of the boule back into the melt for a square wafer but mechanical handling of that sharp edge would make the wafer much more fragile (also any spin-on processes would prob leave a nasty bump around the corners). the amount of waste on a 300mm wafer is fairly small for all but the largest dies anyway

shrike82
Jun 11, 2005

i hope it's not a zero sum game where Nvidia, AMD, and Intel are competing for the same TSMC supply and that they're able to supply more combined

next year is going to be brutal otherwise

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
IIRC Alchemist is on TSMC 6, so not the same 7 capacity used for RDNA2

Next gen might see overlap though

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
Will Intel's GPU likely support freesync?

NeverRelax
Jul 16, 2021

by Jeffrey of YOSPOS

punk rebel ecks posted:

Will Intel's GPU likely support freesync?

10th gen and up CPUs with integrated graphics say they support intel adaptive sync which as far as i can tell is freesync. Seems likely the GPUs will too

shrike82
Jun 11, 2005

that’d be an incredible self own if they didn’t

NeverRelax
Jul 16, 2021

by Jeffrey of YOSPOS

shrike82 posted:

that’d be an incredible self own if they didn’t

from 2018 saying they will
https://twitter.com/IntelGraphics/status/1062855813797044224

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Aren't Intel's graphics drivers still absolute trash to the point of negatively impacting performance? At least in the videos I've seen comparing their IGUs with similarly specc'ed Ryzen APUs. So I would maybe wait for them to sort that stuff out first?

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

NeverRelax posted:

10th gen and up CPUs with integrated graphics say they support intel adaptive sync which as far as i can tell is freesync. Seems likely the GPUs will too

hadn't heard of that, that's cool. same goes for similar AMD cpus?

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.

NeverRelax posted:

10th gen and up CPUs with integrated graphics say they support intel adaptive sync which as far as i can tell is freesync. Seems likely the GPUs will too

I want to buy a 4KTV, but I'd be bummed if it support freesync/Gsync, but not whatever Intel is doing for their graphics cards.

punk rebel ecks fucked around with this message at 05:20 on Sep 8, 2021

wargames
Mar 16, 2008

official yospos cat censor

Rinkles posted:

hadn't heard of that, that's cool. same goes for similar AMD cpus?

AMD igpus have alot of great driver support because at the heart of things they are amd gpus just tiny, and AMD gpus drivers are actually quite good as a amd gpu user for the last 4 years.

Craptacular!
Jul 9, 2001

Fuck the DH

change my name posted:

Aren't Intel's graphics drivers still absolute trash to the point of negatively impacting performance? At least in the videos I've seen comparing their IGUs with similarly specc'ed Ryzen APUs. So I would maybe wait for them to sort that stuff out first?

Pretty much everybody's drivers are trash to some extent or another these days, because Nvidia sees gaming as secondary to ML and AMD would rather just let other people make drivers for them except Microsoft doesn't allow for that anymore.

Intel also has enough favoritism in the industry that they can push drivers through Windows Update. Nvidia long avoided that because since the days when they were competing with Voodoo they have always wanted you to bear witness to marketing and branding efforts, to know that you're consuming Nvidia product and to look for more Nvidia product in the future. Intel doesn't need to do that poo poo because they're goddamn Intel and they're going to be in every loving OEM machine if they want to be.

BurritoJustice
Oct 9, 2012

Ampere scales extremely well with temperature. My 3080 has a stock boost clock of 1755 but it runs at 2020MHz out of the box because I've got it under water at a peak of 39C. I don't know if I can even get more performance from manual overclocking.

MarcusSA
Sep 23, 2007

change my name posted:

Aren't Intel's graphics drivers still absolute trash to the point of negatively impacting performance? At least in the videos I've seen comparing their IGUs with similarly specc'ed Ryzen APUs. So I would maybe wait for them to sort that stuff out first?

Yes they are and yes absolutely. Their hardware could be fantastic but their drivers definitely still have issues.

Craptacular! posted:

Pretty much everybody's drivers are trash to some extent or another these days, because Nvidia sees gaming as secondary to ML and AMD would rather just let other people make drivers for them except Microsoft doesn't allow for that anymore.

Intel also has enough favoritism in the industry that they can push drivers through Windows Update. Nvidia long avoided that because since the days when they were competing with Voodoo they have always wanted you to bear witness to marketing and branding efforts, to know that you're consuming Nvidia product and to look for more Nvidia product in the future. Intel doesn't need to do that poo poo because they're goddamn Intel and they're going to be in every loving OEM machine if they want to be.

I’m sure they could but they haven’t yet so saying they could is a bit of a pipe dream especially with the state of their drivers currently.

They have a lot of catching up to do and I’m sure they will do it eventually but it’s gonna be a rough go at the start that’s for sure.

shrike82
Jun 11, 2005

i don't recall drivers being that much better in the past

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
The rule of thumb used to be NVidia was good with drivers while AMD was bad and Intel was horrible.

Dr. Video Games 0031
Jul 17, 2004

Are refresh rates still broken on nvidia drivers if you have multiple monitors with a game playing on one and a video playing on a second?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

punk rebel ecks posted:

The rule of thumb used to be NVidia was good with drivers while AMD was bad and Intel was horrible.

It was a stupid and wrong rule of thumb. There were plenty of times AMD's drivers were as good or better. The reality is at this point we're sitting on drivers with decades of cobbled together hacks to make games work. They have to be in contention for some of the most nightmarish codebases around.

Dr. Video Games 0031 posted:

Are refresh rates still broken on nvidia drivers if you have multiple monitors with a game playing on one and a video playing on a second?

That's a windows issue, not an Nvidia issue. MS has sorta-kinda but not really fixed it.

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

K8.0 posted:

That's a windows issue, not an Nvidia issue. MS has sorta-kinda but not really fixed it.

Then why doesn't it happen on AMD cards?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply