|
K8.0 posted:It means it's made out of 4 smaller pieces put together instead of one monolithic one. Same sort of principle as Zen chiplets in Ryzen/Threadripper CPUs. It's going to happen with all GPUs eventually, at least at the mid-high end. I see. Is this a good thing?
|
# ? Sep 7, 2021 19:11 |
|
|
# ? May 30, 2024 14:40 |
|
punk rebel ecks posted:I see. Is this a good thing? It is less ideal than a monolithic die, but may be a necessary move in order to continue getting more powerful parts at a reasonable price.
|
# ? Sep 7, 2021 19:19 |
|
Zedsdeadbaby posted:Five to six years running games at high settings is my guess based on personal experience. My 1080ti was bought at launch in 2017 and it's still running games four years later on high settings at 4K with the occasional lowered resolution scale and individual taxing settings lowered a notch or two. gradenko_2000 posted:assuming DLSS keeps getting implemented into games, and FSR gets implemented in whatever doesn't get DLSS (given that FSR works on NVidia GPUs), and optimistically also looking forward to whatever XeSS has to offer both in game support and as a feature that can also be run on NVidia GPUs, I would wager that you'd be able to stretch out an Ampere card to at least half a decade just by progressively toning-down your expectations as the envelope gets pushed further and further Thanks for the feedback! I had used the 210 step up method to get the 3090 and had gotten the 10-year warranty; I’ve been planning now to sell it but there’s a part of me that thinks with the 10 year warranty, if I get several years out of it, maybe it isn’t so bad. A friend had a 3090 and downgraded to a 3080ti since he could sell the 3090 for enough to cover it, so I’ve been debating just going with a 3080ti otherwise.
|
# ? Sep 7, 2021 19:29 |
|
punk rebel ecks posted:What does 4 graphics chiplets mean? imagine 4 chiplets on the edge of a wafer...
|
# ? Sep 7, 2021 19:36 |
Kibner posted:It is less ideal than a monolithic die, but may be a necessary move in order to continue getting more powerful parts at a reasonable price. To expand a bit, using chiplets can drastically improve yield. If you are using a single hunk of silicon with four cores on it a defect in any of them effectively kills the whole thing, but with a chiplet design a defect in any single core only kills that core.
|
|
# ? Sep 7, 2021 19:52 |
|
Really they should make square wafers. Such a waste.
|
# ? Sep 7, 2021 19:55 |
|
Rinkles posted:Really they should make square wafers. Such a waste. Ah yes, the Wendy's Wafer
|
# ? Sep 7, 2021 20:53 |
|
Rinkles posted:Really they should make square wafers. Such a waste. Silicon is rotationally formed, so it's round by default. You'd have to cut them down to get them square. You may as well make them round and do what you can to fill in the edges - which I guarantee is a heavy focus with the really high-end processes. TSMC isn't run by morons, they aren't wasting anything they don't absolutely have to.
|
# ? Sep 7, 2021 21:51 |
|
It was a joke.
|
# ? Sep 7, 2021 21:55 |
|
They should make pentagonal wafers.
|
# ? Sep 7, 2021 22:41 |
|
making the dies hexagons so they form a honeycomb lattice would waste less wafer area, but i guess they would have to CNC cut them instead of just running a cutting wheel over the whole wafer
|
# ? Sep 7, 2021 22:48 |
|
I would simply form the silicon in a square manner.
|
# ? Sep 7, 2021 22:53 |
|
I would simply produce 100% percent perfect wafers every time, ensuring none went to waste
|
# ? Sep 7, 2021 22:55 |
|
3080 ti ftw reservation on evga.com just popped. Its good for about 7 more hours. I have my 210 step up to 3090 on the way so I really dont need it. Anyone looking for a MSRP 3080ti?
|
# ? Sep 7, 2021 23:13 |
Typically the most important criteria in laying out a semiconductor wafer is maximizing use of the exposure area of the photolithography equipment. Since that area is formed by scanning a thin rectangular slit across the wafer that means the dies are rectangular.
|
|
# ? Sep 7, 2021 23:23 |
|
change my name posted:I would simply produce 100% percent perfect wafers every time, ensuring none went to waste Agreed Personally I'd build them in a dick-shaped manner. The longer the shaft, the faster the chip. Hire me Intel
|
# ? Sep 8, 2021 02:43 |
|
https://twitter.com/greymon55/status/1435217663765057540?s=20 hope this is true
|
# ? Sep 8, 2021 03:06 |
|
it's a really good price/performance point. i know i know it's far from scientific (point me at a better one!) but it's the only 30 series on the steam hardware sale doodad for a reason. i love mine very much!
|
# ? Sep 8, 2021 03:08 |
|
just wait, slide is probably about msrp or shroud horniness levels
|
# ? Sep 8, 2021 03:10 |
|
While I wouldn't sneeze at a 3060ti/3070 if I could actually get one for $500/$700 respectively, I kinda just want either that Intel or an 6700xt these days just to have the Linux support. In Intel's case I'm hoping the video encoder is good, too. Like Pascal or better, even if it's not up to Turing, is still better than AMD at this moment.
|
# ? Sep 8, 2021 03:19 |
|
Craptacular! posted:While I wouldn't sneeze at a 3060ti/3070 if I could actually get one for $500/$700 respectively, I kinda just want either that Intel or an 6700xt these days just to have the Linux support. In Intel's case I'm hoping the video encoder is good, too. Like Pascal or better, even if it's not up to Turing, is still better than AMD at this moment. QuickSync has been a feather in Intel’s cap for a while. I’d expect it to compare favorably.
|
# ? Sep 8, 2021 03:59 |
|
The lucky few that actually got a 3070 for $500 got one of the best deals the PC gaming scene has ever seen imo, beefy performance for the price for sureK8.0 posted:Silicon is rotationally formed, so it's round by default. You'd have to cut them down to get them square. You may as well make them round and do what you can to fill in the edges - which I guarantee is a heavy focus with the really high-end processes. TSMC isn't run by morons, they aren't wasting anything they don't absolutely have to.
|
# ? Sep 8, 2021 04:06 |
|
i hope it's not a zero sum game where Nvidia, AMD, and Intel are competing for the same TSMC supply and that they're able to supply more combined next year is going to be brutal otherwise
|
# ? Sep 8, 2021 04:11 |
|
IIRC Alchemist is on TSMC 6, so not the same 7 capacity used for RDNA2 Next gen might see overlap though
|
# ? Sep 8, 2021 04:15 |
|
Will Intel's GPU likely support freesync?
|
# ? Sep 8, 2021 04:34 |
|
punk rebel ecks posted:Will Intel's GPU likely support freesync? 10th gen and up CPUs with integrated graphics say they support intel adaptive sync which as far as i can tell is freesync. Seems likely the GPUs will too
|
# ? Sep 8, 2021 04:46 |
|
that’d be an incredible self own if they didn’t
|
# ? Sep 8, 2021 04:47 |
|
shrike82 posted:that’d be an incredible self own if they didn’t from 2018 saying they will https://twitter.com/IntelGraphics/status/1062855813797044224
|
# ? Sep 8, 2021 05:01 |
|
Aren't Intel's graphics drivers still absolute trash to the point of negatively impacting performance? At least in the videos I've seen comparing their IGUs with similarly specc'ed Ryzen APUs. So I would maybe wait for them to sort that stuff out first?
|
# ? Sep 8, 2021 05:08 |
|
NeverRelax posted:10th gen and up CPUs with integrated graphics say they support intel adaptive sync which as far as i can tell is freesync. Seems likely the GPUs will too hadn't heard of that, that's cool. same goes for similar AMD cpus?
|
# ? Sep 8, 2021 05:09 |
|
NeverRelax posted:10th gen and up CPUs with integrated graphics say they support intel adaptive sync which as far as i can tell is freesync. Seems likely the GPUs will too I want to buy a 4KTV, but I'd be bummed if it support freesync/Gsync, but not whatever Intel is doing for their graphics cards. punk rebel ecks fucked around with this message at 05:20 on Sep 8, 2021 |
# ? Sep 8, 2021 05:15 |
|
Rinkles posted:hadn't heard of that, that's cool. same goes for similar AMD cpus? AMD igpus have alot of great driver support because at the heart of things they are amd gpus just tiny, and AMD gpus drivers are actually quite good as a amd gpu user for the last 4 years.
|
# ? Sep 8, 2021 05:20 |
|
change my name posted:Aren't Intel's graphics drivers still absolute trash to the point of negatively impacting performance? At least in the videos I've seen comparing their IGUs with similarly specc'ed Ryzen APUs. So I would maybe wait for them to sort that stuff out first? Pretty much everybody's drivers are trash to some extent or another these days, because Nvidia sees gaming as secondary to ML and AMD would rather just let other people make drivers for them except Microsoft doesn't allow for that anymore. Intel also has enough favoritism in the industry that they can push drivers through Windows Update. Nvidia long avoided that because since the days when they were competing with Voodoo they have always wanted you to bear witness to marketing and branding efforts, to know that you're consuming Nvidia product and to look for more Nvidia product in the future. Intel doesn't need to do that poo poo because they're goddamn Intel and they're going to be in every loving OEM machine if they want to be.
|
# ? Sep 8, 2021 05:33 |
|
Ampere scales extremely well with temperature. My 3080 has a stock boost clock of 1755 but it runs at 2020MHz out of the box because I've got it under water at a peak of 39C. I don't know if I can even get more performance from manual overclocking.
|
# ? Sep 8, 2021 05:34 |
|
change my name posted:Aren't Intel's graphics drivers still absolute trash to the point of negatively impacting performance? At least in the videos I've seen comparing their IGUs with similarly specc'ed Ryzen APUs. So I would maybe wait for them to sort that stuff out first? Yes they are and yes absolutely. Their hardware could be fantastic but their drivers definitely still have issues. Craptacular! posted:Pretty much everybody's drivers are trash to some extent or another these days, because Nvidia sees gaming as secondary to ML and AMD would rather just let other people make drivers for them except Microsoft doesn't allow for that anymore. I’m sure they could but they haven’t yet so saying they could is a bit of a pipe dream especially with the state of their drivers currently. They have a lot of catching up to do and I’m sure they will do it eventually but it’s gonna be a rough go at the start that’s for sure.
|
# ? Sep 8, 2021 05:36 |
|
i don't recall drivers being that much better in the past
|
# ? Sep 8, 2021 05:37 |
|
The rule of thumb used to be NVidia was good with drivers while AMD was bad and Intel was horrible.
|
# ? Sep 8, 2021 05:39 |
|
Are refresh rates still broken on nvidia drivers if you have multiple monitors with a game playing on one and a video playing on a second?
|
# ? Sep 8, 2021 05:56 |
|
punk rebel ecks posted:The rule of thumb used to be NVidia was good with drivers while AMD was bad and Intel was horrible. It was a stupid and wrong rule of thumb. There were plenty of times AMD's drivers were as good or better. The reality is at this point we're sitting on drivers with decades of cobbled together hacks to make games work. They have to be in contention for some of the most nightmarish codebases around. Dr. Video Games 0031 posted:Are refresh rates still broken on nvidia drivers if you have multiple monitors with a game playing on one and a video playing on a second? That's a windows issue, not an Nvidia issue. MS has sorta-kinda but not really fixed it.
|
# ? Sep 8, 2021 06:05 |
|
|
# ? May 30, 2024 14:40 |
|
K8.0 posted:That's a windows issue, not an Nvidia issue. MS has sorta-kinda but not really fixed it. Then why doesn't it happen on AMD cards?
|
# ? Sep 8, 2021 06:10 |