|
TheFluff posted:The Direct3D team put their gigantic collection of 30+ years of GPU's up on the walls in their office. Pretty cool stuff. My first card when I got back in to PCs in 95ish was a Rendition Verite 1000 which was awesome for the time. A year or two later I paired with with a Voodoo and it rocked Quake and Longbow 2. I still think I have my voodoo 1 & 2 cards kicking around somewhere and pretty I have a 5500 in its box. AGP though
|
# ? Jan 12, 2019 18:59 |
|
|
# ? May 15, 2024 03:42 |
|
When I was in school 15 nm was when they predicted the rules of physics to be a limit. Turns out to have been more like 65 nm, at that point the node names became increasingly disconnected from the element sizes.
|
# ? Jan 12, 2019 19:01 |
|
That's probably a nightmare to dust, I feel bad for their cleaning staff.
|
# ? Jan 12, 2019 19:03 |
|
Surprise Giraffe posted:Give the rate of change some other name or none at all if you like, it's sad to watch it falter like this. Doubt that's all down to physics, all things considered. There’s hundreds of billions of dollars that have gone into it to even get this far, and thirty years of a firehose from intel, nsf, dod, etc. and countless phds in any domain even tangentially related. We’re approaching wires only a single atom wide, patterns laid down by xrays, vacuums sparser than deep space, but if only there were more random phds, sure.
|
# ? Jan 12, 2019 19:14 |
|
German Computerbase Forum, some months ago „Lmao the 2080 is just 1,5 year old 1080Ti performance with crippled 8 GB VRAM and with a lottery ticket for Turing Features beeing relevant in 1 year in games.. nvidia failed big time! For 700 bucks lol.“ German Computerbase Forum now „Wohoo, the R7 is a great gap closer and dominant 16 GB VRAM and 2080 performance for 700 bucks, that’s competition now!“ I don’t want to interrupt the victory laps of the AMD lovers, but I don’t understand why they felt the 2080 was a major fail in the conclusion while the R7 now is amazing? It’s a allegedly slightly slower 2080 minus Turing Chips, for 700 bucks as well and irrelevant 16 GB VRAM, unless you refer to the running gag of heavy modded Skyrim at 4K as a relevant scenario. Mr.PayDay fucked around with this message at 19:33 on Jan 12, 2019 |
# ? Jan 12, 2019 19:18 |
|
I find it a bit hilarious, that Jensen is chastising "uncompliant" FreeSync monitors for blanking and poo poo like that, when not so long ago, their driver was broken for 3-4 releases, making GSync fail exactly the same way.
|
# ? Jan 12, 2019 19:28 |
|
Combat Pretzel posted:I find it a bit hilarious, that Jensen is chastising "uncompliant" FreeSync monitors for blanking and poo poo like that, when not so long ago, their driver was broken for 3-4 releases, making GSync fail exactly the same way. Did the FreeSync monitors get fixed? If you can’t or won’t ship a fix in a way that’s convenient for the customer, you have to be more careful to not have problems in the first place.
|
# ? Jan 12, 2019 19:30 |
|
People love underdogs
|
# ? Jan 12, 2019 19:32 |
|
Subjunctive posted:Did the FreeSync monitors get fixed? If you can’t or won’t ship a fix in a way that’s convenient for the customer, you have to be more careful to not have problems in the first place.
|
# ? Jan 12, 2019 19:36 |
|
GRINDCORE MEGGIDO posted:Wonder if any goons do? A thread by them would be fascinating. "It's so loving exciting to see chips get so small that I keep giving AMD money" has been a constant refrain in my life.
|
# ? Jan 12, 2019 19:39 |
|
Surprise Giraffe posted:Give the rate of change some other name or none at all if you like, it's sad to watch it falter like this. Doubt that's all down to physics, all things considered. There were definitely bad decisions made, like Brian Krzanich's absolutely idiotic quest to cut R&D spending for a company that's incredibly reliant on R&D spending to advance their product line. Yet even with increased investment it's hard to fight against a ramp up of costs this big. The main limiter of smaller nodes IMO is not going to be physical inability to manufacture them but this big ramp up in costs eventually making them economically unviable. MaxxBot fucked around with this message at 19:57 on Jan 12, 2019 |
# ? Jan 12, 2019 19:54 |
|
Combat Pretzel posted:you can't go around pointing fingers You sure can, because there’s a difference between “fixed promptly” vs “present for the lifetime of the product with no attempt to remedy”.
|
# ? Jan 12, 2019 19:59 |
|
Combat Pretzel posted:I find it a bit hilarious, that Jensen is chastising "uncompliant" FreeSync monitors for blanking and poo poo like that, when not so long ago, their driver was broken for 3-4 releases, making GSync fail exactly the same way. I've been using a gsync display while always updating my drivers when a new one hit and this never happened to me. Granted this is a sample size of one gsync display and one gpu, maybe I just dodged the bullet?
|
# ? Jan 12, 2019 20:25 |
|
Mr.PayDay posted:German Computerbase Forum, some months ago One is a well-off company in a dominant position ultimately pushing the prices up and up for similar performance, the other is a struggling company re-using a compute card for consumers with what really is an enormous amount of (incredibly expensive) VRAM. The cost is too high, but that's how it is. Radeon VII is not an amazing solution, it's a stop-gap one, but the thing here is - they're happy that there's (potentially) competition at the high-end. That ends up being a good thing for everyone, even if the cards involved are all bad purchases. If AMD's GPU share dwindles to absolutely gently caress-all, we'll be trapped in the same position as we were with CPUs from Intel for some time, where they basically didn't see any need to increase core counts on the mainstream platform for a crazy long time. HalloKitty fucked around with this message at 20:35 on Jan 12, 2019 |
# ? Jan 12, 2019 20:32 |
|
craig588 posted:When I was in school 15 nm was when they predicted the rules of physics to be a limit. Turns out to have been more like 65 nm, at that point the node names became increasingly disconnected from the element sizes. The physics isn't the problem. Quantum mechanics is extremely precise and accurate. We know how to simulate down to the subatomic level. The problem is engineering. It's really easy to go "small feature small wavelength" for light but producing euv (soft x ray) light is a loving Rube Goldberg machine since lenses don't work at those frequencies. Also you gotta make billions and billions of defect free devices thousands of times a day. Statistical variance kills you in the placement of atoms.
|
# ? Jan 12, 2019 20:39 |
|
MaxxBot posted:There were definitely bad decisions made, like Brian Krzanich's absolutely idiotic quest to cut R&D spending for a company that's incredibly reliant on R&D spending to advance their product line. Yet even with increased investment it's hard to fight against a ramp up of costs this big. The main limiter of smaller nodes IMO is not going to be physical inability to manufacture them but this big ramp up in costs eventually making them economically unviable. Yep we can manufacture tiny tiny transistors using ebeam lithography but it would cost a billion dollars a chip
|
# ? Jan 12, 2019 20:40 |
|
HalloKitty posted:One is a well-off company in a dominant position ultimately pushing the prices up and up for similar performance, the other is a struggling company re-using a compute card for consumers with what really is an enormous amount of (incredibly expensive) VRAM. The cost is too high, but that's how it is. Radeon VII is not an amazing solution, it's a stop-gap one, but the thing here is - they're happy that there's (potentially) competition at the high-end. That ends up being a good thing for everyone, even if the cards involved are all bad purchases. AMD's GPU share has been nonexistent at the high end for 4 years, and in general that period has had some of the best value high-end cards in history - the 980Ti and 1080Ti were both remarkably high performance (big gap vs x80) and relatively decent value for high-end cards. The 20 series is an anomaly in performance because it's not a performance focused architecture, Nvidia introduced something new that takes up a ton of space, and they did it on an old process where the resulting GPUs are enormous and expensive to produce. People need to calm the hell down and wait for the die shrink. Of course a GPU with a lot of BOM spent on something with zero software support is not going to be a good value when it's new. Be glad that Nvidia has chosen this largely irrelevant point in time to push something new out, so the next generation can be solid value again.
|
# ? Jan 12, 2019 20:46 |
|
slidebite posted:I wish they had higher res pics of those. Apparently the pics are low-res because a lot of those cards were preproduction models provided to Microsoft under NDA. The NDAs may or may not have expired by now but nobody wants to take the time to dig back through 25 years of them to figure out which they could post high-res pics of and which they couldn't so they just went with low-res for the whole set.
|
# ? Jan 12, 2019 21:06 |
|
GRINDCORE MEGGIDO posted:Wonder if any goons do? A thread by them would be fascinating. This was pretty up-to-date 6 years ago, more on the process side than the physics side though. https://www.youtube.com/watch?v=NGFhc8R_uO4 The problem with looking for goons is anyone in the industry is going to behind NDAs with serious teeth. The guy who presented this had to stop himself repeatedly as he got too close to trade secrets. For the physics side academic research is open and you can get an idea of what's going on in production now by what they were working on 5-10 years ago.
|
# ? Jan 12, 2019 21:24 |
|
Subjunctive posted:You sure can, because there’s a difference between “fixed promptly” vs “present for the lifetime of the product with no attempt to remedy”. Indiana_Krom posted:I've been using a gsync display while always updating my drivers when a new one hit and this never happened to me. Granted this is a sample size of one gsync display and one gpu, maybe I just dodged the bullet?
|
# ? Jan 12, 2019 21:31 |
|
Combat Pretzel posted:Like that issue, that after a random number of wake ups and mode switches, the GSync module in my displays pick a random column 8 pixels wide and move it to the left side of the screen? Which can only be fixed with a firmware update done at the manufacturer? Yours moves to the left side of the screen? Huh, mine moves to the right side of the screen for my xb271hu. Didn't know it could go to the left unless there's two monitor skus out there that do this.
|
# ? Jan 12, 2019 21:45 |
|
MIne moves to the left. I always assumed it's a driver thing because I've never seen it happen in Linux.
|
# ? Jan 12, 2019 21:49 |
|
XB271HU here. It picks a random stripe, puts it on the left and everything else up to where that stripe comes from, moves to the right. Now that you say it, I've seen the stripe move to the right, too. Now I'm not sure if it happens regularly on both screen or whether one goes left and the other goes right. Acer wanted me to send them in, but like I'm going to send something bulky around, tempting my luck to get it back slightly round. Craptacular! posted:MIne moves to the left. I always assumed it's a driver thing because I've never seen it happen in Linux.
|
# ? Jan 12, 2019 21:50 |
|
Always does it to the right on mine. Happens randomly but it's been more common lately. It also has for some reason started turning on Nvidia stereoscopic 3D vision mode somehow randomly after waking up. It's done that 3 times total and I have to power cycle the computer to fix it. I do not have the 3d vision drivers installed. So I dunno wtf
|
# ? Jan 12, 2019 21:56 |
|
Combat Pretzel posted:Like that issue, that after a random number of wake ups and mode switches, the GSync module in my displays pick a random column 8 pixels wide and move it to the left side of the screen? Which can only be fixed with a firmware update done at the manufacturer? Yeah, that is bad though at least there’s a way to get it fixed. Is your point “if one produces a thing which has a defect, one cannot decry other people for producing a thing with a different defect”? That seems like a challenging position to defend, so I’m probably wrong. Did this only happen with Acer devices? Do we know whose defect it was?
|
# ? Jan 12, 2019 21:57 |
|
It involves updating the firmware of specifically the GSync module. The initial post was about Jensen going all like "Ooooh look how those not-GSync displays flicker because of no GSync", while their own ostensibly so superior solution can be upset pretty quickly. Which kind of suggests that the graphics card is actually more involved than expected, and that those apparently so terrible displays can probably be tamed via driver just fine. I guess I'm generally just cranky about that guy. Same with that hollering about Radeon VII. Like what the hell. He's always been weird, but never really went on the offensive like that before against AMD/ATI.
|
# ? Jan 12, 2019 22:47 |
|
Did it happen to other manufacturers’ products too? (That the card can cause similar flickering doesn’t really mean that it can cure it if it’s caused on the monitor side, but I hope you’re right because flicker-free high quality adaptive sync shouldn’t just be for the rich!)
|
# ? Jan 12, 2019 23:00 |
|
I've heard it happening to the Dell TN gsync monitor in my travels for a solution for my XB271HU. BTW, my monitor came shipped with the firmware that supposedly fixes the issue according to some Acer community guy on their forums, but I still get the stripe. So I guess what I'm saying is, good thing you guys didn't bother if Acer wanted you to send it in.
|
# ? Jan 12, 2019 23:44 |
|
PCjr sidecar posted:There’s hundreds of billions of dollars that have gone into it to even get this far, and thirty years of a firehose from intel, nsf, dod, etc. and countless phds in any domain even tangentially related. I just wonder how closely the numbers in terms of people and investment follow the increase in the complexity solving the problems, but I guess it's thats pretty vague and political, and would need real research for anyone to be able to say for sure Harik posted:This was pretty up-to-date 6 years ago, more on the process side than the physics side though. That's really interesting, thanks.
|
# ? Jan 12, 2019 23:48 |
|
yergacheffe posted:I've heard it happening to the Dell TN gsync monitor Latest revision S2716DGR, can confirm. Outstanding screen otherwise.
|
# ? Jan 13, 2019 00:02 |
|
Craptacular! posted:Latest revision S2716DGR, can confirm. Outstanding screen otherwise. Hmm, I've had the A04 revision for over a year now and haven't run into that issue. Maybe I'm just lucky.
|
# ? Jan 13, 2019 00:16 |
|
Craptacular! posted:Latest revision S2716DGR, can confirm. Outstanding screen otherwise.
|
# ? Jan 13, 2019 00:40 |
|
Combat Pretzel posted:When did you buy it? I got mine around June 2016. If yours is relatively new, it's surprising that it'll still happen. I guess I should clarify so we're talking about the same thing: I have a revision A09 panel I bought a couple months ago for black friday. Sometimes, often prompted by switching between fullscreen games and desktop, the picture will be shifted to the right with the first however many pixels on the left looking sort of like the rightmost edge of the picture? It looks like it wrapped around or something. This can usually be prompted by alt-tabbing, and solved by alt-tabbing or launching a game if need be. Only once or twice have I not been able to reset it and had to reboot the whole computer.
|
# ? Jan 13, 2019 00:51 |
|
That's the issue, but the easiest way to fix it is to turn the monitor off and back on. I've never had that fail to work, and it takes two button presses and 3 seconds.
|
# ? Jan 13, 2019 00:59 |
|
My Asus PG279Q I bought in June 2016 does it too, and yeah a quick power cycle fixes it so I've never bothered with the firmware flash. e: it's a fairly rare occurrence these days so who knows
|
# ? Jan 13, 2019 01:04 |
|
Mr.PayDay posted:German Computerbase Forum, some months ago It's sour grapes. They hate nVidia.
|
# ? Jan 13, 2019 01:32 |
|
i have the same dell and mine does that too but like people say, quick power cycle and we're back in business it happens a few times a month maybe
|
# ? Jan 13, 2019 02:28 |
|
Well poo poo. Re: XV273K:quote:The FreeSync range supported by the screen is 48 - 120/144Hz but will depend on your resolution. The "overclocked" (as Acer like to refer to it as) 4K-144Hz mode will not support FreeSync, as confirmed by the user manual. It looks like this is a limitation of using a dual DP connection, with FreeSync only working when powering the screen from a single interface. So if you're running at 4K resolution you will only be able to use FreeSync between 48 - 120Hz. If you have dropped to a lower resolution, like 2560 x 1440 for instance, you can run at a refresh rate of up to 144Hz easily, without needing to use dual DP connections or making any OSD changes. So if you're running at a lower resolution than the native 4K, the FreeSync range will be 48 - 144Hz. Guess I'm still not buying a new monitor.
|
# ? Jan 13, 2019 10:30 |
|
Truga posted:Well poo poo. Re: XV273K:
|
# ? Jan 13, 2019 11:59 |
|
|
# ? May 15, 2024 03:42 |
|
Combat Pretzel posted:I guess I'm generally just cranky about that guy. On reflection, I’m coming around to your crankiness.
|
# ? Jan 13, 2019 14:27 |