Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
slidebite
Nov 6, 2005

Good egg
:colbert:

I wish they had higher res pics of those.

My first card when I got back in to PCs in 95ish was a Rendition Verite 1000 which was awesome for the time. A year or two later I paired with with a Voodoo and it rocked Quake and Longbow 2.

I still think I have my voodoo 1 & 2 cards kicking around somewhere and pretty I have a 5500 in its box. AGP though :(

Adbot
ADBOT LOVES YOU

craig588
Nov 19, 2005

by Nyc_Tattoo
When I was in school 15 nm was when they predicted the rules of physics to be a limit. Turns out to have been more like 65 nm, at that point the node names became increasingly disconnected from the element sizes.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
That's probably a nightmare to dust, I feel bad for their cleaning staff.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

Surprise Giraffe posted:

Give the rate of change some other name or none at all if you like, it's sad to watch it falter like this. Doubt that's all down to physics, all things considered.

There’s hundreds of billions of dollars that have gone into it to even get this far, and thirty years of a firehose from intel, nsf, dod, etc. and countless phds in any domain even tangentially related.

We’re approaching wires only a single atom wide, patterns laid down by xrays, vacuums sparser than deep space, but if only there were more random phds, sure.

Mr.PayDay
Jan 2, 2004
life is short - play hard
German Computerbase Forum, some months ago
„Lmao the 2080 is just 1,5 year old 1080Ti performance with crippled 8 GB VRAM and with a lottery ticket for Turing Features beeing relevant in 1 year in games.. nvidia failed big time! For 700 bucks lol.“

German Computerbase Forum now
„Wohoo, the R7 is a great gap closer and dominant 16 GB VRAM and 2080 performance for 700 bucks, that’s competition now!“

I don’t want to interrupt the victory laps of the AMD lovers, but I don’t understand why they felt the 2080 was a major fail in the conclusion while the R7 now is amazing? It’s a allegedly slightly slower 2080 minus Turing Chips, for 700 bucks as well and irrelevant 16 GB VRAM, unless you refer to the running gag of heavy modded Skyrim at 4K as a relevant scenario.

Mr.PayDay fucked around with this message at 19:33 on Jan 12, 2019

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I find it a bit hilarious, that Jensen is chastising "uncompliant" FreeSync monitors for blanking and poo poo like that, when not so long ago, their driver was broken for 3-4 releases, making GSync fail exactly the same way.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Combat Pretzel posted:

I find it a bit hilarious, that Jensen is chastising "uncompliant" FreeSync monitors for blanking and poo poo like that, when not so long ago, their driver was broken for 3-4 releases, making GSync fail exactly the same way.

Did the FreeSync monitors get fixed? If you can’t or won’t ship a fix in a way that’s convenient for the customer, you have to be more careful to not have problems in the first place.

alex314
Nov 22, 2007

People love underdogs :shrug:

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Subjunctive posted:

Did the FreeSync monitors get fixed? If you can’t or won’t ship a fix in a way that’s convenient for the customer, you have to be more careful to not have problems in the first place.
I have no idea, because I currently have GSync ones. Those got fixed eventually. I'm just saying, you can't go around pointing fingers, when you can't keep your own poo poo working, because you broke it with a driver update on a component that's not supposed to be involved in the refreshing of the LCD. I mean, that's the one of the points of the GSync module in the display, just as per NVidia on CES a few days ago.

ZobarStyl
Oct 24, 2005

This isn't a war, it's a moider.

GRINDCORE MEGGIDO posted:

Wonder if any goons do? A thread by them would be fascinating.
Agreed. This industry is already crashing headlong into my own; massive amounts of modern biology cannot exist without micro-silicon manufacturing. Chips already are nanomachines for measurement purposes, but micromanufacturing cannot be far off. And when a chip scale machine builds the successive generation of ever smaller chips, it's silicon based life in all but name.

"It's so loving exciting to see chips get so small that I keep giving AMD money" has been a constant refrain in my life.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Surprise Giraffe posted:

Give the rate of change some other name or none at all if you like, it's sad to watch it falter like this. Doubt that's all down to physics, all things considered.

There were definitely bad decisions made, like Brian Krzanich's absolutely idiotic quest to cut R&D spending for a company that's incredibly reliant on R&D spending to advance their product line. Yet even with increased investment it's hard to fight against a ramp up of costs this big. The main limiter of smaller nodes IMO is not going to be physical inability to manufacture them but this big ramp up in costs eventually making them economically unviable.

MaxxBot fucked around with this message at 19:57 on Jan 12, 2019

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Combat Pretzel posted:

you can't go around pointing fingers

You sure can, because there’s a difference between “fixed promptly” vs “present for the lifetime of the product with no attempt to remedy”.

Indiana_Krom
Jun 18, 2007
Net Slacker

Combat Pretzel posted:

I find it a bit hilarious, that Jensen is chastising "uncompliant" FreeSync monitors for blanking and poo poo like that, when not so long ago, their driver was broken for 3-4 releases, making GSync fail exactly the same way.

I've been using a gsync display while always updating my drivers when a new one hit and this never happened to me. Granted this is a sample size of one gsync display and one gpu, maybe I just dodged the bullet?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Mr.PayDay posted:

German Computerbase Forum, some months ago
„Lmao the 2080 is just 1,5 year old 1080Ti performance with crippled 8 GB VRAM and with a lottery ticket for Turing Features beeing relevant in 1 year in games.. nvidia failed big time! For 700 bucks lol.“

German Computerbase Forum now
„Wohoo, the R7 is a great gap closer and dominant 16 GB VRAM and 2080 performance for 700 bucks, that’s competition now!“

I don’t want to interrupt the victory laps of the AMD lovers, but I don’t understand why they felt the 2080 was a major fail in the conclusion while the R7 now is amazing? It’s a allegedly slightly slower 2080 minus Turing Chips, for 700 bucks as well and irrelevant 16 GB VRAM, unless you refer to the running gag of heavy modded Skyrim at 4K as a relevant scenario.

One is a well-off company in a dominant position ultimately pushing the prices up and up for similar performance, the other is a struggling company re-using a compute card for consumers with what really is an enormous amount of (incredibly expensive) VRAM. The cost is too high, but that's how it is. Radeon VII is not an amazing solution, it's a stop-gap one, but the thing here is - they're happy that there's (potentially) competition at the high-end. That ends up being a good thing for everyone, even if the cards involved are all bad purchases.

If AMD's GPU share dwindles to absolutely gently caress-all, we'll be trapped in the same position as we were with CPUs from Intel for some time, where they basically didn't see any need to increase core counts on the mainstream platform for a crazy long time.

HalloKitty fucked around with this message at 20:35 on Jan 12, 2019

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

craig588 posted:

When I was in school 15 nm was when they predicted the rules of physics to be a limit. Turns out to have been more like 65 nm, at that point the node names became increasingly disconnected from the element sizes.

The physics isn't the problem. Quantum mechanics is extremely precise and accurate. We know how to simulate down to the subatomic level.

The problem is engineering. It's really easy to go "small feature small wavelength" for light but producing euv (soft x ray) light is a loving Rube Goldberg machine since lenses don't work at those frequencies.

Also you gotta make billions and billions of defect free devices thousands of times a day. Statistical variance kills you in the placement of atoms.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

MaxxBot posted:

There were definitely bad decisions made, like Brian Krzanich's absolutely idiotic quest to cut R&D spending for a company that's incredibly reliant on R&D spending to advance their product line. Yet even with increased investment it's hard to fight against a ramp up of costs this big. The main limiter of smaller nodes IMO is not going to be physical inability to manufacture them but this big ramp up in costs eventually making them economically unviable.



Yep we can manufacture tiny tiny transistors using ebeam lithography but it would cost a billion dollars a chip

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

HalloKitty posted:

One is a well-off company in a dominant position ultimately pushing the prices up and up for similar performance, the other is a struggling company re-using a compute card for consumers with what really is an enormous amount of (incredibly expensive) VRAM. The cost is too high, but that's how it is. Radeon VII is not an amazing solution, it's a stop-gap one, but the thing here is - they're happy that there's (potentially) competition at the high-end. That ends up being a good thing for everyone, even if the cards involved are all bad purchases.

If AMD's GPU share dwindles to absolutely gently caress-all, we'll be trapped in the same position as we were with CPUs from Intel for some time, where they basically didn't see any need to increase core counts on the mainstream platform for a crazy long time.


AMD's GPU share has been nonexistent at the high end for 4 years, and in general that period has had some of the best value high-end cards in history - the 980Ti and 1080Ti were both remarkably high performance (big gap vs x80) and relatively decent value for high-end cards. The 20 series is an anomaly in performance because it's not a performance focused architecture, Nvidia introduced something new that takes up a ton of space, and they did it on an old process where the resulting GPUs are enormous and expensive to produce.

People need to calm the hell down and wait for the die shrink. Of course a GPU with a lot of BOM spent on something with zero software support is not going to be a good value when it's new. Be glad that Nvidia has chosen this largely irrelevant point in time to push something new out, so the next generation can be solid value again.

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.

slidebite posted:

I wish they had higher res pics of those.

My first card when I got back in to PCs in 95ish was a Rendition Verite 1000 which was awesome for the time. A year or two later I paired with with a Voodoo and it rocked Quake and Longbow 2.

I still think I have my voodoo 1 & 2 cards kicking around somewhere and pretty I have a 5500 in its box. AGP though :(

Apparently the pics are low-res because a lot of those cards were preproduction models provided to Microsoft under NDA. The NDAs may or may not have expired by now but nobody wants to take the time to dig back through 25 years of them to figure out which they could post high-res pics of and which they couldn't so they just went with low-res for the whole set.

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

GRINDCORE MEGGIDO posted:

Wonder if any goons do? A thread by them would be fascinating.

This was pretty up-to-date 6 years ago, more on the process side than the physics side though.

https://www.youtube.com/watch?v=NGFhc8R_uO4

The problem with looking for goons is anyone in the industry is going to behind NDAs with serious teeth. The guy who presented this had to stop himself repeatedly as he got too close to trade secrets.

For the physics side academic research is open and you can get an idea of what's going on in production now by what they were working on 5-10 years ago.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Subjunctive posted:

You sure can, because there’s a difference between “fixed promptly” vs “present for the lifetime of the product with no attempt to remedy”.
Like that issue, that after a random number of wake ups and mode switches, the GSync module in my displays pick a random column 8 pixels wide and move it to the left side of the screen? Which can only be fixed with a firmware update done at the manufacturer?

Indiana_Krom posted:

I've been using a gsync display while always updating my drivers when a new one hit and this never happened to me. Granted this is a sample size of one gsync display and one gpu, maybe I just dodged the bullet?
When it happened, I found plenty of people noticing the same. Might be specifically related to multiple displays.

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Combat Pretzel posted:

Like that issue, that after a random number of wake ups and mode switches, the GSync module in my displays pick a random column 8 pixels wide and move it to the left side of the screen? Which can only be fixed with a firmware update done at the manufacturer?

Yours moves to the left side of the screen? Huh, mine moves to the right side of the screen for my xb271hu. Didn't know it could go to the left unless there's two monitor skus out there that do this.

Craptacular!
Jul 9, 2001

Fuck the DH
MIne moves to the left. I always assumed it's a driver thing because I've never seen it happen in Linux.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
XB271HU here. It picks a random stripe, puts it on the left and everything else up to where that stripe comes from, moves to the right. Now that you say it, I've seen the stripe move to the right, too. Now I'm not sure if it happens regularly on both screen or whether one goes left and the other goes right.

Acer wanted me to send them in, but like I'm going to send something bulky around, tempting my luck to get it back slightly round.

Craptacular! posted:

MIne moves to the left. I always assumed it's a driver thing because I've never seen it happen in Linux.
Doesn't happen when I disable GSync. Quickly power cycling fixes it, doubtful it's the driver. It takes longer for the card to notice the screen's gone. Seems the relevant electronics in the display don't shut down the DisplayPort immediately.

MagusDraco
Nov 11, 2011

even speedwagon was trolled
Always does it to the right on mine. Happens randomly but it's been more common lately.

It also has for some reason started turning on Nvidia stereoscopic 3D vision mode somehow randomly after waking up. It's done that 3 times total and I have to power cycle the computer to fix it. I do not have the 3d vision drivers installed. So I dunno wtf

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Combat Pretzel posted:

Like that issue, that after a random number of wake ups and mode switches, the GSync module in my displays pick a random column 8 pixels wide and move it to the left side of the screen? Which can only be fixed with a firmware update done at the manufacturer?

When it happened, I found plenty of people noticing the same. Might be specifically related to multiple displays.

Yeah, that is bad though at least there’s a way to get it fixed. Is your point “if one produces a thing which has a defect, one cannot decry other people for producing a thing with a different defect”? That seems like a challenging position to defend, so I’m probably wrong.

Did this only happen with Acer devices? Do we know whose defect it was?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
It involves updating the firmware of specifically the GSync module.

The initial post was about Jensen going all like "Ooooh look how those not-GSync displays flicker because of no GSync", while their own ostensibly so superior solution can be upset pretty quickly. Which kind of suggests that the graphics card is actually more involved than expected, and that those apparently so terrible displays can probably be tamed via driver just fine. I guess I'm generally just cranky about that guy. Same with that hollering about Radeon VII. Like what the hell. He's always been weird, but never really went on the offensive like that before against AMD/ATI.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Did it happen to other manufacturers’ products too?

(That the card can cause similar flickering doesn’t really mean that it can cure it if it’s caused on the monitor side, but I hope you’re right because flicker-free high quality adaptive sync shouldn’t just be for the rich!)

yergacheffe
Jan 22, 2007
Whaler on the moon.

I've heard it happening to the Dell TN gsync monitor in my travels for a solution for my XB271HU.

BTW, my monitor came shipped with the firmware that supposedly fixes the issue according to some Acer community guy on their forums, but I still get the stripe. So I guess what I'm saying is, good thing you guys didn't bother if Acer wanted you to send it in.

Surprise Giraffe
Apr 30, 2007
1 Lunar Road
Moon crater
The Moon

PCjr sidecar posted:

There’s hundreds of billions of dollars that have gone into it to even get this far, and thirty years of a firehose from intel, nsf, dod, etc. and countless phds in any domain even tangentially related.

We’re approaching wires only a single atom wide, patterns laid down by xrays, vacuums sparser than deep space, but if only there were more random phds, sure.

I just wonder how closely the numbers in terms of people and investment follow the increase in the complexity solving the problems, but I guess it's thats pretty vague and political, and would need real research for anyone to be able to say for sure

Harik posted:

This was pretty up-to-date 6 years ago, more on the process side than the physics side though.

https://www.youtube.com/watch?v=NGFhc8R_uO4

The problem with looking for goons is anyone in the industry is going to behind NDAs with serious teeth. The guy who presented this had to stop himself repeatedly as he got too close to trade secrets.

For the physics side academic research is open and you can get an idea of what's going on in production now by what they were working on 5-10 years ago.

That's really interesting, thanks.

Craptacular!
Jul 9, 2001

Fuck the DH

yergacheffe posted:

I've heard it happening to the Dell TN gsync monitor

Latest revision S2716DGR, can confirm. Outstanding screen otherwise.

Regrettable
Jan 5, 2010



Craptacular! posted:

Latest revision S2716DGR, can confirm. Outstanding screen otherwise.

Hmm, I've had the A04 revision for over a year now and haven't run into that issue. Maybe I'm just lucky. :shrug:

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Craptacular! posted:

Latest revision S2716DGR, can confirm. Outstanding screen otherwise.
When did you buy it? I got mine around June 2016. If yours is relatively new, it's surprising that it'll still happen.

Craptacular!
Jul 9, 2001

Fuck the DH

Combat Pretzel posted:

When did you buy it? I got mine around June 2016. If yours is relatively new, it's surprising that it'll still happen.

I guess I should clarify so we're talking about the same thing:

I have a revision A09 panel I bought a couple months ago for black friday. Sometimes, often prompted by switching between fullscreen games and desktop, the picture will be shifted to the right with the first however many pixels on the left looking sort of like the rightmost edge of the picture? It looks like it wrapped around or something. This can usually be prompted by alt-tabbing, and solved by alt-tabbing or launching a game if need be.

Only once or twice have I not been able to reset it and had to reboot the whole computer.

Stickman
Feb 1, 2004

That's the issue, but the easiest way to fix it is to turn the monitor off and back on. I've never had that fail to work, and it takes two button presses and 3 seconds.

Enos Cabell
Nov 3, 2004


My Asus PG279Q I bought in June 2016 does it too, and yeah a quick power cycle fixes it so I've never bothered with the firmware flash.

e: it's a fairly rare occurrence these days so who knows

GRINDCORE MEGGIDO
Feb 28, 1985


Mr.PayDay posted:

German Computerbase Forum, some months ago
„Lmao the 2080 is just 1,5 year old 1080Ti performance with crippled 8 GB VRAM and with a lottery ticket for Turing Features beeing relevant in 1 year in games.. nvidia failed big time! For 700 bucks lol.“

German Computerbase Forum now
„Wohoo, the R7 is a great gap closer and dominant 16 GB VRAM and 2080 performance for 700 bucks, that’s competition now!“

I don’t want to interrupt the victory laps of the AMD lovers, but I don’t understand why they felt the 2080 was a major fail in the conclusion while the R7 now is amazing? It’s a allegedly slightly slower 2080 minus Turing Chips, for 700 bucks as well and irrelevant 16 GB VRAM, unless you refer to the running gag of heavy modded Skyrim at 4K as a relevant scenario.

It's sour grapes. They hate nVidia.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

i have the same dell and mine does that too but like people say, quick power cycle and we're back in business

it happens a few times a month maybe

Truga
May 4, 2014
Lipstick Apathy
Well poo poo. Re: XV273K:

quote:

The FreeSync range supported by the screen is 48 - 120/144Hz but will depend on your resolution. The "overclocked" (as Acer like to refer to it as) 4K-144Hz mode will not support FreeSync, as confirmed by the user manual. It looks like this is a limitation of using a dual DP connection, with FreeSync only working when powering the screen from a single interface. So if you're running at 4K resolution you will only be able to use FreeSync between 48 - 120Hz. If you have dropped to a lower resolution, like 2560 x 1440 for instance, you can run at a refresh rate of up to 144Hz easily, without needing to use dual DP connections or making any OSD changes. So if you're running at a lower resolution than the native 4K, the FreeSync range will be 48 - 144Hz.

Guess I'm still not buying a new monitor.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Truga posted:

Well poo poo. Re: XV273K:


Guess I'm still not buying a new monitor.
Yeah, DP 1.4 is still DP 1.4. At least for me though it's no big deal, 120Hz is still higher than I can probably drive it at for the games where Freesync matters, and the difference between 120 and 144 is pretty insignificant. With how little I play AAA titles these days I suspect I'd run it at 144Hz fixed refresh rate most of the time though. Probably gonna end up ordering it against my better judgement later this month - I really don't need it.

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Combat Pretzel posted:

I guess I'm generally just cranky about that guy.

On reflection, I’m coming around to your crankiness.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply