Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Junior Jr.
Oct 4, 2014

by sebmojo
Buglord
If AMD and NVIDIA were really going to cater to bitcoin miners, they'd start looking into other crypto markets like Zcash.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Watermelon Daiquiri posted:

Even if they don't want to pony up for the ~100 wafers/month in 12" 65-28nm that would probably be the best option, they can always bundle their asic up with 5 or so other company's and save even more

I'm kinda curious if you could guesstimate how many chips per wafer that translates into? And what the actual price would be for a small run like that?

Apparently they are using Altera Arria V FPGAs, which puts them somewhere in the range of 75-504k logic elements and 1156 DSP blocks.

Bearing in mind I know nothing about how this actually works, wild rear end guess: with a 12" diameter wafer you'd have 72546mm^2 of area, let's call it 25mm^2 per chip, you'd end up with 2901 chips per wafer at 100% effective utilization, with yields and some wastage let's call it 80% effective utilization, or 2321 chips per wafer.

No idea on their sales either, but I mean they have to sell like at least 10k per month of the things, so maybe 25k/50k/100k as a guess?

Dunno, it really seems like the math would support just making the ASICs and tossing the spares in a warehouse somewhere. Like I said, I bet they could drop the price and keep their margin. FPGAs are really expensive, like I bet that's at least a $50-75 part even in bulk, and I doubt an ASIC could be that much.

Paul MaudDib fucked around with this message at 21:16 on May 5, 2017

eames
May 9, 2009

AVeryLargeRadish posted:

Also Gsync has the greatest effect when your frame rate is lower, so that will help out a lot.

I know that but have no practical experience with Gsync.
I'd probably be happier with a 1440p/100Hz IPS G-sync monitor as I won't be able to use the higher refresh rate, but since such a monitor doesn't seem to exist it's kind of a moot point. :shrug: Just ordered the S2417DG, if I'm not happy with it it shouldn't be too hard to resell at that price.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

eames posted:

Sanity check: Is a GTX1060-6GB and a 1440p/165Hz G-sync panel an ok combination?

I currently have a 9 year old NEC 27" 1080p IPS screen which I use for occasional gaming when I find the time (which isn't much lately).
The Dell S2417DG is currently on sale for ~$300 (converted from local EU pricing). 24" TN 1440p 165Hz and G-Sync.
My PC does about 100-120 FPS at 1440p in Overwatch. My plan is to upgrade to a GTX1170 when Volta is out and a 4k G-Sync HDR screen when they become mainstream.

It's on the low end as a general-purpose system in more demanding AAA titles (probably no more than 50fps at ultra settings, 60-65 fps at medium settings on graphically-intensive newer titles). But it'll do fine, it's not like 60fps is literally unplayable, and esports are super fast.

At the end of the day, if you can afford it, it plays the games you want at the settings/framerates you want then go hog wild. Most of this whole discussion comes down to differences of opinion on the games and settings and framerates we want. :v:

Zero VGS posted:

Also to the people who say you don't need a curved monitor, in fairness I had a 29" ultrawide before that was flat, and along the left/right edges the the screen at the last millimeter or so it sharply faded to black, I assume as a result of how the IPS layers are laminated... my 34" curved doesn't have that effect so you do see right up to the edge.

It's not a dealbreaker or anything but that's a bit of benefit at least.

I'm guessing this wasn't IPS then. VA and TN panels both have really lovely viewing angles. VA loses about half its brightness in 20 degrees of horizontal angle, which is why a lot of VA ultrawide monitors have fairly aggressive curves. For example, the IPS X34 is 3800R, the VA Z35 is 2000R, i.e. almost twice as much curve.

I really don't get why they don't have curved TN. A curve on a 27" IPS monitor is a worthless gimmick, whereas you can see color shift in the corners of a 27" TN monitor if you move your head a little bit. A little curve would go a long way to fix that.

Junior Jr. posted:

If AMD and NVIDIA were really going to cater to bitcoin miners, they'd start looking into other crypto markets like Zcash.

We're talking about the GSync hardware module here, it uses an FPGA and I'm wondering why they haven't just made an ASIC already since that's got to be crazy expensive.

On the other hand, while I bitch about the cost it works amazingly well. None of the flicker problems or sync rate limitations of (some of) the FreeSync monitors.

Paul MaudDib fucked around with this message at 21:30 on May 5, 2017

PC LOAD LETTER
May 23, 2005
WTF?!

Watermelon Daiquiri posted:

It's not too much of a stretch that per-wafer costs went down a few as the process matured between 290x and fury
Yeah but they weren't clear about it so it looks weird to me.

Watermelon Daiquiri posted:

And is that 'fanout' thing supposed to be using a normal wafer that gets processed? If so, isn't that what intel or whoever it was is doing?
Fanout is more a part of design rather than something to do with the wafer processing per se AFAIK so that would be on AMD here but I don't work in a foundry.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
40 fps gsync is way smoother than 60 fps with dips. I really like gsync in that I can crank up the settings without having to worry about slowdowns in certain areas of my game. It goes a long way towards being happy with my GPU and not feeling the upgrade itch. Why upgrade when im already running Ultra smoothly?

Go gsync and dont look back

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Paul MaudDib posted:

I'm guessing this wasn't IPS then.

No, it was IPS, but the extreme sides of the 29" ultrawide viewed head on at normal sitting distance are screwy, right where they meet the bezel. It's like a slight vignetting that the curved one avoids. We're only talking the last 10 pixels on either side but once you notice it you don't unsee it.

1gnoirents
Jun 28, 2014

hello :)

metallicaeg posted:

I can't roll my eyes hard enough at this. A 1060 will handle 1080p144 and is adequate for 1440p. A 1080 absolutely will perform well at 4k.

I'd agree that a 1060 level of performance would be adequate and certainly will "do" but its definitely not ideal for those resolutions. While I've never had a 1060, I've had plenty of 970's, 980's, 980ti, 1070, 1080 and while it's impressive how well a 980/1060 will perform at 1440p you are going to be sorely lacking in a lot of games. It will come down to what you play hands down and secondarily what you will put up with.

For the record, a 1080 is barely keeping me happy at 1440p 60 hz so this is a very relative.

I'd pretty much say the same scenario for 4k. A 1080 will do and it's great for what it is but its not ideal at all. Nothing really is, but if there is any case for a 1080ti its if you have a 4k monitor - you will use every drop of that GPU and still dip down to uncomfortable levels every now and then. Again this is very game dependent but if youre interested in graphical games then thats just the way it is. MOBAs and Overwatch excluded

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
Another Nvidia driver update and poo poo is still broken. I'm going to be stuck on 378.92 for a long time aren't I?

Anime Schoolgirl
Nov 28, 2002

spasticColon posted:

Another Nvidia driver update and poo poo is still broken. I'm going to be stuck on 378.92 for a long time aren't I?
This is the Geforce Experience, waiting on which updates are safe for your odd hardware/OS version special case :tipshat:

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Anyone know how to get the Frame Rate Limiter value set in Nvidia Profile Inspector to persist over restarts?

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

Anime Schoolgirl posted:

This is the Geforce Experience, waiting on which updates are safe for your odd hardware/OS version special case :tipshat:

All I have to say right now is gently caress the Creator's Update because that's what broke everything.

redeyes
Sep 14, 2002

by Fluffdaddy
AMD is just fine under Creators update. Even got HDR enabled in windows.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

redeyes posted:

AMD is just fine under Creators update. Even got HDR enabled in windows.

When do the Vega cards come out? I'll just sell my 1070 and get a 1070 equivalent AMD Vega card. I'm just tired of poo poo being broken and it looks like Nvidia's drivers are going to broken for a long time.

spasticColon fucked around with this message at 02:47 on May 6, 2017

PC LOAD LETTER
May 23, 2005
WTF?!
AMD is saying Q2 for Vega so probably a very late June launch or paper launch with good supply in July is what I'm interpreting that as.

Unless they've screwed something up big time it should perform somewhere in between a 1080 or 1080Ti.

I'm hoping they're willing to keep the price lower than Nvidia has for that level of performance but no real good leaks that I've seen on pricing to indicate this.

redeyes
Sep 14, 2002

by Fluffdaddy

spasticColon posted:

When do the Vega cards come out? I'll just sell my 1070 and get a 1070 equivalent AMD Vega card. I'm just tired of poo poo being broken and it looks like Nvidia's drivers are going to broken for a long time.

AMD has their own set of problems.. but It's at least stable.

1gnoirents
Jun 28, 2014

hello :)

spasticColon posted:

Another Nvidia driver update and poo poo is still broken. I'm going to be stuck on 378.92 for a long time aren't I?

I missed it, what happened ?

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

1gnoirents posted:

I missed it, what happened ?

Full Dynamic Range still doesn't work for me so colors look washed out on my TV. The colors look just fine on my PS4 though so my TV isn't the culprit.

spasticColon fucked around with this message at 03:20 on May 6, 2017

redeyes
Sep 14, 2002

by Fluffdaddy

spasticColon posted:

Full Dynamic Range still doesn't work for me so colors look washed out on my TV. The colors look just fine on my PS4 though so my TV isn't the culprit.

That is the same with AMD. I have no idea how it's suppose to work. But its there.

ufarn
May 30, 2009
One possibility is your RGB Range setting on the TV is wrong. Videogames require Full range, whereas movies and so on require Normal. Go check the advanced Picture options or something and make sure.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~
1080 Ti owners: how's your experience with coil whine?

I picked up a MSI Gaming X model a week ago and the whine was really bad. Like, audible in another room bad. I returned it to Amazon and while I'm waiting on a refund I'm looking into other models. Obviously there's a slight bit of lottery involved but is any brand consistently better than others at avoiding the issue?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Zero VGS posted:

Anyone know how to get the Frame Rate Limiter value set in Nvidia Profile Inspector to persist over restarts?

Edit: Figured it out... you have to make a loving startup batch file which will always throw a UAC prompt.

https://www.youtube.com/watch?v=lKgC3CdZYh0

That's one thing that AMD does better, a Frame Rate Limiter that's actually built in.

Space Racist posted:

1080 Ti owners: how's your experience with coil whine?

I picked up a MSI Gaming X model a week ago and the whine was really bad. Like, audible in another room bad. I returned it to Amazon and while I'm waiting on a refund I'm looking into other models. Obviously there's a slight bit of lottery involved but is any brand consistently better than others at avoiding the issue?

I thought my PNY 1080ti FE had bad coil whine, but upon further review it seems to be totally coming from my HDPlex 300w power supply. Might want to double check that it's not your PSU making the whine or at least contributing to it with how it interacts with the card.

Zero VGS fucked around with this message at 19:15 on May 6, 2017

Truga
May 4, 2014
Lipstick Apathy

Space Racist posted:

1080 Ti owners: how's your experience with coil whine?

I picked up a MSI Gaming X model a week ago and the whine was really bad. Like, audible in another room bad. I returned it to Amazon and while I'm waiting on a refund I'm looking into other models. Obviously there's a slight bit of lottery involved but is any brand consistently better than others at avoiding the issue?

Try over/underclocking it by like 50MHz or moving the voltage around by 0.02V or so. It might make it go away, or at least change the pitch to be the kind that isn't very audible.

My 980Ti was terrible at stock freq/voltage, but at around 1300mhz it's silent.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Zero VGS posted:

I thought my PNY 1080ti FE had bad coil whine, but upon further review it seems to be totally coming from my HDPlex 300w power supply. Might want to double check that it's not your PSU making the whine or at least contributing to it with how it interacts with the card.

Yeah, I already tried two different PSUs (albeit both were Seasonic), exact same whine in both cases.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Zero VGS posted:

300w power supply

:stonk:

The card pulls at least 275W all by itself... 350 or more if you bump the power limit up...

craig588
Nov 19, 2005

by Nyc_Tattoo
Whoooaaaa. Normally people are a little too enthusiastic about getting big power supplies, but yeah, the 1080 ti is a 250 watt minimum card all on its own. 300 is too low even if you're using one of those ultra low wattage Intel CPUs, the fans, drives, memory, and motherboard itself are easily going to push you over 300. You want to replace that as soon as possible.

craig588 fucked around with this message at 20:32 on May 6, 2017

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




EVGA just started some sort of Elite membership program. I guess they'll let you get in on the pre-orders and such. It's a free program but I guess you need to own either an EVGA video card or a few of their other products to join it. Seems pretty silly, though.

Paul MaudDib posted:

:stonk:

The card pulls at least 275W all by itself... 350 or more if you bump the power limit up...

That little PSU is capable of sustaining 350W, and has no problem 400W peaks. Not From Concentrate guy did a video on it on Youtube.

VulgarandStupid fucked around with this message at 20:39 on May 6, 2017

TheJeffers
Jan 31, 2007

VulgarandStupid posted:

That little PSU is capable of sustaining 350W, and has no problem 400W peaks. Not From Concentrate guy did a video on it on Youtube.

350W-360W is about what a 1080 Ti + 7700K system is going to draw under load, so even with that in mind...

GRINDCORE MEGGIDO
Feb 28, 1985


Running a PSU at 100% is not a good idea. Why spend all that and risk going over the rated wattage?

If it's temperature controlled, the fan in it is going to be racing.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

VulgarandStupid posted:

EVGA just started some sort of Elite membership program. I guess they'll let you get in on the pre-orders and such. It's a free program but I guess you need to own either an EVGA video card or a few of their other products to join it. Seems pretty silly, though.


That little PSU is capable of sustaining 350W, and has no problem 400W peaks. Not From Concentrate guy did a video on it on Youtube.

The power supply is the only part of your PC that can brick the rest. A good rule of thumb I've always followed is to have 20% more than your system needs at peak so you aren't running it at capacity all the time (thus wearing out the only part that can kill your pic faster).

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




GRINDCORE MEGGIDO posted:

Running a PSU at 100% is not a good idea. Why spend all that and risk going over the rated wattage?

If it's temperature controlled, the fan in it is going to be racing.

Because he can fit it in a 4 liter case. Also, it has no fan.

Ak Gara
Jul 29, 2005

That's just the way he rolls.
I always heard power supplies are at their max efficiency when running at 50%. 300w from a 600w power supply, 500w from a 1000w supply, etc.

Phuzun
Jul 4, 2007

tehinternet posted:

The power supply is the only part of your PC that can brick the rest. A good rule of thumb I've always followed is to have 20% more than your system needs at peak so you aren't running it at capacity all the time (thus wearing out the only part that can kill your pic faster).

Exactly this, the capacitors age no matter what and you always want to consider a good 15-20% buffer when you calculate what's in the system.

The efficiency is that 80plus, gold, platinum, etc. label that you see on them. Some hardware reviewers will test at the various levels of draw to confirm those ratings. Getting a bigger, more efficient power supply is always a good idea if you plan to use it awhile.

Phuzun fucked around with this message at 21:24 on May 6, 2017

eames
May 9, 2009

Ak Gara posted:

I always heard power supplies are at their max efficiency when running at 50%. 300w from a 600w power supply, 500w from a 1000w supply, etc.

Oh this again. Yeah, that's a good rule assuming your computer runs at 100% CPU and GPU load the second you turn it on.
Mine is on 24/7 and runs at full load perhaps 3-4% of the time.

I'm currently running a 4C/4T Haswell Xeon, 16GB DDR3L-ECC RAM, 3x spun up 8TB helium drives and a GTX1060 off a 290W Dell OEM PSU.

Extrapolating measured numbers from my current machine running Prime95 blend + Unigine Superposition Extreme, this system with a GTX1080 would peak (!) at 82.4% of the PSU rated DC capacity.
That's calculated from power draw at the wall, assuming 83% efficiency at this load as listed in the PSU data sheet.

Would I run a 1080 off this PSU? No, probably not without changing the PSU, but chances are it would work just fine.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

eames posted:

I'm currently running a 4C/4T Haswell Xeon, 16GB DDR3L-ECC RAM, 3x spun up 8TB helium drives and a GTX1060 off a 290W Dell OEM PSU.

Would I run a 1080 off this PSU? No, probably not without changing the PSU, but chances are it would work just fine.

I had a HP workstation with a "450W" power supply, I tried to use a R9 280 in it and after 5 or 10 minutes of gaming the PSU would overheat and power off.

Anecdotes ahoy, but it's not always that simple. Especially on PSUs where it might be split into multiple rails.

Paul MaudDib fucked around with this message at 22:02 on May 6, 2017

craig588
Nov 19, 2005

by Nyc_Tattoo
I like to use 80% of the PSUs rating. I think, honestly, I made that up because of the 80 plus ratings having the number 80 in them so 80% of the rating sounds like the same number so that's good enough for me. 50% is the peak efficiency, but you'll (almost) never make up the extra cost of buying a bigger PSU compared to buying a cheaper lower rated one. If you're outfitting a whole office and need to hit certain power numbers is the only time I'd consider efficiencies or lower than 80% load ratings. Of course I'm also someone who likes novelty more than money and I spent over 200 dollars on one of the first Platinum rated PSUs to hit the market. It's saving me dollars a year! I could have just bought a 100 dollar gold or silver rated one instead if I really cared about saving money. The difference in me running a Silver rated PSU at near 100% versus running my dumb Platinum one at 50%, if I was to keep my PC on and fully stressed 24/7, is about 6 dollars a month, more realistically, based on my Steam game time tracking and if we pretend that every game I play is loading the PC 100%, it's probably closer to a dollar per month. For normal people, don't do what I did, new hardware is fun, but man when the purpose of that new hardware is to save money and it ends up costing more money it's a real hard case to make.

With aging, heat, and noise I never recommend speccing out a PSU to exactly the peak load your hardware is rated for, but 50% just kind of throws the cost/benefit away. The difference in efficiency for any 80% rated PSU is only 3% worse at 100% than 50%, until you get to titanium where it gets to 4%. The difference between Platinum at 50% and Silver at 100% is still only 7%.

craig588 fucked around with this message at 22:43 on May 6, 2017

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!
I went overkill with my PSU because I got a good deal on it(EVGA Supernova G2 850W for $80 after rebate), it came with a 10 year warranty and it's got the sort of capacity I need if I ever decide to do something silly like SLI.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
A second Vega benchmark has hit the tower!

craig588
Nov 19, 2005

by Nyc_Tattoo

AVeryLargeRadish posted:

I went overkill with my PSU because I got a good deal on it(EVGA Supernova G2 850W for $80 after rebate), it came with a 10 year warranty and it's got the sort of capacity I need if I ever decide to do something silly like SLI.


Yeah, that's what makes it (almost) not worth it. With a good sale it can be worth it to step up the efficiency.

Adbot
ADBOT LOVES YOU

GRINDCORE MEGGIDO
Feb 28, 1985


VulgarandStupid posted:

Because he can fit it in a 4 liter case. Also, it has no fan.

Ohhh. I'm super keen on finding how long that lasts powering that system.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply