|
If AMD and NVIDIA were really going to cater to bitcoin miners, they'd start looking into other crypto markets like Zcash.
|
# ? May 5, 2017 20:54 |
|
|
# ? Jun 3, 2024 21:53 |
|
Watermelon Daiquiri posted:Even if they don't want to pony up for the ~100 wafers/month in 12" 65-28nm that would probably be the best option, they can always bundle their asic up with 5 or so other company's and save even more I'm kinda curious if you could guesstimate how many chips per wafer that translates into? And what the actual price would be for a small run like that? Apparently they are using Altera Arria V FPGAs, which puts them somewhere in the range of 75-504k logic elements and 1156 DSP blocks. Bearing in mind I know nothing about how this actually works, wild rear end guess: with a 12" diameter wafer you'd have 72546mm^2 of area, let's call it 25mm^2 per chip, you'd end up with 2901 chips per wafer at 100% effective utilization, with yields and some wastage let's call it 80% effective utilization, or 2321 chips per wafer. No idea on their sales either, but I mean they have to sell like at least 10k per month of the things, so maybe 25k/50k/100k as a guess? Dunno, it really seems like the math would support just making the ASICs and tossing the spares in a warehouse somewhere. Like I said, I bet they could drop the price and keep their margin. FPGAs are really expensive, like I bet that's at least a $50-75 part even in bulk, and I doubt an ASIC could be that much. Paul MaudDib fucked around with this message at 21:16 on May 5, 2017 |
# ? May 5, 2017 21:12 |
|
AVeryLargeRadish posted:Also Gsync has the greatest effect when your frame rate is lower, so that will help out a lot. I know that but have no practical experience with Gsync. I'd probably be happier with a 1440p/100Hz IPS G-sync monitor as I won't be able to use the higher refresh rate, but since such a monitor doesn't seem to exist it's kind of a moot point. Just ordered the S2417DG, if I'm not happy with it it shouldn't be too hard to resell at that price.
|
# ? May 5, 2017 21:19 |
|
eames posted:Sanity check: Is a GTX1060-6GB and a 1440p/165Hz G-sync panel an ok combination? It's on the low end as a general-purpose system in more demanding AAA titles (probably no more than 50fps at ultra settings, 60-65 fps at medium settings on graphically-intensive newer titles). But it'll do fine, it's not like 60fps is literally unplayable, and esports are super fast. At the end of the day, if you can afford it, it plays the games you want at the settings/framerates you want then go hog wild. Most of this whole discussion comes down to differences of opinion on the games and settings and framerates we want. Zero VGS posted:Also to the people who say you don't need a curved monitor, in fairness I had a 29" ultrawide before that was flat, and along the left/right edges the the screen at the last millimeter or so it sharply faded to black, I assume as a result of how the IPS layers are laminated... my 34" curved doesn't have that effect so you do see right up to the edge. I'm guessing this wasn't IPS then. VA and TN panels both have really lovely viewing angles. VA loses about half its brightness in 20 degrees of horizontal angle, which is why a lot of VA ultrawide monitors have fairly aggressive curves. For example, the IPS X34 is 3800R, the VA Z35 is 2000R, i.e. almost twice as much curve. I really don't get why they don't have curved TN. A curve on a 27" IPS monitor is a worthless gimmick, whereas you can see color shift in the corners of a 27" TN monitor if you move your head a little bit. A little curve would go a long way to fix that. Junior Jr. posted:If AMD and NVIDIA were really going to cater to bitcoin miners, they'd start looking into other crypto markets like Zcash. We're talking about the GSync hardware module here, it uses an FPGA and I'm wondering why they haven't just made an ASIC already since that's got to be crazy expensive. On the other hand, while I bitch about the cost it works amazingly well. None of the flicker problems or sync rate limitations of (some of) the FreeSync monitors. Paul MaudDib fucked around with this message at 21:30 on May 5, 2017 |
# ? May 5, 2017 21:25 |
|
Watermelon Daiquiri posted:It's not too much of a stretch that per-wafer costs went down a few as the process matured between 290x and fury Watermelon Daiquiri posted:And is that 'fanout' thing supposed to be using a normal wafer that gets processed? If so, isn't that what intel or whoever it was is doing?
|
# ? May 5, 2017 21:33 |
|
40 fps gsync is way smoother than 60 fps with dips. I really like gsync in that I can crank up the settings without having to worry about slowdowns in certain areas of my game. It goes a long way towards being happy with my GPU and not feeling the upgrade itch. Why upgrade when im already running Ultra smoothly? Go gsync and dont look back
|
# ? May 5, 2017 23:17 |
|
Paul MaudDib posted:I'm guessing this wasn't IPS then. No, it was IPS, but the extreme sides of the 29" ultrawide viewed head on at normal sitting distance are screwy, right where they meet the bezel. It's like a slight vignetting that the curved one avoids. We're only talking the last 10 pixels on either side but once you notice it you don't unsee it.
|
# ? May 5, 2017 23:42 |
|
metallicaeg posted:I can't roll my eyes hard enough at this. A 1060 will handle 1080p144 and is adequate for 1440p. A 1080 absolutely will perform well at 4k. I'd agree that a 1060 level of performance would be adequate and certainly will "do" but its definitely not ideal for those resolutions. While I've never had a 1060, I've had plenty of 970's, 980's, 980ti, 1070, 1080 and while it's impressive how well a 980/1060 will perform at 1440p you are going to be sorely lacking in a lot of games. It will come down to what you play hands down and secondarily what you will put up with. For the record, a 1080 is barely keeping me happy at 1440p 60 hz so this is a very relative. I'd pretty much say the same scenario for 4k. A 1080 will do and it's great for what it is but its not ideal at all. Nothing really is, but if there is any case for a 1080ti its if you have a 4k monitor - you will use every drop of that GPU and still dip down to uncomfortable levels every now and then. Again this is very game dependent but if youre interested in graphical games then thats just the way it is. MOBAs and Overwatch excluded
|
# ? May 5, 2017 23:49 |
|
Another Nvidia driver update and poo poo is still broken. I'm going to be stuck on 378.92 for a long time aren't I?
|
# ? May 6, 2017 01:21 |
|
spasticColon posted:Another Nvidia driver update and poo poo is still broken. I'm going to be stuck on 378.92 for a long time aren't I?
|
# ? May 6, 2017 01:47 |
|
Anyone know how to get the Frame Rate Limiter value set in Nvidia Profile Inspector to persist over restarts?
|
# ? May 6, 2017 01:56 |
|
Anime Schoolgirl posted:This is the Geforce Experience, waiting on which updates are safe for your odd hardware/OS version special case All I have to say right now is gently caress the Creator's Update because that's what broke everything.
|
# ? May 6, 2017 02:17 |
|
AMD is just fine under Creators update. Even got HDR enabled in windows.
|
# ? May 6, 2017 02:29 |
|
redeyes posted:AMD is just fine under Creators update. Even got HDR enabled in windows. When do the Vega cards come out? I'll just sell my 1070 and get a 1070 equivalent AMD Vega card. I'm just tired of poo poo being broken and it looks like Nvidia's drivers are going to broken for a long time. spasticColon fucked around with this message at 02:47 on May 6, 2017 |
# ? May 6, 2017 02:41 |
|
AMD is saying Q2 for Vega so probably a very late June launch or paper launch with good supply in July is what I'm interpreting that as. Unless they've screwed something up big time it should perform somewhere in between a 1080 or 1080Ti. I'm hoping they're willing to keep the price lower than Nvidia has for that level of performance but no real good leaks that I've seen on pricing to indicate this.
|
# ? May 6, 2017 02:50 |
|
spasticColon posted:When do the Vega cards come out? I'll just sell my 1070 and get a 1070 equivalent AMD Vega card. I'm just tired of poo poo being broken and it looks like Nvidia's drivers are going to broken for a long time. AMD has their own set of problems.. but It's at least stable.
|
# ? May 6, 2017 02:52 |
|
spasticColon posted:Another Nvidia driver update and poo poo is still broken. I'm going to be stuck on 378.92 for a long time aren't I? I missed it, what happened ?
|
# ? May 6, 2017 03:01 |
|
1gnoirents posted:I missed it, what happened ? Full Dynamic Range still doesn't work for me so colors look washed out on my TV. The colors look just fine on my PS4 though so my TV isn't the culprit. spasticColon fucked around with this message at 03:20 on May 6, 2017 |
# ? May 6, 2017 03:17 |
|
spasticColon posted:Full Dynamic Range still doesn't work for me so colors look washed out on my TV. The colors look just fine on my PS4 though so my TV isn't the culprit. That is the same with AMD. I have no idea how it's suppose to work. But its there.
|
# ? May 6, 2017 15:02 |
|
One possibility is your RGB Range setting on the TV is wrong. Videogames require Full range, whereas movies and so on require Normal. Go check the advanced Picture options or something and make sure.
|
# ? May 6, 2017 15:25 |
|
1080 Ti owners: how's your experience with coil whine? I picked up a MSI Gaming X model a week ago and the whine was really bad. Like, audible in another room bad. I returned it to Amazon and while I'm waiting on a refund I'm looking into other models. Obviously there's a slight bit of lottery involved but is any brand consistently better than others at avoiding the issue?
|
# ? May 6, 2017 19:05 |
|
Zero VGS posted:Anyone know how to get the Frame Rate Limiter value set in Nvidia Profile Inspector to persist over restarts? Edit: Figured it out... you have to make a loving startup batch file which will always throw a UAC prompt. https://www.youtube.com/watch?v=lKgC3CdZYh0 That's one thing that AMD does better, a Frame Rate Limiter that's actually built in. Space Racist posted:1080 Ti owners: how's your experience with coil whine? I thought my PNY 1080ti FE had bad coil whine, but upon further review it seems to be totally coming from my HDPlex 300w power supply. Might want to double check that it's not your PSU making the whine or at least contributing to it with how it interacts with the card. Zero VGS fucked around with this message at 19:15 on May 6, 2017 |
# ? May 6, 2017 19:12 |
|
Space Racist posted:1080 Ti owners: how's your experience with coil whine? Try over/underclocking it by like 50MHz or moving the voltage around by 0.02V or so. It might make it go away, or at least change the pitch to be the kind that isn't very audible. My 980Ti was terrible at stock freq/voltage, but at around 1300mhz it's silent.
|
# ? May 6, 2017 19:15 |
|
Zero VGS posted:I thought my PNY 1080ti FE had bad coil whine, but upon further review it seems to be totally coming from my HDPlex 300w power supply. Might want to double check that it's not your PSU making the whine or at least contributing to it with how it interacts with the card. Yeah, I already tried two different PSUs (albeit both were Seasonic), exact same whine in both cases.
|
# ? May 6, 2017 19:19 |
|
Zero VGS posted:1080ti Zero VGS posted:300w power supply The card pulls at least 275W all by itself... 350 or more if you bump the power limit up...
|
# ? May 6, 2017 20:13 |
|
Whoooaaaa. Normally people are a little too enthusiastic about getting big power supplies, but yeah, the 1080 ti is a 250 watt minimum card all on its own. 300 is too low even if you're using one of those ultra low wattage Intel CPUs, the fans, drives, memory, and motherboard itself are easily going to push you over 300. You want to replace that as soon as possible.
craig588 fucked around with this message at 20:32 on May 6, 2017 |
# ? May 6, 2017 20:30 |
|
EVGA just started some sort of Elite membership program. I guess they'll let you get in on the pre-orders and such. It's a free program but I guess you need to own either an EVGA video card or a few of their other products to join it. Seems pretty silly, though. Paul MaudDib posted:
That little PSU is capable of sustaining 350W, and has no problem 400W peaks. Not From Concentrate guy did a video on it on Youtube. VulgarandStupid fucked around with this message at 20:39 on May 6, 2017 |
# ? May 6, 2017 20:36 |
|
VulgarandStupid posted:That little PSU is capable of sustaining 350W, and has no problem 400W peaks. Not From Concentrate guy did a video on it on Youtube. 350W-360W is about what a 1080 Ti + 7700K system is going to draw under load, so even with that in mind...
|
# ? May 6, 2017 20:45 |
|
Running a PSU at 100% is not a good idea. Why spend all that and risk going over the rated wattage? If it's temperature controlled, the fan in it is going to be racing.
|
# ? May 6, 2017 20:46 |
|
VulgarandStupid posted:EVGA just started some sort of Elite membership program. I guess they'll let you get in on the pre-orders and such. It's a free program but I guess you need to own either an EVGA video card or a few of their other products to join it. Seems pretty silly, though. The power supply is the only part of your PC that can brick the rest. A good rule of thumb I've always followed is to have 20% more than your system needs at peak so you aren't running it at capacity all the time (thus wearing out the only part that can kill your pic faster).
|
# ? May 6, 2017 20:59 |
|
GRINDCORE MEGGIDO posted:Running a PSU at 100% is not a good idea. Why spend all that and risk going over the rated wattage? Because he can fit it in a 4 liter case. Also, it has no fan.
|
# ? May 6, 2017 21:12 |
|
I always heard power supplies are at their max efficiency when running at 50%. 300w from a 600w power supply, 500w from a 1000w supply, etc.
|
# ? May 6, 2017 21:22 |
|
tehinternet posted:The power supply is the only part of your PC that can brick the rest. A good rule of thumb I've always followed is to have 20% more than your system needs at peak so you aren't running it at capacity all the time (thus wearing out the only part that can kill your pic faster). Exactly this, the capacitors age no matter what and you always want to consider a good 15-20% buffer when you calculate what's in the system. The efficiency is that 80plus, gold, platinum, etc. label that you see on them. Some hardware reviewers will test at the various levels of draw to confirm those ratings. Getting a bigger, more efficient power supply is always a good idea if you plan to use it awhile. Phuzun fucked around with this message at 21:24 on May 6, 2017 |
# ? May 6, 2017 21:22 |
|
Ak Gara posted:I always heard power supplies are at their max efficiency when running at 50%. 300w from a 600w power supply, 500w from a 1000w supply, etc. Oh this again. Yeah, that's a good rule assuming your computer runs at 100% CPU and GPU load the second you turn it on. Mine is on 24/7 and runs at full load perhaps 3-4% of the time. I'm currently running a 4C/4T Haswell Xeon, 16GB DDR3L-ECC RAM, 3x spun up 8TB helium drives and a GTX1060 off a 290W Dell OEM PSU. Extrapolating measured numbers from my current machine running Prime95 blend + Unigine Superposition Extreme, this system with a GTX1080 would peak (!) at 82.4% of the PSU rated DC capacity. That's calculated from power draw at the wall, assuming 83% efficiency at this load as listed in the PSU data sheet. Would I run a 1080 off this PSU? No, probably not without changing the PSU, but chances are it would work just fine.
|
# ? May 6, 2017 21:52 |
|
eames posted:I'm currently running a 4C/4T Haswell Xeon, 16GB DDR3L-ECC RAM, 3x spun up 8TB helium drives and a GTX1060 off a 290W Dell OEM PSU. I had a HP workstation with a "450W" power supply, I tried to use a R9 280 in it and after 5 or 10 minutes of gaming the PSU would overheat and power off. Anecdotes ahoy, but it's not always that simple. Especially on PSUs where it might be split into multiple rails. Paul MaudDib fucked around with this message at 22:02 on May 6, 2017 |
# ? May 6, 2017 22:00 |
|
I like to use 80% of the PSUs rating. I think, honestly, I made that up because of the 80 plus ratings having the number 80 in them so 80% of the rating sounds like the same number so that's good enough for me. 50% is the peak efficiency, but you'll (almost) never make up the extra cost of buying a bigger PSU compared to buying a cheaper lower rated one. If you're outfitting a whole office and need to hit certain power numbers is the only time I'd consider efficiencies or lower than 80% load ratings. Of course I'm also someone who likes novelty more than money and I spent over 200 dollars on one of the first Platinum rated PSUs to hit the market. It's saving me dollars a year! I could have just bought a 100 dollar gold or silver rated one instead if I really cared about saving money. The difference in me running a Silver rated PSU at near 100% versus running my dumb Platinum one at 50%, if I was to keep my PC on and fully stressed 24/7, is about 6 dollars a month, more realistically, based on my Steam game time tracking and if we pretend that every game I play is loading the PC 100%, it's probably closer to a dollar per month. For normal people, don't do what I did, new hardware is fun, but man when the purpose of that new hardware is to save money and it ends up costing more money it's a real hard case to make. With aging, heat, and noise I never recommend speccing out a PSU to exactly the peak load your hardware is rated for, but 50% just kind of throws the cost/benefit away. The difference in efficiency for any 80% rated PSU is only 3% worse at 100% than 50%, until you get to titanium where it gets to 4%. The difference between Platinum at 50% and Silver at 100% is still only 7%. craig588 fucked around with this message at 22:43 on May 6, 2017 |
# ? May 6, 2017 22:38 |
I went overkill with my PSU because I got a good deal on it(EVGA Supernova G2 850W for $80 after rebate), it came with a 10 year warranty and it's got the sort of capacity I need if I ever decide to do something silly like SLI.
|
|
# ? May 6, 2017 22:45 |
|
A second Vega benchmark has hit the tower!
|
# ? May 6, 2017 22:46 |
|
AVeryLargeRadish posted:I went overkill with my PSU because I got a good deal on it(EVGA Supernova G2 850W for $80 after rebate), it came with a 10 year warranty and it's got the sort of capacity I need if I ever decide to do something silly like SLI. Yeah, that's what makes it (almost) not worth it. With a good sale it can be worth it to step up the efficiency.
|
# ? May 6, 2017 22:49 |
|
|
# ? Jun 3, 2024 21:53 |
|
VulgarandStupid posted:Because he can fit it in a 4 liter case. Also, it has no fan. Ohhh. I'm super keen on finding how long that lasts powering that system.
|
# ? May 6, 2017 22:53 |