|
latinotwink1997 posted:Are ASICs just going to basically kill crypto altogether? till next craze happen, then no more gpus.
|
# ? Aug 26, 2018 15:03 |
|
|
# ? May 4, 2024 12:47 |
|
latinotwink1997 posted:Are ASICs just going to basically kill crypto altogether? you need holy water and a steak to kill crypto
|
# ? Aug 26, 2018 16:54 |
|
wargames posted:till next craze happen, then no more gpus. I don’t see how any Gpus can compete with any asic
|
# ? Aug 26, 2018 16:56 |
|
Comfy Fleece Sweater posted:I don’t see how any Gpus can compete with any asic i can go to best buy and buy 16 gpus and no asics.
|
# ? Aug 26, 2018 19:42 |
|
Klyith posted:you need holy water and a steak to kill crypto sounds like a nice meal
|
# ? Aug 27, 2018 00:08 |
|
there was a scheme to buy a hydro plant in upstate New York, for bitcoin mining! it is of course insane
|
# ? Aug 27, 2018 00:14 |
|
latinotwink1997 posted:Are ASICs just going to basically kill crypto altogether? Pretty much. Anyone who claims their crypto is asic-proof or resistant is lying.
|
# ? Aug 27, 2018 02:10 |
|
wargames posted:i can go to best buy and buy 16 gpus and no asics. that just means that it's harder for a cryptocurrency to remain decentralized once it becomes valuable enough to be mined by ASIC hardware like you're technically correct that you can buy GPUs right off the shelf but once a cryptocurrency becomes valuable enough to make ASIC mining worthwhile then the economics of GPU mining no longer makes sense
|
# ? Aug 27, 2018 03:31 |
|
QuarkJets posted:that just means that it's harder for a cryptocurrency to remain decentralized once it becomes valuable enough to be mined by ASIC hardware This. It just seems like you’ll end up with a couple Chinese ASIC farms controlling a majority of the currency, making it no longer decentralized. They manipulate it how they choose meaning the benefits of crypto is gone and suddenly it’s just dead. No one uses it, the farms die out and we’re back where we started. And honestly, I long to see that day come.
|
# ? Aug 27, 2018 04:17 |
|
latinotwink1997 posted:This. It just seems like you'll end up with a couple Chinese ASIC farms controlling a majority of the currency, making it no longer decentralized. They manipulate it how they choose meaning the benefits of crypto is gone and suddenly its just dead. No one uses it, the farms die out and were back where we started. This is good for bitcoin.
|
# ? Aug 27, 2018 05:00 |
|
Palladium posted:This is good for bitcoin. The dawning realization that it is neither decentralized nor secure enough to warrant its enormous cost, then being put out of its misery? Sure!
|
# ? Aug 27, 2018 06:20 |
|
Stickman posted:The dawning realization that it is neither decentralized nor secure enough to warrant its enormous cost, then being put out of its misery? Sure! this has been the case since 2014, but bitcoin is here it turns out bitcoin completely refutes the Efficient Market Hypothesis
|
# ? Aug 27, 2018 09:54 |
|
divabot posted:this has been the case since 2014, but bitcoin is here it's from march but I just came across this wonderful tale in which a crypto conference is held and a cannabis company is hired to cater, which doesn't adequately announce the fact that the entire menu is spiked with THC. predictable, wonderful finger-pointing ensues. i love everything about it.
|
# ? Aug 27, 2018 18:58 |
|
divabot posted:this has been the case since 2014, but bitcoin is here I keep holding out hope that if enough people get screwed by exchanges and enough real money starts shifting out of the US, we'll actually see some regulation. But America
|
# ? Aug 27, 2018 22:55 |
|
latinotwink1997 posted:This. It just seems like you’ll end up with a couple Chinese ASIC farms controlling a majority of the currency, making it no longer decentralized. They manipulate it how they choose meaning the benefits of crypto is gone and suddenly it’s just dead. No one uses it, the farms die out and we’re back where we started. Where we started is inventing random shitcoins and seeing which one takes off. So, yeah, you're right: someone will develop a new coin that is moderately resistant to existing ASICs, and the cycle will begin anew.
|
# ? Aug 28, 2018 01:33 |
|
Interesting. Crypto influencing a major chip maker? or am I reading this wrong Granted, 7nm is probably too expensive for them quote:GlobalFoundries Reshapes Technology Portfolio to Intensify Focus on Growing Demand for Differentiated Offerings
|
# ? Aug 28, 2018 12:07 |
|
We're hitting a hard wall with per-core process improvements and in the absence of that you run custom silicon to accelerate the bottlenecks in your workload. This is an inevitable consequence of the world hitting the limits of silicon die-shrinks and GF is following the money.
|
# ? Aug 28, 2018 13:30 |
|
Lube banjo posted:Interesting. Crypto influencing a major chip maker? or am I reading this wrong you're reading it wrong. ASIC just means application-specific integrated circuit, and could be anything. Most chips in a phone are ASICs.
|
# ? Aug 28, 2018 13:47 |
|
BangersInMyKnickers posted:We're hitting a hard wall with per-core process improvements and in the absence of that you run custom silicon to accelerate the bottlenecks in your workload. This is an inevitable consequence of the world hitting the limits of silicon die-shrinks and GF is following the money. It's a little more complicated than that. GF could choose to chase 7nm/10nm like Intel, Samsung, and TSMC but the huge capital costs to develop the process need to be amortized over a large number of wafers, and GF didn't want to pay to expand their facilities to the extent that would be required for that. Custom silicon is well and good but there were process improvements that GF could have made that they chose not to. The article says that their investors are getting antsy with GF continuing to lose money and wanted to go for profitability today rather than another big capital outlay trying to stay competitive with the big dogs. Of course, there are many tiny fabs out there that did the same thing, you just don't know them because nobody cares about them anymore, and if GF goes down that road then eventually they'll fade out too. Charlie Demerjian was all-in on the idea that AMD was going to multi-source Zen2, GF made changes to their anticipated 7nm process that put their design rules more in-line with TSMC 7SOC, and then all of a sudden AMD announced they were single-sourcing from TSMC and GF announced they weren't doing 7nm. It's not clear which came first, because GF could definitely get cold feet if their lead customer pulled out, but AMD could also have pulled out because GF wasn't serious about building out enough capacity to get costs down sufficiently.
|
# ? Aug 28, 2018 13:53 |
|
GF really needed to get into the 7nm game to be honest, and just if they could get there and stay there for 8+ years that would have done them wonders becuase going past 7nm is going to be basically impossible without EUV.
|
# ? Aug 28, 2018 19:46 |
|
wargames posted:GF really needed to get into the 7nm game to be honest, and just if they could get there and stay there for 8+ years that would have done them wonders becuase going past 7nm is going to be basically impossible without EUV. They may just roll it out after TSMC/Intel/Samsung get it up and running, via the tried and try process of 'hire people from those places, make them in charge of unfucking your product'. Also known as Industrial Cross-Pollination.
|
# ? Aug 28, 2018 20:33 |
|
Methylethylaldehyde posted:They may just roll it out after TSMC/Intel/Samsung get it up and running, via the tried and try process of 'hire people from those places, make them in charge of unfucking your product'. Also known as Industrial Cross-Pollination. I mean the gloflo 14nm does come from samsung.
|
# ? Aug 29, 2018 04:31 |
|
Hey goons, I'm writing an article on other potential uses for all the leftover processing power that's been built up over the past few years. Like rendering or AI computing, for example. Are there any ex-miners here who'd be keen to answer some questions? Ideally I'm looking for someone who knows their poo poo and invested in a decent setup.
|
# ? Sep 19, 2018 02:58 |
|
Disco De Soto posted:Hey goons, I'm writing an article on other potential uses for all the leftover processing power that's been built up over the past few years. Like rendering or AI computing, for example. I'm a professional HPC person (computational physics and artificial intelligence) who has also paid a lot of attention to cryptocurrency mining for like... years and years. Since at least the days when CPU bitcoin mining was the norm. I remember the response from Satoshi when the first CUDA bitcoin miner was released (he asked that people not use it because that would violate the spirit of the bitcoin experiment). Most miners won't be able to adequately answer your questions, because they have no experience in those fields. The GPUs used for cryptocurrency mining are reusable for many tasks, because they're GPUs; lots of HPC is done on GPUs these days, and gamers have always maintained a large second-hand market for used video cards. Plenty of academics and hobbyists use GTX 1080 Tis for machine learning, for instance, because they're cheap and pretty powerful There are some technical reasons why someone might prefer say a Quadro or a Tesla card over a gaming card (example: the GTX cards do not have ECC VRAM, meaning they are less reliable if you need your results to be very accurate; GTX cards also have poor performance for double-precision computation, if that's something that you need). And the window for transitioning hardware to the HPC realm is also slowly closing; the GTX 2080 is shipping to consumers this week, and this time next year it'll be very hard to find an HPC person willing to buy the previous generation of hardware (plenty of gamers will still be willing to buy them, though). There was some AI graduate student researcher who was trying to build a network for people to sell their GPU power to researchers instead of to cryptocurrency miners (e.g. NiceHash but for neural networks), but IIRC he was doing it in his free time and I don't think it ever got off the ground, and he was never able to solve the honesty problem (e.g. how do you prove that the "computational miner" actually performed the work that you gave them when your outputs are not easily verifiable?) The huge number of ASIC processors (for BTC, LTC, ETH, etc) are completely useless for other tasks. Those are going to wind up in a landfill. They are custom-made to mine bitcoins/litecoins/any other cryptocurrency using the same hashing algorithm, they can accomplish no other tasks Do you have any other questions? QuarkJets fucked around with this message at 10:28 on Sep 19, 2018 |
# ? Sep 19, 2018 07:55 |
|
QuarkJets posted:*awesome poo poo* Thanks, this was very insightful. I do have some other questions - I'll PM you so I don't hijack the thread.
|
# ? Sep 19, 2018 11:26 |
|
Disco De Soto posted:Thanks, this was very insightful. I wouldn’t mind reading your questions and what Quark has to say. I love effortposts. It’s a problem, sorry.
|
# ? Sep 19, 2018 14:35 |
|
tehinternet posted:I wouldn’t mind reading your questions and what Quark has to say. Yeah, you might as well. I mean, honestly, what else is gonna go on in this thread? "Hey, is GPU mining still dead?" "Yup" "Oh, well then should I spend $10k on ASICs?" "Nope, market's fuckin' hosed, son." "Oh, "
|
# ? Sep 19, 2018 14:54 |
|
DrDork posted:"Oh, well then should I spend $10k on ASICs?" Is there a secondhand market for ASICs? Asking for a friend. E: I guess I should say third-hand because they're already second-hand when they get to the original buyers lmao
|
# ? Sep 19, 2018 15:05 |
|
Shrimp or Shrimps posted:Is there a secondhand market for ASICs? Asking for a friend. Someone is always looking for landfill or to melt stuff down, I suppose.
|
# ? Sep 19, 2018 20:55 |
|
Shrimp or Shrimps posted:Is there a secondhand market for ASICs? Asking for a friend. Sure, there will always be people who think that cryptocurrency mining is bound to make a resurgence or that cryptocurrency X is about to shoot to the moon (any day now!)
|
# ? Sep 19, 2018 22:00 |
|
Some more effortly details - All of the people with GPU mining rigs would probably be really happy to have another way to use their hardware to make money, like what that grad student I mentioned was trying to do. All that matters to them is that they be paid for running their hardware. Most of the computational hardware is in the hands of non-experts, so you just need to make the user interface as simple as possible. NiceHash did a great job of this, "click a button and start slowly making money from your idle hardware" is an easy to understand paradigm and would bring a lot of people on board. In fact, more generalized GPU computing would probably bring in more people than cryptocurrency mining ever did, because it would like the natural skeeviness attached to cryptocurrency and people contribute idle resources for free to all kinds of good causes (SETI@Home, BOINC, etc.) - Cryptocurrency miners are not actively seeking out these opportunities but would surely embrace them if they advertised themselves. - Amazon is practically printing money because AWS offers a lot of computational power to anyone who wants to pay for it. AWS is highly generalized (e.g. not just for heavy computation), and there's certainly room for a lower-cost alternative focused on heavy computation - I can't emphasize the need for an intuitive user interface enough. - The subset of tasks I mentioned that aren't well-suited to a GTX 1080 Ti are definitely in the minority. Machine Learning is the vast majority of computational effort right now, it's a big hot topic in HPC and it doesn't need any of the bells and whistles offered by the premium cards. Caveat: the more premium cards have tensor cores that are extremely well-optimized for machine learning tasks, and they're way better at the half-precision computations that machine learning algorithms frequently perform. A GTX 1080 Ti will never be superior to a V100 for machine learning, but if compute time with ten GTX 1080 Tis is cheaper than compute time on one V100 then most professionals will go with the 1080 Tis. - Corporations are risk-averse and are going to want assurances that you're not sending their data to their competitors. - Solving the honesty problem means being able to easily verify computational outputs. This is easy by design in cryptocurrency, figuring out the correct nonce for the next block is hard but then verifying that the nonce is correct is easy. This is lot harder for computational problems. Say that I need to convolve one billion matrices with one billion other matrices; the only way to verify those outputs is to do the difficult computation yourself and check the answer. You can't solve this issue but you can mitigate it; other providers on the network can perform the verification, and you could wrap that into the cost offered to people looking for computational power (e.g. a user could ask that N% of the outputs be independently verified X times by Y independent providers; N could be 100 for renders, and maybe this degree of redundant computation is still cost effective because you don't have to purchase and maintain your own hardware) QuarkJets fucked around with this message at 22:21 on Sep 19, 2018 |
# ? Sep 19, 2018 22:18 |
|
QuarkJets posted:- Solving the honesty problem means being able to easily verify computational outputs. This is easy by design in cryptocurrency, figuring out the correct nonce for the next block is hard but then verifying that the nonce is correct is easy. This is lot harder for computational problems. Say that I need to convolve one billion matrices with one billion other matrices; the only way to verify those outputs is to do the difficult computation yourself and check the answer. You can't solve this issue but you can mitigate it; other providers on the network can perform the verification, and you could wrap that into the cost offered to people looking for computational power (e.g. a user could ask that N% of the outputs be independently verified X times by Y independent providers; N could be 100 for renders, and maybe this degree of redundant computation is still cost effective because you don't have to purchase and maintain your own hardware) It seems like you could also use redundant computes to weed out bad actors - X% failures and your reinbursement rate takes a hit, Y% and you lose your account (perhaps even banning the GPU device ID, though you'd want some sort amnesty for second-hand purchases).
|
# ? Sep 19, 2018 23:01 |
|
QuarkJets posted:Some more effortly details What Quarkjets is trying to say here is that you need a Blockchain to solve these problems
|
# ? Sep 20, 2018 01:58 |
|
Stickman posted:It seems like you could also use redundant computes to weed out bad actors - X% failures and your reinbursement rate takes a hit, Y% and you lose your account (perhaps even banning the GPU device ID, though you'd want some sort amnesty for second-hand purchases). Yeah and you need a way of ensuring that your redundancy is independent. And there's also a matter of padding your hours. Say that provider X takes 4 hours to process some data and provider Y takes 4.5 hours. Did provider Y pad their hours for greater payout? Hard to say. Or if you pay based on availability instead of wall time, how do you detect fake "outages"? These are tough problems but I doubt they're insurmountable.
|
# ? Sep 20, 2018 02:18 |
|
QuarkJets posted:Yeah and you need a way of ensuring that your redundancy is independent. I assume you'd probably pay by the computation rather than hour in order to promote efficient computation (like *coin mining, but with a more useful outcome). If there are deadlines, you could add a bit extra for priority processing (i.e., the data cruncher would see the deadline for their computation component and receive a bonus if they dedicate enough resources to meet the deadline), or if the problem supports it, simply distribute small enough chunks that you can cut off processing at a certain time. I see your point, though - some problems might not quite be so granular, so there might need to be some additional incentives to meet availability targets.
|
# ? Sep 20, 2018 02:26 |
|
I was reading bitcoin energy consumption and how proof-of-stake algorithms use way less energy compared to proof-of-work algorithms that bitcoin currently uses. Do you think it's likely that bitcoin could switch? I was visiting the Auckland Bioengineering Institue today and talking to a PHD student who's working on modeling the pulmonary valve that connects the heart and lungs. They have an HPC setup he sometimes uses to process his work. Seems a much better use for all that processing power than mining bitcoin. Red Rox fucked around with this message at 03:53 on Sep 20, 2018 |
# ? Sep 20, 2018 03:43 |
|
Disco De Soto posted:I was reading bitcoin energy consumption and how proof-of-stake algorithms use way less energy compared to proof-of-work algorithms that bitcoin currently uses. Do you think it's likely that bitcoin could switch? Proof of stake is just getting rid of any pretense that this isn't just a straight ponzi scheme.
|
# ? Sep 20, 2018 16:57 |
|
Proof of steak heh heh heh 🥩
|
# ? Sep 20, 2018 17:23 |
|
Stickman posted:I assume you'd probably pay by the computation rather than hour in order to promote efficient computation (like *coin mining, but with a more useful outcome). If there are deadlines, you could add a bit extra for priority processing (i.e., the data cruncher would see the deadline for their computation component and receive a bonus if they dedicate enough resources to meet the deadline), or if the problem supports it, simply distribute small enough chunks that you can cut off processing at a certain time. I see your point, though - some problems might not quite be so granular, so there might need to be some additional incentives to meet availability targets. Normally supercomputers charge by computational time, which captures what you described. But it's very easy to spoof that to be whatever you want, if you exclusively own the hardware.
|
# ? Sep 21, 2018 01:30 |
|
|
# ? May 4, 2024 12:47 |
|
Disco De Soto posted:I was reading bitcoin energy consumption and how proof-of-stake algorithms use way less energy compared to proof-of-work algorithms that bitcoin currently uses. Do you think it's likely that bitcoin could switch? no, bitcoin will never ever switch to proof of stake. Other cryptocurrencies may, some day, if someone can ever figure out a design that doesn't just turn into proof of work with extra steps
|
# ? Sep 21, 2018 01:48 |