Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Ignoarints
Nov 26, 2010

Zero VGS posted:

I just explained it. You sign up for the Steam In-Home Streaming beta and run it on a desktop with any card (though the 750ti is supposed to have the best stream hardware and is what I currently own). Then you nab a decent, old laptop (I think even without discrete graphics is fine) and put it on a fast wireless network or an extremely fast VPN, run Steam on that too and it'll let you launch a game from off your desktop (locking use of the PC while you're doing so).

Oh I didn't even know that was possible, my brain just sort of glazed over those words.

Adbot
ADBOT LOVES YOU

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
So, which one of us will me the first crazy bastard that sets up a VT-d Lan party in a box? 1 server, 1 or 2 GRIDs and 16 thin clients.

:getin:

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

deimos posted:

So, which one of us will me the first crazy bastard that sets up a VT-d Lan party in a box? 1 server, 1 or 2 GRIDs and 16 thin clients.

The stupid thing is there's no way you can get that to be any cheaper or more effective than 16 fat clients. An Nvidia GRID is $2000-$3000 on eBay and only has a couple passively cooler Keplers that probably can't handle any 3D intensive games.

Just for general thin-client computing, I set up my work with a single virtualized server and 100 non-branded Windows CE thin clients I got for $30 shipped each and they work great. But brand-name thin clients (anything that could work with GRID) like Dell Wyse and HP, everyone wants $500 each which is loving laughable.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Zero VGS posted:

The stupid thing is there's no way you can get that to be any cheaper or more effective than 16 fat clients. An Nvidia GRID is $2000-$3000 on eBay and only has a couple passively cooler Keplers that probably can't handle any 3D intensive games.

Just for general thin-client computing, I set up my work with a single virtualized server and 100 non-branded Windows CE thin clients I got for $30 shipped each and they work great. But brand-name thin clients (anything that could work with GRID) like Dell Wyse and HP, everyone wants $500 each which is loving laughable.

No, I meant run X virtual machines with a chunk of GRID each (I was mostly kidding about GRID, it can be multiple video cards) and Steam stream to thin clients.

(Our work VDI servers have quad K1s, 24 cores and half a TB of ram, RAM is by far our most precious resource.)

deimos fucked around with this message at 23:04 on Mar 13, 2014

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Zero VGS posted:

The lowest bandwidth setting I see in Steam Streaming is 3Mbit and SpeedTest.net is giving me 4Mbit upload. Plus the 750ti isn't going to be a weak card when the laptop it is streaming to is 1600x900 native, that's 44% less than 1080p

Again, no harm in seeing what happens, I'll report back.

I'm betting it will be slow to unworkable, but with google fiber on the horizon in my town and hopefully in other places, this could be of interest.

Star War Sex Parrot
Oct 2, 2003

Zero VGS posted:

The lowest bandwidth setting I see in Steam Streaming is 3Mbit and SpeedTest.net is giving me 4Mbit upload. Plus the 750ti isn't going to be a weak card when the laptop it is streaming to is 1600x900 native, that's 44% less than 1080p

Again, no harm in seeing what happens, I'll report back.
Bandwidth won't make it unplayable -- latency will.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Jack Trades posted:

I'm using a GeForce 560 GTX with Win7 x64. I already tried updating DirectX and doing a clean reinstall of all Nvidia drivers between versions 314.21 and 335.23.
Except for clean OS reinstall I don't have any more ideas on what might fix that.
Uninstall the nVidia drivers, reboot, run Display Driver Uninstaller to remove the remnants, reboot, install the latest drivers (make sure GeForce Experience does not get installed). Sometimes using a driver cleaning program is necessary to get rid of outstanding issues. If that doesn't help your card may just be dying, if you're sure those games don't exhibit MSAA problems on other systems. For example CoD:Blops2 had a broken MSAA resolve filter at launch that looked like your second screenshot.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Pimpmust posted:

The 460 is still hosed even with the latest driver and will randomly hang up the computer. Works fine with the old 314 driver :iiam:

My 460 GTX was hosed too on anything after 314.22 until I disabled hardware acceleration in Firefox. Fixed all my crashes and "display driver has stopped responding" crap!

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


^ ^ ^ The driver as installed has not given me trouble in Firefox since 332.21. I only don't claim it as a cure-all because I cannot account for everyone else's use cases. In addition, because of how Windows renders interface graphics, turning off hardware acceleration in browsers does not ensure that your system will not experience TDRs or hard-locks. For these reasons, and while it is nice that it paid off for you, I ask that you please do not propose workarounds - especially year-old workarounds that drastically decrease application performance and are not a guaranteed fix - as though they are proper solutions.

quite a few Fermi-users actually posted:

[some variant of "I installed newer drivers that claim the TDRs and hardlocks are gone but they're still giving me problems!"]

The problem is nVidia (and AMD) consumer GPU drivers are incredibly messy, and even DDU isn't guaranteed to remove some of the misconfiguration left by Fermi drivers after 314.22 and before... I want to say 332.21. Also some bundled software is still built with Fermi cards as not even an afterthought, as Alereon mentioned. Good news is that if the misconfiguration is never hit by things that break when they hit it, it's not an emergency. With that in mind:

First step: Extract the contents of the driver package into a folder. Leave all the files in root but remove all the folders but Display.Driver, HDAudio, NVI2 and PhysX. Then run setup from the extracted folder.

If you're wondering, HDAudio is the HDMI Audio driver for your card and NVI2 is the installer's own operating files. The rest is stuff you either know if you need it and can then feel free to put back (3D Vision, LED visualization, .NET 4.0 runtime (you probably already have this), Optimus), know you can't use (Shadowplay, NvVAD (Shield streaming audio driver) and GeForce Experience), or is outdated (nView). Update stuff is stuff you probably do manually and thanks to nVidia's track record probably don't want to be bugged about.

If that doesn't work: Backing up a system image* (I would do a separate documents and AppData backup too) and reinstalling Windows and current nVidia drivers is probably cheaper both money- and time-wise than simply going to "drivers killed my card".

*I shouldn't have to explain why this part is important.

dont be mean to me fucked around with this message at 00:19 on Mar 15, 2014

90s Solo Cup
Feb 22, 2011

To understand the cup
He must become the cup



HalloKitty posted:

MSRP is $299.

CamelCamelCamel is a great thing for finding old prices; and I can find 280Xs that were $300~

vvv Most definitely, the 280X compares most closely (basically totally equal) to a 770. If it were still available at MSRP all day long, it would be a no-brainer

Still kicking myself for buying my 270X ($219.99) back when 280X prices had gone stupidly stratospheric ($100 to $200 above MSRP).

Welp, could have bought a GTX 760 for $40 more, but I was being exceptionally cheap about it.

Ignoarints
Nov 26, 2010
Does anyone know if there is any reason to bios volt mod if instability occurs before I reach the highest TDP it can do stock? My 2 SLI 660ti's are unstable before they use 114% of the TDP, which is the opposite of the one 660ti I had before. I have a feeling it's not worth it but I thought I'd check in case there is something not straightforward about it

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe

Everyday Lurker posted:

Still kicking myself for buying my 270X ($219.99) back when 280X prices had gone stupidly stratospheric ($100 to $200 above MSRP).

Welp, could have bought a GTX 760 for $40 more, but I was being exceptionally cheap about it.

If you think that's bad, how about this? I asked for either a 270X, 280X, or 290 for Christmas, and sure I know I didn't pay for it directly, but it ate into the household income. Anyway, the giver went with a 270X, and I had waited until the first week of December to suggest it, so the cheapest one available was $270, shipped from an Amazon Marketplace seller in Canada. It didn't arrive until the 27th.

I still have a kickass video card, though, and I shouldn't complain that I could have waited a few months.

Oh, and for added information, this is for a system with OS X installed as the default environment, and that R9 270X runs Dolphin at full speed in most circumstances, like running Twilight Princess at full speed almost everywhere. My existing GTX 670 couldn't run that game any faster than half to two thirds full speed, except in Windows or Linux. There was a small hitch, in that it ran the Mac version of Bioshock Infinite like molasses when looking at anything but directly at a wall, but 10.9.2 fixed that.

veedubfreak
Apr 2, 2005

by Smythe
One thing I have learned over the years is that if you can afford it, you're best served by buying whatever the best card released on the day it comes out. This assumes you don't wait 3-4 generations. I'm usually able to sell my old card and buy the newest card for 50-100 bucks investment each generation. Considering that I usually get 6 months or more out of a card, I rarely eat too much cost.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I disagree. While it can be good for trying to get in early on overclocking, it's only in the last few generations that AMD in particular has done dual-BIOS cards that trivially flash to their market segmented pricier counterpart that only early adopters are likely to get a chance at (and they skipped the last generation for that entirely). So apart from that, you're spending the most money the card is likely to cost, to get the least performance the card is likely to have, as well as the least stability that the card is likely to experience, all in the name of _______?

Just having the most powerful setup around for a while because you can afford to? That's a highly personal purchasing strategy. Note that there are people in the thread remarking on how well their 560Tis or performance equivalents have held up since they got them. What works for your loaded, somewhat crazy self is very non-universal. I stretched farther than I wanted to to be a double early adopter on the 780 AND the 780Ti, and in my case I really wish I would have waited for the 780Ti to launch because I basically just pissed about $250 down the drain in that purchase because I made a bad guess in terms of where they'd go with GK110. I pretty much just got hosed on that one. I didn't think yields for a 7-billion transistor chip being used to power their highest end pure compute AND workstation cards would still leave as much room as they have for raising the power envelope a bit, lowering the VRAM substantially, cutting the compute - minus the super Titan, of course - and putting out an otherwise complete GK110 chip at virtually the same price point that they introduced the 780 at.

I do it every year, too, but I don't do it because it's a good idea, I do it because I like shiny poo poo in videogames and that's a good way to keep on top of shiny poo poo in videogames. In terms of getting your money's worth, it's pretty foolish.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
I like shiny poo poo in video games too Agreed:glomp:.

Speaking of which. I'd like to have a discussion on the relative merits/overclocking capabilities of various manufacturers' Gtx 780ti variants. I thought I'd ask here rather than the parts picking thread because a) my question is more theoretical overclocking/cooler stuff and b) talking about getting a 780ti over there sets a bad example and raises Crackbone's blood pressure to unacceptable levels.

The gigabyte ghz edition seems to have the highest stock clocks of any 780ti by a long shot (1150mhz). But reading reviews, I see that the EVGA classified seems to be much better built - 14 power phases vs 8 in the Gigabyte, better quality fans in the EVGA, etc. Would I be correct in assuming that despite the Gigabyte having a higher clock speed out of the box, the EVGA could probably be pushed further with manual overclocking? Bearing in mind I'm a complete novice at overclocking, and probably won't be doing anything too risky like custom bios flashing or anything.

Also the Asus version - it seems to have the lowest clock speed, and some reviews place it as being noisier than the others - how well is it likely to perform relative to the other two?

The Gigabyte card seems to be the most commonly available of the 3, (and the EVGA in particular I'd need to jump through some hoops to obtain) and there's about $40 difference in price between the 3 of them. Are they close enough together that I should just get whichever is cheapest/easiest to find, or is one of them worth going to some extra effort to obtain?

I'm in Australia so warranty support issues between manufacturers are irrelevant.

Da Mott Man
Aug 3, 2012


deimos posted:

So, which one of us will me the first crazy bastard that sets up a VT-d Lan party in a box? 1 server, 1 or 2 GRIDs and 16 thin clients.

:getin:

Already done, Hyper-V, Remote Desktop Services Virtualization Host, RemoteFX. Just log into the web portal, use the RDP, it spools up a VM with GPU accelerated graphics and off you go gaming. Only limitation is vRAM, so you can't turn on all the shiny crap, but works great for older games. Best I've been able to do is 20 people playing Counter-Strike 1.6 on a single Hyper-V server.

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

The Lord Bude posted:

I like shiny poo poo in video games too Agreed:glomp:.

Speaking of which. I'd like to have a discussion on the relative merits/overclocking capabilities of various manufacturers' Gtx 780ti variants. I thought I'd ask here rather than the parts picking thread because a) my question is more theoretical overclocking/cooler stuff and b) talking about getting a 780ti over there sets a bad example and raises Crackbone's blood pressure to unacceptable levels.

The gigabyte ghz edition seems to have the highest stock clocks of any 780ti by a long shot (1150mhz). But reading reviews, I see that the EVGA classified seems to be much better built - 14 power phases vs 8 in the Gigabyte, better quality fans in the EVGA, etc. Would I be correct in assuming that despite the Gigabyte having a higher clock speed out of the box, the EVGA could probably be pushed further with manual overclocking? Bearing in mind I'm a complete novice at overclocking, and probably won't be doing anything too risky like custom bios flashing or anything.

Also the Asus version - it seems to have the lowest clock speed, and some reviews place it as being noisier than the others - how well is it likely to perform relative to the other two?

The Gigabyte card seems to be the most commonly available of the 3, (and the EVGA in particular I'd need to jump through some hoops to obtain) and there's about $40 difference in price between the 3 of them. Are they close enough together that I should just get whichever is cheapest/easiest to find, or is one of them worth going to some extra effort to obtain?

I'm in Australia so warranty support issues between manufacturers are irrelevant.

I got the 780ti classified and it boosts to 1150hmz out of the box with out changing anything plus evga has support for "unofficial" evga bios's that let you overvolt the card past the 1.21v limit that Nvidia puts on the cards, I'm very happy with mine (its up to 1300mhz on the core before temps get too high on the cooler + the evga bios)

Senjuro
Aug 19, 2006

Well that sucks. My 285 GTX still works surprisingly well even with games like Battlefield 4.

I am not completely up to date with all the GPU news but am I right to think that now is not the best time to get a new card if I want it to last? Here's what I figure:
1. The new console generation has barely just started and since most games are made with consoles in mind, you want your PC to be able to match whatever they throw at the consoles for the rest of the generation.
2. DirectX 12 may be on the horizon.
3. The 800 series with 20nm/Maxwell is on the horizon.

beejay
Apr 7, 2002

1. It's gonna be a while before the new consoles catch up and surpass computer video cards graphics-wise... if ever. They are basically computers themselves running AMD 7xxx series GPUs. However, being able to program specifically for the hardware in consoles gives them the ability to squeeze a lot out of it as the generation goes on. Anyway, in my opinion anything you buy now will hang with the consoles for at least a year and probably two.

2. Yes it may be on the horizon, but again it's going to take a while for games to make full use of it. A lot of DirectX 11 games also support 10, and the same should be true of 12. So many people still being on Windows 7 means games will still support 10 and 11 for a while.

3. New cards are always "on the horizon" and I wouldn't expect a 20nm part comparable to say the 760 until late this year.


Bottom line - if there's a game or a few games that you play now that give you trouble, and you want to fix that trouble, it's time to upgrade. Buying a video card for the future is a silly thing to do.

subx
Jan 12, 2003

If we hit that bullseye, the rest of the dominoes should fall like a house of cards. Checkmate.

beejay posted:

1. It's gonna be a while before the new consoles catch up and surpass computer video cards graphics-wise... if ever. They are basically computers themselves running AMD 7xxx series GPUs. However, being able to program specifically for the hardware in consoles gives them the ability to squeeze a lot out of it as the generation goes on. Anyway, in my opinion anything you buy now will hang with the consoles for at least a year and probably two.

If they aren't caught up with PC's now, how will they catch up? New cards will be coming out in the next few months (the 800 series in Q3 or Q4), and that will push PC's pretty much out of range for the consoles.

Also, 4K PC gaming is something that will happen in the next few years, and consoles often struggle at 1080p.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

subx posted:

If they aren't caught up with PC's now, how will they catch up?

In the current model of making consoles cheaper than a PC, never.

But eventually, we'll reach the limit of how much we can shrink semiconductors, Moore's law will fail, and then all bets are off. Maybe processing power will plateau for long enough for console fabrication costs to be cheap enough to be equivalent to PCs. Most likely researchers will just figure out something better than silicon by then. Or PC hardware will start relying on quantum computing. (Ha, yeah, right.)

beejay
Apr 7, 2002

Sorry I meant like if you buy a card now, I doubt that the console graphics will catch up or surpass that card for a while.

Pimpmust
Oct 1, 2008

lovely porting is probably going to be a bigger concern and having a beefier computer only helps so much with that problem.

Granted, this gen of consoles are more PC-like than ever, so maybe it won't be such a huge issue (like say, PS3 architecture).

Monday_
Feb 18, 2006

Worked-up silent dork without sex ability seeks oblivion and demise.
The Great Twist

subx posted:

If they aren't caught up with PC's now, how will they catch up?

It's a matter of how fast nVidia and AMD can release more powerful GPUs versus how fast devs can coax more performance out of the consoles. Look at modern 360/PS3 graphics versus launch titles like Perfect Dark Zero or Call of Duty 2. There's a big difference there. Battlefield 4 will run on a 360 or PS3, but good luck getting it to run on a high end PC from 2005, even though the PC was more powerful at the time.

Nephilm
Jun 11, 2009

by Lowtax
I'm pretty sure BF4 can run on a C2D E6600 + 8800 GTS at the crummy upscaled 720p quality it does on consoles.

Senjuro
Aug 19, 2006
Ok so thanks for the advice, I guess I'll be getting a new card soon. Any consensus on what's the best bang for the buck card at the moment? GeForce 750 TI seems to be mentioned a lot from what I can find.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


750 Ti is good because it can run off of bus power alone. It's effectively a Maxwell preview.

It's only as powerful as a 560 Ti, though, so you're likely to be making compromises even at 1080p (depending on game, of course).

A 760 will draw something like thrice the power but has longer legs.

Nephilm
Jun 11, 2009

by Lowtax

Senjuro posted:

Ok so thanks for the advice, I guess I'll be getting a new card soon. Any consensus on what's the best bang for the buck card at the moment? GeForce 750 TI seems to be mentioned a lot from what I can find.

The GTX 750Ti is not the best bang for the buck; it's a card a step or two above the entry level that happens to consume very little power, making it a good option for HTPC configurations and a painless upgrade for prebuilt systems that might have dubious PSUs. If you have no such concerns, there are superior performers for the price.

Also, this is a question for the parts picking megathread.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

MondayHotDog posted:

It's a matter of how fast nVidia and AMD can release more powerful GPUs versus how fast devs can coax more performance out of the consoles. Look at modern 360/PS3 graphics versus launch titles like Perfect Dark Zero or Call of Duty 2. There's a big difference there. Battlefield 4 will run on a 360 or PS3, but good luck getting it to run on a high end PC from 2005, even though the PC was more powerful at the time.
True, albeit Mantle thankfully has put a spotlight on supposed inefficiency/overhead in the PC driver/API stack, and while not 100% the reason, I'm sure played a part in MS announcing DX12 with the emphasis on reducing overhead. That, and with so much middleware involved in development these days and the similarities between PC's and consoles in terms of architecture, gives some indication that perhaps PC won't have to completely brute-force its way to truly show its advantages over consoles with their (historically) more efficient console API structure (at least to the extent where you would need 5X the GPU/CPU horsepower to show 2-3X the advantage) - albeit the advantages of having a fixed platform will still be present.

Which is good, because PC GPU's (well, everything) is hitting the same wall CPU scaling has run into.

Happy_Misanthrope fucked around with this message at 22:53 on Mar 16, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Sir Unimaginative posted:

750 Ti is good because it can run off of bus power alone. It's effectively a Maxwell preview.

It's only as powerful as a 560 Ti, though, so you're likely to be making compromises even at 1080p (depending on game, of course).

A 760 will draw something like thrice the power but has longer legs.

From what I can tell, seems more like the 750Ti is a step above the 560Ti in terms of performance, legitimately a good option for 1080p gaming - reason being that the 650Ti is an almost exact performance analog for the 560Ti, and the 750Ti is roundly ahead of the 650Ti performance-wise. For slightly under half the power usage (of the 560Ti, to be clear, here - the 650Ti isn't THAT dated, heh).

750Ti is probably going to be my recommendation for "I don't need all the shinies, but I'd like many of them?" for 1080p in 2014.

Agreed fucked around with this message at 00:29 on Mar 17, 2014

Ignoarints
Nov 26, 2010
I was playing it conservative with overclocking my SLI'd 660 ti's after reading that cards in SLI don't behave the same as opposed to single cards. I became very familiar of the capabilities of my single 660ti, and I'm very pleased to find that my two in SLI have virtually the same clock limit and significantly greater memory limits (relative) so far, at stock voltage levels. My Asus single card was in a top percentile of overclocking results so I am very happy at the performance of these two MSI's.

Where things begin to differ though is apparently I was running into a voltage limitation first with the Asus card, which was bios modded and I could get another 60-80 MHz or so out of it, while these MSI cards are getting unstable at the same clock but not utilizing all the available voltage, so perhaps that's as far as they're going to go. I don't know if this is the result of the SLI setup or the cards themselves, but I doubt I'll be modding these cards to find out since the performance is great. I am only using 1080p. If anybody wonders in this scenario if overclocking matters at this resolution I can tell you without a doubt it improved performance. Even frame limited, there would be situations of short hiccups or dips depending on what was going on and overclocking GREATLY reduced these effects. Flipping between stock and overclock settings easily confirmed this for me.

So far one card runs at 1228 MHz, the other at 1201 MHz. Memory is equal, so far, at 6880 MHz, and I haven't reached the limit on that figure either. I must be getting close though.

Edit: Don't know if this is common knowledge as it sort of seemed so online, but after I set my HPET as the only timer in Windows I eliminated all remaining stuttering

Ignoarints fucked around with this message at 10:26 on Mar 17, 2014

Peechka
Nov 10, 2005
Those coin miners are just killing the market right now. Raising the AMD cards $100-200 above MSRP, the same cards that were supposed to give Nvidia a run for its money. Its a shame really, we should be seeing lower prices across the board with more competition on AMDs side for less cash.

I thought that mining coins with video cards was obsolete anyway?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Peechka posted:

I thought that mining coins with video cards was obsolete anyway?

Bitcoin, sure, but most alternative coins use the scrypt algorithm, which was intended to run only on CPUs, but since it became possible on GPUs, the rest is history.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Peechka posted:

Those coin miners are just killing the market right now. Raising the AMD cards $100-200 above MSRP, the same cards that were supposed to give Nvidia a run for its money. Its a shame really, we should be seeing lower prices across the board with more competition on AMDs side for less cash.

I thought that mining coins with video cards was obsolete anyway?

Mining Bitcoins with GPUs is obsolete, but those LiteCoin/DogeCoin alternates use an algorithm that needs tons of VRAM to be effective. I have a friend (yes, really a friend, not me, lol) who bought a couple dozen GTX 750ti and they're currently making him $30 a month each after the power bill. As I understand it he uses a mining pool that automatically points all his cards' hashing power to mine the most profitable coin that hour, then changes it into Bitcoin automatically and deposits it into his Coinbase account, which then deposits cash money directly into his bank account.

Is it stupid? Yeah, I'm telling him that in all likelihood the "difficulty" of mining the coins will ramp up over the coming months fast enough that he'll never even break even on the cards, plus that all probably looks shady to the IRS even if he reports it properly, but hey, he won't listen and it is a never-ending source of entertainment for me, much like bitcoin in general.

At least the 750 series is energy efficient. I think all two dozen of his cards still takes less wattage on full tilt than veedubfreak's rig.

Ignoarints
Nov 26, 2010

Peechka posted:

Those coin miners are just killing the market right now. Raising the AMD cards $100-200 above MSRP, the same cards that were supposed to give Nvidia a run for its money. Its a shame really, we should be seeing lower prices across the board with more competition on AMDs side for less cash.

I thought that mining coins with video cards was obsolete anyway?

Killing the market for us maybe. Despite the fact AMD doesn't make money off mark ups, I'm sure they can't be too terribly upset by selling out of every single thing they make ever. And nvidia can't be too upset as they don't have to aggressively price, and can pad a little, both bank and time. If anything I hope the lull is well spent investing into future tech in a more aggressive manner that they wouldn't be comfortable with before (which seemingly there is no reason to doubt they aren't). I'm just praying mining is just a bubble

subx
Jan 12, 2003

If we hit that bullseye, the rest of the dominoes should fall like a house of cards. Checkmate.

Ignoarints posted:

Killing the market for us maybe. Despite the fact AMD doesn't make money off mark ups, I'm sure they can't be too terribly upset by selling out of every single thing they make ever. And nvidia can't be too upset as they don't have to aggressively price, and can pad a little, both bank and time. If anything I hope the lull is well spent investing into future tech in a more aggressive manner that they wouldn't be comfortable with before (which seemingly there is no reason to doubt they aren't). I'm just praying mining is just a bubble

If cryptocoins actually become a thing, eventually you will start to see more specialized equipment, and it won't hurt video cards. I can see nVidia and AMD being forerunners in the design/development of that equipment. I do not imagine that would hurt their other markets though, maybe it would end up helping by creating some better algorithms for gaming.

(And yes, I realize cryptocoins "are a thing" already or the cards wouldn't have jumped in price. They still aren't widely used though.)

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

subx posted:

If cryptocoins actually become a thing, eventually you will start to see more specialized equipment, and it won't hurt video cards. I can see nVidia and AMD being forerunners in the design/development of that equipment. I do not imagine that would hurt their other markets though, maybe it would end up helping by creating some better algorithms for gaming.

(And yes, I realize cryptocoins "are a thing" already or the cards wouldn't have jumped in price. They still aren't widely used though.)

There are plenty of dedicated Bitcoin ASICs, and some companies are at work creating scrypt miners, designed to be more difficult/expensive to design for, as you need more RAM the faster you mine (the way I understand it).

Nephilm
Jun 11, 2009

by Lowtax
Once the difficulty on litecoin or whatever flavor of the month is ramps up enough that only ASICs will cut it, people will switch to the next flavor of the month cryptocoin. There'll be no end to it until the fad itself dies.

subx
Jan 12, 2003

If we hit that bullseye, the rest of the dominoes should fall like a house of cards. Checkmate.

Nephilm posted:

Once the difficulty on litecoin or whatever flavor of the month is ramps up enough that only ASICs will cut it, people will switch to the next flavor of the month cryptocoin. There'll be no end to it until the fad itself dies.

Can something really be a fad when it has such a small following? I guess I still don't get the whole cryptocoin thing, I guess I'm just :corsair:

Adbot
ADBOT LOVES YOU

Peechka
Nov 10, 2005
The AMD 280(non X), at the $280 price point, just released this week and is already selling out.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply