|
Buildzoid definitely has his own Vega as of like two days ago but I haven't seen any of his impressions of the card yet.
|
# ? Jul 9, 2017 06:29 |
|
|
# ? May 31, 2024 06:21 |
|
Palladium posted:Linus also did his idiot rant against closed standards on ARM mobile hardware because building a community-backed Linux OS on already-Android devices is obviously such a huge, lucrative and completely viable market. He was right. You can't build anything interesting (hardware wise) with half the ARM SoCs out there because of loving Mali graphics bullshit. Only the IP licencee can even download the damned drivers, and they all do something dumb to make them only work on their vendor kernel, and are linked against whatever ancient libc they happened to be using 5 years ago when they made the BSP. What you CAN do is copy the vendors reference design and make the exact product they originally envisioned, with your own branding. Want to do anything other than that? Go gently caress yourself, or buy an IP license and fab your own SoC. Same goes for a number of other hardware modules - bluetooth and wifi are frequent offenders, and of course just lol at the idea of including GSM on a product if you don't have a $100m backing. Tablets and phones are boring compared to what you could do with a real quad-core Linux machine on a chip the size of a fingernail.
|
# ? Jul 9, 2017 06:31 |
|
AVeryLargeRadish posted:Buildzoid did a stream today, he talked a fair bit about Vega and he has come to the conclusion that the card is very badly power throttled at 300W, conservatively it's gonna need like 400-500W on water. Basically you will want an 800W+ PSU bare minimum. The water cooled Vega FE is rated for a 375w TDP and I wouldn't be surprised that if power throttled on that as well, 375w is the max that 2 8-pin connectors + PCI-E power is rated for. Of course, this hasn't stopped AMD in the past, but one would like to think after Powergate on the RX 480's, they would actually try to stick to their rated TDP before overclocking is thrown into the mix. It would be hilarious if Vega can't even consistently hit max boost clocks with a water cooler, nevermind actually overclocking the thing.
|
# ? Jul 9, 2017 08:01 |
|
Beautiful Ninja posted:The water cooled Vega FE is rated for a 375w TDP and I wouldn't be surprised that if power throttled on that as well, 375w is the max that 2 8-pin connectors + PCI-E power is rated for. Of course, this hasn't stopped AMD in the past, but one would like to think after Powergate on the RX 480's, they would actually try to stick to their rated TDP before overclocking is thrown into the mix. But each 8pin is rated for 150 watts and you can pull another 100-150 from pci.
|
# ? Jul 9, 2017 09:10 |
|
wargames posted:But each 8pin is rated for 150 watts and you can pull another 100-150 from pci. i'm pretty sure pci-e spec says like 75
|
# ? Jul 9, 2017 10:08 |
|
Cygni posted:Maybe SteamOS will finally be the revolution and 2018 will be the year of linux on the d-*laugh track starts early* Valve has been hiring quite a few people to work on the open AMD drivers lately and is co-authoring a lot of Vulkan extensions which seem aimed at VR and "consolizing" linux. Things like high priority queues for preemption when working with VR and giving the compositor priority to draw the UI over games and being able to take true exclusive control of a display from the desktop so you can have tight control of frame timing to do things like race the beam.
|
# ? Jul 9, 2017 14:04 |
|
https://www.youtube.com/watch?v=iiLSxhyiZss No this can't be true, my Zotac card is supposed to have the highest factory core and boost clock than the others, it even slightly higher than Aorus's Xtreme card. Please tell me I didn't just waste my cash on this.
|
# ? Jul 9, 2017 15:57 |
|
The_Franz posted:Valve has been hiring quite a few people to work on the open AMD drivers lately and is co-authoring a lot of Vulkan extensions which seem aimed at VR and "consolizing" linux. Things like high priority queues for preemption when working with VR and giving the compositor priority to draw the UI over games and being able to take true exclusive control of a display from the desktop so you can have tight control of frame timing to do things like race the beam. This is good. Some of this stuff is why we chose not to support Linux with Rift CV1.
|
# ? Jul 9, 2017 16:10 |
|
Question: for pretty much the same price, I can get either the Gigabyte GTX 1060 6GB Mini ITX OC (a SFF card) or the Palit GTX 1060 6GB Dual. My case can fit the larger card, but if all else were completely equal I'd get the Mini ITX so I could potentially move it to a smaller case. The larger card has a few more DisplayPort connectors, but I don't care about those. Looking at the cards, I would expect the following: 1) The Gigabyte has slightly higher clocks (both base and OC) so it should perform very slightly better 2) The Palit should be significantly more silent due to the double fans spinning slower 3) There are no other meaningful differences between the cards Are my guesses correct?
|
# ? Jul 9, 2017 17:14 |
|
I am trying to roll back to a specific AMD driver version to get my R9 380 to play nice with Final Fantasy XIV. I have uninstalled the driver with Display Driver Uninstaller in safe mode and installed version 16.3.2. I have disabled automatic updates in the driver's installation program as well as in the Device Installation options. However after playing for a while and crashing, I checked the driver in Device Manager and it has updated to the most recent version without my knowledge. Is there a way to get it to stop doing this? I believe I read something about drivers being stored in the Windows folder somewhere but like hell am I poking around blind in there. e: Running up-to-date Windows 10 x64. Dr Snofeld fucked around with this message at 17:22 on Jul 9, 2017 |
# ? Jul 9, 2017 17:19 |
|
NihilCredo posted:Question: for pretty much the same price, I can get either the Gigabyte GTX 1060 6GB Mini ITX OC (a SFF card) or the Palit GTX 1060 6GB Dual. There's essentially a 100% chance that the Palit card matches the Gigabyte clocks if you tell it to.
|
# ? Jul 9, 2017 17:48 |
|
NihilCredo posted:Question: for pretty much the same price, I can get either the Gigabyte GTX 1060 6GB Mini ITX OC (a SFF card) or the Palit GTX 1060 6GB Dual. Yes but I'd like to stress point 2 as the difference between compact "mini" cards and dual fan coolers is stark. I had a EVGA 1060 SC and an Asus 1070 Dual in a badly ventilated case, the 1060 could be heard through noise canceling headphones, throttling at 83°C at 100% fan speed. The 1070, which has more heat to dissipate, runs at 73°C at 50% fan speed and is almost inaudible between the CPU and PSU fans.
|
# ? Jul 9, 2017 17:59 |
|
Fruit Chewy posted:i'm pretty sure pci-e spec says like 75 Technically the extra 2 pins in an 8-pin are sense pins, so they don't transmit any power according to the spec, the 6-pin itself can carry 150W. But in practice the 2 extra pins are only used as sense pins at startup and then actually can also transmit even more power so it's more like 150W on a 6-pin and 250-300W on an 8-pin. Pulling more than 75W from the slot is a bad idea though. Paul MaudDib fucked around with this message at 18:05 on Jul 9, 2017 |
# ? Jul 9, 2017 18:03 |
|
NihilCredo posted:My case can fit the larger card, but if all else were completely equal I'd get the Mini ITX so I could potentially move it to a smaller case. The larger card has a few more DisplayPort connectors, but I don't care about those. Many modern SFF cases are designed to accommodate full-length GPUs like that Palit, so there's generally no need to compromise with an ITX-length card unless you're dead-set on a particular case that does require one. Check out the Silverstone RVZ02, Dan A4, and Ncase M1.
|
# ? Jul 9, 2017 18:11 |
|
NihilCredo posted:My case can fit the larger card, but if all else were completely equal I'd get the Mini ITX so I could potentially move it to a smaller case. The larger card has a few more DisplayPort connectors, but I don't care about those. I've got a msi mini itx 1070 and it's great. It's silent and performs the same as any other 1070. Unless that 1060 just has a terrible itx cooler design I'd go for it.
|
# ? Jul 9, 2017 18:11 |
|
Paul MaudDib posted:Technically the extra 2 pins in an 8-pin are sense pins, so they don't transmit any power according to the spec, the 6-pin itself can carry 150W. But in practice the 2 extra pins are only used as sense pins at startup and then actually can also transmit even more power so it's more like 150W on a 6-pin and 250-300W on an 8-pin. Not like amd or nvidia hasn't pulled more then 75 watts from pci-e before
|
# ? Jul 9, 2017 18:46 |
|
Junior Jr. posted:https://www.youtube.com/watch?v=iiLSxhyiZss Probably, GPU Boost 3 lets cards overclock themselves so a "guaranteed" overclock is a minimum and other cards/coolers can surpass them; even if their stated clock speeds are lower.
|
# ? Jul 9, 2017 18:49 |
|
Junior Jr. posted:https://www.youtube.com/watch?v=iiLSxhyiZss Waste? No. That just shows its cooler isnt well designed for the vrms. Which, as pointed out there, is pretty dumb considering the size. However that wont affect performance. All reasonably cooled Pascal cards perform the same though so if you thought you were buying faster speeds thats just not true anymore. All AIBs are guilty for marketing this way still. All you are truly paying for these days is how little noise it can manage. The backplate thing would bug me though. Id just remove it outright if I had that card. 13 degrees is significant for just a backplate.
|
# ? Jul 9, 2017 19:16 |
|
What's a good all-in-one temperature/fan speed monitor? Speedfan doesn't work with 7000-series i5's. RealTemp doesn't show fan speed or GPU temp.
|
# ? Jul 9, 2017 19:38 |
|
Sanctum posted:What's a good all-in-one temperature/fan speed monitor? I dont know what it is with CPU fan speed info (and setting curves and all that) but its always some dumb mess. Afterburner alone is so close but the one thing it doesnt show is CPU fan speed (but it shows everything else you need). I've just eventually learned to live without it and estimate based on CPU temp info which afterburner displays nicely
|
# ? Jul 9, 2017 19:56 |
|
Wacky Delly posted:Probably, GPU Boost 3 lets cards overclock themselves so a "guaranteed" overclock is a minimum and other cards/coolers can surpass them; even if their stated clock speeds are lower. So is FireStorm not that good of an overclocking program?
|
# ? Jul 9, 2017 20:07 |
|
wargames posted:Not like amd or nvidia hasn't pulled more then 75 watts from pci-e before Yeah, remember when RX 480s were blowing up motherboards at launch? Most motherboards will handle a little over 75W... for at least a little while.
|
# ? Jul 9, 2017 20:22 |
|
Afterburner, GPU Tweak, Precision, Firestorm and almost certainly every single other one I'm forgetting all are hitting the same points with a different skin applied. Take the one you like the look of most. I like Afterburner because it still has the old style skin like they all used to have back in like 2007.
|
# ? Jul 9, 2017 20:27 |
|
Junior Jr. posted:https://www.youtube.com/watch?v=iiLSxhyiZss any amount you paid over reference might have been a waste but its still a fully functional operating within specs 1080ti
|
# ? Jul 9, 2017 20:33 |
|
Notebookcheck posted an article on Nvidia's new "Max Q" chips aka 1070m and 1080m (but don't call them that because notebook chips don't exist anymore!). https://www.notebookcheck.net/Opinion-Nvidia-s-Max-Q-Maximum-efficiency-minimum-performance.232038.0.html
|
# ? Jul 9, 2017 20:39 |
|
Sanctum posted:What's a good all-in-one temperature/fan speed monitor? I like Open Hardware Monitor. You can hide and rename any of the many values it shows and it has a built in web server you can enable if you want so you can monitor from a phone or tablet while full screen gaming.
|
# ? Jul 9, 2017 20:46 |
|
I sincerely hope that everyone who bought Canada's entire stock of video cards below GTX1080's to mine loving Ethercoin dies in a loving fire. My 7870 is gasping its last and I can't find a (modern) card in this country under $600 right now. poo poo is fuuuuuuuucked.
|
# ? Jul 9, 2017 22:09 |
|
Rime posted:I sincerely hope that everyone who bought Canada's entire stock of video cards below GTX1080's to mine loving Ethercoin dies in a loving fire. My 7870 is gasping its last and I can't find a (modern) card in this country under $600 right now. WTS GPUs, full price
|
# ? Jul 9, 2017 22:11 |
|
Rime posted:I sincerely hope that everyone who bought Canada's entire stock of video cards below GTX1080's to mine loving Ethercoin dies in a loving fire. My 7870 is gasping its last and I can't find a (modern) card in this country under $600 right now. Profitability is tanking. Wait a couple months and then scoop some barely used GTX 1080s on the cheap.
|
# ? Jul 9, 2017 22:16 |
|
Rime posted:I sincerely hope that everyone who bought Canada's entire stock of video cards below GTX1080's to mine loving Ethercoin dies in a loving fire. My 7870 is gasping its last and I can't find a (modern) card in this country under $600 right now. Canada Computers has some in stock: http://www.canadacomputers.com/product_info.php?cPath=43_1200_557_559&item_id=099794
|
# ? Jul 9, 2017 22:44 |
|
Kazinsal posted:Profitability is tanking. Wait a couple months and then scoop some barely used GTX 1080s on the cheap. I'm kind of worried that this won't actually happen. Now that nicehash lets people mine whatever is most profitable at any point in time, and there are lots of cryptocurrencies to choose from, what if we have a situation where it's always profitable to be mining something on a GPU, and so the demand from miners never goes down? An eternal GPU famine...
|
# ? Jul 9, 2017 23:00 |
|
VostokProgram posted:I'm kind of worried that this won't actually happen. Now that nicehash lets people mine whatever is most profitable at any point in time, and there are lots of cryptocurrencies to choose from, what if we have a situation where it's always profitable to be mining something on a GPU, and so the demand from miners never goes down? An eternal GPU famine... Something I've wondered is whether or not it'd be possible for card makers to gimp GPU's w/r/t mining but not impact performance in games(too much). Have the miners compete with ppl who have legit compute needs for cards instead of ppl who just want to game.
|
# ? Jul 9, 2017 23:07 |
|
Always being profitable would mean that someone would always be interested in buying. As soon as Gyft runs out of VC cash and China loosens up how much money wealthy barons can take to Macau to gamble, the interest in any crypto coin is down to smugglers, drug dealers, sex traffickers, and the cops trying to catch them. Craptacular! fucked around with this message at 23:17 on Jul 9, 2017 |
# ? Jul 9, 2017 23:10 |
|
VostokProgram posted:I'm kind of worried that this won't actually happen. Now that nicehash lets people mine whatever is most profitable at any point in time, and there are lots of cryptocurrencies to choose from, what if we have a situation where it's always profitable to be mining something on a GPU, and so the demand from miners never goes down? An eternal GPU famine... It provides no tangible benefit. People will just push nicehash into unprofitability and then exit causing a crash across the board. There's no reason to be involved unless you do exactly that.
|
# ? Jul 9, 2017 23:14 |
|
inkwell posted:Something I've wondered is whether or not it'd be possible for card makers to gimp GPU's w/r/t mining but not impact performance in games(too much). Have the miners compete with ppl who have legit compute needs for cards instead of ppl who just want to game. Why would card makers want to shrink demand for their product? It's not like anyone is blaming EVGA for the current situation.
|
# ? Jul 9, 2017 23:15 |
|
The other thing is that Ether is a weird exception because it was written by a guy who was frustrated that Bitcoin difficulty was raised to the moon by Chinese farms of specialty ASIC devices. It was deliberately designed to be resistant to ASICs and processors and optimal on consumer/gamer GPUs so that the average shmoe has a chance to mine and buy some burritos. This is great for people selling GPUs at great markup but it's not great for coingods who would rather compete with other coingods for hardware and not see the ROI goalposts moved further out because the common guy just trying to play Assassins Creed at 1440p is partially responsible for driving the price up.
|
# ? Jul 9, 2017 23:17 |
|
VostokProgram posted:I'm kind of worried that this won't actually happen. Now that nicehash lets people mine whatever is most profitable at any point in time, and there are lots of cryptocurrencies to choose from, what if we have a situation where it's always profitable to be mining something on a GPU, and so the demand from miners never goes down? An eternal GPU famine... As someone who's been mining for a while I kind of hoped that Nicehash adding in the new cryptocurrencies would stave off the decline of profitability but it's been dropping steadily for a while now. I'm making a little more than half of what I was when I started a month ago and if the decline continues at this pace I'd imagine a lot of people will stop mining in 1-2 months or so. I just have no idea at what point in the profitability decline people typically start dumping all of their stuff.
|
# ? Jul 9, 2017 23:18 |
|
VostokProgram posted:I'm kind of worried that this won't actually happen. Now that nicehash lets people mine whatever is most profitable at any point in time, and there are lots of cryptocurrencies to choose from, what if we have a situation where it's always profitable to be mining something on a GPU, and so the demand from miners never goes down? An eternal GPU famine... LTT and GamersNexus have touched on it, but it seems the shortage is due to AIB manufacturers are not ramping up orders, because they assume the crash is coming. They don't want to be stuck holding massive stock of cards that are impossible to move, especially with Vega and Volta coming. They learned from last time. If mining keeps going forever (it wont), AIB partners will ramp up orders and fill the demand, likely with mining focused products. But they are hesitant to do that yet.
|
# ? Jul 9, 2017 23:21 |
|
Lockback posted:Why would card makers want to shrink demand for their product? It's not like anyone is blaming EVGA for the current situation. https://youtube.com/watch?v=P_oHQPWRWnQ
|
# ? Jul 9, 2017 23:22 |
|
|
# ? May 31, 2024 06:21 |
|
As far as I'm concerned, as long as my GPU rig can mine buttcoins better than my crappy ASIC miners, and they offer me a slightly decent payout rate, I'm fine with that.
|
# ? Jul 9, 2017 23:38 |