|
It’s a dumb choice of words, but as the meaning of them go you can totally see an ex-Apple guy being concerned about the user facing layer of the thing.
|
# ? Dec 13, 2017 20:58 |
|
|
# ? May 29, 2024 03:49 |
|
It's also a dumb line of thought when you're trying to make a product that has to actually compete on its own merits, not just sell back into a niche of True Believers and professionals suffering from vendor lock-in.
|
# ? Dec 13, 2017 22:32 |
|
DrDork posted:It's also a dumb line of thought when you're trying to make a product that has to actually compete on its own merits, not just sell back into a niche of True Believers and professionals suffering from vendor lock-in. Has that ever stopped AMD?
|
# ? Dec 14, 2017 01:28 |
|
DrDork posted:It's also a dumb line of thought when you're trying to make a product that has to actually compete on its own merits, not just sell back into a niche of True Believers and professionals suffering from vendor lock-in. You’re suggesting most people do t care what a GPU card looks like in their computers, but the array of LEDs and poo poo people are making now says otherwise.
|
# ? Dec 14, 2017 01:46 |
|
i mean, it is a good rear end shroud (the metal ones, the plastic ones look like dogshit)
|
# ? Dec 14, 2017 01:48 |
|
Arivia posted:Has that ever stopped AMD? I worked at an Nvidia shop and I was given a dream GPU request that was identical to one of the higher end Firepros so I wrote up a proposal for it that would have cost half in terms of hardware and filled their exact request except for being AMD. They decided not to switch because of the cost of retraining and retooling to run everything on AMD cards rather than Nvidia ones, even if AMD was giving away the cards for free it wouldn't have been worth it because of all of the other costs from switching away from Nvidia.
|
# ? Dec 14, 2017 01:49 |
|
Craptacular! posted:You’re suggesting most people do t care what a GPU card looks like in their computers, but the array of LEDs and poo poo people are making now says otherwise. That's more an attempt to stand out a little when, thanks to Boost and highly homogeneous chips, the whole factory overclocked thing has lost a lot of it's luster and your options as a AIB company are basically "make it look fancy" and/or "make a cooler that doesn't suck." Flashy lights absolutely aren't convincing anyone to buy a Radeon instead of a similarly priced GTX these days.
|
# ? Dec 14, 2017 02:51 |
|
More developments on headless guest passthrough from Wendell. https://youtu.be/okMGtwfiXMo
|
# ? Dec 14, 2017 12:47 |
|
DrDork posted:It's also a dumb line of thought when you're trying to make a product that has to actually compete on its own merits, not just sell back into a niche of True Believers and professionals suffering from vendor lock-in. even during periods when amd cards were objectively better than nvidia cards the nvidia cards still sold more. don't underestimate the power of branding. nvidia is the master of sycophantic fanboyism (via branding) and professional vendor lock in (via CUDA); it's a cornerstone of their business model. can't blame amd for wanting a piece of that. i mean graphics cards are just hardware that's sitting inside a box that you aren't looking at 99.9% of the time and should rightly be a commodity, but they're not. that's captialism! that being said the look of the vega cards fails entirely on its own merits as apart from vega being mud in the eyes of most casual observers anyway the shroud is ugly as sin. especially the founder's edition one. who thought yellow and blue was a good colour scheme? loving bananaman rear end poo poo Generic Monk fucked around with this message at 13:26 on Dec 14, 2017 |
# ? Dec 14, 2017 13:16 |
|
Craptacular! posted:It’s a dumb choice of words, but as the meaning of them go you can totally see an ex-Apple guy being concerned about the user facing layer of the thing. 1) Design the best loving shroud on the planet 2) Install card upside down so no one sees it anyway 3) quit and join intel
|
# ? Dec 14, 2017 13:17 |
|
GRINDCORE MEGGIDO posted:Didn't he work at Apple? I guess the entire ethos there is have a nice shroud over a Vega thing underneath. i mean the modern apple ethos is to prioritise quiet operation and power efficiency most of the time, almost to a fault. so mission loving failed on that front
|
# ? Dec 14, 2017 13:23 |
|
SwissArmyDruid posted:More developments on headless guest passthrough from Wendell. I like the bit where in the future you might be able to use an AMD graphics card to enable FreeSync on your nVidia card. I wonder if you could save money buy buying a cheap AMD card instead of a G-Sync monitor.
|
# ? Dec 14, 2017 13:26 |
|
Generic Monk posted:even during periods when amd cards were objectively better than nvidia cards the nvidia cards still sold more. how much objectively better are we talking, and how much objective sales advantage did that translate to? was this actual advantage, or just in the Doom 2016 (AMD-advantaged) titles of the day? Paul MaudDib fucked around with this message at 14:33 on Dec 14, 2017 |
# ? Dec 14, 2017 14:29 |
|
Paul MaudDib posted:how much objectively better are we talking, and how much objective sales advantage did that translate to? was this actual advantage, or just in the Doom 2016 (AMD-advantaged) titles of the day? When they launched, the 4870 and 290X were objectively the best cards at their price-points by a good bit. They weren't quite the highest powered cards available, but just a cut below at a huge price discount. The sales advantage is what never really appeared: despite AMD being the objectively better option for everything except ultra-high-end SLI type options at a few points in its history, AMD has never captured more than 45% of the market, and usually has been more in the 30%'s. NVidia, on the other hand, has never had less than 55% of the market, regardless of how badly they've messed things up.
|
# ? Dec 14, 2017 15:04 |
|
Fermi
|
# ? Dec 14, 2017 15:21 |
|
DrDork posted:When they launched, the 4870 and 290X were objectively the best cards at their price-points by a good bit. They weren't quite the highest powered cards available, but just a cut below at a huge price discount. They probably could have done a bit better if the 290 drivers were worth anything on launch. 770s consistently matched them in popular titles when they shouldnt have for months. Then the price of 290s skyrocketed for the first buttcoin bubble. Iirc there was just a few months at the start of msrp pricing. But I do agree there was a Buy Nvidia unless you're a big ol Noob mentalilty for many years leading up to this that was even less warranted (Fermi stuff here though then I couldnt even care to read benchmarks then so thats what ive just been told). But AMD had also built a true culture around them too that only recently it seems has been whittled away to an insignificant number. Id feel worse for them if they didnt do literally everything they did for years after the 290, lol.
|
# ? Dec 14, 2017 15:38 |
|
Measly Twerp posted:I like the bit where in the future you might be able to use an AMD graphics card to enable FreeSync on your nVidia card. I wonder if you could save money buy buying a cheap AMD card instead of a G-Sync monitor. My guess is that NVidia will disable whatever they use as soon as they find it and they'll spend 9 months cussing each other and playing cat/mouse until the project dies.
|
# ? Dec 14, 2017 15:55 |
|
Say what you want about vega *someone*'s buying them all.
|
# ? Dec 14, 2017 16:04 |
|
I want to see how long that machine can run at full tilt before it thermal throttles. Is it measured in minutes or just seconds?
|
# ? Dec 14, 2017 16:08 |
|
My best guess would be 5 minutes from a cold boot, since it takes a while for the alu case to heat up. Otherwise, yeah, <100 seconds?
|
# ? Dec 14, 2017 16:12 |
|
My bet is disabled at the factory. Like pre-set to 75% power or something.
|
# ? Dec 14, 2017 16:20 |
|
I played with one for about 10 minutes, and couldn’t make the GPU downclock using LuxMark.
|
# ? Dec 14, 2017 16:20 |
at that price they should loving give you applecare, jeez
|
|
# ? Dec 14, 2017 16:22 |
|
Maybe Apple is getting all the ones that can work without being overvolted to toxic levels, and that's why the add-in boards are terrible?
|
# ? Dec 14, 2017 16:23 |
|
16 grand
|
# ? Dec 14, 2017 16:37 |
|
128GB of ECC DDR4 runs close to $3000 just by itself.
|
# ? Dec 14, 2017 16:55 |
|
1gnoirents posted:16 grand The price isn't that bad, you can easily hit 16k with a comparable Dell or HP workstation. The problem is they stubbornly limit you to AMD GPUs in a market where CUDA software still dominates.
|
# ? Dec 14, 2017 17:01 |
|
The fact that it's unupgradable makes that 16k hurt a bit more.
|
# ? Dec 14, 2017 17:05 |
|
Have they moved completely away from hybrid/Fusion/SSHD drives after they couldn't figure out how to support it on High Sierra? The pricing looks insane, but the DRAM costs must be nuts, too.
|
# ? Dec 14, 2017 17:58 |
|
ufarn posted:Have they moved completely away from hybrid/Fusion/SSHD drives after they couldn't figure out how to support it on High Sierra? nah apple just knows not to put spinning disks in a machine that costs this much since they'll at worst never be used and at best actively degrade performance based on the config Nfcknblvbl posted:The fact that it's unupgradable makes that 16k hurt a bit more. i mean most people buying this are going to be doing it through a company and will prob be able to get an upgrade turned around by a business rep relatively sharpish this looks like a good machine although i will probably never so much as touch one as long as i live. if we weren't in a situation where macs with high core counts and decent gpus were so hard to come by i wouldn't be able to tell you what market segment this is supposed to fill though. companies with tens of thousands of dollars to spend on a pro desktop but not enough floorspace to fit a tower workstation? repiv posted:The price isn't that bad, you can easily hit 16k with a comparable Dell or HP workstation. The problem is they stubbornly limit you to AMD GPUs in a market where CUDA software still dominates. i can't blame them for their antipathy toward nvidia considering how many dud laptop GPUs they were provided with (though honestly they should just eat the loving cost, they can afford it.) current solution is prob to just use an external gpu but it sure would be nice if that modular mac pro they say they're making comes with an actual (shock, horror, ecstasy) pci slot Paul MaudDib posted:how much objectively better are we talking, and how much objective sales advantage did that translate to? was this actual advantage, or just in the Doom 2016 (AMD-advantaged) titles of the day? im not drawing u a graph look it up x Generic Monk fucked around with this message at 20:19 on Dec 14, 2017 |
# ? Dec 14, 2017 20:03 |
|
Nfcknblvbl posted:The fact that it's unupgradable makes that 16k hurt a bit more. No, you see, that's a feature. It ensures you don't use obsolete hardware for too long. They're looking out for your best interests.
|
# ? Dec 14, 2017 20:21 |
|
BIG HEADLINE posted:No, you see, that's a feature. It ensures you don't use obsolete hardware for too long. They're looking out for your best interests. i run a 5yo mac mini that works better today than the day it was sold. i have a thinkpad of comparable age that won't even run the latest version of windows properly. im with angry motherboard repairer/youtube ranter guy on getting rid of a lot of repairability for dubious gain but the and software are, with only a few exceptions, incredibly solid. people regularly make macs last for 8-10 years Generic Monk fucked around with this message at 20:28 on Dec 14, 2017 |
# ? Dec 14, 2017 20:25 |
|
Generic Monk posted:i run a 5yo mac mini that works better today than the day it was sold. Taking FineWine bullshit to the next level
|
# ? Dec 14, 2017 20:32 |
|
Generic Monk posted:i run a 5yo mac mini that works better today than the day it was sold. i have a thinkpad of comparable age that won't even run the latest version of windows properly. im with angry motherboard repairer/youtube ranter guy on getting rid of a lot of repairability for dubious gain but the and software are, with only a few exceptions, incredibly solid. people regularly make macs last for 8-10 years While the claim that a 5yo is better now than before is fishy, I can confirm that for the 6 months of my life that I had a Mac as th company laptop I came to appreciate the hardware. Is solid, and fast for the specs. Now if they could design their software to be usable and fix the keyboard on their laptops, it would be quite a nice machine worthy of the price. As is stands, it isn't.
|
# ? Dec 14, 2017 20:37 |
|
Fermi was objectively the fastest uarch when it came out, and it wasn't uniquely power-hungry. High-end GCN cards had a similar power draw. It just had a lovely blower cooler that wasn't really up to the demands of modern GPUs. It pulled less than, say, Vega. Or even Fiji. Or even the 290X... DrDork posted:When they launched, the 4870 and 290X were objectively the best cards at their price-points by a good bit. They weren't quite the highest powered cards available, but just a cut below at a huge price discount. They weren't Maxwell-level good though, and that's the only time NVIDIA managed to take a dominant advantage. Coincidentally, that's also when AMD thought the discrete GPU market was dying and they could get away with rebrands forever. AMD deserves their current low marketshare. You can't just not do R&D for 5 years and expect people to still buy your products. GCN is loving ancient. It's just AdoredTV-level whining that AMD isn't having 75% marketshare handed to them on a silver platter despite self-admittedly doing pretty much nothing in the market for the last 5 years. The fact that their products haven't been in stock for anywhere near MSRP for the last 9 months is just the cherry on top. Not just Vega either, Polaris has never really recovered from the mining boom. GCN isn't even competitive with Paxwell to begin with, let alone when priced 25-75% above Paxwell. Paul MaudDib fucked around with this message at 20:59 on Dec 14, 2017 |
# ? Dec 14, 2017 20:37 |
|
PCPer says the Titan V is 50% faster than the 1080 Ti in their gaming tests, that's actually better than I (and they) expected given how much of the die is used for FP64 and tensor cores. A GV102 chip with that stuff removed would be a monster. https://www.pcper.com/reviews/Graphics-Cards/NVIDIA-TITAN-V-Review-Part-1-Gaming MaxxBot fucked around with this message at 20:53 on Dec 14, 2017 |
# ? Dec 14, 2017 20:49 |
|
MaxxBot posted:PCPer says the Titan V is 50% faster than the 1080 Ti in their gaming tests, that's actually better than I (and they) expected given how much of the die is used for FP64 and tensor cores. A GV102 chip with that stuff removed would be a monster. It's the power of compound interest. Titan Xp is 10% faster than the 1080 Ti by itself, and Titan Xv adds 42% more cores on top of that. Really it's probably more like 30% on average though. And the minimums are pretty bad right now.
|
# ? Dec 14, 2017 20:55 |
|
metallicaeg posted:Taking FineWine bullshit to the next level i did put an ssd in it. which does help. not exactly telling the whole truth i'll give u that until the last update if u had a shitload of windows open the ui would stutter a bit when you invoked the mission control feature that makes the windows fly all over the place so you can see them better. with the HS update even that tiny bit of lag has been ironed out. thank u based apple
|
# ? Dec 14, 2017 20:55 |
|
Paul MaudDib posted:It's the power of compound interest. Titan Xp is 10% faster than the 1080 Ti by itself, and Titan Xv adds 42% more cores on top of that. I really hope this doesn't mean the 1000-1100 series jump will be tiny, I've grown used to 700-900-1000 being so nice.
|
# ? Dec 14, 2017 21:03 |
|
|
# ? May 29, 2024 03:49 |
|
More importantly PCPer says it does 79MH/s! I hope someone makes an 8x Titan V mining rig.Nfcknblvbl posted:I really hope this doesn't mean the 1000-1100 series jump will be tiny, I've grown used to 700-900-1000 being so nice. I don't expect that the gaming cards will be released based on the GV100, it will probably be an entirely different set of chips.
|
# ? Dec 14, 2017 21:04 |