|
craig588 posted:1. Yes, the Volta x80 is coming in spring, but based on past trends the 1080 ti will only be a bit slower than it and if you can use them now it's probably not worth waiting I think it's also pretty likely that the Volta xx80 will have 8GB of VRAM, and in that case the 1080ti would still have a clear advantage in VRAM-bound machine learning tasks.
|
# ? Nov 11, 2017 14:58 |
|
|
# ? May 18, 2024 09:58 |
|
Krailor posted:Another option for a bunch of pcie lanes would be to build an x99 system. It's a little older but plenty of places are still selling stock and it will be much cheaper than either a Threadripper or x299 system. ooh, hadn't thought about the need for blower style. thanks. and in terms of the power supply, is 1K watts enough for TR+ 4x 1080 Tis?
|
# ? Nov 11, 2017 15:04 |
|
shrike82 posted:ooh, hadn't thought about the need for blower style. thanks. A 1080Ti can happily suck down 250-300W, so you'd be looking at 1-1.2Kw just for the GPUs. I'd look at ~1400W PSUs.
|
# ? Nov 11, 2017 15:26 |
|
for ML, I've been using nvidia-smi to turn down the power for my current 1080 Ti to 180W i have to check about whether my wall wart supports >1KW safely
|
# ? Nov 11, 2017 15:32 |
|
VulgarandStupid posted:It's better to have a seperate power brick as it removes a heat producing part from the enclosure and they're passively cooled. Plus I'd rather have a brick on the floor rather than that much more volume on my desk. Now, if we start talking about proprietary connectors like the Xbox 360 and certain laptops that's a different matter entirely, that's just the worst of both worlds. SwissArmyDruid posted:Here's hoping Intel doesn't ship with one of those stupid loving ones with the cable from the brick to the wall ungodly long, and the cable from the brick to the device being too goddamn short. I'd place my bets on a midspan brick with 2.5 feet of cable on the device side and a four foot C5 cable on the wall side. wolrah fucked around with this message at 15:43 on Nov 11, 2017 |
# ? Nov 11, 2017 15:40 |
|
Heise.de is claiming that NVs new consumer architecture will be called Ampere, not Volta, and should be announced during GTC18 at the end of March. I suppose Ampere for consumers and Volta for HPC makes sense, to avoid a repeat of the Pascal confusion where two very different architectures had the same name. repiv fucked around with this message at 16:30 on Nov 11, 2017 |
# ? Nov 11, 2017 16:04 |
|
Even with blower cards, the cards in the middle of the stack are going to be getting hot - or alternately the blower is going to get pretty noisy.
|
# ? Nov 11, 2017 16:33 |
|
Guess I have a Vega 64 coming, should be interesting to test and compare to the Pro Duo Polaris and 1070. Was hoping for the WX9100 and a threadripper considering I did a bunch of work for them to demo the WX9100+threadripper on, but there seems to have been a major dearth of AMD hardware to the point where I heard (don’t take this as hard fact, please) even many RTG engineers couldn’t get them. And as a side note, man is it hard to create a demo for hardware you don’t have, especially when that demo has requirements specific to the hardware in question. We’ve had a good relationship with them though and their newer ProRender/software leadership has meant that there’s better followthrough on their end - but I get worried when I read some of the news/rumors here. I’m just an artist that loves rendering tech so the game side of things is irrelevant to me, but I know if AMD drops the ball on the gaming side, the Pro side I rely on will be effected. Though the statement that Dr. Su wants to focus on the pro side first is really heartening - they were industry leaders in RAM capacity as recently as the Hawaii chip (the W9100 with 32GB of RAM was incredible - it wasn’t the fastest card, but man...) and I think they need a unique draw for guys like me. Hell, a Vega with 32GB would be a dream come true, or (dreaming now) any card with 56 or 64 GB. Some of our scenes use 100GB+ of RAM when rendering, meaning we can only render on CPU (much slower than GPU unless you have something like a 56+ thread processor), so having those on GPU isn’t feasible... but many of our “smaller” scenes use 30-60GB, so having a modern card that could handle that load would be amazing. Unrelated to RTG, but I recently discovered that the RPD Polaris seems to be nonexistent in the wild. Besides me and like 2 or 3 other guys (all of us who got them provided by AMD), I can’t find a single review or purchase comment. I understood when this happened with the original SSG (there were only 50 made and it wasn’t easy to leverage for work, that’s for sure), but I would’ve expected at least a couple of artists to pick one up, considering it’s 2x WX7100 with 2x RAM for less than the cost of two. But maybe the slower speed, bad timing (right before Vega...) and no review samples meant they didn’t make many, and people just didn’t want them. So, I think I’ll be benchmarking and reviewing this guy as a sort of curiosity/artifact.
|
# ? Nov 11, 2017 18:07 |
|
Zero VGS posted:I like how "ChipHell" gets the Hades Canyon chip, and there's Devil's Canyon too. Wouldn't be surprised, considering their last Layoff Round, and the new one coming for the Finance team, probably just enough to shore up what they have to pay Raja. The Moral Hell over there might be part of it, but I have to admit I liked Devils Canyon and this new one looks even more badass. However considering how things continue to look and be priced from Intel, I still feel my next build eventually, will be a Threadripper and NVIDIA whatever badassness they have with a Ti on the end. My current rig somehow continues to just deliver so no reason to chuck it yet.
|
# ? Nov 11, 2017 19:10 |
|
repiv posted:Heise.de is claiming that NVs new consumer architecture will be called Ampere, not Volta, and should be announced during GTC18 at the end of March. Would a 1050 Ti card be a decent fit for an old i5-2500 (stock clock, non-k) based system? Just to make it last a bit longer and then work as a living room gaming system for less demanding stuff. Of course with this approach, if I do get a VR system around holidays, I'd be still screwed so yeah.
|
# ? Nov 11, 2017 21:42 |
|
IMO VR kinda sucks value wise for anyone who isnt an enthusiast. Wait for the tech to improve so you arent stuck with the 1st gen tech when everyone else has the new 8k screens at 144hz. 1050ti will be fine for 1080p and will play what you want but if you can afford $100 more the 1060 6gb is a much much stronger card matching approximately the same fps/$
|
# ? Nov 12, 2017 01:31 |
|
On the other hand, the rift is 350 on black Friday.
|
# ? Nov 12, 2017 02:12 |
|
The Gasmask posted:Guess I have a Vega 64 coming, should be interesting to test and compare to the Pro Duo Polaris and 1070. Was hoping for the WX9100 and a threadripper considering I did a bunch of work for them to demo the WX9100+threadripper on, but there seems to have been a major dearth of AMD hardware to the point where I heard (don’t take this as hard fact, please) even many RTG engineers couldn’t get them. And as a side note, man is it hard to create a demo for hardware you don’t have, especially when that demo has requirements specific to the hardware in question. No Radeon SSG card for you? Seems like if you need that much VRAM, it would be useful. Pricing thing?
|
# ? Nov 12, 2017 02:17 |
|
NewFatMike posted:No Radeon SSG card for you? Seems like if you need that much VRAM, it would be useful. Pricing thing? AFAICT SSG could only useful for workloads that slowly and sequentially churn through a large dataset, something like a GI renderer that needs to constantly sample textures scattered over the entire dataset is just going to thrash the cache and overload the SSD controller with tiny non-sequential reads
|
# ? Nov 12, 2017 02:42 |
|
repiv posted:Heise.de is claiming that NVs new consumer architecture will be called Ampere, not Volta, and should be announced during GTC18 at the end of March. Announced in March means general availability in like may or june? I'd been planning to hold this 970 until Volta but now I'm contemplating just getting a 1080 or 1080ti...
|
# ? Nov 12, 2017 04:29 |
|
VostokProgram posted:Announced in March means general availability in like may or june? I'd been planning to hold this 970 until Volta but now I'm contemplating just getting a 1080 or 1080ti... Yeah, my 970 went off warranty last month and it feels like walking a tightrope without a net below.
|
# ? Nov 12, 2017 04:44 |
|
NewFatMike posted:No Radeon SSG card for you? Seems like if you need that much VRAM, it would be useful. Pricing thing? We have the original SSG actually - at least with that one, the problem was software needed to be specifically coded for it. I think there was a custom version of premiere available, but for 3D rendering workloads there wasn’t any custom versions of the programs we use, and considering the GPU in that thing wasn’t anywhere near as powerful as even a 1070 and had really buggy drivers, it hasn’t been used much. The new one looks cool, and we may end up with one or two of them depending. The benefit and caveat of partnering with hardware companies is we don’t pay for this stuff - which means we only get awesome gear if they feel like sending it and we can do something neat with it in return. This fact has made me 10x more appreciative of consumer spending and I’ve become much more discriminatory as a result - if I’m going to drop $500+ on a GPU or CPU, it better drat well be worth the money. I could never afford the bleeding edge before I got this job, so I was always midrange and often one or two gens behind. Then once I started getting the latest practically thrown at me, I realized just how temperamental and buggy that stuff was, and how often the gains were minimal for a significant increase in retail. The Gasmask fucked around with this message at 05:26 on Nov 12, 2017 |
# ? Nov 12, 2017 05:17 |
|
BIG HEADLINE posted:Yeah, my 970 went off warranty last month and it feels like walking a tightrope without a net below. Out of warranty doesn't mean will blow up. The worst case scenario is that it does blow up and you have to buy a new video card, at which point the question becomes on why buy it now when you can buy it tomorrow? Tomorrow you will always get a better deal, baring any Xcoin shenanigans.
|
# ? Nov 12, 2017 06:41 |
|
But then maybe you can't play your games for a few days!Fauxtool posted:IMO VR kinda sucks value wise for anyone who isnt an enthusiast. Wait for the tech to improve so you arent stuck with the 1st gen tech when everyone else has the new 8k screens at 144hz. As for VR I've been actually considering the Samsung headset, it's higher res and does inside-out tracking already.
|
# ? Nov 12, 2017 09:28 |
|
mobby_6kl posted:
As someone who has both a Rift and a Microsoft (Dell) headset, think twice before getting any of the Microsoft headsets. The lenses on the Dell at least are really bad and blur everything but the middle third of the image, and they don't have progressive lenses to do a tilt focus like the Rift has, and all of those headsets require you to be looking at your controllers for them to track accurately. If they are down by your sides or behind your back they drift off, and it actually prevents some game mechanics and stuff. The Rift is $400 now and $350 for Black Friday, the Samsung is $500. At least see if you can try the Samsung one first, they have it at some Microsoft Stores. But I don't think they hold up to the Rift at the same price, nevermind more expensive. The resolution boost doesn't matter if you don't put money/time into really nice lenses. The Rift controllers are way better ergonomically too.
|
# ? Nov 12, 2017 09:46 |
|
Zero VGS posted:As someone who has both a Rift and a Microsoft (Dell) headset, think twice before getting any of the Microsoft headsets. The lenses on the Dell at least are really bad and blur everything but the middle third of the image, and they don't have progressive lenses to do a tilt focus like the Rift has, and all of those headsets require you to be looking at your controllers for them to track accurately. If they are down by your sides or behind your back they drift off, and it actually prevents some game mechanics and stuff. Yesterday I saw 1070's going for 400€ so yes buying a 1060 doesn't seem to be that good an idea. I didn't realize my 970 was already 3 years ago, welp apparently it was. Luckily I have only a 60Hz monitor so I don't need a faster card yet. Still waiting for those sweet 100Hz+ 38" gsync monitors..
|
# ? Nov 12, 2017 10:17 |
It is just a poo poo time to buy a monitor. I’m sticking with my 1200p 60Hz panel and 970 until high refresh rate adaptive sync HDR IPS/OLED becomes a thing that exists. Otherwise I’ll feel like I’m missing something. 16:10 4K/5K is already a pipe dream I’m going to need to give up on.
|
|
# ? Nov 12, 2017 10:36 |
|
Laslow posted:It is just a poo poo time to buy a monitor. I’m sticking with my 1200p 60Hz panel and 970 until high refresh rate adaptive sync HDR IPS/OLED becomes a thing that exists. Otherwise I’ll feel like I’m missing something. 16:10 4K/5K is already a pipe dream I’m going to need to give up on. CES is January 3rd to January 7th your dreams might just be answered come then. Maybe not the Oled part but hopefully the HDR 144hz Gsync IPS VostokProgram posted:Announced in March means general availability in like may or june? I'd been planning to hold this 970 until Volta but now I'm contemplating just getting a 1080 or 1080ti... Micron had a stock market statement in May saying they are starting GDDR6 production in April 2018, so yeah that means you get your 1180 or whatever the gently caress they call it Founders Edition in your hand probably end of June earliest, god only knows how long AIB cards will be delayed but hopefully I can get a good AIB before August. Urthor fucked around with this message at 11:58 on Nov 12, 2017 |
# ? Nov 12, 2017 11:56 |
|
repiv posted:Heise.de is claiming that NVs new consumer architecture will be called Ampere, not Volta, and should be announced during GTC18 at the end of March. Makes sense. Tensor cores are totally pointless for graphics.
|
# ? Nov 12, 2017 12:10 |
|
Urthor posted:Micron had a stock market statement in May saying they are starting GDDR6 production in April 2018, so yeah that means you get your 1180 or whatever the gently caress they call it Founders Edition in your hand probably end of June earliest, god only knows how long AIB cards will be delayed but hopefully I can get a good AIB before August. On the other hand SK Hynix said they're producing GDDR6 for a client to release high-end graphics cards by early 2018, and May/June would really be stretching the definition of "early". Malcolm XML posted:Makes sense. Tensor cores are totally pointless for graphics. Yeah there was no doubt that Tensor/FP64 support would be stripped out of consumer chips. I wonder if they'll give us FP16 this time though, it could go either way.
|
# ? Nov 12, 2017 12:16 |
|
Laslow posted:16:10 4K/5K is already a pipe dream I’m going to need to give up on. It wasn't 16 years ago, but today, the idea is indeed dead.
|
# ? Nov 12, 2017 13:45 |
|
3840x1600 is good enough, you can run it on 2560x1600 for 16:10 if needed. Scaling still works so poorly I don't want to buy a higher dpi monitor than 10 years ago.
|
# ? Nov 12, 2017 13:50 |
Urthor posted:CES is January 3rd to January 7th your dreams might just be answered come then. I remember years ago analysts were saying that discrete video cards were gonna go out of style like sound cards because graphics couldn’t be improved upon much as if they thought we’d be targeting 1080p60 forever. I wish I had a job like that and just get paid for being fuckin wrong all the time.
|
|
# ? Nov 12, 2017 13:56 |
|
repiv posted:On the other hand SK Hynix said they're producing GDDR6 for a client to release high-end graphics cards by early 2018, and May/June would really be stretching the definition of "early". Well, the timing points to AMD, but "high-end" points to Nvidia.
|
# ? Nov 12, 2017 13:58 |
Ihmemies posted:3840x1600 is good enough, you can run it on 2560x1600 for 16:10 if needed. Scaling still works so poorly I don't want to buy a higher dpi monitor than 10 years ago.
|
|
# ? Nov 12, 2017 14:01 |
|
Laslow posted:Windows 10 is little bit better and MacOS HiDPI is good. I’m tired of the DPI of my computer getting dunked on so hard by my goddamn cellphone. Stop having your monitor in your face I guess?
|
# ? Nov 12, 2017 14:05 |
|
Laslow posted:It is just a poo poo time to buy a monitor. I’m sticking with my 1200p 60Hz panel and 970 until high refresh rate adaptive sync HDR IPS/OLED becomes a thing that exists. Otherwise I’ll feel like I’m missing something. 16:10 4K/5K is already a pipe dream I’m going to need to give up on. The TV market is so much more competitive it's not even funny anymore. Last year Black Friday I saw a 55" 4K going at $300 with plenty of stocks, but make that into a 27" monitor, add factory calibration and suddenly it's a $1000+ "ahaha fuuuuck you PC losers". At the end of day that still doesn't hold a candle to a LG C7 OLED now going at $1700 besides PPI.
|
# ? Nov 12, 2017 14:21 |
|
Isn't the time-to-photon pretty poo poo in your average TV?
|
# ? Nov 12, 2017 14:26 |
|
SwissArmyDruid posted:Well, the timing points to AMD, but "high-end" points to Nvidia. comedy option: intel iris extreme with gddr6 launching in january
|
# ? Nov 12, 2017 14:55 |
|
repiv posted:comedy option: intel iris extreme with gddr6 launching in january at this point I would not be surprised
|
# ? Nov 12, 2017 16:26 |
|
Ihmemies posted:Scaling still works so poorly I don't want to buy a higher dpi monitor than 10 years ago. On the other hand this is a real chicken and egg problem. If people continue to mostly not buy higher DPI desktop displays the idiots creating all these lovely Windows apps that break when scaled will never have an incentive to fix things. Remember when Apple launched Retina displays? Lots of major apps did not properly support them. I remember the first time I messed with one at an Apple store it seemed like half of what they have installed on the demo systems was either tiny or blurry. The majority released updates within a year or so, and many of those that weren't being updated anymore were abandoned. It's been years since I've seen improper scaling on a Mac. Why does the Mac software get fixed when there are tons of Windows apps that still get it wrong after so many years of these problems being pointed out? Because they were forced to. People were buying Retina Macs and weren't willing to compromise. Unfortunately the Windows world has no single vendor who can unilaterally push hardware changes, and a lot more users who would rather do backflips through flaming hoops than consider replacing the piece of software they started using in 1997. wolrah fucked around with this message at 17:39 on Nov 12, 2017 |
# ? Nov 12, 2017 17:35 |
|
That is exactly why I got a 49" Samsung 4k Quantum Dot TV to use as a monitor. 1:1 scaling at around 90dpi. Exactly perfect scaling for any apps. Bottom line, Windows and high DPI is a non starter. Modern apps are fine but that is really nothing I use.
|
# ? Nov 12, 2017 19:13 |
|
3dcenter.org reports that the Intel i7-8809G APU is scoring 13341 in 3DMark 11, the same as a GTX-1060 MaxQ and ~10% behind a mobile GTX-1060.
|
# ? Nov 12, 2017 19:14 |
|
eames posted:3dcenter.org reports that the Intel i7-8809G APU is scoring 13341 in 3DMark 11, the same as a GTX-1060 MaxQ and ~10% behind a mobile GTX-1060. I wonder how much power these chips are going to be using and how much heat is generated
|
# ? Nov 12, 2017 20:55 |
|
|
# ? May 18, 2024 09:58 |
|
eames posted:3dcenter.org reports that the Intel i7-8809G APU is scoring 13341 in 3DMark 11, the same as a GTX-1060 MaxQ and ~10% behind a mobile GTX-1060. These numbers do not help me feel less annoyed about buying this laptop!
|
# ? Nov 12, 2017 22:24 |