Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

craig588 posted:

1. Yes, the Volta x80 is coming in spring, but based on past trends the 1080 ti will only be a bit slower than it and if you can use them now it's probably not worth waiting

I think it's also pretty likely that the Volta xx80 will have 8GB of VRAM, and in that case the 1080ti would still have a clear advantage in VRAM-bound machine learning tasks.

Adbot
ADBOT LOVES YOU

shrike82
Jun 11, 2005

Krailor posted:

Another option for a bunch of pcie lanes would be to build an x99 system. It's a little older but plenty of places are still selling stock and it will be much cheaper than either a Threadripper or x299 system.

Also if you're going to get 4x GPUs you want to get ones with blower style coolers; they're made for sitting close together in SLI. The open air coolers are great if there's space between cards but they start choking when the cards are right next to each other.

ooh, hadn't thought about the need for blower style. thanks.

and in terms of the power supply, is 1K watts enough for TR+ 4x 1080 Tis?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

shrike82 posted:

ooh, hadn't thought about the need for blower style. thanks.

and in terms of the power supply, is 1K watts enough for TR+ 4x 1080 Tis?

A 1080Ti can happily suck down 250-300W, so you'd be looking at 1-1.2Kw just for the GPUs. I'd look at ~1400W PSUs.

shrike82
Jun 11, 2005

for ML, I've been using nvidia-smi to turn down the power for my current 1080 Ti to 180W

i have to check about whether my wall wart supports >1KW safely

wolrah
May 8, 2006
what?

VulgarandStupid posted:

It's better to have a seperate power brick as it removes a heat producing part from the enclosure and they're passively cooled. Plus I'd rather have a brick on the floor rather than that much more volume on my desk.
Don't forget the ease of replacement. Power supplies are one of the most common failures I see and the silly custom poo poo you find in a lot of internal PSU SFF machines can be impossible to track down. I have otherwise good Dell Optiplex SFF machines dropping like flies because their OEM PSUs don't like to get hot but are also the primary dust collection zone. Replacing those is a matter of finding random sellers on ebay or Amazon and hoping what they have is what I actually need because there are a dozen different variants of which only some are compatible. An external brick using a standard barrel plug is nice and simple.

Now, if we start talking about proprietary connectors like the Xbox 360 and certain laptops that's a different matter entirely, that's just the worst of both worlds.

SwissArmyDruid posted:

Here's hoping Intel doesn't ship with one of those stupid loving ones with the cable from the brick to the wall ungodly long, and the cable from the brick to the device being too goddamn short.
This part I don't have much confidence on. I recently installed some low-end NUCs and they shipped with wall warts with ~4' cables. The wall wart alone was annoying enough, but the short cable meant we ended up having to attach power strips to the back of the desk rather than just leaving them on the floor where they'd be more accessible.

I'd place my bets on a midspan brick with 2.5 feet of cable on the device side and a four foot C5 cable on the wall side.

wolrah fucked around with this message at 15:43 on Nov 11, 2017

repiv
Aug 13, 2009

Heise.de is claiming that NVs new consumer architecture will be called Ampere, not Volta, and should be announced during GTC18 at the end of March.

I suppose Ampere for consumers and Volta for HPC makes sense, to avoid a repeat of the Pascal confusion where two very different architectures had the same name.

repiv fucked around with this message at 16:30 on Nov 11, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Even with blower cards, the cards in the middle of the stack are going to be getting hot - or alternately the blower is going to get pretty noisy.

The Gasmask
Nov 30, 2006

Breaking fingers like fractals
Guess I have a Vega 64 coming, should be interesting to test and compare to the Pro Duo Polaris and 1070. Was hoping for the WX9100 and a threadripper considering I did a bunch of work for them to demo the WX9100+threadripper on, but there seems to have been a major dearth of AMD hardware to the point where I heard (don’t take this as hard fact, please) even many RTG engineers couldn’t get them. And as a side note, man is it hard to create a demo for hardware you don’t have, especially when that demo has requirements specific to the hardware in question.

We’ve had a good relationship with them though and their newer ProRender/software leadership has meant that there’s better followthrough on their end - but I get worried when I read some of the news/rumors here.

I’m just an artist that loves rendering tech so the game side of things is irrelevant to me, but I know if AMD drops the ball on the gaming side, the Pro side I rely on will be effected. Though the statement that Dr. Su wants to focus on the pro side first is really heartening - they were industry leaders in RAM capacity as recently as the Hawaii chip (the W9100 with 32GB of RAM was incredible - it wasn’t the fastest card, but man...) and I think they need a unique draw for guys like me.

Hell, a Vega with 32GB would be a dream come true, or (dreaming now) any card with 56 or 64 GB. Some of our scenes use 100GB+ of RAM when rendering, meaning we can only render on CPU (much slower than GPU unless you have something like a 56+ thread processor), so having those on GPU isn’t feasible... but many of our “smaller” scenes use 30-60GB, so having a modern card that could handle that load would be amazing.


Unrelated to RTG, but I recently discovered that the RPD Polaris seems to be nonexistent in the wild. Besides me and like 2 or 3 other guys (all of us who got them provided by AMD), I can’t find a single review or purchase comment. I understood when this happened with the original SSG (there were only 50 made and it wasn’t easy to leverage for work, that’s for sure), but I would’ve expected at least a couple of artists to pick one up, considering it’s 2x WX7100 with 2x RAM for less than the cost of two.

But maybe the slower speed, bad timing (right before Vega...) and no review samples meant they didn’t make many, and people just didn’t want them. So, I think I’ll be benchmarking and reviewing this guy as a sort of curiosity/artifact.

EdEddnEddy
Apr 5, 2012



Zero VGS posted:

I like how "ChipHell" gets the Hades Canyon chip, and there's Devil's Canyon too.

Has Intel been hailing Satan all this time? Explains a lot of things...

Wouldn't be surprised, considering their last Layoff Round, and the new one coming for the Finance team, probably just enough to shore up what they have to pay Raja. The Moral Hell over there might be part of it, but I have to admit I liked Devils Canyon and this new one looks even more badass.

However considering how things continue to look and be priced from Intel, I still feel my next build eventually, will be a Threadripper and NVIDIA whatever badassness they have with a Ti on the end. My current rig somehow continues to just deliver so no reason to chuck it yet.

mobby_6kl
Aug 9, 2009

by Fluffdaddy

repiv posted:

Heise.de is claiming that NVs new consumer architecture will be called Ampere, not Volta, and should be announced during GTC18 at the end of March.

I suppose Ampere for consumers and Volta for HPC makes sense, to avoid a repeat of the Pascal confusion where two very different architectures had the same name.
Thanks nvidia, now I'm stuck in another awkward spot. I think my 650 Ti is pretty much done at this point now that it can barely run Wolfenstein so I was going to go all out and get a new system with a 8700 and maybe 1070/ti and now what.

Would a 1050 Ti card be a decent fit for an old i5-2500 (stock clock, non-k) based system? Just to make it last a bit longer and then work as a living room gaming system for less demanding stuff. Of course with this approach, if I do get a VR system around holidays, I'd be still screwed so yeah.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
IMO VR kinda sucks value wise for anyone who isnt an enthusiast. Wait for the tech to improve so you arent stuck with the 1st gen tech when everyone else has the new 8k screens at 144hz.

1050ti will be fine for 1080p and will play what you want but if you can afford $100 more the 1060 6gb is a much much stronger card matching approximately the same fps/$

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
On the other hand, the rift is 350 on black Friday.

NewFatMike
Jun 11, 2015

The Gasmask posted:

Guess I have a Vega 64 coming, should be interesting to test and compare to the Pro Duo Polaris and 1070. Was hoping for the WX9100 and a threadripper considering I did a bunch of work for them to demo the WX9100+threadripper on, but there seems to have been a major dearth of AMD hardware to the point where I heard (don’t take this as hard fact, please) even many RTG engineers couldn’t get them. And as a side note, man is it hard to create a demo for hardware you don’t have, especially when that demo has requirements specific to the hardware in question.

We’ve had a good relationship with them though and their newer ProRender/software leadership has meant that there’s better followthrough on their end - but I get worried when I read some of the news/rumors here.

I’m just an artist that loves rendering tech so the game side of things is irrelevant to me, but I know if AMD drops the ball on the gaming side, the Pro side I rely on will be effected. Though the statement that Dr. Su wants to focus on the pro side first is really heartening - they were industry leaders in RAM capacity as recently as the Hawaii chip (the W9100 with 32GB of RAM was incredible - it wasn’t the fastest card, but man...) and I think they need a unique draw for guys like me.

Hell, a Vega with 32GB would be a dream come true, or (dreaming now) any card with 56 or 64 GB. Some of our scenes use 100GB+ of RAM when rendering, meaning we can only render on CPU (much slower than GPU unless you have something like a 56+ thread processor), so having those on GPU isn’t feasible... but many of our “smaller” scenes use 30-60GB, so having a modern card that could handle that load would be amazing.


Unrelated to RTG, but I recently discovered that the RPD Polaris seems to be nonexistent in the wild. Besides me and like 2 or 3 other guys (all of us who got them provided by AMD), I can’t find a single review or purchase comment. I understood when this happened with the original SSG (there were only 50 made and it wasn’t easy to leverage for work, that’s for sure), but I would’ve expected at least a couple of artists to pick one up, considering it’s 2x WX7100 with 2x RAM for less than the cost of two.

But maybe the slower speed, bad timing (right before Vega...) and no review samples meant they didn’t make many, and people just didn’t want them. So, I think I’ll be benchmarking and reviewing this guy as a sort of curiosity/artifact.

No Radeon SSG card for you? Seems like if you need that much VRAM, it would be useful. Pricing thing?

repiv
Aug 13, 2009

NewFatMike posted:

No Radeon SSG card for you? Seems like if you need that much VRAM, it would be useful. Pricing thing?

AFAICT SSG could only useful for workloads that slowly and sequentially churn through a large dataset, something like a GI renderer that needs to constantly sample textures scattered over the entire dataset is just going to thrash the cache and overload the SSD controller with tiny non-sequential reads :supaburn:

Yaoi Gagarin
Feb 20, 2014

repiv posted:

Heise.de is claiming that NVs new consumer architecture will be called Ampere, not Volta, and should be announced during GTC18 at the end of March.

I suppose Ampere for consumers and Volta for HPC makes sense, to avoid a repeat of the Pascal confusion where two very different architectures had the same name.

Announced in March means general availability in like may or june? I'd been planning to hold this 970 until Volta but now I'm contemplating just getting a 1080 or 1080ti...

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

VostokProgram posted:

Announced in March means general availability in like may or june? I'd been planning to hold this 970 until Volta but now I'm contemplating just getting a 1080 or 1080ti...

Yeah, my 970 went off warranty last month and it feels like walking a tightrope without a net below.

The Gasmask
Nov 30, 2006

Breaking fingers like fractals

NewFatMike posted:

No Radeon SSG card for you? Seems like if you need that much VRAM, it would be useful. Pricing thing?

We have the original SSG actually - at least with that one, the problem was software needed to be specifically coded for it. I think there was a custom version of premiere available, but for 3D rendering workloads there wasn’t any custom versions of the programs we use, and considering the GPU in that thing wasn’t anywhere near as powerful as even a 1070 and had really buggy drivers, it hasn’t been used much.

The new one looks cool, and we may end up with one or two of them depending. The benefit and caveat of partnering with hardware companies is we don’t pay for this stuff - which means we only get awesome gear if they feel like sending it and we can do something neat with it in return.

This fact has made me 10x more appreciative of consumer spending and I’ve become much more discriminatory as a result - if I’m going to drop $500+ on a GPU or CPU, it better drat well be worth the money.

I could never afford the bleeding edge before I got this job, so I was always midrange and often one or two gens behind. Then once I started getting the latest practically thrown at me, I realized just how temperamental and buggy that stuff was, and how often the gains were minimal for a significant increase in retail.

The Gasmask fucked around with this message at 05:26 on Nov 12, 2017

Volguus
Mar 3, 2009

BIG HEADLINE posted:

Yeah, my 970 went off warranty last month and it feels like walking a tightrope without a net below.

Out of warranty doesn't mean will blow up. The worst case scenario is that it does blow up and you have to buy a new video card, at which point the question becomes on why buy it now when you can buy it tomorrow? Tomorrow you will always get a better deal, baring any Xcoin shenanigans.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
But then maybe you can't play your games for a few days!

Fauxtool posted:

IMO VR kinda sucks value wise for anyone who isnt an enthusiast. Wait for the tech to improve so you arent stuck with the 1st gen tech when everyone else has the new 8k screens at 144hz.

1050ti will be fine for 1080p and will play what you want but if you can afford $100 more the 1060 6gb is a much much stronger card matching approximately the same fps/$
Sadly a 1060 is like 350 bucks here, which is almost in the "just get the 1070" territory, but might be a decent compromise.

As for VR I've been actually considering the Samsung headset, it's higher res and does inside-out tracking already.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

mobby_6kl posted:


As for VR I've been actually considering the Samsung headset, it's higher res and does inside-out tracking already.

As someone who has both a Rift and a Microsoft (Dell) headset, think twice before getting any of the Microsoft headsets. The lenses on the Dell at least are really bad and blur everything but the middle third of the image, and they don't have progressive lenses to do a tilt focus like the Rift has, and all of those headsets require you to be looking at your controllers for them to track accurately. If they are down by your sides or behind your back they drift off, and it actually prevents some game mechanics and stuff.

The Rift is $400 now and $350 for Black Friday, the Samsung is $500. At least see if you can try the Samsung one first, they have it at some Microsoft Stores. But I don't think they hold up to the Rift at the same price, nevermind more expensive. The resolution boost doesn't matter if you don't put money/time into really nice lenses. The Rift controllers are way better ergonomically too.

Ihmemies
Oct 6, 2012

Zero VGS posted:

As someone who has both a Rift and a Microsoft (Dell) headset, think twice before getting any of the Microsoft headsets. The lenses on the Dell at least are really bad and blur everything but the middle third of the image, and they don't have progressive lenses to do a tilt focus like the Rift has, and all of those headsets require you to be looking at your controllers for them to track accurately. If they are down by your sides or behind your back they drift off, and it actually prevents some game mechanics and stuff.

The Rift is $400 now and $350 for Black Friday, the Samsung is $500. At least see if you can try the Samsung one first, they have it at some Microsoft Stores. But I don't think they hold up to the Rift at the same price, nevermind more expensive. The resolution boost doesn't matter if you don't put money/time into really nice lenses. The Rift controllers are way better ergonomically too.

Yesterday I saw 1070's going for 400€ so yes buying a 1060 doesn't seem to be that good an idea. I didn't realize my 970 was already 3 years ago, welp apparently it was. Luckily I have only a 60Hz monitor so I don't need a faster card yet. Still waiting for those sweet 100Hz+ 38" gsync monitors..

Laslow
Jul 18, 2007
It is just a poo poo time to buy a monitor. I’m sticking with my 1200p 60Hz panel and 970 until high refresh rate adaptive sync HDR IPS/OLED becomes a thing that exists. Otherwise I’ll feel like I’m missing something. 16:10 4K/5K is already a pipe dream I’m going to need to give up on.

Urthor
Jul 28, 2014

Laslow posted:

It is just a poo poo time to buy a monitor. I’m sticking with my 1200p 60Hz panel and 970 until high refresh rate adaptive sync HDR IPS/OLED becomes a thing that exists. Otherwise I’ll feel like I’m missing something. 16:10 4K/5K is already a pipe dream I’m going to need to give up on.

CES is January 3rd to January 7th your dreams might just be answered come then.

Maybe not the Oled part but hopefully the HDR 144hz Gsync IPS

VostokProgram posted:

Announced in March means general availability in like may or june? I'd been planning to hold this 970 until Volta but now I'm contemplating just getting a 1080 or 1080ti...

Micron had a stock market statement in May saying they are starting GDDR6 production in April 2018, so yeah that means you get your 1180 or whatever the gently caress they call it Founders Edition in your hand probably end of June earliest, god only knows how long AIB cards will be delayed but hopefully I can get a good AIB before August.

Urthor fucked around with this message at 11:58 on Nov 12, 2017

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

repiv posted:

Heise.de is claiming that NVs new consumer architecture will be called Ampere, not Volta, and should be announced during GTC18 at the end of March.

I suppose Ampere for consumers and Volta for HPC makes sense, to avoid a repeat of the Pascal confusion where two very different architectures had the same name.

Makes sense. Tensor cores are totally pointless for graphics.

repiv
Aug 13, 2009

Urthor posted:

Micron had a stock market statement in May saying they are starting GDDR6 production in April 2018, so yeah that means you get your 1180 or whatever the gently caress they call it Founders Edition in your hand probably end of June earliest, god only knows how long AIB cards will be delayed but hopefully I can get a good AIB before August.

On the other hand SK Hynix said they're producing GDDR6 for a client to release high-end graphics cards by early 2018, and May/June would really be stretching the definition of "early".

Malcolm XML posted:

Makes sense. Tensor cores are totally pointless for graphics.

Yeah there was no doubt that Tensor/FP64 support would be stripped out of consumer chips. I wonder if they'll give us FP16 this time though, it could go either way.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Laslow posted:

16:10 4K/5K is already a pipe dream I’m going to need to give up on.

It wasn't 16 years ago, but today, the idea is indeed dead.

Ihmemies
Oct 6, 2012

3840x1600 is good enough, you can run it on 2560x1600 for 16:10 if needed. Scaling still works so poorly I don't want to buy a higher dpi monitor than 10 years ago.

Laslow
Jul 18, 2007

Urthor posted:

CES is January 3rd to January 7th your dreams might just be answered come then.

Maybe not the Oled part but hopefully the HDR 144hz Gsync IPS
It won’t be cheap getting a good enough GPU to drive it properly in any case, so I’m not in a huge rush.

I remember years ago analysts were saying that discrete video cards were gonna go out of style like sound cards because graphics couldn’t be improved upon much as if they thought we’d be targeting 1080p60 forever. I wish I had a job like that and just get paid for being fuckin wrong all the time.

SwissArmyDruid
Feb 14, 2014

by sebmojo

repiv posted:

On the other hand SK Hynix said they're producing GDDR6 for a client to release high-end graphics cards by early 2018, and May/June would really be stretching the definition of "early".

Well, the timing points to AMD, but "high-end" points to Nvidia.

Laslow
Jul 18, 2007

Ihmemies posted:

3840x1600 is good enough, you can run it on 2560x1600 for 16:10 if needed. Scaling still works so poorly I don't want to buy a higher dpi monitor than 10 years ago.
Windows 10 is little bit better and MacOS HiDPI is good. I’m tired of the DPI of my computer getting dunked on so hard by my goddamn cellphone.

orcane
Jun 13, 2012

Fun Shoe

Laslow posted:

Windows 10 is little bit better and MacOS HiDPI is good. I’m tired of the DPI of my computer getting dunked on so hard by my goddamn cellphone.

Stop having your monitor in your face I guess?

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Laslow posted:

It is just a poo poo time to buy a monitor. I’m sticking with my 1200p 60Hz panel and 970 until high refresh rate adaptive sync HDR IPS/OLED becomes a thing that exists. Otherwise I’ll feel like I’m missing something. 16:10 4K/5K is already a pipe dream I’m going to need to give up on.

The TV market is so much more competitive it's not even funny anymore. Last year Black Friday I saw a 55" 4K going at $300 with plenty of stocks, but make that into a 27" monitor, add factory calibration and suddenly it's a $1000+ "ahaha fuuuuck you PC losers". At the end of day that still doesn't hold a candle to a LG C7 OLED now going at $1700 besides PPI.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Isn't the time-to-photon pretty poo poo in your average TV?

repiv
Aug 13, 2009

SwissArmyDruid posted:

Well, the timing points to AMD, but "high-end" points to Nvidia.

comedy option: intel iris extreme with gddr6 launching in january

Arivia
Mar 17, 2011

repiv posted:

comedy option: intel iris extreme with gddr6 launching in january

at this point I would not be surprised

wolrah
May 8, 2006
what?

Ihmemies posted:

Scaling still works so poorly I don't want to buy a higher dpi monitor than 10 years ago.

On the other hand this is a real chicken and egg problem. If people continue to mostly not buy higher DPI desktop displays the idiots creating all these lovely Windows apps that break when scaled will never have an incentive to fix things.

Remember when Apple launched Retina displays? Lots of major apps did not properly support them. I remember the first time I messed with one at an Apple store it seemed like half of what they have installed on the demo systems was either tiny or blurry.

The majority released updates within a year or so, and many of those that weren't being updated anymore were abandoned. It's been years since I've seen improper scaling on a Mac.

Why does the Mac software get fixed when there are tons of Windows apps that still get it wrong after so many years of these problems being pointed out? Because they were forced to.

People were buying Retina Macs and weren't willing to compromise.

Unfortunately the Windows world has no single vendor who can unilaterally push hardware changes, and a lot more users who would rather do backflips through flaming hoops than consider replacing the piece of software they started using in 1997.

wolrah fucked around with this message at 17:39 on Nov 12, 2017

redeyes
Sep 14, 2002

by Fluffdaddy
That is exactly why I got a 49" Samsung 4k Quantum Dot TV to use as a monitor. 1:1 scaling at around 90dpi. Exactly perfect scaling for any apps. Bottom line, Windows and high DPI is a non starter. Modern apps are fine but that is really nothing I use.

eames
May 9, 2009

3dcenter.org reports that the Intel i7-8809G APU is scoring 13341 in 3DMark 11, the same as a GTX-1060 MaxQ and ~10% behind a mobile GTX-1060.

Rabid Snake
Aug 6, 2004



eames posted:

3dcenter.org reports that the Intel i7-8809G APU is scoring 13341 in 3DMark 11, the same as a GTX-1060 MaxQ and ~10% behind a mobile GTX-1060.

I wonder how much power these chips are going to be using and how much heat is generated

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo

eames posted:

3dcenter.org reports that the Intel i7-8809G APU is scoring 13341 in 3DMark 11, the same as a GTX-1060 MaxQ and ~10% behind a mobile GTX-1060.

These numbers do not help me feel less annoyed about buying this laptop!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply