Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
cubicle gangster
Jun 26, 2005

magda, make the tea
On a similar note, I've never understood the benefits of gpu rendering for large scale work. By the time you've got enough cards in your machine to fit a heavy production scene into memory you could have built 5 upper mid range amd boxes with 128gb ram each for a render farm.
Its a much higher entry price point and less flexible.

Adbot
ADBOT LOVES YOU

SubNat
Nov 27, 2008

Out-of-core rendering is getting more common now though, isn't it?
Last I checked out V-ray they had support for it, meaning that gpu rendering is becoming more flexible in regards to scene sizes, at a hit to performance. (Though dunno exactly how much, as that's kind of dependent on how much data needs to get swapped in.)
(Though I suppose smart access from AMD and it's soon-to-be-announced nvidia counterpoint might mitigate some of that perf hit due to being able to access more vram at a time.)
As OoC gets more common and performant, you can get away with more, cheaper gpus that have a way way better bang for your buck though, which starts making the value proposition a bit more practical. As you're not targeting the highest possible memory density.

And since vray and blender support pushing rendering to cpu and gpu at the same time, you can be a lot more flexible in mixing hardware too.
Which is honestly pretty great, though it's problematic to have so much of rendering centered around CUDA, considering how it's never going to be open to amd or intel gpus, most likely.

echinopsis posted:

Well. I just looked at what I could get out of upgrading my GTX 1060 to an RTX 3090 which seems to be the fastest card there is, and the GPU mark goes from approx 10,000 to 25,000, and I know those numbers don't necessarily indicate how much faster everything would be, but it looks on the surface like I could get a 2.5x increase in speed.

(Gpu marks aren't directly linear, a 2080ti vs a 3080 is like 21.5k vs 23.8k (~10% more) despite a 3080 performing 20-30% better in a lot of titles. Looking up rendering benchmarks etc is a far better indicator of perf.)

Yeah, one aspect that doesn't really come out from gpu mark etc, due to it just being raster/game perf is that all the RTX gpus have a bunch of extra silicon in the form of tensor cores and rt cores, which help kick things up a notch when it comes to rendering, provided the engine can make use of them.
V-ray got an average 40% speedup on a RTX 2080 from getting their renderer to make use of the additional RT hardware on the gpus, compared to just running their old gpu-rendering engine on the card.
(Though that's likely not relevant for you right now, I think blender just supports the optiX denoiser, but not any usage of RT hardware directly/explicitly?)

Elukka
Feb 18, 2011

For All Mankind

SubNat posted:

(Though that's likely not relevant for you right now, I think blender just supports the optiX denoiser, but not any usage of RT hardware directly/explicitly?)
I think it does? At least benchmarks I can find with RTX cards show up to a 100% improvement in render times with OptiX on. (not the denoiser)

e: Yeah, apparently it does since some time last year: https://code.blender.org/2019/07/accelerating-cycles-using-nvidia-rtx/

Elukka fucked around with this message at 18:48 on Dec 11, 2020

SubNat
Nov 27, 2008

Elukka posted:

I think it does? At least benchmarks I can find with RTX cards show up to a 100% improvement in render times with OptiX on. (not the denoiser)

e: Yeah, apparently it does since some time last year: https://code.blender.org/2019/07/accelerating-cycles-using-nvidia-rtx/

Ah ye, egg on my face. Good on 'em. It's surprising how quickly renderers adopted the RTcores.
For whatever reason I assumed Optix was specifically the ai denoising solution, as opposed to the general engine for hardware accelerated RTcore/raytracing support.

Probably because it gets presented as Optix Denoising whenever I hear the name come up.

Elukka
Feb 18, 2011

For All Mankind
Same, I learned like last week that OptiX was also something other than a denoiser.

It is really cool how fast renderers adopted it. Now if I actually had an RTX card... I'm gonna see if used 2080 prices go down once the 3000 series is more available.

Kanine
Aug 5, 2014

by Nyc_Tattoo
Gonna ask a potentially stupid/uncomfortable question for other devs in the thread:

As a junior artist (currently on a contract but looking for more remote work soon) in the games industry how bad for my career is it that I tweet/retweet/like stuff on my public twitter about crunch, unionization, etc. (where I post my art and have a few hundred followers comprised of mostly other people in the games industry)

Am I basically loving myself over and making myself unhireable? Is it enough for me to make my account private when I send out applications or is me doing this essentially getting me put on a literal blacklist that will prevent me from obtaining gainful employment in the industry ever again.

I'm obviously not stupid enough to talk about this on the work slack/discord in the company I'm currently contracting for but several of my coworkers have followed me on twitter since I started there a few months ago.

Kanine fucked around with this message at 03:22 on Dec 12, 2020

ImplicitAssembler
Jan 24, 2013

What do you hope to gain from it?.

Stupid_Sexy_Flander
Mar 14, 2007

Is a man not entitled to the haw of his maw?
Grimey Drawer
Possible strange request, but can I hire one of you dudes or dudettes to make a coin holder for me? I've got a design in mind where it holds it upright, and stable, but it's kinda curved a bit and I need it to fit a specific coin, and I'm not exactly confident in my ability to match the size and shape.

It's for a file I'm just gonna make public, so if someone wants to knock it out and can hit me with a quote, please feel free to drop me a message.

Sagebrush
Feb 26, 2012

Post a picture of what you're imagining and if it's not too complex I can probably knock it out in a few minutes. No charge.

(If it's like covered in skulls and snakes and poo poo then we can negotiate)

Kanine
Aug 5, 2014

by Nyc_Tattoo

ImplicitAssembler posted:

What do you hope to gain from it?.

Do you mean specifically from me talking about stuff on twitter, or like long term if I'm planning on actually doing something concrete once I'm confident in trying to organize a studio?

echinopsis
Apr 13, 2004

by Fluffdaddy
he’s hoping to unionise the graphic design industry

Kanine
Aug 5, 2014

by Nyc_Tattoo
if i were a graphic designer probably yeah

mutata
Mar 1, 2003

Every couple months I run a tweet deleter and delete every tweet, like, and reply older than a week. I think if I was actively applying to places, I would run that, pin a tweet with my top 4 pieces, and just retweet cool art until I got hired. I don't think you're doing any damage, but you should assume that employers and potential employers WILL look at your social media and judge you off it. Especially don't tweet about internal unionizing efforts or whatever, you'll find yourself no longer a "culture fit".

My current employer is pretty liberal and we're small and they're all aware that I'm an angry lefty weirdo who uses Twitter for yelling. YMMV. Social media is bad.

500
Apr 7, 2019

It sounds like you're asking if there's a global shared blacklist that all hiring managers use to keep out undesirables, or something. I don't think most of them care enough about who you are to bother doing something like that. That blacklist sounds like it would be a nightmare to maintain, too -- just some poor sap sifting through thousands of junior game devs and adding them to a spreadsheet if they tweet about worker's rights. Or making an algorithm do the job for them and accidentally filtering a bunch of talented artists out of the hiring pool.

Still, potential employers will most likely check out your twitter before they interview you. If you're worried your tweets might lose you points, then maybe rein it in a bit. But in my experience you need to be reaching a certain quality bar with your work before people even start assessing you for cultural fit.

floofyscorp
Feb 12, 2007

In all seriousness, you have to decide for yourself how much you want a job at a place that would drop you for tweeting about workers rights. Early in your career, that might be something you're more willing to accept in exchange for experience and money. Or maybe there's just not much choice near you and you don't want to move, so you might want to be more circumspect.

Personally, I have a link to the union I helped found and do work for right in my Twitter bio because I'm too old and grumpy to put up with any poo poo anymore, but I have ten+ years experience and little expectation of working at a AAA studio any time soon anyway. YMMV.

ImplicitAssembler
Jan 24, 2013

I doubt there's a blacklist, but at least for VFX, people most certainly talk cross-company. I also doubt you'll get dropped, but if you pick up a reputation, then you will drop down the line, if there's someone else with a similar skillset.
So..are you tweeting because you want to get more followers or because you genuinely want to unionize?. If it's the latter, is twitter really the right place for it?.

cubicle gangster
Jun 26, 2005

magda, make the tea
I dont think you're in danger of being looked down on for tweeting about workers rights, but how you tweet about it has the potential to rub people the wrong way. If you have an unnatural focus on wage theft and never working an hour over your contracted amount, it begins to reflect on what you might be like to work with long term.
There's also the chance it will come across as performative too - depending on the content being shared and if there is a clear intent.


SubNat posted:

Out-of-core rendering is getting more common now though, isn't it?
Last I checked out V-ray they had support for it, meaning that gpu rendering is becoming more flexible in regards to scene sizes, at a hit to performance. (Though dunno exactly how much, as that's kind of dependent on how much data needs to get swapped in.)

It does, but it's still missing basic vray features which we use daily (they're the kind of features that if you didn't use, you'd just render with corona or f-storm and save a shitload on vray licenses). and on top of that, out of core rendering is slower than cpu only with a $500 ryzen. They've got such a long way to go and processors are making such huge gains with every generation, I just don't see it ever catching up.

cubicle gangster fucked around with this message at 18:51 on Dec 12, 2020

SubNat
Nov 27, 2008

cubicle gangster posted:

It does, but it's still missing basic vray features which we use daily (they're the kind of features that if you didn't use, you'd just render with corona or f-storm and save a shitload on vray licenses). and on top of that, out of core rendering is slower than cpu only with a $500 ryzen. They've got such a long way to go and processors are making such huge gains with every generation, I just don't see it ever catching up.

Then that's very understandable then, atleast until we hit another round of cpu stagnation (if amd runs out of steam without intel catching up, for example.), and gpu development pivots ahead.
A shame they still don't have feature parity between the cpu/gpu renderers either.

(As mentioned though, I am curious if nvidia rolling out resizable bar support/their version of smart access will help negate the OoC rendering penalties, don't think I've seen any benchmarks for how it helps with amd gpu rendering, only vague game benchmarks.
And also the whole 'gpus can read directly from storage without needing to wait for cpu' feature as well. Gpu-direct?)

cubicle gangster
Jun 26, 2005

magda, make the tea

SubNat posted:

Then that's very understandable then, atleast until we hit another round of cpu stagnation (if amd runs out of steam without intel catching up, for example.), and gpu development pivots ahead.
A shame they still don't have feature parity between the cpu/gpu renderers either.

Yeah, I do think AMD making huge gains probably hosed up projections of how useful GPU rendering would actually be. On the high end cpu benchmark https://www.cpubenchmark.net/high_end_cpus.html
The 5950x is an $800 processor breaking 45k.
Side by side benchmarks (to get a clean image, in GPU bucket mode) place a 2080ti in line with a score of 20-25k on the CPU side, depending on what's being rendered. So while they've not been directly benchmarked, it seems like a 3090 should be on par with a 5950x.
AMD will have to stagnate and nvidia get the same kind of performance leap for another full generation or two before I can get on board. Right now it's double the workstation cost for a less flexible, inferior approach with no change in speed.
Gpu's do scale more easily - you can fit 4 in one box without much effort... but then you've got an $8-10k+ workstation. i'd rather have 4 boxes and run a small farm.

Gearman
Dec 6, 2011

cubicle gangster posted:

Yeah, I do think AMD making huge gains probably hosed up projections of how useful GPU rendering would actually be. On the high end cpu benchmark https://www.cpubenchmark.net/high_end_cpus.html
The 5950x is an $800 processor breaking 45k.
Side by side benchmarks (to get a clean image, in GPU bucket mode) place a 2080ti in line with a score of 20-25k on the CPU side, depending on what's being rendered. So while they've not been directly benchmarked, it seems like a 3090 should be on par with a 5950x.
AMD will have to stagnate and nvidia get the same kind of performance leap for another full generation or two before I can get on board. Right now it's double the workstation cost for a less flexible, inferior approach with no change in speed.
Gpu's do scale more easily - you can fit 4 in one box without much effort... but then you've got an $8-10k+ workstation. i'd rather have 4 boxes and run a small farm.


In a general sense (because I'm not really sure what scale you're really discussing) at "large scale" rendering, CPUs are still certainly better.

Where GPUs become better is realtime feedback, higher performance viewport, and now being able to do dual GPU and CPU rendering. There's also a ton of AI-driven advancements that are only available to GPUs, in addition to specific rendering technology like CUDA.

When you talk about render setups like 4x GPU or CPU, then the better solution is likely just paying a few bucks for a cloud rendering solution. When you talk even bigger, you start considering cloud vs on-prem and which one makes sense really depends on your situation.

Either way, it still makes sense for most CG artists to have a single, beefy, GPU just for the viewport performance alone.

cubicle gangster
Jun 26, 2005

magda, make the tea
We've actually retired our amazon EC2 account and gone with local hardware again - we already owned 60 vray render node licenses from our cloud use, so we have a closet full of $1400 ryzen builds running 24/7 now instead. After a lot of spreadsheeting we found it's the most cost effective solution by a long way, it's kind of wild.

I actually had a line in that post about the benefits of GPU - you do get immediate feedback much, much quicker, and it's more reactive to editing the scene, but final renders take longer to clean up. For some people who arent doing final renders and need more immediate feedback, GPU does make a lot of sense.

Edit: I am properly rambling about nothing, christ. Too much coffee today.

cubicle gangster fucked around with this message at 20:50 on Dec 12, 2020

KinkyJohn
Sep 19, 2002

Is Blender + Octane + Houdini + Zbrush a good combo for game dev and a bit of everything?

I'm hearing that blender really stepped up their game from being kinda crappy to being a seriously capable piece of software

Neon Noodle
Nov 11, 2016

there's nothing wrong here in montana

KinkyJohn posted:

Is Blender + Octane + Houdini + Zbrush a good combo for game dev and a bit of everything?

I'm hearing that blender really stepped up their game from being kinda crappy to being a seriously capable piece of software

Blender is life

500
Apr 7, 2019

KinkyJohn posted:

Is Blender + Octane + Houdini + Zbrush a good combo for game dev and a bit of everything?

I'm hearing that blender really stepped up their game from being kinda crappy to being a seriously capable piece of software

I mean, it depends. Are you about to drop a bunch of money on licenses or something? Assuming you're new to 3D -- maybe start with just Blender and find out what its limitations are first. Blender has decent sculpting tools, for example, so you may not even need zBrush.

sigma 6
Nov 27, 2004

the mirror would do well to reflect further

cubicle gangster posted:

Yeah, I do think AMD making huge gains probably hosed up projections of how useful GPU rendering would actually be. On the high end cpu benchmark https://www.cpubenchmark.net/high_end_cpus.html
The 5950x is an $800 processor breaking 45k.
Side by side benchmarks (to get a clean image, in GPU bucket mode) place a 2080ti in line with a score of 20-25k on the CPU side, depending on what's being rendered. So while they've not been directly benchmarked, it seems like a 3090 should be on par with a 5950x.
AMD will have to stagnate and nvidia get the same kind of performance leap for another full generation or two before I can get on board. Right now it's double the workstation cost for a less flexible, inferior approach with no change in speed.
Gpu's do scale more easily - you can fit 4 in one box without much effort... but then you've got an $8-10k+ workstation. i'd rather have 4 boxes and run a small farm.

This is fascinating. I haven't checked CPU scores in forever and I heard the Ryzen Threadrippers were trouncing Intel. This seems to confirm that. Haven't built an AMD computer since the early 2000s and they usually over heated. Also, since AMD / ATI partnered up I never really looked back since I prefer Intel / Nvidia for 3d. Now I am wondering if things have changed enough to switch back?... hmmm. Do AMDs have issues with Nvidia cards since they partnered with ATI or do AMD boxes work as well or better with Nvidia cards? Maybe a dumb question but Nvidia is no friend of ATI and Apple is no friend of Nvidia and so on ...
Don't care about SLI as much but is that even a thing anymore? Years ago SLI only offered 1.5x performnace vs. 2x performance with 2 graphics cards. Hopefully they fixed that.

Gearman
Dec 6, 2011

sigma 6 posted:

This is fascinating. I haven't checked CPU scores in forever and I heard the Ryzen Threadrippers were trouncing Intel. This seems to confirm that. Haven't built an AMD computer since the early 2000s and they usually over heated. Also, since AMD / ATI partnered up I never really looked back since I prefer Intel / Nvidia for 3d. Now I am wondering if things have changed enough to switch back?... hmmm. Do AMDs have issues with Nvidia cards since they partnered with ATI or do AMD boxes work as well or better with Nvidia cards? Maybe a dumb question but Nvidia is no friend of ATI and Apple is no friend of Nvidia and so on ...
Don't care about SLI as much but is that even a thing anymore? Years ago SLI only offered 1.5x performnace vs. 2x performance with 2 graphics cards. Hopefully they fixed that.

There's a lot to cover here but for now here's a TL;DR:
- AMD acquired ATI back in 2006.
- Not only are AMD's Threadrippers trouncing Intel, but pretty much the entire lineup of AMD CPUs are beating equivalent Intel chips. This will still likely continue for some time, because Intel is having some pretty serious production issues with their new 7nm process. Intel chips still have strong single core performance, but AMDs chips are pretty much across the board better and cheaper.
- No noticeable difference whatsoever if you're using an AMD Ryzen CPU and an NVIDIA GPU. AMD has talked about new tech that will make an AMD CPU and GPU work better together using "Smart Access Memory" but NVIDIA can do something similar for NVIDIA GPUs on either Intel or AMD CPUs.
- SLI is dead, but you can still stuff extra GPUs in a box and use them to help reduce render time. Multiple GPUs are now useless for gaming.

If you're building a new machine today, you're probably going to have an AMD CPU and an NVIDIA GPU at most price points.

Gearman fucked around with this message at 04:21 on Dec 13, 2020

SubNat
Nov 27, 2008

Gearman posted:

- No noticeable difference whatsoever if you're using an AMD Ryzen CPU and an NVIDIA GPU. AMD has talked about new tech that will make an AMD CPU and GPU work better together using "Smart Access Memory" but NVIDIA can do something similar for NVIDIA GPUs on either Intel or AMD CPUs.

Smart access memory is 'kinda' a AMD GPU feature, not a AMD cpu or motherboard feature, the way they're rolling it out.
The feature is already rolling out for intel motherboards that provide the update for it, and amusingly, most intel cpus from 2014+ support it(provided the mobo gets updated.), while only the newest 1-2 generations of Ryzen do so far.
(I believe ASUS has already started rolling out updates for some of their intel mobos?)

It's all marketing, they're just enabling a feature of the PCI-e spec that's been around for ages, but nobody saw a need to support it. (Hence why up-to-6-year-old intel cpus support them, because they implemented that part of the spec as well just in case.)

But yeah, Nvidia will roll it out as well sometime in the new future now that AMD has it, and I imagine it'll quickly become one of those completely platform agnostic features, where it only matters if you have a mobo new enough to support it in bios. After all, it's just part of the PCI-E spec.
(The feature is 'resizeable BAR' support, and in benchmarks it seems to give an extra 1-2% perf in some games, TPU did a test across 22 titles. It's good that it's present, but it's not exactly a noteworthy feature. It's just marketing trying to make a mountain out of a molehill because they implemented it first.)

sigma 6
Nov 27, 2004

the mirror would do well to reflect further

Hmm. Thanks for all the info! Had hoped they had done away with SLI.
Maybe I will look into finding an AMD motherboard CPU / mobo combo I can afford but for now I think I will try and max out the Intel mobo PCIE slot I already have. Definitely some old Geforces in there right now. Someone told me the other day that high end Nvidia cards are very hard to find at the moment.
Are people still using them for bitcoin mining or is it just a general shortage due to covid / manufacturing problems?

Anyone who is passionate about Blender: Why should I switch from Maya to Blender if I already get Maya free from work? Other than the grease pencil function and arguably sculpting, is there something Blender does much better than Maya? There has been a huge push to Blender for a little while now and I realize this might be opening a can of worms.

sigma 6 fucked around with this message at 06:19 on Dec 13, 2020

cubicle gangster
Jun 26, 2005

magda, make the tea

sigma 6 posted:

Hmm. Thanks for all the info! Had hoped they had done away with SLI. .

For what it's worth, the reason is that they have nvlink - which allows you to pool 2 gpus memory together. 4x titan rtx's can be set up in a dual nvlink which gives 48gb memory.
It's basically the pro version of sli. They don't need to cater dual cards to gamers anymore. Fast, but very expensive.

All of dbox is running modest ryzens and mid to low end nvidia cards - $1500 workstations. Hardware is in a really great place right now, it feels very uncompromised and a few boxes pooled can burn through any tricky render.

cubicle gangster fucked around with this message at 06:34 on Dec 13, 2020

echinopsis
Apr 13, 2004

by Fluffdaddy

sigma 6 posted:

Anyone who is passionate about Blender: Why should I switch from Maya to Blender if I already get Maya free from work? Other than the grease pencil function and arguably sculpting, is there something Blender does much better than Maya? There has been a huge push to Blender for a little while now and I realize this might be opening a can of worms.

Word on the street is that Autodesk is doing some good things lately. If Blender comes up with anything compelling to encourage you to shift, you'll hear about it. Until then stick with what you're good at IMO.

Kanine
Aug 5, 2014

by Nyc_Tattoo
Hey I really appreciate the feedback, that all tracks with the answers I've gotten elsewhere pretty much.

sigma 6
Nov 27, 2004

the mirror would do well to reflect further

echinopsis posted:

Word on the street is that Autodesk is doing some good things lately. If Blender comes up with anything compelling to encourage you to shift, you'll hear about it. Until then stick with what you're good at IMO.

Yeah - the beginning of this 2020.4 video feels a little like a Houdini workflow. That's not a bad thing at all.

cubicle gangster: Thanks for the tip. I have to seriously consider this for my next build. Surely AMDs heat problems must be fixed at this point.

sigma 6 fucked around with this message at 07:45 on Dec 13, 2020

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

sigma 6 posted:

Surely AMDs heat problems must be fixed at this point.

and how!

Kanine
Aug 5, 2014

by Nyc_Tattoo


I'm thinking of doing a 3d model of one of my blaster designs as a personal fun side project and to improve at some new techniques. I'm curious which one yall think I should try out modelling first?

echinopsis
Apr 13, 2004

by Fluffdaddy
Second from bottom, if my opinion matters lol

sigma 6
Nov 27, 2004

the mirror would do well to reflect further

Bottom one IMO. Did you photobash these?

Sagebrush
Feb 26, 2012

I like the funny little pistol on the left. Bottom one is also decent.

Kanine
Aug 5, 2014

by Nyc_Tattoo

sigma 6 posted:

Bottom one IMO. Did you photobash these?

yeah they're all collaged/photobashed together from various gun bits with some minor painting over to make stuff work.

(royal armouries collections site along with various firearms auction sites are great for finding gun reference but also for good high res photobash source material)

Gearman
Dec 6, 2011

sigma 6 posted:

Yeah - the beginning of this 2020.4 video feels a little like a Houdini workflow. That's not a bad thing at all.

cubicle gangster: Thanks for the tip. I have to seriously consider this for my next build. Surely AMDs heat problems must be fixed at this point.

Yeah, AMD hasn't had heat problems for quite a while now. With modern PC case design, and even a stock heatsink and Dan combo, you only see high temps if something is wrong. Their chips are also in every Xbox One, Xbox Series S/X, PS4, and PS5. It's been some time since AMD has had a bad chipset.

Re: Blender.
It's an excellent and free tool that's nearly reached parity with Max and Maya. If you get Maya for free and don't have a real need to learn another modeling program, then it's not really worth your time to learn. That said, it definitely has a much bigger plugin and development community than Max and Maya. If you're looking to pickup another tool in the toolbox I'd probably recommend Houdini first.

Adbot
ADBOT LOVES YOU

BonoMan
Feb 20, 2002

Jade Ear Joe
I just ordered a system from Puget so I'm super excited. 5950x, 3080 and 64 gigs ram.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply