Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Kazinsal
Dec 13, 2011


Elukka posted:

I usually start thinking about what's it for and who made it. I might have a loose visual idea at this point, or I might not. For that ship the extent of the latter was "I want it to have a pointy nose". Then I think about what kind of hardware it needs to do what it's for. In my case I like to go with semi-realistic engineering concerns, but you can do the same with entirely fictional tech (like Star Trek did - entirely fictional parts and design rules for ships) or with a more intuitive ruleset. Homeworld is a great example of the latter. In any case, having context and restrictions is very important to me to have any compelling visual ideas. I want to have some narrative context for why the thing exists and I want to have rules for how things work inside the fiction. I like worldbuilding so that comes naturally to me.

For that ship, I started with the pointy nose, then blocked out approximate volumes for its functional bits. I figured out how much volume would be needed for main propellant, maneuvering thruster propellant, weapons, etc. based on some figures and guesses of material densities, and arranged them in some sensible way. I ended up with a core of main propellant, four liquid tanks for the maneuvering thrusters around it, missiles in external pods, and everything else in the nose section. Then I draped a skin over it and had this:



It's really boring at this point. I did some pro concept art to figure out where I wanted to go with it:



Then I applied that, did some scale adjustments on various bits, generally to make interesting bits more prominent and boring bits less prominent. (Side note: I think Expanse ships would benefit from another pass like this) Eventually I got a final shape I was happy with, and started scribbling out ideas for details:



Then I drew normal maps for that detail directly. If I was making a high poly model this is where I would be modeling all this detail instead, but I'm bad at dealing with high poly models. After that, I did the texturing proper.



I would probably be able to do a lot more things if I had your aptitude for loving around and finding out neat things!

as someone who does sci-fi modelling occasionally (though admittedly a much lower detail and more stylized I see stuff like this and wish I knew blender instead of 3ds max. the idea of just casually painting on normal maps instead of painstakingly modelling everything while trying to stay under a certain self imposed triangle limit is extremely foreign to me

e: in a past life I was a dev on an MMO-esque mod for a long-forgotten space shooter game and a few years ago I had bad nostalgia and started developing parts of an engine for that type of game before I realized what an impossible task that I had given myself and gave up. my fondest memories of it are of relaxing and modelling spaceships and station parts and stuff

Kazinsal fucked around with this message at 11:07 on Oct 3, 2021

Adbot
ADBOT LOVES YOU

Elukka
Feb 18, 2011

For All Mankind

Kazinsal posted:

as someone who does sci-fi modelling occasionally (though admittedly a much lower detail and more stylized I see stuff like this and wish I knew blender instead of 3ds max. the idea of just casually painting on normal maps instead of painstakingly modelling everything while trying to stay under a certain self imposed triangle limit is extremely foreign to me
That's not a Blender thing, I did that with Quixel Suite's NDO which is a Photoshop plugin that's now deprecated and unsupported. At some point I'll have to replace it with something else, which probably will have to be Substance, which I'm not overly enthusiastic about because it's Adobe and because it's a huge complex suite of software and I feel I don't really need most of it.

I looked around for software that can do the same and it's either too simplistic or incredibly enormous so I probably have to go for the latter. What NDO does is let you just draw as normal, and what you draw is converted on the fly to normal map, and you have a bunch of sliders to adjust just how that happens for each layer. (depth of the detail, whether it goes up or down, how smooth it is, etc.)

e: This is all stuff I just drew in Photoshop and for this kind of thing I find it so much easier than modeling all this detail. Of course, actual modeled detail on a high poly model would still look better... If I was making game models though this would make a whole lot of sense.

Elukka fucked around with this message at 13:53 on Oct 3, 2021

Songbearer
Jul 12, 2007




Fuck you say?
Substance Painter is great for normal mapping and such but again, it's Adobe. If you can stand making a deal with the devil for it I've found it to be a lifesaver for me, since manually photoshopping and texturing stuff is massively outside my wheelhouse.

Jenny Agutter
Mar 18, 2009

Komojo posted:

Bézier curves are polynomial functions, and the smoothness of the curve depends on the degree of the polynomial.

Suppose you have a 3D camera that is following a path. With only linear interpolation, it would move straight lines and you would see a sudden jump in velocity every time it goes through a control point. If you combine two linear interpolations you get a quadratic curve; this would show a constant acceleration for each segment of the path, but as soon as you go from one segment to another the acceleration would suddenly change. For smooth acceleration you need at least a cubic Bézier curve, which is the most common type.

The curvature at any given point be defined as the inverse of the radius of an osculating circle tangent to that point. There's a handy formula on Wikipedia for computing the curvature if you know the first and second derivative, which is easy to compute for Bézier curves.

So just because it is so interesting, I wrote a Python script that can visualize the curvature of a Bézier curve in Blender. It looks for a curve called "BezierCurve" and then creates a new object called "DebugMesh" which shows the curvature overlaid in 3D.



code:
import bpy  
from mathutils import Vector  
import math

scene = bpy.context.scene

# Bezier curve function
def Bezier(p0, p1, p2, p3, t):
    u = 1 - t
    b0 = u * u * u
    b1 = 3 * u * u * t
    b2 = 3 * u * t * t
    b3 = t * t * t
    return (b0 * p0) + (b1 * p1) + (b2 * p2) + (b3 * p3)
    
# Derivative of Bezier curve function
def BezierD1(p0, p1, p2, p3, t):
    u = 1 - t
    b0 = 3 * u * u
    b1 = 6 * u * t
    b2 = 3 * t * t
    return (b0 * (p1 - p0)) + (b1 * (p2 - p1)) + (b2 * (p3 - p2))
    
# Second derivative of Bezier curve function
def BezierD2(p0, p1, p2, p3, t):
    u = 1 - t
    b0 = 6 * u
    b1 = 6 * t
    return (b0 * (p2 - (2 * p1) + p0)) + (b1 * (p3 - (2 * p2) + p1))

# ShowCurvature: Generate a new object that shows the curvature of a Bezier curve
def ShowCurvature(curveObjectName, debugObjectName):

    # Create the mesh object
    bpy.ops.object.select_all(action='DESELECT')
    if (scene.objects.find(debugObjectName) >= 0):
        debugObj = scene.objects[debugObjectName]
        debugObj.select_set(True)
        bpy.ops.object.delete()        

    debugMesh = bpy.data.meshes.new(debugObjectName+"Mesh")
    debugObj = bpy.data.objects.new(debugObjectName, debugMesh)
    col = bpy.data.collections.get("Main")
    col.objects.link(debugObj)
    debugObj.select_set(True)

    # Read the Bezier curve points        
    curveObj = scene.objects[curveObjectName]
    spline = curveObj.data.splines[0]

    # Add curve points to new mesh
    verts = []
    edges = []
    faces = []

    numPoints = len(spline.bezier_points)
    edgeIndex = 0
    for curveIndex in range(0,numPoints-1):
        p0 = spline.bezier_points[curveIndex+0].co
        p1 = spline.bezier_points[curveIndex+0].handle_right
        p2 = spline.bezier_points[curveIndex+1].handle_left
        p3 = spline.bezier_points[curveIndex+1].co

        DETAIL = 256
        SCALE = 0.5
        for i in range(0, DETAIL+1):
            t = i / DETAIL
            
            # Find the derivatives
            d1 = BezierD1(p0, p1, p2, p3, t)
            d2 = BezierD2(p0, p1, p2, p3, t)
            
            # Find the "out" vector
            sideways = d1.cross(d2)
            out = d1.cross(sideways).normalized()
            
            # Compute the curvature
            a = ((d2[2] * d1[1]) - (d2[1] * d1[2]))
            b = ((d2[0] * d1[2]) - (d2[2] * d1[0]))
            c = ((d2[1] * d1[0]) - (d2[0] * d1[1]))
            k = math.sqrt((a * a) + (b * b) + (c * c))
            k /= math.pow((d1[0] * d1[0]) + (d1[1] * d1[1]) + (d1[2] * d1[2]), 1.5)

            # Add a line segment            
            d0 = Bezier(p0, p1, p2, p3, t)
            verts.append(d0)
            verts.append(d0 + (out * k * SCALE))
            edges.append((edgeIndex, edgeIndex+1))
            edgeIndex += 2

    # Create the mesh                
    debugMesh.from_pydata(verts, edges, faces)

ShowCurvature("BezierCurve", "DebugMesh")


For NURBS curves the math gets a bit more complicated but the basic idea is the same.

yo this owns. blender thread is the best thread

echinopsis
Apr 13, 2004

by Fluffdaddy
yeah that looks like those curve analysis things sagebrush posts

Komojo
Jun 30, 2007

I think I have my default Blender file set up with a collection called "Main" so you might need to change that name to get it to work.

echinopsis
Apr 13, 2004

by Fluffdaddy
I am slowly honing my default blender file to be the one ultimate default file. it’s getting very very good now




also just came across this and drat if this isn’t some inspiration

Bluemillion
Aug 18, 2008

I got your dispensers
right here

echinopsis posted:

I am slowly honing my default blender file to be the one ultimate default file. it’s getting very very good now




also just came across this and drat if this isn’t some inspiration





Started with a 32 slice cylinder with no cap. Scrolled up a ring cut until the vertical slices had good proportions. Decimate modifier set to un-subdivide, 1 pass to turn the squares into diamonds. Poke faces to get the points on the knurling. Select similar to select all the points, s+shift z to scale them out on the x and y axis. Extruded the top and bottom, subdivided without smoothing. Selected each ring individually and shift alt s to sphere-ify. Bevel on the top, tiny, tiny bevel modifier for a pinch of realism. Light with an environmental texture. Switch to cycles. Go to world and unselect camera under ray visibility to hide the texture.

Edit: Oh yeah, and you need shade smooth and autosmooth for the top and bottom.

That's the model anyhow. Could spend more time on the shader but :effort:

Bluemillion fucked around with this message at 23:53 on Oct 3, 2021

echinopsis
Apr 13, 2004

by Fluffdaddy
mate that’s good. you’ve got a better feel for the array of mesh editing functions than I do.

back when I was doing this model




I really wanted knurled knobs but could never make them
:cry:

Bluemillion
Aug 18, 2008

I got your dispensers
right here

echinopsis posted:

mate that’s good. you’ve got a better feel for the array of mesh editing functions than I do.

back when I was doing this model




I really wanted knurled knobs but could never make them
:cry:

Well, now you can! :glomp:
The real meat of that breakdown is the decimate modifier set to unsubdivide and 1 pass, followed by the poke faces. That's all you really gotta remember.
Also don't be afraid to search for how to do specific things. I've spent a lot of my life watching youtube tutorials.

I insist on doing as much as possible without external add-ons, so I spend a lot of time obsessing over topology and the nitty-gritty of mesh editing.
Your shaders and sims are super cool, so keep at it!

Bluemillion fucked around with this message at 23:26 on Oct 3, 2021

echinopsis
Apr 13, 2004

by Fluffdaddy
alright lol so I upgraded from a gtx 1060 to an rtx 3060ti

I ran a benchmark. 50 samples so not exactly ideal but whatever

GTX 1060 6gb

cuda: 46s
optix: 29s


that alone was interesting, that optix on a non rtx card would make such a difference. should have been doing that from ages ago

now:


RTX 3060Ti

cuda: 20s


optix..?



THREE SECONDS




lol


Bluemillion posted:

I insist on doing as much as possible without external add-ons, so I spend a lot of time obsessing over topology and the nitty-gritty of mesh editing.
Your shaders and sims are super cool, so keep at it!

thanks dude

between all of us, we're quite good

Jenny Agutter
Mar 18, 2009

congrats on the upgrade echi , feels like living in the future doesn’t it?

echinopsis
Apr 13, 2004

by Fluffdaddy
I wasn’t really unhappy with what I had, not sure what started the bug to get a new one. upgraded my power supply too coz i’d be running at close to hundred percent and imagine that’s not great


the ryzen and rtx seem to play together very nicely

echinopsis
Apr 13, 2004

by Fluffdaddy
another benchmark. same scene. 50 samples


this is all with rtx


cycles (not cycles x)

cuda: 30s
optix : 24s

cycles x

cuda 10s
optix 5s



I have a feeling the prior benchmark I not only switched cards but loaded different blender and switch from cycles to cycles x

cycles x with optix is basically insane. the loading kernels sucks tho lol

echinopsis
Apr 13, 2004

by Fluffdaddy


I left this for an hour

it's.... smooth lol

toiletbrush
May 17, 2010
thats fkin beautiful

how does ray/path tracing simulate subsurface scattering (or any volume scattering really) ? Is there some sort of pre-processing you can do on a volume and do some sort of analytical thing? Or do you basically just have to sample a whole load of randomly bouncing around rays?

Cybernetic Vermin
Apr 18, 2005

toiletbrush posted:

thats fkin beautiful

how does ray/path tracing simulate subsurface scattering (or any volume scattering really) ? Is there some sort of pre-processing you can do on a volume and do some sort of analytical thing? Or do you basically just have to sample a whole load of randomly bouncing around rays?

presumably the latter, it is to some extent the point of raytracing that you just simulate the rays for most light phenomena, including refracting and scattering rays inside materials

echinopsis
Apr 13, 2004

by Fluffdaddy
I think just statistically, as in, the density of a volume is a bit like the chance a ray will scatter/absorb over its step size. but… idk, and that doesn’t explain much


To be clear, I used a volume scatter in that. the surface shader is ordinary shader mixed close to 50/50 with transparent shader

the sub surface scattering part of the regular shader is a hack and it works ok but is more of an effect than really representing scattering within a thing


i’ve been trying to mess around with something that could act to produce veins or something so that the internal density isn’t homogenous. that’s what the cubes were for last time, I wanted it to look like there was something inside.

any other interesting thing is that if you change the colour of the scatter to say red, the area that gets the light looks red and the rest ends up looking opposite of red because the red light as already been scattered away. different to say absorbing a shade which doesn’t tend to do this.

echinopsis
Apr 13, 2004

by Fluffdaddy

Cybernetic Vermin posted:

presumably the latter, it is to some extent the point of raytracing that you just simulate the rays for most light phenomena, including refracting and scattering rays inside materials

ray tracing .. is old skool
path tracing is where it’s at

Cybernetic Vermin
Apr 18, 2005

echinopsis posted:

ray tracing .. is old skool
path tracing is where it’s at

i refuse to consider path tracing distinct from ray tracing

echinopsis
Apr 13, 2004

by Fluffdaddy
can't believe we got graphics chuds itt

Zlodo
Nov 25, 2006
the whole shtick of path tracing is to send random rays across the range of directions where rays should have bounced and use that to approximate the sum of all rays contributions at that point (more rays=better approximation=less noise) and I don't think subsurface scattering works differently

Jenny Agutter
Mar 18, 2009

from last year but holy poo poo
https://twitter.com/jonasdichelle/status/1328860954856140801?s=21

Trabisnikof
Dec 24, 2005

will nodevember allow full use of geo nodes? or will there be a challenge category to only use shader nodes like just starting with the base cube is one

echinopsis
Apr 13, 2004

by Fluffdaddy

Zlodo posted:

the whole shtick of path tracing is to send random rays across the range of directions where rays should have bounced and use that to approximate the sum of all rays contributions at that point (more rays=better approximation=less noise) and I don't think subsurface scattering works differently

I just checked the docs and it says certain kinds of sss is true volumetric scattering, with the implication that the other types of sss aren’t?

I think the reason I have thought it’s an effect is because all sss is is volumetric scattering, and yet it doesn’t always act like volumetric scattering. I’ve used it before and the results have sometimes been odd if it’s less common geometry.

as in, the results are different than if you made a volume scatter to be the same. at least that what I remember thinking

I could well be wrong 🤷‍♂️

Jenny Agutter
Mar 18, 2009

the point of nodevember is just doing things procedurally and not creating things by hand. like no sculpting, no texture painting, no music composition. you can use whatever you want towards that end. geometry nodes, animation nodes, shader nodes, procedural sound software, anything along those lines.

Bluemillion
Aug 18, 2008

I got your dispensers
right here

Today I learned there's a node limit.

Archduke Frantz Fanon
Sep 7, 2004

https://twitter.com/tessamonash/status/1437268643000164354

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:


i think i understand how to do this, conceptually

e: the hard part would be drawing all the pictures procedurally. making the cards and moving them around would be relatively easy, but tedious

fart simpson fucked around with this message at 13:49 on Oct 9, 2021

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

echinopsis posted:

another benchmark. same scene. 50 samples


this is all with rtx


cycles (not cycles x)

cuda: 30s
optix : 24s

cycles x

cuda 10s
optix 5s



I have a feeling the prior benchmark I not only switched cards but loaded different blender and switch from cycles to cycles x

cycles x with optix is basically insane. the loading kernels sucks tho lol

yeah dude i went from 2014 intel cpu rendering to cycles x on a 3060. it’s literally 20-50x faster depending on the scene

Songbearer
Jul 12, 2007




Fuck you say?

This is absolutely true and when I get stuck in a rut with keyframing I just do this for a bit

Sometimes it can actually help

not often tho

echinopsis
Apr 13, 2004

by Fluffdaddy
https://hdrmaps.com/freebies/

just another source of freebies goodness

echinopsis
Apr 13, 2004

by Fluffdaddy
hey so one of these hdri that I downloaded, it has an exr file like normal

but then a folder fill of ordinary photos. dng file format.

and it makes me wonder if this hdri has been generated from a bunch of photos. makes me wonder if using photogrammetry you can make your own hdri files without a mirrorball.

in fact this should result in less distortion?

anyway. cool idea. gonna investigate

Sagebrush
Feb 26, 2012

i made some stitched 360 degree panoramas back in the day. you take a zillion photos in every direction and run them through some software and it makes a 2:1 equirectangular projection and you stick that in your rendering engine and bob's your uncle. (not my uncle)

it would be tricky to get a real HDRI out of this because you'd have to take multiple exposure-bracketed photos at each location and stitch them in somehow. but as long as nothing is moving it should theoretically be possible. and if you don't care about it being a true HDRI, any image will work fine, just crank up the contrast

echinopsis
Apr 13, 2004

by Fluffdaddy
oh yeah true


.. meshroom seems to ask about camera data a lot, I wonder if it can calculate exposures or whatnot, and if you just use exposure bracketing and do wide brackets

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

to warm up a bit for nodevember, i went back and redid one of my last years submissions:

https://i.imgur.com/jFwjLfV.mp4

since then ive learned a bit more about vector displacement workflows and animation so i was able to improve it a bit:

https://i.imgur.com/VTY36Jw.mp4

gosh i cant wait for nodevember

fart simpson fucked around with this message at 14:24 on Oct 11, 2021

Jenny Agutter
Mar 18, 2009

echinopsis posted:

hey so one of these hdri that I downloaded, it has an exr file like normal

but then a folder fill of ordinary photos. dng file format.

and it makes me wonder if this hdri has been generated from a bunch of photos. makes me wonder if using photogrammetry you can make your own hdri files without a mirrorball.

in fact this should result in less distortion?

anyway. cool idea. gonna investigate

thats exactly how they make hdris
https://blog.polyhaven.com/how-to-create-high-quality-hdri/

although professional groups, like the forza horizon team has talked about this, use a 3-camera rig with super wide angle lenses so they can get the entire sky in one go

Jenny Agutter
Mar 18, 2009

fart simpson posted:

to warm up a bit for nodevember, i went back and redid one of my last years submissions:

https://i.imgur.com/jFwjLfV.mp4

since then ive learned a bit more about vector displacement workflows and animation so i was able to improve it a bit:

https://i.imgur.com/VTY36Jw.mp4

gosh i cant wait for nodevember



ha i'm going to have a hard time keeping up this year. should start practicing too

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

Jenny Agutter posted:

ha i'm going to have a hard time keeping up this year. should start practicing too

i was most excited about figuring out how to do the grass.

the trick i was missing last year was alpha masking different chunks of geometry out, and then using the a vector scale node with the factor set to the alpha mask but slightly bigger. you can then do any displacement stuff you want and plug it into that scale node with the mask plugged into the fac and all the displacement will only affect that specific chunk and leave the rest of the object alone. you can then use the same masks as the fac for color mix or shader mix nodes.

last year i was stuck thinking about everything as contiguous single objects, so that grass was just taking a tiny bit of the edge of the cube and stretching it out so there was no real geometry to work with. this time my grass was literally just the bottom face of the cube alpha masked out and only stretched slightly, so there was tons of geometry left in there with which to push up some blades of grass, which was just a voronoi driving both the z axis and the color

this video tutorial was what taught me about this alpha/scale trick:
https://www.youtube.com/watch?v=aEeBAxjvY8U&t=3s

once i figured out that trick then making almost anything became "easy" if not tedious. because you can then just isolate everything you want and work on them as separate components entirely and just add everything up before the end. each alpha mask is done independently, and added. each shader is done independently, and added with mix shader nodes. each displacement is done independently, and vector added together. it's a very slick workflow.

like for the remake of my house the node tree looks huge but it was actually pretty easy to make since each section is isolated and i can easily change one part without affecting anything else:

fart simpson fucked around with this message at 16:06 on Oct 11, 2021

Adbot
ADBOT LOVES YOU

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

here's another trick i picked up for animation:



the #frame is just a frame driver (you taught me that last year, jenny). you plug it onto the map range node and set the from min to the start frame you want to start animating that particular property. from max sets the end frame you want that animation to be done. you map that to 0-1 and plug it into an rgb curve and now you have an animation driver that works, in the case of this image, from frame 16 to frame 40 following the easing function i drew out with that rgb curve. its really neat

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply