|
Raenir Salazar posted:Basically I just want to (in Unity) generate procedural Perlin/Simplex noise to a texture to use for real-time terrain generation (I tried using DOTS/Unity's Jobs system but 5k by 2k was an unacceptable 10 seconds). Right, OK. I'm kind of avoiding doing a deep dive into that particular simplex noise implementation, because it should be irrelevant to your problem and it seems very Unity-specific. I would again be deeply surprised if Unity does not have a better solution to your problem than "edit this shader" but I don't know Unity. Someone who does could probably give a better answer. This said, the "parallax" you're talking about sounds like what happens when you scale or translate the different octaves in simplex noise independently. I think inoise is called for each octave in that implementation, so if you insert a fixed transform there then the transforms will only be correct for at most one octave of the noise and you get the effect that the different layers of the noise move independently. For texture transforms, you're on the right track by the sounds of it. You can always do those without touching anything about the details of the texture generation or even knowing what type of texture you are using. You don't need or want to touch noise generation parameters like frequency or bandwidth to zoom or pan, any more than you'd need to edit an image-based texture in paint to zoom or pan. If you have a texture coordinate p, and you want to translate it so the texture origin is centered on a point p0, you can do the transform p = p + p0. If you want to zoom in by a factor of k, you can do the transform p = p * k. Transforming the texture lookup coordinate prior to look-up works for any texture, procedural or image, of any dimensions. So just do that things wherever you're specifying the o.uv = v.texcoord - float4(0.5, 0.5, 0, 0); texture coordinate, I assume the vertex shader. You'd end up with something like: code:
I suggested 3D noise because I figured you were texturing 3D data. I'm not sure what you mean by wrap-around, but it sounds unrelated to texture dimension. Since you're texturing a 2D image, use a 2D texture.
|
# ¿ Jan 4, 2021 23:23 |
|
|
# ¿ May 16, 2024 00:53 |
|
You can apply an offset to gl_FragDepth based on gl_FrontFacing in the fragment shader. That might limit the early z tests, depending on how you're offsetting, and I don't think you can offset by the same amount as glPolygonOffset (which I've never used). In general you will not be able to do this in the vertex shader by just looking at the post-transform normal. A given vertex can be a part of both a front-facing and a back-facing triangle, and a vertex normal can and almost certainly will be back-facing from some perspectives even if it is a part of a front-facing triangle. Offsetting the vertex by the normal still works if you know you have nice inputs with vertex data that lets you, of course, but not for arbitrary input.
|
# ¿ Aug 11, 2021 21:25 |