|
Progress report, Solveable weirdness & hacks: Two things, 1) For some reason my sides are flipped. Left is right, top left is top right etc. I did a simple hack to fix this but I really have no idea why this is so. Here's my current code that gives the above image, you see I manually swapped my indexes for which hex direction I'm going to get them to line up. code:
e You're later code implemented: That looks amazing. (The weird whiteness is just because I have a lovely water texture) Raenir Salazar fucked around with this message at 03:54 on Sep 10, 2015 |
# ¿ Sep 10, 2015 03:42 |
|
|
# ¿ May 16, 2024 08:53 |
|
Joda posted:Yay, glad you got it working. Sorry I haven't been very helpful for the last couple questions; a lot of it was Unity specific, and I've been to busy with school to find the time to understand your shader properly. I probably didn't phrase them right but I don't think my questions were that Unity specific per se. But yay! Much rejoicing! I thank my stars that I do have some little bit of the required problem solving skills required of a developer, once I decided to buckle down and ask "Okay, is there a pattern to why my shader isn't working?" and determined the flippedness was universal I could narrow down my trial and error fixes.
|
# ¿ Sep 10, 2015 13:53 |
|
Oh god, for my class assignment I need to use Opengl/C++ and I did not miss this at all when using Unity. Having to manually compile my dependent libraries is a crock (to use the x64 libs anyways). It works though! I just wish the guides were more clear than they currently are.
|
# ¿ Sep 30, 2015 00:47 |
|
So I have an Opengl application that creates a circle and outputs it to the screen. I want to use an Orthographic projection and have the window resize and everything resize accordingly. I seem to have the aspect ratio correct now after much pain, but when I maximize the window the object seems to migrate to the lower left corner. code:
code:
Edit: I'm using modern opengl and GLFW. code:
And soooooolved through some trial and error, turns out I needed to use glViewport which I decided to try after determining that my opengl tutorials that I was following didn't update the window and weren't using glViewport either so it seemed like a good attempt as any. code:
Raenir Salazar fucked around with this message at 03:43 on Oct 6, 2015 |
# ¿ Oct 6, 2015 03:14 |
|
Suspicious Dish posted:I don't see where you update the GL viewport or your MV matrix with the new window size. It's updated in my code just not in what I pasted. code:
|
# ¿ Oct 6, 2015 16:52 |
|
Suspicious Dish posted:You should call glViewport when the window changes size (or at the start of every frame, for simple applications). For games with complex scenes would that still qualify as "simple"? Right now the size change is from clicking the maximize button or dragging the window. So I assume it's sufficient to call it at the beginning? quote:glViewport specifies how GL's clip space maps to "screen space". You can technically have multiple GL viewports per window (think AutoCAD or Maya with its embedded previews), so you have to specify this yourself. For most applications, it's simply the size of the window, though. quote:Once you have that set up, you can imagine a number line with the far left corner being -1, and the far right corner being +1, and similar for This I understand. quote:In order to have a circle that's a certain size in screen space (e.g. "50px radius"), you have to set up the coordinates to convert from the screen space that you want, to the clip space that GL wants. You do this by setting up a matrix that transforms them to that clip space, using the screen space as an input. This I also generally understand barring the occasional site that doesn't consistently use the correct terminology. Though right now he's an interesting thing that's confusing me. When I create a cube at coordinates 0,0 to 1,1 (two triangles) and pass it to my shader before I apply any matrix transformations for projection it's about a quarter of my app in size. When I do apply a projection transformation (and suppose my viewport is now 800,600) the square is now really small (iirc, I'm at work so I don't have the app on me). What's the ratio of "unit" (say radius length or length of a side) length to pixels? How would I do the Unity thing of having a square or a cube that's "one meter" in length/radius so I know how everything else should be scaled relative to it?
|
# ¿ Oct 7, 2015 21:07 |
|
Yeah! Got a nice golden ratio distribution of these little red circles. (Each made with 360 triangles* to give the impression of a smooth circle, if anyone has a good solution for drawing a smooth circle in open gl I'm very interested). The main success here is implementing a Render() function to call to draw my shapes with some simple positioning and the theoretical option to pass different attributes to the shader for each circle. The next milestone is to actually get these moving in some illusion of newtonian frictionless vacuum physics and collisions. *I use the VBO with an array, so I'm not using immediate mode or any other deprecated features, I draw the circle with its 1000+ verts once, make it a vector and then pass it off to my buffer/shader. The next thing I can do is also pass a triangle index for further optimization but really drawing an arbitrary number of lines/triangles to create the illusion of a circle seems like a very brute forcey method to me and there's gotta be a standard solution somewhere no?
|
# ¿ Oct 11, 2015 08:46 |
|
Yeah pretty happy about this. Basic opengl 2D physics thing. Raenir Salazar fucked around with this message at 01:42 on Oct 16, 2015 |
# ¿ Oct 16, 2015 01:39 |
|
I am using Assimp to load a Rigged Mesh into an OpenGL application; this application is meant to allow me to manually manipulate the bones and see it likewise transform the mesh in real time. Everything works, except that I want to rotate the bones so that they rotate via the global Z axis facing the camera. So that suppose a hand is facing mostly towards me, the hand rotates as though it's joint's Z axis was pointed towards the screen. Right now I have this result. The bones rotate according to their local blender roll axis. Code: code:
quote:Obtain the transformation from bone space to view space (i.e. the view transformation multiplied by the global bone transformation). Invert it to get the transformation from view space to bone space. Transform the vector (0,0,1,0) by the inverse to get the bone-space rotation axis. I am probably misunderstanding this but, I now have this: code:
Lines for the axis it rotates around, circle if it happens to be pointed at the camera. CurrentNodeTransform is the transform of the bone relative to it's parent, I've tried cramming in there at different locations a version that had the local transform multiplied by it's parent transform but just made things weird. e: Edited a couple of times to not break the forum CSS. Raenir Salazar fucked around with this message at 18:39 on Oct 27, 2016 |
# ¿ Oct 27, 2016 18:36 |
|
Aaaaand I solved it, no idea how I was so close before but somehow fumbled the ball at the last second, I feel like I would've tried this permutation of multiplications and I just don't know how I missed it.code:
Second to last line needed to be: code:
code:
Raenir Salazar fucked around with this message at 22:47 on Oct 27, 2016 |
# ¿ Oct 27, 2016 22:45 |
|
So, for some reason, code:
code:
The foremost code results in rotations in, approximately 30 degree increments for each unit of rotation. So Y: 1 results in my model rotating about 30 degrees. And thus Y: 90 looks like it goes something like 15 to 30 degrees too far. But glm:radians works. This is completely at odds with the documentation for GLM as far as I can tell when it says "angleInDegrees" for the second parameter. What gives?
|
# ¿ Dec 11, 2016 13:37 |
|
That explains it! Google brings me 0.9.3~ish versions of the manual for some reason.
|
# ¿ Dec 11, 2016 14:35 |
|
I managed to finish my final project! Here's a Demo video + Commentary. Basically I made an OpenGL application that animates a rigged mesh using the Windows Kinect v2. There are two outstanding issues: 1. Right now every frame is a keyframe when inserting. I don't really have it so that you can have a 30 second animation with say 3 Key frames where it interpolates. I'm seeing if I can fix it but I am getting some strange bad allocation memory errors when I try. On super simple lines of code too like aiVector3D* positionKeys = new AiVector3D[NEW_SIZE]; I don't get it, I'm investigating. 2. It only in theory works on any mesh, they have to share the same skeleton structure and names; and then their bones have to have some arbitrary orientation that matches the kinect but when I try to fix it so it matches it ruins my rigging on the pre-supplied blend files I found off of youtube from Sebastian Lague. I'd have to reskin the meshes to the fixed orientations which is a huge headache as I'm not 100% how the orientations have to be in Blender to make the Kinect happy. quote:- Bone direction(Y green) - always matches the skeleton. Okay, so Y makes sense to me. Follows the length of the bone from joint to joint; I'm not sure if it's positive Y or negative Y but I hope it doesn't matter. In blender the default orientation following most tutorials is a positive Y orientation facing away from the parent bone. Now "Normal" and "Binormal" don't make sense to me in any practical way. If the Bone is following my mesh's arm, is Z palm up or palm down? This is all I really care about and I don't see anything in my googling that implies what's correct. Using Blender's "Recalculate Bone Roll" with "Global Negative Y Axis" points the left arm Z's axis forward, and sometimes this gives good results? I want my palm movement to match my palm orientation but it's hard to get this right because my mesh gets deformed editing my bones without rerigging it and it's hard to know up front if I'm right.
|
# ¿ Dec 18, 2016 02:23 |
|
Absurd Alhazred posted:I don't know much about the Kinect, but I would have thought that "bone roll" (which I would have called "bone pitch") would be the vector the bone rotates around, so perpendicular to the plane of movement of the bone, Would this be that? Y follows the bone. But then X and Z feels like it could be anything. I'm confused as can't the bone rotate on either the X or Z axis?
|
# ¿ Dec 18, 2016 03:08 |
|
Absurd Alhazred posted:Other than the thumb's metacarpal*, which is all over the place, most bone joints have a natural movement plane, which is what I would think the Z would be perpendicular to. I don't know how they deal with the thumb: do they have any special allowances for it? IIRC the Kinect treats every bone the same way. It only represents one finger and the thumb though. MSDN Though neither of my meshes have fingers iirc. In Blender unless I have Inverse Kinematics I can rotate the bones however I want when animating; so if Z is the Bone Normal/Roll and X is the Binormal, how should the Bones be oriented in Blender with respect to their "Natural" plane of movement? Which brings me to: roomforthetuna posted:The fact that you can appear to bend your arm around either axis is really a facet of the Y axis rotation of the bone above it in the hierarchy; when you rotate *that* bone 90 degrees you effectively switch the directions of the X and Z axis of the lower bone (sign notwithstanding). The body does a pretty good job of hiding this, but with your elbow out try touching your fist to your chest, watch the joint, and try to then point your lower arm upwards without the joint rotating. It doesn't work. I can see this but what about the shoulder though. It can rotate forwards (Holding my arms in front of me) and can rotate sideways, so that my arms point away from my body. Or is my collarbone doing the rotating here that causes that axis change?
|
# ¿ Dec 18, 2016 03:50 |
|
I am struggling with this shader here: Perlin Noise Shader I'd like it so that regardless the dimensions of the plane it is on, it will properly display the generated noise, instead of stretching or squashing. Can I get this to use world coordinates instead of vertex coordinates?
|
# ¿ Jan 1, 2021 02:24 |
|
Basically I just want to (in Unity) generate procedural Perlin/Simplex noise to a texture to use for real-time terrain generation (I tried using DOTS/Unity's Jobs system but 5k by 2k was an unacceptable 10 seconds). In Unity you use Graphics.Blit, which at first didn't work very well but eventually some people were able to help me solve it. Changing it from code:
code:
To from there get it so that the displayed material on the plane doesn't squash/stretch when I change the scale of the plane to something that isn't square (1408, 1, 512) I did this: code:
Where _Scale is something like 2.75, 1 in the case of 1408 by 512 to make sure the noise/material displayed to the plane isn't being stretched. But now I want to be able to both zoom and pan (zoom is controlled by _Frequency), but I can't get that to work. If I put the Offsets in inoise it pans and stays centered but has a weird parallax/cloud effect. code:
code:
code:
I don't really need 3D or 4D noise because I'm likely just going to use a fall off map to insure the map is surrounded by water, so I don't really need it to be able to wrap around. I can probably live with the parallax but if there's a simple change to fix it, I'd be greatful.
|
# ¿ Jan 1, 2021 21:01 |
|
|
# ¿ May 16, 2024 08:53 |
|
Xerophyte posted:Right, OK. I'm kind of avoiding doing a deep dive into that particular simplex noise implementation, because it should be irrelevant to your problem and it seems very Unity-specific. I would again be deeply surprised if Unity does not have a better solution to your problem than "edit this shader" but I don't know Unity. Someone who does could probably give a better answer. Thanks for this! I was able to gradually solve it with some help in the Unity discord, so for reference I will document the solution for your interest Unity doesn't really have any better noise generation solution that's real time. I tried using their noise generation function from their math library in DOTS (their multithreading/jobs API) and it was still 10 seconds or more for a 5k by 2k texture while this github project was real time on the shader. Basically it seemed like it was an unavoidable issue where the snoise(...) function was a very expensive operation no matter what when performed on the CPU with the in-built Unity one. Maybe one of those "fastnoise" libraries would've made it better but they use generators and would've required a way to re-implement it to work while multithreaded. The solution in the end looks like this: code:
code:
code:
code:
I figure it's much faster to blend textures on the GPU and re-Blit the result to a new texture than to either use multithreading to apply photoshop blends or just to run a loop. Not that running a loop would've been slow, as it was the noise function that made it intolerably slow. Although I hated the streaks on the diagonals, so after discovering Unity's Shader Forge has a Rounded Rectangle node, spent some time through sheer trial and error reimplementing it as a custom node. code:
I then plug the resulting value into another custom function node that takes the value to a power of a different a,b input to adjust the fall off map. Basically I wanted to have the "fall off" of the fall off map but in the shape of a rectangle, the "linear" based fall off map resulted in uglyness along the diagonals, if I knew how to do like, bicubic sampling or gaussian blur maybe that would've fixed it but it seemed easier to reimplement the rounded rectangle. code:
Since by multiplying ridged noise and fbm noise will give like, mountainous looking islands with ridges and so on. I've tested it and the results are Blit-able back to texture, so I'm pretty happy that I can get the results I want. I'm basically working on a procedural random world generator where I'd like to give the user/player the ability to create a world as they want.
|
# ¿ Jan 5, 2021 18:21 |