|
Colonel J posted:Thanks for your help, I finally got it working. I couldn't really get the bias matrix to work as it would distort my geometry in strange ways. I just multiplied the vertice positions by 0.5 and translated by 0.5 and they're good now. Is the shadowing working correctly as you're intending? At certain points the shadow of the vertical stick should pass over the horizontal ones, but I can see that it's not. Maybe it's an artifact of my older video card, but it doesn't look right to me.
|
# ? Feb 11, 2014 21:23 |
|
|
# ? May 16, 2024 07:11 |
|
HiriseSoftware posted:Is the shadowing working correctly as you're intending? At certain points the shadow of the vertical stick should pass over the horizontal ones, but I can see that it's not. Maybe it's an artifact of my older video card, but it doesn't look right to me. Yeah, that's a bit of a weird behavior that seems tied to the renderDepth of the objects in the scene. I tried playing a bit with it but I didn't spend too much time; basically some stuff renders at the wrong time when it does the shadow map and they turn out to be under things that are in front of them. I added yCube.renderDepth = 100; to try and force it to render last and it's better: http://jsfiddle.net/7b9G8/2/ However I fear it wouldn't hold to every camera orientation and I'm not sure how to fix it for good.
|
# ? Feb 11, 2014 21:45 |
|
Not really a question, but I just wanted to say that this Metropolis Light Transport is busting my balls. Our advisor has given us a compact implementation of it to use as a reference, but it's not exactly easy reading.
|
# ? Feb 12, 2014 13:22 |
|
Boz0r posted:Not really a question, but I just wanted to say that this Metropolis Light Transport is busting my balls. Our advisor has given us a compact implementation of it to use as a reference, but it's not exactly easy reading. At its core, it's really just a random number generator that stores the numbers generated. To do a small-step mutation, you pick one of the numbers previous generated and change it slightly. You then reset the generator and "re-play" the numbers as you're building a new path. A large-step mutation just clears the list of numbers and creates new ones. In either case, before mutating, you'd save the state of the generator. If the path generated with the mutated numbers doesn't get accepted, the generator get restored to the saved version. Because the transition probabilities for mutating random numbers are symmetric, they cancel out in the acceptance computation. To compute the acceptance probability of the newly generated path, you just divide its brightness by your last path's brightness. Aside from substituting calls to the system's RNG with your calls to your RNG, you don't have to touch your ray tracing code. steckles fucked around with this message at 18:42 on Feb 12, 2014 |
# ? Feb 12, 2014 18:18 |
|
Thanks, those're good points.
|
# ? Feb 12, 2014 20:15 |
|
Am I missing something here or what with initializing GL? I need a valid context to even get wglChoosePixelFormatARB() ... but to get a valid context, I need to have a pixel format set. And then SetPixelFormat() expects a filled out PIXELFORMATDESCRIPTOR anyway.
|
# ? Feb 13, 2014 11:23 |
|
slovach posted:Am I missing something here or what with initializing GL? PIXELFORMATDESCRIPTOR is a windows specific struct, not a general OpenGL struct. You fill that out yourself and use your Windows Device Context to get a Pixel format. (Via ChoosePixelFormat). Then with the Pixel Format it returns, you call SetPixelFormat, then wglCreateContext. code:
Because Microsoft didn't want to implement OpenGL3, and OpenGL is a messy API in general, you have to create an OpenGL context using the windows APIs. Then you can request the function pointers for the newer OpenGL context creation methods. So you'll need to call wglGetProcAddress And find the address of wglCreateContextAttribsARB. Then you'll end up with two contexts, so you'll have to destroy the temporary one. Or use one of the utilities to do it for you.
|
# ? Feb 15, 2014 09:50 |
|
I tried moving my raytracing project to my desktop Windows PC to get some more power, but when I try running it, I get the most vague error message:code:
|
# ? Feb 19, 2014 15:42 |
|
Boz0r posted:I tried moving my raytracing project to my desktop Windows PC to get some more power, but when I try running it, I get the most vague error message: File permissions, perhaps? See if your user account needs to take ownership of the new files. Making a copy of them in your Documents folder and trying with that should show if this is the cause.
|
# ? Feb 19, 2014 17:43 |
|
If I run the executable as administrator from Windows Explorer I still get the error. I pretty much assume it has something to do with OpenGL, glut, glew, or something.
|
# ? Feb 19, 2014 19:36 |
|
Boz0r posted:If I run the executable as administrator from Windows Explorer I still get the error. I pretty much assume it has something to do with OpenGL, glut, glew, or something. Have you looked into the system event logs, and/or tried started the executable with a debugger attached?
|
# ? Feb 19, 2014 19:52 |
|
Why would a raytracing app depend on OpenGL stuff anyway? I'd suspect maybe a more general C runtime or something, but that's really guessing.
|
# ? Feb 19, 2014 21:54 |
|
I use OpenGL as a wireframe renderer where I can move the camera around, and then click on a button and have it render the raytracing. It's a framework we got from our instructor. I've tried starting the program with a debugger, but I get the error before it triggers the first line in main(). I don't know how to view system event logs.
|
# ? Feb 20, 2014 11:40 |
|
Boz0r posted:I use OpenGL as a wireframe renderer where I can move the camera around, and then click on a button and have it render the raytracing. It's a framework we got from our instructor. You'd probably be better off in the tech support forum, but from a quick look around, at least one person got this issue from a mismatched freeglut.dll (64-bit) and Visual Studio (32-bit), if you just kinda threw things in there without installing the version your machine needs
|
# ? Feb 20, 2014 11:51 |
|
Yeah, I was thinking I might need to delete all that poo poo and reinstall it, but I wanted to check for other solutions first.
|
# ? Feb 20, 2014 13:31 |
|
Sounds like it could be a bad executable or binary dependency -- have you built the executable on that machine or just copied it there?
|
# ? Feb 20, 2014 15:09 |
|
I got it working by deleting everything related to glew and glut and reinstalling it
|
# ? Feb 20, 2014 15:27 |
|
A'ight, so while I couldn't get Picking to work out in time my arcball rotation was *perfect* and I'm reasonably pretty drat happy to know that most of my classmates couldn't figure that out. It fills me with some confidence that I'm not out of my depth or anything. The next step now for the next and now current assignment is to render animations using a skeleton system. A *lot* has been provided already, the keyframes, transformations, bones and so on have all been provided in files, so I just need to import and read them and apply the transformations. I'm looking at this tutorial though it uses some SDL stuff we're not bothering with and has a means to manually move 2D bones to create its own keyframes; does this seem like a good guide as any for Ancient OpenGL and does anyone have any suggestions? I'm currently shifting the code as I see it from a struct based thing to a class based implementation as I have this irrational hatred of structs I can never understand. For this implementation of keeping track of child bones I'm thinking a vector will do the job.
|
# ? Feb 23, 2014 23:12 |
|
Okay so an actual question, in the tutorial I linked above the guy defines his bones to have an angle and a length, however he is working in 2D. In 3D this is puzzling me, I suppose I would need to define an angle for all three planes (well, 2, the "roll" of a line probably is irrelevant)? Unlike his example I don't have angles given to me, just the points between two bones. I imagine its likely less headache to use an implementation that just uses defined points for both ends of the bone but the tutorial uses angles so there's a time sink either way. So, suppose I have points A and B, do I derive the vector between them and then derive a "fake" vector from its magnitude on the X,Y (or Z?) planes and find the angles from there?
|
# ? Feb 24, 2014 04:06 |
|
The vector from B to A is <Ax - Bx, Ay - By,Az - Bz>. Yes, you'll need 3 angles per bone for 3D bones. You should probably make the assumption that when the model is first loaded all of the angles are zero.
|
# ? Feb 24, 2014 06:55 |
|
haveblue posted:The vector from B to A is <Ax - Bx, Ay - By,Az - Bz>. The initial frame is given to us and its a standard/simple skeleton, all of its bones appear to be angled in some form; here's what my tutor says: quote:Right, an angle has to be *between* two lines/rays/segments. I think my hunch is correct from this that I need to create a second vector in which to figure out the angle from for the initial frame.
|
# ? Feb 24, 2014 16:56 |
|
Boz0r posted:I tried moving my raytracing project to my desktop Windows PC to get some more power, but when I try running it, I get the most vague error message: If you just copied the executable from your development machine to the desktop, chances are you are missing either some DLLs for any of the external dependencies (e.g. GLUT) or you are missing the visual studio runtime that your raytracer was compiled against. Each version of visual studio has a different set of DLLs that implement various bits and pieces of C/C++ functionality and those have to be either in your PATH system variable or in the working directory of the application (generally, the same directory where your executable is). If you Google "Visual Studio 20XX runtime" you will find download links directly from Microsoft. You might wanna consider trying to build your code locally on your desktop and running it with a debugger attached, that should give you more information about what's happening.
|
# ? Feb 24, 2014 22:13 |
|
shodanjr_gr posted:Each version of visual studio has a different set of DLLs that implement various bits and pieces of C/C++ functionality and those have to be either in your PATH system variable or in the working directory of the application (generally, the same directory where your executable is). If you Google "Visual Studio 20XX runtime" you will find download links directly from Microsoft. (I expect you'd still need the GLUT, and there's no static linking DirectX either, but the runtime libraries I'm pretty sure still do.)
|
# ? Feb 25, 2014 06:12 |
|
roomforthetuna posted:You can also (more sensibly in my opinion) configure it to do static linking so that you don't have to distribute those files with your exe, for any project that isn't going to be made of a bunch of modules. Assuming you're not using MFC or something. That's actually a better approach for something small-scale. I also believe that GLUT can also be static-linked. On another note, is there an OpenGL debugger that works properly in Windows 8.1? I used to use gRemedy's gDebugger in Windows 7 and it worked great for me, but since I moved to Win 8.1, it crashes whenever I try to look at textures/buffers when at a breakpoint. I tried AMD's version as well, but it crashes the same way. NVidia's NSight graphics debugger doesn't want to debug non-core GL contexts. edit: To answer my own question, AMD's GPU Perf Studio 2 seems to work fine on NVidia cards (including GLSL on-the-fly editing) and all the nice HUD injection stuff). The UI is somewhat more janky than gDebugger (slower, .NET based and talks to a server over HTTP) but it gets the job done and the profiling tools are better. However, it doesn't do other stuff that gDebugger does, like allow you to break on certain GL calls and it doesn't seem to be able to show you stack traces that lead to API calls either. edit 2: Actually, the Frame Debugger and API Trace functionality works but the frame profiler doesn't since it seems to need access to low level hardware counters. Bummer... shodanjr_gr fucked around with this message at 07:14 on Feb 26, 2014 |
# ? Feb 26, 2014 05:07 |
|
I'm currently trying to convert a pbrt scene to fit into our own ray tracing framework. When reading the .pbrt I see some values get set while I guess other values assumes a default value - there are a lot of translates back and forth. Would it be possible for me to set a flag or parameter for pbrt such that when running it will be super-verbose - like tell me the actual world coordinates of the camera, the up-vector, the vertex-coordinates and such?
|
# ? Feb 27, 2014 21:32 |
|
I'm working with Peter Pike Sloan's paper Stupid Spherical Harmonics Tricks right now, trying to implement a Monte Carlo integrator for the rendering equation. In appendix A he gives a recurrence equation for calculating the associated Legendre Polynomials: and then states you increment m in the outer loop and l and the inner, which leads me to believe that to find P(l,m) you have to start with P(0,0) = 1 and work your way up the l's to the one you want, then do P(1,1) and work your way up the l's again, etc. It works for the l P(l,0) but as soon as I hit P(1,1) it's like the equations up there break. Mathematica tells me P(1,1) = 0 but if you use the equation in the paper you find P(1,1) = (1 - 2*1)*P(0,0) = -1. Who's in the wrong here? (probably me)
|
# ? Mar 1, 2014 19:56 |
|
My MLT is actually looking pretty good, the only problem is that the image keeps getting brighter and brighter the longer it runs. I assume this is wrong as it should converge to f(x), right?
|
# ? Mar 3, 2014 12:39 |
|
Boz0r posted:My MLT is actually looking pretty good, the only problem is that the image keeps getting brighter and brighter the longer it runs. I assume this is wrong as it should converge to f(x), right?
|
# ? Mar 3, 2014 20:10 |
|
I've been playing around with rendering resolution independent 2D shapes on the GPU. Now that I've got a few primitives like circles and bezier curves, I want to start casting dynamic shadows. The approach I'm currently working on is to use OpenCL to take my geometry and lights as inputs and return shadow geometry as the output. The shadow geometry would be triangles representing shadows cast by the objects. Then I'm going to render the original geometry and finally render the shadow geometry on top of it. I'm just curious what you all think of this scheme? I haven't had a lot of luck finding existing information on GPU accelerated 2D shadows, just bits and pieces, but that might be because I'm pretty new to this and I don't know exactly what I'm looking for.
|
# ? Mar 5, 2014 04:21 |
|
Ok so I'm trying to build a simple 2D engine in OpenGL (JOGL) and I'm running into a bit of trouble. I'll try to avoid posting all of the source here but here's what I think are the relevant bits: My test program currently does this: Java code:
loadGraphic uses JogAmp to load the texture and sticks it into a tree keyed by name: Java code:
I'm currently using immediate rendering like so: Java code:
Java code:
EDIT: Hmm. Manually setting the texID to 1/2 shows the block for both. Somehow the second texture is ending up at both texIDs. DOUBLE EDIT: I'm stupid and thought texture binding was done after glBegin(). Fellatio del Toro fucked around with this message at 09:33 on Mar 8, 2014 |
# ? Mar 8, 2014 09:03 |
|
I haven't used JOGL so I'm not sure what it does internally but I'm noticing a couple of things wrong with your code. You start a draw call (gl.glBegin()) and then inside of that you render the vertices for both of your primitives. I'm pretty sure that non vertex state can not be affected within a drawCall (between glBegin() and glEnd()). You should do your glBind call outside of the glBegin()/glEnd() block and only submit vertex geometry within that block. Probably what happens in your situation is that the second glBindTexture() call within your draw call ends up getting applied after the draw call ends and is used as the active texture for the draw calls in the next frame. You probably want to rewrite your Graphics2D:raw() function like this: code:
code:
|
# ? Mar 8, 2014 09:30 |
|
Yeah that did it. I never noticed the issue up until now as I never tried more than one texture. Hurrah! Fellatio del Toro fucked around with this message at 09:36 on Mar 8, 2014 |
# ? Mar 8, 2014 09:34 |
|
Does JOGL not have shaders and VBOs and stuff? glBegin and friends have been dead since forever.
|
# ? Mar 8, 2014 14:47 |
|
I'm pretty sure it does but every resource I've found on OpenGL starts with immediate rendering and gets into that stuff later. I was trying to start working with VBOs yesterday but I'm having a hard time finding a good reference.
|
# ? Mar 8, 2014 18:50 |
|
Suspicious Dish posted:Does JOGL not have shaders and VBOs and stuff? glBegin and friends have been dead since forever. When you're only pushing a few hundred triangles, using deprecated immediate mode APIs isn't a big deal.
|
# ? Mar 10, 2014 22:49 |
|
So I'm trying to figure out linear blend skinning and nothings working. This is the formula we're given: And I thought I understood it but now the more I look at it the more I'm starting to think I have it completely wrong. code:
Thoughts? Either my matrix multiplication isn't working the way I thought it should work or I completely misunderstood the formula.
|
# ? Mar 11, 2014 18:44 |
|
Raenir Salazar posted:*skinning stuff* It's kind of hard to know exactly what your transformations look like just from this code, but in general this is what you want to do for each joint: code:
Once you have a transform for each joint, you can upload all those to a shader and use the formula you posted, something like this(if you limit influence to four transforms): code:
|
# ? Mar 11, 2014 19:33 |
|
Zerf posted:It's kind of hard to know exactly what your transformations look like just from this code, but in general this is what you want to do for each joint: by shaders do you mean modern opengl stuff? We've been using immediate mode/old opengl so far. What actually outputs the mesh is: quote:glBegin(GL_TRIANGLES); So what I've been trying to do is transform the mesh, upload the new mesh, and then that gets drawn. I don't have a real heirarchy for my skeleton, I just draw lines between two points for each bone and each bone is a pair of two joints kept in a vector array. So to clarify, my skeleton animation works perfectly but making the jump from that to my mesh is whats difficult.
|
# ? Mar 11, 2014 19:58 |
|
Raenir Salazar posted:by shaders do you mean modern opengl stuff? We've been using immediate mode/old opengl so far. Then I think what you are missing is the inverse bind pose transform. Is this a school assignment? Is that transform mentioned somewhere? A simple example why it's needed: Imagine that we have two joints, one at position j1abs(5,0,0) and one at position j2abs(5,2,0). j2abs has a relative transform to j1abs which looks like j2rel(0,2,0). Now we have a vertex which we want to skin. This vertex is placed at v1abs(5,3,0). For simplicity, we want to attach this vertex only to the j2 joint. We cannot apply j2s absolute transform to the vertex position right away(that would give us a new position v1'abs(5+5,2+3,0+0)=(10,5,0) which is not what we want). Therefore, we define the inverse bind pose transform to be the transformation from a joint to the origin of the model. In other words, we want a transform which transforms a position in the model to the local space of a joint. With translations, this is simple, we can just invert the transform by negating it, giving us j2invBindPose(-5,-2,0). Now, lets try and apply these transformations. First we take the vertex v1 and multiply with the inverse bind pose for j2. This results in a position(0,1,0)(see, we are now in joint local space). Now we can simply apply j1rel(5,0,0) and j2rel(0,2,0) which gives us v1'abs(0+5+0,1+0+2,0+0+0)=(5,3,0), right were we started. Now imagine we change j1rels transform to j1rel(6,1,0). We again take v1abs(5,3,0)*j2invBindPose(-5,-2,0)*j1rel(6,1,0)*j2rel(0,2,0) = ( 5 + -5 + 6 + 0, 3 + -2 + 1 + 2, 0+0+0+0) = v1'abs(6,4,0), which is exactly what we want. So, does this explanation make sense to you or have I succeeded in making you more confused?
|
# ? Mar 11, 2014 20:42 |
|
|
# ? May 16, 2024 07:11 |
|
I think I see what you mean but isn't that handled by assigning weights? Example, vertex[0] = <0.0018 0.0003 0.8716 0.0003 0.0004 0 0 0.0006 0.0007 0.0001 0 0.0063 0.0046 0 0.0585 0.0546 0.0002> We're given the file that has the associated weights for every vertex. Each float is for a particular bone from 0 to 16; the file has the 17 weights for each of the 6669 vertexes. e: Out of curiosity Zerf do you have skype and any chance could I add you for not just figuring this out but for opengl help and advice in general. Raenir Salazar fucked around with this message at 21:06 on Mar 11, 2014 |
# ? Mar 11, 2014 21:01 |