Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
unixbeard
Dec 29, 2004

How can I set a vertex to an arbitrary location in a vertex shader?

I want something like:
code:
void main(void)
{
    vec4 vpos = vec4(200.0, 300.0, 0.0, 1.0);
    gl_Position = gl_ModelViewProjectionMatrix * vpos;
}
Which i tried but doesn't work. It's using ARB so I believe the coords are ok, but its GLSL 1.2.

Adbot
ADBOT LOVES YOU

HiriseSoftware
Dec 3, 2004

Two tips for the wise:
1. Buy an AK-97 assault rifle.
2. If there's someone hanging around your neighborhood you don't know, shoot him.

unixbeard posted:

How can I set a vertex to an arbitrary location in a vertex shader?

I want something like:
code:
void main(void)
{
    vec4 vpos = vec4(200.0, 300.0, 0.0, 1.0);
    gl_Position = gl_ModelViewProjectionMatrix * vpos;
}
Which i tried but doesn't work. It's using ARB so I believe the coords are ok, but its GLSL 1.2.

So you don't see a single point being drawn? What kind of primitives are you trying to render?

unixbeard
Dec 29, 2004

I see all the points, they should be random but end up in a circle shape.

It should look like this



But ends up looking like this



It's just vertex data as points. The locations are stored in an FBO/texture, the shader gets a bunch of vertexes that are the offset into the texture which it uses to look up the x/y location, then sets gl_Position to what it gets. It's rendered using an FBO. It feels like I am missing something simple but I don't know what.

Im using openFrameworks but the code is up here http://pastebin.com/wcUr17xZ.

haveblue
Aug 15, 2005



Toilet Rascal
How are you generating the points? If your source of randomness or however you convert that source to x, y, and z isn't uniform you'll see patterns in the result even if the shader is correct.

haveblue fucked around with this message at 23:55 on Dec 27, 2012

roomforthetuna
Mar 22, 2005

I don't need to know anything about virii! My CUSTOM PROGRAM keeps me protected! It's not like they'll try to come in through the Internet or something!

gooby on rails posted:

How are you generating the points? If your source of randomness or however you convert that source to x, y, and z isn't uniform you'll see patterns in the result even if the shader is correct.
If you convert it to x, y and z, and they are reasonably evenly distributed, and the coordinates are being passed through a perspective transform, then you'll see results like those pictured. You only really want to randomize x and y (relative to the camera) if you want something that looks like noise.

unixbeard
Dec 29, 2004

gooby on rails posted:

How are you generating the points? If your source of randomness or however you convert that source to x, y, and z isn't uniform you'll see patterns in the result even if the shader is correct.

I thought of this but the points are random. I both keep a copy of the points and draw them with this:

code:
            glEnableClientState(GL_VERTEX_ARRAY);
            glVertexPointer(3, GL_FLOAT, 0, pos);
            glDrawArrays(GL_POINTS, 0, numParticles);
            glDisableClientState(GL_VERTEX_ARRAY);
and I also copy them out of the texture and plot them and get the first image. So the points are random and there does not seem to be anything wacky going on in the texture. I'm pretty sure there is something else I need to do in the shader.

roomforthetuna posted:

If you convert it to x, y and z, and they are reasonably evenly distributed, and the coordinates are being passed through a perspective transform, then you'll see results like those pictured. You only really want to randomize x and y (relative to the camera) if you want something that looks like noise.

It seems to me I should be able to emulate any of the fixed pipeline processing in the shader. I've done other stuff with shaders and a bunch of points stored in a texture and not had this issue. I've tried other transforms but no luck. What exactly I am missing I really dont know.

unixbeard
Dec 29, 2004

I tried using a regular grid vs random locations, and it more or less works, wtf :pwn:

The only issue is the first row/col seems a bit squished, but the code seems to be ok. There really is something funky going on here.

Max Facetime
Apr 18, 2009

Could be CLAMP_TO_EDGE on the texture that's causing the squishiness.

unixbeard
Dec 29, 2004

Sorry I should have been more clear: it "works" for regularly spaced points, but still turns everything into a circle for the random points. So the vertex translation shader appears to be OK, and something else is causing the random data to end up in a circle.

ynohtna
Feb 16, 2007

backwoods compatible
Illegal Hen
Long shot, but mip-mapping and filtering are disabled, right?

unixbeard
Dec 29, 2004

ynohtna posted:

Long shot, but mip-mapping and filtering are disabled, right?

Oop that's it, it was filtering. I owe you a beer if you ever happen to be in Sydney.

I set mag filter to GL_NEAREST and now it does what I expected, the default was GL_LINEAR. I was also wondering why some of the particles were darker than others as i set them all to 1.0 and there was no alpha data, maybe that should've been a hint? Why did you think it might have been that? I'm still pretty new to OpenGL. Thanks though i've spent a lot of time on this. I knew it was something dumb :argh:

Jewel
May 2, 2009

Could someone explain why the filtering was making it into a sphere? Seems weird. Also yo fellow Sydney-person.

The Gripper
Sep 14, 2004
i am winner
What's with all the Sydney nerds programming at 11PM on a Friday night, during the silly season? Truly a city of lame dudes (put my name down in that list too).

ynohtna
Feb 16, 2007

backwoods compatible
Illegal Hen
Cool - glad to have helped!

As to why the filtering pulled the points towards the centre it's 'cos the filtering averages each lookup to the 4 neighbour points and the average of a group from an uniformly distributed n-dimensional set will bias towards the domain's centre.

slovach
Oct 6, 2005
Lennie Fuckin' Briscoe
Am I missing something obvious here with my input layout? Trying to get instancing working...

Runtime is complaining that: D3D11: ERROR: ID3D11Device::CreateInputLayout: The provided input signature expects to read an element with SemanticName/Index: 'TEXCOORD'/0, but the declaration doesn't provide a matching name. [ STATE_CREATION ERROR #163: CREATEINPUTLAYOUT_MISSINGELEMENT ]


The input layout:
code:
	layout[0].SemanticName		= "POSITION";
	layout[0].SemanticIndex		= 0;
	layout[0].Format		= DXGI_FORMAT_R32G32B32A32_FLOAT;
	layout[0].InputSlot		= 0;
	layout[0].AlignedByteOffset	= 0;
	layout[0].InputSlotClass	= D3D11_INPUT_PER_VERTEX_DATA;
	layout[0].InstanceDataStepRate	= 0;

	layout[1].SemanticName		= "COLOR";
	layout[1].SemanticIndex		= 0;
	layout[1].Format		= DXGI_FORMAT_R32G32B32A32_FLOAT;
	layout[1].InputSlot		= 0;
	layout[1].AlignedByteOffset	= D3D11_APPEND_ALIGNED_ELEMENT;
	layout[1].InputSlotClass	= D3D11_INPUT_PER_VERTEX_DATA;
	layout[1].InstanceDataStepRate	= 0;

	layout[2].SemanticName		= "TEXCOORD";
	layout[2].SemanticIndex		= 1;
	layout[2].Format		= DXGI_FORMAT_R32G32B32A32_FLOAT;
	layout[2].InputSlot		= 1;
	layout[2].AlignedByteOffset	= 0;
	layout[2].InputSlotClass	= D3D11_INPUT_PER_INSTANCE_DATA;
	layout[2].InstanceDataStepRate	= 1;
The input on the shader:
code:
VOut vs(float4 pos : POSITION, float4 col : COLOR, float4 inst : TEXCOORD0)
{
...
inst is just meant to be another position.


edit: Jesus loving christ I saw it right after I posted it. I'm loving retarded.
layout[2].SemanticIndex = 1;

Should be 0.


edit 2:
Another question... I want to use a matrix for my instance position instead of just a vector, but I don't see a semantic for anything more than float4? Does it even matter? I just went with TEXCOORD since that was the first thing on google.

Can I actually have a matrix as the input or do I really need to use 4 individual vectors...

Like what the gently caress is going on here? Looks like I can do it.
http://www.gamedev.net/topic/608857-dx11-how-to-transfer-matrix-to-vs/

but looking at this...
http://msdn.microsoft.com/en-us/library/windows/desktop/bb509647(v=vs.85).aspx

Why is WORLDVIEW absent from here?

slovach fucked around with this message at 14:03 on Dec 29, 2012

Boz0r
Sep 7, 2006
The Rocketship in action.
I'm reading this GPU Gems article on shadow maps, and in the section on Percentage-Closer Filtering they say that shadow maps cannot be prefiltered. Why is this?

http://http.developer.nvidia.com/GPUGems/gpugems_ch11.html

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

Boz0r posted:

I'm reading this GPU Gems article on shadow maps, and in the section on Percentage-Closer Filtering they say that shadow maps cannot be prefiltered. Why is this?

http://http.developer.nvidia.com/GPUGems/gpugems_ch11.html
Because shadow maps do not store light intensity, they store depth. Depth values are either in front of the depth distance you're checking for shadowing or they aren't, so averaging a depth value that is shadowed with one that is not shadowed for instance will not give you a depth value that is half shadowed, it just gives you a new distance that is either shadowed or not.

(In case you're wondering why shadows aren't completely hard-edged from that, it's because modern GPUs will automatically filter shadow map lookups at a given depth, but the filtering only goes as far as emulating linear filtering)

Variance shadow maps CAN be filtered and anti-aliased, but they suffer from artifacts in certain scenarios.

High Protein
Jul 12, 2009
I'm using a really expensive shader, but I only need to run it sometimes. To take care of the 'sometimes' I'm trying to use the stencil buffer: an earlier pass always writes 1, and then the expensive pass has 'not 1' as its stencil check and 'keep' as all its operations.

Unfortunately, the expensive shader needs to output depth. Somehow, this results in it always being run, even when completely masked out by the stencil buffer. Heck, even if I set its stencil buffer check to 'never' it still runs. If I remove the depth output, everything works as expected. Note that the shader also uses 'discard' ops. Anyway, it seems that the fact that I output depth results in the stencil check taking place after the pixel shader runs, instead of before.

Is this just some quirk of my specific GPU (GTX 580) or is there a good reason for this? I don't understand why the fact that the fragment's depth value might change is relevant, as that doesn't change the fact that it's masked out even by the basic stencil check, so the zfail op doesn't matter anymore. And anyway, it's set to 'keep'. Maybe the GPU just turns off early z and early stencil rejection together?

Contains Acetone
Aug 22, 2004
DROP IN ANY US MAILBOX, POST PAID BY SECUR-A-KEY

Contains Acetone fucked around with this message at 17:46 on Jun 24, 2020

Boz0r
Sep 7, 2006
The Rocketship in action.

OneEightHundred posted:

Because shadow maps do not store light intensity, they store depth. Depth values are either in front of the depth distance you're checking for shadowing or they aren't, so averaging a depth value that is shadowed with one that is not shadowed for instance will not give you a depth value that is half shadowed, it just gives you a new distance that is either shadowed or not.

(In case you're wondering why shadows aren't completely hard-edged from that, it's because modern GPUs will automatically filter shadow map lookups at a given depth, but the filtering only goes as far as emulating linear filtering)

Variance shadow maps CAN be filtered and anti-aliased, but they suffer from artifacts in certain scenarios.

Thanks, I get it now.

Can someone explain what homogeneous division is? I've tried reading up on it, but I'm still not sure what it actually is.

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

Boz0r posted:

Can someone explain what homogeneous division is? I've tried reading up on it, but I'm still not sure what it actually is.
When you output vertex coordinates, it outputs 4 values: X, Y, Z, and W. The actual location where the vertex winds up on your screen will be X/W, Y/W and the depth will be Z/W.

The reason this is done is because the division allows the hardware to determine how to handle various aspects that can't be done with just 2D coordinates, like perspective correction, handling screen-edge clipping for coordinates that cross the W=0 plane, etc.

The depth is divided because doing that results in a non-linear depth value distribution that concentrates more depth precision close to the camera.

Boz0r
Sep 7, 2006
The Rocketship in action.
So the homogeneous division is the projection from view space to screen space, more or less?

lord funk
Feb 16, 2004

I feel like I'm thiiiiiiis close to understanding OpenGL + VBO + VBA. But maybe not:

I know you can apply matrix transformations to transform objects, but can you change the actual vertex positions in between frames?

code:
GLfloat _vertexData = {
//x, y,                   r, g, b, a
        -1.0f, -1.0f,      0.5f, 0.0f, 0.0f, 1.0f,
        1.0f, -1.0f,       0.0f, 0.5f, 0.0f, 1.0f,
        1.0f,  1.0f,       0.0f, 0.0f, 0.5f, 1.0f,
        -1.0f, 1.0f,       0.5f, 0.5f, 0.0f, 1.0f
};

- (void)setupGL {
   glGenVertexArraysOES(1, &_vertexArray);
    glBindVertexArrayOES(_vertexArray);
    
    glGenBuffers(1, &_vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, 4 * 6 *sizeof(GLfloat), _vertexData, GL_DYNAMIC_DRAW);
    
    glEnableVertexAttribArray(GLKVertexAttribPosition);
    glVertexAttribPointer(GLKVertexAttribPosition, 2, GL_FLOAT, GL_FALSE, 24, BUFFER_OFFSET(0));
    glEnableVertexAttribArray(GLKVertexAttribColor);
    glVertexAttribPointer(GLKVertexAttribColor, 4, GL_FLOAT, GL_FALSE, 24, BUFFER_OFFSET(8));
}
Is there a way to move the x/y coordinates in _vertexData? Is there a way to alter color data? If not, what should I be doing?

EDIT: is it glBufferSubData? and if it is, how come I always figure things out 2 minutes after asking?

lord funk fucked around with this message at 23:19 on Jan 12, 2013

rustak
Jan 15, 2007
no, really

lord funk posted:

I feel like I'm thiiiiiiis close to understanding OpenGL + VBO + VBA. But maybe not:

I know you can apply matrix transformations to transform objects, but can you change the actual vertex positions in between frames?

code:
GLfloat _vertexData = {
//x, y,                   r, g, b, a
        -1.0f, -1.0f,      0.5f, 0.0f, 0.0f, 1.0f,
        1.0f, -1.0f,       0.0f, 0.5f, 0.0f, 1.0f,
        1.0f,  1.0f,       0.0f, 0.0f, 0.5f, 1.0f,
        -1.0f, 1.0f,       0.5f, 0.5f, 0.0f, 1.0f
};

- (void)setupGL {
   glGenVertexArraysOES(1, &_vertexArray);
    glBindVertexArrayOES(_vertexArray);
    
    glGenBuffers(1, &_vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, 4 * 6 *sizeof(GLfloat), _vertexData, GL_DYNAMIC_DRAW);
    
    glEnableVertexAttribArray(GLKVertexAttribPosition);
    glVertexAttribPointer(GLKVertexAttribPosition, 2, GL_FLOAT, GL_FALSE, 24, BUFFER_OFFSET(0));
    glEnableVertexAttribArray(GLKVertexAttribColor);
    glVertexAttribPointer(GLKVertexAttribColor, 4, GL_FLOAT, GL_FALSE, 24, BUFFER_OFFSET(8));
}
Is there a way to move the x/y coordinates in _vertexData? Is there a way to alter color data? If not, what should I be doing?

EDIT: is it glBufferSubData? and if it is, how come I always figure things out 2 minutes after asking?

If you want to replace the values entirely, then yes, BufferSubData -- but not with your vertex layout there. To replace the vertex coords you'd have to call BufferSubData 4 times; if you plan on changing them, you should put your vertex coords first and then your colors after (so the vertices are all in one block followed by the color data). The way you have it laid out has some benefits for performance, but they're nullified if you're updating the data.

However, it depends on what you mean by "move" or "alter" -- you can also do them programmatically in your shaders, e.g. by passing in a matrix to multiply the vertices by as you mention, or by passing in a uniform to scale or add to the color/vertex values.

lord funk
Feb 16, 2004

Thanks - that makes sense. I'm thinking of making two vertexArrays, one with static objects that will be moved using matrix transformations, and another with dynamic objects whose vertices will change each frame. Does that sound like a good idea?

lord funk
Feb 16, 2004

Seems to be working great. Screenshot:



The circles and pac-man wedges are fixed coordinates, but the lines connecting all the touch points change their vertex x/y position each frame.

Party Ape
Mar 5, 2007
Don't pay $10 bucks to change my avatar! Send me a $10 donation to Doctors with Borders and I'll stop posting for 24 hours!
Apologies for the massive code dump here but I'm trying to make sure that I've provided enough information to help you all understand what I'm doing (which is) trying to get a grip on this new open GL 4.0 thing with the fancy shaders.

I'm pretty sure I'm making a rookie mistake of some kind here but I've spent all day with google going through tutorials and most of them are obsolete and the rest seem to just replicate what I've done.

Here is how I'm going about it:
- init();
- loadTextureLibrary();
- loadShaderLibrary();
- loadBufferLibrary();

Then the engine calls drawFrame() unto infinity.

(Problem is solved so I removed the massive code dump that was here, if you'd like to see it for some bizarre reason, PM me.)

What it produces is:



The tiny square in the top left is the actual texture and you can see how it's rendering it. If I had to take a wild rear end guess, I'd say it's using the same uv in the fragment shader for every outbound pixel but I don't know why so if anyone has a theory about what is or isn't firing (or even if you see something I've done that's just terrible or deprecated) please let me know.

Also, I have absolutely no idea what glUniform1i(uniforms[UNIFORM_TEXTURE], 0); does. Doesn't GL_TEXTURE0 activate the first texture register? Why do we need to pass '0' to the uniform as well?

glActiveTexture(GL_TEXTURE0);
glUniform1i(uniforms[UNIFORM_TEXTURE], 0);
glBindSampler(0, samplers[SAMPLER_TILE])

Ed: if you're wondering what the problem was, I was using different names for the texture coord vertex in my vertex and fragment shaders.

Party Ape fucked around with this message at 09:02 on Feb 19, 2013

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

Heliotic posted:

Also, I have absolutely no idea what glUniform1i(uniforms[UNIFORM_TEXTURE], 0); does. Doesn't GL_TEXTURE0 activate the first texture register? Why do we need to pass '0' to the uniform as well?
The uniform gets set to a texture unit so you can have multiple samplers bound to the same texture unit if you want. It's kind of a useless feature though since texture units don't really have any relevant state other than a bound texture, but they probably split it up that way for the sake of encapsulation.

MarsMattel
May 25, 2001

God, I've heard about those cults Ted. People dressing up in black and saying Our Lord's going to come back and save us all.
Looks like you've not got any glTexParameter calls, could be that.

Party Ape
Mar 5, 2007
Don't pay $10 bucks to change my avatar! Send me a $10 donation to Doctors with Borders and I'll stop posting for 24 hours!
Thanks for the suggestion, I was under the impression that you could set the parameters against the sampler rather than the texture (but unfortunately it didn't fix the problem.)

Unfortunately, the new openGL superbible is still a few weeks off so I'll keep poking away at it until then.

Max Facetime
Apr 18, 2009

Heliotic posted:

code:
        glVertexAttribPointer(ATTRIB_VERTEX, 3, GL_FLOAT, GL_FALSE, 0, NULL);
        glVertexAttribPointer(ATTRIB_TEXCOORD, 2, GL_FLOAT, GL_TRUE, 0, NULL);

I don't know why texture coordinates are set to be normalized but vertex coordinates are not. They both seem to use same kind of floats in memory so it should be the same for both?

A question of my own:

What does context being lost in OpenGL mean and how should it be handled? How should resources like buffers, shaders, programs and textures be managed in general?

Max Facetime fucked around with this message at 17:20 on Feb 18, 2013

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

Win8 Hetro Experie posted:

What does context being lost in OpenGL mean and how should it be handled? How should resources like buffers, shaders, programs and textures be managed in general?
Windows will evict your D3D context if the application window is minimized or resized, destroying all of its resources and forcing you to reupload them and reset the device if you want to do anything. OpenGL specifically requires that data is not able to be spontaneously lost, so the Windows drivers will preserve them. It sounds like mobile devices may also be able to lose the context if the device goes to sleep, but I don't know much about that.

ShinAli
May 2, 2003

The Kid better watch his step.

Heliotic posted:

holy poo poo

Should texOut in the vertex shader be texCoord? That or rename texCoord in the fragment program to texOut?

ShinAli fucked around with this message at 05:06 on Feb 19, 2013

Party Ape
Mar 5, 2007
Don't pay $10 bucks to change my avatar! Send me a $10 donation to Doctors with Borders and I'll stop posting for 24 hours!

ShinAli posted:

Should texOut in the vertex shader be texCoord? That or rename texCoord in the fragment program to texOut?

...

I gotta admit I'm disappointed at how stupid that was. :downs: (Especially since I've worked with shaders before in OpenGLES2), I owe you a beer.

Thanks!

Max Facetime
Apr 18, 2009

OneEightHundred posted:

Windows will evict your D3D context if the application window is minimized or resized, destroying all of its resources and forcing you to reupload them and reset the device if you want to do anything. OpenGL specifically requires that data is not able to be spontaneously lost, so the Windows drivers will preserve them. It sounds like mobile devices may also be able to lose the context if the device goes to sleep, but I don't know much about that.

I guess this means going back to square one and starting again from setting up the display mode and pixel format. Is there any harm or defensive programming advantage in calling delete buffers etc. with the old buffer names before allocating new ones?

Party Ape
Mar 5, 2007
Don't pay $10 bucks to change my avatar! Send me a $10 donation to Doctors with Borders and I'll stop posting for 24 hours!

The Gripper posted:

What's with all the Sydney nerds programming at 11PM on a Friday night, during the silly season? Truly a city of lame dudes (put my name down in that list too).

Put me down as another ex-sydneysider (now Canberran) who spent both Friday and Saturday night playing with OpenGL.

Buncha weirdos you ask me.

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

Win8 Hetro Experie posted:

I guess this means going back to square one and starting again from setting up the display mode and pixel format. Is there any harm or defensive programming advantage in calling delete buffers etc. with the old buffer names before allocating new ones?
Again, keep in mind that it only affects D3D and you can effectively ignore it with OpenGL.

If you are using D3D, then a lost context will still return memory regions and whatnot when you attempt to map resources, but they'll be from a dummy allocation system that throws it out when you unmap. Probably the best thing to do is just program as usual, but have a mechanism for doing a drop/reload of all resources.

haveblue
Aug 15, 2005



Toilet Rascal
Incidentally, this is why a lot of older PC games are so unstable when alt-tabbing- they weren't written defensively enough to catch the invalidation at literally any point in the program.

I've never seen iOS destroy a context without being specifically told to, but I don't know how Android or Windows Phone handle it. I think ES has the same stipulation as full-fat GL.

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

gooby on rails posted:

Incidentally, this is why a lot of older PC games are so unstable when alt-tabbing- they weren't written defensively enough to catch the invalidation at literally any point in the program.
It's not so much that they wouldn't catch it "at any point" as that they weren't well-prepared for invalidation ever. Most old games load all of their data at level start and never again, so asking them to reload data after the level's been started is a huge kludge that doesn't get well-tested. A lot of them would reference textures as D3D texture objects in which case good luck combing your entire code base to strip those all out and replace them with handles to a drop-friendly resource manager, and games are STILL having trouble with it.

If your app can support manually dropping all resources and reloading them while you're playing, then you should be able to survive a context loss without problem, since the D3D API should still be giving you responses that will at least avoid a crash until then. The best practice is probably to only reference D3D resources via a resource manager that can do a centralized drop/reload.

Android dropping the context in certain situations is documented, intended behavior.

OneEightHundred fucked around with this message at 19:47 on Feb 23, 2013

Adbot
ADBOT LOVES YOU

Max Facetime
Apr 18, 2009

drat, the terminology is really confusing, but I think I can make some semblance of what happens in OpenGL ES 2.0 and similar.

First, OpenGL doesn't lose its context, EGL loses the OpenGL context. The error code is EGL_CONTEXT_LOST and it's mentioned in eglSwapBuffers and a few other functions. Also what OneEightHundred said about Android.

Then, in WebGL there is the error code gl.CONTEXT_LOST_WEBGL that gl.getError can return for anything. Khronos has a wiki for WebGL and it's got instructions for handling it properly. Though it's a bit odd that some of the issues mentioned, like something hogging the GPU or the driver being updated must surely apply in desktop OpenGL as well.

In light of those I think I should be fine with pretending to be doing an orderly cleanup phase, then starting initialization from the beginning. If nothing else at least this should be testable.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply