Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

SupSuper posted:

Nothing like shaders to hide your lack of artistic skills and make even boring spheres look like something else.



They're even animated.

Are you making a Tron clone?

https://www.youtube.com/watch?v=BbBqPkdheFg

Adbot
ADBOT LOVES YOU

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

ijustam posted:

I've owned this thing for 4 years and kept it through 3 moves and I finally got around to doing something with it.

https://www.youtube.com/watch?v=8mHXzukItVk

Now to figure out what exactly to utilize it for.

A giant, low-resolution thermometer/weather station display

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

steckles posted:

The floor and ceiling geometry is imported from .obj models that meet very specific requirements. Walls are determined by the sector edges and the difference in height between each edge and its neighbours, if it has any. I modelled everything in Wings and textured it "in-game".

Optimization has been an interesting task; Modern CPUs are such FP monsters that the limiting factor in engine performance isn't rendering the level, it's writing the pixels to the framebuffer. If I could get SDL to give me a screen surface ordered in columns rather than rows, it'd probably double the frame rate.

Do what modern GPUs do and output in a "block-linear" pattern. Instead of going horizontally (bad for reuse):

WWWW
XXXX
YYYY
ZZZZ

Or vertically (bad for write combining):
WXYZ
WXYZ
WXYZ
WXYZ

DO a combination of the both:

WWXX
WWXX
WWXX
WWXX
YYZZ
YYZZ
YYZZ
YYZZ

(where the number of outputs you do along a given "row" is determined by your memory architecture. You can just experiment with it).

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Suspicious Dish posted:

hey look it's a readable lisp

Polish notation is never readable :colbert:

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Superschaf posted:

But if you reverse it, it all falls into place!

This actually makes me wonder -- are there any languages that allow for "infix functions"? I ask because it occurred to me that "prefix" is basically what is used for all other function calls in all other languages, and I'm wondering if Infix is more readable to me just simply out of habit or of there's some other sort of ergonomics at play. Obviously C++ allows for a lot of operator overloading, but that's problematic for its own reasons.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

piratepilates posted:

It's actually really simple and easy to start with strangely.

The core of it is just a loop through every pixel of what will be your final image, sending out a ray for each pixel, and seeing how those rays interact with the scene. From there it's just adding on parts of it to make the final image nicer.

Ray-Tracer on the back of a business card
code:
#include <stdlib.h>   // card > aek.ppm
    #include <stdio.h>
    #include <math.h>
    typedef int i;typedef float f;struct v{
    f x,y,z;v operator+(v r){return v(x+r.x
    ,y+r.y,z+r.z);}v operator*(f r){return
    v(x*r,y*r,z*r);}f operator%(v r){return
    x*r.x+y*r.y+z*r.z;}v(){}v operator^(v r
    ){return v(y*r.z-z*r.y,z*r.x-x*r.z,x*r.
    y-y*r.x);}v(f a,f b,f c){x=a;y=b;z=c;}v
    operator!(){return*this*(1/sqrt(*this%*
    this));}};i G[]={247570,280596,280600,
    249748,18578,18577,231184,16,16};f R(){
    return(f)rand()/RAND_MAX;}i T(v o,v d,f
    &t,v&n){t=1e9;i m=0;f p=-o.z/d.z;if(.01
    <p)t=p,n=v(0,0,1),m=1;for(i k=19;k--;)
    for(i j=9;j--;)if(G[j]&1<<k){v p=o+v(-k
    ,0,-j-4);f b=p%d,c=p%p-1,q=b*b-c;if(q>0
    ){f s=-b-sqrt(q);if(s<t&&s>.01)t=s,n=!(
    p+d*t),m=2;}}return m;}v S(v o,v d){f t
    ;v n;i m=T(o,d,t,n);if(!m)return v(.7,
    .6,1)*pow(1-d.z,4);v h=o+d*t,l=!(v(9+R(
    ),9+R(),16)+h*-1),r=d+n*(n%d*-2);f b=l%
    n;if(b<0||T(h,l,t,n))b=0;f p=pow(l%r*(b
    >0),99);if(m&1){h=h*.2;return((i)(ceil(
    h.x)+ceil(h.y))&1?v(3,1,1):v(3,3,3))*(b
    *.2+.1);}return v(p,p,p)+S(h,r)*.5;}i
    main(){printf("P6 512 512 255 ");v g=!v
    (-6,-16,0),a=!(v(0,0,1)^g)*.002,b=!(g^a
    )*.002,c=(a+b)*-256+g;for(i y=512;y--;)
    for(i x=512;x--;){v p(13,13,13);for(i r
    =64;r--;){v t=a*(R()-.5)*99+b*(R()-.5)*
    99;p=S(v(17,16,8)+t,!(t*-1+(a*(R()+x)+b
    *(y+R())+c)*16))*3.5+p;}printf("%c%c%c"
    ,(i)p.x,(i)p.y,(i)p.z);}}

Hubis fucked around with this message at 11:18 on Jun 16, 2015

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Avenging Dentist posted:

Maybe you should try a nice filter instead.

Or, more helpfully, a median filter.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Jo posted:

Much obliged. I'll give this a try. I'm still trying to grok the difference in their behavior.

The median is the 'middle value' of the set (i.e. the one in the middle when all values are sorted based on some criteria) wheras the (arithmatic) mean is the average of the set.

Since the median value will match one of the adjacent pixels, there will be no interpolation/blurring, so you'll preserve sharp edges in the data set. This is good if you're upsampling something from a limited palette and want the output to stay in that palette, for example. Of course you can only really do this if you have a reasonable way to sort the set to find the median in the first place (and that can be the tricky part for a median filter) but in your case it would be an easy implementation since the image is already black-and-white.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

evilentity posted:

Working on smooth lights for box2dlights/libgdx. Left is smooth, right is default. Fun!
https://www.youtube.com/watch?v=tMlrP2UYXss

That's a really neat and elegant solution for 2d. I've been working on something surprisingly similar!

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

HardDisk posted:

I always wanted to build a terrain generator. Do you have any recommendation on articles/tutorials?

there's a ridiculous amount of professional AND hobby-level content out there, but this is as good a place to start as any: http://vterrain.org/Elevation/Artificial/

e: this, along with most of what Amit put up, is also great and probably more current (I couldn't find the link before): http://www-cs-students.stanford.edu/~amitp/game-programming/polygon-map-generation/

Hubis fucked around with this message at 22:55 on Dec 18, 2015

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

lord funk posted:



Working on a rippling water effect for our stage lighting.

Does anyone know how to convert a color value (RGB or HSV) to red, cyan, green, yellow, blue, and magenta values (I don't know, RCGYBM)?

Illumination is additive, so your output can be expressed as some linear combination of an arbitrary number of components. Usually this is Red/Green/Blue -- or, more correctly, whatever the primaries are for the color space you're operating in (which are monochromatic 700nm Red, 546nm Green, and 436nm Blue/Violet for CIE RGB). However, there's no reason you can't add more components. If they are chosen correctly, this allows you to have a much percentage of the perceptual gamut covered than the traditional triangular 3-component space would allow (as well as providing for probably more efficient lighting, since you need a lot more power in Blue wavelengths than Green ones for the same perceived brightness). You might also have more than three primaries because single wavelength illumination isn't the same as broad-band illumination, and so having more primaries lets you better model "full spectrum" lighting.

The MOST CORRECT way to do it would be:

1) Determine the color space your "color values" are actually outputting.
Your monitor is probably outputting sRGB, but it's quite likely that the stage lights have a wider gamut than that (for reasons mentioned earlier). This doesn't really matter unless you care a lot about how the colors on the screen map to the colors on the stage. Either the colors on the screen will be less colorful and have their hues clamped/compressed, or the colors on the stage will be limited to the sRGB gamut (and may be less vibrant than the light is actually capable of). In some sense you can kind of skip this step to start, but that just means you're making some implicit decisions/not worrying about the mapping from display to actual lighting.

One option would be to do all your simulation in XYZ space (which uses imaginary colors) which will make step 3 trivial. To display on your monitor, just do an XYZ-sRGB transform first, clamping the final color values to the sRGB gamut.

2) Determine the color coordinates of each of your primaries in xyY space.
You can start with educated guesses using an annotated CIE 1931 chromaticity diagram. Once you've got the system working, you can tweak them a little bit if the output seems biased to one shade or another.

3) Transform your sim output to xyY.
If you had RGB values, you do this by transforming to XYZ first. This is just a matrix multiplication, with the transform matrix depending on your primaries (i.e. the color space). For sRGB, you multiply your input color by this:
code:
0.4124564  0.3575761  0.1804375
 0.2126729  0.7151522  0.0721750
 0.0193339  0.1191920  0.9503041
Once you have XYZ values, convert to xyY as follows:
code:
x = X / (X+Y+Z)
y = Y / (X+Y+Z)
Y = Y
In this space, 'x' and 'y' are your chromaticity dimensions, and Y is the perceived luminance.

4) Compute the relative proportion of the primaries to produce the output
Find a solution to the following linear equation:
code:
x_sim = K_red*x_red + K_yellow*x_yellow + K_green*x_green + K_cyan*x_cyan + K_blue*x_blue + K_magenta*x_magenta
y_sim = K_red*y_red + K_yellow*y_yellow + K_green*y_green + K_cyan*y_cyan + K_blue*y_blue + K_magenta*y_magenta
Note that this assumes all your x/y values are at the same perceived luminance, Y
The 'x' and 'y' values above are the coordinates for the primaries you determined in step 2. You want to solve for the various K values, which will be the relative brightness of each output. Since you're solving 6 variables over 2 equations, the results will be under-constrained, meaning you've got multiple sets of values that produce the same percieved color. This is what's called "Metamerism". Pick a heuristic to simplify the problem however you want. One option might be to compute the distance in xy space to each of the primaries, and then only solve for the furthest 1 and the nearest 2.

5) Adjust outputs based on desired luminance
Scale the computed K values by the Y value from step 3. This is your brightness for each primary.
Note that this might not technically be correct, depending on what your desired white point is, etc.

Hubis fucked around with this message at 20:25 on Jan 12, 2016

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...
(multi-posting for clarity)

That being said, a quick and dirty way to do it would be the following:

1) Convert your sim output to HSV

2) Determine the two primaries on the color wheel that your H value lies between.
Hue values are as follows:
Red: 0
Yellow: 60
Green: 120
Cyan: 180
Blue: 240
Magenta: 300

https://upload.wikimedia.org/wikipedia/commons/3/33/Hsv-hexagons-to-circles.svg

3) Determine the saturated color output based on Hue and the two primaries.
If H were 30, for example, your saturated output would be:

R = 0.5
Y = 0.5
G = 0.0
C = 0.0
B = 0.0
M = 0.0

If H were 195, it would be
R = 0.0
Y = 0.0
G = 0.75
C = 0.25
B = 0.0
M = 0.0

4) Based on the Saturation value, determine the actual color output
A color with 0 saturation would produce
R = Y = G = C = B = M = (1/6)

So using the S value, linearly interpolate from the unsaturated value (1/6) to the saturated color determined in step 3 to produce your final component proportions.
code:
R = S * R_s + (1-S) * (1/6)
Y = S * Y_s + (1-S) * (1/6)
G = S * G_s + (1-S) * (1/6)
C = S * C_s + (1-S) * (1/6)
B = S * B_s + (1-S) * (1/6)
M = S * M_s + (1-S) * (1/6)
5) Scale your results based on V
Take the relative component values from step 5 and multiply them by V to give you the final overall brightness.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

lord funk posted:

You are awesome Hubis. Thanks - that was really helpful.

Glad to help -- it's fortunate that I happened to just be doing a lot of reading on color spaces recently :)

Another way of thinking of that second method, btw, is just thinking of the color space as a hexagon made of 6 triangles, with the pure colors at the edges and middle-grey at the center. Then you use Hue to determine which triangle you fall into, which gives you the three 'primaries' you want to use (middle grey, and the two saturated colors).

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

lord funk posted:

I'm totally with you there. I've got a minBrightness value and some scaling parameters in there to lesson the full on/off effect. Makes it a lot more natural looking. The piece we're playing is going to slowly ramp from choppy/bright water to steady/deep water over 30 minutes.

Hubis you weren't kidding about the LEDs being able to produce colors the laptop monitor doesn't at all. I've got two implementations: one that translates RGB into the 7 color values, and the second 'quick and dirty' implementation you posted above. The latter is better at producing the full range of color, but at 0% saturation the lights are very dim (all at 1/6 strength). I'm messing around with ways to keep the brightness up as the saturation goes down.

I almost posted something about this, but figured it might not be a problem. It could be that the perceived brightness might be nonlinear to the numbers you're feeding to the system.

One option would be to modify your 'white point' (the 1/6th middle grey) to be brighter, so that as you desaturate you're doing so to a brighter color. This still assumes it's linear, but might be enough to fix the problem. Note that you could also make the white point non-uniform (vary the brightness of different components) to tint the color somehow if you wanted.

Another option would be to multiply your final brightness by a uniform saturation scale factor. In other words, something like
code:
V_out = V * (1 + K*pow(S, N))
This would make your desaturated light K times brighter than your saturated light, with N as the slope (1 = linear, 2 = quadratic, etc).

Finally, it might be some sort of gamma-like issue. You could try scaling all your individual components by a gamma factor before output:
code:
K_color_out = pow(K_color, 1/N)
(assumes K_color = [0, 1]; try N=2.2)

This would amplify color components with low brightnesses, and leave high brightnesses mostly unchanged. This is akin to the gamma transforms you have to do when outputting a color to the screen when rendering in sRGB. Note that this would have the effect of reducing saturation somewhat, but might be more perceptually correct.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Suspicious Dish posted:

watch in amazement as this random noise becomes... a meme!!

*audience oohs*

*trains a deep-learning net on an endless supply of internet memes*
*back-feeds a white noise signal into the outputs*

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Baloogan posted:

the network dreams of memes

code:
August 29th, 2017 - 2:14am: FYAD becomes self-aware

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...
cross-post from the 3D graphics thread...

Hubis posted:

If anyone is going to be at GDC this year, feel free to swing by and check out my talk Wednesday afternoon: Fast, Flexible, Physically-Based Volumetric Light Scattering

https://www.youtube.com/watch?v=lQIZzKBydk4

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...
I just assumed it was a David Bowie reference

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Sex Bumbo posted:

Ill make everyone feel better about themselves, here's a burger emoji that I rendered in the stupidest way possible

https://www.shadertoy.com/view/Xlt3zs

Mods plz change my username to "Procedural Emoji" kthx

Adbot
ADBOT LOVES YOU

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

So kind of like a super simple IK solver?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply