Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
lord funk
Feb 16, 2004

Sweet benevolant baby jesus I finally got one to work :negative:



Next up: figuring out how to make this way less painful. The method I found that worked used *14* mouse clicks per mesh+bone just to pair them with a 1.0 weighting.

Also I bought a $11 3-button mouse, because jfc the middle-mouse button should never be a part of any UI, let alone its #1 element.

Adbot
ADBOT LOVES YOU

Ranzear
Jul 25, 2013

I've thought about this a lot and my (unimplemented) solution is to export just the armature and attach the meshes to each bone by name later. Makes it easier to swap out turret types or add extra bones (say, spinning radar dish on top of the mantlet, etc) without breaking previous configurations. Extraneous bones are basically free to add later and meshes can be swapped freely per bone without mucking with weights at all. Animations don't care about the extra bones either.

The only fiddly bit is getting origins and rotations set right for each mesh on export. Since they'll be parented to the bone one has to reverse all the bone positions and rotations back to origin and then export just the single mesh.

Ranzear fucked around with this message at 03:37 on Dec 17, 2022

lord funk
Feb 16, 2004

In my case, since I expect to always have one mesh = one bone, I should be able to make a script to automatically create the armature, bones, leaf bones, weighting, and joint orientations. Each bone is the same repetitive procedure, so it shouldn't be too bad. Then all I have to do is parent the ones I want to be attached to each other.

And yeah, I still had some work to get the orientation right (this is correct):

Raenir Salazar
Nov 5, 2010

College Slice
And thus the first continents took shape...



Interesting that the overall contours of these landmasses look surprisingly decent with random floodfill.

We have two continents that ended up brushing up to each other, forming a sort of proto-eurasian landmass, there is also a large number of lakes and a tendency to forming bays/fjords and the like.

I'd probably ideally like it so it doesn't use any cells that touch the left or right-most edges of the texture; but the top and bottom are fine.

I'm not 100% sure if this is free of bugs, it is a bit surprising to me how many "gaps" I am finding myself with, but it is random floodfill so maybe this is to be expected?

E: With some plate boundaries overlaid using Paint.net:


(The line thickness I set slightly too thick, so its pinching off some ends, don't worry too much about it!)

So depending on how the movements and densities are, we should hopefully end up with an interesting elevation map as a result.

I still haven't really setup some rules yet for how exactly the information of plates colliding gets propagated, to form interesting features like island arcs and so on; also some features like the Urals aren't a result of plates colliding in the present day, but a result of plates colliding hundred of millions of years ago and the plates have long since merged?

Not sure how to handle something like that so we can have some off-plate mountain ranges, maybe without doing a full simulation do two sets of the above to pretend model a geological history with an "older" version that acts as an influence on the "newer" version?

e2: I'm also a little worried about how to resolve how these plates are supposed to move and interact with respect to the actual landmasses when some of the shapes and the way they overlap is a little insensible compared to actual geology. :ohdear:

Raenir Salazar fucked around with this message at 06:10 on Dec 23, 2022

Raenir Salazar
Nov 5, 2010

College Slice
I am very confused but I was making some changes to add arrows to display the tectonic plate movement vectors and weirdly my continents changed. But in writing this I think I figured it out, does adding a call to Random.Range(...) change the result of subsequent calls? It has to I suspect? Because that's how it basically works right and how seeds work? You're sampling a generated list of pseudo random numbers and your seed I suppose changes the list but the order in which you sample them depends on the calls to Random.Range()? So if I add a loop to uses Random.Range to generate random directions it will change what calls to Random.Range in my Continent generator function will result in?

Anyways, I got arrows to display, ideally now I'd find a way of overlapping these images together.



e to clarify: Ideally I wanna stop having to overlap these images manually, it'd be nice to automatically output these images, I believe I can copy a source texture to a destination texture, not sure if that factors in opacity. The tricky part is figuring out an easy way to get the outline of a shape in an image in C#/Unity3D, time to google.

e3: I have successfully? Automated the above process so now I can overlay images and also grab the contours of my plate or continent maps.



It's not very readable at the moment, I need to pass a different material to change the colour of the arrows to stand out better, and ideally find a way to thicken the lines.

Raenir Salazar fucked around with this message at 06:13 on Dec 24, 2022

Ranzear
Jul 25, 2013

You have it mostly right. A PRNG like that spits out a deterministic ordering of shifting values. You'll consume them in a different order by calling any Random.foo() in new places. I bet Unity is also giving you no guarantee about order of calls somewhere which will break in hilarious ways down the line.

The easiest thing to do* is get a unique seed running for every little subsystem and use a single main seed to populate all the subsystem seeds in a given order. This is imperative once PRNG runs several systems as locking the main seed will keep everything else constant even if order-breaking changes are made to a subsystem (and being able to see those changes happened is nice too).

*Except you're in Unity, so you'll have to find something other than the built-in Random library. Xorshift128 is okay for gamedev, fails statistical tests, but it has no instantiation so you just get the one seed and that's it. Maybe someone can chime in on how to get a C/C++ dll running as a plugin, then just throw PCG at it. Might be my new favorite rust crate.

roomforthetuna
Mar 22, 2005

I don't need to know anything about virii! My CUSTOM PROGRAM keeps me protected! It's not like they'll try to come in through the Internet or something!

Ranzear posted:

*Except you're in Unity, so you'll have to find something other than the built-in Random library. Xorshift128 is okay for gamedev, fails statistical tests, but it has no instantiation so you just get the one seed and that's it.
I always just end up copy-pasting a Mersenne Twister for this purpose (and rewriting it into whatever language I'm using if necessary, it's so simple that it only takes a few minutes - certainly quicker than trying to figure out how to get a rust crate into Unity!)

Raenir Salazar
Nov 5, 2010

College Slice
Well it isn't something I have to worry about too much currently as it isn't like the results are changing each time I hit the play button, but mainly when I suppose I add a new bit of logic that calls Random; but it's definitely something to keep in mind! Mystery is resolved. :)

Ranzear
Jul 25, 2013

roomforthetuna posted:

certainly quicker than trying to figure out how to get a rust crate into Unity!

It's a C/C++ lib first. Just happened to see it has a rust crate recently and PCG was a thing back when I picked up Alea for my JavaScript garbage before rust even existed.

MT can just be a little heavy and slow for gamedev, for instance if you poked it multiple times per particle in some crazy effect. That's the really good case for xorshift.

Needing multiple seeds/streams is the main issue though.

Ranzear fucked around with this message at 08:09 on Dec 25, 2022

Raenir Salazar
Nov 5, 2010

College Slice
Small update, I wanted to make the lines thicker so I was trying to figure out a computer shader for it.



I got this result initially, which was weird but also interesting, the double lines look nice even if sadly there are some junctions where it doesn't work perfectly.

So I was trying to fix this result and I caught on onto the fact that none of my changes were working.

What ended up happening was I never actually renamed the function in question, I had just copied the old function call and pasted it without changing it.

So I was feeding this output:



As input into the same function which got:



It's a nice result, I might keep it, but that was also a few hours of trial and fail before I realized the root issue. :gonk:

At the least this result DOES make it easier to differentiate the plates.

There is a weird stray pixel apparently in the bottom left, gotta figure out what's up with that.

Mata
Dec 23, 2003
So how big of a deal is floating point handling in multiplayer games these days?
My understanding is that desktops are sufficiently compliant with whatever IEEE standard that in practice, the float nondeterminism threat is a bit overblown. However, if implementing like a lockstep multiplayer model that should work between ARM mobile devices / consoles / desktops, you probably want to use fixed point math in your synchronized parts. Is that accurate?

Chainclaw
Feb 14, 2009

Mata posted:

So how big of a deal is floating point handling in multiplayer games these days?
My understanding is that desktops are sufficiently compliant with whatever IEEE standard that in practice, the float nondeterminism threat is a bit overblown. However, if implementing like a lockstep multiplayer model that should work between ARM mobile devices / consoles / desktops, you probably want to use fixed point math in your synchronized parts. Is that accurate?

With floating point you can't really pick how precision is applied. I think the most common side effect of floating point drift is, with a lot of animation to game pipelines, is when you have a complex animation way off-origin. You'll see extremities (hands, feet) not end up in the right place due to cumulative floating point drift across the skeletal hierarchy.

Here's an example of a game running into issues due to floating point drift:
https://twitter.com/HeatSig/status/716958992891834368

I think Kerbal Space Program works around a lot of it by keeping the origin centered at the current player's ship and moving the whole universal around the player.

Fixed point math is so much nicer, you have direct control over precision.

KillHour
Oct 28, 2007


Chainclaw posted:

I think Kerbal Space Program works around a lot of it by keeping the origin centered at the current player's ship and moving the whole universal around the player.

I really wish more engines were built with this as the default, since it just makes more sense. Maybe there's a performance reason not to but I have to imagine it's solvable without too much hassle for the people using the engine if you plan for it.

Mata
Dec 23, 2003
The way I deal with it ideally is to have a integer-math based logical gamestate, which feels robust and is usually pretty easy to reason about, and then the whole graphical frontend where I don't really care if a vertex is drawn at 0.3000004 instead of 0.3 or whatever. It's hard to imagine making a 3D game (or even a 2D one) without converting to floats at some point. And yeah when doing this translation you can move your world origin to keep float inaccuracies at a minimum.

But my question was more about the "logical backend" thing - I do often run into problems where I need to do a square root, or do some trig operation, and doing those kind of operations without involving floats or doubles is laborious and relatively expensive.
Is it a bad idea to do something like
code:
let result = (int)(atan2((float)my_fixed_point_coord.y, (float)my_fixed_point_coord.x));
if you need the result to be exactly the same across all networked clients?

Jabor
Jul 16, 2010

#1 Loser at SpaceChem
Casting the result of atan2 to an int is only going to have seven possible values, so I'm guessing that that one isn't such a good idea for reasons unrelated to float precision.

For square roots you will be totally fine as long as your input number is exactly representable as a float - even if different implementations somehow find slightly different results, that discrepancy vanishes when you cast back to an int.

Trig functions I would be more concerned about - even stuff like "does the hardware have a fused multiply-add instruction with infinite internal precision" can change the output slightly here, and if you're unlucky that difference could result in different output integers.

more falafel please
Feb 26, 2005

forums poster

I'm not sure you want to be using a lockstep model for a game that's going to be on mobile.

Xerophyte
Mar 17, 2008

This space intentionally left blank

Mata posted:

Is it a bad idea to do something like
code:
let result = (int)(atan2((float)my_fixed_point_coord.y, (float)my_fixed_point_coord.x));
if you need the result to be exactly the same across all networked clients?

It certainly can be, depending on how diverse "all clients" is. Consistency is not guaranteed by any of the C or C++ standards, at least.

Even on just PCs Intel and AMD hardware have used different implementations for their trig instructions. Above that the different standard libraries have used different approximations for the various transcendentals, often as work arounds for when the hardware instruction turns out to be slightly rubbish. I think things are a little more consistent today and as long as your platforms are all Intel/AMD x87-derivatives (E: or all ARM, etc) then you can probably make it work as long as you control the toolchain, explicitly set denormal handling, intermediate rounding, etc.

Implementing your own fixpoint (or float) trig isn't too hard if you need to be sure, at least as long as accuracy isn't critical. There are libraries around for it.

Xerophyte fucked around with this message at 08:55 on Dec 29, 2022

Hughlander
May 11, 2005

Mata posted:

So how big of a deal is floating point handling in multiplayer games these days?
My understanding is that desktops are sufficiently compliant with whatever IEEE standard that in practice, the float nondeterminism threat is a bit overblown. However, if implementing like a lockstep multiplayer model that should work between ARM mobile devices / consoles / desktops, you probably want to use fixed point math in your synchronized parts. Is that accurate?

Pretty much, https://randomascii.wordpress.com/category/floating-point/page/3/ was the gold standard for floats in games for the past 10 years IMO.

more falafel please
Feb 26, 2005

forums poster

This just reminded me of a bug from Mortal Kombat 2011 that probably would have shipped if I hadn't caught it randomly.

The game was lockstep, so basically the only data that was sent every frame was the controller input state. We were shipping on Xbox360 and PS3, both of which supported 16:9 but also 4:3, so you had to support 4:3.

I ran automated tests every night with desync detection turned on (basically, have "fenceposts" littered throughout the code that record __FILE__/__LINE__ into a buffer, optionally with a value. checksum that buffer every frame and send it on the wire to the other player, if your checksums are ever different, end the game and dump your log to a file where they can be diffed to (hopefully) figure out where you diverged). I'd run it on my 3 Xboxes and 3 PS3s, just going through matchmaking with each other and running AI vs AI matches, then I'd look to see if there were any dumps that weren't just from AI logic being nondeterministic (would have been nice to fix that but unnecessary). Two of my Xboxes and PS3s were hooked up to 16:9 TVs, the other set was just on a generic dell monitor from a few years back that was 4:3.

We had a function in character scripts called something like AmIOnTheLeft() which was used a lot to figure out mirroring stuff. I have no idea what the usage was here in this specific instance but it was used everywhere.

AmIOnTheLeft() for some reason did screenspace math to determine if you were on the left or not, which was almost always right, but one night I got a dump where one client said AmIOnTheLeft() was true and the other one said it was false, while the two characters were jumping past each other and switching sides.

Fix was to use the goddamn world space "tightrope" that both players were constrained to to trivially figure out if you were more than halfway along it.

tango alpha delta
Sep 9, 2011

Ask me about my wealthy lifestyle and passive income! I love bragging about my wealth to my lessers! My opinions are more valid because I have more money than you! Stealing the fruits of the labor of the working class is okay, so long as you don't do it using crypto. More money = better than!
Going to be the coding snob here and say that unless you code your inner loop in assembly language and then hand roll everything else in C, you are not a real developer yet.

Lol, I’m actually being a teensy bit hypocritical and actually use Unreal or Game Maker or ClickTeam Fusion because rolling my own engine is actually really hard to do. I did back in the Commodore 64 days but computers have become just a tiny bit more complex over the last forty years.

Ptolo
Oct 31, 2011

This continental drift model looks very cool. Reminds me of some of the things that SimEarth tried to simulate.

I'm wanting to revisit an old project of mine to make a voxel editor:



Main reason I want to go back to it is that I was hosting it on rawgit or something but it no longer seems to be hosted there? Unsurprising as it was five years ago I uploaded it.

I have an itch.io page now so it's a lot easier to just put it up there.

But first I want to strip out all the Minecraft textures, generally move away from trying to replicate the block variety found there. For one thing I dislike having fixed colours for, say, wool blocks and prefer for colour to be material independent.

Is there a standard for voxel data yet, a standardised export format of some kind? I just rolled my own run length encoded data contained in a JSON object. I also embedded the JSON data in the image files the program produces so things like that picture of a basic hut up there contains the data to load it back into the editor :)

Raenir Salazar
Nov 5, 2010

College Slice
Not much progress other than to add the cells to my overlay:



And also I FIXED THE ERRANT PIXEL :woop:

For vertices in my mesh that I suppose weren't whole numbers on my specific seed I was using resulted in a single pixel being transparent and not coloured with the right colour as a result of the edges I suppose not lining up correctly with the pixel grid.

As I was constructing the mesh for Blit'ing I had each vertex rounded to the nearest int and the pixel is fixed; there is unfortunately still a similar problem where some of the edges of the map are transparent but I assume this is due to a similar problem maybe either the rounding is off by 1 pixel or has something to do with my bounds.

The actual graph overlay though, as seen here:



Is fine with no problematic pixels.

I tried to display cell ID's for debugging purposes but the quality is a little wanting, some end up cropped at the edges etc; I could attempt a safezone and rendering at a larger resolution but it doesn't seem important right now.



e: As I suspected the issue seemed to be my Bounds, i.e (0,0), (0, Height), (Width, Height), (Width, 0) for my Bounding sites for my Delaunay triangulation/voronoi graph; in that:

(1,1), (1, Height), (Width, Height), (Width, 1) Weirdly works to having the Voronoi/Delaunay graphs having all edges being visible, including along the edges of the texture; but results in transparent pixels where the actual underlying edge doesn't seem to line up correctly and thus doesn't get rendered when the cells are filled in.

(0,0), (0, Height - 1), (Width - 1, Height - 1), (Width - 1, 0) Results in the top and right edges being completely transparent by 1 pixel; with the voronoi edges parallel to the top and right edges being visible right before the transparent pixels but aren't visible at the bottom and left edges.

while (0,0), (0, Height), (Width, Height), (Width, 0) works perfectly for the underlying cells being coloured but not all edges of the voronoi graph overlay.

This is some chicanery.

Raenir Salazar fucked around with this message at 18:22 on Jan 6, 2023

KillHour
Oct 28, 2007


Raenir Salazar posted:


This is some chicanery.

Joe Biden tried to warn us but we didn't listen

Raenir Salazar
Nov 5, 2010

College Slice

KillHour posted:

Joe Biden tried to warn us but we didn't listen

Hahaha.

In the end I decided the easiest solution is to just NOT have any bounding edges at all, as seen here:



Or rather, I *do* have them, but they're outside the bounds of the texture. It isn't like I really *need* them anyways for anything.

But at least there's no more errant transparent pixels anywhere that I can tell and now if I *do* want to have the outer edge rendered I can just create a rectangle Unity Mesh and Blit it and merge it to the rest of the Texture overlay pile.

e: I am working on making the right and left edges be able to line up so if I decide to support some kind of 2D wrapping I don't have to make any weird decisions or edge cases, and surprisingly just mirroring the left side points to the right hand side seems to work; what is however exceedingly weird to me is that it also mirrored the cells in the middle area?



e2: The answer as per usual is because I'm stoopid. :saddowns: I multiplied the total number of points/area I'm intending to mirror by 0.5 instead of 0.05, so it ended up mirroring all of the left hand points to the right hand side!

That feels like a good thing to know, and maybe it has some kind of application. I can't really think of it right now, but it seems like it could be useful.



The cells along the left/right edges are now mirrored and line up, things to consider:

- Whether the tectonic plates as well as continents should wrap around or basically stop at the left/right edges for simplicity.
- How best to compute adjacent tiles at the wrap around point, I don't know for sure if the edge cells have their site on the edge (x and y coordinates are zero on the relevant edge). I think it might be the case that their "bounded" property provided by the library I'm using is false, but I'm not wrapping top/bottom so its not perfect.

Not really coding implementations Food for Thought but:
- I wonder if I should have some means of automatically detecting if a content goes all the way across the north/south of the map and either rejecting this or randomly prune either the north, south, or both edges to provide a sea route around? Or just to tell Columbus/Megallan gently caress you and let this situation occur.

Still need to implement the actual tectonic plate fault line detection. It seems like it would be best to be on a cell by cell basis as to whether it is converging or diverging.

I'm not particularly happy about needing to potentially implement wrapping because to my mind it kinda makes the map a little uglier, I like my world maps more "centered" but there isn't a good way to insure my continents look natural if I were to just prune any cells that were near the edge.

Raenir Salazar fucked around with this message at 23:59 on Jan 6, 2023

Raenir Salazar
Nov 5, 2010

College Slice
:catstare: :catdrugs:



(I'm trying to see if there's a way of simplifying the way I go about generating my voronoi mesh for Blit'ing and instead I found the 10,000 Faces of God.

e: A slightly more comprehendable version.


And they say the AI can't do art.

So the key idea to my attempt at simplying my process was I originally took my voronoi diagram, and then went face by face individually triangulating them into their own unity meshes which I then merged into one mesh; so for a 5000 face tesselation its like having 5,000 submeshes and each mesh corresponds to a single pixel in my texture map.

I then figured, "What if I still calculated the per vertex UVs but skipped the process of individually meshing each face and just triangulated all of the voronoi diagrams vertices and then converted the whole thing to a Unity3D mesh with the UVs I made?"

Turns out this Will. Not. Work. Because now faces are (I THINK) sharing vertices which now have many different colours assigned at the sametime which results in the graphics card doing weird interpolation between points.

So maybe I can still simplify the process but I still absolutely need to have separate faces.

Raenir Salazar fucked around with this message at 01:51 on Jan 8, 2023

Red Mike
Jul 11, 2011

Raenir Salazar posted:

- I wonder if I should have some means of automatically detecting if a content goes all the way across the north/south of the map and either rejecting this or randomly prune either the north, south, or both edges to provide a sea route around? Or just to tell Columbus/Megallan gently caress you and let this situation occur.

In a past project I've seen that was similar, their "realistic" solution was to basically make the map that's being generated only fill the space that's non-polar. So basically have the polar areas (a rectangle or polygon at the top/bottom of the final image) pre-generated in a simplified way that makes sure it's traversable, and the non-polar areas (a rectangle or polygon for the rest of the final image) generate using whatever normal method you use. At the end, you merge the two.

The benefit on that project was that it allowed the polar areas to be constrained to some gameplay mechanics (don't accidentally generate endgame things that can be traversed with non-endgame abilities, which the rest of the map is allowed to do because it's not as unbalanced as the polar area endgame things), while not having to change the generation of the non-polar areas (because all they changed was they made the top/bottom 5% of the image be generated ahead of time). Rejecting/randomly changing the non-polar generation would have been difficult because it would lead to more unbalanced outcomes.

The downside is that if your generation of the non-polar areas is hard to merge with the polar areas, then you spend more time trying to figure out a nice 'merge'.

Raenir Salazar
Nov 5, 2010

College Slice
Yeah, depends on how I would go about it. If I tried to slice off two thin rectangles representing the poles and then generated separate voronoi diagrams on these slices, it would be very spicy to figure out how to merge these diagrams with the main one.

Probably the best way is to iteration through all my cells, check for the ones on the edge (not too difficult, pretty fast with some sort of KD-Tree setup) or within that slice and I suppose ideally the bottom of the top slice or the top of the bottom slice need to be reserved as water tiles.

That's if I do want that though, although I can see how I might want to leave it as a hyperparameter for the end user. Currently to link up the East and West edges just to test it I do a O(N^2) search of the edge tiles, so maybe now's a good time to look into this and also put in that KD Tree.

Ranzear
Jul 25, 2013

You're making a rectilinear map, which is always gonna get screwy at the poles anyway.

I once sketchy coded up, but never got anywhere with, a dodecahedron based projection. Just the mathy bits, borders per face and whatnot. Pentagram faces make up most of a hemisphere, a pentagram for the pole, and then I planned little rectangular transition strips or even an entire equatorial rectangle belt that got rid of the weird zig-zag. I also have a note scribbled down somewhere to not adjust the camera until it departs the polar pentagon, which would be less jarring in a top-down setting. My little planetoid was supposed to have six biomes so it all fit together really nicely.

It'd be fun to adapt your code to triangles on an icosahedron and then do some crazy land-mass-based unfolding into a relatively contiguous flat map.

roomforthetuna
Mar 22, 2005

I don't need to know anything about virii! My CUSTOM PROGRAM keeps me protected! It's not like they'll try to come in through the Internet or something!

Ranzear posted:

You're making a rectilinear map, which is always gonna get screwy at the poles anyway.

I once sketchy coded up, but never got anywhere with, a dodecahedron based projection. Just the mathy bits, borders per face and whatnot. Pentagram faces make up most of a hemisphere, a pentagram for the pole, and then I planned little rectangular transition strips or even an entire equatorial rectangle belt that got rid of the weird zig-zag. I also have a note scribbled down somewhere to not adjust the camera until it departs the polar pentagon, which would be less jarring in a top-down setting. My little planetoid was supposed to have six biomes so it all fit together really nicely.

It'd be fun to adapt your code to triangles on an icosahedron and then do some crazy land-mass-based unfolding into a relatively contiguous flat map.
I had a terrain generation that generated on a the-oval-shape-that-maps-to-a-sphere-alright in a way that wrapped right to left, and did a pretty good job, using some sort of plate-tectonics style generation. Then it wrapped the resulting image around a sphere and made both a bump map and color map for a spherical planet. I was doing an Elite-style game, with full ship construction (from parts, not just "here's a hull with all the engines and everything, nail some weapons to it"). I had all the base mechanics working, then decided to do something else instead because the interesting part was done, as is my way. Ages ago, it was using DirectX 7.

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
Is there some Unity asset I can get that is a layer over controller stuff? The knee-jerk reaction is probably, "yeah, it's called your game" but I've just spent a bunch of annoying time fixing some bugs in things like double button pressing that I wish I just didn't have to deal with. I'm already using InControl to abstract the different controllers, but I'm just seeing that I could disappear into the realm of managing controls in a way that's been fixed thousands of times before.

leper khan
Dec 28, 2010
Honest to god thinks Half Life 2 is a bad game. But at least he likes Monster Hunter.

Rocko Bonaparte posted:

Is there some Unity asset I can get that is a layer over controller stuff? The knee-jerk reaction is probably, "yeah, it's called your game" but I've just spent a bunch of annoying time fixing some bugs in things like double button pressing that I wish I just didn't have to deal with. I'm already using InControl to abstract the different controllers, but I'm just seeing that I could disappear into the realm of managing controls in a way that's been fixed thousands of times before.

My answer was going to be InControl or rewired, but you already have that.

The 'what do i do when I press a button' problem is pretty easy to solve. I like scriptable objects for it. If you want, you could build one that has a cool down on firing an event.

Could also be you just need to rework your menus/interactions so they're tolerant of expected player behavior?

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
I'm amused a little over it since for all the wheels I reinvent, this situation is one that might have actually been useful. I just feel like I devolve too much into conditional, temporally-bound bullshit code when I'm managing my input. Given the little bugs and quirks, it sure looks like it's just a bad way to do it to just get it done.

A lot of it would be better off in state machines. At that point, it's at a level of structure where I would have assumed somebody has been there before and made something out of it.

Like, I'd especially think this would be something you could get as an asset for fighting games in the asset store or something.

orenronen
Nov 7, 2008

Rocko Bonaparte posted:

I'm amused a little over it since for all the wheels I reinvent, this situation is one that might have actually been useful. I just feel like I devolve too much into conditional, temporally-bound bullshit code when I'm managing my input. Given the little bugs and quirks, it sure looks like it's just a bad way to do it to just get it done.

This kind of complex temporal + state control handing is a perfect fit for UniRx. If you've never one of the Rx-family of libraries before (or any other reactive-functional API) it can be a steep learning curve, but once you figure it out it makes very complex situations trivially easy.

Isometric Bacon
Jul 24, 2004

Let's get naked!
In my first big project (basically a RTS) I implemented my own raycasting from the camera system, with a central input manager and camera control. I had to write extensive input handling code, particularly to have it work both with mouse, gamepad and touchscreen gestures (why doesn't unity have these by default?) and is always a big operation to - but it's quite robust and handles all sorts of different input conditions and devices.

This new project I went 'gently caress it' and allowed my interactabke Monobehaviours to simply use the Pointer Handler interfaces. Wow, that was so much quicker. Starting to run into its limitations pretty quickly however.

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!

orenronen posted:

This kind of complex temporal + state control handing is a perfect fit for UniRx. If you've never one of the Rx-family of libraries before (or any other reactive-functional API) it can be a steep learning curve, but once you figure it out it makes very complex situations trivially easy.

Yeah I intend to look at that when I have a few contiguous hours again. I'll try to shitpost about it.

Raenir Salazar
Nov 5, 2010

College Slice
Thoughts on how to properly decouple UI from gameplay?

Suppose I have a first person shooter, where if the player puts on sunglasses and left clicks, they give a thumbs up. Without sunglasses they give a wave.

So in the UI that handles tinting the vision, I have a bool that controls "IsWearingSunglasses", previously this class is what listened to the left click binding.

I moved the left click control to a input controller class, but to know whether to play the thumbs up or wave animation I still need to know about the internal state of the UI! I'm still tightly coupled!

I suppose I could put the UI state of is wearing sunglasses into a separate gameState class that's globally accessible by the input class, but is that the best way to decouple this?

leper khan
Dec 28, 2010
Honest to god thinks Half Life 2 is a bad game. But at least he likes Monster Hunter.

Raenir Salazar posted:

Thoughts on how to properly decouple UI from gameplay?

Suppose I have a first person shooter, where if the player puts on sunglasses and left clicks, they give a thumbs up. Without sunglasses they give a wave.

So in the UI that handles tinting the vision, I have a bool that controls "IsWearingSunglasses", previously this class is what listened to the left click binding.

I moved the left click control to a input controller class, but to know whether to play the thumbs up or wave animation I still need to know about the internal state of the UI! I'm still tightly coupled!

I suppose I could put the UI state of is wearing sunglasses into a separate gameState class that's globally accessible by the input class, but is that the best way to decouple this?

Observer pattern/events. Button click sends a button click event. Your state has something listening to events. It performs the requisite action. Head inventory slot apparently owns this actions result. So on equip, have the thing subscribe to events. Unequip/destroy unsubscribe.

The sunglasses can fire off a message for the animation system to thumbs up. Works basically the same way, except the animation system subscribes to all of the messages at init. If you want to support very large amounts of animations, you can send a message to the animation system that it should subscribe to some event.

dupersaurus
Aug 1, 2012

Futurism was an art movement where dudes were all 'CARS ARE COOL AND THE PAST IS FOR CHUMPS. LET'S DRAW SOME CARS.'

Raenir Salazar posted:

I suppose I could put the UI state of is wearing sunglasses into a separate gameState class that's globally accessible by the input class, but is that the best way to decouple this?

It's this. UI works best when it's as stateless as possible, and any state it does have should only be internal to it

12 rats tied together
Sep 7, 2006

dupersaurus posted:

It's this. UI works best when it's as stateless as possible, and any state it does have should only be internal to it

agreed. your left click should be sending "LeftClick" or some other abstract name for the type of event either to the UI directly or to an event stream that the UI subscribes to

it's the UI's responsibility to figure out what LeftClick means this time, inside handleLeftClick (or whatever)

I usually end up having some kind of intermediary uiContext that e.g. stops LeftClick from bubbling out into the simulation if there is a menu open, for example. I would turn LeftClick into a "named thing" in here and publish it. Responsibility for consuming the Named Thing correctly is delegated elsewhere.

This example seems like you need sunglasses onEquip to publish a "dim screen" and a "adjust named thing animation" event. It's common IMO to be firing off a ton of events like this, and it's one of the challenges of the architecture, so you should consider whether or not it's better to just actually be tightly coupled. Depending on overall complexity, sometimes worse is better here IMO.

Adbot
ADBOT LOVES YOU

Isometric Bacon
Jul 24, 2004

Let's get naked!
One really nice pattern to look into is a EventBus, which several open source implementations are available.

In a strong event based system, I found it got quite unwieldy with how much code had to directly reference other classes. This meant if I wanted to grab code from one project and paste it into another, it would need a bunch of code as dependencies.

With an eventbus you can abstract events into enum types, or strings.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply