|
"Entity Component System" is so named because it has things called Entities, things called Components, and things called Systems. For this reason, to avoid confusion, I suggest taking care not to refer to an ECS library, implementation, architecture, etc. as a "system," even if it is a system. Lots of things are systems.
|
# ? Oct 28, 2022 23:07 |
|
|
# ? May 25, 2024 03:05 |
|
Ranzear posted:
I wonder if this unique "local systems" ECS idea would appeal to you. It's a long video but he goes into his new ideas in the later sections. https://youtu.be/jjEsB611kxs The gist is each entity updates it's own set of systems instead of the "global" conventional ecs querying loops etc. Of course, global systems can still be there.
|
# ? Oct 28, 2022 23:08 |
|
And yeah, in the new UE5 ECS (Mass) they opted to call systems Processors and components Fragments to avoid confusion. It's fairly similar to the early days of DOTS but it's already been used in their Matrix demo. It is designed to live alongside actors rather than replacing them if that assuages fears of the DOTS debacle.
|
# ? Oct 28, 2022 23:13 |
|
Chainclaw posted:The problem with Entity Component Systems and entity systems with components is they share a lot of terminology but aren't really related. In as much as I've experienced and understood, Unity and Godot do a lot more of the latter than the former and that context was rather important to what I wanted to convey. They don't really expose the 'system' side of things in a useful way to many detriments. I'm currently writing the system I described in Rust and just used it as a pointer on how Unity mucks up temporality so much that every game made in it ends up with the forbidden behavior singleton eventually. But yes, 'let the behavior/system find the entities' does read like how an ECS should work just like how my isolation of the simulation model and sending only display data for the client to view and receiving only remapped controller inputs back across the server/client gap might sound familiar too. I'm not claiming to have invented anything new. Ranzear fucked around with this message at 23:34 on Oct 28, 2022 |
# ? Oct 28, 2022 23:30 |
|
I didn't mean to accuse you of pretending to invent that pattern, I just thought you might enjoy the video.
|
# ? Oct 28, 2022 23:37 |
|
Ranzear posted:In as much as I've experienced and understood, Unity and Godot do a lot more of the latter than the former and that context was rather important to what I wanted to convey. They don't really expose the 'system' side of things in a useful way to many detriments. Unity has an ECS that behaves nothing like you implied it behaved. And isn't central to the general operation of the engine in any way. You're basically just spewing FUD
|
# ? Oct 28, 2022 23:37 |
|
Megafunk posted:I didn't mean to accuse you of pretending to invent that pattern, I just thought you might enjoy the video. It's over two hours long, understandably I didn't get to it yet, lol. leper khan posted:Unity has an ECS that behaves nothing like you implied it behaved. And isn't central to the general operation of the engine in any way. You're basically just spewing FUD Maybe a little. I've only had exposure to mostly bad Unity development and pre-3.0 Godot. Ranzear fucked around with this message at 23:44 on Oct 28, 2022 |
# ? Oct 28, 2022 23:38 |
|
Unity has a system where you use entities and components, and Unity also has an Entity Component System. They are different. Yes the terminology is stupid. another fun overloaded term is "actors". A lot of game engines refer to different things as actors. Some it's the same as a Unity entity. Others, it's a unique special thing meant for skinned meshes. These are all unrelated to the Actor Model used outside of games. https://en.wikipedia.org/wiki/Actor_model My personal bugaboo the last few years has been people using AR for a bunch of different things. So I have no idea if they're talking about an automated review system, augmented reality, etc. Chainclaw fucked around with this message at 23:54 on Oct 28, 2022 |
# ? Oct 28, 2022 23:49 |
|
Uh, was this particular ECS you're talking about added in 2020? Because like, 'they fixed it!' would be a nice response even though I never really gave a timeframe of my experience. I also happen to know Rocko's project is at least older than that but they may have kept up, if migrating it to DOTS was even possible. Hell, I had to twist arms to get at least an upgrade to an LTS version when that was a new concept. Actor model is just smalltalk-era OOP paradigms to me. Same class that bashed some smalltalk into me also introduced me to rust. Ranzear fucked around with this message at 00:07 on Oct 29, 2022 |
# ? Oct 29, 2022 00:03 |
|
Megafunk posted:And yeah, in the new UE5 ECS (Mass) they opted to call systems Processors and components Fragments to avoid confusion. It's fairly similar to the early days of DOTS but it's already been used in their Matrix demo. It is designed to live alongside actors rather than replacing them if that assuages fears of the DOTS debacle. Makes sense that they call them Processors and Fragments considering they already have long-standing things called Systems and Components.
|
# ? Oct 29, 2022 08:30 |
|
more falafel please posted:Makes sense that they call them Processors and Fragments considering they already have long-standing things called Systems and Components.
|
# ? Oct 29, 2022 12:44 |
|
quote:There are only two hard things in Computer Science: cache invalidation and naming things. Today I learned the original quote never had "... and off-by-one errors." Megafunk posted:I wonder if this unique "local systems" ECS idea would appeal to you. It's a long video but he goes into his new ideas in the later sections. Good video. Putting the followup on next. Some of his tweaks are very close to mine. What he describes as proxies are the most critical purpose of my tags, pointers directly into entity data, but it seems his global systems are still run across all entities whereas my tags will register only the relevant subset for a given behavior to iterate over; his proxies just give a contiguous view. I don't have the local systems at all, nor even per-component logic. My tagging system ideally(!!) turns every system into a local system. A tag might have additional data but can be as simple as boolean by existing, which I suppose is the canonical description of a component but I more simply haven't decided yet on whether I need to split them into two kinds: static data versus dynamic flags. Making them both leans well to the pointer-passing ease of it all and is what has made them more component-like. My behaviors are supposed to be much more generic too, more akin to ye olde Starcraft map triggers than any sort of whole game system. Ordered instructions like "Add tag x, decrement tag y by delta if present, if y present and ≤ 0 remove tag y now and add w which will remove z later" should be able to handle a metric buttload of stuff without touching code at all. One more bit of logic over top of this is easy declarations that behavior k must finish before behavior j can start (and k can be just an empty behavior that requires l+m+n+o+p have finished for easy composition). With this I plan to have tags and behaviors be much more granular, such as:
It might be more clear now how multiple tags might make a component and multiple behaviors might make a system, or maybe the whole thing is a 'more true' ECS model. I'm just doing it the way I think is sensible and useful and leads to predictable behavior and, bonus, will serialize into easily edited markup files. Something I'm actually stuck on is whether tags should have some arithmetic behaviors. For instance, a generic incoming damage tag might get added by multiple sources and should maybe totalize automatically, or maybe I'll just find a way to allow multiple instances of the same tag with some hidden indexing for stuff like poison_clear to only remove the one that invoked it. Rubber-ducking it here I realize I probably need both; some systems may stack instances while other add to the value. We have very different environments and endgame goals. He has hard-threaded everything and strict parallelism goes a long way while he builds high entity count single-scene games and eschews networking entirely. I'm using async (tokio) and aiming for relatively few entities but several to many scenes in parallel in a strictly networked context with nothing to render locally, so I have a bit more wiggle room for mutexes as necessary with lots of work stealing. I wouldn't be able to do such sparse and narrow behaviors on fixed threads and affinities. I really like AI 'stims' as a term and I'll steal it from him when I write my vector junk. AI will run on this same framework but I can decouple it from the main simulation tick rate. AI will still see the fresh entity data, it'll just be running less often and allowed to be crunchier with its own entity pool. I haven't dug into spatial hierarchy yet, but I think our approach is the same. Anybody who leads a spatial hierarchy definition with the tank example is cool in my book. When I need a skeletal tree with per-bone colliders and stuff it'll all be on a single entity, but this side of things doesn't render so that'd just be for locational damage in an fps or fighting game (so it will still have to be aware of animation poses for instance). My secret sauce is that the client runs none of this. I'm shoving it raw data of entity positions and animation states and whatever with looked-ahead deltas for interpolation between updates. The client can render however it likes, and additionally this gives me information control: they can't see or even guess at what the server doesn't tell them. Ranzear fucked around with this message at 23:07 on Oct 29, 2022 |
# ? Oct 29, 2022 22:48 |
|
Considering your obvious depth of knowledge here you've likely already considered/discarded this, but: I think you could do what you're proposing with a simple event bus. Just publish events (such as phases in MTG in your example), and have subscribers react and do what they need to do at that point. "Tags" would essentially just be a curated list of subscribers. I'm sure there's a lot I haven't thought of here that makes this untenable.
|
# ? Oct 29, 2022 23:30 |
|
Tags can exist as just data without being registered to a behavior. Nothing yet prevents a behavior from interacting with other tags either. Haven't decided how it'll bail nicely if one isn't present but I want to subvert having dependencies, so maybe a defaulting system. Suppose I want a deck-random system per player entity, it would just store RNG state in a tag and behaviors would call for it on demand. Ranzear fucked around with this message at 06:01 on Oct 30, 2022 |
# ? Oct 30, 2022 05:49 |
|
Wait are you talking about Unity tags that are just strings? This is what I get for asking an architectural moon question and then get a sinus infection.
|
# ? Oct 30, 2022 09:13 |
|
I'm talking about my own engine. Your question did prompt me to look twice at and fix some stuff though.
|
# ? Oct 30, 2022 09:25 |
|
Well I'm sure I'll have this sinus infection tomorrow so look forward to the bugs I'll find.
|
# ? Oct 30, 2022 09:51 |
|
Ranzear posted:Today I learned the original quote never had "... and off-by-one errors." Tim Bray says that was added later by someone else. https://twitter.com/dkarlton/status/895871336941211648 https://twitter.com/timbray/status/506146595650699264 And this guy claims to be the source: https://twitter.com/secretGeek/status/552779013890904064
|
# ? Nov 1, 2022 05:19 |
|
Fuschia tude posted:Tim Bray says that was added later by someone else. Lmao the off by one joke has been around since at least mid 2000s
|
# ? Nov 1, 2022 05:55 |
|
Gotta add "date handling" to the joke I guess.
|
# ? Nov 1, 2022 06:05 |
|
Jabor posted:Gotta add "date handling" to the joke I guess. * Excel has entered the chat *
|
# ? Nov 1, 2022 06:28 |
|
Date handling is a subset of naming things.
|
# ? Nov 1, 2022 07:23 |
|
leper khan posted:Lmao the off by one joke has been around since at least mid 2000s Yeah, I was gonna say, I heard the off-by-one part way before I heard the cache invalidation part, if anything I always assumed that was the original joke
|
# ? Nov 1, 2022 07:48 |
|
So I made a compute shader to try to draw a delaunay triangulation to a output texture. In general my approach is given a pixel in my 2D texture space, I want to know if it should be coloured; and it should be coloured if and only if it is on the line between two delaunay points. So given a list of edges for every pixel I iterate through the list of edges and check if its on the line. After much trial and error and googling various formulas for this, I came upon an approach that checks if a given point is within a certain distance of the line. This works pretty good. All other approaches like checking if distance(A,C) + distance(B,C) <= (A,B) + Epsilon etc would miss something like 90% of all line pixels. My assumption is since the pixel coordinates are integers, the results aren't mapping correctly to the 2D grid. The main flaw in the below approach is there's some honestly tolerable inaccuracy in that the lines in places are too thick. I'd ideally like much more accurate lines, like "supercover" lines in particular, but not sure how to accomplish this in a compute shader; my current implementation below: code:
And I am aware I probably shouldn't be looping through 600+ edges to see if the current pixel is relevant to any of them; I should probably pre-compute that sort of spatial information and pass it in but I wanted to test things out first.
|
# ? Nov 2, 2022 01:26 |
|
Have you considered antialiasing your lines? Instead of having a hard "close to a line -> set colour to black; not close enough -> do nothing", you'd set a shade of grey based on how close the pixel is to the line.
|
# ? Nov 2, 2022 02:50 |
|
Jabor posted:Have you considered antialiasing your lines? Instead of having a hard "close to a line -> set colour to black; not close enough -> do nothing", you'd set a shade of grey based on how close the pixel is to the line. Hrm, well, my goal is to eventually be generating a polygon based procedural world map (finally got back to this project since where I last left off in 2021!!! ) so the goal is not necessarily nice lines in of themselves, I just wanted to test displaying the results to a texture where it'd be nice if the lines happened to be of a certain type. Last time I draw lines to a texture (using the CPU; which honestly is probably not that slow? Might be worth re-examining! Since the number of length of lines might be reasonably finite relative to the area of a texture?), I found that "supercover" lines had the least imperfections for displaying voronoi boundaries without "doubling over" certain boundaries making them appear thicker than they should. For testing purposes I think its a little clearer if something's wrong if there's a clear dilineation between pixels (which is convenient to take the output texture into Paint.net and using the Wand tool!), and anti-aliasing them might make certain errors blend outside of my ability to spot check things. e: Although it occurs to me it might be easier to figure out a way of Blit'ing a Wireframe mesh to a Texture; my first attempt failed in that it blit the full rendered mesh (as a weirdly shaped concave looking plane) to a texture when I want the wireframe; maybe there's an easy way to do that because then I can also wireframe the voronoi mesh and just use the compute shader for flood filling. Raenir Salazar fucked around with this message at 03:18 on Nov 2, 2022 |
# ? Nov 2, 2022 03:16 |
|
Raenir Salazar posted:Hrm, well, my goal is to eventually be generating a polygon based procedural world map (finally got back to this project since where I last left off in 2021!!! ) so the goal is not necessarily nice lines in of themselves, I just wanted to test displaying the results to a texture where it'd be nice if the lines happened to be of a certain type. Last time I draw lines to a texture (using the CPU; which honestly is probably not that slow? Might be worth re-examining! Since the number of length of lines might be reasonably finite relative to the area of a texture?) But what about drawing the lines on the GPU using line-drawing or triangle-pairs? It's super optimized for that poo poo. Edit: like you said in your edit while I was saying this.
|
# ? Nov 2, 2022 13:23 |
|
roomforthetuna posted:Yeah, if it's a one-off generate-stuff operation, drawing 600 lines on CPU with a bresenham algorithm is probably just as good as doing 600 operations per pixel in the whole rectangle on the GPU (the increased parallelism isn't really giving you much when you're using it to do hundreds of times more operations). Yeah as this is just testing while in my actual use case it will likely be over 10,000 lines. I'll revisit trying to do this by blit'ing a mesh to a rendertexture and maybe there's a flag to blit the wireframe.
|
# ? Nov 2, 2022 16:21 |
|
Arcane question, I'm not sure where to ask. I'm reverse-engineering old computer games, primarily on x86 or possibly x64 architecture (DOS/Windows), but 65xx (Apple ][, Commodore 64) is also on the table. (How old: 10+ years bare minimum, mostly 25+, possibly 40+.) What I'm trying to crack is the random number generator, which is key to many older games. This long ago it was common for programmers to roll their own RNG (believe it or not, this was actually a good idea in C before 1997) but far from universal. I'm not sure where I can find previously-done work on this, however; this is difficult enough work that I'm trying to avoid reinventing the wheel. All resources I know of are either primarily hardware-focused (like vogons.org), or are concerned with adventure games which mostly don't use RNG (like the ScummVM forums). Is there such a thing? I don't believe this exists for DOS/Windows (you can easily find it for specific console games, like the original Final Fantasy), because all sources I can find are not reliable enough for my purposes. I'm currently looking at the Turbo Pascal library, which as far as I know is best described in The Random Number Generators of the Turbo Pascal Family (1991). That's reasonably accurate but still has some mistakes and omissions, and anything else I know of (Wikipedia, or any other academic paper I could find) is worse. If that doesn't exist, I'm interested in curating a list of RNGs .. except I'm not sure where to put my work, aside from "my own Github repository, or Web page". What forum might be most interested?
|
# ? Nov 2, 2022 18:54 |
|
You might ask this question on the tasvideos.org forums. There's a lot of retrocomputing folks there, and while they're primarily interested in old consoles, that doesn't invalidate their knowledge base. And they may well have connections to other communities that know more.
|
# ? Nov 2, 2022 19:19 |
|
Boldor posted:Arcane question, I'm not sure where to ask. In the old days on the Commodore 64 we would generate a random seed using the 6581 sound chip white noise waveform, which is literally random numbers that you can hear. So if you can find a white noise generator you’ll be able to spin up pseudo random numbers pretty easily.
|
# ? Nov 2, 2022 19:28 |
|
roomforthetuna posted:Yeah, if it's a one-off generate-stuff operation, drawing 600 lines on CPU with a bresenham algorithm is probably just as good as doing 600 operations per pixel in the whole rectangle on the GPU (the increased parallelism isn't really giving you much when you're using it to do hundreds of times more operations). Status Report: Success. All it needed was setting GL.wireframe = true. code:
e: although tbf some of the time in the compute shader function is setting up the edges to be passed in; I'd imagine most of that 140 number is the edge collection. Raenir Salazar fucked around with this message at 23:45 on Nov 2, 2022 |
# ? Nov 2, 2022 23:39 |
|
I have a CoolShots system in my game that takes automatic screenshots when interesting things happen, to show to the player later. This system needs to not draw the HUD, so I turn it off before its camera renders, and then back on afterwards. I've noticed though that sometimes the HUD visibly flickers, which implies that the main camera is also rendering when the CoolShots camera is rendering. Or something. The CoolShots render code is pretty simple: code:
|
# ? Nov 2, 2022 23:53 |
|
TooMuchAbstraction posted:I have a CoolShots system in my game that takes automatic screenshots when interesting things happen, to show to the player later. This system needs to not draw the HUD, so I turn it off before its camera renders, and then back on afterwards. I've noticed though that sometimes the HUD visibly flickers, which implies that the main camera is also rendering when the CoolShots camera is rendering. Or something. Interestingly enough I saw something similar to this in the comments for the library I'm using: code:
|
# ? Nov 3, 2022 00:09 |
|
I don't think so...that looks like it's doing some kind of turbo-manual rendering where the projection matrix is being set manually, and then not cleared reliably. I'm doing straightforward renders, I just need certain parts of the game to be disabled when I do. (it sure would be nice if I could just put those parts on a layer that's invisible to the CoolShots camera, but Screen Space: Overlay canvases ignore that whole system )
|
# ? Nov 3, 2022 00:27 |
|
So I got a voronoi mesh now, but the results aren't ideal; in particular the voronoi mesh has a lot of outgoing edges along the outermost perimeter of the mesh; so there's parts where there's like several tiny adjacent voronoi cells. While I am aiming for something more like this. It isn't exactly clear to me what settings or parameters I'm getting wrong. TooMuchAbstraction posted:I don't think so...that looks like it's doing some kind of turbo-manual rendering where the projection matrix is being set manually, and then not cleared reliably. I'm doing straightforward renders, I just need certain parts of the game to be disabled when I do. Maybe, but here's the full script for context in case it does turn out to be useful!
|
# ? Nov 3, 2022 01:06 |
|
Raenir Salazar posted:So I got a voronoi mesh now, but the results aren't ideal; in particular the voronoi mesh has a lot of outgoing edges along the outermost perimeter of the mesh; so there's parts where there's like several tiny adjacent voronoi cells. Probably easiest to push the edge nodes to the edges
|
# ? Nov 3, 2022 01:53 |
|
leper khan posted:Probably easiest to push the edge nodes to the edges Nono that isn't the problem, they just happen to not extend all the way because I was playing with padding and Boundary settings; the problem is how many outgoing edges are close to each other; going all the way to the edge wouldn't change that those regions are very narrow (or very tiny if I clipped them off properly)! While I want every cell to basically look like any other cell.
|
# ? Nov 3, 2022 02:02 |
|
OK, this is a weird issue: I have to restart Unity to get it to actually pick up changes to my preprocessor. I have a BETA preprocessor directive, which is used to control access to certain prerelease game content. Attempting to add/remove that directive via Project Settings > Player > Scripting Define Symbols just straight up does not work unless I restart Unity. I know this because I have a build preprocessor script, which uses the directive to move some files around, and it logs what it's doing. And while the Project Settings dialog may claim that the symbol is/is not set, the logs will say otherwise. I feel like this has to be some kind of disconnect between Unity and the compiler? I'm using Visual Studio 2019 (version 16.11.16) for this. Any suggestions for steps I might take to fix the issue? It's mildly annoying in that it adds extra steps to the build process. Mostly I'm worried that I'll accidentally send prerelease content into the real world ahead of schedule.
|
# ? Nov 4, 2022 20:35 |
|
|
# ? May 25, 2024 03:05 |
|
TooMuchAbstraction posted:OK, this is a weird issue: I have to restart Unity to get it to actually pick up changes to my preprocessor. I have a BETA preprocessor directive, which is used to control access to certain prerelease game content. Attempting to add/remove that directive via Project Settings > Player > Scripting Define Symbols just straight up does not work unless I restart Unity. I know this because I have a build preprocessor script, which uses the directive to move some files around, and it logs what it's doing. And while the Project Settings dialog may claim that the symbol is/is not set, the logs will say otherwise. Can you get away with reloading assets instead of reloading? Otherwise I generally recommend setting up a build pipeline that doesn't include you going through the build menu in unity on your dev machine.
|
# ? Nov 4, 2022 20:52 |