Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SavageMessiah
Jan 28, 2009

Emotionally drained and spookified

Toilet Rascal
I started writing a simple entity system for use in a couple of games I've been thinking about. I'm writing it in C because that's something I want to get better at. I'm going for an Artemis-like system with no logic in the components and no direct connection between the entity (which is just a long) and the component data.

Here's what I was thinking, someone tell me if this sounds retarded or not, I'm not used to writing stuff where I actually need to think about memory layout and management (java programmer mainly):

The core would a World structure that contains the next entity id, an array of component manager structs, an array of system structs, a pointer to a dynamically growing array of (ulong)bitfields indicating what components are set on an entity, and the root of an avl tree mapping ids to indices into that array. I realize this will limit the number of component types to 64, but I don't think that will be a problem for anything I want to make. The component manager structs would have function pointers for getting, setting, etc the component data for an entity based solely on the entity id. The component system would manage the mapping from entity ids to the data itself - allowing each component to manage memory in the way most convenient for it. The system structs would be systems that hold the logic for the entities and would have have a bitmask for the components that are interesting to it. Whenever an entity becomes interesting due to components being added/removed the add/delete functions for the system would be called so it can do whatever it needs to.

This seems like it would be pretty simple to write/use and provide more than enough performance and scalability for the projects I have in mind, mostly roguelikes and fairly simple 2d games.

tldr; goon desires feedback on ill-advised C E/C/S entity framework

Adbot
ADBOT LOVES YOU

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!
On the news front, we're apparently very close to getting a Doom 3 source release:

John Carmack posted:

doom 3 source is packaged and tested, we are waiting on final lawyer clearance for release.

Probably not as impactful as it would have been before we had XNA, UDK, and Unity, but nothing beats stuff like this for loving with renderer internals, and it is a pretty feature-complete package in its own right. They even had basic vehicles working in a mini-mod in the SDK.

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
More component-based stuff: I'm trying to figure out how I want to handle entity death. At this point I have a Health component that emits a signal when health reaches the minimum health. Something that uses that right now is the component for animation capability. It will go through the death animations and freeze it on the dead body animation. However I haven't made anything else use it yet, notable components that control the entity in some way like AI or Homing. So with Homing I now have a body on the ground following the player around.

It was at that point I decided maybe not having every single component listen for this signal was wise. The best solution I can think of is for something external to listen for the death signal, and transform the entity into a dying entity, and then perhaps a dead entity. It will remove the components that are involved directly in making it do alive stuff, but keep around everything else so if it gets revived, it can get right back to what it was doing. It seems like the best way to handle it as I can think of, but am I missing the benefits of not transforming the entity? Or am I going to get into a mess changing the entities out?

Unormal
Nov 16, 2004

Mod sass? This evening?! But the cakes aren't ready! THE CAKES!
Fun Shoe

Rocko Bonaparte posted:

More component-based stuff: I'm trying to figure out how I want to handle entity death. At this point I have a Health component that emits a signal when health reaches the minimum health. Something that uses that right now is the component for animation capability. It will go through the death animations and freeze it on the dead body animation. However I haven't made anything else use it yet, notable components that control the entity in some way like AI or Homing. So with Homing I now have a body on the ground following the player around.

It was at that point I decided maybe not having every single component listen for this signal was wise. The best solution I can think of is for something external to listen for the death signal, and transform the entity into a dying entity, and then perhaps a dead entity. It will remove the components that are involved directly in making it do alive stuff, but keep around everything else so if it gets revived, it can get right back to what it was doing. It seems like the best way to handle it as I can think of, but am I missing the benefits of not transforming the entity? Or am I going to get into a mess changing the entities out?

For Caves of Qud, I have a special "graveyard" area where things go to 'die', which is basically setup as a safe sandbox for entities to live till they can be finally destroyed (usually at the end of of an overall 'game tick', when we're sure where not in the middle of a big message chain that might revive entities from the dead). It basically just ignores any messages entities inside it try to send to it or out.

The component itself are responsible for managing the termination of the entity, though.

SavageMessiah posted:

I realize this will limit the number of component types to 64, but I don't think that will be a problem for anything I want to make.

One of the really nice things about a component-based framework is the ability to add a tiny little component that specializes the behavior of an object, without constantly adding infinite specializing junk to your core components. Losing this aspect by having a very limited number of components would be pretty sad.

Unormal fucked around with this message at 19:48 on Nov 1, 2011

Canine Blues Arooo
Jan 7, 2008

when you think about it...i'm the first girl you ever spent the night with

Grimey Drawer
I'm interested in actually starting to put all my GREAT IDEAs on paper but I'm not sure where I should start. I would like to do something simple and 2D but a lot of the engines I'm seeing are aimed at 3D development. I would like to use Unity as I'm already familiar with C#, but again, I don't know if it's appropriate to try and develop a 2D project in said engine.

Are there engines specifically devoted to 2D development, or does one just make 2D work in the current 3D engines with magic?

RoboCicero
Oct 22, 2009

"I'm sick and tired of reading these posts!"
If you're a student you can always give XNA a shot -- it's free through the dreamspark program, I believe.

Orzo
Sep 3, 2004

IT! IT is confusing! Say your goddamn pronouns!
XNA is free anyway.

roomforthetuna
Mar 22, 2005

I don't need to know anything about virii! My CUSTOM PROGRAM keeps me protected! It's not like they'll try to come in through the Internet or something!

Canine Blues Arooo posted:

Are there engines specifically devoted to 2D development, or does one just make 2D work in the current 3D engines with magic?
I think we might need to know what you're expecting an engine to do, to answer your question. As those guys above suggest, XNA is pretty good in itself for 2D graphics, and Box2D has an XNA port, so that'll add physics for you if that's part of what you're expecting of an engine.

If you're expecting easy stuff for a GUI, you might be best just going with Flash, because there's a remarkable dearth of decent menus-and-poo poo libraries for games.

ambushsabre
Sep 1, 2009

It's...it's not shutting down!
Whenever someone asks a question like this I have to pimp flixel (2d really easy flash library) a little bit, because it helped me get into rapid game dev and also got me laid.

fakeedit: it might not have done one of those things.

Shalinor
Jun 10, 2002

Can I buy you a rootbeer?

ambushsabre posted:

Whenever someone asks a question like this I have to pimp flixel (2d really easy flash library) a little bit, because it helped me get into rapid game dev and also got me laid.

fakeedit: it might not have done one of those things.
Flixel will totally get any nerd laid. If it isn't working, they just need to work with Flixel longer.

I also highly recommend Flixel when you're just starting out. Stencyl + Flixel is a massively powerful combination, and is a great jumping-off point for learning game design or programming.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Sorry if this is a broad question but I've been googling around and can't find anything to address it. In the context of developing a top-down 2D space shooter kind of game for Android, what would be a more efficient graphics engine: sprite-based (8-bit NES level of resolution) or vector-based (wireframe objects, like the game Asteroids). I'd like to have 10-100 simple objects on the screen and still shoot for 30 frames per second on a midrange smartphone.

Pfhreak
Jan 30, 2004

Frog Blast The Vent Core!

Zero VGS posted:

Sorry if this is a broad question but I've been googling around and can't find anything to address it. In the context of developing a top-down 2D space shooter kind of game for Android, what would be a more efficient graphics engine: sprite-based (8-bit NES level of resolution) or vector-based (wireframe objects, like the game Asteroids). I'd like to have 10-100 simple objects on the screen and still shoot for 30 frames per second on a midrange smartphone.

Give them both a try and report back!

Internet Janitor
May 17, 2008

"That isn't the appropriate trash receptacle."

Made another game in Forth. It's like a cross between space invaders and those games where you blow up skyscrapers to land a plane. Here's the source!

edit: And to save anyone curious the hassle of figuring out how to compile it, here's an executable JAR of the game: http://www.mediafire.com/?zwkifkhb9pb7ufp

Internet Janitor fucked around with this message at 05:22 on Nov 2, 2011

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Canine Blues Arooo posted:

I would like to use Unity as I'm already familiar with C#, but again, I don't know if it's appropriate to try and develop a 2D project in said engine.

They at least have a 2D tutorial, so that's something. There also seem to be things like SpriteUI and Sprite Manager to help with such stuff.

I just got the Unity stuff set up here an hour ago, though, so don't take my word for anything!

PDP-1
Oct 12, 2004

It's a beautiful day in the neighborhood.
What do the cool kids use for game clocks on Windows systems these days? I've been looking into the TimerQueueTimer in kernel32.dll, and at least in initial tests it seems to work. Just curious if there's something better out there before I go hog wild with it.

The other option I've seen mentioned is the multimedia timer from winmm.dll, but MSDN says it's obsolete.

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!
Microsoft recommends using QueryPerformanceCounter.

QueryPerformanceCounter does not behave well on some older hardware and generally needs Vista or later to be dependable.

timeGetTime is millisecond-precision and generally very reliable.

I think KeQueryInterruptTime is functionally equivalent to timeGetTime

OneEightHundred fucked around with this message at 16:03 on Nov 8, 2011

Orzo
Sep 3, 2004

IT! IT is confusing! Say your goddamn pronouns!
Not sure if this belongs in the 3d graphics questions thread or here, but oh well...

Has anyone had problems with graphics 'stuttering?' I have a very simple test application (C#, SlimDX) where a textured quad (doesn't matter if I use the built-in Sprite class or manually fill my own vertex buffer) moves from left to right over and over, and about once per second there is a visible 'jump' of a few frames. I feel like I've tried everything and cannot figure out what it is. Garbage collection every frame, vsync on and off, etc etc. Eventually I tried the SlimDX samples that come with the SDK and found they that had the same problem, so now I think it's something wrong with my computer. I tried it on my laptop and had a similiar--but not quite as extreme--problem. It jumps a little bit, but not quite as dramatic on my main machine. Updated my video card (9800 gtx+) drivers to no avail. Turned on/off debug runtime in the DirectX control panel.

I'm totally at a loss! I know this isn't perfect information, I'm just hoping someone has seen these symptoms before and knows what it is.

Sagacity
May 2, 2003
Hopefully my epitaph will be funnier than my custom title.

Orzo posted:

Has anyone had problems with graphics 'stuttering?'
Is this windowed or fullscreen? Have you tried both?

roomforthetuna
Mar 22, 2005

I don't need to know anything about virii! My CUSTOM PROGRAM keeps me protected! It's not like they'll try to come in through the Internet or something!
I get that stuttering effect a few times shortly after a game starts up, but it doesn't keep going. When that happens I believe it's from some sort of libraries hooking in and/or detaching (based on the VC++ debug output window). With the ongoing stuttering, perhaps you have some service that's horribly misbehaving with its hooks, like a trackpad driver, printer driver, antivirus or something? Antivirus even not misbehaving, if your game is writing debug info to a log file maybe? Check if your hard drive light is flashing in sync with the stutters?

PDP-1
Oct 12, 2004

It's a beautiful day in the neighborhood.

OneEightHundred posted:

Microsoft recommends using QueryPerformanceCounter.

QueryPerformanceCounter does not behave well on some older hardware and generally needs Vista or later to be dependable.

timeGetTime is millisecond-precision and generally very reliable.

I think KeQueryInterruptTime is functionally equivalent to timeGetTime

Thanks for the comments. I realized I omitted some important info in that I'm looking for a timer that is event or callback based instead of polling based like QueryPerformanceCounter.

The reason is that this game will be running a fairly heavy simulation that throws chunks of work into a thread pool, so I'd like the main game thread to consume as close to zero CPU time as possible between ticks so the thread pool can do more work. If the main thread is spinning in a tight loop around QueryPerformanceCounter checks it's burning CPU cycles that could be better used elsewhere. With an event or callback based timer I'm hoping I'll be able to just block the main game thread until the next timer tick happens and then signal cross-thread that it's OK to proceed. It's an unusual design to be sure, but faster simulation execution should make the game feel a lot more smooth for the player.

I did also look at timeGetTime and even coded up a quick demo, but it requires timeSetEvent to get going and that function is marked as obsolete on MSDN. KeQueryInterruptTime looks like a more modern version but also appears to be polling-based.

Orzo
Sep 3, 2004

IT! IT is confusing! Say your goddamn pronouns!

Sagacity posted:

Is this windowed or fullscreen? Have you tried both?
I've tried both and it happens in both.

Orzo
Sep 3, 2004

IT! IT is confusing! Say your goddamn pronouns!

roomforthetuna posted:

I get that stuttering effect a few times shortly after a game starts up, but it doesn't keep going. When that happens I believe it's from some sort of libraries hooking in and/or detaching (based on the VC++ debug output window). With the ongoing stuttering, perhaps you have some service that's horribly misbehaving with its hooks, like a trackpad driver, printer driver, antivirus or something? Antivirus even not misbehaving, if your game is writing debug info to a log file maybe? Check if your hard drive light is flashing in sync with the stutters?
Yeah, this one keeps going and it's at a fairly regular (once per second) frequency. I'll have to check about the hard drive light tonight when I get home, thanks.

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

PDP-1 posted:

I'm hoping I'll be able to just block the main game thread until the next timer tick happens and then signal cross-thread that it's OK to proceed.
Use Sleep if you really want to do something like that, but honestly, the difference between idling a CPU core waiting until it's OK for the game thread to wake up, and just spinning until the frame counter ticks over, is practically nothing.

Okay not exactly nothing, but you'd be measuring the difference in kilowatt-hours rather than seconds.

quote:

I did also look at timeGetTime and even coded up a quick demo, but it requires timeSetEvent to get going
The only thing you need before timeGetTime, if anything, is timeBeginPeriod to set the minimum resolution. I think the default minimum is 5ms, which is usually fine.

The Glumslinger
Sep 24, 2008

Coach Nagy, you want me to throw to WHAT side of the field?


Hair Elf
So I've got a question. I'm working on a system to load/save input settings to a file as part of a huge team project. They want me to read from the save/read the name of the control and then store them. The rub is that while they are saved as text, they want me to convert those to the OIS enum of the same name.

Now working in C++, I can't think of a way to do this other than a giant hashmap of strings to enums. But I really don't want to have to manually write and enter 144 entries into the hash-map because it is both tedious and I am likely to accidentally add an error into the system that way.

Does anyone know of a clever way in which I can do with or should I start populating that hashmap?

DancingMachine
Aug 12, 2004

He's a dancing machine!

crazylakerfan posted:

So I've got a question. I'm working on a system to load/save input settings to a file as part of a huge team project. They want me to read from the save/read the name of the control and then store them. The rub is that while they are saved as text, they want me to convert those to the OIS enum of the same name.

Now working in C++, I can't think of a way to do this other than a giant hashmap of strings to enums. But I really don't want to have to manually write and enter 144 entries into the hash-map because it is both tedious and I am likely to accidentally add an error into the system that way.

Does anyone know of a clever way in which I can do with or should I start populating that hashmap?

Normally I would handle this would some generated code. i.e. a perl script that generates a .h and .cpp file at build time.

AntiPseudonym
Apr 1, 2007
I EAT BABIES

:dukedog:

Orzo posted:

Not sure if this belongs in the 3d graphics questions thread or here, but oh well...

Has anyone had problems with graphics 'stuttering?' I have a very simple test application (C#, SlimDX) where a textured quad (doesn't matter if I use the built-in Sprite class or manually fill my own vertex buffer) moves from left to right over and over, and about once per second there is a visible 'jump' of a few frames. I feel like I've tried everything and cannot figure out what it is. Garbage collection every frame, vsync on and off, etc etc. Eventually I tried the SlimDX samples that come with the SDK and found they that had the same problem, so now I think it's something wrong with my computer. I tried it on my laptop and had a similiar--but not quite as extreme--problem. It jumps a little bit, but not quite as dramatic on my main machine. Updated my video card (9800 gtx+) drivers to no avail. Turned on/off debug runtime in the DirectX control panel.

I'm totally at a loss! I know this isn't perfect information, I'm just hoping someone has seen these symptoms before and knows what it is.

I had a problem like this that annoyed me for years, where I'd get a very visible framerate drop and stuttering when the VC++ debugger was attached. Eventually found a solution a couple of months ago; add _NO_DEBUG_HEAP=1 to Properties->Debugging->Environment and hopefully it'll go away.

Although be warned you'll lose some of the (Rather important) heap warnings when this is enabled, so I'd recommend adding it to new configuration.

Orzo
Sep 3, 2004

IT! IT is confusing! Say your goddamn pronouns!
Actually, although I haven't solved the problem, I did discover that it's local to my machine (actually, my desktop AND my laptop, which is what convinced me it was a problem with the code). It didn't skip for several other people that I had try it out.

AntiPseudonym
Apr 1, 2007
I EAT BABIES

:dukedog:

Orzo posted:

Actually, although I haven't solved the problem, I did discover that it's local to my machine (actually, my desktop AND my laptop, which is what convinced me it was a problem with the code). It didn't skip for several other people that I had try it out.

Were you running the debugger on your machine, though?

Orzo
Sep 3, 2004

IT! IT is confusing! Say your goddamn pronouns!
Nope, no debugger, tried several release configurations...no luck.

Visible Stink
Mar 31, 2010

Got a light, handsome?

Orzo posted:

Stuttering movement stuff.
This is a little story that happened to me.

I was messing around with SFML to learn C++ and tried moving a texture around with the arrow keys and its movement had stuttering just as you described. I tried some
of the inbuilt functions for framerates and enabling vertical sync. I tried multiplying by delta time and simply incrementing positions but still the problem remained. So I switched over to XNA 4.0. Same scenario: moving a sprite with the arrow keys, and same stuttery movement. So I tried XNA 3.1 and it still kept happening. I tried all the v-sync and delta time stuff as before but nothing worked. Figuring it must be my machine I took it in to work to try there and STILL the stuttery movement persisted.

This drove me mad all week until the weekend when I found the solution. I have f.lux (that program that adjusts your screen color based on time of day) installed at home and at work. Well, I was doing some image work in Photoshop and for some reason rather than choosing to temporarily disable it I exited it completely. Ad lo and behold, when I ran my code again a little later it worked! It ran silky smooth with nary a itter or jerk to be seen.

I guess the moral of the story is make sure you haven't got something running in the background on both your laptop and desktop that other people would be unlikely to be running (like f.lux), since you mentioned it happened on both your machines but not other peoples.

Hope this helps and good luck!

Shalinor
Jun 10, 2002

Can I buy you a rootbeer?
My laptop also stutters during development, and the mouse cursor actually jerks around at times. This seems to be only when I have UDK open. But I'm pretty sure that's just because the Unreal Editor is a resource hogging piece of crap.

The other gotcha I run into is how much memory Firefox will collect if you leave it run for a while in the background. It collects contiguous memory blocks like they were Pokemon.

Shalinor fucked around with this message at 16:46 on Nov 15, 2011

PalmTreeFun
Apr 25, 2010

*toot*

Mustard Snobbery posted:

This is a little story that happened to me.

I was messing around with SFML to learn C++ and tried moving a texture around with the arrow keys and its movement had stuttering just as you described. I tried some
of the inbuilt functions for framerates and enabling vertical sync. I tried multiplying by delta time and simply incrementing positions but still the problem remained. So I switched over to XNA 4.0. Same scenario: moving a sprite with the arrow keys, and same stuttery movement. So I tried XNA 3.1 and it still kept happening. I tried all the v-sync and delta time stuff as before but nothing worked. Figuring it must be my machine I took it in to work to try there and STILL the stuttery movement persisted.

This drove me mad all week until the weekend when I found the solution. I have f.lux (that program that adjusts your screen color based on time of day) installed at home and at work. Well, I was doing some image work in Photoshop and for some reason rather than choosing to temporarily disable it I exited it completely. Ad lo and behold, when I ran my code again a little later it worked! It ran silky smooth with nary a itter or jerk to be seen.

I guess the moral of the story is make sure you haven't got something running in the background on both your laptop and desktop that other people would be unlikely to be running (like f.lux), since you mentioned it happened on both your machines but not other peoples.

Hope this helps and good luck!

Oh yeah, I remember Super Meat Boy had this problem too. If you had f.lux on the game would jitter every couple seconds, loving up your jumps and sometimes causing you to phase through walls.

Orzo
Sep 3, 2004

IT! IT is confusing! Say your goddamn pronouns!
Mustard Snobbery:

You are my loving hero. Thank you so, so much.

Orzo
Sep 3, 2004

IT! IT is confusing! Say your goddamn pronouns!

PalmTreeFun posted:

Oh yeah, I remember Super Meat Boy had this problem too. If you had f.lux on the game would jitter every couple seconds, loving up your jumps and sometimes causing you to phase through walls.
Really? I mean I know Super Meat Boy isn't exactly AAA, but it's a well polished game and you'd think that the developers would decouple physics updates from framerates!

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

Orzo posted:

Really? I mean I know Super Meat Boy isn't exactly AAA, but it's a well polished game and you'd think that the developers would decouple physics updates from framerates!
Even if it didn't do that, it would gently caress up the input timing which would kill you just as quickly in that game.

Orzo
Sep 3, 2004

IT! IT is confusing! Say your goddamn pronouns!

OneEightHundred posted:

Even if it didn't do that, it would gently caress up the input timing which would kill you just as quickly in that game.
Yeah, but that's more acceptable than warping through walls.

Anyway, what's weird about the problem I was having that exiting f.lux fixed is that the XNA samples ran smooth. I wonder why.

PDP-1
Oct 12, 2004

It's a beautiful day in the neighborhood.

Orzo posted:

Anyway, what's weird about the problem I was having that exiting f.lux fixed is that the XNA samples ran smooth. I wonder why.

How are you handling loop timing in your non-XNA code? If you're spinning a tight while loop that checks the system clock to see when it's time to do the next Update/Draw cycle, your game thread will try to take 100% of the CPU. Other processes (like f.lux) won't get any time until the scheduler stops your thread to service them all at once, producing a noticeable hitch.

On the other hand if your game does it's Update/Draw and then yields back any excess time via Thread.Sleep(0) or whatever, the other processes could run between game frames and get their work done in a more smooth one-at-a-time way.

I don't know specifically how XNA handles it's timing, but I've never noticed it running the CPU to 100% unless you turn off the fixed timestep option. That suggests that they're using the second, 'sleepy' timing model which might explain why you didn't have the problem inside XNA.

e: I experimented with making my own 'sleepy' game loop a week or so ago using a TimerQueueTimer and it worked pretty well. I blocked the game thread between frames and signaled when it was ok to run from the TimerQueueTimer callback routine.

Orzo
Sep 3, 2004

IT! IT is confusing! Say your goddamn pronouns!
Maybe. Right now it's a tight while loop with Application.DoEvents to process input (I'm aware that there are better message processing techniques, but it's simple right now for prototyping). But you're right, it could definitely be that.

The1ManMoshPit
Apr 17, 2005

crazylakerfan posted:

So I've got a question. I'm working on a system to load/save input settings to a file as part of a huge team project. They want me to read from the save/read the name of the control and then store them. The rub is that while they are saved as text, they want me to convert those to the OIS enum of the same name.

Now working in C++, I can't think of a way to do this other than a giant hashmap of strings to enums. But I really don't want to have to manually write and enter 144 entries into the hash-map because it is both tedious and I am likely to accidentally add an error into the system that way.

Does anyone know of a clever way in which I can do with or should I start populating that hashmap?

Use macro expansion, ie:
code:
#define MY_LIST_OF_INPUTS \
    INPUT_EXPAND(JumpButton) \
    INPUT_EXPAND(MoveOrWhatever) \
    ...

enum InputEnum
{
#define INPUT_EXPAND(input) input,
    MY_LIST_OF_INPUTS
#undef INPUT_EXPAND
    InputEnumMax
};

static const char* gInputNames[InputEnumMax] =
{
#define INPUT_EXPAND(input) #input,
    MY_LIST_OF_INPUTS
#undef INPUT_EXPAND
};
Now the string for an enum is gInputNames[enumValue]. You could write a different expansion to populate a std::map or whatever to go from string to enum as well.

Adbot
ADBOT LOVES YOU

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
I was wondering now how common it was for games written in static languages on Windows to actually target 64-bit now? I'm asking this after wrestling with a lot of middleware stuff that were still 32-bit by default.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply