Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
xgalaxy
Jan 27, 2004
i write code
When debugging UTF8 (which is multibyte) in Visual Studio you can put ",s8" in the watch window and it will display correctly.

Adbot
ADBOT LOVES YOU

ZombieApostate
Mar 13, 2011
Sorry, I didn't read your post.

I'm too busy replying to what I wish you said

:allears:
That is certainly good to know. Hopefully I'll remember that it exists at least, when I get to that point. I also (re?)discovered that you can use hardcoded strings with the annoying unicode versions of windows functions that take strings if you surround them with _T() or use quotes with an L in front like so L"". Now I think I'll go cry myself to sleep. I think the link I edited in gives me a good bit to work with at least.

Relaxodon
Oct 2, 2010
Has anyone dabbled in Go game development? The language not the board game. I am interested in getting into it and would like to combine that with some graphics programming experiments I have in mind. (Involving tessellation shaders)

There seem to be some basic openGL bindings but it is hard to judge if they are any good without jumping through a lot of hoops first.

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

ZombieApostate posted:

I think I'm done playing with premake and tinkering with extension wranglers for now. I want to ask about Unicode.
Unicode is a pain to deal with unless you deal with it up front. UTF-8 can solve a lot of compatibility issues, but it can make character counting and offset operations break, i.e. returning bad character counts or causing substring operations to break the string in half. You also have to be careful of scenarios where a string ends in the middle of a multi-byte character.

It also depends a lot on your target platform. Linux uses UTF-8 for its Unicode support, Windows uses UTF-16 (wchar_t) almost exclusively.

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

ZombieApostate posted:

If I understand it correctly, Unicode is always 2 bytes per character and Multi-Byte could be 1 or 2 bytes depending on the character.

Windows (and Microsoft) tend to call UCS-2 "Unicode". That is not the case. The real Unicode is not a byte encoding, it's a standard that specifies everything but. UTF-8 is really what you should be planning to use, everywhere.

I do not know what the "Unicode" and "Multi-byte" options are in Visual Studio.


OneEightHundred posted:

Unicode is a pain to deal with unless you deal with it up front. UTF-8 can solve a lot of compatibility issues, but it can make character counting and offset operations break, i.e. returning bad character counts or causing substring operations to break the string in half.

UTF-8 can deal with invalid data quite well, as bytes that start characters and continuation bytes are distinct and recognizable, even without existing state.

I tend to use the UTF-8 macros in glib to deal with the situation.

ZombieApostate
Mar 13, 2011
Sorry, I didn't read your post.

I'm too busy replying to what I wish you said

:allears:

Suspicious Dish posted:

Windows (and Microsoft) tend to call UCS-2 "Unicode". That is not the case. The real Unicode is not a byte encoding, it's a standard that specifies everything but. UTF-8 is really what you should be planning to use, everywhere.

I do not know what the "Unicode" and "Multi-byte" options are in Visual Studio.


UTF-8 can deal with invalid data quite well, as bytes that start characters and continuation bytes are distinct and recognizable, even without existing state.

I tend to use the UTF-8 macros in glib to deal with the situation.
I think a lot of the confusion I'm having with Unicode comes from Microsoft being idiots about nomenclature. The seem intent on using unique terms no one else uses for things that already have names and, again, like you say, use "Unicode" as a specific type, rather than a term that encompasses the whole topic like every other sane person on the planet. I'm still not sure if Visual Studio's Unicode setting is UTF-16 or UCS-2, but I'm pretty sure MSDN says it can be 16/32 (edit: I think this is right now) bits per code page point and Multi-Byte can be 8/16/24/32 (edit: think this is also correct now) bits per code page point (if I'm using the right terms).

This seems like pretty solid advice, not having any experience dealing with Unicode before now, and it seems to match up with what you're saying. So unless somebody points out a huge hole in what their theory, I think I'll just try to follow that. The whole page should really be required reading for anybody that wants to deal with Unicode. It's really helped me understand what the hell is going on a lot better.

OneEightHundred posted:

Unicode is a pain to deal with unless you deal with it up front. UTF-8 can solve a lot of compatibility issues, but it can make character counting and offset operations break, i.e. returning bad character counts or causing substring operations to break the string in half. You also have to be careful of scenarios where a string ends in the middle of a multi-byte character.

It also depends a lot on your target platform. Linux uses UTF-8 for its Unicode support, Windows uses UTF-16 (wchar_t) almost exclusively.

From what I've been reading, UTF-16 will still have the same problems you mention UTF-8 having, just with fewer languages. If I understand it correctly, most western languages won't have those problems with UTF-16 because they almost always fit in the first 16 bits (2 char's or 1 wchar_t), but asian languages, in particular, often use 32 bits for characters (or 2 wchar_t's). That's not even going into the fact that apparently you can represent some characters, like ones with accents, as two separate code pages points overlapped (so è could be 1 character, or a combination of a normal e and another accent character). That would even further gently caress up any attempt to count characters by counting bytes. Which means even with UTF-32, where everything is fixed at 4 bytes, you can't count bytes and be sure you're getting the right number of characters. It's crazy how much more complex this is than I thought it would be :gonk:

Fortunately I haven't done much of anything with strings yet (just using them for filenames for a few things so far, I think), so if I deal with it now, it shouldn't be that bad.

ZombieApostate fucked around with this message at 19:04 on Jul 23, 2012

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
Stop using the word "code page". It is irrelevant and incorrect when talking about Unicode, except as a history lesson. Show me the MSDN page that tells you that "Unicode" is "16/32 bits per code page", and I will report it to their documentation team.

UTF-16 is indeed "two bytes per code point", for code points U+0000 to U+FFFF, also known as the "Basic Multilingual Plane". For code points above U+FFFF, there is a mechanism called "surrogate pairs", which uses some special code points (U+D800-U+DFFF) that are marked as "reserved for UTF-16" and aren't mapped to any character.

I don't see what's so hard about counting characters in UTF-8 as compared to ASCII. ASCII requires a scan for a NUL byte, counting the bytes in between. UTF-8 requires a scan for any non-continuation byte, which is any byte that does not start with "0b10" ((byte & 0xC0) != 0x80).

You can also use the char length table like glib does to "skip to the next character".

ZombieApostate
Mar 13, 2011
Sorry, I didn't read your post.

I'm too busy replying to what I wish you said

:allears:
Yeah, you're right about the code page/point bit, I got them mixed up. This is the page I was looking at, but now that I read it again, it seems to be implying that the Unicode setting is UCS-2? I think? They don't mention UCS-2/UTF-8/UFT-16 anywhere in the description for it, so it's kind of ambiguous, but it sounds like UCS-2.

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

ZombieApostate posted:

Yeah, you're right about the code page/point bit, I got them mixed up. This is the page I was looking at, but now that I read it again, it seems to be implying that the Unicode setting is UCS-2? I think? They don't mention UCS-2/UTF-8/UFT-16 anywhere in the description for it, so it's kind of ambiguous, but it sounds like UCS-2.

UCS-2 is simply "two bytes per character". UTF-16 is UCS-2 plus the surrogate pairs stuff. This means that UCS-2 cannot represent characters outside of the Basic Multilingual Plane. The "Unicode" that Microsoft talks about above indeed looks like UCS-2. Shame on you, Microsoft.

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

OneEightHundred posted:

I've been doing some aggressive optimization the main takeaways are that Mono's BitConverter implementation is awful (it doesn't save time over individual byte accesses like the MS version), its store motion and aliasing optimizations are apparently really bad since a ton of time was saved by just unrolling some block operations, Unity's Color32 constructor is really slow, and having Editor Attach enabled is bad for perf.

I've got it down to 0.876 sec of CPU time on a 2.2GHz Phenom 9500 (with TLB patch disabled) to decode 20 seconds of 160x128 24fps video, which is a lot better, but still pretty bad since it means a 640x480 would probably max out the CPU core.

I'm going to try some threading optimizations. At the very least, it should be possible to do the colorspace decode while processing the next frame, but it might be possible to retool the decoder to do reconstructs in parallel.

Is this with or without Editor Attach still enabled? For BitConverter, try using DataConverter instead, and see if it provides any speedups. File a bug for the Unity guys for Color32 (there's an option in the Help menu for it), and if you can get reproducible testcases for store motion/aliasing optimizations, file bugs for Mono.

Nalin
Sep 29, 2007

Hair Elf

ZombieApostate posted:

std::string and std::wstring

Then apparently people generally use ICU or UTFCPP or something to do the actual UTF-8<->UTF-16<->UTF-32 conversion? Any suggestions?
std::string uses char and std::wstring uses wchar_t. That is pretty much it. It doesn't understand unicode at all. Be very, very careful if you use any string manipulation functions when storing unicode data.

You shouldn't bother with UTF-32. Since my project will use ICU eventually, I just use UTF-16 internally and convert to/from UTF-8 when writing/reading files.

ZombieApostate posted:

I also (re?)discovered that you can use hardcoded strings with the annoying unicode versions of windows functions that take strings if you surround them with _T() or use quotes with an L in front like so L"".
Yeah. The L"" denotes the string as a wchar_t string. Just be careful if you try to port your code to something other than Windows. Linux uses UTF-8 encoded data in char strings, and it doesn't support a lot of the wchar_t functions that Windows does.

Suspicious Dish posted:

UCS-2 is simply "two bytes per character". UTF-16 is UCS-2 plus the surrogate pairs stuff. This means that UCS-2 cannot represent characters outside of the Basic Multilingual Plane. The "Unicode" that Microsoft talks about above indeed looks like UCS-2. Shame on you, Microsoft.
Windows supports UTF-16 since (I believe) Windows XP. If you set the compiler to UNICODE, then all Windows API commands will use the W version (wchar_t) and support UTF-16 with surrogate pairs.

German Joey
Dec 18, 2004
Does anyone know where I can find a free isographic hex tiles set? It doesn't really matter what's on the tiles, I just want to play around with them in the tiles engine that I'm creating. (I currently have options for square, isographic-square, and hex tiles, so isographic-hex would complete the set.)

Deki
May 12, 2008

It's Hammer Time!
Man, I want to find the guy who designed Final fantasy Tactics' AI and shake his hand. I'm working on a strategy RPG AI, and this poo poo is harder than anything I've had to do so far, and I'm not even working with a 3D game.

PDP-1
Oct 12, 2004

It's a beautiful day in the neighborhood.

Deki posted:

Man, I want to find the guy who designed Final fantasy Tactics' AI and shake his hand. I'm working on a strategy RPG AI, and this poo poo is harder than anything I've had to do so far, and I'm not even working with a 3D game.

Have you found any good resources describing how to make that kind of AI? I'm interested in making a societal-level AI like you might find in the Civilization games, but I've never gone further than doing basic individual wander/flee/chase type stuff.

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!
I've heard FFT's AI is actually really advanced and is only easy because of the terrible loadouts the enemies get, and ROM hacks really bring out how much of a dick it is.

I'm suspecting from observation that it evaluates orders per-unit, but it does a lot of introspection regarding the pending event queue. It'll only consider units that can't move out of the way when considering ground-targeted effects for instance, enemies targeted by your AOEs will often move next to your units to hit them with it and avoid hitting their allies, and they heavily prioritize actions that will result in a guaranteed kill.

Suspicious Dish posted:

Is this with or without Editor Attach still enabled? For BitConverter, try using DataConverter instead, and see if it provides any speedups. File a bug for the Unity guys for Color32 (there's an option in the Help menu for it), and if you can get reproducible testcases for store motion/aliasing optimizations, file bugs for Mono.
The Color32 issue doesn't seem like a bug as much as the JIT handling it as a function call.

Anyway, I redid a ton of the IO, threaded all reconstruction operations, and threaded the colorspace conversion, which got it down to about 0.5 sec of CPU time. I'm about to say "gently caress it" because I'm not sure how much better this can get.

I'm considering porting (or making) a VQ codec for low CPU usage scenarios (i.e. phone) anyway. I've almost got Cinepak ported. I was considering porting Indeo 3 after finding out that the patent on it got canned when they failed to pay maintenance fees, but apparently that's a YUV 4:1:0 codec and gently caress that. Cinepak blows rear end but it's in that class of codecs where you can solve most quality problems by just cranking the bitrate up so whatever.

OneEightHundred fucked around with this message at 23:50 on Jul 24, 2012

General_Failure
Apr 17, 2005
Thanks for all the versioning/ multiple location advice earlier. Haven't had much of a chance to look at it yet. Been a bit busy with other things but as soon as I get enough sleep to be functional I'm going to chase it up.

Mustach
Mar 2, 2003

In this long line, there's been some real strange genes. You've got 'em all, with some extras thrown in.

Relaxodon posted:

Has anyone dabbled in Go game development? The language not the board game. I am interested in getting into it and would like to combine that with some graphics programming experiments I have in mind. (Involving tessellation shaders)

There seem to be some basic openGL bindings but it is hard to judge if they are any good without jumping through a lot of hoops first.
Things are fragmented still. There are lots of cgo wrappers for SDL on github, but last I checked (which was a while ago, so this isn't a problem anymore) none of them took care of the SDLmain nonsense, so they wouldn't work on OSX or Windows, just Linux. But, if you're just doing experiments and on Linux, you could probably pick any of them, aside from ones like banthar/Go-SDL, which was first past the post but hasn't been updated for Go 1.

sturgeon general
Jun 27, 2005

Smells like sushi.

OneEightHundred posted:

I've heard FFT's AI is actually really advanced and is only easy because of the terrible loadouts the enemies get, and ROM hacks really bring out how much of a dick it is.

I'm suspecting from observation that it evaluates orders per-unit, but it does a lot of introspection regarding the pending event queue. It'll only consider units that can't move out of the way when considering ground-targeted effects for instance, enemies targeted by your AOEs will often move next to your units to hit them with it and avoid hitting their allies, and they heavily prioritize actions that will result in a guaranteed kill.

I'm sure it's some variant of minimax/alpha-beta pruning like with chess games, where the AI generates a tree of future moves with score values, and the max depth of the tree correlates to difficulty. Units with low HP and defense have a higher value to attack/heal, and of course it factors in future turns. In the later battles, the AI units seem to take longer turns since they have more abilities to factor in.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Hubis posted:

http://msdn.microsoft.com/en-us/library/windows/desktop/ff476876(v=vs.85).aspx

DirectX 11 introduces back-compatibility feature levels down to DX9.1, so there's no reason to use DX9 over DX11 unless you're concerned about supporting people still running XP (in which case OpenGL is probably a better option).

Coming back to this, the June 2012 Steam Hardware Survey is out.

Almost 80% of systems are DirectX 10 capable (Vista+ and DX10+ hardware) and about half of those support DirectX 11 feature levels as well.

Toper Hollyphant
Jul 31, 2010

Relaxodon posted:

Has anyone dabbled in Go game development? The language not the board game. I am interested in getting into it and would like to combine that with some graphics programming experiments I have in mind. (Involving tessellation shaders)

There seem to be some basic openGL bindings but it is hard to judge if they are any good without jumping through a lot of hoops first.

A bit, but I'm less interested in 3D stuff and more about AI and bunch of other stuff. Because of this I'm more a roguelike guy so can't really help with the OpenGL bindings.

There are quite many of those though and multiple are updated after Go1 release. I'd be surprised if you wouldn't find a working library from that selection.

Biggest problem for whatever you're planning will probably be avoiding GC. Not impossible, but non GC languages will make it so much easier. With Go you can still avoid allocating memory and thus preventing gc from doing a sweep.

roomforthetuna
Mar 22, 2005

I don't need to know anything about virii! My CUSTOM PROGRAM keeps me protected! It's not like they'll try to come in through the Internet or something!

Hubis posted:

Coming back to this, the June 2012 Steam Hardware Survey is out.

Almost 80% of systems are DirectX 10 capable (Vista+ and DX10+ hardware) and about half of those support DirectX 11 feature levels as well.
Though that's 80% of systems that use Steam, which is going to be preselected to people who are more gaming-oriented. So it's a perfectly fine metric if you're planning to release a game on Steam, but not a good metric if you're going to target casual users or something.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

roomforthetuna posted:

Though that's 80% of systems that use Steam, which is going to be preselected to people who are more gaming-oriented. So it's a perfectly fine metric if you're planning to release a game on Steam, but not a good metric if you're going to target casual users or something.

I think it would be a pretty hard argument that the audience for any personal or indie game project (that's not something like a facebook app) is going to be substantially outside that covered by people who use steam, but that's a different argument.

ZombieApostate
Mar 13, 2011
Sorry, I didn't read your post.

I'm too busy replying to what I wish you said

:allears:
Anybody actually used MyGUI outside of Ogre? I'm trying to integrate it into my OpenGL engine, but I'm not seeing any actual output. I see the textures loaded in gDEBugger, the vertex buffers are there and don't look weird or anything. There's nothing in the log that looks out of place. When I stepped through the code, it looked like it was trying to draw, and at least it doesn't crap out anywhere that I could see. I left a shader bound by accident at first and it made MyGUI textured triangles sprout all over the last model I drew, but once I unbound it I got nothing. It looks like they disable depth testing before they draw, so that shouldn't be the problem. I tried turning off culling in case they wound their triangles opposite from mine or something, but that didn't help either.

I posted on their forums, but they look pretty dead and it's been a couple days without a response. I'm pretty much out of ideas at this point. I couldn't find any docs on how to set it up with the OpenGL platform (I've just been adapting the Ogre directions), so I could have easily missed a step. Or it could be one of those dumb things you can't see because you've been staring at it too long.

edit: :negative: Derp, I'm an idiot, unbound the shader, but forgot to unbind the vertex array object, which proceeded to gently caress with MyGUI's vertex buffer, since they're bums and aren't using a VAO.

ZombieApostate fucked around with this message at 06:05 on Jul 26, 2012

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

Hubis posted:

I think it would be a pretty hard argument that the audience for any personal or indie game project (that's not something like a facebook app) is going to be substantially outside that covered by people who use steam, but that's a different argument.
The problem with DX10 and DX11 is that they aren't a whole lot better than DX9 capability-wise unless you're dealing with some seriously bleeding-edge poo poo, and usability-wise they're probably worse with the constant buffer and IA management idiocy. They need a better reason to migrate to them if it's going to cost 20% of your userbase.

Null Pointer
May 20, 2004

Oh no!

erotic dad posted:

I'm sure it's some variant of minimax/alpha-beta pruning like with chess games, where the AI generates a tree of future moves with score values
I'm sure it's just some vicious and creatively-designed expert system. FFT has a massive fanout, so I think you'd have a hard time getting better results with negamax given that FFT has no obvious abstractions and it has to run on a PS1.

They didn't use it in FFT, but I think MCTS would be a better option for a new game. MCTS is definitely faster for games that have a high fan-out, and unlike chess single moves are unlikely to have devastating consequences (in either direction). I've been talking about writing a FFT bot with MCTS for a couple of years, but I never found the time.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

OneEightHundred posted:

The problem with DX10 and DX11 is that they aren't a whole lot better than DX9 capability-wise unless you're dealing with some seriously bleeding-edge poo poo, and usability-wise they're probably worse with the constant buffer and IA management idiocy. They need a better reason to migrate to them if it's going to cost 20% of your userbase.

The driver model allows for drivers to substantially reduce resource management overhead (and thus run-time hitching from streaming content) and the API overhead per draw-call is much better, letting you draw more stuff per frame without having to worry about performance. SM 4.0+ introduces a lot of convenient stuff that's hardly "bleeding edge" like first-class support for depth comparison samplers, as well as just general improvements to the programming model that make it a lot easier to work with. There's definitely performance reasons to be worth switching, but performance would have to be an important concern to you.

It's really personal opinion but I don't find them to be less usable at all. As always a bad interface you know well is always going to be more "usable" than a slightly less bad API that you don't, and a lot of it comes down to taste.

Personally, I just feel like I wouldn't lose sleep over that 20% of the audience because odds are they're not going to be playing something I throw up on github anyways. You're cutting out a lot more actual potential players by using DirectX rather than OpenGL to begin with. I'm not trying to make a fight about DirectX or anything, I just think that "A ton of people can't run DX10+ so just use DX9" is conventional wisdom that should be challenged. The only reason most game developers are still using DX9 is because they have existing engines they don't feel like porting, and/or they're targeting the Xbox360 primarily and their PC engine is just a port of that codebase. I expect DirectX 9 use to vanish within the next 1-3 years as people prep for the next generation of consoles.

Hubis fucked around with this message at 14:36 on Jul 26, 2012

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

Hubis posted:

I'm not trying to make a fight about DirectX or anything, I just think that "A ton of people can't run DX10+ so just use DX9" is conventional wisdom that should be challenged.
Well, I think it's conventional wisdom because DX10 doesn't offer enough. DX9 had some very clear improvements over DX8 capability-wise and usability-wise (i.e. VFDs), it's much harder to find the same class of improvements in DX10. First-class depth compare is nice, but float textures can do roughly the same thing.

DX9 will fade out over the next few years now that there's a more compelling upgrade from XP, and DX10 isn't the userbase toxin it was when it came out, but right now, I feel like you're still better off with DX9 unless you know there's something you absolutely need DX10 to do.

OpenGL should be the solution for the time being, but is kind of a step back just because of a certain manufacturer's consistently terrible driver support for it.

OneEightHundred fucked around with this message at 15:42 on Jul 26, 2012

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

OneEightHundred posted:

Well, I think it's conventional wisdom because DX10 doesn't offer enough. DX9 had some very clear improvements over DX8 capability-wise and usability-wise (i.e. VFDs), it's much harder to find the same class of improvements in DX10. First-class depth compare is nice, but float textures can do roughly the same thing.

DX9 will fade out over the next few years now that there's a more compelling upgrade from XP, and DX10 isn't the userbase toxin it was when it came out, but right now, I feel like you're still better off with DX9 unless you know there's something you absolutely need DX10 to do.

OpenGL should be the solution for the time being, but is kind of a step back just because of a certain manufacturer's consistently terrible driver support for it.

DX10 didn't introduce much feature-wise I would agree -- I guess I just see DX11 as a lot of DX10's potential realized. The driver/API overhead improvements really are non-trivial, though. I would agree that most people's personal projects can be written using just the features available in DX9 quite easily and without the user-base cost. It's a problem you just don't have to worry about.

I do think that if you're doing something for educational purposes (or creating tech you want to carry with you) then you should target DX11 (with back-supported feature levels) because it's a lot more future-proof, and you have a lot more room for growth down the line.

Max Facetime
Apr 18, 2009

OneEightHundred posted:

OpenGL should be the solution for the time being, but is kind of a step back just because of a certain manufacturer's consistently terrible driver support for it.

How do things like angleproject (an API translator from OpenGL ES 2.0/WebGL to DirectX 9) factor into this? It seems to me that if you were to specialize heavily into one API then what does going OpenGL beyond ES 2.0 get you besides driver problems, why not just go DirectX 11?

OneEightHundred
Feb 28, 2008

Soon, we will be unstoppable!

I am in posted:

How do things like angleproject (an API translator from OpenGL ES 2.0/WebGL to DirectX 9) factor into this? It seems to me that if you were to specialize heavily into one API then what does going OpenGL beyond ES 2.0 get you besides driver problems, why not just go DirectX 11?
I meant that OpenGL has mechanisms to leverage DX10 features on XP and adapts relatively easily to not using them, on top of having significantly lower call overhead than DX9, which SHOULD have made it preferable to DX10 in most scenarios where using 10 was desired.

Unfortunately, OpenGL support on DX10 class hardware has generally been very buggy. It does NOT help that GLSL is anything but "if it works, it's compliant and will work everywhere." NVIDIA lets you do a lot of things that are illegal by the spec, and ATI's GLSL compiler has repeatedly had terrible bugs ranging from blue screens to things that never should have made it past regression testing like "defining a varying after a struct = syntax error."

OneEightHundred fucked around with this message at 21:36 on Jul 27, 2012

Paniolo
Oct 9, 2007

Heads will roll.
Any XNA experts in here? I'm playing around with it but I find the default content loader quite limited. Specifically what I would really like is the ability to use the XML object serializer to load objects, but then to fix up object references to other static resources at load time. For example:

code:
class SpriteSheet
{
    [ResolveOnLoad]
    Texture2D Texture;
    SpriteFrame[] Frames;
}

...

<Asset Type="SpriteSheet">
   <Texture>dog.png</Texture>
   <Frames>
      ...
   </Frames>
</Asset>
What I would like is for ContentManager.Load<SpriteSheet>("dog") to call Load<Texture2D>("dog.png") as part of its deserialization process. I'm okay with subclassing ContentManager and implementing the fixup behavior myself, but the biggest problem I'm having is that the XmlImporter is having none of this.

I have come up with a few workarounds but all of them frankly suck. I'm kind of amazed this functionality isn't in the standard content manager already as it seems pretty universally required.

chglcu
May 17, 2007

I'm so bored with the USA.
I haven't used XNA in a while, but isn't this the sort of thing you use content pipeline extensions for? Basic overview here: http://msdn.microsoft.com/en-us/library/bb447754.aspx

Paniolo
Oct 9, 2007

Heads will roll.

prolecat posted:

I haven't used XNA in a while, but isn't this the sort of thing you use content pipeline extensions for? Basic overview here: http://msdn.microsoft.com/en-us/library/bb447754.aspx

Yeah I am trying to avoid completely rewriting the XML pipeline though, just add a postprocessing step.

The approach I ended up taking is to define two classes for any object I want to import via xml:

code:
class SpriteDefinition
{
   int Width;
   int Height;
   String Texture;   
}

[ResourceDefinition=typeof(SpriteDefinition)]
class Sprite
{
   int Width;
   int Height;
   [ResourceReference]
   Texture2D Texture;
}
Then I subclassed ContentManager and overloaded Load() so that if you ask for a Sprite, it will call base.Load<SpriteDefinition>, then create a Sprite, copy over the normal properties and then fixup the references.

It works but it adds boilerplate and maintenance and increases memory usage.

Jewel
May 2, 2009

Does anyone have any good resources to properly implementing OOP or Component based programming, or pretty much "how do I make a complete project properly"?? I'm looking for books, webpages, anything (Looking for game-specific tutorials/books though best case). Basically my problem is that, while I know how to program a process to do pretty much any given task, my code winds together into terrible spaghetticode pretty fast, since I don't know how to do even simple stuff properly, like "Say I have a particle manager, how do I let all objects create particles? Do I pass the manger around? Do I have a global manager collection that can be accessed? Do I have the object's parent be an object manager and the object manager's parent be a manager manager, and then access object.parent.parent.particlemanager?? Do I create particles from there with key values, since the object wouldn't have access to the particle types?" basically that just goes on forever and I forget about the actual programming and fret about the oop until I stop working on the project forever :sigh:

Paniolo
Oct 9, 2007

Heads will roll.

Jewel posted:

"Say I have a particle manager, how do I let all objects create particles?

In this particular case you're asking the wrong question. Every object shouldn't necessarily be allowed to create particles. There should be only a small number, or ideally one, point from which that code could be run.

In my experience, most overengineering arises from writing code without knowing what the use case for that code would be, so you try to cover any possible use case. The more you know about the context in which code will be run before you start to write it, the less you will deal with ambiguity and the easier it is to solve these problems.

edit: You asked for a link, here's the bitsquid blog where they discuss their engine. They take a more data-driven approach which I am a fan of as well.

Jewel
May 2, 2009

Paniolo posted:

In this particular case you're asking the wrong question. Every object shouldn't necessarily be allowed to create particles. There should be only a small number, or ideally one, point from which that code could be run.

And that's pretty much my biggest flaw right there. I don't know the """best""" ways to write the communication between things, end up hacking together a system of passing objects/parents, and then it gets too messy and I stop.

Unormal
Nov 16, 2004

Mod sass? This evening?! But the cakes aren't ready! THE CAKES!
Fun Shoe

Jewel posted:

And that's pretty much my biggest flaw right there. I don't know the """best""" ways to write the communication between things, end up hacking together a system of passing objects/parents, and then it gets too messy and I stop.

When in doubt, fire events between things :)

leper khan
Dec 28, 2010
Honest to god thinks Half Life 2 is a bad game. But at least he likes Monster Hunter.

Jewel posted:

And that's pretty much my biggest flaw right there. I don't know the """best""" ways to write the communication between things, end up hacking together a system of passing objects/parents, and then it gets too messy and I stop.

Have you tried diagramming out all of the interactions before you write any code? It sounds like you're pressing keys to keyboard before actually knowing the purpose of doing so.

It's probably best to get a class diagram set up, figure out what everything needs to do and iterate that 1 to n times until it seems manageable. It sounds like what you're asking is, "How do I plan a project while writing code?". The answer is: you plan a project, then you write code.

Jewel
May 2, 2009

leper khan posted:

Have you tried diagramming out all of the interactions before you write any code? It sounds like you're pressing keys to keyboard before actually knowing the purpose of doing so.

Pretty much what I'm doing, yeah. I just wanna code and get so caught up in it :(

I need a big whiteboard so I can sketch out stuff like that in a much nicer way eventually.

I'll try it out though, thanks! I always write down stuff like this and then forget to do it. Also even though it's been answered, I'm still wondering if there's ever been a book written about programming relationships between entities in regards to gamedev.

Edit: vvv Stuff like this is why I don't like doing anything, and end up freezing up, hahah.

Jewel fucked around with this message at 02:23 on Aug 3, 2012

Adbot
ADBOT LOVES YOU

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe

Unormal posted:

When in doubt, fire events between things :)

This is a recipe for disaster in a game engine.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply