Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
xzzy
Mar 5, 2009

I've played with convolution matrices a little bit, I just never thought they could be fast enough to do real time like that because iterating over every pixel on the screen gets slow fast.

While my results did look cool, they were hilariously slow to generate.

Adbot
ADBOT LOVES YOU

Raenir Salazar
Nov 5, 2010

College Slice

orenronen posted:

The best thing that happened to my Unity programming methodology is UniRx, a port of the .NET Reactive Extensions that sadly not many people know about except for a small cult following among the Japanese Unity community (which is where I work). It not only makes standard RX work in Unity, but also provides many Unity-specific additions that makes most common tasks achievable in a reactive way.

The library doesn't force functional methodology on you, but since it's built on top of functional ideas you are often lead that way when using it. There's a somewhat steep learning curve but once you get the hang of it the kind of things you can achieve with one or two lines of code is almost magical. It's astounding how applicable it is to every aspect of Unity game development, from UI to recognizing complex mouse gestures or input sequences to achieving complex timing effects, with almost no effect on performance. Seriously, if you're using Unity you should check it out.

I'm working on a strategy game in Unity and I appreciate this link! :)

Zaphod42
Sep 13, 2012

If there's anything more important than my ego around, I want it caught and shot now.

xzzy posted:

I've played with convolution matrices a little bit, I just never thought they could be fast enough to do real time like that because iterating over every pixel on the screen gets slow fast.

While my results did look cool, they were hilariously slow to generate.

So do it on the GPU! :eng101:

Shalinor
Jun 10, 2002

Can I buy you a rootbeer?

Zaphod42 posted:

So do it on the GPU! :eng101:
This. Convolution matrices are the heart of probably 99% of the recent graphical upgrades you've seen. Ambient Occlusion, FSAA variants, monte-carlo inspired bounce calcs done in screen space, certain types of reflection, etc - they all involve iterating across every single pixel and dumping the result into another buffer. Most render pipelines have a stack of at least 6 textures where they're shuffling full screen calculations like that, one to the next. If you're feeling frisky, just adjust the resolution of some of the buffers to minimize the number of actual pixels being calculated, and get some free blur while you're at it.

Shalinor fucked around with this message at 18:01 on Jul 17, 2015

Joda
Apr 24, 2010

When I'm off, I just like to really let go and have fun, y'know?

Fun Shoe
I've run into a problem trying to switch from a window to fullscreen with GLFW. Whenever I do so (at least on Windows) the entire app seems to crash. This is the relevant code:

C++ code:
if(messages[i]->type == FULL_SCREEN_MSG) {
                GLFWwindow*  temp = window;
                window = glfwCreateWindow(render.getWidth(), render.getHeight(), "Deep G-buffers", glfwGetPrimaryMonitor(), temp);
                glfwSetKeyCallback(window,processInput);
                glfwMakeContextCurrent(window);
                render.setContext(window);
                messages[i]->kill = true;
}
C++ code:
void RenderEngine::setContext(GLFWwindow *window) {
    context = window;

    glViewport(0,0,width,height);
    glClearColor(1,1,1,1);
}
Does anyone know what might be going on here? It works just fine when I launch the app in fullscreen, but passing the context on seems to give it problems. It seems to be exclusive to Windows btw.

E: What I'm getting is a segmentation fault, and it happens both with and without fullscreen (like if I just try and pass off the context to a new window) Also, it seems to happen in my render pass and my debugger gives me something related to nvogl, which I assume means the context hasn't been properly transfered or something? More specifically it happens with the first call to glDrawElements(), which I take to mean something has happened to the context, meaning I can no longer use it?

Xerophyte posted:

I can't actually load the assimp site right now but I think the scene is freed with the importer, yes. It looks like it in the source also, as well.

E: There's also an Importer::FreeScene() function, which appears to be for when you want to free the scene and keep the Importer around.

Alright, thanks.

Joda fucked around with this message at 06:34 on Jul 18, 2015

Tres Burritos
Sep 3, 2009

Shalinor posted:

This. Convolution matrices are the heart of probably 99% of the recent graphical upgrades you've seen. Ambient Occlusion, FSAA variants, monte-carlo inspired bounce calcs done in screen space, certain types of reflection, etc - they all involve iterating across every single pixel and dumping the result into another buffer. Most render pipelines have a stack of at least 6 textures where they're shuffling full screen calculations like that, one to the next. If you're feeling frisky, just adjust the resolution of some of the buffers to minimize the number of actual pixels being calculated, and get some free blur while you're at it.

http://www.adriancourreges.com/blog/2015/03/10/deus-ex-human-revolution-graphics-study/

That's a pretty good read for some of that stuff.

Boz0r
Sep 7, 2006
The Rocketship in action.
Very vague question: Is it generally a good or bad idea to have all game logic decoupled from Unity itself?
I did the Roguelike tutorial, where they move stuff around in tiles, and they use raycasts and colliders to check if there's anything in the position they're moving to. That seems pretty convoluted, as I'd just check the grid position for anything. Thoughts?

Cumslut1895
Feb 18, 2015

by FactsAreUseless

Boz0r posted:

Very vague question: Is it generally a good or bad idea to have all game logic decoupled from Unity itself?
I did the Roguelike tutorial, where they move stuff around in tiles, and they use raycasts and colliders to check if there's anything in the position they're moving to. That seems pretty convoluted, as I'd just check the grid position for anything. Thoughts?

I think the ease of use of unity stops people from having to think about things themselves. grid checks are a much better solution, but if an easier tool is present, a lot of developers won't bother to put the work in

seiken
Feb 7, 2005

hah ha ha
If I heard about someone doing a tile-based game and actually using raycasts to check collision I would just laugh in their face forever

Boz0r
Sep 7, 2006
The Rocketship in action.
That's what I thought, but it was an official Unity tutorial, so I thought I'd ask. It hurt my software architecture soul.

Yodzilla
Apr 29, 2005

Now who looks even dumber?

Beef Witch
Yeah that sounds like some complete garbage. Now I'm just imagining a Bejeweled-like game with rays shooting all over the goddamn place.

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
You can try but you will hit some tension with Unity's mono-specific shenanigans. I like to decouple especially for unit testing. However, all this stuff uses Vector3 and stuff like that. I found I could sever that for my pathfinding. So I could import the main Unity Dlls and use its types with needing to test and run in Unity directly. Technically I could write my own Vector3 and map back and forth, but that leaves me sour.

I could not do the same for my texture atlas; Texture2D had something specific with Unity's mono runtime. I could still make a dedicated class for the atlas and use Unity's testing tools to verify the atlas, but it had to be in-process.

dougdrums
Feb 25, 2005
CLIENT REQUESTED ELECTRONIC FUNDING RECEIPT (FUNDS NOW)

Yodzilla posted:

Yeah that sounds like some complete garbage. Now I'm just imagining a Bejeweled-like game with rays shooting all over the goddamn place.

This is the best part of that tutorial:
code:
//By storing the reciprocal of the move time we can use it by multiplying instead of dividing, this is more efficient.
            inverseMoveTime = 1f / moveTime;
http://unity3d.com/learn/tutorials/projects/2d-roguelike/movingobject

That's gonna buy you some time, considering ...

Shalinor
Jun 10, 2002

Can I buy you a rootbeer?

Yodzilla posted:

Yeah that sounds like some complete garbage. Now I'm just imagining a Bejeweled-like game with rays shooting all over the goddamn place.
This, and also, over time you will find Unity Is Stupid and the more you airlock your code away from it, the better. The best way to use Unity in a large project is, probably, to treat it mostly like a rendering engine.

Smaller stuff doesn't matter though, just do whatever.

Rocko Bonaparte
Mar 12, 2002

Every day is Friday!
If I am doing manual editor GUI drawing in Unity, is there any way to determine dirty regions and repaint only what I need? I am thinking that my editors are really making Unity spin since it is being told to draw tiles each GUI paint cycle.

ModeSix
Mar 14, 2009

So after several days of experimenting with various ways to create a seamless level I've come to the conclusion I need to actually build my world on a cylinder, however it seems Unreal Engine really doesn't play well with gravity modifications as the gravity function is coded into the engine itself.

So my new question is, does anyone have any experience dealing with gravity around a cylinder and having it change so that the down of gravity is always the player Z down instead of what the engine says is gravity?

Unfortunately the answerhub and forums aren't as full of relevant examples and discussion about this type of thing as the unity forums are.

dupersaurus
Aug 1, 2012

Futurism was an art movement where dudes were all 'CARS ARE COOL AND THE PAST IS FOR CHUMPS. LET'S DRAW SOME CARS.'
Rotate the world around the character instead of the character around the world?

ModeSix
Mar 14, 2009

dupersaurus posted:

Rotate the world around the character instead of the character around the world?

This would work at a huge expense to cpu time. Recalculating the physics on the whole world every frame is really not an ideal solution.

dupersaurus
Aug 1, 2012

Futurism was an art movement where dudes were all 'CARS ARE COOL AND THE PAST IS FOR CHUMPS. LET'S DRAW SOME CARS.'

ModeSix posted:

This would work at a huge expense to cpu time. Recalculating the physics on the whole world every frame is really not an ideal solution.

The other option would be to not use engine gravity, but manually apply a force to every object that points to the center of the world.

ModeSix
Mar 14, 2009

dupersaurus posted:

The other option would be to not use engine gravity, but manually apply a force to every object that points to the center of the world.

This is what I'm thinking along the lines of. I've done some reading about radial force and I suppose I could just apply it in the negative direction.

dupersaurus
Aug 1, 2012

Futurism was an art movement where dudes were all 'CARS ARE COOL AND THE PAST IS FOR CHUMPS. LET'S DRAW SOME CARS.'

ModeSix posted:

This is what I'm thinking along the lines of. I've done some reading about radial force and I suppose I could just apply it in the negative direction.

Negative force at map center, or a force along the down vector applied to object origin, probably just about a coin flip.

Nition
Feb 25, 2006

You really want to know?
Someone made an "easy to implement suggestion" for my game yesterday that would actually be easy to implement. Is this a sign of the end times?

wayfinder
Jul 7, 2003
But a cylinder only wraps around one surface axis! :) You need to have your game play on a torus :can:

ModeSix
Mar 14, 2009

wayfinder posted:

But a cylinder only wraps around one surface axis! :) You need to have your game play on a torus :can:

Well it only needs to wrap in one direction, because it will generate a new piece in the other direction, but at this point with all the weirdness associated with trying to do this the "elegant" way I may just have it generate new world pieces in all directions instead of trying to make it wrap on itself.

Spiritus Nox
Sep 2, 2011

Dumb question: I'm making a thing in Unreal Engine 4 and I'm trying to improve performance on lower end machines by just decreasing graphics quality.I can change the rendering quality in the preview window just fine, but then I package the game and it goes right back to dsiplaying at max quality with all the AA and fancy lighting and so on slowing things down. Where can I look if I want to modify the default graphics settings for the final, packaged product?

poemdexter
Feb 18, 2005

Hooray Indie Games!

College Slice
If you fully embrace and know how to work within the Entity/Component pattern, you can easily do big projects in Unity. Just my two cents based of experience. I find most people want to take a more strict OOP pattern with Unity and struggle because Unity is designed around E/C.

ether
May 20, 2001

ModeSix posted:

So after several days of experimenting with various ways to create a seamless level I've come to the conclusion I need to actually build my world on a cylinder, however it seems Unreal Engine really doesn't play well with gravity modifications as the gravity function is coded into the engine itself.

So my new question is, does anyone have any experience dealing with gravity around a cylinder and having it change so that the down of gravity is always the player Z down instead of what the engine says is gravity?

Unfortunately the answerhub and forums aren't as full of relevant examples and discussion about this type of thing as the unity forums are.

Arbitrary gravity direction for CharacterMovement is coming in UE 4.9 I think, based on this pullrequest

I've been messing with it for SAGDCX and it mostly works with some funky behaviour here and there you need to address because the entire engine assumes gravity is always gonna be Z-axis.

Quiet_
Sep 15, 2007

Spiritus Nox posted:

Dumb question: I'm making a thing in Unreal Engine 4 and I'm trying to improve performance on lower end machines by just decreasing graphics quality.I can change the rendering quality in the preview window just fine, but then I package the game and it goes right back to dsiplaying at max quality with all the AA and fancy lighting and so on slowing things down. Where can I look if I want to modify the default graphics settings for the final, packaged product?

When I last looked you had to do something like the console commands from here (scroll down to the image "set scalability settings at startup")
https://answers.unrealengine.com/questions/23023/trouble-configuring-game-settings.html

Pollyanna
Mar 5, 2005

Milk's on them.


poemdexter posted:

If you fully embrace and know how to work within the Entity/Component pattern, you can easily do big projects in Unity. Just my two cents based of experience. I find most people want to take a more strict OOP pattern with Unity and struggle because Unity is designed around E/C.

I actually really like ECS, cause I think it's a lot easier to comprehend cognitively and is a lot more flexible than typical OOP approaches. I was always under the impression that Unity was all OOP, mostly because of being C#-based, so maybe I should give it another chance.

I'm still curious about ECS itself and I want to try my hand at my own implementation of it, too, though.

raging bullwinkle
Jun 15, 2011
I have a question about animation inside of unity.

In most animation programs I've used (after effects, flash, maya, spine, c4d, etc.) you can change the length of the handles to affect easing:


In unity, the handles apparently have a set length:


Which means if I want to create the same easing, I need to add extra keyframes and gently caress around a lot. Is that the case or am I missing something?

Shalinor
Jun 10, 2002

Can I buy you a rootbeer?

poemdexter posted:

If you fully embrace and know how to work within the Entity/Component pattern, you can easily do big projects in Unity. Just my two cents based of experience. I find most people want to take a more strict OOP pattern with Unity and struggle because Unity is designed around E/C.
I've got a decade in ECS engines, and still found the process of making a big project in Unity miserable verging on intolerable. It isn't the base engine conceit that's the problem, it's the inherent bugginess of their toolset. And also the way it flips its poo poo when you collaborate via source control and it occasionally corrupts something for no good reason. Oh, and the way it makes it (almost) impossible to keep your changes isolated / it occasionally touches a prefab or scene despite no apparent intentional changes.

It works better the smaller your team is. I don't think I could use it on a project requiring more than one or two artists, one designer, and one or two programmers. (ie. Hot Tin Roof was enough to push it to the brink for us)

... which is why I'd vote for "using it as a rendering engine" as your team size goes up. If you avoid using their toolset, you don't hit any of the issues we did over the project. Unity itself still has some lovely bugs, but they're normal engine bugs at least, not mysterious toolset bugs. As it happens, you want to do that anyways if your game is procedural (using prefabs instead of manual runtime construction caused its own nightmares), so it turns out you aren't missing much anyways for a bunch of genres/game types.

Shalinor fucked around with this message at 14:29 on Jul 20, 2015

Angry_Ed
Mar 30, 2010




Grimey Drawer

Shalinor posted:

... which is why I'd vote for "using it as a rendering engine" as your team size goes up. If you avoid using their toolset, you don't hit any of the issues we did over the project. Unity itself still has some lovely bugs, but they're normal engine bugs at least, not mysterious toolset bugs. As it happens, you want to do that anyways if your game is procedural (using prefabs instead of manual runtime construction caused its own nightmares), so it turns out you aren't missing much anyways for a bunch of genres/game types.

This is going to sound like a very dumb question but as a person who isn't too well-versed in programming, and is a bit fuzzy on the details of high level tech/art pipelines in game development, what exactly do you mean by using it as a rendering engine? Do you mean letting a program other than MonoDevelop handling the code compiling or something far more elaborate?

Angry_Ed fucked around with this message at 15:04 on Jul 20, 2015

poemdexter
Feb 18, 2005

Hooray Indie Games!

College Slice

Shalinor posted:

I've got a decade in ECS engines, and still found the process of making a big project in Unity miserable verging on intolerable. It isn't the base engine conceit that's the problem, it's the inherent bugginess of their toolset. And also the way it flips its poo poo when you collaborate via source control and it occasionally corrupts something for no good reason. Oh, and the way it makes it (almost) impossible to keep your changes isolated / it occasionally touches a prefab or scene despite no apparent intentional changes.

It works better the smaller your team is. I don't think I could use it on a project requiring more than one or two artists, one designer, and one or two programmers. (ie. Hot Tin Roof was enough to push it to the brink for us)

... which is why I'd vote for "using it as a rendering engine" as your team size goes up. If you avoid using their toolset, you don't hit any of the issues we did over the project. Unity itself still has some lovely bugs, but they're normal engine bugs at least, not mysterious toolset bugs. As it happens, you want to do that anyways if your game is procedural (using prefabs instead of manual runtime construction caused its own nightmares), so it turns out you aren't missing much anyways for a bunch of genres/game types.

I was thinking about HTR when I wrote my response and I agree with you. At most, I've worked with two developers and an artist and we already had a workflow agreed upon to help with some of the Unity quirks you mentioned. A team of your size would probably be like trying to herd cats to keep your Unity project in a state that can compile.

Shalinor
Jun 10, 2002

Can I buy you a rootbeer?

Angry_Ed posted:

This is going to sound like a very dumb question but as a person who isn't too well-versed in programming what exactly do you mean by using it as a rendering engine? Do you mean letting a program other than MonoDevelop handling the code compiling or something far more elaborate?
Don't use scenes (you probably only have one scene that exists to load your manager or whatever), don't use prefabs. Roll your own systems to replace them / roll your own toolset.

h_double
Jul 27, 2001

OneEightHundred posted:

I had to reread that, it sounded like he was talking about 8-bit games until I realized that the quote has nothing else to do with games at all.

I'm kind of curious what the context of that quote is enough, and what he IS referring to. It sounds like it's all about music, but what medium did musicians have to deal with where 8-bit sound was even a thing? Early synthesizers?


Samplers. And drum machines which used digital samples.


OneEightHundred posted:

Also AFAIK, old game system sound was mostly crappy not so much because of the sound bit depth, but because it was mostly done with FM synthesis, in some cases with FM synth processors that could only do square and sawtooth waves. Maybe it's both though. :v:


FM (frequency modulation) synthesis is most common in early Sega consoles (Master System + Genesis/Megadrive), and arcade games of the mid-late 80s, which used a few different Yamaha sound chips like the YM2151 and YM2612. These are very similar in design to the FM implementation in the Yamaha DX7, which was the first (and is still the most important) keyboard FM synthesizer.

FM synthesis is great for certain types of sounds, it can do airy or metallic or sparkly sounds amazingly well, and especially when it was new it sounded striking and distinctive and unlike anything that had come before. But FM isn't very good at modeling most real-world sounds, and it is notoriously difficult to design FM sounds compared to other methods of sound generation.

Also, classic FM synths don't know how to make anything other than a sine wave. If you want a square or sawtooth wave, you use one sine wave to modulate a second sine wave in a way which creates harmonics which approximate a square wave (you know how I said FM sound design was difficult?)

Frequency modulation can be used as a technique in other types of synthesis too, any time you hear an old game with a police siren or pulsating UFO noise, that's using one low frequency oscillator to modulate the frequency of a second (audio) tone generator.

But really, it's a little generous to call anything in the 8-bit (NES and earlier) game era as "synthesis" at all. Mostly early games used simple digital tone generators to create square waves, pulse waves (a pulse wave is a square wave with a variable width), triangle waves (a triangle wave is a digital approximation of a sine wave, it's easier to compute a zig-zag than to do trigonometry), and white noise. There were some cool techniques like pulse modulation (changing the width of a pulse wave over time to make phasey/electronic sounds) that were in use as early as the Atari 2600, but up until the 16-bit era, game sound hardware by and large didn't have much concept of filters or envelopes, which are essential components to most definitions of "synthesis" (the SID chip in the Commodore 64 is an exception).

In the 1980s, digital recording in the form of PCM (pulse code modulation) sampling started to be a thing, though for a while memory costs were prohibitive to be used as anything other than a wavetable (sampling a single wave cycle of a few milliseconds), or short lo-fi sound clips. In the 16-bit era, you had a combination of FM synthesis and more gradually sophisticated sample playback synthesis like what's on the SNES. Nowadays of course memory is so cheap that pretty much everything gets generated as sampled sound, and mostly the only "synthesis" used in games is for dynamic effects like echo and filtering.

floofyscorp
Feb 12, 2007

poemdexter posted:

I was thinking about HTR when I wrote my response and I agree with you. At most, I've worked with two developers and an artist and we already had a workflow agreed upon to help with some of the Unity quirks you mentioned. A team of your size would probably be like trying to herd cats to keep your Unity project in a state that can compile.

TFU had upwards of 100 people working in Unity on it. We had a LOT of tech art scripts and importers and Perforce voodoo and automated build processes running constantly to keep everything together. From my dumb artist perspective, it worked out okay... mostly.

(Then again, TFU died a horrible firey mass-layoffs death - but I don't think Unity had much to do with that.)

ShinAli
May 2, 2003

The Kid better watch his step.
I've been feeling the Unity project management pain lately but its been mostly due to third party tools and assets that use Unity, for example ProBuilder, AStar Pathfinding and etc. Asset Store is both a blessing and a curse, where truly professional and rock solid stuff is pretty much non-existent; but gently caress it I'd rather spend 50 dollars than work on my own system. Still, I feel like using a Quake BSP editor like TrenchBoom would be a way better solution than loving ProBuilder.

The last "professional" Unity project I've worked on (ver 3.5) was much less of a pain, all we've used is NGUI and prime[31] stuff.

poemdexter
Feb 18, 2005

Hooray Indie Games!

College Slice

ShinAli posted:

I've been feeling the Unity project management pain lately but its been mostly due to third party tools and assets that use Unity, for example ProBuilder, AStar Pathfinding and etc. Asset Store is both a blessing and a curse, where truly professional and rock solid stuff is pretty much non-existent; but gently caress it I'd rather spend 50 dollars than work on my own system. Still, I feel like using a Quake BSP editor like TrenchBoom would be a way better solution than loving ProBuilder.

The last "professional" Unity project I've worked on (ver 3.5) was much less of a pain, all we've used is NGUI and prime[31] stuff.

NGUI and prime[31] would be considered "professional" to me. They both were required if you were doing mobile anything.

FuzzySlippers
Feb 6, 2009

Shalinor posted:

Don't use scenes (you probably only have one scene that exists to load your manager or whatever), don't use prefabs. Roll your own systems to replace them / roll your own toolset.

Yeah I think serialization is one of Unity's biggest problems. There's a lot of wackiness that goes on when it serializes that can go awry and source control only makes this worse. Especially prefabs can be such a mess.

If you don't want to go the XML/JSON/whatever route and want to be able to drag drop objects in the editor ScriptableObjects seem a lot more consistent especially if don't treat them as a "DB" by having them store a big list of whatever data. They are still not mergable in source control but at least if you split all your data into individual ScriptableObjects (each spell, item, etc as a file) you get a bit more flexibility and any mishaps don't scuttle everything.

Adbot
ADBOT LOVES YOU

Unormal
Nov 16, 2004

Mod sass? This evening?! But the cakes aren't ready! THE CAKES!
Fun Shoe

FuzzySlippers posted:

Yeah I think serialization is one of Unity's biggest problems. There's a lot of wackiness that goes on when it serializes that can go awry and source control only makes this worse. Especially prefabs can be such a mess.

If you don't want to go the XML/JSON/whatever route and want to be able to drag drop objects in the editor ScriptableObjects seem a lot more consistent especially if don't treat them as a "DB" by having them store a big list of whatever data. They are still not mergable in source control but at least if you split all your data into individual ScriptableObjects (each spell, item, etc as a file) you get a bit more flexibility and any mishaps don't scuttle everything.

Are there any good resources on learning to work with ScriptableObjects 'correctly'?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply