|
I've played with convolution matrices a little bit, I just never thought they could be fast enough to do real time like that because iterating over every pixel on the screen gets slow fast. While my results did look cool, they were hilariously slow to generate.
|
# ? Jul 17, 2015 16:15 |
|
|
# ? Jun 10, 2024 18:58 |
|
orenronen posted:The best thing that happened to my Unity programming methodology is UniRx, a port of the .NET Reactive Extensions that sadly not many people know about except for a small cult following among the Japanese Unity community (which is where I work). It not only makes standard RX work in Unity, but also provides many Unity-specific additions that makes most common tasks achievable in a reactive way. I'm working on a strategy game in Unity and I appreciate this link!
|
# ? Jul 17, 2015 16:42 |
|
xzzy posted:I've played with convolution matrices a little bit, I just never thought they could be fast enough to do real time like that because iterating over every pixel on the screen gets slow fast. So do it on the GPU!
|
# ? Jul 17, 2015 17:03 |
|
Zaphod42 posted:So do it on the GPU! Shalinor fucked around with this message at 18:01 on Jul 17, 2015 |
# ? Jul 17, 2015 17:59 |
I've run into a problem trying to switch from a window to fullscreen with GLFW. Whenever I do so (at least on Windows) the entire app seems to crash. This is the relevant code:C++ code:
C++ code:
E: What I'm getting is a segmentation fault, and it happens both with and without fullscreen (like if I just try and pass off the context to a new window) Also, it seems to happen in my render pass and my debugger gives me something related to nvogl, which I assume means the context hasn't been properly transfered or something? More specifically it happens with the first call to glDrawElements(), which I take to mean something has happened to the context, meaning I can no longer use it? Xerophyte posted:I can't actually load the assimp site right now but I think the scene is freed with the importer, yes. It looks like it in the source also, as well. Alright, thanks. Joda fucked around with this message at 06:34 on Jul 18, 2015 |
|
# ? Jul 18, 2015 03:37 |
|
Shalinor posted:This. Convolution matrices are the heart of probably 99% of the recent graphical upgrades you've seen. Ambient Occlusion, FSAA variants, monte-carlo inspired bounce calcs done in screen space, certain types of reflection, etc - they all involve iterating across every single pixel and dumping the result into another buffer. Most render pipelines have a stack of at least 6 textures where they're shuffling full screen calculations like that, one to the next. If you're feeling frisky, just adjust the resolution of some of the buffers to minimize the number of actual pixels being calculated, and get some free blur while you're at it. http://www.adriancourreges.com/blog/2015/03/10/deus-ex-human-revolution-graphics-study/ That's a pretty good read for some of that stuff.
|
# ? Jul 18, 2015 03:54 |
|
Very vague question: Is it generally a good or bad idea to have all game logic decoupled from Unity itself? I did the Roguelike tutorial, where they move stuff around in tiles, and they use raycasts and colliders to check if there's anything in the position they're moving to. That seems pretty convoluted, as I'd just check the grid position for anything. Thoughts?
|
# ? Jul 18, 2015 11:25 |
|
Boz0r posted:Very vague question: Is it generally a good or bad idea to have all game logic decoupled from Unity itself? I think the ease of use of unity stops people from having to think about things themselves. grid checks are a much better solution, but if an easier tool is present, a lot of developers won't bother to put the work in
|
# ? Jul 18, 2015 12:26 |
|
If I heard about someone doing a tile-based game and actually using raycasts to check collision I would just laugh in their face forever
|
# ? Jul 18, 2015 13:55 |
|
That's what I thought, but it was an official Unity tutorial, so I thought I'd ask. It hurt my software architecture soul.
|
# ? Jul 18, 2015 14:00 |
|
Yeah that sounds like some complete garbage. Now I'm just imagining a Bejeweled-like game with rays shooting all over the goddamn place.
|
# ? Jul 18, 2015 15:11 |
|
You can try but you will hit some tension with Unity's mono-specific shenanigans. I like to decouple especially for unit testing. However, all this stuff uses Vector3 and stuff like that. I found I could sever that for my pathfinding. So I could import the main Unity Dlls and use its types with needing to test and run in Unity directly. Technically I could write my own Vector3 and map back and forth, but that leaves me sour. I could not do the same for my texture atlas; Texture2D had something specific with Unity's mono runtime. I could still make a dedicated class for the atlas and use Unity's testing tools to verify the atlas, but it had to be in-process.
|
# ? Jul 18, 2015 16:32 |
|
Yodzilla posted:Yeah that sounds like some complete garbage. Now I'm just imagining a Bejeweled-like game with rays shooting all over the goddamn place. This is the best part of that tutorial: code:
That's gonna buy you some time, considering ...
|
# ? Jul 18, 2015 17:05 |
|
Yodzilla posted:Yeah that sounds like some complete garbage. Now I'm just imagining a Bejeweled-like game with rays shooting all over the goddamn place. Smaller stuff doesn't matter though, just do whatever.
|
# ? Jul 18, 2015 17:44 |
|
If I am doing manual editor GUI drawing in Unity, is there any way to determine dirty regions and repaint only what I need? I am thinking that my editors are really making Unity spin since it is being told to draw tiles each GUI paint cycle.
|
# ? Jul 18, 2015 19:02 |
|
So after several days of experimenting with various ways to create a seamless level I've come to the conclusion I need to actually build my world on a cylinder, however it seems Unreal Engine really doesn't play well with gravity modifications as the gravity function is coded into the engine itself. So my new question is, does anyone have any experience dealing with gravity around a cylinder and having it change so that the down of gravity is always the player Z down instead of what the engine says is gravity? Unfortunately the answerhub and forums aren't as full of relevant examples and discussion about this type of thing as the unity forums are.
|
# ? Jul 18, 2015 19:33 |
|
Rotate the world around the character instead of the character around the world?
|
# ? Jul 18, 2015 19:43 |
|
dupersaurus posted:Rotate the world around the character instead of the character around the world? This would work at a huge expense to cpu time. Recalculating the physics on the whole world every frame is really not an ideal solution.
|
# ? Jul 18, 2015 19:46 |
|
ModeSix posted:This would work at a huge expense to cpu time. Recalculating the physics on the whole world every frame is really not an ideal solution. The other option would be to not use engine gravity, but manually apply a force to every object that points to the center of the world.
|
# ? Jul 18, 2015 19:55 |
|
dupersaurus posted:The other option would be to not use engine gravity, but manually apply a force to every object that points to the center of the world. This is what I'm thinking along the lines of. I've done some reading about radial force and I suppose I could just apply it in the negative direction.
|
# ? Jul 18, 2015 20:03 |
|
ModeSix posted:This is what I'm thinking along the lines of. I've done some reading about radial force and I suppose I could just apply it in the negative direction. Negative force at map center, or a force along the down vector applied to object origin, probably just about a coin flip.
|
# ? Jul 18, 2015 20:14 |
|
Someone made an "easy to implement suggestion" for my game yesterday that would actually be easy to implement. Is this a sign of the end times?
|
# ? Jul 19, 2015 00:04 |
|
But a cylinder only wraps around one surface axis! You need to have your game play on a torus
|
# ? Jul 19, 2015 12:46 |
|
wayfinder posted:But a cylinder only wraps around one surface axis! You need to have your game play on a torus Well it only needs to wrap in one direction, because it will generate a new piece in the other direction, but at this point with all the weirdness associated with trying to do this the "elegant" way I may just have it generate new world pieces in all directions instead of trying to make it wrap on itself.
|
# ? Jul 19, 2015 15:56 |
|
Dumb question: I'm making a thing in Unreal Engine 4 and I'm trying to improve performance on lower end machines by just decreasing graphics quality.I can change the rendering quality in the preview window just fine, but then I package the game and it goes right back to dsiplaying at max quality with all the AA and fancy lighting and so on slowing things down. Where can I look if I want to modify the default graphics settings for the final, packaged product?
|
# ? Jul 19, 2015 23:23 |
|
If you fully embrace and know how to work within the Entity/Component pattern, you can easily do big projects in Unity. Just my two cents based of experience. I find most people want to take a more strict OOP pattern with Unity and struggle because Unity is designed around E/C.
|
# ? Jul 20, 2015 04:04 |
|
ModeSix posted:So after several days of experimenting with various ways to create a seamless level I've come to the conclusion I need to actually build my world on a cylinder, however it seems Unreal Engine really doesn't play well with gravity modifications as the gravity function is coded into the engine itself. Arbitrary gravity direction for CharacterMovement is coming in UE 4.9 I think, based on this pullrequest I've been messing with it for SAGDCX and it mostly works with some funky behaviour here and there you need to address because the entire engine assumes gravity is always gonna be Z-axis.
|
# ? Jul 20, 2015 08:00 |
|
Spiritus Nox posted:Dumb question: I'm making a thing in Unreal Engine 4 and I'm trying to improve performance on lower end machines by just decreasing graphics quality.I can change the rendering quality in the preview window just fine, but then I package the game and it goes right back to dsiplaying at max quality with all the AA and fancy lighting and so on slowing things down. Where can I look if I want to modify the default graphics settings for the final, packaged product? When I last looked you had to do something like the console commands from here (scroll down to the image "set scalability settings at startup") https://answers.unrealengine.com/questions/23023/trouble-configuring-game-settings.html
|
# ? Jul 20, 2015 11:47 |
|
poemdexter posted:If you fully embrace and know how to work within the Entity/Component pattern, you can easily do big projects in Unity. Just my two cents based of experience. I find most people want to take a more strict OOP pattern with Unity and struggle because Unity is designed around E/C. I actually really like ECS, cause I think it's a lot easier to comprehend cognitively and is a lot more flexible than typical OOP approaches. I was always under the impression that Unity was all OOP, mostly because of being C#-based, so maybe I should give it another chance. I'm still curious about ECS itself and I want to try my hand at my own implementation of it, too, though.
|
# ? Jul 20, 2015 12:26 |
|
I have a question about animation inside of unity. In most animation programs I've used (after effects, flash, maya, spine, c4d, etc.) you can change the length of the handles to affect easing: In unity, the handles apparently have a set length: Which means if I want to create the same easing, I need to add extra keyframes and gently caress around a lot. Is that the case or am I missing something?
|
# ? Jul 20, 2015 12:37 |
|
poemdexter posted:If you fully embrace and know how to work within the Entity/Component pattern, you can easily do big projects in Unity. Just my two cents based of experience. I find most people want to take a more strict OOP pattern with Unity and struggle because Unity is designed around E/C. It works better the smaller your team is. I don't think I could use it on a project requiring more than one or two artists, one designer, and one or two programmers. (ie. Hot Tin Roof was enough to push it to the brink for us) ... which is why I'd vote for "using it as a rendering engine" as your team size goes up. If you avoid using their toolset, you don't hit any of the issues we did over the project. Unity itself still has some lovely bugs, but they're normal engine bugs at least, not mysterious toolset bugs. As it happens, you want to do that anyways if your game is procedural (using prefabs instead of manual runtime construction caused its own nightmares), so it turns out you aren't missing much anyways for a bunch of genres/game types. Shalinor fucked around with this message at 14:29 on Jul 20, 2015 |
# ? Jul 20, 2015 14:25 |
|
Shalinor posted:... which is why I'd vote for "using it as a rendering engine" as your team size goes up. If you avoid using their toolset, you don't hit any of the issues we did over the project. Unity itself still has some lovely bugs, but they're normal engine bugs at least, not mysterious toolset bugs. As it happens, you want to do that anyways if your game is procedural (using prefabs instead of manual runtime construction caused its own nightmares), so it turns out you aren't missing much anyways for a bunch of genres/game types. This is going to sound like a very dumb question but as a person who isn't too well-versed in programming, and is a bit fuzzy on the details of high level tech/art pipelines in game development, what exactly do you mean by using it as a rendering engine? Do you mean letting a program other than MonoDevelop handling the code compiling or something far more elaborate? Angry_Ed fucked around with this message at 15:04 on Jul 20, 2015 |
# ? Jul 20, 2015 15:00 |
|
Shalinor posted:I've got a decade in ECS engines, and still found the process of making a big project in Unity miserable verging on intolerable. It isn't the base engine conceit that's the problem, it's the inherent bugginess of their toolset. And also the way it flips its poo poo when you collaborate via source control and it occasionally corrupts something for no good reason. Oh, and the way it makes it (almost) impossible to keep your changes isolated / it occasionally touches a prefab or scene despite no apparent intentional changes. I was thinking about HTR when I wrote my response and I agree with you. At most, I've worked with two developers and an artist and we already had a workflow agreed upon to help with some of the Unity quirks you mentioned. A team of your size would probably be like trying to herd cats to keep your Unity project in a state that can compile.
|
# ? Jul 20, 2015 15:04 |
|
Angry_Ed posted:This is going to sound like a very dumb question but as a person who isn't too well-versed in programming what exactly do you mean by using it as a rendering engine? Do you mean letting a program other than MonoDevelop handling the code compiling or something far more elaborate?
|
# ? Jul 20, 2015 15:05 |
|
OneEightHundred posted:I had to reread that, it sounded like he was talking about 8-bit games until I realized that the quote has nothing else to do with games at all. Samplers. And drum machines which used digital samples. OneEightHundred posted:Also AFAIK, old game system sound was mostly crappy not so much because of the sound bit depth, but because it was mostly done with FM synthesis, in some cases with FM synth processors that could only do square and sawtooth waves. Maybe it's both though. FM (frequency modulation) synthesis is most common in early Sega consoles (Master System + Genesis/Megadrive), and arcade games of the mid-late 80s, which used a few different Yamaha sound chips like the YM2151 and YM2612. These are very similar in design to the FM implementation in the Yamaha DX7, which was the first (and is still the most important) keyboard FM synthesizer. FM synthesis is great for certain types of sounds, it can do airy or metallic or sparkly sounds amazingly well, and especially when it was new it sounded striking and distinctive and unlike anything that had come before. But FM isn't very good at modeling most real-world sounds, and it is notoriously difficult to design FM sounds compared to other methods of sound generation. Also, classic FM synths don't know how to make anything other than a sine wave. If you want a square or sawtooth wave, you use one sine wave to modulate a second sine wave in a way which creates harmonics which approximate a square wave (you know how I said FM sound design was difficult?) Frequency modulation can be used as a technique in other types of synthesis too, any time you hear an old game with a police siren or pulsating UFO noise, that's using one low frequency oscillator to modulate the frequency of a second (audio) tone generator. But really, it's a little generous to call anything in the 8-bit (NES and earlier) game era as "synthesis" at all. Mostly early games used simple digital tone generators to create square waves, pulse waves (a pulse wave is a square wave with a variable width), triangle waves (a triangle wave is a digital approximation of a sine wave, it's easier to compute a zig-zag than to do trigonometry), and white noise. There were some cool techniques like pulse modulation (changing the width of a pulse wave over time to make phasey/electronic sounds) that were in use as early as the Atari 2600, but up until the 16-bit era, game sound hardware by and large didn't have much concept of filters or envelopes, which are essential components to most definitions of "synthesis" (the SID chip in the Commodore 64 is an exception). In the 1980s, digital recording in the form of PCM (pulse code modulation) sampling started to be a thing, though for a while memory costs were prohibitive to be used as anything other than a wavetable (sampling a single wave cycle of a few milliseconds), or short lo-fi sound clips. In the 16-bit era, you had a combination of FM synthesis and more gradually sophisticated sample playback synthesis like what's on the SNES. Nowadays of course memory is so cheap that pretty much everything gets generated as sampled sound, and mostly the only "synthesis" used in games is for dynamic effects like echo and filtering.
|
# ? Jul 20, 2015 15:26 |
|
poemdexter posted:I was thinking about HTR when I wrote my response and I agree with you. At most, I've worked with two developers and an artist and we already had a workflow agreed upon to help with some of the Unity quirks you mentioned. A team of your size would probably be like trying to herd cats to keep your Unity project in a state that can compile. TFU had upwards of 100 people working in Unity on it. We had a LOT of tech art scripts and importers and Perforce voodoo and automated build processes running constantly to keep everything together. From my dumb artist perspective, it worked out okay... mostly. (Then again, TFU died a horrible firey mass-layoffs death - but I don't think Unity had much to do with that.)
|
# ? Jul 20, 2015 15:29 |
|
I've been feeling the Unity project management pain lately but its been mostly due to third party tools and assets that use Unity, for example ProBuilder, AStar Pathfinding and etc. Asset Store is both a blessing and a curse, where truly professional and rock solid stuff is pretty much non-existent; but gently caress it I'd rather spend 50 dollars than work on my own system. Still, I feel like using a Quake BSP editor like TrenchBoom would be a way better solution than loving ProBuilder. The last "professional" Unity project I've worked on (ver 3.5) was much less of a pain, all we've used is NGUI and prime[31] stuff.
|
# ? Jul 20, 2015 18:14 |
|
ShinAli posted:I've been feeling the Unity project management pain lately but its been mostly due to third party tools and assets that use Unity, for example ProBuilder, AStar Pathfinding and etc. Asset Store is both a blessing and a curse, where truly professional and rock solid stuff is pretty much non-existent; but gently caress it I'd rather spend 50 dollars than work on my own system. Still, I feel like using a Quake BSP editor like TrenchBoom would be a way better solution than loving ProBuilder. NGUI and prime[31] would be considered "professional" to me. They both were required if you were doing mobile anything.
|
# ? Jul 20, 2015 18:28 |
|
Shalinor posted:Don't use scenes (you probably only have one scene that exists to load your manager or whatever), don't use prefabs. Roll your own systems to replace them / roll your own toolset. Yeah I think serialization is one of Unity's biggest problems. There's a lot of wackiness that goes on when it serializes that can go awry and source control only makes this worse. Especially prefabs can be such a mess. If you don't want to go the XML/JSON/whatever route and want to be able to drag drop objects in the editor ScriptableObjects seem a lot more consistent especially if don't treat them as a "DB" by having them store a big list of whatever data. They are still not mergable in source control but at least if you split all your data into individual ScriptableObjects (each spell, item, etc as a file) you get a bit more flexibility and any mishaps don't scuttle everything.
|
# ? Jul 20, 2015 19:25 |
|
|
# ? Jun 10, 2024 18:58 |
|
FuzzySlippers posted:Yeah I think serialization is one of Unity's biggest problems. There's a lot of wackiness that goes on when it serializes that can go awry and source control only makes this worse. Especially prefabs can be such a mess. Are there any good resources on learning to work with ScriptableObjects 'correctly'?
|
# ? Jul 20, 2015 19:29 |