|
Apologies in advance if my tone sucks; I've had a horrible day.lord funk posted:You are insightful as always w-rw-rw-. lord funk posted:I finally got a chance to play around with things. Napkin calculation: sqrt(100 * 1024 * 1024 / 300 / 32) is approximately 100. Your memory fail comes from the fact that you're switching between a poo poo ton of bitmaps - the size of which is roughly their actual number of pixels times the number of bytes per pixel. lord funk posted:Part 2: I switched to a generic UIView, where I run a CADisplayLink timer that sets the layer.contents to the next CGImageRef in sequence. If I store the images into an array, the memory starts out at zero but builds up to the same amount as earlier: lord funk posted:Part 3: Since keeping the memory down is my primary goal, I loaded the image data (as NSData) into an array, and I create the image ref each frame, then release it immediately. It definitely takes some CPU to do this, but I could make a buffer to load maybe 30 frames at a time as a tradeoff. lord funk posted:If I have a question it's for clarification here. I'm seeing that setting a layer's contents to a CGImage does take memory. Am I not doing it correctly? This page might be partially enlightening:: http://sean.voisen.org/blog/2013/04/high-performance-image-loading-os-x/ - it might be possible to decode CGImages on a background thread if you're committed to loving your CPU and by extension, your battery. But basically, if you want the sweet animation, you can totally do it in a performant, low-memory, low-CPU fashion. You just need to reduce your images to as few as possible, or even draw and save them in code, then animate them with Core Animation. Swapping hundreds of different images from your app to the CPU and crossing into the GPU, fast, *must* by necessity, gently caress you on CPU or memory. Doctor w-rw-rw- fucked around with this message at 06:03 on Aug 29, 2013 |
# ? Aug 29, 2013 02:27 |
|
|
# ? Jun 6, 2024 11:45 |
|
To add on to his, as a separate post because it deserves special attention: The key to performant visuals on mobile (and maybe even generally) is to avoid copying memory. Remember that your images go from being compressed files on disk to being loaded into memory (possibly streamed in and evicted as small chunks as a part of decoding), then decoded into a bitmap (definitely all in memory), then *copied to video memory*. Telling the display server to have the GPU map a texture with a different rotation (setting the transform does this for you) is vastly, vastly more performant than loading a new image. The former is basically zero CPU and memory, with a fixed and pretty much expected GPU cost. The latter falls flat from a GPU efficiency standpoint no matter what you do, and fails hard on either CPU or memory even if you try to be smart about it. So yeah, you can totally animate that in code. Are you going to eke any substantial performance or memory wins out of trying it with 300 images instead? The answer isn't just no, but gently caress no.
|
# ? Aug 29, 2013 02:47 |
|
Doctor w-rw-rw- posted:The key to performant visuals on mobile (and maybe even generally) is to avoid copying memory. This all makes perfect sense to me, and not to be arrogant, I knew this already. But on this basis why do Apple themselves appear to animate large parts of the iOS UI with huge series' of pngs? There was a tool I used once which ripped through the iOS frameworks on your machine and dumped out every image it came across, which revealed things like (the one example that sticks in my head) the Siri purple microphone animation being made up of 100 different pngs for each percentage level of volume input. Sure, the existence of these images doesn't mean they're definitely doing this, but it seems a strong indication to me. In this specific case maybe the purple glow would be more expensive to calculate/render at runtime, but the knock-on effect of loading all these images instead seems pretty extreme. Again, this wasn't the only example of this kind of thing, just the only one I can remember at 6am..
|
# ? Aug 29, 2013 06:21 |
|
Froist posted:This all makes perfect sense to me, and not to be arrogant, I knew this already. But on this basis why do Apple themselves appear to animate large parts of the iOS UI with huge series' of pngs? Just take a look at <redacted> and it's blatantly clear that performance is no longer necessarily their strong suit. I'm not certain the extent to which I am able to expand on that. In any case, you shouldn't hold Apple up as a paragon of iOS app programming. The hardware is good enough that they don't need the best people pushing performance to the limits anymore. They also have a history of turning visual effects off on older hardware. The 4S had a big jump in fill rate over the 4, if memory serves. And yup, that Siri bit does sound pretty lazy. To clarify my earlier notes, it's not just using tons of images that is bad performance. Loading and using images can be unnoticeable if you're not overwhelming memory, and managing it well. It's obviously the right thing to do in many cases. But if whatever you're doing involves animation and pushing new bits (or drawing) from the app into some sort of graphics context (CALayer, Core Graphics, OpenGL, whatever), you're probably using a fair amount of effort just for the trip from storage->CPU->GPU.
|
# ? Aug 29, 2013 06:51 |
|
Yeah, it's all about avoiding the memory copy. I used to work in performance for a different handset manufacturer/OS, and seeing this made me wonder whether Apple had a magic solution. Guess not. Just to be clear it's not me that asked the original question, I was just hijacking the discussion
|
# ? Aug 29, 2013 07:08 |
|
That does sound extra lazy, since that specific animation could obviously be drawn in code quickly (just set the layer's shadow radius and color), or if you had to use images could be handled with end images and one middle image repeated to fill. For the original question, I agree that if you can get at the source of the animation, it does appear to be a bunch of circles rotating on various orbits. It could be drawn in code if you were really serious about the performance aspect. Unfortunately I don't think DrawCode does animations, but in theory you could have an Illustrator plugin that exported as CoreGraphics calls, or maybe some translator from a vector animation format (not that I know if one exists).
|
# ? Aug 29, 2013 15:45 |
|
I am surprised at the Siri animation. For certain animations you save tons of time if you programmatically draw it (especially when you decide you want to change that shade of purple...). My particular animation could be approximated with a few simple shapes and drawn programmatically, but a) I like that it captures all the detail of the actual performance it came from b) I don't want to spend a week re-coding the OpenGL drawing code / relationships to make the same animation and c) it's really the only thing going on in its view, so I'm not battling with other processes. Then there's this alternate animation which has three versions masked inside a folder so they bubble out: So you're all totally right: best practice for memory / CPU / battery it should be drawn in-app. I'll be sacrificing one (probably CPU to be nice to other apps - it's what this app is all about) because I'm lazy. Side note: took me about 9 hours to capture that animation. Had to write automated touch generation code for my performance screen, render and capture, shrink and animate... repeat because I didn't like a detail...
|
# ? Aug 29, 2013 20:07 |
|
Doctor w-rw-rw- posted:I would bet money on the possibility that your CPU hit is one of the loading file or PNG->bitmap decoding steps. Oh yeah it's totally the PNG->bitmap step. It's a really clear example that you get performance at the cost of [ ]CPU [ ]Memory <--pick one
|
# ? Aug 29, 2013 20:11 |
|
lord funk posted:It's a really clear example that you get performance at the cost of [ ]CPU [ ]Memory <--pick one That's incorrect, sometimes it's both.
|
# ? Aug 29, 2013 20:30 |
|
lord funk posted:I am surprised at the Siri animation. For certain animations you save tons of time if you programmatically draw it (especially when you decide you want to change that shade of purple...). Believe me, Core Animation is a lot simpler than you'd guess. If you can write a KVC-compliant property that modifies the relevant transform/bounds/position, you can just write a CABasicAnimation to tween a start and end value for you. At the very least, I would suggest trying a simple animation group to get a feel for it. No need to stray anywhere near icky low-level OpenGL code there. BTW, maybe you can help me with something. I want to take a CGImage, load it into a FBO, run a shader on it for a disc blur, (additional reference: CIDiscBlur - it's basically a blur where a pixel blows up into a circle - like the film effect where you go out of focus and lights turn into circles), then render that back into a CGImage. I'm clueless on everything there that involves OpenGL (ES2 of course). Do you think you'd have anything helpful to offer here?
|
# ? Aug 29, 2013 20:50 |
|
Doctor w-rw-rw- posted:Believe me, Core Animation is a lot simpler than you'd guess. If you can write a KVC-compliant property that modifies the relevant transform/bounds/position, you can just write a CABasicAnimation to tween a start and end value for you. At the very least, I would suggest trying a simple animation group to get a feel for it. No need to stray anywhere near icky low-level OpenGL code there. quote:BTW, maybe you can help me with something. I want to take a CGImage, load it into a FBO, run a shader on it for a disc blur, (additional reference: CIDiscBlur - it's basically a blur where a pixel blows up into a circle - like the film effect where you go out of focus and lights turn into circles), then render that back into a CGImage. I'm clueless on everything there that involves OpenGL (ES2 of course). Do you think you'd have anything helpful to offer here? I might have something helpful in the form of someone else's work: GPUImage https://github.com/BradLarson/GPUImage I saw this demonstrated a week ago, and it's a huge bank of filters and supposedly cuts waaaay down on the boilerplate backend code.
|
# ? Aug 29, 2013 21:52 |
|
Does anyone know about using iRate in iOS 7? It seems like the URL we used for app ratings in ios 6 doesn't work anymore in 7.
Juul-Whip fucked around with this message at 00:07 on Aug 30, 2013 |
# ? Aug 29, 2013 23:45 |
|
Doctor w-rw-rw- posted:BTW, maybe you can help me with something. I want to take a CGImage, load it into a FBO, run a shader on it for a disc blur, (additional reference: CIDiscBlur - it's basically a blur where a pixel blows up into a circle - like the film effect where you go out of focus and lights turn into circles), then render that back into a CGImage. I'm clueless on everything there that involves OpenGL (ES2 of course). Do you think you'd have anything helpful to offer here? GPUImage is probably the way to go for this. The big thing is I'd skip the intermediate step of turning it into a CGImage again and have the filter output to a GPUImageView directly. My only caution would be that GPUImage's development definitely tends toward the camera-filter usage, rather than filtering static images. There have been changes before that break the static stuff unintentionally (though not in a while AFAIK). GPUImage is super super nice for avoiding dealing with OpenGL and getting right to using a shader. If you need the CGImage for some reason, like you're going to save the output or do something else CPU-based with it, you might be best off using the Accelerate framework's vImage convolutions to do a blur (just look up a Gaussian kernel or something). Both GPUImage and CoreImage will give you overhead copying CGImages into FBOs and FBOs into CGImages. I can't find my benchmarks from a while back, but we were processing and saving the output to disk, and on a 4S a simple 3*3 blur was slightly faster than GPUImage and in-place color matrix recoloring was like 20% faster. Also, regardless, disc blur is probably a Poisson blur, which is something you don't want in any interactive code because it is slooooowww.
|
# ? Aug 29, 2013 23:47 |
|
lord funk posted:I might have something helpful in the form of someone else's work: GPUImage I just used this in an app I started on, and it's easy to use and really fast. I'm jut using it to darken and blur a user selected image, but it was 6 lines of code to do it. Good stuff.
|
# ? Aug 30, 2013 00:10 |
|
Hmm, nope. My goal isn't to slot in a library and transform an image, it's to write a shader. And the one thing I want to do right now is take a CGImage as input it, apply a special blur with OpenGL, then take the output.
|
# ? Aug 30, 2013 01:00 |
|
It's pretty bread and butter with OpenGL but there are a lot of moving parts to get a handle on before you can do that. There is a book "iPhone 3D Programming" by Philip Rideout/Oreilly which can help get you up to speed. It's a great book and one of the most enjoyable programming books I have read in years. In a nutshell, you would create an fbo, load you image into a gl texture, pass that to a shader as a uniform texture (using sampler2D) which renders to the fbo, then use the fbo texture as your processed image. It is straight forward enough but opengl and shaders require a bit of time to get familiar with.
|
# ? Aug 30, 2013 01:54 |
|
Has anyone used FMOD Studio for iOS? Looks really amazing for organizing audio events and setting different properties like pitch randomization, wet/dry levels and priority. I was considering using it, but I don't know if it would be overkill for a 2D adventure game. There's lots of sprites hoping around, doing things, doors opening collecting things, etc, and I can see where it would be helpful especially for creating some kind of ceiling for maximum audio events and priority. Couldn't find videos on their site, but here's one if anyone's curious: https://www.youtube.com/watch?v=vr6pCpV9mO8 LP0 ON FIRE fucked around with this message at 03:08 on Aug 30, 2013 |
# ? Aug 30, 2013 03:04 |
|
Doctor w-rw-rw- posted:Hmm, nope. My goal isn't to slot in a library and transform an image, it's to write a shader. And the one thing I want to do right now is take a CGImage as input it, apply a special blur with OpenGL, then take the output. Why not fire up the OpenGL Game template and modify what's already there? It's got the shader there and ready to go.
|
# ? Aug 30, 2013 20:35 |
|
Just curious if anyone has run into performance problems using autolayout for UICollectionViewCells (or UITableViewCells, for that matter)? Wondering if I should do the layout by hand instead.
|
# ? Aug 30, 2013 22:03 |
|
Doctor w-rw-rw- posted:Hmm, nope. My goal isn't to slot in a library and transform an image, it's to write a shader. And the one thing I want to do right now is take a CGImage as input it, apply a special blur with OpenGL, then take the output. You can write your own shaders for use with GPUImage, IIRC.
|
# ? Aug 31, 2013 20:46 |
|
NoDamage posted:Just curious if anyone has run into performance problems using autolayout for UICollectionViewCells (or UITableViewCells, for that matter)? Wondering if I should do the layout by hand instead. Have you used Time Profiler to make sure it's an autolayout issue? Maybe too many resources are being loaded, or it could be something completely different.
|
# ? Aug 31, 2013 20:57 |
|
Is there a way to skin NSMenu? We have a nice look going in OpenEmu, but our implementation basically replicates the standard functionality really badly and the code is super ugly and requires a couple hacks.
|
# ? Aug 31, 2013 23:58 |
|
I need to learn Objective-C & iOS development really fast so I can hit the ground running (and lead a team) on a major project. Hooray startups and rapidly changing things! I'm a strong generalist programmer and definitely don't need intro to OO concepts or pointers or anything like that. Most immediately I've been doing Android development. What's a good place to learn the stuff I need (Objective-C, frameworks, design patterns) ASAP? Are Apple's docs the best for an "expert-level" overview? Any solid books? edit: the product is basically an application-as-an-SDK, there is a huge focus on UI/UX, and there will be lots of animations and network calls. admiraldennis fucked around with this message at 12:28 on Sep 1, 2013 |
# ? Sep 1, 2013 12:18 |
|
admiraldennis posted:I need to learn Objective-C & iOS development really fast so I can hit the ground running (and lead a team) on a major project. Hooray startups and rapidly changing things! Objective-C: Read the pdf http://chachatelier.fr/programmation/fichiers/cpp-objc-en.pdf it's pre ARC but it will give you a grounding in where the language is coming from. UI/UX frameworks: There's definitely an 'Apple way' of doing things if you have the time hit iTunes U for the Stanford University lectures. Watch them at 1.5X speed. If you need a book, I don't have a good recommendation I'm afraid.
|
# ? Sep 1, 2013 15:04 |
|
admiraldennis posted:I need to learn Objective-C & iOS development really fast so I can hit the ground running (and lead a team) on a major project. Hooray startups and rapidly changing things! Read the docs for most of the rest - they're really well organized and will have a lot of really useful information. Skip books. They'll all suck or be outdated or just show you how to make vanilla user interfaces, which you can learn yourself. PM me with your IM information, or I guess we could set up an IRC channel so you can spam someone who knows more than you with questions. That's how I learned - spamming my friend at Apple with requests for more information.
|
# ? Sep 1, 2013 19:09 |
|
Doctor w-rw-rw- posted:PM me with your IM information, or I guess we could set up an IRC channel so you can spam someone who knows more than you with questions. That's how I learned - spamming my friend at Apple with requests for more information. I would kill for an ios focused IRC channel. I don't have PMs, but if you set it up, please post here.
|
# ? Sep 1, 2013 19:15 |
|
Malloreon posted:I would kill for an ios focused IRC channel. I don't have PMs, but if you set it up, please post here. I'm in. I've never really used IRC so y'all can school me.
|
# ? Sep 1, 2013 19:22 |
|
I've got an iPad 3rd gen. running at 60fps, and an iPad 4th gen. running at 29fps. It has to have something to do with the iOS 7 blur effect being active on the 4th gen and not the 3rd gen. Uuugghhh feel like I'm wasting time tracking this down.
|
# ? Sep 1, 2013 19:27 |
|
Doctor w-rw-rw- posted:Write some UIViews with custom CALayer subclasses, play with AFNetworking (don't waste time on low level networking stuff for now if HTTP/HTTPS is good enough), write a simple uiviewcontroller which contains other uiviewcontrollers using containment. As a gateway to understanding memory management, know how autorelease works as deeply as possible. Skim https://github.com/toulouse/FourHundred for really basic core graphics drawing. Great info, thanks for this. I'll definitely be hitting you up soon with nagging questions.
|
# ? Sep 2, 2013 18:45 |
|
admiraldennis posted:Great info, thanks for this. Sure. I haven't set up any irc channel because I wouldn't idle in one until the work week starts.
|
# ? Sep 2, 2013 20:16 |
|
lord funk posted:Have you used Time Profiler to make sure it's an autolayout issue? Maybe too many resources are being loaded, or it could be something completely different.
|
# ? Sep 2, 2013 20:55 |
|
I've just created #appledevgoons on synirc. server is irc.synirc.net
|
# ? Sep 2, 2013 21:47 |
|
NoDamage posted:Yeah. Sadly I was getting big spikes in CPU activity whenever [UICollectionView performBatchUpdates:] was called and it seemed to be getting stuck in the autolayout engine. For now I've removed autolayout from my cells and performance has increased quite a bit. Check that you've implemented isequal on the layouts, I think. They added a property in iOS 7 that contains an object whose default isEqual tests for referential equality. I don't really know the details myself, as I haven't run into it, but I've heard that was a thing. Doctor w-rw-rw- fucked around with this message at 00:50 on Sep 3, 2013 |
# ? Sep 2, 2013 22:00 |
|
Doctor w-rw-rw- posted:Check that you've implemented isequal on the layouts, I think. They added a property in iOS 7 that contains an object whose default isEqual tests for referential equality. I don't really know the details myself, as I haven't run into it, but I've heard that was a thing.
|
# ? Sep 3, 2013 05:59 |
|
NoDamage posted:I don't understand what you're saying here. Are you referring to NSLayoutConstraint, UICollectionViewLayout, or something else entirely? I don't see where isEqual fits into any of that... Ah, just looked it up. In iOS 7, UICollectionViews don't call applyLayoutAttributes on custom UICollectionViewCells if the layout attributes haven't changed, and checks with isEqual. Probably not your issue, albeit a potential pitfall. Sorry I'm off my rocker today, haven't been myself today.
|
# ? Sep 3, 2013 09:33 |
|
Is there a way to have a UITableView immediately select its cells upon touch begin and cancel them on scroll, rather than wait until touch end as it normally does? It appears to ignore the delaysContentTouches flag from UIScrollView.
|
# ? Sep 3, 2013 20:43 |
|
Doctor w-rw-rw- posted:Ah, just looked it up. In iOS 7, UICollectionViews don't call applyLayoutAttributes on custom UICollectionViewCells if the layout attributes haven't changed, and checks with isEqual. Probably not your issue, albeit a potential pitfall.
|
# ? Sep 3, 2013 22:22 |
|
Malloreon posted:I've just created #appledevgoons on synirc. server is irc.synirc.net It might be worth adding this info to the OP.
|
# ? Sep 4, 2013 00:59 |
|
Axiem posted:It might be worth adding this info to the OP. Beat you by over an hour!
|
# ? Sep 4, 2013 04:27 |
|
|
# ? Jun 6, 2024 11:45 |
|
Is there a reason why iOS REDACTED UIDatePickers go over the bounds they display in Interface Builder (and start slightly lower than they display)? I'd hope by preview 6, that sort of thing would be sorted.
|
# ? Sep 4, 2013 05:10 |