Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SaTaMaS
Apr 18, 2003

TheReverend posted:

UI tests.

Sometimes finds my elements?! Sometimes not. I say 90% of the time they are found for any single test.

Now do this with 20 tests and you can see why making this part of our CI flow pisses me off!

I'd really like the folks at Apple not make this suck so hard!

Any hot tips?

(Sorry for the long delay, I just randomly decided to check the thread today) If these are dynamically added elements, you need to set accessibilityElements on the cell and post a layout changed notification. e.g., at the end of the listener where the UI is added:

self.accessibilityElements = [your new UI];
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil)

that updates the list of elements that the UI test can access.

Adbot
ADBOT LOVES YOU

Good Sphere
Jun 16, 2018

Edit, just a weird problem I had where it showed two masters in my Source Control Navigator. Unsure how this happened, but it's now fixed.
I moved a hidden ".git" (system) folder outside my main project folder. I then could create a new repository within the Source Control Navigator, create a new remote and the appropriate projects are visible in GitHub.

Good Sphere fucked around with this message at 18:33 on Feb 13, 2019

Good Sphere
Jun 16, 2018

Remember all the posts I made about my troubles getting worse CPU performance on newer phones using live camera CIFilters? I'm sure you do. I have a simplified example project on GitHub if anyone wants to take a look to investigate why this is happening: https://github.com/PunchyBass/Live-Filter-test-project

lord funk
Feb 16, 2004

I just got a fun email from a user. He desperately needs an 'app re-skin' done and is looking for a developer. Of course, it's an easy copy of another existing app, with a new interface 'to avoid copyright infringement.' And he was willing to pay... 'several hundred dollars if necessary!'

Doc Block
Apr 15, 2003
Fun Shoe
Starting a new high end car company. Just need a mechanic to look at a picture of a Bugatti Veyron and make one a little bit different to avoid copyright infringement. Will pay several hundred dollars if necessary!

JawnV6
Jul 4, 2004

So hot ...

Good Sphere posted:

Remember all the posts I made about my troubles getting worse CPU performance on newer phones using live camera CIFilters? I'm sure you do. I have a simplified example project on GitHub if anyone wants to take a look to investigate why this is happening: https://github.com/PunchyBass/Live-Filter-test-project

I still don't understand what your expectations are or what the use case is. CPU% is meaningless on its own and apples-to-oranges across years of releases. You're carrying this pervasive assumption that it must be lower for the newer generation and you haven't been that direct with it. Do you need to do CPU-bound work in the idle spots? Measure that instead. Is there a "correct" CPU% that you absolutely must hit? Share it.

Doc Block
Apr 15, 2003
Fun Shoe
I’ll take a look at your sample tonight or tomorrow, but for now, to follow up what Jawnv6 said, look in Instruments and in Apple’s Metal debug tools to see where the CPU and GPU time is being spent. If your code’s CPU and GPU time is the same or lower, don’t worry about it.

While it’s weird that a newer device is taking more CPU and GPU time to do the same task, Apple’s CI filter you’re running could be doing something differently on the newer device, etc

Good Sphere
Jun 16, 2018

JawnV6 posted:

I still don't understand what your expectations are or what the use case is. CPU% is meaningless on its own and apples-to-oranges across years of releases. You're carrying this pervasive assumption that it must be lower for the newer generation and you haven't been that direct with it. Do you need to do CPU-bound work in the idle spots? Measure that instead. Is there a "correct" CPU% that you absolutely must hit? Share it.

I don't really have any expectations, but I feel that I'm not out of line for being suspicious that a phone with four generations difference is getting way worse performance, and it's worth investigating. If I don't figure it out soon, I'm not going to delay releasing it, but I think it could have a negative impact and scare people off from using it. It's more obvious than the CPU read out I'm getting. I can actually feel the XS get hot and it drains it more quickly. I've looked at the idle time, and decreased it with experimentation. I don't know exactly the best methods precisely using the CPU vs GPU where there's minimal idle times. I know it's best to not trade off work between the CPU and GPU, and I'm suspicious that's still happening. I expect the CPU to be below 20%.


Doc Block posted:

I’ll take a look at your sample tonight or tomorrow, but for now, to follow up what Jawnv6 said, look in Instruments and in Apple’s Metal debug tools to see where the CPU and GPU time is being spent. If your code’s CPU and GPU time is the same or lower, don’t worry about it.

While it’s weird that a newer device is taking more CPU and GPU time to do the same task, Apple’s CI filter you’re running could be doing something differently on the newer device, etc


Thanks. The most CPU impact happens in MTKView's draw function when render is called.

I've been afraid of the newer phones using CIFilters differently. In that case, it should be fixed or labeled as deprecated.

SaTaMaS
Apr 18, 2003
I need to do a Hilbert transform on some data. A quick google search turned up this answer on stackoverflow, is this about as good as it gets, or is there a better version available?

https://stackoverflow.com/a/21907439

jackpot
Aug 31, 2004

First cousin to the Black Rabbit himself. Such was Woundwort's monument...and perhaps it would not have displeased him.<
Does this have a name, besides Smart App Banner? I could've sworn it had a catchier, shorter name.

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
Catchier than what I usually call it, “the lil app banner that you can put on websites that can do the associated domain thingy”.

Doh004
Apr 22, 2007

Mmmmm Donuts...
Yeah that's just the Universal Links app banner.

Which is different from the meta tag driven Smart Banner.

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.

Doh004 posted:

Yeah that's just the Universal Links app banner.

Which is different from the meta tag driven Smart Banner.

Is it? I thought they were indistinguishable.

dizzywhip
Dec 23, 2005

Anyone have any good resources for learning audio programming (sequencing in particular) on Apple platforms? I'm currently using a combination of Core Audio, Core MIDI, and AVAudioEngine to do some relatively simple dynamic sequencing based on user input (i.e. the user writes out some music, then I create a MIDI sequence out of it that gets played back), and I've stumbled along for a while getting the basics working, but I keep hitting road blocks when I get to more involved functionality. For example, there are a lot of things that I want to trigger in sync with the music -- looping sections, automatically stopping playback when the end of the music is reached, flashing a UI component in time with metronome clicks, etc. I've set up custom MIDI events that I can respond to, which sort of work, but the timing is so out of sync that it's not really usable, and I don't know how to even begin debugging it.

I want to spend some time properly learning about audio programming so I can have more control over things like that instead of just fumbling around, but good resources seem hard to come by. Apple's documentation is usually pretty sparse or seemingly out of date, and I haven't found anything that covers audio at a fundamental level -- it's all about using specific APIs and assumes some prior knowledge. The few audio-centric WWDC sessions I've seen are fairly high-level or mostly just cover API changes. Any links to books or online resources would be really appreciated 🙏 Non-Apple stuff is welcome if it seems helpful, though learning from the perspective of Apple's audio APIs would be nice.

Doh004
Apr 22, 2007

Mmmmm Donuts...

pokeyman posted:

Is it? I thought they were indistinguishable.

Si senor.

Old smart banner that you can't deeplink from:


Universal Links banner (Apple docs didn't come up in cursory search, stole an image of it):

lord funk
Feb 16, 2004

dizzywhip posted:

Anyone have any good resources for learning audio programming (sequencing in particular) on Apple platforms? I'm currently using a combination of Core Audio, Core MIDI, and AVAudioEngine to do some relatively simple dynamic sequencing based on user input (i.e. the user writes out some music, then I create a MIDI sequence out of it that gets played back), and I've stumbled along for a while getting the basics working, but I keep hitting road blocks when I get to more involved functionality. For example, there are a lot of things that I want to trigger in sync with the music -- looping sections, automatically stopping playback when the end of the music is reached, flashing a UI component in time with metronome clicks, etc. I've set up custom MIDI events that I can respond to, which sort of work, but the timing is so out of sync that it's not really usable, and I don't know how to even begin debugging it.

I want to spend some time properly learning about audio programming so I can have more control over things like that instead of just fumbling around, but good resources seem hard to come by. Apple's documentation is usually pretty sparse or seemingly out of date, and I haven't found anything that covers audio at a fundamental level -- it's all about using specific APIs and assumes some prior knowledge. The few audio-centric WWDC sessions I've seen are fairly high-level or mostly just cover API changes. Any links to books or online resources would be really appreciated 🙏 Non-Apple stuff is welcome if it seems helpful, though learning from the perspective of Apple's audio APIs would be nice.

I develop synths on iOS. You do not want to learn CoreAudio as an approach to developing a sequencer, and AVAudio is totally useless (it's designed for incidental, non-timing essential sound). Rather, you should look at existing audio engines that can be loaded onto iOS.

I use libpd. It allows you to run Pd (Pure Data) patches on iOS, and gives you hooks to communicate between the audio graph and the user interface. So the idea is that you design your audio / sequencer in Pd, then load it into an app, and control it from the UI side. There are tons of apps that use it, including Arpeggionome, which is a sequencing app.

As for perfect synchronization, you'd want to look into adding Ableton Link into your app, and designing around that paradigm. It's sample-accurate time sync between iOS apps.

Any other questions, just let me know.

dizzywhip
Dec 23, 2005

lord funk posted:

I develop synths on iOS. You do not want to learn CoreAudio as an approach to developing a sequencer, and AVAudio is totally useless (it's designed for incidental, non-timing essential sound). Rather, you should look at existing audio engines that can be loaded onto iOS.

I use libpd. It allows you to run Pd (Pure Data) patches on iOS, and gives you hooks to communicate between the audio graph and the user interface. So the idea is that you design your audio / sequencer in Pd, then load it into an app, and control it from the UI side. There are tons of apps that use it, including Arpeggionome, which is a sequencing app.

As for perfect synchronization, you'd want to look into adding Ableton Link into your app, and designing around that paradigm. It's sample-accurate time sync between iOS apps.

Any other questions, just let me know.

Thanks for your help! Any particular reason to avoid Core Audio? Is it just too limited? I was hoping to stick to system APIs if possible, but if there's something third-party that works better, then maybe that's the way to go, especially in the short term. But longer term I was hoping to learn more about low-level audio programming in general so that I at least understand more about how these libraries work under the hood.

In any case, I took a look at libpd and Ableton Link and they seem interesting, but I'm not sure they fit my use case. Correct me if I'm wrong, but it looks like PD has you create patches using a GUI that you can load up to play at runtime, but won't really help if I need to do the actual sequencing at runtime. And Ableton Link seems to be for synchronizing audio across devices, but I don't currently have any plans to involve multiple devices or even interface with any MIDI controllers -- everything is happening locally in one app, I just need to synchronize non-audio code with the audio right now.

For now I'm gonna poke around and see if I can find other audio libraries that would be helpful and maybe try diving into the PD source code and see if I can learn anything there. I was looking at AudioKit a while back, but their sequencing APIs were a pretty thin and limited wrapper around Core Audio sequencing, so it wasn't very useful.

Nice work on TC-11 by the way! I'm messing around with it and it's pretty sweet.

Simulated
Sep 28, 2001
Lowtax giveth, and Lowtax taketh away.
College Slice

dizzywhip posted:

Thanks for your help! Any particular reason to avoid Core Audio? Is it just too limited? I was hoping to stick to system APIs if possible, but if there's something third-party that works better, then maybe that's the way to go, especially in the short term. But longer term I was hoping to learn more about low-level audio programming in general so that I at least understand more about how these libraries work under the hood.

In any case, I took a look at libpd and Ableton Link and they seem interesting, but I'm not sure they fit my use case. Correct me if I'm wrong, but it looks like PD has you create patches using a GUI that you can load up to play at runtime, but won't really help if I need to do the actual sequencing at runtime. And Ableton Link seems to be for synchronizing audio across devices, but I don't currently have any plans to involve multiple devices or even interface with any MIDI controllers -- everything is happening locally in one app, I just need to synchronize non-audio code with the audio right now.

For now I'm gonna poke around and see if I can find other audio libraries that would be helpful and maybe try diving into the PD source code and see if I can learn anything there. I was looking at AudioKit a while back, but their sequencing APIs were a pretty thin and limited wrapper around Core Audio sequencing, so it wasn't very useful.

Nice work on TC-11 by the way! I'm messing around with it and it's pretty sweet.

CoreAudio is really good. I'm biased but no other platform does nearly as good of a job. You can run live audio mixing through a mac and lots of systems do.

lord funk
Feb 16, 2004

Don't get me wrong - CoreAudio is pretty baller. I just played a concert where I had 8 iPads all running as DACs on a Mac, plus a MOTU 8 channel interface, all combined into a new 'Aggregate Device'. It was pretty sweet.

My warning is that, like all things, you should consider using existing audio engines before writing your own. It's like using Unity to make a game instead of learning to draw triangles on screen yourself with OpenGL.

The benefit of using Pd is that you *can* do low level audio programming by creating your own objects, and that all integrates nicely with Pd's audio callback. It's suuuuper nice to be able to create a great audio graph in an environment built for it (Pd), test it out on the desktop, then load it onto an iOS device and control it from UIKit.

Pd patches run realtime -- you can create a sequencer patch in Pd, then as you run it on your iOS device you can alter everything from the sequence contents to the playback controls, etc.. TC-11 is built around this. Every touch controller fires a message into the Pd graph to change the parameters in realtime.

CoreAudio is the backend that gets your audio render callback working; it's not where you get creative and build an app. I haven't looked at AudioKit, but that's the right idea! Between that or libpd or something else, you'll be much happier.

dizzywhip posted:

Nice work on TC-11 by the way! I'm messing around with it and it's pretty sweet.

Thanks! 🍻

TheReverend
Jun 21, 2005

Wrong thread maybe but can anyone recommend a book on RXSwift?

I'm old and stubborn and like books for this type of stuff.

Also whats the buzz, is this type of stuff just a fad or do you think it catch on to be more widespread?


SaTaMaS posted:

(Sorry for the long delay, I just randomly decided to check the thread today) If these are dynamically added elements, you need to set accessibilityElements on the cell and post a layout changed notification. e.g., at the end of the listener where the UI is added:

Sorry for my delay. They aren't but this is something I didn't know and we have a few VCs with programatically added stuff so this will be great for that time, so thanks! :)

My UI testing mandate is still present and somehow I've become the guy who ends up fixing all the tests when they break which is awful but appreciated by management. Slowly but surely I'm realizing it's usually other's breaking it and not a fault of XCTest.

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?
Don’t forget that way more poo poo is asynchronous than you might think when doing UI testing. You really need to leverage expectations.

Also, don’t use implicit expectations (pseudocode):

code:
func testSomethingFragile {
    someElement.expectation(predicate: “visible == true”)
    // stuff that should result in someElement becoming visible
    waitForExpectations()
}
The expectation here is implicit because it’s not assigned to something and waited on explicitly. It’s a recipe for fragility, since this code might look like it works, but then when composed together with other code, timing differences or UI changes can cause the test to break because something else waited on all expectations and consumed yours, or something else created an implicit expectation that you consumed.

Better to write all your test code this:

code:
func testSomethingNotFragile {
    let e = someElement.expectation(predicate: “visible == true”)
    // stuff that should result in someElement becoming visible
    waitForExpectations([e])
}

Axiem
Oct 19, 2005

I want to leave my mind blank, but I'm terrified of what will happen if I do

TheReverend posted:

Also whats the buzz, is [Rx] just a fad or do you think it catch on to be more widespread?

I think there's a reasonable chance Rx patterns will catch on; they're relatively language-portable, and there are benefits.

However, I've been in codebases that used RxSwift and it was a giant loving mess, and much worse than if they'd just gone with non-RxSwift ways of doing things.

However however, the current project I'm on was greenfield with RxSwift in from the get-go (along with a library some of my coworkers wrote: RxSugar), and it's actually been really nice and useful. It saves a fair amount of Observer-pattern boilerplate code, and it helps reinforce that a lot of things are actually asynchronous under the hood, and forces us to contend with that.

Like most tools, it can be used poorly and cause more trouble than it's worth. It's still in my bucket of wanting to use it more to get a better sense of where that boundary is, but I can say it definitely has its useful parts.

I have a sideproject-at-work sort of thing that I'll eventually make public on Github—conveniently, also showcasing how I like to do UI Tests—without worrying about NDA issues and such. I'm using RxSwift and RxSugar in it and liking it so far. It's just...taking a while to get to a releasable point, sorry.

Doctor w-rw-rw-
Jun 24, 2008
Hot take: not only was it never a fad, it will never really hit an inflection point and gain significant adoption in the same way that React (for example) has. For apps, anyways.

Good engineers establishing good conventions, and a product built with some foresight probably will get to experience the best of Rx, because they'd probably make any kind of principled system work in a logically consistent and understandable way.

That's good, but personally, I think that Rx is a footgun: bad programmers or impedance mismatches with the systems you need to interact with can screw things up in difficult to debug ways. A system that is heavily loaded and needs async and requires chains of a variety of modular, interchangeable blocks is needed less often than people think, but as you start to distance the way the code is actually executed from the norm, debugging your assumptions about what should have happened on top of debugging the system itself gets super hard. Granted, a big reason I think so is because of a former coworker who grossly abused RxCocoa with inappropriate threading and side effects, but generally, you shouldn't need to pay the cost of a whole lot of hypercomplicated concurrency, and its upkeep afterwards.

That's not to say you wouldn't benefit from learning and practicing a somewhat functional style of programming but I personally don't find Rx the best way to get there and I haven't seen it gain a meaningful amount of mindshare in the several years since it first appeared on my radar.

Stringent
Dec 22, 2004


image text goes here

Doctor w-rw-rw- posted:

That's good, but personally, I think that Rx is a footgun: bad programmers or impedance mismatches with the systems you need to interact with can screw things up in difficult to debug ways.

This matches my experience with it as well. I've been splitting the difference and using PromiseKit in projects that seem to warrant it, but I reckon whenever coroutines make it into Swift the Rx stuff is gonna get abandoned and projects that use it are going to become albatrosses.

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
PromiseKit is good poo poo. As soon as I have two asynchronous operations happening either in serial or in parallel, it’s promise time.

I do feel a little silly when it takes me a half hour and a whiteboard to figure out what ends up being like four lines of promises and method calls. But that’s less time and fewer calls than the equivalent would usually be without promises.

The one downside is that promises can take some decent time to learn, and you’re not really rewarded with a new way of thinking like you might be if you learned e.g. reactive extensions.

dizzywhip
Dec 23, 2005

lord funk posted:

Don't get me wrong - CoreAudio is pretty baller. I just played a concert where I had 8 iPads all running as DACs on a Mac, plus a MOTU 8 channel interface, all combined into a new 'Aggregate Device'. It was pretty sweet.

My warning is that, like all things, you should consider using existing audio engines before writing your own. It's like using Unity to make a game instead of learning to draw triangles on screen yourself with OpenGL.

The benefit of using Pd is that you *can* do low level audio programming by creating your own objects, and that all integrates nicely with Pd's audio callback. It's suuuuper nice to be able to create a great audio graph in an environment built for it (Pd), test it out on the desktop, then load it onto an iOS device and control it from UIKit.

Pd patches run realtime -- you can create a sequencer patch in Pd, then as you run it on your iOS device you can alter everything from the sequence contents to the playback controls, etc.. TC-11 is built around this. Every touch controller fires a message into the Pd graph to change the parameters in realtime.

CoreAudio is the backend that gets your audio render callback working; it's not where you get creative and build an app. I haven't looked at AudioKit, but that's the right idea! Between that or libpd or something else, you'll be much happier.

Alright, well I've mostly solved my immediate lag problems by dropping AVAudioEngine and moving deeper into Core Audio with AUGraph even though that API was supposed to be deprecated last year according to a WWDC session. I'll take a closer look at Pd in a bit and see if it'll work for me, though I'm still a little wary of moving away from system libraries. I wanna have as much control as possible for some projects I have in mind for the future, and I don't mind taking the time to build up my own audio components on top of the system.

Dog on Fire
Oct 2, 2004

If I recall correctly, in Objective-C, methods must not begin with 'new', like [foo newBar]. But can they begin with 'news', like [foo newsBar], does anyone know?

Good Sphere
Jun 16, 2018

I have a super annoying ongoing issue that I have no idea how to solve. When using the front facing camera to record a video in my app, and using UIActivityController to share, some apps receive the video upside down and others don't. Saving to my camera roll looks fine. Even saving to my camera roll and uploading from Facebook or Messenger appears fine. Two major ones that receive upside down from UIActivityController are Facebook and Messenger. I've tried to compare all sorts of information from the resulting video file shared and saved, and they appear identical.

I can't seem to locate where the problem is being caused, and the only thing I can think of is flipping the video as the user selects Facebook or Messenger. Is this even possible, and if it is, could it be slow if the video is large?

Could I be doing something wrong when sharing the video which contaminates the video when sharing to some platforms? self.fileURL is the video location that it was recorded to using a temporary directory:

code:
func movieURL() -> NSURL {
	let tempDir = NSTemporaryDirectory()
	let url = NSURL(fileURLWithPath: tempDir).appendingPathComponent("tmpMov.mov")
	return url! as NSURL
}
	
Code for the UIActivityController:

code:
let urlData = NSData(contentsOf: self.fileURL)

if urlData != nil {

	let paths:[String] = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)
	let docDirectory:String = paths[0]
	let filePath:String = "\(docDirectory)/tmpMov.mov"
	urlData?.write(toFile: filePath, atomically: true)
	
	let videoLink = NSURL(fileURLWithPath: filePath)
	let objectsToShare:[NSURL] = [videoLink]
	
	let activity: UIActivityViewController = UIActivityViewController(activityItems: objectsToShare, applicationActivities: nil)
	
	DispatchQueue.main.async {
		UIApplication.topViewController?.present(activity, animated: true, completion: nil)
	}

}

Doc Block
Apr 15, 2003
Fun Shoe
When I did a photo app a while back, I remember there being something about an orientation flag for the photo. Maybe video has the same, and it’s getting stripped out when saving to a file instead of the camera roll?

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.

Dog on Fire posted:

If I recall correctly, in Objective-C, methods must not begin with 'new', like [foo newBar]. But can they begin with 'news', like [foo newsBar], does anyone know?

Are you thinking of ARC method families?

Good Sphere
Jun 16, 2018

Doc Block posted:

When I did a photo app a while back, I remember there being something about an orientation flag for the photo. Maybe video has the same, and it’s getting stripped out when saving to a file instead of the camera roll?

Yeah I was thinking the same thing, and I tried looking for it using the mdls command and exiftool to check for extra exif information. To rotate a video, I apply a transform.

I save my video using PHPhotoLibrary, so maybe it adds something extra about orientation? Can I share a video similar to what I did in my post above, except have PHPhotoLibrary make the asset? I tried doing something like that, but I don't know how if it's possible.

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe

Dog on Fire posted:

If I recall correctly, in Objective-C, methods must not begin with 'new', like [foo newBar]. But can they begin with 'news', like [foo newsBar], does anyone know?

The convention is based on "words", where changes in capitalization trigger a word boundary. So no, news and newsBar are not considered to be in the new family of selectors.

Doctor w-rw-rw-
Jun 24, 2008
And even if it were you could put a objc_method_family attribute of "none" on it to undo the automatic retain behavior.

brap
Aug 23, 2004

Grimey Drawer
So you're saying "must" to mean "strongly discouraged by convention", not "produces a compile error", right?

hackbunny
Jul 22, 2007

I haven't been on SA for years but the person who gave me my previous av as a joke felt guilty for doing so and decided to get me a non-shitty av

brap posted:

So you're saying "must" to mean "strongly discouraged by convention", not "produces a compile error", right?

No, it actually has semantic meaning. "new" methods return an object and pass ownership of it to the caller. It started as a convention, but ARC turned it into a hard rule: an ARC caller will release the object returned by a "new" method, sooner or later, and you drat better take that into account when planning the lifetime of the object

Doctor w-rw-rw-
Jun 24, 2008

brap posted:

So you're saying "must" to mean "strongly discouraged by convention", not "produces a compile error", right?

It doesn't cause a compile error, but it's not just discouraged by convention, it actually affects how the object is reference counted. If you're a responsible developer you'll try to respect the method family, but if you *really* want to you can override the behavior.

Plorkyeran
Mar 22, 2007

To Escape The Shackles Of The Old Forums, We Must Reject The Tribal Negativity He Endorsed
Getting method families wrong results in runtime crashes and/or memory corruption rather than compile errors. It's a lot of fun.

TheReverend
Jun 21, 2005

I work on an old app from maybe 2010 that's was originally Obj-C and since maintained as a mix of swift and Obj-C.

We use Reskit for our Rest API.

Pretty sure that's old and busted now.

Is the new Swift 4.2 codable stuff good enough for most use cases now for rest object mapping?

Doh004
Apr 22, 2007

Mmmmm Donuts...

TheReverend posted:

I work on an old app from maybe 2010 that's was originally Obj-C and since maintained as a mix of swift and Obj-C.

We use Reskit for our Rest API.

Pretty sure that's old and busted now.

Is the new Swift 4.2 codable stuff good enough for most use cases now for rest object mapping?

Yes.

Adbot
ADBOT LOVES YOU

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
Codable is really good.

Also best of luck extricating yourself from RestKit. Definitely more of a framework than a library in the "who calls who" sense.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply