|
I made an HTML parser and pushed the deployment target back as far as I could without being annoying. I have no idea if anyone uses it that far back. The warnings popped up in Xcode 9.3.
|
# ? Apr 9, 2018 00:01 |
|
|
# ? May 22, 2024 16:46 |
|
ManicJason posted:You're also leaking a UIActivityIndicatorView every time a cell is reused. Thank you, good catch! dc3k posted:You're also using KVC methods on a dictionary (valueForKey instead of objectForKey) in one spot, but not the others. Thank you, and you hit the nail on the head!
|
# ? Apr 9, 2018 09:34 |
|
CuddlyZombie posted:Thank you, and you hit the nail on the head! I’m not sure I’d look favourably on a candidate who was passing their coding exercise around the local forums for feedback and improvements. I kinda suspected that was what you were doing and was intentionally coy in my response, but it’s probably best to be up front about that in the future. But I also don’t know your situation and there’s a pretty slim chance your prospective employer will find out if they even care, so what do I know?
|
# ? Apr 9, 2018 14:51 |
|
edit: I should have posted this in the screen shots thread. Anyway, having design issues. See here: https://forums.somethingawful.com/showthread.php?threadid=2841382&pagenumber=195#post483012137
LP0 ON FIRE fucked around with this message at 18:15 on Apr 10, 2018 |
# ? Apr 10, 2018 18:12 |
|
Today sucked. I've attempted to solve a problem again that I've been at for months, spent about 9 hours on it and failed miserably - I cannot record my mic's input so that AVAssetWriter can capture it! I'm filtering my video capture by modifying the buffer in captureOutput, and append it to a AVAssetWriterInputPixelBufferAdaptor. How does captureOutput get a buffer and turn the microphones input into something that it can write to AVAssetWriter? How does it do that? How do I even know my mic is picking up audio? I cannot find any information about this! https://stackoverflow.com/questions/49784587/how-to-record-audio-from-the-mic-while-appending-a-modified-samplebuffer-image-t
|
# ? Apr 11, 2018 22:22 |
|
Good news is I finally got it to work Not really in the way I wanted to, but works fine. I use an additional AVAudioSession, AVAudioRecorder and another NSURL, instead of just AVAssetWriter, then I merge them together using AVMutableComposition. I really hope sync will be ok! Some really good references. One for recording audio (which magically defaults to your phone's microphone as an input): https://www.hackingwithswift.com/example-code/media/how-to-record-audio-using-avaudiorecorder And merging them both together: https://stackoverflow.com/questions/31984474/swift-merge-audio-and-video-files-into-one-video
|
# ? Apr 12, 2018 20:18 |
|
This is almost certainly the wrong thread as I'm approaching this from a Javascript perspective, but it's about Apple's ecosystem. Specifically HomePod. In short, I'm trying to programmatically control its volume. As of recently, any iOS device or iTunes instance can control any other AirPlay device that is currently playing. When an Apple TV is connected to HomePod, its physical remote controls the volume. I think this is all happening over AirPlay. I managed to hack a fork of node-airtunes, but it disconnects all other devices. What I really want to do is inject commands like, "volume up" or "pause," not take over playback altogether. Where I'm stuck is, I don't know if Apple is using some internal version of AirPlay 2 to enable this cooperative behavior or if I need to dive deep in the weeds of UDP requests and the AirPlay protocol. More broadly, I'm not sure where to look for more information. I've just been piecing together information from various Github threads. Any ideas?
|
# ? Apr 19, 2018 05:53 |
|
There's no such thing as an 'external' version of AirPlay 2, though. AirTunes is the old name of AirPlay, so AirTunes v2 isn't AirPlay 2; that library looks pretty ancient. As far as I can tell, AirPlay 2 is covered by MFI, which means it's subject to NDA, and very likely also the MFI chip which secures and encrypts hardware interactions. If it is, you're boned, if not, you'll probably still have a difficult time figuring out iTunes does it. iTunes *does* have a remote API for the remote app to use (but means you're dependent on iTunes), but if it's doing something special over bluetooth (and it's Apple so it probably is) you probably won't be able to get what you want. Or maybe there's some simple media control standard they're using in a clever way and it's all simple to get what you want but I think the likelihood of that is super low.
|
# ? Apr 19, 2018 09:30 |
|
This is a question I think I already know the answer to, but I want to be certain: While signing a framework is technically possible, it is NOT possible to enable App Groups and thus enable sharing pasteboards between an app and another app (signed with a different team ID) which integrates that framework, correct?
|
# ? Apr 19, 2018 12:28 |
|
Your framework gets re-signed when being embedded in another app, doesn’t it? So it’s a question of whether two apps with different team IDs can share a pasteboard. (Which I don’t know the answer to.)
|
# ? Apr 19, 2018 12:44 |
|
I believe Apple doesn’t allow developer teams to generate keychain access groups such that they can be shared outside the developer team.
|
# ? Apr 20, 2018 03:18 |
|
As a former Flash developer and now Swift/Obj-C developer the downward trend on this chart is giving me panic attacks https://trends.google.com/trends/explore?date=today%205-y&q=%2Fm%2F010sd4y3 How worried should I be? I'm sure there will be Swift jobs out there as long as there are iPhones, but lately more and more companies have been switching to React Native/NativeScript even though JavaScript is a garbage language for app development.
|
# ? Apr 25, 2018 18:25 |
|
Looks pretty flat to me? If you're talking about the recent dip, than maybe it's a lull between now and WWDC?
|
# ? Apr 25, 2018 18:58 |
|
Doh004 posted:Looks pretty flat to me? If you're talking about the recent dip, than maybe it's a lull between now and WWDC? It's not dropping that much, but if Swift peaked this fast it's still alarming
|
# ? Apr 25, 2018 20:03 |
|
You’re going to be fine.
|
# ? Apr 25, 2018 21:53 |
|
No see this time cross platform tools might actually deliver on their promises unlike the past several decades!
|
# ? Apr 26, 2018 01:26 |
|
The future is lovely electron apps.
|
# ? Apr 27, 2018 16:33 |
|
Hey guys, really weird issue I'm having. I wrote a bunch of XCUI tests for a client's app after re-signing it then sent it off to them with instructions on how to re-sign. It compiles and runs on their machine just fine, however, when they attempt to run the XCUI tests every one fails with Thread 1: EXC_BAD_ACCESS (code=1, address=0x0) when attempting to enter text into the first UIElement it is supposed to find. All of the tests run beautifully on my machine, so I'm trying to figure out what's going on with theirs. I have verified that they re-signed everything correctly, so now i can't help but wonder if something could be going on with their install of XCode or something? Any help or pointers would be great, thanks! edit: i found the issue: https://github.com/lionheart/openradar-mirror/issues/19677 switched to an 11.3 simulator device and it works now FAT32 SHAMER fucked around with this message at 19:02 on Apr 27, 2018 |
# ? Apr 27, 2018 16:51 |
|
When I take a screenshot on my iPod in ios11, it brings up this neat editor thing. Does anybody know if a) thats a native UIKit library or something (I am not having any luck with google searches), and b) if its possible to throw custom data to it instead of it grabbing the contents of the screen?
|
# ? Apr 29, 2018 08:48 |
|
soundsection posted:When I take a screenshot on my iPod in ios11, it brings up this neat editor thing. Does anybody know if a) thats a native UIKit library or something (I am not having any luck with google searches), and b) if its possible to throw custom data to it instead of it grabbing the contents of the screen? It’s built in so my guess is you can’t change anything given to it
|
# ? Apr 30, 2018 02:28 |
|
Got a problem here https://www.youtube.com/watch?v=h8LVyCLlwXQ My first view does not autorotate. My second view does. To make my second view controller autorotate, I do something similar to this answer: https://stackoverflow.com/questions...702941#41702941 The main issue is the CATransition to the new view controller. When it does that, the second view is already loaded, so everything rotates, including whatever it cached as an image in the first view. I tried to compensate that by rotating the first view just before the fade transition, and you can see in the video that rotates the first view in the fade the correct amount, I just need to translate the position, but the problem is that the initial rotation is actually visible for a slight moment! Why? code:
|
# ? May 1, 2018 17:09 |
|
idk if it’s the only issue but don’t touch views from a background thread. You’re (almost certainly?) doing so in the block dispatched to a global queue.
|
# ? May 1, 2018 19:07 |
|
pokeyman posted:idk if it’s the only issue but don’t touch views from a background thread. You’re (almost certainly?) doing so in the block dispatched to a global queue. Good to know. That DispatchGroup was just there temporarily to try to get the rotation to happen and immediately play out the CATransition and present the new view controller. Before I had put the DispatchGroup functionality in there, the issue of seeing the first view rotate suddenly was still happening.
|
# ? May 1, 2018 19:58 |
|
At work we've just recently started using Apple Testflight on a small scale (1 beta released so far, to around 10 external users) - we shipped the first beta build around a month ago. I've just prepared/uploaded a second beta build and uploaded it to iTunes Connect, intending to release it to our "internal" group (4 staff members' personal accounts, including myself) over the weekend, then add it to the external group next week once we've done some sanity testing. As soon as the build finished processing, I had fear struck into me by a push notification and an email from Testflight informing me that a new build was available. At this point I hadn't enabled the build for either group, simply uploaded the binary through Application Loader. On double checking, it turns out both these notifications specified the month-old build (that I'd already got installed), and the new build isn't available in Testflight on devices yet. While this is a relief that we haven't shipped anything to external testers prematurely, it still seems bizarre for the system to send a notification at that point. I can only presume the external testers received it too. Has anyone else experienced this? Is there a chance it's just temporary bugginess on Apple's side, or is this the norm? We were really thinking about embracing this a bit more and sending different betas to different groups, but if all testers get notified whenever we upload a new build (even if they're not invited to it) that idea's kinda shot in the foot.
|
# ? May 3, 2018 17:12 |
|
I don’t think external testers get a notification until you release the build to external testers. Internal testers can use any build that’s been processed, so that’s when the notification goes out to those users. I’m not aware of a way to prevent internal testers from having access to builds like this, but maybe there’s a way. The only explanation I can think of for the new notification including the old release notes is because you haven’t specified any release notes yet for the new build, probably because iTunes Connect doesn’t ask for notes until you release to external testers.
|
# ? May 3, 2018 17:43 |
|
The thing is, for all intents and purposes I am an external tester. It came to my Gmail account that is in no way associated with the company developer account. Same with a colleague - we set our personal devices up this way so we could test the same way as external testers will, just in a different group. We don’t actually have any internal testers configured in that respect. I also noticed later once the panic had subsided that I hadn’t cleared the export control encryption check at that point, so it wasn’t even available to (zero in our case) internal testers..
|
# ? May 3, 2018 19:11 |
|
Ah. Well either something fucky happened or I don’t understand TestFlight as well as I thought! I’m sure the latter is true, can’t really say about the former.
|
# ? May 3, 2018 22:50 |
|
Hello thread. I've got a question about the dual speaker setup on iPhone 7 and newer. I'm curious if they show up as separate outputs. Could someone run this on an actual device **in portrait mode** and report what it logs? I feel lovely asking but I don't have one of these phones myself. I would be willing to venmo/square cash you a tenner for your troubles https://gist.github.com/rfistman/1c63315d6634112eac8b1f7dc9dffe64 To clarify, I'd like to know which device you ran it on and what it reports both in portrait and landscape mode.
|
# ? May 8, 2018 05:53 |
|
iPhone X Portrait:code:
code:
Froist fucked around with this message at 11:40 on May 8, 2018 |
# ? May 8, 2018 11:37 |
|
Froist posted:iPhone X Portrait: Awesome, thank you!! This makes me wonder if setting an AVAudioPlayer's channelAssignments will control playback to the separate speakers. I might come up with one more basic test to check this idea.
|
# ? May 8, 2018 21:03 |
|
Well, I put together an example that uses kAudioQueueProperty_ChannelAssignments. Basically this should make it easy to test where sound comes out of for the various ways you can change where sound comes out of. TableViewController.m code:
code:
code:
code:
blorpy fucked around with this message at 09:25 on May 11, 2018 |
# ? May 10, 2018 10:35 |
|
Does anyone use the Giphy API? I'm super confused about how I'd upload a gif to Giphy using their Swift API. It doesn't even demonstrate the capability on their github (besides generating animated text). They have an upload endpoint mentioned in their docs, but I don't have the slightest idea how that could relate to the API.
|
# ? May 10, 2018 21:03 |
|
Looks like that Swift SDK isn't super up to date and doesn't include the upload functionality. You could probably write your own client that POSTs to the upload endpoint with your image data and your corresponding API keys - shouldn't be too hard to get up and running. Doh004 fucked around with this message at 03:41 on May 11, 2018 |
# ? May 10, 2018 21:44 |
|
Doh004 posted:Looks like that Swift SDK isn't super up to date and doesn't include the upload functionality. Yeah I was thinking about doing that. Giphy wants a screenshot of my API implantation, but mixing JS with Swift just didn't seem proper, but I guess I'll need to do it that way. I shot them an email about it too just in case.
|
# ? May 10, 2018 22:02 |
|
LP0 ON FIRE posted:Yeah I was thinking about doing that. Giphy wants a screenshot of my API implantation, but mixing JS with Swift just didn't seem proper, but I guess I'll need to do it that way. I shot them an email about it too just in case. Why are you mixing JS with Swift?
|
# ? May 10, 2018 23:17 |
|
Yeah you’re overcomplicating this. Grab Alamofire (iOS doesn’t have multipart/form-data), play with its request serialization options, and aim at the giphy endpoint until it works.
|
# ? May 10, 2018 23:24 |
|
Has anyone ever shared my current fresh hell of using app extensions with CocoaPods? My company has an SDK pod that holds most of the communication/data code and a Swift app that uses it. I ran into a few places where the SDK used [UIApplication sharedApplication], which will go boom if compiled in an extension target. I found an article that described using an extension subspec to set a compiler flag and allow conditional compilation within the pod. Cool. In messing around with that, I found that simply having an extension subspec that fully inherits from the core/default subspec and using that in the Podfile for the extension target was enough to get rid of the errors about using [UIApplication sharedApplication]. I'm guessing that having a different subspec caused CocoaPods to stop deduping the SDK pod even though the subspecs are identical otherwise, but I really don't know. I'm a bit scared, and I still expect the sharedApplication call to happen and blow up with this approach. I'm still very early down the road of developing a today extension, so I may quickly learn that including that whole SDK in the extension target is a bad approach anyway and instead opt to write a tiny amount of data to a shared container or something.
|
# ? May 11, 2018 23:55 |
|
Cant help you with the cocoapods stuff; we distribute internal modules via a Carthage like tool. Maybe instead of relying on availability from the outside calling in use it inside the framework around the disallowed calls?
|
# ? May 13, 2018 03:47 |
|
My first thought is: can you split the SDK into two subspecs, one that’s purely extension-safe API usage and another that depends on the first and augments with the extension-disallowed API usage? I think, but am not sure, that CocoaPods will include the first subspec in both and so it won’t include duplicate symbols. It’s also possible you’ll have to make two different (non-sub) specs, I forget exactly how CocoaPods works here.
|
# ? May 13, 2018 14:19 |
|
|
# ? May 22, 2024 16:46 |
|
I’d use this topic of frameworks to make a remarkably clumsy segue to my own question, albeit actually about libraries: are there any limitations on the iOS SDK version that was used to build a library that is used in a project? To be more specific, we have a library that was built using Xcode 8.3.3. Will we have to rebuild it using Xcode 9 at some point? If so, will we have to do it when iOS 11 SDK will become required in July?
|
# ? May 13, 2018 14:55 |