|
I'm working on a data-driven app that uses CATiledLayer (inside a UIView) inside a UIScrollView, but I'm not sure how to make it append new tiles on the right side while scrolling to the left side, as I get new data. On each update I make the contentSize wider and move the contentOffset backwards, but instead of adding on new tiles, it just stretches out the existing tiles. edit: Just needed to change the contentMode and call setNeedsDisplay() on the UIView SaTaMaS fucked around with this message at 16:38 on May 9, 2017 |
# ¿ May 9, 2017 16:20 |
|
|
# ¿ May 2, 2024 00:13 |
|
I'm working on an OSX Cocoa application and using NSArrayControllers for the first time. I have the table displaying data, but one of the variables in my model is an array of strings. For some reason it's only displaying a single value out of the array. I have a workaround by using a derived variable, but is there some way to get a table cell to display an array as a comma-delimited string? (Having a Model Key Path of objectValue.array.description just returns a KVO error)
|
# ¿ Jul 21, 2017 00:52 |
|
Does anyone know of any good 3.5mm headphone/microphone splitters on iPhone? Apparently it's incredibly difficult to find one that will work as expected? I found one that works partially - https://www.amazon.com/StarTech-com-headsets-separate-headphone-microphone/dp/B004SP0WAQ But the problem is that the microphone input bleeds into the headphone output. The only other option seems to be soldering a 1600 Ohm resistor into the microphone line (seriously) https://electronics.stackexchange.com/questions/38452/electronic-aspects-of-iphone-3-5mm-audio-output
|
# ¿ Jul 25, 2017 23:32 |
|
I'm working on an AR app using Vuforia/OpenGL. Is it possible to move between ViewControllers to swap out UI/Trackers while leaving the video background unchanged? The video image is rendered to a texture and drawn in OpenGL. edit: Actually I think I found something that might do it: https://www.cleveroad.com/blog/playing-blurred-video-in-background-with-gpuimage SaTaMaS fucked around with this message at 02:41 on Sep 8, 2017 |
# ¿ Sep 7, 2017 01:39 |
|
I'm trying to import a mesh from Blender into iOS OpenGL. I'm trying two different ways to do it, either would be fine but both are running into problems. 1. Convert a mesh from an OBJ file into a .h file. I've been using mtl2opengl, but it doesn't seem to like the fact that the file contains multiple meshes, and each one is basically a curved line (I'm creating a grid which needs to go over a person's head in my AR app). It counts the vertices, but just outputs an empty file. 2. Import the mesh using Model I/O and render it using GLKit. I'm able to import the file into a GLKMesh, but Apple's documentation for GLKMesh is non-existent (since they want everyone to use Metal instead, but I really want this to be a cross-platform as possible). I have no idea what the rendering call should look like, and my current attempt just crashes - glDrawElements(_mesh.submeshes[0].mode, _mesh.submeshes[0].elementCount, _mesh.submeshes[0].type, (__bridge GLvoid*)_mesh.submeshes[0].elementBuffer); I can't find any sample code, but it seems like there should be some repo somewhere that someone has already done this.
|
# ¿ Oct 8, 2017 22:06 |
|
Doc Block posted:GLKit is (relatively) old and seems to have mostly been intended as a stopgap to help developers switch from OpenGL ES 1.1 to ES 2.0. The main reason is because I'm having a hard time figuring out which method I should go with for loading OBJ files. Model I/O seemed like a straightforward solution except for the lack of sample code. I have everything working with a cube loaded in from a .h file, though I'm just using a simple array buffer instead of an element buffer. I briefly looked at AssImp, maybe that's the solution instead of Model I/O.
|
# ¿ Oct 9, 2017 02:34 |
|
Doc Block posted:GLKit is (relatively) old and seems to have mostly been intended as a stopgap to help developers switch from OpenGL ES 1.1 to ES 2.0. It looks like AssImp also blows up when I try to load lines instead of meshes. I'm getting the error "aiScene::mNumMeshes is 0. At least one mesh must be there". Is there a way to make AssImp expect lines instead of triangle meshes?
|
# ¿ Oct 9, 2017 17:33 |
|
How do people learn Accelerate? I'm trying to use it for linear algebra and signal processing, but I haven't seen any book for it, and the documentation is bare-bones. At the moment I'm digging through github and stackoverflow looking for sample code, but I don't know why it has to be this much of a pain.
|
# ¿ Nov 14, 2017 17:04 |
|
edit: determined what I wanted to do wasn't possible
SaTaMaS fucked around with this message at 04:02 on Feb 4, 2018 |
# ¿ Feb 4, 2018 01:48 |
|
As a former Flash developer and now Swift/Obj-C developer the downward trend on this chart is giving me panic attacks https://trends.google.com/trends/explore?date=today%205-y&q=%2Fm%2F010sd4y3 How worried should I be? I'm sure there will be Swift jobs out there as long as there are iPhones, but lately more and more companies have been switching to React Native/NativeScript even though JavaScript is a garbage language for app development.
|
# ¿ Apr 25, 2018 18:25 |
|
Doh004 posted:Looks pretty flat to me? If you're talking about the recent dip, than maybe it's a lull between now and WWDC? It's not dropping that much, but if Swift peaked this fast it's still alarming
|
# ¿ Apr 25, 2018 20:03 |
|
Toady posted:We're well past the point where Objective-C, as fond as I am of it, could be considered something other than legacy technology. It's clear where the focus is. There seems to be a good reason to still write libraries in it, so you can pull in C/C++ code and interface it with Swift
|
# ¿ Jun 7, 2018 02:59 |
|
I have a 3rd party framework which is used in both a sub-project (that I have set up as a library) and my main project. When I only include the library in one or the other I get a linker error, but when I include it in both I get the error: Class <SharedClass> is implemented in both <SubProject> and <MainProject>. One of the two will be used. Which one is undefined. How do I include the framework in both without getting a conflict?
|
# ¿ Sep 3, 2018 01:20 |
|
pokeyman posted:Is the third party framework a static or dynamic library? If it’s dynamic I think you can just link against it from both and you should be ok? From the sounds of things it’s static, in which case you should try linking it only in the subproject but add the third party framework's headers to your main project's header search paths build setting. That way you only link one copy of the library into your main project's product. It's a dynamic library. I tried linking the same library file in both the parent and sub-project since if I don't provide something for "Linked Framework and Libraries" I get linker errors, but what Xcode seems to want to do is still include the same framework file twice so that there is the duplication.
|
# ¿ Sep 3, 2018 15:50 |
|
It turned out the key was dragging/reordering the frameworks in "Link Binary with Libraries" so that the sub-project was linked before the library. +1 for button mashing!
|
# ¿ Sep 3, 2018 18:46 |
|
TheReverend posted:UI tests. (Sorry for the long delay, I just randomly decided to check the thread today) If these are dynamically added elements, you need to set accessibilityElements on the cell and post a layout changed notification. e.g., at the end of the listener where the UI is added: self.accessibilityElements = [your new UI]; UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil) that updates the list of elements that the UI test can access.
|
# ¿ Feb 4, 2019 20:22 |
|
I need to do a Hilbert transform on some data. A quick google search turned up this answer on stackoverflow, is this about as good as it gets, or is there a better version available? https://stackoverflow.com/a/21907439
|
# ¿ Feb 17, 2019 04:30 |
|
Is Autoresizing being deprecated? When I use it for simple layouts where I don't need to worry about clipping I get the warning "Auto Layout Localization: Views without any layout constraints may clip their content or overlap other views." Is there any way to remove that error without having to use constraints?
|
# ¿ Mar 26, 2019 17:32 |
|
Are there any good resources on functional design patterns for Swift? As I understand it, functional programming has obsoleted a lot of the design patterns as they were presented in the Gang of Four book, but when I look for Swift design patterns online, they tend to be OOP implementations of the GOF patterns that don't take advantage of Swift's functional capabilities.
|
# ¿ May 12, 2019 17:04 |
|
A penetration tester went through my app at work, and one thing they flagged was that stack-smashing protection was off. The way this is checked is by looking for flags in the binary using "otool -I -v AppName|grep stack" and looking for ___stack_chk_fail/___stack_chk_guard. These flags seem to appear in apps that use obj-c but not swift-only apps. I suspect that ssp is on in swift-only apps but the flags are getting stripped out, but can't find any confirmation, has anyone dealt with this before? I'm reading some stuff saying that the flags might show up in an ad-hoc deployment IPA, but I can't generate those with my work certificate SaTaMaS fucked around with this message at 19:25 on May 11, 2020 |
# ¿ May 11, 2020 18:52 |
|
Pulcinella posted:Has anyone tried to do work in RealityKit? It seems unfinished. It seems like it's meant for some improbable future where there needs to be coordination between multiple AR overlays in different parts of a room, when back in this reality one overlay is plenty and all the sample code is still done in SceneKit. Plus the ARKit tracking for just a single anchor is still pretty crappy without the LIDAR hardware no one has, so they're completely putting the cart before the horse.
|
# ¿ Jun 12, 2020 02:50 |
|
Pulcinella posted:I was alerted to it from this thread where an Apple employee mentions it (followed by a few people saying it doesn’t work). The thread makes it seem like SceneKit is basically dead in terms of active development (no surprise).I imagine it was something that was cut from iOS13 and the documentation was never removed, so I expect there will be another big push for AR during WWDC. Reality Composer is also pretty buggy. There's also this (https://www.bloombergquint.com/businessweek/apple-team-working-on-vr-and-ar-headset-and-ar-glasses) quote:Cody White, who helped develop Apple’s RealityKit software, which allows developers to implement 3D rendering in augmented-reality apps for the iPhone and iPad, quit in December Not a good look!
|
# ¿ Jun 19, 2020 14:23 |
|
Stringent posted:I honestly can't shake the feeling that SwiftUI is abandonware. I haven't thought through it enough to give a solid argument for why that may be, but that's what my gut is telling me. The best reason I can come up with to use it is for large teams where the version control for storyboards would quickly become a nightmare. That's not a compelling reason for the vast majority of teams.
|
# ¿ Jun 19, 2020 14:25 |
|
Stringent posted:Alternatively, is Swift's implementation of UIKit not already half baked enough to preclude any kind of major paradigm change like SwiftUI? SwiftUI isn't really a paradigm change, it's a new layer of abstraction on top of UIKit.
|
# ¿ Jun 19, 2020 16:28 |
|
ketchup vs catsup posted:I have not seen a company of any size use storyboards since...2016? It must be wonderful to not have to worry about capricious UX designers
|
# ¿ Jun 19, 2020 16:34 |
|
101 posted:Would there be a reason to do this over using xibs? If you needed a customized UINavigationItem
|
# ¿ Jun 20, 2020 15:07 |
|
It sure is going to be awesome when junior developers start showing up who know SwiftUI + Combine but don't know the first thing about ViewControllers, Core Animation, or how to structure non-SwiftUI apps...
|
# ¿ Jun 20, 2020 21:19 |
|
Has anyone used coremltools to generate a model? I'd really like for the resulting bounding boxes to be normalized the same standard way as other VNImageBasedRequest results are, however all that seems to be visible on the Python side is the scaled input image in pixels, nothing about the screen size or the original size of the input image, which would be necessary in order to normalize.
|
# ¿ Aug 6, 2020 13:47 |
|
KidDynamite posted:anyone have a good way to do a bottom half modal? use a UIPresentationController
|
# ¿ Nov 11, 2020 03:47 |
|
What do Swift developers think of this? https://www.iosapptemplates.com/blog/swiftui/swiftui-drawbacks Accord to this Swift UI isn't yet ready for complex projects. Florian does a lot of good stuff on Swift so I'm inclined to agree.
|
# ¿ Nov 11, 2020 03:51 |
|
Has anyone done much unit testing with Core Data? I have some test files that each succeed individually, but when I try to run them all I get the error "Failed to find a unique match for an NSEntityDescription to a managed object subclass" as though the tests were creating multiple NSPersistentContainers (I'm only creating them in memory) at the same time and the entity names were colliding. These are all instance level NSPersistentContainer variable and since they're only in memory I'm not sure why they know about each other or how to fix it.
|
# ¿ Dec 1, 2020 18:46 |
|
pokeyman posted:Random guess: are you creating multiple instances of your managed object model? On each test case I initialize an NSPersistentContainer and loadPersistentStores in setUp and destroyPersistentStore and set the NSPersistentContainer to nil in tearDown It looks like I'm creating multiple identical persistent stores somehow, but since each test file works individually. Multiple test files extend the same XCTestCase superclass I created for testing core data, but since the NSPersistentContainer is instance-level I'm not sure where the problem is SaTaMaS fucked around with this message at 03:00 on Dec 2, 2020 |
# ¿ Dec 2, 2020 02:58 |
|
pokeyman posted:My guess is the first test loads the model, then the second test tries to load the same model again and you're hooped. Does it go away if you init a single NSManagedObjectModel (like at the module level, so there's only ever one instance) and pass it in to the NSPersistentContainer's initializer? Or you could make it a static var in your test case superclass, just make sure there's only ever one. That fixed it, thanks!! I guess XCode can run multiple tests concurrently but I'm still fuzzy on how it happens that when multiple NSPersistentContainers are created in memory the NSManagedObjectModels are all lumped together creating duplicates.
|
# ¿ Dec 3, 2020 00:51 |
|
I'm working on an app with computer vision, and one thing I'm working on is having the user take multiple images of a target from multiple angles since every so often there is a bad angle that fools the vision model. I'm working with the attitude quaternion in Core Motion and trying to store the axis of rotation while ignoring the tilt (portrait/landscape), then making sure any new axis is at least a small angle from any previous ones. The default basis doesn't seem very conducive to this. I think the tilt of the phone is changing the axis? However I came across some code on github that seems to fix the problem. The issue is I only sort of understand quaternions, and I really don't understand what this code is doing and why it fixes the problem. My guess would be that it's tilting the quaternion from portrait into landscape since it's a VR app, but it's also flipping the x/y/z around for some reason? Also when I change the viewportTiltAngle to 0 it stops working for me. Here is the repo: https://github.com/bartlomiejn/6dof-vr/blob/425393b55312901a84ae91c3204aba8840d23bb0/6dof-vr/Modules/Motion/MotionService.swift Here is the function: code:
|
# ¿ Jan 13, 2022 16:08 |
|
Pulcinella posted:I finally have a chance to use async/await for the project I’m on and…I don’t like it? I know it’s supposed to be simpler and more straightforward than the old method, but I’m so, so used to nested callbacks that it’s been hard to wrap my brain around async await. Like async seems to have this viral quality where when one thing is async, all of a sudden everything around it has to be async and it spreads from there. I just want to wait for one asynchronous data to be retrieved and push a view controller configured with that data but I’m trapped in async world! I just want to get back on the main thread! Give me a pyramid of doom any day. b-but most UI functions are marked as @MainActor so you can call them and have them run on the main thread without having to put DispatchQueue.main.async all over the place. If they aren't, just use await MainActor.run{}
|
# ¿ May 28, 2022 19:24 |
|
I'm trying to customize a TabView, and I'm having some trouble understanding how the component works (it's dumb as hell for Apple to not make SwiftUI open-source to get it out of the buggy black-box phase faster)code:
|
# ¿ May 28, 2022 19:41 |
|
ultramiraculous posted:Yeah go with making a protocol IMO and have URLSession and your MockSession conform to it via an extension IMO. I wish they'd built the Foundation APIs to be protocol-based but what can you do at this point . related WWDC session: https://developer.apple.com/videos/play/wwdc2018-417/?time=539
|
# ¿ Aug 27, 2022 14:44 |
|
The way SwiftUI handles focus seems bizarre to me. For every other UI library I've used, when you type in a text field and then click a button, the textfield loses focus before the action on the button is run, but on SwiftUI the textfield never loses focus so any handlers for that never get called if the button is pressed. Is there any way to rationalize this as anything other than a bug, and what is a clean way to force the lose focus handler on the text field to be called before the button action is run without creating dependencies between the components? I've tried putting sendAction/resignFirstResponder in the button handler, but that causes the lose focus handler to be called after the button action.
|
# ¿ Sep 27, 2022 18:23 |
|
How do you make a deep copy of a Set in Swift? I'm trying to transfer a Set containing a Core Data relationship from one thread to another, and I'm getting a Core Data error which I'm pretty sure is because of the bridged NSSet reference hiding in the Set type.
|
# ¿ Oct 8, 2022 16:21 |
|
|
# ¿ May 2, 2024 00:13 |
|
Is anyone using ViewInspector to unit test SwiftUI? I'm about to start using it since it seems like the only game in town, but I'm hesitant because 1. It requires inserting boilerplate code into Views in order to test @State vars 2. Apple could come out with a way to unit test SwiftUI at any time 3. The owner is Russian and hasn't posted in a month, what happens if he gets conscripted?
|
# ¿ Oct 27, 2022 20:13 |