Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SaTaMaS
Apr 18, 2003
I'm working on a data-driven app that uses CATiledLayer (inside a UIView) inside a UIScrollView, but I'm not sure how to make it append new tiles on the right side while scrolling to the left side, as I get new data. On each update I make the contentSize wider and move the contentOffset backwards, but instead of adding on new tiles, it just stretches out the existing tiles.

edit: Just needed to change the contentMode and call setNeedsDisplay() on the UIView

SaTaMaS fucked around with this message at 16:38 on May 9, 2017

Adbot
ADBOT LOVES YOU

SaTaMaS
Apr 18, 2003
I'm working on an OSX Cocoa application and using NSArrayControllers for the first time. I have the table displaying data, but one of the variables in my model is an array of strings. For some reason it's only displaying a single value out of the array. I have a workaround by using a derived variable, but is there some way to get a table cell to display an array as a comma-delimited string? (Having a Model Key Path of objectValue.array.description just returns a KVO error)

SaTaMaS
Apr 18, 2003
Does anyone know of any good 3.5mm headphone/microphone splitters on iPhone? Apparently it's incredibly difficult to find one that will work as expected? I found one that works partially -
https://www.amazon.com/StarTech-com-headsets-separate-headphone-microphone/dp/B004SP0WAQ

But the problem is that the microphone input bleeds into the headphone output. The only other option seems to be soldering a 1600 Ohm resistor into the microphone line (seriously)
https://electronics.stackexchange.com/questions/38452/electronic-aspects-of-iphone-3-5mm-audio-output

SaTaMaS
Apr 18, 2003
I'm working on an AR app using Vuforia/OpenGL. Is it possible to move between ViewControllers to swap out UI/Trackers while leaving the video background unchanged? The video image is rendered to a texture and drawn in OpenGL.

edit: Actually I think I found something that might do it: https://www.cleveroad.com/blog/playing-blurred-video-in-background-with-gpuimage

SaTaMaS fucked around with this message at 02:41 on Sep 8, 2017

SaTaMaS
Apr 18, 2003
I'm trying to import a mesh from Blender into iOS OpenGL. I'm trying two different ways to do it, either would be fine but both are running into problems.

1. Convert a mesh from an OBJ file into a .h file. I've been using mtl2opengl, but it doesn't seem to like the fact that the file contains multiple meshes, and each one is basically a curved line (I'm creating a grid which needs to go over a person's head in my AR app). It counts the vertices, but just outputs an empty file.
2. Import the mesh using Model I/O and render it using GLKit. I'm able to import the file into a GLKMesh, but Apple's documentation for GLKMesh is non-existent (since they want everyone to use Metal instead, but I really want this to be a cross-platform as possible). I have no idea what the rendering call should look like, and my current attempt just crashes -
glDrawElements(_mesh.submeshes[0].mode, _mesh.submeshes[0].elementCount, _mesh.submeshes[0].type, (__bridge GLvoid*)_mesh.submeshes[0].elementBuffer);
I can't find any sample code, but it seems like there should be some repo somewhere that someone has already done this.

SaTaMaS
Apr 18, 2003

Doc Block posted:

GLKit is (relatively) old and seems to have mostly been intended as a stopgap to help developers switch from OpenGL ES 1.1 to ES 2.0.

If you're concerned about your app being cross platform, why are you using GLKit and/or Model I/O? Maybe take a look at something like AssImp for loading your model instead.

Does your drawing code work if you replace the GLKMesh with a hand-written cube? You've set up your VAO, VBO, and element buffer correctly, and they're bound?

The main reason is because I'm having a hard time figuring out which method I should go with for loading OBJ files. Model I/O seemed like a straightforward solution except for the lack of sample code. I have everything working with a cube loaded in from a .h file, though I'm just using a simple array buffer instead of an element buffer. I briefly looked at AssImp, maybe that's the solution instead of Model I/O.

SaTaMaS
Apr 18, 2003

Doc Block posted:

GLKit is (relatively) old and seems to have mostly been intended as a stopgap to help developers switch from OpenGL ES 1.1 to ES 2.0.

If you're concerned about your app being cross platform, why are you using GLKit and/or Model I/O? Maybe take a look at something like AssImp for loading your model instead.

Does your drawing code work if you replace the GLKMesh with a hand-written cube? You've set up your VAO, VBO, and element buffer correctly, and they're bound?

It looks like AssImp also blows up when I try to load lines instead of meshes. I'm getting the error "aiScene::mNumMeshes is 0. At least one mesh must be there". Is there a way to make AssImp expect lines instead of triangle meshes?

SaTaMaS
Apr 18, 2003
How do people learn Accelerate? I'm trying to use it for linear algebra and signal processing, but I haven't seen any book for it, and the documentation is bare-bones. At the moment I'm digging through github and stackoverflow looking for sample code, but I don't know why it has to be this much of a pain.

SaTaMaS
Apr 18, 2003
edit: determined what I wanted to do wasn't possible

SaTaMaS fucked around with this message at 04:02 on Feb 4, 2018

SaTaMaS
Apr 18, 2003
As a former Flash developer and now Swift/Obj-C developer the downward trend on this chart is giving me panic attacks
https://trends.google.com/trends/explore?date=today%205-y&q=%2Fm%2F010sd4y3

How worried should I be? I'm sure there will be Swift jobs out there as long as there are iPhones, but lately more and more companies have been switching to React Native/NativeScript even though JavaScript is a garbage language for app development.

SaTaMaS
Apr 18, 2003

Doh004 posted:

Looks pretty flat to me? If you're talking about the recent dip, than maybe it's a lull between now and WWDC?

It's not dropping that much, but if Swift peaked this fast it's still alarming

SaTaMaS
Apr 18, 2003

Toady posted:

We're well past the point where Objective-C, as fond as I am of it, could be considered something other than legacy technology. It's clear where the focus is.

There seems to be a good reason to still write libraries in it, so you can pull in C/C++ code and interface it with Swift

SaTaMaS
Apr 18, 2003
I have a 3rd party framework which is used in both a sub-project (that I have set up as a library) and my main project. When I only include the library in one or the other I get a linker error, but when I include it in both I get the error:

Class <SharedClass> is implemented in both <SubProject> and <MainProject>. One of the two will be used. Which one is undefined.

How do I include the framework in both without getting a conflict?

SaTaMaS
Apr 18, 2003

pokeyman posted:

Is the third party framework a static or dynamic library? If it’s dynamic I think you can just link against it from both and you should be ok? From the sounds of things it’s static, in which case you should try linking it only in the subproject but add the third party framework's headers to your main project's header search paths build setting. That way you only link one copy of the library into your main project's product.

Disclaimer: when confronted with scenarios like yours I tend to end up mashing buttons until something happens that approximates what I want. So I may have led you on a wild goose chase here.

It's a dynamic library. I tried linking the same library file in both the parent and sub-project since if I don't provide something for "Linked Framework and Libraries" I get linker errors, but what Xcode seems to want to do is still include the same framework file twice so that there is the duplication.

SaTaMaS
Apr 18, 2003
It turned out the key was dragging/reordering the frameworks in "Link Binary with Libraries" so that the sub-project was linked before the library. +1 for button mashing! :f5:

SaTaMaS
Apr 18, 2003

TheReverend posted:

UI tests.

Sometimes finds my elements?! Sometimes not. I say 90% of the time they are found for any single test.

Now do this with 20 tests and you can see why making this part of our CI flow pisses me off!

I'd really like the folks at Apple not make this suck so hard!

Any hot tips?

(Sorry for the long delay, I just randomly decided to check the thread today) If these are dynamically added elements, you need to set accessibilityElements on the cell and post a layout changed notification. e.g., at the end of the listener where the UI is added:

self.accessibilityElements = [your new UI];
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil)

that updates the list of elements that the UI test can access.

SaTaMaS
Apr 18, 2003
I need to do a Hilbert transform on some data. A quick google search turned up this answer on stackoverflow, is this about as good as it gets, or is there a better version available?

https://stackoverflow.com/a/21907439

SaTaMaS
Apr 18, 2003
Is Autoresizing being deprecated? When I use it for simple layouts where I don't need to worry about clipping I get the warning "Auto Layout Localization: Views without any layout constraints may clip their content or overlap other views." Is there any way to remove that error without having to use constraints?

SaTaMaS
Apr 18, 2003
Are there any good resources on functional design patterns for Swift? As I understand it, functional programming has obsoleted a lot of the design patterns as they were presented in the Gang of Four book, but when I look for Swift design patterns online, they tend to be OOP implementations of the GOF patterns that don't take advantage of Swift's functional capabilities.

SaTaMaS
Apr 18, 2003
A penetration tester went through my app at work, and one thing they flagged was that stack-smashing protection was off. The way this is checked is by looking for flags in the binary using "otool -I -v AppName|grep stack" and looking for ___stack_chk_fail/___stack_chk_guard. These flags seem to appear in apps that use obj-c but not swift-only apps. I suspect that ssp is on in swift-only apps but the flags are getting stripped out, but can't find any confirmation, has anyone dealt with this before?

I'm reading some stuff saying that the flags might show up in an ad-hoc deployment IPA, but I can't generate those with my work certificate

SaTaMaS fucked around with this message at 19:25 on May 11, 2020

SaTaMaS
Apr 18, 2003

Pulcinella posted:

Has anyone tried to do work in RealityKit? It seems unfinished.

The documentation for ARView mentions things that don’t exist. This init method doesn’t exist. There also is no camera mode property as well.

It seems like it's meant for some improbable future where there needs to be coordination between multiple AR overlays in different parts of a room, when back in this reality one overlay is plenty and all the sample code is still done in SceneKit. Plus the ARKit tracking for just a single anchor is still pretty crappy without the LIDAR hardware no one has, so they're completely putting the cart before the horse.

SaTaMaS
Apr 18, 2003

Pulcinella posted:

I was alerted to it from this thread where an Apple employee mentions it (followed by a few people saying it doesn’t work). The thread makes it seem like SceneKit is basically dead in terms of active development (no surprise).I imagine it was something that was cut from iOS13 and the documentation was never removed, so I expect there will be another big push for AR during WWDC. Reality Composer is also pretty buggy.

There's also this (https://www.bloombergquint.com/businessweek/apple-team-working-on-vr-and-ar-headset-and-ar-glasses)

quote:

Cody White, who helped develop Apple’s RealityKit software, which allows developers to implement 3D rendering in augmented-reality apps for the iPhone and iPad, quit in December

Not a good look!

SaTaMaS
Apr 18, 2003

Stringent posted:

I honestly can't shake the feeling that SwiftUI is abandonware. I haven't thought through it enough to give a solid argument for why that may be, but that's what my gut is telling me.

The best reason I can come up with to use it is for large teams where the version control for storyboards would quickly become a nightmare. That's not a compelling reason for the vast majority of teams.

SaTaMaS
Apr 18, 2003

Stringent posted:

Alternatively, is Swift's implementation of UIKit not already half baked enough to preclude any kind of major paradigm change like SwiftUI?

SwiftUI isn't really a paradigm change, it's a new layer of abstraction on top of UIKit.

SaTaMaS
Apr 18, 2003

ketchup vs catsup posted:

I have not seen a company of any size use storyboards since...2016?

My personal take is the moment the app becomes something you intend to release, lay out your views in code.

It must be wonderful to not have to worry about capricious UX designers

SaTaMaS
Apr 18, 2003

101 posted:

Would there be a reason to do this over using xibs?

If you needed a customized UINavigationItem

SaTaMaS
Apr 18, 2003
It sure is going to be awesome when junior developers start showing up who know SwiftUI + Combine but don't know the first thing about ViewControllers, Core Animation, or how to structure non-SwiftUI apps...:corsair:

SaTaMaS
Apr 18, 2003
Has anyone used coremltools to generate a model? I'd really like for the resulting bounding boxes to be normalized the same standard way as other VNImageBasedRequest results are, however all that seems to be visible on the Python side is the scaled input image in pixels, nothing about the screen size or the original size of the input image, which would be necessary in order to normalize.

SaTaMaS
Apr 18, 2003

KidDynamite posted:

anyone have a good way to do a bottom half modal?

use a UIPresentationController

SaTaMaS
Apr 18, 2003
What do Swift developers think of this? https://www.iosapptemplates.com/blog/swiftui/swiftui-drawbacks
Accord to this Swift UI isn't yet ready for complex projects. Florian does a lot of good stuff on Swift so I'm inclined to agree.

SaTaMaS
Apr 18, 2003
Has anyone done much unit testing with Core Data? I have some test files that each succeed individually, but when I try to run them all I get the error "Failed to find a unique match for an NSEntityDescription to a managed object subclass" as though the tests were creating multiple NSPersistentContainers (I'm only creating them in memory) at the same time and the entity names were colliding. These are all instance level NSPersistentContainer variable and since they're only in memory I'm not sure why they know about each other or how to fix it.

SaTaMaS
Apr 18, 2003

pokeyman posted:

Random guess: are you creating multiple instances of your managed object model?

On each test case I initialize an NSPersistentContainer and loadPersistentStores in setUp and destroyPersistentStore and set the NSPersistentContainer to nil in tearDown

It looks like I'm creating multiple identical persistent stores somehow, but since each test file works individually. Multiple test files extend the same XCTestCase superclass I created for testing core data, but since the NSPersistentContainer is instance-level I'm not sure where the problem is

SaTaMaS fucked around with this message at 03:00 on Dec 2, 2020

SaTaMaS
Apr 18, 2003

pokeyman posted:

My guess is the first test loads the model, then the second test tries to load the same model again and you're hooped. Does it go away if you init a single NSManagedObjectModel (like at the module level, so there's only ever one instance) and pass it in to the NSPersistentContainer's initializer? Or you could make it a static var in your test case superclass, just make sure there's only ever one.

I could be barking up the wrong tree, but this is sounding awfully familiar, and that's how I got around it.

edit: yeah this https://stackoverflow.com/questions/51851485/multiple-nsentitydescriptions-claim-nsmanagedobject-subclass

That fixed it, thanks!! I guess XCode can run multiple tests concurrently but I'm still fuzzy on how it happens that when multiple NSPersistentContainers are created in memory the NSManagedObjectModels are all lumped together creating duplicates.

SaTaMaS
Apr 18, 2003
I'm working on an app with computer vision, and one thing I'm working on is having the user take multiple images of a target from multiple angles since every so often there is a bad angle that fools the vision model. I'm working with the attitude quaternion in Core Motion and trying to store the axis of rotation while ignoring the tilt (portrait/landscape), then making sure any new axis is at least a small angle from any previous ones. The default basis doesn't seem very conducive to this. I think the tilt of the phone is changing the axis? However I came across some code on github that seems to fix the problem. The issue is I only sort of understand quaternions, and I really don't understand what this code is doing and why it fixes the problem. My guess would be that it's tilting the quaternion from portrait into landscape since it's a VR app, but it's also flipping the x/y/z around for some reason? Also when I change the viewportTiltAngle to 0 it stops working for me.

Here is the repo:
https://github.com/bartlomiejn/6dof-vr/blob/425393b55312901a84ae91c3204aba8840d23bb0/6dof-vr/Modules/Motion/MotionService.swift

Here is the function:
code:
    private func correctedRotation(from quaternion: CMQuaternion?) -> simd_float4 {
        guard let quaternion = quaternion else {
            return simd_float4()
        }
        
        let quat = simd_quatf(quaternion)
        let correctedAxisAngle = simd_float4(
            quat.axis.y,
            quat.axis.z,
            -quat.axis.x,
            quat.angle)
        
        let correctedQuat = simd_quatf.fromAxisAngle(correctedAxisAngle)
        
        let viewportTiltAngle = Float(90.0.degreesToRadians)
        let tiltQuat = simd_quatf(
            ix: -1.0 * sin(viewportTiltAngle / 2),
            iy: 0.0 * sin(viewportTiltAngle / 2),
            iz: 0.0 * sin(viewportTiltAngle / 2),
            r: cos(viewportTiltAngle / 2))
        
        let correctedQuat2 = correctedQuat * tiltQuat
        
        let correctedAxisAngle2 = simd_float4(
            correctedQuat2.axis.x,
            correctedQuat2.axis.z,
            correctedQuat2.axis.y,
            correctedQuat2.angle)
        
        return correctedAxisAngle2
    }

SaTaMaS
Apr 18, 2003

Pulcinella posted:

I finally have a chance to use async/await for the project I’m on and…I don’t like it? I know it’s supposed to be simpler and more straightforward than the old method, but I’m so, so used to nested callbacks that it’s been hard to wrap my brain around async await. Like async seems to have this viral quality where when one thing is async, all of a sudden everything around it has to be async and it spreads from there. I just want to wait for one asynchronous data to be retrieved and push a view controller configured with that data but I’m trapped in async world! I just want to get back on the main thread! Give me a pyramid of doom any day.

This is mostly just griping. I should probably go re-watch those WWDC videos.

b-but most UI functions are marked as @MainActor so you can call them and have them run on the main thread without having to put DispatchQueue.main.async all over the place. If they aren't, just use await MainActor.run{}

SaTaMaS
Apr 18, 2003
I'm trying to customize a TabView, and I'm having some trouble understanding how the component works (it's dumb as hell for Apple to not make SwiftUI open-source to get it out of the buggy black-box phase faster)

code:
struct ContentView: View {
    @State private var selection = 0
    @State private var selection2 = 1
    
    var body: some View {
        TabView(selection: $selection) {
            TabView(selection: $selection2) {
                Text("View A")
                    .font(.title)
                    .tabItem{ Text("View A") }
                    .tag(0)
                Text("View B")
                    .font(.title)
                    .tabItem{ Text("View B") }
                    .tag(1)
                Text("View C")
                    .font(.title)
                    .tabItem{ Text("View C") }
                    .tag(2)
            }
            .tabItem{ Text("View 1") }
            .tag(0)
        }
    }
}
This sample works correctly, with 1 item in the parent and 3 items in the child. It seems likely that Apple is using a PreferenceKey to hold the tabItems, but I don't understand why the .tabItem() calls from inside the child TabView don't propagate to the parent TabView. Is there some way to keep the child and parent PreferenceKey structs separate, or some way to achieve this without using PreferenceKey, which AFAIK gets set all the way up the chain of parent views?

SaTaMaS
Apr 18, 2003

ultramiraculous posted:

Yeah go with making a protocol IMO and have URLSession and your MockSession conform to it via an extension IMO. I wish they'd built the Foundation APIs to be protocol-based but what can you do at this point :shrug:.

related WWDC session: https://developer.apple.com/videos/play/wwdc2018-417/?time=539

SaTaMaS
Apr 18, 2003
The way SwiftUI handles focus seems bizarre to me. For every other UI library I've used, when you type in a text field and then click a button, the textfield loses focus before the action on the button is run, but on SwiftUI the textfield never loses focus so any handlers for that never get called if the button is pressed. Is there any way to rationalize this as anything other than a bug, and what is a clean way to force the lose focus handler on the text field to be called before the button action is run without creating dependencies between the components? I've tried putting sendAction/resignFirstResponder in the button handler, but that causes the lose focus handler to be called after the button action.

SaTaMaS
Apr 18, 2003
How do you make a deep copy of a Set in Swift? I'm trying to transfer a Set containing a Core Data relationship from one thread to another, and I'm getting a Core Data error which I'm pretty sure is because of the bridged NSSet reference hiding in the Set type.

Adbot
ADBOT LOVES YOU

SaTaMaS
Apr 18, 2003
Is anyone using ViewInspector to unit test SwiftUI? I'm about to start using it since it seems like the only game in town, but I'm hesitant because
1. It requires inserting boilerplate code into Views in order to test @State vars
2. Apple could come out with a way to unit test SwiftUI at any time
3. The owner is Russian and hasn't posted in a month, what happens if he gets conscripted?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply