Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
LP0 ON FIRE
Jan 25, 2006

beep boop

Dog on Fire posted:

Have you checked your "iPhone Developer: ..." certificate in Keychain Access? If you haven't then maybe there's some useful information there? For example if the certificate is valid or if it isn't valid then why.

"Checked" as in right-click on the certificate -> "Get Info" or "Evaluate ..."

“Certificate status: Good” for both Apple IDs

But!! “No root certificate found”

Edit: I just checked them again and instead it says “success” instead of “No root certificate found”. What??

LP0 ON FIRE fucked around with this message at 20:39 on Feb 21, 2018

Adbot
ADBOT LOVES YOU

LP0 ON FIRE
Jan 25, 2006

beep boop
See my updates in my last post. The Evaluation Status is saying different things sometimes.

LP0 ON FIRE
Jan 25, 2006

beep boop

pokeyman posted:

Agreed. Probably should’ve checked if it was a solo or team situation.

It says team but solo makes sense for me.

LP0 ON FIRE
Jan 25, 2006

beep boop

pokeyman posted:

Sorry I’m being unclear, I meant solo/team as in number of people involved, not the dev portal concept of "team".

lol

LP0 ON FIRE
Jan 25, 2006

beep boop
e: Think I found a pretty good tutorial. Still creating it. https://www.patreon.com/posts/create-ios-xcode-10361342

Is there a built in UI element similar to the one on the bottom of the camera app where you select video/photo/square? Kind of like a sideways picker view. The closest I can find is the page control element with the dots.

LP0 ON FIRE fucked around with this message at 17:14 on Feb 26, 2018

LP0 ON FIRE
Jan 25, 2006

beep boop
What was mentioned in that video tutorial was great and was simple enough to not use one that is already made. The only thing that didn't work for me was setting the label and view's size that are added to the UIPickerView in pickerView(_ pickerView: UIPickerView, viewForRow row: Int, forComponent component: Int, reusing view: UIView?) -> UIView

Instead, I used pickerView(_ pickerView: UIPickerView, rowHeightForComponent component: Int) -> CGFloat to return a height for each row.

LP0 ON FIRE
Jan 25, 2006

beep boop
edit: Sorry I always post here way too soon but I swear I feel hopeless and out of answers. I think I got it, though I don't know if it is technically correct. I check if it's cancelled to break out of the loop. But why do I need to? Isn't it cancelled so it should handle that itself?:
code:
recordedTimeQueue = DispatchWorkItem {
for i in 1 ..< (600) { // 600 max seconds
	if(recordedTimeQueue?.isCancelled)! { break }
		usleep(1000000)
		DispatchQueue.main.async {
			recordedTime?.text = "\(i)"
		}
	}
}
I have a DispatchWorkItem I'm trying to invalidate. I can see when my timer runs after it is stopped, I have an additional timer updating the text.

code:
recordedTimeQueue = DispatchWorkItem {
	for i in 1 ..< (600) { // 600 max seconds
        	usleep(1000000)
                DispatchQueue.main.async {
                    recordedTime?.text = "\(i)"
                }
        }
}

DispatchQueue.global().async(execute: recordedTimeQueue!)
When I stop it:

code:
print("recording stopped")
			
recordedTimeQueue?.cancel()
I even tried making recordedTimeQueue nil.

One guess I might have is that I need to create a global bool so I can optionally return out of the for loop, but I don't know why this would be required.

LP0 ON FIRE fucked around with this message at 18:20 on Mar 23, 2018

LP0 ON FIRE
Jan 25, 2006

beep boop
I was having slowdown problems that I thought was from my video capture DispatchWorkItem updating the text, but it looks like it has to do with attempting a video capture multiple times eventually with this crash message: *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '*** -[AVAssetWriterInputPixelBufferAdaptor appendPixelBuffer:withPresentationTime:] Must start a session (using -[AVAssetWriter startSessionAtSourceTime:) before appending pixel buffers'

LP0 ON FIRE fucked around with this message at 21:00 on Mar 23, 2018

LP0 ON FIRE
Jan 25, 2006

beep boop

Dog on Fire posted:

I guess the work item, if it has started, will finish even if you cancel it midway. Which for me kind of makes sense because having a piece of code stopping at a random position would usually lead to all sorts of nasty things, so I’d say this really can’t happen automatically and we just have to do the stopping ourselves when we need to.


Doh004 posted:

I haven't had the opportunity to use the new Dispatch APIs, but my hot take on all of this:

- Don't do things on the main thread if you don't need to
- Use the built in APIs to check if a worker has been cancelled to break out of the loop

This seems like a good blog post about it: https://medium.com/@yostane/swift-sweet-bits-the-dispatch-framework-ios-10-e34451d59a86

I've been breaking out of the loop with isCancelled, which makes my timer I was having trouble with before work correctly now, but it looks like the problem is that it somehow sees that I'm not starting a session before appending it if I stop and reattempt several videos in a row. I'm stepping through it and don't see how what it's saying is possible.

Thanks for the link. There's some stuff in there I'm not familiar with, so I'll go through it and see if anything helps.

LP0 ON FIRE fucked around with this message at 18:17 on Mar 26, 2018

LP0 ON FIRE
Jan 25, 2006

beep boop
I THINK I fixed it, although I don't want to say I'm sure until it's been tested for a while. But usually it was after I made a video about 15 or 16 times, and so far I can go way past that without the video player being invisible or having session not started crashes.

All it took was making the player layer nil when exiting the view. I also have this wrapped in an asynchronous queue still from before which I saw suggested, and it doesn't seem to hurt.

code:
DispatchQueue.main.async(){
	self.player?.pause()
	self.playerLayer?.removeFromSuperlayer()
	self.playerLayer = nil
	self.player = nil
}

LP0 ON FIRE
Jan 25, 2006

beep boop
Right now I have a "touch to focus and set exposure" feature on my camera app. How can I have the best of both worlds much like Apple's camera app where it both auto focuses and can touch to focus?

I have a Stack Overflow thread about it: https://stackoverflow.com/questions/49541837/swift-autofocus-expose-and-continuousautofocus-exposure-at-the-same-time

It's kind of hard for me to imagine how this works, but I guess it checks how much the cameras input changes. I see that there is a custom exposure mode (and something similar with focus I think), but I haven't seen many good examples of how this works together.

LP0 ON FIRE
Jan 25, 2006

beep boop
Progress! :dance:

I figured out how to have touch focus and switch back to continuous focus when the scene changes. I'm so glad it's easy. Basically you want your device's isSubjectAreaChangeMonitoringEnabled property set to true, then create a notification that looks like this:

code:
NotificationCenter.default.addObserver(self,
selector: #selector(self.setDefaultFocusAndExposure),
name: NSNotification.Name.AVCaptureDeviceSubjectAreaDidChange,
object: nil)
To switch back to the default continuous mode in case the user was using touches and the scene dramatically changes.

I'm still having a few major issues I'm stumped with. One is trying to rotate a video after I record it. Or maybe set that while I'm recording it? I just don't know.

The other is getting my mic to record when recording video. I don't know why I'm so stumped with getting this to work. I think I'll put a debug UILabel to check db input level just to see if I'm getting any input in the first place.

LP0 ON FIRE fucked around with this message at 18:38 on Apr 6, 2018

LP0 ON FIRE
Jan 25, 2006

beep boop
edit: I should have posted this in the screen shots thread. Anyway, having design issues. See here: https://forums.somethingawful.com/showthread.php?threadid=2841382&pagenumber=195#post483012137

LP0 ON FIRE fucked around with this message at 18:15 on Apr 10, 2018

LP0 ON FIRE
Jan 25, 2006

beep boop
Today sucked. I've attempted to solve a problem again that I've been at for months, spent about 9 hours on it and failed miserably - I cannot record my mic's input so that AVAssetWriter can capture it! I'm filtering my video capture by modifying the buffer in captureOutput, and append it to a AVAssetWriterInputPixelBufferAdaptor. How does captureOutput get a buffer and turn the microphones input into something that it can write to AVAssetWriter? How does it do that? How do I even know my mic is picking up audio? I cannot find any information about this!

https://stackoverflow.com/questions/49784587/how-to-record-audio-from-the-mic-while-appending-a-modified-samplebuffer-image-t

LP0 ON FIRE
Jan 25, 2006

beep boop
Good news is I finally got it to work :)

Not really in the way I wanted to, but works fine. I use an additional AVAudioSession, AVAudioRecorder and another NSURL, instead of just AVAssetWriter, then I merge them together using AVMutableComposition. I really hope sync will be ok!

Some really good references. One for recording audio (which magically defaults to your phone's microphone as an input):
https://www.hackingwithswift.com/example-code/media/how-to-record-audio-using-avaudiorecorder

And merging them both together:
https://stackoverflow.com/questions/31984474/swift-merge-audio-and-video-files-into-one-video

LP0 ON FIRE
Jan 25, 2006

beep boop
Got a problem here

https://www.youtube.com/watch?v=h8LVyCLlwXQ

My first view does not autorotate. My second view does. To make my second view controller autorotate, I do something similar to this answer: https://stackoverflow.com/questions...702941#41702941

The main issue is the CATransition to the new view controller. When it does that, the second view is already loaded, so everything rotates, including whatever it cached as an image in the first view.

I tried to compensate that by rotating the first view just before the fade transition, and you can see in the video that rotates the first view in the fade the correct amount, I just need to translate the position, but the problem is that the initial rotation is actually visible for a slight moment! Why?

code:
UIView.animate(withDuration: 0.3, delay: 0.0, options: UIViewAnimationOptions.curveEaseOut, animations: {
	self.view.transform = CGAffineTransform(scaleX: 0.75, y: 0.75)
}, completion: { (finished: Bool) in
	
	
	let group = DispatchGroup()
	group.enter()
	
	DispatchQueue.global(qos: .userInitiated).sync {
		// this is the rotation that can be seen for a fraction of a second
		self.view.transform = CGAffineTransform(rotationAngle: CGFloat(-90.0 * .pi / 180))
		self.view.setNeedsDisplay()
		group.leave()
	}
	
	group.notify(queue: .main) {		
		let transition = CATransition()
		transition.duration = 1.5
		transition.type = kCATransitionFade
		transition.timingFunction = CAMediaTimingFunction(name:kCAMediaTimingFunctionEaseIn)
		self.view.window!.layer.add(transition, forKey: kCATransition)
		self.present(manageCaptureVC, animated: false, completion: nil)
	}


})

LP0 ON FIRE
Jan 25, 2006

beep boop

pokeyman posted:

idk if it’s the only issue but don’t touch views from a background thread. You’re (almost certainly?) doing so in the block dispatched to a global queue.

Good to know. That DispatchGroup was just there temporarily to try to get the rotation to happen and immediately play out the CATransition and present the new view controller. Before I had put the DispatchGroup functionality in there, the issue of seeing the first view rotate suddenly was still happening.

LP0 ON FIRE
Jan 25, 2006

beep boop
Does anyone use the Giphy API? I'm super confused about how I'd upload a gif to Giphy using their Swift API. It doesn't even demonstrate the capability on their github (besides generating animated text).

They have an upload endpoint mentioned in their docs, but I don't have the slightest idea how that could relate to the API.

LP0 ON FIRE
Jan 25, 2006

beep boop

Doh004 posted:

Looks like that Swift SDK isn't super up to date and doesn't include the upload functionality.

You could probably write your own client that POSTs to the upload endoing with your image data and your corresponding API keys - shouldn't be too hard to get up and running.

Yeah I was thinking about doing that. Giphy wants a screenshot of my API implantation, but mixing JS with Swift just didn't seem proper, but I guess I'll need to do it that way. I shot them an email about it too just in case.

LP0 ON FIRE
Jan 25, 2006

beep boop

Doh004 posted:

Why are you mixing JS with Swift?

Because the Giphy Swift SDK does not seem to have any upload functionality, and at the same time Giphy wants to see screenshots of my implementation, using their code. Only JavaScript upload functionality seems to be available.

pokeyman posted:

Yeah you’re overcomplicating this. Grab Alamofire (iOS doesn’t have multipart/form-data), play with its request serialization options, and aim at the giphy endpoint until it works.

I'll check it out, thanks.. Ever since someone responded last week from Giphy that they'd send it to the right department, they never got back to me.

I'm not sure if I should go through all the work only for them to reject it.

LP0 ON FIRE
Jan 25, 2006

beep boop

Doh004 posted:

I take it this is for a job interview? If so, I know people would get a ton of extra points if they extended my SDK and added additional functionality to it. Maybe that's the point? :iiam:

Nope. I'm tempted, but at the same time they might not like what I did because it doesn't follow their documentation.

LP0 ON FIRE
Jan 25, 2006

beep boop
For performance and to modernize my code, I'm trying to scale an image with UIGraphicsImageRenderer the way I did with UIGraphicsBeginImageContext. I'm doing something wrong since the resulting image is twice as large instead of half!

The old way:

code:
let sourceCore = ciImage
let scaledSourceImage:UIImage = UIImage(ciImage:ciImage)
		
let size = scaledSourceImage.size.applying(CGAffineTransform(scaleX: 0.5, y: 0.5))
let hasAlpha = false
let scale: CGFloat = 1.0 // use scale factor of main screen

UIGraphicsBeginImageContextWithOptions(size, !hasAlpha, scale)
	scaledSourceImage.draw(in: CGRect(origin: .zero, size: size))
	let scaledImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
		
The new way, that makes a big image instead of half the size:

code:
let sourceCore = ciImage
let scaledSourceImage:UIImage = UIImage(ciImage:ciImage)
	
let size = scaledSourceImage.size.applying(CGAffineTransform(scaleX: 0.5, y: 0.5))
let hasAlpha = false
				
let renderFormat = UIGraphicsImageRendererFormat.default()
renderFormat.opaque = !hasAlpha
let renderer = UIGraphicsImageRenderer(size: size, format: renderFormat)
let scaledImage = renderer.image {
	(context) in
	scaledSourceImage.draw(in: CGRect(origin: .zero, size: size))
}

LP0 ON FIRE
Jan 25, 2006

beep boop

Dog on Fire posted:

I can’t try the code out on my own, but maybe the scale being 1.0 messes things up in the old function and the image has been smaller than it has had to be?

What if you compare the images after setting the scale in the old function instead of 1.0 to this: https://developer.apple.com/documentation/uikit/uiscreen/1617836-scale

The old function works great. I’m trying to make it smaller, although it is a valid point that the scale variable name is misleading. It’s the new function that I can’t understand why it doesn’t work correctly.

LP0 ON FIRE fucked around with this message at 16:31 on Jun 14, 2018

LP0 ON FIRE
Jan 25, 2006

beep boop

ManicJason posted:

You probably want scale to be 0.0, which means use the main screen scale, or whatever your main screen scale is (2.0 for most devices). 1.0 is specifying non-Retina.

The newer code that I can't get working correctly that uses UIGraphicsImageRendererFormat does not use the scale variable. (Maybe I should somehow?)

LP0 ON FIRE
Jan 25, 2006

beep boop

ManicJason posted:

UIGraphicsImageRendererFormat should default to main screen scale. If your old code worked with 1.0, try setting the UIGraphicsImageRendererFormat scale to 1.0.

You mean in the CGAffineTransform's scaleX and y? That just makes the image even bigger. Besides that, I'm unaware of a scale parameter to send to UIGraphicsImageRendererFormat like UIGraphicsBeginImageContextWithOptions has.

LP0 ON FIRE
Jan 25, 2006

beep boop

ManicJason posted:

I literally mean the property called “scale” on UIGraphicsImageRendererFormat.

Oh! I did not know about that haha. Interesting

Anyway, my performance went waaaay up by using all CIFilters instead to scale (CILanczosScaleTransform) and place (CISourceAtopCompositing) smaller images onto a larger one. I was running it live through the camera view, so draw and UIGraphicsImageRendererFormat was attempted murder to my phone in comparison. Highly suggested.

LP0 ON FIRE
Jan 25, 2006

beep boop
What’s wrong when my provision profile expires way too soon? In most cases it lasts about a week. Sometimes a couple days before needing to build and verify my app again in Settings.

I just verified my app again in Settings today, which I thought meant it auto renewed my provision, but here we are tonight and my app already refuses to open. Same pattern that often happens, and I’ll need to go back into work and verify the app again.

Adbot
ADBOT LOVES YOU

LP0 ON FIRE
Jan 25, 2006

beep boop

Plorkyeran posted:

Are you using the free developer account? The provisioning profile for those only lasts a week.

That’s it! Thanks. I’ve been using my own account and need to switch.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply