Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
I made an HTML parser and pushed the deployment target back as far as I could without being annoying. I have no idea if anyone uses it that far back. The warnings popped up in Xcode 9.3.

Adbot
ADBOT LOVES YOU

CuddlyZombie
Nov 6, 2005

I wuv your brains.

ManicJason posted:

You're also leaking a UIActivityIndicatorView every time a cell is reused.

Thank you, good catch!

dc3k posted:

You're also using KVC methods on a dictionary (valueForKey instead of objectForKey) in one spot, but not the others.

For array indices you can use array[index] rather than [array objectAt:index] and for dictionaries you can use dictionary[key] instead of [dictionary objectForKey:key] to make the code a bit nicer to read and ~ * more modern * ~

I know you didn't ask for code review but if "coding challenge" = "job interview takehome" this might help a bit.

Thank you, and you hit the nail on the head! :blush:

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.

CuddlyZombie posted:

Thank you, and you hit the nail on the head! :blush:

I’m not sure I’d look favourably on a candidate who was passing their coding exercise around the local forums for feedback and improvements. I kinda suspected that was what you were doing and was intentionally coy in my response, but it’s probably best to be up front about that in the future.

But I also don’t know your situation and there’s a pretty slim chance your prospective employer will find out if they even care, so what do I know?

LP0 ON FIRE
Jan 25, 2006

beep boop
edit: I should have posted this in the screen shots thread. Anyway, having design issues. See here: https://forums.somethingawful.com/showthread.php?threadid=2841382&pagenumber=195#post483012137

LP0 ON FIRE fucked around with this message at 18:15 on Apr 10, 2018

LP0 ON FIRE
Jan 25, 2006

beep boop
Today sucked. I've attempted to solve a problem again that I've been at for months, spent about 9 hours on it and failed miserably - I cannot record my mic's input so that AVAssetWriter can capture it! I'm filtering my video capture by modifying the buffer in captureOutput, and append it to a AVAssetWriterInputPixelBufferAdaptor. How does captureOutput get a buffer and turn the microphones input into something that it can write to AVAssetWriter? How does it do that? How do I even know my mic is picking up audio? I cannot find any information about this!

https://stackoverflow.com/questions/49784587/how-to-record-audio-from-the-mic-while-appending-a-modified-samplebuffer-image-t

LP0 ON FIRE
Jan 25, 2006

beep boop
Good news is I finally got it to work :)

Not really in the way I wanted to, but works fine. I use an additional AVAudioSession, AVAudioRecorder and another NSURL, instead of just AVAssetWriter, then I merge them together using AVMutableComposition. I really hope sync will be ok!

Some really good references. One for recording audio (which magically defaults to your phone's microphone as an input):
https://www.hackingwithswift.com/example-code/media/how-to-record-audio-using-avaudiorecorder

And merging them both together:
https://stackoverflow.com/questions/31984474/swift-merge-audio-and-video-files-into-one-video

Kobayashi
Aug 13, 2004

by Nyc_Tattoo
This is almost certainly the wrong thread as I'm approaching this from a Javascript perspective, but it's about Apple's ecosystem. Specifically HomePod. In short, I'm trying to programmatically control its volume. As of recently, any iOS device or iTunes instance can control any other AirPlay device that is currently playing. When an Apple TV is connected to HomePod, its physical remote controls the volume. I think this is all happening over AirPlay. I managed to hack a fork of node-airtunes, but it disconnects all other devices. What I really want to do is inject commands like, "volume up" or "pause," not take over playback altogether. Where I'm stuck is, I don't know if Apple is using some internal version of AirPlay 2 to enable this cooperative behavior or if I need to dive deep in the weeds of UDP requests and the AirPlay protocol. More broadly, I'm not sure where to look for more information. I've just been piecing together information from various Github threads. Any ideas?

Doctor w-rw-rw-
Jun 24, 2008
There's no such thing as an 'external' version of AirPlay 2, though. AirTunes is the old name of AirPlay, so AirTunes v2 isn't AirPlay 2; that library looks pretty ancient.

As far as I can tell, AirPlay 2 is covered by MFI, which means it's subject to NDA, and very likely also the MFI chip which secures and encrypts hardware interactions. If it is, you're boned, if not, you'll probably still have a difficult time figuring out iTunes does it. iTunes *does* have a remote API for the remote app to use (but means you're dependent on iTunes), but if it's doing something special over bluetooth (and it's Apple so it probably is) you probably won't be able to get what you want.

Or maybe there's some simple media control standard they're using in a clever way and it's all simple to get what you want but I think the likelihood of that is super low.

Neco
Mar 13, 2005

listen
This is a question I think I already know the answer to, but I want to be certain:

While signing a framework is technically possible, it is NOT possible to enable App Groups and thus enable sharing pasteboards between an app and another app (signed with a different team ID) which integrates that framework, correct?

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
Your framework gets re-signed when being embedded in another app, doesn’t it? So it’s a question of whether two apps with different team IDs can share a pasteboard. (Which I don’t know the answer to.)

Doctor w-rw-rw-
Jun 24, 2008
I believe Apple doesn’t allow developer teams to generate keychain access groups such that they can be shared outside the developer team.

SaTaMaS
Apr 18, 2003
As a former Flash developer and now Swift/Obj-C developer the downward trend on this chart is giving me panic attacks
https://trends.google.com/trends/explore?date=today%205-y&q=%2Fm%2F010sd4y3

How worried should I be? I'm sure there will be Swift jobs out there as long as there are iPhones, but lately more and more companies have been switching to React Native/NativeScript even though JavaScript is a garbage language for app development.

Doh004
Apr 22, 2007

Mmmmm Donuts...
Looks pretty flat to me? If you're talking about the recent dip, than maybe it's a lull between now and WWDC?

SaTaMaS
Apr 18, 2003

Doh004 posted:

Looks pretty flat to me? If you're talking about the recent dip, than maybe it's a lull between now and WWDC?

It's not dropping that much, but if Swift peaked this fast it's still alarming

brap
Aug 23, 2004

Grimey Drawer
You’re going to be fine.

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?
No see this time cross platform tools might actually deliver on their promises unlike the past several decades!

Kobayashi
Aug 13, 2004

by Nyc_Tattoo
The future is lovely electron apps.

FAT32 SHAMER
Aug 16, 2012



Hey guys, really weird issue I'm having. I wrote a bunch of XCUI tests for a client's app after re-signing it then sent it off to them with instructions on how to re-sign. It compiles and runs on their machine just fine, however, when they attempt to run the XCUI tests every one fails with Thread 1: EXC_BAD_ACCESS (code=1, address=0x0) when attempting to enter text into the first UIElement it is supposed to find. All of the tests run beautifully on my machine, so I'm trying to figure out what's going on with theirs. I have verified that they re-signed everything correctly, so now i can't help but wonder if something could be going on with their install of XCode or something?

Any help or pointers would be great, thanks!

edit: i found the issue: https://github.com/lionheart/openradar-mirror/issues/19677

switched to an 11.3 simulator device and it works now

FAT32 SHAMER fucked around with this message at 19:02 on Apr 27, 2018

soundsection
May 10, 2010
When I take a screenshot on my iPod in ios11, it brings up this neat editor thing. Does anybody know if a) thats a native UIKit library or something (I am not having any luck with google searches), and b) if its possible to throw custom data to it instead of it grabbing the contents of the screen?

FAT32 SHAMER
Aug 16, 2012



soundsection posted:

When I take a screenshot on my iPod in ios11, it brings up this neat editor thing. Does anybody know if a) thats a native UIKit library or something (I am not having any luck with google searches), and b) if its possible to throw custom data to it instead of it grabbing the contents of the screen?

It’s built in so my guess is you can’t change anything given to it

LP0 ON FIRE
Jan 25, 2006

beep boop
Got a problem here

https://www.youtube.com/watch?v=h8LVyCLlwXQ

My first view does not autorotate. My second view does. To make my second view controller autorotate, I do something similar to this answer: https://stackoverflow.com/questions...702941#41702941

The main issue is the CATransition to the new view controller. When it does that, the second view is already loaded, so everything rotates, including whatever it cached as an image in the first view.

I tried to compensate that by rotating the first view just before the fade transition, and you can see in the video that rotates the first view in the fade the correct amount, I just need to translate the position, but the problem is that the initial rotation is actually visible for a slight moment! Why?

code:
UIView.animate(withDuration: 0.3, delay: 0.0, options: UIViewAnimationOptions.curveEaseOut, animations: {
	self.view.transform = CGAffineTransform(scaleX: 0.75, y: 0.75)
}, completion: { (finished: Bool) in
	
	
	let group = DispatchGroup()
	group.enter()
	
	DispatchQueue.global(qos: .userInitiated).sync {
		// this is the rotation that can be seen for a fraction of a second
		self.view.transform = CGAffineTransform(rotationAngle: CGFloat(-90.0 * .pi / 180))
		self.view.setNeedsDisplay()
		group.leave()
	}
	
	group.notify(queue: .main) {		
		let transition = CATransition()
		transition.duration = 1.5
		transition.type = kCATransitionFade
		transition.timingFunction = CAMediaTimingFunction(name:kCAMediaTimingFunctionEaseIn)
		self.view.window!.layer.add(transition, forKey: kCATransition)
		self.present(manageCaptureVC, animated: false, completion: nil)
	}


})

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
idk if it’s the only issue but don’t touch views from a background thread. You’re (almost certainly?) doing so in the block dispatched to a global queue.

LP0 ON FIRE
Jan 25, 2006

beep boop

pokeyman posted:

idk if it’s the only issue but don’t touch views from a background thread. You’re (almost certainly?) doing so in the block dispatched to a global queue.

Good to know. That DispatchGroup was just there temporarily to try to get the rotation to happen and immediately play out the CATransition and present the new view controller. Before I had put the DispatchGroup functionality in there, the issue of seeing the first view rotate suddenly was still happening.

Froist
Jun 6, 2004

At work we've just recently started using Apple Testflight on a small scale (1 beta released so far, to around 10 external users) - we shipped the first beta build around a month ago. I've just prepared/uploaded a second beta build and uploaded it to iTunes Connect, intending to release it to our "internal" group (4 staff members' personal accounts, including myself) over the weekend, then add it to the external group next week once we've done some sanity testing.

As soon as the build finished processing, I had fear struck into me by a push notification and an email from Testflight informing me that a new build was available. At this point I hadn't enabled the build for either group, simply uploaded the binary through Application Loader. On double checking, it turns out both these notifications specified the month-old build (that I'd already got installed), and the new build isn't available in Testflight on devices yet.

While this is a relief that we haven't shipped anything to external testers prematurely, it still seems bizarre for the system to send a notification at that point. I can only presume the external testers received it too.

Has anyone else experienced this? Is there a chance it's just temporary bugginess on Apple's side, or is this the norm? We were really thinking about embracing this a bit more and sending different betas to different groups, but if all testers get notified whenever we upload a new build (even if they're not invited to it) that idea's kinda shot in the foot.

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
I don’t think external testers get a notification until you release the build to external testers.

Internal testers can use any build that’s been processed, so that’s when the notification goes out to those users. I’m not aware of a way to prevent internal testers from having access to builds like this, but maybe there’s a way.

The only explanation I can think of for the new notification including the old release notes is because you haven’t specified any release notes yet for the new build, probably because iTunes Connect doesn’t ask for notes until you release to external testers.

Froist
Jun 6, 2004

The thing is, for all intents and purposes I am an external tester. It came to my Gmail account that is in no way associated with the company developer account. Same with a colleague - we set our personal devices up this way so we could test the same way as external testers will, just in a different group. We don’t actually have any internal testers configured in that respect.

I also noticed later once the panic had subsided that I hadn’t cleared the export control encryption check at that point, so it wasn’t even available to (zero in our case) internal testers..

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
Ah. Well either something fucky happened or I don’t understand TestFlight as well as I thought! I’m sure the latter is true, can’t really say about the former.

blorpy
Jan 5, 2005

Hello thread. I've got a question about the dual speaker setup on iPhone 7 and newer. I'm curious if they show up as separate outputs.

Could someone run this on an actual device **in portrait mode** and report what it logs? I feel lovely asking but I don't have one of these phones myself. I would be willing to venmo/square cash you a tenner for your troubles :)

https://gist.github.com/rfistman/1c63315d6634112eac8b1f7dc9dffe64

To clarify, I'd like to know which device you ran it on and what it reports both in portrait and landscape mode.

Froist
Jun 6, 2004

iPhone X Portrait:
code:
2018-05-08 11:35:22.586251+0100 AudioTest[1590:993293] -----
2018-05-08 11:35:22.586337+0100 AudioTest[1590:993293] UID Speaker
2018-05-08 11:35:22.586367+0100 AudioTest[1590:993293] portName Speaker
2018-05-08 11:35:22.586389+0100 AudioTest[1590:993293] portType Speaker
2018-05-08 11:35:22.586516+0100 AudioTest[1590:993293] channels (
    "<AVAudioSessionChannelDescription: 0x1d001c3d0, name = Speaker 1; label = 4294967295 (0xffffffff); number = 1; port UID = Speaker>",
    "<AVAudioSessionChannelDescription: 0x1d001c340, name = Speaker 2; label = 4294967295 (0xffffffff); number = 2; port UID = Speaker>"
)
2018-05-08 11:35:22.586534+0100 AudioTest[1590:993293] dataSources (null)
2018-05-08 11:35:22.586549+0100 AudioTest[1590:993293] selectedDataSource (null)
2018-05-08 11:35:22.587405+0100 AudioTest[1590:993293] preferredDataSource (null)
iPhone X Landscape:
code:
2018-05-08 11:35:28.929764+0100 AudioTest[1590:993293] -----
2018-05-08 11:35:28.929945+0100 AudioTest[1590:993293] UID Speaker
2018-05-08 11:35:28.930036+0100 AudioTest[1590:993293] portName Speaker
2018-05-08 11:35:28.930118+0100 AudioTest[1590:993293] portType Speaker
2018-05-08 11:35:28.930411+0100 AudioTest[1590:993293] channels (
    "<AVAudioSessionChannelDescription: 0x1c001bd90, name = Speaker 1; label = 4294967295 (0xffffffff); number = 1; port UID = Speaker>",
    "<AVAudioSessionChannelDescription: 0x1c001bda0, name = Speaker 2; label = 4294967295 (0xffffffff); number = 2; port UID = Speaker>"
)
2018-05-08 11:35:28.930501+0100 AudioTest[1590:993293] dataSources (null)
2018-05-08 11:35:28.930554+0100 AudioTest[1590:993293] selectedDataSource (null)
2018-05-08 11:35:28.931099+0100 AudioTest[1590:993293] preferredDataSource (null)
.. So I guess the answer you're looking for is "single output with multiple channels" - when I run it on a 6S I only get one channel listed.

Froist fucked around with this message at 11:40 on May 8, 2018

blorpy
Jan 5, 2005

Froist posted:

iPhone X Portrait:
code:
2018-05-08 11:35:22.586251+0100 AudioTest[1590:993293] -----
2018-05-08 11:35:22.586337+0100 AudioTest[1590:993293] UID Speaker
2018-05-08 11:35:22.586367+0100 AudioTest[1590:993293] portName Speaker
2018-05-08 11:35:22.586389+0100 AudioTest[1590:993293] portType Speaker
2018-05-08 11:35:22.586516+0100 AudioTest[1590:993293] channels (
    "<AVAudioSessionChannelDescription: 0x1d001c3d0, name = Speaker 1; label = 4294967295 (0xffffffff); number = 1; port UID = Speaker>",
    "<AVAudioSessionChannelDescription: 0x1d001c340, name = Speaker 2; label = 4294967295 (0xffffffff); number = 2; port UID = Speaker>"
)
2018-05-08 11:35:22.586534+0100 AudioTest[1590:993293] dataSources (null)
2018-05-08 11:35:22.586549+0100 AudioTest[1590:993293] selectedDataSource (null)
2018-05-08 11:35:22.587405+0100 AudioTest[1590:993293] preferredDataSource (null)
iPhone X Landscape:
code:
2018-05-08 11:35:28.929764+0100 AudioTest[1590:993293] -----
2018-05-08 11:35:28.929945+0100 AudioTest[1590:993293] UID Speaker
2018-05-08 11:35:28.930036+0100 AudioTest[1590:993293] portName Speaker
2018-05-08 11:35:28.930118+0100 AudioTest[1590:993293] portType Speaker
2018-05-08 11:35:28.930411+0100 AudioTest[1590:993293] channels (
    "<AVAudioSessionChannelDescription: 0x1c001bd90, name = Speaker 1; label = 4294967295 (0xffffffff); number = 1; port UID = Speaker>",
    "<AVAudioSessionChannelDescription: 0x1c001bda0, name = Speaker 2; label = 4294967295 (0xffffffff); number = 2; port UID = Speaker>"
)
2018-05-08 11:35:28.930501+0100 AudioTest[1590:993293] dataSources (null)
2018-05-08 11:35:28.930554+0100 AudioTest[1590:993293] selectedDataSource (null)
2018-05-08 11:35:28.931099+0100 AudioTest[1590:993293] preferredDataSource (null)
.. So I guess the answer you're looking for is "single output with multiple channels" - when I run it on a 6S I only get one channel listed.

Awesome, thank you!!

This makes me wonder if setting an AVAudioPlayer's channelAssignments will control playback to the separate speakers. I might come up with one more basic test to check this idea.

blorpy
Jan 5, 2005

Well, I put together an example that uses kAudioQueueProperty_ChannelAssignments. Basically this should make it easy to test where sound comes out of for the various ways you can change where sound comes out of.

TableViewController.m
code:
#import "TableViewController.h"

#import <AVFoundation/AVFoundation.h>

@interface TableViewController()
@property NSArray *cells;
@property NSInteger channels;
@property NSString *uid;
@property bool started;
@property int playSide;
@property float phase;
@end

static const unsigned int bufferLength = 4096;
static const float freq = 2 * M_PI * (440.f / 44100.f);
static const float amp = 0.1f;

static void output_callback(void *user_data, AudioQueueRef queue, AudioQueueBufferRef buffer)
{
  float *stereo = (float *)buffer->mAudioData;
  TableViewController *controller = (__bridge id)user_data;
  memset(stereo, 0, buffer->mAudioDataBytesCapacity);
  const int playSide = [controller playSide];
  float phase = [controller phase];
  for (unsigned int i = 0; i < bufferLength; i++) {
    if (playSide == -1 || playSide == 0) {
      stereo[2 * i] = amp * sinf(phase);
    }
    if (playSide == -1 || playSide == 1) {
      stereo[(2 * i) + 1] = amp * sinf(phase);
    }
    phase += freq;
    if (phase > 2 * M_PI) {
      phase -= 2 * M_PI;
    }
  }
  [controller setPhase:phase];
  buffer->mAudioDataByteSize = buffer->mAudioDataBytesCapacity;
  AudioQueueEnqueueBuffer(queue, buffer, 0, NULL);
}

@implementation TableViewController {
  AudioQueueRef outputQueue;
  AudioQueueBufferRef *outputBuffers;
}

- (void)loadView
{
  [super loadView];
  
  self.title = @"Speaker Test";
  
  NSMutableArray *labels = [NSMutableArray arrayWithObjects:@"Stereo, Left Only", @"Stereo, Right Only", @"Pan Left", @"Pan Right", nil];
  
  NSArray *outputs = AVAudioSession.sharedInstance.currentRoute.outputs;
  self.channels = 0;
  if ([outputs count] > 0) {
    AVAudioSessionPortDescription *port = outputs[0];
    NSArray *channels = port.channels;
    for (int j = 0; j < [channels count]; ++j) {
      AVAudioSessionChannelDescription *channel = channels[j];
      [labels addObject:[NSString stringWithFormat:@"%@/%@", [port portName], [channel channelName]]];
    }
    self.channels = [channels count];
    self.uid = [channels[0] owningPortUID];
  }
  
  NSMutableArray *mutCells = [NSMutableArray arrayWithCapacity:16];
  
  for (int i = 0; i < [labels count]; ++i) {
    UITableViewCell *cell = [[UITableViewCell alloc] init];
    UILabel *label = [[UILabel alloc] initWithFrame:CGRectInset(cell.bounds, 15, 0)];
    label.text = labels[i];
    [cell addSubview:label];
    [mutCells addObject:cell];
  }
  
  self.cells = mutCells;

  self.started = false;
  self.playSide = -1;
}

- (NSInteger)numberOfSectionsInTableView:(UITableView *)tableView
{
  return 1;
}

- (NSInteger)tableView:(UITableView *)tableView numberOfRowsInSection:(NSInteger)section
{
  if (section == 0) {
    return [self.cells count];
  }
  return 0;
}

- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath
{
  if (indexPath.section == 0) {
    return self.cells[indexPath.row];
  }
  return nil;
}

- (NSString *)tableView:(UITableView *)tableView titleForHeaderInSection:(NSInteger)section
{
  return nil;
}

- (void)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(NSIndexPath *)indexPath
{
  if (indexPath.section != 0) {
    return;
  }
  
  if (self.started) {
    AudioQueueReset(outputQueue);
    AudioQueueDispose(outputQueue, true);
  }

  AudioStreamBasicDescription format;
  format.mSampleRate = 44100.f;
  format.mFormatID = kAudioFormatLinearPCM;
  format.mFormatFlags = kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked;
  format.mBitsPerChannel = sizeof(float) * 8;
  format.mChannelsPerFrame = 2;
  format.mBytesPerFrame = sizeof(float) * format.mChannelsPerFrame;
  format.mFramesPerPacket = 1;
  format.mBytesPerPacket = format.mBytesPerFrame * format.mFramesPerPacket;
  format.mReserved = 0;
  
  AudioQueueNewOutput(&format, output_callback, (__bridge void *_Nullable)(self), NULL, NULL, 0, &outputQueue);
  
  if (indexPath.row == 0) {
    self.playSide = 0;
  } else if (indexPath.row == 1) {
    self.playSide = 1;
  } else {
    self.playSide = -1;
  }
  
  if (indexPath.row == 2) {
    AudioQueueSetParameter(outputQueue, kAudioQueueParam_Pan, -1);
  } else if (indexPath.row == 3) {
    AudioQueueSetParameter(outputQueue, kAudioQueueParam_Pan, 1);
  } else {
    AudioQueueSetParameter(outputQueue, kAudioQueueParam_Pan, 0);
  }
  
  if (indexPath.row > 3 && indexPath.row < (4 + self.channels)) {
    AudioQueueChannelAssignment assignments[2];
    assignments[0].mChannelNumber = (unsigned int)(indexPath.row - 3);
    assignments[0].mDeviceUID = (__bridge CFStringRef)self.uid;
    assignments[1].mChannelNumber = (unsigned int)(indexPath.row - 3);
    assignments[1].mDeviceUID = (__bridge CFStringRef)self.uid;
    AudioQueueSetProperty(outputQueue, kAudioQueueProperty_ChannelAssignments, assignments, sizeof(assignments));
  }

  outputBuffers = malloc(3 * sizeof(AudioQueueBufferRef));
  
  for (unsigned int i = 0; i < 3; i++) {
    AudioQueueAllocateBuffer(outputQueue, bufferLength * 2 * sizeof(float), &outputBuffers[i]);
    outputBuffers[i]->mAudioDataByteSize = bufferLength * 2 * sizeof(float);
    output_callback((__bridge void*)(self), outputQueue, outputBuffers[i]);
  }
  
  AudioQueueStart(outputQueue, NULL);
  
  self.started = true;
}

@end

TableViewController.h
code:
#import <UIKit/UIKit.h>

@interface TableViewController : UITableViewController

@end
AppDelegate.m
code:
#import "AppDelegate.h"
#import "TableViewController.h"

#import <AVFoundation/AVFoundation.h>

@interface AppDelegate ()

@end

@implementation AppDelegate


- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
  NSError *error;
  if (![[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryMultiRoute error: &error]) {
    NSLog(@"setCategory: %@", error);
  }
  
  if (![[AVAudioSession sharedInstance] setActive: YES error: &error]) {
    NSLog(@"setActive: %@", error);
  }

  TableViewController *controller = [[TableViewController alloc] initWithStyle:UITableViewStylePlain];
  UINavigationController *nav = [[UINavigationController alloc] initWithRootViewController:controller];
  
  self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
  self.window.rootViewController = nav;
  [self.window makeKeyAndVisible];
  
  return YES;
}
AppDelegate.h
code:
#import <UIKit/UIKit.h>

@interface AppDelegate : UIResponder <UIApplicationDelegate>

@property (strong, nonatomic) UIWindow *window;


@end
If anyone could run this on an iPhone 7 or newer, I'd be super grateful. I've set up some various test cases to try to control which speaker the sound comes out of when the phone's in portrait and landscape modes. My suspicions are that panning left/right will work in landscape but won't work in portrait, but I may be wrong. But I have good hope for the channel mapping working in portrait mode. Might be useful to put your finger over each speaker to muffle it, since I'm not sure how easy localizing the audio is with just your ears :)

blorpy fucked around with this message at 09:25 on May 11, 2018

LP0 ON FIRE
Jan 25, 2006

beep boop
Does anyone use the Giphy API? I'm super confused about how I'd upload a gif to Giphy using their Swift API. It doesn't even demonstrate the capability on their github (besides generating animated text).

They have an upload endpoint mentioned in their docs, but I don't have the slightest idea how that could relate to the API.

Doh004
Apr 22, 2007

Mmmmm Donuts...
Looks like that Swift SDK isn't super up to date and doesn't include the upload functionality.

You could probably write your own client that POSTs to the upload endpoint with your image data and your corresponding API keys - shouldn't be too hard to get up and running.

Doh004 fucked around with this message at 03:41 on May 11, 2018

LP0 ON FIRE
Jan 25, 2006

beep boop

Doh004 posted:

Looks like that Swift SDK isn't super up to date and doesn't include the upload functionality.

You could probably write your own client that POSTs to the upload endoing with your image data and your corresponding API keys - shouldn't be too hard to get up and running.

Yeah I was thinking about doing that. Giphy wants a screenshot of my API implantation, but mixing JS with Swift just didn't seem proper, but I guess I'll need to do it that way. I shot them an email about it too just in case.

Doh004
Apr 22, 2007

Mmmmm Donuts...

LP0 ON FIRE posted:

Yeah I was thinking about doing that. Giphy wants a screenshot of my API implantation, but mixing JS with Swift just didn't seem proper, but I guess I'll need to do it that way. I shot them an email about it too just in case.

Why are you mixing JS with Swift?

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
Yeah you’re overcomplicating this. Grab Alamofire (iOS doesn’t have multipart/form-data), play with its request serialization options, and aim at the giphy endpoint until it works.

ManicJason
Oct 27, 2003

He doesn't really stop the puck, but he scares the hell out of the other team.
Has anyone ever shared my current fresh hell of using app extensions with CocoaPods?

My company has an SDK pod that holds most of the communication/data code and a Swift app that uses it. I ran into a few places where the SDK used [UIApplication sharedApplication], which will go boom if compiled in an extension target.

I found an article that described using an extension subspec to set a compiler flag and allow conditional compilation within the pod. Cool.

In messing around with that, I found that simply having an extension subspec that fully inherits from the core/default subspec and using that in the Podfile for the extension target was enough to get rid of the errors about using [UIApplication sharedApplication]. I'm guessing that having a different subspec caused CocoaPods to stop deduping the SDK pod even though the subspecs are identical otherwise, but I really don't know. I'm a bit scared, and I still expect the sharedApplication call to happen and blow up with this approach.


I'm still very early down the road of developing a today extension, so I may quickly learn that including that whole SDK in the extension target is a bad approach anyway and instead opt to write a tiny amount of data to a shared container or something.

Kallikrates
Jul 7, 2002
Pro Lurker
Cant help you with the cocoapods stuff; we distribute internal modules via a Carthage like tool. Maybe instead of relying on availability from the outside calling in use it inside the framework around the disallowed calls?

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
My first thought is: can you split the SDK into two subspecs, one that’s purely extension-safe API usage and another that depends on the first and augments with the extension-disallowed API usage? I think, but am not sure, that CocoaPods will include the first subspec in both and so it won’t include duplicate symbols.

It’s also possible you’ll have to make two different (non-sub) specs, I forget exactly how CocoaPods works here.

Adbot
ADBOT LOVES YOU

Dog on Fire
Oct 2, 2004

I’d use this topic of frameworks to make a remarkably clumsy segue to my own question, albeit actually about libraries: are there any limitations on the iOS SDK version that was used to build a library that is used in a project?

To be more specific, we have a library that was built using Xcode 8.3.3. Will we have to rebuild it using Xcode 9 at some point? If so, will we have to do it when iOS 11 SDK will become required in July?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply