Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ManicJason
Oct 27, 2003

He doesn't really stop the puck, but he scares the hell out of the other team.

LP0 ON FIRE posted:

why didn't they just make it Swift, or have the language know to do whatever it needs to do without adding "@obj"? Maybe there's not a good reason..
That was the case before Swift 4. Here is the reasoning for the change.

Adbot
ADBOT LOVES YOU

LP0 ON FIRE
Jan 25, 2006

beep boop

ManicJason posted:

That was the case before Swift 4. Here is the reasoning for the change.


Axiem posted:

Things with @objc are visible to Objective-C; things without it aren't. This has effects in how it's compiled under the hood in ways that people more knowledgable than I can explain, but it basically comes down to "avoiding @objc will make more performant code".


ManicJason posted:

That was the case before Swift 4. Here is the reasoning for the change.

Thanks. Makes me wonder how much longer that will need to be used often, and realize that Swift is still very much in it's infancy. Still Swift 3 to 4 has some drastic changes, and pretty much everything I look up has deprecations all over it.

As an unimportant, irrelevant side note, right now I'm struggling with optionals on AVCaptureSession (especially with canAddInput) and what delegate will be set on capturePhoto on AVCapturePhotoOutput. I might make a post about it later, but I think I may just need to understand all this stuff better first.

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.

LP0 ON FIRE posted:

Finally getting my hands dirty into Swift. I'm curious if there's any convincing reason behind prepending "@obj" before some of the functions, for instance ones that are called by Timers (I think because of the selector parameter). I know it's using Obj-C, but why did they design it this way? It seems confusing to remember that some calls must have this. So why didn't they just make it Swift, or have the language know to do whatever it needs to do without adding "@obj"? Maybe there's not a good reason..

As for your particular example, there’s a block-based Timer initializer available in Swift, in case you find it useful.

pokeyman fucked around with this message at 03:31 on Oct 18, 2017

LP0 ON FIRE
Jan 25, 2006

beep boop

pokeyman posted:

As for your particular example, there’s a block-based timer Timer initializer available in Swift, in case you find it useful.

Ah, good to know! Makes more sense now.

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
For whatever reason there’s been a block-based timer function on CFRunLoopTimer for awhile, but never on NSTimer. Always thought that was a strange omission. Happy to delete that NSTimer category from my utils.m now that the Swift overlay has it.

LP0 ON FIRE
Jan 25, 2006

beep boop
Here's my AVCapturePhotoOutput problem. For AVCapturePhotoOutput, I have no idea what the delegate should be set to. Every combination I tried returns an error or crashes with an exception "Could not cast value of type 'myApp.MainViewController' to 'AVCapturePhotoCaptureDelegate'".

The line is self.stillImageOutput.capturePhoto(with: settings, delegate: self as AVCapturePhotoCaptureDelegate)

"self" - returns error "Argument type 'MainViewController' does not conform to expected type 'AVCapturePhotoCaptureDelegate'"

"self as! AVCapturePhotoCaptureDelegate" - crashes immediately "Could not cast value of type 'MyApp.MainViewController' to 'AVCapturePhotoCaptureDelegate'."

"self as AVCapturePhotoCaptureDelegate" - returns error "'MainViewController' is not convertible to 'AVCapturePhotoCaptureDelegate'; did you mean to use 'as!' to force downcast?"

I've also tried setting a constant for my delegate:

let appDelegate = UIApplication.shared.delegate as! AppDelegate

Then trying different combinations with it such as:

self.stillImageOutput.capturePhoto(with: settings, delegate: AppDelegate as AVCapturePhotoCaptureDelegate)

Current implementation (minus viewDidLoad and some other things) where setting it to 'self as AVCapturePhotoCaptureDelegate' is giving me an error (described above):

code:
import UIKit
import AVFoundation

class MainViewController: UIViewController {

    var session = AVCaptureSession()
    var stillImageOutput = AVCapturePhotoOutput()
    var videoPreviewLayer: AVCaptureVideoPreviewLayer?

    // outlets and vars
    @IBOutlet weak var previewView: UIView!

    @IBOutlet weak var captureImageView: UIImageView!

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)

        videoPreviewLayer?.frame = previewView.bounds

        session.sessionPreset = AVCaptureSession.Preset.photo

        let backCamera = AVCaptureDevice.default(for:AVMediaType.video)

        var error: NSError?
        var input: AVCaptureDeviceInput!
        do {
            input = try AVCaptureDeviceInput(device: backCamera!)
        } catch let error1 as NSError {
            error = error1
            input = nil
            print(error!.localizedDescription)
        }

        if error == nil &&
            session.canAddInput(input) {
            session.addInput(input)

            let settings = AVCapturePhotoSettings()
            let previewPixelType = settings.__availablePreviewPhotoPixelFormatTypes.first!
            let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                                 kCVPixelBufferWidthKey as String: 160,
                                 kCVPixelBufferHeightKey as String: 160,
                                 ]
            settings.previewPhotoFormat = previewFormat

            // What do I set the delegate to?
            self.stillImageOutput.capturePhoto(with: settings, delegate: self as AVCapturePhotoCaptureDelegate)


            if session.canAddOutput(stillImageOutput) {
                session.addOutput(stillImageOutput)

                videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
                videoPreviewLayer!.videoGravity = AVLayerVideoGravity.resizeAspect
                videoPreviewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
                previewView.layer.addSublayer(videoPreviewLayer!)
                session.startRunning()

            }

        }

    }
}
I'm still new to Swift so I could be doing something really clueless with views and delegates, but I'm not completely sure.

Doc Block
Apr 15, 2003
Fun Shoe
Your class has to adopt the AVCapturePhotoCaptureDelegate protocol, which means declaring it and then implementing all required methods and any optional methods you need. No different than Objective-C.

IDK Swift, but probably something like
Swift code:
class Whatever : AVCapturePhotoCaptureDelegate {
...
func delegateMethod1
...
func delegateMethod2
... etc
}
Since your class doesn’t do this, the compiler is complaining.

LP0 ON FIRE
Jan 25, 2006

beep boop

Doc Block posted:

Your class has to adopt the AVCapturePhotoCaptureDelegate protocol, which means declaring it and then implementing all required methods and any optional methods you need. No different than Objective-C.

IDK Swift, but probably something like
Swift code:
class Whatever : AVCapturePhotoCaptureDelegate {
...
func delegateMethod1
...
func delegateMethod2
... etc
}
Since your class doesn’t do this, the compiler is complaining.

Thank you :)

hackbunny
Jul 22, 2007

I haven't been on SA for years but the person who gave me my previous av as a joke felt guilty for doing so and decided to get me a non-shitty av
Let's talk CFI, shall we?

Now, Apple men present, I know that CFI is not officially supported on iOS, but the toolchain supports it, and I figured it couldn't hurt having that extra layer of security in my shipped binaries. Granted, as implemented by the Xcode command line tools, all it does is hit a breakpoint on violations, with no diagnostic message of any kind. To make it even less pleasant to diagnose violations, it requires link-time optimization, which inlines functions so aggressively I always prayed I'd never have to deal with a genuine CFI crash. My prayers fell on deaf ears

Disclaimer: I'm going to disable CFI, I've already chosen to, you don't need to tell me. All I want to know is: is this a compiler bug, or is this a compiler bug? (I know it is, because before I upgraded Xcode, the same code didn't crash). The code that crashes is:

code:
    0x10052ecac <+76>:  bl     0x10058d568               ; symbol stub for: operator new(unsigned long)
    0x10052ecb0 <+80>:  mov    x20, x0
    0x10052ecb4 <+84>:  cbz    wzr, 0x10052ed44

...

->  0x10052ed44 <+228>: brk    #0x1
I had to download the AArch64 reference to figure out what was happening, because it didn't make sense to me (it still didn't make sense as we'll see). So, the code calls ::operator new to allocate memory for an object; then, it stores the result in register x20. Then, it compares register wzr with zero, and jumps to a breakpoint instruction if it's zero (cbz). Except, wzr is always zero, isn't it? it's the special "always zero" register, right? So, is it a regression in the Xcode command line tools, or is it a regression in the Xcode command line tools?

LP0 ON FIRE
Jan 25, 2006

beep boop
Update: Maybe you don't need to specify DISPATCH_QUEUE_SERIAL anymore? I'm afraid this is wrong since it's so different, but I now have this without an error:

code:
videoOutput.setSampleBufferDelegate(self,
queue:  DispatchQueue(label: "sample buffer delegate"))
------------------------------------------------------------------------------------------------------------------------------------


Anyone know what's going on here? I find this error confusing and unhelpful. It has a problem with DISPATCH_QUEUE_SERIAL: "Cannot convert value of type '()' to expected argument type '__OS_dispatch_queue_attr?'"

code:
videoOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))
videoOutput is AVCaptureVideoDataOutput. AVCaptureVideoDataOutputSampleBufferDelegate is added to my class.

LP0 ON FIRE fucked around with this message at 17:39 on Oct 25, 2017

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
Try passing nil or .serial as the second parameter to DispatchQueue.init

LP0 ON FIRE
Jan 25, 2006

beep boop

pokeyman posted:

Try passing nil or .serial as the second parameter to DispatchQueue.init

Update:

I also see a way of passing a label and attributes, but ".serial" does not seem to be supported any longer. Other users say this is default, but I wonder how you'd specify it anyway? There doesn't seem to a list on the Apple website https://developer.apple.com/documentation/dispatch/dispatchqueue.attributes

code:
  DispatchQueue(label: "sample buffer delegate", attributes: [.serial, .qosUtility]))
-----------------------------------------------------------------
Old:
(Edit, didn't realize there was another init method!)

What would that look like? The params are label, qos, attributes, autoreleaseFrequency and target. It doesn't seem like I'd need some other parameters, so I guess I'd pass nil, but qos (the second parameter) doesn't let me pass nil or .serial.

I'm unsure if setting it up with DispatchQueue.init with the 5 parameters it has to do with the problem I'm having right now which is captureOutput is not being called.

LP0 ON FIRE fucked around with this message at 21:43 on Oct 25, 2017

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
I just popped this in a playground and it did the thing:

code:
import Dispatch
let q = DispatchQueue(label: "qq", qos: .default)
q.async {
    print("hi")
}
I’m ok with libdispatch but I haven’t really internalized the Swift overlay yet, sorry for the runaround.

LP0 ON FIRE
Jan 25, 2006

beep boop

pokeyman posted:

I just popped this in a playground and it did the thing:

code:
import Dispatch
let q = DispatchQueue(label: "qq", qos: .default)
q.async {
    print("hi")
}
I’m ok with libdispatch but I haven’t really internalized the Swift overlay yet, sorry for the runaround.

Cool, I didn't think of that! Works too.

I just discovered my main problem was that my captureOutput function was written as pre-swift 3 - thus not being called :downs:
Should have been:
code:
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
Now I just need to fix a few main thread errors, but it looks like I'm on my way to getting it working properly.

LP0 ON FIRE
Jan 25, 2006

beep boop


I’m really excited to have the CIFilters work though the camera in real time. I’m curious how the filters are written, so eventually I’ll find the source of those. (But seriously, where is the source for these?) I’m just using the built in ones that Apple wrote for now.

LP0 ON FIRE fucked around with this message at 21:04 on Oct 26, 2017

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.
I don’t think the source for those filters is available. Though I’d imagine if they’re just compiled shaders somewhere you could probably decompile them somehow?

edit: also yay for getting it working!

LP0 ON FIRE
Jan 25, 2006

beep boop

pokeyman posted:

I don’t think the source for those filters is available. Though I’d imagine if they’re just compiled shaders somewhere you could probably decompile them somehow?

edit: also yay for getting it working!

Wow really? I’m really stupid and I’m genuinely curious how this works. Could they possibly be some tiny compiled programs of the shaders somewhere and then they get compiled with the code or something? I thought everything you include is available somewhere.

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

LP0 ON FIRE posted:

Wow really? I’m really stupid and I’m genuinely curious how this works. Could they possibly be some tiny compiled programs of the shaders somewhere and then they get compiled with the code or something? I thought everything you include is available somewhere.

Er, you don’t think UIKit source code is included with Xcode and the iOS SDK, do you?

LP0 ON FIRE
Jan 25, 2006

beep boop

eschaton posted:

Er, you don’t think UIKit source code is included with Xcode and the iOS SDK, do you?

I guess not. I'm just wondering how that works.

Anyway never mind, no one has to explain. I can just look up how frameworks really work and maybe make one myself for the learning experience.

LP0 ON FIRE fucked around with this message at 18:00 on Oct 27, 2017

LP0 ON FIRE
Jan 25, 2006

beep boop
Now the camera output is suddenly not working and I didn't make any changes. Tried cleaning, deleting the app. Making me lose all faith in using Swift. What a waste of time.

hackbunny
Jul 22, 2007

I haven't been on SA for years but the person who gave me my previous av as a joke felt guilty for doing so and decided to get me a non-shitty av

eschaton posted:

Er, you don’t think UIKit source code is included with Xcode and the iOS SDK, do you?

Christ, should we be so lucky

LP0 ON FIRE
Jan 25, 2006

beep boop
Okay so the built in Camera app didn't even work on my phone! Just a black screen when I went to the camera. Restarted my phone. My app works. Welp, that was 3 hours well spent!

LP0 ON FIRE fucked around with this message at 20:44 on Oct 27, 2017

brap
Aug 23, 2004

Grimey Drawer
:( Programming is just like that sometimes. poo poo don’t work and you dunno why.

Simulated
Sep 28, 2001
Lowtax giveth, and Lowtax taketh away.
College Slice

fleshweasel posted:

:( Programming is just like that sometimes. poo poo don’t work and you dunno why.

I nominate this post for understatement of the year.

lord funk
Feb 16, 2004

Speaking of...

By far my biggest reported crash is OpenGL rendering in the background. GLEngine glFinish_Exec crashes on [GLKView display].

1. Does anyone know of a (fairly) reliable way to recreate this? I can't figure out the series of app entering / exiting that gets this to occur.

2. I only start my displayLink in two places: from the UIApplicationDidBecomeActive notification and the view controller's viewDidAppear. The start function performs a check in each case:

code:
    if ([[UIApplication sharedApplication] applicationState] == UIApplicationStateActive) {
        if (self.isViewLoaded && self.view.window) {
		...start displayLink...
	}
    }
I've now put the [[UIApplication sharedApplication] applicationState] == UIApplicationStateActive check in both the displayLink tick and even in the draw function. Is this enough? Why is this happening if I perform the check before starting the displayLink?

lord funk
Feb 16, 2004

Oh, and I'm stopping the displayLink on all the following notifications:

UIApplicationWillResignActiveNotification
UIApplicationWillTerminateNotification
UIApplicationDidEnterBackgroundNotification

...and on the view controller's viewWillDisappear.

LLSix
Jan 20, 2010

The real power behind countless overlords

Running into an odd xcodebuild error when trying to build an iphone app from the command line:

code:
codebuild build -quiet -project configcc.xcodeproj -scheme configcc   -configuration Debug  -destination generic/platform=iOS -destination-timeout 1 
xcodebuild: error: Unable to find a destination matching the provided destination specifier:
		{ generic:1, platform:iOS }

	Available destinations for the "configcc" scheme:
		{ platform:macOS, arch:x86_64 }
make[2]: *** [xcodebuild-debug-device] Error 70
make[1]: *** [sub-configcc-configcc-pro-all] Error 2
Any advice on how to resolve it? I'm only seeing this on a new-to-me macbook that I've been asked to setup to use as a Jenkins build machine. My usual macbook builds just fine with the exact same commands so I must have something wrong in my environment.

Plorkyeran
Mar 22, 2007

To Escape The Shackles Of The Old Forums, We Must Reject The Tribal Negativity He Endorsed
Not having the code signing certs installed on your new machine will produce that error.

LP0 ON FIRE
Jan 25, 2006

beep boop
1) I'm trying to find out more about file storage and retrieval on iPhone. Is it correct to assume the string after Application in file:///var/mobile/Containers/Data/Application/[big random looking string here]/Documents/ is supposed to be secret for security reasons?

2) I'm trying to make a video out of a chain of images, and I found a Swift class for that:
https://stackoverflow.com/a/41159403

Their example shows how to use fileURL in a player, and not how to save it to your phone, so I tried this, but I receive an error "Fetch failed: The operation couldn’t be completed. (Cocoa error -1.)"

code:
var uiImages = [UIImage]()

/** add image to uiImages */
for _ in 0 ... 10 {
	uiImages.append(UIImage(cgImage: image.cgImage!))
}

let settings = CXEImagesToVideo.videoSettings(codec: AVVideoCodecH264, width: (uiImages[0].cgImage?.width)!, height: (uiImages[0].cgImage?.height)!)
let movieMaker = CXEImagesToVideo(videoSettings: settings)
movieMaker.createMovieFrom(images: uiImages){ (fileURL:URL) in
	
	print(fileURL) // file:///var/mobile/Containers/Data/Application/[big random looking string here]/Documents/exportvideo.mp4
	
	PHPhotoLibrary.shared().performChanges({
		PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: fileURL)
	}) { saved, error in
		if saved {
			let alertController = UIAlertController(title: "Your video was successfully saved", message: nil, preferredStyle: .alert)
			let defaultAction = UIAlertAction(title: "OK", style: .default, handler: nil)
			alertController.addAction(defaultAction)
			self.present(alertController, animated: true, completion: nil)
		}else{
			
			print("Fetch failed: \(error!.localizedDescription)")
			
		}
	}
}
Before I swear I've seen file URL's in private before var. Instead it's file:///var/mobile/Containers/Data/Application/[big random looking string here]/Documents/exportvideo.mp4

Stringent
Dec 22, 2004


image text goes here
You need to store relative URLs rather than absolute because the iOS sandboxing allows it to move stuff around as it sees fit. You'll want to do something more along these lines:
code:
 let fileUrl = try? FileManager.default
            .url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false)
            .appendingPathComponent(fileName)
Docs are here: https://developer.apple.com/documentation/foundation/filemanager/1407693-url

LP0 ON FIRE
Jan 25, 2006

beep boop

Stringent posted:

You need to store relative URLs rather than absolute because the iOS sandboxing allows it to move stuff around as it sees fit. You'll want to do something more along these lines:
code:
 let fileUrl = try? FileManager.default
            .url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false)
            .appendingPathComponent(fileName)
Docs are here: https://developer.apple.com/documentation/foundation/filemanager/1407693-url


Update

I think I was just calling the wrong function all along. Now it claims it was saved, but I don't see it in my camera roll. Instead of using
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: fileURL)
I use
UISaveVideoAtPathToSavedPhotosAlbum(fileURL.path, nil, nil, nil)

Older

Thanks I'm still stumped, but I will do more research. Your code still looks like it creates an absolute path.

code:
self.fileURL = try? FileManager.default
.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false)
.appendingPathComponent("exportvideo.mp4")
print(self.fileURL) // file:///var/mobile/Containers/Data/Application/[hash string]/Documents/exportvideo.mp4

LP0 ON FIRE fucked around with this message at 20:30 on Nov 2, 2017

Stringent
Dec 22, 2004


image text goes here
It does create an absolute path, the trick is that [hash string] portion is subject to change by the operating system. So that's why you use the enums when calling url to get the
pre:
file:///var/mobile/Containers/Data/Application/[hash string]/Documents/
portion and just worry about saving
pre:
exportvideo.mp4
yourself.

LP0 ON FIRE
Jan 25, 2006

beep boop
So I finally figured out that nothing was really wrong with the way I was trying to make the video from still images, except that the video was waaaay too large, giving me the vague "-1" error upon saving. Interestingly enough, it was able to say it was assembling the video and that it did exist. The only way I was able to get it to crash was to apply my video effect to each frame which was also vague - it just gave me a "Lost connection" to device error in Xcode!

LP0 ON FIRE fucked around with this message at 22:44 on Nov 7, 2017

LLSix
Jan 20, 2010

The real power behind countless overlords

Plorkyeran posted:

Not having the code signing certs installed on your new machine will produce that error.

Thank you for the quick response! It was very helpful.

LP0 ON FIRE
Jan 25, 2006

beep boop
Xcode is complaining about how I'm setting the camera flash is deprecated in iOS 10, but I have no idea how to set it now.

code:
try device.lockForConfiguration()
device.flashMode = AVCaptureDevice.FlashMode.auto 
device.unlockForConfiguration()
Setting the flashMode that way gives me the warning 'flashMode' was deprecated in iOS 10.0: Use AVCapturePhotoSettings.flashMode instead.

Which makes me think I"m supposed to set the flash mode like this:

code:
AVCapturePhotoSettings.flashMode = AVCaptureDevice.FlashMode.auto
But that gives me an error: Instance member 'flashMode' cannot be used on type 'AVCapturePhotoSettings'

Setting it with AVCapturePhotoSettings:

code:
let settings = AVCapturePhotoSettings()
...
settings.flashMode = AVCaptureDevice.FlashMode.auto
...
Passes, but nothing happens. This is with "auto" or "on" :confused:

LP0 ON FIRE fucked around with this message at 23:20 on Nov 13, 2017

SaTaMaS
Apr 18, 2003
How do people learn Accelerate? I'm trying to use it for linear algebra and signal processing, but I haven't seen any book for it, and the documentation is bare-bones. At the moment I'm digging through github and stackoverflow looking for sample code, but I don't know why it has to be this much of a pain.

carry on then
Jul 10, 2010

by VideoGames

(and can't post for 10 years!)

How come this is invalid:

code:
return quicksort(smaller) + [list[0]] + quicksort(larger)
(Error "'Int' is not convertible to '[Int]'" on the second '+')

but this is valid:

code:
var intermediate = quicksort(smaller) + [list[0]]
return intermediate + quicksort(larger)

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe
I dunno, that's surprising. What Xcode are you using?

carry on then
Jul 10, 2010

by VideoGames

(and can't post for 10 years!)

rjmccall posted:

I dunno, that's surprising. What Xcode are you using?

9.1 if I remember right. Next time I have the system in front of me I'll check the build number.

e: reproduced on my other system, Xcode 9.1, build is 9B55.

Maybe some context will help someone see what I'm doing wrong? It's just quicksort...

code:
func quicksort(_ list: [Int]) -> [Int] {
    if list.count == 0 {
        return list
    }
    
    let (smaller, larger) = partition(pivot: list[0],
                                      rest: Array(list.suffix(from: 1)),
                                      smaller: [],
                                      larger: [])
    return quicksort(smaller) + [list[0]] + quicksort(larger)
}

func partition(pivot: Int, rest: [Int], smaller: [Int], larger: [Int]) -> ([Int], [Int]) {
    if rest.count == 0 {
        return (smaller, larger)
    }
    
    if rest[0] <= pivot {
        return partition(pivot: pivot,
                         rest: Array(rest.suffix(from: 1)),
                         smaller: [rest[0]] + smaller,
                         larger: larger)
    }
    
    return partition(pivot: pivot,
                     rest: Array(rest.suffix(from: 1)),
                     smaller: smaller,
                     larger: [rest[0]] + larger)
}

carry on then fucked around with this message at 15:17 on Nov 15, 2017

Adbot
ADBOT LOVES YOU

lord funk
Feb 16, 2004

Wow the new code folding is garbage. Scrolling is molasses, and the entire text editor is filled up with buttons for expanding functions instead of the old {. . .} button. You can't position your cursor at the end of a function block anymore 😡

So pointless. And we still don't have a dark mode.

edit: also lol at the function mouseover highlight bug. Just mouse over and scroll away and the button stays highlighted. Did an intern program this at the last second to 'wow' everyone in the office or something?

lord funk fucked around with this message at 20:51 on Nov 15, 2017

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply