WWDC Initial thoughts

2022 年 6 月 9 日 • 9:31 AM

New APIs

There are a bunch of new APIs available to developers. Some are entirely new but most are what’s been in use in iOS 14 or even earlier.

WeatherKit

WeatherKit is a paid set of APIs. I initially thought that half a million calls per month was a stingy quota—comparing with Open Weather Maps that I adopted in PlantPal—but most developers found it quite generous as it’s basically half the price of DarkSky (which, mind you, was acquired by Apple years ago).

No plans for me to adopt it anytime soon. I’m already bought in on Open Weather Maps. On iOS 14 or earlier I can only implement that with a REST API so that’s not ideal. I’ll wait.

Notifications (Live Activities)

Officially branded as Live Activities, this notification feature is a new way to keep the user updated on “things that are happening in real time, right from your Lock Screen.”

I can see my PowerTimer app adopt this new feature. Instead of sending a bunch of sequential notifications as the timer updates, I can now pin the time live on the Lock Screen.

There’s a small asterisk saying Live Activities are coming later this year. So it could either be missing from the entire beta, or available as early as July.

App Intents

App Intents seems to be the new way for apps to interact with Siri and Shortcuts. I may adopt that for apps that currently don’t use SiriKit (the original Intents framework) but not for PowerTimer yet. I’ll wait until iOS 16 prevails in the market.

Not sure why they are upgrading to App Intents. Maybe there’s too much baggage on SiriKit, interacting with Siri, Shortcuts, and — since iOS 14 — Widgets.

Charts

There are finally charts. Officially supported charts, programmed in SwiftUI.

At first glance, there isn’t anything fancy for data-intense charts that you might see in a normal day of a data analyst. But bar charts, line charts, and scatter plots are more than enough to show basic data in most of the apps.

Maybe there are ways to layer these basic charts to form a more detailed visualization—I’ll see.

I’ll definitely adopting those in PlantPal. In PowerTimer, maybe.

Miscellaneous

Screen Time is now accessible to developers. Hopefully I can implement categorized time tracking with PowerTimer. I’ll see how the framework looks like.

I can totally see many other apps adopt Screen Time. Ulysses, which I’m using right now to right this post, will likely adopt folder based time categorizing, to give one example.

Focus Mode is now accessible to developers. There’s definitely an update coming for PowerTimer on that.

Shared with You is now accessible to developers. Interesting addition to the now open to developers category. Not sure if it adds value to my app. But apps like Instagram can likely have a section that shows in-app content shared over Messages.

One thing to note: apps have no access to what’s shared. Instagram cannot tell which posts were shared with you with Messages. Apple really nailed the privacy aspect.

There’s now a good reason to revisit Watch Widgets. They appear on the Lock Screen now, so… there’s that.

Object identification might be available? Now developers can make text recognizable in videos or in UIImageView objects. This might be possible for plant recognition for PlantPal, but I’ll see what’s opened up to developers and what’s not. I’ve been using the Camera app to take photos of trees and flowers around Vancouver, and the identification rate is top-notch. Would be a shame if this part isn’t open to developers.

Photo library can track changes. This is likely helpful for those app that heavily integrates with the Photos Library.

M2 and the New MacBooks

M2 looks great by the numbers. It’s expectedly not as revolutionary as the Intel–to–Apple Silicon jump, but still good improvement over two years.

I like my 16-inch MacBook Pro’s look—it’s not deceiving or trying to please anyone. It looks hefty and dependable. Along the same line, I like the look of the new MacBook Air.

Compared to the M1 MacBook Air that’s still selling for the foreseeable future, for $200 more, you get M2, slightly larger display (with a notch), better camera, MagSafe (which gives you one extra port when you’re charging), fast charging.

For the new MacBook Pro, you can spend an extra $100 than the new Air. You lose all those features above (except the M2) in exchange for a fan, and a lump of coal called the Touch Bar. I’ve never met anyone—developers, college students, designers, photographers—that said anything nice about the Touch Bar. The 13-inch MacBook Pro definitely exist for some people out there. I’m just in a different world.

Finally, I was expecting to see the new Mac Pro at WWDC. There was nothing. Traditionally, it’s not out of ordinary to show the Mac Pro at WWDC, as advanced developers love it. Today, the Mac Pro is just too “Pro” for most developers, the vast majority of whom are served well by the models with M1 Pro, Max or Ultra chips.

◼︎

The Best Lens to Use: Let iOS Decide

2022 年 6 月 9 日 • 9:09 AM

When you begin developing an app with camera features, it’s tempting to choose the single back-facing wide lens (the “1x lens”), as it’s a safe choice available on all models of iPhones since their inception. But this choice comes with a problem: you are unintentionally making a choice for iOS by telling it to use and only use the wide lens for photo capturing.

This becomes a very real issue with the iPhone 13 Pro: not allowing iOS to pick the best lens to use results in poor (and often unusable) image quality. Here are two apps I frequently use on my iPhone:

Lululemon (left) cannot scan a credit card1 properly on the iPhone 13 Pro. Foodnoms (right) has issue scanning the barcode up close (it’s a box of Nespresso pods with a barcode equally small in size).

Why Is this happening?

Since the iPhone 7 Plus, the first with a dual camera system, iOS has been able to decide the best lens to use to capture the image, depending on a few factors (and possibly many more):

  • The zoom factor, obviously: on the iPhone 13 Pro, a 3x lens cannot physically take a 2x photo, and is not considered
  • A lens might be blocked, by a finger or some other object
  • A lens might be out of focus for a close-up shot
  • A telephoto lens might produce poor quality in low-light environment, and iOS decides it might as well crops the image from the 1x lens

The iPhone 13 Pro, with its macrophotography power, really kicked off the discussion on the “smart” lens switching. The 26 mm wide-angle lens (the “1x lens”) on the iPhone 13 Pro has a minimum focus distance of approximately 8–10 cm. That’s about as long as the diagonal of your credit cards, and a much longer distance than the minimum focus distance on older models of the iPhone.

Many developers, when starting off with AVFoundation or when copying-and-pasting code from StackOverflow, inadvertently tell iOS to stick to the 1x lens no matter what. They use it to scan barcodes, identifying credit cards, or even use it to take photos. It all worked well until hell broke loose with the iPhone 13 Pro.

What’s the solution, as a developer?

Unless you are developing a camera app that must strictly use the user-specified lens, you should defer lens choice to iOS. For the rear-facing cameras, you should be adopting the first available AVCaptureDevice object in the following order:

  • Triple Camera, a system that switches automatically among the three lens
  • Dual Wide Camera, a system that switches between 0.5x and 1x as appropriate
  • Dual Camera, a system that switches between 1.0x and telephoto lens (2x, 2.5x, or 3x depending on the phone model)
  • Wide Camera, the 1x single-lens camera

This table below lists out availabilities of the lenses:

  Triple Camera Wide Dual Camera Dual Camera Wide Camera
iPhone 13 Pro Yes Yes Yes Yes
iPhone 13   Yes   Yes
iPhone XS     Yes Yes
iPhone 11       Yes
iPhone 6s       Yes
Device type availability on key iPhone models, from the latest to the oldest still supported by iOS 15.

Many developers chose the wide lens to use because it’s “safe” and available on all iPhone models. They are wrong. Don’t do that.

To put all the above in code:

let deviceTypes: [AVCaptureDevice.DeviceType] = [
	.builtInTripleCamera,
	.builtInDualWideCamera,
	.builtInDualCamera,
	.builtInWideAngleCamera,
]
let discoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: deviceTypes, mediaType: .video, position: .back)
let selectedDevice = discoverySession.devices.first
return selectedDevice // this can be `nil`

Note that the discovery session returns devices in the order you specified using deviceTypes (as per Apple documentation). The first element in the devices array would be the proper device for general purpose camera.

  1. I traced a credit card on the iPad. I didn’t want to expose my card number; I can’t be redacting the number, either, as it would defeat the purpose of showing blur. In practice, the Lululemon app managed to scan 2 of my credit & debit cards and struggled with the others on the iPhone 13 Pro. The app scanned all cards OK—without breaking a sweat—on my iPhone XS. 

◼︎

Develop for Shortcuts to Process Images

2022 年 1 月 7 日 • 10:40 PM

The Shortcuts app is the official automation solution supported throughout the Apple ecosystem. It allows the user to fit your app into their workflow, and you don’t even have to design UI for the action.

The idea of the Shortcuts action in BORD, my app that un-crops photo and adds adaptive colour backgrounds.

In this tutorial, I will demonstrate how you can develop and expose a Shortcuts action in the Shortcuts app. In my app, BORD, this action adds coloured borders to your photos, so that a 2:3 portrait DSLR photo can fit Instagram’s 4:5 aspect ratio requirements.

For sake of the tutorial, we will develop an action that pass through input as output. However, you are free to do any image operations on the input as you wish.

Table of contents—

◼︎

UIApplication Key Window Replacement

2021 年 11 月 9 日 • 9:30 AM

With Split Screen on iPad, you may end up with two active windows of the same app. When I was working on building the menu system, I encountered the problem: how can I know which window scene is “active and current”, the same scene that invoked keyboard shortcuts?

In UIKit, the UIApplication class has a property named keyWindow. This property is marked as deprecated in iOS 13.0, accompanied by a message from Apple:

‘keyWindow’ was deprecated in iOS 13.0: Should not be used for applications that support multiple scenes as it returns a key window across all connected scenes

So what’s its replacement for iOS 15?

TL;DR. As indicated by @saafo on Stackoverflow:

Although UIApplication.shared.keyWindow is deprecated after iOS 13.0 by Apple, it seems this attribute can help you find the active one when multiple scenes [are] in the foreground by using:

UIApplication.shared.keyWindow?.windowScene

So… there you have it. I have no idea why Apple marked keyWindow as deprecated. You should continue to use it until Apple finally replaces it with something functional, or when it becomes obsolete.

Below, I’ll discuss two attempts that did not work as of today on iPadOS 15.0.2.

◼︎

Random Little Things from WWDC 2021

2021 年 6 月 7 日 • 3:08 PM

Notifications now have “urgency” (or as Apple puts it, “Interruption level”). I can probably update my PowerTimer app so that they utilizes Time Sensitive urgency level to function fully with Do Not Disturb on.

Alway-On app for watchOS — when watch is dimmed, you can have your own app on screen. Potentially useful, again, for my timer app. Interestingly, this is powered by the same technology for iOS widgets — TimelineView to schedule view update, and SwiftUI Text.DateStyle for regular view updates.

Visual Lookup surfaces objects captured in a photo, and would even go identify house plants, pets, landmarks and more. There’s no developer documentation mentioning this, so I assume it’s available only to iOS at least for now. It’s available on A12 or later chips (A12, 13, 14 and M1).

◼︎

“Let them eat cake.” — The Shitty Take by David Heinemeier Hansson on the Reduced App Store Cut

2020 年 11 月 18 日 • 7:22 PM

David Heinemeier Hansson — CTO at Basecamp, famously known as a public critic of Apple’s control and monopoly — made a strongly opinionated thread on Twitter regarding Apple’s announcement today: developers who earn less than $1M will qualify for 15% cut starting January 1st, 2021.

I followed the entire episode of Basecamp vs. Apple and I agree with DHH on many fronts. But this time, his comments are condescending and downright shitty:

If you’re a developer making $1m, Apple is STILL asking to be paid $150,000, just to process payments on the monopoly computing platform in the US. That’s obscene! You could hire two people at that take, still have money for CC processing.

Sure. I can just hire two people. I can definitely spend just a few days looking out for a contracting job board, interview a few dozen people, draft a contract, set up payment, and do the tax to hire two people. How easy is that.

◼︎

Enumerated Custom Errors in R

2020 年 9 月 4 日 • 11:58 AM

When designing a complex Shiny app in R, you might want to present different kinds of errors that you expect your users to encounter. These errors should have a summary to show the user what went wrong. There should also be a longer description to give the user more information about the error, as well as possible steps to troubleshoot.

You would also want your custom errors enumerated, so that the messages are in one place for easy management and update. This is critical as your app grows in complexity, and it’s really just good practice. In this post, we will explore just that: design your own custom error cases to display helpful error messages in Shiny apps.

◼︎

Developers Whining About Apple’s IAP Cut Are Babies

2020 年 8 月 15 日 • 5:31 PM

The “I’m An Adult, And I Don’t Pay My Taxes” Argument

As a small developer for iOS myself, I found Apple’s 30% cut high, but somewhat acceptable. With it, I’m not worrying about monetary transactions (sales, fraud detection, refund handling, money deposit); I don’t need to find a place to host my app for download; I don’t need to tell people where to find it, as long as they know the name or have a link.

Plus, with Apple doing a good job with the OS updates, supporting legacy systems are the least of my worries. With a nice and properly scoped idea, it’s easy to have an app up and running within literally hours.

So, I’m happy to pay that 30% tax, and just focus on my apps. This is from me, someone who makes a few hundred bucks a month from the App Store.

◼︎

Published and NSManaged

2020 年 7 月 13 日 • 11:31 PM

While developing a universal app with the updated SwiftUI framework, I found that @Published decorators do not function as intended in objects inheriting from NSManagedObject class.

What’s Expected & What’s Broken

What’s exactly the expected behaviour anyway? We know that —

  • Objects conforming to ObservableObject protocol can have variables marked with @Published. SwiftUI views that depends on an observable object instance will update when a published variable receives a new value.

◼︎

There's Always a Unique Identifier

2020 年 7 月 3 日 • 12:00 AM

When a business owner approaches me to set up a new ETL pipeline for their emerging dataset, I always ask one question: “What’s the unique identifier of this dataset?” This question gets paraphrased according to the business owner’s knowledge and understanding on data and database, and often times I help them identify a candidate (either a single-column primary key, or a multi-column composite key).

What I don’t do is I never take no for an answer.

The statement that “This dataset has no unique identifier” invites trouble1 — ETL pipelines exist to bring order into the chaotic data from various sources. If they are adamant about the data being without a unique identifier, follow up with a few more questions:

  1. How are you gonna tell if the data have duplicate in the future? How do you ensure its quality down the road?
  2. Why do you need this dataset to begin with, given that you can’t ensure its quality?

Chances are you will find that they have absolutely no clue themselves. Kindly ask them to come back later.

  1. Especially with our archaic infrastructure that gets virtually zero IT support. We have stiches of solutions that depend on Perl, Python 2, Python 3, R, VBA, JavaScript, R, C#, .NET, Ruby on Rail, Oracle SQL, SQL Server, and MS Access, to name a few. 

◼︎