Lost in the sauce with Apple’s move to the premium phone space is that iOS 12 is exactly what the iPhone ecosystem has needed for a couple of years. It’s not a true Snow Leopard-type “No new features” release but it does a LOT of cleaning up under the hood. The Internet is rife with reports of how four and five year old iOS devices were suddenly restored to the full flush of youth, and I can say that my vintage-Christmas-2013 iPad mini which was functionally unusable in iOS 11 is usable in iOS 12 again. The old quip about “snappy” holds true for sure. There are other nice touches, like the new accents in Siri or the fact that voice recognition works in low-power mode or that the utterly unintuitive second step to close apps on the iPhone X is no longer necessary. It’s a polish job for an OS that desperately needed one after the fiasco that was iOS 11/macOS High Sierra (for my money, the worst OS release since…7.5.2? Maybe?) and it seems rock-solid.
But the real magic is Shortcuts.
Shortcuts began life as an app called Workflow, designed to bring some primitive scripting capability to iOS. And it could do some nifty tricks, like helping you select a picture and turn it into a LOLCAT-style meme or populating a tweet with ASCII art or parsing QR codes with the camera or just putting one-touch buttons in the widget pane to call someone or start a certain playlist. And it can still do all of those things. But Apple bought Workflow, renamed it, and integrated it with another bought product from eight or ten years back…called Siri. So now, not only can you craft your own workflows, you can trigger them through Siri with voice commands. You can nest scripts in other scrips. You can call system functions. And most interesting of all, Siri will parse things you do on a regular basis – on the phone, without sending anything to the cloud for processing – and offer them as options to integrate into other shortcuts.
The practical upshot of this is that at bedtime, I can say “hey Siri, time to go to bed” and it will flip the phone to Do Not Disturb and launch the white noise app. When I walk out the door in the morning, I can say “hey Siri, time to go to work” and it will turn on low-power mode, shut off the wifi (so it doesn’t try to grab the half-assed wifi on every passing bus) and start the mellow playlist that eases me into the day, while launching Transit to let me see what time the train is coming. Or, if I’m at home in the recliner with a pint or just settling into a chair in my favorite no-television bar, I can say “hey Siri, pub mode” and it will kick on Do Not Disturb, launch the Kindle app for reading and start playing my “This Are Two-Tone” list with Madness, the Specials, the English Beat, and all the other stuff I like to have in the background while sipping on a Smithwick’s and reading.
This. Is. HUGE.
Not just because it’s a huge step down the road to JARVIS. It can’t read your mind yet, but it can make educated guesses and offer things that you can approve and incorporate for it to use. And it can do it all with machine learning on the device itself. This is enormous, because unlike Google or Facebook, no offsite data mining and aggregation is necessary to make it work. It may not be as effective as other digital assistants, but it does its thing without compromising your privacy. In a digital world where privacy is a luxury good, this might be one worth paying for.
And at this point – when you can buy a device knowing the OS will be usably updated for four years, when it can be customized and shaped into the most truly personal computer yet, and when it obviates the need for tablet, Kindle, and maybe even television in addition to all the other things the smartphone replaced, you can start to see $800 for 4 years as a more reasonable arrangement. If Apple’s long game is “value for money over more time” and they can make it pay out, that wouldn’t be the worst thing in the world to happen to this industry, especially when I’m balking at having to replace a watch after three years.
Get back to me again after the iOS 14 launch when we see how this X is working.