Cashing in

A month ago, I was up in the mountains, with virtually no internet connection – at my tent-cabin, there was none. Just me, a cooler of beer, a zero-G chair, and a Kindle. Paradise. Only problem is, I managed to lose the Kindle somewhere in the woods. So I took advantage of Amazon’s self-created holiday and bought a replacement. Kindle Paperwhite, as good a single-purpose device as you could ask for. Everything you need, nothing you don’t, sorted. But like the original, I went for the one with “special offers.”

Because right now, you can’t get anything else. They offer the Paperwhite – the cheapest one with built-in illumination, all I really require – for $120. And it comes with advertising, on the lock screen and in a banner at the bottom. Not particularly intrusive, not particularly interesting, mostly for Kindle-original content (which tends toward the self-published, far as I can tell) – but the kicker is, you can pay to turn off the ads. It will cost you a slick $20, one time. Apparently, somewhere in there, Amazon has calculated that the lifetime value of you seeing their ads is $20. But then, you don’t see them very much.

By contrast, look at what Amazon has done since the tremendous bust that was the Kindle Fire phone. (Seriously, Jayne Mansfield didn’t have a bust like the Fire phone was a bust.) Now instead, they have a slew of low-cost Android devices, which can be bought by Prime members…with special offers. The version without the ads will set you back an extra $50. Given that my original Kindle lasted me approximately six years, and that most people keep a phone from two to three, but that you’ll see the lock screen of your phone a LOT more than the lock screen of your Kindle, that’s probably a rough equivalent. We’ll stick with 2, because low-cost phones don’t hold up as long as flagship models, but consider it: if you are a Prime member, it’s worth $25 a year for Amazon to have that eyeball space.

Which makes sense. You paid $99 a year to be on Amazon Prime and have that free shipping, so it’s worth kicking back $25 of that as an annual phone discount to show you more things that your Prime membership is, in turn, likely to speed you along to purchasing. Amazon has the same appetite for data as a Facebook or Google, but it’s very single-minded: Amazon wants all the data you can generate about buying stuff, because they want you to buy more of their stuff. It’s bits in the service of atoms.

When you get right down to it, Amazon and Apple are the only companies who deal in atoms anymore. Apple is largely agnostic about what services you use – one gets the sense that they run their own services out of some atavistic Scarlett O’Hara impulse to never be hungry again after Steve led them back from the brink, but let’s face it, most folks have Google email and probably do streaming music through Pandora or Spotify rather than Apple Music and they’re all using Twitter or Facebook or Instagram or Snapchat. And as long as you laid down the $650 for that iPhone, that’s fine with Cupertino. (Of which more later.) Microsoft is kept alive partly by Xbox gamers but mostly by the fact that every major business still relies on Microsoft Office (even when it runs on a Mac or – heavens! – an iPad). Google and Facebook don’t even have a physical product to sell you, unless you count the slender market for the Pixel or the Chromecast or maybe the abortive $1/year charge for WhatsApp. Those are paid for entirely with your eyeballs.

Amazon has its dubious side, no doubt – mostly because it’s laundered the Wal-Mart monopsony for the digital era and the upper-middle-class market – but in this, at least, it seems straightforward. Amazon will feed you ads for other things Amazon can sell you, much like Sirius XM’s 40s Junction channel will never advertise anything but itself and other SiriusXM stations. But for other companies and other applications, that advertising is everything. Because they sell your eyeballs along to other takers. It seems like almost every new app that comes down the pike on the iTunes App Store is free and then charges you…to remove the ads. 

So I have to ask myself again: how much would Google have to charge you for their services – or Facebook, or anyone else – before it would be more profitable than just selling advertising against your data? And since they don’t charge…is it valuable enough they can’t? And at that point, how well off am I not using Facebook or Google products? And then I remember that Facebook owns Instagram and that 80% of the people I email with use Gmail…and maybe my goose is cooked no matter what. And that’s when you need regulation. But that’s a story for another time.

Wonder

DC couldn’t have gone about this more ass backwards. First movie: a Superman reboot, less than a decade after the last Superman reboot – which flopped because 1) the only Superman you need was made in 1978 and 2) Superman is the least interesting character to tell a story about. Okay. Second movie: Superman again, plus Batman but a different Batman than the one who had three movies in the last decade or the one who had four between 1989 and 1997 or the one from the TV show. And he’s fighting Superman for some reason. And let’s throw in Wonder Woman, and a bunch of hooks for the team-up movie with a bunch of heroes we haven’t met yet. Third movie: let’s forget the first two movies happened and remake the Dirty Dozen with a bunch of villains no one has heard of (except the Joker) and not really hook it into anything we’ve seen thus far other than the idea that the Joker and Batman have a past, which anyone could have told you already. And then…the Zack Snyder Grimdark Murderverse is a critical disaster area and an emerging box of flop? Send the girl.

At least Marvel waited to put identifiable characters in its studio ident until they had some. (Green Lantern? That was on purpose?) If there’s a criticism to make of DC, it’s impatience and incoherence. Marvel took four years and five movies to get to Avengers, they eschewed Yet Another Origin Story for the Hulk (the only MCU character who any person on the street could have named in 2007), and every leading character – Iron Man, Hulk, Thor and Captain America – was introduced and on the radar with their own picture before Avengers. Even Nick Fury and Phil Coulson had three or more appearances to build things out. Yes, Superman and Batman and Wonder Woman need no introduction – which is why they should go last, not first. But if you’re going to go first, why wouldn’t you start on the lowest difficulty setting?

I mean, count it up. Four Superman movies between 1978 and 1987, plus reboots in 2006 and 2013. Seven Batman movies with four different actors from 1989 to 2012, including a major thematic reboot halfway through. We know the deal with Superman and Batman. Wonder Woman was an unbelievable opportunity: THE female superhero, a character that everyone knows, but whose origin story is nebulous and malleable and whose only mass-media portrayal was an admittedly-iconic TV show in the late 70s. You simultaneously get an A-list character and a tablua rasa. It should be the easiest kind of movie to make – all the pop-culture recognition of Star Wars or Superman and none of the existing continuity baggage to weight it down. So…

Wonder Woman was – and this sounds like damning with faint praise and it sincerely is not meant to be – a perfectly good superhero movie. It’s hands-down the best DCEU movie yet and there is not a second place finisher; it’s literally the only one worth paying to see. But you can see it struggling against the freight of the ZSGDMV – the fight choreography and color correction just suggest 300 or Watchmen way too much, especially once you put the Mediterranean armor on everyone. I half expected Princess Buttercup to shout “AMAZONS, WHAT IS YOUR PROFESSION” at one point. And it doesn’t help that some of the WWI elements came across as a Captain America: The First Avenger pastiche, with the kinda sorta Howling Commandos and the heroic self sacrifice of Steve to prevent a plane full of doomsday weapons destroying a major world city. I realize that Wonder Woman’s canonical origins involved meeting an American man who brought her into a World War, and that’s as may be, but nothing required the second and third act to be quite that on-the-nose. And the pre-existing storyline means we’ve basically put her character on ice for a hundred years, so all those supporting characters (including a criminally underused Lucy Davis as Etta Candy) get wasted on a one-and-done. (In fact, Patty Jenkins got handed Gal Gadot rather than doing casting herself – and it works out splendidly, even Lynda Carter says so, but it does feed the “backward in heels” aspect of having to make this picture fourth.)

But it was World War I, not II, and that was a very good decision to make with the whole Ares angle. The war to end wars and the debut of industrialized warfare on a continental scale, aerial bombing and gas attacks and machine guns and the endless murderous stalemate that traumatized a generation and set the stage for everything after? It gives Diana both a reason to show up and a reason for her to give up for a hundred years after. (That 100 years is a convenient way of pointing up the whole “immortal daughter of Zeus” thing and also making it a bigger deal that now the stakes are high enough for her to come out of hiding in a way that even freakin’ Nazis apparently weren’t.) Which is good, because DC/Warner/Snyder/Johns have done absolutely naff all to set up the villain of Justice League – I assume it’s Darkseid, but one painting that’s supposed to be suggestive of a character no one has heard of is fan service, not exposition.

It’s a good movie. Possibly a great one. But it makes you wonder what Patty Jenkins could have done if she hadn’t been painted into a corner. I think if this had been the first DCEU movie, rather than the fourth, and not had to service the storyline of the first three, you could have gotten a masterpiece and a solid foundation from which to build the DCEU. Instead you merely get a first-rate summer blockbuster that makes you ask “was that so hard, guys?” And while it’s probably too late for Justice League to go to school on it, maybe it’s not too late for the GrimDark MurderVerse brain trust to say “more like Wonder Woman next time.” Which would be a win all around. 

Years ago and time gone by

I know today is the 10th anniversary of the iPhone launch, but I didn’t get my company-issued one for almost a month, so rather than reflect on how I was onsite for the two moments that shaped the 21st century, I’ll look back a little…because I finally got with the times in modern geek/political culture and saw Hamilton yesterday.

I think the obvious thing is: had I seen this twenty-five years ago, it would have changed my life. But like Rent – which I saw for the first time in 2003, after retrovirals and the dot-com boom and HIV as a suspended sentence rather than a death warrant – time and events have moved me out of the target demographic. I think the ideal audience for this show is a young American of any age, background or station who hasn’t yet had the opportunity of a shot, let alone the chance of throwing it away. (Had I seen this show twenty years ago, rather than twenty-five, it would have been a lot harder to take, and I can’t vouch for what would have been my reaction even two or three years ago.)

I have said elsewhere that this show reminded me of the iPhone, or more precisely, the iPhone 4: the craftsmanship, the materials, the weight of it and the feel of it. Lin-Manuel Miranda has taken such simple phrasing as “it’s quiet uptown” or “that would be enough” and freighted them with enough emotional heft to add them to a list of things that will probably be tattoos on theater majors for years to come. Not that “I am not throwing away my shot” and “young scrappy and hungry” and “WHO LIVES WHO DIES WHO TELLS YOUR STORY” won’t be up there with “defying gravity” and “no day but today” and “the Internet is for porn” (okay, maybe not that last one). But you can see every dollar, every hour, every drop of sweat up there on that stage. This was not something some genius randomly shat out, this was a work, a labor, and if it looks effortless in the telling you can see the effort that made it. Like Willie Mays, Miranda put in the hard hours to make it look easy.

The thing that grabbed me most about that show, though, was the liminality of the moment. Nothing was predestined for the United States of America. Nothing was on rails that said we would inevitably become a superpower. Those early days, those first arguments about how we would regard our allies or how we would finance our government or who would have the upper hand between the agrarian rural lands and the swelling urban districts – those are the same fights we’re having two hundred forty years later, over the disproportionate power and influence of the South or the relative righteousness of the sweat of the brow versus pushing papers.

We write rules, we make laws, we throw everything into that black box and agree to abide by whatever emerges from the room where it happens – but like Gibson’s cyberspace, it’s really a shared consensual hallucination. We have norms and behaviors that are only that way because we agree that’s how it’s going to be. And then when we disagree – what? When we decide that we just don’t have to do what we always did? When we can use a Senate rule – not a law, not a Constitutional process, a mere point of debating order – to shut down majority rule? To deny one branch of government its role in another? When we decide that we need to know a candidate’s finances, until he says “no you don’t” – what then? When a full house beats a flush and the guy with the flush says “no it doesn’t” and scoops for the pot – what then? What are you prepared to do? Never mind if you don’t have the votes – what if the votes aren’t enough?

Hamilton and his friend and his foes gave us enough of a government and a nation that we grafted this Founding Father nobility over it and took it for granted. Maybe we can keep it. Maybe not. And nobody knows what comes next. And contra Angelica, Eliza, And Peggy, you’re not always lucky to be alive when history is happening.  Of which.

Irony

Look, there’s nothing shocking or tragic here. If we’re willing to shrug off two dozen kids shot dead at school two weeks before Christmas, then a bunch of Congressmen getting sprayed is just the price of doing business. And it’s not like one political party has gone to the verge of incitement to protect the guns. Or like major network news gives airtime to those spreading lies and slander for the sake or protecting the guns. Oh wait.

Lie down with pigs, you’re gonna get shit on you. Make your highest political value “No Gun Left Behind” and eventually someone’s getting shot. Might be time to change how we think…but we won’t. 

“…oh that’s a huge botch”

Theresa May had a slim but viable Tory majority locked in through 2020 in Parliament. Owing to the Fixed-Term Parliaments Act the Conservatives passed to protect their LibDem coalition back in 2011, even the change of leadership and the chaos of Brexit meant they could still hang on without having to call a new election. But she decided to call an election in 2017 anyway, thinking that her party could extend its grip on government and have a mandate to push ahead with Brexit.

Whoops.

The Tories no longer have a majority at all. They are dependent on a right-wing Northern Irish party for a majority, one that some Tories are already pushing back against. The Conservative backbench has already gotten the heads of May’s advisors as the quid pro quo for not turfing her out with a leadership contest which would almost certainly send the country out for another election. And oh by the way, EU negotiations over the terms of Brexit are scheduled to start a week from today – and the mandate for a hard Brexit has been blown into a billion pieces as it becomes increasingly clear that the Great British public would kind of like a do-over on the mistakes of 2016.

So say we all.

If there’s a recurring theme in 2017 politics worldwide, it’s that people are puking up the populism they were over served in 2016. In France, an explicitly pro-EU technocrat crushed Yet Another LePen in the election for President – and the pundits said “well he won’t have his own party in Parliament, he’s going to struggle, that’s the real election” right up until Macron’s new party won a landslide majority in Sunday’s voting. Angela Merkel, who some thought might be on the rocks coming into 2017, looks to be shaping up well as the new Leader of the Free World. And oh by the way, the American President is now down to a -20 split on approval rating.

This is the problem with American government: we have parliamentary politics but a divided-powers form of government. It yields a huge structural advantage to a party that wants to undermine the role of government and a party that bases itself heavily in small, rural, overly-white states. When the two dovetail, you wind up with what we get now: the United States of Alabama. While the majority of the country would like to be rid of it – after all, a plurality voted for the other leading candidate, not the winner – we don’t have a lot of options for solving this thing in the near future. Probably not until 2020, if we’re being honest.

Because here’s the thing: we’ve had three impeachments of a President in American history. None resulted in a guilty finding at Senate trial. The most recent one was an explicitly political act to try to undo what the GOP couldn’t accomplish twice at the ballot box, a ginned-up perjury trap formed from a six-year fishing expedition. The secondary impact was to utterly tar impeachment as a political process and effectively undermine its legitimacy for the future – maybe as revenge for Nixon, who knows, but the point is this: we’ve never actually removed a President that way. Nixon resigned rather than go to trial in the Senate. There are no circumstances right now under which this Senate could muster the votes to convict a Republican President.

And what if they could? Bear in mind that even if “the system works” as the goo-goos like to say, think of the implications around what it would take for half the GOP to turn on their incumbent. Something really really bad would have to have happened, and that means that sure, he’s gone, but we’re also reckoning with the consequences of whatever thing made it possible for the Senate to do the deed – collusion with foreign powers, massive abuse of authority, a dead girl and a live boy, whatever – along with, in all likelihood, the activation of a rump faction that has spent decades now dying for an excuse to want to need to use their guns. People hoping for a neatly-timed assassination are asking for a world of nightmares – if you thought the country went seven bubbles off plumb after September 11, and it did, try to conceive of the world of shit that would be unleashed if someone took a shot at the President and succeeded. Hint: you don’t want those problems.

The time to sort this thing out was in 2016. We don’t have the same sort of puke-and-rally mechanism a parliamentary system comes with. Right now is the time to work on flipping the Congress in 2018 to further mitigate the damage, but thinking we have an escape route in less than three and a half years is a fool’s errand. In the meantime, there’s a perfectly usable political party and institutions of government, and the thing to do is use what we have right in front of us instead of trying to magic up some sort of miracle-erase undo solution.

Because no matter how things turn out – even if Kamala Harris has her hand on the Bible on January 20, 2021, looking at a 350-seat Democratic House and 70 seats in the Senate, and the land of milk and honey and fried catfish is at hand – we’re never going to have not elected Donald Trump. That’s something we have to deal with, not wish into the cornfield. And there are a lot of implications to that. Of which…

In the meantime…e-o-leven

If there’s one thing you should take away from today’s WWDC keynote, it’s that Apple has heard the mob of developers darkly warning that it’s been four years since the Mac Pro and that Apple isn’t committed to its professional market. This murdered-out hot-rod iMac Pro may be the replacement for the 2013 trashcan, or it may just be a desperate placeholder until a new pro tower can be conceived and birthed. Either way, it’s hard not to hear the subtext of today’s keynote as “Baby! I’ve changed! I swear! Don’t leave! I’ll give you anything you want! I’ll give you APIs for machine learning and augmented reality! Just stay!”

Oh yeah, about that. It looks like Apple is finally arriving for the Great Virtual Reality Fight – and where there are only a handful of phones supporting Google’s Project Tango, it looks like Apple’s plan is to make their solution run on any of their 64-bit hardware back to the iPhone 5s.* Some better than others, certainly, and almost certainly optimized for some notional 2017 iPhone (7s? 8? 10th Anniversary?), but for developers, there will be APIs and a pre-existing installed base. Where Apple hasn’t been first to the fight, they’ve generally arrived with a finished solution requiring only some polish once it’s made contact with reality (iTunes, iPod, iPhone…) so it’s not hard to imagine that within a year, Apple will be punching at equal weight with the likes of the Oculus Rift or the Hololens or Tango.

Except, of course, where the Apple Watch is concerned. We have yet another UI coming for this watch, the third major iteration in as many years, and this one looks like it might stick. When Google was coming up with their watch, I hypothesized that the goal was basically Google Now on the wrist, and that seems to be where watchOS 4 is heading, using Siri and machine learning to mine your device – more on that in a minute – and put the sort of “Your Day Today” stuff on your arm in order. Here a meeting, there a ticket, there driving time home, here tomorrow’s weather forecast, with reminders and notifications slid in as required to alert you to email or advise you to walk that last four minutes you need. There have been God only knows how many attempts at this on iOS – Donna, ARO Saga, Osito, Google Now itself – but they all relied on you giving the app access and mostly depended on you using Google services for mail and calendar. Four years on, this is built in at the OS level, and the key thing will be whether you have to use Apple’s primary apps and services or whether it can work with your Outlook calendar or your Gmail or the like. 

And of course whether that data resides on your phone. Apple is all-in on personal privacy, or at least enough to make it a deliberate selling point. They went out of their way to assuage concerns about their always-listening speaker, to emphasize that their machine learning is searching your phone ON your phone, that your data stays your data. And as the only one of the Big Five tech firms in America whose business model relies purely on selling a physical product for cash on the barrelhead, they can get away with it in ways Google and Facebook simply aren’t and will never be capable of. As iMessage moves into the cloud, of course, this could get complicated, because now some of your data will be obligated to reside on Apple servers somewhere – but that’s why we download Signal, right?

There is a dividing line with this technology. I saw it to some extent when I was doing remote workstation support through ARD and would take over someone’s screen. About half the time the response was “ooh, cool!” and the other half it was “eww, creepy.” That nano-millimeter between cool and creepy is where Apple is trying to tiptoe, trying to offer you a magical experience without giving you reason to engage the suspicion module.  Some of the split may be generational; the Snapchat kids are probably less bothered about the notion that a company could see all their stuff. Then again, when your experience with the Internet began as “they could be ax murderers” instead of “the app said it was OK to get in this guy’s car,” it’s not surprising that your reaction to some of this stuff is “ask questions first and upgrade later.” The goal for Siri – whether in your phone, your arm or the speaker you just parked under your TV – is to be JARVIS without becoming Big Brother.

About that TV…it takes about as much time to take a leak as Tim Cook spent talking about Apple TV. In fact, apart from the news that Amazon Prime Video, the last major streaming holdout, will be available by the end of the year, there wasn’t any news. Instead we got the HomePod, the Apple answer to Alexa and Google Home, and while it is pitched as a music device first and foremost, its Siri and HomeKit integration suggest that it will be the Apple hub for home in a way the Apple TV might have otherwise been. Which is interesting, given that the Apple TV has its own named operating system and App Store and the like.

Two possibilities here, both of which I suspect are true. One is that Apple is getting nowhere with its television plans. Rights-holders and broadband companies aren’t about to play along, especially with Ajit Pai ball-washing the cable companies with every decision, and without some sort of actual television service of its own, the Apple TV is a glorified and overpriced Chromecast. The other possibility is that Apple is really serious about Siri this time, and believes that by the beginning of 2018 it will be a sufficiently capable user interface to be the only interface in very strictly limited circumstances. Play this song, what’s the weather tomorrow, did the S&P 500 close up or down. Not a whole lot, and not appreciably more than you can get out of Siri now, but by using a very tightly-selected few “domains” and making them work well, Apple is betting that it can make voice a plausible UI mechanism which can then be expanded as needed.

It’s an interesting bet, and one that goes along with the emerging meme that the computer itself is being abstracted away. Google is more or less up front about this, saying that Google’s services are the real computer and that your watch or TV or speaker or phone or car or whatever is just your chosen portal through which you interact. Apple is doing something similar, trying to homogenize your phone and laptop and speaker and watch and tablet and desktop into one big lumpy pillow which you can fluff up into whatever configuration you presently require. (The addition of a dock and drag-and-drop and enhanced multitasking and a FILE SYSTEM BROWSER and the like to iOS 11 for iPad suggests that we’re not that far off from one OS to rule them all – “appleOS” maybe, but just as likely “siriOS” at this point. Are they siriOS? Possibly.

The pieces are all there.  The voice recognition is finally approaching usability. The machine learning – if you can get past the suspicion – is starting to get better about surfacing the right information contextually. If you can go between watch and headphones and the larger phone in your pocket without pulling it out, maybe you do only need just the one 5-inch AMOLED-display VR-ready 2500-mAh-battery iPhone Superba that docks in your 4K display at the office instead of a phone and a tablet and a laptop.

In a lot of ways, then, Apple spent today asserting that they’re still here, and they’re still serious about everything, and that they want to build the future. It’s not time to sell the stock yet.

 

*ETA: according to Phil Michaels, the baseline is an A9 processor with iOS, so iPhone 6s and later. Jury is out on whether that includes the iPhone SE, which packs the A9 but not the 3DTouch components.

flashback, part 85 of n

In the beginning, it was still shirt and tie four days a week. Casual Friday was olly-olly-oxen-free, unlike at Sonat when it just meant you could leave the tie off – any sort of button-up or polo was fine, as were jeans, and I still had those Converse leather All-Stars that I recall wearing, the last time I owned basketball shoes. I still had the Elk for outerwear, the big oversized leather jacket I’d foolishly bought that first semester at Vanderbilt, and (briefly) had an actual trench coat for rain before I quickly returned it at JC Penney to have the credit applied against a MasterCard that was already straining against its credit limit.

It was sometime around the time my dad died that National Geographic went away from any sort of business attire. It wasn’t quite on the level of a dot-com but it was a lot more casual than the rest of DC. I was casting about for the right jeans, and went through several different manufacturers – Britches, Eddie Bauer. It wasn’t until California that I would settle on Levi’s 501s for a decade, followed by the addition of LC King Pointer Brand work jeans – I never owned either one in DC. Same with sunglasses – I went through maybe half a dozen pair of assorted manufacture, here some Clubmasters and there something cheap and there a Fossil or an Oakley. Never adopted amber lenses or the New Wayfarer until California.

I never wore a hat in DC that much – it was too hot in the summer and made my hair a mess, but I did have a Boston Red Sox batting practice hat for when we played softball. Not long after my future wife moved to DC, it was replaced by a Giants BP hat. I certainly owned some Redskins headgear, and there were lids from my alma maters, but they didn’t get much run. If it was hot weather, I didn’t have a hat on, and smartly so – I wouldn’t need a hat on a regular basis until I finally cut my hair down after moving west and getting married.

I wore wide-wale corduroy in the winter and flat-front khaki in the summer. I had mostly black polo shirts in cold weather and mostly untucked button-up resort-type shirts in hot weather, bought on sale from the Macy’s clearance rack at the Ballston mall.  Thus my future surrogate big sister’s dig at my packing: “black shirt, black shirt, black shirt, Hawaiian shirt, black Hawaiian shirt.” Between Easter and Labor Day, I didn’t even wear socks – sometimes with fisherman sandals but more commonly with low four-eyelet Dr Martens brown oxfords with a padded collar…and no socks. I owned half a dozen pair of assorted DMs in DC but never steel toes before joining the fruit company in Cupertino. But the DMs took care of the endless search for the right brown show that had consumed my last year in Nashville.

I carried a much larger Leatherman, every day. I carried a Zippo lighter, every day. For years, I carried a pipe and a tobacco pouch. I carried a pager and a cell phone and a rolled-up magazine and a Walkman or MP3 player or iPod and sometimes a Blackberry or Palm Pilot, all of which were completely replaced by the end of summer 2007 with just an iPhone. I dreaded summer because I needed the pocket space of a jacket, and once invested in Dockers with concealed cargo pockets, zippers down the outside seam, so that I could get my smoking and technology apparatus hidden away.

So many of the things that are central to my wardrobe now – the LC King and American Giant, the Indy boots and canoe mocs and all the American-made workwear, the button-fly 501s and Ray-Ban New Wayfarers with polarized lenses, the New Era low-crown 5950 baseball caps, the endless American Apparel T-shirts in 2XL, the Harris Tweed and the Buzz Rickson and the Filson trucker jacket, the black plastic Birkenstock shower shoes – all of that has happened since I came West. Apart from the Indiana Jones leather jacket and the ubiquitous khakis from April to October, I didn’t have any one particular thing that stood out about my DC wardrobe – just a bunch of pieces in and out in search of a unified theme which wasn’t even “failed grad student gone wild” or “upscale vagrant” or “man wearing clothes so he doesn’t catch an indecent exposure bid on the Orange Line at 9:15 AM”.

Maybe I was more liminal then. Or more creative. Or the climate required flexibility. Or maybe I just had it in my head to handle the bandwidth of a little more variability in how I left the house. I’m not yet on the Steve Jobs school of simplification, but there’s a lot more commonality in what I wear out the door in the mornings now, because the climate here is basically “April in Birmingham” 12 months a year. There was certainly no time in DC when I owned five identical T-shirts, four identical pair of jeans, three different identical T-shirts, eight identical pair of black socks and a dozen different baseball caps with almost as many different pairs of shoes.

I don’t know what any of that means. I feel like it means something, or ought to, but I’m not the one to figure out what it is.

More distracting thoughts

So I bought a SIM from US Mobile ($4) and set up a sample plan for them. 100 minutes, 100 texts (both the minimum) and 4 GB of data. The total cost per month is $27. A more realistic configuration for me would probably be 300 minutes and 10 GB of data (just to be safe) but even that is all of $39 a month. They’re backboned off T-Mobile, so I have a pretty good idea what their network is like, and in the early testing on my iPhone 6 (to eliminate iPhone/Android differences in how stats are calculated and bars shown) it seems to be not-terrible. The only caveat is that they don’t support visual voicemail, although the fact that I even thought about voicemail proves I’m over 40.

In the course of doing this, I have come to realize that yes, the iPhone 6 is just a little too big. The Moto X has the same size screen in a smaller package. Not much smaller – the difference is 2 millimeters wide and 9 millimeters high – but when those differences are all bezel and no screen whatsoever, it makes a world of difference. It also doesn’t help that the iPhone 6 requires a case if you want it to lie flat and not on the camera hump, which only makes matters worse (not to deny that the Apple leather case is very nice). And of course, the iPhone 6 and 6S are infamous for their battery performance, so much so Apple had to produce a first-party battery case. But that is neither here nor there at present, and Apple does deserve credit for keeping the same outward body dimensions (and thus the same cases and accessories apart from headphones) for three generations of the device (as well as keeping the iPhone 5 design from 2012 intact for the SE).

Again, it drives home how the Moto X got it right. Just a hair over 5 inches by 2.5 inches, with a 720p display in AMOLED, 2 GB of RAM, always-listening voice command (“listen up, Friday”) and a 2000 mAh battery (which Apple has yet to fit into a phone that isn’t a cafeteria tray). What I wouldn’t give…but again, neither here nor there.

The point is, I feel like there’s a sweet spot in which a larger phone could be made to work for me. In my continuing quest for the one perfect solution for everything, the notion that there’s one phone that could be the daily driver and the travel phone and a viable Kindle substitute and have my EU SIM card in there at the same time and have replaceable parts in case the camera or battery go south…

And then I see Andy Rubin’s new Essential phone. Standalone device, presumably not bound to carrier updates, ceramic and titanium and a HUGE full-screen display (with a notch for the camera), a real premium device with a 3000mAh battery…and it’s actually longer (by 4mm) and wider (by 4 mm) than the iPhone 7 is already (although thick enough to completely contain the camera, so well done there). Which means that by the time you put the case on the iPhone 6 I currently have, it’s…the same size. Just a hair too big. All screen, which is nice, but it’s still a phone that’s just a hair bigger than the phone that was just a hair too big. And if the Pixel and the iPhone 7 are already a hair too big, I don’t have an answer. I don’t think we’re going to get a phone with a 4.7” display in a compact one-hand-able format again. Which is why I’m back to “you will pry this iPhone SE from my cold dead hands.”  I haven’t had a work phone upgrade in three years, because I wanted this SE to be mine and mine alone, and I’m glad I did (and that I have a viable option for continuing on prepaid for under $40 if circumstances dictate.

Special Edition

A year on, I look at the other things around the cheap-phone space. I only paid $300 for my Moto X in 2014, so any replacement Android device needs to be no more than that. And the thing is…it’s not there. Nothing else at that price point has an AMOLED display AND has NFC AND takes the same size SIM care AND comes in at a comfortable one-hand size. Never mind the crapsack cameras that come with Androids at that price point or the virtual guarantee that you’ll never see more than one major OS upgrade (if that). Sure, it seems nice to have the promise of a 4000 mAh battery(!) in the Moto C Plus, or completely unbranded Android in the Nokia 5 (hopefully with the fit and finish we expected of our old pals from Espoo), but there’s always some kind of compromise.

And then there’s the iPhone SE which I bought cash on the barrelhead last year for $500. Although in a way it was actually kind of free, because it was completely paid for by my share of the court settlement over Apple, Google, etc conspiring to restrain employee movement. In any event, it was the first cell phone I put on a credit card of my own since 2014, and only the second since 2010. So it had to be something special to make it worthwhile, especially since the iPhone 6s was the first iPhone I found less desirable than its predecessor.

After one year of use, the SE has proven to be special indeed. I have bopped back and forth between phone cases, and I’ve still pulled out the Moto X on nights or off-days when I needed to be more fully detached without giving up connectivity altogether (read: I want to see Instagram and I might get a Slack message from Kazakhstan). And I still greatly prefer to use my Kindle Paperwhite for reading, because the SE’s display is indeed a little narrow for everyday use (but serviceable in a pinch). But after a year of everyday use, I took the phone to an Oakland A’s game last weekend, and never needed to pull out the external battery despite six hours of Slack, Instagram, text messaging, taking pictures, paying for beer and generally carrying on out and about with friends.

It fits in a pocket. It fits in one hand. It plays nicely with the car’s integrated CarPlay console or with my new Bluetooth noise-cancelling headphones (and I can walk to the fridge and back at work, leaving the phone on my desk, and be just fine). It took a prepaid EE SIM in London and was just as useful on every frequency band as in the US, and it took a prepaid T-Mobile SIM in San Jose and gave me top signal at a California League baseball game and a fishing boat off Santa Cruz. It takes amazing 12 MP pictures that no point-and-shoot I’ve ever owned would take. It has NFC for payments and a fingerprint reader to unlock it, it works for airplane tickets and baseball tickets and concert tickets alike. And because it came out in the spring of 2016, it probably has a good three years of OS updates ahead of it.

And this one doesn’t belong to work, and isn’t locked to a carrier, and isn’t hobbled by a contract. This phone is all mine, stem to stern, and I could quit work tomorrow and pop my T-Mob SIM in there and carry right on until I settled on a long-term deal with them or with Cricket to carry me forward for less per month than I ever paid before abandoning my own AT&T foundation account in 2012. I can wait for phone makers to come around to the fact that yes, there are people who want something that doesn’t need a purse, and not everyone needs a 5-inch display.

Two phones I’ve bought since 2010. Neither has ever given me any reason to regret them.

For the culture

(Mind wandering in the cause of distraction.)

For some reason, I am a person who completely missed on what is commonly thought of as “geek culture.” In my life, I’ve seen maybe three episodes of MST3K. I saw a good few episodes of Star Trek – both original and Next Generation – but never really got into them aside from the Borg cliffhanger in 1990, which in my mind was still a better Trek movie than any of the Next Generation-featuring ones. While I was drawn into Doctor Who in my youth, it’s sort of gone by the boards lately. I was into comics for exactly four years between 1983 and 1987, and into tabletop RPGs maybe 1982 to 1987 tops. I was a Star Wars obsessive, obviously, but that just made me a kid in the 70s. I never got much further into the Expanded Universe than the original Thrawn trilogy (and missed very little, by all accounts; the other EU books I did read were pretty much crap). I’ve still never seen Monty Python’s Life of Brian, a single episode of Red Dwarf or Babylon 5, or any of seasons 1 or 4 of Blackadder. I’ve read one Discworld book and don’t remember that much of it and my sci-fi literary canon basically begins with Connie Willis and ends with William Gibson.

I didn’t have a computer at home until I went off to grad school, aside from the times over Christmas break when my dad would bring one home to noodle around on (never to any great effect). I never did online gaming other than at work, in the days of Quake and Unreal and original-flavor Call of Duty. I bought a PlayStation 2 mostly as a DVD player and owned exactly two games for it (NCAA Basketball ’04 and Arena Football, neither of which I played more than twice). My first BBS membership was in 1994. And while I was on Slashdot in the late 90s, I’ve never had an account on Digg or Reddit or anything similar. I don’t even follow Wil Wheaton or Felicia Day on Twitter.

In short, while I did hit some of the most obvious markers and am broadly conversant in the lingua franca, I never really bought into capital-letter Geek Culture. It’s possible that there just wasn’t that much of it accessible in exurban Alabama in the 1980s, or that the pre-Internet world made it a lot harder to find and connect with things. Or maybe it’s just the same pop-culture blind spot that I still have to this day (a short list of current hings I’ve never seen an episode of: Game of Thrones, The Sopranos, Breaking Bad, Stranger Things, House of Cards, Veep, The Walking Dead, Justified). But a healthy grounding in geek culture was something that was necessary if you were going to fit into the people at my undergrad that didn’t fit in, so…well…there you go.

It’s a tough one. On the one hand you want to reject that there’s any qualitative difference between, say, Tolkien obsessives and Auburn fanatics. But on the other hand, you don’t want to reject caring about everything and anything. But on the third hand…I suppose on the third hand, I’ve rejected a subculture that I was entirely fit for, except inasmuch as it hits the mainstream of American life and not always then. (I mean, I own almost every Marvel Cinematic Universe film, but I’ve still never seen the 2008 Incredible Hulk.)

Then again, it’s not like I was in the mainstream of popular culture…ever. I’ve never seen an episode of Three’s Company or Miami Vice, of Mad Men or The Big Bang Theory (to go opposite ends). Never saw Dallas or Dynasty. Never watched past the first 20 minutes of Lost (starting on an airplane might have been a mistake). Never seen a single minute of a single Shonda Rimes show.  These days, my television viewing consists of Silicon Valley, Agents of SHIELD, Tiny House Hunters, House Hunters International, maybe Graham Norton and/or Top Gear, and an occasional smattering of rugby, Premier League soccer or the Oakland A’s. Plus reruns of California’s Gold, of course. And that’s about it. No interest in professional sports on TV otherwise, and precious little college sports aside from bowl season or March Madness. And since you can’t really participate in American sports culture if you’re punched out of the NFL…

I was listening to a podcast about expat life for Americans abroad, and apart from the fact that every single one complains about the dearth of decent Mexican food in France or New Zealand or Germany, one comment stuck out, when someone said that as an expat, she felt kind of lost – not a part of America anymore, and somehow different when back here, but not really a part of her now-home country either. And that struck home with me, largely because it fits so well. I still don’t feel wholly Californian (though I am determined to make the effort more than ever these past six months) but wouldn’t feel right returning to the DMV or Nashville.  I definitely didn’t fit in whilst in undergrad but didn’t fit in with the subcultural alternative either. Somehow, I have managed to make myself fit nowhere exactly, which isn’t always a bad outcome. But it dovetails neatly with the “broad but not deep” which has characterized so much of my life…of which, blah blah blah.