“I’ll see you in twenty-five years.”

(OBVIOUSLY ONE IMPERIAL FUCK TON OF SPOILERS FOR “TWIN PEAKS: THE RETURN” FOLLOW)

 Three years ago, when we found out there was going to be new Twin Peaks, it was the most inconceivable thing imaginable. The notion that after all this time, we’re headed back to that tiny town in Washington – well, what did I say when I rediscovered the series a year before the announcement? 

The thing is, Twin Peaks in its time tended to parallel my life. It started with a bang in the spring of 1990, when I was through with high school and anxious to get on with my future.  I even bought the cassette single of the theme, deliberately thinking to myself “you know, this would make a fine song with the new girlfriend which I will undoubtedly meet once college gets going.”  And then, when the show came back in the fall, it slowly deteriorated until petering out in April…which is just about how my freshman year went.  One long slow deterioration until by April, it was obvious that I wasn’t going to be able to save this bird from a hard landing.  And just like my college career, the series didn’t have a happy ending either – just a cliffhanger with no obvious hope for how things could be saved.

Twenty-five years on and eighteen episodes later, it turns out – surprise – everyone got old. It’s jarring to have gone from the original series straight into the new one. Everyone got older, everyone got tired, you could see the weight of the years on every single person in that town of 51,201. Hawk especially struck me – hair gone white, moving slowly, connected to a dying friend who maybe only he understood. Special Agent Albert Rosenfeld, gone from incisive snark wielder to just-hanging-on veteran agent with the sort of world-weary hangdog look normally associated with Tommy Lee Jones characters. Big Ed, Norma, Nadine, Dr. Jacoby, Shelly, Bobby – everyone’s older, everyone’s put on weight or gotten gaunt, and if you were already in the workplace back then…well, guess what, you’re probably doing the same job. I was off to college when Twin Peaks came out, and in some ways, I’m still there myself. There’s no question of picking up right where you left off, and when you haven’t seen these people in forever, they certainly aren’t going to look like you remembered.

It’s definitely dated, I admit.  The pacing isn’t quite as bad as you’d expect of an 80s prime-time soap opera – and make no mistake, that’s what this is – but then, some of the slowness could be camouflaged by the abiding weirdness David Lynch brings to the table every time out…So many plots and story lines that went nowhere, seemingly. Anything with the Packard Mill got boring in a hurry – Piper Laurie’s scenery-chewing bitchery seems much more suited to something like Dynasty.  The switch from the plot being driven by the expanding Renault crime organization to being propelled by Windom Earle seems fairly abrupt.  And James off with his mysterious woman served no purpose whatsoever.  No wonder it went off the rails – there was just too damn much to keep track of.  Lesson learned: you can be complex without being complicated…  

The pacing sure didn’t change. If anything it got worse, and you wonder what the show would have been like if it had stuck to the original order of only nine episodes. And the hanging threads – worse than before, if anything, including having no idea what’s the story with some of our most beloved characters. This is one you definitely have to go back and watch from the beginning to see if it starts to make any more sense after the fact, but I’m not a hundred percent sure it will. And in some ways, that’s OK. This isn’t really about driving the plot to its conclusion, it’s about the setting and the atmosphere and the presence of a strangeness that you will never understand or see the end of. Like, well, life. We’ll never know why Sarah Palmer was like that. We’ll never know what happened to Audrey. We’ll never know if that was really Laura or what kind of world they’re in or when or where. There’s every chance that this whole thing is some sort of Owl Creek fever dream in Coop’s dying moments somewhere in Philadelphia in 1989.

Oh, and Josie Packard’s never getting out of that drawer knob, I guess. 

The look is equally dated, although once again that could be partly Lynch and possibly just an affinity for the era. Let’s be honest; I was 18 and pretty much every one of the women on the show still holds up… Norma in particular is still lovely, although she (and presumably Big Ed) are younger then than I am now, which is kind of disturbing to think about.  I’m still rooting for those two, of course – it’s tough to be with the one you love when one has a spouse in prison and the other has a superhumanly strong one with an eye patch and a drape-runner fixation.

The feel was dated, and deliberately so. Lynch is committed to his surreal 1950s horror beneath the surface ethos, although nobody does black-and-white better. But so were the actors dated. Many others have said it, but this show didn’t shy away from the brutal fact that we all get old, we all die, and not everyone gets a happy ending. In so many ways, the payoff was in episode 16, when Dale Cooper wakes up, pulls out the IV, is back in the suit, crisply demands a revolver and a flight to Spokane, and says “I AM the FBI.” That’s what we all want to imagine it could be like – that we wake from the dream, somehow, and are fresh and ready to go, capable and confident, with the opportunity to take care of that unfinished business. And just like Coop, the dream of unfinished business runs headlong into the reality that time runs one way, you can’t go home again, and everything that happened really happened with no undoing it. The notion that you can always proves to be an illusion. Always.

But that said, I’ll say this, spoiler free: if you didn’t jump off the couch screaming and punching the air triumphantly at the 10:00 of episode 15, you don’t have a soul. If that’s the only payoff from the original that we ever get from this series, it’s the one I would have wanted. And it’s proof that sometimes, rarely, you get that piece of a dream you hadn’t thought would ever come around again. 

It wasn’t the old Twin Peaks. It never could have been. But it was enough.

Locking the barn after it’s burned down


Ultimately, the thing is this: at some point the Google Now-like service has to be something that does all its data mining and processing locally on the phone itself. Independent or at least agnostic of service provider, able to get useful info out of your work email without compromising your security in doing so and able to leverage whatever personal email provider you use without relying on Google’s technology.  In a way, that’s already present in iOS – for instance, if you get email with a tracking number from UPS or FedEx and tap on that tracking number, you’ll see “Track Shipment” as an option, irrespective of whence came the email.  Apple Data Detectors – a technology that Apple first rolled out in 1997 then largely ignored until two or three years ago  – can do that right now, already parsing out addresses to be sent to the address book (or soon to Maps) or dates to be sent to the calendar.  So the technology is there and it doesn’t take much to suggest that it could be extended to include things like flight confirmation numbers or  the like.

All of this is a very roundabout way of saying that I fully expect an Apple watch before long, and I expect it to rely on the functionality of iOS 7 to deliver a thin but satisfying slice of data to a glorified wrist-bound FitBit.  And in doing so, obviate the need for the phone itself to do a lot of the heavy lifting that currently makes it difficult if not impossible to use the iPhone itself as your fitness/presence tracker (see: the battery-slaughter of Saga or Human or Moves).  Anything that can be staffed out to something with its own separate battery is good for your phone.

So now we wait.  Every man his own Big Data.  It’s coming.

– 4 Sept 2013

 

I think I might have been onto something. Four years later, this is more or less what Apple is pitching with CoreML.The promise is that all the processing of your data, all the heavy lifting of sorting through your information to find the patterns and tease out the useful interactions, can be done ENTIRELY on the phone without ever exposing your information to offsite processing. Do I buy it? Maybe. The fact of the matter is, though, Apple is the only vendor in this space explicitly touting the privacy and security of their solution. Amazon pays it lip service, and Google…well, Google has never made any secret of the fact that you’re the product, not the customer. (Another reason I try to avoid their products at all costs, and mostly succeed – except for occasional use of YouTube or text messages to a Google Voice number, or the old Moto X experiment.)

The problem is, there’s just so much that goes to the cloud anyway now. It’s not just storage – Apple had the 100 MB iDisk product as part of the original iTools in 2000, which is probably why Steve Jobs dismissed Dropbox as a feature rather than a product – it’s processing. Try using Siri offline. Doesn’t work. That external processing is mandatory for parsing voice commands. Signal is only slightly above battery in Maslow’s Hierarchy of Modern Needs. And then there’s the whole bit about how the Big Four kind of have you stuck. Only Apple isn’t really doing much with services beyond what is needed for independence from the others for email and music and the line – Google, Amazon and Facebook all have the ability to reconstruct fairly detailed profiles of your interest and behaviors.

And that’s before taking into account the fact that their information is probably not dissimilar to what the big credit bureaus have, and Equifax just demonstrated how well protected that is. The majority of American adults now have their golden-ticket personal ID out somewhere for the use of nefarious types, to the point that we may have to institute credit-freeze-by-default as a security measure. Which they don’t want, obviously, because they make their money providing your information to others. And before saying “that’s different”, consider that the use of Facebook and other social media in credit ranking is already out there.

We’ve poured our lives into the Internet with no thought of security. Facebook in particular got away with the greatest bait and switch in history, offering a walled garden in exchange for your real identity before dynamiting the walls. Throw in the whole “we accidentally the election” and Fuckerberg deserves to be in Gitmo trading cigarettes, not acting like nobody notices him running for President.  Too many companies have spent too much time and made too much money off our data without protecting it. A genuinely populist movement would be pushing back against that. Hard. But we don’t have populism in this country, just redneckery dressed up as populism by the kind of assholes who assume that “people” means “white.”

Meanwhile, get as far off Google and Facebook as you can. Maybe you can keep secure and unprobed. I doubt it, though.

Misuse it and lose it

It’s basically impossible to argue that the Second Amendment of the United States Constitution is past its sell-by date. Written at a time without standing armies, when the militia consisted of a state mandate of authority and training over every able-bodied man with a gun, and at a time when a gun was something you could fire once a minute if you were lucky, when a large city was measured in the tens of thousands – it’s almost impossible to make a case for ownership of military-style weapons in a 21st century urban environment.

After all, what is the point? Sporting use? Most states won’t let you use the NATO-standard 5.56mm cartridge for hunting because it’s too light, so that AR-15 clone has to scale up to something heavier, and besides, how many shots do you need to take at the deer without reloading? Two? Three? Unless that deer manages to return fire, anything over five is bullshit. OK, so how about the ever-popular “bullet box” as last resort of freedom? The United States has a standing army, and it’s huge even before you add in the Navy, Air Force and Marines. Consider things like Ruby Ridge, Waco, and contemplate the words “tactical airpower”. The people who want to need the guns don’t seem to grasp the level of asymmetry a modern government can bring to bear. Don’t forget, at the end of Red Dawn, all the boys were dead.

Yes, there is a case for sporting weapons and certain personal-defense options. The necessary tools for that basically peaked around the mid-20th century, and they’re just as effective as they ever were. (The sidearm of Marine Force Recon, for instance, was designed originally in 1911.) Time was, you could enforce the law just fine with a heavy revolver and a pump shotgun. Probably still could, probably still should in a lot of cases. But in the last fifty years, the net result of the Second Amendment has been to flood the country with endless semi-automatic weapons and bottomless magazines of ammo that have as their intended function the killing of troops on a battlefield. The time to stand up for the Second Amendment was when people were ignoring the “well-regulated militia” part of it and using the rest to legalize M4-geries and SKS imports and 100 rounds in a clip.

This is of concern not because of the Second Amendment, but because of the First.

The First Amendment absolutists are coming out of the woodwork in defense of Nazis. White supremacists, anti-Semites, racists and bigots have somehow managed to become a larger, louder, more cohesive force in American life than in the last fifty years combined, and almost 100% of that can be put down to the use of social media and electronic communication. Infamously, the first email spam I ever received – over twenty years ago – was some sort of white nationalist screed, at a time when it really took some effort to deploy spam over email rather than just saturation-bombing USENET groups. In the ensuing decades – and especially with the rise of Facebook and Twitter – it’s become incredibly easy to find like-minded people, and in a lot of cases, to rally them to attack unlike-minded people with the same tools. 

The “freedom of speech” argument in the First Amendment was made at a time when speech was just that: speech. Or written letters. Or published books or newspapers. You had to go pick up the printed matter and read it, or speak to the person, and if they wanted to force their speech on you, they had to follow you around yelling at you – at which point, things being how they were in the 18th Century, you could probably take a swing at them and be just fine. Now, though, technology means that if you are on social media at all, it is possible for literally thousands of people to bombard you with their speech 24/7/365, and your options are either to get offline or make a career out of blocking them. As it stands, most of these companies – Twitter and Facebook most egregiously – have gone to great lengths to paint themselves as “the free speech wing of the free speech party,” and view this kind of amorphous perpetual harassment as merely the price of doing business.

Which is insane.

Back in the day, that kind of harassment was simply impossible. If a person comes up to you and starts spewing racist invective, they’re right in front of you to confront and deal with and be known to you. Now, the opportunity cost for tens of thousands of people to target you, personally and directly, is effectively nil. If it has been possible for the entire population of Philadelphia in 1787 to materialize in Ben Franklin’s house, masked and anonymous, to abuse him as a Francophile libertine pervert who spent all his time laying pipe on old broads, and vanish as soon as he looked at them only to continue the moment his head turned, I suspect there would have been a slightly different conversation around the Bill of Rights.

If you want to defend free speech, fine. If you want to say the cure for abuse of free speech is more free speech, fine. But you’d better have a plan for allowing free speech without facilitating abuse, harassment, and physical violence, and mitigating them when they do happen. The alternative is the risk that after years and years of a surfeit of utter libertarianism-without-consequence bullshit, the First Amendment will prove itself as obsolete as the Second.

On the eve

I don’t really watch college football any more. I think the disaster of 2013-14 beat it out of me, to be honest. Cal gets a new coach who takes them to 1-11, proving once and for all that it is easier for a camel to go through the eye of a needle than for the Air Raid to win meaningful games. Vanderbilt has a horror show incident in the offseason, somehow matches its best win total with eight plus a bowl victory. And then the coach, who spoke endlessly about how great the program was and how recruits should build a tradition instead of renting one, took many of them to do just that and provide life to a program whose “punishment” was to have the same record as our highest. And then we got our most desired candidate, who proved to be in over his head, and we crashed right back down to 3-9 and a poor 3-9 at that. By rights we should have been 1-11 as well in 2014, and the world was quick to proclaim that things were back to normal.

College football is the worst sort of mirror of the real world, because it’s just like the real world. Them that has, gets. Them that’s on top, stays on top. What you do, what you achieve at any given time, is unimportant relative to everyone else’s perception of what you are. If you can sustain excellent for maybe a decade, you can alter perception. Maybe quicker with a compliant media and a personal megaphone. But unless you can get immediately better, hold that for five years and crown it with a national title, the first slip means you’re going right back to the basement irrespective of the merits.

At least the Pac-12 is kinda sorta trying. There are a lot of well-regarded schools there, not least the #1 public university in the world. There are a lot of things they do beyond just football (they basically run the Olympic sports, not to put too fine a point on it) and as we were recently reminded, they gave the world Jackie Robinson (for which UCLA has gained a lot of karma over the years). In the Pac-12, football is not the entirety of your university’s identity unless you pull a USC and make it so, and even then, people will always bring up the film school. There’s balance, there’s rounding, there’s a smidgen of actual perspective.

And then there’s the SEC. 13 football teams surrounded by varying degrees of college. 13 organizations driven and supported by people who never set foot on campus except for gameday. 13 teams who have no problem with the notion that the only thing that matters in evaluating a program is how good you were in your parents’ era. Never mind academics, never mind other sports, never mind simple matters like not breaking the law – thirteen autumn Saturdays are the sum and substance of what you are as a university.

And then there’s Vanderbilt.

People are up in arms at the thought we might move off campus, and I agree with that – college football ought to be played on campus, always, that’s why it’s called college football. But the early returns suggest that to rebuild and repurpose our stadium would cost somewhere in the high eight figures. And at the end of it, we’d accommodate somewhere between thirty and forty thousand people, in a conference where four of our thirteen “peers” can hold over 100,000 in their gladiatorial arenas. When the day comes for the Great Powers of College Football to break the chains of the NCAA that hold them to the barest lip-service of student-athletics, at least ten of those schools will make the jump, and all thirteen will want to.

And then there’s us.

This is why I can’t follow college football anymore. I wish nothing but well to Vanderbilt and Cal, I wish nothing but ill to Stanfurd and Tennessee and Auburn and Texas and a host of others, but my favorite sport from earliest memory for forty years is not something I can engage with any longer. It means overlooking lawbreaking, overlooking exploitation, overlooking an activity that we can no longer pretend doesn’t have serious long term health consequences. It means emotional investment in the most rigged game in the casino. It means being the bull in every bullfight, in a world of outrage at your temerity to maybe try to use your horns before your inevitable demise. It means knowing that every Saturday there’s a 2-out-of-3 chance that it will end badly, in a sport and a world that gives you no credit whatsoever for doing your best or doing things “the right way.”

Maybe it would be different if we really had “peer” institutions. If Alexander Heard’s Magnolia League had come to pass, maybe we’d be playing Duke and UNC and Tulane and SMU and Rice and hell, maybe Navy and Wake Forest and Georgia Tech. Maybe if we’d decamped from the SEC with the Yellow Jackets and the Green Wave in the 1960s we’d be in the ACC and facing down a slightly less trying road in a conference with several private schools already present. Maybe when the football teams break off we’ll be left behind in the SEC for all other sports and can play our football in the Southern Conference or whatever you want to call it. Maybe this ends with the Commodores as the bully of the Southern Athletic Association, pushing around a bunch of religious colleges with enrollments under two thousand in non-scholarship Division III.

But in 2017, the last thing I need in my life is another losing cause, something else where you have to put your face to the grindstone over and over for no apparent reward other than the nebulous promise that someday things will be better. There’s enough of that going on now without inviting more of it into your life. Once, college football was a joy and a delight. Now it’s just a different flavor of misery. And unlike most of the others, it’s one I can choose to do without. So I’m gonna. Sorry, guys. Best of luck. Anchor Down. Go Bears. Maybe someday.

The slot machine

Tweetie was the first iPhone Twitter client worth having. Loren Brichter started out on the iPhone team at Apple, and after it shipped, he started his own company and produced Tweetie – which included a landmark innovation. By scrolling to the top of the page and continuing to scroll, the “rubber-band-“ effect would do more than just be a visual marker: it would actually trigger a reload. All you had to do was keep flicking, and you could constantly reload the Twitter stream to see new content. And just to put too fine a point on it, if you dragged far enough, you got a sort of slot-machine wheel that would pop up three birds.

Loren’s on-the-nose Easter egg has come to sum up the Twitter experience for me. This time last year, I had six distinct Twitter accounts with their own functions and purposes. Of those, one is defunct and another is rarely touched. Two of the others have been deleted outright. I have a “public” account, a “professional” account that links to my real identity, and a “personal” account which is locked down tight.

But more to the point, I don’t use them as much. When I blew away those two Twitter accounts, with them went a couple hundred mutual followers and the bulk of my day-to-day socialization. I rarely do much with any account but the “personal” one, which has maybe two dozen followers, and while I like being able to keep up with people, it’s such a firehose of everything in the world I want to avoid that barely a day goes by where I don’t have to log out and quit the browser to try to forestall going back and looking again and again and getting even more out of sorts.

Twitter really is the slot machine, the gameification of attention – at any given moment, if I load Twitter and scroll through it, I have roughly a 20% chance of seeing something I’ll be glad I saw, a 30% of seeing something I’ll be angry I saw, and a 50% chance of something of no interest. That’s bad arithmetic, and it’s happened even after weeding down two or three times to get to a tightly curated private group with only the things I’m likely to want to see. And the problem there is – you can’t even control for those things, because there’s so much to be outraged about, there’s no escaping it short of tapping out altogether. People speak ominously about a “filter bubble” and yes, it’s problematic when people only get a drumbeat of their preferred propaganda rather than any sort of factual news, but at some point you have to balance being informed against being driven insane. Right now, Twitter is a piss-poor tool for achieving that balance, and that may be sullying the good name of piss.

But then, Twitter – and Facebook, which I resolutely refuse to use on a more than bimonthly basis – have a lot to answer for in general. Of which.

The price of “free”

There are two companies in Silly Con Valley who are still invested in the model of “cash on the fucking barrelhead in exchange for goods and services” – one is Apple and the other is Amazon. Everyone else is pushing “free at the point of use.” (To some extent, even Apple is doing this, although it’s hard to quibble with the notion that the Steve-Jobs-as-Scarlett-O’Hara DNA is still in the Cupertino bloodstream, what with the desperate need to never be dependent on Google or Microsoft or anyone else for critical services. Plus selling the phones is the big ticket; my 99 cents a month for extra iCloud storage is peanuts.)

We’re starting to see some of the limits of this. The biggest one is that when you don’t pay for a service, you have no recourse. I don’t know what the legal implications of this are – lack of consideration generally means there’s no contractual obligation, or so I was led to believe – but it certainly seems to me that the combination of “eternal beta” with “free at point of use” means that the service can change in any way at any time and if you don’t like it, tough shit, you paid nothing so don’t look a gift horse in the mouth. 

Problem is, this means you also have no leverage at all. If you quit using Twitter, which has richly earned the label of “a honeypot for assholes,” you’re not taking a penny out of their pocket. If you stop using Google’s Gmail offering, it does you no good, because 4/5 of the other people you email are using it and your content is just going to be parsed and data-mined same as if you were using it. You can bail out of Facebook, but they have half a billion other people willing to keep churning away. It would take a complete collapse of the online advertising market to put a hole in these companies, and for the likes of Google and Facebook, they essentially are the online advertising market.

Meanwhile, while we weren’t looking, Amazon essentially became the Internet’s Wal-Mart, a monopsony buyer which can ruin suppliers by leaving them off the site or demanding they meet a price point or just by moving into their market with their own-label goods. Wal-Mart was a bad company, sure, but the destruction of brick-and-mortar retail means that there aren’t going to be local jobs created by Amazon other than at fulfillment centers. (I suppose they could get into the delivery business themselves, if UPS and USPS aren’t enough. Maybe staff it out to Uber and Lyft, who knows.)

But that’s Amazon. If you’re willing to just go to a store – assuming one is handy – you can avoid that. Problem is, “social media” is pretty much entirely in the hands of Facebook or a handful of others (and don’t get it twisted, WhatsApp and Instagram are both Facebook and both doing everything they can to snipe other services like Snapchat). Trying to keep up with friends without using one of Facebook’s products is damn near impossible at this point unless you’re prepared to round up everyone you know to use something like Slack, or roll your own listserv, or convince everyone to use some other app suitable for group texting (even if it just ends up being shared SMS).

That’s the increasing problem: the Internet is maximizing the worst of humanity, but the only practical solution as an individual is to opt out. I guess I’m just lucky to be part of the last generation that knew you could have a life that wasn’t online, even if that online life got me to where I am now. But it’s reaching a point where depending on the vicissitudes of social media is a fool’s errand. Of which.

The next up

It’s kind of alarming to think about, but if you knock off standby and talk and audio playback time (which any phone can handle these days) and look at the browsing times, the current iPhone 7 Plus – the big one – doesn’t represent that much of an upgrade over the iPhone SE. Nor did the 7 represent that much of an advance over the 6S, which had to have more efficient internals just to make up for the fact that it packed a smaller battery to make room for 3D Touch. (Which begs the question – what would happen if you made the phone thick enough to contain the camera nub, ditched 3D Touch, and used all the created space for battery? But I digress.)

While there are certain phone makers who will go for a huge battery occasionally (as Motorola still attempts in the mid-market), most premium phone makers seem to have thrown up their hands and gone all on in “THIN THIN THIN MY GOD CAN YOU BELIEVE HOW THIN LOOK HOW THIN IT IS OH YEAH IT CHARGES QUICK YOU CAN CHARGE IT BACK UP IN A SECOND BUT LOOK HOW THIN” which is…okay, whatever. I mean, if you don’t have a charge cable at your desk and leave it plugged in whenever you’re there I don’t know what to tell you, but much good that does you when you don’t have a fixed location and aren’t driving to work with the phone on the charger. The fact that Apple still makes the external battery pack for the iPhone 7 is all the proof you need: even they know it’s not enough.

I won’t lie; I’ve been tempted by the notion of a larger screen. The 4.7” of the original Moto X was in a form factor that worked perfectly, but those days are gone. The Great Mentioner is convinced that the iPhone Pro, so-called, will have a 5-plus inch AMOLED display with no bezels in more-or-less the form factor of the iPhone 7 – which is nice if true but still just a hair too big especially once you get a case on it. And at this point, it’s got to be at least a 5 inch screen; I went from the iPhone 6 to the SE and never looked back so 4.7 isn’t going to get it done anymore.

Thing is – what do I really want with a bigger screen? Kindle reading, maybe, it reduces the number of flicks – but as long as you have the Paperwhite, why not use that and save your battery? The wife found the London flyby in the Maps app, and that was pretty sweet, but is it worth buying a bigger phone? There’s always video, I guess, but I don’t watch video on my iPhone SE in the first place and I doubt I’d watch more on a bigger device. For a generation who has YouTube and Netflix in place of television, the incentives are probably different, but I’m getting by OK.

Here’s the thing, though: if the battery life isn’t getting any better from having a bigger phone, and I don’t need the bigger screen, what do I need another phone for at all? I can’t be the only person willing to stretch their device for three years or maybe more, especially one that has suited me as perfectly as the iPhone SE. (Not even the Apple Watch has been as perfect a companion. I don’t take it abroad, I don’t bother with it on days I know I won’t be at work and won’t get the exercise in anyway, and that Ion-X Glass can scratch just fine, thank you. And when the phone is small enough to use one-handed, you don’t need a remote control to avoid fishing it out of your pocket.)

If I need a larger phone at all, it’s as the shutdown device, the alternate-reality Android that I only use for a few hours at a time and never for anything more than a half-dozen apps. Kindle, Wikipedia, Foursquare, Instagram, Slack, perhaps some sort of streaming audio from London or Ireland or baseball, and that’s about it. It’s possible that something like the Amazon-discounted Nokia 6 or Moto G5 would be a replacement once the faithful old Moto X is no longer suitable for Tuesday or Sunday night unplugging and getting away from it most. And it would give me a crack at Android Oreo (is this really the most apt time for Google to push a product that’s got a lot of brown to look at but is all white at the core?) for whatever that’s worth.

Or, you know, this could be it. I could finally be down to one phone, personally owned, just the SE for all things with the knowledge that there are two or three viable prepaid service options if ever I ditch work. And that’s the thing with the iPhone Pro: if you’re going to drop a thousand dollars on a damned phone, you better know up front that you’re getting value for money for years. At that price point, it has to have a laptop lifespan, not a phone lifespan. Two and done is fine for $200 Shenzen screwdriver-jobs with Qualcomm guts and no upgrade path. From Cupertino, I need better.

Fair warning, Auburn man and limey prick: don’t screw this up.

Where to go and what to do

I don’t talk much about Vanderbilt football in this space anymore. Nor in the space where I used to publish weekly during the season. The exhilaration of the Brigadoon era crashed and burned once our best success in a century got strip-mined for the benefit of Joe Paterno’s squad, and so far, we’ve proven we can hit the non-Brigadoon highlights since 1982: beat Tennessee, win six games, go to a bowl (and despite the loss, we might be the first SEC team happy to go to Shreveport…ever, really).  The question is, can we beat that? Many have tried, but only two non-Brigadoon coaches have made it even to six wins since that Hall of Fame Bowl in 1982, and one promptly lost the bowl game while the other crashed and burned, going 1-5 down the stretch before the punter made MVP of the bowl and preserved a winning season.

The aspirational model for what our football team could become definitely seems to be Stanford. I mean, we went out and hired their DC and everything. But apparently in 2016, Stanford didn’t sell out their 50,000 stadium. Not once. And they’re raising ticket prices. Now, in fairness, Stanford has gotten themselves into a situation where their two biggest-drawing opponents are in the same year (Cal and Notre Dame) and the other biggest attractions at present (UCLA, Oregon) are in the same year as well. When USC is the only remotely interesting home game, it’s probably a tough sell anyway. But this is a program that’s gone to MULTIPLE Rose Bowls in recent memory, its the beneficiary of unlimited slobber-worship by the college football media, and has a benefactor who showers the athletic department with literally millions of dollars a year. They can’t fill 50K.

There are those who argue that Vanderbilt football is a sleeping giant, a Stanford-esque overnight sensation waiting to happen, and all that we have to do is build a new modernized stadium and mysteriously winning will breed winning and we’ll find ourselves going head-up with Alabama and Florida and battling for a playoff berth every year. These people are insane. The cost of catching up with the rest of the SEC – in facilities, in mindshare, in media coverage – can’t be measured in dollars and cents. It would require a cultural change on campus, it would require a complete transformation in the college football media, and it would require years of repetition before people got in their heads that Same Old Vandy was gone for good. And if you don’t believe that, look at what happened in 2014, when it only took the first half of a weird and rain-delayed game for the world to proclaim that things were back to normal on West End.

So there’s a spectrum. At one end, we drop football altogether as a sport. At the other, we do whatever it takes to keep pace with the giants of college football, irrespective of the cost or impact on the university. Right now, the two factors that are orbiting one another are stadium location and conference affiliation. While Vanderbilt is a peer and competitive member of the SEC in every sport other than football, only one thing matters in the Southeastern Conference and it’s the one thing at which we happen not to be a competitive peer. Meanwhile, the plan appears to be that instead of spending the money on a new on-campus stadium or a massive refurbishment of the one we already have, we’ll borrow someone else’s off-campus stadium and wait to see what happens, not least because an on-campus stadium is a huge chunk of property and they aren’t making any more of that.

At this point, I don’t think we’re ever going to see a larger stadium for Vanderbilt than what we have now. There’s simply no percentage in it. By the same token, I can’t think of anyone in major college football playing their games in an off-campus stadium other than the LA schools, and that represents special circumstances as neither UCLA nor USC has ever had an on-campus facility in the modern era (they shared the Coliseum until the 80s and UCLA has been at the Rose Bowl ever since). So look at Tulane, which is the poster child for a school that bailed out of big-time athletics – and even they have built a new 30,000 seat stadium, on campus, at a cost of $75 million. Now we have a dollar figure to play with. Probably safe to assume that any rebuilt Dudley Field will cost roughly similar. At that point, you have to think the powers in Kirkland Hall are looking at this notional MLS stadium and thinking “that’s $75 million we don’t have to spend, never mind the potential use value of getting the Dudley land for something else.” I get that. I don’t like it one bit, but I totally see how they get from here to there.

So. Are we going to chase the rest of the SEC no matter what? We are a founding member of the SEC and a competitive peer in every single sport we participate in…except for one, and that one happens to be the thing that defines the SEC and all it really cares about. We can continue to power our way through in baseball, in tennis and golf and cross-country, and possibly even in basketball behind CBD and CSW, but the price of doing that is bashing our football into the rest of the conference every year with one hand tied behind its back. Can that be done? Hell yes, we’ve done it for decades. But it’s unlikely to build a fan base or bring in additional revenue apart from our one-fourteenth share of the TV and bowl money. It’s also worth noting that Brigadoon aside, our baseline improvement is largely a function of playing twelve games a year rather than eleven; those five-win seasons in the Dinardo era would probably have been six in the 21st century.

That’s really the only question. If we don’t chase the SEC, and nothing happens to separate football from the rest of college athletics, we’ll almost certainly end up somewhere else – Conference USA, the Sun Belt, some lower division altogether. I don’t think Vanderbilt will dump football as a sport until football itself goes away, though, and as an example I offer good ol’ Birmingham-Southern College, which in 2006 abandoned its several-year experiment as a Big South Division-I school and reverted to D-III. The first thing they did in D-III? They ADDED football – which they hadn’t played since 1939 – and committed to building an on-campus facility for it. The whole point of leaving Division I was supposedly financial, yet they added the most expensive sport a school can play.

So what now? The optimal scenario: play elsewhere for free until we can figure out what the future of Vanderbilt football looks like, then build accordingly on campus as required. But that leaves too many variables in the hands of others, and the uncertainty will do nothing for the team, the fans or the perception of a program that already gets Hillary Clinton levels of media regard. At last call, the future for a Vandy supporter is the same as it’s always been: unknowable but grim.

flashback, part 87 of n: ten years after

In retrospect, the trouble really began when I yielded to one of my co-workers, who had just gotten married and needed to visit Germany with his new bride before her pregnancy got too far along. Which was fine, I didn’t begrudge him that in the least – but it meant that not only did I go on vacation, I found myself instead covering his job in addition to mine. 

And that’s when the knee really started to give me trouble. It had always been a bit dicky, ever since my brother took the post-hole diggers and made a hole that he lured me across, stepping knee-deep and miraculously not breaking anything. But for whatever reason, it was worse than it had been before, probably from the wear and tear of dockwalloping for two years plus. I hadn’t had to do as much of it lately, but going back to it made things worse, and eventually I was referred to a doctor who recommended surgery to clean it up.

That was the point at which a smarter person would have gone to his boss and said “I need some kind of accommodation.” And had I any inkling of how things would turn out, that’s exactly what I would have done. But I didn’t feel like I could, because our group had trouble in the past with a lead who always said “I’ll come back this afternoon and help you out,” which meant that there would be no help and there might not be an afternoon. So I had to do my usual desk work and then come back and do my share of the forklift jockeying and box moving. And when I got done backfilling for my one colleague, I had to start backfilling for another one who was constantly being re-tasked to assorted secret squirrel projects – which left me doing three jobs, none of which was particularly technical. The dock work could have been done by anyone with a strong back, the scheduling could have been done by a well-crafted piece of javascript if we’d had more competent programmers working with us, and the third job was deadly dull but just as deadly serious, packing out the special kits for sales staff working special events, and it was all your fault if anything went wrong irrespective of how.

So I panicked. I was terrified that if I didn’t get back into the technical side of things, I would be doomed. I was a troubleshooter, I was a problem solver, and I wanted to be solving more impressive things than how to get an education rep to take ten iPods instead of forty for whatever podcasting demonstration they were going to put on in the Grange hall in Dubuque. And ultimately, that was the foolishness – the notion that I somehow had to stay technical, that I would be in trouble if I didn’t.

Had I stayed, there’s a chance I could have eventually moved into the sales-engineer side of education or government sales. I was known and liked by people in both areas (even if others in EDU hated my guts) and I was building professional connections – the lack of which remains my Achilles’ heel in an industry and a part of the world where your next job almost always comes from a call from a former co-worker. Had I stayed, I was in line for a raise that would have handed me the same salary I got in my next job as a government subcontractor – and with actual benefits, unlike the subcontract gig.  Had I stayed, I could have worked through the incipient depression from a more fortified position, rather than off the back of what I rapidly realized was a catastrophic mistake.

2007 was also when I tried to leave the internet behind and embrace the real world. I signed up for RCIA, but it didn’t go anywhere, partly for want of anyone who would make a viable sponsor but largely because I didn’t feel I could convert to only 60% of a religion I wasn’t raised in. I signed up for a men’s a capella chorus, but as the youngest one there by twenty years, it didn’t really do anything for me socially and took four hours a night for rehearsals. I signed up for a Java programming class at the local community college, only to realize that I have absolutely no interest in programming. And in an attempt to get into the real world, I abandoned my LiveJournal presence in any meaningful way. Which didn’t work out that great, to be honest. 

In short, 2007 was a slowly gathering existential crisis which climaxed with me in a cinder block office in December, working what were functionally two part time jobs without benefits, bereft of whatever psychic gain came from being associated with Apple or National Geographic, without any meaningful support structure that wasn’t a continent away or borrowed second-hand, and convinced that my entire past was crumbling into a black hole behind me as I desperately tried to stay one step ahead. It was enough, eventually, to drive me to medication and a fourth try at some sort of therapy. I’d like to say it worked, and I suppose by summer of 2009 I was sort of okay – but it wasn’t back to normal, it was a new normal, in a way that left me wary of new normals for good.

I eventually made the money back. I eventually got out of field support. But it’s another situation where better choices would have gotten me there sooner, and it’s a useful reminder: be the person you’re becoming rather than trying to cling to the one you were.

Charlottesville and everything after

You can’t be surprised by this. This has been a long time coming, ever since the GOP hitched its wagon to the South in 1994, or 1988, or 1968 – pick whatever date you want. But it obviously wasn’t going to happen under George W. Bush – normally control of Congress and the White House means you can pursue your aims through political means. The villains in the piece here aren’t just the white supremacists and the President [sic] who enables them – it’s the party that thought they could keep using the opiate of racism just enough to get them by without getting hooked. And now, here we are: keep dog-whistling about the secret Muslim Kenyan usurper and that Democrats are out to destroy white people, and then when you get unified control of government, people like David Duke think it’s finally payday in the village.

And it’s kind of broadly based, because we decided somewhere that the Internet doesn’t count and isn’t the real world. Meantime, the alt-right and the GamerGate pukes and all the other arrested-development adolescent boys took it very very seriously. Now, matters are worse. A huge group can be rallied to Charlottesville with ease, whereas a tiny fraction of that number could be pulled to Birmingham in 1991 for me to elbow one in the dome at the Guns ’n Roses show at the race course. And more to the point, condemning the KKK and their polo-shirt ilk should be the easiest thing in the whole goddamn world for a politician to do. This is cartoon stuff, rookie-difficulty-setting, the kind of stuff you can point to as “REAL racism” to distract from redlining and cutting Obamacare and all the other things that hit nonwhites first and harder and longer for the benefit of folks with money. It should be a layup to condemn those pricks.

And yet.

Here’s an easy rule of thumb: anyone who hesitates to condemn the Klan, anyone who has to hedge their words around lashing out at white supremacists? It’s because they’re on their side. It’s because they rely on their support. It’s because that’s who they are. For decades, Democrats had to live down anyone to the left of the New Deal, had to hem and haw and apologize for rappers or undocumented immigrants or gay people or do some kind of po-faced dance around anything that cast aspersion on anyone white. Well, here you go. Payback is hell. The United Cracker Front in Charlottesville this weekend needs to hang like a millstone around the neck of every Republican from now until time immemorial. These are the deplorables.You want to defend that? This isn’t you? This isn’t what you stand for? Fuck you, prove it.