Population 51,201

I had just turned eighteen when Twin Peaks hit American televisions. It was the perfect combination of place and time: for me, something new and slightly weird just as I was getting ready to leave high school and achieve the big dream: college! independence! A new life for myself!  And a weird, quirky, dreamlike thriller was just the thing to pull me in, especially when the first few episodes ran again all summer and I could get caught up and stuck in.

Over twenty years later, I found it on Netflix, and resolved to plow through all twenty-nine episodes again, this time with the benefit of years and sense and a slightly better grip on things like Tibetan wisdom and terminal ballistics.  Last night I finished the last-but-one episode. There’s only the finale left.  And I’m reluctant to run it…because I don’t want it to end.

It’s definitely dated, I admit.  The pacing isn’t quite as bad as you’d expect of an 80s prime-time soap opera – and make no mistake, that’s what this is – but then, some of the slowness could be camouflaged by the abiding weirdness David Lynch brings to the table every time out.  It’s also tough to wrap your head around not only a world without cell phones and the Internet (not to deny them credit for Macintosh product placement before it was fashionable), but a world where the cops still carry revolvers and people routinely smoke indoors.  Twenty years ago is a foreign country.

The look is equally dated, although once again that could be partly Lynch and possibly just an affinity for the era. Let’s be honest; I was 18 and pretty much every one of the women on the show still holds up (we tend to forget that Audrey Horne was America’s designated sex-incarnate for most of 1990).  Norma in particular is still lovely, although she (and presumably Big Ed) are younger then than I am now, which is kind of disturbing to think about.  I’m still rooting for those two, of course – it’s tough to be with the one you love when one has a spouse in prison and the other has a superhumanly strong one with an eye patch and a drape-runner fixation.

So many plots and story lines that went nowhere, seemingly. Anything with the Packard Mill got boring in a hurry – Piper Laurie’s scenery-chewing bitchery seems much more suited to something like Dynasty.  The switch from the plot being driven by the expanding Renault crime organization to being propelled by Windom Earle seems fairly abrupt.  And James off with his mysterious woman served no purpose whatsoever.  No wonder it went off the rails – there was just too damn much to keep track of.  Lesson learned: you can be complex without being complicated.

But so much of it still works. Special Agent Dale Cooper and Sheriff Harry Truman remain the most underrated bromance of our time, and the evolution of Coop from mysterious eccentric sharpshooting investigative genius to humanized flannel-clad troubled soul in love with the new girl at the diner (and yes, that was Heather Graham, folks) is rather a nice character arc. I would have loved to see more of the Bookhouse Boys and seen more of their battles against “the evil in the woods” – and maybe the origins of the Faustian circumstances by which Twin Peaks became this idyllic small town taken out of time and framed by evil on all sides.

The thing is, Twin Peaks in its time tended to parallel my life. It started with a bang in the spring of 1990, when I was through with high school and anxious to get on with my future.  I even bought the cassette single of the theme, deliberately thinking to myself “you know, this would make a fine song with the new girlfriend which I will undoubtedly meet once college gets going.”  And then, when the show came back in the fall, it slowly deteriorated until petering out in April…which is just about how my freshman year went.  One long slow deterioration until by April, it was obvious that I wasn’t going to be able to save this bird from a hard landing.  And just like my college career, the series didn’t have a happy ending either – just a cliffhanger with no obvious hope for how things could be saved.

Now? Now it’s a waking dream, a little slice of the past brought back out of the black hole, a piece of that sort of “creeping strangeness that we can’t quite bring ourselves to call magic” somewhere between urban fantasy and magical realism I mentioned a few weeks back. It’s a much better selection of jazzy ambient work music (hell, it’s a whole new soundtrack album that didn’t even exist twenty years ago).  It’s an explanation for my affinity for coffee and cherry pie (which never waned).  It’s the existence proof for shows like Lost or The X-Files, which never could have happened were it not for one random murder in the small-town woods south of the Canadian border.

RIP, Laura Palmer. =)

This time it’s different

Much has been made in the last few days of one particular San Francisco techno-douche who famously raved out about what he hated about the city – and who was promptly savaged by pretty much the entire Internet.  As much fun as it was to see the entire 415 going in on some assbag who thirty years ago doubtless would have been coke-snorting his way around Wall Street, it drives home something I’ve been feeling lately – that this time, the up-and-coming generation of rich young dot-communists is materially more unpleasant than before.

I think this is true, I think it goes beyond just the typical generational disdain for those following behind, and I think it can be explained by a couple different phenomena.  The first, and most obvious, is that this particular technology bubble (and it is a bubble, don’t kid yourself) is not happening in parallel with a booming economy in general.  Famously, in 1999, we hired a waitress from Hooters to sit the help desk at $50K a year, because everyone more technically qualified was working elsewhere for more money. I myself got dragged in off the street in 1997 for $40K with not a day’s experience in IT, and saw my salary jump by almost 40% in the ensuing three years.  The dot-com boom went hand-in-glove with the longest sustained economic expansion in the history of this country, and it’s hard to disdain the twenty-something youngsters minting cash out of straw when everybody’s getting paid.

Fast forward to 2013, where the economy has never really recovered from the credit crunch of 2008.  A parsimonious Congress, in the grip of a political cabal that’s more than willing to sink the country in order to slag that colored boy in the White House, has given us an austerity binge that slowed recovery to a trickle.  Interest rates stay low, quantitative easing continues, but the market gets hinky every time it thinks the Fed might turn off the taps – the economy is on permanent life-support, and people stay in jobs they don’t want or can’t stand for the sake of security, or insurance, or because they have to get the kids through school.

Meanwhile, the nature of the technology itself is more solipsistic than ever before. The dot-com boom was about monetizing the web – search engines, advertising, retail that could offer results superior to brick-and-mortar shopping. And, of course, the technology blue chips that were getting you there: Netscape, Cisco, Microsoft, AOL.  Today, that’s all in place; Amazon is everyone’s default shopping choice and everybody’s got an on-ramp to the information superhighway (sneezes cobwebs off cliche).  The driving forces behind the modern tech economy are largely centered around social networking and mobile computing.  Facebook. Twitter. Instagram. Foursquare. Google and Apple, of course.  It’s all about personal gratification, in a way that can be much more personal, pervasive and persistent than when our internet experience revolved around desktop computers plugged into DSL (if you were really lucky).

The combination of the two is what really drives people up the wall.  Things like Uber and Lyft and Sidecar are tremendously useful in a city woefully underserved by cabs, but the cabs are (quite rightly) irate at being subject to a slew of regulations that “disrupting” services are spared. The entire ecosystem of private shuttle buses running from San Francisco down the peninsula – taking ridership from a public transit system that needs it to survive, and frequently usurping public bus stops for their own – offends the sense that we’re somehow all in this together.  People looking to buy a house or a condo in the city are finding themselves shut out by people who can swarm in with a cash offer and then just keep the property as an investment.  And Sean Parker’s infamous “fantasy wedding” has become the gold standard of cautionary tales about the solipsistic oblivion of modern Silicon Valley wealth.

So what’s the solution?  There’s not one, really. Eventually this bubble will bust and all the douchebags will go back to Goldman Sachs or whatever, maybe. Or the economy will take off and enough money will shake loose that everybody’s happy – thought probably not, or it would have happened by now. Or this will just roll itself into the ongoing trend of social bifurcation where certain people wind up with Platinum Plus Preferred Citizenship and everyone else scuffles out a living as best they can.

Dude you’re getting a breach

So apparently Fast Eddie Snowden had access to classified material even as a Dell contractor, before joining Booz Allen. This is, politely put, an utter shitshow. It also points up a lot about how modern trends in business and governance have combined to make things worse.

See, there are some things that rightfully ought not be staffed out, and you would think national security would be at the top of the list.  But it has been an article of faith among the chattering classes of the Village that the private sector can always do everything better than the government. Consequently, as the government moved into things that didn’t exist in the 1970s, the odds are ever greater that those functions will be handled by contracts with private-sector employers.  Result?  Contractors and sub-contractors in positions of rather vital importance. And since government can’t pay like the private sector – because that would be a wasteful use of taxpayer dollars!!* – the sorts of people who can do these things are going to be in the private sector anyway, so you’re more or less obligated to contract out to get this done anyway.

And here’s the problem: I’ve been on the contract game. Hell, I was a government subcontractor. And the organization did not exactly inspire in me tons of loyalty – certainly not to the contracting or subcontracting companies. This is the price of treating labor as a fungible and disposable commodity – the workers will in turn treat the jobs as fungible and disposable.   That’s half the appeal of contracting, after all – you’ve got no ties to the company, you’ve got nothing invested with them, and the more you bounce around, the less incentive you have to take care of any particular job – because the next one’s around the corner.

Snowden, then, is the inevitable result of the modern mania for outsourcing.  Somewhere along the way, the powers that be decided that workers just weren’t that critical to the system.  Better to always be able to cut and run, to bring in somebody cheaper, to hit the eject button without even needing a reason.  Staff jobs disappear, population increases, more and more employers decide to go that route, and pretty soon we’re all out there making a virtue of necessity and talking about freedom and personal agency and taking control of Brand You, because that’s all that’s out there.  Add “recruiter” to “financial planner” on the platter of things you do in addition to your black-letter job responsibilities.

So there you go.  Throw our brave new world of work onto the pile with our terrorism-panic and insistence that the government spare no effort to protect us**, and you have an utterly inevitable and predictable situation – the entire Snowden affair, stem to stern, was practically predestined thirty years ago.

 

 

* Here’s the question: no matter how gold-plated the benefits, is it really cheaper to pay a staff employee than to pay a contracting company to pimp a contractor to you? I don’t use the word pimp idly – because face it, in contracting, you’re paying to dispose of the staff once the job’s done.

** The real danger for Snowden is that all this NSA stuff turns out to be completely legal, which given the exhaustive reach of the USA-PATRIOT act of 2001, its various re-authorizations, the laws enabling the FISA courts and the like – well, it’s not unthinkable.  Short of going all the way to the Supreme Court, which is mostly filled out with people appointed in the War On Terror era or those highly sympathetic to it anyway, it’s not beyond the realm of possibility that for all the whistleblowing, no actual laws were broken.  If so, that puts our little Ed in a much more precarious spot legally.  Then again, being a guest of Putinist Russia might be its own punishment at this point.  It might not hurt for him to leak those girlfriend pics again.

Down in the Delta

So a week or so ago, I decided that since I didn’t have kids, I might as well take the case off my iPhone 5.  Nobody else handles it, I’m reasonably sure-handed, why not?  Sure enough, I’ve only dropped it once and not in a particularly harmful way, so now I have the benefit of it being light and slim and such.  And then, a couple days after, I did what my wife had done to her iPhone 4S with no regrets: pulled the screen protector off and left it off.

This was a huge leap of faith. After all, I scratched the screen of my iPhone 4 within a week of receiving it, and I’ve been paranoid about it ever since.  To take this fragile aluminum-glass iPhone 5 with no case, no screen protector, nothing at all – it feels akin to whipping off my trousers and Porky Pig-ing my way around Plato’s Retreat in 1978 or so. I mean, it may feel great and look very sexy, but before long, I’m going to wish I’d never done it.

But so far, it hasn’t been much of an issue.  And the thing is – it’s like getting a brand new phone. I don’t think I appreciated just how amazing the screen is, after months of having it covered by a scratched layer of polymer. I certainly didn’t appreciate just HOW thin and light it is on a day-to-day basis, or how the chamfer between glass and aluminum looks if anything even more high-tech than its Dieter Rams-influenced predecessors.  And given that right now it will support every known feature of iOS 7, it’s going to feel like a whole different phone again soon – probably by mid-September, given the Great Mentioner’s announce date of September 10 and the likelihood of sales on the 20th.  It’ll almost certainly feel like the biggest shift in the iPhone since it first shipped.

And that’s important. Apple is getting clubbed pretty good in the blogopshere by people who look at the evolving state of Android, or the new design of the Moto X, or the prospect of Google Glass, or just the fact that most Android phones have upward of 5-inch displays, and want to know why Apple isn’t doing anything wildly different.  And this betrays a couple of fundamental misconceptions about how Apple works and how things are in the world.

For starters, Apple tends not to test things out in public. The first iPhone and the first iPad had months to build buzz, but since then, new versions tend to be on the shelves within a couple of weeks of announcement.  The rumor mill goes berserk, always, but Apple themselves never let the cat out of the bag early. The notion of a Google Glass-like approach where an unfinished product is released to a handful of randoms is unthinkable in Apple World.  Sure, Siri has been “beta” since it launched, but how much of an outlier is that? By contrast, how long did it take for Gmail – for GMail – to drop the beta tag?  Siri’s ongoing beta status jumps out because it’s unusual that Apple would go to market with an officially unfinished product.

The other consideration is that of the delta.  Apple hasn’t had nearly as far to come. The first iPhone lacked a few things, but within three years, the iPhone 4 was essentially what we have now.  By contrast, the first Android phone worth criticizing only shipped in 2010…a few months before the iPhone 4.  Apple hasn’t had nearly as far to come over the last three years as the Android ecosystem, which itself made a virtue of necessity by producing ever-larger phones to accommodate the ever-larger batteries required to carry them through a full day.  The result, with half a dozen manufacturers, was a plethora of choice and the appearance that the Android world was somehow advancing further and faster than the Apple one.

In the end, a lot of what people wanted from iOS 7 was change for the sake of change.  Something new, something fresh, something different and exciting. That’s only been made worse by the features Motorola has rolled into the Moto X’s hardware, features that are almost certainly going to require new hardware to emulate and which Apple may not be able to match until 2014.  Unless Apple has something they’ve been playing very close to the vest – a la Siri in 2011 – it’s almost a guarantee that the iPhone 5S* will be heralded as a great disappointment. Because the delta won’t be big enough.

 

 

 

* The Great Mentioner has concluded that we’re looking at a 5S, which given the track record since 2008-09 makes perfect sense.  I also buy into the assertion that the 5 will remain as the $99 option and that there will be a new 5C as the free-with-contract phone, basically made of the guts of the 4S with the screen and Lightning connector of the 5 in a plastic case.  Mainly because Apple wants to standardize on one phone/iPod display and one connector, and the 4S will be the last thing standing not using the stretch screen or the Lightning connector. Plus a brand-new product in the low-cost area might be a big enough delta for some.

The Berry Crack’d

Welp. Blackberry is officially looking for somebody to buy them. This was more or less inevitable; the Z10 was far too little three years too late while the Playbook was a complete and utter bust as a tablet (Amazon basically schooled them with extraordinarily similar hardware with the original $199 Kindle Fire).  Sic transit gloria mundi – the gold standard of the connected life in 2002 is a poor fourth in 2013, and arguably has been since Microsoft shipped Windows Phone 7.

It didn’t have to be like this.  Research In Motion had a hammerlock on the corporate market for most of the first decade of the 21st century.  For the longest time, the Blackberry Enterprise Server was the only option if you wanted to get your corporate email sent to a portable device, and it remained the most secure and reliable choice even as the Sidekick and iPhone carved out the “consumer smartphone” market for themselves – because Blackberry Messenger split the difference between texting and IM and became the indispensable unique selling point for RIM’s devices.  Apple didn’t produce an alternative until Messages in 2011.

But what did for Blackberry, ultimately, was Android – which gave half a dozen different companies the opportunity to put a “good enough” smartphone in the hands of any old punter, free with a two year contract. And Android (and with it the iPhone) got a lot better at accommodating corporate requirements before the Blackberry got the ability to handle consumer smartphone apps and connectivity. I experienced it myself in 2009, when I carried the Bold for a couple of months.  The most well-received and highly-regarded Blackberry device to date, and it still sported a physical keyboard and a tiny (albeit crystal-clear) screen.  And the most highly-recommended application for web browsing was the same Opera proxy browser that I’d been trying to use on a Moto flip phone four years earlier.

RIM thought everyone would stay loyal to the physical keyboard – but of all the high-end smartphones of the last couple of years, exactly none have shipped with a keyboard.  Not the HTC One, not the Samsung Galaxy S3 or S4, not the Moto X, not any iPhone.  Even RIM/Blackberry shipped the Z10, with no keyboard, a month before the Q10 which had it.  RIM also thought people would stay loyal to Blackberry Enterprise Server – but the iPhone has had direct interoperability with Exchange servers for five years, which beats having to run a separate box for wireless email.  And the proliferation of unlimited texting – and ultimately unlimited messaging of all types – made Blackberry Messenger just one more number to remember.

It didn’t have to be like this.  RIM could have, should have jumped onto Android as soon as it became obvious the iPhone wasn’t a gimmick or a fad or a one-off.  Instead of the Storm – with its full screen that clicked as one huge button – RIM should have turned out an Android phone.  Instead of layering it with the likes of Sense or MOTOBLUR or TouchWiz, they should have taken the opportunity to port BBM, to layer their BES interoperability and security over Android.  They could have taken the best, most attractive, most marketable parts of the Blackberry experience, let Google be responsible for the OS and the ecosystem, and established a value proposition that no Android vendor could have rivaled.  Instead, they remained convinced they were indispensable.

Nobody’s indispensable. “Good enough” always carries the day, else the Macintosh would have ruled personal computing in the 1990s.  And “good enough” has carried Android to the market lead, as the Moto X builds buzz and as one blogger after another proclaims the miracle of Google Now.  One in particular said that Android has surpassed the iPhone experience for him because he uses Google services for everything anyway – which makes perfect sense. If you use Google services for everything, well of course a Google phone OS that integrates with your existing Google accounts and services will work better than an iPhone, or a Blackberry, or a Windows Phone device.

But that’s the thing, as I learned when trying to make it work: Google Now is of very limited utility if you don’t use GMail as your email provider.  I can get the directions to work every morning, I can get the weather (kinda sorta), but I don’t use GMail at all anymore – so I’m not going to have automated package tracking or flight status updates or boarding passes magically appearing at the airport or hotel and restaurant reservation reminders.  Once somebody comes up with a mechanism that can mine the data on the local device and parse it there, without recourse to reading your data from the server side or piping it back up to the cloud first, this might work out a lot better for me.

But for now, I don’t need it that badly. Weather and traffic alone aren’t that vital, they appear to be coming to the iOS 7 notifications automatically anyway, and there’s nothing I can’t get just as effectively elsewhere.  Which could be the ultimate epitaph for Blackberry: in the end, everything else got to be just as good.

The Courting of Marcus Dupree

1981.  A different era. ESPN barely exists.  College football games only appear on television on Saturdays. There’s no  such thing as commercial Internet.  Sports talk radio is in its infancy.  The triple-option Wishbone offense is au courant among major programs, not just service academies.  The SEC still has ten teams, SMU is still a national power, and Bear Bryant is still alive.  Basically, from our standpoint thirty-plus years on, it’s prehistoric college football. No realignment or 12-team conferences or first-week-of-December title games or BCS standings.  You know, what I was raised on.

Into this comes one Willie Morris, native of Yazoo City and alumnus of the University of Texas, a Rhodes Scholar in the late 1950s and a famous literary editor who found himself at the University of Mississippi in 1980, just as a young man in Philadelphia, Mississippi was making a name for himself on the high school football field.  Apparently that young man’s legend had reached all the way to New York City, which is how Willie Morris found himself spending most of the 1981 high school football season in and around Philadelphia to watch the senior season of a certain Marcus Dupree, the consensus #1 high school player in the country.

The book is widely regarded as a classic of college football literature, and so I’m embarrassed to admit I didn’t start reading it until 2013. When I did, though, it was compelling – this, after all, is less than two decades removed from the infamous murder of three civil rights workers in Philadelphia in 1963, more or less contemporary with the Birmingham marches.  So reading about Philadelphia in the autumn of 1981 is more or less like reading about my own hometown in the vicinity of 4th grade or so.  Combine that with recruiting in an age with no Twitter, no 7-on-7 camps, no Rivals rankings, no national high-school All-American games, no endless hat games broadcast live by ESPN on National Signing Day…

I mean, think about it.  This is an era where national sports coverage realistically means Sports Illustrated, The Sporting News, and ABC’s Wide World of Sports. There’s no SEC Media Days with more credentialed reporters flocking to Hoover than attend the Super Bowl’s media day; instead a bunch of beat writers crammed into a rickety DC-3 and touched down in each of the 10 SEC towns to see the teams and coaches individually (and the SEC Skywriters Tour passed into legend).  For a single high school player to rate that kind of national attention was literally without precedent, and The Courting of Marcus Dupree does an amazing job of showing how a small Southern town, still scarred from the civil rights era, finds itself through the looking glass because of one 17-year-old.  They had no idea how to handle recruiting mania, because the mania hadn’t existed before.

Really, that’s the appeal: at root, The Courting of Marcus Dupree is about a small isolated Southern community having to adapt to the modern world, one halfback sweep at a time.  And yet, for the first time that I can remember, it actually made me a little tiny bit homesick for the idea of a small pastoral town, leaves turning, high school football as the focus of everything, where the “coffee shop” is in fact a diner and the sports talk comes from guys at the counter arguing over what was in the paper and what they hear (the evolution of “What do you hear?” as the greeting of choice is a particularly salient and entertaining point). No social media, no 24-hour cable news and sports, something quiet and manageable.

I’d go crazy inside of a week, I know.  At least, I think I know.

Ad the penny drops.

Not a typo. Effective today, Google has begun introducing advertising into Google Maps. This is why we have Apple Maps: because ads were the price of introducing vector maps and turn-by-turn for the iOS version of Google Maps. Apple said no, took the bullet, and the mapping solution in iOS is still recovering.

Not to say that Apple might not try to monetize its location data in some future version of iOS. Anything’s possible. But Apple is still out to sell you atoms, not bits, and as long as that’s the plan, they’re not going to give away their bits for somebody else to sell. I have no doubt that this news will drive a lot of iPhone users to give Apple Maps another look – and a year on, they might like what they see better than they did on first release.

The Secret World

…is a massively-multiplayer online role-playing game, and it’s literally the first one that ever actually made me consider signing up for it.  Largely because it’s not Yet Another Dungeons And Dragons Knockoff.  Between Everquest and World of Warcraft, it’s tough to find any sort of RPG that isn’t caught up in the same old swords and sorcery and Tolkein-knockoff.  But The Secret World is set in the present day.  Yes there’s magic and such, but there’s also international conspiracy and H.P. Lovecraft and a Hollow Earth and…well, imagine equal parts of Raiders of the Lost Ark and At The Mountains of Madness and you’ll get there.  It doesn’t hurt that two of the factions are Templars and Illuminati (and based on the way the game operates, I can say without hesitation or fear of contradiction that I am all Templar.)

Part of the appeal is just that it’s different.  And part of the appeal is that it apparently doesn’t rely on the endless level-grinding or guild-based dungeon-raiding that other games do; you don’t have to book hours to go clomping along with a bunch of other people in order to play.  The asocial element is kind of appealing, actually.  I think that’s a bit of what drew me to Ingress however briefly – while it’s a huge game of massive factions locked in combat, you actually do all the work by yourself.  I just couldn’t bring it up quickly enough to make it worthwhile, especially with a dodgy side-loaded client.

I guess that’s the genre I’m into now: somewhere between urban fantasy and magical realism. Kentucky Route Zero might be the best example, with its weird and surreal things simmering under the surface of the ordinary world – and that’s as simple a game as you could ask for.  It’s more interactive novel than game, to be honest.  And then, there’s the TV show I’ve unearthed on Netflix and started watching again after twenty years…but that’s a whole post of its own, of which etc etc you know the drill. 

It’s time for something new.  Well, new-ish.  Vampires are played all the way out, the endless Star Trek and Star Wars knockoffs are running out of steam, the superhero genre is starting to feel long in the tooth for everyone that isn’t Marvel Studios, why not break out the magical realism? Skip the Hogwarts stuff and just go for that creeping strangeness that we call magic because we don’t have another explanation for it…after all, isn’t that what the entire genre of magical realism was created for? To work out those things in print that we can’t talk about in polite society?

Or maybe there are just days when you need a demon clawing its way out of the Earth, and a shotgun full of silver pellets, because you just need to be able to shoot something dead with a clear conscience.

Striking

The thing I keep coming back to about the imminent BART strike is the same thing I go to with all manner of public-employee labor disputes: these are the last people to get screwed.  People talk about their zero-contribution defined-benefit pensions as if this is some sort of mystical ice cream land – when this used to be a regular thing.  Seriously. I still have a piece of one myself from my first job, which I started in 1997.  With an old and conservative and slow-to-change organization, sure, but we’re not talking about some sort of cloud-cuckoo-land.  

In fact, what we’re talking about is a lot of what kept people working in public-service jobs for a long time: the promise that you were giving up cash up front now in exchange for a secure and reliable retirement.  Corporate America realized it was cheaper and easier to match some money up front, trumpet the ownership society, and make people do their own retirement financial planning (or better yet, line the pockets of the financial services industry to kinda-sorta-halfass it).  But for the most part, government retirement stayed the same: with no shareholder value to maximize and with things like collective bargaining and workplace rights protected in ways the private sector circumvented to a fault, the public sector remained an echo of the days when you could stay with an employer for thirty years and retire in good stead.

Supposedly, BART employees haven’t had a raise in four years. That’s the other annoying thing: the idea that this notional platinum-encrusted retirement is so generous that asking for a raise is the height of impudence.  But then again, that’s the Southern way: don’t worry about your penurious circumstances now because you’ll get milk and honey and fried catfish in Heaven.  Oddly enough, for some reason this deal is never that attractive to the ones offering it.  But here on Earth, you can’t pay your mortgage with the promise of a gainful retirement – shit costs money. And prices haven’t really been dropping these last four years – gas is still over $4 a gallon and the cable bill never goes any lower.  For all the Fed’s and the Street’s fretting about the inflation risk and damping it down, it hasn’t made stuff cheaper.

No, what happened was that there were jobs that paid and jobs that didn’t.  The jobs that paid got moved to where they could pay people less, and the jobs that didn’t pay so good – the sorts of things filled by high school kids on summer break – stuck around.  We hollowed out that space where people without a professional degree could make a living and raise a family on one income.  There are only a few left, and the people who have them are going to defend them to the very last.  As well they should. “We already screwed everybody else to take your turn” is no reason to accept a screwing yourself.

The Warning Shot

So the Moto X is here.  It’s attractive, it’s innovative in its way, it’s not particularly future-proof, and it’s a lot more expensive than people were anticipating.  And that last bit, while most disappointing/relieving to me, may very well point up a fundamental problem.

See, one of the selling points of the Moto X is that it’s actually put together in America.  Some of the components are imported, of necessity, but the phones themselves get made in Texas, in an old Nokia plant.  It’s the closest thing in years to a legitimately American-made mobile phone.  And it’s got mid-range specs and a high-end price.  And there are some rumblings that this is the inevitable cost of making a phone in America – and by implication, that the current pricing level for high-tech gadgetry is entirely a function of cheap Chinese wages, and that moving away from that inevitably entails a possible doubling of costs.

That’s as may be.  The fact is, almost every high-tech manufacturer had decamped to China by the early 21st century (Apple’s own PowerBook manufacturing left Cork for OEM by Quanta around 2000 sometime).  It’s not unlike what happened to the textile industry over a century and a half of seeking cheap labor – clothing moved from the mill towns of New England to the deep South, and once modern OSHA regulations and labor laws reached the former Confederacy, the industry promptly chose to up sticks and move into Central America, and eventually as far as South Asia. Bangladesh recently experienced a loss of life an order of magnitude worse than the legendary Triangle Shirtwaist Fire of 1911, which caught the attention of the rest of the world and pointed up the fact that it’s not just high tech: mass manufacturing as a whole has been moved to where labor is cheapest and oversight is slightest. 

The thing about technology is this: as a result of ten years of manufacturing in Shenzen, all by the same half-dozen companies like Foxconn or Pegatron, an infrastructure has grown up around that manufacturing process that provides economies of scale. Setting up shop to make iPhones or iPads in the United States would entail moving the supply chain for parts, or else shipping them halfway around the world (much as Motorola may have to do with the X), and then building the infrastructure to do large-scale assembly of delicate parts. Almost from scratch, at that, since Silicon Valley manufacturing mostly evaporated twenty years ago.  Not for nothing is this Motorola plant in Texas, which leads the country in “business is always right and taxes are always wrong” levels of regulation and workplace safety; the South has always been the site for manufacturing of last resort before leaving the US altogether.  And even with all that, the Moto X’s announced cost at point of purchase is nearly double that of similarly-configured hardware from other manufacturers.

There are other theories around this. The most prominent is that Motorola is attempting to send the signal that because this phone costs $200 on contract, it is a peer of other phones that cost $200 on contract like the HTC One or Samsung Galaxy S4 – phones that have much more advanced technical specs. Or of the iPhone, which is similarly equipped but almost a year old and due to be replaced with a more advanced model by the end of autumn.  Maybe it’s the designer jeans approach – if you charge $200 for it, people will think it’s worth $200.

But it’s also entirely possible that this is the price of “made in America”.  Just like the Levis 501 jeans that cost $40 off the shelf or $178 for the LA-made North Carolina denim pair, it may be that what we think of as a mid-range Android phone just has to cost twice what we’re used to if we want to put it together in the United States instead of whatever the cheapest sweatshop in Shenzhen is this month.  Maybe if a lot of different companies went that route – if we had the Fort Worth Special Economic Zone and went nuts with the tax incentives and unemployment was high enough that the necessary skilled labor could be had at a minimal wage and a whole lot of gadget makers started making a whole lot of gadgets here – you might eventually achieve some economy of scale and drive down the cost.

But for whoever goes first, it’s going to be an expensive proposition, and nobody is patriotic enough to take a bath on their earnings while rebuilding the sector. More to the point, Wall Street won’t reward you for the long-term thinking, because they’re not capable of seeing a world beyond the next quarter.  So for Motorola to do this at all, they’re going to have to be able to do some magic – using patriotism and multi-colored shells to convince Ed Earl Brown to buy a quarter for fifty cents.  Whether enough people are willing to buy into the magic…well, I guess we’ll find out.