Take the charge

The one thought I keep coming back to with Facebook Home – more so than having your personal life whored out to every ad buyer on earth, more than the prospect of seeing your friends’ most compromising pics as soon as you pull your phone out on the train – the thing I latch onto is this: unless you get the HTC First, which is apparently dumping Sense in favor of FBH, you’re most likely going to be running FBH over top of some other middleware layer (Sense, MOTOBLUR, TouchWiz) which itself is running over Android. And I think of one thing: battery.

This is the problem. I mean, the problem. Faster processors, more RAM, bigger and higher-res screens, more complex operating systems and apps, and ever-faster cellular data – battery technology hasn’t budged in about six years or so, when the lithium polymer in a SonyEricsson Z520 lasted me four days between charges. Then again, it was pushing a tiny screen and no data service to speak of. Hell, most of its power went toward changing themes and flipping between ringtones. No RSS, no Twitter (except via SMS), no web browsing or Instagram or podcasts or streaming March Madness action. And the only real advance in battery life…is to make it bigger. In fact, that’s half the reason Android phones all grew to 5 inch screens: more room for battery in an OS not known for power management.

I accept that I need to plug the iPhone in at the end of the day. Time was, that was enough – normal workday meant the phone went down to about 30%, but that was good enough. Then the retina display, and live podcast updating instead of syncing off the laptop, and then Verizon and LTE…and now I have a charge cable at my desk and a spare charge pack in the bag and, most recently, an actual iPod shuffle for the reliable standby music once the podcasts are done. It’s almost enough that I can get through the day and have battery enough to go to dinner or something and not be completely wiped. It’s also why I stick to the Kindle for reading at home rather than use one of the iOS devices. Using two devices – three, now, on Tuesday nights – might see woefully inefficient, but the Kindle charges once a week and the dumb phone once a month. And there’s something to be said for never having to worry about charge.

So now we’re back to Facebook Home, which is essentially replacing your phone’s UI with a live feed of your FB account. Status updates. Pictures. Chat. All in real time, or something close to it, and no doubt running location services the whole way (how else to sell geolocation-connected information to advertisers? Because yes, Zuck admits as much, there will be ads in your feed in FBH eventually). All this plus Android’s well-known fruit-fly battery performance? You’ll be plugging in the phone twice a day minimum.

And that’s what will do for them. Not privacy concerns from the kind of user base that would have equated AOL with the Internet fifteen years ago. Not performance issues from heaping superfluous code on superfluous code. What’s going to make Ed Earl Brown chuck his FacePhone out the truck window is when he plugs it in at lunchtime to top up and still plugs a dead phone into the cigarette lighter for the drive home. It’s the same reason the first iPhone didn’t ship with 3G capacity, or a higher-res screen, or an app framework or any kind of user-alterable code: because if the first 2007 iPhone couldn’t make it through the whole day, there probably would never have been an iPhone 3G.

Mark my words, Facebook Home will live or die on one metric: whether your battery does.

Spencer Hall is brilliant

Best block quote ever, part of a brief and incisive piece:

 

“A critical reader should not assume the word means anything. I grew up Catholic. In no way shape or form can you call me religious in any sense, but it does stick with us in a few ways. The most consistent one is an emphasis on action, and that belief without action is just pissing into a stiff wind and calling it a sunshower. Another is a lifelong focus on blood, violence, and the Old Testament, since that was the part we really enjoyed asking CCD teachers about. We were the worst CCD student ever, and apologize to our teachers for ever being there.”

Home, sick

Long story short: Facebook Home is Zuckface’s effort to get you to run the Facebook app as the UI of your Android phone.  Apparently it’s going to be a full middleware piece, designed to run all your chat through Facebook and ease your camera shots into Facebook and use your News Feed as your home screen and…

This was inevitable, really.  They have to get pride of place on your mobile device to maximize the amount of data they can collect and advertise against, and the middleware approach is a lot cheaper than coming up with an Android fork (like Amazon) or an entire phone OS (like Microsoft) or an entire whole-package phone (like Blackberry or Apple).  But the pitch of “make the Facebook app the UI of your whole phone” appeals to me not at all.  Nothing ever worked better by adding more plumbing.

(Of course, this all assumes you have Facebook at the core of your social strategy rather than “that one thing hanging out on the side that I check once a week to make sure nothing’s happened to my East Coast gang or my old high school friends” – which is increasingly not the case.  Facebook has a way of turning into an avalanche of your mother’s forwarded chain emails punctuated by ads.  I couldn’t function without Twitter, but I wouldn’t notice if Facebook disappeared tomorrow – and the risk that we’ve passed Peak Facebook is probably what’s driving the urgency to get onto mobile in an unmissable way as quickly as they can.)

Operation Clotheshorse

I suppose it began with the ill-advised acquisition of the Saboteur jacket – the waterproof silk-lined sport coat I bought in 2010. Way too expensive and a hair too small, and ultimately not quite right. But it was the start, and it broke the seal.

I suppose things were made worse by the arrival of Team Black Swan East, who brought their Southern sense of dressing well and forced us to step up our game a little. I bought a pair of American-made Lucky Brand jeans, then a pair of American-made Bill’s Khakis, then broke down and finally bought that seersucker sport coat just in time to wear it to New York and buy two Uniqlo blazers for $60.

That really did it. I had the peacoat. I got another pair of Lucky jeans, then bought three more pair of my old reliable 501s – each in a more fashion-forward wash than before. I obtained an actual seersucker suit. My first new pair of Docs in over 3 years were chukkas. My key Christmas present was the Levi’s-Filson tin cloth trucker jacket i had coveted for some time. And this week, I dropped $300 on hand-sewn American-made gray suede wingtips, the most I’ve ever spent on a single pair of shoes in my life.

So why? The guy with the famously predictable DC wardrobe, who wore the same model jeans with the same pair of workboots for five years in California, who owned 11 pair of Dr Martens and half a dozen black or gray American Apparel T-shirts and a bunch of Vanderbilt gear – why the sudden onset of boat shoes and Palladium boots and new colorful button-ups?

Maturity, possibly. Vandy Lifestyle, for one – need to look the part. Continued life and work in Silicon Valley, for another – I have the old EUS urge to look a cut above the average paste-eating neckbeard in this industry. For a third, as the old campaign poster said in high school, clothes make the man – naked people have little or no influence on society.

Maybe this is all part of the regeneration. This is just who I am now. It’s not the cheapest hobby, I suppose, but when your fixation is finding stuff you can wear for the rest of your life, I suppose it’ll pay out over time.

The Quickening

So Google killed off Reader with only three and a half months’ lead time, to the outrage of the entire Internet.  And a week later, they dropped the all-new Google Keep, an addendum to Google Drive meant as an all-purpose note-bucket…which everyone immediately saw as aimed squarely at Evernote, the best-of-breed all-everything data repository (one I make a LOT of use of, including notes toward these very blog entries).

Add to that the rumblings that there’s now a Google Watch in the offing – not surprising, but given the focus on Google Glass, it seemed as if the Beast of Mountain View was going to put most of its “wearable computing” eggs in the “eyewear” basket–

Hold up.

That phrase.  The Beast of Mountain View.  Shouldn’t that be the Beast of Redmond?  Nope.  In fact, one Mac partisan has gone so far as to declare that Microsoft is no longer the enemy.  Sure, Windows is still out there and PCs running Win7 and its predecessors are a dominant ecosystem, but in the post-PC world? Microsoft is an afterthought.  Its digital music offerings ranged from worthless to risible.  Its phone offerings were too little too late, and while intriguing currently has little prospect of rating better than a poor third in the smartphone world.  It’s not aiming to dethrone Android or iOS, it’s aiming to lap Blackberry.  The XBox – itself starting to get a bit long in the tooth – is the only spot off the desktop where Microsoft has made a dent.  And their attempt to repurpose and rebrand Hotmail as Outlook.com, complete with Google-bashing commercials, was roundly mocked and ultimately dismissed.

Meanwhile, Google’s push into new areas continues.  G+ remains a point of emphasis to compete with Facebook.  Google Play continues to be pushed as the alternative to the iTunes Music Store or Amazon (especially given the irony that Amazon built its own tablet ecosystem on Android). Google Apps for Business have reached a point that one of the hottest startups in Silicon Valley is running its entire ecosystem on them – eschewing Office, Exchange and Outlook altogether.  The Nexus line now includes a 4″ phone, a 7″ tablet and a 10″ tablet to cover the entire range of touch devices, and there’s a nice slim metal ultrabook-style Google Chromebook. And now there’s apparently going to be a mythical Google watch, to go with the mythical iWatch and mythical Samsung watch.

Google has become the new Microsoft again – they were truly indispensable for years in the field of search, especially map search, and Gmail overwhelmed Yahoo and Hotmail and all its other competitors, and one thing led to another and now way too many of us couldn’t get by without at least one Google service.  And tragically, somebody just beat me to this post while it sat in my drafts folder, so here is his take – and he’s spot on.

The fact that people are talking about how to live without Google should be the most disconcerting thing of all for the powers that be out on 1600 Amphitheater Parkway.  For my own part, now that Reader won’t be a thing anymore, it’s possible for the first time to contemplate life completely free of the big G.  Hell, Apple Maps got me to my physical therapist this morning bang on time and without a hiccup.  2013 may turn out to be the year that Google got added to the old EUS creed: “We don’t drink flavored liquor, we don’t smoke machine-rolled cigars, and we don’t put mission-critical work on Microsoft products.”

The examined life

The bit of Avenue Q that always makes me gulp very hard is the line in “I Wish I Could Go Back To College” (hell, that song is triggering enough) when Princeton sings “I wish I had taken more pictures…”  Because there are almost no photographs at all of my seven years indentured to higher education.  In fairness, a lot of that stems from the fact that there wasn’t very much worth taking pictures of, and my lack of memories is more painful than my lack of evidence – but I don’t remember even having a camera between high school and the ill-starred purchase of one of those Advanced Photo System cameras in 1997, which itself barely lasted a year before going askew under circumstances unremembered. (I chalk it up to 1998 generally.)

My then-girlfriend bought me a digital camera around…when? 2002? 2003? It was a birthday present, and it got some use, though I don’t know what ever became of it.  I’m sure I took some shots on the honeymoon trip, but I don’t recall…but the point is, that was a 2.1 megapixel camera.  And in 2007, I took possession of my first iPhone…with its own 2 megapixel camera.

Now, it wasn’t as good a camera, obviously. No optical zoom at all, no flash, and certainly no prospect for video recording.  But it was a camera that I had on me.  As with flasks and pistols, so with cameras: the one you have on you when you need it is infinitely better than the highly superior featureful model sitting in your drawer back at home.  And yet, with 2 MP and poor focus and no flash and no HDR mode to help clean up and no video for a pinch, “I have it with me” was the only real feature it presented.

If you look through my iPhoto library, though, there’s a huge spike in pictures in 2010, even correcting for the import of my wife’s pics of the trip to Europe that year. (When you correct for her pictures during The Summit, the historic weeklong visit of Team Black Swan East en route to their eventual emigration, 2009 is of a piece with 2008.) No, in mid-2010, I upgraded to the iPhone 4, which offered HD video capture, an LED flash, and – most importantly – an improved auto-focus-capable 5 MP camera.  The picture totals go WAY up.  And then in 2012, the phone gets warranty-replaced with the iPhone 4S, which improved to an 8 megapixel camera, and that’s when I start taking pictures all the time.  Sure, you could probably adjust for Twitter and Path and Instagram and Facebook, but the presence of a point-and-shoot replacement in my pocket at all times meant that I finally started taking more pictures.  Landscapes. Cocktails. I have a better record of my life since 2011 than I have of the entire 1990s.

Too, I have blog records going back to 1999.  Not always current, not always frequent, but my life now is on record in ways that it wasn’t before my great regeneration at age 25. And it’s been surprisingly helpful to be able to go back to, say, late 2003 or early 2004 or most of 2007 and 2008, and compare history and see if I’ve learned anything.  But equally important to me, in some ways, is the fact that I have a past now. The black hole has been pushed back; the void in my life is reduced.  I have memories, I have proof, I have fifteen years of experience to look back and say “I remember when” and things I can draw from and build on.  And that, on many levels, has slowly started to patch up the empty place that made me wish I could go back to college.

Appropriate Gestures

With the demise of Google Reader, I’m doing what everyone else is and exploring options.  Personally, I think the best move for iOS is to leverage iCloud and just have Reeder store your OPML file there, but in the meantime, the Internet’s preferred and chosen solution is Feedly.  Which seems nice enough on the website, but the app is unnecessarily colorful and swipey at the expense of useful.  Double-taps in an article dismiss it, swipe the wrong way and mark everything unread, and the controls are vague and not really that granular…

Which brings us to the swipe problem generally.  Gestures on a phone are more or less standard, at least in iOS world: pinch-to-zoom, the pull-to-refresh innovated by Loren Brichter, and the multi-finger swipe for app-switching on the iPad (or multifinger-pinch to get back to the home screen).  Beyond that, gestures seem to be defined on a per-app basis.  Reeder, for instance, seems content to scroll clean into previous or next article, and lets you specify what left/right swipe on an article will do.  With Feedly, it’s unclear what’s a scroll down, how you get to the next article, what suddenly dumps you back into the list of articles – at least in Reeder, there’s a steady sense of back-and-forth, the way there was in the Twitter app for iPad before they chose to fuck it up instead.

The swipe seems to be the new hotness.  Things like Sparrow and Clear for the iPhone are heavily gesture-based, the new Blackberry OS doesn’t even have a physical button – it’s all swipes. I think a lot of this is people still trying to reproduce Minority Report, but a lot of it also seems to be an attempt to carve out a distinctly different (and presumably patent-able) interface paradigm.  Given Apple’s lack of success at trying to patent the multi-touch UI (which was, at the time, a genuinely unique proposition in phones) my suspicion is that any such patent attempt would be the height of foolishness, but it might be enough for prospective players just to be significantly different.  Blackberry’s Z10 (and they seem to finally have adopted Blackberry as the company name rather than RIM) is pushing its swipe-GUI as its unique selling proposition – probably too little too late, but a sign of the times.  Their device, like the HTC One, is also a largely metal proposition, as is the BLU Life One, a genuinely interesting proposition – contract-free stock-Android unlocked device for $299 and not a Nexus?  Only Samsung is sticking by their plastic – but then, Samsung is bigger than any other Android player at this point and probably deserves its own post later.

In any event, it’ll be interesting to see the evolution of the gesture-based UI, especially in a world where everyone is waiting for the iWatch and/or Google Glass.  How do you control user interaction when there’s no keyboard or mouse and practically no screen at all?  Is the time of voice finally upon us?  Have Siri and Nuance’s Dragon line and Google’s excellent voice search interpreter finally broken the Star Trek barrier?  And how well will these interfaces cope with a gravelly drawl?  These are important issues.

Live and let diet

I’ve never been on a diet before.  Oh, I’ve given up stuff for Lent in varying quantities since college, and my first college girlfriend didn’t eat red meat so I pretty much didn’t about half the time for three years.  And when the Tired Texan closed I wouldn’t eat McDonald’s for a year.  And I tend to abide by the dietary limitations of whoever prepares dinner in accordance with their own restrictions.  But in terms of serious lifestyle change? Nothing.

Then I had my annual physical and bloodwork, on the heels of the nutritional shitshow that was the February project, and all the numbers were worse than last year.  HDL, LDL, triglycerides, ratio, blood pressure, waist circumference, percent body fat, weight, even *height* – all worse.  Time to make a change.

Over Christmas break, the wife had me read about the Primal diet, which is apparently a variation on the Paleo diet, which is in line with the eating of unprocessed foods and etc. as our caveman ancestors did, which…eh. Once I got past all the exclamation points (a hallmark of all diet literature, in my experience) and tales of Gonk or Gronk or whatever, it boiled down to a lot of the same low-carb stuff previously pitched as Atkins, or South Beach, or what have you.  With an extra helping of “Soylent Green is gluten!  IT’S GLUTEN!!” – which, given that my wife’s had a known gluten allergy for fifteen years at least, means it’s probably the smart way to go for her.

Okay, so how does this work out for me?  For the first week, at least, hardcore:

* No booze. (Ulp.)

* No more Coke Zero – in fact no more processed soda at all.  Just what I fizz up in the Sodastream with some lime juice and/or a dash of bitters maybe.  Certainly nothing pre-packaged and nothing with syrups involved.  Given the amount of Coke Zero I get through these last couple of years, that’s no small sacrifice, especially in a world of Coca-Cola Freestyle machines.

* No more junk food.  Nothing out of a vending machine. Nothing pre-packaged to speak of. No grazing on sweet stuff.

* No bread.  No sandwiches, no rolls, no empty calories from starch.  (Effectively, for my purposes anyway, this more or less adds up to “no fast food period.”

* No more sweetening the coffee.  Honey, perhaps, but no three packets of sugar or Splenda or what have you.

* Permissible beverages: coffee (black), tea (unsweet), and whatever bottled products I can find that use no sweeteners aside from stevia and the sugar alcohol that normally goes along with it whose name escapes me.  And more water. Lots more water.

Result?

The first week ended up being more like the first 10-12 days, as it turns out.  I was forced to actually walk out to the cafeteria to eat rather than run through Chipotle or just subsist out of the “automated convenience store” at work, and I was actually kind of hungry the first couple of days.  The first week saw me down 7 pounds, almost entirely water weight I’m sure, because I was legit dehydrated (not least because I was probably drinking coffee in the morning where normally I would have consumed some Zero.)  And the toughest part was that it went along with a lot of work bullshit, the sort of thing where I would normally just say “to hell with it” and go get a bottle of Zero or some Pop Tarts and tap out for a while the same way I did at the cigar shop ten years ago.

Ultimately, I think I may do more stress-eating than I realized.  The deprivation of that, for me, is less about “I can’t have this tasty pop-tart” and more about “I have enough stress in my life, do I really need to add to it by inflicting this kind of disciplinary deprivation upon myself?”  And flying off the handle isn’t near as much exercise as you’d think.  Nevertheless, I did power through.  It was Day 13 of the diet before I finally broke down and had myself a couple cans of Zero (at the end of a long and frustrating day where I basically served my guests Pork Shoulder a la Crematoria). That’s also the first alcohol I’d drunk the whole way – a glass of a nice cab plus a bottle of grapefruit radler (German biking beer) after the fiasco. (A Bushmills at dinner on St Paddy’s was the only other alcohol I’ve had.)  Last night, Day 16, I had In N Out for dinner – 3×2, plain, no fries or salad on it.  On two occasions, I broke down and had a nice gelato bar (Rechutti burnt caramel, AMAZING) which itself contains fewer carbs than the burger bun alone.  On day 9, I went to an Irish bar for three hours and had no Guinness, no chips, no curry, no apple pie – just unsweet tea, broccoli, and a half-pound hamburger patty with cheddar and Irish bacon and no bun.

I lost 7 pounds the first week and a couple more since, for what that’s worth.  But weight was not a present concern for me at all – I want to get two things out of this diet: smaller gut and better cholesterol numbers.  So far, I don’t feel like the gut has moved that much, but it may be a gradual thing or may take somebody else noticing.  And I won’t know what’s doing with my cholesterol until I give blood and get the basic numbers again to see whether things are declining to a more reasonable level.

Still, as lifestyle changes go, this one appears to be sustainable.  Simple blunt rules – no junk food, no fast food, no endless bottles of drink, no loading up on filler, and prefer red wine if it’s drankin’ time.  That sort of thing is a lot easier for me to remember and stick to than trying to keep track of total carbs or Points™ or constantly looking up nutritional information and ingredient lists online.  Pizza: out, hamburgers: out, burritos: out, Imperial pints of oatmeal porter: out. I don’t have to think too much about it, I just have to abide by the key rules.

The good news: being a platelet donor and able to give them a double-unit every week, it’s pretty straightforward for me to get that blunt-object cholesterol number to match and see what impact the diet is having.  After that, it’ll be time to start working in more exercise (not to mention rehab on my bum shoulder) and seeing what, if anything, can be gradually added in.  Pints of Guinness, or slices of Big Sur at Pizza My Heart washed down with lashings of Orange Vanilla Coke Zero, are not out of my life altogether – but they’re not going to be a routine feature either.

64-48

IF I HAD THE WINGS OF AN EAGLE
IF I HAD THE TAIL OF A CROW
I’D FLY MY ASS OVER KENTUCKY
AND SHIT ON THOSE BASTARDS BELOW!!!!

Let’s go steal an industry

I can’t believe that John Rogers, the creator and show runner of Leverage, wasn’t thinking ahead when he wrote this – in 2005. Long before Hulu, before Netflix streaming, before Rob Thomas could raise $2,000,000 in a day for a Veronica Mars feature film from fan contributions.  But John Rogers is a very, very, very smart man.

Read the whole thing and marvel, but for now, this:

 

The simple, hard-ass center of the new media revolution is that, in order for a show to show a profit on TV in the old model, it needs to stay on the air. To stay on the air, in order to generate enough perceived value for advertisers (for the network) and syndicates (for the studio), a show needs, regularly, ten million consumers a week. Five or seven on a smaller network.

In order for a show to create a profit on DVD (the fat pipe model of the present), it needs one million consumers.

There are a whole lot more risks one can take down here when you only need a million consumers. My proposal, actually, is that the better new media model (as the pipeline broadens, and the BigC’s lose more and more control over both distribution systems and the perception game) is of an insurgent, cell theory of entertainment. (*cable TV is a primitive form of this. Discuss).

It makes more sense for a BigC to cultivate a large number of small, streamlined productions, each of which cultivate a passionate (insurgent) fan base who will make multiple purchases of the entertainment product, than to continue to try for the largest common denominator. In effect, the first BigC who gives up will win. And win big.