Appropriate Gestures

With the demise of Google Reader, I’m doing what everyone else is and exploring options.  Personally, I think the best move for iOS is to leverage iCloud and just have Reeder store your OPML file there, but in the meantime, the Internet’s preferred and chosen solution is Feedly.  Which seems nice enough on the website, but the app is unnecessarily colorful and swipey at the expense of useful.  Double-taps in an article dismiss it, swipe the wrong way and mark everything unread, and the controls are vague and not really that granular…

Which brings us to the swipe problem generally.  Gestures on a phone are more or less standard, at least in iOS world: pinch-to-zoom, the pull-to-refresh innovated by Loren Brichter, and the multi-finger swipe for app-switching on the iPad (or multifinger-pinch to get back to the home screen).  Beyond that, gestures seem to be defined on a per-app basis.  Reeder, for instance, seems content to scroll clean into previous or next article, and lets you specify what left/right swipe on an article will do.  With Feedly, it’s unclear what’s a scroll down, how you get to the next article, what suddenly dumps you back into the list of articles – at least in Reeder, there’s a steady sense of back-and-forth, the way there was in the Twitter app for iPad before they chose to fuck it up instead.

The swipe seems to be the new hotness.  Things like Sparrow and Clear for the iPhone are heavily gesture-based, the new Blackberry OS doesn’t even have a physical button – it’s all swipes. I think a lot of this is people still trying to reproduce Minority Report, but a lot of it also seems to be an attempt to carve out a distinctly different (and presumably patent-able) interface paradigm.  Given Apple’s lack of success at trying to patent the multi-touch UI (which was, at the time, a genuinely unique proposition in phones) my suspicion is that any such patent attempt would be the height of foolishness, but it might be enough for prospective players just to be significantly different.  Blackberry’s Z10 (and they seem to finally have adopted Blackberry as the company name rather than RIM) is pushing its swipe-GUI as its unique selling proposition – probably too little too late, but a sign of the times.  Their device, like the HTC One, is also a largely metal proposition, as is the BLU Life One, a genuinely interesting proposition – contract-free stock-Android unlocked device for $299 and not a Nexus?  Only Samsung is sticking by their plastic – but then, Samsung is bigger than any other Android player at this point and probably deserves its own post later.

In any event, it’ll be interesting to see the evolution of the gesture-based UI, especially in a world where everyone is waiting for the iWatch and/or Google Glass.  How do you control user interaction when there’s no keyboard or mouse and practically no screen at all?  Is the time of voice finally upon us?  Have Siri and Nuance’s Dragon line and Google’s excellent voice search interpreter finally broken the Star Trek barrier?  And how well will these interfaces cope with a gravelly drawl?  These are important issues.

Live and let diet

I’ve never been on a diet before.  Oh, I’ve given up stuff for Lent in varying quantities since college, and my first college girlfriend didn’t eat red meat so I pretty much didn’t about half the time for three years.  And when the Tired Texan closed I wouldn’t eat McDonald’s for a year.  And I tend to abide by the dietary limitations of whoever prepares dinner in accordance with their own restrictions.  But in terms of serious lifestyle change? Nothing.

Then I had my annual physical and bloodwork, on the heels of the nutritional shitshow that was the February project, and all the numbers were worse than last year.  HDL, LDL, triglycerides, ratio, blood pressure, waist circumference, percent body fat, weight, even *height* – all worse.  Time to make a change.

Over Christmas break, the wife had me read about the Primal diet, which is apparently a variation on the Paleo diet, which is in line with the eating of unprocessed foods and etc. as our caveman ancestors did, which…eh. Once I got past all the exclamation points (a hallmark of all diet literature, in my experience) and tales of Gonk or Gronk or whatever, it boiled down to a lot of the same low-carb stuff previously pitched as Atkins, or South Beach, or what have you.  With an extra helping of “Soylent Green is gluten!  IT’S GLUTEN!!” – which, given that my wife’s had a known gluten allergy for fifteen years at least, means it’s probably the smart way to go for her.

Okay, so how does this work out for me?  For the first week, at least, hardcore:

* No booze. (Ulp.)

* No more Coke Zero – in fact no more processed soda at all.  Just what I fizz up in the Sodastream with some lime juice and/or a dash of bitters maybe.  Certainly nothing pre-packaged and nothing with syrups involved.  Given the amount of Coke Zero I get through these last couple of years, that’s no small sacrifice, especially in a world of Coca-Cola Freestyle machines.

* No more junk food.  Nothing out of a vending machine. Nothing pre-packaged to speak of. No grazing on sweet stuff.

* No bread.  No sandwiches, no rolls, no empty calories from starch.  (Effectively, for my purposes anyway, this more or less adds up to “no fast food period.”

* No more sweetening the coffee.  Honey, perhaps, but no three packets of sugar or Splenda or what have you.

* Permissible beverages: coffee (black), tea (unsweet), and whatever bottled products I can find that use no sweeteners aside from stevia and the sugar alcohol that normally goes along with it whose name escapes me.  And more water. Lots more water.

Result?

The first week ended up being more like the first 10-12 days, as it turns out.  I was forced to actually walk out to the cafeteria to eat rather than run through Chipotle or just subsist out of the “automated convenience store” at work, and I was actually kind of hungry the first couple of days.  The first week saw me down 7 pounds, almost entirely water weight I’m sure, because I was legit dehydrated (not least because I was probably drinking coffee in the morning where normally I would have consumed some Zero.)  And the toughest part was that it went along with a lot of work bullshit, the sort of thing where I would normally just say “to hell with it” and go get a bottle of Zero or some Pop Tarts and tap out for a while the same way I did at the cigar shop ten years ago.

Ultimately, I think I may do more stress-eating than I realized.  The deprivation of that, for me, is less about “I can’t have this tasty pop-tart” and more about “I have enough stress in my life, do I really need to add to it by inflicting this kind of disciplinary deprivation upon myself?”  And flying off the handle isn’t near as much exercise as you’d think.  Nevertheless, I did power through.  It was Day 13 of the diet before I finally broke down and had myself a couple cans of Zero (at the end of a long and frustrating day where I basically served my guests Pork Shoulder a la Crematoria). That’s also the first alcohol I’d drunk the whole way – a glass of a nice cab plus a bottle of grapefruit radler (German biking beer) after the fiasco. (A Bushmills at dinner on St Paddy’s was the only other alcohol I’ve had.)  Last night, Day 16, I had In N Out for dinner – 3×2, plain, no fries or salad on it.  On two occasions, I broke down and had a nice gelato bar (Rechutti burnt caramel, AMAZING) which itself contains fewer carbs than the burger bun alone.  On day 9, I went to an Irish bar for three hours and had no Guinness, no chips, no curry, no apple pie – just unsweet tea, broccoli, and a half-pound hamburger patty with cheddar and Irish bacon and no bun.

I lost 7 pounds the first week and a couple more since, for what that’s worth.  But weight was not a present concern for me at all – I want to get two things out of this diet: smaller gut and better cholesterol numbers.  So far, I don’t feel like the gut has moved that much, but it may be a gradual thing or may take somebody else noticing.  And I won’t know what’s doing with my cholesterol until I give blood and get the basic numbers again to see whether things are declining to a more reasonable level.

Still, as lifestyle changes go, this one appears to be sustainable.  Simple blunt rules – no junk food, no fast food, no endless bottles of drink, no loading up on filler, and prefer red wine if it’s drankin’ time.  That sort of thing is a lot easier for me to remember and stick to than trying to keep track of total carbs or Points™ or constantly looking up nutritional information and ingredient lists online.  Pizza: out, hamburgers: out, burritos: out, Imperial pints of oatmeal porter: out. I don’t have to think too much about it, I just have to abide by the key rules.

The good news: being a platelet donor and able to give them a double-unit every week, it’s pretty straightforward for me to get that blunt-object cholesterol number to match and see what impact the diet is having.  After that, it’ll be time to start working in more exercise (not to mention rehab on my bum shoulder) and seeing what, if anything, can be gradually added in.  Pints of Guinness, or slices of Big Sur at Pizza My Heart washed down with lashings of Orange Vanilla Coke Zero, are not out of my life altogether – but they’re not going to be a routine feature either.

64-48

IF I HAD THE WINGS OF AN EAGLE
IF I HAD THE TAIL OF A CROW
I’D FLY MY ASS OVER KENTUCKY
AND SHIT ON THOSE BASTARDS BELOW!!!!

Let’s go steal an industry

I can’t believe that John Rogers, the creator and show runner of Leverage, wasn’t thinking ahead when he wrote this – in 2005. Long before Hulu, before Netflix streaming, before Rob Thomas could raise $2,000,000 in a day for a Veronica Mars feature film from fan contributions.  But John Rogers is a very, very, very smart man.

Read the whole thing and marvel, but for now, this:

 

The simple, hard-ass center of the new media revolution is that, in order for a show to show a profit on TV in the old model, it needs to stay on the air. To stay on the air, in order to generate enough perceived value for advertisers (for the network) and syndicates (for the studio), a show needs, regularly, ten million consumers a week. Five or seven on a smaller network.

In order for a show to create a profit on DVD (the fat pipe model of the present), it needs one million consumers.

There are a whole lot more risks one can take down here when you only need a million consumers. My proposal, actually, is that the better new media model (as the pipeline broadens, and the BigC’s lose more and more control over both distribution systems and the perception game) is of an insurgent, cell theory of entertainment. (*cable TV is a primitive form of this. Discuss).

It makes more sense for a BigC to cultivate a large number of small, streamlined productions, each of which cultivate a passionate (insurgent) fan base who will make multiple purchases of the entertainment product, than to continue to try for the largest common denominator. In effect, the first BigC who gives up will win. And win big.

Fuck Google

Well, there goes that. Google Reader is going down in three and a half months, and the Internet is losing its collective shit. Not least because basically every RSS reader for mobile devices relies on Google Reader for backbone sync.

Be interesting to see who steps up, because from the outcry, there’s a market for people who need this service. Marissa? In the meantime, if you recall, of all the Google services, the two I couldn’t do without were Maps and Reader. Well, Apple has Maps covered for mobile devices now…not perfectly, but good enough. It would be easy to run Google-free. Inconvenient, but the last irreplaceable part of the Google ecosystem has just pulled the plug, and the makers of those RSS apps will not just throw up their hands and say “oh for fuck’s sake” – Feedly and Reeder have already announced that they won’t go quietly and Feedly is even working on an API that could be a drop-in replacement.

Meanwhile, one thing is obvious, and has been for a while: if you’re building a product that depends on Google for a critical function, you’re an idiot.

SXSW

Not much news coming out of the annual computing-hipster gangbang in Austin, unless you count the risible panel that argued that 50% of Millenials would rather have no job at all than a job they hate.  In related news, 50% of Millenials should be ground up into free meatloaf for the unemployed. If you can have horse in a meatloaf you can have a horse’s ass in a meatloaf…but I digress.

The thing that gave SXSWi a name in Silicon Valley was its fame as the launching pad for Twitter in 2007 and Foursquare in 2009.  Twitter existed, but the splash it made at the conference as essentially a cheap and easy solution to blast-group-texting sent it on the rocket ride to where it is now.  Foursquare, meanwhile, added gamification to the social check-in functions of Dodgeball (from the same author).  But Foursquare had an advantage: it happened After iPhone.  Once the iPhone was able to run apps and use its internal GPS for location services, it was possible to take the key social features of Dodgeball and abstract away the need to fill in a place and location via text message.  Dodgeball was acquired by Google, died a slow neglectful death, and was forgotten.  Foursquare conquered the location-based social networking space.  Smartphone Time again.

Problem is, these things burn out quick. No real news out of SXSW this year, nothing interesting or captivating. You can’t turn it on like a switch, and “South-by” has rapidly turned into another promotional clusterfuck, devalued by expansion and overexposure (TED talks, anybody?) – I suppose it’s nice to let Austin feel like it’s vital to the technology world once a year for a long weekend, but there’s a reason Apple, Google, Twitter and Facebook can all be found within fifty miles of one another.

Speaking of…it looks like Andy Rubin is out as head of Android within Google, to be replaced by him what runs Chrome OS.  If I had to bet I’d say convergence is on the way, and not a moment too soon – ChromeOS is a nice idea but a niche application at best and the Chromebook Pixel is the 20th Anniversary Macintosh of Google.  A convergence between Android and Chrome, with Chrome becoming the browser-based runtime for an Android environment that can go on Windows or Mac hardware…that would be interesting.

flashback, part 59 of n

When the Riverchase Galleria opened in February 1986, I had barely been aware it was coming.  At the time, the best thing going for malls in Birmingham was Century Plaza, a mid-70s two-level brick Brutalist monster with four anchor stores, a flat exhibition deck in the middle of the second story (suitable for setting up Santa Claus in December and not much else), a scattering of eateries (including the much-missed Hot Sam pretzels) and a general feel that you could just as easily be underground.  And the anchors – Sears, JC Penney, Pizitz and Rich’s – were pretty much set in stone, in ascending order of posh and respectable.

And then the Galleria opened.

The first hint that things were different came as soon as you walked into Parisian, the major Birmingham department store not represented at Century.  It was huge, airy, with a mezzanine level floating between its two stories, and in 1986 fashion, that level was loaded with nothing but Swatch watches and Coca-Cola rugby shirts.  All by itself, that would have been a revolution in local retail.

But then, you walked out into the mall itself…and it was open and airy itself, with a huge glass atrium (with neon accents!) running the entire length of the mall proper.  It couldn’t have felt more radically different.  The mall had everything I needed at the time – two record stores, two bookstores – but it also had an actual candy store, something unheard of in malls around our area.  It had a store selling nothing but video games (Electronics Boutique), it had a music box store (seriously), and in a stunning turn of events, it had a whole lot of places to eat right next to each other, with a common dining space around a huge fountain spraying three stories high into an atrium between the office tower and the hotel (yes, a hotel in a mall). 

The first food court in town wasn’t the half of the amazements, though. There were glass elevators going to the third-level observation deck (itself mainly just an extension of the office building lobby).  There was a store called Banana Republic that appeared to be some kind of safari outfitter, complete with a jeep halfway through the front glass surrounded by jungle foliage. And Rich’s, the biggest anchor store, was itself three stories, and the top story even had a tiny grocery section.  You could presumably have a room in the hotel and come over to get Pop Tarts. And to cap it all off, there was space for another anchor store, one coming in 1987: Macy’s.  Macy’s.  The icon of New York City opening a store in Alabama.

Two or three years earlier, the notion had come to me in a dream that the mall would be a perfect place to hang out and walk around and spend time as a teenager – that’s how culturally benighted we were; I didn’t get it from movies or TV, it came to me in a freakin’ dream – so to have this amazing modern super-80s temple of American commerce dropped on me at age 14 was absolutely perfect.  The obvious problem, of course, was that it was on the wrong side of town and I didn’t have a driver’s license.  But any time I could get over there, I went like a shot – after a life spent largely on the rural side of town, this was my first routinely accessible exposure to a bigger, brighter, more exciting world.  One that would lead to a couple of major changes within a year…but that’s another story.

I Hate It Here

I hate my job. There it is.  This is not news.  In fact, this has pretty much been par for the course for most of the last fifteen years. 

This blog post nails it – if a job was fun they wouldn’t have to pay people to do it.  To quote in bulk:

Right now I have what by any criteria would be considered a good job. I’m paid decently, I have basic benefits, and the position is as close to Stable as jobs get these days. Yet I’m not happy because I’m expecting the job to make me happy. I expect it to not suck, when in reality on many days it does suck because it’s a goddamn job. Nowhere was I promised that it would be rewarding and fun all the time, or that it wouldn’t be frustrating, or that I would have days where I come home and wonder why I bother. I bother because they pay me, and getting paid is very useful to me. But that’s it. That’s the deal: I show up and fulfill my responsibilities, and then I get a check. Nobody said anything about fun.

As often as I give this advice to other people, I give it to myself lately. What I can’t figure out is why people in my age group (or younger) have this idea that the task for which they get paid will also be personally enriching. Is it because we lack fulfillment in our personal lives? Is it because we’re spoiled, believing that the working world owes us self-actualization in addition to a means of supporting ourselves? I’m not sure. What is certain is that we should be careful what we wish for. Those factory jobs that no longer exist start to look pretty appealing as our Career-as-Spirit Quest theory runs into reality.

It’s like this blog – I do it, I enjoy doing it, but it’s not something really monetizable.  Because to do that, I’d need to go out and hustle ads, I’d need to produce a base minimum of content, I’d have to start tailoring my output to maximize page views and draw traffic, and next thing you know, I’m not blogging anymore.  I enjoy writing what I feel like, when I feel like it, and I’m pretty sure that shortly after having to meet a deadline on SEO-maximized topics day in and day out, I’d wind up hating it.

When I took my first full-time job out of grad school, I remembered telling people that my job was as easy and profitable as picking up money in the street. It didn’t take that long to change, largely because our management situation melted down within a year to the point that our lead tech was reporting directly to an out-of-touch and irrationally unreasonable vice president.  But thanks to the foxhole mentality and the relentless churn of the dot-com era, we were quick to build a team of techs that weren’t just whip-smart and capable, but pleasant to work with and a boost to morale (lest we forget, “morale” is a measure of “how people are doing when they aren’t doing well at all,” to quote PJ O’Rourke, who famously pointed out that you never hear about the morale of people on spring break or at vacation resorts, just prisoners and soldiers and the like). 

So what would make my job suck less?

To borrow from Great Place To Work, employees believe they work for great organizations when they consistently:

1) Trust the people they work for;

2) Have pride in what they do, and

3) Enjoy the people they work with.

Well?

1) This is tricky.  I was lucky to direct-report to the greatest manager I ever knew for most of the first seven years out of school.  Since then it’s been a mixed bag – and almost without fail, my management from the director level up tends to be indifferent at best and actively antagonistic at worst.  The most constant problem in IT support comes from management that fixates on “customer service” and interprets it as how good we make the end-user feel rather whether the problem was resolved successfully and in the timeliest possible manner.  This frequently stems from managers who aren’t technical enough to understand the problems their staff is solving, along with the misbegotten notion that a company’s IT staff is providing a customer service to their co-workers instead of a peer function.  I never hear anybody going on about whether the electricians or the security or the custodial staff are providing excellent customer service.

2) I’m Winston Wolf.  I solve problems.  May I come in?  I daresay the one thing I do better than anything else in the workplace is solve problems – if you have a thing that needs to be made to work, or linked or integrated or just figured out, I’m your guy.  And inasmuch as I do that, I enjoy it and take pride in my success.  If there’s a real live disaster and I have to shovel coal twelve hours a day for a week to save our asses, I take pride in that too.  What I don’t take pride in is having to spend those twelve hours mopping up somebody else’s foreseeable mistake, or cleaning up from a disaster that we saw coming and which management ignored.  And I certainly don’t take pride in an endless array of having to walk around hand-holding the kind of people who never think to try rebooting the computer or sit in front of an open browser window with the cursor in the address field and ask “how do I get to webmail?”  (HINT: the address is webmail.company.domain, and if you were to just type “webmail” and hit return, YOU WOULD ACTUALLY GET THERE.)  Inasmuch as tech support is about problem-solving, I enjoy it.  Inasmuch as it’s a blend of babysitting and veterinary medicine, I hate it.

3) This is the problem…there are two splits here.  One is co-workers, and one is end-users.  End-users are always a problem.  Some are worse than others, some are really a pleasure to work with, but as the computing environment has evolved over the years, less stuff breaks.  We’re not using Token Ring some places and Ethernet others, we’re not struggling with System 7 and trying to make TCP/IP work reliable and trying to pass AppleTalk so people can print, we’re not running Windows NT 4 and terrified every time the virus alert pings.  Ten years of Windows XP, for better or worse, led to most of the rough edges being filed off, while OS X has gotten more robust and reliable with every passing release.  It’s reached a point where support issues, especially with a Macintosh, are only occasionally about “something is wrong with the computer,” and even those are mostly about a Java plugin that stopped working or a printer that requires deleting the queue and setting up a fresh connection – things that any user with admin rights ought to be able to figure out and fix themselves in 5 minutes.  And in the case of the younger users, they pretty much do.  That’s why I think the job is going away in ten years – partly because the technology is simplifying, but also in part because the generation that entered the workplace before their computers did is finally starting to retire and go away. Ten more years will be thirty years since I was in college, at which point it can be safely assumed that anyone in an office workplace has been using computers in an office since they started working.

The other split is co-workers, and here I was ruined, because my teammates at the first job were the perfect crew.  Replicating that has proven impossible, largely because we failed to weed out the toxic people quickly enough at my first California job and because I haven’t really had that peer-group environment since.  When everyone’s responsible for their own area, there’s commiseration, but not that common experience, that banding-together-against-the-common-foe.  Right now, I have a decent enough group of folks, but few if any are the sort of people I’d want to spend 8 hours at the Four Provinces singing and getting knee-walking drunk alongside.

So what’s the solution?  Right now, the plan is to agitate to move within this existing employer to a job with more future-proofing – something in data center or infrastructure administration, something that will still be necessary when all work is being dictated into iWatches from your home-working desk.  Something that will get me away from a customer-facing environment, something less interrupt-driven (well, slightly) and more project oriented, something not keyed to the workday hours of a call center.  And in the meantime, 5 PM means work is done, not to even be thought about until 8:30 the next morning (or Monday as the case may be).

Maybe this is all just project-related stress. Maybe once encryption is over and done with, it’ll be possible to have a more normal relationship with work.  But given that this was pretty much my situation and feeling for most of 2012, I suspect probably not.  This isn’t run-of-the-mill dysthymia either.  I don’t need antidepressants, I need something for stress.  And probably a ton of Xanax.  And let’s face it, a couple of cocktails wouldn’t hurt at all.  Not likely in the near future.  Of which, etc.

 

Too big for their britches

No one who lived through high tech in the 1990s will ever entirely trust Microsoft, or view them as other than a threat.  The Beast of Redmond used its effective monopoly for PC operating systems to leverage an effective monopoly in productivity software (killing the likes of WordPerfect or Lotus 1-2-3 as alternatives) and then used the Windows-Office duopoly to control the personal computing industry. They missed the boat on web browsing, and then bought Spyglass’s commercial Mosaic code to turn into Internet Explorer – and knit it into the operating system to crowd out Netscape, with the side-effect of intro ducting vectors of vulnerability that would provide ten years of malware vulnerability.  They invested a mere $150 million in Apple for the sake of propping up the only other viable commercial operating system maker – and in return Apple got continuing production of Microsoft Office for Mac, because without it the Macintosh would have been doomed quickly as a platform.

Things are different now.  Companies like Google and Facebook don’t have a monopoly as such – there’s nothing preventing you from using Yahoo for email or Duck Duck Go for web search or Path for social networking.  What Google and Facebook do have is a crippling majority of mindshare, and the sense in Facebook’s case that you have to be there because everyone else is, while in Google’s case, you’ve probably relied on their services (which have been largely excellent for many years) to the point that you’ve given them a critical mass of data.

Google and Facebook don’t charge end users at the point of service. Essentially, the only way you can give Google money is by buying a Nexus device – even Android is free and open-source at the OS level (the Google-specific apps for Android may still be proprietary).  Facebook is constantly smacking down rumors that they’re going to start charging, but by and large the only way to give Facebook money is to pay to promote a post. Or…to advertise.

I often wonder whether Google and/or Facebook can’t charge end-users for their service. Think about it – if Google or Facebook got rid of advertising and just charged the necessary amount to make up the difference, how many people would stick with it?  And more to the point, could you make as much money just by charging what the market will bear?  Or is this a situation where Google and/or Facebook’s services are provided as a loss leader to allow them to harvest enough personal information to make a profit at advertising?

Microsoft certainly thinks so – they pitched their new Outlook.com email service with some risible ads about “Scroogling”  as if they are shocked, shocked that a technology company would take unfair advantage of their position to make more money.  Frankly they ought to be more worried about getting their uptime on Outlook.com above three nines, but that’s neither here nor there.  Microsoft is finally paying the price for having sat on their laurels after the launch of Windows XP, while Apple was skating off toward the iPod and then the iPhone and then the iPad, and Microsoft shat the bed with the Zune and took forever to produce Windows Phone 7 and then whored all its buyers with Windows Phone 8.  And then Surface…erm, surfaced. Essentially, Microsoft is still stuck into the idea of Windows everywhere and Windows everything, one ring to rule them all.

By contrast, Google’s one thing is its ability to aggregate data and sell advertising.  And they will do anything that feeds into that, and kill it gladly if it doesn’t work out.  Sic transit Google Wave, Google Buzz, Orkut. That’s why Google+ will hang on.  That’s why they’re happy to put a superior Google Maps product on the iPhone instead of keeping it themselves as an Android differentiator. Google doesn’t care if they make money on the bait; their money’s in the trap.  Which is the same reason why Facebook keeps adding things like voice chat and video calling, and is urging on Graph Search (which is basically asking you to make your life an open book for the sake of improving search results).  Everything Apple (and Amazon) does is about getting you to buy more Apple (or Amazon) stuff; everything Google and Facebook do is about getting you to let them sell more of you.  That’s why Google Maps and the Kindle app are available on iOS – Apple knows that you’re more likely to buy an iPad if it can be a Kindle reader too, and Amazon knows it will sell more Kindle books if it can sell to more people than just own a Kindle device.  Similarly, Google wants mapping information from more people than own Android devices, and Apple would rather you buy an iPhone and run Google Maps than buy an Android device to do so.

But at least with Apple and Amazon, they’re still charging cash on the barrelhead in the time-honored market tradition of “money can be exchanged for goods and services.” Google and Facebook are still dueling for the right to your online identity – e.g. Facebook Connect or the all-new Google+ Sign-In – and to make themselves the intermediary of your online experience.  Hell, Google’s new Chromebook Pixel charges you MacBook Air money for what is essentially a browser terminal, but has one terabyte of cloud storage connected. One terabyte of your stuff…to be stored by Google.  The mind reels.

The moral of the story is this: if you are in fact the product, you have to avoid letting yourself get locked into a single solution.  Putting your entire life in Google is hella convenient and easy to use, but to quote Spencer Hall, “there is always free cheese in the trap, and it is always a deal.”  You don’t have to let yourself be the product. Google and Facebook may be too big for their britches, but they don’t have to be too big for yours.  If privacy and control are important to you, they’re worth paying for.

iWhat?

Bloomberg appears to have decided to will an “iWatch” into existence today, and everyone is running with their report – apparently the nonexistent iWatch is a better investment for them than the nonexistent iTelevision, and the fact that they are advising investors on the merits of products that do not exist is just one more reason why the investor class should be ground up into free meatloaf for the unemployed. But I digress.

The problem is, I’m still struggling with the use case for a notional “iWatch”.  With Google Glass, as I mentioned previously, the model is JARVIS – Tony Stark’s real-time artificial-intelligence assistant tied to a voice and HUD interface in the Iron Man armor.  The idea of being able to look at something, have it identified and pop up its Wikipedia entry instantly? I like that.  Not to mention real-time navigation and directions, possibly tied right to my calendar and incorporating things like traffic and train delays. “Sir, you need to leave NOW if you are to make the connection at Millbrae in time for the Warriors game.”

But the watch?  What would I like to be able to see on the watch?  Well, notifications, I suppose – it would be nice not to have to take the phone out to see who’s calling or who just texted me.  I suppose there are worse things than having a couple of canned replies available from the watch as well. (On my way, etc)  It might be handy to have calendar events visible and reminder alerts for same.  Plus having the wrist alerting might be more reliable than the phantom vibrate that we feel for no reason (or the real one we don’t).

The other trick is Bluetooth 4.0, of course.  What’s the range?  Would this mean alerts in the shower while the phone is outside on the sink counter? (Assuming a similar degree of waterproofing as the Pebble, or my own watch.) How long before you have to charge the watch? Once a week is fine – pop it on the charger overnight on Saturday night – but if it’s only 3 or 4 days, that’s problematic.  And how much power does Bluetooth 4.0 draw?  Are the savings from not taking out the phone and lighting up that big screen going to be wasted in maintaining the Bluetooth connection?  If the early returns on the Pebble are anything to go by, very possibly.

So what does that leave?  Some of the Fitbit/Fuelband-type stuff, maybe?  Activity tracking, sleep monitor, count steps and miles run and etc?  Okay, sure, might be useful especially as I work to get in better shape.  But enough to make it worth spending the money? Maybe if everything is rolled in together…at this point, I can see the appeal in waiting for the Apple version of such a device.  It may not be feature-complete in its first incarnation, but those features will work.  It’s like the original iPhone: no cut and paste, no 3G, no GPS no 3rd-party application SDK, but largely because those features couldn’t be implemented well or without compromising the battery.  The features that it did ship with were well-implemented, which seems to be the difference.  Google (and through them Samsung) will release a boatload of features and let you beta-test for them; Apple will release a limited device that does only the things it can do and do them well.

And maybe they’ll come up with some way of making those things make sense, and come up with some function I didn’t know I needed until I can’t live without it.  But until then, I’ll stick with what I’ve got…