EOF

Leaving the Land of the Giants

Doc Searls

Issue #226, February 2013

The next revolution will be personal. Just like the last three were.

The cover of the December 1st–7th 2012 issue of The Economist shows four giant squid battling each other (www.economist.com/printedition/2012-12-01). The headline reads, “Survival of the biggest: The internet's warring giants”. The squid are Amazon, Apple, Facebook and Google. Inside, the story is filed under “Briefing: Technology giants at war”. The headline below the title graphic reads, “Another game of thrones” (www.economist.com/news/21567361-google-apple-facebook-and-amazon-are-each-others-throats-all-sorts-ways-another-game). The opening slug line reads “Google, Apple, Facebook and Amazon are at each other's throats in all sorts of ways.” (Raising the metaphor count to three.)

Now here's the question: Is that all that's going on? Is it not possible that, in five, ten or twenty years we'll realize that the action that mattered in the early twenty-teens was happening in the rest of the ocean, and not just among the mollusks with the biggest tentacles?

War stories are always interesting, and very easy to tell because the format is formulaic. Remember Linux vs. Microsoft, personalized as Linus vs. Bill? Never mind that Linux as a server OS worked from the start with countless millions (or even billions) of Windows clients. Or that both Linus and Bill had other fish to fry from the start. But personalization is cheap and easy, and there was enough antipathy on both sides to stoke the story-telling fires, so that's what we got. Thus, today we might regard Linux as a winner and Microsoft as a loser (or at least trending in that direction). The facts behind (or ignored by) the stories mostly say that both entities have succeeded or failed largely on their own merits.

Here's a story that illustrates how stories can both lead and mislead.

The time frame was the late 1980s and early 1990s, and the “war” was between CISC (Complex Instruction Set Computing, en.wikipedia.org/wiki/Complex_instruction_set_computing) and RISC (Reduced Instruction Set Computing, en.wikipedia.org/wiki/Reduced_instruction_set_computing). The popular CPUs at the time were CISC, and the big two CISC competitors were Intel's x86 and Motorola's 68000. Intel was winning that one, so Motorola and other chip makers pushed RISC as the Next Big Thing. Motorola had an early RISC lead with the 88000 (before later pivoting to the PowerPC).

At the time, I was working with Sun Microsystems and its allies on SPARC, Sun's RISC design, which was implemented in various ways by a raft of chip makers, including Texas Instruments, Fujitsu and Cypress Semiconductor. In spite of Sun's heft in the marketplace, we had trouble getting attention for SPARC with the tech pubs, because they tended to see everything as an Intel vs. Motorola fight. We felt we couldn't challenge either one of those guys head-on, even if SPARC was superior on technical grounds (which Sun and its partners believed). So we decided the best strategy was for SPARC to pick a fight with another RISC upstart called MIPS.

This was pure bait for the pubs, which came over to this new fight to see what was up. I think we caught MIPS off guard at first, but it defended itself well and ended up selling years later for hundreds of $millions to SGI, which eventually went bankrupt. SPARC is still around, running gear made by Oracle, which acquired Sun. The big winner in the CPU market remained Intel and, therefore, CISC. In fact, the x86 architecture still rules, at least on PCs and servers, but not in mobile devices, where ARM (Advanced RISC Machine) now kicks butt. And for what it's worth, MIPS is now fighting ARM in the Android market, and Motorola's chip division is the long-since-spun-off ON Semiconductor.

So, five points here:

  1. Vendors use stories as marketing strategies.

  2. Vendor war coverage is always to some degree an exercise in misdirection (en.wikipedia.org/wiki/Misdirection_(magic)), even when journalistic intentions are worthy ones.

  3. The real story is always much more complicated than vendor war coverage can characterize.

  4. “Winners” never win forever, especially in tech.

  5. “Losers” don't always die. Often they stay alive by selling out, or they thrive by finding niches and working them.

Now back to our four squid.

The graphic above The Economist story is a antique-style map (media.economist.com/sites/default/files/cf_images/images-magazine/2012/12/01/FB/20121201_FBD000.png) of the fantasy-fiction kind, drawn by David Parkins (www.davidparkins.com). It shows a large mountainous land, with the Sea of Content to the west and the Sea of Commerce to the east. Dividing the land are four throne-doms: Applechia, Google Earth, Amazonia and Fortress Facebook. A fifth, Empire of the Microserfs, is across the Sea of Content in the northwest corner of the map, bordered by the Cliffs of Surface. In Google Earth are Adsense-land, the Mirkwood of Regulation, the Wastes of Litigation (“Here be lawyers”), Pagerank Pinnacle (at the end of Algorithm Reach), beside which lies The Firth of Android. Appleacia has the iPhone Keep. Amazon has the Cloud Mountains and a volcano named Kindle. Between the latter and Netflix Nation (which lies above the Satrapy of Spotify) intrudes Pirate Bay. Offshore are the eBook Islands. On the opposite shore are OneClick Castle and Prime Port. Somewhere in the middle, between the Cloud Mountains and Fortress Facebook is the Lost City of MySpace. Out in the Sea of Content are small islands called RIM Rocks and Nokia. Atop the map is The Dark Offline. Floating in the Sea of Commerce is a Chinese junk flying the Samsung banner. A peninsula in the southeast corner features Secondhand City, the Bay of E and the Cape of Coin. There's a dragon smiling out of the Sea of Commerce, named The Next Big Thing. Finally, in the center of the map, between the four thronedoms, is an un-named body of water surrounding Identity Island.

Parkins' antique style also depicts antique substance in the making—because all four of the thrones (or squid, take your pick) are at least as affected by their own weaknesses as by the strengths of companies they are said to be fighting. And, because so many of us are at their mercy, their weaknesses are to some degree ours as well.

So let's look at those weaknesses, and then at where the rest of the action is, because neither are getting enough attention.

First, Apple.

While it's not wise to bet against a company as successful as Apple has become, it is wise to expect failure from a company whose success is rightly attributed to a dead and irreplaceable CEO. Although it was business as usual for a while after Steve Jobs perished in September 2011, it was clear a year later that the wheels were coming off. First there was the Maps app debacle, in which Apple replaced its Google-based Maps app on iOS 6 with one based on a stew of inadequate substitutes—and then failed to improve it for months while Google took its sweet time not producing its own Maps app for the operating system. This not only hurt Apple and iOS 6, but also the new iPhone 5, which featured the Maps value-subtract and was itself an unspectacular successor to the iPhone 4s—which wasn't all that big an improvement on the iPhone 4, which came out way back in 2010. Meanwhile, for all of Apple's continued success with the iPhone, its entire iOS smart-thing hardware market contains just three devices (iPhone, iPad and iPad Mini) made by only one company. Meanwhile, Android remains an open platform with countless hardware implementations from many companies. As I write this, the new Consumer Reports rates various Samsung Galaxy devices ahead of the iPhone, which had formerly topped the magazine's ratings. Countless new Android phones also will hit the market before the iPhone 6.

In 2012, Apple also continued to make fixing or improving its hardware as hard as possible for anybody not an Apple employee. Batteries, RAM and solid-state storage on new Apple hardware tends to be hard-wired or -glued. One result is the latest MacBook Pro, with its retina display, which Kyle Wiens in Wired calls “the least repairable, least recyclable computer I have encountered in more than a decade of disassembling electronics” (www.wired.com/opinion/2012/10/apple-and-epeat-greenwashing and www.wired.com/gadgetlab/2012/06/opinion-apple-retina-displa).

Credit where due: Apple has been brilliant at retailing and customer support. On the latter count, nobody else is even close. Also, Apple is advantaged by a competitor—Microsoft—that seems hell-bent on sending customers anywhere else.

At this point, it's not clear where Apple is headed. The company's only “wow” product since Steve died was the iPad Mini, which should have come out years earlier. In the past, it was easy to assume that Apple had a “next big thing” up its sleeve. Now it's not.

On to Google.

Last October, Google took the wraps off the biggest thing it has in the physical world: giant data centers, which it immodestly calls “Where the Internet lives” (www.google.com/about/datacenters/gallery/#/). The photos doing the bragging are as artful as can be, considering that the subjects look like power plants: vast and stark white buildings, with glowing racks inside and huge cooling gear outside, veined by an abundance of plumbing. It makes one pause to consider how dependent we have become on giant companies and their very earth-bound “clouds”.

By coincidence, this month is the third anniversary of a column here titled “The Google Exposure” (www.linuxjournal.com/magazine/eof-google-exposure). In it, I wrote:

I'm just worried about the way Google makes money. Nearly all of it comes from advertising. That's what pays for all the infrastructure Google is giving to the rest of us. As our dependency on Google verges on the absolute, this should be a concern. Think of advertising as oil and Google as one big emirate. What happens when the oil runs out?...The free rides won't go on forever. There are better ways than advertising for demand and supply to find each other...and more will be found. Google will be in the middle of that discovery process, no doubt. But it's an open question whether Google will make the same kind of money in a post-advertising marketplace. I'm betting it won't.

Since then, Google has continued growing at a 20+% annual rate, and diversifying a bit (for example, by acquiring Motorola Mobility). But the vulnerabilities are still there: for Google and therefore also for the rest of us. Also, the Internet that “lives” in Google's data centers has become an overwhelmingly commercial one, especially on the Web. The percentage of information on the Web that isn't about selling something continues downward as more and more eyeball-routers get into the ad-based game—and game that game as well. How far can this go before the whole ad-funded system, with Google at its center, begins to fail in big and obvious ways? No way to tell, but the system we have now can't go on forever. Trees do not grow to the sky.

Next, Facebook.

An alpha geek told me recently that the most remarkable thing about Facebook is the sturdiness of its infrastructure: it rarely if ever goes down. Compare that to Twitter, a much smaller service notorious for its familiar “fail whale”. Facebook's infrastructure should be good for many things other than housing a locked-in “social” space where inhabitants get advertised at. What if Facebook started offering paid services to its users, turning them into actual customers? For example, it could work as a fourth-party agency (blogs.law.harvard.edu/vrm/2009/04/12/vrm-and-the-four-party-system), helping customers actually find products and services, rather than merely searching for them, as they do with Google. Facebook could host personal clouds (www.windley.com/archives/2012/11/the_cloud_needs_an_operating_system.shtml) of data kept private for paying customers, selectively disclosing required data to potential sellers (or government agencies, or nonprofits) on a secure need-to-know basis—treating personal data the way a bank (as a fourth party) treats customers' money. Prototype work on this kind of thing has already taken place at Innotribe (innotribe.com), the innovation arm of SWIFT (www.swift.com), the banking nonprofit that moves $trillions around the world every day. I know, because I've been involved in it. But Facebook won't go there because Facebook, like Google, sees its main business as advertising and would rather do business with businesses than with individuals. Also, like Google, it would rather sell its users to advertisers than serve as an intermediary in the far larger retail and services marketplace.

One reason Facebook won't make that move was suggested to me by a top executive at an advertising company a couple years ago. He told me the blinders both Facebook and Google wore were the ones that keep them focused mostly on each other. While this isn't a verbatim quote, it's close enough: “Google envies Facebook's ability to get personal with users, while Facebook envies Google's ability to put ads everywhere on the Web.” Thus, we have locked tentacles rather than evolution by either squid.

Next, Microsoft.

Today in the mail came our copies of Vanity Fair and the New Yorker, both Condé Nast publications. Both looked different and confusing. Instead of the usual cover art, there were collections of squares and rectangles that called to mind the tablet app Flipboard, which organizes “social” content in the form of picture-tiles one can flip through like one would a magazine. I have Flipboard, but its lack of an outline-like organizing structure, such as a directory or a table of contents, annoys me. I thought, This can't be real. This has to be an ad for something. Then I saw the small print: “A sample of the new New Yorker experience on the Windows 8 desktop.” Oy vay. Microsoft and Condé Nast hit into a triple-play on that one, because it made me hate the OS while dreading at the same time having an “experience” like what it showed.

So far, I've met only one Windows user who likes Windows 8, and that's just for some deeply buried technical stuff. Everybody else either doesn't like it or hates it outright. The UI, reportedly nice on phones and tablets, is strange on anything with a keyboard and mouse or trackpad. The learning curve is more like a wall, and—well, nobody asked for all this new stuff. As for the new Surface tablet, it looks like the second coming (and going) of the Tablet PC (en.wikipedia.org/wiki/Microsoft_Tablet_PC). One version of the OS doesn't even run Microsoft's Office apps. Some game developers called the new OS and its Apple-like “store” for silo-ing apps a “catastrophe” (www.neowin.net/news/valve-co-founder-windows-8-is-a-catastrophe) and a “disaster” (www.neowin.net/news/blizzards-rob-pardo-windows-8-is-not-awesome-for-the-company).

On the mobile front, Microsoft teamed up with Nokia to bet the former mobile-phone giant's farm on Windows-based phones, which promptly tanked in the marketplace. Now farmland for both companies is shrinking like a puddle on a hot day.

In fact, Microsoft has some legacy advantages. It always has been far more open and supportive toward developers than Apple. Unlike Facebook and Google, its users are actually paying customers. And it has always been, at its heart, a personal computing company. That too should give it a kind of advantage over Facebook, Google, Twitter and everybody betting on “social” (read: advertising), “the cloud” and “big data”—all of which are corporate/enterprise plays.

Over the years, I've known and worked with a lot of good people inside Microsoft, all of whom have labored to open the company's technology, make it play better with others in the marketplace, and put some truly innovative technologies to work. The company's decision to default Do Not Track in the “on” position with the latest rev of Internet Explorer was astute, correct and perhaps even brave. It's the kind of thing that a clued-in company would do. I've also seen some excellent Microsoft research on user feelings and preferences in respect to lost on-line privacy. That should energize Microsoft around some fresh opportunities, but the company seems to lack adrenal glands. Opportunities are lost every day the company fails to win hearts and minds by standing behind users—its customers—in the fight against abuses of privacy.

Instead, Microsoft continues to fight Google straight-up with an Online Services Division that has lost $billions over recent years.

Next, Amazon.

Amazon is strongest among The Economist's four giant squid, or thrones. It succeeds, Jeff Bezos says, “by starting with the customer and moving backwards”. By 2009, Amazon already controlled more than a third of all e-commerce (www.pcmag.com/article2/0,2817,2345381,00.asp). Since then, I've heard numbers as high as 50%. Whatever the number, you can see the result by looking inside any UPS or Fedex delivery truck and eyeballing all the boxes labeled Amazon or Zappos (Amazon's shoe store).

While Apple, Google and Facebook all clearly have good engineers and solid technical infrastructure, Amazon tops them all by connecting its innards directly both to individuals and to techies among business customers. It is a rare example of a geek-driven company that also understands and loves to do business with everybody it can.

Amazon's only shortcoming is one it shares with the rest of retailing, as well as with its big-squid competitors: it runs a big data silo where customer data goes in, but not back out to individuals. For example, I would like to have a cooperative data-sharing relationship with Amazon, in which I tell it everything I own (or feel like telling it I own), so it doesn't bother trying to sell me what I already have but didn't buy from Amazon. I would like my personal API to be one it could program against, just as I (or my fourth party) can program using its APIs. This requires a respect on Amazon's part for the fact that my life is bigger than the corner that deals with it—and that I can do more with my own data than it can. Also that that this will be a Good Thing for both of us.

But there isn't any sign that this will happen, mostly because we don't yet have our own APIs, and managing our own data isn't something many of us do yet, least of all so we can deal in one consistent way with many suppliers. Mostly, we just fill up hard drives and hope whatever we have “in the cloud” is sort of safe and not going to bite us some way in the future.

Which brings us to the rest of the world.

The revolution we're in is a personal one, not a corporate one. It is a revolution in which personal empowerment has turned out to be good for enterprises because it was good for individuals. This fact has been manifest ever since PCs appeared on Earth around the turn of the 1980s.

To MIS directors in 1983, “personal computing” was oxymoronic. Computing was a corporate thing called data processing. It was big and expensive and specialized and centralized. But those same MIS directors had to start dealing with personal computing because individuals in their organizations and out in the marketplace were getting more done with their own word processing, spreadsheets and accounting software than companies could get done with their old big-iron data processing systems.

Likewise, IT directors in 1997 had to start dealing with personal communications (e-mail, instant messaging, personal publishing), because people in their organizations and out in the marketplace had tools of their own that stripped the gears of what the companies could do with their big old legacy systems.

IT directors in 2009 had to start dealing with iThings and Androids because that's what employees and users brought to work, and customers brought to stores, along with zillions of apps that far exceeded in what could be done with company-issued BlackBerries.

Today's “big data” bluster—all that stuff about how marketing can now know more about the customer than she knows about herself—is mainframe talk. Individuals know more about themselves than systems of any kind can guess at, no matter how much data those systems gather. Given the means to control our own lives, with their own personal platforms (not just ones on their devices, but on their own pile of data), we will be able to do far more with that data than can any other entity. We also can do it cooperatively with other entities, provided neither of us is busy trying to lock in or control the other.

In the next several years, personal data and personal operating systems for managing relationships using that data will be as revolutionary as PCs were in 1983, the Internet was in 1996 and mobile was in 2009. We can keep watching giants battle all they want. But the action that matters most won't be theirs. It will be ours.

Doc Searls is Senior Editor of Linux Journal. He is also a fellow with the Berkman Center for Internet and Society at Harvard University and the Center for Information Technology and Society at UC Santa Barbara.