The Old-Fashioned Way

2008.09.11   prev     next

jump to bottom

INTERVIEWER: Tonight, our guest: Thomas Sargent, Nobel Laureate in economics, and one of the most cited economists in the world. Professor Sargent, can you tell me what CD rates will be in two years?

PROFESSOR SARGENT: No.

INTERVIEWER: (stares blankly)

— Ally Bank commercial, “Predictions,” 2012

SOMETIME in the 1980s Smith Barney ran a series of TV commercials featuring the commanding John Houseman, who I knew mainly as the company boss in Rollerball and the beachside storyteller at the start of The Fog. Houseman seriously intoned the tag line of the spot: “At Smith Barney, they make their money the old-fashioned way. They earn it.”

My dad’s hilarious comment on those commercials was, “At Smith Barney, we make our money the old-fashioned way — we con people into forking it over.” What made this joke so delightful was that it wasn’t just a slam against Smith Barney (or investment firms in general), but also a wry comment that the old-fashioned way of making money is to trick people into giving it to you.

Why did my father, and why do I as well, view investment firms so dimly? Well, it’s simple. If I knew a way to make money by investing in the stock market, would I make a modest living by selling that information to others? Or would I just use that information to make a huge fortune by investing in the stock market myself? It would be a no-brainer.

Those Who Can’t Do...

The really spooky thing about this observation is that it doesn’t apply only to people who claim to tell you how to invest in the stock market. It also applies broadly to anyone who claims to tell you how to do anything that makes significantly more money than they’re making by selling you the in­for­ma­tion.

In other words, it applies broadly to educators. Why would I make half as much money teaching students how to program computers if I could make twice as much by getting a job programming computers? Why would I make a third as much money teaching students how to write movie scripts if I could make three times as much by spending that time writing movie scripts?

Of course, this doesn’t apply to all educators. Some professors with advanced degrees do actually make as much or more as they could make by getting a job in the specialty that they are teaching. But most of these are research professors who spend little time teaching students, and most of their time actually working in the subject at hand.

Where does that leave the average student, who is being lectured, tested, and graded by a person who makes far less money than the students can make when they get a job in this subject? Either (a) they’re being taught by someone who doesn’t understand the subject well, or (b) they’re being taught well, but they’re being taught a subject that most of them will never be able to get a job doing, because the jobs just aren’t there in sufficient quantity — and that’s why this teacher wound up teaching the subject instead of doing it.

Perhaps the old-fashioned way of making money is to sell knowledge of how to make money. Of course, that knowledge has to be either poorly taught, bogus, stuffed with fluff, and/or useless to the vast majority of those purchasing it. Otherwise the teacher wouldn’t be teaching it.

 

Update 2012.08.09 — Horace Dediu on The Critical Path, Episode #48, when asked how listeners can apply his financial analyses:

This is where I get really uncomfortable, I get really nervous, because the answer is, actually unfortunate, that there isn’t one that I can give you.

The answer tends to be something you give yourself. And all I can do is help you to think, or train you, train a thought process in you. I mean, this is the thing about education in general, I believe. The greatest learning isn’t usually a fact you get, but a way of thinking. And so a great school, a great teacher will influence you because they’ll give you this new way of thinking — they’ll train your mind rather than give you data. And so all I would say is this: that if you simply reflect upon what some of these thoughts, maybe they resonate, maybe they’re set in the way that they accord, then you can maybe start to think through.

I hate to sound so cliché, but you know, this is kind-of like, in these Zen or these concepts of going to speak to a guru, and the guru tells you to look within you. I kind-of didn’t understand what that was all about, when I would see these metaphors, or whatever they’re called, these tales — now I sort-of get that, because what it means is when you are in the business of sort-of teaching and finding deeper truths, that you really realize that it’s more about really the person figuring it out themselves. And that’s not to say there’s no value in the teacher, but I think that the teacher should be there to encourage, and make sure that you are seeing everything, and challenges you.

So, you know, it is a dialogue you have. And so you might ask a question, and then it’s replied with another question. And this happens also sometimes in my blog during the commenting phase where someone would ask a question and I could only answer it with yet another question, and hopefully through good question-asking you actually get to somehow solving your problem.

And so I’m sorry to say I can’t give an algorithmic answer on how to do things. And more sort-of saying that you need to do a lot of introspection. I do believe there is a way to think through, and all I can do is give you example after example after example, to train you to think. That’s how case methods work. You just constantly, you get the exercise, and the repetition of thinking, that trains you to follow the right path which you make yourself. But, gosh, I really feel bad about giving you this vague answer, but I’m sorry, that’s about all I can do.

Three cheers for honesty.

 

Update 2012.08.24

It seems to me that there is a group of people who believe that all markets eventually commoditize and everyone will be satisfied with sub-standard products as long as the prices are lower. Not coincidentally, these people also tend to write sub-standard analysis of Apple. —John Moltz

I genuinely (i.e. not sarcastically) laud the honesty of Dediu in the previous update’s quotation. But I can’t express similar approval for Dediu’s mentor, Harvard business professor Clay Christensen.

Christensen is most famous for his “disruption theory,” which attempts to predict the success of companies and their products by the degree to which they disrupt the business of entrenched industry players. It all sounds very intelligent, and Christensen has best-selling books based on it. Plus he teaches a very popular elective class at Harvard.

Now, here’s Christensen in a January 2006 interview, when asked if he thought Microsoft’s inability to deliver Vista in a timely manner, the proliferation of Windows malware, and Apple’s adoption of Intel chips, might help Apple:

I don’t. I think it will allow them to survive for a bit longer.

Survive? For a bit longer?

A year-and-a-half later you would think that Christensen would be realizing his assessment of Apple was seriously askew, but instead he had this to say about the just-released iPhone:

[T]he prediction of [my disruption] theory would be that Apple won’t succeed with the iPhone. They’ve launched an innovation that the existing players in the industry are heavily motivated to beat: It’s not [truly] disruptive. History speaks pretty loudly on that, that the probability of success is going to be limited.

In the five years since, the iPhone rapidly shot into the stratosphere of success, and now, by itself, makes more revenue than all of Microsoft’s products combined.

Am I crazy, or when the inventor and primary proponent of a business theory can’t seem to apply that theory without make laughably, 180° incorrect predictions about the future of an impressive-looking new product from a then-already-very-successful company — then maybe, just maybe, the theory is no good? Either that, or Christensen has some bug up his ass about Apple, where he can’t bear to predict success for that particular company? You decide which.

James Allworth is a close colleague of Christensen — together they have co-authored at least one book — and he seems to have inherited (or, at any rate, shares) Christensen’s ability to downplay Apple in facts-defying manner. His Harvard Business Review article, “The Fall of Wintel and the Rise of Arm­droid,” identifies ARM and Android as the big threat to the “Wintel” hegemony:

Both ARM and Android — Armdroid — are providing everything that tablet manufacturers need ... We will be able to look back and say that this was the CES that saw Wintel fall and Armdroid rise up.

Could Allworth really have been unaware in January 2011 that iPad owned something like 90% of that market? ARM and Android were providing everything tablet makers needed — except sales. And providing it to every tablet maker — but one.

And here we are about nineteen months later and the situation hasn’t noticeably improved for people who want a not-iPad-dominated tablet market, but Allworth doesn’t seem ready to change his feelings about Apple. Just four days before the Apple-v-Samsung jury announced a huge victory for Apple, Allworth typed up, “Who Cares If Samsung Copied Apple?” in which he seriously advances the idea that Samsung (or any other company, presumably) should be legally allowed to nearly exactly replicate Apple’s entire product line (phones, tablets, packaging, advertising, stores, etc.). He describes such activity as “a perfectly functioning competitive market,” and suggests that Apple’s legitimate recourse is to replace its successful products with all-new, equally successful products so fast that blatant copiers like Samsung can’t keep up. Not mentioning why Apple should have to bear the cost of such a frantic pace of R&D all by itself (or even whether that’s possible), he offers up that previous copying of the Mac by Microsoft “didn’t stop” Apple from continuing to innovate — so it must be OK.

It would be easy to dismiss this as a manifestation of some sort of Harvard socialism (i.e. let’s take success away from the big winner and give it to everyone else), but what other successful company is treated this way by Christensen and Allworth? Something more specific must be at work.

Maxwell Wessel, another apparent Christensen believer, wrote “Why Spotify Will Kill iTunes” in Harvard Business Review just over a year ago:

iTunes as we know it is over. It is walking, talking, and continuing to pretend it’s alive, but Spotify, Europe’s outrageously successful streaming music product, has just shown us the future. ... [The] iTunes business model is about to be blown up completely and swiftly. ... To appreciate the truth of this claim, it’s vital to understand one of Clayton Christensen’s theories on marketing and product development: Jobs-to-be-done. ... we know it’s disruption because it is a business model, fundamentally advantaged in one of the characteristics we value in completing the job-to-be-done. Over time this model will displace iTunes. We’ve seen the future, because that’s what disruptive theory lets us do.

I really don’t fault Christensen or his allies for being unable to “see the future.” I do fault them for (a) acting like they can see the future, and (b) trying, for reasons unstated, to find a way to apply their theories to Apple with negative re­sult.

What makes Dediu such a breath of fresh air is that he is willing both to acknowledge and even celebrate Apple’s tremendous success, and to admit very openly his inability to see the future via Christensen’s concepts.

 

Update 2012.09.11 — Another Wessel article:

“Idolize Bill Gates, Not Steve Jobs”

“At the end of his life, Jobs saw his legacy as Apple. Bill Gates stepped away from Microsoft in 2006 and, despite the company’s growing troubles in the face of the mobile disruption, has devoted his genius to solving the world’s biggest problems, despite the fact that solving those problems doesn’t create profit or fame. Gates committed his talents to eliminating diseases, increasing development standards, and generally fighting inequality.”

“I would happily give up my iPhone to put food on the plates of starving children. Steve Jobs turned his company into a decade long leader in the truly new space of mobile computing. Bill Gates decided to eliminate malaria. Who do you think we should be putting up on a pedestal for our children to emulate?”

Now, on any given day, Wessel could quit Harvard and go spend his life as a volunteer worker for an anti-malaria program, but to the best of my knowledge he hasn’t done that. (Neither has Gates, for that matter.) And he could sell his iPhone, terminate his phone plan, and donate the money thus saved to a feed-the-hungry program, but I doubt he’s done even that.

Yet he wants us to idolize Gates and comparatively think of Jobs as some sort of greedy capitalist.

Why Wessel et al. want so badly to slam Apple is a very good question. But an even better question to ponder, when reading anything they write, is this: Do these guys take their own “disruption theory” seriously? Do they really believe they’ve found something useful? Something that really qualifies as an objectively applicable theory? Or is it, even to them, just a pretentious tool of convenience, to be applied in whatever way will yield a wanted result?

 

Update 2012.09.30 — Two days ago, Allworth was the guest on Dediu’s The Critical Path #56 “Strategic Disadvantages.” Pretty much every time Allworth spoke at length, he used the opportunity to advance the idea that the ability of whole-product integrators (e.g. Apple) to be profitable will fade, and modular component suppliers (e.g. Samsung) will take over, due to (a) chip performance becoming more than adequate for most consumers, and (b) component suppliers becoming familiar with how the whole-product design works, then cutting the integrator out of the equation. Other than referencing what happened to Dell at the hands of Asus, and what happened to Apple at the hands of Windows PCs over twenty years ago, I didn’t hear much in the way of explanation of how faster chips and ambitious component suppliers are going to cause serious trouble for Apple.

Tech writer Rob Enderle pretends to be objective in each article he writes, but if you look back at the sum of what he’s written, you realize that he has a very consistent pattern of predicting that Apple, and Apple’s individual products, will do dramatically worse than they actually do. Allworth, I think, is in danger of becoming another Enderle. How many times can you predict trouble for a company, while that company just keeps doing better and better, before it starts becoming apparent to your readers that something other than the desire of predictive accuracy must be driving your assessments?

And while Enderle has no analytic business theory or major university affiliation of which I’m aware, Allworth and Christensen have both. How long can a corporate-success thesis survive at one of America’s most prestigious institutions of higher learning while its key proponents are publicly, enthusiastically, and repeatedly err­ing about the prospects of the most successful company in the world?

Sit back and let’s see.

 

Update 2012.11.15 — Another Wessel article in HBR from late August, “The Inevitable Disruption of Television”.

Video-related companies, products, and services mentioned in that article:

  • Amazon
  • Amazon Instant Video
  • Breaking Bad
  • Clear
  • Comcast
  • Cox
  • Dish Network
  • Funny or Die
  • Game of Thrones
  • HBO
  • Hulu
  • Hulu Plus
  • MACC
  • Mad Men
  • Netflix
  • Nielsen
  • Reddit
  • Rogers
  • SnagFilms
  • Time Warner
  • YouTube

Not mentioned at all:

  • Apple
  • any of Apple’s pro­ducts/­ser­vi­ces

The weirdest thing about team-disruption-theory is that if you wanted an example of a company that recently rose from next-to-nothing to greatness and power by disrupting the entrenched businesses of industry-dominating players, you really couldn’t do better than Apple.

For some inexplicable reason, it appears that these guys despise the very best, stand-out example of their own theory in action.

 

Update 2012.12.21 — In fairness, Wessel (see previous update) also didn’t mention Google, Android, or Google TV. But there’s a good reason not to mention them — they’re not succeeding in the TV space. Apple TV, on the other hand, is doing quite well. And Apple is hotly rumored to be about to do something even bigger in the TV market.

When Apple was hotly rumored to be coming out with a phone, we got the iPhone. When they were hotly rumored to be coming out with a tablet, we got the iPad. And when they were hotly rumored to come out with a smaller tablet, we got the iPad mini. If you’re really in the business of predicting product disruptions, now might be a pretty good time to do it.

 

Update 2012.12.27 — Ally Bank commercial quotation added

 

Update 2013.01.14 — Moltz quote added to August 24 update

 

Update 2013.04.29 — Credit where credit is due: This slideshow by Allworth is pretty kick-ass.

 

Update 2013.05.10 — Harvard dumps its Apple stock. I’m sure they’re gonna be real happy they did that over the next few years. (One-year update: Price is up about 33%.) (Two-year update: Up about 95%.) (Eight-year, two-month update: Up 800%.)

 

Update 2013.06.02 — Somehow missed this one from a year ago. Christensen “worrying” about Ap­ple:

I worry that Apple is in the same situation [Sony was], in that the sequence of extraordinary products has been disruptive relative to the traditional competitors in the marketplace, but relative to Ap­ple’s business model, they have not been disruptive to Apple. So they haven’t seen this problem before. I worry that maybe they have not learned to recognize what’s been happening, because they haven’t seen this kind of problem before. ...

The transition from proprietary architecture to open modular architecture just happens over and over again. It happened in the personal computer. Although it didn’t kill Apple’s computer business, it relegated Apple to the status of a minor player. The iPod is a proprietary integrated product, although that is becoming quite modular. You can download your music from Amazon as easily as you can from iTunes. You also see modularity organized around the Android operating system that is growing much faster than the iPhone. So I worry that modularity will do its work on Apple.

Call me cynical, but I get a sneaking feeling that Christensen’s main worry is that several years from now, Apple will be doing better than ever.

 

Update 2013.10.16,25 — Wessel with Dina Wang in HBR, from right about a year ago:

“The outlook for Apple Maps is grim. Given its outstanding investment in mapping technology and the significant stake in advertising revenue, Google is unlikely to cede its turf in mapping easily.”

“Apple might house the world’s most seasoned innovators, but at the end of the day, it’s hard to compete against good business theory. It’s just plain difficult to fight a sustaining innovation battle against an incumbent.”

Wessel and Wang were either unaware or willfully ignorant of the fact that Apple has always controlled its Maps app in iOS: The moment Apple made the decision to use its own back-end maps service in that app, Google’s data was gone and the entire app was end-to-end controlled by Apple. Apple didn’t need Google to “cede” anything.

Since then, Apple has allowed Google to distribute its own, separate maps app via Apple’s iOS App Store, but most iOS device users are (really) unaware of it, and simply use the built-in Maps app without even knowing, much less caring, that the back-end data is coming from a different source.

Wessel and Wang paint a gloomy picture of Apple’s future with Maps — but here we are a year later and there just doesn’t seem to be much of a problem for Apple at all. Any follow-up from Wessel and Wang on this? Or does Harvard condone the same sort of hit-and-run, fire-and-forget journalism that we would expect from the sleaziest tech media reporters?

 

Update 2014.05.14 — Allworth on Exponent #3, “Valiantly Defending Jobs,” dutifully echoing Christensen’s “worries” about mod­u­lar­i­ty:

By virtue of the fact that [Apple’s] an integrated player, that’s their advantage. And it cuts against them once the way the product all fits together has been determined and people know how all the pieces, all the various modules, fit together, and you can start to establish the interconnects. That’s when the modular players, like the Microsofts, have their advantage. Now, I think we’re getting to a point in the development of mobile where it’s starting to shift to where there’s a benefit to being a [modular] versus an [integrated] player. That being said, I don’t think the solution to this is to go out and buy a headphone man­u­fac­tur­er.

Perhaps the modularity threat isn’t the problem Apple is trying to solve.

 

Update 2014.06.11 — Another font of wisdom at Harvard Business School, professor Gautam Mukunda explains why Apple’s heyday is over:

There are just forces in any environment and any market that constantly drag companies to the mean. What Apple did was essentially in violation of business physics for an extremely long time.

Translation: Our theories are so true, they’re like the laws of physics. Except that occasionally they can be violated. But only temporarily.

 

Update 2014.06.18 — Allworth a­gain thumping the modular-players-will-win drum on Exponent #6:

I wouldn’t say I disagree [that Apple’s 2014 WWDC was full of great news]. I wouldn’t say that it wasn’t positive. It’s just — this is always the case for me with Apple. When they’ve just announced something new, when they’re going into a new category, for me that’s when things are most exciting. They’re really breaking new ground. And as things start to mature, their ability to do things that other people aren’t doing, to do things that are really revolutionary — it’s not that they can’t do them, it’s just not as revolutionary as when they start out in a new field. And we’ve touched on this previously, this notion of looking at the world through the lens of integration and modularity. And an integrated organization is able to outperform when performance isn’t good enough. And that typically is when a category is created. And as things start to mature, modular players do well.

Allworth (and Christensen) indeed appear to be viewing Apple through a modularity-eventually-beats-integration lens. What I want to know is, why? What evidence is there in Apple’s present or recent past (say, ten years) that would lead an objective person to think that such a lens presents an accurate picture? Is there any justification for wearing this lens, that’s more sophisticated than “Apple got beat by Microsoft and its OEMs in the 1980s, so it’s gotta happen again, any year now?” If there is, then I (and probably many other intelligent members of Exponent’s audience) would sure like to hear about it.

 

Update 2014.06.25 — In “The Disruption Machine” (The New Yorker), Jill Lepore does a fine job of exposing how truly poorly Christensen’s Innovator’s Dilemma examples fit his disruption theory, but she includes only a brief mention of Apple (Christensen’s misprediction of iPhone failure). Thankfully, Dediu has just provided us with a transcript of Christensen talking in detail about how Apple’s story purportedly fits into his theory:

[T]o be fast and flexible and responsive, the architecture of the product has to evolve towards a modular architecture, because modularity enables you to upgrade one piece of the system without having to redesign everything, and you can mix and match and plug and play best of breed components to give every customer exactly what they need. ...

So here the rough stages of value added in the computer industry, and during the first two decades, it was essentially dominated by vertically integrated companies because they had to be integrated given the way you had to compete at the time. We could actually insert right in here “Apple Computer.” ... Do you remember in the early years of the PC industry Apple with its proprietary architecture? Those Macs were so much better than the IBM’s. They were so much more convenient to use, they rarely crashed, and the IBM’s were kludgy machines that crashed a lot, because in a sense, that open architecture was prematurely modular.

But then as the functionality got more than good enough, then there was scope, and you could back off of the frontier of what was technologically possible, and the PC industry flipped to a modular architecture. And the vendor of the proprietary system, Apple continues probably to make the neatest computers in the world, but they become a niche player because as the industry disintegrates like this, it’s kind of like you ran the whole industry through a baloney slicer, and it became dominated by a horizontally stratified population of independent companies who could work together at arm’s length interfacing by industry standards.

Now, here’s what actually hap­pened:

  1. IBM, through a combination of accumulated ’60s/’70s FUD and by making a machine that was more appealing to business users, took over the personal computer market at the beginning of the 1980s.
  2. IBM accidentally lost legal control of the spec to their own product, meaning that suddenly, everyone could make PC clones without paying IBM a dime, or even getting IBM’s permission. But they did have to license MS-DOS from Microsoft to ensure that their computers would run the burgeoning market of PC soft­ware.
  3. By the time the Mac appeared in 1984, MS-DOS already was the overwhelming market-majority platform, benefiting tremendously from the virtuous, developer-user circle. The Mac — an all-new platform — was fighting the vicious circle, and didn’t even break 10% market share.
  4. Over about ten years (’85-’95), Microsoft replaced MS-DOS with Windows, a close copy of the Mac. They came very near to not getting away with it in court, but won the appellate court’s final verdict.

The market for PCs did not “flip” to modularity because it “had to” or because that’s somehow better for the consumer — it happened by luck, because of IBM’s incredible blunder, plus Microsoft lucking-out in court and getting to take the whole Mac UI for free. PC modularity materialized before the Mac even hit the market; the Mac was in a niche position from day one, and simply stayed there. Macs were not significantly less crash-prone than PCs; they crashed a lot. And Apple tried the “modularity” (OS licensing) approach in the ’90s — it was a horrible debacle that nearly destroyed them; Jobs killed it immediately upon his return to the company.

Christensen simply has no idea what he’s talking about. He’s been fed a fictionalized version of what happened, or he’s just trying to force the facts to fit the theory. Either way, we shouldn’t be surprised in the least if this theory does a poor job of predicting Apple’s prospects, or those of any other company.

 

Update 2014.06.26 — In Exponent #8, Allworth and co-host Ben Thompson discuss Lepore’s article, and Allworth gives an impassioned insistence of the predictive power of Christensen’s disruption theory:

There’s an implicit suggestion in the article that it’s not predictive. I’m not entirely sure that— in fact, that’s probably framing it too gently — I believe disruption is predictive. And it’s predictive because — how is the best way to describe this — it’s predictive because people are predictable. It’s predictive the same way that capitalism is— the way people behave in a capitalist society, by and large, is predictable. You put people inside a large organization with a profit motive, particularly if it’s a successful organization, and there are incentives for them to do certain things.

And those patterns play out time and time again. And that’s why disruption is— I find it such a useful frame, or useful lens to look out into the future because when you see that pattern, and you understand the reason for it, like— and that’s what good theory does: It drives to the reason why something happens. It’s not corellative; it’s causal. It’s like, what causes what to happen and why. When you see the pattern and you understand why, then it’s actually a very useful mechanism to be able to make— to have insight on the future. Whether you’re outside a company looking in, like we are in a lot of instances, or whether you’re a manager inside a company deciding what action to take in the future.

Compare this to the numbered history of 1980s personal computing (previous update, above): Could disruption theory — or any economic theory — have predicted that IBM would decide to march into the nascent personal computer market and take it over? What if they had decided instead to stick to mainframe computers? Could the theory have predicted that IBM would accidentally fail to protect its PC from being rampantly cloned? Could it have predicted that a single appellate judge would decide that Apple’s work in creating the Mac wasn’t protected by law, and so could be freely used by Microsoft? Clearly, all these questions have the same answer: No.

It is easy to look at past events and think that you have found a predictive theory that shows how they had to happen the way they did. Such a theory, if true, would be very useful indeed. But useful-if-true is not evidence of a theory’s truth. A real defense of disruption theory would involve showing that Lepore’s (and my) refuting examples are actually incorrect. A real defense of disruption theory would include making several, specific, multi-year predictions that subsequently come true, or making a disruption-theory investment fund that actually earns money. Allworth’s above-quoted defense doesn’t include any such things; instead it offers only repeated descriptions of how nice it would be if we could predict the economic future.

In the end, the theory that wins the day is chaos theory. It has a reliable, accurate prediction: that people who think they have found predictive economic theories, such as Christensen, ultimately will be found to be wrong.

 

Update 2014.07.11 — Dediu (sigh) also thumping the modularity-wins drum, on The Innovation Engine Podcast:

Software’s gonna come in [to health care, energy, education, transportation, etc.]. It comes in with Apple because Apple’s able to integrate it with the product that people buy early on. But over time, that becomes modularized, and then you have platforms and other things, like Android, essentially taking up, or filling that in, as a market. Apple lives with that. That’s reality for them. They’re never going to do a Windows or an Android, so they need to continue moving [to] where that formula which they have makes sense.

No. They don’t.

 

Update 2014.08.05 — Allworth (Exponent #11), hammering the modularity nail until it falls out the other side of the board:

“The way I like to think about this is whether it makes sense to be integrated or modular. That varies over time based on where you are in the product category life cycle. And at the beginning of that, when performance is absolutely not good enough, it makes most sense to be integrated, because you have most control over all the different components. As time goes by, the modular players can see how the integrated player has put everything together, and they can start to copy it, and pull it all apart, and focus on their own little [piece]. And, at least the theory would suggest, the modular players start to catch up. And that was the reason why I asked the question, because it might be the case that the difference between iOS and Android is starting to narrow. I would actually say that’s true.”

“When they [Apple] run out of things to improve, when the things to improve become less obvious, the ability to increase performance, to improve the experience, is relatively limited. I’m just going to put it in provocative language: It becomes easier for the modular players to copy the integrated player, because the integrated player runs out of places to go.”

“If you’re paying for a Porsche, that there’s a car that’s not as expensive that can outperform it — they can only get away with that for so long before the underlying reason for the purchase starts to call into question the status element of it.”

“When you look at it through the integrated-vs.-modular lens, the modular players ... they focus on their own little piece, and they know how all the pieces fit together, and they just drive down price and focus on getting those pieces faster, better than an integrated player is able to perform. There’s a case to be made that potentially the modular players could catch up to the integrated player, and in terms of performance, surpass it. And then the question becomes, OK, if that is to be true, is it just enough that you have a luxury experience around the brand. Is that then enough defense? I’m not sure.”

 

Update 2014.09.06 — You’d think that by February of 2013, Christensen, at long last, would be reassessing his position. But here he is, speaking at Startup Grind in Mountain View:

“[Disruption theory] allows you to predict whether you will kill the incumbents or whether the incumbents will kill you.”

“[I]n smartphones the Android operating system has consummate modularity that now allows hundreds of people in Vietnam and China to assemble these things. Just like I pray for Harvard Business School, I pray for Apple. They always have won with their proprietary architecture and because of their advantage. If you ask them what is the core of the company, they will say it is design and the interaction with the customer. Manufacturing really is not our core competence. So you just give that to the Chinese. But then what happens to them? As the dominant architecture becomes open and modular, the value of their proprietary design becomes commoditized itself. It may not be as good, but almost good enough is often good enough.”

I’m starting to think that Christensen has completely committed not only his career but his whole thought process to this idea, and that he’s going to go to his grave believing that it just can’t be incorrect, no matter what actually happens with Apple in the real world around him.

 

Update 2014.09.07 — comment added to 06.11 update

 

Update 2014.10.05 — Another of Allworth’s comments (Exponent #8) on Jill Lepore’s article:

For someone in an article to pick on, like, a business book that represented the start of research on a topic, and to take out a concept with ignorance of the twenty years of scholarship that’s followed up, I think, like, I don’t know, I think that’s an unreasonable criticism.

Immediately after, Thompson seems about to put his finger on the big flaw in Allworth’s comment, but then veers off into the vacuous idea that no theory is ever really “correct,” even in it’s currently championed form (with which Allworth happily agrees).

The really serious problem with Allworth’s above-quoted comment is that it doesn’t identify which of the following scenarios it is de­scrib­ing:

  • The theory was essentially correct in Christensen’s original book, and has been subtly refined in the subsequent twenty years so that it is now even more correct.
  • The theory was wildly wrong in Christensen’s original book, but since has been massively altered so that it is now essentially, if not completely, cor­rect.

Fortunately, we don’t need to know which of these scenarios is the right one, because Allworth’s detraction of Lepore is invalid either way.

If Allworth meant that Christensen’s original book was essentially correct, then his comment makes no sense, because Lepore isn’t arguing that the book needed subtle improvements (and thus ignoring the fact that it may have received such improvements over the years); she’s arguing that the book was wildly wrong.

And if Christensen’s original book was, in fact, wildly wrong, then the relevant questions are: Is this Lepore article the first (or first attention-getting) point-by-point exposure of that fact? Isn’t an article like hers long overdue? And why is Christensen’s original book still selling, and still celebrated as essentially correct by Christensen’s promoters?

 

Update 2015.02.05 — Fresh on the heels of the best quarter of Apple’s history — and indeed, the highest profits of any company, in any business, in any quarter, ever — another business professor, Juan Pablo Vazquez Sampere, in HBR is unimpressed:

“We Shouldn’t Be Dazzled by Apple’s Earnings Report”

“[D]oes [this unprecedented quarter] mean that our beloved Apple is alive and well? A look at the bigger picture within which these numbers sit suggests an alternate view. To see that larger picture, let’s locate Apple within its larger context as a once disruptive innovator that’s now essentially an incumbent.

A fundamental tenet of disruptive innovation ... [four dry paragraphs describing disruption dogma]”

“Apple used to revolutionize industries ... That Apple seems no longer present. In this instance, all Apple has done is copy a feature [bigger phones] for its own best customers. While that’s very effective for today, it does not solve the problem of tomorrow for a company that competes on serial innovation.”

“[B]y dazzling us with dollars, it seems that Apple’s leaders are deliberately trying to divert our attention. ... they are inevitable [sic] forcing us to ask ourselves, is this what we get from the new Apple?”

Translation: If “disruption theory” says Apple must be in decline, then dammit, Apple’s in decline!

 

Update 2015.02.08 — On Exponent #33, Allworth comments on Ap­ple’s quarter:

You [Ben Thompson] mentioned [... John] Gruber’s “claim chowder” post, and going through all the analysts and pointing out how they’re wrong. And I certainly wouldn’t say that that’s what you did with your post; you were giving a much more reasoned explanation as to why people got it wrong. That being said, I have a question for you, which is: Gruber’s implicitly saying, “You guys were wrong; I’m right; Apple’s fantastic; Apple’s a fantastic investment; you should keep buying it.” But at some point — unless they sell everything to everybody — at some point, it hits a peak. Right? And I’m curious, we’ve gone into the retrospective explanation as to why people are wrong, and pointing at the analysts to say, look at you guys, you’ve got it horribly wrong. But I’m curious as to like, talking about it in terms of, how would you actually predict, going forward, what’s gonna happen? And I guess the very simple question I could ask you right now is: Would you invest in Apple, given where they are at the moment?

Pretty stupefying: Allworth disses Gruber for exposing wildly incorrect Apple naysayers and allegedly not caring why they were wrong — but in the same breath he essentially argues that Apple must be currently peaking because in a finite world there isn’t room for infinite growth?! Allworth here seems to be answering his own question: The reason why those naysayers were wrong about Apple is because they were willing to uncritically embrace any argument, no matter how absurd, if it gave them an excuse to predict Apple’s impending decline.

Free advice for team-disruption: Why don’t you start spinning your theory to predict that in the future, Apple will do dramatically better, even than it’s doing now? You might have more luck with that.

 

Update 2015.02.28 — Horace Dediu (The Critical Path #142), talking about his work on the recently founded Clayton Christensen Institute for Disruptive Innovation, where they are attempting to create a new, different kind of MBA program based on Christensen’s disruption theories:

If I were to think about an MBA curriculum: How do you redefine an MBA? And is it even marketable as an MBA? Can you get anybody to show up? Can you get anybody to value the result? Can you say, well, I’ve got this special disruptive MBA. And who can say, “Well, that’s what we need out here.” You know? “Welcome, young man.” Most people would be, like, “What!? I want you get a traditional MBA.” [Which] means he’s taken accounting and all these other things, and you see the world through the same lens that everybody else does.

So this would be either something that you could market to someone, and say, look, you already got the MBA; now it’s time to unlearn everything. Now that would be one way to look at it. Another way, because— funny enough, you need to know the rules in order to break them. That’s one of the tragedies of it, actually. But you don’t need to be in the— I think the deeper you are into the program, into the traditional MBA, and the more experience you’ve had, you probably cannot unlearn it. There are issues— you wanna have sufficient exposure to it, but not necessarily enough that it has brainwashed you, and completely blown your ability to really learn. There’s a saying, I think, it takes formal education to become really stupid. So, you know, it takes an MBA to become really ignorant about bus­i­ness.

Brainwashing that can’t be unlearned? Formal education that makes you really stupid? People who’ve completely blown their ability to really learn, making them really ignorant about business? Gee, if I didn’t know better...

 

Update 2015.03.02 — Just picked up this Christensen lulu from last July’s Harvard Magazine, “Disruptive Genius”:

I said, “I don’t think the iPhone will succeed.” [and it did] ... But then comes the Android operating system from Google, which by definition makes the devices open and modular all the way through. So the people using the Android operating system are now Motorola, Samsung, LG. And they are killing Apple: now, Android accounts for about 80 percent of the market. So I was wrong, and then I was right.

Which is worse: Christensen re­writ­ing his six-year-old prediction as, “iPhone will fail some way, someday,” or declaring that iPhone is being killed, a few months before it fuels the most profitable quarter of any company, ever? Perhaps both frauds are overshadowed by Harvard’s own magazine’s willingness to uncritically publish whatever the guy says.

 

Update 2015.04.12 — Allworth (Exponent #41):

You pick up the newspaper every day, and they explain what happens in such a way that it reads like it was blindingly obvious. But really, if anyone writing the newspaper knew it was as blindingly obvious as they made it out to be in retrospect, they probably wouldn’t be writing the news; they’d probably have invested a lot of money and they’d probably be on an island somewhere.

He can see this with crystal clarity when it comes to unspecified, news-service, business writers. But what about business-theory professors? Why aren’t they sipping tiki cocktails on oversized yachts? Allworth doesn’t grace us with this information — thankfully, Lepore let’s us in on the an­swer:

The theory of disruption is meant to be predictive. On March 10, 2000, Christensen launched a $3.8-million Disruptive Growth Fund, which he managed with Neil Eisner, a broker in St. Louis. Christensen drew on his theory to select stocks. Less than a year later, the fund was quietly liquidated: during a stretch of time when the Nasdaq lost fifty per cent of its value, the Disruptive Growth Fund lost sixty-four per cent.

 

Update 2015.05.06 — Dediu (The Critical Path #148) summarizing Apple Watch:

Two thumbs up. I think it’s a great product. I think it’s going to be a great category. It’s also potentially disruptive to computing as we know it. And what we end up with is maybe not a replacement, but an enhancement, an extension of it. If you bundle it with the iPhone — and let’s assume just as a way of thinking about it — the iPhone goes from being a $600 product to suddenly a $1000 product. And it’s a $1000 product that you might update every two years. And that iPhone franchise is suddenly increased by 30%. Or 50%, I should say. That is a huge thing. And that in itself is a brilliant business. But this is not really what it’s about; it’s about changing where the software lives and how computing is moving on to becoming more per­son­al.

And in the next episode:

The stock market still doesn’t think [Apple’s] worth all that much; it’s still pricing it at 17 P/E, which is just mind-blowing. When you take cash out of that picture, cash and returns and yield to the shareholder, it’s one of the cheapest stocks in any large-cap index, which, again, shocks me to no end.

Would we ever hear Christensen say that a new item from Apple is going to be a great product or a great category, and that it’s potentially disrupting to computing as we know it? Would Christensen ever say that Apple’s stock price is shockingly, mind-blowingly low? This is why my respect for Dediu’s work doesn’t extend to his mentor.

 

Update 2015.08.27HBR runs a story called “How Samsung Became a Design Powerhouse” — written by a Samsung designer and a Samsung consultant.

 

Update 2015.09.15 — Ah, another HBR lecture about Apple by Professor Sampere:

“Apple Pay Is Just a Big Giveaway to Credit Card Companies”

“It’s easy to assume Apple Pay is one in a long line of disruptive innovations from the master of serial disruption. But this time that’s not the case.”

This time I’m right.

“There’s no technical reason why the banks need to go through the credit card companies to offer credit services to their customers. So Apple executives could have negotiated with retail banks, just as it [sic] did with the recording labels, to launch Apple Pay. If it [sic] had, Apple Pay would have been a substitute for credit cards, and would truly be disruptive to the credit card industry. Instead, Apple negotiated with the credit card companies, which is why you need to introduce your credit card number, instead of your bank account number, to configure the ap­pli­ca­tion.”

So Apple could have tried to get retail banks to become credit card companies, instead of just dealing with the credit card companies. And this would have been the right move, uh — why?

“If Apple succeeds at developing a standard for mobile payments, the credit card companies will retain all the bargaining power they currently have with banks to gain access to people’s money and can circumvent Apple at any moment.”

Oh, right, Apple will get shafted. Apple’s successes will be abruptly taken away. Because, you know. Windows. In the ’90s. Everybody knows that!

 

Update 2015.09.29 — Allworth and Thompson (Exponent #53) explaining how Apple might be losing its way:

JA: [Apple Watch] is confirmatory for me, a little bit, that maybe part of this magic that makes this company so special — its ability to take out everything, everything that’s extraneous, and just leave the very core of what matters most in its products — maybe some of that magic is starting to be lost.

BT: ... You’ll make this point that Apple’s magic comes from its ability to focus, and deliver just the most narrow product, and then evolve from there. And people will respond, oh yeah, but they’re a big company now, like, they can walk and chew gum at the same time, or something to that effect. And that’s to miss the point. You don’t focus and do a minimum viable product to save money, or to save resources. You do it because to presume that you can see that far down the road is to make that five-year plan that I talked about at the beginning, and Apple is well on their way to achieving their plan with the watch, and it’s ending up in the wrong place.

JA: I completely agree with this.

Before Jobs showed off iPhone in January ’07, I speculated that it would be a very narrow, get-just-a-few-features-really-right kind of gadget. To my surprise, it was an extremely general-purpose computer, that put many, disparate functions in a single device in your pocket. And, of course, it rapidly became one of the most — if not the most — successful technology products of all time.

I don’t pretend to have any idea whether Apple Watch has such a huge future ahead of it, but I can say this: Allworth’s and Thompson’s “narrow product magic” is a myth. It wasn’t true nine years ago, and it’s not true now.

 

Update 2016.02.04 — Thrice published in HBR this past year, Professor André Spicer, PhD, of Cass Business School at City University London, appears to know who his friends are (reported by Andrew Cave in Forbes):

“Spicer ... is concerned. ‘Apple may have had the largest quarterly profits in history but it could go from darling to dud within a few years,’ he says. ‘This happened to Nokia before, and it could easily happen to Apple. Its strategy of providing a limited range of high priced products could backfire. As smartphones increasingly become undifferentiated commodities, people will start asking why they are paying such a huge premium and Apple could find itself trapped by what it is good at. ... Now Apple is trying to make up for flat sales of its core product by moving into other markets like healthcare, financial services and cars. ... It is uncertain where the skills of making cool looking mobile phones will translate into banking.’”

“‘It is easy to paint a doom and gloom scenario,’ he says, ‘but what is more likely is that Apple will shift from being outstanding to being simply ordinary. When this happens, many of the habits which you find in middle aged companies will kick in — cost cutting, fashion following, and pointless and repetitive change programs. ...’”

“‘Some investors will be eyeing Apple’s cash pile, hoping they might see it returned to them rather than being reinvested to reinvigorate the company.’ If that happens, it will probably be the ‘first nail in Apple’s coffin,’ predicts the London professor.”

And in his own Newsweek article, last March:

“The Apple Watch: The Perfect Gizmo For the Narcissist”

“Our research has found that smartwatches ... have a hidden, darker side which the companies selling them are unlikely to talk about. ... We found that heavy smartwatch users valued how the devices helped them track information as it arrived while still appearing socially attentive.”

“In the work Carl Cederström and I have done on the hidden dangers of wearables, a big big concern is privacy. The Apple Watch, like most wearables, is essentially a tracking device ... The result? A database of personal information of which the Stasi could only have dreamed.”

“[W]earables could fuel an unhealthy obsession with personal wellness. ... people start to become more interested in what is going on inside themselves rather than what is happening in the world. ... we pour more attention into monitoring and controlling ourselves, giving us less time to do the things which actually make us happy.”

 

Update 2016.02.05 — It’s a year after Apple’s corporate-history-breaking quarter, and the company’s doing even slightly better than it was then — looks like Professor Sampere can’t resist lobbing more bombast their way, in HBR:

“Apple’s Shrinking Impact in the Smartphone Industry”

“[A]fter Steve Jobs came the iPhone 6. It was a game changer for Apple, and not in a good way. ... screen size is simply an industry feature, one that other smartphone companies have introduced already.”

“The fact that the larger screen was a valued feature for consumers and that it was much less costly for Apple to produce and launch explains the record earnings of the iPhone 6. But the longer-term picture is not so rosy.”

“Now ... there is no smartphone company that is a market-creating innovator. Apple, Samsung, and the others are stuck in a battle of sustaining innovations ... Unfortunately for Apple, the strategic shift to engaging in classical competition instead of continuing leading the industry doesn’t have a good prognosis. In these situations, the incumbent almost always fails ...”

Can’t wait to hear what Sampere thinks of Apple’s “failures” this time next year.

 

Update 2016.04.27 — Bill George, Harvard Business School Senior Fellow, speaking on CNBC, on the subtopic “Has Apple Lost Cred­i­bil­i­ty?”:

The pressure’s really on Tim [Cook]. He needs an Act 2, desperately. He’s done a good job for five years consolidating all the gains that Steve put in place, and he is a supply-chain guy, you got it exactly right, but he’s got plenty of innovators underneath him, Jony Ive and many, but I think he needs to stop just incrementally designing, and come out with that breakthrough. And we haven’t seen that yet.

At this point, would it be at all unfair to describe Harvard as an anti-Apple echo chamber?

 

Update 2017.09.08 — Allworth, in Exponent #123 “There Is No Going Back,” once again doubles-down on Christensen’s correctness:

[Ben,] you’re selling [aggregation theory] a little short. You’re talking about it in terms of explanatory power. I would actually use a different word, which is predictive power. When you get an idea that is able to help you see into the future, and see how things are gonna play out? Those are the most valuable ideas, whether you’re an entrepreneur, an investor, or even an employee trying to decide what kind of company you wanna work at, or which company to work at. Being able to pick something up and use it as a lens to figure out how things are going to play out.

And the other one that we often keep coming back to is disruption. And yeah, the circumstances in which you apply disruption have changed, then the theory needs to evolve as a result of it. But it has tremendous predictive power. And that is the power— and the reason why you shouldn’t feel sheepish about aggregation theory, because if you can use it as a way to figure out what’s gonna happen in the future, that’s an incredibly powerful tool.

After everything that’s transpired in the last twenty years, I would think that at least a decade of solid, predictive successes (starting today) would be warranted before dogmatic re-statement of the theory’s accuracy could be considered at all appropriate. So at this point, anyone should be historically safe saying that for Harvard Business School graduates, loyalty to institute and instructor matters much more than whether anything they’re saying is actually true.

 

Update 2018.12.04 — Harvard business professors Willy Shih and David Yoffie, as reported by Jake Swearingen in New York Magazine:

Shih: “Commoditization is the normal cycle for most products. When the first Xerox plain-paper copier came out, they were really cool and Xerox became a fabulously successful company. ... Now, copier machines are a dirt-order commodity. ... Once you get driven into the commodity space, you start to think, ‘Oh, I’ve just got gotta come up with the next great feature that will cause people to buy my product over the others.’ But at some point, you way exceed what consumers need or are willing to pay for. And then you become a commodity.”

Yoffie: “When Apple is at an average selling price of $800, and the rest of industry is at $300, you can only defy gravity for so long.”

Swearingen: “How Long Can Ap­ple Defy Gravity? ... its strategy of slowly raising its average selling price while selling fewer phones has a natural limit. ... for Apple to be successful in replacing lost revenue with its services division, it still needs a large number of iOS users. Raising the average selling price is effective in the short term, but may backfire in the long term. ... the next decade will likely be a great shakeout, with some old guard names leaving the market, some newer brands becoming ascendant ... But manufacturer loss is consumer gain; the main differentiator in a mature market where commoditization has fully taken hold is one that should be appealing to anyone who shops for a smartphone: price.”

 

See also:
The Old-Fashioned Way
&
Apple Paves the Way For Apple
&
iPhone 2013 Score Card
&
Disremembering Microsoft
&
What Was Christensen Thinking?
&
Four Analysts
&
Remember the iPod Killers?
&
The Innovator’s Victory
&
Answering the Toughest Question About Disruption Theory
&
Predictive Value
&
It’s Not A Criticism, It’s A Fact
&
Vivek Wadhwa, Scamster Bitcoin Doomsayer
&
Judos vs. Pin Place
&
To the Bitter End

 

prev     next

 

 

Hear, hear

prev     next

Best recent articles

Make Your Own FBI Backdoor, Right Now

Polygon Triangulation With Hole

The Legacy of Windows Phone

Palm Fan

Vivek Wadhwa, Scamster Bitcoin Doomsayer

Fanboy Features (regularly updated)

When Starting A Game of Chicken With Apple, Expect To Lose — hilarious history of people who thought they could bluff Apple into doing whatever they wanted.

A Memory of Gateway — news chronology of Apple’s ascendancy to the top of the technology mountain.

iPhone Party-Poopers Redux and Silly iPad Spoilsports — amusing litanies of industry pundits desperately hoping iPhone and iPad will go away and die.

Embittered Anti-Apple Belligerents — general anger at Apple’s gi-normous success.

RSS FEED

My books

Now available on Apple Books!

   

Links

Daring Fireball

The Loop

RoughlyDrafted

Macalope

Red Meat

Despair, Inc.

Real Solution #9 (Mambo Mania Mix) over stock nuke tests. (OK, somebody made them rip out the music — try this instead.)

Ernie & Bert In Casino

Great Explanation of Star Wars

Best commercials (IMO) from Super Bowl 41, 43, 45, 46, 47, 53 and 55

Kirk & Spock get Closer

American football explained.

TV: Succession; Better Call Saul; Homeland; Survivor; The Jinx; Breaking Bad; Inside Amy Schumer

God’s kitchen

Celebrity Death Beeper — news you can use.

Making things for the web.

RedQueenCoder.

My vote for best commercial ever. (But this one’s a close second, and I love this one too.)

Recent commercials I admire: KFC, Audi, Volvo

Best reggae song I’ve discovered in quite a while: Virgin Islands Nice

d120 dice: You too (like me) can be the ultimate dice nerd.

WiFi problems? I didn’t know just how bad my WiFi was until I got eero.

Favorite local pad thai: Pho Asian Noodle on Lane Ave. Yes, that place; blame Taco Bell for the amenities. Use the lime, chopsticks, and sriracha. Yummm.

Um, could there something wrong with me if I like this? Or this?

This entire site as a zip file — last updated 2023.10.06

Previous articles

Engström’s Motive

Google’s Decision

Warrening

The Two Envelopes Problem, Solved

The Practical Smartphone Buyer

Would Apple Actually Exit the EU Or UK?

See You Looked

Blackjack Strategy Card (Printable)

Swan Device 1956 — Probable Shape

Pu

RGB-To-Hue Conversion

Polygon Triangulation With Hole

One-Point Implosion: “Palm Fan”

Implosion: Were Those Two-Speed Lenses Really Necessary?

Apple Wants User/Developer Choice; Its Enemies Want Apple Ruin

Tim Sweeney Plays Dumb

The Jury of One

The Lesson of January 6

Amnesia Is Not A Good Plot

I Was Eating for 300 lbs, Not 220

Action Arcade Sounds and Reality

The Flea Market and the Retail Store

Squaring the Impossible

Yes, Crocodiles Are Dinosaurs — Duh

Broccoli and Apples Are Not the Antidote To Donuts and Potato Chips

Cydia and “Competition”

The Gift of Nukes

Prager University and the Anti-Socialists’ Big Blind Spot

In Defense of Apple’s 30% Markup, Part 2

In Defense of Apple’s 30% Markup

Make Your Own FBI Backdoor, Right Now

Storm

The Legacy of Windows Phone

Mindless Monsters

To the Bitter End

“Future Shock” Shock

Little Plutonium Boy

The iPhone Backdoor Already Exists

The Impulse To Be Lazy

HBO’s “Meth Storm” BS

Judos vs. Pin Place

Vizio M-Series 65" LCD (“LED”) TV — Best Settings (IMHO)

Tasting Vegemite (Bucket List)

The IHOP Coast

The Surprise Quiz Paradox, Solved

Apple, Amazon, Products, and Services — Not Even Close

Nader’s Open Blather

Health — All Or Nothing?

Vivek Wadhwa, Scamster Bitcoin Doomsayer

Backwards Eye Wiring — the Optical Focus Hypothesis

Apple’s Cash Is Not the Key

Nothing More Angry Than A Cornered Anti-Apple

Let ’Em Glow

The Ultimate, Simple, Fair Tax

Compassion and Vision

When Starting A Game of Chicken With Apple, Expect To Lose

The Caveat

Superb Owl

NavStar

Basic Reproduction Number

iBook Price-Fixing Lawsuit Redux — Apple Won

Delusion Made By Google

Religion Is A Wall

It’s Not A Criticism, It’s A Fact

Michigan Wolverines 2014 Football Season In Review

Sprinkler Shopping

Why There’s No MagSafe On the New MacBook

Sundar Pichai Says Devices Will Fade Away

The Question Every Apple Naysayer Must Answer

Apple’s Move To TSMC Is Fine For Apple, Bad For Samsung

Method of Implementing A Secure Backdoor In Mobile Devices

How I Clip My Cat’s Nails

Die Trying

Merger Hindsight

Human Life Decades

Fire and the Wheel — Not Good Examples of A Broken Patent System

Nobody Wants Public Transportation

Seasons By Temperature, Not Solstice

Ode To Coffee

Starting Over

FaceBook Messenger — Why I Don’t Use It

Happy Birthday, Anton Leeuwenhoek

Standard Deviation Defined

Not Hypocrisy

Simple Guide To Progress Bar Correctness

A Secure Backdoor Is Feasible

Don’t Blink

Predictive Value

Answering the Toughest Question About Disruption Theory

SSD TRIM Command In A Nutshell

The Enderle Grope

Aha! A New Way To Screw Apple

Champagne, By Any Other Maker

iOS Jailbreaking — A Perhaps-Biased Assessment

Embittered Anti-Apple Belligerents

Before 2001, After 2001

What A Difference Six Years Doesn’t Make

Stupefying New Year’s Stupidity

The Innovator’s Victory

The Cult of Free

Fitness — The Ultimate Transparency

Millions of Strange Devotees and Fanatics

Remember the iPod Killers?

Theory As Simulation

Four Analysts

What Was Christensen Thinking?

The Grass Is Always Greener — Viewing Angle

Is Using Your Own Patent Still Allowed?

The Upside-Down Tech Future

Motive of the Anti-Apple Pundit

Cheating Like A Human

Disremembering Microsoft

Security-Through-Obscurity Redux — The Best of Both Worlds

iPhone 2013 Score Card

Dominant and Recessive Traits, Demystified

Yes, You Do Have To Be the Best

The United States of Texas

Vertical Disintegration

He’s No Jobs — Fire Him

A Players

McEnroe, Not Borg, Had Class

Conflict Fades Away

Four-Color Theorem Analysis — Rules To Limit the Problem

The Unusual Monopolist

Reasonable Projection

Five Times What They Paid For It

Bypassable Security Certificates Are Useless

I’d Give My Right Arm To Go To Mars

Free Advice About Apple’s iOS App Store Guidelines

Inciting Violence

One Platform

Understanding IDC’s Tablet Market Share Graph

I Vote Socialist Because...

That Person

Product Naming — Google Is the Other Microsoft

Antecessor Hypotheticum

Apple Paves the Way For Apple

Why — A Poem

App Anger — the Supersized-Mastodon-In-the-Room That Marco Arment Doesn’t See

Apple’s Graphic Failure

Why Microsoft Copies Apple (and Google)

Coders Code, Bosses Boss

Droidfood For Thought

Investment Is Not A Sure Thing

Exercise is Two Thirds of Everything

Dan “Real Enderle” Lyons

Fairness

Ignoring the iPod touch

Manual Intervention Should Never Make A Computer Faster

Predictions ’13

Paperless

Zeroth — Why the Century Number Is One More Than the Year Number

Longer Than It Seems

Partners: Believe In Apple

Gun Control: Best Arguments

John C. Dvorak — Translation To English

Destructive Youth

Wiens’s Whine

Free Will — The Grand Equivocation

What Windows-vs.-Mac Actually Proved

A Tale of Two Logos

Microsoft’s Three Paths

Amazon Won’t Be A Big Winner In the DOJ’s Price-Fixing Suit

Infinite Sets, Infinite Authority

Strategy Analytics and Long Term Accountability

The Third Stage of Computing

Why 1 Isn’t Prime, 2 Is Prime, and 2 Is the Only Even Prime

Readability BS

Lie Detection and Psychos

Liking

Steps

Microsoft’s Dim Prospects

Humanity — Just Barely

Hanke-Henry Calendar Won’t Be Adopted

Collatz Conjecture Analysis (But No Proof; Sorry)

Rock-Solid iOS App Stability

Microsoft’s Uncreative Character

Microsoft’s Alternate Reality Bubble

Microsoft’s Three Ruts

Society’s Fascination With Mass Murder

PlaysForSure and Wikipedia — Revisionism At Its Finest

Procrastination

Patent Reform?

How Many Licks

Microsoft’s Incredible Run

Voting Socialist

Darwin Saves

The Size of Things In the Universe

The Self-Fulfilling Prophecy That Wasn’t

Fun

Nobody Was In Love With Windows

Apples To Apples — How Anti-Apple Pundits Shoot Themselves In the Foot

No Holds Barred

Betting Against Humanity

Apple’s Premium Features Are Free

Why So Many Computer Guys Hate Apple

3D TV With No Glasses and No Parallax/Focus Issues

Waves With Particle-Like Properties

Gridlock Is Just Fine

Sex Is A Fantasy

Major Player

Why the iPad Wannabes Will Definitely Flop

Predators and Parasites

Prison Is For Lotto Losers

The False Dichotomy

Wait and See — Windows-vs-Mac Will Repeat Itself

Dishonesty For the Greater Good

Barr Part 2

Enough Information

Zune Is For Apple Haters

Good Open, Bad Open

Beach Bodies — Who’s Really Shallow?

Upgrade? Maybe Not

Eliminating the Impossible

Selfish Desires

Farewell, Pirate Cachet

The Two Risk-Takers

Number of Companies — the Idiocy That Never Dies

Holding On To the Solution

Apple Religion

Long-Term Planning

What You Have To Give Up

The End of Elitism

Good and Evil

Life

How Religion Distorts Science

Laziness and Creativity

Sideloading and the Supersized-Mastodon-In-the-Room That Snell Doesn’t See

Long-Term Self-Delusion

App Store Success Won’t Translate To Books, Movies, and Shows

Silly iPad Spoilsports

I Disagree

Five Rational Counterarguments

Majority Report

Simply Unjust

Zooman Science

Reaganomics — Like A Diet — Works

Free R&D?

Apple’s On the Right Track

Mountains of Evidence

What We Do

Hope Conquers All

Humans Are Special — Just Not That Special

Life = Survival of the Fittest

Excuse Me, We’re Going To Build On Your Property

No Trademark iWorries

Knowing

Twisted Excuses

The Fall of Google

Real Painters

The Meaning of Kicking Ass

How To Really Stop Casual Movie Disc Ripping

The Solitary Path of the High-Talent Programmer

Fixing, Not Preaching

Why Blackmail Is Still Illegal

Designers Cannot Do Anything Imaginable

Wise Dr. Drew

Rats In A Too-Small Cage

Coming To Reason

Everything Isn’t Moving To the Web

Pragmatics, Not Rights

Grey Zone

Methodologically Dogmatic

The Purpose of Language

The Punishment Defines the Crime

Two Many Cooks

Pragmatism

One Last Splurge

Making Money

What Heaven and Hell Are Really About

America — The Last Suburb

Hoarding

What the Cloud Isn’t For

Diminishing Returns

What You’re Seeing

What My Life Needs To Be

Taking An Early Retirement

Office Buildings

A, B, C, D, Pointless Relativity

Stephen Meyer and Michael Medved — Where Is ID Going?

If You Didn’t Vote — Complain Away

iPhone Party-Poopers Redux

What Free Will Is Really About

Spectacularly Well

Pointless Wrappers

PTED — The P Is Silent

Out of Sync

Stupid Stickers

Security Through Normalcy

The Case For Corporate Bonuses

Movie Copyrights Are Forever

Permitted By Whom?

Quantum Cognition and Other Hogwash

The Problem With Message Theory

Bell’s Boring Inequality and the Insanity of the Gaps

Paying the Rent At the 6 Park Avenue Apartments

Primary + Reviewer — An Alternative IT Plan For Corporations

Yes Yes Yes

Feelings

Hey Hey Whine Whine

Microsoft About Microsoft Visual Microsoft Studio Microsoft

Hidden Purple Tiger

Forest Fair Mall and the Second Lamborghini

Intelligent Design — The Straight Dope

Maxwell’s Demon — Three Real-World Examples

Zealots

Entitlement BS

Agenderle

Mutations

Einstein’s Error — The Confusion of Laws With Their Effects

The Museum Is the Art

Polly Sooth the Air Rage

The Truth

The Darkness

Morality = STDs?

Fulfilling the Moral Duty To Disdain

MustWinForSure

Choice

Real Design

The Two Rules of Great Programming

Cynicism

The End of the Nerds

Poverty — Humanity’s Damage Control

Berners-Lee’s Rating System = Google

The Secret Anti-MP3 Trick In “Independent Women” and “You Sang To Me”

ID and the Large Hadron Collider Scare

Not A Bluff

The Fall of Microsoft

Life Sucks When You’re Not Winning

Aware

The Old-Fashioned Way

The Old People Who Pop Into Existence

Theodicy — A Big Stack of Papers

The Designed, Cause-and-Effect Brain

Mosaics

IC Counterarguments

The Capitalist’s Imaginary Line

Education Isn’t Everything

I Don’t Know

Funny iPhone Party-Poopers

Avoiding Conflict At All Costs

Behavior and Free Will, Unconfused

“Reduced To” Absurdum

Suzie and Bubba Redneck — the Carriers of Intelligence

Everything You Need To Know About Haldane’s Dilemma

Darwin + Hitler = Baloney

Meta-ware

Designed For Combat

Speed Racer R Us

Bold — Uh-huh

Conscious of Consciousness

Future Perfect

Where Real and Yahoo Went Wrong

The Purpose of Surface

Eradicating Religion Won’t Eradicate War

Documentation Overkill

A Tale of Two Movies

The Changing Face of Sam Adams

Dinesh D’Souza On ID

Why Quintic (and Higher) Polynomials Have No Algebraic Solution

Translation of Paul Graham’s Footnote To Plain English

What Happened To Moore’s Law?

Goldston On ID

The End of Martial Law

The Two Faces of Evolution

A Fine Recommendation

Free Will and Population Statistics

Dennett/D’Souza Debate — D’Souza

Dennett/D’Souza Debate — Dennett

The Non-Euclidean Geometry That Wasn’t There

Defective Attitude Towards Suburbia

The Twin Deficit Phantoms

Sleep Sync and Vertical Hold

More FUD In Your Eye

The Myth of Rubbernecking

Keeping Intelligent Design Honest

Failure of the Amiga — Not Just Mismanagement

Maxwell’s Silver Hammer = Be My Honey Do?

End Unsecured Debt

The Digits of Pi Cannot Be Sequentially Generated By A Computer Program

Faster Is Better

Goals Can’t Be Avoided

Propped-Up Products

Ignoring ID Won’t Work

The Crabs and the Bucket

Communism As A Side Effect of the Transition To Capitalism

Google and Wikipedia, Revisited

National Geographic’s Obesity BS

Cavemen

Theodicy Is For Losers

Seattle Redux

Quitting

Living Well

A Memory of Gateway

Is Apple’s Font Rendering Really Non-Pixel-Aware?

Humans Are Complexity, Not Choice

A Subtle Shift

Moralism — The Emperor’s New Success

Code Is Our Friend

The Edge of Religion

The Dark Side of Pixel-Aware Font Rendering

The Futility of DVD Encryption

ID Isn’t About Size or Speed

Blood-Curdling Screams

ID Venn Diagram

Rich and Good-Looking? Why Libertarianism Goes Nowhere

FUV — Fear, Uncertainty, and Vista

Malware Isn’t About Total Control

Howard = Second Coming?

Doomsday? Or Just Another Sunday

The Real Function of Wikipedia In A Google World

Objective-C Philosophy

Clarity From Cisco

2007 Macworld Keynote Prediction

FUZ — Fear, Uncertainty, and Zune

No Fear — The Most Important Thing About Intelligent Design

How About A Rational Theodicy

Napster and the Subscription Model

Intelligent Design — Introduction

The One Feature I Want To See In Apple’s Safari.