More on “buy vs. build”

May 18th, 2010  |  Published in Uncategorized  |  3 Comments

In an earlier post I mentioned Nicholas Carr’s book Does IT Matter? and the argument that information technology has become a commodity, and therefore an organization like the University should buy off-the-shelf systems to meet IT needs. I think anyone who wants to engage in the “buy vs. build” argument should read this book in order to understand the reasoning, even if in the end you don’t agree with it.

In this book, Carr reviews how earlier technology advances, like the development of railroads and the harnessing of electricity, played out in business. At first, businesses that embraced the new technology had a strategic advantage over those that didn’t. After a while, though, all businesses were using it (or had gone out of business) so there was no strategic advantage to the technology, it had just become a part of the cost of doing business. Instead of devising a “railroad strategy” or an “electricity strategy,” businesses located near a rail line or hooked up to an electric utility and purchased standard equipment to use the technology. Carr argues that information technology has reached this same stage.

Carr applies his argument to both hardware and software. Now, I’m mostly a software guy, so my discussion will focus on that side. Software is rather unique from an economic point of view: it costs a lot to create a program, but once it’s been written it costs very close to nothing to copy and distribute it. This drives software toward commodity pricing: the more copies of the program you sell, the smaller a percentage of the fixed costs of software development each customer will have to pay.

On the other hand, a big part of commoditization is standardization. I can treat electricity as a commodity because every device plugs into a few standard outlets: rather than having to purchase the same kind of, say, toaster, as my last one because it has a proprietary way to get power from the electrical lines, I can buy any brand. If I have a Ford car, I can replace it with a Chevy without changing the route I drive to work. With software we do not yet have nearly this degree of standardization. As Brian Cantrill said in The Economics of Software :

The problem is that for all of the rhetoric about software becoming a “commodity”, most software is still very much not a commodity: one software product is rarely completely interchangeable with another. The lack of interchangeability isn’t as much of an issue for a project that is still being specified (one can design around the specific intricacies of a specific piece of software), but it’s very much an issue after a project has deployed: deployed systems are rife with implicit dependencies among the different software components. These dependencies — and thus the cost to replace a given software component — tend to increase over time. That is, your demand becomes more and more price inelastic as time goes on, until you reach a point of complete price inelasticity. Perhaps this is the point when you have so much layered on top of the decision, that a change is economically impossible. Or perhaps it’s the point when the technical talent that would retool your infrastructure around a different product has gone on to do something else — or perhaps they’re no longer with the company. Whatever the reason, it’s the point after which the software has become so baked into your infrastructure, the decision cannot be revisited.

This is another article you should definitely read if you are interested in this issue. If you do, you’ll see that his analysis has influenced mine. Cantrill’s conclusion is that the peculiar economics of software mean that in the long run open source provides the greatest benefits for both software producers and consumers.

So how does this apply to the University? We are definitely in a “vendor lock-in” situation, and the vendors that have us locked in are not just IBM (as some seem to be framing the issue) but also Software AG and, well, ourselves, with the software we’ve developed in-house. Any change is going to be costly, but also any change will most likely result in being locked in to a new set of vendors. Perhaps Cantrill is right, and we should be looking more closely at open source software if we want to have more flexibility in the future.

Responses

  1. Adam Connor says:

    May 19th, 2010 at 11:33 am (#)

    I’m a fan of open source, and I’d support looking more closely at it, but it brings its own requirements, which may be onerous in different ways. We might need different skills and expertise to work with it, and we might not be comfortable with the kind of support we would have. It might require a significant shift in mindset.

    I read Carr’s book 4-5 years ago, so I mostly just remember my impression at this point, which was that it was too facile for my tastes. It is easy to call something a commodity, but that can easily be a half-truth. Hamburgers are certainly a commodity at one level, but that doesn’t mean that we should all eat at McDonald’s because it is cheapest. I think there remain many situations where technical innovation and cleverness can give an organization an advantage, and intuitively we all know this: that’s why technology startups exist and sometimes prosper. Whether that fits with UT’s business strategy I can’t answer, though; maybe we’re McDonald’s, IT-wise.

    More when I have time to think about it…

  2. ross hartshorn says:

    May 20th, 2010 at 11:23 am (#)

    One of the main reasons things become commodities, is that there are some type of economies of scale. In software, of course, there is a lot of that because the cost of duplication is almost zero. This makes it very attractive economically to use the same software as others.

    There are basically three ways to use the same software as others:
    1) buy something lots of other people buy
    2) join an open-source community
    3) open-source your own software, and convince others to start using it (and eventually contributing as well)

    We have the in-house expertise on how to do (1), but it almost inevitably leads to vendor lock-in. While I would love to do (3), it requires that you use tools lots of other people use, so that they are able to join in (e.g. Java not Natural, relational databases). Really, though, (2) almost requires that, since in order to customize things to your own needs at all, you’ll have to use tools that open source options are written in, and you will almost certainly need to customize something.

    Our current strategy seems to be a mix of (1) for hardware and (3) for software, but the only people we let in are other components of UT (probably also the only people we could talk into using it, also).

  3. Adam Connor says:

    May 21st, 2010 at 7:25 am (#)

    I would add: While being part of a community (vendor, open source, whatever) reduces the cost by spreading the fixed costs over more users, it is limited to providing the common portion that many or most users need. Java still has the best libraries I’ve ever seen, but you aren’t going to find a Java jar that replicates our accounting system. In the end, you either write your custom stuff yourself, or you change your practices to match the software available.

    Having a lot of patches on top of someone else’s platform can be a painful experience, since every time there is a new release your patches may break. That is part of the genius of open source — there is a chance you can get your patches into the core. But that’s less likely unless they are generally useful, so there is still a lot of implicit pressure to conform to the common pattern.

Leave a Response

Social Widgets powered by AB-WebLog.com.