The following text is copyright 2004 by Network World, permission is hearby given for reproduction, as long as attribution is given and this notice is included.

 

Just how is this news?

 

By Scott Bradner

 

One of the two top stories on CNET for January 18th was called "seeds of destruction" and focused on a report that some computer security experts see a parallel between the spread of agricultural blights and computer viruses.  In particular, these computer experts think that today's technology monocultures (for example, Microsoft desktop machines) may react like the agricultural monocultures of the past when confronted by a new virus, mainly they will mostly die.

 

What I do not understand is why CNET thought this was news in January 2004.  This has been the long-held view of computer security experts and amateurs for many years.  CNET itself had multiple stories on the topic in 2003, including one about the U.S. National Science Foundation (NSF) funding some university researchers to look at the issue.

 

This conclusion is also completely obvious.  If 90 plus percent of the computers in an organization run exactly the same operating system then 90 plus percent of the machines in that organization are vulnerable if a new virus shows up to exploit a previously unknown bug in the operating system.  In case you have been living in a cave, exploits of new bugs in the dominant operating system are not uncommon, even if they tend to exploit known bugs.  If the same organization has machines running operating systems from 10 different vendors who did not share their code then only 10% or so of the organization would be at risk.  It might be an important 10% but still the impact on the organization would likely be a lot less than in the monoculture (and currently common) case.

 

 

But so what? 

 

There are not enough vendors of personal computer operating systems to make any significant difference to this threat.  Even if one did not take into account the advantages of managing a uniform environment and the need to have good application-level interoperability, it would be next to impossible to come up with more than 3 possibilities.

 

It seems to me that all this talk of the dangers of software monocultures, which I've done some of myself, is accurate but irrelevant.  As has been made very clear by history, no matter how often security holes are found companies are not going to swap out their Microsoft systems for alternative solutions in enough quantity to make any significant difference.  Even if they did, it would just make the alternative a more attractive target.

 

 Macs, at least the latest versions, do have quite good interoperability with Windows environments but they do seem to scare the support people.  When I'm feeling selfish I'd like to see Macs stabilize at 15 to 20 percent of the market.  That would be a big enough market to keep Apple creating these fantastic products but not so big as to become too much of a target for the wackos.  When I'm not feeling quite so selfish I think the better-for-Apple level would be 25% or so.

 

I'm all for NSF funding good research and hope the project focus is more on ways to deal with the effects of the monoculture rather than lamenting its existence.

 

disclaimer:  Many things can be accurately said about Harvard, "monoculture" is not one of them - but the above lament on lamenting is my own.