title:  Old bugs don't die easily

 

by: Scott Bradner

 

One might think that a vulnerability first described in 1985 would not be a factor in today's Internet, especially if a good way to eliminate the vulnerability was published in 1996.  But, sad to say, that is not the case.  The Wall Street Journal reported the other day that Internet consulting company Guardent claims that an old bug had arisen from a supposed grave.

 

Transmission Control Protocol (TCP), described in RFC 793 (http://www.ietf.org/rfc/rfc793.txt), is the basic reliable data delivery protocol used in the Internet.  TCP uses a data sequence number as the basis of its reliable delivery process.  When your computer sends data to another computer on an IP network it breaks the data stream up into chunks, known as packets, for transmission.  When it sends each packet it includes a sequence number that represents the total number of bytes of data that it has sent up to this time during the specific communication.  The destination host responds with an acknowledgement packet containing the sequence number of the next byte of data that it expects to see.  The sender uses this acknowledgement to find out what data has made it to the destination node. 

 

In many environments trust relationships are defined between hosts, for example, between a file server and its clients.  An attacker with the knowledge of what sequence numbers a file server will use and the ability to forge IP addresses can fool the file server into thinking it is talking to a trusted client when it is talking to the attacker.  To make this harder computers try to make it hard to guess what initial sequence number will be used in a conversation.

 

But there have been problems in coming up with a good way to make it hard to guess the initial sequence number.  Robert Morris's February 1985 paper (ftp://ftp.research.att.com/dist/internet_security/117.ps.Z) details the above attack and make some suggestions on how to prevent it.  A decade later Steve Bellovin published a more detailed description and set of recommendations in RFC 1948 (http://www.ietf.org/rfc/rfc1948.txt).

 

But, just like the users I mentioned in last week's column, system vendors are sometimes not all that good at fixing their software to avoid vulnerabilities, even when the vulnerabilities have been known for a very long time (centuries of Internet time.)  In this case vendors did generally try to plug the security hole after a well-publicized attack in 1994 but did not then add the additional protections that Bellovin described two years later in RFC 1948 because they were seen as to hard to do.

 

But Guardent's report indicates that avoiding the hard work just meant that the problem did not go away.  It is, of course, a truism in the security area that good security is not easy. This example should be taken as just another reminder of that truth for anyone concerned with the security of their own network and systems.

 

disclaimer:  Harvard's motto makes a claim for truth but the observation in this column is mine.