Sponsored by: This story appeared on Network World Fusion at http://www.nwfusion.com/columnists/2002/1202bradner.html 'Net Insider: A resilient architecture ByÊScott Bradner Network World,Ê12/02/02 Sept. 11, 2001, was a generally quiet day on the Internet. This was true even though the attacks in New York destroyed some important network facilities. It might not have looked that way to those trying to get through to CNN and other news sources, but those problems turned out to be local to the news sources. There also were connectivity disruptions to a few countries because of poor design choices made in the past. These are some conclusions in a recently releasedÊNational Research Council report. On a somewhat more worrisome note, the report indicates that the Internet might not fare so well if it was the direct target of a major attack. The report, "The Internet Under Crisis Conditions: Learning from Sept. 11," is available for online reading (through a crappy reader) or purchase at here. The main reason the Internet was largely unaffected on Sept. 11 is its underlying architectural vision. This vision comes from some early research that led to the ARPANET (seeÊOn Distributed Communications: Introduction to Distributed Communications Network) and the initial ARPANET design philosophy (The Design Philosophy of the DARPA Internet Protocols). The Internet consists of many highly interconnected individual networks, most of which are highly interconnected internally. This architecture means the loss of major interconnection points or major communications links has little effect because the traffic just bypasses the outage through other links or interconnection points. A few network outages occurred Sept. 11 in which the connectivity was not as rich as it might have been or users were directly connected to network equipment that was destroyed or which lost power in the aftermath. But these outages were isolated. Less isolated were the visible problems with news sites such asÊcnn.com. These sites, or the links to them, quickly became overloaded as office workers tried to find out what was happening. Most problems were fixed within a few hours as the sites did what they should have done in the first place and distributed their content among a number of redundant servers around the network. The same basic problem struck South Africa when it turned out that the country's name server was not replicated as it should have been, but instead was just located in New York. The report specifically does not attempt to predict how the Internet would perform if its infrastructure was the target of a sustained attack. One hint came a few weeks ago when the root name servers were subjected to a denial-of-service attack. In this case there was little effect, but we might not be so lucky in the future unless some of the known vulnerabilities are addressed. This type of objective analysis of such a terrible day does make me feel funny; it's a bit like the Federal Aviation Administration accident investigators saying the engines were working just fine when a plane crashed. It is needed, but it must not overlook the human cost. Disclaimer: No joke this week. I am not speaking for the university in the above. Related Links Bradner is a consultant with Harvard University's University Information Systems. Reach him at sob@sobco.com. Bradner forum Get Bradner in your inbox 'Net InsiderÊarchive All contents copyright 1995-2002 Network World, Inc. http://www.nwfusion.com