title: Forgetting DCE
by: Scott Bradner
Some of you might remember
Distributed Computing Environment (DCE) but it's not clear that some of the
industry pundits or venture capitalists do. Or at least they haven't internalized a principal reason
that DCE is, to put it politely, not prevalent today.
For those who do not
remember, Distributed Computing Environment is a set of technologies developed
by the Open Software Foundation (now called "The Open Group"
www.opengroup.org) that enables a computer user to make use of network-based
resources to augment their local computer. DCE, quoting from an IBM web page 'is a comprehensive suite of
integrated, yet modular, products which support transparent file access and
secure resource sharing in heterogeneous, networked computing environments." (For more info on DCE see
http://www.faqs.org/faqs/dce/faq/.)
I'm sure the Open Group will
call this simplistic, but in my mind a major reason that DCE was developed was
to share resources, such as disk and processor cycles, over a network because
having enough dedicated resources for individuals was too expensive. With DCE the user can get access to
databases without having to have a local copy and can get heavy-duty processing
done without having to have as powerful a computer on their desk.
But the DCE proponents did
not take into account the continued development of technology. Before the DCE specifications could be
fully developed disk and computer technology developed enough to negate much of
the assumed advantages of using DCE.
DCE was based on the assumption that the cost of management of the use
of distributed resources would remain less than the cost of replicating those
resources. This assumption did not
prove to be long lived in the case of DCE.
There just may be a lesson in
the history of DCE for those who are considering investing in peer-to-peer
networking, storage as a service offering or maybe even virtual private
networks (VPNs).
I am leaving out a number of
other arguments that were made in the case of DCE, single signon, centralized
backup, centralized authorization management, and more. Some of these arguments are now made
for the newer technologies -- they may prove to be as non-decisive as they did
for DCE. I am also leaving out the
ego factor that leads network managers to think that they should control
everything that connects to their networks. That factor is harder to analyze -- some of the egos are
rather strong.
An undercurrent of Clayton Christensen's
"The Innovator's Dilemma" is
that it is quite hard for people to take into account the fact that technology
does not stand still when evaluating their options. It is much too easy to see what you can buy today and assume
that it represents what will be available in the future. A case of this may be the pundits that
dismiss using the best-effort Internet for telephony - all they can see is that
it would not work well enough for them today. They forget that using today as a guide led to developing
DCE.
disclaimer: Harvard knows yesterday and today
rather well but has trouble with tomorrow, as do I, but I provide the above
caution anyway.