From Risks 19.95
The discussion about the lack of NT security books seems to miss the larger issue. Much
of the thinking of computer security seems to hark back to the good old days when we knew
how to build secure systems. And to simple systems such as Unix that provide simple
security models for simple problems.
I too am nostalgic for the old days of Multics when security was defined in terms of
usability. The requirement was the users be able to trust the system. This was less an
issue of military security than being able to specify what access was to be given and to
whom. The system was honed with just Read/Execute/Write (and Append, but that's already a
messy one) on files (and directories). The Access list could explicitly list users or
projects (administrative groupings of users). If the user didn't know about the security
system, by default, there weren't any further exposures.
Unix had a different design point. The system was assumed to be like a friendly work
group with papers left on desks for easy access. Unix default was to allow the world to
read all of your files. Not safe for the na but unlike Multics the assumption was that
you really didn't want to keep anything confidential on your computer. (I know I'll be
flamed for this but defaulting to read means that only fools would trust the system
security. Alas, the default is to be a fool.) The initial access system with groups made
any attempt to really control access problematic. And then SU was thrown in to get around
all of this. Perhaps some of this has been address but to think of Unix as the model for
security seems silly.
Then there were PC's and other small systems that simply lacked any notions of access
control. That was fine since they were never ever going to be on a network.
So now we have gripes about NT. In fact, NT has a very sophisticated security model
with the ability to specify access control for all sorts of objects. And it has been C2
certified (at least, in a vacuum, networks tend to be problematic in the world of
security). The problem with NT is that there is no effective UI for security. More to the
point, thanks to the sophistication (AKA complexity), it is difficult to understand what
is really going on in terms of security. Especially with mechanisms such a path access
(security on shares) mixed in.
Of course, these systems have common problems like security being specified in terms of
local certification authorities (i.e., the local system directory or the NT domain).
But the REAL problem is that
de jure
(or military) model of security has
little to do with the real world:
- The attributes associated with files have little to do with the access appropriate for
programs (agents) acting on behalf of users. The result is that programs get wide access
(in Unix via SU, in NT there might be a finer grain) and then must make sure they don't
"abuse" this authority due to bugs.
- Mapping identity into a login ID is naive since users might have different authority
based on roles.
But the most important issue goes back to the Multics' principle that all is moot if
the user doesn't understand the system and is not comfortable with this understanding. And
the defaults must be safe. The current systems require vigilance. Perhaps that's OK in a
world of de jure where the user is blamed for mistakes. But in the real world
this is called an insensitive bureaucracy for good reasons.
If this weren't bad enough wrapping firewalls and other bad ideas around all this and
call it a solution just adds to the complexity.
If I need to read a manual to be secure, I don't have security.
Furthermore, systems security is a minor part of the problem. The real issue is
security of the applications. What authority am I giving to what user with what intent
under what circumstance. How do the operating system mechanisms serve these needs?
In the old days of time-sharing systems where we just wanted to protect files online
things were simpler. The world is more interesting now and so is the question of what we
mean by security.
Note that some people wondered about the contradiction between sophisticated and
problematic. The explanation is simple -- complexity. A sophisticated system is also
likely to be too complex to be understood and thus the consequences of ones actions cannot
be trusted. Ideally this complexity is compensated by greater expressive power but that's
not the norm.
Note that the October 1998 issue of Scientific American
has an article on hacking. It gives a good illustration of why it is necessary to have
access control at every end point rather than providing complete entree to those who get
by the front door (firewall). Unfortunately,
the article isn't posted online so you must find the paper edition.
I attended the Upside Summit
last week. While attendance was disappointing, it worked in the favor of the attendees by
providing access to the stellar set of speakers. I won't attempt to summarize the meeting
since you can read that at the Upside site.
One can speculate on why the attendance was low but my assumption is that the
heads-down focus that serves the entrepreneur makes it difficult to worry about the
seemingly distant policies issues. But, whether we like it or not, the Internet and our
industry are a major obsession of society and, by extension, Congress. But, even if
we weren't worried about legislation about the Internet and the computer industry, the
increasing overlap between the consumer electronics industry, the "media" and
the computer industry requires that we face up to the regulatory legacy of these other
industries.
I came with my own agenda. Most briefly, I call it unencumbered IPV6. Specifically
trying to keep the Internet simple with enough addresses so that we do not need to create
choke points such as firewalls and proxies. I go into this in more detail elsewhere
so won't go into more detail here. It is necessary to reduce this to a sound bite that
gets attention in order to be able to discuss it further. It's still difficult to make
people realize the problems with the current Internet architecture. But this is not the
issue for the Washington crowd.
The real concern is that without understanding what the Internet is and what it is
about the Internet that has opened up communications to real competition. Traditionally
communications has been regulated based on the assumption of scarcity and the assumption
that one can identify the purpose of each communications path. The Internet has changed
the rules.
Richard Wylie, former FCC Chairman, gave a talk that covered Digital TV among other
issues. I spoke to him briefly and he still believed that communications is a scarce
resource. No surprise since it is hard to understand what it means to grow by factors or
10 or 100. Thus Congress is attempting to force everyone to buy a new TV in the next ten
years. It's as if we legislated quadraphonic sound, 8-track tapes or other failed
technologies.
Perhaps if it were the only path to improving television one can argue for legislation
and the investment of a few billion dollars. But it is not necessary to legislate signal
formats when conversions are a matter of software and, perhaps, some assist chips. It is
also less obvious why we must give away a huge amount of spectrum to the broadcasters to
bribe them into forcing us to buy new TVs.
Prof. Stan Liebowitz (and his colleague whose card I can't find) researched path
dependence. It is a myth that Beta was better than VHS. Even if it had some advantages,
the convenience of VHS dominated. And Laserdiscs which produce a much better picture just
didn't find a market that valued picture quality enough to put up with the clunkiness of
the medium. Forcing people to accept an arbitrary definition of video quality whether they
want it or not also defies the vote of the marketplace.
Of course I want better quality. But the IP infrastructure will give us a much better
way to achieve it.
I was also on a telecommunications panel. Again, this is an issue I've discussed
elsewhere. I'll touch upon the concept of the user defining the product in the next
section.
I didn't pay much attention to the debates on the Microsoft Antitrust suits since it
seemed silly to have lawyers argue technical issues. The whole notion of an operating
system boundary is obsolete. I do find some of the marketing deals questionable and can
see some action on this. But there is just too much danger in trying to legislate software
design decisions. I view the whole issue moot since the PC is at the end of its lifecycle.
But, perhaps, that's the only time when lawyers can have the illusion of understanding.
The conference coincided with Congress passing the CDA-II act which attempts to
prohibit stuff that offends those who are afraid of knowledge. (OK, so I'm biased.) Larry Magid noted the inconsistency of this act and
the release of the Starr Report to the net. But the speakers such as Robert Bork, and
Orrin Hatch didn't see any contradiction. Of course, one shouldn't take politicians too
seriously. (Speaking of Bork, you might find this review of his moral diatribe
entertaining.)
This is the real revolution that is going to displace the PC. The PC, in fact,
succeeded because the consumers defined the product. One doesn't buy a wordprocessor. One
buys a computer and uses it as a word processor.
This was a result of the commoditization of computation. Buy being able to buy
computation independent of the particular application, one could optimize the platform for
price and performance and then rely on a software marketplace to create the most effective
applications. The users themselves defined the platform by their choices. They could also
define the platform more directly by writing programs which, in turn, become products for
others to buy. The result is a rapid cycle of innovation based on previous innovation. And
a process of reduced process on the hardware.
The commoditization of communications coupled with the ability to create very
inexpensive processors is the basis for this next generation of products. The availability
of the IP infrastructure allows products to emphasize form as well as function and
compromise for their application while being able to complement its local function by
connecting to other devices and services.
The Palm Pilot is a good example of such a transitional product. It is defined by the
ability to be carried casually with simple access to its internal functions. It does have
some programmability which allows for its use in other applications. It's main
extensibility comes from it's ability to exchange contents with the PC.
With the PC, the business and personal machines are essentially the same. The general
purpose nature of the platform penalizes over-specialization with a smaller marketplace
and, more important, the reduced availability of software.
In the next generation, form will typically define the product. Initially these will be
defined by the manufacturer. A watch is a watch, a cellular phone is just like the desk
phone but mobile. And the DTV (Digital Television) will tune to the limited selection of
programs provided by broadcasters.
But this will rapidly change. If for no other reason than the difficulty in solving
users problems through industry committees. We are still stuck with the Set Top Box that
does more to complicate the entertainment system than to enable it. In the absence of
standards, we have a plethora of clever work-arounds that don't really work and are very
confusing. In telephony, we have services that will sequence through your numbers until
they find you. And, even then, trying to coordinate voice mailboxes, home/office phones,
family phones etc is complex.
Imagine having protocols which allow you to write your own rules for how these devices
function. More likely, you'll buy precanned rules and just tune them just as you buy
programs and tweak them. A phone would be able to search for calls that you want to answer
and would coordinate with your messaging services to handle messages and provide you with
a list. You can then view the messages on a PC or listen to them on the phone. There are
pieces of these products but they come burdened with other capabilities rather than
allowing you to mix and match and adjust them to your needs.
The experience with electronic mail provides a clear example of the power of
user-defined products. The X.400 email standard had major backing of all the
telecommunications authorities, their governments and government contractors. But it took
ten years to just start the process. In the meantime, the Internet mail protocol was
trivial and could be implemented over a weekend. And, unlike X.400, one could experiment
with it without any tools other than a standard terminal program (Telnet).
The ability to experiment with user-defined products is going to force a redefinition
of the product marketplace. Telecommunications is the first to feel this. The phone
companies will no longer define what telephony is. As IP telephony improves, traditional
telephony will disappear. Not because of any price advantage. Simply because the
limitations imposed by the PSTN will cease to make sense. They will also no longer be able
to charge outrageous fees such as $5/month to stop blocking caller ID. There is no cost to
these services. They are only possible because the service is being held hostage by the
dinosaurs.
This isn't to say services have no value, just that they must be subject to marketplace
forces. The ability to price on value rather than cost is very attractive to service
providers but technological changes are giving the consumers the option of creating the
services themselves and the service providers must adjust to the new rules.
Broadcasters will also discover that they no longer control the "bandwidth
budget" nor do they own eyeballs when the world is just a click away. It is unclear
how many people will continue to choose to watch the same broadcasts as the same time.
One's environment will no longer be designed by the limitations of thermostats and
rigid systems. If you want to reduce your energy usage, you'll be able to ask the
appliances for their usage history and be able to buy scheduling programs to coordinate
usage against your priorities and energy prices. Or maybe you'll just stock energy in fuel
cells.
The Web shows the power of electronic connectivity. But it is just a hint of what will
happen when we have a rich set of protocols describing information and as we move beyond
the limitations of keyboard-defined interactions and let the devices cooperate directly.
Stay tuned, the ride is just beginning.