On Fri, 2006-04-28 at 16:18 +0200, Marcus Brinkmann wrote:
> Say Joe wants to shoot Sue, so Joe asks me for a gun. If I don't give
> him a gun to shoot Sue, but explain him why shooting Sue is really a
> bad idea, I have not solved the problem?
Perhaps I have misunderstood your position on confinement. If so, I
I know that I should resist gun discussions in email, but I believe that
the scenario you establish above is actually analogous to your view on
confinement. It is artificial and wrong.
The problem with your scenario is that it presumes you know what the gun
will be used for. Certainly, if I come to you and say "Can I borrow your
gun so that I can shoot Bas (nothing personal)." then you should say
However, if I instead come to you and say "Can I borrow your gun?" it is
another discussion entirely. A gun is a tool, and there are legitimate
uses for it. Perhaps I wish to put a damaged animal out of its misery.
Perhaps I have reason to know that Bas is planning to break into my home
and I wish to defend my home. Perhaps I merely wish to shoot small
pieces of lead at inoffensive paper targets. Perhaps I plan to destroy
the gun. Without further discussion, you do not know whether the use of
the gun is proper.
In the same way, if I come to you and say "I would like a confinement
solution", you do not know without whether I plan to use it for DRM or
Your position on confinement appears to be "It supports DRM, therefore
it should be banned". This is similar to the position "guns can be used
to kill people, therefore guns should be banned." Both positions are
dogma: they do not admit reason. Both positions are, in fact, immoral.
They reject the legitimate uses of both tools for which no currently
known alternative tool is realistically feasible.
Banning a device (or a technical means) is justified ONLY when a society
concludes that the overwhelming majority of actual uses are harmful to
others. Both requirements must be met: the overwhelming majority of uses
must be harmful, and the harm must be caused to third parties.
In my opinion, you have not satisfied either test. In fact, I do not
believe that either test *can* be satisfied today. There is an
insufficient base of knowledge about the uses of confinement from which
to draw any conclusion.
In fact, I do not believe that DRM satisfies either test. Yes, there is
a bad legal state today. I believe that the current situation is an
anomaly, and that balance will be restored. Such struggles for balance
have been the history of law for hundreds of years. You may not agree
with me. Neither of us will know who is correct for another 40 years.
Setting this aside, the consensus of society does not agree that the use
of DRM is harmful.
If I really wanted to ban something, I would ban software. Software has
been responsible for *far* greater harm than DRM. Think about it.
> In the end, it will be at the discrimination of the machine owner if
> he allows to do such an operation or not, just as it is at the
> discrimination of the machine owner if he enables the DRM chip in the
> computer or not.
This analogy also fails. In the DRM chip case, the chip is present and
the user merely needs to turn it on or off. In the Hurd case you have
(at least in the past) proposed that the basic feature of confinement
should be omitted from the design, and that people should have to do
significant work to add it.
There are two problems with this position:
1. The level of effort involved in the two cases is different by many
orders of magnitude.
2. Removing confinement from the design space compromises the
entire architecture of the system. This cannot be repaired
merely by adding some confinement software later.
> You seem to think that a principle and a dogma are two different
> things. I don't see why. They seem to differ mostly in what people
> connotate with them.
A dogma is a position that does not admit of change or reason. It is
therefore irrational. A principle is a position based on the best
available reason, but is subject to change in the face of better
information, new facts, or clearer understanding. The DRM position is a
> > ...The people I love around me somethimes have
> > to protect me from myself, and I sometimes have to protect them form
> > themsleves. And we are generally wery grateful that we did that to each
> > other. That could be the case between Alice and Bob here.
> Yes, but that is a social contract between Alice and Bob, and I don't
> think that's a good guiding principle for an operating system design.
As an operating system architect, I am substantially more expert and
more knowledgeable than my users. I can anticipate, and architect to
prevent, errors and compromises that most users cannot even recognize!
Any knowing failure to do so is an ethical compromise. Any failure to do
so that leaves my users open to non-consensual harm is immoral. Note
that a user cannot give consent concerning issues that they do not
I believe that rejecting confinement as a basic building block is a
profoundly unethical decision.
L4-hurd mailing list