TUCoPS :: Crypto :: submit.txt

Submissions to NIST CSSAB on encryption and CLIPPER technology initiative

>From: padgett@tccslr.dnet.mmc.com (A. PADGETT PETERSON, P.E., INFORMATION
SECURITY (407)826-1101)
To: "kammer@micf.nist.gov"@UVS1.dnet.mmc.com,
Subject: Clipper/Capstone Key Escrow Management

re: maintaining Clipper/Capstone key confidentiality

Recently in an E-Mail conversation with Dorothy Denning a thought occured
to me concerning a means to avoid the key-management problems inherant
with authorized wiretaps. Since this has been one of the apparent stumbling
blocks concerning the issue and since Mrs. Denning indicated that this
possibility had not come up in her conversations with NSA/NIST, the
current "call for comments" seemed to be an appropriate time to present 
my concept formally.

The objection seems to have been primarily that if the keys are released
for a particular chip or chips so that a properly ordered wiretap may
take place, would not the keys (and the chips) have to be considered
comprimised thereafter ?

My concept is simply that the keys are never distributed to outside
Instead, on presentation of a properly approved wiretap order, the
requesting agency receives a special complementary Clipper chip to that 
mentioned in the order that is configured for "receive only".

The chip is then used by the requesting agency for the duration of the tap
and is required to be returned to the escrow agency on expiration of the

Utilizing this concept three advantages accrue:

 1) Since the keys are never divulged, confidentiality is restored once
    the wiretap chip is returned to the escrow agency.

 2) Since the wiretap chip is unique and identifiable hardware, full
    accountability is maintained.

 3) Since the wiretap chip is "receive only", a recording of the encrypted 
    transmission might be admissable as part of the "chain of evidence" as
    only the original Clipper could have produced it.

Note: while I have discussed the first two points before, I believe this is

      the first public mention of the third possibility.


                              A. Padgett Peterson, P.E.  

From forman@cs.washington.edu Thu May 13 12:29:49 1993
Received: from june.cs.washington.edu by csrc.ncsl.nist.gov (4.1/NIST)
     id AA03876; Thu, 13 May 93 12:29:41 EDT
Posted-Date: Thu, 13 May 93 09:29:41 -0700
Received-Date: Thu, 13 May 93 12:29:41 EDT
Received: by june.cs.washington.edu (5.65b/7.1ju)
     id AA23713; Thu, 13 May 93 09:29:41 -0700
Date: Thu, 13 May 93 09:29:41 -0700
>From: forman@cs.washington.edu (George Forman - GHF)
Return-Path: <forman@cs.washington.edu>
Message-Id: <9305131629.AA23713@june.cs.washington.edu>
To: crypto@csrc.ncsl.nist.gov
Cc: forman@cs.washington.edu


(Certainly others have submitted these ideas, so I'll be very terse.)

#1. I believe no attempt should be made to limit domestic use of strong
encryption techniques.  

(One cannot legislate that all communication be intelligible to the
government.  Such laws cannot be enforced.  Information can be sent in
many subtle ways.  Only the good guys and the dumb bad guys will comply.)

#2. While I think "key escrow cryptography" is interesting technology
(and perhaps useful within some businesses), I do not believe it should
be adopted as a national standard.

(Its costs and risks outweigh its practical benefit.  Consider #1 above.
Also, power corrupts-- the escrowed keys will be the subject of many
consider the complexity and cost of maintaining nearly infinitely many keys
forever.  And how hard will it be for the FBI to obtain the right
escrow keys if a bad guy is using several stolen phones, and perhaps
encrypting his e-mail messages with standard encryption programs
available on BBSs and the Internet?)

#3. I think the details of any nationally adopted encryption scheme
should be published.

(I think publishing the details of the encryption system has a great
benefit-- lots of people who care will proof read it and test its
robustness.  Having only a few great minds proof it isn't as good as
having a lot of people beat on it.)

Thank you for your effort to collect responses,

     George Forman
     PhD candidate, Univ of Washington, Seattle
From tad@ksr.com Thu May 13 12:52:06 1993
Return-Path: <tad@ksr.com>
Received: from hopscotch.ksr.com by csrc.ncsl.nist.gov (4.1/NIST)
     id AA03910; Thu, 13 May 93 12:52:00 EDT
Posted-Date: Thu, 13 May 93 12:52:37 EDT
Received-Date: Thu, 13 May 93 12:52:00 EDT
Received: from ksr.com (frankenstein.ksr.com) by hopscotch.ksr.com with
     id AA28692; Thu, 13 May 1993 12:51:42 -0400
Received: from foramena.ksr.com by ksr.com (4.0/SMI-3.2)
     id AA11458; Thu, 13 May 93 12:52:39 EDT
Received: by foramena.ksr.com (4.1/KSR-2.0)
     id AA02284; Thu, 13 May 93 12:52:37 EDT
Date: Thu, 13 May 93 12:52:37 EDT
>From: tad@ksr.com
Message-Id: <9305131652.AA02284@foramena.ksr.com>
To: crypto@csrc.ncsl.nist.gov
Subject: Clipper chip review

   I wish to add my voice to the NIST review of the Clipper/Capstone
proposal. I welcome the opportunity to do so electronically.
   The Clipper Chip proposal seems to have value by establishing a
standard. However, it destroys that value through the secrecy of the
algorithm, the lack of a software implementation (which is required
to retain that secrecy), and the government access to decryption. I,
as a citizen, would not use any encryption which as weak as this scheme
appears. Better schemes, and better implementations exist in the open
market. I believe that the best approach for the federal government is
to relax the export criteria for encryption devices. The Clipper Chip
comes through as a waste of time and money.

                              Jeff Deutch
                              Computer Engineer
                              Kendall Square Research Corp.
(Affiliation given for identification purposes only.)

From john_fletcher@lccmail.ocf.llnl.gov Thu May 13 16:45:40 1993
Return-Path: <john_fletcher@lccmail.ocf.llnl.gov>
Received: from ocfmail.ocf.llnl.gov ([]) by csrc.ncsl.nist.gov
     id AA06513; Thu, 13 May 93 16:45:34 EDT
Posted-Date: 13 May 1993 13:41:19 U
Received-Date: Thu, 13 May 93 16:45:34 EDT
Received: from lccmail.ocf.llnl.gov by ocfmail.ocf.llnl.gov (4.1/SMI-4.0)
     id AA07003; Thu, 13 May 93 13:45:30 PDT
Message-Id: <9305132045.AA07003@ocfmail.ocf.llnl.gov>
Date: 13 May 1993 13:41:19 U
>From: "John Fletcher" <john_fletcher@lccmail.ocf.llnl.gov>
Subject: Cryptographic Issue Stateme
To: "Crypto Issue" <crypto@csrc.ncsl.nist.gov>

                       Subject:                               Time:13:16
  OFFICE MEMO          Cryptographic Issue Statement          Date:5/13/93
Just by following the specifications published and widely available in FIPS
PUB 46, I personally programmed a C-language DES subroutine package in
about one week.  Subsequently I located a similar package developed in
Austrailia and available over the Internet.  I then found a DES program 
printed on page 506 of the book "Computer Networks" (2nd edition) by Andrew
S. Tanenbaum.  These are just three examples illustrating that the DES
"cat" is "out of the bag" and available to anyone.

In view of this, I believe that the export ban on DES is
It is so clearly ineffective in limiting access to DES that it comes across
as just foolish.  My fear is that foolish regulations tend to reduce
respect for all regulations, even the ones that are well-founded.  That is,
I fear that there are those who might conclude that there are no secrets
worthy of
containment when they see efforts expended to contain what is so clearly
not a secret,  and I fear that they may act on that conclusion.

From BDCARRD1%BUDGET.BITNET@ENH.NIST.GOV Fri May 14 14:20:27 1993
Received: from ENH.NIST.GOV by csrc.ncsl.nist.gov (4.1/NIST)
     id AA10161; Fri, 14 May 93 14:20:21 EDT
Posted-Date: 14 May 1993 14:04:22 -0400 (EDT)
Received-Date: Fri, 14 May 93 14:20:21 EDT
 <01GY62TS7P9C002LWL@ENH.NIST.GOV>; Fri, 14 May 1993 14:20:16 EDT
Received: from BUDGET (BDCARRD1) by BUDGET.BITNET (Mailer R2.10 ptf000)
 BSMTP id 5675; Fri, 14 May 93 14:05:03 EDT
Date: 14 May 1993 14:04:22 -0400 (EDT)
Subject: Comment on Legal and Constitutional Issues
To: crypto@csrc.ncsl.nist.gov
Message-Id: <01GY62TS7P9E002LWL@ENH.NIST.GOV>
Content-Transfer-Encoding: 7BIT
Comments: Converted from PROFS to RFC822 format by PUMP V2.2X

     I would like to comment on the proposals concerning the Clipper
Chip with respect to the related area of Legal and Constitutional Issues.
I have worked in a variety of positions in the computer field including
technical support of security software and management of a data security
department. I have also completed part of a doctoral program in Inform-
ation Science with a concentration in Information Policy. Finally, I am
an active citizen with a strong love of the Bill of Rights.
     I am very concerned that our government should presume to reserve
the ability to gain access to our communications. If two individuals
today spoke a language that police eavesdroppers did not understand,
they ought not to be compelled to explain what the language was so that
the police could decipher their conversations. I believe that the case
of encryption is similar. If individuals are able to encrypt their
communications securely so that only they hold the key, then government
must do without that information. Any other rule would be a form of
a priori self-incrimination which would make a mockery of the Fifth
     The case with respect to the Fourth Amendment is similar. Even given
a legal search warrant, the police are not guaranteed that they will find
the evidence they seek. If the target of their search has hidden the
evidence very well, the search may fail. The target person cannot be
compelled to tell them where the evidence is. I see encryption as similar
to a physical hiding of potential evidence. Giving government the key
would be tantamount to telling the police where to find the evidence,
and compelling people to provide that key would be unconstitutional.
     If the Clipper Chip is only one option and people are free to use
any other hardware or software encryption, then it would not be a threat
to civil liberties. It would also be of questionable value.
     I urge a policy that would remove the NSA from any dominant role
in determining civil liberties questions, something that agency has
proven itself totally incapable of understanding. I urge a policy that
would preserve privacy and freedom from government interference, even
if this means that law enforcement must find other ways to gain evidence.
     Thank for the opportunity to express my opinion on this important
public policy issue with which I am concerned both professionally and
personally. I give permission to reproduce this comment in any format.
I also make the standard disclaimer that these are my opinions alone.
     David G. Carroll
     1092 Van Antwerp Rd.
     Schenectady, NY   12309
     (518) 377-9384 (Home - weekdays aft. 6pm EDT or weekends)

From mrosing@igc.apc.org Sat May 15 11:16:19 1993
Return-Path: <mrosing@igc.apc.org>
Received: from cdp.igc.org by csrc.ncsl.nist.gov (4.1/NIST)
     id AA13703; Sat, 15 May 93 11:16:04 EDT
Posted-Date: Sat, 15 May 93 08:16:12 PDT
Received-Date: Sat, 15 May 93 11:16:04 EDT
Received: by igc.apc.org (4.1/Revision: 1.85 )
     id AA04916; Sat, 15 May 93 08:16:12 PDT
Date: Sat, 15 May 93 08:16:12 PDT
>From: Mike Rosing <mrosing@igc.apc.org>
Message-Id: <9305151516.AA04916@igc.apc.org>
To: crypto@csrc.ncsl.nist.gov
Subject: Crypto Issue Statement
Cc: eff@eff.org

          Cryptographic Issue Statement

     The purpose of this statement is to address some of the
issues raised by the Computer System Security and Privacy Advisory
Board from my personal perspective.  As I am not yet a professional
cryptographer I will leave the details for others and attempt to
focus on civil liberties and privacy issues.

     The whole issue presented by the Skipjack algorithm with key
escrow reminds me of a line in Lau Tzu's "Tao Te Ching": the more
rules and regulations a government creates, the more clever the people
become.  The United States government has told the American people
"trust us" for many years.  They gave us Viet Nam, the War on Drugs,
and an invasion at Waco.  Very few sane people trust the U.S.
government.  While the majority of the population is ignorant of what
cryptography can do for them, the people government wants to catch
(terrorists and drug dealers for example) are well aware of how to
keep a secret.

     It is impossible to give government access to all private
encrypted transmissions.  There are enough clever people in the U.S.
who can develop mathematical algorithms for encryption which can be
put into computers for easy use.  Will the government decide to outlaw
"strong crypto" because it defeats their ability to crack it?  If not,
will the government take a recorded encryption and force the creator
of the record to divulge the key?  This would be an exception clause
to the 5th amendment, but there are so many exceptions to the Bill
of Rights today that the majority of the populace won't notice.

     I ask these questions because I am working on a strong
crypto system that I believe I can sell to big business.  It will work
with software or hardware.  It should be straight forward to encrypt
conversation before it goes to the Skipjack routine in the newly
proposed escrow system.  People will then know that even if the
government is listening to their conversation, it will be very
difficult to crack.  I'm sure there is a market for such a device.

     I have the feeling that it really doesn't matter what any
private citizen thinks about the key escrow scheme.  To keep our
privacy we are simply going to have become more clever.  Keeping the
algorithm secret and then saying "there is no back door" and expecting
people to believe it is wishful thinking.  To be blunt, the whole
thing stinks.

     From what I know about government bureaucracy, this
committee is nothing but window dressing.  You can't actually do
anything about the introduction of the escrow technology system,
except slow it down by a few months.  Will you have the courage to say
this in your final report?  Does any government employee know how to
tell the truth?  You are going to read and hear many arguments against
this technology for the obvious reason that it gives too much power to
government.  The people who really need security will bypass this
technology for something they can trust.  What does the government do

     If you really want to accomplish something, allow private
citizens such as myself to play with the Skipjack algorithm and the
escrow chip.  When several thousand individual citizens have played
with the device and algorithm there will be a level of trust built up.
The "back door" we fear may be there, but it might be so hard to use
that we could be certain that only the NSA has the resources to use
it.  Without this level of trust, there is little point in introducing
this technology.  If people can't trust their government, all the
other issues are secondary.

Patience, persistence, truth,         reality:  dvader@hemp-imi.hep.anl.gov
Dr. mike                              home:           mrosing@igc.org  
IMI, P.O. BOX 2242, Darien IL 60559   phone: 708-859-0499

From mrnoise@econs.umass.edu Wed May 19 14:05:32 1993
Return-Path: <mrnoise@econs.umass.edu>
Received: from POBOX.UCS.UMASS.EDU by csrc.ncsl.nist.gov (4.1/NIST)
     id AA02699; Wed, 19 May 93 14:05:22 EDT
Posted-Date: 19 May 1993 14:05:09 -0400 (EDT)
Received-Date: Wed, 19 May 93 14:05:22 EDT
Received: from titan.ucs.umass.edu by POBOX.UCS.UMASS.EDU (PMDF #2573 ) id
 <01GYD1QUXJ8W00N6OJ@POBOX.UCS.UMASS.EDU>; Wed, 19 May 1993 14:05:12 -0400
Received: by titan.ucs.umass.edu (5.65/DEC-Ultrix/4.3) id AA13647; Wed,
 19 May 1993 14:05:10 -0400
Date: 19 May 1993 14:05:09 -0400 (EDT)
>From: "Mr. Noise" <mrnoise@econs.umass.edu>
Subject: NIST Open Meeting
To: crypto@csrc.ncsl.nist.gov
Cc: mrnoise@titan.ucs.umass.edu (Mr. Noise)
Message-Id: <9305191805.AA13647@titan.ucs.umass.edu>
Content-Type: text
Content-Transfer-Encoding: 7BIT
X-Mailer: ELM [version 2.4 PL21]
Content-Length: 2898

*  Permission granted to reproduce this message in any medium *

To Whom It May Concern:

Thank you for this opportunity to comment on the recent proposals to
establish a "key escrow" system and on the public availability of strong
cryptography in general.  I am a graduate student in the Economics
department at UMASS-Amherst as well as a partner in Brazerko
a small company started this year in Connecticut to provide electronic
communications services in New London County.  I am writing to you,
both as an a citizen concerned with his right to privacy and as someone
concerned with the effect of government regulation in the rapidly-expanding
electronic communications sector.

As a citizen, I hold dear my rights to free speech and privacy, and I am
proud of America's rich heritage of liberty.  Naturally, I recognize the
government's legitimate need to impinge on that liberty when the nation's
security is at stake, but too often in our history the liberty of the
citizenry has been infringed on a flimsy pretext.  By making it costly for
the government to violate our privacy, the public availability of strong
cryptography will ensure that the government obtains access to our private
communications only when there is a clear need.  In my estimation, this
added liberty is surely worth the costs in increased difficulties for law

We must also ask ourselves who would be harmed by legislation restricting
availability of strong cryptography or eroding its security through a "key
escrow" system.  Clearly, criminals will not be harmed, but rather the
average, law-abiding citizen.  Just as a ban on small arms would leave
law-abiding citizens unable to defend themselves against gun-weilding
criminals, so too would restrictions on cryptography leave them defenseless
against those who would lawlessly invade their privacy.

The ability to communicate securely over the growing electronic network is
also important to industry.  If firms can conduct their business securely
on the public network, they will take advantage of the opportunities for
increased productivity it affords, engendering growth throughout the
economy.  This is especially important in our global economy, where the
worldwide communications network allows companies to expand overseas with
relative ease.  Not only must encryption be made available for public use,
it must be allowed to cross political boundaries.

While the government's proposal would provide the needed security in
theory, I am sure that others will write in to suggest why public registry
of keys makes the proposal flawed in the real world.  Only individual
citizens and firms can provide security for themselves.  A technology as
important as cryptography cannot be left to the vagaries of the public


Robert Szarka

(For verification or further comment: 1-203-886-6294, voice)

From pcw@access.digex.net Tue May 25 14:59:24 1993
Return-Path: <pcw@access.digex.net>
Received: from access.digex.net by csrc.ncsl.nist.gov (4.1/NIST)
     id AA01394; Tue, 25 May 93 14:59:16 EDT
Posted-Date: Tue, 25 May 1993 14:59:10 -0400
Received-Date: Tue, 25 May 93 14:59:16 EDT
Received: by access.digex.net id AA18141
  (5.65c/IDA-1.4.4 for crypto@csrc.ncsl.nist.gov); Tue, 25 May 1993
14:59:10 -0400
Date: Tue, 25 May 1993 14:59:10 -0400
>From: Peter Wayner <pcw@access.digex.net>
Message-Id: <199305251859.AA18141@access.digex.net>
To: crypto@csrc.ncsl.nist.gov

Raymond Kammer 

Dear Mr. Kammer:

I'm filing my comments to NIST Clipper Chip. I would like the opportunity
to testify at your meeting on either June 2,3 or 4th. 

Thank you for taking the time to solict public comment on the chip.

-Peter Wayner

Comments on the National Institute of Standards and Technology's
(NIST) Proposed Encryption Chip with Key Escrow.
Peter Wayner

Permission is granted to freely distribute this text. 
Abstract: My comments are limited to the practical problems involving
pure hardware solutions. I feel that such systems are unwieldy,
expensive and not easily retrofitted into machines that are already in
service. More importantly, the key escrow system adds an additional
weakness that if compromised, could render the standard obsolete. If
such a "Digital Pearl Harbor" occured, the country would be without
secure channels until all of the hardware in the country could be
replaced and this could easily take over 1 year.
My comments are limited to the practical problems involved in
implementing a hardware- based encryption standard for the country. I
believe that specialized hardware is an unnecessarily expensive and
overly complicated approach for providing solid encryption
capabilities and these costs will deter people from adopting the
standard. More importantly, these high costs and the general
inflexibility would prevent the US from having a quick response in the
event that the key escrow system became compromised.
Although it is hard to estimate the true effect that the NIST chip
could have on the price of telephones and computers, it is possible to
make ballpark guesses. Manufacturers like Sun Microsystems and IBM
multiply the cost of a part by about 4 to determine the impact of
adding that part to the final price of the machine. This would mean
that a chip that cost $25 would add about $100 to purchasers cost.
This rule of thumb includes the cost of adding extra inventory,
reworking the assembly lines, re-engineering circuit boards, re-
programming system software, training support staff, re-writing
manuals and other extraneous tasks that are not directly related to
the cost of the part.
Some low-end PC manufacturers are able to use lower multiples because
they provide less support and assistance for the final customer. More
importantly, they use very standard designs with off-the-shelf
chipsets that are optimized to make cheap computers available to all.
At this time, though, the chipsets are not designed to allow for an
encryption "co- processor" and adding the chip could be more
expensive. For this reason, I feel that that the chip could also add
$100 to the price of off-the-shelf PCs-- an amount that is almost 10%
for many models.
The cost of adding the chip to any of the existing computers, though,
could be much more expensive. The chip would need to be mounted on an
expansion board that fits into computers. The cost for this board
would need to be about $100 to cover the costs of marketting,
packaging and stocking the product. Some computers, however, do not
have expansion slots and others have all of their expansion slots
filled up already. Computer manufacturers routinely survey users to
discover how many cards they use so the computers can be built with
the minimum necessary slots. In time, there would be enough space for
a NIST encryption chip card, but until then many users would have
trouble adding the chip to their current system.
The high cost is bound to slow the adoption of the standard because
the risk of data insecurity is nebulous and illformed. Will they be
willing to pay extra for this security? Will American people be
willing to add the chip to their home phones to protect themselves
from evesdroppers listening for their credit card numbers? The
problems are severe, but people often don't protect themselves until
it is to late. If the cost is significant, then many people will
certainly balk at the added cost and slow if not stop the development
of the standard.

A Cheaper Solution
Naturally, every new feature is going to cost something. But the fact
is that encryption does not need to cost this much money if it is
accomplished in software. It could be almost free.  A student on
summer vacation can turn out a system that lives in the public domain.
There is ample evidence that people are willing to do this. PGP
(Pretty Good Privacy) is a system that Phil Zimmerman developed on his
own and gave to the world. NIST could easily pay someone to generate a
public-domain software version for general distribution if it wanted
to provide the lowest cost standard for the people.
There is already ample evidence that software solutions succeed and
hardware solutions do not. Several corporations including Cryptech and
AMD have manufactured fast DES chips for years. Yet, the chips are
rarely found in many applications. Public domain implementations of
DES accomplish much of the DES encryption which is done in this
I think that most people would agree that a secure standard for data
encryption is necessary to the country's economic health. For this
reason, I believe that a free software implementation is the best way
to achieve this goal. Cost will not prevent people from adopting the
The Telephone Problem
Perhaps the best example of the cost of converting a $25 chip into a
markettable product is the AT&T secure phone announced on the same day
as the NIST chip. It was priced at over $1000. Certainly, some of this
cost covers the extra electronics to process the voice, but the need
to mark up products to pay for the work is still evident. The price on
these phones is sure to drop as the market grows more mature, but it
should be obvious that the market won't grow substantially until the
price drops more. The Government may be able to afford these rates,
but even the average corporation cannot.
The cost of adding secure encryption to handheld market is more
difficult to estimate. Here size, weight and power consumption are
just as important as price and an extra chip adds to each of these
problems. Cellular companies currently aim to manufacturer devices at
a price point of $100/unit in wholesale costs. The NIST chip would
mark up the price by at least 25%, drop the battery life, increase the
weight and add to pocket bulge. These are not positive effects on a
product.Yet, digital cellular phones and digital cordless phones are
perhaps the most important market for a secure encryption device
because the signals travel over the airwaves.
As before, all of the work of the Clipper chip could be accomplished
in software. Many of the current digital cellular phones use
highly-integrated Digital Signal Processing computers that both
control the phone and handle the signalling chores. Adding encryption
to a phone can be done by merely instructing the programmer to add an
additional function. The cost per unit is minimal and the extra
feature does not affect the power consumption. There is no doubt that
most people would rather have a software solution.
"Digital Pearl Harbors"
The Key Escrow system allows the law enforcement agencies to access
the content of a signal when they are duly authorized. The NIST plan
requires that the key be split up and held by two separate agencies.
This is both a concession to those who fear abuse and a good safety
procedure. But we must remember Ben Franklin's admonishment that
"three can keep a secret if two are dead."
Does NIST have plans for replacing the chips throughout the country if
the key escrow services are compromised? Although I realize that
serious precautions will be taken to protect the keys, I hope that
NIST realizes their value. The Russians were able to obtain the
secrets of the atomic bomb and the hydrogen bomb for very little
money. There have been several high-profile spy cases involving
cryptographic information. The intelligence community recognizes the
need to keep information compartmentalized and to frequently change
codes and ciphers but there are still breaches of security. This system,
however, is barely compartmentalized.
Criminals are becoming increasingly adept with technology. One group
placed a fake Automated Teller Machine in a Mall and used it to steal
account information which they later used to make fake withdrawls.
Many crimes like this will be possible in the future and I have little
doubt that the escrowed keys will have much more value than the atomic
The cost of replacing all of the NIST chips around the country would
be prohibitive. What would happen if the FBI discovered that two
people in the different escrow agencies succumbed to bribery? Would
NIST announce a recall of all encryption chips? What would they use to
replace the chips? It could take 6 months to design and fabricate a
new chip in sufficient quantities. There are at least 250 million
phones around the country and 50 million computers. Even if each
computer and phone had a zero insertion force sockets that made
exchanging the chips easy, the cost to the country would be over $7
billion dollars at $25 a chip.
A software solution, on the other hand, could be changed very quickly
in the event of a compromise. Many companies that manufacture virus
software include provisions for delivering updates whenever a new
virus is discovered. The solution often travels substantially faster
than the virus itself because people are able to download the
anti-virus from bulletin boards.
The military and the intelligence community routinely change their
cipher systems because they know that mistakes can be made and leaks
can emerge in even the best system. The economic health of the country
is resting, in some part, on the success of large, broadly implemented
encryption systems. Many foreign companies pay princely sums for
American technology. They routinely pay sums that are 10 times larger
than the largest offered by the old Soviet Union. Can we be certain
that two escrow agencies are going to be any more secure than the
atomic scientists or the intelligence community?
The NIST system is too expensive and too unwieldly for general use.
NIST would be better advised to develop a standard implemented in
software that could be made available to all at no cost. It could be
essentially free and much less prone to dangerous interruptions of
services in case the system was compromised.

From upsetter@mcl.mcl.ucsb.edu Tue May 25 20:25:09 1993
Return-Path: <upsetter@mcl.mcl.ucsb.edu>
Received: from hub.ucsb.edu by csrc.ncsl.nist.gov (4.1/NIST)
     id AA01716; Tue, 25 May 93 20:24:55 EDT
Posted-Date: Tue, 25 May 93 17:25:01 PDT
Received-Date: Tue, 25 May 93 20:24:55 EDT
Received: from mcl.mcl.ucsb.edu by hub.ucsb.edu; id AA26554
     sendmail 4.1/UCSB-2.0-sun
     Tue, 25 May 93 17:25:24 PDT for crypto@csrc.ncsl.nist.gov
Message-Id: <9305260025.AA26554@hub.ucsb.edu>
Received: by mcl.mcl.ucsb.edu
     ( id AA25644; Tue, 25 May 93 17:25:02
>From: Jason Hillyard <upsetter@mcl.mcl.ucsb.edu>
Subject: Cryptographic Issue Statement
To: crypto@csrc.ncsl.nist.gov
Date: Tue, 25 May 93 17:25:01 PDT
Mailer: Elm [revision: 70.85]

This submission is for the NIST's Computer System Security and Privacy
Advisory Board hearing on the Clipper Chip.  It is my understanding that
submissions must be received by May 27 and that hearing will be held on
June 2-4.

I can be contacted at this Internet address or via the information given
at the end of this submission.

Jason Hillyard


     On June 5, 1991, Philip Zimmerman released a computer program called 
PGP to the world.  PGP, which stands for "Pretty Good Privacy", is an 
encryption program, a bunch of bits and bytes which Zimmerman himself calls

"guerrilla software".  It was written in response to what he perceived as a

threat to our privacy-- the proposed Digital Telephony legislation pushed 
by the FBI and the Department of Justice.  This software engineer decided 
to take direct action.  He wrote a high quality encryption program and gave

it away for free.  Today there are versions of PGP available for all kinds 
of computers, from Macs to VAX, and programmers all over the world are 
working on future versions.

     Zimmerman's actions can be seen as a strong affirmation that 
cryptography has gone public.  What was once the exclusive domain of the 
NSA and military signal intelligence experts has become a thriving field of

academic inquiry, and it has been for twenty years.  Now encryption is 
starting to hit the street.  Our personal computers are perfectly capable 
of providing us with the type of communications security once reserved for 
the military and intelligence communities.  The digital telecommunications 
networks we will become personally acquainted with in near future will also

provide more opportunities for the public use of encryption.

     Recently, however, there has been a growing public debate about how 
strong encryption technology should be and who should be able to use it.  
One major player in this debate is the federal government.  Different gears

in the federal machine are squeaking for different reasons.  The executive 
branch wants to build its "information infrastructure".  The FBI wants to 
keep its ability to easily eavesdrop on telephone conversations.  The NSA 
must preserve its position as supreme code maker and code breaker.  In the 
past few years a new brand of civil libertarian has also vigorously joined 
the debate.  Public-interest groups such as the EFF (Electronic Frontier 
Foundation) and CPSR (Computer Professionals for Social Responsibility) 
seek to ensure our privacy and civil liberties are not compromised by new 
technologies.  They are challenging the government's attempt to influence 
the public use of encryption.

     I would also like to introduce a third player in the debate-- the 
"technicians".  These are the computer scientists and engineers who 
develop, design, and implement encryption systems.  As the ones who will 
actually be building the encryption and telecommunication systems of the 
future, we have a unique position to take a leading role in the debate.  
Rather than blindly accept government standards and regulations, we should 
examine the issues and decide for ourselves how encryption technology 
should be used.


     The fundamental question boils down to this:  How much access should 
the government have to our personal communications?  This presents a trade-
off between the obligations of the government to protect national security 
and the rights of the citizens to privacy and free speech.  Proponents of 
government control insist restrictions on encryption technology are 
necessary to conduct lawful investigations of terrorists, drug dealers, and

gangsters.  Opponents cry out that any restrictions intrude on our right to

privacy and right to free speech.

     These arguments are currently being made in the debates on encryption 
technology and the Digital Telephony proposal.  I tend to side with the 
freedom of speech argument-- but with a twist.  The real issue at stake is 
communication.  Simply put, we should have the freedom to communicate, in 
any way we wish by whatever medium we wish.  If that means communicating so

nobody else can understand us, so be it.  This is not about restricting 
freedom of speech.  As the proponents of government control point out, 
there are restrictions on our freedom of speech.  People cannot make 
slanderous or libelous remarks.  There are laws against "obscenity".  But 
restrictions on freedom of speech deal with speech which can be understood-
- the restrictions are based on content.  What about speech which nobody, 
except the parties who are speaking, can understand?  How in the world 
could that speech be restricted for it's content?

     It can't.  Restrictions on encrypted speech would prevent speech 
simply because it had the potential to be obscene, the potential to be 
libelous, the potential to be a threat to national security.  The idea of 
the government restricting speech simply because it has the potential to be

dangerous is a drastic expansion of government power.  Restrictions on 
encryption technology, whether by export control or government-influenced 
standards essentially result in restrictions on encrypted speech.


     Many people won't agree with me-- but that's fine.  As technicians we 
should examine the issues and decide for ourselves how encryption 
technology should be used.  Upon making that decision, we can design 
systems to deal with the issues and satisfy the needs of the public.  If 
one engineer wants to design an escrowed key system, that's fine.  If 
another wants to design a highly secure system, that's fine.

     However, the federal government is ready to decide for us what kind 
of communication systems we must design.  That is why we must take a stand 
and demand what I call a "level playing field" when it comes to 
communication technology.  The technology we design should be built to meet

the specifications of those who use it.  The purpose of the technology 
should not be manipulated for the political benefits of a few, as the 
Digital Telephony proposal would do.  Communication networks should be 
designed to facilitate communications between interested parties.  They 
should not be designed to facilitate communications between interested 
parties and provide the cops lawful access to those communications.  
Encryption systems should be designed to provide the best security possible

for a given application.  They should not be designed to provide the best 
security possible, but no security when law enforcement has warrant to tap 
the line.  The law enforcement agencies have no place in demanding special 
consideration when it comes to developing or providing communications 
technology for the public.

     The government should also realize that changes in technology will 
change the way law enforcement does its job.  That's the way the game will 
be played on the level playing field.  Our access to technology is based on

how much time, money, and skill we have available.  The FBI should and does

use the technology it feels necessary to do its job better.  And hey, the 
drug dealers also use technology:  fast cars, cellular telephones, beepers. 

But should we not develop certain benign technologies simply because the 
bad guys will use them?  That's a decision the engineers should make, not 
the government.


     Given this, industry should take the initiative to design and develop 
authentication and encryption products to meet public demand.  They could 
start by developing some international standards.  Interestingly, the 
government always seems to be there when encryption standards are 
developed.  This is not true for other telecommunications standards.  What 
normally happens is that a standards organization, such as the 
International Telecommunications Union or the International Standards 
Organization, gets together and decides on the specifications for a 
proposed standard.  Then various companies go to work on their various 
solutions and propose them to a committee.  After a debate, the committee 
decides on a standard.  The government never plays a part.

     But for some reason, the NIST and the NSA feel they have been given 
the authority to develop encryption standards.  They were involved in the 
design of Data Encryption Standard and the Digital Signature Standard.  Now

the NSA helped design the Clipper Chip.  This leads to possible conflicts 
of interest, since the NSA is tasked with making codes for public use as 
well as breaking codes.  But the government involvement is totally 
unnecessary.  Sure, the government should make its own standards for 
government communications.  But it's about time for industry to develop 
their own authentication and encryption standards and implement these 
standards, without any meddling from the government.

     Even if the export restrictions persist, international industry 
standards would encourage international development.  If U.S. companies 
can't provide secure products for Americans, we could get compatible 
products from other countries.  Or better yet, multinationals like Motorola

or AT&T could develop standard encryption devices overseas, for overseas 
markets as well as domestic markets.


     Unfortunately, I believe it would be very difficult for the 
technicians to accomplish this in the present political climate.  One 
engineering professor I spoke with suggested it would be even more 
difficult to create an international encryption standard, since foreign 
governments would have similar motivations to repress encryption 
technology.  However, as engineers and computer scientists, we should 
exercise our professional authority on the technical issues and get 
involved in the policy debate.  It's about time cryptography was treated as

a science and not a secret.  It's about time the use of cryptography was 
treated as a telecommunications issue, not a national security issue.  As 
technicians, we will be the ones building the communication systems, and we

have the final say if we wish to take a stand.

Jason Hillyard      5/25/93
P.O. Box 14685
Santa Barbara, CA 93107

From floydf@iphase.com Thu May 27 00:21:41 1993
Return-Path: <floydf@iphase.com>
Received: from iphase.com by csrc.ncsl.nist.gov (4.1/NIST)
     id AA10073; Thu, 27 May 93 00:21:26 EDT
Posted-Date: Wed, 26 May 93 23:21:10 CDT
Received-Date: Thu, 27 May 93 00:21:26 EDT
Received: from wildcat.iphase.com by iphase.com (4.1/1.34)
     id AA18150; Wed, 26 May 93 23:21:11 CDT
Received: by wildcat.iphase.com (4.1/SMI-4.1)
     id AA02679; Wed, 26 May 93 23:21:10 CDT
Date: Wed, 26 May 93 23:21:10 CDT
>From: floydf@iphase.com (Floyd Ferguson)
Message-Id: <9305270421.AA02679@wildcat.iphase.com>
To: crypto@csrc.ncsl.nist.gov

Computer System Security and Privacy Advisory Board Technology
National Institute of Standards and Technology
Gaithersburg, MD

26 May 1993


The April 16th announcement of the President's Clipper and data
encryption initiative was followed immediately by an electronic storm
of discussion, much focused on the secrecy of the Skipjack algorithm,
the lack of details about the escrow mechanism, seasoned with the
usual blend of wild speculations and paranoiac guesses.  Buried under
all this noise, the silent, classified Presidential Directive
initiating a comprehensive inquiry into related public policy issues
lay largely unmentioned, apparently unnoticed.

As distance related cost factors of carried telecom traffic drop,
network-related products seem destined to take the same price plunge
seen in recent past with both micro-processors and mass-storage.  The
results? As network technology becomes a commodity item, used by
millions, for thousands of different purposes, more and more of us
will move more of our personal and business connections from the
familiar physical world of smell and sight to the new digital realms
of electronic mail, remote logins, video conferencing, and a host of
services and products not yet conceived.

>From the standpoint of public and social policy, Clipper (the silicon
plus the secret policy development) suffers one fundamental defect: by
shrouding policy issues regarding privacy protection, encryption, and
law enforcement in secrecy, through use of secret Presidential
Directive, and by failing to disclose the particulars of the proposed
key escrow mechanisms beyond vague affirmations placing everything in
the Attorney General's hands, this initiative substantially diminishes
the openness, the vitality and the good will required to develop a
robust, valuable and productive public digital network, that would
rank with our voice telephone network, our public transport system,
and our numerous public utilities as the best in the world.

Today's digital pathways truly form a frontier; services are primitive,
reliability and availability remain poor, and only a few elite corners
of society benefit from the power provided by the new digital
networks.  If (not when) tomorrow's digital highways extend these
benefits to many, maybe most, these new citizens of the digital realms
will find their digital identity, their personal network "face", as it
were, tied to their key, which allows them to communicate with others
securely and privately, and to establish those firm personal
identities necessary for productive social and commercial interaction.

Under the Clipper initiative, the sole responsibility is placed with
the Attorney General to make arrangements to hold these keys, and to
determine the legal procedures by which those keys can be obtained by
governmental agencies.  The preservation and integrity of my digital
identity appears to be a secret!  Who keeps the keys, why, how, when,
where?  These questions remain not only unanswered, but unasked.

As a user of a digital service, I may not be happy about using a
secret encryption device delivered from the government to the public
on a silicon platter.  I may be unhappy paying more than I would need
to were the process to be open to the optimizations available in a
competitive environment.  But, I absolutely will not ever place the
integrity of my digital identity solely, unconditionally, and
irrevocably in the hands of a single, centralized agent, particularly
if that agent happens to bear the wealth, the power, and the weight of
the State.

Legitimate arguments can be advanced for preserving the technology of
encryption a secret: none can be made for keeping secret the
mechanisms of key registry.

This is not a technological issue: I "own" my personal identity more
truly than I own any other merely material thing; and can also assess
and manage personal risks associated with my use of publicly provided
services.  I, and a hundred million other Americans, users of cars and
highways, owners of homes, shops and small businesses, daily manage
personal risk, selecting insurance providers, making payments always
and claims occasionally, while understanding little or none of the
technical details of the provided service.  But, we have choices, and
our ability to freely choose allows us to freely use valuable shared
resources, like our physical transportation network, to both
contribute to our personal and family well-being, as well as the
shared good of society, while managing personal risks incurred.  We
need that same freedom of choice in the emerging digital realm.

Society has as much legitimate interest in the regulation of
crypto-technology in the new digital networks as it does in regulating
the insurance industry, the use and operation of the telephone system,
and the public highways.  Social structures and policy bodies have
evolved to address those needs.  Certainly the tasks of regulating
publicly available encryption for use with a national digital
infrastructure will require different forms and pose different
challenges than the regulation of other products available to
individual users.  But, this task is not insurmountable, nor is it
primarily technological. It should not be initiated solely (and
secretly) at the hands of the Attorney General, or of any other agency
of the Executive branch.  It should involve, and does require, the
participation and involvement of the citizens affected, and both
deserves and requires the public attention and debate possible through
our duly elected public legislators.

This is not a technological issue: we share words to communicate, not
only with each other but with all who speak our language, those
present, and those past who formed our words and minds and voices.
Words are not private, but by their exchange become part of our common
human inheritance.  Secrets shared are no longer secrets, but secrets
encrypted build a wall between those with the key and those without.
Sometimes these walls can protect us, allowing us to traverse digital
ways safely, engaging in human discourse and activities not possible
were each word visible and observed.  But, these walls can also
conceal, allowing some to prey on others with diminished fear of
detection and reprisal.  Highways and roads, too, are subject to the
same ambivalence; they can be used or abused, but, in order to be
useful, they must be regulated.  No one would want to drive to work on
a freeway system without rules; likely no one could.  Cryptography
provides a powerful tool to tame today's wild electronic frontier.

By shrouding the social and policy issues in secrecy the Clipper
initiative moves further from this goal, and obstructs many possible
paths of progress.  As the rest of the world moves away from
centralized economic planning and modes of government that reduce
personal freedom and the resulting healthy diversity, it is critical
that the legitimate interests, needs, and capacities of the public
sector be accommodated in this debate.  The Clipper initiative must be
opened to public debate.  Key escrow is fundamental to personal
identity in the new digital world: the public must participate in the
discussion, the debate, and ultimately, through their duly elected
legislators, in the formulation of effective, equitable policy and
law, prior to the implementation of such policy by executive agency.

Floyd Ferguson
Concerned citizen

From sgs@grebyn.com Thu May 27 00:35:45 1993
Return-Path: <sgs@grebyn.com>
Received: from grebyn.com (leviticus.grebyn.com) by csrc.ncsl.nist.gov
     id AA10095; Thu, 27 May 93 00:35:33 EDT
Posted-Date: Thu, 27 May 1993 00:35:27 -0400 (EDT)
Received-Date: Thu, 27 May 93 00:35:33 EDT
Received: by grebyn.com (4.1/SMI-4.1/ccg.7.2.91)
     id AA11018; Thu, 27 May 93 00:35:28 EDT
>From: sgs@grebyn.com (Stephen G. Smith)
Message-Id: <9305270435.AA11018@grebyn.com>
Subject: Clipper Comments
To: crypto@csrc.ncsl.nist.gov
Date: Thu, 27 May 1993 00:35:27 -0400 (EDT)
X-Mailer: ELM [version 2.4 PL21]
Mime-Version: 1.0
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: 7bit
Content-Length: 8153      

Cryptographic Issue Statements
Computer System Security and Privacy Advisory Board
Technology Building, Room B-154
National Institute of Standards and Technology
Gaithersburg, MD, 20899

In answer to your request for comments on the issues raised by the 
"Clipper chip":

I see a number of very disturbing aspects of the "Clipper" announcement.  
Others will probably comment on the Constitutional problems in the 
assumption that the Government has the right to tap telephones at will, 
and the legal and ethical problems of a secret sole-source contract award 
for a system that is potentially extremely lucrative.  I will limit my 
comments to a couple of technical issues, with a postscript on some wider 

TECHNICAL ISSUES:  Is the system secure?

The first and biggest problem with the Clipper technology is that it is 
classified.  There is simply no way to verify that the chips do what is 
claimed for them, or even if they do anything at all.  In the absence of 
any real review of the algorithms, systems, or chip designs, we naturally 
tend to assume the worst.

I am not qualified to comment on the cryptographic algorithms used in the 
"Clipper" and "Capstone" chips, even if they weren't classified. However, 
it is a truism that a secure *algorithm* does not insure a secure 
*system*.  The problems that I see with the system are:

A.  The Man in the Middle.

Take two Clipper chips.  Connect them back-to-back, so that the "clear" 
output of one feeds into the "clear" input of the other.  Add some signal 
processing "glue" to pass dialing and routing information around the 
Clippers.  Take the resulting assembly and insert into a phone line, 
either by actually cutting the wires or by fooling around with the 
switch.  We now have the following arrangement:
     +---+         +---+         +---+
     | A | ------- | M | ------- | B |
     +---+         +---+         +---+

At the start of a call, A dials B's number.  M routes this information 
directly to B.  A then initiates the key exchange supposedly with B, but 
actually with M.  M, sensing that this is an encrypted call, begins 
negotiating a key exchange with B.  At this time, A and B assume that 
they are negotiating with each other, when they are actually both 
negotiating with M.

When the key exchange is complete, A sends encrypted data to B.  M 
decrypts the data, reads it, re encrypts it, and sends it to B.

This method is completely general and can be used against any "zero 
knowledge" system.  M needs no knowledge of the cryptosystems in use, and 
only needs to tell the difference between normal routing signals and 
encrypted data.

Supposedly, the Bell Atlantic systems that are the first to use the 
Clipper have a display that shows the session key that is currently in 
use.  I rather doubt this, as you don't normally want your session key 
where anybody can see it.  In use, supposedly, A reads off the key to B. 
If the key doesn't match, then there is a "man in the middle."  In 
reality, I doubt that users would take the trouble to read off a long (20 
digits?) number every time they make a call.

In any case, if Clipper or something similar comes into widespread use, I 
foresee the "back-to-back" chips taking over the niche currently occupied 
by the "two alligator chips and a headset" tap that will work on current 
non digital telephones -- simple, cheap, effective, and illegal.

How can we keep the "man in the middle" out of things?  With only two 
stations and no prior arrangement between A and B, we can't.  There must 
be some prior arrangement between A and B.  This can be provided 
automatically by a third party.  See Internet RFC 1421 for an example of 
a system that uses this approach.

B.  Back Doors

These could be cleared up by releasing the algorithms and chip design for 
public review.

The proposed system of key escrow makes decoding of encrypted 
conversations a very tedious process.  I find it very difficult to 
believe that law enforcement agencies would be willing to put up with it.  
This leads to speculations about a "back door" that would allow those who 
know it to decrypt any message they wanted, without touching the escrowed 

The simplest way of compromising the Clipper would be to subvert the key 
exchange.  The "man in the middle" (above) would take an active part in 
the negotiations to determine the secret session key.  This would make 
the "man in the middle" undetectable even in the unlikely event that the 
two users read the "keys" off to each other.

An example of a protocol that this would work on might be: A sends B the 
public part of a public key cipher.  B uses this key to encrypt a 
randomly chosen secret session key.  B sends the encrypted key to A, A 
deciphers the secret key, and they both use it to encrypt further 

With the man in the middle, M intercepts A's key and passes M's own key 
to B.  B uses M's key to encrypt a secret key.  M decrypts the key, re 
encrypts it with A's public key, and sends it on to A.  All three are now 
using the same key.

Can this happen in Clipper?  We can't know as long as the algorithm and 
the chip design are classified.  It would likely be hidden in circuitry 
purported to provide "conference calling."

Are there other ways of doing it?  We can't know.


The Clipper appears to be an attempt by the Government to insure that it 
will always be able to tap telephones and other forms of electronic 
communications at will.  We remember the abuses of J. Edgar Hoover, 
Richard Nixon, John Mitchell, and Ed Meese.  Are they the worst that the 
United States will ever have?  We would be foolish to assume so.

The implication that I have seen is that the Government intends to 
require that anyone doing "sensitive" business with the Government will 
be required to use Clipper, and no other form of communications data 
security.  This will presumably generate a large enough installed base of 
"secure" telephones that no competing "commercial-only" standard can 
survive.  (It will also generate windfall profits for the companies 
making the equipment.  That, I am sure, is somebody else's argument.)

If the Government were interested only in official Government business,
it could simply require a centralized key distribution facility, as is
currently done with classified communications.  This would be much
easier than the rigmarole with "key escrow," but it would probably not
be acceptable for business use.

For me, the final nail in the Clipper's technical coffin is that it is 
not being cleared for use with classified data.  If it's that good, why 
won't the Government use it?

The Clipper concept is flawed technically, but it is probably the only 
way that the Government can even attempt to maintain "universal 
tappability" of all telephones.  Systems that are more secure lose the 
"universal tappability."  Even with Clipper, someone who didn't want to 
be tapped could simply "pre encrypt" the data before feeding it into the 
Clipper chip.  The fact that the data is encrypted could not even be 
*detected* without a court order.

Let it go.  The advantages of solidly secure communications far outweigh 
the rather dubious advantages of phone taps.

At the beginning of the twentieth century, a criminal could only get far 
away from the scene of the crime by taking a train.  Trains and train 
stations are rare and easily watched by law enforcement officers.  The 
advent of the automobile meant that a criminal could commit a crime and 
be far away by the time the crime was discovered.  Would we have a better 
society today if the growth of the automobile had been "managed" to the 
benefit of law enforcement?  Somehow, I doubt it.

The current computer revolution is at least as much of a change as the 
introduction of the automobile.  Attempts to "manage" it for the 
temporary advantage of a narrow group of people are simply doomed. 

Steve Smith                     Agincourt Computing
sgs@grebyn.com                  (301) 681 7395
"Truth is stranger than fiction because fiction has to make sense."

From djw@eff.org Thu May 27 12:54:41 1993
Return-Path: <djw@eff.org>
Received: from eff.org by csrc.ncsl.nist.gov (4.1/NIST)
     id AA10654; Thu, 27 May 93 12:54:33 EDT
Posted-Date: Thu, 27 May 1993 12:59:53 -0500
Received-Date: Thu, 27 May 93 12:54:33 EDT
Received: from [] (jackson.eff.org) by eff.org with SMTP id
  (5.65c/IDA-1.5/ident for <crypto@csrc.ncsl.nist.gov>); Thu, 27 May 1993
12:56:07 -0400
Message-Id: <199305271656.AA01012@eff.org>
Date: Thu, 27 May 1993 12:59:53 -0500
To: crypto@csrc.ncsl.nist.gov
>From: djw@eff.org (Daniel J. Weitzner)
Subject: Comments of the Electronic Frontier Foundation

May 27, 1993

Before the 

Technology Building, Room B-154
National Institute of Standards and Technology
Gaithersburg, MD  20899



Key Escrow Chip Cryptographic Technology and Government Cryptographic
Policies and Regulations

        The Electronic Frontier Foundation (EFF) commends the Computer
System Security and Privacy Advisory Board for offering the public the
opportunity to comment on developments in cryptography and communications
privacy policy.  Recent Administration proposals, including use of the
Clipper Chip and establishment of a government-controlled key escrow
system, raise questions that cut to the core of privacy protection in the
age of digital communication technology.  The questions noted by the
Advisory Board in its Notice of Open Meeting (58 FR 28855) reflect a broad
range of concerns, from civil liberties to global competitiveness.  The
Digital Privacy and Security Working Group -- a cooperative effort of civil
liberties organizations and corporate users and developers of communication
technology which is chaired by the EFF -- has also submitted over one
hundred questions to the Administration.  (These questions are being
submitted to the Advisory Board under separate cover on behalf of the
Working Group.)  That there are so many questions demonstrates the need for
a comprehensive review of cryptography and privacy policy.  

        We are encouraged that the Administration has expressed a
willingness to undertake such a review.  However, it has become clear that
plans for rapid introduction of the Clipper Chip could unacceptably distort
this important policy review.  The Administration has made not secret of
the fact that they hope to use government purchasing power to promote
Clipper as a de facto standard for encryption.  With Clipper on the market,
the policy process will be biased toward a long-term solution such as
Clipper with key escrow.  Moreover, the rush to introduce Clipper is
already forcing a hasty policy review which may fail to provide adequate
public dialogue on the fundamental privacy questions which must be resolved
to reach a satisfactory cryptography policy.  Based on the depth and
complexity of questions raised by this review, EFF believes that no
solution, with Clipper Chip or otherwise, should be adopted by the
government until the comprehensive cryptography review initiated by the
Administration is complete.

        EFF is a nonprofit, public interest organization whose public
policy mission is to insure that the new electronic highways emerging from
the convergence of telephone, cable, broadcast, and other communications
technologies enhance free speech and privacy rights, and are open and
accessible to all segments of society.  

        In these comments, we will elaborate on questions 1, 2, and 3
listed in the Advisory Board's Notice.  We offer these comments primarily
to raise additional questions that must be answered during the course of
the Administration's policy review.


        Unraveling the current encryption policy tangle must begin with one
threshold question: will there come a day when the federal government
controls the domestic use of encryption through mandated key escrow schemes
or outright prohibitions against the use of particular encryption
technologies?  Is Clipper the first step in this direction?  A mandatory
encryption regime raises profound constitutional questions, some of which
we will discuss below.  So far, the Administration has not declared that
use of Clipper will be mandatory, but several factors point in that

1.  Secrecy of the algorithm justified by need to ensure key escrow

        Many parties have already questioned the need for a secret
algorithm, especially given the existence of robust, public-domain
encryption techniques.  The most common explanation given for use of a
secret algorithm is the need to prevent users from by-passing the key
escrow system proposed along with the Clipper Chip.  If the system is truly
voluntary, then why go to such lengths to ensure compliance with the escrow

2.  How does a voluntary system solve law enforcement's problems?

        The major stated rationale for government intervention in the
domestic encryption arena is to ensure that law enforcement has access to
criminal communications, even if they are encrypted.  Yet, a voluntary
scheme seems inadequate to meet this goal.  Criminals who seek to avoid
interception and decryption of their communications would simply use
another system, free from escrow provisions.  Unless a government-proposed
encryption scheme is mandatory, it would fail to achieve its primary law
enforcement purpose.  In a voluntary regime, only the law-abiding would use
the escrow system.  


        Even if government-proposed encryption standards remain voluntary,
the use of key escrow systems still raises serious concerns:

1. Is it wise to rely on government agencies, or government-selected
private institutions to protect the communications privacy of all who would
someday use a system such as Clipper?

2.  Will the public ever trust a secret algorithm with an escrow system
enough to make such a standard widely used?


        Beyond the present voluntary system is the possibility that
specific government controls on domestic encryption could be enacted.  Any
attempt to mandate a particular cryptographic standard for private
communications, a requirement that an escrow system be used, or a
prohibition against the use of specific encryption algorithms, would raise
fundamental constitutional questions.  In order to appreciate the
importance of the concerns raised, we must recognize that we are entering
an era in which most of society will rely on encryption to protect the
privacy of their electronic communications.  The following questions arise:

1.  Does a key escrow system force a mass waiver of all users' Fifth
Amendment right against self-incrimination?

        The Fifth Amendment protects individuals facing criminal charges
from having to reveal information which might incriminate them at trial. 
So far, no court has determined whether or not the Fifth Amendment allows a
defendant to refuse to disclose his or her cryptographic key.  As society
and technology have changed, courts and legislatures have gradually adapted
fundamental constitutional rights to new circumstances.  The age of digital
communications brings many such challenges to be resolved.  Such decisions
require careful, deliberate action.  But the existence of a key escrow
system would have the effect of waiving this right for every person who
used the system in a single step.  We believe that this question certainly
deserves more discussion.  

2.  Does a mandatory key escrow system violate the Fourth Amendment
prohibition against "unreasonable search and seizure"?

        In the era where people work for "virtual corporations" and conduct
personal and political lives in cyberspace, the distinction between
communication of information and storage of information is increasingly
vague.  The organization in which one works or lives may be constitute a
single virtual space, but be physically dispersed.  So, the papers and
files of the organization or individual may be moved within the
organization by means of telecommunications technology.  Until now, the law
of search and seizure has made a sharp distinction between, on the one
hand, seizures of papers and other items in a person's physical possession,
and on the other hand, wiretapping of communications.  Seizure of papers or
personal effects must be conducted with the owner's knowledge, upon
presentation of a search warrant.  Only in the exceptional case of
wiretapping, may a person's privacy be invaded by law enforcement without
simultaneously informing the target.  Instantaneous access to encryption
keys, without prior notice to the communicating parties, may well
constitute a secret search, if the target is a virtual organization or an
individual whose "papers" are physically dispersed.  Under the Fourth
Amendment, secret searches are unconstitutional.

3.  Does prohibition against use of certain cryptographic techniques
infringe individuals' right to free speech?

        Any government restriction on or control of speech is to be
regarded with the utmost scrutiny.  Prohibiting the use of a particular
form of cryptography for the express purpose of making communication
intelligible to law enforcement is akin to prohibiting anyone from speaking
a language not understood by law enforcement.  Some may argue that
cryptography limitations are controls on the "time, place and manner" of
speech, and therefore subject to a more lenient legal standard.  However,
time, place and manner restrictions that have been upheld by courts include
laws which limit the volume of speakers from interfering with surrounding
activities, or those which confine demonstrators to certain physical areas.
 No court has ever upheld an outright ban on the use of a particular
language.  Moreover, even a time, place and manner restriction must be
shown to be the "least restrictive means" of accomplishing the government's
goal. It is precisely this question -- the availability of alternatives
which could solve law enforcement's actual problems -- that must be
explored before a solution such as Clipper is promoted.


        As this Advisory Board is well aware, the Computer Security Act of
1987 clearly established that neither military nor law enforcement agencies
are the proper protectors of personal privacy.  When considering the law,
Congress asked, "whether it is proper for a super-secret agency [the NSA]
that operates without public scrutiny to involve itself in domestic
activities...?"  The answer was a clear "no."  Recent Administration
announcements regarding the Clipper Chip suggest that the principle
established in the 1987 Act has been circumvented.  For example, this
Advisory Board was not consulted with until after public outcry over the
Clipper announcements.  Not only does the initial failure to consult eschew
the guidance of the 1987 Act, but also it ignored the fact that this
Advisory Board was already in the process of conducting a cryptography

        As important as the principle of civilian control was in 1987, it
is even more critical today.  The more individuals around the country come
to depend on secure communications to protect their privacy, the more
important it is to conduct privacy and security policy dialogues in public,
civilian forums.


The EFF thanks the Advisory Board for the opportunity to comment on these
critical public policy issues.  In light of the wide range of difficult
issues raised in this inquiry, we encourage the Advisory Board to call on
the Administration to delay the introduction of Clipper-based products
until a thorough, public dialogue on encryption and privacy policy has been

Respectfully Submitted,

Electronic Frontier Foundation
+1 202-544-9237

Jerry Berman
Executive Director

Daniel J. Weitzner 
Senior Staff Counsel

From djw@eff.org Thu May 27 12:55:14 1993
Return-Path: <djw@eff.org>
Received: from eff.org by csrc.ncsl.nist.gov (4.1/NIST)
     id AA10665; Thu, 27 May 93 12:55:06 EDT
Posted-Date: Thu, 27 May 1993 13:00:07 -0500
Received-Date: Thu, 27 May 93 12:55:06 EDT
Received: from [] (jackson.eff.org) by eff.org with SMTP id
  (5.65c/IDA-1.5/ident for <crypto@csrc.ncsl.nist.gov>); Thu, 27 May 1993
12:56:21 -0400
Message-Id: <199305271656.AA01027@eff.org>
Date: Thu, 27 May 1993 13:00:07 -0500
To: crypto@csrc.ncsl.nist.gov
>From: djw@eff.org (Daniel J. Weitzner)
Subject: Digital Privacy and Security Working Group Comments

The Digital Privacy and Security Working Group, whose members are listed
below, submitted the following questions to the Clinton Administration
regarding Clipper and Cryptography Policy.  The Working Group hereby
submits this set of questions for the consideration of the Computer System
Security and Privacy Advisory Board.

Members of the Digital Privacy and Security Working Group:

abcd, The Microcomputer Industry Association
Advanced Network & Services, Inc.
American Civil Liberties Union
Apple Computer, Inc.
Business Software Alliance
Cavanagh Associates, Inc.
Cellular Telephone Industry Association
Computer Professionals for Social Responsibility
Computer & Business Equipment Manufacturers Association
Computer & Communications Industry Association
Crest Industries, Inc.
Digital Equipment Corporation
Electronic Frontier Foundation
Electronic Mail Association
Hewlett-Packard Company
Information Technology Association of America
Information Industry Association
International Communication Association
Iris Associates
Lotus Development Corporation
McCaw Cellular Communications
Microsoft Corporation
National Association of Manufacturers
RSA Data Security, Inc.
Software Publishers Association
Sun Microsystems, Inc.
Telecommunications Industry Association
Toolmaker, Inc.
Trusted Information Systems
United States Telephone Association

Work Group Questions:


A. Process by Which the Proposal Was Developed

1.      Why the secrecy in which the encryption code scheme was developed? 
Were any members of the computer, communications, or security industries
consulted? Were any privacy experts consulted? Has the Justice Department
or the White House Office of Legal Counsel considered the constitutional

2.      The Administration's announcement implies that a policy review on
encryption has been commenced; but at the same time, it appears that a
decision has already been reached to support the Clipper proposal or some
other key-escrow scheme.  Is any review of the Clipper chip itself now
underway?  What progress has been made?  When will this expedited review be

3.      What role has the National Security Agency played in the
development and selection of the Clipper Chip and key escrow system?  What
will NSA's role be in the deployment and evaluation of the system?  Are
these roles consistent with the principle of civilian control of computer
security, as required by the Computer Security Act of 1987?

4.      What efforts are underway to improve the government's ability to
decrypt non-Clipper algorithms which are likely to be used by criminals? 
Can the government decrypt all commercially available hardware sold
domestically and abroad? If not, wouldn't it be a better policy to direct
U.S. resources in that direction instead of the Clipper approach?

5.      What percentage of the 800 to 900 annual Title III interceptions
encounter encrypted communications?  What percentage of law 
B. Secrecy of the Algorithm
11.     Will the Clipper proposal have the same degree of public review
that other NIST standards, senforcement encountered encryption is estimated
to be Clipper as opposed to the other encryption schemes?  Is this a
solution in search of a problem?

6.      Did the government consider commercially-available encryption
schemes and reject them? If so, why were they rejected, and is that
analysis available? If not, why not?

7.      Capstone is the successor to Clipper with the addition of public
key exchange and digital signature capabilities. Is Clipper just an
intermediate step before Capstone is released? Why did the White House
press release not mention Capstone?

8.      How will this relate to the FBI's Digital Telephony Proposal?  Has
the Administration committed to supporting, discarding or reintroducing the
proposal in a new form?

9.      What is the history of the proposal?  How long has this been under

10.     How long has the Clipper Chip and escrow concept been in
development?  Which agency originated these concepts?

uch as DSS have gone through?

12.     How can the public trust the security and reliability of an
algorithm that is kept classified?

13.     If American firms are not able to have their encryption experts
examine the algorithm, how can they be sure that there is no "trap door"
that would allow any Clipper Chip security system to be overridden?  Dr.
Kammer of NIST has said that "respected experts from outside the government
will be offered access" to the algorithm. How do interested parties go
about obtaining this access to the classified material about the Clipper
algorithm and participate in the analysis of the design to search for trap
doors and other weaknesses?  What specific reports from this process will
serve to reassure users regarding the integrity of the Clipper Chip?

14.     What will be the consequence if the algorithm is published? Will it
become less secure?  If publication (i.e., de-classification) would make it
less secure, how secure can it be? 

15.     If the Clipper Chip is too weak to protect classified government
communications, why should it be used for sensitive proprietary private
sector communications?

16.     Executive Order 12356 has procedures on classification and
declassification of information.  Is the algorithm being classified under
the framework of this order? What agency is in charge of classification/

17.     How much effort has the government put into the design and
cryptoanalysis of the Clipper Chip as compared to the public analysis of
the Data Encryption Standard during the last 16 years?

18.     Is the Skipjack algorithm being used by the Clipper Chip derived
from codes used in the management of our nuclear arsenal?  Is this why the
algorithm is being kept secret?  If this is so, why are we using this
secret system for a dubious commercial standard?  If there is a national
security justification to avoid having this encryption technique revealed,
why risk compromising it by integrating it into publicly distributed

19.     If the algorithm is classified, how will it be legal to distribute
the chips to users not qualified to handle classified encryption equipment?
This seems contrary to Facility Security Clearance procedures and the
Personal Security Clearance requirements of DoD 5220.222-M, Industrial
Security Manual for Safeguarding Classified Information.

20.     Is it illegal to reverse engineer the Clipper Chip?  If it were
reverse engineered, would it then be illegal to reveal the algorithm?  

C. Voluntariness of Clipper System

21.     Will this system be truly voluntary? If so, won't criminals and
terrorists just use some other type of encryption?

22.     If the use of the Clipper Chip is "voluntary," why would any party
desiring privacy or secrecy of communications use it, knowing that the US.
government has a process to allow decryption?  If the Administration's
ultimate goal is to ban other forms of encryption for use domestically,
what is the legal basis for such an approach?

23.     Isn't the Administration doing more than "encouraging" use of
Clipper?  (E.g., discontinuing DES at the end of the current certification
cycle, directing NIST to adopt Clipper as a Federal standard, and
maintaining export restrictions on hardware/software using different

24.     Does the government have any plans to campaign for the
implementation of the Clipper Chip as a standard for data cryptography?

25.     What impact will the introduction of Clipper have on the market for
other encryption technologies?  Will the government otherwise try to
discourage other cryptographic mechanisms from being marketed domestically
and abroad?

26.     Isn't the government dictating the design of technology into
commercial products rather than allowing market demand to dictate?

27.     What prevents a sender of information from encrypting with secure,
easy to obtain software using DES or RSA algorithms before sending data
through a channel encrypted with the Clipper system?

28.     Would the Administration ever consider making the Clipper Chip or
other key escrow system mandatory?

D. Key Escrow System

29.     How can the government assure us that the keys held in escrow are
not compromised?  What public or private agencies have sufficient integrity
and public trust to serve as escrow agents?

30.     How can the public be sure that keys will only be revealed upon
proper warrant?  Will there be clerks who actually operate the equipment
who could get anyone's keys?  Or will judges have personal keys, which
would be directly authenticated to the escrow agents' equipment that
protects the users' keys?

31.     Once the keys are obtained from the escrow holders, is it
envisioned that electronic surveillance can be done "real-time," or will
recording and post-processing be required?

32.     To hear both sides of a conversation, does law enforcement need the
keys of both participants?

33.     After law enforcement has properly obtained a pair of unit keys
from the escrow agents and conducted a wiretap, will the keys be "returned"
to the agents?  What safeguards exist to prevent law enforcement from
re-using the keys without authorization in the future?

34.     Once in possession of the unit keys, can the government pretend to
be ("spoof") the original unit owner?

35.     What is the smallest number of people who would be in a position to
compromise the security of the system?

36.     Can an escrow agent exercise discretion in the release of key
information?  E.g., can they refuse an inappropriate request?  (Phone
companies ensure that court orders are facially valid.)  Can they publicize
an inappropriate request?  Can they tell the person whose communications
were intended to be violated?

37.     Who will be responsible for auditing the escrow process and the use
of revealed keys?

38.     How will the government ensure that unanticipated uses of the
escrow database are prevented in the long term?  (E.g., the Census database
was supposed to stay confidential for 75 years, but was released during
World War Two to allow Japanese-Americans to be imprisoned without cause. 
What protections are in place to make sure that this never happens again?

39.     What happens when one discovers that the keys have been captured
through theft?  How difficult would it be to change keys?  What is done in
the meanwhile?  How difficult is it to reprogram the chip, or do you need a

40.     If the chip can be reprogrammed, how do you prevent covert changes
that will not be discovered until authorization to tap is received and
execution of the warrant is forestalled?

41.     It appears that once a given chip has been compromised due to use
of the escrowed keys, the chip and the equipment it is used in are
vulnerable forever.  Is there any mechanism or program to re-key or replace
compromised hardware?  Is there any method for a potential acquiring party
to verify whether the keys on a given chip have been compromised?  Who
should bear the cost of replacement or re-keying of compromised hardware?

42.     What safeguards will be used when transporting the escrow keys?

43.     What are the national security implications of widespread
deployment of Clipper?  Does it make our communications more susceptible to
disruption or jamming?

44.     Doesn't the two-escrowee approach make these locations targets of
opportunity for any party or foreign government that wants to gain access
to sensitive US. information?  If an escrow location is compromised, all
chip data contained there is compromised.  Wouldn't these locations also
become targets of opportunity for any criminal or terrorist organization
that wanted to disrupt US. law enforcement?  What back-up or physical
security measures are envisioned?  If multiple copies are kept, doesn't
this increase the threat of compromise?

E. Choice of Agents for the Keys

45.     Who will be the agents for the keys? How secure will they be from
the outside and from the inside?  What is the cost of maintaining the
escrow system?  Who will pay?  Who will profit?

46.     When will the escrow agents be announced? Will there be a process
to allow input into the selection of these individuals/agencies?

47.     Although it has been reported that the escrow holders will not be
the FBI, DoD, CIA or NSA, is it envisioned that one or both of the escrow
locations will be non-government entities?  Can one or both be private
parties?  What will the process be to determine what private party will be
awarded the contract for key holder?

48.     Can the set of escrow agents be changed after the initial
selection? How can the government be prevented from moving the escrow
contract to a more pliable escrow agent, if one of the agents stands up
against the government for the rights of the people whose keys they are

49.     Will escrow agents be immune from prosecution during their term of
office, like Members of Congress, the President, and Justices of the
Supreme Court?  If not, what will prevent the government from harassing the
agents during a dispute with the Justice Department?

50.     Will there be a mechanism for particular people to keep their keys
out of the key escrow database, or to obtain Clipper Chips with keys that
have not been escrowed? (E.g. Judges, law enforcement officers, NSA
officials, the President, etc.)

F. Level of Security of Clipper Chip Encryption
51.     How will the government assure American businesses that their
proprietary information is not compromised?  Given the extremely
competitive nature of the high-tech industries, and the importance of
intellectual property, how can American firms be adequately protected?

52.     How will the government assure American citizens that the privacy
of their electronic communications and the security of personal information
that is transmitted in electronic form will all be secure under the Clipper

53.     f the Administration is so confident about the level of security of
the Clipper Chip scheme, why will classified information not be encrypted
with it?

54.     What warranty is the US. government prepared to make regarding the
security of the Clipper Chip compared to other algorithms, and indemnity
for failures for breaches of the algorithm, chips that are compromised due
to failures in the security of the escrow system, or other failures in the
Clipper approach?  

55.     What effect does Clipper have on other NSA and DOD programs aimed
at encryption and authentication of unclassified messages (e.g., MOSAIC)?

56.     If Clipper is not approved for classified traffic, what government
agencies will be utilizing Clipper, and for what applications?

57.     Normal security procedures involve changing cryptography keys
periodically, in case one has been compromised. But the family and unit
keys cannot be changed by the user. If these keys are compromised, it won't
matter how frequently the user changed their session keys. Doesn't the long
use of the same family and unit keys increase the likelihood that these
keys will be compromised while they are still in use? Doesn't this also
eliminate a significant degree of the user's control of the level of
security that their his or her system provides?

58.     If the government discovered that the algorithm or family key had
been discovered by a foreign government or private individuals, would it
tell the public that the system had been compromised?  Are there plans to
restore privacy and authentication if the algorithm is compromised?

59.     How secure is the Clipper algorithm if it is attacked by a person
with half the key? 

G. Level of Privacy Protection

60.     Given the dramatic growth in transmission and storage of personal
information in electronic form, does the Administration recognize that
private individuals, as well as large organizations, need access to
affordable, robust encryption systems?

61.     Is law enforcement permitted to identify the specific piece of
communications equipment without obtaining a warrant?  If encrypted
communications include the serial number ("chip family key"), will law
enforcement be able to keep track of communications traffic and track
private citizens without even securing the keys from the escrow agents?

62.     Does the Administration believe that all household phones are going
to be replaced with secure versions over some period of time?  At what

63.     It has been impossible to keep any large collection of information
completely private, including Social Security records, tax information,
police files, motor vehicle records, medical records, video rentals, highly
classified military information, and information on abuses of power. How
will users be able to tell when this happens to the key escrow information?

H. Constitutional/Legal Implications

64.     Has the Administration fully considered the constitutional
implications of the Clipper Chip and other key escrow systems?

65.     Does forcing someone to disclose a key for future law enforcement
access infringe the fundamental right against self incrimination embodied
in the Fifth Amendment?

66.     Does requiring key disclosure in conjunction with a particular
technology violate users' right to free speech under the First Amendment? 
Courts frown most severely on any government attempts to compel a
particular form of speech.

67.     Does the escrow system violate the letter or the spirit of the
Fourth Amendment protections which safeguard citizens against intrusive law
enforcement practices?

68.     When the Administration says "nor is the U.S. saying that 'every
American, as a matter of right, is entitled to an unbreakable commercial
encryption product,'" are they therefore saying the inverse, that every
American is not allowed to have an unbreakable commercial encryption

69.     Does the Administration see the need for any new legislation to
implement its Clipper Chip proposal? If so, specifically identify.

70.     In the event that one or more escrow keys are obtained through
unauthorized means, what liability, if any, might the equipment
manufacturer have to bear?

71.     What will be the relationship between Federal and state law
enforcement?  Will the policy pre-empt state law?  How will state law
enforcement access the "key" system?

72.     What is the statutory authority for regulation of domestic
encryption?  Are any of these statutes cold war relics?  Should the
efficacy of all statutes that effect civilian encryption be reviewed?

73.     What protections do we have against blackmailing by escrow agents,
or by others who have gained possession of escrowed keys?  Is there civil
or criminal liability for escrow agents who reveal keys illegally?

74.     What is the impact on society if the right to hold a truly private
conversation is withdrawn?

75.     Is strong encryption technology important for protecting
intellectual property in a digital network environment?

I. Logistics of Chip Development and Manufacture

76.     Why weren't other Chip manufacturers given the chance to bid on the
chip production process?  Why was the choice made to have only one

77.     Since the Clipper Chip design data will need to be released to
manufacturers, how will we be assured that this information, in itself,
will not allow the user systems to be compromised?

78.     What assurances will there be that the manufacturer is not keeping
a record of all keys issued?

79.     We have read Dorothy Denning's explanation of how the two 80-bit
keys will be created in the SCIF.  Is this description accurate? If not,
how would this process occur? If so, is the system feasible? What will the
cost be for this process and for the increased security of the involved
government agents?

80.     The chips will be programmed in a Secure Compartmented Information
Facility (SCIF). Does this suggest that the chips should at some point be
classified Secret or Top Secret? What is the classification of the Clipper
and Capstone chips and the Skipjack algorithm? How will these chips be
declassified once leaving the SCIF?

81.     Some of the press reports imply that AT&T has had access to this
information in order to incorporate Clipper into some of its equipment
designs. Is that implication accurate?

82.     Can this scheme be implemented in software? If so, why haven't we
seen information on that software?  If not, were issues of how this
hardware solution would affect continued use of software encryption
adequately evaluated? Were the comparative costs of software and hardware
encryption schemes evaluated? Is this evaluation available for analysis?

83.     Current high speed DES processors have encryption rates of
approximately 200 megabits per second, while the Clipper Chip has a
throughput of 12.5 megabits per second.  Within two to five years, 100 Mbs+
technologies, such as Fast Ethernet, FDDI and ATM, will become commonplace.
 How will the Clipper technology be used in environments where data is sent
at 100 Mbs or faster?

J. Feasibility/Implementation

84.     What testing has been done to verify the ability of Clipper to work
across the panoply of new emerging technologies?  If the underlying digital
transport protocol drops a bit or two, will that interfere with Clipper
operation?  How critical is synchronization of the bit stream for Clipper
operation?  Has this technology been tested with ISDN, TDMA, Cellular, CDMA
Cellular, ATM, SONET, SMDS, etc. and other emerging technologies?  What
effect does Clipper have on the Cellular Authentication and Voice
Encryption (CAVE) algorithm?  Are these differences for key generation,
authentication, or voice privacy?

85.     Does the Administration seek to extend the Clipper Chip proposal to
the TDMA and CDMA digital cellular standards?

86.     When will the government publish the various Modes of Operation and
other documents for Clipper, together with a physical implementation
standard (similar to the old FS-1027)?

87.     Will the government consider the development of alternate sources
for the chip or will vendors be limited to a single, monopoly supplier?

88.     Initially, the Clipper Chip is being proposed for telephone
technology, but the White House specifically mentions that the technology
will be used for electronic data transmission. What is the timetable for
implementing this?

89.     What is the scope that the Administration envisions for the Clipper
Chip's algorithm use?  What about Capstone?  Is it limited to choice, or
does it encompass electronic mail, network encryption, security modems,
long-haul bulk encryptors, video applications, computer password
protection, Intelligent Vehicle Highway Systems ("IVHS"), satellite
communications -- both transport and control, electronic funds transfers,

90.     What is the Administration's policy on other security mechanisms
beyond privacy, such as message authentication codes for banking and EFT,
and for integrity and digital signatures for sender authentication and
non-repudiation? What is the impact on international standards such as
X.500 and X.509?

91.     Since Clipper, as currently defined, cannot be implemented in
software, what options are available to those who can benefit from
cryptography in software? Was a study of the impact on these vendors or of
the potential cost to the software industry conducted?

92.     What is are the success criterion for the Clipper initiative? 
Would the government abandon its initiative if the Clipper is shown to be
unsuccessful beyond government use?

93.     What is the expected useful lifetime of the Clipper technology?
What do you expect will render it useless at some point?

94.     Is it true that the name "Clipper Chip" is the intellectual
property of another company?

K. Impact on American Competitiveness

95.     As the key-escrow approach is designed to ensure the ability of the
American government to access confidential data, do NIST and NSA expect
overseas customers (who do not have the protection of due process) to
purchase the chip for data protection?

96.     In testimony before the House Telecommunications Subcommittee, Mr.
Kammer of NIST indicated that if he were a foreign customer, he would not
purchase devices that included the Clipper Chip. Doesn't this raise serious
balance-of-trade problems?

97.     Will the technology, or the Chip itself, be shared with other
allied governments  (e.g., the UK), or will US. producers of data security
products, forced by government standards to develop clipper-based products
for the US. market, be permanently closed out of the overseas security

98.     If Clipper won't be commercially accepted abroad, and export
controls continue to prohibit the exportation of other encryption schemes,
isn't the US. government limiting American companies to a US. market?

99.     Given the restrictions on who can build Clipper devices, how will
Clipper keep up with advances in semiconductor speed, power, capacity and
integration? Openly available devices, such as Intel-compatible
microprocessors, have seen dramatic gains, but only because everyone was
free to try to build a better version.

100.    Will the Clipper Chip be used nationally and internationally? How
will multinational operations accommodate this new system?

101.    Banking and finance are truly global today. Most European financial
institutions use technology described in standards such as ISO 9796. Many
innovative new financial products and services will employ the reversible
cryptography described in these standards. Clipper does not comply with
these standards. Will US. financial institutions be able to export Clipper?
If so, will their overseas customers find Clipper acceptable?

102.    If overseas companies provide systems based on algorithms that do
not have key escrow schemes that encrypt faster and more securely, how will
we compete internationally? We are market leaders in applications software
and operating systems. our world leadership in operating systems is
dependent on integrating security in internationally distributed systems.

103.    Internet Privacy Enhanced Mail (PEM) is becoming an internationally
recognized system for encrypting Electronic Mail. Would Skipjack encryption
become a US. standard for encrypting electronic mail while the rest of the
world used PEM? How would E-mail traffic between the US. and other
countries be encrypted?

L. Effect on Export Control Policy

104.    In light of the Clipper initiative, will export restrictions on
hardware and software encryption regimes using DES and RSA algorithms
(which are widely available abroad) remain in place?

105.    Will American firms be allowed to sell devices containing the
Clipper Chip abroad? Under which governmental regulatory regime would
exports of devices containing the Clipper Chip fall? What conditions would
be applied to exports of devices containing the Clipper Chip? (E.g., would
American firms be allowed to export devices to non-US. customers without
the escrow requirement? If not, who would hold the keys?)

106.    What governmental regulations will apply to imports of devices
containing the Clipper Chip? Given that most US. companies source most
customer premise equipment (e.g., telephones, fax machines, etc.) offshore,
how will the logistics be handled for the export of the Clipper Chip as a
component, and the subsequent import of the device containing the chip?
Will the US. permit non-US. manufacturers to have the Clipper algorithm? If
not, how will the Administration justify this trade barrier?

107.    If the Clipper Chip cannot be reverse-engineered, and if the US.
government is capable of decrypting, why would there be any reason to limit
Clipper products from being exported?

108.    If Clipper is allowed to be exported, does the US. government
foresee a problem with other governments? Would the US. government's access
to escrow keys be viewed as an exercise of extraterritorial jurisdiction?

M. Implications for Installed-Base/Existing Products

109.    What are the implications of NSA/NIST withdrawing the certification
of DES? Although it may -- at some point in the future -- no longer be used
for government purposes, that is not going to effect commercial or private
users' applications of DES. What about the embedded base of DES hardware?

110.    Will existing systems need to be replaced?

111.    What efforts were spent to make the new encryption approach
compatible with the embedded base of equipment?  If DES was becoming weak
(vulnerable), wouldn't merely extending the DES key length to 80 bits have
solved that problem?

112.    There are a number of companies that employ non-escrowed
cryptography in their products today.  These products range from secure
voice, data, and fax, to secure e-mail, electronic forms, and software
distribution, to name but a few.  With over a million such products in use
today, what does the Clipper scheme foretell for these products and the
many corporations and individuals that are invested in them and use them? 
Will the investment made by the vendors in encryption-enhanced products be
protected?  If so, how?  Is it envisioned that they will add escrow
features to their products or be asked to employ Clipper?

N. Process by which Input Will Be Received from Industry/Public Interest

113.    If the outcome of the policy review is not pre-ordained, then the
process to analyze the issues and arrive at solutions would seem to need a
great deal of definition. What roles have been identified for Congress, the
private sector, and other interested parties? Who is coordinating the

114.    Why does the Presidential directive on the review process remain
From jim@RSA.COM Thu May 27 14:32:44 1993
Return-Path: <jim@RSA.COM>
Received: from RSA.COM (CHIRALITY.RSA.COM) by csrc.ncsl.nist.gov (4.1/NIST)
     id AA10737; Thu, 27 May 93 14:32:35 EDT
Posted-Date: Thu, 27 May 93 11:31:27 PDT
Received-Date: Thu, 27 May 93 14:32:35 EDT
Received: by RSA.COM 
     id AA22646; Thu, 27 May 93 11:31:27 PDT
Date: Thu, 27 May 93 11:31:27 PDT
>From: jim@RSA.COM (Jim Bidzos)
Message-Id: <9305271831.AA22646@RSA.COM>
To: crypto@csrc.ncsl.nist.gov
Subject: Submission


Cryptographic Issue Statements
Computer System Security and Advisory Board
Technology Building Room B-154
National Institute of Standards and Technology
Gaithersburg, MD  20899

Statement of Jim Bidzos, President, RSA Data Security, Inc.


RSA Data Security, Inc.
100 Marine Parkway
Redwood City, CA  94065

Phone: 415/595-8782
Fax:   415/595-5198

email: jim@rsa.com

To Whom It May Concern:

Much has been said about Clipper and Capstone (the term Clipper will
be used to describe both) recently.  Essentially, Clipper is a
government-sponsored tamper-resistant chip that employs a classified
algorithm and a key escrow facility that allows law enforcement, with
the cooperation of two other parties, to decipher Clipper-encrypted
traffic.  The stated purpose of the program is to offer
telecommunications privacy to individuals, businesses, and government,
while protecting the ability of law enforcement to conduct
court-authorized wiretapping.

The announcement said, among other things, that there is currently no
plan to attempt to legislate Clipper as the only legal means to
protect telecommunications.  Many have speculated that Clipper, since
it is only effective in achieving its stated objectives if everyone
uses it, will be followed by legislative attempts to make it the only
legal telecommunications protection allowed. This remains to be seen.
In light of past attempts at this type of legislation (S266 in May
1991 and the Digital Telephony Bill of 1992) one must believe the
issue is being given serious consideration by law enforcement.

There are a number of companies that employ non-escrowed cryptography
in their products today.  These products provide security for voice,
data, and fax transmissions in networks all over the US.  Since
Clipper, as currently defined, cannot be implemented in software, what
options are available to those who can benefit from cryptography in
software?  Will NIST state clearly that the investment these companies
are making will not be threatened by legislation?

In 1992, the number of deployed products licensed by RSA Data Security
which use public-key went over one million. (This is he number of
products, not users.  There are likely more users than products.) The
majority of these products use BSAFE or TIPEM, software toolkits
offering DES and RSA, and no escrow features.  This number will grow
quickly as it does not include the RSA-enhanced Apple Macintosh OS or
Novell NetWare 4.0, both of which began shipping in 1993.  Apple sells
millions of Macs yearly, and Novell has well over 13 million
customers, most of whom will naturally upgrade to release 4.  Has NIST
considered and valued the impact of Clipper on the software industry?

Banking and finance (as well as general commerce) are truly global
today. Most European financial institutions use technology described
in standards such as ISO 9796.  Many innovative new financial products
and services will employ the reversible cryptography described in
these standards.  Clipper does not comply with these standards.  The
basis for international commerce will be compatible communications and
security systems.  Will US financial institutions be able to export
Clipper? If so, will their overseas customers or correspondent banks
find Clipper acceptable?  Will the governments of other countries
allow Clipper equipment into financial institutions that, in many
cases, they partially or entirely own?  Why was no study of the
potential impact of Clipper on US competitiveness conducted?

During NIST's policy review in June 1993, they ask US industry to
detail actual losses and projections due to Clipper and export
controls.  This is unfair.  No company wants to admit publicly where
and how it lost business to competition.  Doing so simply provides
valuable information that can be used by competitors against them
again, or worse, by other competitors they haven't lost to yet.

If the government holds that export controls are working, even though
they don't contain the technology in the US, then let them tell us
where and how they benefit from the policy, or let's begin removing
the controls.  If that sounds unreasonable, it's only the equivalent
of their request to industry to "put up or shut up."

In what must be seen as the tip of the iceberg, warning signals exist.
Australia's Courier Mail reported in a lead business story on May 18,
1992 that U.S. export controls will be directly responsible for three
Australian companies taking over $100 million per year in business
from U.S. suppliers in Australia alone for Pay-TV systems.  They
report that the full amount could be billions in the emerging Pacific
market for these systems.

At a June 1992 conference in Washington, DC, five panelists discussed
how export controls were affecting their business.  One, a
representative of a Fortune 5 company, described how two of their
major clients were lost because adequate security could not be offered
by the US company in Europe. Another panelist, representing a major
computer company, described how a European company was specifically
created and funded to exploit market opportunities created by US
export controls.  He further stated that his company had lost system
sales --hardware and software-- due to their inability to provide
adequate security to foreign buyers.

Export controls coupled with Clipper, which puts the US at odds with
the rest of the world, could cost US industry billions of dollars in
lost commerce opportunities and lost jobs.  Clipper will cost US
industry billions of dollars, and create the potential for a national
catastrophe by putting all the keys to a nationwide security system in
one place.  The impact of these policies and actions deserve more open
study than NIST and NSA have been willing to provide.

This is the problem with Clipper/Capstone.  There is a presumption on
the part of the government that a wiretap capability through escrowed
cryptography must be protected regardless of the cost to industry.
This is what we should be debating.

From hanson@ptolemy.arc.nasa.gov Thu May 27 15:12:55 1993
Return-Path: <hanson@ptolemy.arc.nasa.gov>
Received: from ptolemy.arc.nasa.gov (ptolemy-ethernet.arc.nasa.gov) by
csrc.ncsl.nist.gov (4.1/NIST)
     id AA10795; Thu, 27 May 93 15:12:46 EDT
Posted-Date: Thu, 27 May 93 12:15:07 PDT
Received-Date: Thu, 27 May 93 15:12:46 EDT
Received: from jabberwock.arc.nasa.gov by ptolemy.arc.nasa.gov (4.1/) id
<AA24469>; Thu, 27 May 93 12:15:07 PDT
Date: Thu, 27 May 93 12:15:07 PDT
>From: Robin Hanson <hanson@ptolemy.arc.nasa.gov>
Message-Id: <9305271915.AA24469@ptolemy.arc.nasa.gov>
Received: by jabberwock.arc.nasa.gov (4.1/SMI-4.1)
     id AA13922; Thu, 27 May 93 12:12:25 PDT
To: crypto@csrc.ncsl.nist.gov
Cc: hanson@ptolemy.arc.nasa.gov
Subject:  Cryptographic Issue Statement

[This is an updated version of a message I sent May 13.]

You have announced:

  "The Board solicits all interested parties to submit well-written,
  concise issue papers, position statements, and background materials on
  areas such as those listed below. ... Because of the volume of
  responses expected, submittors are asked to identify the issues above
  to which their submission(s) are responsive."

My paper included below addresses this issue: 

  ... Issues involved in balancing various interests affected by 
  government cryptographic policies.

Specifically, I examine whether government cryptographic policies
intended to preserve wiretap abilities can cost phone users less than
they benefit citizens seeking law enforcement.  This seems unlikely.

Robin Hanson

                          by Robin Hanson
              hanson@ptolemy.arc.nasa.gov 510-651-7483  
               47164 Male Terrace, Fremont, CA 94539
                            May 21, 1993
                         Distribute Freely

  SUMMARY: Compared to an average monthly phone bill of eighty dollars,
  the option to wiretap the average phone line is probably worth less than
  twelve cents a month to police and spy agencies.  Claims that this 
  option is worth over a dollar a month ignore the basic economics of 
  law enforcement.  Thus recently proposed government policies to preserve
  wiretap abilities in the face of technological change must raise phone 
  costs by less than one part in seven hundred to be cost-effective.  
  Why not let a market decide if wiretaps make sense?  


Until now, telephones have happened to allow the existence of "wiretaps",
cheap detectors which can pick up conversations on a phone line without the
consent of either party to the conversation.  And since 1968, U.S. police
have been allowed to request such wiretaps from judges, and must compensate
phone companies for expenses to assist a tap.  Since then, law enforcement
agencies have come to rely on this capability to aid in criminal

However, wiretaps have become more difficult as phone companies have
switched to digital technologies.  And powerful new encryption technologies
threaten to make truly private communication possible; a small chip in each
phone could soon make it virtually impossible to overhear a conversation
without a physical microphone at either end.  So the U.S. government has
begun to actively respond to these threats to police wiretap abilities.

Regarding digital phone issues, a "FBI Digital Telephone Bill" was
circulated early in 1992 [1], proposing to require all communication
services to support easy wiretaps, now without compensation from the
police.  Each tapped conversation would have to be followed smoothly as the
parties used call-forwarding or moved around with cellular phones.  The
data for that conversation would have to be separated out from other
conversations, translated to a "form representing the content of the
communication", and sent without detection or degradation to a remote
government monitoring facility, to be received as quickly as the parties to
the conversation hear themselves talk.  Congress has yet to pass this bill.

Regarding encryption issues, the White House announced on April 16, 1993 
that 1) they had developed and begun manufacturing a special "wiretap" (or
"Clipper") chip to be placed in future phones, instead of the total privacy
chips which have been under private development, 2) they plan to require
this chip in most phones the government buys, and 3) they will request all
manufacturers of encrypted communications hardware to use this wiretap
chip.  The same day, AT&T announced it would use these chips "in all its
secure telephone products".  

The plan seems to be to, at the very least, create a defacto standard for
encryption chips, so that alternatives become prohibitively expensive for
ordinary phone users, and to intimidate through the threat of further
legislation.  Such legislation would be required to stop privacy fans and
dedicated criminals, who might be willing to pay much more to use an
alternative total privacy standard.

Both the specific wiretap chip design and the general algorithm are secret.
Each chip would be created under strict government supervision, where it
would be given a fixed indentifier and encryption key [2].  At some
unspecified frequency during each conversation, the chip would broadcast
its identifier and other info in a special "law enforcement field".  Law
enforcement officers with a court order could then obtain the key
corresponding to this indentifier from certain unspecified agencies, and
could thereby listen in on any future or previously recorded conversations
on that phone.

To date, most concerns voiced about the wiretap chip have been about its
security.  Encryption algorithms are usually published, to allow the
absence of public demonstrations of how to break the code to testify to the
strength of that code.  And it is not clear what government agency could be
trusted with the keys.  Many suspect the government will not limit its
access in the way it has claimed; the track records of previous
administrations [3], and of foreign governments [4], do not inspire
confidence on this point.

This paper, however, will neglect these concerns, and ask instead whether
this new wiretap chip, and other policies to preserve phone wiretaps, are
cost-effective tools for police investigation.  That is, which is a cheaper
way for society to investigate crime: force phone communications to support
wiretaps, or give police agencies more money to investigate crimes as they
see fit?  Or to put it another way, would police agencies still be willing
to pay for each wiretap, if each wiretapping agency were charged its share
of the full cost, to phone users, of forcing phones to support wiretaps?

A recent U.S. General Accounting Office report on the FBI bill stated [1]:

 "[N]either the FBI nor the telecommunications industry has 
  systematically identified the alternatives, or evaluated their costs, 
  benefits, or feasibility."

While this paper will not change this sad fact, it does aspire to improve
on the current confusion.  To begin to answer the above questions, we might
compare the current benefits wiretaps provide to law enforcement agencies
with projected costs of implementing the new wiretap chip and other wiretap


1992 is the latest year for which wiretap statistics are available [5].
According to the Office of U.S. Courts, 919 wiretap installations were
requested by local, state, and federal police in 1990, no requests were
denied, and 846 taps were installed.  2685 arrests resulted from wiretaps
started the same year, 1211 arrests came from wiretaps in previous years,
and about 60% of arrests eventually lead to convictions.  About 37% of
wiretaps were requested by federal authorities, and 67% of state wiretaps
were in New York, New Jersey, and Florida.  28 states had no wiretaps, and
10 states do not allow wiretaps.

About 69% of taps were regarding drug offenses, and 10% for racketeering,
and 7% for gambling offenses.  Wiretaps are most useful for investigating
"victimless" crimes, since victims will often give police permission to
record their calls.

Each wiretap installation heard an average of 1861 calls, 19% of them
incriminating, among 117 people.  Of 829 installations reporting costs, the
average cost was $46,492.  Federal taps cost about twice as much as state
taps, so federal agencies paid 53% of total wiretap costs.  $1.1 million
was also spent following up on wiretaps from previous years.  Thus a total
of $40.4 million was spent on wiretaps, to obtain about 4000 arrests, at
about $10,000 per arrest, or four times as much as the $2500 per arrest
figure one gets by dividing the $28 billion spent by all police nationally
by the total 11 million non-traffic arrests in 1987 [6].  Thus wiretaps are
a relatively expensive form of investigations.

75% of the wiretaps were for phone lines (vs pagers, email, etc.), and are
the focus of this paper.  The $30 million per year spent on phone taps
represents only one thousandth of the total police expenditures.
Projecting previous trends from the 138 million phone "access" lines in the
country in 1990 [6] suggests 147 million access lines in 1992.  Thus about
20 cents spent per year per phone line, or about two cents a month, is
spend on phone wiretaps.  Since 1978, our foreign intelligence agencies
have also been authorized to tap international phone calls.  No statistics
are published on these taps, so let us assume a similar number of "spy"
wiretaps are done, giving a total of ~$60 million annually, or four cents
per month per phone line spent on wiretaps.

Of course the amount police spend on wiretaps is not the same as the
benefits of wiretaps.  How can we estimate benefits?  Dorothy Denning, an
advocate of both the FBI bill and the wiretap chip, claims that "the
economic benefits [of wiretaps] alone are estimated to be billions of
dollars per year" [7], and then refers to amounts fined, recovered, and "$2
billion in prevented potential economic loss" by the FBI from 1985 to 1991.
Denning further relays fascinating FBI claims that through wiretaps "the
hierarchy of organized crime has been neutralized or destabilized", and
that "the war on drugs ... would be substantially ... lost" without them.

Two billion dollars per year of wiretap benefit would translate to a little
over a dollar a month per phone line.  Denning, however, offers no support
for her claims, and appears to be relaying internal FBI figures, which the
FBI itself has neither revealed nor explained to the public.  And the FBI
is hardly a neutral party on this subject.

Estimating the benefits of police investigations is not as simple as it
might seem, however, and certainly requires more than adding up amounts
fined or recovered.  Long and well-established results in the economics of
law enforcement [8] tell us to reject the notion that we should be willing
to spend up to one dollar on police, in order to collect another dollar in
fines or to prevent another dollar of theft.  So, for example, we rightly
reject IRS pleas for increased budget based solely on estimates of how many
more dollars can be collected in taxes for each dollar spent by the IRS.
In fact, a main reason given for using public police to investigate crime,
instead of private bounty hunters, is to avoid such police overspending.

In general, we deter a given class of criminals through a combination of
some perceived probability of being caught and convicted, and some expected
punishment level if convicted.  And some crime is directly prevented,
than deterred, through some level of police monitoring.  The optimum police
budget is a complex tradeoff between social costs due to the crimes
themselves, the punishment exacted, and police expenses.

How then can we estimate wiretap benefits?  Let us assume that about the
right total amount is being spent on police, and that police have about the
right incentives, to spend their budget to monitor where it would help the
most, and to get as many as possible of the right kinds of convictions.
(If police budgets are too low, then the answer is to increase them, rather
than trying to crudely subsidize any one of their expenses.)

In this case the social benefit of being able to wiretap is no more than
about the additional amount police would be willing to pay, beyond what
they now pay, to undertake the same wiretaps (assuming this remains a small
fraction of total police budgets).  The benefit of wiretaps is actually
less than this value, because were wiretaps to become more expensive, we
might prefer to get the same criminal deterrence by instead raising
punishment and lowering the probability of conviction, or perhaps we might
accept a lower deterrence level, or even decriminalize certain activities.
Police monitoring might be similarly adjusted.

How much police would be willing to pay for each wiretap would depend of
course on how what alternatives are available.  If unable to wiretap a
particular suspect's phone line, police might instead use hidden
microphones, informants, grant immunity to related suspects, or investigate
a suspect in other ways.

The law requires that police requesting a wiretap must convince a judge
that other approaches "reasonably appear to be unlikely to succeed if tried
or to be too dangerous".  But in practice judges don't often question
boilerplate claims to this effect in police requests [9], and
investigations often continue even after a wiretap has failed to aid an
investigation.  Experienced investigators advise wiretaps as a last resort,
but mainly because wiretaps are so expensive.

More importantly, police can also choose to focus on similar suspects who
are more easily investigated without wiretaps.  Most police cases are near
the borderline where it is not clear that they are worth pursuing, and will
be simply dropped should a more pressing case suddenly arise.  Many cases
reach the point where a wiretap might help, but are dropped because a
wiretap seems too costly.  And most cases now using wiretaps would probably
be abandoned if wiretaps became dramatically more expensive.

No doubt a few wiretaps are so valuable that it would have cost ten times
as much to obtain similar results through other means.  But on average, it
is hard to imagine that police would be willing to pay more than a few
times what they now pay for each wiretap.  If we assume that police would
on average be willing to pay twice as much for each tap, then the social
benefit of phone wiretaps is about equal to the current spending level of
four cents a month per phone line.  If we assume that police would on
average be willing to pay four times as much per wiretap, the option to
wiretap the average phone would be worth twelve cents a month.

A better estimate of wiretap values might come from randomly asking recent
wiretap requestors whether they would have still requested that wiretap had
they expected it to take twice as much labor to get the results they had
expected, or three times as much, etc.  The FBI will not allow such a
survey by ordinary citizens, but perhaps some state police would.  But
until such research is done, the twelve cent figure seems a reasonably
generous estimate, and the four cent figure may be closer to reality,

Of course the value of the option to tap any particular phone line
presumably varies a great deal from the average value.  But unless the
police can somehow pay only for the option to wiretap particular phone
lines of its choosing, it is the average value that matters for a
cost/benefit analysis.


Let us for the moment optimistically assume that the U.S. government
encryption scheme used in the wiretap chip is as secure as whatever private
enterprise would have offered instead, protecting our conversations from
the spying ears of neighbors, corporations, and governments, both foreign
and domestic.  Even so, the use of this chip, and of other policies to
support wiretaps, would create many additional costs to build and maintain
our communication system.

Some phone companies must have perceived a non-trivial cost in continuing
to support wiretaps while moving to digital phone transmissions, even when
compared to the widely recognized value of staying on the good side of the
police.  Otherwise the police would not have complained of "instances in
which court orders authorizing the interception of communications have not
been fulfilled because of technical limitations within particular
telecommunications networks" [1].  

The wiretap chip requires extra law enforcement fields to be added to phone
transmissions, increasing traffic by some unknown percentage.  A special
secure process must be used to add encryption keys to chips, while securely
distributing these keys to special agencies, which must be funded and
monitored.  The chips themselves are manufactured through a special process
so that the chip becomes nearly impossible to take apart, and the pool of
those who can compete to design better implementations is severely limited.
Private encryption systems not supporting wiretaps would require none of
these extra costs.

Perhaps most important, government decree would at least partially replace
private marketplace evolution of standards for how voice is to be
represented, encrypted, and exchanged in our future phones.  It is widely
believed that governments are less efficient than private enterprise in
procuring products and standards, though they may perhaps perform a useful
brokering role when we choose between competing private standards.  How
much less efficient is a matter of debate, some say they pay twice as much,
while others might say they pay only 10% more.

This type of wiretap support also raises costs by preventing full use of a
global market for telephone systems.  It pushes certain domestic phone
standards, which foreign countries may not adopt, and requires the use of
encryption methods known only to our government, which foreign countries
are quite unlikely to adopt.

In 1990, 53 U.S. phone companies had total revenues of $117.7 billion for
domestic calls, $4.4 billion for overseas calls, and $4.5 billion for
cellular calls [6], for a total cost of $126.6 billion dollars to run the
phone system.  Extrapolating recent trends suggests $138 billion for 1992,
and an average monthly phone bill of $78 per line.  If we generously assume
that police and spies would on average be willing to pay four times as much
as the ~$60 million they now spent on wiretaps annually, we find that
wiretaps are not cost effective if we must raise phone costs by as much as
one part in 700 to preserve wiretap abilities in the face of technological
change.  The twelve cents per line wiretap option value must be compared
with an average seventy dollar monthly phone bill.  (If we assume that
police would only pay twice as much on average, then this limit falls to
one part in 2300!)

Dorothy Denning relays FBI claims that $300 million is the maximum
cumulative development cost "for a switch-based software solution" so that
phone companies can continue to support wiretaps [7].  Denning does not,
however, say how long this solution would be good for, nor what the
software maintenance and extra operating costs would be.  And again this is
a figure which the FBI itself has neither revealed nor explained to the
public.  If we use a standard estimate that software maintenance typically
costs twice as much as development [10], and accept this FBI estimate, then
total software costs would be by itself five times the above generous
estimate of annual wiretap benefits.

The current government contractor claims it will offer the wiretap chips
for about $26 each in lots of 10,000 [2], over twice the $10 each a
competing private developer claims it would charge [11] for a chip with
comparable functionality, minus wiretap support.  And the wiretap chip
price probably doesn't reflect the full cost of government funded NSA
research to develop it.  If only one phone (or answering machine) is
replaced per phone line every five years, the extra cost for these chips
alone comes out to over 27 cents extra a month per line, or by itself more
than two times a twelve cent estimated wiretap option value.  Of course
most phones wouldn't have encryption chips for a while, but the wiretap
benefit is per phone, so this argument still applies.


Given the dramatic difference between the total cost of running the phone
system and an estimated social value of wiretaps, we can justify only the
slightest modification of the phone system to accommodate wiretaps.  When
the only modification required was to allow investigators in to attach
clips to phone wires, wiretap support may have been reasonable.  But when
considering more substantial modification, the burden of proof is clearly
on those proposing such modification to show how the costs would really be
less than the benefits.  This is especially true if we consider the costs
neglected above, of invasions of the privacy of innocents, and the risk
that future administrations will not act in good faith [3].

If consensus cannot be obtained on the relative costs and benefits of
wiretaps, we might do better to focus on structuring incentives so that
people will want to make the right choices, whatever those might be.
Regarding phone company support for wiretaps, it seems clear that if
wiretaps are in fact cost-effective, there must be some price per wiretap
so that police would be willing to pay for wiretaps, and phone companies
would be willing to support them.  As long as the current law requiring
police to pay phone company "expenses" is interpreted liberally enough, the
market should provide wiretaps, if they are valuable.

Monopoly market power of phone companies, or of police, might be an issue,
but if we must legislate to deal with monopoly here, why not do so the same
way we deal with monopoly elsewhere, such as through price regulation?
Legislating the price to be zero, however, as the FBI bill seems to
propose, seems hard to justify.  And having each police agency pay for
wiretaps, rather than all phone companies, seems fairer to states which
forbid or greatly restrict the use of wiretaps.

Regarding encryption chips, recall that without legislation outlawing
private encryption, serious criminals would not be affected.  In this case,
it does not seem unreasonable to allow phone companies to offer discounts
to their customers who buy phones supporting wiretaps, and thereby help
that phone company sell wiretaps to police.  Each phone user could then
decide if this discount was worth buying a more expensive phone chip, and
risking possible unlawful invasions of their privacy.  Adverse selection,
however, might make privacy lovers pay more than they would in an ideal

If outlawing private encryption is seriously considered, then we might do
better to instead just declare an extra punishment for crimes committed
with the aid of strong encryption, similar to current extra punishments for
using a gun, crossing state lines, or conspiring with several other people.
As in these other situations, a higher punishment compensates for lower
probabilities of convicting such crimes, and for higher enforcement costs,
while still allowing individual tradeoffs regarding wiretap support.

If, as seems quite possible, the stringent cost requirements described here
for preserving wiretap abilities cannot be met, then we should accept that
history has passed the economical wiretap by.  Police functioned before
1968, and would function again after wiretaps.

[1] ftp: ftp.eff.org /pub/EFF/legislation/new-fbi-wiretap-bill

[2] Clipper Chip Technology, ftp: csrc.ncsl.nist.gov /pub/nistnews/clip.txt

[3] Alexander Charns, Cloak and Gavel, FBI Wiretaps, Bugs, Informers, and
    the Supreme Court, Univ. Ill. Press, Chicago, 1992.

[4] Headrick, The Invisible Weapon, Oxford Univ. Press, 1991.

[5] Report on Applications for Orders Authorizing or Approving the
    Interception of Wire, Oral, or Electronic Communications, 1992,
    Administrative Office of U.S. Courts, Washington, DC 20544.

[6] Statistical Abstract of the United States, 1992.

[7] Dorothy Denning, "To Tap Or Not To Tap", Comm. of the ACM, March 1993.

[8] Richard Posner, Economic Analysis of Law, 4th Ed., 1992, Chapter 22.

[9] Report of the National Commission for the Review of Federal and State
    Laws Relating to Wiretapping and Electronic Surveillance, Washington,

[10] Barry Boehm, Software Engineering Economics, Prentice Hall, 1981.

[11] Conversation with Steven Bryen, representative of Secure 
     Communications Technology, 301-588-2200, April 25, 1993.

No one paid Robin anything to write or research this (unfortunately :-)

From Ralph.Durham@Forsythe.Stanford.EDU Thu May 27 19:11:05 1993
Return-Path: <Ralph.Durham@Forsythe.Stanford.EDU>
Received: from Forsythe.Stanford.EDU by csrc.ncsl.nist.gov (4.1/NIST)
     id AA11125; Thu, 27 May 93 19:10:56 EDT
Posted-Date:      Thu, 27 May 93 16:10:43 PDT
Received-Date: Thu, 27 May 93 19:10:56 EDT
Message-Id: <9305272310.AA11125@csrc.ncsl.nist.gov>
Date:      Thu, 27 May 93 16:10:43 PDT
To: crypto@csrc.ncsl.nist.gov
>From: "Ralph Durham" <Ralph.Durham@Forsythe.Stanford.EDU>
Subject: Clipper Chips! NO!

Madam / Sir:

RE: Clipper chip;

I currently have no vested interest in the encryption of computer
files or the transmission of same.

Encryption technology has become a potential commodity for the
masses because of the PC. Thre will be no way to stop people from
encrypting their files for safekeeping of transmission should they
want. This is akin to prohibition, gun laws, and drugs. This is not
a policable issue.

This technology, the clipper chip, will reduce our country's
technological edge in this feild because it will lead to stagnation.

This will drive prices up for honest people because of monopolistic
encryption chips will be needed. Add to this the cost of a secure
network to manufacture, program, sell, install, and keep the 2nd key
required to use the system. For what? I for one would like to see a
cost benefit analysis done for this issue alone.

The other issue is privacy. With out the public knowing what this
encryption is like and how it will be used we cannot be sure that it
is really secure or needed. What precautions are going to be taken
to ensure that this is the best way to encrypt data or that only law
enforement can get the 2nd key for justified reasons. The way our
government, and other governments, have acted in the past leaves me
leary of the future. How will we be able to protect our rights and

How will these keys be stored? How will access be decided? Will we
have criminal charges for the unathorized use of the 2nd key. Or the
data thus gained. Or the theft of keys and data?

Drop this ill fated plan. Spend the money on something better for
the country than the marginal, at best, law enforcement gain from
this idea. Privacy issues, cost benefit, get children innoculations
and adequate educations and we will have less need for a police

Ralph G. Durham
104A Escondido Village
Stanford, CA 94305


TUCoPS is optimized to look best in Firefox® on a widescreen monitor (1440x900 or better).
Site design & layout copyright © 1986-2024 AOH