home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: sci.crypt
- Path: sparky!uunet!news.univie.ac.at!scsing.switch.ch!bernina!caronni
- From: caronni@nessie.cs.id.ethz.ch (Germano Caronni)
- Subject: Re: Generating good keys
- Message-ID: <1993Jan29.030159.7177@bernina.ethz.ch>
- Keywords: random sources, usage of, public random data, entropy measurement
- Sender: news@bernina.ethz.ch (USENET News System)
- Organization: Swiss Federal Institute of Technology (ETH), Zurich, CH
- References: <1993Jan28.201110.23210@qualcomm.com>
- Date: Fri, 29 Jan 1993 03:01:59 GMT
- Lines: 103
-
-
- In article <1993Jan28.201110.23210@qualcomm.com> you (Phil Karn) write:
- >
- >The question, of course, is knowing when you have enough. MD-5 itself
- >is pretty fast, but gathering all of this information could still slow
- >you down. And you'd run the risk of generating correlated results if
- >you do it too often. Comments?
- >
- Hi,
- just bear with me if this gets close to nonsense :-)
- Your idea sounds good to me. And intuitively I agree in the 'destillation'
- of random data by e.g. MD-5 . Perhaps there are formal reasons for it ?
-
- I can't say anything certain about getting correlated results, but I guess
- when doing data-finding in certain intervalls or at certain given times,
- correlation might get higher. How about sampling at randomly :-)
- specified times, and store this data in a private place, where it can be
- consumed as one-time random-bit-stream ? 'randomly' means here
- to use the last sample(s) of random data for determining the time
- of the next sample-taking.
-
- What sources do we have (I am just trying to think them out, some of
- them are valid only for certain OS) ?
-
- Time of Day
- Mouse position, (latency of) actual keystrokes
- number of current scanline on the monitor
- contents of the actually displayed image.
- Hash on Main-Memory,FAT's,Kernel Tables,DiskBlocks as you stated
- Access/Modifytimes of /dev/tty??s
- Cpu-Load
- Network-Packets
- Login/Logout-Times of Users
- PID/UID
- Reponse-Times for some ping(1)s :-)
- Temperature/Pressure/etc if measurable
- (e.g. input of measurement-units for the current environment)
- (like turning on the mike on a NeXT and do some sampling :-))
- Actual number of ethernet-collisions
- Remote Users ... (everything you can gain from a network)
- The latest News (as lately stated)
- ...
-
- Can you imagine other sources ?
-
-
-
- Now about knowing when you got enough.
-
- Reference to J. of Cryptology (1992) 5: 89-105
- A 'Universal' Statistical Test for Random Bit Generators
-
- from which I derivate my idea explained below.
-
- I could collect several of these events and transform them into
- a bit stream which would have high redundance, as most of
- subsequently collected data does not change enough or not randomly
- enough. Then I could calculate the entropy of this sequence.
- If the entropy is high enough, I have enouh randomness, can now
- hash the collection by MD5 and store it for further use. Some
- time later, which is perhaps defined by the first byte of the
- 16 bytes obtained this way, i begin a new sampling sequnce.
-
- Question is, how do I find the actual entropy of the sequence ?
- Would it be sufficent to try to 'compress' it, and be content
- with the result if an sufficent low percentage of compressibilty
- is reached ? Or are there reasons why you can't use this ?
- You do not need to actually compress the sampled data, in the paper
- mentioned above an 'easy' way is shown to calculate the per-bit
- entropy of the sequence you want to test.
-
- Hmmm. How about taking adaptive compression, compress all input
- and use the output of this continual process ? (While forgetting
- all data to remember the actual compression-tables) Wouldn't this
- data meet all needed standards for non-redundancy ? So you could
- hash the compressed output and use it.
-
- I guess on some systems you could install a kind of demon,
- which collects such data, and presents an periodically changing
- window (size: several megs?) of it in a publicy available file.
-
- I could imagine such demons on several machines, occasionaly
- exchanging data to make the public random source even more
- random.
- Having the user, when he needs some random data accessing somre
- remote demons to get it isn't a bad idea neither, as long
- enemies to not know what parts of the recieved data you actually
- use. (But I prefer regular exchange between demons) which then hash
- data into (parts of) their own pulic random-data.
-
- So, what is your opinion about this ?
-
- And how would the user choose part of the random-data visible
- in that file for his own use?
-
- Friendly greetings,
- Germano Caronni
-
- --
- Instruments register only through things they're designed to register.
- Space still contains infinite unknowns.
-
- Germano Caronni caronni@nessie.cs.id.ethz.ch
-