[Beowulf] Quasi-Non-Von-Neumann hardware in a Beowulf

Robert G. Brown rgb at phy.duke.edu
Thu Mar 10 22:31:07 EST 2005


On Thu, 10 Mar 2005, Joe Landman wrote:

> 
> 
> Robert G. Brown wrote:
> 
> > Actually IIRC there was some effort by Intel to build something into a
> > CPU or some other part of a mobo chipset, but I don't know what came of
> > it.  The people who need this commercially are webvolken and encryption
> > algorithms, where "unpredictable" is almost as good as "random to the
> > nth bit of randomness".  Nobody uses marginally secure encryption if
> > they can help it.  However, these folks are totally happy with
> > kilorands/sec, let alone megarands/sec.  I need hundreds of
> > megarands/sec.
> 
> (thinking aloud)
> 
> Hmmm.  I bet we could generate this (if you dont mind tt800 or its MT 
> ilk), in software using some really neat and very powerful (and 
> inexpensive) COTS chips.  Dont know if we could hit 0.1 GRPS (gigarand 
> per second), but I bet it could come pretty close.

As I said, it isn't as easy as it sounds to generate truly random
numbers.  Pseudorandom yes, but there I can generate immediate data
(from my dieharder program, for example) on how fast you can make them.
On a dual 242 Opteron, for example, using the mt19937_1999 generator
(one of the best, fastest rngs from the GSL) I get roughly 50 MegaRPS
per processor, or 100 MRPS total.  Generators of comparable or a bit
better quality tend to run just a bit slower to a lot slower.  The built
in /dev/random entropy generator in linux is VERY slow and blocks when
there is insufficient entropy.  /dev/urandom doesn't block, is quite
unpredictable, but is STILL very, very slow at less than a MROPS on
nearly any modern hardware.  This gives you some idea of how difficult
it is to get 100 MRPS rates on ANY sort of hardware, even with
pseudorandom number generators.

Hardware generators have the same issues.  Radioactivity based
generators need to be pretty damn hot to generate 3.2 random gigabits
per second.  Photon counters have all sorts of hardware response time
issues (and tend not to be as poissonian/random as you might think, see
Hanbury-Brown-Twiss or computations of antibunching in the fluorescent
spectrum).  "Entropy" based generators can be OK, but again, it is
difficult to know all the timescales of decay/decorrelation of all
aspects of order, as nearly ALL hardware based sources are not
independent random for very short times -- rather you hope that they
have "only" relatively short autocorrelation times and don't sample them
too often.

This is one of the things I might be working on a bit over the next
year, and is one reason I started working on dieharder (the tool I use
both to measure their quality of randomness, so to speak, and to time
them as well).

> I have been thinking about a USB dongle for the past few months that 
> provided random ints at a pretty nice rate.  The chip I have in mind for 
> this might be able to be powered off the USB port also (I think, I need 
> to see the overall power needs).
> 
> Could do a PCI card... would cost more to fabricate and sell.  The $30 
> mark might be low.  I'd guess closer to about $100 for that, unless we 
> could get significant volumes  ... (economies of scale work wonders for 
> component pricing).

FIRST you have to come up with a design, THEN you have to build a
prototype and test it.  Chances are that when you test it you'll find
that while perhaps (if you're lucky) it is unpredictable, it isn't
"uniformly random" at the bit level.  After all, plenty of things sample
a random DISTRIBUTION (say a Gaussian) that is unpredictable but not
uniform.  Then you get to look for transformations that will take your
non-uniform random distribution and make it uniform at the bit level.
Typically this will cost you bits -- for example, if your original
distribution has a bias in its 0 vs 1 numbers you can combine pairs of
bits to create a distribution that is uniform in 0 and 1.  Then you have
to look at 00, 01, 10, and 11 -- do they all occur binomially
distributed (on average) with p = 0.25?  How about 000, 001, 010, 011,
100, 101, 110, 111 with p = 0.125?  Etc.  (I find that NO pseudorandom
number generators I've tested have the right six or greater bit
distribution, that is, 000000, 000001, 000010, ... are not each
binomially distributed with p = 1/64 in a very large bit sample.)

Eventually you get it to pass diehard (and maybe even dieharder) at the
expense of many bits and much consequent halving of the peak rates.  If
it is STILL really fast -- 100 MROPS is a good thing to shoot for as it
is the equivalent of a dedicated dual opteron THEN you can look for a
bus and market.

Not easy at all.  Or cheap.  Intel and other companies have long been
looking for a good way of doing this, and if you find one (especially
one that could be incorporated on a single chip) you can probably sell
it directly to them and retire.

   rgb

-- 
Robert G. Brown	                       http://www.phy.duke.edu/~rgb/
Duke University Dept. of Physics, Box 90305
Durham, N.C. 27708-0305
Phone: 1-919-660-2567  Fax: 919-660-2525     email:rgb at phy.duke.edu


_______________________________________________
Beowulf mailing list, Beowulf at beowulf.org
To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf



More information about the Beowulf mailing list