Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Hardware RNG (measures AC mains chaos to make random numbers)

Status
Not open for further replies.
MrAl said:
...Normally when i say 'generator' i mean an algorithm that is purely mathematical in some way and does not include real world chaos of any kind.
...

I already have a well tuned software PRNG here;
https://www.romanblack.com/black_rng.htm
which has a massive number of states, greater than any standard software RNG I have seen. It also changes its algorithm over time, and randomly mixes data processed with older algorithms with data from the new algorithm, two features not generally seen on software RNGs.

It would be good to hear more ideas on a very simple hardware system to generate some entropy, when I said "self contained" that means the simple hardware makes the entropy and all the PIC needs to do is measure it, preferably time based measurement as that rules out the issues caused by voltage based (ADC) type measurements. :)

Something with a few gates or a few transistors, that reacts chaotically but also reliably would be perfect. Maybe a simulator job? ;)

()blivion said:
..."RC slope->digital input triggering points are always unpredictable"
If it's non-Schmidt triggered, yes. That's the illegal logic area which is classified as "result not determinable". I have always though of this and wanted to try and develop a RNG based on this practice, but digital logic's tend to fry when you put their inputs in the "illegal zone". And that scares me :)<)

I would use schmidt gates, for reliability. They still have some uV or mV of fuzziness on the triggering edges, which is enough to add to the overall entropy of the reaction.

Jpanhalt said:
...It returns a result for 900,000 to 900,009 as 3587510954. Do you know whether that is correct? I only memorized pi to 10 decimal places (1415926535). At least it checks out that far.
...

Thanks for the link to the Pi tester. (Also thanks to ()blivion for the other Pi software link). :) I have already downloaded some Pi files, but unfortunately they are all in decimal!! Raw Pi itself is essentially a "random" number but converting it into decimal may corrupt it? My software really needs binary data to run the tests properly and can only do some basic data counting when the data is decimal (no nice displays).

(Added) I ran the test on 1.25Mb of Pi from the file from the Gutenberg book, and compared to 1.25Mb of "random" decimal digits 0-9 generated by Rand(10).

Pi looks to be VERY inline with what you would expect from a perfectly random sequence. It has no digit bias, matching the random data exactly. The only small difference I can detect is in the 2 digit pairs, Pi shows a slightly smoother distribution than rand(10) for the most common 2 digit pairs, but they are so close as to both be comsidered "good" random data;
Code:
(note! there are 10 digits 0-9, as ascii 48-57)
Results from 1254540 bytes of Pi (Gutenberg book);
Byte-aligned 2byte freq (largest 20); 
 'Perfect' next byte freq = 12545.4 times
  53,  53 = 12876 times 
  50,  55 = 12834 times 
  57,  52 = 12772 times 
  53,  52 = 12726 times 
  49,  57 = 12724 times 
  54,  48 = 12720 times 
  51,  48 = 12715 times 
  50,  49 = 12707 times 
  52,  54 = 12706 times 
  50,  50 = 12698 times 
  48,  57 = 12697 times 
  51,  55 = 12696 times 
  56,  56 = 12685 times 
  57,  56 = 12672 times 
  57,  57 = 12656 times 
  56,  51 = 12655 times 
  51,  51 = 12642 times 
  55,  54 = 12640 times 
  51,  53 = 12636 times 
  50,  52 = 12633 times 

Results from 1254540 bytes of rand(10);
Byte-aligned 2byte freq (largest 20); 
 'Perfect' next byte freq = 12545.4 times
  54,  50 = 12949 times 
  54,  56 = 12877 times 
  54,  49 = 12848 times 
  53,  54 = 12844 times 
  53,  48 = 12830 times 
  51,  54 = 12780 times 
  48,  54 = 12747 times 
  57,  54 = 12738 times 
  54,  53 = 12706 times 
  50,  51 = 12698 times 
  52,  54 = 12691 times 
  56,  54 = 12667 times 
  50,  54 = 12658 times 
  56,  51 = 12656 times 
  56,  56 = 12655 times 
  52,  52 = 12654 times 
  56,  55 = 12651 times 
  55,  53 = 12627 times 
  54,  48 = 12623 times 
  51,  57 = 12619 times

Maybe MisterT can comment on this comparison his statistics math was pretty good if I remember right. :)
 
The hardware RNG I've always considered would be battery powered with individual buttons for 2, 3, 4, 6, 8, 10, 12, 16, 20, 30, and 100 sided "dice". I did it with a pic chip and PRNG on a breadboard (minus the 30 and 100 sided dies) about 20 years ago but was disappointed with the results.

From what I understand, modern zeners have been designed to minimize noise. I've also read somewhere, probably in a Radio Electronics mag 15 years ago, that a reverse biased base-emiitter junction, especially one in an older transistor, will behave like a very noisy zener. Maybe I'll try that this winter.
 
Having read this thread with interest and interesting opinions of randomness. Name your own choice of random number generator or seed generator.
Mine would start with a Geiger Muller tube for the entropy input. How you deal with the random periods between it firing is open to many suggestions using counters, LFSRs or hashes.
 
One of my ideas was to use a hex schmidt inverter IC as 6 independent RC oscillators, then resistor couple some of their outputs back into some of their inputs.
Is this the sort of thing?
View attachment 67077
It's only 4-bit, with each bit modulating the clock period, but demonstrates the principle.
I have a feeling (but not the maths to back it up) that the number sequence will repeat at some function of the lowest common multiple of the various oscillator periods; but perhaps that's not the case?
 
Last edited:
Pi looks to be VERY inline with what you would expect from a perfectly random sequence. It has no digit bias, matching the random data exactly. The only small difference I can detect is in the 2 digit pairs, Pi shows a slightly smoother distribution than rand(10) for the most common 2 digit pairs, but they are so close as to both be comsidered "good" random data;

Maybe MisterT can comment on this comparison his statistics math was pretty good if I remember right. :)

I don't think I can say much from that kind of comparison. Standard C++ library Rand() is very likely a poor PRNG.. optimized for speed, not quality. And your data set is still relatively small. I believe Pi is very good sequence of random digits. At least it is extensively studied and "everybody" seem to agree that it is random and uniform.
 
At least it is extensively studied and "everybody" seem to agree that it is random and uniform.
And 100% predictable.
 
Name your own choice of [hardware] random number generator or seed generator.

LFSR PRNGs. (shrinking generator, alternating step generator.)

Fast, compact, power efficient, easy to implement in silicon, and random enough for most everything other than academics and other highly strict niche applications. And if that's not good enough by it's self, you can feed it's output into a software PRNG algorithm.
 
If no two snow flakes are alike then I would say that parts of nature are truly random.
It's just a matter of measuring it.
 
Really rad machine find 4pyros. I love it.

As for randomness as a whole. In the strictest sense, there really is no such thing as tangible "truly, fully, completely, random". The universe has finite quantum states, and only exists for finite time (admittedly, each are very large). One of the requirements for tangible perfect random is that neither of these are true. Also, just because something is able to go on forever, does not mean it is random. Going with snow flakes (patterned ice crystals) for example, say none of them are ever exactly the same. However, they are almost always all round-ish, made of water, and about the same size. So there is a pattern, and thus they are not 100% random. There is always some pattern, Pi has a pattern, that pattern *IS* Pi. It may be unique to Pi. But that doesn't make it not a pattern. As Sceadwian said, it's 100% predictable.

It's semantics though. I guess the real definition of "random" is that there must be no PERCEIVABLE pattern. And a lot of PRNGs satisfy this.

In the end, if the only practical usefulness for better and better randomness is in cryptology, then by practicality, predictable irrational numbers are not random. LFSRs are a far better choice.
 
Cachehiker said:
The hardware RNG I've always considered would be battery powered with individual buttons for 2, 3, 4, 6, 8, 10, 12, 16, 20, 30, and 100 sided "dice". I did it with a pic chip and PRNG on a breadboard (minus the 30 and 100 sided dies) about 20 years ago but was disappointed with the results.
...

A manual press RNG is one of the simplest and best systems, you just run a fast counter and grab the value when your finger releases the button (wheel of fortune style). If the counter is fast enough and has the same time period per digit then you get a perfect random result the button releases. Semiconductor junction noise won't be as good as a captured counter.

Alec_t said:
Is this the sort of thing?

It's only 4-bit, with each bit modulating the clock period, but demonstrates the principle.
I have a feeling (but not the maths to back it up) that the number sequence will repeat at some function of the lowest common multiple of the various oscillator periods; but perhaps that's not the case?

Cool! The first Chaoscillator simulation. Thanks for that! :) The output looks somewhat chaotic but I think there are not enough feedback paths. You have 4 free running oscillators that have analogue feedback to the RC of one other oscillator. Maybe you could remove the digital flipflops and make the output digital from just one oscillator. Then re-router the feedback paths more circularly so all oscillators affect 2 or more others, and all are affected by at least one other osc.

Here's an XOR shift LFSR from Wikipedia, you can see data from the highest parts of the cycle are mixed and fed back into the start of the cycle. Something like that in analogue is what myself and ()blivion have been suggesting as one Chaoscillator solution.

**broken link removed**
https://en.wikipedia.org/wiki/Linear_feedback_shift_register

MisterT said:
I don't think I can say much from that kind of comparison. Standard C++ library Rand() is very likely a poor PRNG.. optimized for speed, not quality. And your data set is still relatively small. I believe Pi is very good sequence of random digits. At least it is extensively studied and "everybody" seem to agree that it is random and uniform.

Thanks for your input MisterT. I was not planning on using Pi, even as a seed for other processes (especially now I have hardware RNG data for seeds). ;) I was still surprised that converting the pure concept of Pi into base10 did not seem to damage its entropy, I though there would be some patterns introduced by the binary->base10 conversion. It looks like another case of entropy+pattern conversion = entropy I guess.

()blivion said:
LFSR PRNGs. (shrinking generator, alternating step generator.)

Fast, compact, power efficient, easy to implement in silicon, and random enough for most everything other than academics and other highly strict niche applications. And if that's not good enough by it's self, you can feed it's output into a software PRNG algorithm.

By all means experiment with software RNGs if you like, I think everyone should code their own software RNG and test its results, it's a lot of fun! However I would still like to focus more on simple easy to build hardware RNGs in this thread if possible.

4pyros said:
And then there is this on the hardware side;
https://hackaday.com/2009/05/26/dice-o-matic/

Haha! This is just a little bit too much (7 foot tall) "hardware" for me! It's a fun article for sure.

It gave me an idea for a mechanical RNG using a couple of doodad weights hanging on a short piece of fishing line, then hit it with pulses of magnetic field to make it dance around and then capture some data from it's movement. Imagine a little "executive's desktop toy" powered by USB port and outputijng to quality RNG data over USB as it dances around. ;)
 
Maybe you could remove the digital flipflops and make the output digital from just one oscillator.
:confused: The best way I can think of to 'make the output digital' is sampling that one oscillator's output using a clock and a flip-flop/latch. For a useful stream of random data I think the clock period would need to be regular, not chaotic.
 
Hello,

For those who say pi is predicable, that's STILL ONLY if you know the starting point. Before you reply to this, find the sequence "159" and then tell me what the next digit is. Predictable? Then show me the next number and i'll tell you if it is the same as the one i have written down.

Here's a cute little PRNG i dug up on the web. It's so simple but it's also so interesting for a number of reasons...

x=(3+x*27) mod 128

and start with x=7.

This is really cute, and it does work, but it has one serious flaw: it will not produce some numbers at all. For example, it will never produce 1 or 2.

To make up for this unfortunate shortcoming, i used a simple compression technique to squeeze the data into a neat space of 64. The nice thing about it now is that it produces numbers from 1 to 64 but never repeats a number until AFTER all 64 digits 1 through 64 have been produced. That means we'll never see 1,4,4,5,6 because 4 would never repeat that soon. This also means the mean is exactly 31.5 (for n-1) after every 64 generated numbers.

The simple algorithm with additional compression technique is still so simple:
x=(3+x*27) mod 128
if x AND 1 = 1 then
a=-floor(x/2)
else
a=1-floor(x/2)
end if
y=x+a

and now you can print either y or y-1 (y-1 to sum to get the mean).

Repeating the algorithm 64 times generates the digits 1 through 64 in perfectly even distribution (each digit occurs only once).
Repeating the algorithm 640 times generates the digits 1 through 64 exactly 10 times each (of course because of above).

This is what i call a 'well behaved' PRNG. It may or may not be something you would want to use except for electronics testing.


My PHRNG (pseudo hardware random number generator) is simple enough...
Take the year, month, day, hour, min, sec, compress/expand it into the minimum space, then use it for the seed, also using the sec digits + constant for the number of times through the algo before the PR number is spit out. Use this seed and startup for a regular shift register type PRNG in code.
Total required hardware: Just the PC and an operating system.
 
Last edited:
a fast counter and grab the value when your finger releases the button

20 years later it would also be capable of more than individual dies from individual pushbuttons, more like that plus 6d6 (the sum of six six-sided dies) as well as multiple dies with 7, 11, and 13 sides that don't even exist in three dimensions. I'd suspect the program loop could end up routinely generating values separated by regularly defined intervals under those conditions.

Even 20 years ago it would've likely evolved into something more than just individual pushbuttons. I just lacked the funds to buy sufficient decoders and multiplexers and 7-segment LED's and a breadboard big enough to hold them all and oh yeah, the pic chip and programmer technically belonged to my employer.

Alas, if I only had $5 for every pseudo-project that was never taken any further. The reversed biased emitter base junction experiment is mostly curiosity and will likely qualify as yet another one if I ever get around to tinkering with it.
 
Alec_t said:
The best way I can think of to 'make the output digital' is sampling that one oscillator's output using a clock and a flip-flop/latch. For a useful stream of random data I think the clock period would need to be regular, not chaotic.

Sorry I may have misunderstood your circuit. Basically I would use a PIC to sample (at a set clock rate) the digital output of one of the oscillators. That sample should have some entropy component but can be whitened (remove 0-1 bit bias) in software. The point I was trying to make is that rather than use 4 free running stable oscillators to all affect just one other oscillator, I would arrange them in a circuit and get each oscillator affecting one or more others, so the relationship is so much more complex. :)

4pyros said:
So more good reading;
**broken link removed**

Thank you! Those are some awesome Chaoscillators, although he does seem to be optimising the design to keep the mathematical relationships by keeping all the feedback analogue and unclipped between the integrators. I would prefer to use the analogue waves to trigger digital transitions so it smashes the math relationships and produces something with a lot less patternation and a lot more entropy.

He does have a more complex circuit "Cyclic chaos system" with 4 integrators that seems to be getting close to good entropy even without using digital to break the patterns;
**broken link removed**

So an analogue-digital Chaoscillator looks very possible for sure. :)

And his method of using the 'scope to XY plot two different points in the circuit is a fantastic idea for displaying the "chaos" in the circuit. I really appreciate that link and will try that 'scope method of displaying the chaos, even with one axis on a fixed sinewave and the other being the circuit output it would give a good idea of the output result sampled at any fixed freq.

()blivion said:
I now hold in my hand 1uC of Americium 241.
... Now if only I had a geiger-muller tube...

You may not need one. Elektor magazine did a radiation detector using just a semiconductor device, I think it was a photodiode. They got some usable results.
 
MrAl said:
...
Here's a cute little PRNG i dug up on the web. It's so simple but it's also so interesting for a number of reasons...

x=(3+x*27) mod 128

and start with x=7. ...

Hmmm, I ran some tests MrAl and your cute RNG looks very broken. I used x=(3+(x*27)) mod 128, which I assume was the correct algorithm?

I generated and tested 1Mb of data and the data is just a fixed string of 64bytes that repeats constantly, there is no entropy over time. The first two pictures below show the output as stats and as raw HEX data. In the HEX file you can see the 64byte repeating string. The stats show an exactly equal number of all 64 bytes (which you would expect ;)) and only 64 of the 128 possible bytes are represented, in an equal pattern with no 2,3 or 6,7 or 10,11 etc.

Then I added your compression code as you typed it, it does indeed compress the bytes 0-127 into 64 bytes (but they are 1-64 and not 0-63), however there is still only a single repeating constant string of 64 bytes. See the second two pictures. If I have messed up the algorithm somehow I apologise in advance, but from first glance it looks like a broken RNG. :(

As for Pi repeating you are correct it would be hard to locate a 3 digit decimal string as that occurs approx once every 1000 digits. But in breaking RNGs they often acquire a string of thousands of digits and locate that in all the possible output data. Even a simple string of 20 digits (like from playing 10 or so Blackjack hands) would be enough data to find a location in Pi quite easily.
 
Last edited:
As for Pi repeating you are correct it would be hard to locate a 3 digit decimal string as that occurs approx once every 1000 digits. But in breaking RNGs they often acquire a string of thousands of digits and locate that in all the possible output data. Even a simple string of 20 digits (like from playing 10 Blackjack hands) would be enough data to find a location in Pi quite easily.

Also speaking strictly for use in cryptology, it's less about finding the EXACT point and more about narrowing your search area enough to make a computer assisted brute force attack possible. For example, as attackers we need to find digits around 159 to break your (MrAl) encryption? Well, It may not be the first occurrence of it, 3.14>159<2654... But I can still say for certain that it's not going to be in the next 738 digits that follow. So if I keep that up, it goes...

3.14>159<265
or...
418>159<813
731>159<562
910>159<195
Et Cetera...

The point is, by knowing that your three numbers are for sure "159" and that you are for sure using Pi, I have narrowed a brute force of what would be 1 million places in Pi, down to just 1012 possibility's for that same space. (The number of times the sequence 159 appears in the first 1 million digits of Pi) If that trend holds true for all of Pi, then I have reduced the search space down by 1,000 fold.

A significant flaw for security needs.
 
Last edited:
I would arrange them in a circuit and get each oscillator affecting one or more others, so the relationship is so much more complex.
I too would imagine that increases the entropy; but I have a nagging suspicion there could also be a tendency to pull the oscillators into sync, which would reduce the entropy.
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top