José Fonseca, MSc.


A Hardware Pseudo-Random Number Generator for FPGA



During my Master's Thesis, I wondered if I could perform on-chip training on my LSTM Neural Network. Performing this using Backpropagation, in an FPGA, is a very difficult task: it involves an explicit calculation of the gradients. As such, we decided to apply a statistical method that, instead of calculating the gradients, it estimates them using the Simultaneous Perturbation Stochastic Approximation method.


For that end, I needed to generate a stream of random weight values. As such, I implemented a hardware Pseudo-Random Number Generator (PRNG), that uses an approach inspired by this article. The algorithm's main idea is to have two types of PRNGs running in parallel (a Cellular-Automata, and a LFSR), and then combining the outputs using a bitwise-XOR operation.

Of course, the output are not true random numbers, but the pseudo-random sequence has good statistical properties, and passes the DIEHARD battery of tests.



Output from Python generation script
Figure 1 - Output of the Python script that implements this algorithm

In the previous Figure, we have three columns. The leftmost one depicts the output of the LFSR, the center one depicts the Cellular Automata, and the rightmost one depicts the combined output. Each row represents an iteration of each PRNG, and white and black represent zeroes and ones, respectively.

It can be clearly seen that the LFSR presents its typical oblique stripes, a result of the horizontal shifting, and the Cellular Automata presents a few clusters of similar numbers. On the other hand, they XOR combination visually resembles the noise we could observe whenever our old TVs were not properly tuned! Due to time restrictions, no exhaustive probabilistic quality tests were performed, and since the original authors have already performed DIEHARD, it would be redundant.



The HDL code can be downloaded in this link. The contents are the organized in the following fashion: