We Hardware Engineers prefer to test our control and analysis programs on live hardware, but sometimes that is not possible. So we need to write programs that simulate our input or output signals and test with software until we can get our actual hardware again (This is code for: "Pry it out of the hands of the Software Application Developers").
Recently I have been working on FFT processing of RF Digitizers and I don't always have access to a working hardware. It is easy to generate Sine waves and two tone signals in C#, but it is not so easy to generate simulated noise.
I needed a routine for Gaussian Random numbers to represent ADC noise in my signal processing FFT and Windowing testing. So I came up with this little routine based on a classic technique called: "The Polar Method" by Dr. Marsaglia and published in 1962. This implementation brings the classic Fortran work to C# and .Net.
The routine accepts these input parameters:
Int32 size => The size of the data array to produce
double scaling_vrms => The magnitude of the noise in Vrms in the Frequency domain, scaled for DFT size.
The routine scales the noise so that if you DFT it with a standard DFT or FFT routine the noise will always be the requested spectral value regardless of the size of the DFT (or equivalent noise bandwidth). You can strip this scaling out if you wish to.
For instance if you request 1uV RMS noise and 1000 or 10,000 data points the routine will scale the noise output so that you always get 1 uV RMS of noise output for both sizes of DFT's. This scaling is for 1/N scaled DFT's and FFT's (Most are) and saves you from having to calculate the noise bandwidths for a given DFT size.
The routine also guarantees that there is no DC offset value by subtracting the average value of all the random values near the end of the routine.
Figure 1: A Histogram of 10,000 numbers generated by the Gaussian Noise routine. As can be seen the generated noise is decently Gaussian.
Figure 2: 1000 generated points when the requested value was 1 uV RMS. Note the scaling factor is for the frequency domain. The Gaussian Noise output looks just like real noise.
Figure 3: The results of a non-windowed 100,000 point DFT. The requested scale factor was 1 uV RMS and the output is spot on. You can also see that the spectral density is constant so the noise is not only Gaussian but White (or flat with frequency) as well.
Figure 4: With the Gaussian Noise routine it is a simple matter to make any signal to noise ratio that is desired with a simple call to generate some noise for any signal. This is the result of a 100,000 point DFT with a 1 volt RMS, C# generated sine wave with 100,000 points of 1 mV RMS noise added. This created a simulated 60 dB Signal to noise ratio with very little effort allowing me to generate very lifelike signals for testing even without using the actual hardware.
Now everyone is happy (well, hopefully that is) - I have lifelike signals to use in my calibration and test software development and the application developers get to spend a few days with actual hardware. There just never seems to be enough actual hardware to go around.
I hope you find the routine and the testing results useful.
Note: An earlier version of the code posted had the wrong scale factor. The actual scale factor is Sqrt(2).