Kamis, 02 Mei 2013

SPECTRUM ANALYZER; Chapter 5 Sensitivity and Noise


Sensitivity
One of the primary uses of a spectrum analyzer is to search out and measure low-level signals. The limitation in these measurements is the noise generated within the spectrum analyzer itself. This noise, generated by the random electron motion in various circuit elements, is amplified by multiple gain stages in the analyzer and appears on the display as a noise signal. On a spectrum analyzer, this noise is commonly referred to as the Displayed Average Noise Level, or DANL 1 . While there are techniques to measure signals slightly below the DANL, this noise power ultimately limits our ability to make measurements of low-level signals.

Let's assume that a 50 ohm termination is attached to the spectrum analyzer input to prevent any unwanted signals from entering the analyzer. This passive termination generates a small amount of noise energy equal to kTB, where:
Since the total noise power is a function of measurement bandwidth, the value is typically normalized to a 1 Hz bandwidth. Therefore, at room temperature, the noise power density is 174 dBm/Hz. When this noise reaches the first gain stage in the analyzer, the amplifier boosts the noise, plus adds some of its own. As the noise signal passes on through the system, it is typically high enough in amplitude that the noise generated in subsequent gain stages adds only a small amount to the total noise power. Note that the input attenuator 
and one or more mixers may be between the input connector of a spectrum analyzer and the first stage of gain, and all of these components generate noise. However, the noise that they generate is at or near the absolute minimum of 174 dBm/ Hz, so they do not significantly affect the noise level input to, and amplified by, the first gain stage. 

While the input attenuator, mixer, and other circuit elements between the input connector and first gain stage have little effect on the actual system noise, they do have a marked effect on the ability of an analyzer to display low-level signals because they attenuate the input signal. That is, they reduce the signal-to-noise ratio and so degrade sensitivity. 

We can determine the DANL simply by noting the noise level indicated on the display when the spectrum analyzer input is terminated with a 50 ohm load. This level is the spectrum analyzer's own noise floor. Signals below this level are masked by the noise and cannot be seen. However, the DANL is not the actual noise level at the input, but rather the effective noise level. An analyzer display is calibrated to reflect the level of a signal at the analyzer input, so the displayed noise floor represents a fictitious, or effective noise floor at the input. 

The actual noise level at the input is a function of the input signal. Indeed, noise is sometimes the signal of interest. Like any discrete signal, a noise signal is much easier to measure when it is well above the effective (displayed) noise floor. The effective input noise floor includes the losses caused by the input attenuator, mixer conversion loss, and other circuit elements prior to the first gain stage. We cannot do anything about the conversion loss of the mixers, but we can change the RF input attenuator. This enables us to control 
the input signal power to the first mixer and thus change the displayed signal-to-noise floor ratio. Clearly, we get the lowest DANL by selecting minimum (zero) RF attenuation. 

1. Displayed average noise level is sometimes confused with the term Sensitivity. While related, these terms have different meanings. Sensitivity is a measure of the minimum signal level that yields a defined signal-to-noise ratio (SNR) or bit error rate (BER). It is a common metric of radio receiver performance. Spectrum analyzer specifications are always given in terms of the DANL.Because the input attenuator has no effect on the actual noise generated in the system, some early spectrum analyzers simply left the displayed noise 
at the same position on the display regardless of the input attenuator setting. That is, the IF gain remained constant. This being the case, the input attenuator affected the location of a true input signal on the display. As input attenuation was increased, further attenuating the input signal, the location of the signal on the display went down while the noise remained stationary. 

Beginning in the late 1970s, spectrum analyzer designers took a different approach. In newer analyzers, an internal microprocessor changes the IF gain to offset changes in the input attenuator. Thus, signals present at the analyzer's input remain stationary on the display as we change the input attenuator, while the displayed noise moves up and down. In this case, the reference level remains unchanged. This is shown in Figure 5-1. As the attenuation increases from 5 to 15 to 25 dB, the displayed noise rises while the 30 dBm signal remains constant. In either case, we get the best signal-to-noise ratio by selecting minimum input attenuation. 


Figure 5-1. Reference level remains constant when changing input attenuation 

Resolution bandwidth also affects signal-to-noise ratio, or sensitivity. The noise generated in the analyzer is random and has a constant amplitude over a wide frequency range. Since the resolution, or IF, bandwidth filters come after the first gain stage, the total noise power that passes through the filters is determined by the width of the filters. This noise signal is detected and ultimately reaches the display. The random nature of the noise signal causes the displayed level to vary as: 
So if we change the resolution bandwidth by a factor of 10, the displayed noise level changes by 10 dB, as shown in Figure 5-2. For continuous wave (CW) signals, we get best signal-to-noise ratio, or best sensitivity, using the minimum resolution bandwidth available in our spectrum analyzer 2. 


Figure 5-2. Displayed noise level changes as 10 log (BW2 / BW1) 

A spectrum analyzer displays signal plus noise, and a low signal-to-noise ratio makes the signal difficult to distinguish. We noted previously that the video filter can be used to reduce the amplitude fluctuations of noisy signals while at the same time having no effect on constant signals. Figure 5-3 shows how the video filter can improve our ability to discern low-level signals. It should be noted that the video filter does not affect the average noise level and so does not, by this definition, affect the sensitivity of an analyzer. 

In summary, we get best sensitivity for narrowband signals by selecting the minimum resolution bandwidth and minimum input attenuation. These settings give us best signal-to-noise ratio. We can also select minimum video bandwidth to help us see a signal at or close to the noise level 3. Of course, selecting narrow resolution and video bandwidths does lengthen the sweep time. 
Figure 5-3. Video filtering makes low-level signals more discernable 

2. Broadband, pulsed signals can exhibit the opposite behavior, where the SNR increases as the bandwidth 
gets larger. 
3. For the effect of noise on accuracy, see Dynamic range versus measurement uncertainty in 
Chapter 6.

For full page Tutorial please open this link
See also per chapter at links below






Tidak ada komentar:

Posting Komentar

ucx','_assdop');