Sumit Prasad
New Member
Hi All,
I'm currently working on a project where I'm trying to minimize the effect of discontinuity in input signal samples (or the effect known as spectrum leakage) . As per my understanding of the domain and the literature that I've studied, majority of signal analyzers are using window functions to minimize the discontinuity effect.
However, I was thinking of doing the same using a reference signal at the time of reception. When signal samples are received, we can incorporate a reference signal which will maintain phase coherence with the input samples. Whenever any discontinuity is observed, based on the parameters such as missed samples and frequency of reference signal, a new reference signal is generated or we can say reference signal is readjusted. This adjustment is in initial phase of the ref signal. Thus, phase coherence is maintained and effect of discontinuity should be removed.
I'm trying to simulate the circuit on above concept, however, I was wondering why this concept is not being implemented in the commercial spectrum analyzers?
Is there any technical limitation?
If not, can any one guide me to such analyzers where I can study the leakage effect?
In anticipation.
Kind regards,
Sumit
I'm currently working on a project where I'm trying to minimize the effect of discontinuity in input signal samples (or the effect known as spectrum leakage) . As per my understanding of the domain and the literature that I've studied, majority of signal analyzers are using window functions to minimize the discontinuity effect.
However, I was thinking of doing the same using a reference signal at the time of reception. When signal samples are received, we can incorporate a reference signal which will maintain phase coherence with the input samples. Whenever any discontinuity is observed, based on the parameters such as missed samples and frequency of reference signal, a new reference signal is generated or we can say reference signal is readjusted. This adjustment is in initial phase of the ref signal. Thus, phase coherence is maintained and effect of discontinuity should be removed.
I'm trying to simulate the circuit on above concept, however, I was wondering why this concept is not being implemented in the commercial spectrum analyzers?
Is there any technical limitation?
If not, can any one guide me to such analyzers where I can study the leakage effect?
In anticipation.
Kind regards,
Sumit