I have been going through some graphs, and I have a question about the graphs that shows the Error vs Eb/No.
Now Eb is Energy per bit.
Error is number of errors.
What is No? is it the Gaussian white noise of zero mean?
Why the Eb/No axis is measured in dB? what is the reference here?
for example the error probability graph for the NRZ digital data looks like a low pass filter graph. But somehow I just cannot understand it very well. Does that mean if my bit has more energy there is less probability to have an error? why? (a pic is shown).
so No is constant?
Does this graph give me any information about the bit rate? like if I have a faster bit rate ill have more error?
Now Eb is Energy per bit.
Error is number of errors.
What is No? is it the Gaussian white noise of zero mean?
Why the Eb/No axis is measured in dB? what is the reference here?
for example the error probability graph for the NRZ digital data looks like a low pass filter graph. But somehow I just cannot understand it very well. Does that mean if my bit has more energy there is less probability to have an error? why? (a pic is shown).
so No is constant?
Does this graph give me any information about the bit rate? like if I have a faster bit rate ill have more error?