i read that: The BER is an indication of how often a packet or other data unit has to be retransmitted because of an error.
but my opponent of my bachelor thesis asked me if this really true about BER.
i didn't know what to tell him. since i read it somewhere and i didn't just made it up in my mind.
so is this statement is true?
I am used to working with BER in the neighborhood of 1e4 is lousey, 1e6 is decent, 1e8 is pretty good. I have never seen it as a percentage. 1% would be awful!
I am used to working with BER in the neighborhood of 1e4 is lousey, 1e6 is decent, 1e8 is pretty good. I have never seen it as a percentage. 1% would be awful!
thanks for the quick reply.
but it doesn't answer my question. if the following statement is correct "The BER is an indication of how often a packet or other data unit has to be retransmitted because of an error."
Higher BER will require more retransmissions. Exactly how much more depends on the packet size, error correction used, and the nature of the errors. Often bursts of errors are handled differently than truely random errors.
BER tells the raw error rate, not the retransmission rate.