Abstract

The most important measure of error detection is the Hamming distance. This defines the number of changes in the transmitted bits that are required in order for a code word to be received as another code word. The more bits that are added, the greater the Hamming distance can be, and the objective of a good error detecting code is to be able to maximize the minimum Hamming distance between codes. For example, a code which has a minimum Hamming distance of 1 cannot be used to detect errors. This is because a single error in a specific bit in one or more code words causes the received code word to be received as a valid code word. A minimum Hamming distance of 2 will allow one error to be detected. In general, a code C can detect up to N errors for any code word if d(C) is greater than or equal to N + 1 (i.e. d(C)≥N + 1).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.