Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Hamming distance
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Number of bits that differ between two strings}} {{Use American English|date = March 2019}} {{more inline|date=May 2015}} {{Infobox algorithm |name = Hamming distance |class = [[String metric|String similarity]] |image = {{image array | perrow = 2 | width = 150 | image1 = Hamming distance 4 bit binary.svg | alt1 = 4-bit binary tesseract | caption1 = 4-bit binary [[tesseract]] for finding Hamming distance. | image2 =Hamming distance 4 bit binary example.svg | alt2 = 4-bit binary tesseract Hamming distance examples | caption2 = Two example distances: {{font color|red|0100→1001}} has distance 3; {{font color|blue|0110→1110}} has distance 1 }} |caption = |data = [[String (computer science)|string]] |time = <math>O(n)</math> |best-time = <math>O(n)</math> |average-time = <math>O(n)</math> |space = <math>O(n)</math> }} {{multiple image | width = 150 | image1 = Hamming distance 3 bit binary.svg | alt1 = 3-bit binary cube | caption1 = 3-bit binary [[cube]] for finding Hamming distance | image2 = Hamming distance 3 bit binary example.svg | alt2 = 3-bit binary cube Hamming distance examples | caption2 = Two example distances: {{font color|red|100→011}} has distance 3; {{font color|blue|010→111}} has distance 2 | footer = The minimum distance between any two vertices is the Hamming distance between the two binary strings. }} In [[information theory]], the '''Hamming distance''' between two [[String (computer science)|strings]] or vectors of equal length is the number of positions at which the corresponding [[symbol]]s are different. In other words, it measures the minimum number of ''substitutions'' required to change one string into the other, or equivalently, the minimum number of ''errors'' that could have transformed one string into the other. In a more general context, the Hamming distance is one of several [[string metric]]s for measuring the [[edit distance]] between two sequences. It is named after the American mathematician [[Richard Hamming]]. A major application is in [[coding theory]], more specifically to [[block code]]s, in which the equal-length strings are [[Vector space|vectors]] over a [[finite field]]. == Definition == The Hamming distance between two equal-length strings of symbols is the number of positions at which the corresponding symbols are different.<ref>{{cite book |last1=Waggener |first1=Bill |title=Pulse Code Modulation Techniques |date=1995 |publisher=Springer |isbn=978-0-442-01436-0 |page=206 |url=https://books.google.com/books?id=8l_o6kI3760C&pg=PA206 |access-date=13 June 2020}}</ref> == Examples == The symbols may be letters, bits, or decimal digits, among other possibilities. For example, the Hamming distance between: * "'''ka{{red|rol}}in'''" and "'''ka{{red|thr}}in'''" is 3. * "'''k{{red|a}}r{{red|ol}}in'''" and "'''k{{red|e}}r{{red|st}}in'''" is 3. * "'''k{{red|athr}}in'''" and "'''k{{red|erst}}in'''" is 4. * '''{{red|0000}}''' and '''{{red|1111}}''' is 4. * '''2{{red|17}}3{{red|8}}96''' and '''2{{red|23}}3{{red|7}}96''' is 3. == Properties == For a fixed length ''n'', the Hamming distance is a [[Metric (mathematics)|metric]] on the set of the [[word (formal language theory)|words]] of length ''n'' (also known as a [[Hamming space]]), as it fulfills the conditions of non-negativity, symmetry, the Hamming distance of two words is 0 if and only if the two words are identical, and it satisfies the [[triangle inequality]] as well:<ref name="Robinson2003" /> Indeed, if we fix three words ''a'', ''b'' and ''c'', then whenever there is a difference between the ''i''th letter of ''a'' and the ''i''th letter of ''c'', then there must be a difference between the ''i''th letter of ''a'' and ''i''th letter of ''b'', or between the ''i''th letter of ''b'' and the ''i''th letter of ''c''. Hence the Hamming distance between ''a'' and ''c'' is not larger than the sum of the Hamming distances between ''a'' and ''b'' and between ''b'' and ''c''. The Hamming distance between two words ''a'' and ''b'' can also be seen as the [[Hamming weight]] of ''a'' − ''b'' for an appropriate choice of the − operator, much as the difference between two integers can be seen as a distance from zero on the number line.{{clarify|date=June 2020}} For binary strings ''a'' and ''b'' the Hamming distance is equal to the number of ones ([[Hamming weight|population count]]) in ''a'' [[Exclusive or|XOR]] ''b''.<ref name="Warren_2013" /> The metric space of length-''n'' binary strings, with the Hamming distance, is known as the ''Hamming cube''; it is equivalent as a metric space to the set of distances between vertices in a [[hypercube graph]]. One can also view a binary string of length ''n'' as a vector in <math>\mathbb{R}^{n}</math> by treating each symbol in the string as a real coordinate; with this embedding, the strings form the vertices of an ''n''-dimensional [[hypercube]], and the Hamming distance of the strings is equivalent to the [[Manhattan distance]] between the vertices. == Error detection and error correction == The '''minimum Hamming distance''' or '''minimum distance''' (usually denoted by ''d<sub>min</sub>'') is used to define some essential notions in [[coding theory]], such as [[Error detection and correction|error detecting and error correcting codes]]. In particular, a [[code (coding theory)|code]] ''C'' is said to be ''k'' error detecting if, and only if, the minimum Hamming distance between any two of its codewords is at least ''k''+1.<ref name="Robinson2003">{{cite book |author-first=Derek J. S. |author-last=Robinson |title=An Introduction to Abstract Algebra |date=2003 |publisher=[[Walter de Gruyter]] |isbn=978-3-11-019816-4 |pages=255–257}}</ref> For example, consider a code consisting of two codewords "000" and "111". The Hamming distance between these two words is 3, and therefore it is ''k''=2 error detecting. This means that if one bit is flipped or two bits are flipped, the error can be detected. If three bits are flipped, then "000" becomes "111" and the error cannot be detected. A code ''C'' is said to be ''k-error correcting'' if, for every word ''w'' in the underlying Hamming space ''H'', there exists at most one codeword ''c'' (from ''C'') such that the Hamming distance between ''w'' and ''c'' is at most ''k''. In other words, a code is ''k''-errors correcting if the minimum Hamming distance between any two of its codewords is at least 2''k''+1. This is also understood geometrically as any [[Ball (mathematics)#In general metric spaces|closed balls]] of radius ''k'' centered on distinct codewords being disjoint.<ref name="Robinson2003" /> These balls are also called ''[[Hamming sphere]]s'' in this context.<ref name="cc17" /> For example, consider the same 3-bit code consisting of the two codewords "000" and "111". The Hamming space consists of 8 words 000, 001, 010, 011, 100, 101, 110 and 111. The codeword "000" and the single bit error words "001","010","100" are all less than or equal to the Hamming distance of 1 to "000". Likewise, codeword "111" and its single bit error words "110","101" and "011" are all within 1 Hamming distance of the original "111". In this code, a single bit error is always within 1 Hamming distance of the original codes, and the code can be ''1-error correcting'', that is ''k=1''. Since the Hamming distance between "000" and "111" is 3, and those comprise the entire set of codewords in the code, the minimum Hamming distance is 3, which satisfies ''2k+1 = 3''. Thus a code with minimum Hamming distance ''d'' between its codewords can detect at most ''d''-1 errors and can correct ⌊(''d''-1)/2⌋ errors.<ref name="Robinson2003" /> The latter number is also called the ''[[Sphere packing#Other spaces|packing radius]]'' or the ''error-correcting capability'' of the code.<ref name="cc17">{{citation |title=Covering Codes |volume=54 |series=North-Holland Mathematical Library |author-first1=G. |author-last1=Cohen|author1-link=Gérard Denis Cohen |author-first2=I. |author-last2=Honkala |author-first3=S. |author-last3=Litsyn |author-first4=A. |author-last4=Lobstein |publisher=[[Elsevier]] |date=1997 |isbn=978-0-08-053007-9 |pages=16–17}}</ref> == History and applications == The Hamming distance is named after [[Richard Hamming]], who introduced the concept in his fundamental paper on [[Hamming code]]s, ''Error detecting and error correcting codes'', in 1950.<ref>{{Cite journal|last=Hamming|first=R. W.|date=April 1950|title=Error detecting and error correcting codes|journal=The Bell System Technical Journal|volume=29|issue=2|pages=147–160|doi=10.1002/j.1538-7305.1950.tb00463.x|s2cid=61141773 |issn=0005-8580|url=https://calhoun.nps.edu/bitstream/10945/46756/1/Hamming_1982.pdf |archive-url=https://ghostarchive.org/archive/20221009/https://calhoun.nps.edu/bitstream/10945/46756/1/Hamming_1982.pdf |doi-access=free |hdl=10945/46756 |hdl-access=free |archive-date=2022-10-09 |url-status=live}}</ref> Hamming weight analysis of bits is used in several disciplines including [[information theory]], [[coding theory]], and [[cryptography]].<ref>{{Cite book |last1=Jarrous |first1=Ayman |last2=Pinkas |first2=Benny |title=Applied Cryptography and Network Security |chapter=Secure Hamming Distance Based Computation and Its Applications |series=Lecture Notes in Computer Science |date=2009 |volume=5536 |editor-last=Abdalla |editor-first=Michel |editor2-last=Pointcheval |editor2-first=David |editor3-last=Fouque |editor3-first=Pierre-Alain |editor4-last=Vergnaud |editor4-first=Damien |language=en |location=Berlin, Heidelberg |publisher=Springer |pages=107–124 |doi=10.1007/978-3-642-01957-9_7 |isbn=978-3-642-01957-9|doi-access=free }}</ref> It is used in [[telecommunication]] to count the number of flipped bits in a fixed-length binary word as an estimate of error, and therefore is sometimes called the '''signal distance'''.<ref name="Ayala2012">{{cite book |author-first=Jose |author-last=Ayala |title=Integrated Circuit and System Design |date=2012 |publisher=[[Springer Science+Business Media|Springer]] |isbn=978-3-642-36156-2 |page=62}}</ref> For ''q''-ary strings over an [[alphabet]] of size ''q'' ≥ 2 the Hamming distance is applied in case of the [[Binary symmetric channel|q-ary symmetric channel]], while the [[Lee distance]] is used for [[phase-shift keying]] or more generally channels susceptible to [[synchronization error]]s because the Lee distance accounts for errors of ±1.<ref name="Roth2006">{{cite book |author-first=Ron |author-last=Roth |title=Introduction to Coding Theory |date=2006 |publisher=[[Cambridge University Press]] |isbn=978-0-521-84504-5 |page=298}}</ref> If <math>q = 2</math> or <math>q = 3</math> both distances coincide because any pair of elements from <math display="inline">\mathbb{Z}/2\mathbb{Z}</math> or <math display="inline">\mathbb{Z}/3\mathbb{Z}</math> differ by 1, but the distances are different for larger <math>q</math>. The Hamming distance is also used in [[systematics]] as a measure of genetic distance.<ref>{{Cite journal|last1=Pilcher|first1=Christopher D.|last2=Wong|first2=Joseph K.|last3=Pillai|first3=Satish K.|date=2008-03-18|title=Inferring HIV Transmission Dynamics from Phylogenetic Sequence Relationships|journal=PLOS Medicine|language=en|volume=5|issue=3|pages=e69|doi=10.1371/journal.pmed.0050069|pmid=18351799|pmc=2267810|issn=1549-1676 |doi-access=free }}</ref> However, for comparing strings of different lengths, or strings where not just substitutions but also insertions or deletions have to be expected, a more sophisticated metric like the [[Levenshtein distance]] is more appropriate. == Algorithm example == The following function, written in Python 3, returns the Hamming distance between two strings: <syntaxhighlight lang="python3" line="1"> def hamming_distance(string1: str, string2: str) -> int: """Return the Hamming distance between two strings.""" if len(string1) != len(string2): raise ValueError("Strings must be of equal length.") dist_counter = 0 for n in range(len(string1)): if string1[n] != string2[n]: dist_counter += 1 return dist_counter </syntaxhighlight> Or, in a shorter expression: <syntaxhighlight lang="python"> sum(char1 != char2 for char1, char2 in zip(string1, string2, strict=True)) </syntaxhighlight> The function <code>hamming_distance()</code>, implemented in [[Python (programming language)|Python 3]], computes the Hamming distance between two strings (or other [[Iterator|iterable]] objects) of equal length by creating a sequence of Boolean values indicating mismatches and matches between corresponding positions in the two inputs, then summing the sequence with True and False values, interpreted as one and zero, respectively. {{Clear}} <syntaxhighlight lang="python"> def hamming_distance(s1: str, s2: str) -> int: """Return the Hamming distance between equal-length sequences.""" if len(s1) != len(s2): raise ValueError("Undefined for sequences of unequal length.") return sum(char1 != char2 for char1, char2 in zip(s1, s2)) </syntaxhighlight> where the [https://docs.python.org/3/library/functions.html#zip zip()] function merges two equal-length collections in pairs. The following [[C (programming language)|C]] function will compute the Hamming distance of two integers (considered as binary values, that is, as sequences of bits). The running time of this procedure is proportional to the Hamming distance rather than to the number of bits in the inputs. It computes the [[bitwise operation|bitwise]] [[exclusive or]] of the two inputs, and then finds the [[Hamming weight]] of the result (the number of nonzero bits) using an algorithm of {{harvtxt|Wegner|1960}} that repeatedly finds and clears the lowest-order nonzero bit. Some compilers support the [[Hamming weight#Language support|__builtin_popcount]] function which can calculate this using specialized processor hardware where available. <syntaxhighlight lang="c"> int hamming_distance(unsigned x, unsigned y) { int dist = 0; // The ^ operators sets to 1 only the bits that are different for (unsigned val = x ^ y; val > 0; ++dist) { // We then count the bit set to 1 using the Peter Wegner way val = val & (val - 1); // Set to zero val's lowest-order 1 } // Return the number of differing bits return dist; } </syntaxhighlight> A faster alternative is to use the population count (''popcount'') assembly instruction. Certain compilers such as GCC and Clang make it available via an intrinsic function: <syntaxhighlight lang="c"> // Hamming distance for 32-bit integers int hamming_distance32(unsigned int x, unsigned int y) { return __builtin_popcount(x ^ y); } // Hamming distance for 64-bit integers int hamming_distance64(unsigned long long x, unsigned long long y) { return __builtin_popcountll(x ^ y); } </syntaxhighlight> == See also == {{Portal|Mathematics}} * [[Closest string]] * [[Damerau–Levenshtein distance]] * [[Euclidean distance]] * [[Gap-Hamming problem]] * [[Gray code]] * [[Jaccard index]] * [[Jaro–Winkler distance]] * [[Levenshtein distance]] * [[Mahalanobis distance]] * [[Mannheim distance]] * [[Sørensen similarity index]] * [[Sparse distributed memory]] * [[Word ladder]] == References == {{Reflist|refs= <ref name="Warren_2013">{{cite book |title=Hacker's Delight |title-link=Hacker's Delight |author-first=Henry S. |author-last=Warren Jr. |date=2013 |orig-year=2002 |edition=2 |publisher=[[Addison Wesley]] – [[Pearson Education, Inc.]] |isbn=978-0-321-84268-8 |id=0-321-84268-5 |pages=81–96}}</ref> }} == Further reading == * {{FS1037C}} * {{cite journal |author-last=Wegner |author-first=Peter |author-link=Peter Wegner (computer scientist) |doi=10.1145/367236.367286 |issue=5 |journal=[[Communications of the ACM]] |page=322 |title=A technique for counting ones in a binary computer |volume=3 |date=1960|s2cid=31683715 |doi-access=free }} * {{cite book |author-link=David J. C. MacKay |author-last=MacKay |author-first=David J. C. |url=http://www.inference.phy.cam.ac.uk/mackay/itila/book.html |title=Information Theory, Inference, and Learning Algorithms |location=Cambridge |publisher=[[Cambridge University Press]] |date=2003 |isbn=0-521-64298-1}} {{Strings}} {{Authority control}} [[Category:String metrics]] [[Category:Coding theory]] [[Category:Articles with example Python (programming language) code]] [[Category:Articles with example C++ code]] [[Category:Metric geometry]] [[Category:Cubes]] [[Category:Computational linguistics]] [[Category:Information theory]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Authority control
(
edit
)
Template:Citation
(
edit
)
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Clarify
(
edit
)
Template:Clear
(
edit
)
Template:FS1037C
(
edit
)
Template:Harvtxt
(
edit
)
Template:Infobox algorithm
(
edit
)
Template:More inline
(
edit
)
Template:Multiple image
(
edit
)
Template:Portal
(
edit
)
Template:Red
(
edit
)
Template:Reflist
(
edit
)
Template:Short description
(
edit
)
Template:Strings
(
edit
)
Template:Use American English
(
edit
)