Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
List of algorithms
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Information theory and signal processing== {{main|Information theory|Signal processing}} ===Coding theory=== {{further|Coding theory}} ====Error detection and correction==== {{further|Error detection and correction}} * [[BCH Code]]s ** [[BerlekampâMassey algorithm]] ** [[PetersonâGorensteinâZierler algorithm]] ** [[ReedâSolomon error correction]] * [[BCJR algorithm]]: decoding of error correcting codes defined on trellises (principally convolutional codes) * [[Forward error correction]] * [[Gray code]] * [[Hamming code]]s ** [[Hamming(7,4)]]: a [[Hamming code]] that encodes 4 bits of data into 7 bits by adding 3 parity bits ** [[Hamming distance]]: sum number of positions which are different ** [[Hamming weight]] (population count): find the number of 1 bits in a binary word * [[Redundancy check]]s ** [[Adler-32]] ** [[Cyclic redundancy check]] ** [[Damm algorithm]] ** [[Fletcher's checksum]] ** [[Longitudinal redundancy check]] (LRC) ** [[Luhn algorithm]]: a method of validating identification numbers ** [[Luhn mod N algorithm]]: extension of Luhn to non-numeric characters ** [[Parity bit|Parity]]: simple/fast error detection technique ** [[Verhoeff algorithm]] ====Lossless compression algorithms==== {{main|:Category:Lossless compression algorithms|l1=Lossless compression algorithms}} * [[BurrowsâWheeler transform]]: preprocessing useful for improving [[Lossless data compression|lossless compression]] * [[Context tree weighting]] * [[Delta encoding]]: aid to compression of data in which sequential data occurs frequently * [[Dynamic Markov compression]]: Compression using predictive arithmetic coding * [[Dictionary coder]]s ** [[Byte pair encoding]] (BPE) ** [[Deflate]] ** [[LempelâZiv]] *** [[LZ77 and LZ78]] *** [[LZJB|LempelâZiv Jeff Bonwick]] (LZJB) *** [[LempelâZivâMarkov chain algorithm]] (LZMA) *** [[LempelâZivâOberhumer]] (LZO): speed oriented *** [[LempelâZivâStac]] (LZS) *** [[LempelâZivâStorerâSzymanski]] (LZSS) *** [[LempelâZivâWelch]] (LZW) *** [[LZWL]]: syllable-based variant *** [[LZX]] *** [[LZRW|LempelâZiv Ross Williams]] (LZRW) * [[Entropy encoding]]: coding scheme that assigns codes to symbols so as to match code lengths with the probabilities of the symbols ** [[Arithmetic coding]]: advanced [[entropy]] coding *** [[Range encoding]]: same as [[arithmetic coding]], but looked at in a slightly different way ** [[Huffman coding]]: simple lossless compression taking advantage of relative character frequencies *** [[Adaptive Huffman coding]]: [[adaptive coding]] technique based on Huffman coding *** [[Package-merge algorithm]]: Optimizes Huffman coding subject to a length restriction on code strings ** [[ShannonâFano coding]] ** [[ShannonâFanoâElias coding]]: precursor to arithmetic encoding<ref>{{Cite web |title=Shannon-Fano-Elias Coding |url=https://my.ece.msstate.edu/faculty/fowler/Classes/ECE8813/Handouts/shannon_fano_elias.pdf |url-status=dead |archive-url=https://web.archive.org/web/20210228164521/https://my.ece.msstate.edu/faculty/fowler/Classes/ECE8813/Handouts/shannon_fano_elias.pdf |archive-date=2021-02-28 |access-date=2023-10-11 |website=my.ece.msstate.edu}}</ref> * [[Entropy encoding|Entropy coding with known entropy characteristics]] ** [[Golomb coding]]: form of entropy coding that is optimal for alphabets following geometric distributions ** [[Rice coding]]: form of entropy coding that is optimal for alphabets following geometric distributions ** [[Truncated binary encoding]] ** [[Unary coding]]: code that represents a number n with n ones followed by a zero ** [[Universal code (data compression)|Universal codes]]: encodes positive integers into binary code words *** Elias [[Elias delta coding|delta]], [[Elias gamma coding|gamma]], and [[Elias omega coding|omega]] coding *** [[Exponential-Golomb coding]] *** [[Fibonacci coding]] *** [[Levenshtein coding]] * [[FELICS|Fast Efficient & Lossless Image Compression System]] (FELICS): a lossless image compression algorithm * [[Incremental encoding]]: delta encoding applied to sequences of strings * [[PPM compression algorithm|Prediction by partial matching]] (PPM): an adaptive statistical data compression technique based on context modeling and prediction * [[Run-length encoding]]: lossless data compression taking advantage of strings of repeated characters * [[SEQUITUR algorithm]]: lossless compression by incremental grammar inference on a string ====Lossy compression algorithms==== {{main|:Category:Lossy compression algorithms|l1=Lossy compression algorithms}} * [[3Dc]]: a lossy data compression algorithm for [[Normal mapping|normal maps]] * [[Audio data compression|Audio]] and [[speech encoding|Speech]] compression ** [[A-law algorithm]]: standard companding algorithm ** [[Code-excited linear prediction]] (CELP): low bit-rate speech compression ** [[Linear predictive coding]] (LPC): lossy compression by representing the [[spectral envelope]] of a digital signal of speech in compressed form ** [[Mu-law algorithm]]: standard analog signal compression or companding algorithm ** [[Warped Linear Predictive Coding]] (WLPC) * [[Image compression]] ** [[Block Truncation Coding]] (BTC): a type of lossy image compression technique for greyscale images ** [[Embedded Zerotree Wavelet]] (EZW) ** [[Fast Cosine Transform|Fast Cosine Transform algorithm]]s (FCT algorithms): computes Discrete Cosine Transform (DCT) efficiently ** [[Fractal compression]]: method used to compress images using fractals ** [[Set Partitioning in Hierarchical Trees]] (SPIHT) ** [[Wavelet compression]]: form of data compression well suited for [[image compression]] (sometimes also video compression and audio compression) * [[Transform coding]]: type of data compression for "natural" data like audio signals or photographic images * [[Video compression]] * [[Vector quantization]]: technique often used in lossy data compression ===Digital signal processing=== {{further|Digital signal processing}} * [[Adaptive-additive algorithm]] (AA algorithm): find the spatial frequency phase of an observed wave source * [[Discrete Fourier transform]]: determines the frequencies contained in a (segment of a) signal ** [[Bluestein's FFT algorithm]] ** [[Bruun's FFT algorithm]] ** [[Cooley–Tukey FFT algorithm]] ** [[Fast Fourier transform]] ** [[Prime-factor FFT algorithm]] ** [[Rader's FFT algorithm]] * [[Fast folding algorithm]]: an efficient algorithm for the detection of approximately periodic events within time series data * [[GerchbergâSaxton algorithm]]: Phase retrieval algorithm for optical planes * [[Goertzel algorithm]]: identify a particular frequency component in a signal. Can be used for [[DTMF]] digit decoding. * [[Karplus-Strong string synthesis]]: physical modelling synthesis to simulate the sound of a hammered or plucked string or some types of percussion ====Image processing==== {{further|Digital image processing}} * Contrast Enhancement ** [[Histogram equalization]]: use histogram to improve image contrast ** [[Adaptive histogram equalization]]: histogram equalization which adapts to local changes in contrast * [[Connected-component labeling]]: find and label disjoint regions * [[Dithering]] and [[half-toning]] ** [[Error diffusion]] ** [[FloydâSteinberg dithering]] ** [[Ordered dithering]] ** [[Riemersma dithering]] * Elser [[difference-map algorithm]]: a search algorithm for general constraint satisfaction problems. Originally used for [[X-ray crystallography|X-Ray diffraction]] microscopy * [[Feature detection (computer vision)|Feature detection]] ** [[Canny edge detector]]: detect a wide range of edges in images ** [[Generalised Hough transform]] ** [[Hough transform]] ** [[MarrâHildreth algorithm]]: an early [[edge detection]] algorithm ** [[Scale-invariant feature transform|SIFT]] (Scale-invariant feature transform): is an algorithm to detect and describe local features in images. ** {{visible anchor|SURF ([[speeded up robust features|Speeded Up Robust Features]])}}: is a robust local feature detector, first presented by Herbert Bay et al. in 2006, that can be used in computer vision tasks like object recognition or 3D reconstruction. It is partly inspired by the SIFT descriptor. The standard version of SURF is several times faster than SIFT and claimed by its authors to be more robust against different image transformations than SIFT.<ref>{{cite web |url=http://www.vision.ee.ethz.ch/~surf/eccv06.pdf |title=Archived copy |website=www.vision.ee.ethz.ch |access-date=13 January 2022 |archive-url=https://web.archive.org/web/20070221214147/http://www.vision.ee.ethz.ch/~surf/eccv06.pdf |archive-date=21 February 2007 |url-status=dead}}</ref><ref>{{cite web |url=http://glorfindel.mavrinac.com/~aaron/school/pdf/bay06_surf.pdf |title=Archived copy |access-date=2013-10-05 |url-status=dead |archive-url=https://web.archive.org/web/20131006113018/http://glorfindel.mavrinac.com/~aaron/school/pdf/bay06_surf.pdf |archive-date=2013-10-06 }}</ref> * [[RichardsonâLucy deconvolution]]: image de-blurring algorithm * [[Blind deconvolution]]: image de-blurring algorithm when [[point spread function]] is unknown. * [[Median filtering]] * [[Seam carving]]: content-aware image resizing algorithm * [[Segmentation (image processing)|Segmentation]]: partition a digital image into two or more regions ** [[GrowCut algorithm]]: an interactive segmentation algorithm ** [[Random walker algorithm]] ** [[Region growing]] ** [[Watershed (algorithm)|Watershed transformation]]: a class of algorithms based on the watershed analogy
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)