Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Feature (machine learning)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Feature vectors== {{See also|Word embedding}} {{Redirect|Feature space|feature spaces in kernel machines|Kernel method}} In [[pattern recognition]] and [[machine learning]], a '''feature vector''' is an n-dimensional [[Vector (mathematics and physics)|vector]] of numerical features that represent some object. Many [[algorithm]]s in machine learning require a numerical representation of objects, since such representations facilitate processing and statistical analysis. When representing images, the feature values might correspond to the pixels of an image, while when representing texts the features might be the frequencies of occurrence of textual terms. Feature vectors are equivalent to the vectors of [[explanatory variable]]s used in [[statistics|statistical]] procedures such as [[linear regression]]. Feature vectors are often combined with weights using a [[dot product]] in order to construct a [[linear predictor function]] that is used to determine a score for making a prediction. The [[vector space]] associated with these vectors is often called the '''feature space'''. In order to reduce the dimensionality of the feature space, a number of [[dimensionality reduction]] techniques can be employed. Higher-level features can be obtained from already available features and added to the feature vector; for example, for the study of diseases the feature 'Age' is useful and is defined as ''Age = 'Year of death' minus 'Year of birth' ''. This process is referred to as '''feature construction'''.<ref name=Liu1998>Liu, H., Motoda H. (1998) ''[https://books.google.com/books?id=aaDbBwAAQBAJ Feature Selection for Knowledge Discovery and Data Mining].'', Kluwer Academic Publishers. Norwell, MA, USA. 1998.</ref><ref name=Piramithu2009>Piramuthu, S., Sikora R. T. [https://www.sciencedirect.com/science/article/pii/S0957417408001309 Iterative feature construction for improving inductive learning algorithms]. In Journal of Expert Systems with Applications. Vol. 36 , Iss. 2 (March 2009), pp. 3401-3406, 2009</ref> Feature construction is the application of a set of constructive operators to a set of existing features resulting in construction of new features. Examples of such constructive operators include checking for the equality conditions {=, β }, the arithmetic operators {+,β,Γ, /}, the array operators {max(S), min(S), average(S)} as well as other more sophisticated operators, for example count(S,C)<ref name=bloedorn1998>Bloedorn, E., Michalski, R. Data-driven constructive induction: a methodology and its applications. IEEE Intelligent Systems, Special issue on Feature Transformation and Subset Selection, pp. 30-37, March/April, 1998</ref> that counts the number of features in the feature vector S satisfying some condition C or, for example, distances to other recognition classes generalized by some accepting device. Feature construction has long been considered a powerful tool for increasing both accuracy and understanding of structure, particularly in high-dimensional problems.<ref name=breinman1984>Breiman, L. Friedman, T., Olshen, R., Stone, C. (1984) ''Classification and regression trees'', Wadsworth</ref> Applications include studies of disease and [[emotion recognition]] from speech.<ref name=Sidorova2009>Sidorova, J., Badia T. [https://ieeexplore.ieee.org/abstract/document/5402574/ Syntactic learning for ESEDA.1, tool for enhanced speech emotion detection and analysis]. Internet Technology and Secured Transactions Conference 2009 (ICITST-2009), London, November 9β12. IEEE</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)