Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Sensor fusion
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Levels == There are several categories or levels of sensor fusion that are commonly used. <ref>[http://www.infofusion.buffalo.edu/tm/Dr.Llinas'stuff/Rethinking%20JDL%20Data%20Fusion%20Levels_BowmanSteinberg.pdf Rethinking JDL Data Fusion Levels]</ref> <ref>Blasch, E., Plano, S. (2003) βLevel 5: User Refinement to aid the Fusion Processβ, Proceedings of the SPIE, Vol. 5099.</ref> <ref>{{cite conference | author1 = J. Llinas | author2 = C. Bowman | author3 = G. Rogova | author4 = A. Steinberg | author5 = E. Waltz | author6 = F. White | citeseerx = 10.1.1.58.2996 | title = Revisiting the JDL data fusion model II | conference = International Conference on Information Fusion | year = 2004 }}</ref> <ref>Blasch, E. (2006) "[http://www.iut-amiens.fr/~ricquebourg/these/fusion_2006/Papers/394.pdf Sensor, user, mission (SUM) resource management and their interaction with level 2/3 fusion]{{dead link|date=May 2018 |bot=InternetArchiveBot |fix-attempted=yes }}" International Conference on Information Fusion.</ref> <ref>{{Cite web|url=http://defensesystems.com/articles/2009/09/02/c4isr1-sensor-fusion.aspx|title = Harnessing the full power of sensor fusion -| date=3 April 2024 }}</ref> <ref>Blasch, E., Steinberg, A., Das, S., Llinas, J., Chong, C.-Y., Kessler, O., Waltz, E., White, F. (2013) "Revisiting the JDL model for information Exploitation," International Conference on Information Fusion.</ref> * Level 0 β Data alignment * Level 1 β Entity assessment (e.g. signal/feature/object). ** Tracking and object detection/recognition/identification * Level 2 β Situation assessment * Level 3 β Impact assessment * Level 4 β Process refinement (i.e. sensor management) * Level 5 β User refinement Sensor fusion level can also be defined basing on the kind of information used to feed the fusion algorithm.<ref name="GravinaAlinia2017">{{cite journal|last1=Gravina|first1=Raffaele|last2=Alinia|first2=Parastoo|last3=Ghasemzadeh|first3=Hassan|last4=Fortino|first4=Giancarlo|title=Multi-sensor fusion in body sensor networks: State-of-the-art and research challenges|journal=Information Fusion|volume=35|year=2017|pages=68β80|issn=1566-2535|doi=10.1016/j.inffus.2016.09.005|s2cid=40608207 }}</ref> More precisely, sensor fusion can be performed fusing raw data coming from different sources, extrapolated features or even decision made by single nodes. * Data level - data level (or early) fusion aims to fuse raw data from multiple sources and represent the fusion technique at the lowest level of abstraction. It is the most common sensor fusion technique in many fields of application. Data level fusion algorithms usually aim to combine multiple homogeneous sources of sensory data to achieve more accurate and synthetic readings.<ref name="GaoSong2015">{{cite journal|last1=Gao|first1=Teng|last2=Song|first2=Jin-Yan|last3=Zou|first3=Ji-Yan|last4=Ding|first4=Jin-Hua|last5=Wang|first5=De-Quan|last6=Jin|first6=Ren-Cheng|title=An overview of performance trade-off mechanisms in routing protocol for green wireless sensor networks|journal=Wireless Networks|volume=22|issue=1|year=2015|pages=135β157|issn=1022-0038|doi=10.1007/s11276-015-0960-x|s2cid=34505498}}</ref> When portable devices are employed data compression represent an important factor, since collecting raw information from multiple sources generates huge information spaces that could define an issue in terms of memory or communication bandwidth for portable systems. Data level information fusion tends to generate big input spaces, that slow down the decision-making procedure. Also, data level fusion often cannot handle incomplete measurements. If one sensor modality becomes useless due to malfunctions, breakdown or other reasons the whole systems could occur in ambiguous outcomes. * Feature level - features represent information computed on board by each sensing node. These features are then sent to a fusion node to feed the fusion algorithm.<ref name="ChenJafari2015">{{cite journal|last1=Chen|first1=Chen|last2=Jafari|first2=Roozbeh|last3=Kehtarnavaz|first3=Nasser|title=A survey of depth and inertial sensor fusion for human action recognition|journal=Multimedia Tools and Applications|volume=76|issue=3|year=2015|pages=4405β4425|issn=1380-7501|doi=10.1007/s11042-015-3177-1|s2cid=18112361}}</ref> This procedure generates smaller information spaces with respect to the data level fusion, and this is better in terms of computational load. Obviously, it is important to properly select features on which to define classification procedures: choosing the most efficient features set should be a main aspect in method design. Using features selection algorithms that properly detect correlated features and features subsets improves the recognition accuracy but large training sets are usually required to find the most significant feature subset.<ref name="GravinaAlinia2017"/> * Decision level - decision level (or late) fusion is the procedure of selecting an hypothesis from a set of hypotheses generated by individual (usually weaker) decisions of multiple nodes.<ref name="BanovicBuzali2016">{{cite book|last1=Banovic|first1=Nikola|title=Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI '16|last2=Buzali|first2=Tofi|last3=Chevalier|first3=Fanny|last4=Mankoff|first4=Jennifer|last5=Dey|first5=Anind K.|chapter=Modeling and Understanding Human Routine Behavior|year=2016|pages=248β260|doi=10.1145/2858036.2858557|isbn=9781450333627|s2cid=872756}}</ref> It is the highest level of abstraction and uses the information that has been already elaborated through preliminary data- or feature level processing. The main goal in decision fusion is to use meta-level classifier while data from nodes are preprocessed by extracting features from them.<ref name="MariaSever2015">{{cite book|last1=Maria|first1=Aileni Raluca|title=2015 Conference Grid, Cloud & High Performance Computing in Science (ROLCG)|last2=Sever|first2=Pasca|last3=Carlos|first3=Valderrama|chapter=Biomedical sensors data fusion algorithm for enhancing the efficiency of fault-tolerant systems in case of wearable electronics device|year=2015|pages=1β4|doi=10.1109/ROLCG.2015.7367228|isbn=978-6-0673-7040-9|s2cid=18782930}}</ref> Typically decision level sensor fusion is used in classification an recognition activities and the two most common approaches are majority voting and Naive-Bayes.{{Citation needed|date=December 2019|reason=removed citation to predatory publisher content}} Advantages coming from decision level fusion include communication bandwidth and improved decision accuracy. It also allows the combination of heterogeneous sensors.<ref name="ChenJafari2015"/>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)