Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Sensor fusion
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Centralized versus decentralized == In sensor fusion, centralized versus decentralized refers to where the fusion of the data occurs. In centralized fusion, the clients simply forward all of the data to a central location, and some entity at the central location is responsible for correlating and fusing the data. In decentralized, the clients take full responsibility for fusing the data. "In this case, every sensor or platform can be viewed as an intelligent asset having some degree of autonomy in decision-making."<ref>{{cite web|title=Multi-sensor management for information fusion: issues and approaches|url=http://www.elsevier.com/locate/inffus|author=N. Xiong |author2=P. Svensson |publisher = Information Fusion|year = 2002|page = 3(2):163β186}}</ref> Multiple combinations of centralized and decentralized systems exist. Another classification of sensor configuration refers to the coordination of information flow between sensors.<ref name="Durrant-Whyte2016">{{cite journal|last1=Durrant-Whyte|first1=Hugh F.|title=Sensor Models and Multisensor Integration|journal=The International Journal of Robotics Research|volume=7|issue=6|year=2016|pages=97β113|issn=0278-3649|doi=10.1177/027836498800700608|s2cid=35656213}}</ref><ref name="Galar">{{cite book|title=eMaintenance: Essential Electronic Tools for Efficiency|first1=Diego|last1= Galar|first2= Uday |last2=Kumar|isbn=9780128111543|page=26 |publisher=Academic Press|year=2017}}</ref> These mechanisms provide a way to resolve conflicts or disagreements and to allow the development of dynamic sensing strategies. Sensors are in redundant (or competitive) configuration if each node delivers independent measures of the same properties. This configuration can be used in error correction when comparing information from multiple nodes. Redundant strategies are often used with high level fusions in voting procedures.<ref name="LiBao2012">{{cite book|last1=Li|first1=Wenfeng|title=2012 12th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (ccgrid 2012)|last2=Bao|first2=Junrong|last3=Fu|first3=Xiuwen|last4=Fortino|first4=Giancarlo|last5=Galzarano|first5=Stefano|chapter=Human Postures Recognition Based on D-S Evidence Theory and Multi-sensor Data Fusion|year=2012|pages=912β917|doi=10.1109/CCGrid.2012.144|isbn=978-1-4673-1395-7|s2cid=1571720}}</ref><ref name="FortinoGravina2015">{{cite book|last1=Fortino|first1=Giancarlo|title=Proceedings of the 10th EAI International Conference on Body Area Networks|last2=Gravina|first2=Raffaele|chapter=Fall-MobileGuard: a Smart Real-Time Fall Detection System|year=2015|doi=10.4108/eai.28-9-2015.2261462|isbn=978-1-63190-084-6|s2cid=38913107}}</ref> Complementary configuration occurs when multiple information sources supply different information about the same features. This strategy is used for fusing information at raw data level within decision-making algorithms. Complementary features are typically applied in motion recognition tasks with [[neural network]],<ref name="TaoZhang2018">{{cite journal|last1=Tao|first1=Shuai|last2=Zhang|first2=Xiaowei|last3=Cai|first3=Huaying|last4=Lv|first4=Zeping|last5=Hu|first5=Caiyou|last6=Xie|first6=Haiqun|title=Gait based biometric personal authentication by using MEMS inertial sensors|journal=Journal of Ambient Intelligence and Humanized Computing|volume=9|issue=5|year=2018|pages=1705β1712|issn=1868-5137|doi=10.1007/s12652-018-0880-6|s2cid=52304214}}</ref><ref name="DehzangiTaherisadr2017">{{cite journal|last1=Dehzangi|first1=Omid|last2=Taherisadr|first2=Mojtaba|last3=ChangalVala|first3=Raghvendar|title=IMU-Based Gait Recognition Using Convolutional Neural Networks and Multi-Sensor Fusion|journal=Sensors|volume=17|issue=12|year=2017|pages=2735|issn=1424-8220|doi=10.3390/s17122735|pmid=29186887|pmc=5750784|bibcode=2017Senso..17.2735D|doi-access=free}}</ref> [[hidden Markov model]],<ref name="GuenterbergYang2009">{{cite journal|last1=Guenterberg|first1=E.|last2=Yang|first2=A.Y.|last3=Ghasemzadeh|first3=H.|last4=Jafari|first4=R.|last5=Bajcsy|first5=R.|last6=Sastry|first6=S.S.|title=A Method for Extracting Temporal Parameters Based on Hidden Markov Models in Body Sensor Networks With Inertial Sensors|journal=IEEE Transactions on Information Technology in Biomedicine|volume=13|issue=6|year=2009|pages=1019β1030|issn=1089-7771|doi=10.1109/TITB.2009.2028421|pmid=19726268|s2cid=1829011|url=http://www.eecs.berkeley.edu/~yang/paper/GuenterbergE-Biomedicine.pdf}}</ref><ref name="ParisiFerrari2016">{{cite journal|last1=Parisi|first1=Federico|last2=Ferrari|first2=Gianluigi|last3=Giuberti|first3=Matteo|last4=Contin|first4=Laura|last5=Cimolin|first5=Veronica|last6=Azzaro|first6=Corrado|last7=Albani|first7=Giovanni|last8=Mauro|first8=Alessandro|title=Inertial BSN-Based Characterization and Automatic UPDRS Evaluation of the Gait Task of Parkinsonians|journal=IEEE Transactions on Affective Computing|volume=7|issue=3|year=2016|pages=258β271|issn=1949-3045|doi=10.1109/TAFFC.2016.2549533|s2cid=16866555}}</ref> [[support vector machine]],<ref name="GaoBourke2014">{{cite journal|last1=Gao|first1=Lei|last2=Bourke|first2=A.K.|last3=Nelson|first3=John|title=Evaluation of accelerometer based multi-sensor versus single-sensor activity recognition systems|journal=Medical Engineering & Physics|volume=36|issue=6|year=2014|pages=779β785|issn=1350-4533|doi=10.1016/j.medengphy.2014.02.012|pmid=24636448}}</ref> clustering methods and other techniques.<ref name="GaoBourke2014"/><ref name="ParisiFerrari2016"/> Cooperative sensor fusion uses the information extracted by multiple independent sensors to provide information that would not be available from single sensors. For example, sensors connected to body segments are used for the detection of the angle between them. Cooperative sensor strategy gives information impossible to obtain from single nodes. Cooperative information fusion can be used in motion recognition,<ref name="XuWang2016">{{cite journal|last1=Xu|first1=James Y.|last2=Wang|first2=Yan|last3=Barrett|first3=Mick|last4=Dobkin|first4=Bruce|last5=Pottie|first5=Greg J.|last6=Kaiser|first6=William J.|title=Personalized Multilayer Daily Life Profiling Through Context Enabled Activity Classification and Motion Reconstruction: An Integrated System Approach|journal=IEEE Journal of Biomedical and Health Informatics|volume=20|issue=1|year=2016|pages=177β188|issn=2168-2194|doi=10.1109/JBHI.2014.2385694|pmid=25546868|s2cid=16785375|doi-access=free}}</ref> [[gait analysis]], [[motion analysis]],<ref name="Chia BejaranoAmbrosini2015">{{cite journal|last1=Chia Bejarano|first1=Noelia|last2=Ambrosini|first2=Emilia|last3=Pedrocchi|first3=Alessandra|last4=Ferrigno|first4=Giancarlo|last5=Monticone|first5=Marco|last6=Ferrante|first6=Simona|title=A Novel Adaptive, Real-Time Algorithm to Detect Gait Events From Wearable Sensors|journal=IEEE Transactions on Neural Systems and Rehabilitation Engineering|volume=23|issue=3|year=2015|pages=413β422|issn=1534-4320|doi=10.1109/TNSRE.2014.2337914|pmid=25069118|s2cid=25828466|hdl=11311/865739|hdl-access=free}}</ref><ref name="WangQiu2013">{{cite journal|last1=Wang|first1=Zhelong|last2=Qiu|first2=Sen|last3=Cao|first3=Zhongkai|last4=Jiang|first4=Ming|title=Quantitative assessment of dual gait analysis based on inertial sensors with body sensor network|journal=Sensor Review|volume=33|issue=1|year=2013|pages=48β56|issn=0260-2288|doi=10.1108/02602281311294342}}</ref>,.<ref name="KongWanning2017">{{cite journal|last1=Kong|first1=Weisheng|last2=Wanning|first2=Lauren|last3=Sessa|first3=Salvatore|last4=Zecca|first4=Massimiliano|last5=Magistro|first5=Daniele|last6=Takeuchi|first6=Hikaru|last7=Kawashima|first7=Ryuta|last8=Takanishi|first8=Atsuo|title=Step Sequence and Direction Detection of Four Square Step Test|journal=IEEE Robotics and Automation Letters|volume=2|issue=4|year=2017|pages=2194β2200|issn=2377-3766|doi=10.1109/LRA.2017.2723929|s2cid=23410874|url=http://irep.ntu.ac.uk/id/eprint/33502/1/10999_Magistro.pdf}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)