Multisensor Fusion: A Minimal Representation Framework

Front Cover
World Scientific, 1999 - Science - 315 pages
The fusion of information from sensors with different physical characteristics, such as sight, touch, sound, etc., enhances the understanding of our surroundings and provides the basis for planning, decision-making, and control of autonomous and intelligent machines.

The minimal representation approach to multisensor fusion is based on the use of an information measure as a universal yardstick for fusion. Using models of sensor uncertainty, the representation size guides the integration of widely varying types of data and maximizes the information contributed to a consistent interpretation.

In this book, the general theory of minimal representation multisensor fusion is developed and applied in a series of experimental studies of sensor-based robot manipulation. A novel application of differential evolutionary computation is introduced to achieve practical and effective solutions to this difficult computational problem.

From inside the book

Contents

Preface
1
1
17
Multisensor Fusion in Object Recognition
43
Minimal Representation
57
Environment and Sensor Models
75
Minimal Representation Multisensor Fusion
91
Multisensor Data Fusion
125
Applying the Abstract Framework to Concrete
151
Discussion of Experimental Results
225
Conclusion
237
Appendix A List of Symbols
247
Error Residuals
257
Appendix E Properties of Mixture Representation Size
263
TTTT
269
Appendix H Quaternion Algebra
275
78
287

Multisensor Object Recognition in Three Dimen
173
75
202
Laboratory Experiments
203
43
208
45
214

Other editions - View all

Common terms and phrases

Bibliographic information