*banner
 

Sensor fusion in dynamical systems - applications and research challenges
Thomas B. Schön

Citation
Thomas B. Schön. "Sensor fusion in dynamical systems - applications and research challenges". Talk or presentation, 11, December, 2012.

Abstract
Sensor fusion refers to the problem of computing state estimates using measurements from several different, often complementary, sensors. The strategy is explained and (perhaps more importantly) illustrated using four different industrial/research applications, very briefly introduced below. Guided partly by these applications we will highlight key directions for future research within the area of sensor fusion. Given that the number of available sensors is skyrocketing this technology is likely to become even more important in the future. The four applications are; 1. Real-time pose estimation and autonomous landing of the helicopter (using inertial sensors and a camera). 2. Pose estimation of a helicopter using an already existing map (a processed version of an aerial photograph of the operational area), inertial sensors and a camera. 3. Vehicle motion and road surface estimation (using inertial sensors, steering wheel sensor and an infrared camera). 4. Indoor pose estimation of a human body (using inertial sensors and ultra-wideband).

Electronic downloads

Citation formats  
  • HTML
    Thomas B. Schön. <a
    href="http://chess.eecs.berkeley.edu/pubs/955.html"
    ><i>Sensor fusion in dynamical systems -
    applications and research challenges</i></a>,
    Talk or presentation,  11, December, 2012.
  • Plain text
    Thomas B. Schön. "Sensor fusion in
    dynamical systems - applications and research
    challenges". Talk or presentation,  11, December, 2012.
  • BibTeX
    @presentation{Schn12_SensorFusionInDynamicalSystemsApplicationsResearchChallenges,
        author = {Thomas B. Schön},
        title = {Sensor fusion in dynamical systems - applications
                  and research challenges},
        day = {11},
        month = {December},
        year = {2012},
        abstract = {Sensor fusion refers to the problem of computing
                  state estimates using measurements from several
                  different, often complementary, sensors. The
                  strategy is explained and (perhaps more
                  importantly) illustrated using four different
                  industrial/research applications, very briefly
                  introduced below. Guided partly by these
                  applications we will highlight key directions for
                  future research within the area of sensor fusion.
                  Given that the number of available sensors is
                  skyrocketing this technology is likely to become
                  even more important in the future. The four
                  applications are; 1. Real-time pose estimation and
                  autonomous landing of the helicopter (using
                  inertial sensors and a camera). 2. Pose estimation
                  of a helicopter using an already existing map (a
                  processed version of an aerial photograph of the
                  operational area), inertial sensors and a camera.
                  3. Vehicle motion and road surface estimation
                  (using inertial sensors, steering wheel sensor and
                  an infrared camera). 4. Indoor pose estimation of
                  a human body (using inertial sensors and
                  ultra-wideband). },
        URL = {http://chess.eecs.berkeley.edu/pubs/955.html}
    }
    

Posted by David Broman on 12 Dec 2012.
For additional information, see the Publications FAQ or contact webmaster at chess eecs berkeley edu.

Notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright.

©2002-2018 Chess