• No results found

Sensor Fusion for Augmented Reality

N/A
N/A
Protected

Academic year: 2021

Share "Sensor Fusion for Augmented Reality"

Copied!
3
0
0

Loading.... (view fulltext now)

Full text

(1)

Technical report from Automatic Control at Linköpings universitet

Sensor Fusion for Augmented Reality

Fredrik Gustafsson, Thomas B. Schön, Jeroen D. Hol

Division of Automatic Control

E-mail: fredrik@isy.liu.se, schon@isy.liu.se,

hol@isy.liu.se

8th January 2009

Report no.: LiTH-ISY-R-2875

Accepted for publication in Proceedings of the 17th IFAC World

Congress

Address:

Department of Electrical Engineering Linköpings universitet

SE-581 83 Linköping, Sweden

WWW: http://www.control.isy.liu.se

AUTOMATIC CONTROL REGLERTEKNIK LINKÖPINGS UNIVERSITET

Technical reports from the Automatic Control group in Linköping are available from http://www.control.isy.liu.se/publications.

(2)

Abstract

The problem of estimating the position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and a camera. The sensor fusion approach described in this contribution is based on nonlinear ltering using the measurements from these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary eect of reducing computation time and improving the performance in vision processing. A real-time implementation of a nonlinear lter is described, using a dynamic model for the 22 states, where 100 Hz inertial measurements and 12.5 Hz vision measurements are processed. An example where an industrial robot is used to move the sensor unit, possessing almost perfect precision and repeatability, is presented. The results show that position and orientation accuracy is sucient for a number of augmented reality applications.

Keywords: Sensor fusion, nonlinear ltering, tracking, Kalman lter, aug-mented reality.

(3)

Sensor Fusion for Augmented Reality ⋆

Fredrik Gustafsson, Thomas B. Sch¨on, Jeroen D. Hol∗ ∗Division of Automatic Control, Link¨oping University, SE-581 83

Link¨oping, Sweden (e-mail: {fredrik, schon, hol}@isy.liu.se)

Abstract: The problem of estimating the position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and a camera. The sensor fusion approach described in this contribution is based on nonlinear filtering using the measurements from these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a nonlinear filter is described, using a dynamic model for the 22 states, where 100 Hz inertial measurements and 12.5 Hz vision measurements are processed. An example where an industrial robot is used to move the sensor unit, possessing almost perfect precision and repeatability, is presented. The results show that position and orientation accuracy is sufficient for a number of augmented reality applications.

Keywords: Sensor fusion, nonlinear filtering, tracking, Kalman filter, augmented reality 1. EXTENDED ABSTRACT

This contribution deals with estimating the position and orientation (pose) of a camera in real-time, using mea-surements from inertial sensors (accelerometers and rate gyroscopes) and a camera. A system has been developed to solve this problem in unprepared environments, assuming that a map or scene model is available. For a more detailed description of the overall system and the model building we refer to Hol et al. [2007].

Existing pose tracking algorithms in literature stem from the computer vision society, where the primary informa-tion is taken from the image stream. IMU’s are in some studies [Davison et al., 2007] used as a supporting sensor. We have taken a reverse approach inspired by navigation systems in aircraft, where the IMU is the primary sensor, and vision information is the support information used in a Kalman filter to stabilize inherent drift when integrating IMU measurements of accelerations and angular velocities. This system design has proved to give a very robust (to occlusion and fast movements of the camera) and accurate pose estimate, but it also as a side effect decreases the computation burden in the computer vision algorithms, in that an accurate prior estimate of feature locations in the image can be computed.

A further fundamental contribution is to use SLAM (si-multaneous localization and mapping) for on-line learning of a scene model, which is required for the pose esti-mator. SLAM [Thrun et al, 2005] was developed in the mobile robotics community, where only a few dynamical states (typically three) are used, while we have developed marginalization techniques enabling complex dynamics models with a high-dimensional (22 in this application) state vector.

⋆ This work has been performed within the MATRIS consortium, which is a sixth framework research program within the European Union (EU), contract number: IST-002013.

The supporting video illustrates:

• Broadcasting applications of augmented reality in sports, news and entertainment.

• The difference of augmented and virtual reality. • Prior state of the art in form of expensive marker

based system.

• The individual contributions from the MATRIS con-sortium, explaining and providing an overview of the MATRIS system.

• In particular, the IMU sensor which is the core of the hardware is shown in two different versions. The first one is to be attached to the film camera. The second one can be used standalone, as it integrates synchro-nized and calibrated three-dimensional accelerome-ters, gyroscopes, magnetometers and one miniature camera in one single housing.

• Our development platform, where the sensor kit is attached to an industrial robot. In this way, pre-programmed motions can be repeated indefinitely in a Monte Carlo fashion, where for instance robustness to changes in lightning and scene can be evaluated. • Finally, the performance of the Matris demonstrator

is exemplified.

The performance of the developed pose tracking algorithm satisfies the television performance requirements of less than two centimeters and one degree error at all times, independent of the camera motion.

REFERENCES

Davison, A. J., Reid, I., Molton, N., and Strasse, O. (2007). MonoSLAM: Real-time single camera SLAM. IEEE Transactions on Patterns Analysis and Machine Intelligence, 29(6):1052–1067. S. Thrun, W. Burgard, D. Fox. Probabilistic Robotics. Intelligent Robotics and Autonomous Agents. The MIT Press, Cambridge, MA, USA. 2005.

Hol, J.D., Sch¨on, T. B., Luinge, H., Slycke, P., and Gustafsson, F. (2007). Enabling real-time tracking by fusing measurements from inertial and vision sensors. Journal of Real-Time Image Processng.

References

Related documents

Through processes of observational learning on the part of employees (Wood and Bandura 1989), the owner-manager’s pattern of behaviour (e.g., treating mistakes as

This synthesis report is a contribution to the work of the Nordic Working Group for green growth – innovation and entrepreneurship, which operated under the Nordic Council of

Ett par av variablerna från Habib och Huangs (2019) modell exkluderades eftersom Habib och Huang (2019) förutom att undersöka om ARL ökade risken för aktieprisfall, även

På grund av att syftet med arbetet inte är att besvara hur säkerhetskulturen påverkas av personalbristen i alla flygunderhållsenheter i Försvarsmakten är detta inte

Go to the myGeolog service provider (relying party) at http://dev.mygeolog.com Click in the OpenID URI field (to the right on the page, below ordinary login) and enter the

Den praktiska implikationen av den här rapporten är att den vill hävda att det behövs ett skifte i utvecklingen inom ambulanssjukvården mot att även utveckla och öka

Salehi Z, Mashayekhi F, Naji M: Insulin like growth factor-1 and insulin like growth factor binding proteins in the cerebrospinal fluid and serum from patients with Alzheimer's

Based on a framework for nonlinear state estimation, a system has been developed to ob- tain real-time camera pose estimates by fusing 100 Hz inertial measurements and 12.5 Hz