Cargando…

Introduction to autonomous mobile robots

Robots móviles van desde el Sojourner de la misión Mars Pathfinder a los robots de limpieza en el metro de París. Este texto ofrece a los estudiantes y otros lectores interesados ​​una introducción a los fundamentos de la robótica móvil, que abarca la mecánica, motor, y capas cognitivas sensoriales....

Descripción completa

Detalles Bibliográficos
Autor principal: Siegwart, Roland
Otros Autores: Nourbakhsh, Illah Reza, 1970-, Scaramuzza, Davide
Formato: Libro
Lenguaje:English
Publicado: Cambridge, Mass. : MIT Press, 2011, ©2011.
Edición:2a ed.
Colección:Intelligent robotics and autonomous agents
Materias:
Tabla de Contenidos:
  • Machine generated contents note: 1.Introduction
  • 1.1.Introduction
  • 1.2.An Overview of the Book
  • 2.Locomotion
  • 2.1.Introduction
  • 2.1.1.Key issues for locomotion
  • 2.2.Legged Mobile Robots
  • 2.2.1.Leg configurations and stability
  • 2.2.2.Consideration of dynamics
  • 2.2.3.Examples of legged robot locomotion
  • 2.3.Wheeled Mobile Robots
  • 2.3.1.Wheeled locomotion: The design space
  • 2.3.2.Wheeled locomotion: Case studies
  • 2.4.Aerial Mobile Robots
  • 2.4.1.Introduction
  • 2.4.2.Aircraft configurations
  • 2.4.3.State of the art in autonomous VTOL
  • 2.5.Problems
  • 3.Mobile Robot Kinematics
  • 3.1.Introduction
  • 3.2.Kinematic Models and Constraints
  • 3.2.1.Representing robot position
  • 3.2.2.Forward kinematic models
  • 3.2.3.Wheel kinematic constraints
  • 3.2.4.Robot kinematic constraints
  • 3.2.5.Examples: Robot kinematic models and constraints
  • 3.3.Mobile Robot Maneuverability
  • 3.3.1.Degree of mobility
  • 3.3.2.Degree of steerability
  • 3.3.3.Robot maneuverability
  • 3.4.Mobile Robot Workspace
  • 3.4.1.Degrees of freedom
  • 3.4.2.Holonomic robots
  • 3.4.3.Path and trajectory considerations
  • 3.5.Beyond Basic Kinematics
  • 3.6.Motion Control (Kinematic Control)
  • 3.6.1.Open loop control (trajectory-following)
  • 3.6.2.Feedback control
  • 3.7.Problems
  • 4.Perception
  • 4.1.Sensors for Mobile Robots
  • 4.1.1.Sensor classification
  • 4.1.2.Characterizing sensor performance
  • 4.1.3.Representing uncertainty
  • 4.1.4.Wheel/motor sensors
  • 4.1.5.Heading sensors
  • 4.1.6.Accelerometers
  • 4.1.7.Inertial measurement unit (IMU)
  • 4.1.8.Ground beacons
  • 4.1.9.4.Active ranging
  • 4.1.10.Motion/speed sensors
  • 4.1.11.Vision sensors
  • 4.2.Fundamentals of Computer Vision
  • 4.2.1.Introduction
  • 4.2.2.The digital camera
  • 4.2.3.Image formation
  • 4.2.4.Omnidirectional cameras.
  • 4.2.5.Structure from stereo
  • 4.2.6.Structure from motion
  • 4.2.7.Motion and optical flow
  • 4.2.8.Color tracking
  • 4.3.Fundamentals of Image Processing
  • 4.3.1.Image filtering
  • 4.3.2.Edge detection
  • 4.3.3.Computing image similarity
  • 4.4.Feature Extraction
  • Feature Extraction
  • 4.5.Image Feature Extraction: Interest Point Detectors
  • 4.5.1.Introduction
  • 4.5.2.Properties of the ideal feature detector
  • 4.5.3.Corner detectors
  • 4.5.4.Invariance to photometric and geometric changes
  • 4.5.5.Blob detectors
  • 4.6.Place Recognition
  • 4.6.1.Introduction
  • 4.6.2.From bag of features to visual words
  • 4.6.3.Efficient location recognition by using an inverted file
  • 4.6.4.Geometric verification for robust place recognition
  • 4.6.5.Applications
  • 4.6.6.Other image representations for place recognition
  • 4.7.Feature Extraction Based on Range Data (Laser, Ultrasonic)
  • 4.7.1.Line fitting
  • 4.7.2.Six line-extraction algorithms
  • 4.7.3.Range histogram features
  • 4.7.4.Extracting other geometric features
  • 4.8.Problems
  • 5.Mobile Robot Localization
  • 5.1.Introduction
  • 5.2.The Challenge of Localization: Noise and Aliasing
  • 5.2.1.Sensor noise
  • 5.2.2.Sensor aliasing
  • 5.2.3.Effector noise
  • 5.2.4.An error model for odometric position estimation
  • 5.3.To Localize or Not to Localize: Localization-Based Navigation Versus Programmed Solutions
  • 5.4.Belief Representation
  • 5.4.1.Single-hypothesis belief
  • 5.4.2.Multiple-hypothesis belief
  • 5.5.Map Representation
  • 5.5.1.Continuous representations
  • 5.5.2.Decomposition strategies
  • 5.5.3.State of the art: Current challenges in map representation
  • 5.6.Probabilistic Map-Based Localization
  • 5.6.1.Introduction
  • 5.6.2.The robot localization problem
  • 5.6.3.Basic concepts of probability theory
  • 5.6.4.Terminology
  • 5.6.5.The ingredients of probabilistic map-based localization.
  • 5.6.6.Classification of localization problems
  • 5.6.7.Markov localization
  • 5.6.8.Kalman filter localization
  • 5.7.Other Examples of Localization SyGlobally unique localization
  • stems
  • 5.7.1.Landmark-based navigation
  • 5.7.2.Globally unique localization
  • 5.7.3.Positioning beacon systems
  • 5.7.4.Route-based localization
  • 5.8. Autonomous Map Building
  • 5.8.1. Introduction
  • 5.8.2. SLAM: The simultaneous localization and mapping problem
  • 5.8.3. Mathematical definition of SLAM
  • 5.8.4. Extended Kalman Filter (EKF) SLAM
  • 5.8.5. Visual SLAM with a single camera
  • 5.8.6. Discussion on EKF SLAM
  • 5.8.7. Graph-based SLAM
  • 5.8.8. Particle filter SLAM
  • 5.8.9. Open challenges in SLAM
  • 5.8.10. Open source SLAM software and other resources
  • 5.9. Problems
  • 6. Planning and Navigation
  • 6.1. Introduction
  • 6.2. Competences for Navigation: Planning and Reacting
  • 6.3. Path Planning
  • 6.3.1. Graph search
  • 6.3.2. Potential field path planning
  • 6.4. Obstacle avoidance
  • 6.4.1. Bug algorithm
  • 6.4.2. Vector field histogram
  • 6.4.3. The bubble band technique
  • 6.4.4. Curvature velocity techniques
  • 6.4.5. Dynamic window approaches
  • 6.4.6. The Schlegel approach to obstacle avoidance
  • 6.4.7. Nearness diagram
  • 6.4.8. Gradient method
  • 6.4.9. Adding dynamic constraints
  • 6.4.10. Other approaches
  • 6.4.11. Overview
  • 6.5. Navigation Architectures
  • 6.5.1. Modularity for code reuse and sharing
  • 6.5.2. Control localization
  • 6.5.3. Techniques for decomposition
  • 6.5.4. Case studies: tiered robot architectures .