Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

Embedded Robotics (Thomas Braunl, 2 ed, 2006)

.pdf
Скачиваний:
236
Добавлен:
12.08.2013
Размер:
5.37 Mб
Скачать

Embedded Robotics

Thomas Bräunl

EMBEDDED ROBOTICS

Mobile Robot Design and Applications

with Embedded Systems

Second Edition

With 233 Figures and 24 Tables

123

Thomas Bräunl

School of Electrical, Electronic

and Computer Engineering

The University of Western Australia

35 Stirling Highway

Crawley, Perth, WA 6009

Australia

Library of Congress Control Number: 2006925479

ACM Computing Classification (1998): I.2.9, C.3

ISBN-10 3-540-34318-0 Springer Berlin Heidelberg New York ISBN-13 978-3-540-34318-9 Springer Berlin Heidelberg New York

ISBN-10 3-540-03436-6 1. Edition Springer Berlin Heidelberg New York

This work is subject to copyright. All rights are reserved, whether the whole or part of the

PDWHULDO LV FRQFHUQHG VSHFL¿FDOO\ WKH ULJKWV RI WUDQVODWLRQ UHSULQWLQJ UHXVH RI LOOXVWUDWLRQV UHFLWDWLRQ EURDGFDVWLQJ UHSURGXFWLRQ RQ PLFUR¿OPRU LQ DQ\ RWKHU ZD\ DQG VWRUDJH in data banks. Duplication of this publication or parts thereof is permitted only under the provisions of the German Copyright Law of September 9, 1965, in its current version, and permission for use must always be obtained from Springer. Violations are liable for prosecution under the German Copyright Law.

Springer is a part of Springer Science+Business Media

springer.com

© Springer-Verlag Berlin Heidelberg 2003, 2006

Printed in Germany

The use of general descriptive names, registered names, trademarks, etc. in this publi-

FDWLRQ GRHV QRW LPSO\ HYHQ LQ WKH DEVHQFH RI D VSHFL¿F VWDWHPHQW WKDW VXFK QDPHV DUH exempt from the relevant protective laws and regulations and therefore free for general use.

Typesetting: Camera-ready by the author

Production: LE-TEX Jelonek, Schmidt &Vöckler GbR, Leipzig

Cover design: KünkelLopka, Heidelberg

PREFACE

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. .. .. .. . It all started with a new robot lab course I had developed to accompany my

robotics lectures. We already had three large, heavy, and expensive mobile robots for research projects, but nothing simple and safe, which we

could give to students to practice on for an introductory course.

We selected a mobile robot kit based on an 8-bit controller, and used it for the first couple of years of this course. This gave students not only the enjoyment of working with real robots but, more importantly, hands-on experience with control systems, real-time systems, concurrency, fault tolerance, sensor and motor technology, etc. It was a very successful lab and was greatly enjoyed by the students. Typical tasks were, for example, driving straight, finding a light source, or following a leading vehicle. Since the robots were rather inexpensive, it was possible to furnish a whole lab with them and to conduct multi-robot experiments as well.

Simplicity, however, had its drawbacks. The robot mechanics were unreliable, the sensors were quite poor, and extendability and processing power were very limited. What we wanted to use was a similar robot at an advanced level. The processing power had to be reasonably fast, it should use precision motors and sensors, and – most challenging – the robot had to be able to do on-board image processing. This had never been accomplished before on a robot of such a small size (about 12cm u 9cm u 14cm). Appropriately, the robot project was called “EyeBot”. It consisted of a full 32-bit controller (“EyeCon”), interfacing directly to a digital camera (“EyeCam”) and a large graphics display for visual feedback. A row of user buttons below the LCD was included as “soft keys” to allow a simple user interface, which most other mobile robots lack. The processing power of the controller is about 1,000 times faster than for robots based on most 8-bit controllers (25MHz processor speed versus 1MHz, 32-bit data width versus 8-bit, compiled C code versus interpretation) and this does not even take into account special CPU features like the “time processor unit” (TPU).

The EyeBot family includes several driving robots with differential steering, tracked vehicles, omni-directional vehicles, balancing robots, six-legged walkers, biped android walkers, autonomous flying and underwater robots, as

V

Preface

well as simulation systems for driving robots (“EyeSim”) and underwater robots (“SubSim”). EyeCon controllers are used in several other projects, with and without mobile robots. Numerous universities use EyeCons to drive their own mobile robot creations. We use boxed EyeCons for experiments in a sec- ond-year course in Embedded Systems as part of the Electrical Engineering, Information Technology, and Mechatronics curriculums. And one lonely EyeCon controller sits on a pole on Rottnest Island off the coast of Western Australia, taking care of a local weather station.

Acknowledgements

While the controller hardware and robot mechanics were developed commercially, several universities and numerous students contributed to the EyeBot software collection. The universities involved in the EyeBot project are:

University of Stuttgart, Germany

University of Kaiserslautern, Germany

Rochester Institute of Technology, USA

The University of Auckland, New Zealand

The University of Manitoba, Winnipeg, Canada

The University of Western Australia (UWA), Perth, Australia

The author would like to thank the following students, technicians, and colleagues: Gerrit Heitsch, Thomas Lampart, Jörg Henne, Frank Sautter, Elliot Nicholls, Joon Ng, Jesse Pepper, Richard Meager, Gordon Menck, Andrew McCandless, Nathan Scott, Ivan Neubronner, Waldemar Spädt, Petter Reinholdtsen, Birgit Graf, Michael Kasper, Jacky Baltes, Peter Lawrence, Nan Schaller, Walter Bankes, Barb Linn, Jason Foo, Alistair Sutherland, Joshua Petitt, Axel Waggershauser, Alexandra Unkelbach, Martin Wicke, Tee Yee Ng, Tong An, Adrian Boeing, Courtney Smith, Nicholas Stamatiou, Jonathan Purdie, Jippy Jungpakdee, Daniel Venkitachalam, Tommy Cristobal, Sean Ong, and Klaus Schmitt.

Thanks for proofreading the manuscript and numerous suggestions go to Marion Baer, Linda Barbour, Adrian Boeing, Michael Kasper, Joshua Petitt, Klaus Schmitt, Sandra Snook, Anthony Zaknich, and everyone at SpringerVerlag.

Contributions

A number of colleagues and former students contributed to this book. The author would like to thank everyone for their effort in putting the material together.

JACKY BALTES The University of Manitoba, Winnipeg, contributed to the section on PID control,

VI

Preface

ADRIAN BOEING UWA, coauthored the chapters on the evolution of walking gaits and genetic algorithms, and contributed to the section on SubSim,

CHRISTOPH BRAUNSCHÄDEL FH Koblenz, contributed data plots to the sections on PID control and on/off control,

MICHAEL DRTIL FH Koblenz, contributed to the chapter on AUVs, LOUIS GONZALEZ UWA, contributed to the chapter on AUVs,

BIRGIT GRAF Fraunhofer IPA, Stuttgart, coauthored the chapter on robot soccer,

HIROYUKI HARADA Hokkaido University, Sapporo, contributed the visualization diagrams to the section on biped robot design,

YVES HWANG UWA, coauthored the chapter on genetic programming, PHILIPPE LECLERCQ UWA, contributed to the section on color segmentation,

JAMES NG

UWA, coauthored the sections on probabilistic localiza-

 

tion and the DistBug navigation algorithm.

JOSHUA PETITT

UWA, contributed to the section on DC motors,

KLAUS SCHMITT

Univ. Kaiserslautern, coauthored the section on the RoBI-

 

OS operating system,

ALISTAIR SUTHERLAND UWA, coauthored the chapter on balancing robots,

NICHOLAS TAY DSTO, Canberra, coauthored the chapter on map generation,

DANIEL VENKITACHALAM UWA, coauthored the chapters on genetic algorithms and behavior-based systems and contributed to the chapter on neural networks,

EYESIM was implemented by Axel Waggershauser (V5) and Andreas Koestler (V6), UWA, Univ. Kaiserslautern, and FH Giessen.

SUBSIM was implemented by Adrian Boeing, Andreas Koestler, and Joshua Petitt (V1), and Thorsten Rühl and Tobias Bielohlawek (V2), UWA, FH Giessen, and Univ. Kaiserslautern.

Additional Material

Hardware and mechanics of the “EyeCon” controller and various robots of the EyeBot family are available from INROSOFT and various distributors:

http://inrosoft.com

All system software discussed in this book, the RoBIOS operating system, C/C++ compilers for Linux and Windows, system tools, image processing tools, simulation system, and a large collection of example programs are available free from:

http://robotics.ee.uwa.edu.au/eyebot/

VII

Preface

Lecturers who adopt this book for a course can receive a full set of the author’s course notes (PowerPoint slides), tutorials, and labs from this website. And finally, if you have developed some robot application programs you would like to share, please feel free to submit them to our website.

Second Edition

Less than three years have passed since this book was first published and I have since used this book successfully in courses on Embedded Systems and on Mobile Robots / Intelligent Systems. Both courses are accompanied by hands-on lab sessions using the EyeBot controllers and robot systems, which the students found most interesting and which I believe contribute significantly to the learning process.

What started as a few minor changes and corrections to the text, turned into a major rework and additional material has been added in several areas. A new chapter on autonomous vessels and underwater vehicles and a new section on AUV simulation have been added, the material on localization and navigation has been extended and moved to a separate chapter, and the kinematics sections for driving and omni-directional robots have been updated, while a couple of chapters have been shifted to the Appendix.

Again, I would like to thank all students and visitors who conducted research and development work in my lab and contributed to this book in one form or another.

All software presented in this book, especially the EyeSim and SubSim simulation systems can be freely downloaded from:

http://robotics.ee.uwa.edu.au

Perth, Australia, June 2006

Thomas Bräunl

VIII

CONTENTS

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

 

. . . . . . . . .

PART I: EMBEDDED SYSTEMS

 

1 Robots and Controllers

3

1.1 Mobile Robots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.2 Embedded Controllers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

1.3 Interfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

1.4 Operating System. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

1.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2 Sensors

17

2.1 Sensor Categories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.2 Binary Sensor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.3 Analog versus Digital Sensors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.4 Shaft Encoder. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 2.5 A/D Converter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.6 Position Sensitive Device . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.7 Compass. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 2.8 Gyroscope, Accelerometer, Inclinometer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 2.9 Digital Camera. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 2.10 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

3 Actuators

41

3.1 DC Motors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

3.2 H-Bridge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

3.3 Pulse Width Modulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

3.4 Stepper Motors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

3.5 Servos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

3.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

4 Control

51

4.1 On-Off Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

4.2 PID Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

4.3 Velocity Control and Position Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

4.4 Multiple Motors – Driving Straight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

4.5 V-Omega Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

4.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

IX

Contents

5 Multitasking

69

5.1 Cooperative Multitasking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

5.2 Preemptive Multitasking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

5.3 Synchronization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

5.4 Scheduling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

5.5 Interrupts and Timer-Activated Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

5.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82

6 Wireless Communication

83

6.1 Communication Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

6.2 Messages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

6.3 Fault-Tolerant Self-Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87

6.4 User Interface and Remote Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

6.5 Sample Application Program. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

6.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93

PART II: MOBILE ROBOT DESIGN

7 Driving Robots

97

7.1 Single Wheel Drive . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97

7.2 Differential Drive. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

7.3 Tracked Robots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

7.4 Synchro-Drive . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103

7.5 Ackermann Steering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

7.6 Drive Kinematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107

7.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111

8 Omni-Directional Robots

113

8.1 Mecanum Wheels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113

8.2 Omni-Directional Drive. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115

8.3 Kinematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117

8.4 Omni-Directional Robot Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118

8.5 Driving Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119

8.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120

9 Balancing Robots

123

9.1

Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

123

9.2

Inverted Pendulum Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

124

9.3

Double Inverted Pendulum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

128

9.4

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

129

10 Walking Robots

131

10.1 Six-Legged Robot Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 10.2 Biped Robot Design. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 10.3 Sensors for Walking Robots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 10.4 Static Balance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140

X

Contents

10.5 Dynamic Balance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143

10.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148

11

Autonomous Planes

151

 

11.1

Application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

151

 

11.2

Control System and Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

154

 

11.3

Flight Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

155

 

11.4

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

159

12

Autonomous Vessels and Underwater Vehicles

161

12.1 Application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161

12.2 Dynamic Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163

12.3 AUV Design Mako . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163

12.4 AUV Design USAL. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167

12.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170

13 Simulation Systems

171

13.1 Mobile Robot Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171

13.2 EyeSim Simulation System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172

13.3 Multiple Robot Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177

13.4 EyeSim Application. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178

13.5 EyeSim Environment and Parameter Files . . . . . . . . . . . . . . . . . . . . . . . . . . . 179

13.6 SubSim Simulation System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184

13.7 Actuator and Sensor Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186

13.8 SubSim Application. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188

13.9 SubSim Environment and Parameter Files . . . . . . . . . . . . . . . . . . . . . . . . . . . 190

13.10 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193

PART III: MOBILE ROBOT APPLICATIONS

14 Localization and Navigation

197

14.1 Localization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197

14.2 Probabilistic Localization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201

14.3 Coordinate Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205

14.4 Dijkstra’s Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206

14.5 A* Algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210

14.6 Potential Field Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211

14.7 Wandering Standpoint Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212

14.8 DistBug Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213

14.9 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215

15 Maze Exploration

217

15.1 Micro Mouse Contest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217 15.2 Maze Exploration Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219 15.3 Simulated versus Real Maze Program. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226 15.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228

XI