Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

Embedded Robotics (Thomas Braunl, 2 ed, 2006)

.pdf
Скачиваний:
251
Добавлен:
12.08.2013
Размер:
5.37 Mб
Скачать

Robot Groups

Search the row number with the maximum count value.

Search the column number with the maximum count value.

f.These two values are the object’s image coordinates.

EXPERIMENT 23 Object Tracking

Extending the previous experiment, we want the robot to follow the detected object. For this task, we should extend the detection process to also return the size of the detected object, which we can translate into an object distance, provided we know the size of the object.

Once an object has been detected, the robot should “lock onto” the object and drive toward it, trying to maintain the object’s center in the center of its viewing field.

A nice application of this technique is having a robot detect and track either a golf ball or a tennis ball. This application can be extended by introducing a ball kicking motion and can finally lead to robot soccer.

You can think of a number of techniques of how the robot can search for an object once it has lost it.

Lab 11 Robot Groups

Now we have a number of robots interacting with each other

EXPERIMENT 24 Following a Leading Robot

Program a robot to drive along a path made of random curves, but still avoiding obstacles.

Program a second robot to follow the first robot. Detecting the leading robot can be done by using either infrared sensors or the camera, assuming the leading robot is the only moving object in the following robot’s field of view.

EXPERIMENT 25 Foraging

A group of robots has to search for food items, collect them, and bring them home. This experiment combines the object detection task with self-localiza- tion and object avoidance.

Food items are uniquely colored cubes or balls to simplify the detection task. The robot’s home area can be marked either by a second unique color or by other features that can be easily detected.

This experiment can be conducted by:

a.A single robot

b.A group of cooperating robots

c.Two competing groups of robots

445

E Laboratories

EXPERIMENT 26 Can Collecting

A variation of the previous experiment is to use magnetic cans instead of balls or cubes. This requires a different detection task and the use of a magnetic actuator, added to the robot hardware.

This experiment can be conducted by:

a.A single robot

b.A group of cooperating robots

c.Two competing groups of robots

EXPERIMENT 27 Robot Soccer

Robot soccer is of course a whole field in its own right. There are lots of publications available and of course two independent yearly world championships, as well as numerous local tournaments for robot soccer. Have a look at the web pages of the two world organizations, FIRA and Robocup:

http://www.fira.net/

http://www.robocup.org/

446

SOLUTIONS

F

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. .. .. .. .

Lab 1 Controller

EXPERIMENT 1 Etch-a-Sketch

1

/* ------------------------------------------------------

etch.c

2

| Filename:

3

| Authors:

Thomas Braunl

4

| Description:

pixel operations resembl. "etch a sketch"

5

| -----------------------------------------------------

*/

6

#include <eyebot.h>

7

 

 

8void main()

9{ int k;

10int x=0, y=0, xd=1, yd=1;

12LCDMenu("Y","X","+/-","END");

13while(KEY4 != (k=KEYRead())) {

14LCDSetPixel(y,x, 1);

15switch (k) {

16 case KEY1: y = (y + yd + 64) % 64; break;

17case KEY2: x = (x + xd + 128) % 128; break;

18case KEY3: xd = -xd; yd = -yd; break;

19}

20LCDSetPrintf(1,5);

21LCDPrintf("y%3d:x%3d", y,x);

22}

23}

447447

F Solutions

EXPERIMENT 2 Reaction Test Game

1

/* ------------------------------------------------------

react.c

2

| Filename:

3

| Authors:

Thomas Braunl

4

| Description:

reaction test

5

| -----------------------------------------------------

*/

6

#include "eyebot.h"

7

#define MAX_RAND

32767

8

 

 

9void main()

10{ int time, old,new;

12LCDPrintf(" Reaction Test\n");

13LCDMenu("GO"," "," "," ");

14KEYWait(ANYKEY);

15time = 100 + 700 * rand() / MAX_RAND; /* 1..8 s */

16LCDMenu(" "," "," "," ");

17

18OSWait(time);

19LCDMenu("HIT","HIT","HIT","HIT");

20if (KEYRead()) printf("no cheating !!\n");

21else

22{ old = OSGetCount();

23KEYWait(ANYKEY);

24new = OSGetCount();

25LCDPrintf("time: %1.2f\n", (float)(new-old) / 100.0);

26}

27

28LCDMenu(" "," "," ","END");

29KEYWait(KEY4);

30}

448

Controller

EXPERIMENT 3 Analog Input and Graphics Output

1

/* ------------------------------------------------------

micro.c

2

| Filename:

3

| Authors:

Klaus Schmitt

4

| Description:

Displays microphone input graphically

5

|

and numerically

6| ----------------------------------------------------- */

7#include "eyebot.h"

8

9void main ()

10{ int disttab[32];

11int pointer=0;

12int i,j;

13int val;

14

15/* clear the graphic-array */

16for(i=0; i<32; i++)

17disttab[i]=0;

18

19LCDSetPos(0,3);

20LCDPrintf("MIC-Demo");

21LCDMenu("","","","END");

23while (KEYRead() != KEY4)

24{ /* get actual data and scale it for the LCD */

25disttab[pointer] = 64 - ((val=AUCaptureMic(0))>>4);

27/* draw graphics */

28for(i=0; i<32; i++)

29{ j = (i+pointer)%32;

30LCDLine(i,disttab[j], i+4,disttab[(j+1)%32], 1);

31

}

32

/* print actual distance and raw-data */

33

34LCDSetPos(7,0);

35LCDPrintf("AD0:%3X",val);

37/* clear LCD */

38for(i=0; i<32; i++)

39{ j = (i+pointer)%32;

40LCDLine(i,disttab[j], i+4,disttab[(j+1)%32], 0);

41

}

42

 

43/* scroll the graphics */

44pointer = (pointer+1)%32;

45}

46}

449

F Solutions

Lab 2 Simple Driving

Simple driving,

EXPERIMENT 4 Drive a Fixed Distance and Return

using no other

1

/*

 

sensors than

drive.c

shaft encoders

2

|

Filename:

 

3

|

Authors:

Thomas Braunl

4| Description: Drive a fixed distance, then come back

5| ----------------------------------------------------- */

6#include "eyebot.h"

7

#define DIST

0.4

8

#define SPEED

0.1

9

#define TSPEED

1.0

10

void main()

 

11

vw;

12

{ VWHandle

13

PositionType

pos;

14

int

i;

15

 

 

16LCDPutString("Drive Demo\n");

17vw = VWInit(VW_DRIVE,1); /* init v-omega interface */

18if(vw == 0)

19{

20LCDPutString("VWInit Error!\n\a");

21OSWait(200); return;

22}

23VWStartControl(vw,7,0.3,7,0.1);

24OSSleep(100); /* delay before starting */

25

26for (i=0;i<4; i++) /* do 2 drives + 2 turns twice */

27{ if (i%2==0) { LCDSetString(2,0,"Drive");

28

 

VWDriveStraight(vw,DIST,SPEED);

29

else

}

30

{ LCDSetString(2,0,"Turn ");

31

 

VWDriveTurn(vw,M_PI,TSPEED);

32

 

}

33while (!VWDriveDone(vw))

34{ OSWait(33);

35VWGetPosition(vw,&pos);

36LCDSetPrintf(3,0,"Pos: %4.2f x %4.2f",pos.x,pos.y);

37LCDSetPrintf(4,0,"Heading:%5.1f",

38

pos.phi*180.0/M_PI);

39}

40}

41OSWait(200);

42VWRelease(vw);

43}

450

INDEX

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. .. .. .. .

A

A* algorithm 210 A/D converter 22 abstraction layer 379

accelerometer 27, 125, 139 Ackermann steering 5, 105 actuator 41, 267

Actuator models 186 adaptive controller 327, 333 adaptive driving 228

AI 325

air-speed sensor 154 altimeter 154 analog sensor 19 android 134

Andy Droid 135 application program 14, 374 artificial horizon 144 artificial intelligence 325 assemble 364

assembly language 364 audio demo 376 auto-brightness 245 auto-download 375 autonomous flying 151

autonomous underwater vehicle 161 autopilot 151

AUV 161

B

background debugger 12 background debugger module 366 balancing robot 123

ball detection 269

ball kicking 275 bang-bang controller 52 Bayer pattern 33, 249 BD32 366

BDM 12, 366 beacon 197, 198 behavior 326, 327

behavior selection 327 behavioral memory 350 behavior-based robotics 326

behavior-based software architecture 326 behavior-based systems 325

belief 202

bias neurons 287 binary sensor 19 biped robot 134, 145

artificial horizon 144 dynamic balance 143 fuzzy control 144 genetic algorithms 144 inverted pendulum 143 jumping 353

minimal design 145 optical flow 144 PID 144

sensor data 142 static balance 140 uneven terrain 353

walking sequence 145, 147 ZMP 143

biped sensor data 142 blocked 78

boot procedure 383 bootstrap-loader 15 boundary-following algorithm 232 Braitenberg vehicles 6

451

Index

breakpoint 367 bumper 373

C

C 362 C++ 362

camera 30, 125, 139, 268 auto-brightness 245 Bayer pattern 33 color 33 demosaicing 34 EyeSim 175

focus pattern 244 grayscale 33

image processing 243 interface 243

pixel 33 Sieman’s star 244

software interface 36 camera demo 376 camera sensor data 33 chip-select line 383 chromosome 334, 339 CIIPS Glory 264

classical software architecture 325 cleaning 104

closed loop control 48, 51 color class 256

color cone 250 color cube 249 color hue 250

color object detection 251 color space 249

combining C and assembly 365 communication 268

fault tolerance 87 frame structure 86 master 85 message 86 message types 87 polling 84 remote control 90

robot-to-robot 268 self-configuration 87 token ring 84

452

user interface 89 wild card 85 wireless 83

compass 25, 154, 200, 268, 374 compression 15

concurrency 69 configuration space 231 control 51

bang-bang 52 D 59

driving straight 63 fuzzy 144

I 58 on-off 51 P 57

parameter tuning 61 PID 56, 144, 267 position 62

spline generation 346 spline joint controller 347 steering 106

velocity 62 controller 7

controller evolution 349 cooperative multitasking 69 coordinate systems 205 coordinates

image 258 world 258

corrupted flash-ROM 367 Crab 132

cross-compiler 361

D

DC motor 41

dead reckoning 200 demo programs 376 demos.hex 376 demosaicing 34 device drivers 14, 377 device type 373 differential drive 5, 98 digital camera 30 digital control 51 digital sensor 19

Index

digital servo 136 Dijkstra’s algorithm 206 disassemble 367 distance estimation 269 distance map 225

DistBug algorithm 213, 229 download 372

driving demo 377 driving experiments 236 driving kinematics 107 driving robot 97, 113 driving routines 271 driving straight 63

duty cycle 46 DynaMechs 351 dynamic balance 143

dynamic walking methods 143

E

edge detection 246 electromagnetic compatibility 357 embedded controller 3, 7 embedded systems 7, 357 embedded vision system 243 EMC 357

emergence 329

emergent functionality 328 emergent intelligence 328 encoder 51, 373

encoder feedback 51 error model 175 Eve 99, 230 evolution 334, 345 evolved gait 352

extended temperature range 357 EyeBot 4, 429

buttons 431 chip select 430 controller 7

electrical data 432 family 4

hardware versions 429 interrupt request lines 431 IRQ 431

memory map 430

physical data 432 pinouts 433

EyeBox 12, 154 EyeCam 32, 100, 268 EyeCon 4, 7

schematics 10

EyeSim 171, 235, 254, 274, 334 3D representation 174 actuator modeling 174 console 174

environment 179 error model 175 maze format 179 multiple robots 177 parameter files 182 robi file 183 Saphira format 179 sim file 182

user interface 173 world format 179

F

fault tolerance 87 feedback 41, 51, 348 FIFO buffer 33

FIRA competition 263

fitness function 328, 337, 351, 354 flash command 368

flash-ROM 15, 375 flight path 158 flight program 155

flood fill algorithm 224 flying robot 151

focus pattern 244 Four Stooges 105 frame structure 86

fully connected network 280 function stubs 377

functional software architecture 325 fuzzy control 144

G

GA 349

gait 345, 352

453

Index

gait generation tool 141 Gaussian noise 176 gene 334

genetic algorithm 144, 333, 349 global coordinates 258

global positioning system 197 global sensors 197

global variables 375 GNU 361

GPS 151, 197

gyroscope 28, 124, 139, 154

H

Hall-effect sensor 20, 42

Hardware Description Table 14, 379 hardware settings 372

hardware versions 429 H-bridge 44

HDT 9, 14, 132, 379 HDT access functions 382 HDT compilation 381 HDT components 380 HDT magic number 382 hello world program 363 Hermite spline 274, 346 Hexapod 132

hierarchical software architecture 325 holonomic robot 113

homing beacon 198 HSI color model 250 HSV color model 251 hue 250

hue-histogram algorithm 251 hundreds-and-thousands noise 176 hysteresis 53

I

image coordinates 258 image processing 243, 269

auto-brightness 245 color object detection 251 edge detection 246

HSV color model 251 hue-histogram algorithm 251

454

Laplace operator 246 motion detection 248 optical flow 144 RGB color model 251 segmentation 256 Sobel operator 246

image segmentation 256 image sensor 30

inclinometer 30, 125, 126, 139, 154, 348 infrared 373

infrared proximity 139 infrared PSD 139 intercept theorem 260 interface connections 12 interfaces 10

International Aerial Robotics Comp. 151 interrupt 80

introduction 3 inverted pendulum 143 IRQ 431

IRTV 374

J

Jack Daniels 134

Johnny Walker 134 jumping biped robot 353

K

kinematics 107, 117 knowledge representation 327

L

LabBot 100

laboratory assignments 437 laboratory solutions 447 Laplace operator 246 learning 333

legged robots 6 library 377 Linux 361

local coordinates 205, 258 local sensors 198 localization 197, 257