Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

Embedded Robotics (Thomas Braunl, 2 ed, 2006)

.pdf
Скачиваний:
251
Добавлен:
12.08.2013
Размер:
5.37 Mб
Скачать

User Interface and Remote Control

signal is then sent to the appropriate robot, which reacts as if one of its physical buttons had been pressed (see Figure 6.3).

Another advantage of the remote control application is the fact that the host PC supports color, while current EyeCon LCDs are still monochrome for cost

Program 6.1: Wireless “ping” program for controller

1 #include "eyebot.h"

2

3int main()

4{ BYTE myId, nextId, fromId;

5BYTE mes[20]; /* message buffer */

6

int len, err;

7

 

8LCDPutString("Wireless Network");

9LCDPutString("----------------");

10LCDMenu(" "," "," ","END");

11

12myId = OSMachineID();

13if (myId==0) { LCDPutString("RadioLib not enabled!\n");

14

return 1; }

15else LCDPrintf("I am robot %d\n", myId);

16switch(myId)

17{ case 1 : nextId = 2; break;

18case 2 : nextId = 1; break;

19default: LCDPutString("Set ID 1 or 2\n"); return 1;

20}

21

22LCDPutString("Radio");

23err = RADIOInit();

24if (err) {LCDPutString("Error Radio Init\n"); return 1;}

25else LCDPutString("Init\n");

26

27if (myId == 1) /* robot 1 gets first to send */

28{ mes[0] = 0;

29err = RADIOSend(nextId, 1, mes);

30if (err) { LCDPutString("Error Send\n"); return 1; }

31}

32

33while ((KEYRead()) != KEY4)

34{ if (RADIOCheck()) /* check whether mess. is wait. */

35{ RADIORecv(&fromId, &len, mes); /* wait for mess. */

36LCDPrintf("Recv %d-%d: %3d\a\n", fromId,len,mes[0]);

37mes[0]++; /* increment number and send again */

38err = RADIOSend(nextId, 1, mes);

39if (err) { LCDPutString("Error Send\n"); return 1; }

40}

41}

42RADIOTerm();

43return 0;

44}

91

6 Wireless Communication

reasons. If a color image is being displayed on the EyeCon’s LCD, the full or a reduced color information of the image is transmitted to and displayed on the host PC (depending on the remote control settings). This way, the processing of color data on the EyeCon can be tested and debugged much more easily.

An interesting extension of the remote control application would be including transmission of all robots’ sensor and position data. That way, the movements of a group of robots could be tracked, similar to the simulation environment (see Chapter 13).

6.5 Sample Application Program

Program 6.1 shows a simple application of the wireless library functions. This program allows two EyeCons to communicate with each other by simply exchanging “pings”, i.e. a new message is sent as soon as one is received. For reasons of simplicity, the program requires the participating robots’ IDs to be 1 and 2, with number 1 starting the communication.

Program 6.2: Wireless host program

1#include "remote.h"

2#include "eyebot.h"

4int main()

5{ BYTE myId, nextId, fromId;

6BYTE mes[20]; /* message buffer */

7

int len, err;

8

RadioIOParameters radioParams;

9

 

10RADIOGetIoctl(&radioParams); /* get parameters */

11radioParams.speed = SER38400;

12radioParams.interface = SERIAL3; /* COM 3 */

13RADIOSetIoctl(radioParams); /* set parameters */

15err = RADIOInit();

16if (err) { printf("Error Radio Init\n"); return 1; }

17nextId = 1; /* PC (id 0) will send to EyeBot no. 1 */

19while (1)

20{ if (RADIOCheck()) /* check if message is waiting */

21{ RADIORecv(&fromId, &len, mes); /* wait next mes. */

22printf("Recv %d-%d: %3d\a\n", fromId, len, mes[0]);

23

mes[0]++;

/* increment number and send again */

24err = RADIOSend(nextId, 1, mes);

25if (err) { printf("Error Send\n"); return 1; }

26}

27}

28RADIOTerm();

29return 0;

30}

92

References

Each EyeCon initializes the wireless communication by using “RADIOInit”, while EyeCon number 1 also sends the first message. In the subsequent while-loop, each EyeCon waits for a message, and then sends another message with a single integer number as contents, which is incremented for every data exchange.

In order to communicate between a host PC and an EyeCon, this example program does not have to be changed much. On the EyeCon side it is only required to adapt the different id-number (the host PC has 0 by default). The program for the host PC is listed in Program 6.2.

It can be seen that the host PC program looks almost identical to the EyeCon program. This has been accomplished by providing a similar EyeBot library for the Linux and Windows environment as for RoBIOS. That way, source programs for a PC host can be developed in the same way and in many cases even with identical source code as for robot application programs.

6.6 References

BALCH, T., ARKIN, R. Communication in Reactive Multiagent Robotic Systems, Autonomous Robots, vol. 1, 1995, pp. 27-52 (26)

BRÄUNL, T., WILKE, P. Flexible Wireless Communication Network for Mobile Robot Agents, Industrial Robot International Journal, vol. 28, no. 3, 2001, pp. 220-232 (13)

FUKUDA, F., SEKIYAMA, K. Communication Reduction with Risk Estimate for Multiple Robotic Systems, IEEE Proceedings of the Conference on Robotics and Automation, 1994, pp. 2864-2869 (6)

MACLENNAN, B. Synthetic Ecology: An Approach to the Study of Communication, in C. Langton, D. Farmer, C. Taylor (Eds.), Artificial Life II, Proceedings of the Workshop on Artificial Life, held Feb. 1990 in Santa Fe NM, Addison-Wesley, Reading MA, 1991

WANG, J., PREMVUTI, S. Resource Sharing in Distributed Robotic Systems based on a Wireless Medium Access Protocol, Proceedings of the IEEE/RSJ/GI, 1994, pp. 784-791 (8)

WERNER, G., DYER, M. Evolution of Communication in Artificial Organisms, Technical Report UCLA-AI-90-06, University of California at Los Angeles, June 1990

93

PART II:

MOBILE ROBOT DESIGN

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. .. .. .. .

95

D. . .RIVING. . . . . . . . . . .R. . .OBOTS. . . . . . . . . . . . . . . . . .

 

7

 

 

.. . . . . . . .

 

Using two DC motors and two wheels is the easiest way to build a mobile robot. In this chapter we will discuss several designs such as differential drive, synchro-drive, and Ackermann steering. Omnidirectional robot designs are dealt with in Chapter 8. A collection of related research papers can be found in [Rückert, Sitte, Witkowski 2001] and [Cho, Lee 2002]. Introductory textbooks are [Borenstein, Everett, Feng 1998],

[Arkin 1998], [Jones, Flynn, Seiger 1999], and [McKerrow 1991].

7.1 Single Wheel Drive

Having a single wheel that is both driven and steered is the simplest conceptual design for a mobile robot. This design also requires two passive caster wheels in the back, since three contact points are always required.

Linear velocity and angular velocity of the robot are completely decoupled. So for driving straight, the front wheel is positioned in the middle position and driven at the desired speed. For driving in a curve, the wheel is positioned at an angle matching the desired curve.

Figure 7.1: Driving and rotation of single wheel drive

9797

7 Driving Robots

Figure 7.1 shows the driving action for different steering settings. Curve driving is following the arc of a circle; however, this robot design cannot turn on the spot. With the front wheel set to 90° the robot will rotate about the midpoint between the two caster wheels (see Figure 7.1, right). So the minimum turning radius is the distance between the front wheel and midpoint of the back wheels.

7.2 Differential Drive

The differential drive design has two motors mounted in fixed positions on the left and right side of the robot, independently driving one wheel each. Since three ground contact points are necessary, this design requires one or two additional passive caster wheels or sliders, depending on the location of the driven wheels. Differential drive is mechanically simpler than the single wheel drive, because it does not require rotation of a driven axis. However, driving control for differential drive is more complex than for single wheel drive, because it requires the coordination of two driven wheels.

The minimal differential drive design with only a single passive wheel cannot have the driving wheels in the middle of the robot, for stability reasons. So when turning on the spot, the robot will rotate about the off-center midpoint between the two driven wheels. The design with two passive wheels or sliders, one each in the front and at the back of the robot, allows rotation about the center of the robot. However, this design can introduce surface contact problems, because it is using four contact points.

Figure 7.2 demonstrates the driving actions of a differential drive robot. If both motors run at the same speed, the robot drives straight forward or backward, if one motor is running faster than the other, the robot drives in a curve along the arc of a circle, and if both motors are run at the same speed in opposite directions, the robot turns on the spot.

Figure 7.2: Driving and rotation of differential drive

98

Differential Drive

 

 

 

 

 

 

Driving straight, forward:

vL = vR,

vL > 0

Driving in a right curve:

vL > vR,

e.g. vL = 2·vR

Turning on the spot, counter-clockwise: vL = –vR,

vL > 0

Eve We have built a number of robots using a differential drive. The first one was the EyeBot Vehicle, or Eve for short. It carried an EyeBot controller (Figure 7.3) and had a custom shaped I/O board to match the robot outline – a design approach that was later dropped in favor of a standard versatile controller.

The robot has a differential drive actuator design, using two Faulhaber motors with encapsulated gearboxes and encapsulated encoders. The robot is equipped with a number of sensors, some of which are experimental setups:

Shaft encoders (2 units)

Infrared PSD (1-3 units)

Infrared proximity sensors (7 units)

Acoustic bump sensors (2 units)

QuickCam digital grayscale or color camera (1 unit)

Figure 7.3: Eve

One of the novel ideas is the acoustic bumper, designed as an air-filled tube surrounding the robot chassis. Two microphones are attached to the tube ends. Any collision of the robot will result in an audible bump that can be registered by the microphones. Provided that the microphones can be polled fast enough or generate an interrupt and the bumper is acoustically sufficiently isolated from the rest of the chassis, it is possible to determine the point of impact from the time difference between the two microphone signals.

SoccerBot Eve was constructed before robot soccer competitions became popular. As it turned out, Eve was about 1cm too wide, according to the RoboCup rules. As a consequence, we came up with a redesigned robot that qualified to compete in the robot soccer events RoboCup [Asada 1998] small size league and FIRA RoboSot [Cho, Lee 2002].

99

7 Driving Robots

The robot has a narrower wheel base, which was accomplished by using gears and placing the motors side by side. Two servos are used as additional actuators, one for panning the camera and one for activating the ball kicking mechanism. Three PSDs are now used (to the left, front, and right), but no infrared proximity sensors or a bumper. However, it is possible to detect a collision by feedback from the driving routines without using any additional sensors (see function VWStalled in Appendix B.5.12).

Figure 7.4: SoccerBot

The digital color camera EyeCam is used on the SoccerBot, replacing the obsolete QuickCam. With an optional wireless communication module, the robots can send messages to each other or to a PC host system. The network software uses a Virtual Token Ring structure (see Chapter 6). It is self-organiz- ing and does not require a specific master node.

A team of robots participated in both the RoboCup small size league and FIRA RoboSot. However, only RoboSot is a competition for autonomous mobile robots. The RoboCup small size league does allow the use of an overhead camera as a global sensor and remote computing on a central host system. Therefore, this event is more in the area of real-time image processing than robotics.

Figure 7.4 shows the current third generation of the SoccerBot design. It carries an EyeBot controller and EyeCam camera for on-board image processing and is powered by a lithium-ion rechargeable battery. This robot is commercially available from InroSoft [InroSoft 2006].

LabBot For our robotics lab course we wanted a simpler and more robust version of the SoccerBot that does not have to comply with any size restrictions. LabBot was designed by going back to the simpler design of Eve, connecting the motors directly to the wheels without the need for gears or additional bearings.

100

Differential Drive

The controller is again flat on the robot top and the two-part chassis can be opened to add sensors or actuators.

Getting away from robot soccer, we had one lab task in mind, namely to simulate foraging behavior. The robot should be able to detect colored cans, collect them, and bring them to a designated location. For this reason, LabBot does not have a kicker. Instead, we designed it with a circular bar in front (Figure 7.5) and equipped it with an electromagnet that can be switched on and off using one of the digital outputs.

Figure 7.5: LabBot with colored band for detection

The typical experiment on the lab course is to have one robot or even two competing robots drive in an enclosed environment and search and collect cans (Figure 7.6). Each robot has to avoid obstacles (walls and other robots) and use image processing to collect a can. The electromagnet has to be switched on after detection and close in on a can, and has to be switched off when the robot has reached the collection area, which also requires on-board localization.

Figure 7.6: Can collection task

101

7 Driving Robots

7.3 Tracked Robots

A tracked mobile robot can be seen as a special case of a wheeled robot with differential drive. In fact, the only difference is the robot’s better maneuverability in rough terrain and its higher friction in turns, due to its tracks and multiple points of contact with the surface.

Figure 7.7 shows EyeTrack, a model snow truck that was modified into a mobile robot. As discussed in Section 7.2, a model car can be simply connected to an EyeBot controller by driving its speed controller and steering servo from the EyeBot instead of a remote control receiver. Normally, a tracked vehicle would have two driving motors, one for each track. In this particular model, however, because of cost reasons there is only a single driving motor plus a servo for steering, which brakes the left or right track.

Figure 7.7: EyeTrack robot and bottom view with sensors attached

EyeTrack is equipped with a number of sensors required for navigating rough terrain. Most of the sensors are mounted on the bottom of the robot. In Figure 7.7, right, the following are visible: top: PSD sensor; middle (left to right): digital compass, braking servo, electronic speed controller; bottom: gyroscope. The sensors used on this robot are:

Digital color camera

Like all our robots, EyeTrack is equipped with a camera. It is mounted in the “driver cabin” and can be steered in all three axes by using three servos. This allows the camera to be kept stable when combined with the robot’s orientation sensors shown below. The camera will actively stay locked on to a desired target, while the robot chassis is driving over the terrain.

Digital compass

The compass allows the determination of the robot’s orientation at all

102