Задани на лабораторные работы. ПРК / Professional Microsoft Robotics Developer Studio
.pdf
www.it-ebooks.info
Chapter 5: The MRDS Visual Simulation Environment
The state associated with a SimplifiedConvexMeshEnvironmentEntity is very similar to the other entities except that it must have a 3D mesh specified in the EntityState. This is the mesh that will be used to construct the convex mesh. The maximum number of polygons in the simplified convex mesh is 256.
TriangleMeshEnvironmentEntity
This entity is very similar to the SimplifiedConvexMeshEnvironmentEntity except that the entire 3D mesh is used for the physics shape instead of constructing a convex mesh. Because the physics shape is identical to the visual shape, the entity will collide and move just like the visual shape.
You may be tempted to use a TriangleMeshEnvironmentEntity for every object in the scene, but you should be aware that this type of entity has some limitations. Collision detection is not as robust as it is for other shapes. If a shape moves fast enough that its center is inside the triangle mesh shape, then a collision may not be registered. In addition, SimplifiedConvexMeshEnvironmentEntities must be static entities, meaning they cannot be moved around in the environment, and they act as if they have infinite mass.
Robot Entities
Let’s see, what was this book about again? Oh yeah, robots! We should talk about some robots. This chapter covers simulating differential drive robots. In Chapter 8, you’ll look at robots with joints. A differential drive consists of two wheels with independent motors. A differential drive robot can drive forward or backward by driving both wheels in the same direction, and it turns left or right by driving the wheels in opposite directions. Usually, one or two other wheels or castors provide stability.
In this section, you will learn how to drive the iRobot Create, the LEGO NXT, and the Pioneer 3DX robots in the simulation environment. Each of these robots is based on the DifferentialDriveEntity. You’ll also learn how to use sensors such as bumpers, laser range finders, and cameras.
Out for a Drive with the iRobot Create
The best way to become familiar with differential drive robots is to drive one around. Click Start
Visual Simulation Environment
iRobot Create Simulation to start up an environment that contains a number of environmental entities and an iRobot Create robot. This manifest starts several services, listed as follows, including the SimpleDashboard service (shown in Figure 5-22):
SimulationEngine: This is the simulator and it has a state partner that causes the simulator to load its initial scene from the file iRobot.Create.Simulation.xml.
SimulatedDifferentialDrive: This service connects with the iRobot Create entity in the simulation environment and then drives each of its motors according to the commands that it receives.
SimulatedBumper: This service sends a notification to other services when the bumpers on the iRobot Create make contact with another object.
SimpleDashboard: This service provides a Windows Forms user interface that can be used to send commands to the DifferentialDrive service to drive the robot.
263
www.it-ebooks.info
Part II: Simulations
Figure 5-22
Follow these steps to connect the dashboard to the differential drive service and drive the iRobot Create in the environment:
1.
2.
3.
4.
5.
Type your machine name in the Machine textbox. Alternatively, you can just type localhost to select the current machine.
Click the Connect button. The (IRobotCreateMotorBase)/simulateddifferentialdrive service should appear in the list labeled Service Directory.
Double-click the (IRobotCreateMotorBase) /simulateddifferentialdrive/ entry. The Motor label should change to On in the Differential Drive group box.
Click the Drive button in the Direct Input Device group box to enable the drive control. Don’t forget this step!
Drag the trackball graphic forward, backward, to the left, and to the right, to drive the robot in the environment. Go wild. Crash into entities to your heart’s content and take satisfaction in knowing that no one will present you with a bill for a damaged blue sphere.
You can also drive simulated robots and actual robots using a wired Xbox 360 controller. Plug the controller into an open USB port. You will be able to drive the robot using the left thumbstick. If you are using the simulator, you can also move the camera around the environment using the right thumbstick. The only downside is that it’s hard to convince your colleagues that you are doing serious robotics work while holding an Xbox 360 controller.
The LEGO NXT Tribot
The LEGO NXT Tribot is the first robot you build with the LEGO NXT kit. It has two drive wheels and a castor in the back. It also has a number of sensors that you can mount to the robot, but the one in the
264
www.it-ebooks.info
Chapter 5: The MRDS Visual Simulation Environment
simulation model is the touch sensor, or bumper. You can run an environment that contains a LEGO NXT Tribot, a Pioneer 3DX robot, and a lovely table by selecting Visual Simulation Environment
Multiple Simulated Robots.
When you connect the Dashboard, you’ll see three services. You can drive the Tribot by double-clicking the (LEGONXTMotorBase)/simulateddifferentialdrive/ service. This is the service that is used with all three differential drive robots in the simulator. It provides a connection between the DifferentialDriveEntity in the simulator and services outside the simulator. When you examine the ServiceContract property of the LEGO NXT Tribot, you will notice that it references the contract ID for the SimulatedDifferentialDrive service.
Put the simulator into Edit mode by pressing F5. You’ll notice that the Physics menu turns red, indicating that the physics engine has been disabled. You’ll also notice that you can no longer drive the robot. When you can’t drive a robot in the simulation, ensure that the physics engine is enabled.
BumperArrayEntity
The LEGONXTMotorBase entity in the Entities pane has a single child entity, LEGONXTBumpers, which is a BumperArrayEntity. This entity has an array of box shapes that define where the bumpers are located. The LEGO NXT Tribot uses only a single box shape. The ServiceContract property of the LEGONXTBumpers entity is set to the contract identifier for the SimulatedBumper service. This service connects with the BumperArrayEntity and provides a way for other services to be notified when the bumpers are in contact with another shape.
DifferentialDriveEntity
The iRobot Create, the LEGO NXT Tribot, and the Pioneer 3DX entities all inherit from the DifferentialDriveEntity. This is what enables the SimulatedDifferentialDrive service to work with all three of them. Most of the properties of the DifferentialDriveEntity can only be set programmatically, but a few are useful in the Simulation Editor:
CurrentHeading: This is a read-only property that indicates the current heading of the robot in radians. The heading is 0 when the robot faces along the –Z axis and it increases as the robot turns left.
MotorTorqueScaling: This factor multiplies the torque applied to the wheels when
SetMotorTorque is called.
IsEnabled: This is True if the drive is enabled.
Not all robots in the simulation environment need to inherit from the DifferentialDriveEntity. It simply serves as a convenient way to provide common services and properties for two-wheeled differential drive robots. You’ll learn how to implement more complicated robots in the next chapter.
Pioneer3DX
The Pioneer 3DX Robot is manufactured by Mobile Robots, Inc. It is interesting because it has an onboard laser range finder and an onboard computer, so it is capable of autonomous movement. It also has bumpers, and the version in the simulator has a mounted webcam.
You’ve already learned about cameras in the scene but this camera is mounted to the robot as a child entity. When an entity is added as a child of another entity, its position and orientation are relative to the position
265
www.it-ebooks.info
Part II: Simulations
and orientation of the parent. When the parent moves, the child moves with it. Adding a CameraEntity as a child entity of the Pioneer3DX entity causes the camera and its image to follow the robot.
Try this out by double-clicking the (P3DMotorBase)/simulateddifferentialdrive/ service in the Service directory list of the Dashboard. Click the Drive button to enable the drive and verify that you can drive the Pioneer by dragging the trackball icon. Now switch the simulator view to the camera mounted to the Pioneer by selecting robocam from the Camera menu. Drive the robot around and you will see what
the robot sees.
The ServiceContract property of the robocam entity shows that it is associated with the SimulatedWebcam service. This service reads the images generated by the real-time camera and makes them available to other services or to users via a browser window. You can watch updated images from any real-time camera in the scene in a browser window by following these steps:
1.
2.
3.
4.
5.
Make sure the Visual Simulation Environment
Multiple Simulated Robots scene is running.
Open a browser window and point it to http://localhost:50000. The image from the SimulatedWebcam service appears in a browser window (see Figure 5-23).
Select Service Directory from the left pane.
Click either service that begins with /simulatedwebcam/.
Select a refresh interval and click Start. The web page should periodically update with a new image from the camera.
Figure 5-23
266
www.it-ebooks.info
Chapter 5: The MRDS Visual Simulation Environment
LaserRangeFinderEntity
A laser range finder is a sophisticated and expensive device for measuring distance. It uses a laser to sweep a horizontal line in front of the robot. The laser measures the distance to objects in front of the robot by shining a low-power laser and then recording the light that is bounced off of objects. As
the laser sweeps in a horizontal arc, the number of samples that it takes defines its angular resolution.
The LaserRangeFinderEntity simulates a laser range finder by casting rays into the simulation environment that correspond to laser samples. These rays are intersected with the physics shapes in the environment, and an array of distances is returned. Small, flashing, red circles are drawn in the environment at each intersection point. You may notice that these hit points are sometimes drawn in mid-air where no object is visible. This is usually because the physics shape that represents an entity is different from the visual mesh that represents the entity. This is the case with the table in the scene. Switching to combined rendering mode shows that the box shapes defining the sides of the table are quite different from the visual mesh of the table. The laser always operates on physics shapes.
The Dashboard user interface provides a way to visualize the laser range finder data. Follow these steps to enable this feature:
1.
2.
3.
4.
Click Start
Visual Simulation Environment
Multiple Simulated Robots.
Connect the Dashboard by entering the computer name or localhost in the Machine textbox.
Double-click the (P3DXLaserRangeFinder)/simulatedlrf service in the Service Directory list.
An image of two box shapes should appear in the Laser Range Finder window at the bottom of the dialog. Darker pixel colors represent objects that are nearer to the robot.
Summary
This chapter has been a user guide to the simulator. The functionality provided by the simulator has been explained and the built-in entities have been described. Several example scenes have been demonstrated. You might want to remember the following key points as you head into other chapters:
The Microsoft Robotics Developer Studio Visual Simulation Environment is a 3D virtual environment with a powerful physics engine. It can be used to prototype algorithms and new robot designs.
The simulator provides a user interface to move through the environment, as well as to change the rendering mode and save and restore scenes.
The simulator has a built-in editor that enables entire scenes to be defined. Robots and other entities can be created and placed in the environment.
The simulator provides several built-in entities, including robots and sensors.
Custom entities can be defined and used in the environment.
The environments and robots provided with the SDK are interesting, but the real utility and fun comes from building something new. That is what is covered in the next four chapters, where you will learn about adding new entities and simulation services and creating new environments.
267
www.it-ebooks.info
www.it-ebooks.info
Extending the MRDS Visual
Simulation Environment
The previous chapter showed how to use the MRDS Visual Simulation Environment, including making simple edits to the environment using the Simulation Editor. The robot entities and environments provided with the MRDS SDK are great, but they only tap a small part of the simulator’s potential.
This chapter demonstrates how to add your own custom entities and services to the simulation environment. You will define a new four-wheel-drive robot with a camera and IR distance sensors, along with the services needed to drive the motors and read the sensor values. In the next chapter, you’ll use this robot in a simulation of the SRS Robo-Magellan contest.
By the time you complete these two chapters, you will know how to build an entire simulation scenario, complete with special environmental entities, a custom robot, services to interface with the entities, and a high-level orchestration service to control the behavior of the robot. Figure 6-1 shows the Corobot entity defined in this chapter.
Figure 6-1
www.it-ebooks.info
Part II: Simulations
Simulation DLLs and Types
Before you set out on any great adventure, it pays to know what resources are available to you. When you’re writing services that interact with the simulation engine, you need to create classes and use types that are defined in the following DLLs.
RoboticsCommon.DLL
This DLL defines a number of common types and generic contracts that both hardware and simulation services can use. Most simulation services will need to reference this DLL so that they can use at least some of the types defined in the Microsoft.Robotics.PhysicalModel namespace. Some of the types you’ll use in this chapter are as follows:
Vector2, Vector3, Vector4: A structure that contains the indicated number of floating-point values. Vectors are typically used to represent 2D, 3D, or homogeneous coordinate vectors. Vector3 and Vector4 are also sometimes used to represent colors.
Quaternion: 3D rotations are represented by the physics engine as quaternions. It is beyond the scope of this chapter to completely explain the math behind quaternions, but you can reference the following link to learn more: http://en.wikipedia.org/wiki/Quaternion. The
Quaternion type contains four floating-point values: X, Y, Z, and W.
Pose: A Pose defines the position and orientation of an entity within the simulation environment. It consists of a Vector3 position and a Quaternion orientation.
Matrix: As you might expect, this is a 4 × 4 array of floating-point values that represents the transform for a point in the simulation environment. There is also a matrix type provided by the XNA library. This matrix type is used more commonly than the RoboticsCommon matrix type because it supports more built-in operations.
ColorValue: This contains four floating-point values (Alpha, Red, Green, and Blue) that range from 0 to 1 to define a color.
In addition to these basic types, RoboticsCommon.dll also defines a number of generic contracts, which define particular operations but don’t necessarily associate any behavior with the operations. It is often useful to write a hardware service and a simulation service that implement a generic contract. The orchestration service that drives the robot can then work properly in simulation and in the real world by making a simple manifest change. Some of the generic contracts defined in RoboticsCommon are as follows:
AnalogSensor: A generic analog sensor that returns a floating-point value representing the current state of a continuously varying hardware sensor
AnalogSensorArray: Multiple analog sensors
Battery: A generic battery contract that enables the system to report on the state of the battery and provide notifications when the battery level falls below a critical threshold.
ContactSensor: A generic sensor that has a pressed and unpressed state. This is suitable for bumpers, pushbuttons, and other similar sensors.
Drive: A generic two-wheel differential drive that provides operations such as DriveDistance, RotateDegrees, AllStop, and so on. It provides control for two wheels that are driven independently.
270
www.it-ebooks.info
Chapter 6: Extending the MRDS Visual Simulation Environment
Encoder: A generic wheel encoder sensor that provides information about the current encoder state
Motor: Provides a way to control a generic single motor
Sonar: Exposes a generic sonar device, including information about the current distance measurement and angular range and resolution
Webcam: Provides a way to retrieve images from a generic camera such as a webcam
You’ll be using the AnalogSensor, Drive, and Webcam generic contracts as we develop the Robo-Magellan simulation.
SimulationCommon.DLL
This DLL defines types that are used only in the simulation environment. The types specific to the simulation engine are contained in the Microsoft.Robotics.Simulation namespace. Some of these types include the following:
Entity: The base type for all entities in the simulation environment. This type contains all of the information common to both the simulation engine and the physics engine.
EntityState: This type contains information about the entity such as its Pose, Velocity, Angular Velocity, and Name. In addition, it contains a list of all the physics primitives associated with the entity, as well as physical properties such as the mass and density of the object and visual properties such as the default texture, mesh, and rendering effect. Miscellaneous flags are provided to control various aspects of the rendering or physics behavior of the entity.
LightEntity: This is a deprecated type, only present in MRDS 1.5 to provide backward compatibility. Lights are now represented by LightEntities.
SimulationState: This is what is returned from a Get operation on the simulator. It contains information about the Main Camera, and some other information such as the current render mode and whether the physics engine is paused. The most important thing it contains is a list of all the entities in the simulation environment.
The types that the physics engine uses are defined under the namespace Microsoft.Robotics
.Simulation.Physics. These types are too numerous to completely list here but the most commonly used types are as follows:
BoxShapeProperties, CapsuleShapeProperties, SphereShapeProperties,
ConvexMeshShapeProperties, TriangleMeshShapeProperties,
HeightFieldShapeProperties, WheelShapeProperties: These types hold state information about each of the shape objects supported by the physics engine. Some of the information is specific to a particular shape, such as the ProcessedMeshResource in
the ConvexMeshShapeProperties. Much of the information is common to all or some shapes, such as Dimensions, Radius, LocalPose, etc. These shape properties are covered in more detail in the previous chapter.
BoxShape, CapsuleShape, SphereShape, ConvexMeshShape, TriangleMeshShape,
HeightFieldShape, WheelShape: These types are the shapes created from their associated
271
www.it-ebooks.info
Part II: Simulations
shape properties. They contain a reference to the actual physical shape representation in the AGEIA physics engine.
UIMath: This type contains static methods that you can use to convert between an Euler angle rotation representation and a quaternion rotation representation. A method is also provided to round a double value to the nearest hundredth.
Using Visual Studio Help to find out more about MRDS types
When the Microsoft Robotics Developer Studio SDK is installed, it integrates its API help with the Visual Studio help file. You can bring up Visual Studio help by selecting Help
Index. When the help index is displayed, select (unfiltered) from the Filtered by drop-down menu. Type the name of an MRDS type or method in the Look for: textbox and information about that type will be displayed.
SimulationEngine.DLL
This DLL contains most of the simulator functionality. From a programmer’s perspective, the most important types that it contains are in the Microsoft.Robotics.Simulation.Engine namespace. These are the built-in entity types provided with the simulator, such as CameraEntity, SkyDomeEntity, SingleShapeEntity, and so on. Many of these entities are described in the previous chapter. The full source code for all of these entities is in samples\simulation\entities\entities.cs. You can use this code as an example for your own custom entities.
Note that all of these entities inherit from the VisualEntity class. Some of the members and properties of the VisualEntity class are important to understand when creating new entities and services for the simulation environment.
VisualEntity Methods and Members
Initialize, Update, and Render are three virtual methods on VirtualEntity that can be overridden in a subclass to define new behavior for the entity. This section describes these methods, as well as other important methods and member variables on the VisualEntity class.
Initialize: This method is called after the entity has been inserted into the simulation environment. In this method, the state values of the entity are used to instantiate run-time objects, such as shapes and meshes, which enable the entity to function in the simulator. It is important to keep all of the code that creates run-time objects in the Initialize function and not in the constructor. When an entity is deserialized from an XML file, such as when it is pasted using the Simulation Editor, the deserializer calls the default constructor for the entity. This is the constructor with no parameters. It then sets all of the state variables according to the XML data and calls Initialize on the entity to instantiate run-time objects. If there are run-time objects that are initialized in a nondefault constructor, the entity will fail to initialize properly when it is deserialized. It is a good idea to enclose most of the code in the Initialize method within a Try/Catch block and to set the value of the InitError field with the text of any errors that are encountered.
Update: This method is called once each frame while the physics engine is not processing the frame. This is important because some physics engine functions cannot be called while the
272
