Задани на лабораторные работы. ПРК / Professional Microsoft Robotics Developer Studio
.pdf
www.it-ebooks.info
Chapter 6: Extending the MRDS Visual Simulation Environment
http://schemas.tempuri.org/2006/09/simulatedwebcam.html </dssp:Contract> <dssp:Service>http://localhost/Corobot/Cam</dssp:Service> <dssp:PartnerList>
<dssp:Partner> <dssp:Service>http://localhost/Corobot_cam</dssp:Service> <dssp:Name>simcommon:Entity</dssp:Name>
</dssp:Partner> </dssp:PartnerList> </ServiceRecordType>
The service is started by specifying its contract, and the entity name Corobot_cam is added as a partner (in the form of a URI). The service is given a run-time identifier of /Corobot/Cam. This is the identifier that will appear in the service directory.
Verify that the SimulatedWebCam service is working properly by running the manifest. Open a browser window and navigate to http://localhost:50000. Select Service Directory in the left column and then click the entry called /Corobot/Cam. A web page will be displayed showing the current view from the camera. If you select a refresh interval and press the Start button, the web page will be updated with new images at the specified rate.
Adding Infrared Distance Sensors
The Corobot has infrared (IR) distance sensors mounted on the front and rear of it chassis. These sensors serve as virtual bumpers, telling the robot when it is about to collide with an obstacle. In this section, you modify the LaserRangeFinderEntity included with the MRDS SDK to become an IR sensor. You also add a service to read the value of an IR sensor and send notifications to other services when it changes.
The CorobotIREntity
It is difficult to fully simulate all of the properties of an infrared sensor. The main purpose of the sensor is to return an approximate distance value based on the amount of infrared light reflected back to the sensor. The sensor can also be used as a reflectivity sensor because the value it returns changes with the reflectivity of different materials even if they are at the same distance. This property of the IR sensor is used to advantage in the MRDS Sumo Competition Package. The iRobot Create robots used as sumo players utilize the IR sensors mounted on their underside to detect changes in reflectivity in the outer region of the sumo ring so that they don’t exceed that.
The simulation environment does not currently support modeling reflectivity, so this aspect of the sensor is hard to model. The CorobotIREntity that you’ll create for the Corobot assumes that every material in the world has the same reflectivity, so it will simply return distance.
The LaserRangeFinderEntity provided in the SDK is a very accurate distance sensor that sweeps a laser horizontally across the environment and measures the light that is reflected back. Simulation Tutorial 2 (run by clicking Start
Visual Simulation Environment
Multiple Simulated Robots) contains a Pioneer3DX robot with a laser range finder mounted on top. The laser impact points are illuminated to help visualize the geometry that the laser is scanning.
323
www.it-ebooks.info
Part II: Simulations
You can easily modify this entity to become your CorobotIREntity. Copy the source code for the
LaserRangeFinderEntity from samples\simulation\entities\entities.cs into Corobot.cs. Change the name of the entity to CorobotIREntity and include the following member variables:
public class CorobotIREntity : VisualEntity
{
[DataMember]
public float DispersionConeAngle = 8f; // in degrees [DataMember]
public float Samples = 3f; // the number of rays in each direction [DataMember]
public float MaximumRange =
(30f * 2.54f / 100.0f); // 30 inches converted to meters
float _elapsedSinceLastScan; Port<RaycastResult> _raycastResultsPort; RaycastResult _lastResults;
Port<RaycastResult> _raycastResults = new Port<RaycastResult>(); RaycastProperties _raycastProperties;
CachedEffectParameter _timeAttenuationHandle; float _appTime;
Shape _particlePlane;
///<summary>
///Raycast configuration
///</summary>
public RaycastProperties RaycastProperties
{
get { return _raycastProperties; } set { _raycastProperties = value; }
}
float _distance; [DataMember]
public float Distance
{
get { return _distance; } set { _distance = value; }
}
DispersionConeAngle is a new variable that sets the angle across which the infrared rays spread out from the emitter. The Samples variable specifies the number of distance samples to take horizontally and vertically. The MaximumRange variable specifies the farthest distance from which the sensor returns any meaningful data. If objects are farther than this distance, the sensor reports MaximumRange as the distance. You can use the Distance property to retrieve the last reading from the sensor. The rest of the variables are copied directly from the LaserRangeFinderEntity.
The constructors are renamed to match the new class name. State.Assets.Effect is set to “LaserRangeFinder.fx”. This effect is used when the laser impact points are rendered.
///<summary>
///Default constructor used when this entity is deserialized
///</summary>
324
www.it-ebooks.info
Chapter 6: Extending the MRDS Visual Simulation Environment
public CorobotIREntity()
{
}
///<summary>
///Initialization constructor used when this entity is built programmatically
///</summary>
///<param name=”initialPos”></param>
public CorobotIREntity(string name, Pose initialPose)
{
base.State.Name = name; base.State.Pose = initialPose;
// used for rendering impact points base.State.Assets.Effect = “LaserRangeFinder.fx”;
}
The Initialize method is not substantially different from the LaserRangeFinderEntity:
public override void Initialize(xnagrfx.GraphicsDevice device, PhysicsEngine physicsEngine)
{
try
{
if (Parent == |
null) |
throw new |
Exception( |
“This |
entity must be a child of another entity.”); |
// make sure that we take at least 2 samples in each direction if (Samples < 2f)
Samples = 2f;
_raycastProperties = new RaycastProperties(); _raycastProperties.StartAngle = -DispersionConeAngle / 2.0f; _raycastProperties.EndAngle = DispersionConeAngle / 2.0f; _raycastProperties.AngleIncrement =
DispersionConeAngle / (Samples - 1f); _raycastProperties.Range = MaximumRange; _raycastProperties.OriginPose = new Pose();
The sensor calculates a distance value by casting rays out into the environment to see where they intersect with physics objects. The _raycastProperties structure specifies the number of rays and the angle over which they are cast.
// set flag so rendering engine renders us last Flags |= VisualEntityProperties.UsesAlphaBlending;
base.Initialize(device, physicsEngine);
The LaserRangeFinder.fx effect is created in base.Initialize. The following code creates the mesh that is used to render the laser impact point. You keep this functionality in your sensor because it is often useful in debugging services to see exactly where the IR sensor is pointing. The HeightFieldShape is created solely for the purpose of constructing a planar mesh that is two centimeters on a side. The mesh
325
www.it-ebooks.info
Part II: Simulations
is created and added to the Meshes collection. It is given a texture map called particle.bmp that uses transparency to give the rendered plane a circular appearance:
// set up for rendering impact points HeightFieldShapeProperties hf = new HeightFieldShapeProperties(
“height field”, 2, 0.02f, 2, 0.02f, 0, 0, 1, 1); hf.HeightSamples =
new HeightFieldSample[hf.RowCount * hf.ColumnCount]; for (int i = 0; i < hf.HeightSamples.Length; i++)
hf.HeightSamples[i] = new HeightFieldSample();
_particlePlane = new Shape(hf); _particlePlane.State.Name = “laser impact plane”;
//The mesh is used to render the ray impact points
//rather than the sensor geometry.
int index = Meshes.Count; Meshes.Add(SimulationEngine.ResourceCache.CreateMesh(
device, _particlePlane.State)); Meshes[0].Textures[0] =
SimulationEngine.ResourceCache.CreateTextureFromFile( device, “particle.bmp”);
if (Effect != null)
_timeAttenuationHandle = Effect.GetParameter( “timeAttenuation”);
}
catch (Exception ex)
{
// clean up
if (PhysicsEntity != null) PhysicsEngine.DeleteEntity(PhysicsEntity);
HasBeenInitialized = false; InitError = ex.ToString();
}
}
This entity is unusual in that its single mesh is used to render impact points, rather than the geometry of the entity. This won’t be a problem for the Corobot because the IR sensors are very small compared to the body of the robot and they are fixed to the body, so it isn’t necessary to render them as a separate mesh.
Most of the work that the entity does is in the Update method:
public override void Update(FrameUpdate update)
{
base.Update(update);
_elapsedSinceLastScan += (float)update.ElapsedTime; _appTime = (float)update.ApplicationTime;
326
www.it-ebooks.info
Chapter 6: Extending the MRDS Visual Simulation Environment
// only retrieve raycast results every SCAN_INTERVAL. if ((_elapsedSinceLastScan > SCAN_INTERVAL) &&
(_raycastProperties != null))
{
It is a fairly expensive operation to cast rays into the physics environment to determine which physics object they intersect, so this operation is not done every frame. The SCAN_INTERVAL constant determines the frequency with which the distance value is updated.
The position and orientation of the raycast pattern is set according to the Pose of this entity and the Pose of its parent:
_elapsedSinceLastScan = 0;
_raycastProperties.OriginPose.Orientation = TypeConversion.FromXNA(
TypeConversion.ToXNA(Parent.State.Pose.Orientation) * TypeConversion.ToXNA(State.Pose.Orientation));
_raycastProperties.OriginPose.Position = TypeConversion.FromXNA(
xna.Vector3.Transform(
TypeConversion.ToXNA(State.Pose.Position),
Parent.World));
You use the PhysicsEngine Raycast2D API to find the intersections of the rays with physics shapes in the environment. The first set of rays that you cast are in the horizontal plane:
// cast rays on a horizontal plane and again on a vertical plane _raycastResultsPort =
PhysicsEngine.Raycast2D(_raycastProperties);
If the first raycast was successful, then you cast a second set of rays in the vertical plane to form a cross pattern:
_raycastResultsPort.Test(out _lastResults); if (_lastResults != null)
{
You combine the impact points, if any, from both sets of rays and then find the distance to the closest intersection:
RaycastResult verticalResults;
// rotate the plane by 90 degrees _raycastProperties.OriginPose.Orientation =
TypeConversion.FromXNA(
TypeConversion.ToXNA( _raycastProperties.OriginPose.Orientation) * xna.Quaternion.CreateFromAxisAngle(
new xna.Vector3(0, 0, 1), (float)Math.PI / 2f));
(continued)
327
www.it-ebooks.info
Part II: Simulations
(continued)
_raycastResultsPort = PhysicsEngine.Raycast2D(_raycastProperties);
_raycastResultsPort.Test(out verticalResults);
// combine the results of the second raycast with the first if (verticalResults != null)
{
foreach (RaycastImpactPoint impact in verticalResults.ImpactPoints)
_lastResults.ImpactPoints.Add(impact);
}
That is the distance you return. If there is no intersection, then MaximumRange is returned.
// find the shortest distance to an impact point float minRange = MaximumRange * MaximumRange; xna.Vector4 origin = new xna.Vector4(
TypeConversion.ToXNA( _raycastProperties.OriginPose.Position), 1);
foreach (RaycastImpactPoint impact in _lastResults.ImpactPoints)
{
xna.Vector3 impactVector = new xna.Vector3( impact.Position.X - origin.X, impact.Position.Y - origin.Y, impact.Position.Z - origin.Z);
float impactDistanceSquared = impactVector.LengthSquared();
if (impactDistanceSquared < minRange) minRange = impactDistanceSquared;
}
_distance = (float)Math.Sqrt(minRange);
}
}
}
The final two entity methods render the impact points of the rays:
public override void Render( RenderMode renderMode, MatrixTransforms transforms, CameraEntity currentCamera)
{
if ((int)(Flags & VisualEntityProperties.DisableRendering) > 0) return;
328
www.it-ebooks.info
Chapter 6: Extending the MRDS Visual Simulation Environment
Rendering of the impact points is disabled if the DisableRendering flag is set:
if (_lastResults != null)
RenderResults(renderMode, transforms, currentCamera);
}
void RenderResults( RenderMode renderMode,
MatrixTransforms transforms, CameraEntity currentCamera)
{
_timeAttenuationHandle.SetValue(
new xna.Vector4(100 * (float)Math.Cos(
_appTime * (1.0f / SCAN_INTERVAL)), 0, 0, 1));
This sets a value in the effect that causes the impact points to flash on and off. A local transform matrix is built that rotates the impact point mesh so that it faces the camera:
// render impact points as a quad
xna.Matrix inverseViewRotation = currentCamera.ViewMatrix; inverseViewRotation.M41 =
inverseViewRotation.M42 = inverseViewRotation.M43 = 0;
xna.Matrix.Invert(ref inverseViewRotation, out inverseViewRotation); xna.Matrix localTransform = xna.Matrix.CreateFromAxisAngle(
new xna.Vector3(1, 0, 0),
(float)-Math.PI / 2) * inverseViewRotation; SimulationEngine.GlobalInstance.Device.RenderState.
DepthBufferWriteEnable = false;
The DepthBuffer is disabled because these impact points should not occlude other objects. The impact point mesh is adjusted to be a little closer to the ray emitter than the exact impact point:
for (int i = 0; i < _lastResults.ImpactPoints.Count; i++)
{
xna.Vector3 pos = new xna.Vector3(_lastResults.ImpactPoints[i].Position.X,
_lastResults.ImpactPoints[i].Position.Y, _lastResults.ImpactPoints[i].Position.Z);
xna.Vector3 resultDir = pos - Parent.Position; resultDir.Normalize();
localTransform.Translation = pos - .02f * resultDir; transforms.World = localTransform;
This helps the impact points to show up clearly instead of being rendered in the same plane as the shape they intersected.
base.Render(renderMode, transforms, Meshes[0]);
}
SimulationEngine.GlobalInstance.Device.RenderState. DepthBufferWriteEnable = true;
}
329
www.it-ebooks.info
Part II: Simulations
Now that you’ve completely defined the CorobotIR entity, you want to add two of them to your Corobot — one in the front and one in the rear. Add the following code to the CorobotEntity nondefault constructor:
InsertEntityGlobal(
new CorobotIREntity( name + “_rearIR”, new Pose(new Vector3(
0,
chassisDimensions.Y / 2.0f + chassisClearance, chassisDimensions.Z / 2.0f))));
The default orientation for the IR entity is facing toward the +Z direction. That faces toward the rear of the CorobotEntity, so the rear IR sensor is inserted with a default orientation. The coordinates of the position vector place the sensor in the middle of the rear face of the chassis. The position is specified using world coordinates instead of coordinates relative to the parent entity because the InsertEntityGlobal method is used to add the child entity.
The call to insert the front IR entity is essentially the same except that the position coordinates place it in the center of the front face of the chassis shape, just above the camera:
InsertEntityGlobal(
new CorobotIREntity( name + “_frontIR”, new Pose(new Vector3(
0,
chassisDimensions.Y / 2.0f + chassisClearance, -chassisDimensions.Z / 2.0f),
TypeConversion.FromXNA(
xna.Quaternion.CreateFromAxisAngle(
new xna.Vector3(0, 1, 0), (float)Math.PI)))));
In addition, the entity is created with a Pose that rotates it 180 degrees around the +Y axis so that it is facing toward the front of the entity.
With all of the rotations and transformations going on, it is important to test these new sensors to ensure that they are oriented and mounted correctly:
1.
2.
3.
4.
5.
6.
Run the Corobot manifest. You should see the rendered impact marks from the rear IR sensor flashing on and off on the side of the giant box.
Move the Corobot slightly forward. The impact points should disappear. This indicates that the giant box is farther from the sensor than 30 inches.
Move the entity closer to the giant box until the impact points are again visible.
Start the Simulation Editor by pressing F5, expand the Corobot entity in the Entities pane to see its children, and select the Corobot_readIR entity.
Set the DisableRendering flag in the entity flags. The impact points should disappear. This verifies that the rear IR sensor is actually generating those impacts.
Check the Distance property of the Corobot_rearIR sensor. It should change as the Corobot moves closer to the giant box.
330
www.it-ebooks.info
Chapter 6: Extending the MRDS Visual Simulation Environment
Repeat these tests with the front IR sensor to verify that it is also working correctly. Figure 6-12 shows how the laser impact points should appear in the scene when the IR sensor is working properly. (Because this book is black and white, the impact points were enhanced; on the interface they appear read).
Figure 6-12
Adding a SimulatedIR Service
Now that you have an IR entity, you also need a SimulatedIR service to go along with it. This one will be much simpler than the SimulatedQuadDifferentialDrive. You’re going to use the existing AnalogSensor contract defined in RoboticsCommon.dll to reduce the amount of code you need to write. Unlike the SimulatedQuadDifferentialDrive service, you won’t need to implement a _mainPort that supports different operations than the alternate contract. This _mainPort will implement only the AnalogSensor operations. This is analogous to subclassing an existing class and overriding methods to change behavior but adding no additional methods or public variables.
Start an MRDS command prompt and change to the ProMRDS\MyChapter6 directory. Use the following command to generate the SimulatedIR service (bold code indicates something you should type):
C:\Microsoft Robotics Studio (1.5)> C:\Microsoft Robotics Studio (1.5)>cd ProMRDS
C:\Microsoft Robotics Studio (1.5)\ProMRDS>cd MyChapter6
C:\Microsoft Robotics Studio (1.5)\ProMRDS\MyChapter6> dssnewservice / Service:”SimulatedIR” /Namespace:”ProMRDS.Simulation.SimulatedIR” /alt:”http://schemas.microsoft.com/robotics/2006/06/analogsensor.html” /i:”\ Microsoft Robotics Studio (1.5)\bin\RoboticsCommon.dll” /year:”07” /month:”08”
As you would by now expect, this generates a service called SimulatedIR, which supports the AnalogSensor contract. Open your newly generated simulatedIR.csproj from the command line so that Visual Studio inherits the environment from the MRDS command-line environment. Use the
331
www.it-ebooks.info
Part II: Simulations
following steps to transform this generic service into a SimulatedIR service. Refer to the completed service in the Chapter6 directory as necessary.
1.
2.
Add the using statements and DLL references required for a simulation service just as you did in the section “The SimulatedQuadDifferentialDrive Service” earlier in this chapter. Don’t forget to set the CopyLocal and SpecificVersion properties to false for each reference added.
Add the following additional using statement and a reference to the Corobot service:
using corobot = ProMRDS.Simulation.Corobot;
3.
4.
Change the DisplayName and Description attributes to describe the service.
Add two private class members to handle subscribing to the simulation engine:
corobot.CorobotIREntity _entity; simengine.SimulationEnginePort _notificationTarget;
5.
6.
Change the AllowMultipleInstances parameter of the ServicePort attribute on the _ mainPort from false to true. You want multiple instances of this service running because you have multiple IR sensors to support.
[ServicePort(“/simulatedir”, AllowMultipleInstances=true)]
Add a SubscriptionManagerPort to handle interactions with the SubscriptionManager service:
[Partner(“SubMgr”,
Contract = submgr.Contract.Identifier, CreationPolicy = PartnerCreationPolicy.CreateAlways)]
private submgr.SubscriptionManagerPort _submgrPort = new submgr.SubscriptionManagerPort();
7. Add the following code to the Start method to subscribe for a partner entity from the SimulationEngine service and to set up a handler for the notification just as you did in the SimulatedQuadDifferentialDrive service:
protected override void Start()
{
_notificationTarget = new simengine.SimulationEnginePort();
//PartnerType.Service is the entity instance name. simengine.SimulationEngine.GlobalInstancePort.Subscribe( ServiceInfo.PartnerList, _notificationTarget);
//don’t start listening to DSSP operations, other than drop,
//until notification of entity
Activate(new Interleave(
new TeardownReceiverGroup
(
Arbiter.Receive<simengine.InsertSimulationEntity>(
false,
_notificationTarget,
InsertEntityNotificationHandlerFirstTime),
Arbiter.Receive<dssp.DsspDefaultDrop>(
332
