Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

Задани на лабораторные работы. ПРК / Professional Microsoft Robotics Developer Studio

.pdf
Скачиваний:
125
Добавлен:
20.04.2015
Размер:
16.82 Mб
Скачать

www.it-ebooks.info

Chapter 8: Simulating Articulated Entities

//Add a camera to see what the gripper is gripping AttachedCameraEntity gripCam = new AttachedCameraEntity(); gripCam.State.Name = “Arm Cam”;

//move the camera above the L4entity and look at the grippers gripCam.State.Pose = new Pose(new Vector3(0.05f, -0.01f, 0),

Quaternion.FromAxisAngle(0, 1, 0, (float)(Math.PI / 2)) * Quaternion.FromAxisAngle(1, 0, 0, (float)(Math.PI / 3)));

//adjust the near plane so that we can see the grippers gripCam.Near = 0.01f;

//the gripcam coordinates are relative to the L4Entity, don’t use InsertEntityGlobal

L4Entity.InsertEntity(gripCam);

The position and orientation of the camera is defined relative to the L4Entity, and InsertEntity is used rather than InsertEntityGlobal. Another interesting thing to note is that the near plane of the camera is adjusted to be 1 centimeter. All objects closer to the camera than the near plane are not displayed. The default value for the near plane is 10 centimeters, which clips the grippers from the scene. A closer near plane allows the camera to get closer to objects in the scene at the cost of reducing the depth buffer resolution for the scene. In some extreme circumstances, this can cause occlusion problems with some objects in the scene unless the far plane is brought in by the same factor as the near plane.

The AttachedCameraEntity

The CameraEntity provided with the MRDS 1.5 SDK always attempts to keep the horizon level in the displayed scene. This works well for scene cameras but isn’t really desirable for a camera attached to a robot or an arm. A different type of camera called

AttachedCamera, defined in SimulatedLynxL6Arm.cs, enables the camera view to roll with the object to which it is attached. This is accomplished by overriding the

Initialize method and setting the private _frameworkCamera field with a new

Camera object called AttachedCameraModel. The code is interesting because it uses reflection to set a private field. You should use this type of camera whenever you want to attach a camera to an entity that may not always stay lined up with the horizon.

The physics model is perfectly adequate to model the motion and physical constraints of the arm, but it is pretty uninteresting visually. A few custom meshes make quite a difference to the simulation, as shown in Figure 8-11.

403

www.it-ebooks.info

Part II: Simulations

Figure 8-11

The custom meshes were modeled in Maya. The six Maya files for the models are included in the ProMRDS\mayamodels directory starting with l6_base.ma and ending with l6_l4.ma. Corresponding

.obj and .mtl files are included in the store\media directory for use by the simulator. The combined physics and visual view of the arm model is shown in Figure 8-12.

Figure 8-12

404

www.it-ebooks.info

Chapter 8: Simulating Articulated Entities

Running the Arm Service

This is a good time to run the arm simulation and to manipulate the joints to get a feel for the range and capabilities of the arm. You can easily run the arm simulation from the command line using the provided

.cmd file as follows:

C:\Microsoft Robotics Studio (1.5)>SimulatedLynxL6Arm

This starts a DSS node with the Lynx.L6Arm.Simulation manifest. This manifest runs the SimulatedLynxL6Arm service along with the JointMover service. You can use the JointMoverService to manipulate the arm joints and move the arm around. The SimulatedLynxL6Arm user interface is explained in a later section.

Moving the Arm

The SimulatedLynxL6ArmEntity defines a MoveTo method that can be used to move the arm to a specific position. The method takes the following parameters, which completely specify the arm position:

Parameter

Units

Description

 

 

 

baseVal

Degrees

Rotation angle of the base joint

shoulderVal

Degrees

Pivot angle of the shoulder joint

elbowVal

Degrees

Pivot angle of the elbow joint

wristVal

Degrees

Pivot angle of the wrist joint

rotateVal

Degrees

Rotation angle of the wrist joint

gripperVal

Meters

Distance of separation between the grippers

Time

Seconds

Time to complete the motion

 

 

 

This method returns a SuccessFailurePort after the motion has been initiated. A Success message is posted to the port when the motion successfully completes. Otherwise, an Exception message is posted if an error was encountered.

The arm entity has a private Boolean variable called _moveToActive that indicates whether a move operation is currently underway. If a call is made to MoveTo while a move operation is currently active, an exception message is posted to the response port.

405

www.it-ebooks.info

Part II: Simulations

Each parameter is checked against the bounds specified in the corresponding joint description. An invalid parameter results in an exception being posted to the response port, with a message indicating which parameter was bad:

SuccessFailurePort responsePort = new SuccessFailurePort();

if (_moveToActive)

{

responsePort.Post(new Exception(“Previous MoveTo still active.”)); return responsePort;

}

//check bounds. If the target is invalid, post an exception message

//to the response port with a helpful error.

if(!_joints[0].ValidTarget(baseVal))

{

responsePort.Post(new Exception(

_joints[0].Name + “Joint set to invalid value: “ + baseVal.ToString())); return responsePort;

}

After all of the parameters have been validated, each joint description Target value is set to the specified value. In addition, a speed value is calculated for each joint based on the distance between the current value and the target value and the overall time specified for completion of the motion. Each joint gets its own speed value to ensure that joints that have a large distance to move will move more quickly than joints that have a small distance to move so that all joints complete their motion at the same time:

//set the target values on the joint descriptors _joints[0].Target = baseVal;

_joints[1].Target = shoulderVal; _joints[2].Target = elbowVal; _joints[3].Target = wristVal; _joints[4].Target = rotateVal; _joints[5].Target = gripperVal;

//calculate a speed value for each joint that will cause it to complete

//its motion in the specified time

for(int i=0; i<6; i++)

_joints[i].Speed = Math.Abs(_joints[i].Target - _joints[i].Current) / time;

The MoveTo method then sets the _moveToActive flag to true and returns the response port. As you can see, the motion is set up in this method but the joint is actually not moved until its update method is called.

The entity’s Update method is called once each frame, ideally about 60 times per second. Each time this method is called, the joint is moved a small amount based on how much time has elapsed since the

last update.

406

www.it-ebooks.info

Chapter 8: Simulating Articulated Entities

The first time Update is called after the entity is initialized, it sets references to the newly created joints in the joint description array. It then follows the pattern shown in the following code to update each joint:

// update joints if necessary if (_moveToActive)

{

bool done = true;

// Check each joint and update it if necessary. if (_joints[0].NeedToMove(_epsilon))

{

done = false;

Vector3 normal = _joints[0].Joint.State.Connectors[0].JointNormal; _joints[0].UpdateCurrent(_prevTime); _joints[0].Joint.SetAngularDriveOrientation(

Quaternion.FromAxisAngle( normal.X, normal.Y, normal.Z,

DegreesToRadians(_joints[0].Current)));

}

If a motion sequence is currently active, then each joint is evaluated to determine whether it still needs to be moved to hit its target value. If NeedToMove returns true, then UpdateCurrent is called to move the joint toward its target in a step that depends on the speed of the current movement and the amount of time that has elapsed since the last time Update was called. A new joint orientation is calculated from the new joint position and the joint is set to its new position.

If none of the joints need to be updated, then the motion sequence is finished. _moveToActive is set to false and a new SuccessResult message is posted to the response port.

This is a better way of controlling joint movement than the Simulation Tutorial 4 provided in the SDK, which relies on the damping coefficient of the joint drive to dictate the speed at which the joint moves. The method illustrated in this example provides much more control over the joint’s rate of speed and allows for the possibility of setting a maximum motor movement speed according to the characteristics of the physical motors on the arm.

A Fly in the Ointment

At this point, you have a great arm model that closely simulates the capabilities of the real arm. Unfortunately, the AGEIA physics engine appears to have a limitation that significantly affects this arm model. Although the gripper joints bring the grippers together and it is possible to close them on an object, the physics engine does not do a good job of simulating the friction between the grippers and the grasped object. Even with a high-friction material specified for the object and the grippers, the arm cannot pick up an object without it sliding away from the grippers. What do you do when the physics engine won’t work properly? You cheat!

Because an arm really isn’t very interesting if it is not able to pick up objects, you will define code that detects when the arm is closing on an object. A joint will be created on-the-fly to attach the object to the grippers, and then that object will follow the grippers as they move — just as if it had been grasped.

You’ll see how this works in detail because it provides a good example of using some additional simulator functionality.

407

www.it-ebooks.info

Part II: Simulations

Look at the following lines of code near the bottom of the MoveTo function that was described in the section “Moving the Arm”:

if((_joints[5].Target > gripperVal) && (Payload == null))

{

_attachPayload = true;

}

else if ((_joints[5].Target < gripperVal) && (Payload != null))

{

_dropPayload = true;

}

_joints[5] represents the gripper joint. When the motion sequence is being set up, you detect whether the grippers are closing or opening. If they are closing, then you set _attachPayload to be true.

If they are opening, then you set _dropPayload to be true. These flags are not used until the motion sequence is completed. This code is in the Update method:

// no joints needed to be updated, the movement is finished if (_attachPayload)

{

// we are attaching a payload object after the motion has completed if (_intersect == null)

{

// haven’t yet cast the intersection ray, do it now _intersect = new IntersectRay(new Ray(Origin, Direction)); SimulationEngine.GlobalInstancePort.Post(_intersect);

}

List<TriangleIntersectionRecord> results = _intersect.ResponsePort.Test<List<TriangleIntersectionRecord>>();

if (results != null)

{

//we’ve received the intersection results,

//see if we need to attach a payload AttachPayload(results);

if (_payload != null)

{

//create a joint to hold the payload

_payload.PhysicsEntity.UpdateState(false); L4Entity.PhysicsEntity.UpdateState(false);

Vector3 jointLocation = TypeConversion.FromXNA(xna.Vector3.Transform( TypeConversion.ToXNA(_payload.State.Pose.Position), xna.Matrix.Invert(L4Entity.World)));

Vector3 normal = new Vector3(0, 1, 0);

Vector3 axis = new Vector3(1, 0, 0);

//calculate a joint orientation that will preserve the orientation

//relationship between L4Entity and the payload

Vector3 parentNormal =

Quaternion.Rotate(L4Entity.State.Pose.Orientation, normal);

Vector3 parentAxis =

Quaternion.Rotate(L4Entity.State.Pose.Orientation, axis);

408

www.it-ebooks.info

Chapter 8: Simulating Articulated Entities

Vector3 thisNormal = Quaternion.Rotate(_payload.State.Pose.Orientation, normal);

Vector3 thisAxis = Quaternion.Rotate(_payload.State.Pose.Orientation, axis);

EntityJointConnector[] payloadConnectors = new EntityJointConnector[2]

{

new EntityJointConnector(_payload,

thisNormal, thisAxis, new Vector3(0, 0, 0)), new EntityJointConnector(L4Entity,

parentNormal, parentAxis, jointLocation)

};

_payloadJoint = PhysicsJoint.Create(

new JointProperties((JointAngularProperties)null, payloadConnectors));

_payloadJoint.State.Name = “Payload Joint”;

PhysicsEngine.InsertJoint(_payloadJoint); // the payload is now fixed to the L4Entity

}

//the payload attach is complete _attachPayload = false; _intersect = null;

//the motion is also complete, send the completion message _moveToActive = false;

_moveToResponsePort.Post(new SuccessResult());

}

}

//once a ray has been cast into the environment,

//this method interprets the results and

//sets a payload in _payload if one is found within the grippers public void AttachPayload(List<TriangleIntersectionRecord> results)

{

foreach (TriangleIntersectionRecord candidate in results)

{

if (candidate.OwnerEntity.GetType() == typeof(SingleShapeSegmentEntity)) continue;

if (candidate.IntersectionDistance > Grip) break;

_payload = candidate.OwnerEntity;

}

}

If _attachPayload is set after the motion is completed, a ray is constructed that originates in the center of the L4Entity and extends in the direction of the grippers. The ray is intersected with all entities in the environment. If a valid entity is found, that entity is attached to the L4Entity with a joint.

The first time this code is executed, _intersect is null and a new ray is constructed and cast into the environment by posting an IntersectRay message on the SimulationEngine port. This is the first time you’ve seen this method. It calculates intersections with the visual mesh of an object. The laser range finder discussed in Chapter 6 uses the PhysicsEngine.Raycast2D method to cast a ray into the

409

www.it-ebooks.info

Part II: Simulations

environment that intersects with physics shapes. You can use either method depending on whether you desire an intersection with the visual mesh or an intersection with the physics shapes.

Each subsequent time the Update method is called, the responseport of the IntersectRay message is checked for results. When the results are available, they are passed to the AttachPayload method, which determines whether any of the intersection results represent an object that should be attached to the arm. If so, _payload is set to the entity. If a valid payload is found, then a joint is built that attaches the payload entity to the L4Entity while maintaining its relative position and orientation.

Once the results have been processed, a message is sent on the MoveTo response port to indicate that the motion sequence has been completed.

That covers the sequence of events when the payload is grasped. When the grippers are opened and a payload entity is currently attached, the _dropPayload Boolean is set to true. If this flag is set at the end of the Update method, the joint that attaches the payload is deleted and the payload entity is no longer fixed to the arm.

What are the implications of this terrible hack? It means that the arm simulation does a poor job of modeling how well an object is grasped by the grippers. In fact, if an object is within the grippers and they are closed even slightly, the object will be picked up. In the real world, the arm can only pick up objects if the grippers are set to an appropriate value for the size of the object. In addition, once the simulation arm grips an object, the object is firmly attached until it is released. With a real arm, moving the arm too fast may cause the object to dislodge from the grippers. These limitations should be taken into account when using the simulated arm to simulate algorithms intended for a real arm.

Inverse Kinematics

Technically, you have everything you need at this point to move the arm around, but it isn’t much fun to maneuver the arm to pick up an object by specifying each of the joint angles. It is much more convenient to simply specify the X,Y, Z coordinates of the grippers and their rotation and angle of approach, enabling the arm to move to that configuration. The process of calculating the joint angles from such a high-level specification is called inverse kinematics. The reverse process, calculating the gripper position from the joint angles, is called forward kinematics.

There are three solutions for the inverse kinematics for the Lynx 6 robotic arm, available on the Lynxmotion website at the following URLs:

www.lynxmotion.com/images/html/proj073.htm

www.lynxmotion.com/images/html/proj057.htm

www.lynxmotion.com/images/html/proj058.htm

The SimulatedLynxL6Arm service described in this chapter uses an inverse kinematics solution similar to the third link. The service implements a method called MoveToPosition that takes the parameters in the following table, calculates the joint positions, and then calls MoveTo on the arm entity to execute the motion:

410

www.it-ebooks.info

 

 

 

Chapter 8: Simulating Articulated Entities

 

 

 

 

 

Parameter

Units

Description

 

 

 

 

 

X

Meters

X position of the center of the tip of the gripper

 

Y

Meters

Y position of the center of the tip of the gripper

 

Z

Meters

Z position of the center of the tip of the gripper

 

P

Degrees

Angle of approach of the gripper (–90 is vertical with the gripper down)

 

W

Degrees

Rotation angle of the wrist joint

 

Grip

Meters

Distance of separation between the grippers

 

Time

Seconds

Amount of time to complete the motion

 

 

 

 

The code is fairly straightforward even if the math isn’t:

//This method calculates the joint angles necessary to place the arm into the

//specified position. The arm position is specified by the X,Y,Z coordinates

//of the gripper tip as well as the angle of the grip, the rotation of the grip,

//and the open distance of the grip. The motion is completed in the

//specified time.

public SuccessFailurePort MoveToPosition(

float x, //

x position

float y, //

y position

float z, //

z position

float p, //

angle of the grip

float w, //

rotation of the grip

float grip,

// distance the grip is open

float time)

// time to complete the movement

{

//taken from Hoon Hong’s ik2.xls IK method posted on the Lynx website

//physical attributes of the arm

float L1 = InchesToMeters(4.75f); float L2 = InchesToMeters(4.75f); float Grip = InchesToMeters(2.5f); float L3 = InchesToMeters(5.75f);

float H = InchesToMeters(3f); // height of the base float G = InchesToMeters(2f); // radius of the base

float r = (float)Math.Sqrt(x * x + z * z); // horizontal distance to the target float baseAngle = (float)Math.Atan2(-z, -x); // angle to the target

float pRad = DegreesToRadians(p);

float rb = (float)((r - L3 * Math.Cos(pRad)) / (2 * L1)); float yb = (float)((y - H - L3 * Math.Sin(pRad)) / (2 * L1)); float q = (float)(Math.Sqrt(1 / (rb * rb + yb * yb) - 1)); float p1 = (float)(Math.Atan2(yb + q * rb, rb - q * yb)); float p2 = (float)(Math.Atan2(yb - q * rb, rb + q * yb));

float shoulder = p1 - DegreesToRadians(90); // angle of the shoulder joint

(continued)

411

www.it-ebooks.info

Part II: Simulations

(continued)

float

elbow

=

p2 -

shoulder;

//

angle

of

the

elbow

joint

float

wrist

=

pRad

- p2;

//

angle

of

the

wrist

joint

// Position the arm with the calculated joint angles. return _l6Arm.MoveTo(

RadiansToDegrees(baseAngle),

RadiansToDegrees(shoulder),

RadiansToDegrees(elbow),

RadiansToDegrees(wrist),

w,

grip,

time);

}

The X and Z coordinates are converted to a radius and an angle (cylindrical coordinates). The angle of approach (p) of the gripper, along with the radius and elevation (Y coordinate), is used to calculate joint angles from the gripper back to the base. The radian joint values are converted back to degrees and passed to the MoveTo method.

It is not possible for the arm to accommodate all combinations of gripper coordinates and angle-of- approach values. If an impossible position is requested, one of the joint angles will be out of bounds and the MoveTo method will post an exception message to the response port.

Using the Arm User Interface

The SimulatedLynxL6Arm service implements a Windows Forms user interface using the same principles described in Chapter 7. The window is shown in Figure 8-13.

Figure 8-13

The seven parameters to the MoveToPosition method appear in this dialog. You can enter values using meters for distance parameters, degrees for angle parameters, and seconds for the time parameters. After the Submit button is pressed, the arm moves to the requested position if it is valid. Otherwise, an error message is displayed if the position is invalid.

412