PyBullet(九)通过控制位置计算雅可比矩阵,从而让机器人手臂末端直线运动

通过控制位置计算雅可比矩阵,从而让机器人手臂末端直线运动

  • 前言
  • 1. 整体思路
  • 2. 代码解析
    • 2.1 代码分步
      • 2.1.1 库
      • 2.1.2 连接物理模拟
      • 2.1.3 “固定视角”
      • 2.1.4 加载模型
      • 2.1.5 代码中用到的函数
      • 2.1.6 初始化机器人手臂的位置
      • 2.1.7 获取原始状态下机器人手臂的位置和一些基础准备
      • 2.1.8 代码主体
    • 2.2 代码总体
  • 3. 效果展示
  • 4. 总结

系列文章目录:
上一篇:
PyBullet(八)实时控制机器人手臂参数(以UR5机器人手臂为例)(有需要上传油管的视频)
下一篇:

系统:Win10
PyBullet:(官方链接)(官方指南)


前言

导师让完成的练习小Demo,记录一下。

1. 整体思路

没啥思路,直接开整!


2. 代码解析


2.1 代码分步


2.1.1 库

from pyquaternion import Quaternion
import pybullet as p
import time
import pybullet_data
from pybullet_utils import bullet_client as bc
import numpy as np
from numpy.linalg import inv
from numpy.linalg import pinv
import math
import random
import copy
import matplotlib  
import matplotlib.pyplot as plt  

最基础的导入包啊,库啊。没有安装的可以自行pip/pip3 install ......


2.1.2 连接物理模拟

physicsClient = p.connect(p.GUI)#or p.DIRECT for non-graphical version
p.setAdditionalSearchPath(pybullet_data.getDataPath()) #optionally
p.setGravity(0,0,-9.81)#设置重力

导入PyBullet模块后,首先要做的是 "连接 "物理模拟。PyBullet是围绕客户端-服务器驱动的API设计的,客户端发送命令,物理服务器返回状态。PyBullet有一些内置的物理服务器。DIRECT和GUI。GUI和DIRECT连接都会在与PyBullet相同的过程中执行物理模拟和渲染。

至于p.setAdditionalSearchPath(pybullet_data.getDataPath())你可以提供你自己的数据文件,也可以使用PyBullet附带的PyBullet_data包。为此,导入 pybullet_data 并使用pybullet.setAdditionalSearchPath(pybullet_data.getDataPath())注册该目录。


2.1.3 “固定视角”

哈哈哈哈哈哈不是不能改变视角,而是在每次启动程序的时候都能是固定视角。

p.resetDebugVisualizerCamera(cameraDistance=2,cameraYaw=0,cameraPitch=-40,cameraTargetPosition=[0.5,-0.9,0.5])#转变视角

2.1.4 加载模型

#load and set real robot
cylinder_real_robot_start_pos = [0, 0, 0]
cylinder_real_robot_start_orientation = p.getQuaternionFromEuler([0,0,0])
ur5_robot_id = p.loadURDF("D:/Study/.../urdf/real_arm.urdf",cylinder_real_robot_start_pos,cylinder_real_robot_start_orientation)
plane_id = p.loadURDF("plane.urdf")

记得要更改模型加载的地址!

模型:

<?xml version="1.0" ?>
<!-- =================================================================================== -->
<!-- |    This document was autogenerated by xacro from sisbot.xacro                   | -->
<!-- |    EDITING THIS FILE BY HAND IS NOT RECOMMENDED                                 | -->
<!-- =================================================================================== -->
<robot name="sisbot" xmlns:xacro="http://ros.org/wiki/xacro"><!--########################################################              arm                    ###########################################################--><link name="base_link"><visual><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/visual/base.obj"/></geometry><material name="LightGrey"><color rgba="0.7 0.7 0.7 1.0"/></material></visual> <collision><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/collision/base.stl"/></geometry></collision><inertial><mass value="0.0"/><origin rpy="0 0 0" xyz="0.0 0.0 0.0"/><inertia ixx="0.00443333156" ixy="0.0" ixz="0.0" iyy="0.00443333156" iyz="0.0" izz="0.0072"/></inertial></link><joint name="shoulder_pan_joint" type="revolute"><parent link="base_link"/><child link="shoulder_link"/><origin rpy="0.0 0.0 0.0" xyz="0.0 0.0 0.089159"/><axis xyz="0 0 1"/><limit effort="300.0" lower="-3.14159265359" upper="3.14159265359" velocity="10"/><dynamics damping="0.5" friction="0.0"/></joint><link name="shoulder_link"><visual><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/visual/shoulder.obj"/></geometry><material name="LightGrey"><color rgba="0.7 0.7 0.7 1.0"/></material></visual><collision><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/collision/shoulder.stl"/></geometry></collision><inertial><mass value="3.7"/><origin rpy="0 0 0" xyz="0.0 0.0 0.5"/><inertia ixx="0.010267495893" ixy="0.0" ixz="0.0" iyy="0.010267495893" iyz="0.0" izz="0.00666"/></inertial></link><joint name="shoulder_lift_joint" type="revolute"><parent link="shoulder_link"/><child link="upper_arm_link"/><origin rpy="0.0 0.0 0.0" xyz="0.0 0.13585 0.0"/><axis xyz="0 1 0"/><limit effort="300.0" lower="-3.14159265359" upper="3.14159265359" velocity="10"/><dynamics damping="0.5" friction="0.0"/></joint><link name="upper_arm_link"><visual><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/visual/upperarm.obj"/></geometry><material name="LightGrey"><color rgba="0.7 0.7 0.7 1.0"/></material></visual><collision><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/collision/upperarm.stl"/></geometry></collision><inertial><mass value="8.393"/><origin rpy="0 0 0" xyz="0.0 0.0 0.28"/><inertia ixx="0.22689067591" ixy="0.0" ixz="0.0" iyy="0.22689067591" iyz="0.0" izz="0.0151074"/></inertial></link><joint name="elbow_joint" type="revolute"><parent link="upper_arm_link"/><child link="forearm_link"/><origin rpy="0.0 0.0 0.0" xyz="0.0 -0.1197 0.425"/><axis xyz="0 1 0"/><limit effort="300.0" lower="-3.14159265359" upper="3.14159265359" velocity="10"/><dynamics damping="0.5" friction="0.0"/></joint><link name="forearm_link"><visual><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/visual/forearm.obj"/></geometry><material name="LightGrey"><color rgba="0.7 0.7 0.7 1.0"/></material></visual><collision><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/collision/forearm.stl"/></geometry></collision><inertial><mass value="2.275"/><origin rpy="0 0 0" xyz="0.0 0.0 0.25"/><inertia ixx="0.049443313556" ixy="0.0" ixz="0.0" iyy="0.049443313556" iyz="0.0" izz="0.004095"/></inertial></link><joint name="wrist_1_joint" type="revolute"><parent link="forearm_link"/><child link="wrist_1_link"/><origin rpy="0.0 0.0 0.0" xyz="0.0 0.0 0.39225"/><axis xyz="0 1 0"/><limit effort="300.0" lower="-3.14159265359" upper="3.14159265359" velocity="10"/><dynamics damping="0.5" friction="0.0"/></joint><link name="wrist_1_link"><visual><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/visual/wrist1.obj"/></geometry><material name="LightGrey"><color rgba="0.7 0.7 0.7 1.0"/></material></visual><collision><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/collision/wrist1.stl"/></geometry></collision><inertial><mass value="1.219"/><origin rpy="0 0 0" xyz="0.0 0.0 0.0"/><inertia ixx="0.111172755531" ixy="0.0" ixz="0.0" iyy="0.111172755531" iyz="0.0" izz="0.21942"/></inertial></link><joint name="wrist_2_joint" type="revolute"><parent link="wrist_1_link"/><child link="wrist_2_link"/><origin rpy="0.0 0.0 0.0" xyz="0.0 0.093 0.0"/><axis xyz="0 0 1"/><limit effort="300.0" lower="-3.14159265359" upper="3.14159265359" velocity="10"/><dynamics damping="0.5" friction="0.0"/></joint><link name="wrist_2_link"><visual><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/visual/wrist2.obj"/></geometry><material name="LightGrey"><color rgba="0.7 0.7 0.7 1.0"/></material></visual><collision><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/collision/wrist2.stl"/></geometry></collision><inertial><mass value="1.219"/><origin rpy="0 0 0" xyz="0.0 0.0 0.0"/><inertia ixx="0.111172755531" ixy="0.0" ixz="0.0" iyy="0.111172755531" iyz="0.0" izz="0.21942"/></inertial></link><joint name="wrist_3_joint" type="revolute"><parent link="wrist_2_link"/><child link="wrist_3_link"/><origin rpy="0.0 0.0 0.0" xyz="0.0 0.0 0.09465"/><axis xyz="0 1 0"/><limit effort="300.0" lower="-3.14159265359" upper="3.14159265359" velocity="10"/><dynamics damping="0.5" friction="0.0"/></joint><link name="wrist_3_link"><visual><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/visual/wrist3.obj"/></geometry><material name="LightGrey"><color rgba="0.7 0.7 0.7 1.0"/></material></visual><collision><geometry><mesh filename="D:/Study/University_of_Leeds/PhD/PhD_project/code/object_ur5/ur5pybullet-master/meshes/ur5/collision/wrist3.stl"/></geometry></collision><inertial><mass value="0.1879"/><origin rpy="0 0 0" xyz="0.0 0.0 0.0"/><inertia ixx="0.0171364731454" ixy="0.0" ixz="0.0" iyy="0.0171364731454" iyz="0.0" izz="0.033822"/></inertial></link></robot>

2.1.5 代码中用到的函数

一些函数的简单整合。

def setJointPosition(robot, position):num_joints = p.getNumJoints(robot)if len(position) == num_joints: p.setJointMotorControlArray(robot,range(num_joints),p.POSITION_CONTROL,targetPositions=position)
def getJointStates(robot):joint_states = p.getJointStates(robot, range(p.getNumJoints(robot)))joint_positions = [state[0] for state in joint_states]joint_velocities = [state[1] for state in joint_states]joint_torques = [state[3] for state in joint_states]return joint_positions, joint_velocities, joint_torquesdef getMotorJointStates(robot):joint_states = p.getJointStates(robot, range(p.getNumJoints(robot)))joint_infos = [p.getJointInfo(robot, i) for i in range(p.getNumJoints(robot))]joint_states = [j for j, i in zip(joint_states, joint_infos) if i[3] > -1]joint_positions = [state[0] for state in joint_states]joint_velocities = [state[1] for state in joint_states]joint_torques = [state[3] for state in joint_states]return joint_positions, joint_velocities, joint_torques

可以打印出来看看里面的内容,不会用函数的可以上网查一查,这里就不再赘述了。


2.1.6 初始化机器人手臂的位置

numJoints = p.getNumJoints(ur5_robot_id)
print("numJoints:",numJoints)ur5EndEffectorIndex = numJoints - 1
print("ur5EndEffectorIndex:",ur5EndEffectorIndex)robotEndOrientation = p.getQuaternionFromEuler([0,0,-1.57])
targetPositionsJoints=[0,-1.57/2,1.57/2,0,-1.57,0]
setJointPosition(ur5_robot_id, targetPositionsJoints)
for i in range(240):p.stepSimulation()time.sleep(1./240.)

2.1.7 获取原始状态下机器人手臂的位置和一些基础准备

result = p.getLinkState(ur5_robot_id,ur5EndEffectorIndex,computeLinkVelocity=1,computeForwardKinematics=1)
org_link_trn, org_link_rot, org_com_trn, org_com_rot, org_frame_pos, org_frame_rot, org_link_vt, org_link_vr = result
print("link_trn:",org_link_trn)
print("link_rot:",org_link_rot)
print("com_trn:",org_com_trn)
print("com_rot:",org_com_rot)
print("frame_pos:",org_frame_pos)
print("frame_rot:",org_frame_rot)
print("link_vt:",org_link_vt)
print("link_vr:",org_link_vr)org_link_rot_new = [org_link_rot[3], org_link_rot[0], org_link_rot[1], org_link_rot[2]]single_step_duration = 0.03 #seconds
step_flag = 0
move_step = 0.001
last_link_trn = org_link_trn

2.1.8 代码主体

for i in range(2*240):step_flag = step_flag + 1pos, vel, torq = getJointStates(ur5_robot_id)mpos, mvel, mtorq = getMotorJointStates(ur5_robot_id)#获取机器人手臂运动过程中的节点位置result = p.getLinkState(ur5_robot_id,ur5EndEffectorIndex,computeLinkVelocity=1,computeForwardKinematics=1)link_trn, link_rot, com_trn, com_rot, frame_pos, frame_rot, link_vt, link_vr = result#画出机器人手臂末端运动轨迹p.addUserDebugLine(last_link_trn,link_trn,lineColorRGB=[0,0,1])last_link_trn = link_trn#机器人手臂末端移动路线的修正delat_x = org_link_trn[0] + (step_flag * move_step) - link_trn[0]delat_y = org_link_trn[1] + step_flag * 0 - link_trn[1]delat_z = org_link_trn[2] + step_flag * 0 - link_trn[2]movement_vector = [delat_x,delat_y,delat_z]#print("movement_vector:",movement_vector)zero_vec = [0.0,0.0,0.0,0.0,0.0,0.0]zero_acc = [0.0] * len(mpos)#print("movement_vector:",movement_vector)#计算雅可比矩阵jac_t, jac_r = p.calculateJacobian(ur5_robot_id, ur5EndEffectorIndex, com_trn, mpos, zero_vec, zero_acc)#我首先没有考虑保持机器人手臂末端方向一致#所以暂时只用到“jac_t”jac_t_pi = pinv(jac_t)#计算机器人手臂需要调整的角度expected_delta_q_dot_1 = list(np.dot(jac_t_pi, movement_vector))targetPositionsJoints = list(np.sum([expected_delta_q_dot_1, targetPositionsJoints], axis = 0))#机器人手臂的移动setJointPosition(ur5_robot_id, targetPositionsJoints)p.stepSimulation()time.sleep(1./240.)

2.2 代码总体


from pyquaternion import Quaternion
import pybullet as p
import time
import pybullet_data
from pybullet_utils import bullet_client as bc
import numpy as np
from numpy.linalg import inv
from numpy.linalg import pinv
import math
import random
import copy
import matplotlib  
import matplotlib.pyplot as plt  physicsClient = p.connect(p.GUI)#or p.DIRECT for non-graphical version
p.setAdditionalSearchPath(pybullet_data.getDataPath()) #optionally
p.setGravity(0,0,-9.81)#设置重力p.resetDebugVisualizerCamera(cameraDistance=2,cameraYaw=0,cameraPitch=-40,cameraTargetPosition=[0.5,-0.9,0.5])#转变视角#load and set real robot
cylinder_real_robot_start_pos = [0, 0, 0]
cylinder_real_robot_start_orientation = p.getQuaternionFromEuler([0,0,0])
ur5_robot_id = p.loadURDF("D:/Study/.../urdf/real_arm.urdf",cylinder_real_robot_start_pos,cylinder_real_robot_start_orientation)
plane_id = p.loadURDF("plane.urdf")def setJointPosition(robot, position):num_joints = p.getNumJoints(robot)if len(position) == num_joints: p.setJointMotorControlArray(robot,range(num_joints),p.POSITION_CONTROL,targetPositions=position)
def getJointStates(robot):joint_states = p.getJointStates(robot, range(p.getNumJoints(robot)))joint_positions = [state[0] for state in joint_states]joint_velocities = [state[1] for state in joint_states]joint_torques = [state[3] for state in joint_states]return joint_positions, joint_velocities, joint_torquesdef getMotorJointStates(robot):joint_states = p.getJointStates(robot, range(p.getNumJoints(robot)))joint_infos = [p.getJointInfo(robot, i) for i in range(p.getNumJoints(robot))]joint_states = [j for j, i in zip(joint_states, joint_infos) if i[3] > -1]joint_positions = [state[0] for state in joint_states]joint_velocities = [state[1] for state in joint_states]joint_torques = [state[3] for state in joint_states]return joint_positions, joint_velocities, joint_torquesnumJoints = p.getNumJoints(ur5_robot_id)
print("numJoints:",numJoints)ur5EndEffectorIndex = numJoints - 1
print("ur5EndEffectorIndex:",ur5EndEffectorIndex)robotEndOrientation = p.getQuaternionFromEuler([0,0,-1.57])
targetPositionsJoints=[0,-1.57/2,1.57/2,0,-1.57,0]
setJointPosition(ur5_robot_id, targetPositionsJoints)
for i in range(240):p.stepSimulation()time.sleep(1./240.)result = p.getLinkState(ur5_robot_id,ur5EndEffectorIndex,computeLinkVelocity=1,computeForwardKinematics=1)
org_link_trn, org_link_rot, org_com_trn, org_com_rot, org_frame_pos, org_frame_rot, org_link_vt, org_link_vr = result
print("link_trn:",org_link_trn)
print("link_rot:",org_link_rot)
print("com_trn:",org_com_trn)
print("com_rot:",org_com_rot)
print("frame_pos:",org_frame_pos)
print("frame_rot:",org_frame_rot)
print("link_vt:",org_link_vt)
print("link_vr:",org_link_vr)org_link_rot_new = [org_link_rot[3], org_link_rot[0], org_link_rot[1], org_link_rot[2]]single_step_duration = 0.03 #seconds
step_flag = 0
move_step = 0.001
last_link_trn = org_link_trnfor i in range(2*240):step_flag = step_flag + 1pos, vel, torq = getJointStates(ur5_robot_id)mpos, mvel, mtorq = getMotorJointStates(ur5_robot_id)#获取机器人手臂运动过程中的节点位置result = p.getLinkState(ur5_robot_id,ur5EndEffectorIndex,computeLinkVelocity=1,computeForwardKinematics=1)link_trn, link_rot, com_trn, com_rot, frame_pos, frame_rot, link_vt, link_vr = result#画出机器人手臂末端运动轨迹p.addUserDebugLine(last_link_trn,link_trn,lineColorRGB=[0,0,1])last_link_trn = link_trn#机器人手臂末端移动路线的修正delat_x = org_link_trn[0] + (step_flag * move_step) - link_trn[0]delat_y = org_link_trn[1] + step_flag * 0 - link_trn[1]delat_z = org_link_trn[2] + step_flag * 0 - link_trn[2]movement_vector = [delat_x,delat_y,delat_z]#print("movement_vector:",movement_vector)zero_vec = [0.0,0.0,0.0,0.0,0.0,0.0]zero_acc = [0.0] * len(mpos)#print("movement_vector:",movement_vector)#计算雅可比矩阵jac_t, jac_r = p.calculateJacobian(ur5_robot_id, ur5EndEffectorIndex, com_trn, mpos, zero_vec, zero_acc)#我首先没有考虑保持机器人手臂末端方向一致#所以暂时只用到“jac_t”jac_t_pi = pinv(jac_t)#计算机器人手臂需要调整的角度expected_delta_q_dot_1 = list(np.dot(jac_t_pi, movement_vector))targetPositionsJoints = list(np.sum([expected_delta_q_dot_1, targetPositionsJoints], axis = 0))#机器人手臂的移动setJointPosition(ur5_robot_id, targetPositionsJoints)p.stepSimulation()time.sleep(1./240.)

3. 效果展示

视频演示(Bilibili)
视频演示(YouTube)(暂无)


4. 总结

这一段代码的目的是为了让我们更加方便的调试机器人上手臂,让我们共同进步!


本文来自互联网用户投稿,文章观点仅代表作者本人,不代表本站立场,不承担相关法律责任。如若转载,请注明出处。 如若内容造成侵权/违法违规/事实不符,请点击【内容举报】进行投诉反馈!

相关文章

立即
投稿

微信公众账号

微信扫一扫加关注

返回
顶部