当前位置: 首页 > news >正文

用文本文件做网站seo静态页面生成系统

用文本文件做网站,seo静态页面生成系统,网站的建设目标有哪些,常营网站建设公司Authorshaniadolphin求解目的本文将展示位姿估计的一种应用#xff0c;即通过单目相机对环境进行测量。简单来说#xff0c;本文的工作就是利用下面的两幅图#xff0c;在已知P1、P2、P3、P4四点世界坐标的情况下#xff0c;计算出其它点的世界坐标。如图所示#xff0c;一…Authorshaniadolphin求解目的本文将展示位姿估计的一种应用即通过单目相机对环境进行测量。简单来说本文的工作就是利用下面的两幅图在已知P1、P2、P3、P4四点世界坐标的情况下计算出其它点的世界坐标。如图所示一个标准的标定板标定板每个格子的尺寸是30mm通过标定四周的4个点P1、P2、P3、P4旁边有茶罐有待求点为P5、P6。这种应用可以利用一个已经尺寸物体通过两张不同视角的照片求未知物体的尺寸。比如上图中的通过已知的标定板求茶罐的尺寸。而在现实应用中可以用这种方式来求车的尺寸建筑的高度货物的体积等等。求解原理如下图根据P1、P2、P3、P4四点的空间坐标通过openCV的PNP函数可以计算出两次拍照的相机位姿从而进一步计算出相机的坐标与。那么将与与连成直线获得两条直线方程和组成方程组求解得到它们的交点即为待求目标点的坐标。1. 求出点的相机坐标系坐标关于P点如何从二维映射到三维参考上图的坐标通过解已经求出待求点P在图像中的像素坐标为。求出在相机坐标系中的坐标(也就是上图中的点)。具体的转换公式如下式中为相机镜头的焦距在本次实验中使用的是中一光学的35mm手动镜头。为点的像素坐标其余为相机内参数。输入拍到的图片的点包括待求点的像素坐标示例如下P11 np.array([0, 0, 0])P12 np.array([0, 300, 0])P13 np.array([210, 0, 0])P14 np.array([210, 300, 0])p11 np.array([1765, 725])p12 np.array([3068, 1254])p13 np.array([1249, 1430])p14 np.array([2648, 2072])p4psolver1.Points3D[0] np.array([P11,P12,P13,P14])p4psolver1.Points2D[0] np.array([p11,p12,p13,p14])#p4psolver1.point2find np.array([4149, 671])#p4psolver1.point2find np.array([675, 835])p4psolver1.point2find np.array([691, 336])读取标定文件中的相机内参代码如下在代码里预设了相机的传感器大小笔者所用的D7000单反是DX画幅的根据可查到的资料传感器的规格为23.6mm*15.6mm。笔者用在本机的镜头是中一的35mm手动定焦镜头。def getudistmap(self, filename):with open(filename, r,newline) as csvfile:spamreader csv.reader(csvfile, delimiter,, quotechar)rows [row for row in spamreader]self.cameraMatrix np.zeros((3, 3))#Dt np.zeros((4, 1))size_w 23.6size_h 15.6imageWidth int(rows[0][1])imageHeight int(rows[0][2])self.cameraMatrix[0][0] rows[1][1]self.cameraMatrix[1][1] rows[1][2]self.cameraMatrix[0][2] rows[1][3]self.cameraMatrix[1][2] rows[1][4]self.cameraMatrix[2][2] 1print(len(rows[2]))if len(rows[2]) 5:print(fisheye)self.distCoefs np.zeros((4, 1))self.distCoefs[0][0] rows[2][1]self.distCoefs[1][0] rows[2][2]self.distCoefs[2][0] rows[2][3]self.distCoefs[3][0] rows[2][4]scaled_K self.cameraMatrix * 0.8 # The values of K is to scale with image dimension.scaled_K[2][2] 1.0#newcameramtx cv2.fisheye.estimateNewCameraMatrixForUndistortRectify(self.cameraMatrix, self.distCoefs, (imageWidth, imageHeight), np.eye(3), balance0)#map1, map2 cv2.fisheye.initUndistortRectifyMap(self.cameraMatrix, self.distCoefs, np.eye(3), newcameramtx, (imageWidth, imageHeight), cv2.CV_32FC1)else:print(normal)self.distCoefs np.zeros((1, 5))self.distCoefs[0][0] rows[2][1]self.distCoefs[0][1] rows[2][2]self.distCoefs[0][2] rows[2][3]self.distCoefs[0][3] rows[2][4]self.distCoefs[0][4] rows[2][5]#newcameramtx, roi cv2.getOptimalNewCameraMatrix(self.cameraMatrix, self.distCoefs, (imageWidth, imageHeight), 1, (imageWidth, imageHeight))#map1, map2 cv2.initUndistortRectifyMap(self.cameraMatrix, self.distCoefs, None, newcameramtx, (imageWidth, imageHeight), cv2.CV_32FC1)print(dim %d*%d%(imageWidth, imageHeight))print(Kt \n, self.cameraMatrix)#print(newcameramtx \n, newcameramtx)print(Dt \n, self.distCoefs)self.f [self.cameraMatrix[0][0]*(size_w/imageWidth), self.cameraMatrix[1][1]*(size_h/imageHeight)]#self.f [350, 350]print(f \n, self.f)#print(map1,\n,map2.T)return然后就可以将像素坐标转换到世界坐标了def WordFrame2ImageFrame(self, WorldPoints):pro_points, jacobian cv2.projectPoints(WorldPoints, self.rvecs, self.tvecs, self.cameraMatrix, self.distCoefs)return pro_pointsdef ImageFrame2CameraFrame(self, pixPoints):fx self.cameraMatrix[0][0]u0 self.cameraMatrix[0][2]fy self.cameraMatrix[1][1]v0 self.cameraMatrix[1][2]zc (self.f[0]self.f[1])/2xc (pixPoints[0] - u0) * self.f[0] / fx #ffx*传感器尺寸/分辨率yc (pixPoints[1] - v0) * self.f[1] / fypoint np.array([xc,yc,zc])return point2.求出P点在世界坐标系中的方向向量通过以上运算得到了但这个点坐标是在相机坐标系中的需要进一步求解点在世界坐标系中对应的坐标。为了将转为即求出原点在相机坐标系下的坐标需要使用到解求位姿时得到的三个欧拉角。相机坐标系按照轴、轴、轴的顺序旋转以上角度后将与世界坐标系完全平行在这三次旋转中也会与坐标系一起旋转的其在世界系中的位置会发生改变。为了保证系旋转后点依然保持在世界坐标系W原本的位置需要对进行三次反向旋转旋转后得到点在相机坐标系中新的坐标值记为的值等于世界坐标系中向量的值。最终点的世界坐标的值的世界坐标值具体操作如下第一次旋转原始相机坐标系绕轴旋转了变为系将点绕轴旋转得到其为系中的坐标。def CodeRotateByZ(self, x, y, thetaz):#将空间点绕Z轴旋转x1x #将变量拷贝一次保证x outx这种情况下也能计算正确y1yrz thetaz*3.141592653589793/180outx math.cos(rz)*x1 - math.sin(rz)*y1outy math.sin(rz)*x1 math.cos(rz)*y1return outx,outy第二次旋转绕轴旋转了变为系将点绕轴旋转得到其为系中的坐标。def CodeRotateByY(self, x, z, thetay):x1xz1zry thetay * 3.141592653589793 / 180outx math.cos(ry) * x1 math.sin(ry) * z1outz math.cos(ry) * z1 - math.sin(ry) * x1return outx,outz第三次旋转绕轴旋转了变为系将点绕轴旋转得到其为系中的坐标。def CodeRotateByX(self, y, z, thetax):y1yz1zrx (thetax * 3.141592653589793) / 180outy math.cos(rx) * y1 - math.sin(rx) * z1outz math.cos(rx) * z1 math.sin(rx) * y1return outy,outz此时世界坐标系中相机的位置坐标为。结合上面的旋转函数完整的求解代码如下所示def solver(self):retval, self.rvec, self.tvec cv2.solvePnP(self.Points3D, self.Points2D, self.cameraMatrix, self.distCoefs)#print(self.rvec:,self.rvec)#print(self.tvec:,self.tvec)thetax,thetay,thetaz self.rotationVectorToEulerAngles(self.rvec, 0)x self.tvec[0][0]y self.tvec[1][0]z self.tvec[2][0]self.Position_OwInCx xself.Position_OwInCy yself.Position_OwInCz zself.Position_theta [thetax, thetay, thetaz]#print(Position_theta:,self.Position_theta)x, y self.CodeRotateByZ(x, y, -1 * thetaz)x, z self.CodeRotateByY(x, z, -1 * thetay)y, z self.CodeRotateByX(y, z, -1 * thetax)self.Theta_W2C (-1*thetax, -1*thetay,-1*thetaz)self.Position_OcInWx x*(-1)self.Position_OcInWy y*(-1)self.Position_OcInWz z*(-1)self.Position_OcInW np.array([self.Position_OcInWx, self.Position_OcInWy, self.Position_OcInWz])print(Position_OcInW:, self.Position_OcInW)通过世界坐标系的相机坐标a1P点坐标a2构成第一条直线A。def SetLineA(self, A1x, A1y, A1z, A2x, A2y, A2z):self.a1 np.array([A1x, A1y, A1z])self.a2 np.array([A2x, A2y, A2z])3. 最后根据两幅图得到的两条直线计算出P点的世界坐标对另外一幅图也进行如上操作获得两条直线A、B因此求出两条直线A与B的交点即可求出目标点的世界坐标。然而在现实中由于误差的存在A与B基本不会相交因此在计算时需要求他们之间的最近点。class GetDistanceOf2linesIn3D():def __init__(self):print(GetDistanceOf2linesIn3D class)def dot(self, ax, ay, az, bx, by, bz):result ax*bx ay*by az*bzreturn resultdef cross(self, ax, ay, az, bx, by, bz):x ay*bz - az*byy az*bx - ax*bzz ax*by - ay*bxreturn x,y,zdef crossarray(self, a, b):x a[1]*b[2] - a[2]*b[1]y a[2]*b[0] - a[0]*b[2]z a[0]*b[1] - a[1]*b[0]return np.array([x,y,z])def norm(self, ax, ay, az):return math.sqrt(self.dot(ax, ay, az, ax, ay, az))def norm2(self, one):return math.sqrt(np.dot(one, one))def SetLineA(self, A1x, A1y, A1z, A2x, A2y, A2z):self.a1 np.array([A1x, A1y, A1z])self.a2 np.array([A2x, A2y, A2z])def SetLineB(self, B1x, B1y, B1z, B2x, B2y, B2z):self.b1 np.array([B1x, B1y, B1z])self.b2 np.array([B2x, B2y, B2z])def GetDistance(self):d1 self.a2 - self.a1d2 self.b2 - self.b1e self.b1 - self.a1cross_e_d2 self.crossarray(e,d2)cross_e_d1 self.crossarray(e,d1)cross_d1_d2 self.crossarray(d1,d2)dd self.norm2(cross_d1_d2)t1 np.dot(cross_e_d2, cross_d1_d2)t2 np.dot(cross_e_d1, cross_d1_d2)t1 t1/(dd*dd)t2 t2/(dd*dd)self.PonA self.a1 (self.a2 - self.a1) * t1self.PonB self.b1 (self.b2 - self.b1) * t2self.distance self.norm2(self.PonB - self.PonA)print(distance, self.distance)return self.distance总结与验证通过以上的讲解说明了大致的原理和过程。完整的求解代码及结果如下其中代码中打开的“calibration.csv”是一个标定生成的文件存取了笔者D7000标定后得到的内参如表格清单所示。表格清单dim36962448cameraMatrix5546.180090980425572.8831821.0491347.461distCoefs-0.127350.200792-0.002090.000943-0.79439代码清单#!/usr/bin/env python3# coding:utf-8import cv2import numpy as npimport timefrom PIL import Image,ImageTkimport threadingimport osimport reimport subprocessimport randomimport mathimport csvimport argparseclass PNPSolver():def __init__(self):self.COLOR_WHITE (255,255,255)self.COLOR_BLUE (255,0,0)self.COLOR_LBLUE (255, 200, 100)self.COLOR_GREEN (0,240,0)self.COLOR_RED (0,0,255)self.COLOR_DRED (0,0,139)self.COLOR_YELLOW (29,227,245)self.COLOR_PURPLE (224,27,217)self.COLOR_GRAY (127,127,127)self.Points3D np.zeros((1, 4, 3), np.float32) #存放4组世界坐标位置self.Points2D np.zeros((1, 4, 2), np.float32) #存放4组像素坐标位置self.point2find np.zeros((1, 2), np.float32)self.cameraMatrix Noneself.distCoefs Noneself.f 0def rotationVectorToEulerAngles(self, rvecs, anglestype):R np.zeros((3, 3), dtypenp.float64)cv2.Rodrigues(rvecs, R)sy math.sqrt(R[2,1] * R[2,1] R[2,2] * R[2,2])singular sy 1e-6if not singular:x math.atan2(R[2,1] , R[2,2])y math.atan2(-R[2,0], sy)z math.atan2(R[1,0], R[0,0])else :x math.atan2(-R[1,2], R[1,1])y math.atan2(-R[2,0], sy)z 0if anglestype 0:x x*180.0/3.141592653589793y y*180.0/3.141592653589793z z*180.0/3.141592653589793elif anglestype 1:x xy yz zprint(x)return x,y,zdef CodeRotateByZ(self, x, y, thetaz):#将空间点绕Z轴旋转x1x #将变量拷贝一次保证x outx这种情况下也能计算正确y1yrz thetaz*3.141592653589793/180outx math.cos(rz)*x1 - math.sin(rz)*y1outy math.sin(rz)*x1 math.cos(rz)*y1return outx,outydef CodeRotateByY(self, x, z, thetay):x1xz1zry thetay * 3.141592653589793 / 180outx math.cos(ry) * x1 math.sin(ry) * z1outz math.cos(ry) * z1 - math.sin(ry) * x1return outx,outzdef CodeRotateByX(self, y, z, thetax):y1yz1zrx (thetax * 3.141592653589793) / 180outy math.cos(rx) * y1 - math.sin(rx) * z1outz math.cos(rx) * z1 math.sin(rx) * y1return outy,outzdef solver(self):retval, self.rvec, self.tvec cv2.solvePnP(self.Points3D, self.Points2D, self.cameraMatrix, self.distCoefs)thetax,thetay,thetaz self.rotationVectorToEulerAngles(self.rvec, 0)x self.tvec[0][0]y self.tvec[1][0]z self.tvec[2][0]self.Position_OwInCx xself.Position_OwInCy yself.Position_OwInCz zself.Position_theta [thetax, thetay, thetaz]#print(Position_theta:,self.Position_theta)x, y self.CodeRotateByZ(x, y, -1 * thetaz)x, z self.CodeRotateByY(x, z, -1 * thetay)y, z self.CodeRotateByX(y, z, -1 * thetax)self.Theta_W2C ([-1*thetax, -1*thetay,-1*thetaz])self.Position_OcInWx x*(-1)self.Position_OcInWy y*(-1)self.Position_OcInWz z*(-1)self.Position_OcInW np.array([self.Position_OcInWx, self.Position_OcInWy, self.Position_OcInWz])print(Position_OcInW:, self.Position_OcInW)def WordFrame2ImageFrame(self, WorldPoints):pro_points, jacobian cv2.projectPoints(WorldPoints, self.rvecs, self.tvecs, self.cameraMatrix, self.distCoefs)return pro_pointsdef ImageFrame2CameraFrame(self, pixPoints):fx self.cameraMatrix[0][0]u0 self.cameraMatrix[0][2]fy self.cameraMatrix[1][1]v0 self.cameraMatrix[1][2]zc (self.f[0]self.f[1])/2xc (pixPoints[0] - u0) * self.f[0] / fx #ffx*传感器尺寸/分辨率yc (pixPoints[1] - v0) * self.f[1] / fypoint np.array([xc,yc,zc])return pointdef getudistmap(self, filename):with open(filename, r,newline) as csvfile:spamreader csv.reader(csvfile, delimiter,, quotechar)rows [row for row in spamreader]self.cameraMatrix np.zeros((3, 3))#Dt np.zeros((4, 1))size_w 23.6size_h 15.6imageWidth int(rows[0][1])imageHeight int(rows[0][2])self.cameraMatrix[0][0] rows[1][1]self.cameraMatrix[1][1] rows[1][2]self.cameraMatrix[0][2] rows[1][3]self.cameraMatrix[1][2] rows[1][4]self.cameraMatrix[2][2] 1print(len(rows[2]))if len(rows[2]) 5:print(fisheye)self.distCoefs np.zeros((4, 1))self.distCoefs[0][0] rows[2][1]self.distCoefs[1][0] rows[2][2]self.distCoefs[2][0] rows[2][3]self.distCoefs[3][0] rows[2][4]scaled_K self.cameraMatrix * 0.8 # The values of K is to scale with image dimension.scaled_K[2][2] 1.0else:print(normal)self.distCoefs np.zeros((1, 5))self.distCoefs[0][0] rows[2][1]self.distCoefs[0][1] rows[2][2]self.distCoefs[0][2] rows[2][3]self.distCoefs[0][3] rows[2][4]self.distCoefs[0][4] rows[2][5]print(dim %d*%d%(imageWidth, imageHeight))print(Kt \n, self.cameraMatrix)print(Dt \n, self.distCoefs)self.f [self.cameraMatrix[0][0]*(size_w/imageWidth), self.cameraMatrix[1][1]*(size_h/imageHeight)]print(f \n, self.f)returnclass GetDistanceOf2linesIn3D():def __init__(self):print(GetDistanceOf2linesIn3D class)def dot(self, ax, ay, az, bx, by, bz):result ax*bx ay*by az*bzreturn resultdef cross(self, ax, ay, az, bx, by, bz):x ay*bz - az*byy az*bx - ax*bzz ax*by - ay*bxreturn x,y,zdef crossarray(self, a, b):x a[1]*b[2] - a[2]*b[1]y a[2]*b[0] - a[0]*b[2]z a[0]*b[1] - a[1]*b[0]return np.array([x,y,z])def norm(self, ax, ay, az):return math.sqrt(self.dot(ax, ay, az, ax, ay, az))def norm2(self, one):return math.sqrt(np.dot(one, one))def SetLineA(self, A1x, A1y, A1z, A2x, A2y, A2z):self.a1 np.array([A1x, A1y, A1z])self.a2 np.array([A2x, A2y, A2z])def SetLineB(self, B1x, B1y, B1z, B2x, B2y, B2z):self.b1 np.array([B1x, B1y, B1z])self.b2 np.array([B2x, B2y, B2z])def GetDistance(self):d1 self.a2 - self.a1d2 self.b2 - self.b1e self.b1 - self.a1cross_e_d2 self.crossarray(e,d2)cross_e_d1 self.crossarray(e,d1)cross_d1_d2 self.crossarray(d1,d2)dd self.norm2(cross_d1_d2)t1 np.dot(cross_e_d2, cross_d1_d2)t2 np.dot(cross_e_d1, cross_d1_d2)t1 t1/(dd*dd)t2 t2/(dd*dd)self.PonA self.a1 (self.a2 - self.a1) * t1self.PonB self.b1 (self.b2 - self.b1) * t2self.distance self.norm2(self.PonB - self.PonA)print(distance, self.distance)return self.distanceif __name__ __main__:print(***************************************)print(test example)print(***************************************)parser argparse.ArgumentParser(descriptiontest)parser.add_argument(-file, typestr, default calibration.csv)args parser.parse_args()calibrationfile args.filep4psolver1 PNPSolver()P11 np.array([0, 0, 0])P12 np.array([0, 200, 0])P13 np.array([150, 0, 0])P14 np.array([150, 200, 0])p11 np.array([2985, 1688])p12 np.array([5081, 1690])p13 np.array([2997, 2797])p14 np.array([5544, 2757])P11 np.array([0, 0, 0])P12 np.array([0, 300, 0])P13 np.array([210, 0, 0])P14 np.array([210, 300, 0])p11 np.array([1765, 725])p12 np.array([3068, 1254])p13 np.array([1249, 1430])p14 np.array([2648, 2072])p4psolver1.Points3D[0] np.array([P11,P12,P13,P14])p4psolver1.Points2D[0] np.array([p11,p12,p13,p14])#p4psolver1.point2find np.array([4149, 671])#p4psolver1.point2find np.array([675, 835])p4psolver1.point2find np.array([691, 336])p4psolver1.getudistmap(calibrationfile)p4psolver1.solver()p4psolver2 PNPSolver()P21 np.array([0, 0, 0])P22 np.array([0, 200, 0])P23 np.array([150, 0, 0])P24 np.array([150, 200, 0])p21 np.array([3062, 3073])p22 np.array([3809, 3089])p23 np.array([3035, 3208])p24 np.array([3838, 3217])P21 np.array([0, 0, 0])P22 np.array([0, 300, 0])P23 np.array([210, 0, 0])P24 np.array([210, 300, 0])p21 np.array([1307, 790])p22 np.array([2555, 797])p23 np.array([1226, 1459])p24 np.array([2620, 1470])p4psolver2.Points3D[0] np.array([P21,P22,P23,P24])p4psolver2.Points2D[0] np.array([p21,p22,p23,p24])#p4psolver2.point2find np.array([3439, 2691])#p4psolver2.point2find np.array([712, 1016])p4psolver2.point2find np.array([453, 655])p4psolver2.getudistmap(calibrationfile)p4psolver2.solver()point2find1_CF p4psolver1.ImageFrame2CameraFrame(p4psolver1.point2find)#Oc1P_x1 point2find1_CF[0]#Oc1P_y1 point2find1_CF[1]#Oc1P_z1 point2find1_CF[2]Oc1P_1 np.array(point2find1_CF)print(Oc1P_1)Oc1P_1[0], Oc1P_1[1] p4psolver1.CodeRotateByZ(Oc1P_1[0], Oc1P_1[1], p4psolver1.Theta_W2C[2])Oc1P_1[0], Oc1P_1[2] p4psolver1.CodeRotateByY(Oc1P_1[0], Oc1P_1[2], p4psolver1.Theta_W2C[1])Oc1P_1[1], Oc1P_1[2] p4psolver1.CodeRotateByX(Oc1P_1[1], Oc1P_1[2], p4psolver1.Theta_W2C[0])a1 np.array([p4psolver1.Position_OcInWx, p4psolver1.Position_OcInWy, p4psolver1.Position_OcInWz])a2 a1 Oc1P_1#a2 (p4psolver1.Position_OcInWx Oc1P_1[0], p4psolver1.Position_OcInWy Oc1P_y1, p4psolver1.Position_OcInWz Oc1P_z1)point2find2_CF p4psolver2.ImageFrame2CameraFrame(p4psolver2.point2find)#Oc2P_x2 point2find2_CF[0]#Oc2P_y2 point2find2_CF[1]#Oc2P_z2 point2find2_CF[2]Oc2P_2 np.array(point2find2_CF)print(Oc2P_2)Oc2P_2[0], Oc2P_2[1] p4psolver2.CodeRotateByZ(Oc2P_2[0], Oc2P_2[1], p4psolver2.Theta_W2C[2])Oc2P_2[0], Oc2P_2[2] p4psolver2.CodeRotateByY(Oc2P_2[0], Oc2P_2[2], p4psolver2.Theta_W2C[1])Oc2P_2[1], Oc2P_2[2] p4psolver2.CodeRotateByX(Oc2P_2[1], Oc2P_2[2], p4psolver2.Theta_W2C[0])b1 ([p4psolver2.Position_OcInWx, p4psolver2.Position_OcInWy, p4psolver2.Position_OcInWz])b2 b1 Oc2P_2#b2 (p4psolver2.Position_OcInWx Oc2P_x2, p4psolver2.Position_OcInWy Oc2P_y2, p4psolver2.Position_OcInWz Oc2P_z2)g GetDistanceOf2linesIn3D()g.SetLineA(a1[0], a1[1], a1[2], a2[0], a2[1], a2[2])g.SetLineB(b1[0], b1[1], b1[2], b2[0], b2[1], b2[2])distance g.GetDistance()pt (g.PonA g.PonB)/2print(pt)A np.array([241.64926392,-78.7377477,166.08307887])B np.array([141.010851,-146.64449841,167.28164652])print(math.sqrt(np.dot(A-B,A-B)))A点的世界坐标点是distance 0.13766177937900279[241.64926392 -78.7377477 166.08307887]B点的世界坐标点是distance 0.7392672771306183[ 141.010851 -146.64449841 167.28164652]计算AB点的距离A np.array([241.64926392,-78.7377477,166.08307887])B np.array([141.010851,-146.64449841,167.28164652])print(math.sqrt(np.dot(A-B,A-B)))结果为121.41191667813395从数据可以看出茶罐的高度约为171mm(玻璃标定板的高度为4mm)对角的长度约为121mm。也可以在生成世界坐标数据的时候在Z轴数据中填入标定板的高度如下所示P11 np.array([0, 0, 4])P12 np.array([0, 300, 4])P13 np.array([210, 0, 4])P14 np.array([210, 300, 4])P21 np.array([0, 0, 4])P22 np.array([0, 300, 4])P23 np.array([210, 0, 4])P24 np.array([210, 300, 4])即可直接得到对应的结果[ 141.010851 -146.64449841 171.28164652]121.41191667813395参考文档#链接地址文档名称1https://www.cnblogs.com/singlex/p/pose_estimation_3.html2https://www.cnblogs.com/singlex/p/6037020.html
http://www.zqtcl.cn/news/957351/

相关文章:

  • wordpress linux 建站安丘市建设局官方网站
  • 谁给个好网站硬件开发是什么
  • 海外网站加速器免费长春做网站优化哪家好
  • 建立网站需要多长钱电脑网页设计培训
  • 给网站划分栏目邢台做网站优化费用
  • 网群企业网站管理系统红塔区住房和城乡建设局网站
  • 濮阳网站建设在哪做沈阳百度网站的优点
  • 网站上如何做问卷调查温州建设局官方网站
  • 做一件代发哪个网站好具有品牌的福州网站建设
  • 邢台移动端网站建设犀牛建模教程
  • 华池网站建设广西柳州市
  • 泰安网站建设推荐软件商店电脑版官方下载
  • 站长平台网站报价单模板表格
  • 织梦做的网站老是被黑杭州网站设计询问蓝韵网络
  • wordpress手机版如何设置福鼎整站优化
  • 网站建设小程序定制开发北京东宏建设网站
  • 网站制作还花钱网站图怎么做
  • 免费搭网站wordpress minty
  • 海沧建设网站多少国外调色网站
  • 中企动力建站怎么样网站建设与设计的心得体会
  • 打开网站出现directoryj2ee做网站
  • 如何建设一个视频网站西安个人做网站
  • wordpress站群教程市场营销培训课程
  • 17网站一起做网店白沟简单网页制作图片
  • 网站建设项目需求分析流程做商业地产的网站
  • 百度建站商业网点的定义
  • 古镇建设网站经济研究院网站建设方案
  • 会员网站开发百度自己的宣传广告
  • 重庆网络推广网站推广自己设计图纸的软件
  • 国内免费的短视频素材网站什么网站做博客好