Main Content

triangulate

3-D locations of undistorted matching points in stereo images

Description

example

worldPoints= triangulate(matchedPoints1,matchedPoints2,stereoParams)returns the 3-D locations of matching pairs of undistorted image points from two stereo images.

worldPoints= triangulate(matchedPoints1,matchedPoints2,cameraMatrix1,cameraMatrix2)returns the 3-D locations of the matching pairs in a world coordinate system. These locations are defined by camera projection matrices.

[worldPoints,reprojectionErrors] = triangulate(___)additionally returns reprojection errors for the world points using any of the input arguments from previous syntaxes.

[worldPoints,reprojectionErrors,validIndex] = triangulate(___)additionally returns the indices of valid and invalid world points. Valid points are located in front of the cameras.

Examples

collapse all

Load stereo parameters.

load('webcamsSceneReconstruction.mat');

Read in the stereo pair of images.

I1 = imread('sceneReconstructionLeft.jpg'); I2 = imread('sceneReconstructionRight.jpg');

Undistort the images.

I1 = undistortImage(I1,stereoParams.CameraParameters1); I2 = undistortImage(I2,stereoParams.CameraParameters2);

Detect a face in both images.

faceDetector = vision.CascadeObjectDetector; face1 = faceDetector(I1); face2 = faceDetector(I2);

Find the center of the face.

center1 = face1(1:2) + face1(3:4)/2; center2 = face2(1:2) + face2(3:4)/2;

Compute the distance from camera 1 to the face.

point3d = triangulate(center1, center2, stereoParams); distanceInMeters = norm(point3d)/1000;

Display the detected face and distance.

distanceAsString = sprintf('%0.2f meters', distanceInMeters); I1 = insertObjectAnnotation(I1,'rectangle',face1,distanceAsString,'FontSize',18); I2 = insertObjectAnnotation(I2,'rectangle',face2, distanceAsString,'FontSize',18); I1 = insertShape(I1,'FilledRectangle',face1); I2 = insertShape(I2,'FilledRectangle',face2); imshowpair(I1, I2,'montage');

Figure contains an axes object. The axes object contains an object of type image.

Input Arguments

collapse all

Coordinates of points in image 1, specified as anM-by-2 matrix ofMnumber of [xy] coordinates, or as aKAZEPoints,SURFPoints,MSERRegions,cornerPoints, orBRISKPointsobject. ThematchedPoints1andmatchedPoints2inputs must contain points that are matched using a function such asmatchFeatures.

Coordinates of points in image 2, specified as anM-by-2 matrix ofMnumber of [xy] coordinates, or as aKAZEPoints,SURFPoints,MSERRegions,cornerPoints, orBRISKPointsobject. ThematchedPoints1andmatchedPoints2inputs must contain points that are matched using a function such asmatchFeatures.

Camera parameters for stereo system, specified as astereoParametersobject. The object contains the intrinsic, extrinsic, and lens distortion parameters of the stereo camera system. You can use theestimateCameraParametersfunction to estimate camera parameters and return astereoParametersobject.

When you pass astereoParametersobject to the function, the origin of the world coordinate system is located at the optical center of camera 1. Thex-axis points to the right, they-axis points down, and thez-axis points away from the camera.

Projection matrix for camera 1, specified as a 4-by-3 matrix. The matrix maps a 3-D point in homogeneous coordinates onto the corresponding point in the image from the camera. This input describes the location and orientation of camera 1 in the world coordinate system.cameraMatrix1must be a real and nonsparse numeric matrix. You can obtain the camera matrix using thecameraMatrixfunction.

Camera matrices, passed to the function, define the world coordinate system.

Projection matrix for camera 2, specified as a 4-by-3 matrix. The matrix maps a 3-D point in homogeneous coordinates onto the corresponding point in the image from the camera. This input describes the location and orientation of camera 2 in the world coordinate system.cameraMatrix2must be a real and nonsparse numeric matrix. You can obtain the camera matrix using thecameraMatrixfunction.

Camera matrices, passed to the function, define the world coordinate system.

Output Arguments

collapse all

3-D locations of matching pairs of undistorted image points, returned as anM-by-3 matrix. The matrix containsMnumber of [xyz] locations of matching pairs of undistorted image points from two stereo images.

When you specify the camera geometry usingstereoParams, the world point coordinates are relative to the optical center of camera 1.

When you specify the camera geometry usingcameraMatrix1andcameraMatrix2, the world point coordinates are defined by the camera matrices.

The function returnsworldPointsas data typedoublewhenmatchedPoints1andmatchedPoints2are of data typedouble. Otherwise, the function returnsworldPointsas data typesingle.

Data Types:single|double

Reprojection errors, returned as anM-by-1 vector. The function projects each world point back into both images. Then, in each image, the function calculates the reprojection error as the distance between the detected and the reprojected point. ThereprojectionErrorsvector contains the average reprojection error for each world point.

Validity of world points, returned as anM-by-1 logical vector. Valid points, denoted as a logical1(true), are located in front of the cameras. Invalid points, denoted as a logical0(false), are located behind the cameras.

The validity of a world point with respect to the position of a camera is determined by projecting the world point onto the image using the camera matrix and homogeneous coordinates. The world point is valid if the resulting scale factor is positive.

Tips

Thetriangulatefunction does not account for lens distortion. You can undistort the images using theundistortImagefunction before detecting the points. Alternatively, you can undistort the points themselves using theundistortPointsfunction.

References

a[1]哈特利,r和Zisserman。“多个视图Geometry in Computer Vision."Cambridge University Press, p. 312, 2003.

Extended Capabilities

Version History

Introduced in R2014b