Access the full text.
Sign up today, get DeepDyve free for 14 days.
References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.
Pose Self-Measurement of Noncooperative Spacecraft Based on Solar Panel Triangle Structure div.banner_title_bkg div.trangle { border-color: #082C0F transparent transparent transparent; opacity:0.7; /*new styles start*/ -ms-filter:"progid:DXImageTransform.Microsoft.Alpha(Opacity=70)" ;filter: alpha(opacity=70); /*new styles end*/ } div.banner_title_bkg_if div.trangle { border-color: transparent transparent #082C0F transparent ; opacity:0.7; /*new styles start*/ -ms-filter:"progid:DXImageTransform.Microsoft.Alpha(Opacity=70)" ;filter: alpha(opacity=70); /*new styles end*/ } div.banner_title_bkg div.trangle { width: 198px; } #banner { background-image: url('http://images.hindawi.com/journals/jr/jr.banner.jpg'); background-position: 50% 0;} Hindawi Publishing Corporation Home Journals About Us Journal of Robotics About this Journal Submit a Manuscript Table of Contents Journal Menu About this Journal · Abstracting and Indexing · Advance Access · Aims and Scope · Annual Issues · Article Processing Charges · Articles in Press · Author Guidelines · Bibliographic Information · Citations to this Journal · Contact Information · Editorial Board · Editorial Workflow · Free eTOC Alerts · Publication Ethics · Reviewers Acknowledgment · Submit a Manuscript · Subscription Information · Table of Contents Open Special Issues · Published Special Issues · Special Issue Guidelines Abstract Full-Text PDF Full-Text HTML Full-Text ePUB Linked References How to Cite this Article Journal of Robotics Volume 2015 (2015), Article ID 472461, 6 pages http://dx.doi.org/10.1155/2015/472461 Research Article Pose Self-Measurement of Noncooperative Spacecraft Based on Solar Panel Triangle Structure Jingzhou Song and Caixiu Cao School of Automation, Beijing University of Posts and Telecommunications, No. 10 Xitucheng Road, Haidian District, Beijing 100876, China Received 19 August 2014; Revised 5 January 2015; Accepted 5 January 2015 Academic Editor: Gordon R. Pennock Copyright © 2015 Jingzhou Song and Caixiu Cao. This is an open access article distributed under the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Abstract Aiming at the recognition and location of noncooperative spacecraft, this paper presents a monocular vision pose measurement method based on solar triangle structure. First of all, an autonomous recognition algorithm of feature structure based on sliding window Hough transformation (SWHT) and inscribed circle of a triangle is proposed, and the image coordinates of feature points on the triangle can be obtained relying on this algorithm, combined with the P4P algorithm and the structure of spacecraft, calculating the relative pose of target expressed by rotation and translation matrix. The whole algorithm can be loaded into the prewritten onboard program, which will get the autocomplete feature structure extraction and relative pose measurement without human intervention, and this method does not need to mount any markers on the target. Then compare the measured values with the accurate value of the laser tracker, so that a conclusion can be drawn that the maximum position error is lower than 5% and the rotation error is lower than 4%, which meets the requirements of noncooperative spacecraft’s pose measurement for observations, tracking, and docking in the final rendezvous phase. 1. Introduction In the on-orbit servicing tasks, it is usually required to know the target position and orientation information of the spacecrafts. Most of these spacecrafts are noncooperative; they lost contact with the ground and have no artificial sign for auxiliary measurement mounted on the structure [ 1 ]. For such spacecrafts, only their nature surface features can be used for relative pose measurement. The measurement error and measurement difficulty are much larger than the cooperative targets [ 2 ]. In recent years, more and more attentions were paid to the pose measurement of noncooperative spacecraft, due to its high values and good application prospect. Xu et al. proposed an algorithm to measure the position and orientation of a noncooperative spacecraft with the solar panel triangle as the recognition object. In this algorithm, human-interaction is applied to provide the characteristic information of object [ 3 ]. A method for pose measurement of noncooperative spacecraft based on rectangular structure was presented by Miao et al. [ 4 ]. This method chose a huge size structure as the target feature. With the approach distance being close, this method will be invalidated because the target is out of the camera’s view. Terui et al. developed a pose measurement algorithm using stereo-vision [ 5 ], which needs to know the three-dimensional model of the target. Another pose measurement method based on human-interaction appeared in Du et al.’s article [ 6 ]; two cameras were used in this article to prevent camera’s malfunction. Gao et al. proposed a measurement method of incomplete rectangle [ 7 ], which cannot get the position of the target. In article of Lichter and Dubowsky, a method to estimate the target state, shape, and parameters was presented [ 8 ]; data from multiple visual sensors at different distance was needed in this article. In conclusion, although there are lots of overall researches for the relative position and orientation measurement of noncooperative spacecraft at present, studies specifically for spacecraft feature recognition are limited [ 9 ]. The existing feature recognition mostly needs human being’s remote control, and its autonomy and real-time are not high, as well as the fact that the measurement results are affected by the transmission delay and the transmission reliability. In order to get the relative position and orientation of noncooperative spacecraft at close range rendezvous phase, a method based on single camera is proposed to calculate the relative pose of a noncooperative spacecraft that relied on a solar panel triangle structure in this paper. 2. Relative Pose Measurement 2.1. Coordinates Systems in Vision System Four measurement coordinate systems are involved in the noncooperative spacecraft pose measurement system: the space coordinate system is , the camera coordinate system is , the image coordinate system is , and the world coordinate system is . As the camera is mounted on the chaser fixedly, there is no relative motion between the chaser and the camera. In this paper, only the relative pose relationship between the space coordinate system and the camera coordinate system is considered. 2.2. Pose Measurement The mapping relationship between the space coordinates and the image coordinates of the feature points can be determined by the pinhole camera model [ 10 ]. As is shown in Figure 1 , all of the light passes through the optical center; that is, the object point, the image point, and the optical center are collinear. Figure 1: Pinhole camera model. Based on pinhole camera model, the feature point transformation between image coordinate and camera coordinate can be described as where and are the equivalent focal length in direction of axes and . stand for the center of image plane. The rigid relationship of the feature point between the space coordinate and the camera coordinate can be given by Formula ( 2 ) can be expressed with homogeneous coordinate as follows: where is rotation matrix and is the translation vector. Substituting ( 3 ) into ( 1 ), we can obtain the equation Note ; its practical significance is the projection of distance between the optical center and the object point in optical axis direction. It is assumed that four points in the target space are coplanar; then . Define plane as , so the space coordinates of the feature point are , ; then ( 4 ) can be simplified as follows: where , , are known quantities, , are the intrinsic parameters of camera, which can be computed by calibration, and , are quantities to be solved. As , problem boils down to solving , , . Note ; then we have Eight homogenous linear equations can be provided by four groups of corresponding points in ( 6 ); then can be determined solely. As , , , , can be solved according to ( 5 ), and then can be solved as well. Those problems that are to solve the pose matrix of target according to space coordinates and image coordinates of four coplanar points are called P4P problem. 3. Triangle Recognition 3.1. Recognition Principle Assuming the structure and size of the spacecraft are known, based on last chapter, before calculating the relative pose between the spacecraft and the camera, we need to get the image coordinates of four coplanar feature points. The solar panels triangle is a typical part that connects the main body of spacecraft and the solar panels. The obtaining of triangle’s inscribed circle and vertexes image coordinates, introduced below, uses the triangle recognition algorithm that is based on SWHT. Regarding the basic idea of Hough transform: based on the duality between point and line, any point on line in image space can be mapped to a sine curve in parameter space. The slope and intercept of all points on a straight line under the original coordinate system stay the same; which can be described as all curves gathering into the same point in parameter space, as is shown in Figure 2 . Thus, the length of the line in Cartesian coordinate system is equal to the number of the curves that gather to the same point in polar coordinate system. Figure 2: Line in Cartesian space mapping to parameter space. Regarding the basic characteristic of triangle: any triangle only has one inscribed circle, and the distances between the center and three sides are equal (see Figure 3 ). Based on this characteristic, choose a small window sliding in the image, setting the center of the window as the origin of Cartesian coordinate system; when the center of sliding window overlaps with the center of inscribed circle center, three peak points appear in parameter space; the absolute values are equal; the values are different. Figure 3: Triangle and its inscribed circle. 3.2. Detection Result In this paper, the image processing model and triangle feature detection system are built with MATLAB. The size of the original image is ; choose a size window to slide on the image; when the center get to the coordinate , ideal triangle feature can be extracted. The detection result shows in Figure 4 . Figure 4: Experimental result. The pose information is displayed in Table 1 . Table 1: Pose coordinate information. 3.3. Pose Estimation Regarding the constraint relationship as mentioned in Section 2.1 : For the detected triangle, its vertexes’ position on the target spacecraft as well as the size of its sides are known, and then the inscribed circle center can be calculated based on the known vertexes; those four feature points are coplanar. In the target spacecraft coordinate system, take the triangle plane as ; then the three vertexes’ space coordinate are , , ; the corresponding side lengths are , , and , so the inscribed circle center coordinate is . Combining with the above four feature points’ image coordinates, relative pose relationship can be obtained. 4. Experiment Study 4.1. Experiment Facilities In order to verify the effectiveness and feasibility of the measurement method, a noncooperative spacecraft autonomous measurement system with a well-founded experimental platform was built, whose purpose is to simulate the relative position and orientation measurement process. The experimental platform includes CMOS camera, target spacecraft, calibration checkerboard, Laser tracker, onboard computer A, and computer B, as is shown in Figure 5 . Figure 5: Pose measurement experimental platform. CMOS camera uses the photonfocus camera of DS1-D1312 series, with a resolution of and an image size of , and its data interface is cameraLink base. The target spacecraft chooses the Chang’e-2 satellite model (scale 1 : 30), which weighs 1.3 kg; its size parameters are 680 mm (length) × 220 mm (high) × 125 mm. The main body structure is a 55 mm × 66 mm × 70 mm size cube. As the spacecraft used in this paper is a model, a percentage of the distance can be equivalent to 0~1 m in the close range, which fits to the reality; Leica-AT901B acts as the laser tracker, the measurement radius of AT901B is 80 m, its accuracy is 0.06 ram, and measurement distance in this paper is about 4 m; Computer A is replaced by PC, mainly to preside over image processing and calculating; Computer B is the matching computer of Leica, which is in charge of measuring the coordinates of feature points. The relationships between different modules are as described in Figure 6 . Figure 6: Experimental modules and its relationship. ( Steps 1–5 represent the actual measurement experimentation processes; steps A, B, and C represent the simulating calculation processes). 4.2. Experiment Process Two experiments were conducted in order to verify the accuracy of the algorithm; one is to get the calculated value by using the method of this paper; another is to measure the accurate value by laser tracker. In the first experiment, we mounted the camera onto a tripod and calibrated the camera then extracted the feature points of solar panel triangle by using the algorithm in this paper, so the relative pose between camera coordinate system and spacecraft coordinate system can be calculated by P4P algorithm. In the second experiment, we acquired the relative pose between checkerboard and camera by camera calibration and then measured the relative pose between checkerboard and tracker with Leica, so that the relative pose between the camera and Leica can be solved. Then the world coordinates of feature points with Leica were measured, so that the relative pose between camera and target can be calculated, which can be considered as accurate values. The whole measurement experiment process is as shown in Figure 7 . Figure 7: Measurement experimental procedures. 4.3. Results and Discussion The experiment results are as shown in Table 2 . Table 2: Errors of pose parameters. As illustrated in Table 2 , the maximum error of rotation angles is lower than 4% and maximum error of translation parameters is lower than 5%, which indicated that the measurement method is effective. As we change the position and rotation of the spacecraft and repeat the experiments above, errors keep being stable. 5. Conclusions In order to measure the pose of a noncooperative spacecraft, a method based on SWHT and attribute of triangle and its inscribed circle is presented. The method, combined with the P4P algorithm and camera calibration technology, relies on a solar panel triangle structure on the target spacecraft. Comparing the calculated values with the accuracy values, it validates that the method proposed in this paper is effective and feasible. The application of this method may be extended to recognize triangle structure in other areas, such as the recognition of triangle road sign. The P4P algorithm mentioned in this paper can also be used to solve the pose measurement of noncooperative spacecraft given any four coplanar feature points. Although the difference between measured values and accurate values is very small, error still exists because of image feature extraction, P4P algorithm, and camera calibration, and so forth. Further studies will be focused on improving algorithm as well as improving experimental facilities to decrease experiment error. In addition, more studies will be carried out for other possible feature structures, such as the oval docking ring. Conflict of Interests The authors declare that there is no conflict of interests regarding the publication of this paper. Acknowledgment The authors are thankful for the support from laboratory of Virtual Reality of Beijing University of Posts and Telecommunications; without their dedication this paper would not be possible. References N. Cui, P. Wang, J. Guo, et al., “A review of on-orbit servicing,” Journal of Astronautics , vol. 28, no. 4, pp. 33–39, 2007. K. Landzettel and A. Alb-Schaffer, “ROKVISS: verification of advanced light weight robotic joints and tele-presence concepts for future space missions,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA '08) , Pasadena, Calif, USA, 2008. W.-F. Xu, B. Liang, C. Li, Y. Liu, and W.-Y. Qiang, “The approach and simulation study of the relative pose measurement between space-crafts based on stereo vision,” Journal of Astronautics , vol. 30, no. 4, pp. 1421–1428, 2009. View at Scopus X. Miao, F. Zhu, Y. Hao, Q. Wu, and R. Xia, “Vision pose measurement for non-cooperative space vehicles based on solar panel component,” Chinese High Technology Letters , vol. 23, no. 4, pp. 400–406, 2013. View at Publisher · View at Google Scholar · View at Scopus F. Terui, H. Kamimura, and S. Nishida, “Motion estimation to a failed satellite on orbit using stereo vision and 3D model matching,” in Proceedings of the 9th International Conference on Control, Automation, Robotics and Vision (ICARCV '06) , Singapore, December 2006. View at Publisher · View at Google Scholar · View at Scopus X. D. Du, B. Liang, W. Xu, and Y. Qiu, “Pose measurement of large non-cooperative satellite based on collaborative cameras,” Acta Astronautica , vol. 68, no. 11-12, pp. 2047–2065, 2011. View at Publisher · View at Google Scholar · View at Scopus X. H. Gao, B. Liang, and W. F. Xu, “Attitude determination of large non-cooperative spacecrafts in final approach,” in Proceedings of the 11th International Conference on Control, Automation, Robotics and Vision (ICARCV '10) , pp. 1571–1576, Singapore, December 2010. View at Publisher · View at Google Scholar · View at Scopus M. D. Lichter and S. Dubowsky, “State, shape, and parameter estimation of space objects from range images,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA '04) , vol. 3, pp. 2974–2979, New Orleans, La, USA, April-May 2004. View at Publisher · View at Google Scholar K. Zhai, Z. Li, X. Chen, and X. Qu, “Study on recognition method for non-cooperative Spacecraft docking ring,” Aerospace Control , vol. 31, no. 5, pp. 76–82, 2013. G. Zuo, Mono-Vision Based Aerocraft Pose Measurement Technology , National University of Defense Technology, Beijing, China, 2010. (function (i, s, o, g, r, a, m) { i['GoogleAnalyticsObject'] = r; i[r] = i[r] || function () { (i[r].q = i[r].q || []).push(arguments) }, i[r].l = 1 * new Date(); a = s.createElement(o), m = s.getElementsByTagName(o)[0]; a.async = 1; a.src = g; m.parentNode.insertBefore(a, m) })(window, document, 'script', '//www.google-analytics.com/analytics.js', 'ga'); ga('create', 'UA-8578054-2', 'auto'); ga('send', 'pageview');
Journal of Robotics – Hindawi Publishing Corporation
Published: Jan 26, 2015
You can share this free article with as many people as you like with the url below! We hope you enjoy this feature!
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.