Access the full text.
Sign up today, get DeepDyve free for 14 days.
References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.
COMPUTER ASSISTED SURGERY 2019, VOL. 24, NO. S1, 44–52 https://doi.org/10.1080/24699322.2018.1557907 RESEARCH ARTICLE A targeting method for robot-assisted percutaneous needle placement under fluoroscopy guidance a b a c d d a a Zhonghao Han , Keyi Yu , Lei Hu , Weishi Li , Huilin Yang , Minfeng Gan , Na Guo , Biao Yang , a a Hongsheng Liu and Yuhan Wang a b School of Mechanical Engineering and Automation, Beihang University, Beijing, China; Department of Orthopaedic Surgery, Peking Union Medical College Hospital, Peking Union Medical College and Chinese Academy of Medical Science, Beijing, China; c d Orthopaedic Department, Peking University Third Hospital, Beijing, China; Department of Orthopaedic Surgery, The First Affiliated Hospital of Soochow University, Jiangsu, Suzhou, China KEYWORDS ABSTRACT Fluoroscopy Guidance; Robot-Assisted Surgery; Minimally invasive procedures are rapidly growing in popularity thanks to advancements in Space Registration; C-arm medical robots, visual navigation and space registration techniques. This paper presents a pre- Calibration cise and efficient targeting method for robot-assisted percutaneous needle placement under C- arm fluoroscopy. In this method, a special end-effector was constructed to perform fluoroscopy calibration and robot to image-space registration simultaneously. In addition, formulations were given to compute the movement of robot targeting and evaluate targeting accuracy using only one X-ray image. With these techniques, radiation exposure and operation time were reduced significantly compared to other commonly used methods. A pre-clinical experiment showed that the maximum angle error was 0.94 and the maximum position error of a target located 80mm below the end-effector was 1.31mm. And evaluation of the system in a robot-assisted pedicle screws placement surgery has justified the accuracy and reliability of proposed method in clin- ical applications. Introduction fluoroscopy calibration and robot to image-space calibra- tion [11]. In the former, the intrinsic and extrinsic parame- Minimally invasive procedures make it possible for sur- ters of an imager are usually calculated by placing a geons neither to look directly at nor to touch the tis- phantom, on which several fiducial markers distribute, in sues of organs on which they operate [1]. the field of view (FOV) of a fluoroscopy. The robot to Unfortunately, obstruction of visual field dramatically image-space is registered with at least three points increases the difficulty of these surgeries [2]. whosepositionare knownbothinthe robotframe and Nowadays, a number of special techniques have been the image-space frame [12]. After the registration, a point designed to deal with this case [3,4]. Among these on an X-ray image can determine a ray connecting the techniques, medical robots, usually aided by naviga- X-ray source and the point on the image plane [13,14]. tion tools, serve to position a target with a great Although this method computes the exact position of a degree of accuracy [2,5]. Meanwhile, fluoroscopy units target, the whole registration process is time-consuming (C-arms), as most commonly used equipment in the and tends to have a high radiation dose. Furthermore, modern operating room, can provide precise and real- due to the lack of C-arm position and orientation encod- time intraoperative two-dimensional (2-D) visualiza- ing information, imagers have difficulty to switch tion. The combination of these two techniques dra- between two specific positions. As a result, a biplanar matically improves the applications of minimally fluoroscopy is favored or even necessary in this method invasive surgery [6–9]. Implementations of the fluoroscopy guidance can [15]. However, the huge volume and high cost lead bipla- be categorized under two titles: calibrated methods nar imagers only used in some special operating room. and uncalibrated methods [10]. The calibrated methods Another kind of method is uncalibrated or visual ser- tend to compose two critical registration steps: voing based [16–18]. Requiring no additional sensors, CONTACT Weishi Li hzh0602@126.com Orthopaedic Department of Peking University Third Hospital, 49 North Garden Rd., Haidian District, 100191 Beijing, China. 2019 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. COMPUTER ASSISTED SURGERY 45 Figure 1. System structure. no stereotactic equipment, and no prior calibration, an image correction plate and a PC for robot control and image processing. The end-effector, used to pos- these methods can finish 3 D alignment of a target ition a needle, is always in the FOV of the C-arm. The point and a needle using multiple X-ray images col- lected from two dissimilar views. Nonetheless, it’s usu- correction plate mounts on the imager intensifier of the C-arm. ally taking at least 12 iterations to complete one positioning [17] and 6 (n þ 1) iterations for n targets alignments [19]. These multiple iterations increase radi- End-effector design ation exposure and operating time significantly, espe- Structure of the designed end-effector (Figure 2): An cially for some multi-points targeting tasks such as indicator line and two different size circles, made by pedicle screws placement surgery. In addition, one radiopaque materials, are embedded in a radiolucent common feature of these systems is that robots orient hollow housing. The center holes of the two circles a needle in space while maintaining the location of are used to place a multi-stage guide sleeve adapted one specific point (needle entry point), however, in to different size needles. Definition of the tool coord- some applications, it is the direction of the needle inate system (TCS) of the robot (Figure 2): The tool other than the entry point makes more sense. coordinate point (TCP) is located at the center of the To solve the problems mentioned above, our study small circle. The x-axis is collinear with the indicator extends the calibrated methods and proposes a robot- line and the z-axis is collinear with the line connecting assisted targeting approach featuring low radiation and the centers of the two circles with the direction from high efficiency. We will design a particular end-effector the center of the big circle to the center of the small to perform the fluoroscopy calibration and robot to circle. The transform matrix from TCS to the robot is image-space registration simultaneously and give the for- calibrated after the end-effector mounts on the robot, mulations of the robot movement for aligning a specific and the result is stored in the robot controller. The point in the view of a uniplanar fluoroscopy. We will also quick connector guarantees the end-effector and develop a method to assess alignment accuracy by using robot have a reliable and convenient installation. single X-ray film. For evaluation, this method will be To calibrate the pinhole model of an imager, at implemented both in the pre-clinical experiment and least 6 points whose positions are known in a world clinical experiment of pedicle screws placement surgery. coordinate system are needed. The registration accur- acy will improve if more such points are provided [20]. According to this, we have defined 12 fiducial points, Materials and methods which are not separative metal balls but geometric System overview feature points distributed on the line and circles, on The system (Figure 1) comprises of a 6 degrees of the end-effector (Figure 2). This strategy makes sure freedom (DOF) robot, a digital C-arm, an end-effector, that fiducial points can be recognized and determined 46 Z. HAN ET AL. Figure 2. The structure of the end-effector. from just one X-ray image (this will be elaborated later). When the end-effector is used to position, it is a good practice to arrange the big circle closer to the image emitter which makes sure the big circle will have an even bigger projection on X-ray film. This principle would ensure that the program can always correctly recognize the two circles and the fiducial points distributed on them. Registration and targeting Figure 3 shows the movement of the end-effector in the FOV of a fluoroscopy before and after targeting. In the starting position, the projection of the circles is two non-concentric ellipses [21], because the connect- ing lines (L ; L ) between the emitter and each center 1 2 of the circles are not collinear. The TCS in this position is denoted as {A}. P , P are centers of the small circle 1 2 and big circle respectively. In the target position, the 0 0 centers of the two circles (P ; P ), the target M and the 1 2 emitter are 3 D aligned, as a result, the two projection ellipses are concentric with the projection of target m as the common center. In this position, the TCS is represented by {B}. Note that the distance between Figure 3. The movement of the end-effector in the FOV of a the center of the small circle and emitter in each fluoroscopy. targeting process is constant. This gives the ability of defining an appropriate distance manually between the end-effector and patients in the initial position relative position easily using basic image process tech- while maintaining it after the movement. The imager niques like Hough transform. In addition, such feature also avoids the registration failure caused by the over- space is denoted by {W}. lapping of the fiducial points and the points on the According to the mathematical model of a pinhole correction plate on X-ray images. Moreover, the usage camera [22], for a space point PðX ; Y ; Z Þ in the wi wi wi of the two circles allows to assess targeting accuracy world coordinate system, the projection point pðu ; v Þ i i COMPUTER ASSISTED SURGERY 47 geometric feature points are obtained by solving simul- taneous equations. When the calibration has been done, the matrix T in Equation (1) denotes the transformation matrix from {A}to{W}. For a chosen target point mðu ; v Þ 1 1 on the image coordinate, the vector M in the image-space is given by: u u v v 1 0 1 0 M ¼ tz; tz; tz (2) a a x y where tz is the z coordinate of the M in the image-space. W W 0 W 0 Given the vector M , P are collinear, the P 1 1 can also be denoted by Equation (2) with the tz com- puted by: W 0 W P ¼ P ¼ t (3) The point P in the {A} is determined by: A 0 W 1 W 0 P ¼ T P (4) 1 A 1 A 0 Using the P , the TCP can move to the right pos- ition directly. It is worth noting that the target orienta- tion of the end-effector is not unique as long as the z axis has the right direction. A minimal rotation from the starting position to the target position is per- formed in this method. The rotation axis L and the Figure 4. The state when the alignment finished. angle h are obtained respectively by: on the image can be formulated as (image distortion A ~ 0 A L ¼ norm P Z P Z (5) is not considered): 1 2 3 2 3 2 3 wi i a 0 u 0 x 0 6 7 A ~ 0 A R t Y ~ wi 4 5 4 5 6 7 P Z P Z 0 a v 0 1 Z v ¼ ¼ MTX 1 ci i y 0 w 4 5 Z h ¼ acos (6) 0 1 wi 00 10 A ~ 0 A P Z P Z ~ 0 (1) where P Z, P Z are z axis of frame {A} and {B} respectively. where a ; a are pixel scaling on the image base x y The robot end-effector movement can be given by and u ; v are image center. These four parameters in 0 0 translation and rotation denoted by axis-angle repre- matrix M are called intrinsic parameters of an imager, sentation: since they are only determined by C-arm internal A 0 T ¼ P ; Lh (7) move structure. The matrix T is the transformation matrix from world coordinate system to camera coordinate system. Inside, R is a rotation matrix and t is a three- Evaluation dimensional translation vector. The six independent parameters in the matrix T compose the extrinsic The aim of this section is to evaluate the alignment parameters of the imager. accuracy by using one X-ray image. Figure 4 shows the To extract precise positions of each point, the image state when the robot has finished the alignment. In this is processed as follows: firstly, a 3 3Gaussian filter is figure, Q and P are the centers of the small circle and applied to reduce image noise. Secondly, the Canny the big circle respectively. OO is aligned with the z-axis edge detection is used to obtain edge contours of of the image-space frame. Note that O is not necessar- circles and line. Then, a curve fitting algorithm serves ily the center of the image. PV , a radius of the big cir- to determine the mathematical expression of these cle, is parallel to the image plane. PD and EQ are both geometrical elements [23,24]. Finally, coordinates of parallel to the image plane. The angle error h is defined 48 Z. HAN ET AL. Experiment and result For the evaluation, the system was assembled for the proposed method. The robot is the UR5 (UNIVERSAL ROBOTS). The digital C-arm has an approximate source to intensifier distance of 1 meter and an intensifier diameter of 36 cm. The system was tested for accuracy and reliability using specially derived experiments and then clinically validated for pedicle screws implement. Pre-clinical validation The initial evaluation of the method was done in a labora- tory without the participation of subjects (Figure 5). Ten experiments were done following below procedures: Figure 5. Pre-clinical experiment setup. by the space angle between QP and M .Whenthe 1. Set the C-arm in a random position and orientation. depth of a target point is known, the position error e 2. Teach the robot end-effector to an appropriate pos- can be represented by the distance between the point ition. For consistency with clinical applications, the M and the line QP. To get these errors, the exact pos- end-effector is 25 cm below the image intensifier. ition of Q and P need to be computed firstly. 3. Take one X-ray film (Figure 6a). Using these two similar triangles relationship: 4. Correct the image distortion by algorithm using 0 0 0 the projection of equally spaced balls on the cor- OVP OVP ; ODP OO P (8) rection plate. The fiducial points on the end- We have the following expression: effector were then extracted automatically and further used to implement the registration from VP OD ¼ (9) 0 0 0 robot to image-space. Note, the whole step was V P OO done automatically without any interaction of the Given the source to intensifier distance of the C-arm is operator and the time spent is negligible. relatively far and the circles are usually located in the cen- 5. Select a target point on the PC monitor, and the 0 0 tral region, the V P can be represented by the semi- program then computed the movement parame- major axis of the big projection ellipse L multiplying an ters of the robot. image scaling factor a. VP is represented by the actual 6. The robot moved to the target position and fin- radius of the big circle R . Extending all these relationship ished the 3D alignment. to the small circle, the following relationship is obtained: 7. Take another X-ray film (Figure 6b). OD R L s b 8. Record the position of the two centers of the pro- ¼ (10) R L OE jection ellipse and the target point on the image b s and evaluating the accuracy of the alignment. where R is the radius of the small circle and L is the s s Table 1 shows the experimental data of the 10 semi-major axis of the small ellipse on the X-ray image. alignments. The maximum angle error between the In addition, the QP is determined by the center dis- central axis of the end-effector and the targeting ray tance of the two circles L : is 0.94 . The average position error of the points QP ¼ L (11) located 80 mm below the end-effector is 1.31 mm which satisfies the needs of even certain demanding According to the conditions (10) (11) and Equation surgeries. The average time to complete one targeting (2), we can compute the exact position of the points and evaluation is 3.5 minutes, including the position of Q and P in the {W}. The formulations of h and e are the C-arm and robot. The results of the experiment trails then written as: enabled us to confidently proceed with clinical trials. W W M QP h ¼ acos (12) W W M QP Clinical application W W ~ ~ MQ QP To evaluate the feasibility in clinical applications, this e ¼ (13) QP method was implemented as a critical step in the COMPUTER ASSISTED SURGERY 49 Figure 6. Fluoroscopy images of end-effector in pre-clinical validation. a) The fluoroscopy image before alignment; b) The fluoros- copy image after alignment. may not occur during needle insertion. Figure 9 shows Table 1. The experiment data of pre-clinical experiments. that the surgeon verified needle targeting and depth Target Small Circle Large Circle of the insertion using other C-arm views. XYX Y XY During the operation, A total of 30 X-ray images 1 396 515 393 515 395 515 2 710 685 710 682 711 683 were taken with an average of 7.5 to finish individual 3 563 460 561 459 561 461 placement of pedicle screw. Most of these were used 4 350 713 350 713 350 714 5 320 390 322 391 323 389 to determine the standard AP and assess the depth of 6 420 422 420 422 419 420 inserted needles. The radiation exposure was signifi- 7 563 628 561 627 564 630 8 444 570 441 569 443 571 cantly reduced as compared to the other fluoroscopy 9 611 360 611 361 612 359 guidance approach. Thanks to the simplification of the 10 641 503 639 503 640 502 method, the actual implantation time of the 4 pedicle screws were reduced to 35 minutes. After the evalu- robot-assisted pedicle screws placement surgery. In ation, the surgeon determined that all 4 pedicle the setup, the robot and C-arm were put on sterile screws were successfully implanted with highly covers and placed on the same side of the operating accepted accuracy. bed. The end-effector was sterilized by ultraviolet con- sidering the material is not resistant to high tempera- tures. The patient was placed in a prone position Discussion under total anesthesia. In our study, we have presented a robot-assisted After the preparation phase, the surgeon planed targeting method under C-arm fluoroscopy guid- the needle insertion path in the program using ance based on the calibration of the imager. By anteroposterior (AP) fluoroscopic image and pre- using a specially designed end-effector, the imager scanned CT data, and rotation angles of the C-arm calibration, the registration from the robot to from the AP to axial position (in this view, the projec- image-space and the alignment of a target can fin- tion of the needle would be a point) were calculated. ish automatically with just one X-ray image. The The C-arm was then adjusted to a desired orientation accuracy of the positioning can be evaluated dir- manually according to these values. Next, step 2–7, ectly by using another X-ray image with the same described in the pre-clinical evaluation, were carried view. With these techniques, the radiation exposure out sequentially to finish the needle targeting and and operation time have been reduced significantly. alignment evaluation. One group of fluoroscopic All surgeons need to do is adjusting the C-arm to a images before and after alignment are shown in desired orientation and choosing a target point on Figure 7. After rotating away the C-arm, the surgeon the image. installed the multi-stage sleeve and performed the Additionally, owing to simplification of the oper- pedicle screw placement (Figure 8). The separation of the positioning process and drilling process ensures ation steps, the accuracy of the targeting has been that the needle may not be inadvertently inserted dur- improved. The pre-clinical experimental demonstrates ing the orientation stage and accidental reorientation that the maximum angle error was 0.94 and the 50 Z. HAN ET AL. Figure 7. Fluoroscopy images of end-effector in clinical application. a) The fluoroscopy image before alignment; b) The fluoros- copy image after alignment. Figure 8. The surgeon finished the needle placement after the targeting. Figure 9. Verifying the accuracy of pedicle screw placement using the AP and lateral fluoroscopic images. maximum position error of a target located 80 mm placement has shown that all 4 screws were success- below the end-effector was 1.31 mm. The evaluation fully implanted with the accuracy highly accepted by in the clinical experiment of the pedicle screws the surgeon. COMPUTER ASSISTED SURGERY 51 closed-loop imaging feedback and intraoperative tra- Conclusion jectory replanning. Mechatronics. 2013;23:630–645. A robot-assisted positioning approach was developed [8] Ringel F, Stuer € C, Reinke A, et al. Accuracy of robot- to improve placement accuracy of needle insertion. assisted placement of lumbar and sacral pedicle screws: a prospective randomized comparison to con- Design of the special end-effector and formulations ventional freehand screw implantation. Spine. 2012; for targeting and accuracy evaluation were elaborated. 37:E496–E501. The results of both pre-clinical experiment and clinical [9] Yoon HM, Cho H, Park K, et al. Method for C-arm application show that this method can improve posi- based guide needle insertion assistant system for tioning accuracy and reduce operation time and radi- endoscopic disc surgery. Korean J Comput Des Eng. ation exposure dramatically. Another feature of this 2015;20:263–268., [10] Yao J, Taylor RH, Goldberg RP, et al. AC-arm fluoros- method is that the orientation of the C-arm imager copy-guided progressive cut refinement strategy other than entry point determines the direction of the using a surgical robot. Computer Aided Surgery. inserted needle. This feature makes the method ideal 2000;5:373–390., for a range of surgeries in which the needle path can [11] Laudato PA, Pierzchala K, Schizas C. pedicle screw be pre-planned by preoperative data like CT or deter- insertion accuracy using O-arm, robotic guidance, or mined by surgeon experience. To sum up, the pre- freehand technique. Spine. 2018;43:E373–E378. [12] Schreiner S, Anderson JH, Taylor RH, et al. A system sented robot-assisted system shows outstanding for percutaneous delivery of treatment with a fluoro- advantages in improving the accuracy and efficiency scopically-guided robot. In: CVRMed-MRCAS’97. Berlin, during pedicle screws placement. Additional clinical Heidelberg: Springer; 1997. p. 747–756. cases in such surgeries are scheduled in the [13] Czerny C, Eichler K, Croissant Y, et al. Combining C- near future. arm CT with a new remote operated positioning and guidance system for guidance of minimally invasive spine interventions. J Neurointervent Surg. 2014;7: Disclosure statement 303–308. [14] Bzostek A, Schreiner S, Barnes AC, et al. An auto- No potential conflict of interest was reported by the authors. mated system for precise percutaneous access of the renal collecting system. In: CVRMed-MRCAS’97. Berlin, Heidelberg: Springer; 1997. p. 299–308. Funding [15] Kim S, Chung J, Yi BJ, et al. An assistive image-guided This work was supported by Natural Science Foundation of surgical robot system using O-arm fluoroscopy for China (No.61333019), National Hi-Tech Research and pedicle screw insertion: preliminary and cadaveric Development Program of China (“863” Project) study. Neurosurgery. 2010;67:1757–1767. (Nos.2015AA043204), and Capital Health Development [16] Patriciu A, Stoianovici D, Whitcomb LL, et al. October. Research Project (No.2016-2-4094). Motion-based robotic instrument targeting under C- arm fluoroscopy. In: International Conference on Medical Image Computing and Computer-Assisted Intervention. Berlin, Heidelberg: Springer; 2000. p. References 988–998. [17] Navab N, Bascle B, Loser M, et al. Visual servoing for [1] Mack MJ. Minimally invasive and robotic surgery. automatic and uncalibrated needle placement for per- JAMA. 2001;285:568–572. cutaneous procedures. In: Proceedings IEEE confer- [2] Dogangil G, Davies BL, Rodriguez y Baena F. A review ence on computer vision and pattern recognition, of medical robotics for minimally invasive soft tissue 2000. Hilton Head Island, USA: IEEE; 2000. Vol. 2, p. surgery. Proc Inst Mech Eng H. 2010;224:653–679. 327–334. [3] Wang W, Shi Y, Goldenberg AA, et al. Experimental [18] Navab N, Wiesner S, Benhimane S, et al. Visual servo- analysis of robot-assisted needle insertion into por- ing for intraoperative positioning and repositioning of cine liver. BME. 2015;26:S375–S380., mobile C-arms. In: International conference on med- [4] Wang WD, Zhang P, Shi YK, et al. Design and com- ical image computing and computer-assisted inter- patibility evaluation of magnetic resonance imaging- vention. Berlin, Heidelberg: Springer; 2006. p. guided needle insertion system. J Med Imaging 551–560. Health Inform. 2015;5:1963–1967. [19] Bascle B, Navab N, Loser M, et al. Needle placement [5] Barbash GI, Glied SA. New technology and health care costs-the case of robot-assisted surgery. N Engl J under X-ray fluoroscopy using perspective invariants. Med. 2010;363:701–704. In Proceedings IEEE workshop on mathematical meth- [6] Yaniv Z, Joskowicz L. Precise robot-assisted guide ods in biomedical image analysis, 2000. Hilton Head positioning for distal locking of intramedullary nails. Island, USA: IEEE; 2000. p.46–53. IEEE Trans Med Imaging. 2005;24:624–635. [20] Szeliski R. 2010. Computer vision: algorithms and [7] Bernardes MC, Adorno BV, Poignet P, et al. Robot- applications. Ithaca, New York: Springer Science & assisted automatic insertion of steerable needles with Business Media. 52 Z. HAN ET AL. [21] Heikkila J, Silven O. A four-step camera calibration pro- [23] Li X, Zhang D, Liu B. A generic geometric calibration cedure with implicit image correction. In Proceedings method for tomographic imaging systems with flat- IEEE Computer Society Conference on Computer Vision panel detectors—a detailed implementation guide. and Pattern Recognition. (pp. 1106–1112). IEEE; 1997. Med Phys. 2010;37:3844–3854. [22] Tsai R. A versatile camera calibration technique for [24] Li X, Zhang D, Liu B. Sensitivity analysis of a geomet- high-accuracy 3D machine vision metrology using off- ric calibration method using projection matrices for the-shelf TV cameras and lenses. IEEE J Robot digital tomosynthesis systems. Medical Physics 2010; Automat. 1987;3:323–344. 38:202–209.
Computer Assisted Surgery – Taylor & Francis
Published: Oct 1, 2019
Keywords: Fluoroscopy Guidance; Robot-Assisted Surgery; Space Registration; C-arm Calibration
You can share this free article with as many people as you like with the url below! We hope you enjoy this feature!
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.