This work was partially supported by the Grant-in-Aid for Young Scientists B (25750191) from the Ministry of Education, Culture, Sports, Science and Technology of Japan, and the MEXT Global Center of Excellence Program “Global Robot Academia” Waseda University, Tokyo, Japan.
The goal of this research was to teach a robot to "feel" and understand what is happening when a needle is inserted into soft tissue, same as physicians do. The robot will learn from experience to know the type of tissue that is puncturing and what actions needs to perform to complete a needle insertion task.
To the best of our knowledge this was the first time that this approach has been taken.
The trend for cancer treatment is towards Minimally Invasive (MI) treatments, percutaneous in this case, that reduce the physiological and cosmetic impact in patients.
In percutaneous treatments a physician introduces a needle into a cancerous area and then applies radio frequency ablation (RFA), chryoablation or chemotherapy to destroy the cancer, or takes a sample for a biopsy. For example, over 1500 centers performed RFA for liver cancer in 2007 in Japan alone.
But it is a challenging procedure for physicians, who have to guide the needle with limited visibility, needle deflection, low tool maneuverability, involuntary patient’s movement and human tissue deforms easily when a needle is inserted and the target moves. In addition, mechanical properties of human tissue change between patients, thus, surgeons must rely on tactile feedback and experience.
Some of the problems mentioned above can be solved by robots and not surprisingly, percutaneous robots have become a very active research area. Traditionally, the approach was either use of FEM simulation before inserting the needle to calculate a trajectory that accounts for tissue deformation or intra-operative control using US image.
However, none of these two approaches have proven better results than what physicians can achieve. In part because FEM simulations are complex and require precise models, and US image has low quality and it is difficult to distinguish the needle or tissue on it.
This research takes a novel approach and doesn't try to create and accurate model of tissue or control the robot, but teach the robot to control itself. One of the advantages of this approach is that things might change during the procedure (patient's movement, missing imaging, etc.) and the robot can still react to them.
The first step was to prove that it is possible to classify the type of tissue being puncture with just the force measured on the needle. Especially important because previous research had suggested that this wasn't possible. Although it is the case for certain types of tissue (thin membranes for example), it might not be the case for all tissues. To prove the feasibility of our proposal we focused first on liver. On one hand because liver is commonly used for needle insertion research and in the other because of it consists on basically two types of tissue: hepatocyte and veins, thus, reducing the classification problem to a binary classification problem.
We performed a series of needle insertions into swine liver to mesasure the force necessary to puncture each type of tissue. A 18G–1 1/2" surgical needle was inserted vertically 15 mm into the tissue using a Cartesian robot. After the insertions were completed, 75-200 depending on the liver's size, each of the "holes" done by the needle was inspected for the presence of veins or membranes. The position of a vein or membrane was measured and compared to the force graph. That allowed us to classify each puncture force as belonging to hepatocyte tissue or a vein. The final statistical distribution of the puncturing force can be seen in Fig. 1, which clearly shows two distintive distributions. Finally, to classify in real-time the type of tissue being punctured we derived a Bayesian classifier.
With the statistical models of puncturing force for each type and tissue and the Bayesian classifier validated offline, the next step was to create an algorithm to detect in real-time each puncture in order to determine the force and be able to classify it as a particular type of tissue.
The algorithm used to detect the punctures is based on time series fault detection. The idea behind is to take advantage of the fact that the insertion force, as measured on the base of the needle, always grows until the tissue is punctured, at which moment a sharp drop in force is observed. In other words, for any point measured at time t0, the force measured at t1 must be greater than that at t0. The exceptions being a puncture event or noise.
In order to differentiate between noise and a puncture event, we compare the maximum force value in a window with the last measured force. When the difference between the maximum and last recorded point is bigger than the expected value if it were noise, it is considered to be a puncture event. Experimental results show a detection rate over 95 % with an average detection time of 42.1 ms. The punctures not being detected are small puncture events that for most purposes are negible.
Next, we gave "situation awareness" to the robot. By being able to tell the type of tissue being punctured, it would be possible to compare it to a predefined plan, i.e. if an event that shouldn't occur is detected, an error has happened, on the other hand, if an expected event hasn't occured yet, a different action has to be taken (Fig. 3). To prove the value of giving "awareness" to the robot, we carried out an experiment comparing a system with and without awareness for a vein puncturing task.
The robot had to insert a surgical needle into a vein in a piece of swine liver. The no awareness system would use the standard approach of telling the robot how deep it should insert the needle. First the position of the vein was measured with US imaging and then given to the robot. In case of the system with situation awareness no vein position was given but it should stop when it detected a vein puncture. If the needle was inside the vein when the robot stopped it was counted as a success or a failure otherwise. The position of the needle was recorded using ultrasound image.
The results of the experiment show that the system without awareness placed correctly the needle into the vein in 4 of 10 cases, whereas the system with situation awareness did so in 9 of 10 cases, thus proving that the proposed system can perform better with less information by "understanding" what's going on.
Finally, we decided to find the different types of tissue-needle interaction modes and model the local value of the non-linear elastic modulues and friction for future use.
The interaction between needle and tissue could provide information of how tissue will behave (sudden breaking, slippage) whereas knowing the value of the elastic modulus could be used to simulate in real-time tissue deformation. The simple linear model of friction proposed captures perfectly the overal shape of the insertion force and could be use in the future to look for common patterns in insertion force, thus, given and initial shape of the insertion find the probable evolution of the force and therefore being able to simulate for the tissue will deform and where the target will be to compensate needle's trajectory.