Marta Lagomarsino, V. Groenhuis, M. Casadio, M. Welleweerd, F. Siepel, S. Stramigioli
{"title":"Image-guided Breast Biopsy of MRI-visible Lesions with a Hand-mounted Motorised Needle Steering Tool","authors":"Marta Lagomarsino, V. Groenhuis, M. Casadio, M. Welleweerd, F. Siepel, S. Stramigioli","doi":"10.1109/ISMR48346.2021.9661564","DOIUrl":"https://doi.org/10.1109/ISMR48346.2021.9661564","url":null,"abstract":"A biopsy is the only diagnostic procedure for accurate histological confirmation of breast cancer. When sonographic placement is not feasible, a Magnetic Resonance Imaging(MRI)-guided biopsy is often preferred. The lack of real-time imaging information and the deformations of the breast make it challenging to bring the needle precisely towards the tumour detected in pre-interventional Magnetic Resonance (MR) images. The current manual MRI-guided biopsy workflow is inaccurate and would benefit from a technique that allows real-time tracking and localisation of the tumour lesion during needle insertion. This paper proposes a robotic setup and software architecture to assist the radiologist in targeting MR-detected suspicious tumours. The approach benefits from image fusion of preoperative images with intraoperative optical tracking of markers attached to the patient’s skin. A hand-mounted biopsy device has been constructed with an actuated needle base to drive the tip toward the desired direction. The steering commands may be provided both by user input and by computer guidance. The workflow is validated through phantom experiments. On average, the suspicious breast lesion is targeted with a radius down to 2.3 mm. The results suggest that robotic systems taking into account breast deformations have the potentials to tackle this clinical challenge.","PeriodicalId":405817,"journal":{"name":"2021 International Symposium on Medical Robotics (ISMR)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126566457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jianxin Gao, Ju Lin, Irfan Kil, R. Singapogu, R. Groff
{"title":"Deep Learning for Needle Detection in a Cannulation Simulator","authors":"Jianxin Gao, Ju Lin, Irfan Kil, R. Singapogu, R. Groff","doi":"10.1109/ismr48346.2021.9661529","DOIUrl":"https://doi.org/10.1109/ismr48346.2021.9661529","url":null,"abstract":"Cannulation for hemodialysis is the act of inserting a needle into a surgically created vascular access (e.g., an arteriovenous fistula) for the purpose of dialysis. The main risk associated with cannulation is infiltration, the puncture of the wall of the vascular access after entry, which can cause medical complications. Simulator-based training allows clinicians to gain cannulation experience without putting patients at risk. In this paper, we propose to use deep-learning-based techniques for detecting, based on video, whether the needle tip is in or has infiltrated the simulated fistula. Three categories of deep neural networks are investigated in this work: modified pre-trained models based on VGG-16 and ResNet-50, light convolutional neural networks (light CNNs), and convolutional recurrent neural networks (CRNNs). CRNNs consist of convolutional layers and a long short-term memory (LSTM) layer. A data set of cannulation experiments was collected and analyzed. The results show that both the light CNN (test accuracy: 0.983) and the CRNN (test accuracy: 0.983) achieve better performance than the pre-trained baseline models (test accuracy 0.968 for modified VGG-16 and 0.971 for modified ResNet-50). The CRNN was implemented in real time on commodity hardware for use in the cannulation simulator, and the performance was verified. Deep-learning video analysis is a viable method for detecting needle state in a low cost cannulation simulator. Our data sets and code are released at https://github.com/axin233/DL_for_Needle_Detection_Cannulation.","PeriodicalId":405817,"journal":{"name":"2021 International Symposium on Medical Robotics (ISMR)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125934893","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}