WO2017036023A1 - Système de positionnement destiné à être utilisé au cours d'une opération chirurgicale - Google Patents
Système de positionnement destiné à être utilisé au cours d'une opération chirurgicale Download PDFInfo
- Publication number
- WO2017036023A1 WO2017036023A1 PCT/CN2015/099144 CN2015099144W WO2017036023A1 WO 2017036023 A1 WO2017036023 A1 WO 2017036023A1 CN 2015099144 W CN2015099144 W CN 2015099144W WO 2017036023 A1 WO2017036023 A1 WO 2017036023A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- module
- image
- processing module
- data
- central processing
- Prior art date
Links
- 238000012545 processing Methods 0.000 claims abstract description 69
- 238000013079 data visualisation Methods 0.000 claims abstract description 25
- 238000003384 imaging method Methods 0.000 claims abstract description 10
- 230000003287 optical effect Effects 0.000 claims description 5
- 238000012800 visualization Methods 0.000 claims description 4
- 238000007689 inspection Methods 0.000 claims 1
- 238000002604 ultrasonography Methods 0.000 abstract description 13
- 238000009558 endoscopic ultrasound Methods 0.000 abstract description 7
- 210000003484 anatomy Anatomy 0.000 abstract description 2
- 230000036285 pathological change Effects 0.000 abstract 1
- 231100000915 pathological change Toxicity 0.000 abstract 1
- 206010028980 Neoplasm Diseases 0.000 description 20
- 238000000034 method Methods 0.000 description 20
- 230000003902 lesion Effects 0.000 description 19
- 210000004204 blood vessel Anatomy 0.000 description 10
- 230000004927 fusion Effects 0.000 description 10
- 239000000523 sample Substances 0.000 description 10
- 238000001356 surgical procedure Methods 0.000 description 10
- 210000003734 kidney Anatomy 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 239000004575 stone Substances 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 206010006187 Breast cancer Diseases 0.000 description 3
- 208000026310 Breast neoplasm Diseases 0.000 description 3
- 206010013554 Diverticulum Diseases 0.000 description 3
- 208000008839 Kidney Neoplasms Diseases 0.000 description 3
- 238000001574 biopsy Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 210000001519 tissue Anatomy 0.000 description 3
- 238000013276 bronchoscopy Methods 0.000 description 2
- 201000011510 cancer Diseases 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 210000004185 liver Anatomy 0.000 description 2
- 201000007270 liver cancer Diseases 0.000 description 2
- 208000014018 liver neoplasm Diseases 0.000 description 2
- 210000004072 lung Anatomy 0.000 description 2
- 230000002062 proliferating effect Effects 0.000 description 2
- 206010058467 Lung neoplasm malignant Diseases 0.000 description 1
- ZOKXTWBITQBERF-UHFFFAOYSA-N Molybdenum Chemical compound [Mo] ZOKXTWBITQBERF-UHFFFAOYSA-N 0.000 description 1
- 206010062237 Renal impairment Diseases 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 210000001367 artery Anatomy 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 210000000621 bronchi Anatomy 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 238000002052 colonoscopy Methods 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011982 device technology Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000002575 gastroscopy Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 208000028867 ischemia Diseases 0.000 description 1
- 238000002357 laparoscopic surgery Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 201000005202 lung cancer Diseases 0.000 description 1
- 208000020816 lung neoplasm Diseases 0.000 description 1
- 210000005075 mammary gland Anatomy 0.000 description 1
- 238000002324 minimally invasive surgery Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229910052750 molybdenum Inorganic materials 0.000 description 1
- 239000011733 molybdenum Substances 0.000 description 1
- 230000001338 necrotic effect Effects 0.000 description 1
- 238000013059 nephrectomy Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 231100000857 poor renal function Toxicity 0.000 description 1
- 230000002980 postoperative effect Effects 0.000 description 1
- 230000035755 proliferation Effects 0.000 description 1
- 210000005084 renal tissue Anatomy 0.000 description 1
- 238000002271 resection Methods 0.000 description 1
- 210000000626 ureter Anatomy 0.000 description 1
- 201000011531 vascular cancer Diseases 0.000 description 1
- 206010055031 vascular neoplasm Diseases 0.000 description 1
- 210000003462 vein Anatomy 0.000 description 1
Images
Definitions
- the present invention relates to the field of medical device technology, and in particular, to a surgical positioning system.
- Laparoscopic, endoscopic (eg gastroscopy, colonoscopy, bronchoscopy) and surgical robots are all representative of minimally invasive techniques.
- various cameras are the main observation tools. They replace the human eye and are mainly used to perform two tasks: 1. Identify the location of the lesion and lesion in the human body; 2. Identify the position of the surgical instrument and the instrument in the human body.
- the surgical instruments are relatively large and the camera is not difficult to recognize. However, for identifying lesions, especially early lesions, the camera is difficult.
- the reasons are: 1.
- the camera imaging technology uses visible light. Visible light can see the surface of the lesion or the structure of the organizer, and can not see the lesion or tissue structure hidden in the deep layer. For example, in laparoscopic surgery, the camera can see large tumors, but can not see the deep supply of tumor blood vessels.
- the technique of finding lesions before surgery is not necessarily a camera-type technique, but may also be other influential examinations such as ultrasound, nuclear magnetic, and CT.
- the original signal acquisition methods of these technologies are different from the original signal acquisition methods of the cameras, and the types of lesions that they are good at discovering are also different.
- Some early lesions can be detected early with other techniques, and camera technology will have to wait until later to discover. For example, some early breast cancers discovered by nuclear magnetic or molybdenum targets are not much different from normal tissues under the observation of the camera and are difficult to distinguish.
- the CN200680020112 patent mentions a technique for providing a surgical robot with a laparoscopic ultrasound probe specifically for use in surgery, which produces a 2D image that can be processed by the processor to generate at least a portion of the 3D anatomical image. After that, the image and the camera image are transmitted to the processor, and after processing, the camera image is the main image on the display screen, and the ultrasonic image is displayed in the form of the auxiliary image. And the design can also compare the 3D camera view with the 2D ultrasound image slice.
- the CN201310298142 patent mentions another technique.
- the technique converts the pre-operative 3D image into a virtual ultrasound image, which is registered with the intraoperative ultrasound, and the resulting image is then fused with the endoscopic image during the operation, and finally the postoperative evaluation is performed on the cloud platform.
- the CN201310298142 patent has other problems: 1) The cloud server function of this technology is post-installed, and the function of the cloud server is placed in the final stage of the process - the post-evaluation phase. 2/Cloud server function in parallel with other registration functions, reducing the user's dependence on the cloud server. The doctor uses the system, without using the cloud server, can complete the pre-operative 3D image acquisition, the intra-operative ultrasound image fusion, the new fusion image and the process of the camera image fusion. The above two points make a lot of computing work must be done on the local processor, which puts certain requirements on the configuration of the local processor. Mobile devices and wearable devices are inherently limited in size, and compared to desktops and even workstations, these configuration requirements are not easily met.
- a special intraoperative ultrasound device which can realize the comparison between the pre-operative non-instant image data and the intra-operative visible light image; the software running environment is low, and the mobile device or the wearable device is convenient for the 3D image. Read and even show the results of the fusion; it is also theoretically guaranteed that as long as the early detection can find the location of the lesion, the localization system of the lesion can be found during surgery. Popularizing early detection, early treatment has important medical implications.
- an object of the present invention is to provide a surgical positioning system which provides an original and optical camera for the deficiencies of the original laparoscopic, medical endoscope and robotic surgical positioning system.
- the data acquisition method with different signal acquisition methods is performed on the cloud server side to perform 3D visualization processing and then merged with the video or image data of the optical camera to improve the surgical positioning system of the lesion discovery rate in the operation.
- a surgical positioning system that performs positioning by direct comparison between visible light images and non-instant images of imaging;
- the system includes a DICOM data input module, a data visualization processing module, a visible light image input module, a central processing module, and an image Display output module;
- the data visualization operation processing module is located in the cloud, is connected to the central processing module, receives the data of the DICOM data input module, and visualizes the received data, and transmits the patient's 3D model data to the central processing module and/or the image display output.
- the DICOM data input module is connected to the data visualization operation processing module, and is configured to upload the detected data in a format of a DICOM file;
- the visible light image input module is connected to the central processing module for transmitting the intraoperative real-time image data to the central processing module;
- a central processing module configured to receive image data transmitted by the visible light image input module and 3D model data transmitted by the visualization processing module;
- the image display output module is divided into a central processing pre-display output module and a central processing display output module; the two display output modules respectively exist independently and run separately; the central processing pre-display output module is connected with the cloud data visualization operation processing module for The 3D model is displayed; a centrally processed display output module is coupled to the central processing module for displaying an optical image and a 3D model.
- Figure 1 is a schematic view showing the structure of a surgical positioning system.
- FIG. 1 it is a structural diagram of a surgical positioning system, which is positioned by direct comparison between a visible image of visible light and a non-instant image of imaging; the system includes a DICOM data input module 100 and a data visualization operation processing module 200. , visible light image input module 300, central processing module 400, image display output module;
- the data visualization operation processing module is located in the cloud, is connected to the central processing module, receives the data of the DICOM data input module, and visualizes the received data, and transmits the patient's 3D model data to the central processing module and/or the image display output.
- the DICOM data input module is connected to the data visualization operation processing module, and is configured to upload the detected data in a format of a DICOM file;
- the visible light image input module is connected to the central processing module for transmitting the intraoperative real-time image data to the central processing module;
- a central processing module configured to receive image data transmitted by the visible light image input module and 3D model data transmitted by the visualization processing module;
- the image display output module is divided into a central processing pre-display output module 501 and a central processing post-display output module 502; the two display output modules respectively exist independently and run separately;
- the output module is connected to the cloud data visualization operation processing module for displaying the 3D model;
- the central processing display output module is connected to the central processing module for displaying an optical image and a 3D model.
- the doctor uploads the patient's CT data to the cloud server in the form of a DICOM file.
- the 3D model data of the patient's kidney and the location data of the stones in the kidney are transmitted to the central processing module.
- the central processing module accepts the image data transmitted from the ureteroscope camera and the 3D data transmitted from the data visualization processing module. Through registration and fusion, it is determined which relative position of the patient's 3D model is in the lens, and arrives at the buried position. The path of travel required for the stones in the chamber. In this way, along the path prompted by the central processing module, the diverticulum stones buried in the tissue can be found.
- a kidney tumor patient needs to have a partial laparoscopic nephrectomy.
- the doctor uploads the patient's kidney CT data to the cloud server in the form of a DICOM file.
- the 3D model data of the patient's kidney, the location data of the tumor in the kidney, and the location of the renal blood vessels are transmitted to the central processing module.
- the center processing module is also located in the cloud, and receives the image data transmitted from the laparoscopic camera and the 3D data transmitted from the data visualization processing module.
- the direction of the renal tumor blood vessels supplied under the renal capsule is judged. Wear the device (glasses).
- the doctor thus selectively blocks only the blood vessels supplying the kidney tumor and completes the surgery. Conventional methods are needed to block larger arteries and veins, resulting in more extensive renal tissue ischemia and impaired renal function.
- a peripheral lung cancer patient needs a bronchoscopy biopsy.
- the doctor uploads the patient's DICOM file to the cloud server.
- the 3D model data of the patient's lungs and bronchial tubes, the location of the tumor in the lungs, and the location of the blood vessels around the tumor are transmitted to the central processing module.
- the central processing module receives the image data transmitted from the bronchoscope camera and the 3D data transmitted from the data visualization processing module. Through registration and fusion, it is determined which relative position of the bronchial lens is in the patient 3D model, and which bronchus the tumor needs to pass. Rumor, there is no biopsy around the tumor Need to avoid blood vessels, etc. It can even help identify and select biopsy at the marginal site of the tumor because the edge of the tumor is detected at a higher rate than the central cancer cell of the tumor (the ratio of necrotic cells in the center of the tumor is too high).
- a breast cancer patient needs a total laparoscopic mastectomy. Before surgery, tiny early breast cancer lesions were found by NMR, and it was difficult to identify cancer lesions by the camera alone.
- the doctor sends the patient's nuclear magnetic DICOM file to the medical data visualization processing module. After data visualization, the patient's mammary gland along with the tumor's 3D model data is passed to the central processing module.
- the central processing module receives the image data transmitted from the endoscope camera and the 3D data transmitted from the data visualization processing module. Through registration and fusion, it is determined which relative position of the patient's 3D model is located, and where the tumor needs to be moved. In the end, the tumor is not easily recognized by the camera and the tumor is removed.
- a partial liver resection of the robot is required.
- the patient's color Doppler ultrasound results in the location of liver cancer and the abnormal proliferation of vascular tumors.
- the doctor sends the patient's preoperative color ultrasound DICOM file to the medical data visualization processing module.
- the patient's liver, tumor and blood vessel 3D model data are transmitted to the central processing module and the mobile phone.
- the doctor had a general understanding of the blood vessel distribution at the surgical site by the mobile phone before surgery.
- the intraoperative central processing module accepts the image data transmitted from the robot camera and the 3D data transmitted from the data visualization processing module. Through registration and fusion, it is determined which relative position of the surgical instrument is in the patient 3D model, and where the tumor needs to be reached.
- the moving, abnormally proliferating blood vessels are buried there, helping the doctor to find a surgical path that avoids the abnormally proliferating variegated blood vessels, and finally removes the tumor easily.
Landscapes
- Endoscopes (AREA)
Abstract
La présente invention concerne un système de positionnement destiné à être utilisé au cours d'une opération chirurgicale, ledit positionnement étant mis en œuvre en fonction d'une comparaison directe entre une image en temps réel à la lumière visible et une image qui n'est pas en temps réel obtenue par imagerie. Ledit système comprend : un module (100) d'ouverture de session de données de DICOM, un module (200) de commutation et de traitement pour la visualisation de données, un module (300) d'entrée d'image de lumière visible, un module central (400) de traitement, et un module de sortie d'affichage d'image. Le système est conçu pour générer, sur la base des données d'imagerie, un modèle d'imagerie 3D qui n'est pas en temps réel, et puis combiner le modèle d'imagerie 3D qui n'est pas en temps réel avec une image de caméra en temps réel prise au cours d'une opération, ce qui permet de réduire les exigences en termes d'équipements d'opération, par exemple, ne nécessitant pas d'équipement ultrasonore laparoscopique ou d'équipement ultrasonore endoscopique professionnel mais utilisant uniquement un résultat de test pré-chirurgical standard. Théoriquement, le système peut parfaitement assurer l'affichage d'un emplacement d'une modification pathologique dans le modèle 3D, et par conséquent, une caméra peut être actionnée en se basant uniquement sur un emplacement relatif d'un instrument chirurgical et une structure anatomique principale dans la carte 3D.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510559675.4A CN105213032B (zh) | 2015-09-06 | 2015-09-06 | 手术定位系统 |
CN201510559675.4 | 2015-09-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017036023A1 true WO2017036023A1 (fr) | 2017-03-09 |
Family
ID=54982566
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2015/099144 WO2017036023A1 (fr) | 2015-09-06 | 2015-12-28 | Système de positionnement destiné à être utilisé au cours d'une opération chirurgicale |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN105213032B (fr) |
WO (1) | WO2017036023A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114795465A (zh) * | 2022-05-31 | 2022-07-29 | 浙江大学 | 一种基于医学影像三维重建的手术辅助系统及方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106326856A (zh) * | 2016-08-18 | 2017-01-11 | 厚凯(天津)医疗科技有限公司 | 一种手术图像处理方法及装置 |
CN112237477B (zh) * | 2019-07-17 | 2021-11-16 | 杭州三坛医疗科技有限公司 | 骨折复位闭合手术定位导航装置 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002102249A (ja) * | 2000-09-29 | 2002-04-09 | Olympus Optical Co Ltd | 手術ナビゲーション装置および手術ナビゲーション方法 |
WO2005055008A2 (fr) * | 2003-11-26 | 2005-06-16 | Viatronix Incorporated | Systemes et procedes pour la segmentation, la visualisation et l'analyse automatisees d'images medicales |
CN1874734A (zh) * | 2003-09-01 | 2006-12-06 | 西门子公司 | 对电生理导管的心脏应用提供可视化支持的方法和装置 |
CN102811655A (zh) * | 2010-03-17 | 2012-12-05 | 富士胶片株式会社 | 内窥镜观察支持系统、方法、设备和程序 |
US8348831B2 (en) * | 2009-12-15 | 2013-01-08 | Zhejiang University | Device and method for computer simulated marking targeting biopsy |
CN103793915A (zh) * | 2014-02-18 | 2014-05-14 | 上海交通大学 | 神经外科导航中低成本无标记配准系统及配准方法 |
CN104757951A (zh) * | 2014-04-11 | 2015-07-08 | 京东方科技集团股份有限公司 | 一种显示系统和数据处理方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080147086A1 (en) * | 2006-10-05 | 2008-06-19 | Marcus Pfister | Integrating 3D images into interventional procedures |
US8235530B2 (en) * | 2009-12-07 | 2012-08-07 | C-Rad Positioning Ab | Object positioning with visual feedback |
US10561861B2 (en) * | 2012-05-02 | 2020-02-18 | Viewray Technologies, Inc. | Videographic display of real-time medical treatment |
CN203195768U (zh) * | 2013-03-15 | 2013-09-18 | 应瑛 | 手术引导系统 |
CN103371870B (zh) * | 2013-07-16 | 2015-07-29 | 深圳先进技术研究院 | 一种基于多模影像的外科手术导航系统 |
-
2015
- 2015-09-06 CN CN201510559675.4A patent/CN105213032B/zh active Active
- 2015-12-28 WO PCT/CN2015/099144 patent/WO2017036023A1/fr active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002102249A (ja) * | 2000-09-29 | 2002-04-09 | Olympus Optical Co Ltd | 手術ナビゲーション装置および手術ナビゲーション方法 |
CN1874734A (zh) * | 2003-09-01 | 2006-12-06 | 西门子公司 | 对电生理导管的心脏应用提供可视化支持的方法和装置 |
WO2005055008A2 (fr) * | 2003-11-26 | 2005-06-16 | Viatronix Incorporated | Systemes et procedes pour la segmentation, la visualisation et l'analyse automatisees d'images medicales |
US8348831B2 (en) * | 2009-12-15 | 2013-01-08 | Zhejiang University | Device and method for computer simulated marking targeting biopsy |
CN102811655A (zh) * | 2010-03-17 | 2012-12-05 | 富士胶片株式会社 | 内窥镜观察支持系统、方法、设备和程序 |
CN103793915A (zh) * | 2014-02-18 | 2014-05-14 | 上海交通大学 | 神经外科导航中低成本无标记配准系统及配准方法 |
CN104757951A (zh) * | 2014-04-11 | 2015-07-08 | 京东方科技集团股份有限公司 | 一种显示系统和数据处理方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114795465A (zh) * | 2022-05-31 | 2022-07-29 | 浙江大学 | 一种基于医学影像三维重建的手术辅助系统及方法 |
Also Published As
Publication number | Publication date |
---|---|
CN105213032B (zh) | 2017-12-15 |
CN105213032A (zh) | 2016-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7133474B2 (ja) | 内視鏡画像及び超音波画像の画像ベースの融合 | |
Okamoto et al. | Clinical application of navigation surgery using augmented reality in the abdominal field | |
Pratt et al. | An effective visualisation and registration system for image-guided robotic partial nephrectomy | |
Luo et al. | Advanced endoscopic navigation: surgical big data, methodology, and applications | |
CN107456278B (zh) | 一种内窥镜手术导航方法和系统 | |
RU2556593C2 (ru) | Совмещение и навигация для эндоскопической хирургии на основе интеграции изображений | |
Reynisson et al. | Navigated bronchoscopy: a technical review | |
US20190021681A1 (en) | Medical device approaches | |
JP5486432B2 (ja) | 画像処理装置、その作動方法およびプログラム | |
KR20130108320A (ko) | 관련 애플리케이션들에 대한 일치화된 피하 해부구조 참조의 시각화 | |
Bertrand et al. | A case series study of augmented reality in laparoscopic liver resection with a deformable preoperative model | |
JP2013517909A (ja) | 気管支鏡検査法ガイダンスに適用される画像ベースのグローバル登録 | |
CN101797182A (zh) | 一种基于增强现实技术的鼻内镜微创手术导航系统 | |
Onda et al. | Short rigid scope and stereo-scope designed specifically for open abdominal navigation surgery: clinical application for hepatobiliary and pancreatic surgery | |
Kriegmair et al. | Digital mapping of the urinary bladder: Potential for standardized cystoscopy reports | |
Amir-Khalili et al. | Automatic segmentation of occluded vasculature via pulsatile motion analysis in endoscopic robot-assisted partial nephrectomy video | |
WO2019047820A1 (fr) | Procédé, dispositif et système d'affichage d'images pour navigation chirurgicale endoscopique minimalement invasive | |
Sorger et al. | A novel platform for electromagnetic navigated ultrasound bronchoscopy (EBUS) | |
Ma et al. | Surgical navigation system for laparoscopic lateral pelvic lymph node dissection in rectal cancer surgery using laparoscopic-vision-tracked ultrasonic imaging | |
WO2017036023A1 (fr) | Système de positionnement destiné à être utilisé au cours d'une opération chirurgicale | |
CN115414121A (zh) | 一种基于射频定位芯片的外科手术导航系统 | |
Konishi et al. | Augmented reality navigation system for endoscopic surgery based on three-dimensional ultrasound and computed tomography: Application to 20 clinical cases | |
Nagelhus Hernes et al. | Computer‐assisted 3D ultrasound‐guided neurosurgery: technological contributions, including multimodal registration and advanced display, demonstrating future perspectives | |
CN115375595A (zh) | 图像融合方法、装置、系统、计算机设备和存储介质 | |
Langø et al. | Navigation in laparoscopy–prototype research platform for improved image‐guided surgery |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15902828 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15902828 Country of ref document: EP Kind code of ref document: A1 |