US20070149846A1 - Anatomical visualization system - Google Patents
Anatomical visualization system Download PDFInfo
- Publication number
- US20070149846A1 US20070149846A1 US11/634,357 US63435706A US2007149846A1 US 20070149846 A1 US20070149846 A1 US 20070149846A1 US 63435706 A US63435706 A US 63435706A US 2007149846 A1 US2007149846 A1 US 2007149846A1
- Authority
- US
- United States
- Prior art keywords
- software
- endoscope
- computer
- anatomical
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00039—Operational features of endoscopes provided with input arrangements for the user
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/042—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by a proximal camera, e.g. a CCD camera
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
- A61B5/065—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/20—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/313—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
- A61B1/317—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for bones or joints, e.g. osteoscopes, arthroscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/256—User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/368—Correlation of different images or relation of image positions in respect to the body changing the image on a display according to the operator's position
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
- A61B2090/3782—Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- This invention relates to medical apparatus in general, and more particularly to anatomical visualization systems.
- relatively narrow surgical instruments are inserted into the interior of the patient's body so that the distal (i.e., working) ends of the instruments are positioned at a remote interior surgical site, while the proximal (i.e., handle) ends of the instruments remain outside the patient's body.
- the physician manipulates the proximal (i.e., handle) ends of the instruments as required so as to cause the distal (i.e., working) ends of the instruments to carry out the desired surgical procedure at the remote interior surgical site.
- the incisions made in the patient's body can remain relatively small, thereby resulting in significantly faster patient recovery times.
- laparoscopic surgical procedures have been developed wherein the abdominal region of the patient is inflated with gas (e.g., CO 2 ) and then surgical instruments are inserted into the interior of the abdominal cavity so as to carry out the desired surgical procedure.
- gas e.g., CO 2
- arthroscopic surgical procedures have been developed wherein a knee joint is inflated with a fluid (e.g., a saline solution) and then surgical instruments are inserted into the interior of the joint so as to carry out the desired surgical procedure.
- a fluid e.g., a saline solution
- an endoscope In order to visualize what is taking place at the remote interior site, the physician also inserts an endoscope into the patient's body during the endoscopic surgery, together with an appropriate source of illumination.
- an endoscope generally comprises an elongated shaft having a distal end and a proximal end, and at least one internal passageway extending between the distal end and the proximal end.
- Image capturing means are disposed at the distal end of the shaft and extend through the shaft's at least one internal passageway, whereby the image capturing means can capture an image of a selected region located substantially adjacent to the distal end of the shaft and convey that image to the proximal end of the shaft.
- Viewing means are in turn disposed adjacent to the proximal end of the shaft, whereby the image obtained by the image capturing means can be conveyed to a display device which is viewed by the physician.
- Endoscopes of the sort described above are generally sufficient to permit the physician to carry out the desired endoscopic procedure.
- certain problems have been encountered when using such endoscopes in surgical procedures.
- endoscopes of the sort described above generally have a fairly limited field of view.
- the physician typically cannot view the entire surgical field in a single image. This can mean that the physician may not see an important development as soon as it occurs, and/or that the physician must expend precious time and energy constantly redirecting the endoscope to different anatomical regions.
- Visualization problems can also occur due to the difficulty of providing proper illumination within a remote interior site.
- visualization problems can occur due to the presence of intervening structures (e.g., fixed anatomical structures, moving debris, flowing blood, the presence of vaporized tissue when cauterizing in laparoscopic surgery, the presence of air bubbles in a liquid medium in the case of arthroscopic surgery, etc.).
- intervening structures e.g., fixed anatomical structures, moving debris, flowing blood, the presence of vaporized tissue when cauterizing in laparoscopic surgery, the presence of air bubbles in a liquid medium in the case of arthroscopic surgery, etc.
- one object of the present invention is to provide an improved anatomical visualization system.
- Another object of the present invention is to provide an improved anatomical visualization system which is adapted to enhance a physician's ability to comprehend the nature and location of internal bodily structures during endoscopic visualization.
- Still another object of the present invention is to provide an improved anatomical visualization system which is adapted to enhance a physician's ability to navigate an endoscope within the body.
- Yet another object of the present invention is to provide an improved anatomical visualization system which is adapted to augment a standard video endoscopic system with a coordinated computer model visualization system so as to enhance the physician's understanding of the patient's interior anatomical structures.
- Another object of the present invention is to provide an improved method for visualizing the interior anatomical structures of a patient.
- Still another object of the present invention is to provide an improved anatomical visualization system which can be used with remote visualization devices other than endoscopes, e.g., miniature ultrasound probes.
- Yet another object of the present invention is to provide an improved visualization system which can be used to visualize remote objects other than interior anatomical structures, e.g., the interiors of complex machines.
- Yet another object of the present invention is to provide an improved method for visualizing objects.
- an improved anatomical visualization system comprising, in one preferred embodiment, a database of pre-existing software objects, wherein at least one of the software objects corresponds to a physical structure which is to be viewed by the system; a real-time sensor for acquiring data about the physical structure when the physical structure is located within that sensor's data acquisition field, wherein the real-time sensor is capable of being moved about relative to the physical structure; generating means for generating a real-time software object corresponding to the physical structure, using data acquired by the sensor; registration means for positioning the real-time software object in registration with the pre-existing software objects contained in the database; and processing means for generating an image from the software objects contained in the database, based upon a specified point of view.
- the generating means create a software object that corresponds to a disk.
- the generating means may also be adapted to texture map the data acquired by the sensor onto the disk.
- the registration means may comprise tracking means that are adapted so as to determine the spatial positioning and orientation of the real-time sensor and/or the physical structure.
- the real-time sensor may comprise an endoscope and the physical structure may comprise an interior anatomical structure.
- the system may also include either user input means for permitting the user to provide the processing means with the specified point of view, or user tracking means that are adapted to provide the processing means with the specified point of view.
- the real-time computer-based viewing system may comprise a database of software objects and image generating means for generating an image from the software objects contained in the database, based upon a specified point of view.
- means are also provided for specifying this point of view.
- At least one of the software objects contained in the database comprises pre-existing data corresponding to a physical structure which is to be viewed by the system, and at least one of the software objects comprises data generated by a real-time, movable sensor.
- the system further comprises registration means for positioning the at least one software object, comprising data generated by the real-time movable sensor, in registration with the at least one software object comprising pre-existing data corresponding to the physical structure which is to be viewed by the system.
- a preferred method for utilizing the present invention comprises: (1) positioning the sensor so that the physical structure is located within that sensor's data acquisition field, and generating a real-time software object corresponding to the physical structure using data acquired by the sensor, and positioning the real-time software object in registration with the pre-existing software objects contained in the database; (2) providing a specified point of view to the processing means; and (3) generating an image from the software objects contained in the database according to that specified point of view.
- FIG. 1 is a schematic view showing an anatomical visualization system formed in accordance with the present invention
- FIG. 2 is a schematic view of a unit cube for use in defining pologonal surface models
- FIG. 3 illustrates the data file format of the pologonal surface model for the simple unit cube shown in FIG. 2 ;
- FIG. 4 illustrates a system of software objects
- FIG. 5 illustrates an image rendered by the anatomical visualization system
- FIG. 6 illustrates how various elements of system data are input into computer means 60 in connection with the sytem's generation of output video for display on video display 170 ;
- FIG. 7 illustrates additional details on the methodology employed by anatomical visualization system 10 in connection in rendering a video output image for display on video display 170 ;
- FIG. 8 illustrates a typical screen display provided in accordance with the present invention
- FIG. 9 illustrates an image rendered by the anatomical visualization system
- FIG. 10 is a schematic representation of a unit disk software object where the disk is defined in the X-Y plane and has a diameter of 1;
- FIG. 11 shows how the optical parameters for an encdoscope can define the relationship between the endoscope 90 A′ and the disk 90 B′.
- anatomical visualization system 10 which comprises a preferred embodiment of the present invention.
- Anatomical visualization system 10 is intended to be used by a physician 20 to visually inspect anatomical objects 30 located at an interior anatomical site.
- anatomical visualization system 10 might be used by physician 20 to visually inspect a tibia 30 A, a femur 30 B and a meniscus 30 C located within the knee joint of a patient.
- An important aspect of the present invention is the provision of an improved anatomical visualization system which is adapted to augment a standard video endoscopic system with a coordinated computer model visualization system so as to enhance the physician's understanding of the patient's interior anatomical structure.
- anatomical visualization system 10 generally comprises endoscope means 40 , endoscope tracking means 50 , computer means 60 , database means 70 containing 3-D computer models of various objects which are to be visualized by the system, and display means 80 .
- Endoscope means 40 comprise an endoscope of the sort well known in the art. More particularly, endoscope means 40 comprise an endoscope 90 which comprises (i) a lens arrangement which is disposed at the distal end of the endoscope for capturing an image of a selected region located substantially adjacent to the distal end of the endoscope, and (ii) an appropriate image sensor, e.g., a charge coupled device (“CCD”) element or video tube, which is positioned on the endoscope so as to receive an image captured by the lens arrangement and to generate corresponding video signals which are representative of the captured image.
- CCD charge coupled device
- video processing means 95 are provided to convert the analog video signals output by endoscope 90 into the digital video signals required by computer means 60 .
- Video processing means 95 are of the sort well known in the art and hence need not be described in further detail here.
- Endoscope tracking means 50 comprise a tracking system of the sort well known in the art. More particularly, endoscope tracking means 50 may comprise a tracking system 97 of the sort adapted to monitor the position and orientation of an object in space and to generate output signals which are representative of the position and orientation of that object.
- tracking system 97 might comprise an optical tracking system, an electromagnetic tracking system, an ultrasonic tracking system, or an articulated linkage tracking system, among other alternatives. Such tracking systems are all well known in the art and hence need not be described in further detail here.
- Tracking system 97 is attached to endoscope 90 such that the output signals generated by tracking system 97 will be representative of the spatial positioning and orientation of endoscope 90 .
- the output signals generated by tracking system 97 is fed as an input into computer means 60 .
- Computer means 60 comprise a digital computer 130 of the sort adapted for high speed processing of computer graphics.
- digital computers are well known in the art.
- digital computer 130 might comprise a Silicon Graphics Reality Engine digital computer, or it might comprise a Silicon Graphics Iris Indigo 2 Impact digital computer, or it might comprise some equivalent digital computer.
- Computer means 60 also comprise the operating system software (schematically represented at 135 in FIG. 1 ) and the application program software (schematically represented at 140 in FIG. 1 ) required to cause computer 130 to operate in the manner hereinafter described.
- application program software 140 includes image rendering software of the sort adapted to generate images from the 3-D computer models contained in database means 70 according to a specified point of view.
- operating system software 135 might comprise the IRIX operating system
- image rendering software contained in application program software 140 might comprise the IRIS gl image rendering software or the OpenGL image rendering software.
- image rendering software utilizes standard techniques, such as the well-known Z buffer algorithm, to draw images of 3-D computer models according to some specified point of view.
- computer 130 also typically includes input devices 145 through which physician 20 can interact with the computer.
- Input devices 145 preferably comprise the sort of computer input devices generally associated with a Silicon Graphics digital computer, e.g., input devices 145 preferably comprise a keyboard, a mouse, etc.
- input devices 145 permit physician 20 to initiate operation of anatomical visualization system 10 , to select various system functions, and to supply the system with various directives, e.g., input devices 145 might be used by physician 20 to specify a particular viewing position for which the application program's image rendering software should render a visualization of the 3-D software models contained in database means 70 .
- Database means 70 comprise a data storage device or medium 150 containing one or more 3-D computer models (schematically illustrated as 160 in FIG. 1 ) of the anatomical objects 30 which are to be visualized by anatomical visualization system 10 .
- the specific data structure used to store the 3-D computer models 160 will depend on the specific nature of computer 130 and on the particular operating system software 135 and the particular application program software 140 being run on computer 130 .
- the 3-D computer models 160 contained in data storage device or medium 150 are preferably structured as a collection of software objects.
- a scanned anatomical structure such as a human knee might be modeled as three distinct software objects, with the tibia being one software object (schematically represented at 30 A′ in FIG.
- the femur being a second software object (schematically represented at 30 B′ in FIG. 4 ), and the meniscus being a third software object (schematically represented at 30 C′ in FIG. 4 ).
- Such software objects are of the sort well known in the art and may have been created, for example, through post-processing of CT or MRI scans of the patient using techniques well known in the art.
- the 3-D computer models 160 might comprise software objects defined as polygonal surface models, since such a format is consistent with the aforementioned software.
- FIGS. 2 and 3 illustrate a typical manner of defining a software object using a polygonal surface model of the sort utilized by such image rendering software.
- FIG. 2 illustrates the vertices of a unit cube set in an X-Y-Z coordinate system
- FIG. 1 illustrates the vertices of a unit cube set in an X-Y-Z coordinate system
- FIG. 3 illustrates the data file format of the polygonal surface model for this simple unit cube.
- more complex shapes such as human anatomical structures can be expressed in corresponding terms.
- certain digital computers such as a Silicon Graphics digital computer of the sort described above, can be adapted such that digital video data of the sort output by video processing means 95 can be made to appear on the surface of a polygonal surface model software object in the final rendered image using the well known technique of texture mapping.
- Display means 80 comprise a video display of the sort well known in the art. More particularly, display means 80 comprise a video display 170 of the sort adapted to receive video signals representative of an image and to display that image on a screen 180 for viewing by physician 20 .
- video display 170 might comprise a television type of monitor, or it might comprise a head-mounted display or a boom-mounted display, or it might comprise any other display device of the sort suitable for displaying an image corresponding to the video signals received from computer means 60 , as will hereinafter be described in further detail.
- physician tracking means 185 may be attached to video display 170 and then used to advise computer 130 of the physician's head movements. This can be quite useful, since the anatomical visualization system 10 can use such physician head movements to specify a particular viewing position for which the application program's image rendering software should render a visualization of the 3-D software models contained in database means 70 .
- surgeon tracking means 188 (comprising a tracking system 189 similar to the tracking system 97 described above) may be attached directly to surgeon 20 and then used to advise computer 130 of the physician's movements.
- the anatomical visualization system can use such physician movements to specify a particular viewing position for which the application program's image rendering software should render a visualization of the 3-D software models contained in database means 70 .
- an important aspect of the present invention is the provision of an improved anatomical visualization system which is adapted to augment a standard video endoscopic system with a coordinated computer model visualization system so as to enhance the physician's understanding of the patient's interior anatomical structure.
- the improved anatomical visualization system is adapted to augment the direct, but somewhat limited, video images generated by a standard video endoscopic system with the indirect, but somewhat more flexible, images generated by a computer model visualization system.
- database means 70 also comprise one or more 3-D computer models (schematically illustrated at 190 in FIG. 1 ) of the particular endoscope 90 which is included in anatomical visualization system 10 .
- 3-D computer models 190 representing endoscope 90 will depend on the specific nature of computer 130 and on the particular operating system software 135 and the particular application program software 140 being run on computer 130 .
- the 3-D computer models 190 contained in data storage device or medium 150 are preferably structured as a pair of separate but interrelated software objects, where one of the software objects represents the physical embodiment of endoscope 90 , and the other of the software objects represents the video image acquired by endoscope 90 .
- the 3-D computer models 190 representing endoscope 90 comprises a first software object (schematically represented at 90 A′ in FIG. 4 ) representative of the shaft of endoscope 90 .
- the 3-D computer models 190 representing endoscope 90 also comprises a second software object (schematically represented at 90 B′ in FIG. 4 ) which is representative of the video image acquired by endoscope 90 .
- second software object 90 B′ is representative of a planar disk defined by the intersection of the endoscope's field of view with a plane set perpendicular to the center axis of that field of view, wherein the plane is separated from the endoscope by a distance equal to the endoscope's focal distance. See, for example, FIG. 11 , which shows how the optical parameters for an endoscope can define the relationship between the endoscope 90 A′ and the disk 90 B′.
- the anatomical visualization system 10 is arranged so that the video signals output by endoscope 90 are, after being properly transformed by video processing means 95 into the digital data format required by digital computer 130 , texture mapped onto the planar surface of disk 90 B′.
- software object 90 B′ will be representative of the video image acquired by endoscope 90 .
- the two software objects 90 A′ and 90 B′ will together represent both the physical structure of endoscope 90 and the video image captured by that endoscope.
- the 3-D computer models 190 might comprise software objects defined as polygonal surface models, since such a format is consistent with the aforementioned software.
- UV texture mapping parameters are established for each of the vertices of the planar surface disk 90 B′ and the digitized video signals from endoscope 90 are assigned to be texture map images for 90 B′. See, for example, FIG. 10 , which is a schematic representation of a unit disk software object where the disk is defined in the X-Y plane and has a diameter of 1.
- shaft software object 90 A′ and disk software object 90 B′ will also remain constant.
- shaft software object 90 A′ and disk software object 90 B′ it can sometimes be convenient to think of shaft software object 90 A′ and disk software object 90 B′ as behaving like a single unit, e.g., when positioning the software objects 90 A′ and 90 B′ within 3-D computer models.
- anatomical 3-D computer models 160 have been established from anatomical software objects 30 A′, 30 B′ and 30 C′ (representative of the anatomical objects 30 A, 30 B, and 30 C which are to be visualized by the system)
- the endoscope 3-D computer models 190 have been established from the endoscope software objects 90 A′ and 90 B′ (representative of the endoscope and the video image captured by that endoscope)
- the various software objects are placed into proper registration with one another using techniques well known in the art so as to form a cohesive database for the application program's image rendering software.
- a principal task of the application program is to first resolve the relative coordinate system of all the various software objects of anatomical 3-D computer models 160 and of endoscope 3-D computer models 190 , and then to use the application program's image rendering software to merge these elements into a single composite image combining both live video images derived from endoscope 90 with computer generated images derived from the computer graphics system.
- anatomical software objects 30 A′, 30 B′ and 30 C′ will be defined in 3-D computer models 160 in the context of a particular coordinate system (e.g., the coordinate system established when the anatomical software objects were created), and endoscope software objects 90 A′ and 90 B will be defined in the context of the coordinate system established by endoscope tracking means 50 .
- anatomical objects 30 A′, 30 B′ and 30 C′ include unique points of reference which are readily identifiable both visually and within the anatomical 3-D computer models 160
- the tracked endoscope can be used to physically touch those unique points of reference; such physical touching with the tracked endoscope will establish the location of those unique points of reference within the coordinate system of the endoscope, and this information can then be used to map the relationship between the endoscope's coordinate system and the coordinate system of the 3-D computer models 160 .
- proper software object registration can also be accomplished by pointing endoscope 90 at various anatomical objects 30 A, 30 B and 30 C and then having the system execute a search algorithm to identify the “virtual camera” position that matches the “real camera” position. Still other techniques for establishing the proper correspondence between two such coordinate systems are well known in the art.
- anatomical software objects 30 A′, 30 B′ and 30 C′ and endoscope software objects 90 A′ and 90 B′ can be considered to simultaneously coexist in a single coordinate system in the manner schematically illustrated in FIG. 4 , whereby the application program's image rendering software can generate images of all of the system's software objects (e.g., 30 A′, 30 B′, 30 C′, 90 A′ and 90 B′) according to some specified point of view.
- the images generated by the application program's image rendering software will automatically integrate the relatively narrow field of view, live video image data provided by endoscope 90 with (ii) the wider field of view, computer model image data which can be generated by the system's computer graphics. See, for example, FIG. 5 , which shows a composite image 200 which combines video image data 210 obtained from endoscope 90 with computer model image data 220 generated by the system's computer graphics.
- endoscope tracking means 50 are adapted to continuously monitor the current position of endoscope 90 and report the same to digital computer 130
- digital computer 130 can continuously update the 3-D computer models 190 representing endoscope 90 .
- the images generated by the application program's image rendering software will remain accurate even as endoscope 90 is moved about relative to anatomical objects 30 .
- anatomical object tracking means 230 may be attached to one or more of the anatomical objects 30 A, 30 B and 30 C and then used to advise computer 130 of the current position of that anatomical object (see, for example, FIG. 1 , where tracking system 240 has been attached to the patient's tibia 30 A and femur 30 B).
- digital computer 130 can continually update the 3-D computer models 160 representing the anatomical objects. Accordingly, the images generated by the application program's image rendering software will remain accurate even as tibia 30 A and/or femur 30 B move about relative to endoscope 90 .
- FIG. 6 provides additional details on how various elements of system data are input into computer means 60 in connection with the system's generation of output video for display on video display 170 .
- FIG. 7 provides additional details on the methodology employed by anatomical visualization system 10 in connection with rendering a video output image for display on video display 170 .
- the application program software 140 of computer means 60 is configured so as to enable physician 20 to quickly and easily specify a particular viewing position (i.e., a “virtual camera” position in computer graphics terminology) for which the application program's image rendering software should render a visualization of the 3-D software models contained in database means 70 .
- FIG. 8 shows a typical Apple Newton screen display 300 which provides various user input choices for directing the system's “virtual camera”.
- physician 20 may select one of the joystick modes as shown generally at 310 for permitting the user to use a joystick-type input device to specify a “virtual camera” position for the system.
- physician 20 may choose to use physician tracking means 185 or 187 to specify the virtual camera position for the system, in which case movement of the physician will cause a corresponding change in the virtual camera position.
- physician tracking means 185 or 187 to specify the virtual camera position for the system, in which case movement of the physician will cause a corresponding change in the virtual camera position.
- the physician may specify a virtual camera position disposed at the very end of the endoscope's shaft, whereby the endoscope's shaft is not seen in the rendered image (see, for example, FIG. 5 ), or the user may specify a virtual camera position disposed mid-way back along the length of the shaft, whereby a portion of the endoscope's shaft will appear in the rendered image. See, for example, FIG.
- FIG. 9 which shows a composite image 320 which combines video image data 330 obtained from endoscope 90 with computer model image data 340 generated by the system's computer graphics, and further wherein a computer graphic representation 350 of the endoscope's shaft appears on the rendered image.
- physician 20 may specify a virtual camera position which is related to the spatial position and orientation of endoscope 90 , in which case the virtual camera position will move in conjunction with endoscope 90 .
- physician 20 may specify a virtual camera position which is not related to the spatial position and orientation of endoscope 90 , in which case the virtual camera position will appear to move independently of endoscope 90 .
- physician 20 it is also possible for physician 20 to use slider control 360 to direct the application program's image rendering software to adjust the field of view set for the computer graphic image data 340 (see FIG. 9 ) generated by the system's computer graphics.
- physician 20 can use slider control 370 to direct the application program's image rendering software to fade the density of the video image which is texture mapped onto the face of disk software object 90 B′.
- the face of the system's disk can be made to display an overlaid composite made up of both video image data and computer graphic image data, with the relative composition of the image being dictated according to the level of fade selected.
- anatomical visualization system 10 can be configured to work with video acquisition devices other than endoscopes.
- the system can be configured to work with miniature ultrasound probes of the sort adapted for insertion into a body cavity.
- the video output of the miniature ultrasound probe would be texture mapped onto the face of disk software object 90 B′.
- other types of video acquisition devices could be used in a corresponding manner.
- the foregoing visualization system to render images of objects other than anatomical structures.
- the system would be used to provide images from the interior of complex machines, so long as appropriate 3-D computer models are provided for the physical structures which are to be visualized.
- anatomical 3-D computer models 160 were created from software objects 30 A′, 30 B′, 30 C′ representing anatomical objects 30 A, 30 B, 30 C; and endoscope 3-D computer models 190 were created from endoscope software objects 90 A′ and 90 B′ representing endoscope shaft 90 A and endoscope viewing field 90 B; and the various software objects were placed into proper registration with one another using techniques well known in the art so as to form a cohesive database for the application program's image rendering software.
- FIG. 12 there is shown a software object 30 D′ representing the aorta of a patient. Also shown is a series of markers 30 E′ placed into the system (e.g., by a human operator using a mouse) and a series of line segments 30 F′ extending between selected ones of the markers 30 E′. These markers 30 E′ and line segments 30 F′ may be used to plan a surgical procedure, to determine anatomical lengths or angles, etc.
- a straight tube 30 G′ which may also be used for planning and measurement purposes, etc.
- a curved tube 30 H′ which may be used for planning and measurement purposes
- a box 30 I′ which may be used for planning and measurement purposes, e.g., for volume calculations.
- FIG. 13 is similar to FIG. 12 , except that aorta 30 D′ has been rendered transparent.
- FIG. 14 there is shown a virtual graft 30 J′ which represents an arterial stent which may be deployed in the aorta, e.g., to treat an aortic aneurysm.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- General Business, Economics & Management (AREA)
- Business, Economics & Management (AREA)
- Signal Processing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Robotics (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Endoscopes (AREA)
Abstract
An improved anatomical visualization system which is adapted to augment a standard video endoscopic system with a coordinated computer model visualization system so as to enhance a physician's understanding of the patient's interior anatomical structure.
Description
- This application:
- (1) is a continuation-in-part of pending prior U.S. patent application Ser. No. 09/874,869, filed Jun. 5, 2001 by David T. Chen et al. for ANATOMICAL VISUALIZATION SYSTEM (Attorney's Docket No. MMS-2E CON 2), which patent application is hereby incorporated herein by reference; and
- (2) claims benefit of pending prior U.S. Provisional Patent Application Ser. No. 60/277,643, filed Mar. 21, 2001 by David T. Chen et al. for ANATOMICAL VISUALIZATION SYSTEM (Attorney's Docket No. MMS-A PROV), which patent application is hereby incorporated herein by reference.
- This invention relates to medical apparatus in general, and more particularly to anatomical visualization systems.
- Endoscopic surgical procedures are now becoming increasingly popular due to the greatly reduced patient recovery times resulting from such surgery.
- More particularly, in endoscopic surgical procedures, relatively narrow surgical instruments are inserted into the interior of the patient's body so that the distal (i.e., working) ends of the instruments are positioned at a remote interior surgical site, while the proximal (i.e., handle) ends of the instruments remain outside the patient's body. The physician then manipulates the proximal (i.e., handle) ends of the instruments as required so as to cause the distal (i.e., working) ends of the instruments to carry out the desired surgical procedure at the remote interior surgical site. As a result of this technique, the incisions made in the patient's body can remain relatively small, thereby resulting in significantly faster patient recovery times.
- By way of example, laparoscopic surgical procedures have been developed wherein the abdominal region of the patient is inflated with gas (e.g., CO2) and then surgical instruments are inserted into the interior of the abdominal cavity so as to carry out the desired surgical procedure. By way of further example, arthroscopic surgical procedures have been developed wherein a knee joint is inflated with a fluid (e.g., a saline solution) and then surgical instruments are inserted into the interior of the joint so as to carry out the desired surgical procedure.
- In order to visualize what is taking place at the remote interior site, the physician also inserts an endoscope into the patient's body during the endoscopic surgery, together with an appropriate source of illumination. Such an endoscope generally comprises an elongated shaft having a distal end and a proximal end, and at least one internal passageway extending between the distal end and the proximal end. Image capturing means are disposed at the distal end of the shaft and extend through the shaft's at least one internal passageway, whereby the image capturing means can capture an image of a selected region located substantially adjacent to the distal end of the shaft and convey that image to the proximal end of the shaft. Viewing means are in turn disposed adjacent to the proximal end of the shaft, whereby the image obtained by the image capturing means can be conveyed to a display device which is viewed by the physician.
- Endoscopes of the sort described above are generally sufficient to permit the physician to carry out the desired endoscopic procedure. However, certain problems have been encountered when using such endoscopes in surgical procedures.
- For example, endoscopes of the sort described above generally have a fairly limited field of view. As a result, the physician typically cannot view the entire surgical field in a single image. This can mean that the physician may not see an important development as soon as it occurs, and/or that the physician must expend precious time and energy constantly redirecting the endoscope to different anatomical regions.
- Visualization problems can also occur due to the difficulty of providing proper illumination within a remote interior site.
- Also, visualization problems can occur due to the presence of intervening structures (e.g., fixed anatomical structures, moving debris, flowing blood, the presence of vaporized tissue when cauterizing in laparoscopic surgery, the presence of air bubbles in a liquid medium in the case of arthroscopic surgery, etc.).
- It has also been found that it can be very difficult for the physician to navigate the endoscope about the anatomical structures of interest, due to the relative ambiguity of various anatomical structures when seen through the endoscope's aforementioned limited field of view and due to the aforementioned visualization problems.
- Accordingly, one object of the present invention is to provide an improved anatomical visualization system.
- Another object of the present invention is to provide an improved anatomical visualization system which is adapted to enhance a physician's ability to comprehend the nature and location of internal bodily structures during endoscopic visualization.
- Still another object of the present invention is to provide an improved anatomical visualization system which is adapted to enhance a physician's ability to navigate an endoscope within the body.
- Yet another object of the present invention is to provide an improved anatomical visualization system which is adapted to augment a standard video endoscopic system with a coordinated computer model visualization system so as to enhance the physician's understanding of the patient's interior anatomical structures.
- And another object of the present invention is to provide an improved method for visualizing the interior anatomical structures of a patient.
- And still another object of the present invention is to provide an improved anatomical visualization system which can be used with remote visualization devices other than endoscopes, e.g., miniature ultrasound probes.
- And yet another object of the present invention is to provide an improved visualization system which can be used to visualize remote objects other than interior anatomical structures, e.g., the interiors of complex machines.
- And another object of the present invention is to provide an improved method for visualizing objects.
- These and other objects of the present invention are addressed by the provision and use of an improved anatomical visualization system comprising, in one preferred embodiment, a database of pre-existing software objects, wherein at least one of the software objects corresponds to a physical structure which is to be viewed by the system; a real-time sensor for acquiring data about the physical structure when the physical structure is located within that sensor's data acquisition field, wherein the real-time sensor is capable of being moved about relative to the physical structure; generating means for generating a real-time software object corresponding to the physical structure, using data acquired by the sensor; registration means for positioning the real-time software object in registration with the pre-existing software objects contained in the database; and processing means for generating an image from the software objects contained in the database, based upon a specified point of view.
- In another preferred form of the invention, the generating means create a software object that corresponds to a disk. The generating means may also be adapted to texture map the data acquired by the sensor onto the disk. Also, the registration means may comprise tracking means that are adapted so as to determine the spatial positioning and orientation of the real-time sensor and/or the physical structure.
- In another preferred aspect of the invention, the real-time sensor may comprise an endoscope and the physical structure may comprise an interior anatomical structure. The system may also include either user input means for permitting the user to provide the processing means with the specified point of view, or user tracking means that are adapted to provide the processing means with the specified point of view.
- According to another aspect of the invention, the real-time computer-based viewing system may comprise a database of software objects and image generating means for generating an image from the software objects contained in the database, based upon a specified point of view. In accordance with this aspect of the invention, means are also provided for specifying this point of view. At least one of the software objects contained in the database comprises pre-existing data corresponding to a physical structure which is to be viewed by the system, and at least one of the software objects comprises data generated by a real-time, movable sensor. The system further comprises registration means for positioning the at least one software object, comprising data generated by the real-time movable sensor, in registration with the at least one software object comprising pre-existing data corresponding to the physical structure which is to be viewed by the system.
- A preferred method for utilizing the present invention comprises: (1) positioning the sensor so that the physical structure is located within that sensor's data acquisition field, and generating a real-time software object corresponding to the physical structure using data acquired by the sensor, and positioning the real-time software object in registration with the pre-existing software objects contained in the database; (2) providing a specified point of view to the processing means; and (3) generating an image from the software objects contained in the database according to that specified point of view.
- These and other objects and features of the present invention will be more fully disclosed or rendered obvious by the following detailed description of the preferred embodiment of the invention, which is to be considered together with the accompanying drawings wherein:
-
FIG. 1 is a schematic view showing an anatomical visualization system formed in accordance with the present invention; -
FIG. 2 is a schematic view of a unit cube for use in defining pologonal surface models; -
FIG. 3 illustrates the data file format of the pologonal surface model for the simple unit cube shown inFIG. 2 ; -
FIG. 4 illustrates a system of software objects; -
FIG. 5 illustrates an image rendered by the anatomical visualization system; -
FIG. 6 illustrates how various elements of system data are input into computer means 60 in connection with the sytem's generation of output video for display onvideo display 170; -
FIG. 7 illustrates additional details on the methodology employed byanatomical visualization system 10 in connection in rendering a video output image for display onvideo display 170; -
FIG. 8 illustrates a typical screen display provided in accordance with the present invention; -
FIG. 9 illustrates an image rendered by the anatomical visualization system; -
FIG. 10 is a schematic representation of a unit disk software object where the disk is defined in the X-Y plane and has a diameter of 1; and -
FIG. 11 shows how the optical parameters for an encdoscope can define the relationship between theendoscope 90A′ and thedisk 90B′. - Looking first at
FIG. 1 , there is shown ananatomical visualization system 10 which comprises a preferred embodiment of the present invention.Anatomical visualization system 10 is intended to be used by aphysician 20 to visually inspectanatomical objects 30 located at an interior anatomical site. By way of example,anatomical visualization system 10 might be used byphysician 20 to visually inspect atibia 30A, afemur 30B and ameniscus 30C located within the knee joint of a patient. - An important aspect of the present invention is the provision of an improved anatomical visualization system which is adapted to augment a standard video endoscopic system with a coordinated computer model visualization system so as to enhance the physician's understanding of the patient's interior anatomical structure.
- To that end,
anatomical visualization system 10 generally comprises endoscope means 40, endoscope tracking means 50, computer means 60, database means 70 containing 3-D computer models of various objects which are to be visualized by the system, and display means 80. - Endoscope means 40 comprise an endoscope of the sort well known in the art. More particularly, endoscope means 40 comprise an
endoscope 90 which comprises (i) a lens arrangement which is disposed at the distal end of the endoscope for capturing an image of a selected region located substantially adjacent to the distal end of the endoscope, and (ii) an appropriate image sensor, e.g., a charge coupled device (“CCD”) element or video tube, which is positioned on the endoscope so as to receive an image captured by the lens arrangement and to generate corresponding video signals which are representative of the captured image. - The video signals output from
endoscope 90 are fed as an input into computer means 60. However, inasmuch asendoscope 90 will generally output its video signals in analog form and inasmuch as computer means 60 will generally require its video signal input to be in digital form, some conversion of the endoscope's video feed is generally required. In the preferred embodiment, video processing means 95 are provided to convert the analog video signals output byendoscope 90 into the digital video signals required by computer means 60. Video processing means 95 are of the sort well known in the art and hence need not be described in further detail here. - Endoscope tracking means 50 comprise a tracking system of the sort well known in the art. More particularly, endoscope tracking means 50 may comprise a
tracking system 97 of the sort adapted to monitor the position and orientation of an object in space and to generate output signals which are representative of the position and orientation of that object. By way of example, trackingsystem 97 might comprise an optical tracking system, an electromagnetic tracking system, an ultrasonic tracking system, or an articulated linkage tracking system, among other alternatives. Such tracking systems are all well known in the art and hence need not be described in further detail here.Tracking system 97 is attached toendoscope 90 such that the output signals generated by trackingsystem 97 will be representative of the spatial positioning and orientation ofendoscope 90. The output signals generated by trackingsystem 97 is fed as an input into computer means 60. - Computer means 60 comprise a
digital computer 130 of the sort adapted for high speed processing of computer graphics. Such digital computers are well known in the art. By way of example,digital computer 130 might comprise a Silicon Graphics Reality Engine digital computer, or it might comprise a Silicon Graphics Iris Indigo2 Impact digital computer, or it might comprise some equivalent digital computer. - Computer means 60 also comprise the operating system software (schematically represented at 135 in
FIG. 1 ) and the application program software (schematically represented at 140 inFIG. 1 ) required to causecomputer 130 to operate in the manner hereinafter described. In particular,application program software 140 includes image rendering software of the sort adapted to generate images from the 3-D computer models contained in database means 70 according to a specified point of view. By way of example, wheredigital computer 130 comprises a Silicon Graphics digital computer of the sort disclosed above,operating system software 135 might comprise the IRIX operating system, and the image rendering software contained inapplication program software 140 might comprise the IRIS gl image rendering software or the OpenGL image rendering software. Such software is well know in the art. As is also well known in the art, such image rendering software utilizes standard techniques, such as the well-known Z buffer algorithm, to draw images of 3-D computer models according to some specified point of view. - As is well known in the art,
computer 130 also typically includesinput devices 145 through whichphysician 20 can interact with the computer.Input devices 145 preferably comprise the sort of computer input devices generally associated with a Silicon Graphics digital computer, e.g.,input devices 145 preferably comprise a keyboard, a mouse, etc. Among other things,input devices 145permit physician 20 to initiate operation ofanatomical visualization system 10, to select various system functions, and to supply the system with various directives, e.g.,input devices 145 might be used byphysician 20 to specify a particular viewing position for which the application program's image rendering software should render a visualization of the 3-D software models contained in database means 70. - Database means 70 comprise a data storage device or medium 150 containing one or more 3-D computer models (schematically illustrated as 160 in
FIG. 1 ) of theanatomical objects 30 which are to be visualized byanatomical visualization system 10. The specific data structure used to store the 3-D computer models 160 will depend on the specific nature ofcomputer 130 and on the particularoperating system software 135 and the particularapplication program software 140 being run oncomputer 130. In general, however, the 3-D computer models 160 contained in data storage device ormedium 150 are preferably structured as a collection of software objects. By way of example, a scanned anatomical structure such as a human knee might be modeled as three distinct software objects, with the tibia being one software object (schematically represented at 30A′ inFIG. 4 ), the femur being a second software object (schematically represented at 30B′ inFIG. 4 ), and the meniscus being a third software object (schematically represented at 30C′ inFIG. 4 ). Such software objects are of the sort well known in the art and may have been created, for example, through post-processing of CT or MRI scans of the patient using techniques well known in the art. - By way of example, in the case where
digital computer 130 comprises a Silicon Graphics digital computer of the sort described above, and where the operating systems's software comprises the IRIX operating system and the application program's image rendering software comprises the Iris gl or OpenGL image rendering software, the 3-D computer models 160 might comprise software objects defined as polygonal surface models, since such a format is consistent with the aforementioned software. By way of further example,FIGS. 2 and 3 illustrate a typical manner of defining a software object using a polygonal surface model of the sort utilized by such image rendering software. In particular,FIG. 2 illustrates the vertices of a unit cube set in an X-Y-Z coordinate system, andFIG. 3 illustrates the data file format of the polygonal surface model for this simple unit cube. As is well known in the art, more complex shapes such as human anatomical structures can be expressed in corresponding terms. It is also to be appreciated that certain digital computers, such as a Silicon Graphics digital computer of the sort described above, can be adapted such that digital video data of the sort output by video processing means 95 can be made to appear on the surface of a polygonal surface model software object in the final rendered image using the well known technique of texture mapping. - Display means 80 comprise a video display of the sort well known in the art. More particularly, display means 80 comprise a
video display 170 of the sort adapted to receive video signals representative of an image and to display that image on ascreen 180 for viewing byphysician 20. By way of example,video display 170 might comprise a television type of monitor, or it might comprise a head-mounted display or a boom-mounted display, or it might comprise any other display device of the sort suitable for displaying an image corresponding to the video signals received from computer means 60, as will hereinafter be described in further detail. In addition, wherevideo display 170 comprises a head-mounted display or a boom-mounted display or some other sort of display coupled to the physician's head movements, physician tracking means 185 (comprising atracking system 187 similar to thetracking system 97 described above) may be attached tovideo display 170 and then used to advisecomputer 130 of the physician's head movements. This can be quite useful, since theanatomical visualization system 10 can use such physician head movements to specify a particular viewing position for which the application program's image rendering software should render a visualization of the 3-D software models contained in database means 70. - In addition to the foregoing, it should also be appreciated that surgeon tracking means 188 (comprising a
tracking system 189 similar to thetracking system 97 described above) may be attached directly tosurgeon 20 and then used to advisecomputer 130 of the physician's movements. Again, the anatomical visualization system can use such physician movements to specify a particular viewing position for which the application program's image rendering software should render a visualization of the 3-D software models contained in database means 70. - As noted above, an important aspect of the present invention is the provision of an improved anatomical visualization system which is adapted to augment a standard video endoscopic system with a coordinated computer model visualization system so as to enhance the physician's understanding of the patient's interior anatomical structure. In particular, the improved anatomical visualization system is adapted to augment the direct, but somewhat limited, video images generated by a standard video endoscopic system with the indirect, but somewhat more flexible, images generated by a computer model visualization system.
- To this end, and referring now to
FIG. 1 , database means 70 also comprise one or more 3-D computer models (schematically illustrated at 190 inFIG. 1 ) of theparticular endoscope 90 which is included inanatomical visualization system 10. Again, the specific data structure used to store the 3-D computer models 190 representingendoscope 90 will depend on the specific nature ofcomputer 130 and on the particularoperating system software 135 and the particularapplication program software 140 being run oncomputer 130. In general, however, the 3-D computer models 190 contained in data storage device ormedium 150 are preferably structured as a pair of separate but interrelated software objects, where one of the software objects represents the physical embodiment ofendoscope 90, and the other of the software objects represents the video image acquired byendoscope 90. - More particularly, the 3-
D computer models 190 representingendoscope 90 comprises a first software object (schematically represented at 90A′ inFIG. 4 ) representative of the shaft ofendoscope 90. - The 3-
D computer models 190 representingendoscope 90 also comprises a second software object (schematically represented at 90B′ inFIG. 4 ) which is representative of the video image acquired byendoscope 90. More particularly,second software object 90B′ is representative of a planar disk defined by the intersection of the endoscope's field of view with a plane set perpendicular to the center axis of that field of view, wherein the plane is separated from the endoscope by a distance equal to the endoscope's focal distance. See, for example,FIG. 11 , which shows how the optical parameters for an endoscope can define the relationship between theendoscope 90A′ and thedisk 90B′. In addition, and as will hereinafter be described in further detail, theanatomical visualization system 10 is arranged so that the video signals output byendoscope 90 are, after being properly transformed by video processing means 95 into the digital data format required bydigital computer 130, texture mapped onto the planar surface ofdisk 90B′. Thus it will be appreciated thatsoftware object 90B′ will be representative of the video image acquired byendoscope 90. - Furthermore, it will be appreciated that the two
software objects 90A′ and 90B′ will together represent both the physical structure ofendoscope 90 and the video image captured by that endoscope. - By way of example, in the case where
digital computer 130 comprises a Silicon Graphics computer of the sort described above, and where the operating system's software comprises the IRIX operating system and the application program's image rendering software comprises the Iris gl or OpenGL image rendering software, the 3-D computer models 190 might comprise software objects defined as polygonal surface models, since such a format is consistent with the aforementioned software. Furthermore, in a manner consistent with the aforementioned software, UV texture mapping parameters are established for each of the vertices of theplanar surface disk 90B′ and the digitized video signals fromendoscope 90 are assigned to be texture map images for 90B′. See, for example,FIG. 10 , which is a schematic representation of a unit disk software object where the disk is defined in the X-Y plane and has a diameter of 1. - It is important to recognize that, so long as the optical characteristics of
endoscope 90 remain constant, the size and positional relationships between shaft software object 90A′ anddisk software object 90B′ will also remain constant. As a result, it can sometimes be convenient to think of shaft software object 90A′ anddisk software object 90B′ as behaving like a single unit, e.g., when positioning the software objects 90A′ and 90B′ within 3-D computer models. - In accordance with the present invention, once the anatomical 3-
D computer models 160 have been established from anatomical software objects 30A′, 30B′ and 30C′ (representative of theanatomical objects D computer models 190 have been established from the endoscope software objects 90A′ and 90B′ (representative of the endoscope and the video image captured by that endoscope), the various software objects are placed into proper registration with one another using techniques well known in the art so as to form a cohesive database for the application program's image rendering software. - Stated another way, a principal task of the application program is to first resolve the relative coordinate system of all the various software objects of anatomical 3-
D computer models 160 and of endoscope 3-D computer models 190, and then to use the application program's image rendering software to merge these elements into a single composite image combining both live video images derived fromendoscope 90 with computer generated images derived from the computer graphics system. - In this respect it will be appreciated that anatomical software objects 30A′, 30B′ and 30C′ will be defined in 3-
D computer models 160 in the context of a particular coordinate system (e.g., the coordinate system established when the anatomical software objects were created), and endoscope software objects 90A′ and 90B will be defined in the context of the coordinate system established by endoscope tracking means 50. - Various techniques are well known in the art for establishing the proper correspondence between two such coordinate systems. By way of example, where
anatomical objects 30A′, 30B′ and 30C′ include unique points of reference which are readily identifiable both visually and within the anatomical 3-D computer models 160, the tracked endoscope can be used to physically touch those unique points of reference; such physical touching with the tracked endoscope will establish the location of those unique points of reference within the coordinate system of the endoscope, and this information can then be used to map the relationship between the endoscope's coordinate system and the coordinate system of the 3-D computer models 160. Alternatively, proper software object registration can also be accomplished by pointingendoscope 90 at variousanatomical objects - Once the proper correspondence has been established between all of the coordinate systems, anatomical software objects 30A′, 30B′ and 30C′ and endoscope software objects 90A′ and 90B′ can be considered to simultaneously coexist in a single coordinate system in the manner schematically illustrated in
FIG. 4 , whereby the application program's image rendering software can generate images of all of the system's software objects (e.g., 30A′, 30B′, 30C′, 90A′ and 90B′) according to some specified point of view. - Furthermore, inasmuch as the live video output from
endoscope 90 is texture mapped onto the surface ofdisk 90B′, the images generated by the application program's image rendering software will automatically integrate the relatively narrow field of view, live video image data provided byendoscope 90 with (ii) the wider field of view, computer model image data which can be generated by the system's computer graphics. See, for example,FIG. 5 , which shows acomposite image 200 which combinesvideo image data 210 obtained fromendoscope 90 with computermodel image data 220 generated by the system's computer graphics. - It is to be appreciated that, inasmuch as endoscope tracking means 50 are adapted to continuously monitor the current position of
endoscope 90 and report the same todigital computer 130,digital computer 130 can continuously update the 3-D computer models 190 representingendoscope 90. As a result, the images generated by the application program's image rendering software will remain accurate even asendoscope 90 is moved about relative toanatomical objects 30. - In addition to the foregoing, it should also be appreciated that anatomical object tracking means 230 (comprising a
tracking system 240 generally similar to thetracking system 97 described above) may be attached to one or more of theanatomical objects computer 130 of the current position of that anatomical object (see, for example,FIG. 1 , where trackingsystem 240 has been attached to the patient'stibia 30A andfemur 30B). As a result,digital computer 130 can continually update the 3-D computer models 160 representing the anatomical objects. Accordingly, the images generated by the application program's image rendering software will remain accurate even astibia 30A and/orfemur 30B move about relative toendoscope 90. -
FIG. 6 provides additional details on how various elements of system data are input into computer means 60 in connection with the system's generation of output video for display onvideo display 170. -
FIG. 7 provides additional details on the methodology employed byanatomical visualization system 10 in connection with rendering a video output image for display onvideo display 170. - The
application program software 140 of computer means 60 is configured so as to enablephysician 20 to quickly and easily specify a particular viewing position (i.e., a “virtual camera” position in computer graphics terminology) for which the application program's image rendering software should render a visualization of the 3-D software models contained in database means 70. By way of illustration,FIG. 8 shows a typical AppleNewton screen display 300 which provides various user input choices for directing the system's “virtual camera”. For example,physician 20 may select one of the joystick modes as shown generally at 310 for permitting the user to use a joystick-type input device to specify a “virtual camera” position for the system. Alternatively,physician 20 may choose to use physician tracking means 185 or 187 to specify the virtual camera position for the system, in which case movement of the physician will cause a corresponding change in the virtual camera position. Using such tools, the physician may specify a virtual camera position disposed at the very end of the endoscope's shaft, whereby the endoscope's shaft is not seen in the rendered image (see, for example,FIG. 5 ), or the user may specify a virtual camera position disposed mid-way back along the length of the shaft, whereby a portion of the endoscope's shaft will appear in the rendered image. See, for example,FIG. 9 , which shows acomposite image 320 which combinesvideo image data 330 obtained fromendoscope 90 with computermodel image data 340 generated by the system's computer graphics, and further wherein a computergraphic representation 350 of the endoscope's shaft appears on the rendered image. - It is to be appreciated that
physician 20 may specify a virtual camera position which is related to the spatial position and orientation ofendoscope 90, in which case the virtual camera position will move in conjunction withendoscope 90. Alternatively,physician 20 may specify a virtual camera position which is not related to the spatial position and orientation ofendoscope 90, in which case the virtual camera position will appear to move independently ofendoscope 90. - Still referring now to
FIG. 8 , it is also possible forphysician 20 to useslider control 360 to direct the application program's image rendering software to adjust the field of view set for the computer graphic image data 340 (seeFIG. 9 ) generated by the system's computer graphics. - Additionally, it is also possible for
physician 20 to useslider control 370 to direct the application program's image rendering software to fade the density of the video image which is texture mapped onto the face ofdisk software object 90B′. As a result of such fading, the face of the system's disk can be made to display an overlaid composite made up of both video image data and computer graphic image data, with the relative composition of the image being dictated according to the level of fade selected. - It is also to be appreciated that, inasmuch as the display image rendered by
anatomical visualization system 10 is rendered from a collection of software objects contained in 3-D computer models, it is possible to render the display image according to any preferred vertical axis. Thus, for example, and referring now to control 380 inFIG. 8 , it is possible to render the display image so thatendoscope 90 provides the relative definition of “up”, or so that the real world provides the relative definition of “up”, or so that some other object (e.g., the principal longitudinal axis of the patient's tibia) provides the relative definition of “up”. - It is also to be appreciated that
anatomical visualization system 10 can be configured to work with video acquisition devices other than endoscopes. For example, the system can be configured to work with miniature ultrasound probes of the sort adapted for insertion into a body cavity. In this situation the video output of the miniature ultrasound probe would be texture mapped onto the face ofdisk software object 90B′. Alternatively, other types of video acquisition devices could be used in a corresponding manner. - Also, it is possible to use the foregoing visualization system to render images of objects other than anatomical structures. For example, the system would be used to provide images from the interior of complex machines, so long as appropriate 3-D computer models are provided for the physical structures which are to be visualized.
- In the preceding description, anatomical 3-
D computer models 160 were created fromsoftware objects 30A′, 30B′, 30C′ representinganatomical objects D computer models 190 were created from endoscope software objects 90A′ and 90B′ representingendoscope shaft 90A andendoscope viewing field 90B; and the various software objects were placed into proper registration with one another using techniques well known in the art so as to form a cohesive database for the application program's image rendering software. - In this respect it should be appreciated that it is also possible to create computer models of objects in addition to
anatomical objects endoscope 90 and to place those computer models into the system's database for selective viewing by the system's image rendering software. - By way of example, and looking now at
FIG. 12 , there is shown asoftware object 30D′ representing the aorta of a patient. Also shown is a series ofmarkers 30E′ placed into the system (e.g., by a human operator using a mouse) and a series ofline segments 30F′ extending between selected ones of themarkers 30E′. Thesemarkers 30E′ andline segments 30F′ may be used to plan a surgical procedure, to determine anatomical lengths or angles, etc. - Also shown is a
straight tube 30G′ which may also be used for planning and measurement purposes, etc., acurved tube 30H′ which may be used for planning and measurement purposes, and a box 30I′ which may be used for planning and measurement purposes, e.g., for volume calculations. -
FIG. 13 is similar toFIG. 12 , except thataorta 30D′ has been rendered transparent. - In addition, other geometric elements such as curved lines, intersecting lines, etc. may also be provided for planning and measurement purposes.
- Significantly, it is also possible to insert into the system software objects representing virtual grafts, virtual implants, etc. By way of example, in
FIG. 14 there is shown avirtual graft 30J′ which represents an arterial stent which may be deployed in the aorta, e.g., to treat an aortic aneurysm. - It is also to be understood that the present invention is by no means limited to the particular construction herein set forth, and/or shown in the drawings, but also comprises any modifications or equivalents within the scope of the claims.
Claims (1)
1. A real-time computer-based viewing system comprising:
a database of pre-existing software objects, wherein at least one of said software objects corresponds to a physical structure which is to be viewed by said system and at least one of said software objects corresponds to a graft which is to be deployed adjacent said physical structure;
a real-time sensor for acquiring data about said physical structure when said physical structure is located within that sensor's data acquisition field, wherein said sensor is capable of being moved about relative to said physical structure;
generating means for generating a real-time software object corresponding to said physical structure using data acquired by said sensor;
registration means for positioning said real-time software object in registration with said pre-existing software objects contained in said database; and
processing means for generating an image from said software objects contained in said database, based upon a specified point of view.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/634,357 US20070149846A1 (en) | 1995-07-24 | 2006-12-05 | Anatomical visualization system |
US11/732,291 US8285011B2 (en) | 2005-06-02 | 2007-04-03 | Anatomical visualization and measurement system |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/505,587 US5776050A (en) | 1995-07-24 | 1995-07-24 | Anatomical visualization system |
US27764301P | 2001-03-21 | 2001-03-21 | |
US09/874,869 US6612980B2 (en) | 1995-07-24 | 2001-06-05 | Anatomical visualization system |
US10/104,082 US6702736B2 (en) | 1995-07-24 | 2002-03-21 | Anatomical visualization system |
US10/823,811 US7144367B2 (en) | 1995-07-24 | 2004-03-09 | Anatomical visualization system |
US11/634,357 US20070149846A1 (en) | 1995-07-24 | 2006-12-05 | Anatomical visualization system |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/823,811 Continuation US7144367B2 (en) | 1995-07-24 | 2004-03-09 | Anatomical visualization system |
US11/728,205 Continuation-In-Part US20080018645A1 (en) | 2003-11-10 | 2007-03-23 | Anatomical visualization and measurement system |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/637,614 Continuation-In-Part US20070253610A1 (en) | 1995-12-29 | 2006-12-12 | Anatomical visualization and measurement system |
US11/732,291 Continuation-In-Part US8285011B2 (en) | 2005-06-02 | 2007-04-03 | Anatomical visualization and measurement system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20070149846A1 true US20070149846A1 (en) | 2007-06-28 |
Family
ID=46280418
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/104,082 Expired - Lifetime US6702736B2 (en) | 1995-07-24 | 2002-03-21 | Anatomical visualization system |
US10/823,811 Expired - Lifetime US7144367B2 (en) | 1995-07-24 | 2004-03-09 | Anatomical visualization system |
US11/634,357 Abandoned US20070149846A1 (en) | 1995-07-24 | 2006-12-05 | Anatomical visualization system |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/104,082 Expired - Lifetime US6702736B2 (en) | 1995-07-24 | 2002-03-21 | Anatomical visualization system |
US10/823,811 Expired - Lifetime US7144367B2 (en) | 1995-07-24 | 2004-03-09 | Anatomical visualization system |
Country Status (1)
Country | Link |
---|---|
US (3) | US6702736B2 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060276686A1 (en) * | 2004-02-16 | 2006-12-07 | Olympus Corporation | Endoscope and endoscope system |
US20080094506A1 (en) * | 1998-11-09 | 2008-04-24 | Macinnis Alexander G | Graphics display system with anti-flutter filtering and vertical scaling feature |
US20090023993A1 (en) * | 2007-07-18 | 2009-01-22 | Tal Davidson | System and method for combined display of medical devices |
US20090259102A1 (en) * | 2006-07-10 | 2009-10-15 | Philippe Koninckx | Endoscopic vision system |
US20110166418A1 (en) * | 2010-01-07 | 2011-07-07 | Kabushiki Kaisha Toshiba | Medical image processing system and a method for processing a medical image |
US20110251454A1 (en) * | 2008-11-21 | 2011-10-13 | Mayo Foundation For Medical Education And Research | Colonoscopy Tracking and Evaluation System |
US20120201432A1 (en) * | 2011-02-03 | 2012-08-09 | Medtronic, Inc. | Display of an Acquired Cine Loop for Procedure Navigation |
US20120330129A1 (en) * | 2011-06-23 | 2012-12-27 | Richard Awdeh | Medical visualization systems and related methods of use |
US20150025316A1 (en) * | 2013-03-06 | 2015-01-22 | Olympus Medical Systems Corp. | Endoscope system and method for operating endoscope system |
US20150305600A1 (en) * | 2013-03-19 | 2015-10-29 | Olympus Corporation | Endoscope system |
US20160000307A1 (en) * | 2013-04-12 | 2016-01-07 | Olympus Corporation | Endoscope system and actuation method for endoscope system |
US20160073927A1 (en) * | 2013-10-02 | 2016-03-17 | Olympus Corporation | Endoscope system |
US20160278612A1 (en) * | 2013-09-27 | 2016-09-29 | Olympus Corporation | Endoscope system |
DE102017215074A1 (en) | 2017-08-29 | 2019-02-28 | Siemens Healthcare Gmbh | Method and selection unit for selecting a virtual object or a picture parameter value |
Families Citing this family (62)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8944070B2 (en) | 1999-04-07 | 2015-02-03 | Intuitive Surgical Operations, Inc. | Non-force reflecting method for providing tool force information to a user of a telesurgical system |
WO2004010857A1 (en) * | 2002-07-31 | 2004-02-05 | Olympus Corporation | Endoscope |
AU2003208051A1 (en) * | 2003-01-28 | 2004-08-23 | Fujitsu Frontech Limited | Method for supporting data linkage between applications |
US7381183B2 (en) * | 2003-04-21 | 2008-06-03 | Karl Storz Development Corp. | Method for capturing and displaying endoscopic maps |
AU2003252270A1 (en) * | 2003-05-30 | 2005-01-21 | Lattice Technology, Inc. | 3-dimensional graphics data display device |
US7292715B2 (en) * | 2003-06-09 | 2007-11-06 | Infraredx, Inc. | Display of diagnostic data |
US7822461B2 (en) * | 2003-07-11 | 2010-10-26 | Siemens Medical Solutions Usa, Inc. | System and method for endoscopic path planning |
US20050054895A1 (en) * | 2003-09-09 | 2005-03-10 | Hoeg Hans David | Method for using variable direction of view endoscopy in conjunction with image guided surgical systems |
US8276091B2 (en) * | 2003-09-16 | 2012-09-25 | Ram Consulting | Haptic response system and method of use |
WO2005038711A1 (en) * | 2003-10-17 | 2005-04-28 | Koninklijke Philips Electronics, N.V. | Manual tools for model based image segmentation |
US7197170B2 (en) * | 2003-11-10 | 2007-03-27 | M2S, Inc. | Anatomical visualization and measurement system |
JP4343723B2 (en) * | 2004-01-30 | 2009-10-14 | オリンパス株式会社 | Insertion support system |
US9615772B2 (en) * | 2004-02-20 | 2017-04-11 | Karl Storz Imaging, Inc. | Global endoscopic viewing indicator |
US20060098010A1 (en) * | 2004-03-09 | 2006-05-11 | Jeff Dwyer | Anatomical visualization and measurement system |
DE102004044435A1 (en) * | 2004-09-14 | 2006-03-30 | Siemens Ag | Method and device for diagnosis and therapy of the aortic aneurysm |
US7702137B2 (en) | 2004-11-10 | 2010-04-20 | M2S, Inc. | Anatomical visualization and measurement system |
JP2006218129A (en) * | 2005-02-10 | 2006-08-24 | Olympus Corp | Surgery supporting system |
US9789608B2 (en) | 2006-06-29 | 2017-10-17 | Intuitive Surgical Operations, Inc. | Synthetic representation of a surgical robot |
US8285011B2 (en) * | 2005-06-02 | 2012-10-09 | M2S | Anatomical visualization and measurement system |
JP5020945B2 (en) * | 2005-06-06 | 2012-09-05 | ボード・オブ・リージエンツ,ザ・ユニバーシテイ・オブ・テキサス・システム | OCT with spectrally resolved bandwidth |
WO2007030173A1 (en) | 2005-06-06 | 2007-03-15 | Intuitive Surgical, Inc. | Laparoscopic ultrasound robotic surgical system |
US8398541B2 (en) | 2006-06-06 | 2013-03-19 | Intuitive Surgical Operations, Inc. | Interactive user interfaces for robotic minimally invasive surgical systems |
US11259870B2 (en) | 2005-06-06 | 2022-03-01 | Intuitive Surgical Operations, Inc. | Interactive user interfaces for minimally invasive telesurgical systems |
US9289267B2 (en) * | 2005-06-14 | 2016-03-22 | Siemens Medical Solutions Usa, Inc. | Method and apparatus for minimally invasive surgery using endoscopes |
US8125648B2 (en) | 2006-06-05 | 2012-02-28 | Board Of Regents, The University Of Texas System | Polarization-sensitive spectral interferometry |
JP4153963B2 (en) * | 2006-06-12 | 2008-09-24 | オリンパスメディカルシステムズ株式会社 | Endoscope insertion shape detection device |
US8083667B2 (en) | 2006-06-13 | 2011-12-27 | Intuitive Surgical Operations, Inc. | Side looking minimally invasive surgery instrument assembly |
US9718190B2 (en) | 2006-06-29 | 2017-08-01 | Intuitive Surgical Operations, Inc. | Tool position and identification indicator displayed in a boundary area of a computer display screen |
US20090192523A1 (en) | 2006-06-29 | 2009-07-30 | Intuitive Surgical, Inc. | Synthetic representation of a surgical instrument |
US10008017B2 (en) | 2006-06-29 | 2018-06-26 | Intuitive Surgical Operations, Inc. | Rendering tool information as graphic overlays on displayed images of tools |
US10258425B2 (en) | 2008-06-27 | 2019-04-16 | Intuitive Surgical Operations, Inc. | Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide |
WO2008004222A2 (en) * | 2006-07-03 | 2008-01-10 | Yissum Research Development Company Of The Hebrew University Of Jerusalem | Computer image-aided method and system for guiding instruments through hollow cavities |
JP4891006B2 (en) * | 2006-09-06 | 2012-03-07 | オリンパス株式会社 | Endoscope device |
US9138129B2 (en) | 2007-06-13 | 2015-09-22 | Intuitive Surgical Operations, Inc. | Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide |
US9089256B2 (en) | 2008-06-27 | 2015-07-28 | Intuitive Surgical Operations, Inc. | Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide |
US9084623B2 (en) | 2009-08-15 | 2015-07-21 | Intuitive Surgical Operations, Inc. | Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide |
US8620473B2 (en) | 2007-06-13 | 2013-12-31 | Intuitive Surgical Operations, Inc. | Medical robotic system with coupled control modes |
US9469034B2 (en) | 2007-06-13 | 2016-10-18 | Intuitive Surgical Operations, Inc. | Method and system for switching modes of a robotic system |
US8926511B2 (en) * | 2008-02-29 | 2015-01-06 | Biosense Webster, Inc. | Location system with virtual touch screen |
US8185354B2 (en) * | 2008-05-19 | 2012-05-22 | The Procter & Gamble Company | Method of determining the dynamic location of a protection device |
US8260578B2 (en) * | 2008-05-19 | 2012-09-04 | The Procter & Gamble Company | Method of determining the dynamic location of a protection |
US8864652B2 (en) | 2008-06-27 | 2014-10-21 | Intuitive Surgical Operations, Inc. | Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip |
US12239396B2 (en) | 2008-06-27 | 2025-03-04 | Intuitive Surgical Operations, Inc. | Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide |
US10004387B2 (en) | 2009-03-26 | 2018-06-26 | Intuitive Surgical Operations, Inc. | Method and system for assisting an operator in endoscopic navigation |
US8337397B2 (en) * | 2009-03-26 | 2012-12-25 | Intuitive Surgical Operations, Inc. | Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient |
US12266040B2 (en) | 2009-03-31 | 2025-04-01 | Intuitive Surgical Operations, Inc. | Rendering tool information as graphic overlays on displayed images of tools |
US8918211B2 (en) | 2010-02-12 | 2014-12-23 | Intuitive Surgical Operations, Inc. | Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument |
US9492927B2 (en) | 2009-08-15 | 2016-11-15 | Intuitive Surgical Operations, Inc. | Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose |
WO2011094518A2 (en) * | 2010-01-28 | 2011-08-04 | The Penn State Research Foundation | Image-based global registration system and method applicable to bronchoscopy guidance |
US8683387B2 (en) | 2010-03-03 | 2014-03-25 | Cast Group Of Companies Inc. | System and method for visualizing virtual objects on a mobile device |
WO2012003861A1 (en) * | 2010-07-06 | 2012-01-12 | Brainlab Ag | Intra-operative image presentation adapted to viewing direction |
JP5628927B2 (en) | 2010-08-31 | 2014-11-19 | 富士フイルム株式会社 | MEDICAL INFORMATION DISPLAY DEVICE AND METHOD, AND PROGRAM |
US10507066B2 (en) | 2013-02-15 | 2019-12-17 | Intuitive Surgical Operations, Inc. | Providing information of tools by filtering image areas adjacent to or on displayed images of the tools |
TWI518368B (en) * | 2013-09-11 | 2016-01-21 | 財團法人工業技術研究院 | Virtual image display apparatus |
WO2015121764A1 (en) * | 2014-02-11 | 2015-08-20 | Koninklijke Philips N.V. | Spatial visualization of internal mammary artery during minimally invasive bypass surgery |
EP3136943A4 (en) * | 2014-05-01 | 2017-12-27 | EndoChoice, Inc. | System and method of scanning a body cavity using a multiple viewing elements endoscope |
US9950194B2 (en) | 2014-09-09 | 2018-04-24 | Mevion Medical Systems, Inc. | Patient positioning system |
DE102017103198A1 (en) * | 2017-02-16 | 2018-08-16 | avateramedical GmBH | Device for determining and retrieving a reference point during a surgical procedure |
CN109345585B (en) * | 2018-10-26 | 2021-11-30 | 强联智创(北京)科技有限公司 | Method and system for measuring morphological parameters of intracranial aneurysm image |
US11684251B2 (en) * | 2019-03-01 | 2023-06-27 | Covidien Ag | Multifunctional visualization instrument with orientation control |
CN110169820A (en) * | 2019-04-24 | 2019-08-27 | 艾瑞迈迪科技石家庄有限公司 | A kind of joint replacement surgery pose scaling method and device |
EP4384985A1 (en) * | 2021-08-10 | 2024-06-19 | Intuitive Surgical Operations, Inc. | Systems and methods for depth-based measurement in a three-dimensional view |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4722056A (en) * | 1986-02-18 | 1988-01-26 | Trustees Of Dartmouth College | Reference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope |
US4922909A (en) * | 1987-07-17 | 1990-05-08 | Little James H | Video monitoring and reapposition monitoring apparatus and methods |
US4989083A (en) * | 1989-06-29 | 1991-01-29 | Olympus Optical Co., Ltd. | Method of inspecting objects by image pickup means |
US5153721A (en) * | 1990-06-04 | 1992-10-06 | Olympus Optical Co., Ltd. | Method and apparatus for measuring an object by correlating displaced and simulated object images |
US5261404A (en) * | 1991-07-08 | 1993-11-16 | Mick Peter R | Three-dimensional mammal anatomy imaging system and method |
US5417210A (en) * | 1992-05-27 | 1995-05-23 | International Business Machines Corporation | System and method for augmentation of endoscopic surgery |
US5491510A (en) * | 1993-12-03 | 1996-02-13 | Texas Instruments Incorporated | System and method for simultaneously viewing a scene and an obscured object |
US5526812A (en) * | 1993-06-21 | 1996-06-18 | General Electric Company | Display system for enhancing visualization of body structures during medical procedures |
US5526814A (en) * | 1993-11-09 | 1996-06-18 | General Electric Company | Automatically positioned focussed energy system guided by medical imaging |
US5558619A (en) * | 1991-04-23 | 1996-09-24 | Olympus Optical Co., Ltd. | Endoscope system with automatic control according to movement of an operator |
US5704897A (en) * | 1992-07-31 | 1998-01-06 | Truppe; Michael J. | Apparatus and method for registration of points of a data field with respective points of an optical image |
US5765561A (en) * | 1994-10-07 | 1998-06-16 | Medical Media Systems | Video-based surgical targeting system |
US5873822A (en) * | 1994-09-15 | 1999-02-23 | Visualization Technology, Inc. | Automatic registration system for use with position tracking and imaging system for use in medical applications |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5493595A (en) | 1982-02-24 | 1996-02-20 | Schoolman Scientific Corp. | Stereoscopically displayed three dimensional medical imaging |
JPH0681275B2 (en) | 1985-04-03 | 1994-10-12 | ソニー株式会社 | Image converter |
US4729098A (en) | 1985-06-05 | 1988-03-01 | General Electric Company | System and method employing nonlinear interpolation for the display of surface structures contained within the interior region of a solid body |
US4985855A (en) | 1987-08-24 | 1991-01-15 | International Business Machines Corp. | Method for producing installation instructions for three dimensional assemblies |
US4945478A (en) | 1987-11-06 | 1990-07-31 | Center For Innovative Technology | Noninvasive medical imaging system and method for the identification and 3-D display of atherosclerosis and the like |
US4882679A (en) | 1987-11-27 | 1989-11-21 | Picker International, Inc. | System to reformat images for three-dimensional display |
US5448687A (en) | 1988-09-13 | 1995-09-05 | Computer Design, Inc. | Computer-assisted design system for flattening a three-dimensional surface and for wrapping a flat shape to a three-dimensional surface |
US5099846A (en) | 1988-12-23 | 1992-03-31 | Hardy Tyrone L | Method and apparatus for video presentation from a variety of scanner imaging sources |
US5005559A (en) | 1989-07-27 | 1991-04-09 | Massachusetts Institute Of Technology | Video-graphic arthroscopy system |
US5255352A (en) | 1989-08-03 | 1993-10-19 | Computer Design, Inc. | Mapping of two-dimensional surface detail on three-dimensional surfaces |
US5151856A (en) | 1989-08-30 | 1992-09-29 | Technion R & D Found. Ltd. | Method of displaying coronary function |
JP2845995B2 (en) | 1989-10-27 | 1999-01-13 | 株式会社日立製作所 | Region extraction method |
US5179638A (en) | 1990-04-26 | 1993-01-12 | Honeywell Inc. | Method and apparatus for generating a texture mapped perspective view |
JPH0447479A (en) | 1990-06-13 | 1992-02-17 | Toshiba Corp | Picture display device |
US5231483A (en) | 1990-09-05 | 1993-07-27 | Visionary Products, Inc. | Smart tracking system |
DE69132412T2 (en) | 1990-10-19 | 2001-03-01 | St. Louis University, St. Louis | LOCALIZATION SYSTEM FOR A SURGICAL PROBE FOR USE ON THE HEAD |
JPH0789382B2 (en) | 1991-03-14 | 1995-09-27 | インターナショナル・ビジネス・マシーンズ・コーポレイション | Method and apparatus for generating shape model |
US5291889A (en) | 1991-05-23 | 1994-03-08 | Vanguard Imaging Ltd. | Apparatus and method for spatially positioning images |
US5537638A (en) | 1991-10-25 | 1996-07-16 | Hitachi, Ltd. | Method and system for image mapping |
JP2959249B2 (en) | 1991-11-15 | 1999-10-06 | ソニー株式会社 | Video effect device |
US5274551A (en) | 1991-11-29 | 1993-12-28 | General Electric Company | Method and apparatus for real-time navigation assist in interventional radiological procedures |
US5230623A (en) | 1991-12-10 | 1993-07-27 | Radionics, Inc. | Operating pointer with interactive computergraphics |
JP3117097B2 (en) | 1992-01-28 | 2000-12-11 | ソニー株式会社 | Image conversion device |
EP0574111B1 (en) | 1992-04-24 | 1997-10-01 | Sony United Kingdom Limited | Lighting effects for digital video effects system |
US5329310A (en) | 1992-06-30 | 1994-07-12 | The Walt Disney Company | Method and apparatus for controlling distortion of a projected image |
FR2694881B1 (en) | 1992-07-31 | 1996-09-06 | Univ Joseph Fourier | METHOD FOR DETERMINING THE POSITION OF AN ORGAN. |
US5304806A (en) | 1992-11-25 | 1994-04-19 | Adac Laboratories | Apparatus and method for automatic tracking of a zoomed scan area in a medical camera system |
US5511153A (en) | 1994-01-18 | 1996-04-23 | Massachusetts Institute Of Technology | Method and apparatus for three-dimensional, textured models from plural video images |
US5531227A (en) | 1994-01-28 | 1996-07-02 | Schneider Medical Technologies, Inc. | Imaging device and method |
-
2002
- 2002-03-21 US US10/104,082 patent/US6702736B2/en not_active Expired - Lifetime
-
2004
- 2004-03-09 US US10/823,811 patent/US7144367B2/en not_active Expired - Lifetime
-
2006
- 2006-12-05 US US11/634,357 patent/US20070149846A1/en not_active Abandoned
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4722056A (en) * | 1986-02-18 | 1988-01-26 | Trustees Of Dartmouth College | Reference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope |
US4922909A (en) * | 1987-07-17 | 1990-05-08 | Little James H | Video monitoring and reapposition monitoring apparatus and methods |
US4989083A (en) * | 1989-06-29 | 1991-01-29 | Olympus Optical Co., Ltd. | Method of inspecting objects by image pickup means |
US5153721A (en) * | 1990-06-04 | 1992-10-06 | Olympus Optical Co., Ltd. | Method and apparatus for measuring an object by correlating displaced and simulated object images |
US5558619A (en) * | 1991-04-23 | 1996-09-24 | Olympus Optical Co., Ltd. | Endoscope system with automatic control according to movement of an operator |
US5261404A (en) * | 1991-07-08 | 1993-11-16 | Mick Peter R | Three-dimensional mammal anatomy imaging system and method |
US5417210A (en) * | 1992-05-27 | 1995-05-23 | International Business Machines Corporation | System and method for augmentation of endoscopic surgery |
US5704897A (en) * | 1992-07-31 | 1998-01-06 | Truppe; Michael J. | Apparatus and method for registration of points of a data field with respective points of an optical image |
US5526812A (en) * | 1993-06-21 | 1996-06-18 | General Electric Company | Display system for enhancing visualization of body structures during medical procedures |
US5526814A (en) * | 1993-11-09 | 1996-06-18 | General Electric Company | Automatically positioned focussed energy system guided by medical imaging |
US5491510A (en) * | 1993-12-03 | 1996-02-13 | Texas Instruments Incorporated | System and method for simultaneously viewing a scene and an obscured object |
US5873822A (en) * | 1994-09-15 | 1999-02-23 | Visualization Technology, Inc. | Automatic registration system for use with position tracking and imaging system for use in medical applications |
US5765561A (en) * | 1994-10-07 | 1998-06-16 | Medical Media Systems | Video-based surgical targeting system |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080094506A1 (en) * | 1998-11-09 | 2008-04-24 | Macinnis Alexander G | Graphics display system with anti-flutter filtering and vertical scaling feature |
US7554553B2 (en) * | 1998-11-09 | 2009-06-30 | Broadcom Corporation | Graphics display system with anti-flutter filtering and vertical scaling feature |
US7837616B2 (en) * | 2004-02-16 | 2010-11-23 | Olympus Corporation | Endoscope,system, and method for detecting relative position of endoscope and markers placed at target area |
US20060276686A1 (en) * | 2004-02-16 | 2006-12-07 | Olympus Corporation | Endoscope and endoscope system |
US20090259102A1 (en) * | 2006-07-10 | 2009-10-15 | Philippe Koninckx | Endoscopic vision system |
US8911358B2 (en) * | 2006-07-10 | 2014-12-16 | Katholieke Universiteit Leuven | Endoscopic vision system |
US20090023993A1 (en) * | 2007-07-18 | 2009-01-22 | Tal Davidson | System and method for combined display of medical devices |
US20110251454A1 (en) * | 2008-11-21 | 2011-10-13 | Mayo Foundation For Medical Education And Research | Colonoscopy Tracking and Evaluation System |
US20110166418A1 (en) * | 2010-01-07 | 2011-07-07 | Kabushiki Kaisha Toshiba | Medical image processing system and a method for processing a medical image |
JP2011139797A (en) * | 2010-01-07 | 2011-07-21 | Toshiba Corp | Medical image processor and medical image processing program |
US9078568B2 (en) * | 2010-01-07 | 2015-07-14 | Kabushiki Kaisha Toshiba | Medical image processing system and a method for processing a medical image |
US8768019B2 (en) * | 2011-02-03 | 2014-07-01 | Medtronic, Inc. | Display of an acquired cine loop for procedure navigation |
US20120201432A1 (en) * | 2011-02-03 | 2012-08-09 | Medtronic, Inc. | Display of an Acquired Cine Loop for Procedure Navigation |
US20120330129A1 (en) * | 2011-06-23 | 2012-12-27 | Richard Awdeh | Medical visualization systems and related methods of use |
US9460536B2 (en) * | 2013-03-06 | 2016-10-04 | Olympus Corporation | Endoscope system and method for operating endoscope system that display an organ model image to which an endoscopic image is pasted |
US20150025316A1 (en) * | 2013-03-06 | 2015-01-22 | Olympus Medical Systems Corp. | Endoscope system and method for operating endoscope system |
US20150305600A1 (en) * | 2013-03-19 | 2015-10-29 | Olympus Corporation | Endoscope system |
US9521944B2 (en) * | 2013-03-19 | 2016-12-20 | Olympus Corporation | Endoscope system for displaying an organ model image to which an endoscope image is pasted |
US9538907B2 (en) * | 2013-04-12 | 2017-01-10 | Olympus Corporation | Endoscope system and actuation method for displaying an organ model image pasted with an endoscopic image |
US20160000307A1 (en) * | 2013-04-12 | 2016-01-07 | Olympus Corporation | Endoscope system and actuation method for endoscope system |
US20160278612A1 (en) * | 2013-09-27 | 2016-09-29 | Olympus Corporation | Endoscope system |
US10413157B2 (en) * | 2013-09-27 | 2019-09-17 | Olympus Corporation | Endoscope system with image pasting on planar model |
US20160073927A1 (en) * | 2013-10-02 | 2016-03-17 | Olympus Corporation | Endoscope system |
US9662042B2 (en) * | 2013-10-02 | 2017-05-30 | Olympus Corporation | Endoscope system for presenting three-dimensional model image with insertion form image and image pickup image |
DE102017215074A1 (en) | 2017-08-29 | 2019-02-28 | Siemens Healthcare Gmbh | Method and selection unit for selecting a virtual object or a picture parameter value |
Also Published As
Publication number | Publication date |
---|---|
US20030018235A1 (en) | 2003-01-23 |
US6702736B2 (en) | 2004-03-09 |
US7144367B2 (en) | 2006-12-05 |
US20040193006A1 (en) | 2004-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6702736B2 (en) | Anatomical visualization system | |
US6612980B2 (en) | Anatomical visualization system | |
WO1997003601A9 (en) | Anatomical visualization system | |
US11864850B2 (en) | Path-based navigation of tubular networks | |
Bichlmeier et al. | The virtual mirror: a new interaction paradigm for augmented reality environments | |
US6690960B2 (en) | Video-based surgical targeting system | |
US8116847B2 (en) | System and method for determining an optimal surgical trajectory | |
US7945310B2 (en) | Surgical instrument path computation and display for endoluminal surgery | |
CN103108602B (en) | From the robot controlling of vascular tree image endoscope | |
KR100439756B1 (en) | Apparatus and method for displaying virtual endoscopy diaplay | |
US8248413B2 (en) | Visual navigation system for endoscopic surgery | |
US5820559A (en) | Computerized boundary estimation in medical images | |
US20170215971A1 (en) | System and method for 3-d tracking of surgical instrument in relation to patient body | |
US20070161854A1 (en) | System and method for endoscopic measurement and mapping of internal organs, tumors and other objects | |
US20120062714A1 (en) | Real-time scope tracking and branch labeling without electro-magnetic tracking and pre-operative scan roadmaps | |
US20080071143A1 (en) | Multi-dimensional navigation of endoscopic video | |
US20060036162A1 (en) | Method and apparatus for guiding a medical instrument to a subsurface target site in a patient | |
US20080071141A1 (en) | Method and apparatus for measuring attributes of an anatomical feature during a medical procedure | |
WO2008004222A2 (en) | Computer image-aided method and system for guiding instruments through hollow cavities | |
CN111281534B (en) | System and method for generating a three-dimensional model of a surgical site | |
JP2002253480A (en) | Device for assisting medical treatment | |
Bockholt et al. | Augmented reality for enhancement of endoscopic interventions | |
JPH08280710A (en) | Real time medical device,and method to support operator to perform medical procedure on patient | |
JP2004000551A (en) | Endoscope shape detecting device | |
CN118475309A (en) | 3D model reconstruction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |