US20160128783A1 - Surgical navigation system with one or more body borne components and method therefor - Google Patents
Surgical navigation system with one or more body borne components and method therefor Download PDFInfo
- Publication number
- US20160128783A1 US20160128783A1 US14/926,963 US201514926963A US2016128783A1 US 20160128783 A1 US20160128783 A1 US 20160128783A1 US 201514926963 A US201514926963 A US 201514926963A US 2016128783 A1 US2016128783 A1 US 2016128783A1
- Authority
- US
- United States
- Prior art keywords
- sensor
- target
- icu
- display unit
- surgical site
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 27
- 230000003287 optical effect Effects 0.000 claims abstract description 48
- 238000001356 surgical procedure Methods 0.000 claims abstract description 21
- 230000033001 locomotion Effects 0.000 claims abstract description 12
- 230000003068 static effect Effects 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 8
- 210000003484 anatomy Anatomy 0.000 claims description 6
- 210000000988 bone and bone Anatomy 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 claims description 4
- 210000000689 upper leg Anatomy 0.000 description 13
- 210000001624 hip Anatomy 0.000 description 11
- 238000011883 total knee arthroplasty Methods 0.000 description 11
- 238000005259 measurement Methods 0.000 description 9
- 210000001061 forehead Anatomy 0.000 description 6
- 239000011521 glass Substances 0.000 description 6
- 210000003128 head Anatomy 0.000 description 4
- 210000002303 tibia Anatomy 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 210000004394 hip joint Anatomy 0.000 description 2
- 210000000629 knee joint Anatomy 0.000 description 2
- 210000002414 leg Anatomy 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 208000030470 Trigger Finger disease Diseases 0.000 description 1
- 238000011882 arthroplasty Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000036512 infertility Effects 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 210000004197 pelvis Anatomy 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- A61B19/5244—
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B17/16—Instruments for performing osteoclasis; Drills or chisels for bones; Trepans
- A61B17/17—Guides or aligning means for drills, mills, pins or wires
- A61B17/1703—Guides or aligning means for drills, mills, pins or wires using imaging means, e.g. by X-rays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B46/00—Surgical drapes
- A61B46/10—Surgical drapes specially adapted for instruments, e.g. microscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6846—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
- A61B5/6847—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/337—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00207—Electrical control of surgical instruments with hand gesture control or hand gesture recognition
-
- A61B2019/262—
-
- A61B2019/5255—
-
- A61B2019/5272—
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
- A61B2034/2057—Details of tracking cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2072—Reference field transducer attached to an instrument or patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/061—Measuring instruments not otherwise provided for for measuring dimensions, e.g. length
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/067—Measuring instruments not otherwise provided for for measuring angles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/373—Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
Definitions
- the present application relates to computer-assisted surgery and surgical navigation systems where One or more targets and the one or more objects to which the targets are attached are tracked by an optical sensor, such as one borne by the body of the user (e.g. hand, head etc.)
- the present application further relates to gestural control for surgical navigation systems.
- the field of computer-assisted surgery creates systems and devices to provide a surgeon with positional measurements of objects in space to allow the surgeon to operate more precisely and accurately.
- Existing surgical navigation systems utilize binocular cameras as optical sensors to detect targets attached to objects within a working volume.
- the binocular cameras are part of large and expensive medical equipment systems.
- the cameras are affixed to medical carts with various computer systems, monitors, etc.
- the binocular-based navigation systems are located outside a surgical sterile field, and can localize (i.e. measure the pose of) targets within the sterile field.
- There are several limitations to existing binocular-based navigation systems including line-of-sight disruptions between the cameras and the objects, ability to control computer navigation software, cost, and complexity.
- a system for performing a navigated surgery.
- the system comprises a first target attached to a patient at a surgical site and a second target at the surgical site.
- An optical sensor is coupled to the user and detects the first target and second target simultaneously in a working volume of the sensor.
- An intra-operative computing unit receives sensor data concerning the first target and second target, calculates a relative pose and provides display information to a display unit.
- the sensor can be handheld, body-mounted or head-mounted, and communicate wirelessly with the ICU.
- the sensor may also be mountable on a fixed structure (e.g. proximate thereto) with the working volume in alignment with the surgical site.
- the ICU may receive user input via the sensor, where the user input is at least one of sensor motions, voice commands, and gestures presented to the optical sensor by the user.
- the display information may be presented via a heads-up or other display unit.
- system for performing a navigated surgery at a surgical site of a patient
- the system comprises: a first target configured to be attached to the patient at the surgical site; a second target at the surgical site; a sensor configured to be coupled to a user, the sensor comprising an optical sensor configured to detect the first target and second target simultaneously; and an intra-operative computing unit (ICU) configured to: receive, from the sensor, sensor data concerning the first target and second target; calculate the relative pose between the first target and second target; and based on the relative pose, provide display information to a display unit.
- ICU intra-operative computing unit
- the second target may be a static reference target.
- the second target may be attached to one of: a surgical instrument; a bone cutting guide; and a bone.
- the sensor may be configured to be at least one of: handheld; body-mounted; and head-mounted.
- a sensor working volume for the sensor may be in alignment with a field of view of the user.
- the sensor may communicate wirelessly with the ICU.
- the sensor may be communicatively connected by wire to a sensor control unit and the sensor control unit is configured to wirelessly communicate with the ICU.
- the sensor may be further configured to be mountable on a fixed structure.
- the ICU may be further configured to present, via the display unit, where the targets are with respect to the sensor field of view.
- the ICU may be further configured to present, via the display unit, an optical sensor video feed from the optical sensor.
- the ICU may be further configured to receive user input via the sensor by at least one of receiving motions of the sensor, where the sensor has additional sensing capabilities to sense motions; receiving voice commands, where the sensor further comprises a microphone; and receiving gestures presented to the optical sensor by the user, the gestures being associated with specific commands.
- the system may further comprise a display unit wherein the display unit is further configured to be positionable within a field of view of the user while the optical sensor is detecting the first target and second target.
- the display unit may be a surgeon-worn heads up display.
- the method comprises receiving, by at least one processor of an intra-operative computing unit (ICU), sensor data from a sensor where the sensor data comprises information for calculating the relative pose of a first target and a second target, wherein the sensor is coupled to a user and comprises an optical sensor configured to detect the first target and second target simultaneously, and wherein the first target is attached to the patient at the surgical site and the second target is located at the surgical site calculating, by the at least one processor, the relative pose between the first target and second target; and based on the relative pose, providing, by at least one processor, display information to a display unit.
- ICU intra-operative computing unit
- a sensor working volume of the sensor may be in alignment with a field of view of the user and the first target and second target are in the sensor working volume.
- the method may further comprise receiving, by the at least one processor further sensor data from the sensor, wherein the sensor is attached to a fixed structure such that the sensor volume is aligned with the surgical site when the sensor is attached.
- the method may further comprise, receiving, by the at least one processor, user input from the sensor for invoking the at least one processor to perform an activity of the navigated surgery.
- the user input may comprise a gesture sensed by the sensor.
- a method comprising: aiming an optical sensor, held by the hand, at a surgical site having two targets at the site and within a working volume of the sensor, one of the two targets attached to patient anatomy, wherein the optical sensor is in communication with a processing unit configured to determine a relative position of the two targets and provide display information, via a display unit, pertaining to the relative position; and receiving the display information via the display unit.
- This method may further comprise providing to the processing unit user input via the sensor for invoking the processing unit to perform an activity of the navigated surgery.
- FIG. 1 illustrates use of a surgical navigation, system in Total Knee Arthroplasty (TKA);
- FIG. 2 shows in a block diagram an interaction between various components of a surgical navigation system
- FIG. 3 shows use of a sensor with multiple targets in accordance with an embodiment
- FIG. 4 a shows use of a sensor in a handheld mode as an example for clarity
- FIG. 4 b shows use of a sensor in a non-handheld mode as an example for clarity
- FIG. 5 illustrates, as an example for clarity, use of the system where a center of the working volume of a sensor attached to a surgeon is substantially centered with respect to a site of surgery;
- FIG. 6 shows a gesture of a hand sensed by a sensor
- FIG. 7 illustrates an image, as seen by a sensor when configured to sense a gesture of a hand
- FIG. 8 illustrates motions of as body detected by a sensor attached to the body in accordance with an embodiment
- FIG. 9 shows a sensor communicating wirelessly with an intra-operative computing unit (ICU);
- ICU intra-operative computing unit
- FIG. 10 shows a sensor communicating with an ICU through a sensor control unit (SCU);
- SCU sensor control unit
- FIG. 11 shows a sensor attached to a surgeon's head such that the working volume of the sensor is within the surgeon's field of view;
- FIG. 12 shows a sensor integrated with heads-up display glasses
- FIG. 13 shows heads-up display glasses (with integrated sensor) connected by a cable to a SCU mounted on a surgeon;
- FIG. 14 shows a use of a static reference target as an example tear clarity
- FIG. 15 shows a display of an ICU depicting a video feed as seen by an optical sensor, when viewing the targets.
- Field of View The angular Span (horizontally and vertically) that an optical sensor (e.g. camera) is able to view.
- an optical sensor e.g. camera
- DOF Degrees of Freedom
- Pose The position and orientation of an object in up to 6 DOF in space.
- Working Volume A 3D volume relative to an optical sensor within which valid poses may be generated.
- the Working Volume is a subset of the optical sensor's field of view and is configurable depending on the type of system that uses the optical sensor.
- TKA Total Knee Arthroplasty
- a system 100 is used for navigation in TKA.
- Targets 102 are affixed to a patient's anatomy (i.e. a femur bone 104 and a tibia bone 106 that are part of a knee joint) and located within a field of view 108 of a sensor 110 that is held by a hand 112 .
- the sensor 110 is connected to an intra-operative computing unit (ICU) by a cable 113 .
- ICU intra-operative computing unit
- targets 102 are affixed to surgical instruments (in this case, a distal femoral cutting guide 114 ).
- a relative pose between the femur 104 and tibia 106 may allow a surgeon to track the kinematics of a knee joint during a range-of-motion test.
- relative pose between the distal femur cutting guide 114 and the femur 104 may allow a surgeon to precisely align the cutting guide 114 with respect to the femur 104 to achieve desired cut angles.
- the patient rests on an operating table 116 . Applicant's U.S. Pat. No. 9,138,319 B2 issued Sep.
- TKA Total Hip Arthroplasty
- the architecture of a surgical navigation system is shown in FIG. 2 .
- Multiple targets are affixed to objects (anatomy, instruments, etc.) within a sterile field.
- the targets provide optical positional signals, either through emitting or reflecting optical energy.
- the positional signals are received by a sensor, comprising an optical sensor.
- the optical sensor preferably includes a monocular camera (comprising a lens, an imager etc.), illuminating components, and various optics as applicable (e.g. IR filter, if the optical sensor operates in the IR spectrum).
- the optical sensor is configured to detect a target.
- the optical sensor is configured and focused in order to function according to the relatively large working volume. This is unlike endoscopic applications, where an endoscope is configured to view a scene inside a body cavity. Where the scene is very close (e.g. ⁇ 5 cm) to the endoscope optics. Furthermore, endoscopes are used to visualize tissue, whereas the present optical sensor is used to measure relative pose between targets and
- the sensor may also include, other sensing components.
- positional sensing components such as accelerometers, gyroscopes, magnetometers, IR detectors, etc. may be used to supplement, augment or enhance the positional measurements obtained from the optical sensor.
- the sensor may be capable of receiving user input, for example, through buttons, visible gestures, motions (e.g. determined via an accelerometer), and microphones to receive voice commands etc.
- the sensor may include indicators that signal the state of the sensor, for example, green LED to signify sensor is on, red LED to signify error.
- the sensor may be in communication with a sensor control unit (SCU).
- SCU sensor control unit
- the SCU is an intermediate device to facilitate communication between the sensor and an intra-operative computing unit (ICU).
- ICU intra-operative computing unit
- the ICU may comprise a laptop, workstation, or other computing device having at least one processing unit and at least one storage device, such as memory storing software (instructions and/or data) as further described herein to configure the execution of the intra-operative computing unit.
- the ICU receives positional data from the sensor via the SCU, processes the data and computes the pose of targets that are within the working volume of the sensor.
- the ICU also performs any further processing associated with the other sensing and/or user input components.
- the ICU may further process the measurements to express them in clinically-relevant terms (e.g. according to anatomical registration).
- the ICU may further implement a user interface to guide a user through a surgical workflow.
- the ICU may further provide the user interface, including measurements, to a display unit for displaying to a surgeon or other user.
- a sensor is configured for handheld use.
- the sensor 110 may localize two or more targets 102 , in order to compute a relative pose between the targets 102 .
- the targets 102 are attached to objects 302 and the relative pose between the objects 302 can also be calculated.
- the objects could be the femur 104 and the tibia 106 , or the femur 104 and the cutting guide 114 .
- both handheld and non-handheld modes may be supported for use of the sensor within the same surgery. This is illustrated in FIG. 4 a and FIG. 4 b.
- a releasable mechanical connection 404 such as the one described in U.S. 20140275940 titled “System and method for intra-operative leg position measurement”, the entire contents of which are incorporated herein.
- the sensor 110 can be removed from its fixed structure 402 if the malleolus 406 is outside the working volume 408 of the optical sensor in the sensor 110 .
- the sensor 110 can be held in a user's hand 112 to bring the malleolus 406 within its working volume 408 , as shown in FIG. 4 b.
- This system configuration has several advantages. This system does not have a wired connection to objects being tracked. This system further allows the sensor to be used for pose measurements of the targets at specific steps in the workflow, and set aside when not in use. When a target is outside of or obstructed from the working volume of the sensor, while the sensor is attached to a static/fixed position, the position of the sensor can be moved, using the sensor in handheld mode, to include the target in its working volume.
- the sensor may comprise user input components (e.g. buttons). This user input may be a part of the surgical workflow, and may be communicated to the ICU.
- indicators on the sensor could provide feedback to the surgeon (e.g. status LED's indicating that a target is being detected by the sensor). Also, in handheld operation, it may be preferable for a surgical assistant to bold the sensor to align the sensor' working volume with the targets in the surgical site, while the surgeon is performing another task.
- the sensor 110 may be mounted onto a surgeon 502 for other user) using a sensor mounting structure 504 .
- the sensor mounting structure 504 may allow the sensor to be attached to a surgeon's forehead (by way of example, shown as a headband).
- the sensor 110 is preferably placed on the sensor mounting structure 504 such that the working volume 408 of the sensor 110 and the surgeon's visual field of view 506 are substantially aligned, i.e. the majority of the working volume 408 overlaps with the field of view 108 of the sensor 110 .
- the surgeon 502 can see the targets 102
- the sensor 110 (while mounted on the sensor mounting structure 504 ) will likely be able to do so as well.
- the optical sensor in the sensor 110 has a working volume 108 that is approximately centered about the nominal distance between a surgeon's forehead (or other mounting, location) and the surgical site.
- the working volume 108 of the sensor 110 is preferably large enough to accommodate a wide range of feasible relative positions between the mounting position on the surgeon and the surgical site.
- the senor in the surgical navigation system is mounted on the surgeon's body.
- buttons on the sensor may not be feasible to use buttons on the sensor in order to interact with the intra-operative computing unit.
- Hand gestures may be sensed by the optical system and used as user input. For example, if the sensor can detect a user's hand in its field of view, the system comprising the sensor, SCU, ICU, and a display unit, may be able to identify predefined gestures that correspond to certain commands within the surgical workflow.
- waving a hand from left to right may correspond to advancing the surgical workflow on the display unit of the ICU; snapping fingers may correspond to saving a measurement that may also be displayed on the display unit of the ICU; waving a hand hack and forth may correspond to cancelling an action within the workflow; etc.
- a user interacts with the system using a hand wave gesture within the field of view 108 of the sensor 110 .
- These gestures are recognized by the sensor and by image processing software executed by the ICU.
- the hand gestures When mounted on the surgeon's forehead, the hand gestures may be performed by another user within the surgery and need not necessarily be performed by the surgeon.
- using hand gestures as user input is not limited to a system where the sensor is placed on the surgeon's body. It is also possible to use gesture control in any system where the sensor is mounted on the operating table, on the patient, hand-held, etc.
- FIG. 7 an image 702 of a sensor as seen by the optical sensor is illustrated.
- the ICU may receive comments via voice controls (e.g. via a microphone in the sensor or in the ICU itself).
- body motions e.g. head nods 802 or shakes 804
- additional sensing components e.g. an embedded accelerometer.
- the body motions are sensed and interpreted to send signals to the ICU which may communicate wirelessly with the sensor 110 .
- nodding 802 may signal the surgeon workflow to advance
- shaking 804 may signal the surgeon workflow to go back one step or cancel an action, etc.
- the sensor When the sensor is mounted on a surgeon 502 , the sensor may be wireless to allow the surgeon 502 to move freely around the operating room.
- the sensor 110 communicates wirelessly 902 with the ICU 904 , the ICU configured to receive wireless signals 902 from the sensor.
- the senor 110 is connected by wire to a Sensor Control Unit (SCU) 1002 .
- the SCU 1002 further communicates wirelessly 902 with the ICU 904 .
- the SCU 1002 contains a mounting structure 504 such that it may be mounted on the surgeon 502 .
- the sensor 110 may be mounted to the forehead of the surgeon 502 , and the SCU 1002 may be attached to the belt of the surgeon 502 under a surgical gown.
- the SCU 1002 may have a battery to power the sensor, as well as its own electronics.
- the SCU 1002 may have an integrated battery, and the SCU 1002 may be charged between surgeries, such as on a charging dock.
- any wireless protocol/technology may be used (e.g. WiFi, Bluetooth, ZigBee, etc).
- a sensor mounted on a surgeon's body need not necessarily be sterile.
- the surgeon's forehead is not sterile.
- a sensor may not be made from materials that can be sterilized, and by removing the sensor from the sterile field, there is no requirement to ensure that the sensor is sterile (e.g. through draping, autoclaving, etc.).
- the sensor 110 is attached to a mounting structure 504 to the forehead of the surgeon 502 , and the working volume 408 of the sensor 110 is substantially contained within the surgeon's field of view 506 .
- the sensor 110 is configured to detect targets 102 within its working volume 408 such that the relative pose between the targets 102 may be calculated by the ICU 904 .
- the ICU 904 may comprise a display unit.
- the display unit 1102 is also included in the surgeon's field of view 506 . This is advantageous where real-time measurements based on the pose of the targets 102 are provided to the surgeon 502 via the display unit 1102 .
- the targets 102 and the display unit 1102 are simultaneously visible to the surgeon 502 .
- the display unit 1102 may be positioned within the surgeon's field of view 506 , for example, by being mounted to a mobile cart.
- the display unit 1102 is a heads-up display configured to be worn by the surgeon 502 , and be integrated with the sensor 110 .
- the heads-up display is any transparent display that does not require the surgeon to look away from the usual field of view.
- the heads-up display may be projected onto a visor that surgeon's typically wear during surgery.
- a head-mounted sensor allows the surgeon to ensure that the targets are within the working volume of the sensor without additionally trying to look at a static display unit within the operating room, and with a heads-up display, the display unit is visible to the surgeon regardless of where they are looking.
- the senor 110 is integrated with heads-up display glasses 1202 (such as the Google GlassTM product from Google Inc.) Since the sensor 110 and the display unit 1102 are integrated into one device, there is no longer a need tot a dedicated display unit, thus reducing cost and complexity.
- This embodiment is also advantageous since by default, it places the senor's working volume within the surgeon's field of view.
- the heads-up display glasses 1202 are connected by a cable to a SCU 1002 , the SCU 1002 attached to a surgeon 502 .
- the SCU 1002 may also comprise the ICU (that is, it may perform the intra-operative computations to generate display information to be displayed on the heads-up display glasses).
- the SCU may include a battery, e.g. a rechargeable battery.
- the SCU need not communicate wirelessly with any other device, since the entire computational/electronic system is provided by the SCU, heads-up display glasses and sensor. That is, the entire surgical localization system is body-worn (with the exception of the targets and their coupling means).
- the sensor In a body-mounted or handheld camera configuration, the sensor is not required to be stationary when capturing relative pose between multiple targets. However, some measurements may be required to be absolute. For example, when calculating the center of rotation (COR) of a hip joint, a fixed sensor may localize a target affixed to the femur as it is rotated about the hip COR. The target is constrained to move along a sphere, the sphere's center being the hip COR. Since the sensor is in a fixed location, and the hip COR is in a fixed location, the pose data from rotating the femur may be used to calculate the pose of the hip COR relative to the target on the femur. In TKA, the pose of the hip COR is useful for determining the mechanical axis of a patient's leg.
- COR center of rotation
- a second stationary target is used as a static reference to compensate for the possible motion of the sensor.
- a target 102 is affixed to a femur 104 that has a static hip COR 1401 .
- the sensor is not necessarily stationary, as it may be body-mounted or hand-held.
- a static reference target 1402 is introduced within the working volume 408 of the sensor 110 .
- the static reference target 1402 must fulfill two conditions: 1) it must remain stationary during steps executed to measure the hip COR 1401 , and 2) it must be within the working volume 408 of the sensor 110 simultaneously with the target 102 attached to the femur 104 while the hip COR 1401 is measured.
- the static reference target 1402 may be simply placed on the OR bed 116 or mounted to a fixed docking structure 402 .
- the femur is rotated about a pivot point of the hip joint while the sensor 110 captures poses of the target 102 and poses of the static reference target 1402 .
- the poses of the static reference target 1402 are used to compensate for any movement of the sensor 110 as it used in hand-held or body-mounted mode.
- This embodiment is described relative to measuring, hip COR in the context of TKA; however, this embodiment is meant to be example for clarity. There are many other embodiments that will be evident to those skilled in the art e.g. tool calibration, verification of the location of a tip of a tool, etc.).
- the surgeon may be advantageous to display the video feed as seen by the optical sensor to the surgeon via the display unit.
- This visual feedback may allow the surgeon to see what the optical sensor “sees”, and may be useful in a) ensuring that the target(s) are within the working volume of the sensor, and b) diagnosing any occlusions, disturbances, disruptions, etc. that prevents the poses of the targets from being captured by the ICU.
- a persistent optical sensor image feed may be displayed. This is illustrated in FIG.
- a laptop integrated ICU 904 and display 1102
- display a video feed 1502 shown two targets within the image
- Alternative graphics may be shown in addition to or instead of the raw video feed 1502 of the optical sensor.
- simplified renderings of the targets e.g. represented as dots
- a virtual overlay may be displayed on the raw video teed 1502 of the optical sensor, in which the overlay includes colours and shapes overlaid on the target image.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physics & Mathematics (AREA)
- Robotics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Dentistry (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Surgical Instruments (AREA)
- Prostheses (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Endoscopes (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- User Interface Of Digital Computer (AREA)
- Manipulator (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
- This application claims priority to U.S. provisional application No. 62/072,041 titled “Systems, Methods and Devices for Anatomical Registration and Surgical Localization” and filed on Oct. 29, 2014, the entire contents of which are incorporated herein by reference.
- This application claims priority to U.S. provisional application No 62/072,030 titled “Devices including a surgical navigation camera and systems and methods for surgical navigation” and filed on Oct. 29, 2014, the entire contents of which are incorporated herein by reference.
- This application claims priority to U.S. provisional application No. 62/084,891 titled “Devices, systems and methods for natural feature tracking of surgical tools and other objects” and filed on Nov. 26, 2014, the entire contents of which air incorporated herein by reference.
- This application claims priority to U.S. provisional application No. 62/072,032 titled “Devices, systems and methods for reamer guidance and cup sealing ” and filed on Oct. 29, 2014, the entire contents of which are incorporated herein by reference.
- The present application relates to computer-assisted surgery and surgical navigation systems where One or more targets and the one or more objects to which the targets are attached are tracked by an optical sensor, such as one borne by the body of the user (e.g. hand, head etc.) The present application further relates to gestural control for surgical navigation systems.
- The field of computer-assisted surgery (or “computer navigation”) creates systems and devices to provide a surgeon with positional measurements of objects in space to allow the surgeon to operate more precisely and accurately. Existing surgical navigation systems utilize binocular cameras as optical sensors to detect targets attached to objects within a working volume. The binocular cameras are part of large and expensive medical equipment systems. The cameras are affixed to medical carts with various computer systems, monitors, etc. The binocular-based navigation systems are located outside a surgical sterile field, and can localize (i.e. measure the pose of) targets within the sterile field. There are several limitations to existing binocular-based navigation systems, including line-of-sight disruptions between the cameras and the objects, ability to control computer navigation software, cost, and complexity.
- In one aspect, a system is disclosed for performing a navigated surgery. The system comprises a first target attached to a patient at a surgical site and a second target at the surgical site. An optical sensor is coupled to the user and detects the first target and second target simultaneously in a working volume of the sensor. An intra-operative computing unit (ICU) receives sensor data concerning the first target and second target, calculates a relative pose and provides display information to a display unit. The sensor can be handheld, body-mounted or head-mounted, and communicate wirelessly with the ICU. The sensor may also be mountable on a fixed structure (e.g. proximate thereto) with the working volume in alignment with the surgical site. The ICU may receive user input via the sensor, where the user input is at least one of sensor motions, voice commands, and gestures presented to the optical sensor by the user. The display information may be presented via a heads-up or other display unit.
- There is provided as system for performing a navigated surgery at a surgical site of a patient where the system comprises: a first target configured to be attached to the patient at the surgical site; a second target at the surgical site; a sensor configured to be coupled to a user, the sensor comprising an optical sensor configured to detect the first target and second target simultaneously; and an intra-operative computing unit (ICU) configured to: receive, from the sensor, sensor data concerning the first target and second target; calculate the relative pose between the first target and second target; and based on the relative pose, provide display information to a display unit.
- The second target may be a static reference target. The second target may be attached to one of: a surgical instrument; a bone cutting guide; and a bone.
- The sensor may be configured to be at least one of: handheld; body-mounted; and head-mounted.
- A sensor working volume for the sensor may be in alignment with a field of view of the user.
- The sensor may communicate wirelessly with the ICU.
- The sensor may be communicatively connected by wire to a sensor control unit and the sensor control unit is configured to wirelessly communicate with the ICU.
- The sensor may be further configured to be mountable on a fixed structure.
- The ICU may be further configured to present, via the display unit, where the targets are with respect to the sensor field of view. The ICU may be further configured to present, via the display unit, an optical sensor video feed from the optical sensor. The ICU may be further configured to receive user input via the sensor by at least one of receiving motions of the sensor, where the sensor has additional sensing capabilities to sense motions; receiving voice commands, where the sensor further comprises a microphone; and receiving gestures presented to the optical sensor by the user, the gestures being associated with specific commands.
- The system may further comprise a display unit wherein the display unit is further configured to be positionable within a field of view of the user while the optical sensor is detecting the first target and second target. The display unit may be a surgeon-worn heads up display.
- There is provided a computer-implemented method for performing a navigated surgery at a surgical site of a patient. The method comprises receiving, by at least one processor of an intra-operative computing unit (ICU), sensor data from a sensor where the sensor data comprises information for calculating the relative pose of a first target and a second target, wherein the sensor is coupled to a user and comprises an optical sensor configured to detect the first target and second target simultaneously, and wherein the first target is attached to the patient at the surgical site and the second target is located at the surgical site calculating, by the at least one processor, the relative pose between the first target and second target; and based on the relative pose, providing, by at least one processor, display information to a display unit.
- In this method a sensor working volume of the sensor may be in alignment with a field of view of the user and the first target and second target are in the sensor working volume.
- The method may further comprise receiving, by the at least one processor further sensor data from the sensor, wherein the sensor is attached to a fixed structure such that the sensor volume is aligned with the surgical site when the sensor is attached.
- The method may further comprise, receiving, by the at least one processor, user input from the sensor for invoking the at least one processor to perform an activity of the navigated surgery. The user input may comprise a gesture sensed by the sensor.
- There is provided a method comprising: aiming an optical sensor, held by the hand, at a surgical site having two targets at the site and within a working volume of the sensor, one of the two targets attached to patient anatomy, wherein the optical sensor is in communication with a processing unit configured to determine a relative position of the two targets and provide display information, via a display unit, pertaining to the relative position; and receiving the display information via the display unit. This method may further comprise providing to the processing unit user input via the sensor for invoking the processing unit to perform an activity of the navigated surgery.
- Additional objects and advantages of the disclosed embodiments will be set forth in part in the description that follows, and in part will be obvious from the description, or may be learned by practice of the disclosed embodiments. The objects and advantages of the disclosed embodiments will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims.
- It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosed embodiments as claimed.
- The accompanying drawings constitute a part of this specification. The drawings illustrate several embodiments of the present disclosure and, together with the description, serve to explain the principles of the disclosed embodiments as set forth in the accompanying claims.
- Embodiments disclosed herein will be more fully understood from the detailed description and the corresponding drawings, which form a part of this application, and in which:
-
FIG. 1 illustrates use of a surgical navigation, system in Total Knee Arthroplasty (TKA); -
FIG. 2 shows in a block diagram an interaction between various components of a surgical navigation system; -
FIG. 3 shows use of a sensor with multiple targets in accordance with an embodiment; -
FIG. 4a shows use of a sensor in a handheld mode as an example for clarity; -
FIG. 4b shows use of a sensor in a non-handheld mode as an example for clarity; -
FIG. 5 illustrates, as an example for clarity, use of the system where a center of the working volume of a sensor attached to a surgeon is substantially centered with respect to a site of surgery; -
FIG. 6 shows a gesture of a hand sensed by a sensor; -
FIG. 7 illustrates an image, as seen by a sensor when configured to sense a gesture of a hand; -
FIG. 8 illustrates motions of as body detected by a sensor attached to the body in accordance with an embodiment; -
FIG. 9 shows a sensor communicating wirelessly with an intra-operative computing unit (ICU); -
FIG. 10 shows a sensor communicating with an ICU through a sensor control unit (SCU); -
FIG. 11 shows a sensor attached to a surgeon's head such that the working volume of the sensor is within the surgeon's field of view; -
FIG. 12 shows a sensor integrated with heads-up display glasses; -
FIG. 13 shows heads-up display glasses (with integrated sensor) connected by a cable to a SCU mounted on a surgeon; -
FIG. 14 shows a use of a static reference target as an example tear clarity; and -
FIG. 15 shows a display of an ICU depicting a video feed as seen by an optical sensor, when viewing the targets. - It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity.
- Field of View (FOV): The angular Span (horizontally and vertically) that an optical sensor (e.g. camera) is able to view.
- Degrees of Freedom (DOF): independent parameters used to describe the pose (position and orientation) of a rigid body. There are up to 6 DOF for a rigid body: 3 DOF for position (i.e. x, y, z position), and 3 DOF for orientation (e.g. roll, pitch, yaw).
- Pose: The position and orientation of an object in up to 6 DOF in space.
- Working Volume: A 3D volume relative to an optical sensor within which valid poses may be generated. The Working Volume is a subset of the optical sensor's field of view and is configurable depending on the type of system that uses the optical sensor.
- In navigated surgery, targets are affixed to a patient's anatomy, as well as to surgical instruments, such that a relative position and orientation between the sensor and the target may be measured. Total Knee Arthroplasty (TKA) will be used as an example in this disclosure to describe a navigated surgical procedure. As illustrated in
FIG. 1 , asystem 100 is used for navigation in TKA.Targets 102 are affixed to a patient's anatomy (i.e. afemur bone 104 and atibia bone 106 that are part of a knee joint) and located within a field ofview 108 of asensor 110 that is held by ahand 112. Thesensor 110 is connected to an intra-operative computing unit (ICU) by acable 113. Additionally, targets 102 are affixed to surgical instruments (in this case, a distal femoral cutting guide 114). In navigated TKA, a relative pose between thefemur 104 andtibia 106 may allow a surgeon to track the kinematics of a knee joint during a range-of-motion test. Furthermore, as relative pose between the distalfemur cutting guide 114 and thefemur 104 may allow a surgeon to precisely align the cuttingguide 114 with respect to thefemur 104 to achieve desired cut angles. The patient rests on an operating table 116. Applicant's U.S. Pat. No. 9,138,319 B2 issued Sep. 22, 2015 and entitled “Method and System for Aligning a Prosthesis During Surgery” describes operations and methods to register components (such as an optical sensor to a patient's bone (for example a pelvis)), measure relative poses and track objects in a Total Hip Arthroplasty (THA) scenario, the contents of which are incorporated herein by reference. Similar methods and systems as described therein are applicable to TKA. - The architecture of a surgical navigation system is shown in
FIG. 2 . Multiple targets are affixed to objects (anatomy, instruments, etc.) within a sterile field. The targets provide optical positional signals, either through emitting or reflecting optical energy. The positional signals are received by a sensor, comprising an optical sensor. The optical sensor preferably includes a monocular camera (comprising a lens, an imager etc.), illuminating components, and various optics as applicable (e.g. IR filter, if the optical sensor operates in the IR spectrum). The optical sensor is configured to detect a target. The optical sensor is configured and focused in order to function according to the relatively large working volume. This is unlike endoscopic applications, where an endoscope is configured to view a scene inside a body cavity. Where the scene is very close (e.g. <5 cm) to the endoscope optics. Furthermore, endoscopes are used to visualize tissue, whereas the present optical sensor is used to measure relative pose between targets and objects. - The sensor may also include, other sensing components. For example, positional sensing components, such as accelerometers, gyroscopes, magnetometers, IR detectors, etc. may be used to supplement, augment or enhance the positional measurements obtained from the optical sensor. Additionally, the sensor may be capable of receiving user input, for example, through buttons, visible gestures, motions (e.g. determined via an accelerometer), and microphones to receive voice commands etc. The sensor may include indicators that signal the state of the sensor, for example, green LED to signify sensor is on, red LED to signify error.
- The sensor may be in communication with a sensor control unit (SCU). The SCU is an intermediate device to facilitate communication between the sensor and an intra-operative computing unit (ICU).
- The ICU may comprise a laptop, workstation, or other computing device having at least one processing unit and at least one storage device, such as memory storing software (instructions and/or data) as further described herein to configure the execution of the intra-operative computing unit. The ICU receives positional data from the sensor via the SCU, processes the data and computes the pose of targets that are within the working volume of the sensor. The ICU also performs any further processing associated with the other sensing and/or user input components. The ICU may further process the measurements to express them in clinically-relevant terms (e.g. according to anatomical registration). The ICU may further implement a user interface to guide a user through a surgical workflow. The ICU may further provide the user interface, including measurements, to a display unit for displaying to a surgeon or other user.
- In one embodiment, a sensor is configured for handheld use. As depicted in
FIG. 3 , thesensor 110 may localize two ormore targets 102, in order to compute a relative pose between thetargets 102. Thetargets 102 are attached toobjects 302 and the relative pose between theobjects 302 can also be calculated. In relation to navigated TKA, the objects could be thefemur 104 and thetibia 106, or thefemur 104 and the cuttingguide 114. - Furthermore, both handheld and non-handheld modes may be supported for use of the sensor within the same surgery. This is illustrated in
FIG. 4a andFIG. 4 b. For example, during a TKA, it may be preferable to leave thesensor 110 mounted to a fixedstructure 402 for the majority of the procedure with the use of a releasablemechanical connection 404, such as the one described in U.S. 20140275940 titled “System and method for intra-operative leg position measurement”, the entire contents of which are incorporated herein. However, during a step to localize amalleolus 406 of the patient's anatomy, thesensor 110 can be removed from its fixedstructure 402 if themalleolus 406 is outside the workingvolume 408 of the optical sensor in thesensor 110. Thesensor 110 can be held in a user'shand 112 to bring themalleolus 406 within its workingvolume 408, as shown inFIG. 4 b. - This system configuration has several advantages. This system does not have a wired connection to objects being tracked. This system further allows the sensor to be used for pose measurements of the targets at specific steps in the workflow, and set aside when not in use. When a target is outside of or obstructed from the working volume of the sensor, while the sensor is attached to a static/fixed position, the position of the sensor can be moved, using the sensor in handheld mode, to include the target in its working volume. The sensor may comprise user input components (e.g. buttons). This user input may be a part of the surgical workflow, and may be communicated to the ICU. Furthermore, indicators on the sensor could provide feedback to the surgeon (e.g. status LED's indicating that a target is being detected by the sensor). Also, in handheld operation, it may be preferable for a surgical assistant to bold the sensor to align the sensor' working volume with the targets in the surgical site, while the surgeon is performing another task.
- As illustrated in
FIG. 5 , in addition to handheld configurations, thesensor 110 may be mounted onto asurgeon 502 for other user) using asensor mounting structure 504. Thesensor mounting structure 504 may allow the sensor to be attached to a surgeon's forehead (by way of example, shown as a headband). Thesensor 110 is preferably placed on thesensor mounting structure 504 such that the workingvolume 408 of thesensor 110 and the surgeon's visual field ofview 506 are substantially aligned, i.e. the majority of the workingvolume 408 overlaps with the field ofview 108 of thesensor 110. In such a configuration, if thesurgeon 502 can see thetargets 102, the sensor 110 (while mounted on the sensor mounting structure 504) will likely be able to do so as well. This configuration allows the surgeon to rapidly and intuitively overcome line-of-sight disruptions between thesensor 110 and thetargets 102. In this configuration, it is desirable that the optical sensor in thesensor 110 has a workingvolume 108 that is approximately centered about the nominal distance between a surgeon's forehead (or other mounting, location) and the surgical site. The workingvolume 108 of thesensor 110 is preferably large enough to accommodate a wide range of feasible relative positions between the mounting position on the surgeon and the surgical site. - In the previously described embodiments, the sensor in the surgical navigation system is mounted on the surgeon's body. For reasons of sterility and ergonomics, it may not be feasible to use buttons on the sensor in order to interact with the intra-operative computing unit. Hand gestures may be sensed by the optical system and used as user input. For example, if the sensor can detect a user's hand in its field of view, the system comprising the sensor, SCU, ICU, and a display unit, may be able to identify predefined gestures that correspond to certain commands within the surgical workflow. For example, waving a hand from left to right may correspond to advancing the surgical workflow on the display unit of the ICU; snapping fingers may correspond to saving a measurement that may also be displayed on the display unit of the ICU; waving a hand hack and forth may correspond to cancelling an action within the workflow; etc. According to
FIG. 6 , a user interacts with the system using a hand wave gesture within the field ofview 108 of thesensor 110. These gestures are recognized by the sensor and by image processing software executed by the ICU. When mounted on the surgeon's forehead, the hand gestures may be performed by another user within the surgery and need not necessarily be performed by the surgeon. Furthermore, using hand gestures as user input is not limited to a system where the sensor is placed on the surgeon's body. It is also possible to use gesture control in any system where the sensor is mounted on the operating table, on the patient, hand-held, etc. - In
FIG. 7 , animage 702 of a sensor as seen by the optical sensor is illustrated. In addition to gesture control, the ICU may receive comments via voice controls (e.g. via a microphone in the sensor or in the ICU itself). In another embodiment, as shown inFIG. 8 , body motions (e.g. head nods 802 or shakes 804) can be detected by thesensor 110 with additional sensing components, e.g. an embedded accelerometer. The body motions are sensed and interpreted to send signals to the ICU which may communicate wirelessly with thesensor 110. For example, nodding 802 may signal the surgeon workflow to advance, shaking 804 may signal the surgeon workflow to go back one step or cancel an action, etc. - Reference is now made to
FIG. 9 . When the sensor is mounted on asurgeon 502, the sensor may be wireless to allow thesurgeon 502 to move freely around the operating room. In one embodiment, thesensor 110 communicates wirelessly 902 with theICU 904, the ICU configured to receivewireless signals 902 from the sensor. - Reference is now made to
FIG. 10 . In another embodiment, thesensor 110 is connected by wire to a Sensor Control Unit (SCU) 1002. TheSCU 1002 further communicates wirelessly 902 with theICU 904. TheSCU 1002 contains a mountingstructure 504 such that it may be mounted on thesurgeon 502. For example, thesensor 110 may be mounted to the forehead of thesurgeon 502, and theSCU 1002 may be attached to the belt of thesurgeon 502 under a surgical gown. TheSCU 1002 may have a battery to power the sensor, as well as its own electronics. TheSCU 1002 may have an integrated battery, and theSCU 1002 may be charged between surgeries, such as on a charging dock. Multiple SCU's may be available such that if an SCU loses power during surgery, it can be swapped out for a fully charged SCU. Where a wireless sensor is implemented, any wireless protocol/technology may be used (e.g. WiFi, Bluetooth, ZigBee, etc). - It will be appreciated that a sensor mounted on a surgeon's body need not necessarily be sterile. For example, the surgeon's forehead is not sterile. This is advantageous since a sensor may not be made from materials that can be sterilized, and by removing the sensor from the sterile field, there is no requirement to ensure that the sensor is sterile (e.g. through draping, autoclaving, etc.).
- Reference is no made to
FIG. 11 . In one embodiment, thesensor 110 is attached to a mountingstructure 504 to the forehead of thesurgeon 502, and the workingvolume 408 of thesensor 110 is substantially contained within the surgeon's field ofview 506. Thesensor 110 is configured to detecttargets 102 within its workingvolume 408 such that the relative pose between thetargets 102 may be calculated by theICU 904. TheICU 904 may comprise a display unit. In this embodiment, thedisplay unit 1102 is also included in the surgeon's field ofview 506. This is advantageous where real-time measurements based on the pose of thetargets 102 are provided to thesurgeon 502 via thedisplay unit 1102. Thetargets 102 and thedisplay unit 1102 are simultaneously visible to thesurgeon 502. Thedisplay unit 1102 may be positioned within the surgeon's field ofview 506, for example, by being mounted to a mobile cart. - In another embodiment, the
display unit 1102 is a heads-up display configured to be worn by thesurgeon 502, and be integrated with thesensor 110. The heads-up display is any transparent display that does not require the surgeon to look away from the usual field of view. For example, the heads-up display may be projected onto a visor that surgeon's typically wear during surgery. A head-mounted sensor allows the surgeon to ensure that the targets are within the working volume of the sensor without additionally trying to look at a static display unit within the operating room, and with a heads-up display, the display unit is visible to the surgeon regardless of where they are looking. - In another embodiment, as illustrated in
FIG. 12 , thesensor 110 is integrated with heads-up display glasses 1202 (such as the Google Glass™ product from Google Inc.) Since thesensor 110 and thedisplay unit 1102 are integrated into one device, there is no longer a need tot a dedicated display unit, thus reducing cost and complexity. This embodiment is also advantageous since by default, it places the senor's working volume within the surgeon's field of view. - In another embodiment, as illustrated in
FIG. 13 , the heads-up display glasses 1202 (with integrated sensor) are connected by a cable to aSCU 1002, theSCU 1002 attached to asurgeon 502. In this embodiment, theSCU 1002 may also comprise the ICU (that is, it may perform the intra-operative computations to generate display information to be displayed on the heads-up display glasses). The SCU may include a battery, e.g. a rechargeable battery. In this embodiment, the SCU need not communicate wirelessly with any other device, since the entire computational/electronic system is provided by the SCU, heads-up display glasses and sensor. That is, the entire surgical localization system is body-worn (with the exception of the targets and their coupling means). - In a body-mounted or handheld camera configuration, the sensor is not required to be stationary when capturing relative pose between multiple targets. However, some measurements may be required to be absolute. For example, when calculating the center of rotation (COR) of a hip joint, a fixed sensor may localize a target affixed to the femur as it is rotated about the hip COR. The target is constrained to move along a sphere, the sphere's center being the hip COR. Since the sensor is in a fixed location, and the hip COR is in a fixed location, the pose data from rotating the femur may be used to calculate the pose of the hip COR relative to the target on the femur. In TKA, the pose of the hip COR is useful for determining the mechanical axis of a patient's leg.
- In an embodiment where the sensor may not be fixed, a second stationary target is used as a static reference to compensate for the possible motion of the sensor. In
FIG. 14 , atarget 102 is affixed to afemur 104 that has astatic hip COR 1401. The sensor is not necessarily stationary, as it may be body-mounted or hand-held. Astatic reference target 1402 is introduced within the workingvolume 408 of thesensor 110. Thestatic reference target 1402 must fulfill two conditions: 1) it must remain stationary during steps executed to measure thehip COR 1401, and 2) it must be within the workingvolume 408 of thesensor 110 simultaneously with thetarget 102 attached to thefemur 104 while thehip COR 1401 is measured. In this example, thestatic reference target 1402 may be simply placed on the OR bed 116or mounted to a fixeddocking structure 402. To measure thehip COR 1401 the femur is rotated about a pivot point of the hip joint while thesensor 110 captures poses of thetarget 102 and poses of thestatic reference target 1402. The poses of thestatic reference target 1402 are used to compensate for any movement of thesensor 110 as it used in hand-held or body-mounted mode. This embodiment is described relative to measuring, hip COR in the context of TKA; however, this embodiment is meant to be example for clarity. There are many other embodiments that will be evident to those skilled in the art e.g. tool calibration, verification of the location of a tip of a tool, etc.). - Where the surgeon has direct control over the working volume of the optical sensor (i.e. in the body-mounted or handheld configurations), it may be advantageous to display the video feed as seen by the optical sensor to the surgeon via the display unit. This visual feedback may allow the surgeon to see what the optical sensor “sees”, and may be useful in a) ensuring that the target(s) are within the working volume of the sensor, and b) diagnosing any occlusions, disturbances, disruptions, etc. that prevents the poses of the targets from being captured by the ICU. For example, a persistent optical sensor image feed may be displayed. This is illustrated in
FIG. 15 , where a laptop (integratedICU 904 and display 1102) is shown to display a video feed 1502 (showing two targets within the image) of the optical sensor. Alternative graphics ma be shown in addition to or instead of theraw video feed 1502 of the optical sensor. For example, simplified renderings of the targets (e.g. represented as dots) may be displayed within a window on the display instead of thevideo feed 1502. In another example, a virtual overlay may be displayed on the raw video teed 1502 of the optical sensor, in which the overlay includes colours and shapes overlaid on the target image. - Various embodiments have been described herein with reference to the accompanying drawings. It will however, be evident that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the broader scope of the disclosed embodiments as set forth in the claims that follow.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/926,963 US20160128783A1 (en) | 2014-10-29 | 2015-10-29 | Surgical navigation system with one or more body borne components and method therefor |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462072030P | 2014-10-29 | 2014-10-29 | |
US201462072041P | 2014-10-29 | 2014-10-29 | |
US201462072032P | 2014-10-29 | 2014-10-29 | |
US201462084891P | 2014-11-26 | 2014-11-26 | |
US14/926,963 US20160128783A1 (en) | 2014-10-29 | 2015-10-29 | Surgical navigation system with one or more body borne components and method therefor |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160128783A1 true US20160128783A1 (en) | 2016-05-12 |
Family
ID=55856304
Family Applications (13)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/926,963 Abandoned US20160128783A1 (en) | 2014-10-29 | 2015-10-29 | Surgical navigation system with one or more body borne components and method therefor |
US15/522,503 Abandoned US20170333136A1 (en) | 2014-10-29 | 2015-10-29 | Systems and devices including a surgical navigation camera with a kinematic mount and a surgical drape with a kinematic mount adapter |
US15/522,559 Active 2036-09-22 US10682185B2 (en) | 2014-10-29 | 2015-10-29 | Devices, systems and methods for natural feature tracking of surgical tools and other objects |
US15/148,084 Active US9603671B2 (en) | 2014-10-29 | 2016-05-06 | Systems, methods and devices for anatomical registration and surgical localization |
US15/425,690 Active US9713506B2 (en) | 2014-10-29 | 2017-02-06 | Systems, methods and devices for image registration and surgical localization |
US15/656,347 Active US10034715B2 (en) | 2014-10-29 | 2017-07-21 | Systems, methods and devices to measure and display inclination and track patient motion during a procedure |
US16/036,182 Active 2036-02-21 US10786312B2 (en) | 2014-10-29 | 2018-07-16 | Systems, methods and devices to measure and display inclination and track patient motion during a procedure |
US16/871,982 Active US10898278B2 (en) | 2014-10-29 | 2020-05-11 | Systems, methods and devices to measure and display inclination and track patient motion during a procedure |
US16/901,730 Active 2036-08-20 US11583344B2 (en) | 2014-10-29 | 2020-06-15 | Devices, systems and methods for natural feature tracking of surgical tools and other objects |
US17/143,791 Active 2036-07-10 US12035977B2 (en) | 2014-10-29 | 2021-01-07 | Systems, methods and devices to measure and display inclination and track patient motion during a procedure |
US18/082,016 Pending US20230119870A1 (en) | 2014-10-29 | 2022-12-15 | Devices, systems and methods for natural feature tracking of surgical tools and other objects |
US18/129,698 Active US11896321B2 (en) | 2014-10-29 | 2023-03-31 | Systems and methods to register patient anatomy or to determine and present measurements relative to patient anatomy |
US18/129,724 Active US11890064B2 (en) | 2014-10-29 | 2023-03-31 | Systems and methods to register patient anatomy or to determine and present measurements relative to patient anatomy |
Family Applications After (12)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/522,503 Abandoned US20170333136A1 (en) | 2014-10-29 | 2015-10-29 | Systems and devices including a surgical navigation camera with a kinematic mount and a surgical drape with a kinematic mount adapter |
US15/522,559 Active 2036-09-22 US10682185B2 (en) | 2014-10-29 | 2015-10-29 | Devices, systems and methods for natural feature tracking of surgical tools and other objects |
US15/148,084 Active US9603671B2 (en) | 2014-10-29 | 2016-05-06 | Systems, methods and devices for anatomical registration and surgical localization |
US15/425,690 Active US9713506B2 (en) | 2014-10-29 | 2017-02-06 | Systems, methods and devices for image registration and surgical localization |
US15/656,347 Active US10034715B2 (en) | 2014-10-29 | 2017-07-21 | Systems, methods and devices to measure and display inclination and track patient motion during a procedure |
US16/036,182 Active 2036-02-21 US10786312B2 (en) | 2014-10-29 | 2018-07-16 | Systems, methods and devices to measure and display inclination and track patient motion during a procedure |
US16/871,982 Active US10898278B2 (en) | 2014-10-29 | 2020-05-11 | Systems, methods and devices to measure and display inclination and track patient motion during a procedure |
US16/901,730 Active 2036-08-20 US11583344B2 (en) | 2014-10-29 | 2020-06-15 | Devices, systems and methods for natural feature tracking of surgical tools and other objects |
US17/143,791 Active 2036-07-10 US12035977B2 (en) | 2014-10-29 | 2021-01-07 | Systems, methods and devices to measure and display inclination and track patient motion during a procedure |
US18/082,016 Pending US20230119870A1 (en) | 2014-10-29 | 2022-12-15 | Devices, systems and methods for natural feature tracking of surgical tools and other objects |
US18/129,698 Active US11896321B2 (en) | 2014-10-29 | 2023-03-31 | Systems and methods to register patient anatomy or to determine and present measurements relative to patient anatomy |
US18/129,724 Active US11890064B2 (en) | 2014-10-29 | 2023-03-31 | Systems and methods to register patient anatomy or to determine and present measurements relative to patient anatomy |
Country Status (4)
Country | Link |
---|---|
US (13) | US20160128783A1 (en) |
EP (1) | EP3212110B1 (en) |
AU (5) | AU2015337755B2 (en) |
WO (3) | WO2016065457A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018166718A1 (en) | 2017-03-15 | 2018-09-20 | Zf Friedrichshafen Ag | Arrangement and method for determining a gradient signal in a vehicle |
US10702344B2 (en) | 2017-04-24 | 2020-07-07 | Intellijoint Surgical Inc. | Cutting tools, systems and methods for navigated bone alterations |
WO2020160388A1 (en) * | 2019-01-31 | 2020-08-06 | Brain Corporation | Systems and methods for laser and imaging odometry for autonomous robots |
CN113384347A (en) * | 2021-06-16 | 2021-09-14 | 瑞龙诺赋(上海)医疗科技有限公司 | Robot calibration method, device, equipment and storage medium |
CN114615948A (en) * | 2019-10-18 | 2022-06-10 | 完整植入物有限公司Dba阿克塞勒斯 | Surgical navigation system |
US20230355310A1 (en) * | 2022-05-06 | 2023-11-09 | Stryker European Operations Limited | Technique For Determining A Visualization Based On An Estimated Surgeon Pose |
Families Citing this family (100)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7559931B2 (en) | 2003-06-09 | 2009-07-14 | OrthAlign, Inc. | Surgical orientation system and method |
US20100064216A1 (en) | 2008-07-24 | 2010-03-11 | OrthAlign, Inc. | Systems and methods for joint replacement |
CA2736525C (en) | 2008-09-10 | 2019-10-22 | OrthAlign, Inc. | Hip surgery systems and methods |
US10869771B2 (en) | 2009-07-24 | 2020-12-22 | OrthAlign, Inc. | Systems and methods for joint replacement |
US11399900B2 (en) | 2012-06-21 | 2022-08-02 | Globus Medical, Inc. | Robotic systems providing co-registration using natural fiducials and related methods |
US11857149B2 (en) | 2012-06-21 | 2024-01-02 | Globus Medical, Inc. | Surgical robotic systems with target trajectory deviation monitoring and related methods |
US11045267B2 (en) | 2012-06-21 | 2021-06-29 | Globus Medical, Inc. | Surgical robotic automation with tracking markers |
US11793570B2 (en) | 2012-06-21 | 2023-10-24 | Globus Medical Inc. | Surgical robotic automation with tracking markers |
US11864839B2 (en) | 2012-06-21 | 2024-01-09 | Globus Medical Inc. | Methods of adjusting a virtual implant and related surgical navigation systems |
US11857266B2 (en) | 2012-06-21 | 2024-01-02 | Globus Medical, Inc. | System for a surveillance marker in robotic-assisted surgery |
US11974822B2 (en) | 2012-06-21 | 2024-05-07 | Globus Medical Inc. | Method for a surveillance marker in robotic-assisted surgery |
US11896446B2 (en) | 2012-06-21 | 2024-02-13 | Globus Medical, Inc | Surgical robotic automation with tracking markers |
US11864745B2 (en) | 2012-06-21 | 2024-01-09 | Globus Medical, Inc. | Surgical robotic system with retractor |
US10758315B2 (en) | 2012-06-21 | 2020-09-01 | Globus Medical Inc. | Method and system for improving 2D-3D registration convergence |
US11317971B2 (en) | 2012-06-21 | 2022-05-03 | Globus Medical, Inc. | Systems and methods related to robotic guidance in surgery |
US10624710B2 (en) | 2012-06-21 | 2020-04-21 | Globus Medical, Inc. | System and method for measuring depth of instrumentation |
US11589771B2 (en) | 2012-06-21 | 2023-02-28 | Globus Medical Inc. | Method for recording probe movement and determining an extent of matter removed |
US10799298B2 (en) | 2012-06-21 | 2020-10-13 | Globus Medical Inc. | Robotic fluoroscopic navigation |
US11253327B2 (en) | 2012-06-21 | 2022-02-22 | Globus Medical, Inc. | Systems and methods for automatically changing an end-effector on a surgical robot |
US12220120B2 (en) * | 2012-06-21 | 2025-02-11 | Globus Medical, Inc. | Surgical robotic system with retractor |
US11298196B2 (en) | 2012-06-21 | 2022-04-12 | Globus Medical Inc. | Surgical robotic automation with tracking markers and controlled tool advancement |
US10842461B2 (en) | 2012-06-21 | 2020-11-24 | Globus Medical, Inc. | Systems and methods of checking registrations for surgical systems |
US11963755B2 (en) | 2012-06-21 | 2024-04-23 | Globus Medical Inc. | Apparatus for recording probe movement |
US10874466B2 (en) | 2012-06-21 | 2020-12-29 | Globus Medical, Inc. | System and method for surgical tool insertion using multiaxis force and moment feedback |
US11786324B2 (en) | 2012-06-21 | 2023-10-17 | Globus Medical, Inc. | Surgical robotic automation with tracking markers |
US12004905B2 (en) | 2012-06-21 | 2024-06-11 | Globus Medical, Inc. | Medical imaging systems using robotic actuators and related methods |
US12133699B2 (en) | 2012-06-21 | 2024-11-05 | Globus Medical, Inc. | System and method for surgical tool insertion using multiaxis force and moment feedback |
WO2014025305A1 (en) * | 2012-08-08 | 2014-02-13 | Ortoma Ab | Method and system for computer assisted surgery |
US9649160B2 (en) | 2012-08-14 | 2017-05-16 | OrthAlign, Inc. | Hip replacement navigation system and method |
US9993273B2 (en) | 2013-01-16 | 2018-06-12 | Mako Surgical Corp. | Bone plate and tracking device using a bone plate for attaching to a patient's anatomy |
EP2945559B1 (en) | 2013-01-16 | 2018-09-05 | Stryker Corporation | Navigation systems and methods for indicating and reducing line-of-sight errors |
EP3441039B1 (en) | 2013-03-15 | 2020-10-21 | Stryker Corporation | Assembly for positioning a sterile surgical drape relative to optical position sensors |
US20160128783A1 (en) | 2014-10-29 | 2016-05-12 | Intellijoint Surgical Inc. | Surgical navigation system with one or more body borne components and method therefor |
CN113893039B (en) | 2015-02-20 | 2024-09-03 | 史赛克公司 | Installing the system |
US10363149B2 (en) * | 2015-02-20 | 2019-07-30 | OrthAlign, Inc. | Hip replacement navigation system and method |
US11883217B2 (en) | 2016-02-03 | 2024-01-30 | Globus Medical, Inc. | Portable medical imaging system and method |
US11064904B2 (en) | 2016-02-29 | 2021-07-20 | Extremity Development Company, Llc | Smart drill, jig, and method of orthopedic surgery |
US11432878B2 (en) | 2016-04-28 | 2022-09-06 | Intellijoint Surgical Inc. | Systems, methods and devices to scan 3D surfaces for intra-operative localization |
US10118092B2 (en) | 2016-05-03 | 2018-11-06 | Performance Designed Products Llc | Video gaming system and method of operation |
CA3022838A1 (en) | 2016-05-03 | 2017-11-09 | Performance Designed Products Llc | Video gaming system and method of operation |
US10537395B2 (en) | 2016-05-26 | 2020-01-21 | MAKO Surgical Group | Navigation tracker with kinematic connector assembly |
US10223798B2 (en) * | 2016-05-27 | 2019-03-05 | Intellijoint Surgical Inc. | Systems and methods for tracker characterization and verification |
CN109688963B (en) | 2016-07-15 | 2021-08-06 | 马科外科公司 | System for robot-assisted correction of programs |
WO2018047096A1 (en) * | 2016-09-07 | 2018-03-15 | Intellijoint Surgical Inc. | Systems and methods for surgical navigation, including image-guided navigation of a patient's head |
US10993771B2 (en) * | 2016-09-12 | 2021-05-04 | Synaptive Medical Inc. | Trackable apparatuses and methods |
KR101817438B1 (en) | 2016-09-13 | 2018-01-11 | 재단법인대구경북과학기술원 | A surgical navigation system for total hip arthroplasty |
US10828114B2 (en) | 2016-10-18 | 2020-11-10 | Synaptive Medical (Barbados) Inc. | Methods and systems for providing depth information |
GB2570835B (en) * | 2016-10-21 | 2021-11-03 | Synaptive Medical Inc | Methods and systems for providing depth information |
US12109001B2 (en) * | 2016-10-28 | 2024-10-08 | Naviswiss Ag | Sterile covering for an optical device |
US10441365B2 (en) * | 2017-01-11 | 2019-10-15 | Synaptive Medical (Barbados) Inc. | Patient reference device |
US11160610B2 (en) * | 2017-02-07 | 2021-11-02 | Intellijoint Surgical Inc. | Systems and methods for soft tissue navigation |
WO2018150336A1 (en) * | 2017-02-14 | 2018-08-23 | Atracsys Sàrl | High-speed optical tracking with compression and/or cmos windowing |
AU2018236205B2 (en) | 2017-03-14 | 2023-10-12 | OrthAlign, Inc. | Soft tissue measurement and balancing systems and methods |
JP7661007B2 (en) | 2017-03-14 | 2025-04-14 | オースアライン・インコーポレイテッド | Hip replacement navigation system and method |
US11065069B2 (en) | 2017-05-10 | 2021-07-20 | Mako Surgical Corp. | Robotic spine surgery system and methods |
US11033341B2 (en) | 2017-05-10 | 2021-06-15 | Mako Surgical Corp. | Robotic spine surgery system and methods |
US11259878B2 (en) | 2017-05-29 | 2022-03-01 | Intellijoint Surgical Inc. | Systems and methods for surgical navigation with a tracker instrument |
US11096754B2 (en) | 2017-10-04 | 2021-08-24 | Mako Surgical Corp. | Sterile drape assembly for surgical robot |
SE542086C2 (en) * | 2017-10-25 | 2020-02-25 | Ortoma Ab | Tool for Attachment of Marker of a Surgical Navigation System |
CN107997822B (en) * | 2017-12-06 | 2021-03-19 | 上海卓梦医疗科技有限公司 | Minimally invasive surgery positioning system |
KR20200115518A (en) | 2018-01-26 | 2020-10-07 | 마코 서지컬 코포레이션 | End effector, system and method for impacting prosthesis guided by surgical robot |
US10992078B2 (en) | 2018-01-29 | 2021-04-27 | Bard Access Systems, Inc. | Connection system for establishing an electrical connection through a drape and methods thereof |
WO2019209725A1 (en) | 2018-04-23 | 2019-10-31 | Mako Surgical Corp. | System, method and software program for aiding in positioning of a camera relative to objects in a surgical environment |
EP3793464A4 (en) | 2018-05-18 | 2021-07-21 | Bard Access Systems, Inc. | Connection systems and methods thereof for establishing an electrical connection through a drape |
US11191594B2 (en) | 2018-05-25 | 2021-12-07 | Mako Surgical Corp. | Versatile tracking arrays for a navigation system and methods of recovering registration using the same |
EP3586785B1 (en) * | 2018-06-29 | 2024-01-17 | Globus Medical, Inc. | Surgical robotic automation with tracking markers |
US20220331971A1 (en) * | 2021-04-16 | 2022-10-20 | Intellijoint Surgical Inc. | Systems, targets, and methods for optical registration of tools |
CN113316431B (en) | 2018-12-04 | 2024-09-20 | 马科外科公司 | Mounting system with sterile barrier assembly for coupling surgical components |
CN109498102B (en) * | 2018-12-07 | 2024-03-26 | 北京天智航医疗科技股份有限公司 | Depth measuring device and grinding device |
CN109793563B (en) * | 2019-01-31 | 2020-07-14 | 山东大学齐鲁医院 | Nail placement positioning system for minimally invasive treatment of pelvic fracture and working method |
EP3962396A1 (en) * | 2019-04-29 | 2022-03-09 | Smith&Nephew, Inc. | Multi-level positional tracking |
CN110200699B (en) * | 2019-05-21 | 2020-08-18 | 武汉联影智融医疗科技有限公司 | Medical imaging device guided surgical device, calibration method and calibration system |
US12059804B2 (en) | 2019-05-22 | 2024-08-13 | Mako Surgical Corp. | Bidirectional kinematic mount |
CN110115635A (en) * | 2019-05-23 | 2019-08-13 | 长江大学 | A kind of lighting system of automatic tracing visual area |
US20220338886A1 (en) * | 2019-06-19 | 2022-10-27 | Think Surgical, Inc. | System and method to position a tracking system field-of-view |
AU2020315554B2 (en) * | 2019-07-17 | 2023-02-23 | Mako Surgical Corp. | Surgical registration tools, systems, and methods of use in computer-assisted surgery |
EP3997497A4 (en) | 2019-07-29 | 2023-07-05 | Bard Access Systems, Inc. | Connection systems and methods for establishing optical and electrical connections through a drape |
EP4007932A4 (en) | 2019-08-08 | 2023-07-26 | Bard Access Systems, Inc. | Optical-fiber connector modules including shape-sensing systems and methods thereof |
CA3153123A1 (en) * | 2019-09-03 | 2021-03-11 | Russell J. Bodner | Methods and systems for targeted alignment and sagittal plane positioning for hip replacement surgery |
US20210197401A1 (en) * | 2019-12-31 | 2021-07-01 | Auris Health, Inc. | Tool detection system |
WO2021174117A1 (en) | 2020-02-28 | 2021-09-02 | Bard Access Systems, Inc. | Catheter with optic shape sensing capabilities |
EP4110175B1 (en) * | 2020-02-28 | 2025-05-07 | Bard Access Systems, Inc. | Optical connection systems and methods thereof |
WO2021178578A1 (en) | 2020-03-03 | 2021-09-10 | Bard Access Systems, Inc. | System and method for optic shape sensing and electrical signal conduction |
WO2021231994A1 (en) * | 2020-05-15 | 2021-11-18 | Jeffrey Wilde | Joint implant extraction and placement system and localization device used therewith |
US12070276B2 (en) * | 2020-06-09 | 2024-08-27 | Globus Medical Inc. | Surgical object tracking in visible light via fiducial seeding and synthetic image registration |
AU2021288205A1 (en) * | 2020-06-11 | 2023-02-09 | Monogram Orthopaedics Inc. | Navigational and/or robotic tracking methods and systems |
US11107586B1 (en) * | 2020-06-24 | 2021-08-31 | Cuptimize, Inc. | System and method for analyzing acetabular cup position |
CN112155734B (en) * | 2020-09-29 | 2022-01-28 | 苏州微创畅行机器人有限公司 | Readable storage medium, bone modeling and registering system and bone surgery system |
EP4236851A1 (en) | 2020-10-30 | 2023-09-06 | MAKO Surgical Corp. | Robotic surgical system with slingshot prevention |
US20240016550A1 (en) * | 2020-11-03 | 2024-01-18 | Kico Knee Innovation Company Pty Limited | Scanner for intraoperative application |
US12285572B2 (en) | 2020-11-18 | 2025-04-29 | Bard Access Systems, Inc. | Optical-fiber stylet holders and methods thereof |
US11270418B1 (en) * | 2020-11-19 | 2022-03-08 | VPIX Medical Incorporation | Method and system for correcting phase of image reconstruction signal |
EP4247247A1 (en) | 2020-11-24 | 2023-09-27 | Bard Access Systems, Inc. | Steerable fiber optic shape sensing enabled elongated medical instrument |
US12193718B2 (en) | 2021-04-09 | 2025-01-14 | Smith & Nephew, Inc. | Orthopedic surgical instrument |
USD1044829S1 (en) | 2021-07-29 | 2024-10-01 | Mako Surgical Corp. | Display screen or portion thereof with graphical user interface |
US11975445B2 (en) | 2021-08-06 | 2024-05-07 | DePuy Synthes Products, Inc. | System for connecting end effectors to robot arms that operate under sterile conditions |
US12016642B2 (en) * | 2021-09-08 | 2024-06-25 | Proprio, Inc. | Constellations for tracking instruments, such as surgical instruments, and associated systems and methods |
EP4169470B1 (en) | 2021-10-25 | 2024-05-08 | Erbe Vision GmbH | Apparatus and method for positioning a patient's body and tracking the patient's position during surgery |
US12159420B2 (en) | 2021-12-01 | 2024-12-03 | GE Precision Healthcare LLC | Methods and systems for image registration |
US12011227B2 (en) * | 2022-05-03 | 2024-06-18 | Proprio, Inc. | Methods and systems for determining alignment parameters of a surgical target, such as a spine |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7491198B2 (en) * | 2003-04-28 | 2009-02-17 | Bracco Imaging S.P.A. | Computer enhanced surgical navigation imaging system (camera probe) |
WO2010067267A1 (en) * | 2008-12-09 | 2010-06-17 | Philips Intellectual Property & Standards Gmbh | Head-mounted wireless camera and display unit |
US20150080740A1 (en) * | 2013-09-18 | 2015-03-19 | iMIRGE Medical INC. | Optical targeting and visualization of trajectories |
Family Cites Families (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5288785A (en) | 1993-06-14 | 1994-02-22 | Union Carbide Chemicals & Plastics Technology Corporation | Low voltage power cables |
US5591119A (en) * | 1994-12-07 | 1997-01-07 | Adair; Edwin L. | Sterile surgical coupler and drape |
JPH09128916A (en) | 1995-10-31 | 1997-05-16 | Fujitsu Ltd | Disk unit |
US7666191B2 (en) * | 1996-12-12 | 2010-02-23 | Intuitive Surgical, Inc. | Robotic surgical system with sterile surgical adaptor |
US6285902B1 (en) * | 1999-02-10 | 2001-09-04 | Surgical Insights, Inc. | Computer assisted targeting device for use in orthopaedic surgery |
US6288785B1 (en) * | 1999-10-28 | 2001-09-11 | Northern Digital, Inc. | System for determining spatial position and/or orientation of one or more objects |
WO2001097866A2 (en) * | 2000-06-22 | 2001-12-27 | Visualization Technology, Inc. | Windowed medical drape |
AU2002215822A1 (en) * | 2000-10-23 | 2002-05-06 | Deutsches Krebsforschungszentrum Stiftung Des Offentlichen Rechts | Method, device and navigation aid for navigation during medical interventions |
US6978167B2 (en) | 2002-07-01 | 2005-12-20 | Claron Technology Inc. | Video pose tracking system and method |
US7429259B2 (en) * | 2003-12-02 | 2008-09-30 | Cadeddu Jeffrey A | Surgical anchor and system |
WO2006060632A1 (en) * | 2004-12-02 | 2006-06-08 | Smith & Nephew, Inc. | Systems for providing a reference plane for mounting an acetabular cup |
US9492241B2 (en) * | 2005-01-13 | 2016-11-15 | Mazor Robotics Ltd. | Image guided robotic system for keyhole neurosurgery |
EP3470040B1 (en) | 2005-02-22 | 2022-03-16 | Mako Surgical Corp. | Haptic guidance system and method |
EP3372161A1 (en) * | 2005-06-02 | 2018-09-12 | Orthosoft, Inc. | Leg alignment for surgical parameter measurement in hip replacement surgery |
US8000926B2 (en) | 2005-11-28 | 2011-08-16 | Orthosensor | Method and system for positional measurement using ultrasonic sensing |
US7949386B2 (en) * | 2006-03-21 | 2011-05-24 | A2 Surgical | Computer-aided osteoplasty surgery system |
JP2009537229A (en) | 2006-05-19 | 2009-10-29 | マコ サージカル コーポレーション | Method and apparatus for controlling a haptic device |
US9622813B2 (en) * | 2007-11-01 | 2017-04-18 | Covidien Lp | Method for volume determination and geometric reconstruction |
US8000890B2 (en) * | 2007-11-30 | 2011-08-16 | General Electric Company | Image-guided navigation employing navigated point computation method and system |
US9002076B2 (en) * | 2008-04-15 | 2015-04-07 | Medtronic, Inc. | Method and apparatus for optimal trajectory planning |
US20090306499A1 (en) * | 2008-06-09 | 2009-12-10 | Mako Surgical Corp. | Self-detecting kinematic clamp assembly |
US8348954B2 (en) * | 2008-09-16 | 2013-01-08 | Warsaw Orthopedic, Inc. | Electronic guidance of spinal instrumentation |
US20100305439A1 (en) * | 2009-05-27 | 2010-12-02 | Eyal Shai | Device and Method for Three-Dimensional Guidance and Three-Dimensional Monitoring of Cryoablation |
WO2011162753A1 (en) * | 2010-06-23 | 2011-12-29 | Mako Sugical Corp. | Inertially tracked objects |
US8675939B2 (en) | 2010-07-13 | 2014-03-18 | Stryker Leibinger Gmbh & Co. Kg | Registration of anatomical data sets |
AU2011342900A1 (en) * | 2010-12-17 | 2013-07-18 | Intellijoint Surgical Inc. | Method and system for aligning a prosthesis during surgery |
DE102011050240A1 (en) * | 2011-05-10 | 2012-11-15 | Medizinische Hochschule Hannover | Apparatus and method for determining the relative position and orientation of objects |
US9498231B2 (en) * | 2011-06-27 | 2016-11-22 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
EP2765946B1 (en) * | 2011-10-13 | 2015-08-12 | Brainlab AG | Medical tracking system comprising multi-functional sensor device |
US20130096573A1 (en) * | 2011-10-18 | 2013-04-18 | Hyosig Kang | System and method for surgical tool tracking |
IN2012DE01345A (en) * | 2012-05-02 | 2015-08-07 | Stryker Global Technology Ct | |
DE102012208389A1 (en) | 2012-05-18 | 2013-11-21 | Fiagon Gmbh | Registration method and apparatus for a position detection system |
EP2666428B1 (en) * | 2012-05-21 | 2015-10-28 | Universität Bern | System and method for estimating the spatial position of a tool within an object |
EP2863820B1 (en) * | 2012-06-20 | 2020-10-28 | Intellijoint Surgical Inc. | Method of manufacturing a system for guided surgery |
US9649160B2 (en) * | 2012-08-14 | 2017-05-16 | OrthAlign, Inc. | Hip replacement navigation system and method |
US9603665B2 (en) * | 2013-03-13 | 2017-03-28 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
US9220572B2 (en) * | 2013-03-14 | 2015-12-29 | Biomet Manufacturing, Llc | Method for implanting a hip prosthesis and related system |
US9247998B2 (en) | 2013-03-15 | 2016-02-02 | Intellijoint Surgical Inc. | System and method for intra-operative leg position measurement |
US10799316B2 (en) | 2013-03-15 | 2020-10-13 | Synaptive Medical (Barbados) Inc. | System and method for dynamic validation, correction of registration for surgical navigation |
WO2014147601A2 (en) | 2013-03-18 | 2014-09-25 | Navigate Surgical Technologies, Inc. | Method for determining the location and orientation of a fiducial reference |
WO2015022084A1 (en) * | 2013-08-13 | 2015-02-19 | Brainlab Ag | Medical registration apparatus and method for registering an axis |
US20150305823A1 (en) * | 2014-04-25 | 2015-10-29 | General Electric Company | System and method for processing navigational sensor data |
EP3811891B8 (en) * | 2014-05-14 | 2025-01-08 | Stryker European Operations Holdings LLC | Navigation system and processor arrangement for tracking the position of a work target |
US20160128783A1 (en) * | 2014-10-29 | 2016-05-12 | Intellijoint Surgical Inc. | Surgical navigation system with one or more body borne components and method therefor |
-
2015
- 2015-10-29 US US14/926,963 patent/US20160128783A1/en not_active Abandoned
- 2015-10-29 AU AU2015337755A patent/AU2015337755B2/en active Active
- 2015-10-29 US US15/522,503 patent/US20170333136A1/en not_active Abandoned
- 2015-10-29 EP EP15856161.3A patent/EP3212110B1/en active Active
- 2015-10-29 WO PCT/CA2015/000558 patent/WO2016065457A1/en active Application Filing
- 2015-10-29 WO PCT/CA2015/000560 patent/WO2016065459A1/en active Application Filing
- 2015-10-29 WO PCT/CA2015/000559 patent/WO2016065458A1/en active Application Filing
- 2015-10-29 US US15/522,559 patent/US10682185B2/en active Active
-
2016
- 2016-05-06 US US15/148,084 patent/US9603671B2/en active Active
-
2017
- 2017-02-06 US US15/425,690 patent/US9713506B2/en active Active
- 2017-07-21 US US15/656,347 patent/US10034715B2/en active Active
-
2018
- 2018-07-16 US US16/036,182 patent/US10786312B2/en active Active
-
2019
- 2019-10-15 AU AU2019250135A patent/AU2019250135B2/en active Active
-
2020
- 2020-05-11 US US16/871,982 patent/US10898278B2/en active Active
- 2020-06-15 US US16/901,730 patent/US11583344B2/en active Active
-
2021
- 2021-01-07 US US17/143,791 patent/US12035977B2/en active Active
- 2021-06-04 AU AU2021203699A patent/AU2021203699B2/en active Active
-
2022
- 2022-06-03 AU AU2022203868A patent/AU2022203868B2/en active Active
- 2022-12-15 US US18/082,016 patent/US20230119870A1/en active Pending
-
2023
- 2023-03-31 US US18/129,698 patent/US11896321B2/en active Active
- 2023-03-31 US US18/129,724 patent/US11890064B2/en active Active
-
2024
- 2024-10-23 AU AU2024227568A patent/AU2024227568A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7491198B2 (en) * | 2003-04-28 | 2009-02-17 | Bracco Imaging S.P.A. | Computer enhanced surgical navigation imaging system (camera probe) |
WO2010067267A1 (en) * | 2008-12-09 | 2010-06-17 | Philips Intellectual Property & Standards Gmbh | Head-mounted wireless camera and display unit |
US20150080740A1 (en) * | 2013-09-18 | 2015-03-19 | iMIRGE Medical INC. | Optical targeting and visualization of trajectories |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018166718A1 (en) | 2017-03-15 | 2018-09-20 | Zf Friedrichshafen Ag | Arrangement and method for determining a gradient signal in a vehicle |
DE102017204306A1 (en) | 2017-03-15 | 2018-09-20 | Zf Friedrichshafen Ag | Arrangement and method for determining a gradient signal in a vehicle |
US11472415B2 (en) | 2017-03-15 | 2022-10-18 | Zf Friedrichshafen Ag | Arrangement and method for determining a gradient signal in a vehicle |
US10702344B2 (en) | 2017-04-24 | 2020-07-07 | Intellijoint Surgical Inc. | Cutting tools, systems and methods for navigated bone alterations |
US11564753B2 (en) | 2017-04-24 | 2023-01-31 | Intellijoint Surgical Inc. | Cutting tools, systems and methods for navigated bone alterations |
WO2020160388A1 (en) * | 2019-01-31 | 2020-08-06 | Brain Corporation | Systems and methods for laser and imaging odometry for autonomous robots |
US12138812B2 (en) | 2019-01-31 | 2024-11-12 | Brain Corporation | Systems and methods for laser and imaging odometry for autonomous robots |
CN114615948A (en) * | 2019-10-18 | 2022-06-10 | 完整植入物有限公司Dba阿克塞勒斯 | Surgical navigation system |
CN113384347A (en) * | 2021-06-16 | 2021-09-14 | 瑞龙诺赋(上海)医疗科技有限公司 | Robot calibration method, device, equipment and storage medium |
US20230355310A1 (en) * | 2022-05-06 | 2023-11-09 | Stryker European Operations Limited | Technique For Determining A Visualization Based On An Estimated Surgeon Pose |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160128783A1 (en) | Surgical navigation system with one or more body borne components and method therefor | |
US11883117B2 (en) | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums | |
JP7216764B2 (en) | Alignment of Surgical Instruments with Reference Arrays Tracked by Cameras in Augmented Reality Headsets for Intraoperative Assisted Navigation | |
US11839435B2 (en) | Extended reality headset tool tracking and control | |
JP7478106B2 (en) | Extended reality visualization of optical instrument tracking volumes for computer-assisted navigation in surgery | |
US11382713B2 (en) | Navigated surgical system with eye to XR headset display calibration | |
US20250017685A1 (en) | Augmented reality headset for navigated robotic surgery | |
EP3970650B1 (en) | Augmented reality navigation systems for use with robotic surgical systems | |
US12239388B2 (en) | Camera tracking bar for computer assisted navigation during surgery | |
US11737831B2 (en) | Surgical object tracking template generation for computer assisted navigation during surgical procedure | |
US11690697B2 (en) | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment | |
EP3922203A1 (en) | Surgical object tracking in visible light via fiducial seeding and synthetic image registration | |
JP7282816B2 (en) | Extended Reality Instrument Interaction Zones for Navigated Robotic Surgery | |
US20240164844A1 (en) | Bone landmarks extraction by bone surface palpation using ball tip stylus for computer assisted surgery navigation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INTELLIJOINT SURGICAL INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HLADIO, ANDRE NOVOMIR;BAKIRTZIAN, ARMEN GARO;FANSON, RICHARD TYLER;SIGNING DATES FROM 20151130 TO 20151201;REEL/FRAME:037249/0685 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: CANADIAN IMPERIAL BANK OF COMMERCE, CANADA Free format text: SECURITY INTEREST;ASSIGNOR:INTELLIJOINT SURGICAL INC.;REEL/FRAME:050820/0965 Effective date: 20191016 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |