US12236536B2 - System and method for location determination using a mixed reality device and a 3D spatial mapping camera - Google Patents
System and method for location determination using a mixed reality device and a 3D spatial mapping camera Download PDFInfo
- Publication number
- US12236536B2 US12236536B2 US17/486,677 US202117486677A US12236536B2 US 12236536 B2 US12236536 B2 US 12236536B2 US 202117486677 A US202117486677 A US 202117486677A US 12236536 B2 US12236536 B2 US 12236536B2
- Authority
- US
- United States
- Prior art keywords
- jig
- bone
- mixed reality
- reality headset
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/90—Identification means for patients or instruments, e.g. tags
- A61B90/92—Identification means for patients or instruments, e.g. tags coded with colour
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/90—Identification means for patients or instruments, e.g. tags
- A61B90/94—Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text
- A61B90/96—Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text using barcodes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
- A61B2034/2057—Details of tracking cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/373—Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2008—Assembling, disassembling
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
Definitions
- the present disclosure relates generally to surgical systems and methods of facilitating the efficiency and accuracy of implanting surgical prostheses using mixed reality and 3D spacial mapping devices.
- a surgeon will utilize a metal jig which is used as a drilling or cutting guide to make the necessary corresponding cuts and holes in the bone of the knee to facilitate placement and attachment of the implant to the bone.
- these metal jigs must be stocked in a variety of different sizes to accommodate different needs and sizes of patients, accordingly, significant stocks of metal jigs must be stored and sterilize.
- use of these metal jigs include inherent inaccuracies as the surgeons fix the metal jigs with respect to the corresponding bone during use as a drill or cutting guide.
- the femoral implant and tibial implant are designed to be surgically implanted into the distal end of the femur and the proximal end of the tibia, respectively.
- the femoral implant is further designed to cooperate with the tibial implant in simulating the articulating motion of an anatomical knee joint.
- femoral and tibial implants in combination with ligaments and muscles, attempt to duplicate natural knee motion as well as absorb and control forces generated during the range of flexion. In some instances however, it may be necessary to replace or modify an existing femoral and/or tibial implant. Such replacements are generally referred to as revision implants.
- the femur and tibia bones must be cut in very specific and precise ways and at very specific and precise angles and locations, so that the prepared bone will properly engage with and be secured to the corresponding implants.
- a surgeon traditionally uses a jig, or surgical cutting guide as known to those skilled in the field, which can be removably attached or secured to the bone, such that slots, or guides, in the jig facilitate the precise cuts necessary to secure the corresponding implants.
- jig shall thus refer broadly to a surgical cutting guide, that may be configured and arranged to be fixed or attached to a bone, or secured adjacent to a bone or other tissue to be cut by a surgeon and identify a relative location, angle and/or cutting plane that a surgeon should cut or drill on the adjacent bone or tissue, as known in the art.
- a jig may include predetermined slots and/or cutting surfaces to identify where a surgeon should cut the adjacent bone or tissue, wherein such cuts may correspond to a shape of a surgical implant that may be attached to the cut bone or tissue.
- a “cutting surface” may refer to a guide edge for guiding the path of a cutting instrument.
- jigs are typically made of a metal alloy and, due to the precise tolerances at which these jigs must be machined, are quite expensive, ranging as high as $40,000-$50,000 in some cases. These metal jigs must also be stored and reused, which adds additional cost and space resources. Additionally, jigs of various sizes must be kept on had to accommodate patients of different sizes and needs.
- holographic jigs also referred to a virtual jigs
- a virtual jigs have been used to enable a surgeon to visualize the positioning and proper sizing of a jig to a bone.
- the physical jig will impair the view of the virtual or holographic jig, making it difficult to utilize the holographic jig to accurately place the physical jig.
- virtual jig or “holographic jig” as used herein, shall thus refer broadly to any visualization or visual rendering or projection representing an actual physical jig, having all, or mostly all, of the same visual characteristics of the physical jig, including size and shape, as known in the art.
- FIG. 1 is a schematic rendering of a mixed reality system of the present disclosure
- FIG. 2 is a perspective view of another embodiment of a mixed reality system of the present disclosure
- FIG. 3 is a side view of a pin guide of the present disclosure
- FIG. 4 is a front view of a user perspective of a mixed reality headset viewer of the present disclosure
- FIG. 5 is a front view of another user perspective of a mixed reality headset viewer, during a procedure, of the present disclosure.
- FIG. 6 is a front view of a further user perspective of a mixed reality headset viewer, during a procedure, of the present disclosure.
- the terms “virtual,” and “hologram” are used interchangeably, and grammatical equivalents thereof are inclusive or open-ended terms that do not exclude additional, unrecited elements or method steps. These terms are used to describe visual representations of an actual physical device or element, having all, or mostly all, of the same visual characteristics of the physical device, including size and shape.
- Applicant has discovered a novel system and method for generating and using a virtual jig, or virtual instrument, in a surgical procedure, for example, in a knee or tibial implant procedure, or other desired surgical procedure.
- virtual system shall refer broadly to any system capable of generating or creating a simulated or virtual rendering or projection of physical or structural features identical or substantially identical to an actual physical device, instrument or other physical structure, as known in the art.
- a virtual system may also include a device, mechanism, or instrument capable of projecting or displaying the desired a simulated or virtual rendering or projection of physical or structural features identical or substantially identical to an actual physical device.
- a virtual system may also enable a user to manipulate, move and/or modify the simulated or virtual rendering or projection.
- mixed or augmented reality system shall refer broadly to any system capable of generating or creating a simulated or virtual rendering or projection of physical or structural features identical or substantially identical to an actual physical device, instrument or other physical structure, as known in the art.
- a mixed or augmented reality system may also include a device, mechanism, or instrument capable of projecting or displaying the desired a simulated or virtual rendering or projection of physical or structural features identical or substantially identical to an actual physical device overlaid or concurrently with actual physical structures, mechanism or devices in reality, thus incorporating the virtual rendering or projection in real world settings with actual physical element.
- a mixed or augmented reality system may also enable a user to manipulate, move and/or modify the simulated or virtual rendering or projection.
- mixed or augmented reality instrument shall refer broadly to any device, mechanism or instrument used in a mixed or augmented reality system, including a device capable of generating or creating a simulated or virtual rendering or projection of physical or structural features identical or substantially identical to an actual physical device, instrument or other physical structure, as known in the art.
- a mixed or augmented reality instrument may also be capable of projecting or displaying the desired a simulated or virtual rendering or projection of physical or structural features identical or substantially identical to an actual physical device overlaid or concurrently with actual physical structures, mechanism or devices in reality, thus incorporating the virtual rendering or projection in real world settings with actual physical element.
- a mixed or augmented reality instrument may also enable a user to manipulate, move and/or modify the simulated or virtual rendering or projection.
- holographic representation shall refer broadly to a visualization or visual rendering or projection representing an actual physical device or element, having all, or mostly all, of the same visual characteristics of the corresponding physical device or element, including size and shape, as known in the art.
- a mixed or augmented system 100 which can be used to produce, or display, a desired mixed or augmented reality instrument, such as a virtual jig or cutting guide in a display to a surgeon or user, or stated another way, that is visible and manipulatable by a surgeon or user.
- the mixed or augmented reality system 100 may also enable a user to activate or deactivate, in full or in part, the virtual instrument or instruments, making a virtual instrument appear or disappear, as desired in a mixed reality assisted surgery, for example.
- the mixed or augmented reality system 100 may include a mixed or augmented reality headset 102 which may include a transparent or mostly transparent viewer 104 which can be suspended or positioned in front of a user's eyes.
- the headset 102 may include a headband 106 attached to the viewer 104 , which may be used to secure the headset 102 to a user's head 108 , thereby securing the viewer 104 in place in front of the user's eyes.
- the transparent viewer 104 may be configured to project, or otherwise make viewable, on an interior surface of the viewer 104 , a holographic image or images, such as a virtual device, for example, a virtual cutting guide, which may be positionally manipulated by the user, surgeon, third party or remote system, such as a remote computer system.
- the headset 102 may be configured to view holographic images or, alternatively, the holographic images may be turned off and the user wearing the headset 102 may be able to view the surrounding environment through the transparent viewer 104 , unobstructed.
- a user such as a surgeon for example, can wear the mixed or augmented reality headset 102 and then can choose to activate a holographic image to aide in facilitating a surgical procedure and then shut off the holographic image in order to perform the surgical procedure un-obscured, visually.
- One embodiment of the disclosed headset 102 may be a product created and manufactured by Microsoft, known as the HoloLens® mixed or augmented reality system, or any suitable mixed or augmented reality system for generating virtual images viewable by a user or surgeon.
- Headset 102 may be a conventional “off the shelf” product with a built-in platform that enables all of the features described herein with respect to the headset 102 .
- the headset 102 such as a Microsoft HoloLens product, can be loaded or preloaded with all desired or required virtual instruments, including virtual jigs or surgical cutting guides, virtual drill bits, and/or a virtual target which can identify relative locations of a plurality of holes to be drilled by a surgeon to facilitate the fastening of a jig or other device onto a desired bone at the proper desired location, and any other desired virtual instruments or holograms.
- the Microsoft HoloLens product and its capabilities and features, or any suitable mixed or augmented reality system such as is described herein with respect to the headset 102 , are known to those skilled in the art.
- the mixed reality system 100 may also include a computer or computer system 200 having enabling software to communicate with the headset 102 , by both receiving information from the headset 102 and transmitting data and images to the headset 102 . It is therefore to be understood, by way of the circuit diagram and dashed lines shown in FIG. 1 , that headset 102 is electronically connected to the computer system 200 and a 3D spatial mapping camera 300 .
- the 3D spatial mapping camera 300 is electronically connected to the headset 102 and the computer system 200 , as shown in the circuit diagram and dashed lines shown in FIG. 1 . While the 3D spatial mapping camera 300 may be electronically connected to the headset 102 , the 3D spatial mapping camera 300 may be separate from and not mechanically connected to the headset 102 .
- the mixed reality system 100 may also include a 3D spatial mapping camera 300 .
- One embodiment of the disclosed spatial mapping camera 300 may be a product created and manufactured by Microsoft, known as the Azure Kinect®, or any suitable 3D spatial mapping camera capable of continuous 3D mapping and transition corresponding 3D images, such as bones, anatomy, or other desired 3D objects.
- the spatial mapping camera 300 may be a conventional “off the shelf” product with a built-in platform that enables all of the features described herein with respect to the spatial mapping camera 200 .
- the spatial mapping camera 200 such as a Microsoft Azure Kinect product, can be loaded or preloaded with all necessary software to enable wireless communication between the spatial mapping camera 300 and the computer system 200 and/or the headset 102 .
- the Microsoft Azure Kinect product and its capabilities and features, or any suitable 3D spatial mapping camera such as is described herein with respect to the spatial mapping camera 300 , are known to those skilled in the art.
- the spatial mapping camera 300 may include sensor software development kits for low-level sensor and device access, body tracking software development kits for tracking bodies or objects in 3D, and speech cognitive services software development kits for enabling microphone access and cloud-based or wireless speech services.
- the spatial mapping camera 300 may include the following features: depth camera access and mode control (a passive IR mode, plus wide and narrow field-of-view depth modes); RGB camera access and control (for example, exposure and white balance); a motion sensor (gyroscope and accelerometer) access; synchronized depth-RGB camera streaming with configurable delay between cameras; external device synchronization control with configurable delay offset between devices; camera frame meta-data access for image resolution, timestamp, etc.; and device calibration data access.
- depth camera access and mode control a passive IR mode, plus wide and narrow field-of-view depth modes
- RGB camera access and control for example, exposure and white balance
- a motion sensor gyroscope and accelerometer
- synchronized depth-RGB camera streaming with configurable delay between cameras external device synchronization control with configurable delay offset between devices
- camera frame meta-data access for image resolution, timestamp, etc.
- device calibration data access for image resolution, timestamp, etc.
- the spatial mapping camera 300 may also include software development kits that enable a viewer tool to monitor device data streams and configure different modes; a sensor recording tool and playback reader API that uses the Matroska container format, as known to those of ordinary skill in the field; a library and runtime to track bodies in 3D when used with the spatial mapping camera 300 ; contains an anatomically correct skeleton for each partial or full body; provides a unique identity for each body; can track bodies over time; and speech services, such as, speech-to-text, speech translation and text-to-speech.
- the headset 102 , computer system 200 and spatial mapping camera 300 may be programmed and configured to enable a surgeon 107 to see and manipulate a virtual, or holographic target or jig, with respect a patient's bone 400 , anatomical, or any other desired location, which may receive a surgical implant.
- the headset 102 , computer system 200 and spatial mapping camera 300 may communicate with one another via a local network connection, wifi, bluetooth, or any other known wireless communication signal.
- the spatial mapping camera 300 may utilize such enabling software to map the bone 400 and or jig 500 , or other desired anatomy, to help identify the proper location for fastening a jig, or other device, to the bone 400 , prior to cutting the knee.
- the spatial mapping camera 300 differs from traditional imaging, such as an MRI, CT scan, x-ray or the like, in many ways.
- the spacial mapping camera 300 may be mounted, fixedly or moveably, in an operating room, thus giving the most up-to-date mapping information possible.
- the spatial mapping camera 300 may also continuously map the surface and 3D contours of the bone 400 to provide the surgeon 107 with realtime feedback and information to make the proper drill holes, cuts, or other preparations on the bone 400 , before, during, and/or after any given procedure.
- the mixed reality system 100 may also include an alignment jig 500 that can be secured to the exposed bone 400 , or other desired anatomy.
- the jig 500 may includes a first marker 502 which may be attached to the jig 500 at a fixed location on the bone 400 .
- the first marker 502 may include a scannable, optical or visual label 404 , such as a QR code.
- the jig 500 may also include a second marker 510 that may be moveable with respect to the jig 500 and the bone 400 .
- the second marker 510 may also include a scannable, optical or visual label 412 , such as a QR code.
- the mixed reality system 100 may also include a surgical tool 600 , such as a drill for example, that may also include a scannable label 602 , such as a QR code, which can be scanned by the headset 102 and or spatial mapping camera 300 , which may include data related to the jig 500 , surgical procedure, and/or the corresponding patient.
- This data may be transmitted and processed by the computer system 200 , which may then prompt or identify to the surgeon 107 , via the headset 102 where, and at what angle, to position the tool 600 to drill, cut, or otherwise prepare the bone 400 , at the proper location on the bone 400 .
- the headset 102 and the spatial mapping camera 300 may utilize a physical calibration tool 700 to calibrate and align 3D coordinates with the 3D contours of the exposed bone 400 or anatomy.
- the calibration tool 700 may include a specially designed calibration pattern 702 than may be scanned by the headset 102 and/or the spatial mapping camera 300 , which can then send the corresponding calibration data to the computer system 200 which can then use the calibration data to provide accurate coordinates to the headset 102 to identify where to drill, cut, or otherwise prepare the bone 400 .
- the mixed reality system 100 may also include a pin guide 800 that may also include a scannable label 802 , such as a QR code, which can be scanned by the headset 102 , which may then prompt or identify to the surgeon 107 where, and at what angle, to position the pin guide 800 to perform a desired procedure, at the proper location on the bone 400 .
- the pin guide 800 may also include a guide sleeve 804 which can provide a directional guide to receive and orient a pin (not shown) to be placed by the surgeon during a procedure. the surgeon to drill, cut, or otherwise prepare the bone 400 .
- a surgeon or user may perform a surgical procedure by first exposing the bone 400 , or other desired anatomy.
- the spatial mapping camera 300 may then continuously 3D spatially map the exposed bone 400 , and send the corresponding mapping information to the computer system 200 .
- multiple spatial mapping cameras 300 may be used in unison, to provide more accurate and expansive imaging.
- the surgeon may then attach the jig 500 to the exposed bone 400 , at a predetermined or desired location.
- the spatial mapping camera 300 may spatially map the jig 500 and the exposed bone 400 , to map the surface of the exposed bone and relative location of the jig 500 .
- the surgeon may then utilize the headset 102 or spatial mapping camera 300 to scan the fixed scannable label 504 of the first marker 502 and send the corresponding information to the computer system 200 .
- the computer system 200 may then utilize data from the 3D spatial mapping camera 300 and the scannable label 502 of the first marker 502 to determine the orientation of the moveable second marker 510 and send the data to the headset 100 .
- the headset 102 may generate a holographic image 900 , or visualization, viewable by the surgeon on the viewer 104 , such that the surgeon can maintain a real-world view of the bone 400 and simultaneously view the holographic image 900 .
- the holographic image 900 may include a realtime rendering of the exposed bone 400 and a realtime rendering of the jig 500 .
- the headset 102 can then scan the scannable label 512 of the second marker 510 and identify the proper or required position of the second marker 510 , relative to the bone 400 . This identified position of the second marker 510 may be viewed by the surgeon in holographic image 900 .
- the surgeon can then manipulate the rendering of the second marker 510 in the holographic image 900 , until the proper position is set and determined.
- the surgeon can use the headset 102 to lock the holographic image 900 in place. Then the surgeon can manipulate the actual physical second marker 510 to substantially match the positioning of the set rendering of the second marker 510 .
- a target 514 located on the second marker 510 may provide the surgeon with the substantially exact location to drill a required hole, or place a pin, with respect to the exposed bone 400 , such that the surgeon can manipulate the location of the jig with respect to the bone 400 , to substantially match the location of the virtual jig, or holographic image 900 , with respect to a virtual bone.
- the headset 102 may help facilitate the proper orientation of the second marker 510 , and corresponding target 514 , by illuminating the target 514 or by providing a colored symbol, either of which may include a color, such as red, that may change colors, such as changing from red to green, when the target is ultimately moved into the proper position, by operation of a microprocessor (not shown) contained within headset 102 or computer system 200 , said processor being programed as known to those skilled in the art of programming to trigger a change of color when the target is moved into the proper position.
- a microprocessor not shown
- the headset 102 may then scan the scannable label 602 of tool 600 or scannable label 802 of pin guide 800 and identify the location and angle of the tool 600 or pin guide 800 , to make the required drill hole, or pin insertion.
- the headset 102 may help facilitate the proper orientation of the tool 600 or pin guide 800 , by providing a colored symbol 750 , such as red, that may change colors, such as green, when the tool 600 or pin guide 800 are ultimately moved into the proper position.
- the surgeon may then use the properly oriented tool 600 or pin guide 800 to perform the corresponding procedure or preparation of the bone 400 .
- a cutting jig (not shown) may then be attached to the bone 400 at the prepared position.
- a surgical jig is conventionally, and may be, a surgical tool that may be used to help a surgeon make predetermined and accurate cuts of a desired bone to facilitate attachment of a surgical implant.
- a jig may have one or a series of slots located at specific predetermined locations and at specific predetermined angles, with respect to a body of the jig, such that when the jig is attached to a bone surface, the surgeon can make precise and accurate cuts, using the jig as a guide, without the need of additional measurements.
- the jig 500 may be made of plastic, metal, polyamide, or any other desired material. Manufacturing the jig 500 out of a plastic or polyamide material, or other relatively inexpensive material, may allow the jig 500 to be disposable, while still maintaining the precision and accuracy of traditional metal jigs.
- the jig 500 may also be manufactured using a 3D printer, which can further reduce the cost of manufacturing and storage of jigs, since 3D printed jigs could be made on demand, customized to the size and shape required by individual patients and users.
- the physical jig 500 may also be manufactured using any other known technique for forming a physical jig.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Architecture (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Surgical Instruments (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (2)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/486,677 US12236536B2 (en) | 2020-08-17 | 2021-09-27 | System and method for location determination using a mixed reality device and a 3D spatial mapping camera |
Applications Claiming Priority (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202016994662A | 2020-08-17 | 2020-08-17 | |
| US202017030351A | 2020-09-23 | 2020-09-23 | |
| US202117169274A | 2021-02-05 | 2021-02-05 | |
| US202117473867A | 2021-09-13 | 2021-09-13 | |
| US17/486,677 US12236536B2 (en) | 2020-08-17 | 2021-09-27 | System and method for location determination using a mixed reality device and a 3D spatial mapping camera |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US202117473867A Continuation | 2020-08-17 | 2021-09-13 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20220051483A1 US20220051483A1 (en) | 2022-02-17 |
| US12236536B2 true US12236536B2 (en) | 2025-02-25 |
Family
ID=80224275
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/486,677 Active US12236536B2 (en) | 2020-08-17 | 2021-09-27 | System and method for location determination using a mixed reality device and a 3D spatial mapping camera |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US12236536B2 (en) |
Families Citing this family (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3057524B1 (en) | 2013-10-10 | 2019-11-20 | Imascap | Method for designing and producing a shoulder surgery guide |
| WO2019245865A1 (en) | 2018-06-19 | 2019-12-26 | Tornier, Inc. | Mixed reality indication of points at which 3d bone and implant models collide |
| US12236536B2 (en) | 2020-08-17 | 2025-02-25 | Russell Todd Nevins | System and method for location determination using a mixed reality device and a 3D spatial mapping camera |
| US11571225B2 (en) | 2020-08-17 | 2023-02-07 | Russell Todd Nevins | System and method for location determination using movement between optical labels and a 3D spatial mapping camera |
| US20220070353A1 (en) * | 2020-08-27 | 2022-03-03 | Shield Ai | Unmanned aerial vehicle illumination system |
| US20220331008A1 (en) | 2021-04-02 | 2022-10-20 | Russell Todd Nevins | System and method for location determination using movement of an optical label fixed to a bone using a spatial mapping camera |
| US11600053B1 (en) | 2021-10-04 | 2023-03-07 | Russell Todd Nevins | System and method for location determination using a mixed reality device and multiple imaging cameras |
| GB2618853A (en) * | 2022-05-20 | 2023-11-22 | Drill Surgeries Ltd | Surgical guide system for assisting a user controlling a surgical tool |
Citations (77)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE10103922A1 (en) | 2001-01-30 | 2002-08-01 | Physoptics Opto Electronic Gmb | Interactive data viewing and operating system |
| US20050251030A1 (en) | 2004-04-21 | 2005-11-10 | Azar Fred S | Method for augmented reality instrument placement using an image based navigation system |
| WO2007108776A2 (en) | 2005-12-31 | 2007-09-27 | Bracco Imaging S.P.A. | Systems and methods for collaborative interactive visualization of 3d data sets over a network ('dextronet') |
| US20070270685A1 (en) * | 2006-05-19 | 2007-11-22 | Mako Surgical Corp. | Method and apparatus for controlling a haptic device |
| US7331929B2 (en) | 2004-10-01 | 2008-02-19 | General Electric Company | Method and apparatus for surgical operating room information display gaze detection and user prioritization for control |
| US20080183179A1 (en) | 2004-05-22 | 2008-07-31 | Thomas Siebel | Surgical Jig |
| US20090163923A1 (en) | 2004-02-27 | 2009-06-25 | Magnus Flett | Surgical jig and methods of use |
| US7812815B2 (en) | 2005-01-25 | 2010-10-12 | The Broad of Trustees of the University of Illinois | Compact haptic and augmented virtual reality system |
| WO2012033739A2 (en) | 2010-09-08 | 2012-03-15 | Disruptive Navigational Technologies, Llc | Surgical and medical instrument tracking using a depth-sensing device |
| US8560047B2 (en) | 2006-06-16 | 2013-10-15 | Board Of Regents Of The University Of Nebraska | Method and apparatus for computer aided surgery |
| US20140222462A1 (en) | 2013-02-07 | 2014-08-07 | Ian Shakil | System and Method for Augmenting Healthcare Provider Performance |
| US8876830B2 (en) | 2009-08-13 | 2014-11-04 | Zimmer, Inc. | Virtual implant placement in the OR |
| US8954181B2 (en) | 2010-12-07 | 2015-02-10 | Sirona Dental Systems Gmbh | Systems, methods, apparatuses, and computer-readable storage media for designing and manufacturing custom dental preparation guides |
| US8956165B2 (en) | 2008-01-25 | 2015-02-17 | University Of Florida Research Foundation, Inc. | Devices and methods for implementing endoscopic surgical procedures and instruments within a virtual environment |
| WO2015134953A1 (en) | 2014-03-06 | 2015-09-11 | Virtual Reality Medical Applications, Inc. | Virtual reality medical application system |
| US9165318B1 (en) * | 2013-05-29 | 2015-10-20 | Amazon Technologies, Inc. | Augmented reality presentation |
| US20160191887A1 (en) | 2014-12-30 | 2016-06-30 | Carlos Quiles Casas | Image-guided surgery with surface reconstruction and augmented reality visualization |
| DE102015212352A1 (en) | 2015-07-01 | 2017-01-05 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method, arrangement and computer program product for the position detection of an object to be examined |
| US9563266B2 (en) | 2012-09-27 | 2017-02-07 | Immersivetouch, Inc. | Haptic augmented and virtual reality system for simulation of surgical procedures |
| WO2017066373A1 (en) | 2015-10-14 | 2017-04-20 | Surgical Theater LLC | Augmented reality surgical navigation |
| US9730713B2 (en) | 2002-05-15 | 2017-08-15 | Howmedica Osteonics Corporation | Arthroplasty jig |
| US20170245781A1 (en) | 2016-02-29 | 2017-08-31 | Extremity Development Company, Llc | Smart drill, jig, and method of orthopedic surgery |
| US20170258526A1 (en) | 2016-03-12 | 2017-09-14 | Philipp K. Lang | Devices and methods for surgery |
| US20170287218A1 (en) * | 2016-03-30 | 2017-10-05 | Microsoft Technology Licensing, Llc | Virtual object manipulation within physical environment |
| US20170312032A1 (en) | 2016-04-27 | 2017-11-02 | Arthrology Consulting, Llc | Method for augmenting a surgical field with virtual guidance content |
| CN107430437A (en) | 2015-02-13 | 2017-12-01 | 厉动公司 | The system and method that real crawl experience is created in virtual reality/augmented reality environment |
| US20170367766A1 (en) | 2016-03-14 | 2017-12-28 | Mohamed R. Mahfouz | Ultra-wideband positioning for wireless ultrasound tracking and communication |
| WO2018007091A1 (en) | 2016-07-06 | 2018-01-11 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Imaging device for an operating theatre |
| US9892564B1 (en) | 2017-03-30 | 2018-02-13 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
| US20180049622A1 (en) | 2016-08-16 | 2018-02-22 | Insight Medical Systems, Inc. | Systems and methods for sensory augmentation in medical procedures |
| US20180090029A1 (en) | 2016-09-29 | 2018-03-29 | Simbionix Ltd. | Method and system for medical simulation in an operating room in a virtual reality or augmented reality environment |
| US20180098813A1 (en) | 2016-10-07 | 2018-04-12 | Simbionix Ltd. | Method and system for rendering a medical simulation in an operating room in virtual reality or augmented reality environment |
| US9978141B2 (en) | 2015-04-17 | 2018-05-22 | Clear Guide Medical, Inc. | System and method for fused image based navigation with late marker placement |
| US10016243B2 (en) | 2015-03-23 | 2018-07-10 | Justin Esterberg | Systems and methods for assisted surgical navigation |
| WO2018132804A1 (en) | 2017-01-16 | 2018-07-19 | Lang Philipp K | Optical guidance for surgical, medical, and dental procedures |
| US20180240276A1 (en) | 2017-02-23 | 2018-08-23 | Vid Scale, Inc. | Methods and apparatus for personalized virtual reality media interface design |
| WO2018175971A1 (en) | 2017-03-24 | 2018-09-27 | Surgical Theater LLC | System and method for training and collaborating in a virtual environment |
| US10108266B2 (en) | 2012-09-27 | 2018-10-23 | The Board Of Trustees Of The University Of Illinois | Haptic augmented and virtual reality system for simulation of surgical procedures |
| US10194990B2 (en) | 2016-04-27 | 2019-02-05 | Arthrology Consulting, Llc | Method for augmenting a surgical field with virtual guidance content |
| US20190038362A1 (en) | 2017-08-02 | 2019-02-07 | Medtech S.A. | Surgical field camera system |
| WO2019051080A1 (en) | 2017-09-08 | 2019-03-14 | Surgical Theater LLC | Dual mode augmented reality surgical system and method |
| US20190076198A1 (en) * | 2017-09-12 | 2019-03-14 | Michael E. Berend | Femoral medial condyle spherical center tracking |
| US10241569B2 (en) | 2015-12-08 | 2019-03-26 | Facebook Technologies, Llc | Focus adjustment method for a virtual reality headset |
| US10285765B2 (en) | 2014-05-05 | 2019-05-14 | Vicarious Surgical Inc. | Virtual reality surgical device |
| US20190142520A1 (en) | 2017-11-14 | 2019-05-16 | Stryker Corporation | Patient-specific preoperative planning simulation techniques |
| GB2570758A (en) | 2017-11-24 | 2019-08-07 | Abhari Kamyar | Methods and devices for tracking objects by surgical navigation systems |
| US10401954B2 (en) | 2017-04-17 | 2019-09-03 | Intel Corporation | Sensory enhanced augmented reality and virtual reality device |
| US10405873B2 (en) | 2006-08-03 | 2019-09-10 | Orthosoft Ulc | Computer-assisted surgery tools and system |
| US10437335B2 (en) | 2015-04-14 | 2019-10-08 | John James Daniels | Wearable electronic, multi-sensory, human/machine, human/human interfaces |
| CN110431636A (en) | 2017-01-11 | 2019-11-08 | 奇跃公司 | medical assistant |
| US20190380792A1 (en) | 2018-06-19 | 2019-12-19 | Tornier, Inc. | Virtual guidance for orthopedic surgical procedures |
| US20200000527A1 (en) | 2017-02-01 | 2020-01-02 | Laurent CAZAL | Method and device for assisting a surgeon fit a prosthesis, in particular a hip prosthesis, following different surgical protocols |
| US20200037043A1 (en) | 2018-07-27 | 2020-01-30 | Telefonaktiebolaget L M Ericsson (Publ) | SYSTEM AND METHOD FOR INSERTING ADVERTISEMENT CONTENT IN 360º IMMERSIVE VIDEO |
| WO2020033568A2 (en) | 2018-08-07 | 2020-02-13 | Smith & Nephew Inc. | Patella tracking method and system |
| WO2020037308A1 (en) | 2018-08-17 | 2020-02-20 | Smith & Nephew, Inc. | Patient-specific surgical method and system |
| WO2020047051A1 (en) | 2018-08-28 | 2020-03-05 | Smith & Nephew, Inc. | Robotic assisted ligament graft placement and tensioning |
| US20200078100A1 (en) | 2017-01-03 | 2020-03-12 | Mako Surgical Corp. | Systems and methods for surgical navigation |
| US20200107003A1 (en) | 2018-10-01 | 2020-04-02 | Telefonaktiebolaget Lm Ericsson (Publ) | CLIENT OPTIMIZATION FOR PROVIDING QUALITY CONTROL IN 360º IMMERSIVE VIDEO DURING PAUSE |
| US10672288B2 (en) | 2014-09-08 | 2020-06-02 | SimX, Inc. | Augmented and virtual reality simulator for professional and educational training |
| WO2020145826A1 (en) | 2019-01-10 | 2020-07-16 | Augmedit B.V. | Method and assembly for spatial mapping of a model, such as a holographic model, of a surgical tool and/or anatomical structure onto a spatial position of the surgical tool respectively anatomical structure, as well as a surgical tool |
| US10716643B2 (en) | 2017-05-05 | 2020-07-21 | OrbisMV LLC | Surgical projection system and method |
| US20200275976A1 (en) | 2019-02-05 | 2020-09-03 | Smith & Nephew, Inc. | Algorithm-based optimization for knee arthroplasty procedures |
| US20200275988A1 (en) | 2017-10-02 | 2020-09-03 | The Johns Hopkins University | Image to world registration for medical augmented reality applications using a world spatial map |
| US20200302694A1 (en) | 2017-11-07 | 2020-09-24 | Koninklijke Philips N.V. | Augmented reality triggering of devices |
| US20200360093A1 (en) | 2019-04-22 | 2020-11-19 | Khan Surgical Systems, Inc. | System and method to conduct bone surgery |
| US20200375666A1 (en) | 2019-05-29 | 2020-12-03 | Stephen B. Murphy, M.D. | Systems and methods for augmented reality based surgical navigation |
| US10888399B2 (en) | 2016-12-16 | 2021-01-12 | Align Technology, Inc. | Augmented reality enhancements for dental practitioners |
| US20210088811A1 (en) * | 2019-09-24 | 2021-03-25 | Bespoke, Inc. d/b/a Topology Eyewear. | Systems and methods for adjusting stock eyewear frames using a 3d scan of facial features |
| US10980601B2 (en) | 2010-04-28 | 2021-04-20 | Ryerson University | System and methods for intraoperative guidance feedback |
| US20210142508A1 (en) * | 2018-02-03 | 2021-05-13 | The Johns Hopkins University | Calibration system and method to align a 3d virtual scene and a 3d real world for a stereoscopic head-mounted display |
| WO2021094354A1 (en) | 2019-11-11 | 2021-05-20 | Krueger Timo | Method and system for reproducing an insertion point for a medical instrument |
| US11045263B1 (en) | 2019-12-16 | 2021-06-29 | Russell Nevins | System and method for generating a virtual jig for surgical procedures |
| US20210228308A1 (en) | 2020-01-29 | 2021-07-29 | Siemens Healthcare Gmbh | Representation apparatus |
| US20210228286A1 (en) | 2018-05-10 | 2021-07-29 | Live Vue Technologies Inc. | System and method for assisting a user in a surgical procedure |
| US20210244481A1 (en) | 2019-04-29 | 2021-08-12 | Smith & Nephew, Inc. | Multi-level positional tracking |
| US20220051483A1 (en) | 2020-08-17 | 2022-02-17 | Russell Todd Nevins | System and method for location determination using a mixed reality device and a 3d spatial mapping camera |
| US20220047279A1 (en) | 2020-08-17 | 2022-02-17 | Russell Todd Nevins | System and method for location determination using movement between optical labels and a 3d spatial mapping camera |
-
2021
- 2021-09-27 US US17/486,677 patent/US12236536B2/en active Active
Patent Citations (105)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE10103922A1 (en) | 2001-01-30 | 2002-08-01 | Physoptics Opto Electronic Gmb | Interactive data viewing and operating system |
| US9730713B2 (en) | 2002-05-15 | 2017-08-15 | Howmedica Osteonics Corporation | Arthroplasty jig |
| US20090163923A1 (en) | 2004-02-27 | 2009-06-25 | Magnus Flett | Surgical jig and methods of use |
| US20050251030A1 (en) | 2004-04-21 | 2005-11-10 | Azar Fred S | Method for augmented reality instrument placement using an image based navigation system |
| US8123754B2 (en) | 2004-05-22 | 2012-02-28 | Depuy International Ltd. | Surgical jig |
| US20080183179A1 (en) | 2004-05-22 | 2008-07-31 | Thomas Siebel | Surgical Jig |
| US7331929B2 (en) | 2004-10-01 | 2008-02-19 | General Electric Company | Method and apparatus for surgical operating room information display gaze detection and user prioritization for control |
| US7812815B2 (en) | 2005-01-25 | 2010-10-12 | The Broad of Trustees of the University of Illinois | Compact haptic and augmented virtual reality system |
| WO2007108776A2 (en) | 2005-12-31 | 2007-09-27 | Bracco Imaging S.P.A. | Systems and methods for collaborative interactive visualization of 3d data sets over a network ('dextronet') |
| US20070270685A1 (en) * | 2006-05-19 | 2007-11-22 | Mako Surgical Corp. | Method and apparatus for controlling a haptic device |
| US8560047B2 (en) | 2006-06-16 | 2013-10-15 | Board Of Regents Of The University Of Nebraska | Method and apparatus for computer aided surgery |
| US10405873B2 (en) | 2006-08-03 | 2019-09-10 | Orthosoft Ulc | Computer-assisted surgery tools and system |
| US8956165B2 (en) | 2008-01-25 | 2015-02-17 | University Of Florida Research Foundation, Inc. | Devices and methods for implementing endoscopic surgical procedures and instruments within a virtual environment |
| US8876830B2 (en) | 2009-08-13 | 2014-11-04 | Zimmer, Inc. | Virtual implant placement in the OR |
| US10980601B2 (en) | 2010-04-28 | 2021-04-20 | Ryerson University | System and methods for intraoperative guidance feedback |
| WO2012033739A2 (en) | 2010-09-08 | 2012-03-15 | Disruptive Navigational Technologies, Llc | Surgical and medical instrument tracking using a depth-sensing device |
| US8954181B2 (en) | 2010-12-07 | 2015-02-10 | Sirona Dental Systems Gmbh | Systems, methods, apparatuses, and computer-readable storage media for designing and manufacturing custom dental preparation guides |
| US10108266B2 (en) | 2012-09-27 | 2018-10-23 | The Board Of Trustees Of The University Of Illinois | Haptic augmented and virtual reality system for simulation of surgical procedures |
| US9563266B2 (en) | 2012-09-27 | 2017-02-07 | Immersivetouch, Inc. | Haptic augmented and virtual reality system for simulation of surgical procedures |
| US20180348876A1 (en) | 2012-09-27 | 2018-12-06 | The Board Of Trustees Of The University Of Illinois | Haptic augmented and virtual reality system for simulation of surgical procedures |
| US10437339B2 (en) | 2012-09-27 | 2019-10-08 | The Board Of Trustees Of The University Of Illinois | Haptic augmented and virtual reality system for simulation of surgical procedures |
| US20140222462A1 (en) | 2013-02-07 | 2014-08-07 | Ian Shakil | System and Method for Augmenting Healthcare Provider Performance |
| US9165318B1 (en) * | 2013-05-29 | 2015-10-20 | Amazon Technologies, Inc. | Augmented reality presentation |
| WO2015134953A1 (en) | 2014-03-06 | 2015-09-11 | Virtual Reality Medical Applications, Inc. | Virtual reality medical application system |
| EP3113682A1 (en) | 2014-03-06 | 2017-01-11 | Virtual Reality Medical Applications, Inc. | Virtual reality medical application system |
| US10220181B2 (en) | 2014-03-06 | 2019-03-05 | Virtual Reality Medical Applications, Inc | Virtual reality medical application system |
| US10286179B2 (en) | 2014-03-06 | 2019-05-14 | Virtual Reality Medical Applications, Inc | Virtual reality medical application system |
| US20190366030A1 (en) | 2014-03-06 | 2019-12-05 | Virtual Reality Medical Applications, Inc. | Virtual Reality Medical Application System |
| US20190216562A1 (en) | 2014-05-05 | 2019-07-18 | Vicarious Surgical Inc. | Virtual reality surgical device |
| US10285765B2 (en) | 2014-05-05 | 2019-05-14 | Vicarious Surgical Inc. | Virtual reality surgical device |
| US10672288B2 (en) | 2014-09-08 | 2020-06-02 | SimX, Inc. | Augmented and virtual reality simulator for professional and educational training |
| US10602114B2 (en) | 2014-12-30 | 2020-03-24 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures using stereoscopic optical see-through head mounted displays and inertial measurement units |
| US20190149797A1 (en) | 2014-12-30 | 2019-05-16 | Onpoint Medical, Inc. | Augmented Reality Guidance for Spinal Surgery and Spinal Procedures |
| US20160191887A1 (en) | 2014-12-30 | 2016-06-30 | Carlos Quiles Casas | Image-guided surgery with surface reconstruction and augmented reality visualization |
| CN107430437A (en) | 2015-02-13 | 2017-12-01 | 厉动公司 | The system and method that real crawl experience is created in virtual reality/augmented reality environment |
| US10016243B2 (en) | 2015-03-23 | 2018-07-10 | Justin Esterberg | Systems and methods for assisted surgical navigation |
| US10437335B2 (en) | 2015-04-14 | 2019-10-08 | John James Daniels | Wearable electronic, multi-sensory, human/machine, human/human interfaces |
| US9978141B2 (en) | 2015-04-17 | 2018-05-22 | Clear Guide Medical, Inc. | System and method for fused image based navigation with late marker placement |
| DE102015212352A1 (en) | 2015-07-01 | 2017-01-05 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method, arrangement and computer program product for the position detection of an object to be examined |
| US20170367771A1 (en) | 2015-10-14 | 2017-12-28 | Surgical Theater LLC | Surgical Navigation Inside A Body |
| WO2017066373A1 (en) | 2015-10-14 | 2017-04-20 | Surgical Theater LLC | Augmented reality surgical navigation |
| US10241569B2 (en) | 2015-12-08 | 2019-03-26 | Facebook Technologies, Llc | Focus adjustment method for a virtual reality headset |
| US20170245781A1 (en) | 2016-02-29 | 2017-08-31 | Extremity Development Company, Llc | Smart drill, jig, and method of orthopedic surgery |
| US20180116728A1 (en) | 2016-03-12 | 2018-05-03 | Philipp K. Lang | Novel guidance for surgical procedures |
| US10405927B1 (en) | 2016-03-12 | 2019-09-10 | Philipp K. Lang | Augmented reality visualization for guiding physical surgical tools and instruments including robotics |
| US10159530B2 (en) | 2016-03-12 | 2018-12-25 | Philipp K. Lang | Guidance for surgical interventions |
| US20190262078A1 (en) | 2016-03-12 | 2019-08-29 | Philipp K. Lang | Augmented Reality Visualization for Guiding Physical Surgical Tools and Instruments Including Robotics |
| US10368947B2 (en) | 2016-03-12 | 2019-08-06 | Philipp K. Lang | Augmented reality guidance systems for superimposing virtual implant components onto the physical joint of a patient |
| US9980780B2 (en) | 2016-03-12 | 2018-05-29 | Philipp K. Lang | Guidance for surgical procedures |
| US9861446B2 (en) | 2016-03-12 | 2018-01-09 | Philipp K. Lang | Devices and methods for surgery |
| US10292768B2 (en) | 2016-03-12 | 2019-05-21 | Philipp K. Lang | Augmented reality guidance for articular procedures |
| US20190110842A1 (en) | 2016-03-12 | 2019-04-18 | Philipp K. Lang | Augmented Reality Visualization for Guiding Bone Cuts Including Robotics |
| US10278777B1 (en) | 2016-03-12 | 2019-05-07 | Philipp K. Lang | Augmented reality visualization for guiding bone cuts including robotics |
| US11172990B2 (en) | 2016-03-12 | 2021-11-16 | Philipp K. Lang | Systems for augmented reality guidance for aligning physical tools and instruments for arthroplasty component placement, including robotics |
| US20170258526A1 (en) | 2016-03-12 | 2017-09-14 | Philipp K. Lang | Devices and methods for surgery |
| US20170367766A1 (en) | 2016-03-14 | 2017-12-28 | Mohamed R. Mahfouz | Ultra-wideband positioning for wireless ultrasound tracking and communication |
| US20170287218A1 (en) * | 2016-03-30 | 2017-10-05 | Microsoft Technology Licensing, Llc | Virtual object manipulation within physical environment |
| US20170312032A1 (en) | 2016-04-27 | 2017-11-02 | Arthrology Consulting, Llc | Method for augmenting a surgical field with virtual guidance content |
| US10194990B2 (en) | 2016-04-27 | 2019-02-05 | Arthrology Consulting, Llc | Method for augmenting a surgical field with virtual guidance content |
| WO2018007091A1 (en) | 2016-07-06 | 2018-01-11 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Imaging device for an operating theatre |
| US20180049622A1 (en) | 2016-08-16 | 2018-02-22 | Insight Medical Systems, Inc. | Systems and methods for sensory augmentation in medical procedures |
| US20180090029A1 (en) | 2016-09-29 | 2018-03-29 | Simbionix Ltd. | Method and system for medical simulation in an operating room in a virtual reality or augmented reality environment |
| US20180098813A1 (en) | 2016-10-07 | 2018-04-12 | Simbionix Ltd. | Method and system for rendering a medical simulation in an operating room in virtual reality or augmented reality environment |
| US10888399B2 (en) | 2016-12-16 | 2021-01-12 | Align Technology, Inc. | Augmented reality enhancements for dental practitioners |
| US20200078100A1 (en) | 2017-01-03 | 2020-03-12 | Mako Surgical Corp. | Systems and methods for surgical navigation |
| CN110431636A (en) | 2017-01-11 | 2019-11-08 | 奇跃公司 | medical assistant |
| WO2018132804A1 (en) | 2017-01-16 | 2018-07-19 | Lang Philipp K | Optical guidance for surgical, medical, and dental procedures |
| CN110430809A (en) | 2017-01-16 | 2019-11-08 | P·K·朗 | Optical guidance for surgical, medical and dental procedures |
| US20200000527A1 (en) | 2017-02-01 | 2020-01-02 | Laurent CAZAL | Method and device for assisting a surgeon fit a prosthesis, in particular a hip prosthesis, following different surgical protocols |
| US20180240276A1 (en) | 2017-02-23 | 2018-08-23 | Vid Scale, Inc. | Methods and apparatus for personalized virtual reality media interface design |
| JP2020515891A (en) | 2017-03-24 | 2020-05-28 | サージカル シアター エルエルシー | System and method for training and collaboration in a virtual environment |
| WO2018175971A1 (en) | 2017-03-24 | 2018-09-27 | Surgical Theater LLC | System and method for training and collaborating in a virtual environment |
| US9892564B1 (en) | 2017-03-30 | 2018-02-13 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
| US10401954B2 (en) | 2017-04-17 | 2019-09-03 | Intel Corporation | Sensory enhanced augmented reality and virtual reality device |
| US10716643B2 (en) | 2017-05-05 | 2020-07-21 | OrbisMV LLC | Surgical projection system and method |
| US20190038362A1 (en) | 2017-08-02 | 2019-02-07 | Medtech S.A. | Surgical field camera system |
| WO2019051080A1 (en) | 2017-09-08 | 2019-03-14 | Surgical Theater LLC | Dual mode augmented reality surgical system and method |
| US20190076198A1 (en) * | 2017-09-12 | 2019-03-14 | Michael E. Berend | Femoral medial condyle spherical center tracking |
| US20200275988A1 (en) | 2017-10-02 | 2020-09-03 | The Johns Hopkins University | Image to world registration for medical augmented reality applications using a world spatial map |
| US20200302694A1 (en) | 2017-11-07 | 2020-09-24 | Koninklijke Philips N.V. | Augmented reality triggering of devices |
| US20190142520A1 (en) | 2017-11-14 | 2019-05-16 | Stryker Corporation | Patient-specific preoperative planning simulation techniques |
| GB2570758A (en) | 2017-11-24 | 2019-08-07 | Abhari Kamyar | Methods and devices for tracking objects by surgical navigation systems |
| US20210142508A1 (en) * | 2018-02-03 | 2021-05-13 | The Johns Hopkins University | Calibration system and method to align a 3d virtual scene and a 3d real world for a stereoscopic head-mounted display |
| US20210228286A1 (en) | 2018-05-10 | 2021-07-29 | Live Vue Technologies Inc. | System and method for assisting a user in a surgical procedure |
| US20210093391A1 (en) | 2018-06-19 | 2021-04-01 | Tornier, Inc. | Virtual guidance for orthopedic surgical procedures |
| WO2019245870A1 (en) | 2018-06-19 | 2019-12-26 | Tornier, Inc. | Mixed reality-aided education using virtual models or virtual representations for orthopedic surgical procedures |
| US20190380792A1 (en) | 2018-06-19 | 2019-12-19 | Tornier, Inc. | Virtual guidance for orthopedic surgical procedures |
| US20210093329A1 (en) | 2018-06-19 | 2021-04-01 | Tornier, Inc. | Closed-loop tool control for orthopedic surgical procedures |
| US20210093413A1 (en) | 2018-06-19 | 2021-04-01 | Tornier, Inc. | Mixed reality-aided depth tracking in orthopedic surgical procedures |
| US20200037043A1 (en) | 2018-07-27 | 2020-01-30 | Telefonaktiebolaget L M Ericsson (Publ) | SYSTEM AND METHOD FOR INSERTING ADVERTISEMENT CONTENT IN 360º IMMERSIVE VIDEO |
| WO2020033568A2 (en) | 2018-08-07 | 2020-02-13 | Smith & Nephew Inc. | Patella tracking method and system |
| WO2020037308A1 (en) | 2018-08-17 | 2020-02-20 | Smith & Nephew, Inc. | Patient-specific surgical method and system |
| WO2020047051A1 (en) | 2018-08-28 | 2020-03-05 | Smith & Nephew, Inc. | Robotic assisted ligament graft placement and tensioning |
| US20200107003A1 (en) | 2018-10-01 | 2020-04-02 | Telefonaktiebolaget Lm Ericsson (Publ) | CLIENT OPTIMIZATION FOR PROVIDING QUALITY CONTROL IN 360º IMMERSIVE VIDEO DURING PAUSE |
| WO2020145826A1 (en) | 2019-01-10 | 2020-07-16 | Augmedit B.V. | Method and assembly for spatial mapping of a model, such as a holographic model, of a surgical tool and/or anatomical structure onto a spatial position of the surgical tool respectively anatomical structure, as well as a surgical tool |
| US20200275976A1 (en) | 2019-02-05 | 2020-09-03 | Smith & Nephew, Inc. | Algorithm-based optimization for knee arthroplasty procedures |
| US20200360093A1 (en) | 2019-04-22 | 2020-11-19 | Khan Surgical Systems, Inc. | System and method to conduct bone surgery |
| US20210244481A1 (en) | 2019-04-29 | 2021-08-12 | Smith & Nephew, Inc. | Multi-level positional tracking |
| US20200375666A1 (en) | 2019-05-29 | 2020-12-03 | Stephen B. Murphy, M.D. | Systems and methods for augmented reality based surgical navigation |
| US20210088811A1 (en) * | 2019-09-24 | 2021-03-25 | Bespoke, Inc. d/b/a Topology Eyewear. | Systems and methods for adjusting stock eyewear frames using a 3d scan of facial features |
| WO2021094354A1 (en) | 2019-11-11 | 2021-05-20 | Krueger Timo | Method and system for reproducing an insertion point for a medical instrument |
| US11045263B1 (en) | 2019-12-16 | 2021-06-29 | Russell Nevins | System and method for generating a virtual jig for surgical procedures |
| US20210228308A1 (en) | 2020-01-29 | 2021-07-29 | Siemens Healthcare Gmbh | Representation apparatus |
| US20220051483A1 (en) | 2020-08-17 | 2022-02-17 | Russell Todd Nevins | System and method for location determination using a mixed reality device and a 3d spatial mapping camera |
| US20220047279A1 (en) | 2020-08-17 | 2022-02-17 | Russell Todd Nevins | System and method for location determination using movement between optical labels and a 3d spatial mapping camera |
Non-Patent Citations (18)
| Title |
|---|
| "Augmented and virtual reality in surgery—the digital surgical environment: application, limitations and legal pitfalls," accessed at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5220044/, visited on Jul. 4, 2020. 12 pages. |
| "Mixed Reality with HoloLens: Where Virtual Reality Meets Augmented Reality in the Operating Room," accessed at https://www.ingentaconnect.com/content/wk/prs/2017/00000140/00000005/art00063, visited on Jul. 4, 2020, 1 page. |
| "The impact of Web3D technologies on medical education and training," Science Direct, accessed at https://www.sciencedirect.com/science/article/pii/S0360131505000825, visited on Jul. 4, 2020, 11 pages. |
| "Virtual Reality Simulation in Neurosurgery: Technologies and Evolution," Abstract, accessed at https://academic.oup.com/neurosurgery/article-abstract/72/suppl_1/A154/2417686, visited on Jul. 4, 2020, 2 pages. |
| Barad, Justin, Virtual and Augmented Reality Can Save Lives by Improving Surgeons' Training, retrieved at https://www.statnews.com/2019/08/16/virtual-reality-improve-surgeon-training, dated Aug. 16, 2019. |
| Daley, Sam, The Cutting Edge: 10 Companies Bringing Virtual Reality & AR to the OR, retrieved at https://builtin.com/healthcare-technology/augmented-virtual-reality-surgery, dated Jul. 5, 2019. |
| Immersive Touch Launches the First Virtual Reality Integrated Suite for Surgical Planning, retrieved at https://spinalnewsinternational.com/immersivetouch-virtual-reality-suite, dated Oct. 5, 2018. |
| Kaluschke et al., HIPS—A Virtual Reality Hip Prosthesis Implantation Simulator, retrieved at https://www.reasearchgate.net/publication/327329265, upload date Sep. 3, 2018 DOI: 10.1109/VR.2018.8446370. |
| Katanacho, Manuel, Wladimir De la Cadena, and Sebastian Engel. "Surgical navigation with QR codes: Marker detection and pose estimation of QR code markers for surgical navigation." Current Directions in Biomedical Engineering 2.1 (2016): 355-358. |
| Levin et al., The Future of Virtual Reality in Ophthalmology Is Already Here, retrieved at https://www.aao.org/young-ophthalmologists/yo-info/article/future-of-virtual-reality-in-ophthalmology, dated Aug. 16, 2019. |
| LexInnova Patent Landscape Analysis, Virtual Reality, unknown author, copyright date of 2015. |
| Microsoft HoloLens & Mixed /Reality Healthcare Industry Deck, unknown author, at least as early as Oct. 14, 2019. |
| New Apple patent filing shows a mixed reality headset that tracks your whole face, Jul. 22, 2019, (downloaded Jul. 1, 2020 at https://www.theverge.com/2019/7/22/20705158/apple-mixed-reality-headset-ar-glasses-patent-application-face-tracking), 2 pages. |
| Patently Apple—Apple Reveals a Mixed Reality Headset that Uses a Direct Retinal Projector System With Holographic Lenses, retrieved at https://www.patentlyapple.com/patently-apple/2019/09/apple-reveals-a-mixed-reality-headset-that-uses-a-direct-retinal-projector-system-with-hologra . . . ; posted date Sep. 19, 2019. |
| Vaughan et al., A Review of Virtual Reality Based Training Simulators for Orthopaedic Surgery, retrieved at https://www.researchgate.net/publication/283727217, posted date Feb. 22, 2019, DOI: 10.1016/j.medengphy.2015.11.021. |
| Vaughan et al., Does Virtual-Reality Training on Orthopaedic Simulators Improve Performance in the Operating Room? Science and Information Conference 2015, Jul. 28-30, 2015, London, UK; retrieved at https://www.researchgate.net/publication/284415791; DOI: 10.1109/SAI.2015.7237125. |
| Virtual & Augmented Reality Are You Sure it Isn't Real? Kathleen Boyle, CFA, Managing Editor, Citi GPS dated Oct. 2016. |
| Virtual Reality System Helps Surgeons, Reassures Patients, retrieved at https//medicalgiving.stanford.edu/news/virtual-reality-system-helps-surgeons-reassures-patients.html, retrieved date Oct. 24, 2019. |
Also Published As
| Publication number | Publication date |
|---|---|
| US20220051483A1 (en) | 2022-02-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12236536B2 (en) | System and method for location determination using a mixed reality device and a 3D spatial mapping camera | |
| US12290271B2 (en) | System and method for location determination using movement between optical labels and a 3D spatial mapping camera | |
| US12193761B2 (en) | Systems and methods for augmented reality based surgical navigation | |
| US10194990B2 (en) | Method for augmenting a surgical field with virtual guidance content | |
| US20200038112A1 (en) | Method for augmenting a surgical field with virtual guidance content | |
| AU2018316092B2 (en) | Systems and methods for sensory augmentation in medical procedures | |
| US10398514B2 (en) | Systems and methods for sensory augmentation in medical procedures | |
| US10136901B2 (en) | Anatomical concentric spheres THA | |
| WO2018200767A1 (en) | Method for augmenting a surgical with virtual guidance content | |
| KR20220042478A (en) | Methods for knee surgery with inertial sensors | |
| US11610378B1 (en) | System and method for location determination using a mixed reality device and multiple imaging cameras | |
| US11045263B1 (en) | System and method for generating a virtual jig for surgical procedures | |
| US11806081B2 (en) | System and method for location determination using movement of an optical label fixed to a bone using a spatial mapping camera | |
| US20240221325A1 (en) | System and method for location determination using a virtual alignment target |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |