+

WO2024167823A1 - Systems and methods for evaluating and aiding surgical performance - Google Patents

Systems and methods for evaluating and aiding surgical performance Download PDF

Info

Publication number
WO2024167823A1
WO2024167823A1 PCT/US2024/014416 US2024014416W WO2024167823A1 WO 2024167823 A1 WO2024167823 A1 WO 2024167823A1 US 2024014416 W US2024014416 W US 2024014416W WO 2024167823 A1 WO2024167823 A1 WO 2024167823A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical
data
instrument
procedure
surgeon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/014416
Other languages
French (fr)
Inventor
Suzanne FOLEY
Robert Mitchell BALDWIN
David Noel TALLON
Kathleen RYAN
Debora Pereira SALGADO
Isabel Roomans LEDO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Stryker Corp
Original Assignee
Stryker Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stryker Corp filed Critical Stryker Corp
Priority to CN202480011169.9A priority Critical patent/CN120660142A/en
Priority to AU2024216584A priority patent/AU2024216584A1/en
Publication of WO2024167823A1 publication Critical patent/WO2024167823A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • A61B2090/3614Image-producing devices, e.g. surgical cameras using optical fibre
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]

Definitions

  • the present disclosure generally relates to systems and methods for obtaining and processing information relating to surgical performance, and more particularly to systems and methods for pre-operatively, intraoperatively, and/or postoperatively using data obtained from surgical instruments and/or devices in an operating room to evaluate surgical performance, aid surgical performance, and/or reduce a cognitive load on a surgeon during a surgical procedure.
  • a number of factors can affect a patient’s postoperative outcome for a surgical procedure. For example, several studies have found a relationship between a patient outcome and the technical skill of the surgeon that performed the surgery. As another example, during a surgery, a surgeon may be inundated with information from a variety of sources. The competing information provided to the surgeon may increase a cognitive load on the surgeon, which may negatively impact patient outcomes in some instances.
  • a non-transitory computer-readable medium has stored therein instructions that are executable to cause a processor to perform functions including determining a plurality of procedure data sets for a plurality of surgical procedures, and determining, based on the plurality of procedure data sets, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance.
  • Determining the plurality of procedure data sets can include, for each surgical procedure of the plurality of surgical procedures, determining a respective procedure data set of the plurality of procedure data sets by: (i) receiving, from a surgical instrument, instrument data related to an operation of a surgical instrument during the surgical procedure, where the instrument data is based on one or more instrument parameters determined by the surgical instrument at a plurality of points in time during the surgical procedure, (ii) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to a patient anatomy at the plurality of points in time during the surgical procedure, (iii) determining, based on at least one of the position data or the instrument data, kinematic data at the plurality of points in time, and (iv) correlating, for each of the plurality of points in time, the instrument data, the position data, and the kinematic data to determine the respective procedure data set for the surgical procedure.
  • a non-transitory computer-readable medium has stored therein instructions that are executable to cause a processor to perform functions including (i) receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure, (ii) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure, (iii) determining, based on the position data, kinematic data at the plurality of points in time, (iv) determining, using the instrument data and the kinematic data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance, (v) performing an analysis of the (a) the kinematic data and the instrument data relative to (b) the plurality of surgical performance metrics, and (vi) causing, based on the analysis, a user interface to output information to provide feedback to a surgeon relating to a performance of
  • a non-transitory computer-readable medium has stored therein instructions that are executable to cause a processor to perform functions including (i) receiving preoperative information for a surgical procedure to be performed, (ii) determining, using the preoperative information, a plurality of surgical performance metrics, (iii) receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure, (iv) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure, (v) determining, based on the position data, kinematic data at the plurality of points in time, (vi) performing an analysis of the (a) the kinematic data and the instrument data relative to (b) the plurality of surgical performance metrics, and (vii) causing, based on the analysis, a user interface to output information to provide feedback to a surgeon relating to a performance of
  • Figure 1 illustrates a simplified block diagram of a system for evaluating and/or aiding surgical performance, according to an example embodiment.
  • Figure 2 illustrates a simplified block diagram of a system for evaluating and/or aiding surgical performance, according to another example.
  • Figure 3 depicts a first display screen of an application for evaluating and/or aiding surgical performance, according to an example.
  • Figure 4 depicts a second display screen of an application for evaluating and/or aiding surgical performance, according to an example.
  • Figure 5 depicts a third display screen of an application for evaluating and/or aiding surgical performance, according to an example.
  • Figure 6 depicts a fourth display screen of an application for evaluating and/or aiding surgical performance, according to an example.
  • Figure 7 depicts a third display screen of an application for evaluating and/or aiding surgical performance, according to an example.
  • Figure 8 depicts a fourth display screen of an application for evaluating and/or aiding surgical performance, according to an example.
  • Figure 9 depicts a flowchart of a method of evaluating surgical performance for a surgical procedure, according to an example.
  • Figure 10 depicts a flowchart of a method of evaluating surgical performance for a surgical procedure, according to an example.
  • Figure 11 depicts a flowchart of a method of evaluating surgical performance for a surgical procedure, according to an example.
  • the current standard for evaluating surgeons is peer review, either intraoperatively or post-operatively via video footage.
  • Peer review is subject to bias, due to subjectivity and individual differences in the rating process (e.g., surgeons at times disagree about what constitutes “good” surgery).
  • the present disclosure provides for quantifying surgical technique in a more consistent and objective manner based, at least in part, on information directly measured from one or more surgical devices used during surgical procedure(s). Determining and/or using surgical performance metrics according to the systems and methods of the present disclosure can reduce subjectivity and bias, and can provide objective feedback, useful both to individual surgeons, patients, and/or others (e.g., credentialing and licensure committees).
  • the present disclosure further provides for systems that can use computer-based data analytics and/or machine-learning computer algorithms to provide for scalable evaluation of surgical technique and/or surgical performance.
  • the information determined by such computer algorithms can be used preoperatively, intraoperatively, and/or postoperatively to, among other things, (i) allow a surgeon to obtain personalized feedback on their surgical technique, (ii) train a surgeon or members of a surgical team to perform a surgical procedure, (iii) provide a patient with information related to the performance of a surgical procedure, (iv) plan a surgical procedure to be performed, (v) provide predictive analytics and recommendations on surgical technique for a surgical procedure to be performed (e.g., to reduce the occurrence of complications), (vi) provide feedback as to how a surgical instrument performed during a surgical procedure, (vii) provide a knowledge-sharing tool, and/or (viii) provide a memory aid tool to remind a surgeon of clinical choices for a particular surgical procedure.
  • the present disclosure additionally or alternatively provides for reducing a cognitive load on a surgeon during a surgical procedure.
  • Current methods to assess cognitive load rely predominantly on surgeons self-reporting it after surgery (e.g., using NAS A-TLX tool or Surg-TLX).
  • analyzing cognitive load retrospectively may lead to a failure to capture intra-operative fluctuations in cognitive load.
  • the systems and methods of the present disclosure can help to quantify cognitive load in real time and measure its impact at different phases of real surgeries.
  • the systems and methods of the present disclosure provide for measuring a surgeon’s levels of cognitive load based on information provided by one or more surgeon monitoring devices, which can sense one or more physiological conditions of the surgeon during the surgical procedure.
  • the physiological conditions sensed by the surgeon monitoring device(s) can, in some instances, additionally provide an indication of a psychological condition of the surgeon (e.g., a mental state and/or an emotional state of the surgeon).
  • the wearable devices can include one or more devices selected from among a smart ring, eye-tracking glasses, a strap-based sensor (e.g. chest-strap sensor, thighstrap sensor, and shank-strap sensor), a smartwatch, an immersive device (e.g.
  • the systems and methods can, based on the sensed physiological conditions of the surgeon, further provide intraoperative feedback to the surgeon to help reduce the level of cognitive load on the surgeon.
  • AR augmentative reality
  • XR extended reality
  • a flexible epidermal sensor e.g. wireless heart-rate patch monitor
  • systems and methods of the present disclosure can be applied in non-robotic surgery and/or robotic surgery. Additionally, within examples, the systems and methods of the present disclosure can be implemented in one or more surgical fields including, for example, neurosurgery, spinal surgery, endoscopy, orthopedics and Ear,
  • the system 100 includes a controller 110 and one or more surgical devices 112 that are operated during one or more surgical procedures.
  • the one or more surgical devices 112 include at least one surgical instrument 114 and at least one surgical navigation system 116.
  • the surgical device(s) 112 can include additional or alternative devices in other examples.
  • each of the surgical instrument s) 114 is operable to perform a surgical task during a surgical procedure.
  • the surgical instrument s) 114 can include at least one instrument selected from among a group consisting of a drill, a bone cutter, an electrosurgical tool, an aspiration tool, an irrigation tool, a shaver, a microscope, a camera (e.g., an endoscope), a surgical retractor, and an illumination device.
  • the surgical instrum ent(s) 114 can include one or more surgical instruments that can perform at least one surgical task selected from a group consisting of a drilling operation, a cutting operation, a shaving operation, a tissue retraction operation, a suctioning operation, an irrigation operation, a probing operation, a clamping operation, a coagulation operation, a heating operation, a cooling operation, an ablation operation, an electrical stimulation operation, an image capture operation, a sawing operation, and a grinding operation.
  • one or more of the surgical instrument(s) 114 can include a working element that is operable to perform the at least one surgical task.
  • the working element can include a drill bit, an electrosurgical electrode, an ablation end effector (e.g., a cryoablation balloon, an electrode, a laser light emitter, and/or a heating element), a fluid valve, a vacuum source, and a cutting blade.
  • the surgical instrum ent(s) 114 can include one or more user input devices that can be actuated to operate the working element of the surgical instrument s) 114.
  • the user input device(s) can include one or more devices selected from among a group consisting of: one or more buttons, one or more switches, one or more foot pedals, one or more touch screens, one or more dials, one or more triggers, one or more cranks, and one or more suction force control ports.
  • the surgical instrument(s) 114 can include one or more handheld devices that can be gripped, manipulated, and moved by the surgeon during the surgical procedure.
  • the surgical instrum ent(s) 114 can include one or more stationary devices that remain in a fixed position relative to the patient (and/or an operating room) during the surgical procedure.
  • the surgical instrument(s) 114 can include both the handheld device(s) and the stationary device(s).
  • the surgical instrument s) 114 can include an electrosurgical pencil and an electrosurgical generator, where the electrosurgical pencil is held and moved by the surgeon while the electrosurgical generator remains in a fixed position during the surgical procedure.
  • the surgical instrum ent(s) 114 can be entirely operated by the surgeon without robotic assistance.
  • the surgical instrument s) 114 can be include a partially automated robotic device that is operated by a surgeon, and/or a fully automated robotic device that performs the surgical procedure based on preoperative programming input to the surgical instrument(s) 114 by the surgeon.
  • the controller 110 can receive, from the surgical instrum ent(s) 114, instrument data related to an operation of the surgical instrument(s) 114 during the surgical procedure(s).
  • the instrument data can be based on one or more instrument parameters determined by the surgical instrument 114 at a plurality of times during each surgical procedure.
  • the instrument param eter(s) can be sensed by an instrument sensor 118 coupled to the surgical instrument 114.
  • the instrument sensor 118 can include one or more sensors selected from a group consisting of: a current sensor, a voltage sensor, an electrical power sensor, a flow sensor configured to detect a flow a liquid, a flow sensor configured to detect a flow of a gas, a temperature sensor, an accelerometer, a piezo-electric sensor, a force sensor (e.g., a ground reaction force sensor), a vibration sensor, a chemical sensor, an optical sensor, a pressure sensor, a humidity sensor, a position sensor, a hall-effect sensor, a capacitive sensor, and a Doppler flow sensor.
  • a current sensor e.g., a voltage sensor, an electrical power sensor
  • a flow sensor configured to detect a flow a liquid
  • a flow sensor configured to detect a flow of a gas
  • a temperature sensor e.g., a temperature sensor
  • an accelerometer
  • the instrument sensor 118 can be removably coupled to a housing of the surgical instrument s) 114. In other examples, the instrument sensor 118 can be additionally or alternatively non-removably coupled to the housing of the surgical instrument s) 114 (e.g., disposed within an interior cavity of the housing of the surgical instrument s) 114).
  • the surgical instrument(s) 114 can additionally or alternatively determine the instrument parameter(s) separately from the instrument sensor 118. For instance, in some implementations, the surgical instrument s) 114 can determine the instrument parameter(s) based on a setting and/or a mode of operation of the surgical instrument s) 114. As an example, in an implementation in which the surgical instrument s) 114 includes a bone drill, the instrument parameter(s) of a drilling speed and/or a torque can be determined based on a setting selected from among a plurality of settings on the surgical instrument s) 114.
  • the surgical instrument s) 114 include an electrosurgical pencil and an electrosurgical generator
  • the instrument parameter(s) of a power and a waveform of electrosurgical energy applied to tissue by the electrosurgical pencil can be determined based on a setting selected from among a plurality of settings on the electrosurgical generator.
  • the surgical navigation system 116 is configured to determine a position of one or more of the surgical instrum ent(s) 114 relative to a patient anatomy during a surgical procedure.
  • the surgical navigation system 116 can additionally or alternatively determine an orientation of the surgical instrum ent(s) 114 relative to the patient anatomy.
  • the surgical navigation system 116 can be configured to determine the position data using at least one surgical navigation modality selected from a group consisting of: (i) electromagnetic surgical navigation, (ii) optical surgical navigation, (iii) ultrasound surgical navigation, and (iv) machine vision surgical navigation.
  • the surgical navigation system 116 can include an electromagnetic field generator and a position sensor.
  • the electromagnetic field generator can be arranged to emit an electromagnetic field at the patient anatomy.
  • the position sensor can include a current sensor (e.g., sensor coils) that can sense the electromagnetic field and responsively generate a position sensor signal based on one or more properties of the electromagnetic field at a given location of the position sensor.
  • the position sensor can be coupled to the surgical instrument(s) 114 such that the position signal generated by the position sensor is indicative of a position and/or an orientation of the surgical instrument(s) relative to the patient anatomy.
  • the surgical navigation system 116 can include one or more cameras configured to track one or more fiducial markers coupled to the surgical instrument s) 114 and/or the patient anatomy.
  • the fiducial markers can include one or more passive markers (e.g., one or more markers that reflect light) and/or one or more active markers (e.g., one or more markers that emit light).
  • the surgical navigation system 116 can include one or more ultrasound signal emitters and one or more ultrasound signal detectors coupled to the surgical instrument(s) 114 and/or the patient anatomy.
  • the ultrasound signal emitter(s) can emit an ultrasound signal
  • the ultrasound signal detectors can detect the ultrasound signal emitted by the ultrasound signal emitter(s)
  • the surgical navigation system 116 can determine the position data based on the ultrasound signal emitted by the ultrasound signal emitter(s) and the ultrasound signal received by the ultrasound signal detector(s) (e.g., based on time(s) of flight between the ultrasound signal emitter(s) and the ultrasound signal detector(s)).
  • the surgical navigation system 116 can include one or more light sources and/or one or more cameras.
  • the light source(s) can illuminate the patient anatomy and/or the surgical instrument(s) 114.
  • the camera(s) can capture one or more images of the patient anatomy and/or the surgical instrument(s) 114 at a surgical site during the surgical procedure.
  • the surgical navigation system 116 can process the image(s) to determine the position data indicating the position of the surgical instrum ent(s) 114 relative to the patient anatomy.
  • the surgical navigation system 116 can include one or more position sensors 120 that can be coupled to the surgical instrument s) 114 and/or the patient anatomy.
  • the position sensor(s) 120 can be detectable by one or more components of the surgical navigation system 116 (e.g., via electromagnetic sensing, optical sensing, and/or ultrasonic sensing), and the surgical navigation system 116 can determine the position data based on the detected position sensor(s) 120.
  • the surgical navigation system 116 can include one or more position sensors 120 that can be coupled to the surgical instrument s) 114 and/or the patient anatomy.
  • the position sensor(s) 120 can be detectable by one or more components of the surgical navigation system 116 (e.g., via electromagnetic sensing, optical sensing, and/or ultrasonic sensing), and the surgical navigation system 116 can determine the position data based on the detected position sensor(s) 120.
  • the surgical navigation system 116 can include one or more position sensors 120 that can be coupled to the surgical instrument s) 114 and/or the patient anatomy.
  • the surgical navigation system 116 can include a registration system that is configured to establish a frame of reference for the patient anatomy and the position sensor(s) 120 (and, thus, the position of surgical instrument s) 114 indicated by the position sensor(s) 120).
  • the position sensor(s) 120 can be traced along features of the patient anatomy to establish the frame of reference.
  • the surgical navigation system 116 can include one or more touch points on the patient anatomy. At each touch point, the surgical navigation system 116 can register the touch point in space and, using the registered touch points, determine the frame of reference for the patient anatomy in space (e.g., using a three-dimensional coordinate system).
  • the position sensor(s) 120 and the patient anatomy can be mapped to a common frame of reference such that a position of the surgical instrument s) 114 sensed by the position sensor(s) 120 can be correlated (e.g., mapped in space) to the patient anatomy.
  • the surgical navigation system 116 can be an image- guided surgery system that is configured to correlate in real-time a sensed position of the surgical instrument(s) 114 and one or more images of the patient anatomy (e.g., preoperative image(s) of the patient anatomy obtained to the surgical procedure).
  • the image(s) can be at least one image type selected from a group consisting of a computerized tomography (CT) scan, a magnetic resonance imaging (MRI), and a three-dimensional map.
  • CT computerized tomography
  • MRI magnetic resonance imaging
  • the surgical navigation system 116 can be configured to provide the controller 110 with the position data and image data relating to the patient anatomy.
  • the image data and the position data can both be related to a common frame of reference with respect to the patient anatomy.
  • the surgical navigation system 116 can provide the position data to the controller 110 without providing any image data.
  • the surgical instrument s) 114 can provide the instrument data to the controller 110 and the surgical navigation system 116 can provide the position data to the controller 110.
  • the surgical instrument(s) 114 and/or the surgical navigation system(s) 116 can be communicatively connected with the controller 110 via a network.
  • the network can include one or more of the following: a direct or indirect physical communication connection, mobile communication network, Internet, intranet, Local Area Network, Wide Area Network, Storage Area Network, and any other form of connecting two or more systems, components, or storage devices together.
  • the controller 110 is a computing device that is configured to receive data (e.g., the instrument data and/or the position data) from the surgical devices 112 operated during one or more surgical procedures, and determine, based on the data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance.
  • the controller 110 can be implemented using hardware, software, and/or firmware.
  • the controller 110 can include one or more processors 122 and a non-transitory computer- readable medium 124 (e.g., volatile and/or non-volatile memory) that stores machine language instructions or other executable instructions.
  • the instructions when executed by the one or more processors 122, cause the system 100 to carry out various operations described herein.
  • the controller 110 thus, can receive data (including data indicated by the surgical instrum ent(s) 114 and/or the surgical navigation systems(s) 116) and store the data in memory as well.
  • the processor(s) 122 and/or the non-transitory computer-readable medium 124 can be implemented in any number of physical devices/machines.
  • the controller 110 can include one or more shared or dedicated general purpose computer systems/servers. Accordingly, principles and advantages of distributed processing, such as redundancy, replication, and the like, also can be implemented, as desired, to increase the robustness and performance of the devices and systems of the controller 110.
  • the physical devices/machines can be implemented by the preparation of integrated circuits or by interconnecting an appropriate network of conventional component circuits, as is appreciated by those skilled in the electrical art(s).
  • the physical devices/machines may include field programmable gate arrays (FPGA’s), application-specific integrated circuits (ASIC’s), digital signal processors (DSP’s), etc.
  • FPGA field programmable gate arrays
  • ASIC application-specific integrated circuits
  • DSP digital signal processors
  • the physical devices/machines may reside on a wired or wireless network, e.g., LAN, WAN, Internet, cloud, near-field communications, etc., to communicate with each other and/or other systems, e.g., Internet/web resources.
  • the controller 110 can receive data from the surgical devices 112 operated during one or more surgical procedures, and determine, based on the data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance.
  • the non-transitory computer-readable medium 124 has stored therein instructions that are executable to cause the processor(s) 122 to perform functions including: determining a plurality of procedure data sets for a plurality of surgical procedures, and determining, based on the plurality of procedure data sets, the plurality of surgical performance metrics that are indicative of the characteristic of surgical performance.
  • determining the plurality of procedure data sets can include, for each surgical procedure of the plurality of surgical procedures, determining a respective procedure data set of the plurality of procedure data sets by (i) receiving, from the surgical instrum ent(s) 114, the instrument data related to an operation of the surgical instrument(s) 114 during the surgical procedure, where the instrument data is based on the one or more instrument parameters determined by the surgical instrum ent(s) 114 at a plurality of times during the surgical procedure, (ii) receiving, from the surgical navigation system(s) 116, that position data that is indicative of the position of the surgical instrument s) 114 relative to the patient anatomy at the plurality of points in time during the surgical procedure, (iii) determining, based on at least one of the position data or the instrument data, kinematic data at the plurality of points in time, and (iv) correlating, for each of the plurality of points in time, the instrument data, the position data, and the kinematic data to determine
  • the kinematic data can include data for one or more kinematic parameters selected from a group consisting of: (i) a trajectory of the surgical instrum ent(s) 114, (ii) a velocity of the surgical instrum ent(s) 114, (iii) a motion of the surgical instrument s) 114 in three-dimensional space, (iv) an inertia of the surgical instrument(s) 114, (v) an acceleration of the surgical instrum ent(s) 114, (vi) a chatter of the surgical instrum ent(s) 114 (e.g., a movement due to an interaction between the surgical instrument s) 114 and the patient anatomy such as, for instance, the surgical instrum ent(s) 114 bouncing off of a bone of the patient), (vii) a jitter of the surgical instrument s) 114 e.g.
  • a smoothness of movement of the surgical instrument s) 114 e.g., a jerkiness of the movement of the surgical instrument s) 114 due to, for instance, relatively quick starts and stops and/or relatively quick changes in direction of movement
  • an application force applied by the surgical instrum ent(s) 114 to a patient’s anatomy at a surgical site e.g., a movement deviation relative to a preoperative planned route for the surgical instrum ent(s) 114, and (xii) a site of operation of the surgical instrum ent(s) 114 relative to a preoperative planned target site.
  • the kinematic data can include information relating to intentional movements of the surgical instrum ent(s) 114 by the surgeon relative to the patient anatomy, unintentional movements of the surgical instrum ent(s) 114 by the surgeon relative to the patient anatomy, and/or movements of the surgical instrument(s) 114 due to interactions with the patient anatomy.
  • the processor 122 can correlate the instrument data, the position data, and the kinematic data with each other at the plurality of points in time.
  • the processor 122 can correlate the instrument data, the position data, and the kinematic data with each other based on timing information (e.g., timestamp information) provided by the surgical instrument(s) 114 and the surgical navigation system(s) 116.
  • timing information e.g., timestamp information
  • This time-wise synchronization of the instrument data, the position data, and the kinematic data can allow each procedure data set to represent a more complete picture of how the surgeon used the surgical instrument s) 114 and/or how the surgical instrument s) 114 themselves performed during the surgical procedure as compared to considering such data in isolation.
  • the processor 122 can determine the surgical performance metrics based on objective data, which can provide a more consistent and superior basis for evaluating and characterizing surgical performance as compared to prior approaches (e.g., based on subjective peer review of video footage).
  • the processor 122 can determine the surgical performance metrics using descriptive analytics, diagnostic analytics, predictive analytics, and/or prescriptive analytics to analyze the procedure data sets.
  • the processor 122 can, for example, analyze the sets of procedure data to identify patterns in surgical technique (e.g., indicated by the instrument data, the position data, and the kinematic data) that are predictors for clinical outcomes.
  • the processor 122 can additionally or alternatively determine the surgical performance metrics by using the procedure data sets as training data for a machine learning algorithm. As the surgical performance metrics are determined based, at least in part, on procedure data sets from a plurality of surgical procedures, the processor 122 can analyze the procedure data sets from multiple surgical performances to obtain insights that may be useful for future surgical procedures.
  • the surgical performance metrics can provide a basis for preoperatively planning a future surgical procedure, providing intraoperative feedback to a surgeon during a surgical procedure, and/or providing postoperative feedback to a surgeon and/or a patient on the performance of a surgical procedure.
  • determining the surgical performance metrics can include (i) detecting, based on the procedure data sets, an occurrence of a surgical event during one or more surgical procedures of the surgical procedures, (ii) identifying one or more portions of the procedure data sets that are indicative of a cause of the occurrence of the surgical event, and (iii) determining the surgical performance metrics based on the one or more portions of the procedure data sets identified being indicative of the cause of the occurrence of the surgical event.
  • the surgical event can be at least one event selected from a group consisting of: (i) chatter of the surgical instrument s) 114, (ii) a wrap event (e.g., a wrapping of gauze and/or tissue around a rotating element of the surgical instrument(s) 114), (iii) overheating of the surgical instrument(s) 114, and (iv) proximity of the surgical instrument to critical anatomical structures.
  • the surgical performance metrics can provide information that can help to better understand characteristics of surgical performance that may increase and/or reduce a risk of the occurrence of the surgical event. This information can help to preoperatively plan a future surgical procedure, provide intraoperative feedback to a surgeon during a surgical procedure, and/or provide postoperative information to provide feedback on the performance of a surgical procedure.
  • the surgical performance metrics are indicative of a characteristic of surgical performance.
  • the surgical performance metrics can include one or more threshold values defining a range of expected values for at least one of (i) one or more instrument parameters of the instrument data or (ii) one or more kinematic parameters of the kinematic data.
  • the threshold value(s) can be communicated to a surgeon preoperatively to provide guidance for performing a future surgical procedure, intraoperatively to provide real-time feedback to the surgeon during a surgical procedure, and/or postoperatively to provide feedback to a surgeon as to when and/or by how much the surgeon exceeded and/or deviated from the threshold value(s) during a surgical procedure. Additional uses of surgical performance metrics including threshold value(s) are described in further detail below.
  • the surgical performance metrics can additionally or alternatively define a scoring system for evaluating the instrument data, the position data, and the kinematic data of at least one surgical procedure selected from among the plurality of surgical procedures.
  • the surgical performance metrics can define data for comparison to instrument data, position data, and/or kinematic data for a surgical procedure to be scored.
  • the surgical performance metrics can further define one or more scores that can be based on the comparison between the surgical performance metrics and the instrument data, the position data, and/or the kinematic data for the surgical procedure to be scored.
  • the surgical performance metrics can additionally or alternatively define a classification system for classifying a plurality of types of surgical technique.
  • the plurality of types of surgical technique can include two or more types selected from among a group consisting of: (i) an aggressive approach to surgical performance, (ii) a conservative approach to surgical performance, (iii) a smoother approach to surgical performance, (iv) a jerky approach to surgical performance, (v) a relatively fast approach to surgical performance (e.g., a shorter time for performing a surgical procedure), and (vi) a relatively slow approach to surgical performance.
  • the surgical performance metrics can additionally or alternatively provide for determining a plurality of reference fingerprints.
  • the controller 110 can use one or more dimension reduction techniques to identify, extract, and summarize characteristics of surgical technique into a unique identifier.
  • the controller 110 can further be configured to provide recommendations and/or guidance related to surgical technique, settings for the surgical instrument(s) 114, selection of a subset of the surgical instrument s) 114 from among a plurality of surgical instrument(s) 114, and/or selection of a working element for the surgical instrument s) 114 from among a plurality of working elements based on the reference fingerprints.
  • the controller 110 can be configured to determine, based on one or more surgical procedures performed by a specific surgeon, a query fingerprint for the surgeon, make a comparison of the query fingerprint to the reference fingerprints, identify the reference fingerprint that most closely matches the query fingerprint, and provide information associated with the identified reference fingerprint to the surgeon preoperatively and/or intraoperatively to facilitate a surgical procedure.
  • the surgical instrument s) 114 and the surgical navigation system(s) 116 can be a single surgical instrument 114 and a single surgical navigation system 116 used during all of the surgical procedures.
  • the surgical procedures can be performed using re-usable surgical instrum ent(s) 114 and re-usable surgical navigation system(s) 116 at a single location (e.g., in a single operating room).
  • different surgical instruments 114 and/or different surgical navigation systems 116 can be used during the surgical procedures.
  • the instrument data can be obtained from surgical instruments 114 and/or the position data can be obtained from surgical navigation systems 116 at a plurality of different locations.
  • the surgical instruments 114 and/or the surgical navigation systems 116 can be disposable devices that are intended to be discarded after use during a single surgical procedure.
  • the controller 110 can determine the surgical performance metrics based on the instrument data and the position data. In other examples, the controller 110 can evaluate and/or aid surgical performance based on additional or alternative sources of information.
  • FIG. 2 shows a simplified schematic diagram of a system 200, including one or more additional surgical devices 112 and/or one or more data sources 226, according to another example.
  • the system 200 includes the controller 110 and the surgical device(s) 112, as described above with respect to Figure 1.
  • the controller 110 includes the processor(s) 122 and the non-transitory computer-readable medium 124, and the surgical device(s) 112 can include the surgical instrum ent(s) 114 and the surgical navigation system(s) 116, as described above with respect to Figure 1.
  • the data source(s) 226 can include one or more data sources selected from among a group consisting of: an outcome data source 228, a surgeon historical data source 230, and a patient-specific data source 232.
  • the data source(s) 226 can each be communicatively connected to the controller 110 (e.g., via a network as described above with respect to the surgical instrument(s) 114, the surgical navigation system(s) 116, and the controller 110).
  • the processor 122 can use the data provided by the data source(s) 226 in connection with the procedure data sets to determine the surgical performance metrics.
  • the outcome data source 228 can store outcome data relative to postoperative outcomes of the surgical procedures.
  • the processor 122 can additionally or alternatively receive, for each surgical procedure, respective outcome data relating to a postoperative outcome of the surgical procedure, and the processor 122 can determine the plurality of surgical performance metrics further based on the respective outcome data.
  • the outcome data can include an indication of at least one postoperative outcome selected from among a group consisting of: (i) patient-reported pain scores, (ii) hospital stay duration, (iii) postoperative complications, (iv) restitution of function, (v) alleviation of presenting symptoms, and (vi) mortality.
  • Determining the surgical performance metrics based on the sets of procedure data and the outcome data associated with each set of procedure data can help to identify aspects of instrument data, position data, and kinematic data (which can be indicative of surgical technique) that may lead to positive surgical outcomes and/or negative surgical outcomes. This in turn can help to determine surgical performance metrics that can provide actionable insights to surgeons.
  • the historical surgeon data source can store historical surgeon data.
  • the processor 122 can additionally or alternatively receive, for each surgical procedure of the plurality of surgical procedures, the historical surgeon data relating to one or more surgical procedures performed in the past by a surgeon performing the surgical procedure, and the processor 122 can determine the surgical performance metrics further based on the historical surgeon data.
  • the historical surgeon data can include at least one item of information selected from among a group consisting of: a quantity of surgical procedures performed in the past, a type of surgical procedures performed in the past, years of experience, a number of hours of surgery performed by the surgeon, certifications of the surgeon, and an average time for completing surgical procedures.
  • Determining the surgical performance metrics based on the sets of procedure data and the historical surgeon data associated with each set of procedure data can also help to identify aspects of instrument data, position data, and kinematic data that may lead to positive surgical outcomes and/or negative surgical outcomes for surgical procedures performed by surgeons having certain levels of experience and/or surgical tendencies.
  • the historical surgeon data can additionally or alternatively be used to determine the surgical performance metrics based on preoperative information provided for a future surgical performance to be performed by a specific surgeon.
  • the patientspecific data source can store patient-specific data relating to one or more health records for the patients of the surgical procedures.
  • the processor 122 can additionally or alternatively receive, for each surgical procedure of the plurality of surgical procedures, the patient-specific data relating to one or more health records for a patient of the surgical procedure, and the processor 122 can determine the surgical performance metrics further based on the patient-specific data.
  • the patient-specific data can include at least one item of information selected from among a group consisting of: an age, a sex, a height, a weight, a bone density, a body mass index, an allergy indication, a medical history of the patient, demographics, family health history, lab and test results, medications, prior diagnoses, progress notes, medical images (e.g., radiology images, CT images, and/or MRI images), immunizations, patient reported outcome measures (PROMs), information relating to the nature of the medical condition of the patient (e.g, characteristics of cell tissue and/or bone).
  • medical images e.g., radiology images, CT images, and/or MRI images
  • PROMs patient reported outcome measures
  • information relating to the nature of the medical condition of the patient e.g, characteristics of cell tissue and/or bone.
  • Determining the surgical performance metrics based on the sets of procedure data and the patient-specific data associated with each set of procedure data can also help to identify aspects of instrument data, position data, and kinematic data that may lead to good surgical outcomes and/or bad surgical outcomes for patients having certain medical histories and/or health conditions.
  • the patient-specific data can additionally or alternatively be used to determine the surgical performance metrics based on preoperative information provided for a future surgical performance to be performed on a specific patient.
  • the processor(s) 122 can receive image data relating to the patient anatomy from the surgical navigation system 116.
  • the processor(s) 122 can additionally or alternatively receive the image data from the patient-specific data source 232.
  • the image data received from the patient-specific data source 232 and the position data received from the surgical navigation system can both relate to a common frame of reference with respect to the patient anatomy.
  • the image data received from the patient-specific data source can be generated using a registration system that is separate from the surgical navigation system 116.
  • the surgical device(s) 112 can additionally or alternatively include one or more patient monitoring device(s) 234 communicatively connected to the controller 110.
  • the patient monitoring device(s) 234 can determine patient physiological data related to a physiological condition of the patients at the plurality of points of time during the surgical procedures.
  • the patient physiological data can relate to at least one physiological parameter selected from a group consisting of: (i) a heart rate of the patient, (ii) a respiratory rate of the patient, (iii) a body temperature of the patient, (iv) a blood pressure of the patient, and (v) an oxygen saturation of the patient.
  • the patient monitoring device(s) 234 can include one or more patient sensors 236 that are configured to sense the physiological parameter.
  • the processor 122 can receive, for each surgical procedure of the plurality of surgical procedures, the patient physiological data related to the physiological condition of the patient at the plurality of points in time during the surgical procedure. Additionally, for each surgical procedure of the plurality of surgical procedures, the processor 122 can determine the respective procedure data set by correlating, for each of the plurality of points in time, the instrument data, the position data, the kinematic data, and the patient physiological data to determine the respective procedure data set for the surgical procedure.
  • the surgical device(s) 112 can additionally or alternatively include one or more surgeon monitoring device(s) 238 communicatively connected to the controller 110.
  • the surgeon monitoring device(s) 238 can determine surgeon physiological data related to a physiological condition of the surgeon at the plurality of points of time during the surgical procedures.
  • the surgeon physiological data can relate to at least one physiological parameter selected from a group consisting of: (i) a heart rate of the surgeon, (ii) a respiratory rate of the surgeon, (iii) a body temperature of the surgeon, (iv) an eye blink rate of the surgeon, (v) a measure of pupil dilation of the surgeon, (vi) a measure of eye fixation of the surgeon, (vii) a measure of saccade by the surgeon, and (viii) a measure of body movement by the surgeon.
  • the surgeon monitoring device(s) 234 can include one or more surgeon sensors 240 that are configured to sense the physiological parameter.
  • the surgeon monitoring device(s) 238 can include a wearable sensor in contact with the skin of the surgeon (e.g., an Oura ring and/or a strap-based sensor) and/or a wearable sensor that can monitor the eyes of the surgeon (e.g., a pair of glasses having eye tracking features).
  • a wearable sensor in contact with the skin of the surgeon (e.g., an Oura ring and/or a strap-based sensor) and/or a wearable sensor that can monitor the eyes of the surgeon (e.g., a pair of glasses having eye tracking features).
  • the processor 122 can receive, for each surgical procedure of the plurality of surgical procedures, the surgeon physiological data related to the physiological condition of the surgeon at the plurality of points in time during the surgical procedure. Additionally, for each surgical procedure of the plurality of surgical procedures, the processor 122 can determine the respective procedure data set by correlating, for each of the plurality of points in time, the instrument data, the position data, the kinematic data, and the surgeon physiological data to determine the respective procedure data set for the surgical procedure. By determining the surgical performance metrics based on procedure data sets that include the surgeon physiological data, the surgical performance metrics can provide insights into how the surgeon physiological data can affect patient outcomes.
  • the surgeon physiological data can additionally or alternatively provide an indication as to a cognitive load on the surgeon performing the surgical procedure.
  • the processor 122 can use the surgeon physiological data to determine when a cognitive load of a surgeon is high and responsively provide feedback to help reduce the cognitive load.
  • one or more of the surgeon monitoring device(s) 238 can be coupled to the surgeon and at least one of the surgeon monitoring device(s) 238 can be coupled to one or more other members of a surgical team.
  • the surgeon monitoring device(s) 238 can determine the surgeon physiological data related to the physiological condition of the surgeon and the other members of the surgical team at the plurality of points of time during the surgical procedures as described above.
  • the physiological conditions sensed by the surgeon monitoring device(s) can, in some instances, additionally provide an indication of a psychological condition of the surgeon. Accordingly, within examples, determining the surgical performance metrics based on procedure data sets that include the surgeon physiological data can also provide insights into how a mental or emotional state of a surgeon and/or other members of a surgical team can affect patient outcomes.
  • the system 200 can additionally or alternatively include a user interface 242 that can receive one or more inputs from a user and/or provide one or more outputs to the user.
  • the user interface 242 can include one or more buttons, one or more switches, one or more dials, one or more keypads, one or more touchscreens, one or more display devices 244, one or more indicator lights, one or more speakers, and/or one or more haptic output devices.
  • the user interface 242 is communicatively connected with the controller 110.
  • the controller 110 can determine the surgical performance metrics to provide preoperative, intraoperative, and/or postoperative information to a surgeon and/or a patient in connection with one or more surgical procedures.
  • Example implementations for providing preoperative, intraoperative, and/or postoperative information to a surgeon and/or a patient for a surgical procedure are described below.
  • the processor 122 can receive preoperative information for a surgical procedure to be performed.
  • the processor 122 can receive the preoperative information from, for example, the user interface 242 and/or the data sources 226.
  • the preoperative information can include the patient-specific data for the patient of the surgical procedure to be performed and/or the historical surgeon data for the surgeon of the surgical procedure to be performed.
  • the preoperative information can additionally or alternatively include surgeon preference data 246 relating to one or more preferences of the surgeon that will perform the surgical procedure.
  • the surgeon preference data 246 can include one or more preference selected from among a group consisting of: (i) a preference relating to a type of the surgical instrum ent(s) 114, (ii) a preference relating to a setting of the surgical instrument s) 114, (iii) a preference relating to a manner of holding or gripping the surgical instrument s) 114, and (iv) a preference for an order of steps for performing the surgical procedure.
  • the processor 122 can determine, using the preoperative information, a plurality of surgical performance metrics.
  • the processor 122 can be configured to use the preoperative information and the procedure data sets as inputs to determine the surgical performance metrics.
  • the surgical performance metrics can be tailored to the specific conditions of the surgical procedure to be performed as compared to implementations that do not use the preoperative information as an input.
  • the processor 122 can receive, from the surgical instrument(s) 114, the instrument data related to the operation of the surgical instrum ent(s) 114 at a plurality of points in time during the surgical procedure.
  • the processor 122 can also receive, from the surgical navigation system(s) 116, the position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure.
  • the processor 122 can determine, based on the position data, the kinematic data at the plurality of points in time.
  • the processor 122 can then perform an analysis of (i) the kinematic data and the instrument data relative to (ii) the plurality of surgical performance metrics.
  • the processor 122 can cause, based on the analysis, the user interface 242 to output information to provide feedback to the surgeon relating to the performance of the surgical procedure.
  • the user interface 242 can output the information via the display device 244, the speaker(s), the indicator light(s), and/or the haptic device(s). Examples of information that can be displayed on the display device 244 are shown and described below with respect to Figures 3-8.
  • the processor 122 can determine the plurality of surgical performance metrics, and output information preoperatively. For instance, the processor 122 can determine, based on the preoperative information and/or the surgical performance metrics, a procedure plan for performing the surgical procedure. The processor 122 can cause the user interface 242 to output information related to the procedure plan.
  • the procedure plan can include at least one item of information selected from among a group consisting of: (i) a preoperative planned route of movement of the surgical instrument ⁇ s) 114 and/or surgical implants, (ii) a target position for actuating the surgical instrum ent(s) 114, (iii) a selection of the surgical instrument s) 114 from among a plurality of potential surgical instruments 114, (iv) potential complications that may be encountered.
  • the processor 122 can additionally or alternatively perform the analysis and cause the user interface 242 to output the information intraoperatively in real-time during the surgical procedure.
  • the user interface 242 can output at least one item of information selected from a group consisting of: (i) a prediction of the occurrence of an adverse event (e.g. a breach of critical anatomical structures), (ii) an alert of sub-optimal performance of the surgical instrument 114, (iii) an alert of deviation from the preoperative plan, and (iv) an alert of sub-optimal psychological and/or physiological status of the surgeon and/or surgical team. This can help to provide real-time feedback to help enhance surgical performance during the surgical procedure.
  • an adverse event e.g. a breach of critical anatomical structures
  • an alert of sub-optimal performance of the surgical instrument 114 e.g. a breach of critical anatomical structures
  • an alert of deviation from the preoperative plan e.g. a preoperative plan
  • the processor 122 can receive the surgeon physiological data related to the physiological condition of the surgeon performing the surgical procedure. The processor 122 can further make a determination, based on the surgeon physiological data, that a cognitive load on the surgeon is greater than a threshold amount of cognitive load. Responsive to the determination that the cognitive load on the surgeon is greater than the threshold amount of cognitive load, the processor 122 can cause the user interface 242 to output the information. In some examples, the information outputted by the user interface 242 can include instrument guidance information for operating the surgical instrument based on the plurality of surgical performance metrics.
  • the information outputted by the user interface 242 can additionally or alternatively include kinematic guidance for navigating the surgical instrument s) 114 based on the plurality of surgical performance metrics.
  • the processor 122 can additionally or alternatively perform the analysis and cause the user interface 242 to output the information postoperatively.
  • the user interface 242 can provide information to provide a postoperative audit of the surgeon’s surgical performance during the surgical procedure.
  • the processor 122 can additionally or alternatively determine and/or iteratively update the surgical performance metrics based on the instrument data, the position data, and/or the kinematic data received during the surgical procedure. For instance, in another example, the processor 122 can receive, from the surgical instrum ent(s) 114, instrument data related to the operation of the surgical instrum ent(s) 114 at a plurality of points in time during a surgical procedure. The processor 122 can additionally receive, from the surgical navigation system 116, position data that is indicative of the position of the surgical instrum ent(s) 114 relative to the anatomy of a patient at the plurality of points in time during the surgical procedure.
  • the processor 122 can determine, based on the position data, kinematic data at the plurality of points in time, and determine, using the instrument data and the kinematic data, the surgical performance metrics that are indicative of a characteristic of surgical performance.
  • the processor 122 can also perform an analysis of (i) the kinematic data and the instrument data relative to (ii) the plurality of surgical performance metrics.
  • the processor 122 can further cause, based on the analysis, the user interface 242 to output information to provide feedback to a surgeon relating to a performance of the surgical procedure.
  • the processor 122 can cause the user interface 242 to output the information preoperatively, intraoperatively, and/or postoperatively as described above.
  • processor 122 can cause the user interface 242 to output the information intraoperatively
  • processor 122 can determine surgical performance metrics, perform the analysis, and cause the user interface to output the information are intraoperatively performed in real-time during the surgical procedure.
  • the processor 122 can additionally or alternatively cause the user interface 242 to output information responsive to the processor 122 determining that a level of cognitive load is greater than the threshold amount of cognitive load, as described above.
  • the processor 122 can additionally or alternatively cause the surgical instrument 114 to automatically adjust one or more of the instrument parameter(s) based on the instrument data, the position data, the kinematic data, the surgeon physiological data, the patient physiological data received during the surgical procedure, and/or the surgical performance metrics.
  • the processor 122 can intraoperatively cause the surgical instrument 114 to adjust a motor speed, a direction of the motor rotation , a motor torque, a motor temperature, a motor current, and a motor power consumption, an electrosurgical current, an electrosurgical voltage, an electrosurgical waveform, an electrosurgical impedance (e.g., a resistance encountered by the surgical instrument(s) 114 while cutting/aspirating/coagulating tissue), an irrigation flow rate, a suction flow rate, a depth control (e.g., a depth level for screw placement and/or a cutting depth), and a visibility setting for a camera (e.g., an illumination intensity, a white balance setting, an image magnification setting, a focus setting, an image enhancement feature, and/or an air and water insufflation setting).
  • a camera e.g., an illumination intensity, a white balance setting, an image magnification setting, a focus setting, an image enhancement feature, and/or an air and water insufflation setting.
  • the processor 122 can perform an analysis of (i) the kinematic data and the instrument data relative to (ii) the plurality of surgical performance metrics.
  • the processor 122 can further determine, based on the analysis, an adjustment to the instrument parameter(s), and responsively cause the surgical instrument 114 to adjust the one or more instrument param eter(s) in accordance with the adjustment to the instrument parameter(s).
  • the processor 122 can additionally or alternatively receive the surgeon physiological data related to the physiological condition of the surgeon performing the surgical procedure.
  • the processor 122 can make a determination, based on the surgeon physiological data, that a cognitive load on the surgeon is greater than a threshold amount of cognitive load. Responsive to the determination that the cognitive load on the surgeon is greater than the threshold amount of cognitive load, the processor 122 can cause the surgical instrument 114 to adjust the one or more instrument parameter(s).
  • the processor 122 and the user interface 242 can be configured to allow the surgeon and/or an institution (e.g., an employer of the surgeon and/or a hospital administrator at which the surgeon has practice privileges) to set one or more goals for surgical performance.
  • the processor 122 can additionally or alternatively be configured to receive, from the user interface 242, surgical performance goal data relating to the one or more goals for surgical performance.
  • the one or more goals for surgical performance can include one or more of goals selected from a group consisting of: (i) a goal to understand the surgical technique of the surgeon, (ii) a goal to understand management of workflow in the operating room, (iii) a goal to improve safety associated with surgical technique, (iv) a goal to obtain a balance between safety and efficacy, (v) a goal to understand surgical outcomes for surgical procedures performed by the surgeon, (vi) a goal to reduce complication rates, (vii) a goal to obtain continuing professional development (CPD) credits, (viii) a goal to train operating room staff, (ix) a goal to understand patient factors, (x) a goal to reduce wear and tear on a body of the surgeon, and/or (xi) a goal to learn and progress surgical skills.
  • goals selected from a group consisting of: (i) a goal to understand the surgical technique of the surgeon, (ii) a goal to understand management of workflow in the operating room, (iii) a goal to improve safety
  • the user interface 242 can receive a user input selecting one or more of the goals for surgical performance and communicate the user input to the processor 122.
  • the processor 122 can, based on the user input, determine the surgical performance goal data.
  • the processor 122 can further set and/or adjust the surgical performance metrics based on the surgical performance goal data.
  • the surgical performance goals can help to fine tune the surgical performance metrics, and guide current and/or future surgical performance towards desired values of the surgeon and/or the institution.
  • the processor 122 can additionally or alternatively be configured to preoperatively, intraoperatively, and/or postoperatively provide information to the surgeon based on the surgical performance metrics.
  • the processor 122 can use the surgical performance goal data to determine what information is provided, when it is provided, and how it is provided to the surgeon to facilitate the surgeon’s progress towards the surgical performance goals (e.g., selected via the user interface 242).
  • the processor 122 can (i) perform an analysis of one or more data sets (e.g., the instrument data, the position data, the kinematic data, the outcome data, the historical surgeon data, the patient-specific data, the image data, the patient physiological data, and/or the surgeon physiological data) as described above, and (ii) determine, based on the analysis and the surgical performance goal data, information that can be outputted to the surgeon by the user interface 242.
  • one or more data sets e.g., the instrument data, the position data, the kinematic data, the outcome data, the historical surgeon data, the patient-specific data, the image data, the patient physiological data, and/or the surgeon physiological data
  • the processor 122 is configured to cause the user interface 242 to output a first set of information for a first surgical performance goal data and a second set of information for a second surgical performance goal data, where the first surgical performance goal data is different from the second surgical performance goal data and the first set of information is different than the second set of information.
  • the processor 122 and the user interface 242 can be configured to provide information that shows progress towards the surgical performance goals and/or deficiencies in progress towards the surgical performance goals. This can help the surgeon and/or the institution better understand how the surgeon has performed and on what the surgeon may want to focus their attention for additional improvement in performance.
  • the display screen 350 can include a summary of historical surgeon data for a particular surgeon.
  • the display screen 350 can include a first indication 352A of a quantity of surgical procedures performed within a given timeframe and/or a second indication 352B of an average amount of time for completing the quantity of surgical procedures.
  • the display screen 350 can additionally include a first link 354A to upcoming surgical procedures that will be performed by the surgeon, a second link 354B to past surgical that were previously performed by the surgeon, and/or a third link 354C to personalized insights based on an analysis of the procedure data sets associated with the surgeon and the surgical performance metrics determined by the processor 122.
  • a display screen 450 of the application for evaluating and/or aiding surgical performance is shown according to another example.
  • the processor 122 causes the user interface 242 to display a graphical depiction over time of the at least one of the instrument data, the position data, or the kinematic data.
  • the display screen 450 includes a respective identifier 456A-456E for each surgical instrument 114 and a graphical depiction 458A-458E of the instrument data for the surgical instrument 114 relative to an axis 460 indicating the plurality of points of time during the surgical procedure.
  • the display screen 450 includes a first identifier 456A for a drill corresponding to a first graphical depiction 458 A of the instrument data for the drill, a second identifier 456B for a camera corresponding to a second graphical depiction 458B of the instrument data for the camera, a third identifier 456C for an aspirator corresponding to a third graphical depiction 458C of the instrument data for the aspirator, a fourth identifier 456D for a bipolar electrosurgical device corresponding to a fourth graphical depiction 458D of the instrument data for the biopolar electrosurgical device, and a fifth identifier 456E for a suction device corresponding to a fifth graphical depiction 458E of the instrument data for the suction device.
  • the plurality of surgical performance metrics define one or more ranges of expected values for at least one of (i) one or more instrument parameters of the instrument data or (ii) one or more kinematic parameters of the kinematic data.
  • the processor 122 can cause the user interface to display an indication 462 that a portion of the instrument data was outside of at least one of the one or more ranges of expected values defined by the plurality of surgical performance metrics.
  • the processor 122 can additionally or alternatively cause the user interface 242 to display a link 464A to a pre-operative image of an anatomy of a surgical site and/or display a link 464B to a post-operative image of the anatomy.
  • Figure 4 shows graphical depictions 458A458E of the instrument data and an indication that instrument data was outside of the range(s) of expected values
  • the processor 122 can cause the user interface 242 to display graphical depictions of the kinematic data and/or an indication that a portion of the kinematic data was outside of at least one of the one or more ranges of expected values defined by the plurality of surgical performance metrics in other examples.
  • a display screen 550 of an application for evaluating and/or aiding surgical performance is shown according to an example.
  • the display screen 550 includes a plurality of indicators 556A-556C corresponding to respective graphical depictions 558A-558C of instrument parameters indicated by the instrument data at the plurality of points of time (e.g., indicated by an axis 560) during the surgical procedure.
  • the plurality of surgical performance metrics define one or more ranges of expected values for at least one of (i) one or more instrument parameters of the instrument data or (ii) one or more kinematic parameters of the kinematic data.
  • the processor 122 can cause the user interface to display an indication 562 that a portion of the instrument data was outside of at least one of the one or more ranges of expected values defined by the plurality of surgical performance metrics.
  • the processor 122 can cause the user interface 242 to display, based on the position and the kinematic data, an animation 564 of a movement of the surgical instrument 114 overlaid on the image of the anatomy.
  • the animation 564 can include a plurality of colors and each color can be based on the kinematic data.
  • the colors can provide a color coded indication of the kinematic data such that there are N ranges of kinematic data values that are respectively indicated by N colors, where N is an integer value greater than two.
  • a first range of smoothness of movement can be indicated by a first color
  • a second range of smoothness of movement can be indicated by a second color
  • a nth range of smoothness of movement can be indicated by a nth color, where n is an integer value greater than two.
  • the processor 122 can correlate the kinematic parameters with each other at the plurality of points in time.
  • the processor 122 can correlate the instrument parameters with each other at the points in time.
  • the instrument data relates to a speed, a chatter, and a current of the surgical instrument 114.
  • the instrument data can relate to the operation of a motor of the surgical instrum ent(s) 114
  • the instrument data can relate to one or more instrument parameters selected from a group consisting of: a motor speed, a motor torque, a motor temperature, a motor current, and a motor power consumption.
  • a display screen 650 of an application for evaluating and/or aiding surgical performance is shown according to an example.
  • the processor 122 can cause the user interface 242 to display an amount of deviation between (i) an actual location of a device implanted by the surgical instrument 114, and (ii) a preoperatively planned location at which the device was to be implanted by the surgical instrument 114. Additionally, in Figure 6, the processor 122 can cause the user interface to display an indication 666 of a threshold amount of deviation defined by the plurality of surgical performance metrics.
  • a display screen 750 is shown for an application for evaluating and/or aiding surgical performance according to another example.
  • the processor 122 can cause the user interface 242 to simultaneously display (i) an animation 768 of a surgical procedure, and (ii) a graphical depiction over time of the at least one of the instrument data, the position data, or the kinematic data for the surgical procedure.
  • the display screen 750 includes a plurality of indicators 756A-756C corresponding to respective graphical depictions 758A-758C.
  • the graphical depictions 758A-758C include (i) a first graphical depiction 758 A of an acceleration of a tip of the surgical instrument 114 indicated by the kinematic data a plurality of points in time, (ii) a second graphical depiction 758B of a force indicated by the instrument data (e.g., and measured by a force sensor) over the plurality of points in time, and (iii) a revolutions per minute (RPM) of the working element of the surgical instrument 114 (e.g., a drill bit) over the plurality of points in time.
  • RPM revolutions per minute
  • the processor 122 can be configured to play the animation 768 and display the graphical depictions 758A-758C in a time synchronized manner such that the display screen 750 can provide an indication of how the surgical instrument 114 was operated at each point in time over the course of the surgical procedure.
  • the animation 768 can include an animated object representing the surgical instrument 114 and an animated object representing the patient anatomy.
  • the processor 122 can determine the object representing the patient anatomy based on the image data relating to the patient anatomy. For instance, as described above, the processor 122 can receive the image data relating to the patient anatomy from the surgical navigation system 116 and/or the patient-specific data source 232, and the processor 122 can determine the animated object representing the anatomy based on the image data. This can provide a more realistic and/or accurate visualization of the surgical procedure relative to other examples in which the animated object representing the anatomy is a generic representation of the anatomy. However, in other examples, the animated object representing the anatomy can be a generic representation of the anatomy. This can, for example, help to reduce a computational load on the processor 122.
  • the processor 122 can determine the animation 768 based on the instrument data, position data, the kinematic data for the surgical procedure.
  • the animation 768 can depict (i) relative positions, orientations, and/or movements of the surgical instrument 114 relative to the patient anatomy, (ii) actuations of the surgical instrument 114, and/or (iii) interactions between the surgical instrument 114 and the patient anatomy at the plurality of points of time during the surgical procedure.
  • the animation 768 can be specific to the procedure performed by the practitioner as opposed to a generic animation. Additionally, this can help to provide improved visualization of the use of the surgical instrument 114 and/or the surgical technique.
  • the animation 768 can be a generic animation that is displayed for all performances of a particular type of surgical procedure. This can help, for instance, to reduce computational load on the processor 122.
  • the display screen 750 can further include the surgical performance metrics displayed on the animation 768.
  • the processor 112 can cause the display screen 750 to include a tool tip trajectory color-coded according to a smoothness of the movement of the surgical instrument 114.
  • the processor 122 can cause the display screen 750 to display text 770 that can provide a technique recommendation as feedback to a surgeon relating to a performance of the surgical procedure.
  • the display text 770 provides a technique recommendation that relates to a technique for handling of the surgical instrument 114.
  • the display text 770 can provide a technique recommendation that relates to a recommendation to use a different surgical instrument 114 and/or a different working element on the surgical instrument 114.
  • the processor 122 can determine the text and the technique recommendation based on one or more of the analysis of the instrument data, the position data, the kinematic data, and/or the surgical performance metrics.
  • a display screen 850 is shown for an application for evaluating and/or aiding surgical performance is shown according to another example.
  • the processor 122 can cause the user interface 242 to simultaneously display (i) the animation 768 of the surgical procedure, and (ii) a video 872 recorded by an image capture device during the surgical procedure.
  • the processor 122 can be configured to play the animation 768 and display the video 872 in a time synchronized manner such that the display screen 850 can provide an indication of how the surgical instrument 114 was operated at each point in time over the course of the surgical procedure.
  • the image capture device is external to the surgical site such that the video 872 depicts the hands and/or the body of the practitioner from outside of the surgical site.
  • the animation 768 can show how the surgical instrument 114 interacted with the patient anatomy within the surgical site.
  • the video 872 can help to show a way in which the practitioner grasped the surgical instrument, moved the surgical instrument, and/or actuated the surgical instrument in a time synchronized manner with the animation 768 showing the resulting effect in the surgical site. This can help to provide additional or alternative insights into the surgical technique used during the surgical procedure.
  • the image capture device can be one of the surgical instruments 114 such that the video is obtained from within the surgical site.
  • the video can be captured by an endoscope, a surgical microscope, and/or an exoscope.
  • displaying the animation 768 in a time synchronized manner with the video 872 can help the practitioner to review the surgical procedure in conjunction with the view from the surgical site that was actually available to the practitioner during the surgical procedure.
  • the process 900 includes determining a plurality of procedure data sets for a plurality of surgical procedures at block 910, and determining, based on the plurality of procedure data sets, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance at block 912.
  • Determining the plurality of procedure data sets at block 910 can include, for each surgical procedure of the plurality of surgical procedures, determining a respective procedure data set of the plurality of procedure data sets by: (i) receiving, from a surgical instrument, instrument data related to an operation of a surgical instrument during the surgical procedure at block 914, where the instrument data is based on one or more instrument parameters determined by the surgical instrument at a plurality of points in time during the surgical procedure, (ii) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to a patient anatomy at the plurality of points in time during the surgical procedure at block 916, (iii) determining, based on at least one of the position data or the instrument data, kinematic data at the plurality of points in time at block 918, and (iv) correlating, for each of the plurality of points in time, the instrument data, the position data, and the kinematic data to determine the respective procedure data set for the surgical procedure at block 920.
  • the process 1000 includes (i) receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure at block 1010, (ii) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure at block 1012, (iii) determining, based on the position data, kinematic data at the plurality of points in time at block 1014, (iv) determining, using the instrument data and the kinematic data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance at block 1016, (v) performing an analysis of the (a) the kinematic data and the instrument data relative to (b) the plurality of surgical performance metrics at block 1018, and (vi) causing, based on
  • the process 1100 includes (i) receiving preoperative information for a surgical procedure to be performed at block 1110, (ii) determining, using the preoperative information, a plurality of surgical performance metrics at block 1112, (iii) receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure at block 1114, (iv) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure at block 1116, (v) determining, based on the position data, kinematic data at the plurality of points in time at block 1118, (vi) performing an analysis of the (a) the kinematic data and the instrument data relative to (b) the plurality of surgical performance metrics at block 1120, and (vii)
  • Any of the blocks shown in Figures 9-11 may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor for implementing specific logical functions or steps in the process.
  • the program code may be stored on any type of computer readable medium or data storage, for example, such as a storage device including a disk or hard drive. Further, the program code can be encoded on a computer-readable storage media in a machine-readable format, or on other non- transitory media or articles of manufacture.
  • the computer readable medium may include non- transitory computer readable medium or memory, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and Random Access Memory (RAM).
  • the computer readable medium may also include non- transitory media, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example.
  • the computer readable media may also be any other volatile or non-volatile storage systems.
  • the computer readable medium may be considered a tangible computer readable storage medium, for example.
  • components of the devices and/or systems described herein may be configured to perform the functions such that the components are actually configured and structured (with hardware and/or software) to enable such performance.
  • Example configurations then include one or more processors executing instructions to cause the system to perform the functions.
  • components of the devices and/or systems may be configured so as to be arranged or adapted to, capable of, or suited for performing the functions, such as when operated in a specific manner.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Data Mining & Analysis (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Robotics (AREA)
  • Pathology (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Instructional Devices (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

In an example, a system comprises a processor that can receive, from a surgical instrument, instrument data related to an operation of the surgical instrument during a surgical procedure, (ii) receive, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient during the surgical procedure, (iii) determine, based on the position data, kinematic data, (iv) determine, using the instrument data and the kinematic data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance, (v) perform an analysis of the (a) the kinematic data and the instrument data relative to (b) the plurality of surgical performance metrics, and (vi) cause, based on the analysis, a user interface to output information to provide feedback to a surgeon and/or stakeholders (e.g., a hospital administrator) relating to a performance of the surgical procedure.

Description

Systems and Methods for Evaluating and Aiding Surgical Performance
CROSS-REFERENCE TO RELATED APPLICATION
[0001] The present disclosure claims the benefit of U.S. Provisional Application No. 63/443,588, filed February 6, 2023, the contents of which are hereby incorporated by reference in their entirety.
FIELD
[0002] The present disclosure generally relates to systems and methods for obtaining and processing information relating to surgical performance, and more particularly to systems and methods for pre-operatively, intraoperatively, and/or postoperatively using data obtained from surgical instruments and/or devices in an operating room to evaluate surgical performance, aid surgical performance, and/or reduce a cognitive load on a surgeon during a surgical procedure.
BACKGROUND
[0003] A number of factors can affect a patient’s postoperative outcome for a surgical procedure. For example, several studies have found a relationship between a patient outcome and the technical skill of the surgeon that performed the surgery. As another example, during a surgery, a surgeon may be inundated with information from a variety of sources. The competing information provided to the surgeon may increase a cognitive load on the surgeon, which may negatively impact patient outcomes in some instances. SUMMARY
[0004] In an example, a non-transitory computer-readable medium has stored therein instructions that are executable to cause a processor to perform functions including determining a plurality of procedure data sets for a plurality of surgical procedures, and determining, based on the plurality of procedure data sets, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance. Determining the plurality of procedure data sets can include, for each surgical procedure of the plurality of surgical procedures, determining a respective procedure data set of the plurality of procedure data sets by: (i) receiving, from a surgical instrument, instrument data related to an operation of a surgical instrument during the surgical procedure, where the instrument data is based on one or more instrument parameters determined by the surgical instrument at a plurality of points in time during the surgical procedure, (ii) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to a patient anatomy at the plurality of points in time during the surgical procedure, (iii) determining, based on at least one of the position data or the instrument data, kinematic data at the plurality of points in time, and (iv) correlating, for each of the plurality of points in time, the instrument data, the position data, and the kinematic data to determine the respective procedure data set for the surgical procedure.
[0005] In another example, a non-transitory computer-readable medium has stored therein instructions that are executable to cause a processor to perform functions including (i) receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure, (ii) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure, (iii) determining, based on the position data, kinematic data at the plurality of points in time, (iv) determining, using the instrument data and the kinematic data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance, (v) performing an analysis of the (a) the kinematic data and the instrument data relative to (b) the plurality of surgical performance metrics, and (vi) causing, based on the analysis, a user interface to output information to provide feedback to a surgeon relating to a performance of the surgical procedure.
[0006] In another example, a non-transitory computer-readable medium has stored therein instructions that are executable to cause a processor to perform functions including (i) receiving preoperative information for a surgical procedure to be performed, (ii) determining, using the preoperative information, a plurality of surgical performance metrics, (iii) receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure, (iv) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure, (v) determining, based on the position data, kinematic data at the plurality of points in time, (vi) performing an analysis of the (a) the kinematic data and the instrument data relative to (b) the plurality of surgical performance metrics, and (vii) causing, based on the analysis, a user interface to output information to provide feedback to a surgeon relating to a performance of the surgical procedure.
[0007] The features, functions, and advantages that have been discussed can be achieved independently in various embodiments or may be combined in yet other embodiments further details of which can be seen with reference to the following description and drawings. BRIEF DESCRIPTION OF THE FIGURES
[0008] The novel features believed characteristic of the illustrative embodiments are set forth in the appended claims. The illustrative embodiments, however, as well as a preferred mode of use, further objectives and descriptions thereof, will best be understood by reference to the following detailed description of an illustrative embodiment of the present disclosure when read in conjunction with the accompanying drawings, wherein:
[0009] Figure 1 illustrates a simplified block diagram of a system for evaluating and/or aiding surgical performance, according to an example embodiment.
[0010] Figure 2 illustrates a simplified block diagram of a system for evaluating and/or aiding surgical performance, according to another example.
[0011] Figure 3 depicts a first display screen of an application for evaluating and/or aiding surgical performance, according to an example.
[0012] Figure 4 depicts a second display screen of an application for evaluating and/or aiding surgical performance, according to an example.
[0013] Figure 5 depicts a third display screen of an application for evaluating and/or aiding surgical performance, according to an example.
[0014] Figure 6 depicts a fourth display screen of an application for evaluating and/or aiding surgical performance, according to an example.
[0015] Figure 7 depicts a third display screen of an application for evaluating and/or aiding surgical performance, according to an example.
[0016] Figure 8 depicts a fourth display screen of an application for evaluating and/or aiding surgical performance, according to an example. [0017] Figure 9 depicts a flowchart of a method of evaluating surgical performance for a surgical procedure, according to an example.
[0018] Figure 10 depicts a flowchart of a method of evaluating surgical performance for a surgical procedure, according to an example.
[0019] Figure 11 depicts a flowchart of a method of evaluating surgical performance for a surgical procedure, according to an example.
DETAILED DESCRIPTION
[0020] Disclosed embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all of the disclosed embodiments are shown. Indeed, several different embodiments may be described and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are described so that this disclosure will be thorough and complete and will fully convey the scope of the disclosure to those skilled in the art.
[0021] As noted above, factors such as surgical skill and/or cognitive load on a surgeon can affect a patient’s postoperative outcome for a surgical procedure. The present disclosure provides for systems and methods that can help to evaluate surgical performance, aid surgical performance, and/or reduce a cognitive load on a surgeon during a surgical procedure.
[0022] It has been generally thought that increasing surgical case volume and years of practice are associated with improved surgical performance, in a procedure-specific manner. But while surgical expertise is often defined in terms of the number of surgical procedures performed, i.e., surgeon volume, this number is not always associated with the rate of complication, suggesting that it is not a sufficient measure of surgical skills. Experience alone may be insufficient to mitigate risks associated with surgery, and surgical performance cannot be maintained by passive accumulation of experience. Rather, focusing on performance improvement may be beneficial throughout a surgeon's career, such as through use of monitoring tools and structured behavioral modification programs.
[0023] The current standard for evaluating surgeons is peer review, either intraoperatively or post-operatively via video footage. Peer review is subject to bias, due to subjectivity and individual differences in the rating process (e.g., surgeons at times disagree about what constitutes “good” surgery). The present disclosure provides for quantifying surgical technique in a more consistent and objective manner based, at least in part, on information directly measured from one or more surgical devices used during surgical procedure(s). Determining and/or using surgical performance metrics according to the systems and methods of the present disclosure can reduce subjectivity and bias, and can provide objective feedback, useful both to individual surgeons, patients, and/or others (e.g., credentialing and licensure committees).
[0024] In additional or alternative aspects, the present disclosure further provides for systems that can use computer-based data analytics and/or machine-learning computer algorithms to provide for scalable evaluation of surgical technique and/or surgical performance. Additionally, within examples, the information determined by such computer algorithms can be used preoperatively, intraoperatively, and/or postoperatively to, among other things, (i) allow a surgeon to obtain personalized feedback on their surgical technique, (ii) train a surgeon or members of a surgical team to perform a surgical procedure, (iii) provide a patient with information related to the performance of a surgical procedure, (iv) plan a surgical procedure to be performed, (v) provide predictive analytics and recommendations on surgical technique for a surgical procedure to be performed (e.g., to reduce the occurrence of complications), (vi) provide feedback as to how a surgical instrument performed during a surgical procedure, (vii) provide a knowledge-sharing tool, and/or (viii) provide a memory aid tool to remind a surgeon of clinical choices for a particular surgical procedure.
[0025] As described above, the present disclosure additionally or alternatively provides for reducing a cognitive load on a surgeon during a surgical procedure. Current methods to assess cognitive load rely predominantly on surgeons self-reporting it after surgery (e.g., using NAS A-TLX tool or Surg-TLX). However, analyzing cognitive load retrospectively (i.e., not in real time) may lead to a failure to capture intra-operative fluctuations in cognitive load. In some examples, the systems and methods of the present disclosure can help to quantify cognitive load in real time and measure its impact at different phases of real surgeries. For instance, in some examples, the systems and methods of the present disclosure provide for measuring a surgeon’s levels of cognitive load based on information provided by one or more surgeon monitoring devices, which can sense one or more physiological conditions of the surgeon during the surgical procedure. The physiological conditions sensed by the surgeon monitoring device(s) can, in some instances, additionally provide an indication of a psychological condition of the surgeon (e.g., a mental state and/or an emotional state of the surgeon). As examples, the wearable devices can include one or more devices selected from among a smart ring, eye-tracking glasses, a strap-based sensor (e.g. chest-strap sensor, thighstrap sensor, and shank-strap sensor), a smartwatch, an immersive device (e.g. an augmentative reality (AR) and an extended reality (XR) headset), and a flexible epidermal sensor (e.g. wireless heart-rate patch monitor). In some implementations, the systems and methods can, based on the sensed physiological conditions of the surgeon, further provide intraoperative feedback to the surgeon to help reduce the level of cognitive load on the surgeon.
[0026] Within examples, the systems and methods of the present disclosure can be applied in non-robotic surgery and/or robotic surgery. Additionally, within examples, the systems and methods of the present disclosure can be implemented in one or more surgical fields including, for example, neurosurgery, spinal surgery, endoscopy, orthopedics and Ear,
Nose, and Throat (ENT)/Otorhinolarj'ngology.
[0027] Referring now to Figure 1, a simplified block diagram of a system 100 for determining a plurality of surgical performance metrics is shown according to an example. As shown in Figure 1, the system 100 includes a controller 110 and one or more surgical devices 112 that are operated during one or more surgical procedures. In this example, the one or more surgical devices 112 include at least one surgical instrument 114 and at least one surgical navigation system 116. As described in further detail below, the surgical device(s) 112 can include additional or alternative devices in other examples.
[0028] In general, each of the surgical instrument s) 114 is operable to perform a surgical task during a surgical procedure. As examples, the surgical instrument s) 114 can include at least one instrument selected from among a group consisting of a drill, a bone cutter, an electrosurgical tool, an aspiration tool, an irrigation tool, a shaver, a microscope, a camera (e.g., an endoscope), a surgical retractor, and an illumination device. Additionally or alternatively, the surgical instrum ent(s) 114 can include one or more surgical instruments that can perform at least one surgical task selected from a group consisting of a drilling operation, a cutting operation, a shaving operation, a tissue retraction operation, a suctioning operation, an irrigation operation, a probing operation, a clamping operation, a coagulation operation, a heating operation, a cooling operation, an ablation operation, an electrical stimulation operation, an image capture operation, a sawing operation, and a grinding operation.
[0029] Within examples, one or more of the surgical instrument(s) 114 can include a working element that is operable to perform the at least one surgical task. For instance, the working element can include a drill bit, an electrosurgical electrode, an ablation end effector (e.g., a cryoablation balloon, an electrode, a laser light emitter, and/or a heating element), a fluid valve, a vacuum source, and a cutting blade. In some examples, the surgical instrum ent(s) 114 can include one or more user input devices that can be actuated to operate the working element of the surgical instrument s) 114. For instance, the user input device(s) can include one or more devices selected from among a group consisting of: one or more buttons, one or more switches, one or more foot pedals, one or more touch screens, one or more dials, one or more triggers, one or more cranks, and one or more suction force control ports.
[0030] In some examples, the surgical instrument(s) 114 can include one or more handheld devices that can be gripped, manipulated, and moved by the surgeon during the surgical procedure. In other examples, the surgical instrum ent(s) 114 can include one or more stationary devices that remain in a fixed position relative to the patient (and/or an operating room) during the surgical procedure. In other examples, the surgical instrument(s) 114 can include both the handheld device(s) and the stationary device(s). For instance, in one example, the surgical instrument s) 114 can include an electrosurgical pencil and an electrosurgical generator, where the electrosurgical pencil is held and moved by the surgeon while the electrosurgical generator remains in a fixed position during the surgical procedure.
[0031] In some examples, the surgical instrum ent(s) 114 can be entirely operated by the surgeon without robotic assistance. In other examples, the surgical instrument s) 114 can be include a partially automated robotic device that is operated by a surgeon, and/or a fully automated robotic device that performs the surgical procedure based on preoperative programming input to the surgical instrument(s) 114 by the surgeon.
[0032] As shown in Figure 1, the controller 110 can receive, from the surgical instrum ent(s) 114, instrument data related to an operation of the surgical instrument(s) 114 during the surgical procedure(s). The instrument data can be based on one or more instrument parameters determined by the surgical instrument 114 at a plurality of times during each surgical procedure.
[0033] In some implementations, the instrument param eter(s) can be sensed by an instrument sensor 118 coupled to the surgical instrument 114. As examples, the instrument sensor 118 can include one or more sensors selected from a group consisting of: a current sensor, a voltage sensor, an electrical power sensor, a flow sensor configured to detect a flow a liquid, a flow sensor configured to detect a flow of a gas, a temperature sensor, an accelerometer, a piezo-electric sensor, a force sensor (e.g., a ground reaction force sensor), a vibration sensor, a chemical sensor, an optical sensor, a pressure sensor, a humidity sensor, a position sensor, a hall-effect sensor, a capacitive sensor, and a Doppler flow sensor. In some examples, the instrument sensor 118 can be removably coupled to a housing of the surgical instrument s) 114. In other examples, the instrument sensor 118 can be additionally or alternatively non-removably coupled to the housing of the surgical instrument s) 114 (e.g., disposed within an interior cavity of the housing of the surgical instrument s) 114).
[0034] In other implementations, the surgical instrument(s) 114 can additionally or alternatively determine the instrument parameter(s) separately from the instrument sensor 118. For instance, in some implementations, the surgical instrument s) 114 can determine the instrument parameter(s) based on a setting and/or a mode of operation of the surgical instrument s) 114. As an example, in an implementation in which the surgical instrument s) 114 includes a bone drill, the instrument parameter(s) of a drilling speed and/or a torque can be determined based on a setting selected from among a plurality of settings on the surgical instrument s) 114. As another example, in an implementation in which the surgical instrument s) 114 include an electrosurgical pencil and an electrosurgical generator, the instrument parameter(s) of a power and a waveform of electrosurgical energy applied to tissue by the electrosurgical pencil can be determined based on a setting selected from among a plurality of settings on the electrosurgical generator.
[0035] In general, the surgical navigation system 116 is configured to determine a position of one or more of the surgical instrum ent(s) 114 relative to a patient anatomy during a surgical procedure. In some implementations, the surgical navigation system 116 can additionally or alternatively determine an orientation of the surgical instrum ent(s) 114 relative to the patient anatomy. As examples, the surgical navigation system 116 can be configured to determine the position data using at least one surgical navigation modality selected from a group consisting of: (i) electromagnetic surgical navigation, (ii) optical surgical navigation, (iii) ultrasound surgical navigation, and (iv) machine vision surgical navigation.
[0036] For instance, in an implementation in which the surgical navigation system 116 is configured to use electromagnetic surgical navigation, the surgical navigation system 116 can include an electromagnetic field generator and a position sensor. The electromagnetic field generator can be arranged to emit an electromagnetic field at the patient anatomy. The position sensor can include a current sensor (e.g., sensor coils) that can sense the electromagnetic field and responsively generate a position sensor signal based on one or more properties of the electromagnetic field at a given location of the position sensor. In this example, the position sensor can be coupled to the surgical instrument(s) 114 such that the position signal generated by the position sensor is indicative of a position and/or an orientation of the surgical instrument(s) relative to the patient anatomy.
[0037] In an implementation in which the surgical navigation system 116 is configured to use optical surgical navigation, the surgical navigation system 116 can include one or more cameras configured to track one or more fiducial markers coupled to the surgical instrument s) 114 and/or the patient anatomy. The fiducial markers can include one or more passive markers (e.g., one or more markers that reflect light) and/or one or more active markers (e.g., one or more markers that emit light).
[0038] In an implementation in which the surgical navigation system 116 is configured to use ultrasound surgical navigation, the surgical navigation system 116 can include one or more ultrasound signal emitters and one or more ultrasound signal detectors coupled to the surgical instrument(s) 114 and/or the patient anatomy. The ultrasound signal emitter(s) can emit an ultrasound signal, the ultrasound signal detectors can detect the ultrasound signal emitted by the ultrasound signal emitter(s), and the surgical navigation system 116 can determine the position data based on the ultrasound signal emitted by the ultrasound signal emitter(s) and the ultrasound signal received by the ultrasound signal detector(s) (e.g., based on time(s) of flight between the ultrasound signal emitter(s) and the ultrasound signal detector(s)).
[0039] In an implementation in which the surgical navigation system 116 is configured to use machine vision surgical navigation, the surgical navigation system 116 can include one or more light sources and/or one or more cameras. The light source(s) can illuminate the patient anatomy and/or the surgical instrument(s) 114. The camera(s) can capture one or more images of the patient anatomy and/or the surgical instrument(s) 114 at a surgical site during the surgical procedure. The surgical navigation system 116 can process the image(s) to determine the position data indicating the position of the surgical instrum ent(s) 114 relative to the patient anatomy.
[0040] In some examples, the surgical navigation system 116 can include one or more position sensors 120 that can be coupled to the surgical instrument s) 114 and/or the patient anatomy. The position sensor(s) 120 can be detectable by one or more components of the surgical navigation system 116 (e.g., via electromagnetic sensing, optical sensing, and/or ultrasonic sensing), and the surgical navigation system 116 can determine the position data based on the detected position sensor(s) 120. In other examples, the surgical navigation system
116 can omit the position sensor(s) 120.
[0041] In some examples that include the position sensor(s) 120, the surgical navigation system 116 can include a registration system that is configured to establish a frame of reference for the patient anatomy and the position sensor(s) 120 (and, thus, the position of surgical instrument s) 114 indicated by the position sensor(s) 120). For instance, in one implementation, the position sensor(s) 120 can be traced along features of the patient anatomy to establish the frame of reference. In another implementation, for instance, the surgical navigation system 116 can include one or more touch points on the patient anatomy. At each touch point, the surgical navigation system 116 can register the touch point in space and, using the registered touch points, determine the frame of reference for the patient anatomy in space (e.g., using a three-dimensional coordinate system). In this way, the position sensor(s) 120 and the patient anatomy can be mapped to a common frame of reference such that a position of the surgical instrument s) 114 sensed by the position sensor(s) 120 can be correlated (e.g., mapped in space) to the patient anatomy.
[0042] In some examples, the surgical navigation system 116 can be an image- guided surgery system that is configured to correlate in real-time a sensed position of the surgical instrument(s) 114 and one or more images of the patient anatomy (e.g., preoperative image(s) of the patient anatomy obtained to the surgical procedure). As examples, the image(s) can be at least one image type selected from a group consisting of a computerized tomography (CT) scan, a magnetic resonance imaging (MRI), and a three-dimensional map. In some implementations in which the surgical navigation system 116 is an image-guided surgery system, the surgical navigation system 116 can be configured to provide the controller 110 with the position data and image data relating to the patient anatomy. In such implementations, the image data and the position data can both be related to a common frame of reference with respect to the patient anatomy. In other implementations, the surgical navigation system 116 can provide the position data to the controller 110 without providing any image data.
[0043] As described above, the surgical instrument s) 114 can provide the instrument data to the controller 110 and the surgical navigation system 116 can provide the position data to the controller 110. In some examples, the surgical instrument(s) 114 and/or the surgical navigation system(s) 116 can be communicatively connected with the controller 110 via a network. Examples of the network can include one or more of the following: a direct or indirect physical communication connection, mobile communication network, Internet, intranet, Local Area Network, Wide Area Network, Storage Area Network, and any other form of connecting two or more systems, components, or storage devices together.
[0044] The controller 110 is a computing device that is configured to receive data (e.g., the instrument data and/or the position data) from the surgical devices 112 operated during one or more surgical procedures, and determine, based on the data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance. The controller 110 can be implemented using hardware, software, and/or firmware. For example, the controller 110 can include one or more processors 122 and a non-transitory computer- readable medium 124 (e.g., volatile and/or non-volatile memory) that stores machine language instructions or other executable instructions. The instructions, when executed by the one or more processors 122, cause the system 100 to carry out various operations described herein. The controller 110, thus, can receive data (including data indicated by the surgical instrum ent(s) 114 and/or the surgical navigation systems(s) 116) and store the data in memory as well.
[0045] The processor(s) 122 and/or the non-transitory computer-readable medium 124 can be implemented in any number of physical devices/machines. For example, the controller 110 can include one or more shared or dedicated general purpose computer systems/servers. Accordingly, principles and advantages of distributed processing, such as redundancy, replication, and the like, also can be implemented, as desired, to increase the robustness and performance of the devices and systems of the controller 110.
[0046] The physical devices/machines can be implemented by the preparation of integrated circuits or by interconnecting an appropriate network of conventional component circuits, as is appreciated by those skilled in the electrical art(s). The physical devices/machines, for example, may include field programmable gate arrays (FPGA’s), application-specific integrated circuits (ASIC’s), digital signal processors (DSP’s), etc. The physical devices/machines may reside on a wired or wireless network, e.g., LAN, WAN, Internet, cloud, near-field communications, etc., to communicate with each other and/or other systems, e.g., Internet/web resources.
[0047] As described above, the controller 110 can receive data from the surgical devices 112 operated during one or more surgical procedures, and determine, based on the data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance. In an example, the non-transitory computer-readable medium 124 has stored therein instructions that are executable to cause the processor(s) 122 to perform functions including: determining a plurality of procedure data sets for a plurality of surgical procedures, and determining, based on the plurality of procedure data sets, the plurality of surgical performance metrics that are indicative of the characteristic of surgical performance.
[0048] In this example, determining the plurality of procedure data sets can include, for each surgical procedure of the plurality of surgical procedures, determining a respective procedure data set of the plurality of procedure data sets by (i) receiving, from the surgical instrum ent(s) 114, the instrument data related to an operation of the surgical instrument(s) 114 during the surgical procedure, where the instrument data is based on the one or more instrument parameters determined by the surgical instrum ent(s) 114 at a plurality of times during the surgical procedure, (ii) receiving, from the surgical navigation system(s) 116, that position data that is indicative of the position of the surgical instrument s) 114 relative to the patient anatomy at the plurality of points in time during the surgical procedure, (iii) determining, based on at least one of the position data or the instrument data, kinematic data at the plurality of points in time, and (iv) correlating, for each of the plurality of points in time, the instrument data, the position data, and the kinematic data to determine the respective procedure data set for the surgical procedure.
[0049] As examples, the kinematic data can include data for one or more kinematic parameters selected from a group consisting of: (i) a trajectory of the surgical instrum ent(s) 114, (ii) a velocity of the surgical instrum ent(s) 114, (iii) a motion of the surgical instrument s) 114 in three-dimensional space, (iv) an inertia of the surgical instrument(s) 114, (v) an acceleration of the surgical instrum ent(s) 114, (vi) a chatter of the surgical instrum ent(s) 114 (e.g., a movement due to an interaction between the surgical instrument s) 114 and the patient anatomy such as, for instance, the surgical instrum ent(s) 114 bouncing off of a bone of the patient), (vii) a jitter of the surgical instrument s) 114 e.g. due to an unsteadiness of a hand of the surgeon, (ix) a smoothness of movement of the surgical instrument s) 114 (e.g., a jerkiness of the movement of the surgical instrument s) 114 due to, for instance, relatively quick starts and stops and/or relatively quick changes in direction of movement), (x) an application force applied by the surgical instrum ent(s) 114 to a patient’s anatomy at a surgical site, (xi) a movement deviation relative to a preoperative planned route for the surgical instrum ent(s) 114, and (xii) a site of operation of the surgical instrum ent(s) 114 relative to a preoperative planned target site. As such, the kinematic data can include information relating to intentional movements of the surgical instrum ent(s) 114 by the surgeon relative to the patient anatomy, unintentional movements of the surgical instrum ent(s) 114 by the surgeon relative to the patient anatomy, and/or movements of the surgical instrument(s) 114 due to interactions with the patient anatomy.
[0050] As described above, the processor 122 can correlate the instrument data, the position data, and the kinematic data with each other at the plurality of points in time. In one example, the processor 122 can correlate the instrument data, the position data, and the kinematic data with each other based on timing information (e.g., timestamp information) provided by the surgical instrument(s) 114 and the surgical navigation system(s) 116. This time-wise synchronization of the instrument data, the position data, and the kinematic data can allow each procedure data set to represent a more complete picture of how the surgeon used the surgical instrument s) 114 and/or how the surgical instrument s) 114 themselves performed during the surgical procedure as compared to considering such data in isolation. Additionally, because the underlying data of the procedure data sets is obtained from the surgical device(s) 112 that are used to perform the surgical procedures, the processor 122 can determine the surgical performance metrics based on objective data, which can provide a more consistent and superior basis for evaluating and characterizing surgical performance as compared to prior approaches (e.g., based on subjective peer review of video footage).
[0051] In some examples, the processor 122 can determine the surgical performance metrics using descriptive analytics, diagnostic analytics, predictive analytics, and/or prescriptive analytics to analyze the procedure data sets. The processor 122 can, for example, analyze the sets of procedure data to identify patterns in surgical technique (e.g., indicated by the instrument data, the position data, and the kinematic data) that are predictors for clinical outcomes. In some examples, the processor 122 can additionally or alternatively determine the surgical performance metrics by using the procedure data sets as training data for a machine learning algorithm. As the surgical performance metrics are determined based, at least in part, on procedure data sets from a plurality of surgical procedures, the processor 122 can analyze the procedure data sets from multiple surgical performances to obtain insights that may be useful for future surgical procedures. Indeed, as described in further detail below, the surgical performance metrics can provide a basis for preoperatively planning a future surgical procedure, providing intraoperative feedback to a surgeon during a surgical procedure, and/or providing postoperative feedback to a surgeon and/or a patient on the performance of a surgical procedure.
[0052] In some examples, determining the surgical performance metrics can include (i) detecting, based on the procedure data sets, an occurrence of a surgical event during one or more surgical procedures of the surgical procedures, (ii) identifying one or more portions of the procedure data sets that are indicative of a cause of the occurrence of the surgical event, and (iii) determining the surgical performance metrics based on the one or more portions of the procedure data sets identified being indicative of the cause of the occurrence of the surgical event. As examples, the surgical event can be at least one event selected from a group consisting of: (i) chatter of the surgical instrument s) 114, (ii) a wrap event (e.g., a wrapping of gauze and/or tissue around a rotating element of the surgical instrument(s) 114), (iii) overheating of the surgical instrument(s) 114, and (iv) proximity of the surgical instrument to critical anatomical structures. In such examples, the surgical performance metrics can provide information that can help to better understand characteristics of surgical performance that may increase and/or reduce a risk of the occurrence of the surgical event. This information can help to preoperatively plan a future surgical procedure, provide intraoperative feedback to a surgeon during a surgical procedure, and/or provide postoperative information to provide feedback on the performance of a surgical procedure.
[0053] As noted above, the surgical performance metrics are indicative of a characteristic of surgical performance. In one example, the surgical performance metrics can include one or more threshold values defining a range of expected values for at least one of (i) one or more instrument parameters of the instrument data or (ii) one or more kinematic parameters of the kinematic data. In some implementations, the threshold value(s) can be communicated to a surgeon preoperatively to provide guidance for performing a future surgical procedure, intraoperatively to provide real-time feedback to the surgeon during a surgical procedure, and/or postoperatively to provide feedback to a surgeon as to when and/or by how much the surgeon exceeded and/or deviated from the threshold value(s) during a surgical procedure. Additional uses of surgical performance metrics including threshold value(s) are described in further detail below.
[0054] In another example, the surgical performance metrics can additionally or alternatively define a scoring system for evaluating the instrument data, the position data, and the kinematic data of at least one surgical procedure selected from among the plurality of surgical procedures. For instance, in one implementation, the surgical performance metrics can define data for comparison to instrument data, position data, and/or kinematic data for a surgical procedure to be scored. The surgical performance metrics can further define one or more scores that can be based on the comparison between the surgical performance metrics and the instrument data, the position data, and/or the kinematic data for the surgical procedure to be scored.
[0055] In another example, the surgical performance metrics can additionally or alternatively define a classification system for classifying a plurality of types of surgical technique. For instance, the plurality of types of surgical technique can include two or more types selected from among a group consisting of: (i) an aggressive approach to surgical performance, (ii) a conservative approach to surgical performance, (iii) a smoother approach to surgical performance, (iv) a jerky approach to surgical performance, (v) a relatively fast approach to surgical performance (e.g., a shorter time for performing a surgical procedure), and (vi) a relatively slow approach to surgical performance.
[0056] In another example, the surgical performance metrics can additionally or alternatively provide for determining a plurality of reference fingerprints. For instance, the controller 110 can use one or more dimension reduction techniques to identify, extract, and summarize characteristics of surgical technique into a unique identifier. In an example, the controller 110 can further be configured to provide recommendations and/or guidance related to surgical technique, settings for the surgical instrument(s) 114, selection of a subset of the surgical instrument s) 114 from among a plurality of surgical instrument(s) 114, and/or selection of a working element for the surgical instrument s) 114 from among a plurality of working elements based on the reference fingerprints. For instance, the controller 110 can be configured to determine, based on one or more surgical procedures performed by a specific surgeon, a query fingerprint for the surgeon, make a comparison of the query fingerprint to the reference fingerprints, identify the reference fingerprint that most closely matches the query fingerprint, and provide information associated with the identified reference fingerprint to the surgeon preoperatively and/or intraoperatively to facilitate a surgical procedure.
[0057] In some implementations, the surgical instrument s) 114 and the surgical navigation system(s) 116 can be a single surgical instrument 114 and a single surgical navigation system 116 used during all of the surgical procedures. For instance, the surgical procedures can be performed using re-usable surgical instrum ent(s) 114 and re-usable surgical navigation system(s) 116 at a single location (e.g., in a single operating room). In other implementations, different surgical instruments 114 and/or different surgical navigation systems 116 can be used during the surgical procedures. For instance, the instrument data can be obtained from surgical instruments 114 and/or the position data can be obtained from surgical navigation systems 116 at a plurality of different locations. Additionally or alternatively, for instance, the surgical instruments 114 and/or the surgical navigation systems 116 can be disposable devices that are intended to be discarded after use during a single surgical procedure.
[0058] In the examples described above, the controller 110 can determine the surgical performance metrics based on the instrument data and the position data. In other examples, the controller 110 can evaluate and/or aid surgical performance based on additional or alternative sources of information.
[0059] Figure 2 shows a simplified schematic diagram of a system 200, including one or more additional surgical devices 112 and/or one or more data sources 226, according to another example. As shown in Figure 2, the system 200 includes the controller 110 and the surgical device(s) 112, as described above with respect to Figure 1. In Figure 2, the controller 110 includes the processor(s) 122 and the non-transitory computer-readable medium 124, and the surgical device(s) 112 can include the surgical instrum ent(s) 114 and the surgical navigation system(s) 116, as described above with respect to Figure 1.
[0060] Additionally, as shown in Figure 2, the data source(s) 226 can include one or more data sources selected from among a group consisting of: an outcome data source 228, a surgeon historical data source 230, and a patient-specific data source 232. The data source(s) 226 can each be communicatively connected to the controller 110 (e.g., via a network as described above with respect to the surgical instrument(s) 114, the surgical navigation system(s) 116, and the controller 110). The processor 122 can use the data provided by the data source(s) 226 in connection with the procedure data sets to determine the surgical performance metrics.
[0061] In examples that include the outcome data source 228, the outcome data source 228 can store outcome data relative to postoperative outcomes of the surgical procedures. In such examples, the processor 122 can additionally or alternatively receive, for each surgical procedure, respective outcome data relating to a postoperative outcome of the surgical procedure, and the processor 122 can determine the plurality of surgical performance metrics further based on the respective outcome data. As examples, the outcome data can include an indication of at least one postoperative outcome selected from among a group consisting of: (i) patient-reported pain scores, (ii) hospital stay duration, (iii) postoperative complications, (iv) restitution of function, (v) alleviation of presenting symptoms, and (vi) mortality. Determining the surgical performance metrics based on the sets of procedure data and the outcome data associated with each set of procedure data can help to identify aspects of instrument data, position data, and kinematic data (which can be indicative of surgical technique) that may lead to positive surgical outcomes and/or negative surgical outcomes. This in turn can help to determine surgical performance metrics that can provide actionable insights to surgeons.
[0062] In examples that include the surgeon historical data source 230, the historical surgeon data source can store historical surgeon data. In such examples, the processor 122 can additionally or alternatively receive, for each surgical procedure of the plurality of surgical procedures, the historical surgeon data relating to one or more surgical procedures performed in the past by a surgeon performing the surgical procedure, and the processor 122 can determine the surgical performance metrics further based on the historical surgeon data. As examples, the historical surgeon data can include at least one item of information selected from among a group consisting of: a quantity of surgical procedures performed in the past, a type of surgical procedures performed in the past, years of experience, a number of hours of surgery performed by the surgeon, certifications of the surgeon, and an average time for completing surgical procedures. Determining the surgical performance metrics based on the sets of procedure data and the historical surgeon data associated with each set of procedure data can also help to identify aspects of instrument data, position data, and kinematic data that may lead to positive surgical outcomes and/or negative surgical outcomes for surgical procedures performed by surgeons having certain levels of experience and/or surgical tendencies. As described in further detail below, the historical surgeon data can additionally or alternatively be used to determine the surgical performance metrics based on preoperative information provided for a future surgical performance to be performed by a specific surgeon.
[0063] In examples that include the patient-specific data source 232, the patientspecific data source can store patient-specific data relating to one or more health records for the patients of the surgical procedures. In such examples, the processor 122 can additionally or alternatively receive, for each surgical procedure of the plurality of surgical procedures, the patient-specific data relating to one or more health records for a patient of the surgical procedure, and the processor 122 can determine the surgical performance metrics further based on the patient-specific data. As examples, the patient-specific data can include at least one item of information selected from among a group consisting of: an age, a sex, a height, a weight, a bone density, a body mass index, an allergy indication, a medical history of the patient, demographics, family health history, lab and test results, medications, prior diagnoses, progress notes, medical images (e.g., radiology images, CT images, and/or MRI images), immunizations, patient reported outcome measures (PROMs), information relating to the nature of the medical condition of the patient (e.g, characteristics of cell tissue and/or bone). Determining the surgical performance metrics based on the sets of procedure data and the patient-specific data associated with each set of procedure data can also help to identify aspects of instrument data, position data, and kinematic data that may lead to good surgical outcomes and/or bad surgical outcomes for patients having certain medical histories and/or health conditions. As described in further detail below, the patient-specific data can additionally or alternatively be used to determine the surgical performance metrics based on preoperative information provided for a future surgical performance to be performed on a specific patient.
[0064] As described above, in some examples, the processor(s) 122 can receive image data relating to the patient anatomy from the surgical navigation system 116. In other examples, the processor(s) 122 can additionally or alternatively receive the image data from the patient-specific data source 232. The image data received from the patient-specific data source 232 and the position data received from the surgical navigation system can both relate to a common frame of reference with respect to the patient anatomy. For example, the image data received from the patient-specific data source can be generated using a registration system that is separate from the surgical navigation system 116.
[0065] As shown in Figure 2, the surgical device(s) 112 can additionally or alternatively include one or more patient monitoring device(s) 234 communicatively connected to the controller 110. The patient monitoring device(s) 234 can determine patient physiological data related to a physiological condition of the patients at the plurality of points of time during the surgical procedures. As examples, the patient physiological data can relate to at least one physiological parameter selected from a group consisting of: (i) a heart rate of the patient, (ii) a respiratory rate of the patient, (iii) a body temperature of the patient, (iv) a blood pressure of the patient, and (v) an oxygen saturation of the patient. As shown in Figure 2, the patient monitoring device(s) 234 can include one or more patient sensors 236 that are configured to sense the physiological parameter.
[0066] In examples that include the one or more patient monitoring device(s) 234, the processor 122 can receive, for each surgical procedure of the plurality of surgical procedures, the patient physiological data related to the physiological condition of the patient at the plurality of points in time during the surgical procedure. Additionally, for each surgical procedure of the plurality of surgical procedures, the processor 122 can determine the respective procedure data set by correlating, for each of the plurality of points in time, the instrument data, the position data, the kinematic data, and the patient physiological data to determine the respective procedure data set for the surgical procedure.
[0067] As shown in Figure 2, the surgical device(s) 112 can additionally or alternatively include one or more surgeon monitoring device(s) 238 communicatively connected to the controller 110. The surgeon monitoring device(s) 238 can determine surgeon physiological data related to a physiological condition of the surgeon at the plurality of points of time during the surgical procedures. As examples, the surgeon physiological data can relate to at least one physiological parameter selected from a group consisting of: (i) a heart rate of the surgeon, (ii) a respiratory rate of the surgeon, (iii) a body temperature of the surgeon, (iv) an eye blink rate of the surgeon, (v) a measure of pupil dilation of the surgeon, (vi) a measure of eye fixation of the surgeon, (vii) a measure of saccade by the surgeon, and (viii) a measure of body movement by the surgeon. As shown in Figure 2, the surgeon monitoring device(s) 234 can include one or more surgeon sensors 240 that are configured to sense the physiological parameter. As examples, the surgeon monitoring device(s) 238 can include a wearable sensor in contact with the skin of the surgeon (e.g., an Oura ring and/or a strap-based sensor) and/or a wearable sensor that can monitor the eyes of the surgeon (e.g., a pair of glasses having eye tracking features).
[0068] In examples that include the one or more surgeon monitoring device(s) 238, the processor 122 can receive, for each surgical procedure of the plurality of surgical procedures, the surgeon physiological data related to the physiological condition of the surgeon at the plurality of points in time during the surgical procedure. Additionally, for each surgical procedure of the plurality of surgical procedures, the processor 122 can determine the respective procedure data set by correlating, for each of the plurality of points in time, the instrument data, the position data, the kinematic data, and the surgeon physiological data to determine the respective procedure data set for the surgical procedure. By determining the surgical performance metrics based on procedure data sets that include the surgeon physiological data, the surgical performance metrics can provide insights into how the surgeon physiological data can affect patient outcomes.
[0069] The surgeon physiological data can additionally or alternatively provide an indication as to a cognitive load on the surgeon performing the surgical procedure. As described in further detail below, the processor 122 can use the surgeon physiological data to determine when a cognitive load of a surgeon is high and responsively provide feedback to help reduce the cognitive load.
[0070] In other examples, one or more of the surgeon monitoring device(s) 238 can be coupled to the surgeon and at least one of the surgeon monitoring device(s) 238 can be coupled to one or more other members of a surgical team. In such examples, the surgeon monitoring device(s) 238 can determine the surgeon physiological data related to the physiological condition of the surgeon and the other members of the surgical team at the plurality of points of time during the surgical procedures as described above.
[0071] Additionally, as described above, the physiological conditions sensed by the surgeon monitoring device(s) can, in some instances, additionally provide an indication of a psychological condition of the surgeon. Accordingly, within examples, determining the surgical performance metrics based on procedure data sets that include the surgeon physiological data can also provide insights into how a mental or emotional state of a surgeon and/or other members of a surgical team can affect patient outcomes.
[0072] As shown in Figure 2, the system 200 can additionally or alternatively include a user interface 242 that can receive one or more inputs from a user and/or provide one or more outputs to the user. As examples, the user interface 242 can include one or more buttons, one or more switches, one or more dials, one or more keypads, one or more touchscreens, one or more display devices 244, one or more indicator lights, one or more speakers, and/or one or more haptic output devices. The user interface 242 is communicatively connected with the controller 110.
[0073] As noted above, the controller 110 can determine the surgical performance metrics to provide preoperative, intraoperative, and/or postoperative information to a surgeon and/or a patient in connection with one or more surgical procedures. Example implementations for providing preoperative, intraoperative, and/or postoperative information to a surgeon and/or a patient for a surgical procedure are described below.
[0074] In one example, the processor 122 can receive preoperative information for a surgical procedure to be performed. The processor 122 can receive the preoperative information from, for example, the user interface 242 and/or the data sources 226. The preoperative information can include the patient-specific data for the patient of the surgical procedure to be performed and/or the historical surgeon data for the surgeon of the surgical procedure to be performed. The preoperative information can additionally or alternatively include surgeon preference data 246 relating to one or more preferences of the surgeon that will perform the surgical procedure. As examples, the surgeon preference data 246 can include one or more preference selected from among a group consisting of: (i) a preference relating to a type of the surgical instrum ent(s) 114, (ii) a preference relating to a setting of the surgical instrument s) 114, (iii) a preference relating to a manner of holding or gripping the surgical instrument s) 114, and (iv) a preference for an order of steps for performing the surgical procedure.
[0075] The processor 122 can determine, using the preoperative information, a plurality of surgical performance metrics. For example, the processor 122 can be configured to use the preoperative information and the procedure data sets as inputs to determine the surgical performance metrics. In this way, the surgical performance metrics can be tailored to the specific conditions of the surgical procedure to be performed as compared to implementations that do not use the preoperative information as an input.
[0076] The processor 122 can receive, from the surgical instrument(s) 114, the instrument data related to the operation of the surgical instrum ent(s) 114 at a plurality of points in time during the surgical procedure. The processor 122 can also receive, from the surgical navigation system(s) 116, the position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure. The processor 122 can determine, based on the position data, the kinematic data at the plurality of points in time. The processor 122 can then perform an analysis of (i) the kinematic data and the instrument data relative to (ii) the plurality of surgical performance metrics.
[0077] The processor 122 can cause, based on the analysis, the user interface 242 to output information to provide feedback to the surgeon relating to the performance of the surgical procedure. The user interface 242 can output the information via the display device 244, the speaker(s), the indicator light(s), and/or the haptic device(s). Examples of information that can be displayed on the display device 244 are shown and described below with respect to Figures 3-8.
[0078] In some implementations, the processor 122 can determine the plurality of surgical performance metrics, and output information preoperatively. For instance, the processor 122 can determine, based on the preoperative information and/or the surgical performance metrics, a procedure plan for performing the surgical procedure. The processor 122 can cause the user interface 242 to output information related to the procedure plan. As examples, the procedure plan can include at least one item of information selected from among a group consisting of: (i) a preoperative planned route of movement of the surgical instrument^ s) 114 and/or surgical implants, (ii) a target position for actuating the surgical instrum ent(s) 114, (iii) a selection of the surgical instrument s) 114 from among a plurality of potential surgical instruments 114, (iv) potential complications that may be encountered.
[0079] In some implementations, the processor 122 can additionally or alternatively perform the analysis and cause the user interface 242 to output the information intraoperatively in real-time during the surgical procedure. As examples, the user interface 242 can output at least one item of information selected from a group consisting of: (i) a prediction of the occurrence of an adverse event (e.g. a breach of critical anatomical structures), (ii) an alert of sub-optimal performance of the surgical instrument 114, (iii) an alert of deviation from the preoperative plan, and (iv) an alert of sub-optimal psychological and/or physiological status of the surgeon and/or surgical team. This can help to provide real-time feedback to help enhance surgical performance during the surgical procedure.
[0080] In some implementations that include the surgeon monitoring device(s) 238, the processor 122 can receive the surgeon physiological data related to the physiological condition of the surgeon performing the surgical procedure. The processor 122 can further make a determination, based on the surgeon physiological data, that a cognitive load on the surgeon is greater than a threshold amount of cognitive load. Responsive to the determination that the cognitive load on the surgeon is greater than the threshold amount of cognitive load, the processor 122 can cause the user interface 242 to output the information. In some examples, the information outputted by the user interface 242 can include instrument guidance information for operating the surgical instrument based on the plurality of surgical performance metrics. In other examples, the information outputted by the user interface 242 can additionally or alternatively include kinematic guidance for navigating the surgical instrument s) 114 based on the plurality of surgical performance metrics. [0081] In some implementations, the processor 122 can additionally or alternatively perform the analysis and cause the user interface 242 to output the information postoperatively. For example, the user interface 242 can provide information to provide a postoperative audit of the surgeon’s surgical performance during the surgical procedure.
[0082] In another example, the processor 122 can additionally or alternatively determine and/or iteratively update the surgical performance metrics based on the instrument data, the position data, and/or the kinematic data received during the surgical procedure. For instance, in another example, the processor 122 can receive, from the surgical instrum ent(s) 114, instrument data related to the operation of the surgical instrum ent(s) 114 at a plurality of points in time during a surgical procedure. The processor 122 can additionally receive, from the surgical navigation system 116, position data that is indicative of the position of the surgical instrum ent(s) 114 relative to the anatomy of a patient at the plurality of points in time during the surgical procedure. The processor 122 can determine, based on the position data, kinematic data at the plurality of points in time, and determine, using the instrument data and the kinematic data, the surgical performance metrics that are indicative of a characteristic of surgical performance. The processor 122 can also perform an analysis of (i) the kinematic data and the instrument data relative to (ii) the plurality of surgical performance metrics. The processor 122 can further cause, based on the analysis, the user interface 242 to output information to provide feedback to a surgeon relating to a performance of the surgical procedure. The processor 122 can cause the user interface 242 to output the information preoperatively, intraoperatively, and/or postoperatively as described above. In an implementation in which the processor 122 can cause the user interface 242 to output the information intraoperatively, processor 122 can determine surgical performance metrics, perform the analysis, and cause the user interface to output the information are intraoperatively performed in real-time during the surgical procedure. The processor 122 can additionally or alternatively cause the user interface 242 to output information responsive to the processor 122 determining that a level of cognitive load is greater than the threshold amount of cognitive load, as described above.
[0083] In some examples, the processor 122 can additionally or alternatively cause the surgical instrument 114 to automatically adjust one or more of the instrument parameter(s) based on the instrument data, the position data, the kinematic data, the surgeon physiological data, the patient physiological data received during the surgical procedure, and/or the surgical performance metrics. For instance, the processor 122 can intraoperatively cause the surgical instrument 114 to adjust a motor speed, a direction of the motor rotation , a motor torque, a motor temperature, a motor current, and a motor power consumption, an electrosurgical current, an electrosurgical voltage, an electrosurgical waveform, an electrosurgical impedance (e.g., a resistance encountered by the surgical instrument(s) 114 while cutting/aspirating/coagulating tissue), an irrigation flow rate, a suction flow rate, a depth control (e.g., a depth level for screw placement and/or a cutting depth), and a visibility setting for a camera (e.g., an illumination intensity, a white balance setting, an image magnification setting, a focus setting, an image enhancement feature, and/or an air and water insufflation setting). This can help to, for example, automatically adjust the operation of the surgical instrument 114 to (i) guide the surgical procedure towards the surgical performance metrics, (ii) reduce a risk associated with a high cognitive load, and/or (iii) reduce a risk associated with a change in a condition of the patient during the surgical procedure.
[0084] In one implementation, the processor 122 can perform an analysis of (i) the kinematic data and the instrument data relative to (ii) the plurality of surgical performance metrics. The processor 122 can further determine, based on the analysis, an adjustment to the instrument parameter(s), and responsively cause the surgical instrument 114 to adjust the one or more instrument param eter(s) in accordance with the adjustment to the instrument parameter(s).
[0085] In some implementations, the processor 122 can additionally or alternatively receive the surgeon physiological data related to the physiological condition of the surgeon performing the surgical procedure. The processor 122 can make a determination, based on the surgeon physiological data, that a cognitive load on the surgeon is greater than a threshold amount of cognitive load. Responsive to the determination that the cognitive load on the surgeon is greater than the threshold amount of cognitive load, the processor 122 can cause the surgical instrument 114 to adjust the one or more instrument parameter(s).
[0086] In some examples, the processor 122 and the user interface 242 can be configured to allow the surgeon and/or an institution (e.g., an employer of the surgeon and/or a hospital administrator at which the surgeon has practice privileges) to set one or more goals for surgical performance. For instance, the processor 122 can additionally or alternatively be configured to receive, from the user interface 242, surgical performance goal data relating to the one or more goals for surgical performance. As examples, the one or more goals for surgical performance can include one or more of goals selected from a group consisting of: (i) a goal to understand the surgical technique of the surgeon, (ii) a goal to understand management of workflow in the operating room, (iii) a goal to improve safety associated with surgical technique, (iv) a goal to obtain a balance between safety and efficacy, (v) a goal to understand surgical outcomes for surgical procedures performed by the surgeon, (vi) a goal to reduce complication rates, (vii) a goal to obtain continuing professional development (CPD) credits, (viii) a goal to train operating room staff, (ix) a goal to understand patient factors, (x) a goal to reduce wear and tear on a body of the surgeon, and/or (xi) a goal to learn and progress surgical skills. [0087] The user interface 242 can receive a user input selecting one or more of the goals for surgical performance and communicate the user input to the processor 122. In some implementations, the processor 122 can, based on the user input, determine the surgical performance goal data. The processor 122 can further set and/or adjust the surgical performance metrics based on the surgical performance goal data. In such implementations, the surgical performance goals can help to fine tune the surgical performance metrics, and guide current and/or future surgical performance towards desired values of the surgeon and/or the institution.
[0088] In some implementations, the processor 122 can additionally or alternatively be configured to preoperatively, intraoperatively, and/or postoperatively provide information to the surgeon based on the surgical performance metrics. In such implementations, the processor 122 can use the surgical performance goal data to determine what information is provided, when it is provided, and how it is provided to the surgeon to facilitate the surgeon’s progress towards the surgical performance goals (e.g., selected via the user interface 242). Accordingly, the processor 122 can (i) perform an analysis of one or more data sets (e.g., the instrument data, the position data, the kinematic data, the outcome data, the historical surgeon data, the patient-specific data, the image data, the patient physiological data, and/or the surgeon physiological data) as described above, and (ii) determine, based on the analysis and the surgical performance goal data, information that can be outputted to the surgeon by the user interface 242. In this example, for a given data set, the processor 122 is configured to cause the user interface 242 to output a first set of information for a first surgical performance goal data and a second set of information for a second surgical performance goal data, where the first surgical performance goal data is different from the second surgical performance goal data and the first set of information is different than the second set of information. [0089] In some implementations, the processor 122 and the user interface 242 can be configured to provide information that shows progress towards the surgical performance goals and/or deficiencies in progress towards the surgical performance goals. This can help the surgeon and/or the institution better understand how the surgeon has performed and on what the surgeon may want to focus their attention for additional improvement in performance.
[0090] Referring now to Figure 3, a display screen 350 of an application for evaluating and/or aiding surgical performance is shown according to an example. As shown in Figure 3, the display screen 350 can include a summary of historical surgeon data for a particular surgeon. For instance, the display screen 350 can include a first indication 352A of a quantity of surgical procedures performed within a given timeframe and/or a second indication 352B of an average amount of time for completing the quantity of surgical procedures. The display screen 350 can additionally include a first link 354A to upcoming surgical procedures that will be performed by the surgeon, a second link 354B to past surgical that were previously performed by the surgeon, and/or a third link 354C to personalized insights based on an analysis of the procedure data sets associated with the surgeon and the surgical performance metrics determined by the processor 122.
[0091] Referring now to Figure 4, a display screen 450 of the application for evaluating and/or aiding surgical performance is shown according to another example. In Figure 4, the processor 122 causes the user interface 242 to display a graphical depiction over time of the at least one of the instrument data, the position data, or the kinematic data. For instance, in Figure 4, the display screen 450 includes a respective identifier 456A-456E for each surgical instrument 114 and a graphical depiction 458A-458E of the instrument data for the surgical instrument 114 relative to an axis 460 indicating the plurality of points of time during the surgical procedure. [0092] In this example, the display screen 450 includes a first identifier 456A for a drill corresponding to a first graphical depiction 458 A of the instrument data for the drill, a second identifier 456B for a camera corresponding to a second graphical depiction 458B of the instrument data for the camera, a third identifier 456C for an aspirator corresponding to a third graphical depiction 458C of the instrument data for the aspirator, a fourth identifier 456D for a bipolar electrosurgical device corresponding to a fourth graphical depiction 458D of the instrument data for the biopolar electrosurgical device, and a fifth identifier 456E for a suction device corresponding to a fifth graphical depiction 458E of the instrument data for the suction device.
[0093] In Figure 4, the plurality of surgical performance metrics define one or more ranges of expected values for at least one of (i) one or more instrument parameters of the instrument data or (ii) one or more kinematic parameters of the kinematic data. As shown in Figure 4, the processor 122 can cause the user interface to display an indication 462 that a portion of the instrument data was outside of at least one of the one or more ranges of expected values defined by the plurality of surgical performance metrics.
[0094] As shown in Figure 4, the processor 122 can additionally or alternatively cause the user interface 242 to display a link 464A to a pre-operative image of an anatomy of a surgical site and/or display a link 464B to a post-operative image of the anatomy.
[0095] Although Figure 4 shows graphical depictions 458A458E of the instrument data and an indication that instrument data was outside of the range(s) of expected values, the processor 122 can cause the user interface 242 to display graphical depictions of the kinematic data and/or an indication that a portion of the kinematic data was outside of at least one of the one or more ranges of expected values defined by the plurality of surgical performance metrics in other examples. [0096] Referring now to Figure 5, a display screen 550 of an application for evaluating and/or aiding surgical performance is shown according to an example. In Figure 5, the display screen 550 includes a plurality of indicators 556A-556C corresponding to respective graphical depictions 558A-558C of instrument parameters indicated by the instrument data at the plurality of points of time (e.g., indicated by an axis 560) during the surgical procedure.
[0097] In Figure 5, the plurality of surgical performance metrics define one or more ranges of expected values for at least one of (i) one or more instrument parameters of the instrument data or (ii) one or more kinematic parameters of the kinematic data. As shown in Figure 5, the processor 122 can cause the user interface to display an indication 562 that a portion of the instrument data was outside of at least one of the one or more ranges of expected values defined by the plurality of surgical performance metrics.
[0098] Additionally, in Figure 5, the processor 122 can cause the user interface 242 to display, based on the position and the kinematic data, an animation 564 of a movement of the surgical instrument 114 overlaid on the image of the anatomy. In one example, the animation 564 can include a plurality of colors and each color can be based on the kinematic data. For instance, the colors can provide a color coded indication of the kinematic data such that there are N ranges of kinematic data values that are respectively indicated by N colors, where N is an integer value greater than two. In one example implementation, a first range of smoothness of movement can be indicated by a first color, a second range of smoothness of movement can be indicated by a second color, and a nth range of smoothness of movement can be indicated by a nth color, where n is an integer value greater than two.
[0099] Also, as shown in Figure 5, in implementations in which the kinematic data relates to a plurality of kinematic parameters, the processor 122 can correlate the kinematic parameters with each other at the plurality of points in time. Similarly, in implementation in which the instrument data relates to a plurality of instrument parameters, the processor 122 can correlate the instrument parameters with each other at the points in time.
[0100] In the example shown in Figure 5, the instrument data relates to a speed, a chatter, and a current of the surgical instrument 114. In another example, the instrument data can relate to the operation of a motor of the surgical instrum ent(s) 114, the instrument data can relate to one or more instrument parameters selected from a group consisting of: a motor speed, a motor torque, a motor temperature, a motor current, and a motor power consumption.
[0101] Referring now to Figure 6, a display screen 650 of an application for evaluating and/or aiding surgical performance is shown according to an example. In Figure 6, the processor 122 can cause the user interface 242 to display an amount of deviation between (i) an actual location of a device implanted by the surgical instrument 114, and (ii) a preoperatively planned location at which the device was to be implanted by the surgical instrument 114. Additionally, in Figure 6, the processor 122 can cause the user interface to display an indication 666 of a threshold amount of deviation defined by the plurality of surgical performance metrics.
[0102] Referring now to Figure 7, a display screen 750 is shown for an application for evaluating and/or aiding surgical performance according to another example. In Figure 7, the processor 122 can cause the user interface 242 to simultaneously display (i) an animation 768 of a surgical procedure, and (ii) a graphical depiction over time of the at least one of the instrument data, the position data, or the kinematic data for the surgical procedure.
[0103] For instance, in Figure 7, the display screen 750 includes a plurality of indicators 756A-756C corresponding to respective graphical depictions 758A-758C. In this example, the graphical depictions 758A-758C include (i) a first graphical depiction 758 A of an acceleration of a tip of the surgical instrument 114 indicated by the kinematic data a plurality of points in time, (ii) a second graphical depiction 758B of a force indicated by the instrument data (e.g., and measured by a force sensor) over the plurality of points in time, and (iii) a revolutions per minute (RPM) of the working element of the surgical instrument 114 (e.g., a drill bit) over the plurality of points in time. As shown in Figure 7, the processor 122 can be configured to play the animation 768 and display the graphical depictions 758A-758C in a time synchronized manner such that the display screen 750 can provide an indication of how the surgical instrument 114 was operated at each point in time over the course of the surgical procedure.
[0104] As shown in Figure 7, the animation 768 can include an animated object representing the surgical instrument 114 and an animated object representing the patient anatomy. In some examples, the processor 122 can determine the object representing the patient anatomy based on the image data relating to the patient anatomy. For instance, as described above, the processor 122 can receive the image data relating to the patient anatomy from the surgical navigation system 116 and/or the patient-specific data source 232, and the processor 122 can determine the animated object representing the anatomy based on the image data. This can provide a more realistic and/or accurate visualization of the surgical procedure relative to other examples in which the animated object representing the anatomy is a generic representation of the anatomy. However, in other examples, the animated object representing the anatomy can be a generic representation of the anatomy. This can, for example, help to reduce a computational load on the processor 122.
[0105] In some examples, the processor 122 can determine the animation 768 based on the instrument data, position data, the kinematic data for the surgical procedure. As such, the animation 768 can depict (i) relative positions, orientations, and/or movements of the surgical instrument 114 relative to the patient anatomy, (ii) actuations of the surgical instrument 114, and/or (iii) interactions between the surgical instrument 114 and the patient anatomy at the plurality of points of time during the surgical procedure. In such examples, the animation 768 can be specific to the procedure performed by the practitioner as opposed to a generic animation. Additionally, this can help to provide improved visualization of the use of the surgical instrument 114 and/or the surgical technique. However, in other examples, the animation 768 can be a generic animation that is displayed for all performances of a particular type of surgical procedure. This can help, for instance, to reduce computational load on the processor 122.
[0106] In some examples, the display screen 750 can further include the surgical performance metrics displayed on the animation 768. For instance, as one example, the processor 112 can cause the display screen 750 to include a tool tip trajectory color-coded according to a smoothness of the movement of the surgical instrument 114.
[0107] Additionally, in Figure 7, the processor 122 can cause the display screen 750 to display text 770 that can provide a technique recommendation as feedback to a surgeon relating to a performance of the surgical procedure. In Figure 7, the display text 770 provides a technique recommendation that relates to a technique for handling of the surgical instrument 114. In another example, the display text 770 can provide a technique recommendation that relates to a recommendation to use a different surgical instrument 114 and/or a different working element on the surgical instrument 114. The processor 122 can determine the text and the technique recommendation based on one or more of the analysis of the instrument data, the position data, the kinematic data, and/or the surgical performance metrics.
[0108] Referring now to Figure 8, a display screen 850 is shown for an application for evaluating and/or aiding surgical performance is shown according to another example. In Figure 8, the processor 122 can cause the user interface 242 to simultaneously display (i) the animation 768 of the surgical procedure, and (ii) a video 872 recorded by an image capture device during the surgical procedure. As shown in Figure 8, the processor 122 can be configured to play the animation 768 and display the video 872 in a time synchronized manner such that the display screen 850 can provide an indication of how the surgical instrument 114 was operated at each point in time over the course of the surgical procedure.
[0109] In Figure 8, the image capture device is external to the surgical site such that the video 872 depicts the hands and/or the body of the practitioner from outside of the surgical site. By contrast, the animation 768 can show how the surgical instrument 114 interacted with the patient anatomy within the surgical site. In this way, the video 872 can help to show a way in which the practitioner grasped the surgical instrument, moved the surgical instrument, and/or actuated the surgical instrument in a time synchronized manner with the animation 768 showing the resulting effect in the surgical site. This can help to provide additional or alternative insights into the surgical technique used during the surgical procedure.
[0110] In other examples, the image capture device can be one of the surgical instruments 114 such that the video is obtained from within the surgical site. For instance, the video can be captured by an endoscope, a surgical microscope, and/or an exoscope. In such examples, displaying the animation 768 in a time synchronized manner with the video 872 can help the practitioner to review the surgical procedure in conjunction with the view from the surgical site that was actually available to the practitioner during the surgical procedure.
[0111] Referring now to Figure 9, a flowchart of a process 900 of evaluating surgical performance for a surgical procedure, according to an example. As shown in Figure 9, the process 900 includes determining a plurality of procedure data sets for a plurality of surgical procedures at block 910, and determining, based on the plurality of procedure data sets, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance at block 912. Determining the plurality of procedure data sets at block 910 can include, for each surgical procedure of the plurality of surgical procedures, determining a respective procedure data set of the plurality of procedure data sets by: (i) receiving, from a surgical instrument, instrument data related to an operation of a surgical instrument during the surgical procedure at block 914, where the instrument data is based on one or more instrument parameters determined by the surgical instrument at a plurality of points in time during the surgical procedure, (ii) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to a patient anatomy at the plurality of points in time during the surgical procedure at block 916, (iii) determining, based on at least one of the position data or the instrument data, kinematic data at the plurality of points in time at block 918, and (iv) correlating, for each of the plurality of points in time, the instrument data, the position data, and the kinematic data to determine the respective procedure data set for the surgical procedure at block 920.
[0112] Referring now to Figure 10, a process 1000 of evaluating surgical performance for a surgical procedure is shown, according to another example. As shown in Figure 10, the process 1000 includes (i) receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure at block 1010, (ii) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure at block 1012, (iii) determining, based on the position data, kinematic data at the plurality of points in time at block 1014, (iv) determining, using the instrument data and the kinematic data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance at block 1016, (v) performing an analysis of the (a) the kinematic data and the instrument data relative to (b) the plurality of surgical performance metrics at block 1018, and (vi) causing, based on the analysis, a user interface to output information to provide feedback to a surgeon relating to a performance of the surgical procedure at block 1020.
[0113] Referring now to Figure 11, a process 1100 of evaluating surgical performance for a surgical procedure is shown, according to another example. As shown in Figure 11, the process 1100 includes (i) receiving preoperative information for a surgical procedure to be performed at block 1110, (ii) determining, using the preoperative information, a plurality of surgical performance metrics at block 1112, (iii) receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure at block 1114, (iv) receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure at block 1116, (v) determining, based on the position data, kinematic data at the plurality of points in time at block 1118, (vi) performing an analysis of the (a) the kinematic data and the instrument data relative to (b) the plurality of surgical performance metrics at block 1120, and (vii) causing, based on the analysis, a user interface to output information to provide feedback to a surgeon relating to a performance of the surgical procedure at block 1122.
[0114] Any of the blocks shown in Figures 9-11 may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor for implementing specific logical functions or steps in the process. The program code may be stored on any type of computer readable medium or data storage, for example, such as a storage device including a disk or hard drive. Further, the program code can be encoded on a computer-readable storage media in a machine-readable format, or on other non- transitory media or articles of manufacture. The computer readable medium may include non- transitory computer readable medium or memory, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and Random Access Memory (RAM). The computer readable medium may also include non- transitory media, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example. The computer readable media may also be any other volatile or non-volatile storage systems. The computer readable medium may be considered a tangible computer readable storage medium, for example.
[0115] In some instances, components of the devices and/or systems described herein may be configured to perform the functions such that the components are actually configured and structured (with hardware and/or software) to enable such performance. Example configurations then include one or more processors executing instructions to cause the system to perform the functions. Similarly, components of the devices and/or systems may be configured so as to be arranged or adapted to, capable of, or suited for performing the functions, such as when operated in a specific manner.
[0116] The description of the different advantageous arrangements has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. Further, different advantageous embodiments may describe different advantages as compared to other advantageous embodiments. The embodiment or embodiments selected are chosen and described in order to explain the principles of the embodiments, the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.

Claims

CLAIMS What is claimed is:
1. A non-transitory computer-readable medium having stored therein instructions that are executable to cause a processor to perform functions comprising: determining a plurality of procedure data sets for a plurality of surgical procedures, wherein determining the plurality of procedure data sets comprises, for each surgical procedure of the plurality of surgical procedures, determining a respective procedure data set of the plurality of procedure data sets by: receiving, from a surgical instrument, instrument data related to an operation of a surgical instrument during the surgical procedure, wherein the instrument data is based on one or more instrument parameters determined by the surgical instrument at a plurality of points in time during the surgical procedure; receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to a patient anatomy at the plurality of points in time during the surgical procedure; determining, based on at least one of the position data or the instrument data, kinematic data at the plurality of points in time, and correlating, for each of the plurality of points in time, the instrument data, the position data, and the kinematic data to determine the respective procedure data set for the surgical procedure; and determining, based on the plurality of procedure data sets, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance.
2. The non-transitory computer-readable medium of claim 1, wherein determining the plurality of surgical performance metrics comprises using the plurality of procedure data sets as training data for a machine learning algorithm.
3. The non-transitory computer-readable medium of any one of claims 1-2, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, from the surgical navigation system, image data relating to the patient anatomy, wherein the image data and the position data are both related to a common frame of reference with respect to the patient anatomy.
4. The non-transitory computer-readable medium of any one of claims 1-3, wherein the surgical navigation system is configured to determine the position data using at least one surgical navigation modality selected from a group consisting of: (i) electromagnetic surgical navigation, (ii) optical surgical navigation, (iii) ultrasound surgical navigation, and (iv) machine vision surgical navigation.
5. The non-transitory computer-readable medium of any one of claims 1-4, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, for each surgical procedure, respective outcome data relating to a postoperative outcome of the surgical procedure, wherein determining the plurality of surgical performance metrics is further based on the respective outcome data.
6. The non-transitory computer-readable medium of any one of claims 1-5, wherein the kinematic data comprises data for one or more kinematic parameters selected from a group consisting of: (i) a trajectory of the surgical instrument, (ii) a velocity of the surgical instrument, (iii) a motion of the surgical instrument in three-dimensional space, (iv) an inertia of the surgical instrument, (v) an acceleration of the surgical instrument, (vi) a chatter of the surgical instrument, (vii) a jitter of the surgical instrument due to an unsteadiness of a hand of a surgeon, (ix) a smoothness of movement of the surgical instrument, (x) an application force applied by the surgical instrument to a patient’s anatomy at a surgical site, (xi) a movement deviation relative to a preoperative planned route for the surgical instrument, and (xii) a site of operation of the surgical instrument relative to a preoperative planned target site.
7. The non-transitory computer-readable medium of any one of claims 1-6, wherein the instrument data relates to operation of a motor of the surgical instrument.
8. The non-transitory computer-readable medium of claim 7, wherein the instrument data relates to one or more instrument parameters selected from a group consisting of: a motor speed, a motor torque, a motor temperature, a motor current, and a motor power consumption.
9. The non-transitory computer-readable medium of claim 8, wherein the one or more instrument parameters comprises a plurality of instrument parameters, and wherein the processor is configured to correlate the plurality of instrument parameters with each other at the plurality of points in time.
10. The non-transitory computer-readable medium of any one of claims 1-9, wherein the surgical instrument comprises at least one instrument selected from a group consisting of: a drill, a bone cutter, an electrosurgical tool, an aspiration tool, an irrigation tool, a shaver, a microscope, a camera, a surgical retractor, and an illumination device.
11. The non-transitory computer-readable medium of any one of claims 1-10, wherein the surgical instrument comprises an instrument sensor coupled to the surgical instrument, wherein the instrument sensor comprises one or more sensors selected from a group consisting of: an accelerometer, a ground reaction force sensor, a flow sensor configured to detect a flow a liquid, a flow sensor configured to detect a flow of a gas, an electrical power sensor, a temperature sensor, a piezo-electric sensor, a vibration sensor, a chemical sensor, an optical sensor, a pressure sensor, a humidity sensor, a position sensor, a hall-effect sensor, a capacitive sensor, and a Doppler flow sensor.
12. The non-transitory computer-readable medium of any one of claims 1-11, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, for each surgical procedure of the plurality of surgical procedures, patientspecific data relating to one or more health records for a patient of the surgical procedure, wherein determining the plurality of surgical performance metrics is further based on the patient-specific data.
13. The non-transitory computer-readable medium of any one of claims 1-12, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, for each surgical procedure of the plurality of surgical procedures, historical surgeon data relating to one or more surgical procedures performed in the past by a surgeon performing the surgical procedure; and wherein determining the plurality of surgical performance metrics is further based on the historical surgeon data.
14. The non-transitory computer-readable medium of any one of claims 1-13, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, for each surgical procedure of the plurality of surgical procedures, surgeon physiological data related to a physiological condition of the surgeon performing the surgical procedure at the plurality of points in time, wherein for each surgical procedure of the plurality of surgical procedures, determining the respective procedure data set comprises correlating, for each of the plurality of points in time, the instrument data, the position data, the kinematic data, and the surgeon physiological data to determine the respective procedure data set for the surgical procedure.
15. The non-transitory computer-readable medium of claim 14, wherein the surgeon physiological data relates to at least one physiological parameter selected from a group consisting of: (i) a heart rate of the surgeon, (ii) a respiratory rate of the surgeon, (iii) a body temperature of the surgeon, (iv) an eye blink rate of the surgeon, (v) a measure of pupil dilation of the surgeon, (vi) a measure of eye fixation of the surgeon, (vii) a measure of saccade by the surgeon, and (viii) a measure of body movement by the surgeon.
16. The non-transitory computer-readable medium of any one of claims 1-15, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, for each surgical procedure of the plurality of surgical procedures, patient physiological data related to a physiological condition of the patient at the plurality of points in time during the surgical procedure, wherein for each surgical procedure of the plurality of surgical procedures, determining the respective procedure data set comprises correlating, for each of the plurality of points in time, the instrument data, the position data, the kinematic data, and the patient physiological data to determine the respective procedure data set for the surgical procedure.
17. The non-transitory computer-readable medium of claim 16, wherein the patient physiological data relates to at least one physiological parameter selected from a group consisting of: (i) a heart rate of the patient, (ii) a respiratory rate of the patient, (iii) a body temperature of the patient, (iv) a blood pressure of the patient, and (v) an oxygen saturation of the patient.
18. The non-transitory computer-readable medium of any one of claims 1-17, wherein determining the plurality of surgical performance metrics comprises: detecting, based on the plurality of procedure data sets, an occurrence of a surgical event during one or more surgical procedures of the plurality of surgical procedures; and identifying one or more portions of the procedure data sets that are indicative of a cause of the occurrence of the surgical event; and determining the plurality of surgical performance metrics based on the one or more portions of the procedure data sets identified as being indicative of the cause of the occurrence of the surgical event.
19. The non-transitory computer-readable medium of claim 18, wherein the surgical event is at least one event selected from a group consisting of: (i) chatter of the surgical instrument, (ii) wrap, (iii) overheating of the surgical instrument, and (iv) proximity of the surgical instrument to critical anatomical structures.
20. The non-transitory computer-readable medium of any one of claims 1-19, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving preoperative information for a future surgical procedure to be performed; and determining, based on the preoperative information and the plurality of surgical performance metrics, a procedure plan for performing the future surgical procedure.
21. The non-transitory computer-readable medium of any one of claims 1-20, wherein the plurality of surgical performance metrics comprise one or more threshold values defining a range of expected values for at least one of (i) one or more instrument parameters of the instrument data or (ii) one or more kinematic parameters of the kinematic data.
22. The non-transitory computer-readable medium of any one of claims 1-20, wherein the plurality of surgical performance metrics define a scoring system for evaluating the instrument data, the position data, and the kinematic data of at least one surgical procedure selected from among the plurality of surgical procedures.
23. A non-transitory computer-readable medium having stored therein instructions that are executable to cause a processor to perform functions comprising: receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure; receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure; determining, based on the position data, kinematic data at the plurality of points in time; determining, using the instrument data and the kinematic data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance; performing an analysis of the (i) the kinematic data and the instrument data relative to (ii) the plurality of surgical performance metrics; and causing, based on the analysis, a user interface to output information to provide feedback to a surgeon relating to a performance of the surgical procedure.
24. The non-transitory computer-readable medium of claim 23, wherein determining the plurality of surgical performance metrics comprises using the instrument data and the kinematic data as inputs to a machine learning algorithm that has been trained on a plurality of procedure data sets, wherein each procedure data set comprises respective instrument data and respective kinematic data for a respective surgical procedure that was performed in the past.
25. The non-transitory computer-readable medium of any one of claims 23-24, wherein causing, based on the analysis, the user interface to output the information is postoperatively performed after the surgical procedure is completed.
26. The non-transitory computer-readable medium of claim 25, wherein causing the user interface to output the information comprises displaying, on a display device, a graphical depiction over time of the at least one of the instrument data, the position data, or the kinematic data.
27. The non-transitory computer-readable medium of claim 26, wherein the plurality of surgical performance metrics define one or more ranges of expected values for at least one of (i) one or more instrument parameters of the instrument data or (ii) one or more kinematic parameters of the kinematic data.
28. The non-transitory computer-readable medium of claim 27, wherein displaying the information further comprises displaying an indication that a portion of the instrument data was outside of at least one of the one or more ranges of expected values defined by the plurality of surgical performance metrics.
29. The non-transitory computer-readable medium of claim 27, wherein displaying the information further comprises displaying an indication that a portion of the kinematic data was outside of at least one of the one or more ranges of expected values defined by the plurality of surgical performance metrics.
30. The non-transitory computer-readable medium of any one of claims 27-29, wherein displaying the information further comprises displaying a link to a pre-operative image of an anatomy of a surgical site and displaying a link to a post-operative image of the anatomy.
31. The non-transitory computer-readable medium of any one of claims 27-30, wherein displaying the information further comprises displaying, based on the position and the kinematic data, an animation of a movement of the surgical instrument overlaid on the image of the anatomy.
32. The non-transitory computer-readable medium of any one of claims 23-31, wherein displaying the information further comprises displaying an amount of deviation between (i) an actual location of a device implanted by the surgical instrument, and (ii) a preoperatively planned location at which the device was to be implanted by the surgical instrument.
33. The non-transitory computer-readable medium of claim 32, wherein displaying the information further comprises displaying an indication of a threshold amount of deviation defined by the plurality of surgical performance metrics.
34. The non-transitory computer-readable medium of claim 23, wherein determining the plurality of surgical performance metrics, performing the analysis, and causing the user interface to output the information are intraoperatively performed in real-time during the surgical procedure.
35. The non-transitory computer-readable medium of claim 34, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, from a surgeon monitoring device, surgeon physiological data related to a physiological condition of the surgeon performing the surgical procedure; and make a determination, based on the surgeon physiological data, that a cognitive load on the surgeon is greater than a threshold amount of cognitive load, wherein outputting the information is responsive to the determination that the cognitive load on the surgeon is greater than the threshold amount of cognitive load.
36. The non-transitory computer-readable medium of claim 35, wherein the surgeon physiological data relates to at least one physiological parameter selected from a group consisting of: (i) a heart rate of the surgeon, (ii) a respiratory rate of the surgeon, (iii) a body temperature of the surgeon, (iv) an eye blink rate of the surgeon, (v) a measure of pupil dilation of the surgeon, (vi) a measure of eye fixation of the surgeon, (vii) a measure of saccade by the surgeon, and (viii) a measure of body movement by the surgeon.
37. The non-transitory computer-readable medium of any one of claim 34-36, wherein the information outputted by the user interface comprises instrument guidance information for operating the surgical instrument based on the plurality of surgical performance metrics.
38. The non-transitory computer-readable medium of any one of claims 34-37, wherein the information outputted by the user interface comprises kinematic guidance for navigating the surgical instrument based on the plurality of surgical performance metrics.
39. A non-transitory computer-readable medium having stored therein instructions that are executable to cause a processor to perform functions comprising: receiving preoperative information for a surgical procedure to be performed; determining, using the preoperative information, a plurality of surgical performance metrics; receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure; receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure; determining, based on the position data, kinematic data at the plurality of points in time; performing an analysis of the (i) the kinematic data and the instrument data relative to (ii) the plurality of surgical performance metrics; and causing, based on the analysis, a user interface to output information to provide feedback to a surgeon relating to a performance of the surgical procedure.
40. The non-transitory computer-readable medium of claim 39, wherein determining the plurality of surgical performance metrics comprises using the instrument data and the kinematic data as inputs to a machine learning algorithm that has been trained on a plurality of procedure data sets, wherein each procedure data set comprises respective instrument data and respective kinematic data for a respective surgical procedure that was performed in the past.
41. The non-transitory computer-readable medium of any one of claims 39-40, wherein the preoperative information comprises patient-specific data relating to one or more health records for a patient of the surgical procedure.
42. The non-transitory computer-readable medium of claim 41 , wherein the patientspecific data comprises an image of the anatomy captured prior to the surgical procedure.
43. The non-transitory computer-readable medium of any one of claims 41-42, wherein the preoperative information comprises historical surgeon data relating to one or more surgical procedures performed in the past by a surgeon performing the surgical procedure.
44. The non-transitory computer-readable medium of any one of claims 41-43, wherein the preoperative information comprises surgeon preference data relating to one or more preferences for performing the surgical procedure.
45. A method of evaluating surgical performance for a surgical procedure, comprising: determining a plurality of procedure data sets for a plurality of surgical procedures, wherein determining the plurality of procedure data sets comprises, for each surgical procedure of the plurality of surgical procedures, determining a respective procedure data set of the plurality of procedure data sets by: receiving, from a surgical instrument, instrument data related to an operation of a surgical instrument during the surgical procedure, wherein the instrument data is based on one or more instrument parameters determined by the surgical instrument at a plurality of points in time during the surgical procedure, receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to a patient anatomy at the plurality of points in time during the surgical procedure, determining, based on at least one of the position data or the instrument data, kinematic data at the plurality of points in time, and correlating, for each of the plurality of points in time, the instrument data, the position data, and the kinematic data to determine the respective procedure data set for the surgical procedure; and determining, based on the plurality of procedure data sets, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance.
46. A method of evaluating surgical performance for a surgical procedure, comprising: receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure; receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure; determining, based on the position data, kinematic data at the plurality of points in time; determining, using the instrument data and the kinematic data, a plurality of surgical performance metrics that are indicative of a characteristic of surgical performance; performing an analysis of the (i) the kinematic data and the instrument data relative to (ii) the plurality of surgical performance metrics; and causing, based on the analysis, a user interface to output information to provide feedback to a surgeon relating to a performance of the surgical procedure.
47. A method of evaluating surgical performance for a surgical procedure, comprising: receiving preoperative information for a surgical procedure to be performed; determining, using the preoperative information, a plurality of surgical performance metrics; receiving, from a surgical instrument, instrument data related to an operation of the surgical instrument at a plurality of points in time during a surgical procedure; receiving, from a surgical navigation system, position data that is indicative of a position of the surgical instrument relative to an anatomy of a patient at the plurality of points in time during the surgical procedure; determining, based on the position data, kinematic data at the plurality of points in time; performing an analysis of the (i) the kinematic data and the instrument data relative to (ii) the plurality of surgical performance metrics; and causing, based on the analysis, a user interface to output information to provide feedback to a surgeon relating to a performance of the surgical procedure.
48. The method of any one of claims 45-47, wherein determining the plurality of surgical performance metrics comprises using the plurality of procedure data sets as training data for a machine learning algorithm.
49. The method of any one of claims 45-48, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, from the surgical navigation system, image data relating to the patient anatomy, wherein the image data and the position data are both related to a common frame of reference with respect to the patient anatomy.
50. The method of any one of claims 45-49, wherein the surgical navigation system is configured to determine the position data using at least one surgical navigation modality selected from a group consisting of: (i) electromagnetic surgical navigation, (ii) optical surgical navigation, (iii) ultrasound surgical navigation, and (iv) machine vision surgical navigation.
51. The method of any one of claims 45-50, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, for each surgical procedure, respective outcome data relating to a postoperative outcome of the surgical procedure, wherein determining the plurality of surgical performance metrics is further based on the respective outcome data.
52. The method of any one of claims 45-51, wherein the kinematic data comprises data for one or more kinematic parameters selected from a group consisting of: (i) a trajectory of the surgical instrument, (ii) a velocity of the surgical instrument, (iii) a motion of the surgical instrument in three-dimensional space, (iv) an inertia of the surgical instrument, (v) an acceleration of the surgical instrument, (vi) a chatter of the surgical instrument, (vii) a jitter of the surgical instrument due to an unsteadiness of a hand of a surgeon, (ix) a smoothness of movement of the surgical instrument, (x) an application force applied by the surgical instrument to a patient’s anatomy at a surgical site, (xi) a movement deviation relative to a preoperative planned route for the surgical instrument, and (xii) a site of operation of the surgical instrument relative to a preoperative planned target site.
53. The method of any one of claims 45-52, wherein the instrument data relates to operation of a motor of the surgical instrument.
54. The method of claim 53, wherein the instrument data relates to one or more instrument parameters selected from a group consisting of: a motor speed, a motor torque, a motor temperature, a motor current, and a motor power consumption.
55. The method of claim 54, wherein the one or more instrument parameters comprises a plurality of instrument parameters, and wherein the processor is configured to correlate the plurality of instrument parameters with each other at the plurality of points in time.
56. The method of any one of claims 45-55, wherein the surgical instrument comprises at least one instrument selected from a group consisting of: a drill, a bone cutter, an electrosurgical tool, an aspiration tool, an irrigation tool, a shaver, a microscope, a camera, a surgical retractor, and an illumination device.
57. The method of any one of claims 45-56, wherein the surgical instrument comprises an instrument sensor coupled to the surgical instrument, wherein the instrument sensor comprises one or more sensors selected from a group consisting of: an accelerometer, a ground reaction force sensor, a flow sensor configured to detect a flow a liquid, a flow sensor configured to detect a flow of a gas, an electrical power sensor, a temperature sensor, a piezoelectric sensor, a vibration sensor, a chemical sensor, an optical sensor, a pressure sensor, a humidity sensor, a position sensor, a hall-effect sensor, a capacitive sensor, and a Doppler flow sensor.
58. The method of any one of claims 45-57, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, for each surgical procedure of the plurality of surgical procedures, patientspecific data relating to one or more health records for a patient of the surgical procedure, wherein determining the plurality of surgical performance metrics is further based on the patient-specific data.
59. The method of any one of claims 45-58, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, for each surgical procedure of the plurality of surgical procedures, historical surgeon data relating to one or more surgical procedures performed in the past by a surgeon performing the surgical procedure; and wherein determining the plurality of surgical performance metrics is further based on the historical surgeon data.
60. The method of any one of claims 45-59, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, for each surgical procedure of the plurality of surgical procedures, surgeon physiological data related to a physiological condition of the surgeon performing the surgical procedure at the plurality of points in time, wherein for each surgical procedure of the plurality of surgical procedures, determining the respective procedure data set comprises correlating, for each of the plurality of points in time, the instrument data, the position data, the kinematic data, and the surgeon physiological data to determine the respective procedure data set for the surgical procedure.
61. The method of claim 60, wherein the surgeon physiological data relates to at least one physiological parameter selected from a group consisting of: (i) a heart rate of the surgeon, (ii) a respiratory rate of the surgeon, (iii) a body temperature of the surgeon, (iv) an eye blink rate of the surgeon, (v) a measure of pupil dilation of the surgeon, (vi) a measure of eye fixation of the surgeon, (vii) a measure of saccade by the surgeon, and (viii) a measure of body movement by the surgeon.
62. The method of any one of claims 45-61, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, for each surgical procedure of the plurality of surgical procedures, patient physiological data related to a physiological condition of the patient at the plurality of points in time during the surgical procedure, wherein for each surgical procedure of the plurality of surgical procedures, determining the respective procedure data set comprises correlating, for each of the plurality of points in time, the instrument data, the position data, the kinematic data, and the patient physiological data to determine the respective procedure data set for the surgical procedure.
63. The method of claim 62, wherein the patient physiological data relates to at least one physiological parameter selected from a group consisting of: (i) a heart rate of the patient, (ii) a respiratory rate of the patient, (iii) a body temperature of the patient, (iv) a blood pressure of the patient, and (v) an oxygen saturation of the patient.
64. The method of any one of claims 45-63, wherein determining the plurality of surgical performance metrics comprises: detecting, based on the plurality of procedure data sets, an occurrence of a surgical event during one or more surgical procedures of the plurality of surgical procedures; and identifying one or more portions of the procedure data sets that are indicative of a cause of the occurrence of the surgical event; and determining the plurality of surgical performance metrics based on the one or more portions of the procedure data sets identified as being indicative of the cause of the occurrence of the surgical event.
65. The method of claim 64, wherein the surgical event is at least one event selected from a group consisting of: (i) chatter of the surgical instrument, (ii) wrap, (iii) overheating of the surgical instrument, and (iv) proximity of the surgical instrument to critical anatomical structures.
66. The method of any one of claims 45-65, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving preoperative information for a future surgical procedure to be performed; and determining, based on the preoperative information and the plurality of surgical performance metrics, a procedure plan for performing the future surgical procedure.
67. The method of any one of claims 45-66, wherein the plurality of surgical performance metrics comprise one or more threshold values defining a range of expected values for at least one of (i) one or more instrument parameters of the instrument data or (ii) one or more kinematic parameters of the kinematic data.
68. The method of any one of claims 45-67, wherein the plurality of surgical performance metrics define a scoring system for evaluating the instrument data, the position data, and the kinematic data of at least one surgical procedure selected from among the plurality of surgical procedures.
69. The method of any one of claims 45-68, wherein determining the plurality of surgical performance metrics comprises using the instrument data and the kinematic data as inputs to a machine learning algorithm that has been trained on a plurality of procedure data sets, wherein each procedure data set comprises respective instrument data and respective kinematic data for a respective surgical procedure that was performed in the past.
70. The method of any one of claims 46-69, wherein causing, based on the analysis, the user interface to output the information is postoperatively performed after the surgical procedure is completed.
71. The method of claim 70, wherein causing the user interface to output the information comprises displaying, on a display device, a graphical depiction over time of the at least one of the instrument data, the position data, or the kinematic data.
72. The method of claim 71, wherein the plurality of surgical performance metrics define one or more ranges of expected values for at least one of (i) one or more instrument parameters of the instrument data or (ii) one or more kinematic parameters of the kinematic data.
73. The method of claim 72, wherein displaying the information further comprises displaying an indication that a portion of the instrument data was outside of at least one of the one or more ranges of expected values defined by the plurality of surgical performance metrics.
74. The method of claim 72, wherein displaying the information further comprises displaying an indication that a portion of the kinematic data was outside of at least one of the one or more ranges of expected values defined by the plurality of surgical performance metrics.
75. The method of any one of claims 72-74, wherein displaying the information further comprises displaying a link to a pre-operative image of an anatomy of a surgical site and displaying a link to a post-operative image of the anatomy.
76. The method of any one of claims 72-75, wherein displaying the information further comprises displaying, based on the position and the kinematic data, an animation of a movement of the surgical instrument overlaid on the image of the anatomy.
77. The method of any one of claims 45-76, wherein displaying the information further comprises displaying an amount of deviation between (i) an actual location of a device implanted by the surgical instrument, and (ii) a preoperatively planned location at which the device was to be implanted by the surgical instrument.
78. The method of claim 77, wherein displaying the information further comprises displaying an indication of a threshold amount of deviation defined by the plurality of surgical performance metrics.
79. The method of any one of claims 46-78, wherein determining the plurality of surgical performance metrics, performing the analysis, and causing the user interface to output the information are intraoperatively performed in real-time during the surgical procedure.
80. The method of any one of claims 45-79, wherein the instructions are executable to cause the processor to perform the functions further comprising: receiving, from a surgeon monitoring device, surgeon physiological data related to a physiological condition of the surgeon performing the surgical procedure; and make a determination, based on the surgeon physiological data, that a cognitive load on the surgeon is greater than a threshold amount of cognitive load, wherein outputting the information is responsive to the determination that the cognitive load on the surgeon is greater than the threshold amount of cognitive load.
81. The method of claim 80, wherein the surgeon physiological data relates to at least one physiological parameter selected from a group consisting of: (i) a heart rate of the surgeon, (ii) a respiratory rate of the surgeon, (iii) a body temperature of the surgeon, (iv) an eye blink rate of the surgeon, (v) a measure of pupil dilation of the surgeon, (vi) a measure of eye fixation of the surgeon, (vii) a measure of saccade by the surgeon, and (viii) a measure of body movement by the surgeon.
82. The method of any one of claims 79-81, wherein the information outputted by the user interface comprises instrument guidance information for operating the surgical instrument based on the plurality of surgical performance metrics.
83. The method of any one of claims 79-82, wherein the information outputted by the user interface comprises kinematic guidance for navigating the surgical instrument based on the plurality of surgical performance metrics.
84. The method of any one of claims 47-83, wherein the preoperative information comprises patient-specific data relating to one or more health records for a patient of the surgical procedure.
85. The method of claim 84, wherein the patient-specific data comprises an image of the anatomy captured prior to the surgical procedure.
86. The method of any one of claims 84-85, wherein the preoperative information comprises historical surgeon data relating to one or more surgical procedures performed in the past by a surgeon performing the surgical procedure.
87. The method of any one of claims 84-86, wherein the preoperative information comprises surgeon preference data relating to one or more preferences for performing the surgical procedure.
88. The method of any one of claims 46-87, further comprising causing the surgical instrument to automatically adjust one or more instrument parameters based on at least one data selected from a group consisting of: the instrument data, the position data, and the kinematic data.
89. The method of claim 88, wherein the one or more instrument parameters comprise at least one parameter selected from a group consisting of: a motor speed, a direction of the motor rotation, a motor torque, a motor temperature, a motor current, a motor power consumption, an electrosurgical current, an electrosurgical voltage, an electrosurgical waveform, an electrosurgical impedance, an irrigation flow rate, a suction flow rate, a depth control, and a visibility setting for a camera.
PCT/US2024/014416 2023-02-06 2024-02-05 Systems and methods for evaluating and aiding surgical performance Pending WO2024167823A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202480011169.9A CN120660142A (en) 2023-02-06 2024-02-05 Systems and methods for assessing and assisting surgical performance
AU2024216584A AU2024216584A1 (en) 2023-02-06 2024-02-05 Systems and methods for evaluating and aiding surgical performance

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363443588P 2023-02-06 2023-02-06
US63/443,588 2023-02-06

Publications (1)

Publication Number Publication Date
WO2024167823A1 true WO2024167823A1 (en) 2024-08-15

Family

ID=90368854

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/014416 Pending WO2024167823A1 (en) 2023-02-06 2024-02-05 Systems and methods for evaluating and aiding surgical performance

Country Status (3)

Country Link
CN (1) CN120660142A (en)
AU (1) AU2024216584A1 (en)
WO (1) WO2024167823A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119785998A (en) * 2025-03-12 2025-04-08 上海市浦东医院(复旦大学附属浦东医院) A method for intelligent management of neurosurgical instruments

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190090969A1 (en) * 2015-11-12 2019-03-28 Intuitive Surgical Operations, Inc. Surgical system with training or assist functions
US20220233253A1 (en) * 2021-01-22 2022-07-28 Ethicon Llc Situation adaptable surgical instrument control

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190090969A1 (en) * 2015-11-12 2019-03-28 Intuitive Surgical Operations, Inc. Surgical system with training or assist functions
US20220233253A1 (en) * 2021-01-22 2022-07-28 Ethicon Llc Situation adaptable surgical instrument control

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119785998A (en) * 2025-03-12 2025-04-08 上海市浦东医院(复旦大学附属浦东医院) A method for intelligent management of neurosurgical instruments

Also Published As

Publication number Publication date
AU2024216584A1 (en) 2025-08-14
CN120660142A (en) 2025-09-16

Similar Documents

Publication Publication Date Title
US12016644B2 (en) Artificial intelligence guidance system for robotic surgery
CN112804958B (en) Indicator system
US12213658B2 (en) Force-indicating retractor device and methods of use
CN103857349B (en) Intelligent Surgical System
US12137883B2 (en) User interface for digital markers in arthroscopy
JP7730822B2 (en) Joint Extension System
EP3930584B1 (en) Measuring force using a tracking system
US12408908B2 (en) Force-indicating retractor device and methods of use
Luz et al. Impact of image-guided surgery on surgeons' performance: a literature review
WO2024167823A1 (en) Systems and methods for evaluating and aiding surgical performance
EP4087467B1 (en) Tensioner tool and sock with pressure sensor grid for use therewith
WO2024072886A1 (en) Systems and methods for configuring surgical systems to perform patient-specific procedure with surgeon preferences
US20220313365A1 (en) Easy to manufacture autoclavable led for optical tracking
US12318148B2 (en) Tracking device for hand tracking in surgical environment
US20250143797A1 (en) Augmented reality registration device for navigated surgery
CN120788694A (en) Trigeminal ganglion puncture dynamic tracking and positioning system based on mixed reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24713031

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: AU2024216584

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 202480011169.9

Country of ref document: CN

ENP Entry into the national phase

Ref document number: 2024216584

Country of ref document: AU

Date of ref document: 20240205

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWP Wipo information: published in national office

Ref document number: 202480011169.9

Country of ref document: CN

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载