Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the technical solution of the present application and are not intended to limit the present application.
For a better understanding of the technical solution of the present application, the following detailed description will be given with reference to the drawings and the specific embodiments.
In trauma surgery, doctors often rely on intra-operative DR images for lesion localization and intra-operative navigation. Common image navigation systems include CT navigation and CBCT navigation, which rely on intraoperative three-dimensional reconstruction, registration and navigation system support to achieve real-time positioning of the instrument relative to the anatomy of the patient. However, the above-mentioned surgical navigation system needs to rely on a three-dimensional CT imaging device, which increases the cost of the device.
In order to solve the above problems, the embodiment of the application provides a navigation method of a surgical instrument, which can present the relation between the actual position of the surgical instrument and the planned path before surgery in real time in a DR image of a patient on the basis of not depending on three-dimensional CT image equipment, thereby improving the surgery precision and the operation convenience of doctors.
The following describes a navigation method of a surgical instrument according to an embodiment of the present application in detail with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a flow chart illustrating an embodiment of a navigation method of a surgical instrument according to the present application. In this embodiment, the navigation method of the surgical instrument includes steps S10 to S40:
step S10, a calibration module DR image and a calibration module optical image are obtained.
The DR image of the calibration module is a DR image of the calibration module obtained by shooting by the DR imaging device, the optical image of the calibration module is an image of the calibration module obtained by shooting by the optical tracking system, and at least one mark point is arranged on the calibration module.
In some embodiments, the DR imaging apparatus may be controlled to capture DR images of the calibration module from the normal viewing angle and the side viewing angle, respectively, to form a normal calibration module DR image on an imaging plane of a normal DR imaging light source of the DR imaging apparatus, and to form a side calibration module DR image on an imaging plane of a side DR imaging light source.
In some embodiments, the calibration module comprises a first calibration module and a second calibration module, the first calibration module is provided with at least three first marker points, the second calibration module is provided with at least three second marker points, the calibration module DR image comprises a first calibration module DR image and a second calibration module DR image, and the calibration module optical image comprises a second calibration module optical image.
Further, the first calibration module DR image includes a positive first calibration module DR image formed on an imaging plane of the positive DR imaging light source and a negative first calibration module DR image formed on an imaging plane of the negative DR imaging light source. The second calibration module DR image includes a positive second calibration module DR image formed on an imaging plane of the positive DR imaging light source and a negative second calibration module DR image formed on an imaging plane of the negative DR imaging light source.
In some embodiments, the DR imaging device may be a C-arm. The C-arm is also called a C-arm X-ray machine, and generates X-rays to penetrate through a target object through an X-ray emitter and a receiver at two ends of the C-shaped frame, and converts signals into real-time DR images through the receiver for intra-operative navigation.
Preferably, the first calibration module is an image calibrator, a plurality of metal balls are arranged on the image calibrator to serve as first mark points, the second calibration module is a calibration plate, and a plurality of metal balls are arranged on the calibration plate to serve as second mark points.
As shown in fig. 2, fig. 2 shows the spatial positional relationship of the C-arm, the image aligner and the calibration plate. The C-arm comprises a C-shaped frame 1, one end of the C-shaped frame is provided with an X-ray emitter 2, the other end (the end close to the bottom) is provided with a flat panel sensor 3 (namely an X-ray receiver), an image calibrator 4 is placed on the flat panel sensor 3, and a calibration plate 5 is positioned between the X-ray emitter 2 and the image calibrator 4. The image calibrator 4 is provided with a first mark point 6, and the calibration plate is provided with a second mark point 7. The first marker 6 and the second marker 7 may be metal balls.
Step S20, determining a conversion matrix between an optical tracking system coordinate system and a DR imaging plane coordinate system based on the imaging position of the marker point in the DR image of the calibration module and the imaging position of the marker point in the optical image of the calibration module, and marking the conversion matrix as a first conversion matrix.
Specifically, the principle of the DR imaging apparatus can be regarded as a perspective projection process of the monocular camera, so shooting and calibration are required to be performed through the marker points to establish a mapping relationship between a world coordinate system (in this step, the coordinate system of the optical tracking system is used as the world coordinate system) and a DR imaging plane coordinate system, that is, calibration of the internal reference and the external reference of the DR imaging apparatus is completed.
The imaging position of the marker point in the calibration module DR image is the coordinate position of the marker point in the DR imaging plane coordinate system. The DR imaging plane coordinate system refers to the projection of the DR imaging light source of the DR imaging apparatus on the physical imaging plane, and is a two-dimensional coordinate system.
In some embodiments, the step S20 may include:
Determining internal parameters of the DR imaging device based on imaging positions of the first mark points in the DR image of the first calibration module;
Determining external parameters of the DR imaging device based on imaging positions of the second mark points in the DR image of the second calibration module and imaging positions of the second mark points in the optical image of the second calibration module, wherein the external parameters are used for representing a conversion relation between an optical tracking coordinate system and a coordinate system of the DR imaging device;
Based on the internal and external parameters, a transformation matrix of the optical tracking system coordinate system and the image coordinates of the DR imaging device is determined.
Step S30, tracking the surgical instrument through an optical tracking system, and projecting instrument feature points of the surgical instrument onto a DR image of a patient based on the first transformation matrix.
In some embodiments, the step S30 may include:
S301, determining the position of a DR image of the patient in the patient reference coordinate system, which is shot by the DR imaging device, and the position of a DR imaging light source of the DR imaging device in the patient reference coordinate system based on the first conversion matrix and a conversion matrix between a predetermined optical tracking system coordinate system and the patient reference coordinate system.
Specifically, a patient reference array is installed at the patient's bone and a patient reference coordinate system is established based on the patient reference array, denoted PATIENTRF, and then real-time spatial tracking of the patient reference coordinate system PATIENTRF is achieved by an optical tracking system (e.g., a binocular camera) to obtain a transformation relationship between the patient reference coordinate system PATIENTRF relative to the optical tracking system coordinate system, i.e., a transformation matrix between the optical tracking system coordinate system and the patient reference coordinate system.
In some embodiments, the patient DR images captured by the DR imaging apparatus include a positive patient DR image and a lateral patient DR image. In the implementation, the orthotopic patient DR image, the lateral patient DR image, the orthotopic DR imaging light source and the lateral DR imaging light source are respectively projected to a patient reference coordinate system to obtain positions of the orthotopic patient DR image, the lateral patient DR image, the orthotopic DR imaging light source and the lateral DR imaging light source in the patient reference coordinate system.
Specifically, the orthotopic patient DR image/lateral patient DR image/orthotopic DR imaging light source/lateral DR imaging light source is projected into a patient reference frame to obtain a position of the orthotopic patient DR image/lateral patient DR image/orthotopic DR imaging light source/lateral DR imaging light source projected in the patient reference frame by:
Firstly, if the first transformation matrix is a transformation matrix from an optical tracking system coordinate system to a DR imaging plane coordinate system, multiplying the first transformation matrix with a transformation matrix from a patient reference coordinate system to the optical tracking system coordinate system to obtain a transformation matrix from the patient reference coordinate system to the DR imaging plane coordinate system;
And then, according to a conversion matrix from the patient reference coordinate system to the DR imaging plane coordinate system, converting the orthotopic patient DR image/the lateral patient DR image/the orthotopic DR imaging light source/the lateral DR imaging light source in the DR imaging plane coordinate system into the patient reference coordinate system to obtain the positions of the orthotopic patient DR image/the lateral patient DR image/the orthotopic DR imaging light source/the lateral DR imaging light source in the patient reference coordinate system.
S302, calculating projection points of instrument feature points of the surgical instrument in a patient reference coordinate system based on a conversion matrix between a predetermined instrument reference coordinate system and the patient reference coordinate system, and recording the projection points as first instrument projection points.
Specifically, the surgical instrument (e.g., guide sleeve, drill) is fitted with an optical reflection array, an instrument reference coordinate system is established based on the optical reflection array, denoted ToolRF, and then coordinates of the instrument feature points KeyPoint (e.g., drill tip position, sleeve center axis direction) in the instrument reference coordinate system ToolRF are predefined, as shown in fig. 4.
Then, the real-time spatial transformation relation of the instrument reference coordinate system ToolRF relative to the patient reference coordinate system PATIENTRF is acquired by means of the optical tracking system, so that the instrument characteristic points in the instrument reference coordinate system ToolRF are converted into three-dimensional positions under the coordinate system PATIENTRF of the patient reference coordinate system in real time, and a first instrument projection point is obtained.
S303, based on the position of the DR imaging light source in the patient reference coordinate system, projecting a first instrument projection point onto a DR image of the patient in the patient reference coordinate system.
In some embodiments, the step S303 may include:
constructing rays which take the position point of the DR image light source as a starting point and emit the rays to the first instrument projection point in a patient reference coordinate system;
Calculating the intersection point of the ray and the DR image plane of the patient in the patient reference coordinate system to obtain the projection point of the first instrument projection point on the DR image of the patient, and marking the projection point as a second instrument projection point;
the second instrument proxels are rendered onto a patient DR image in a patient reference frame.
Specifically, under the coordinate system of patient reference coordinate system PATIENTRF, let the position of the orthotopic DR imaging light source be S, the instrument feature point of the surgical instrument be P, the imaging plane of the orthotopic DR imaging light source be II, and the unit normal vector beThe existence of known points in the DR image of an orthotopic patient is noted asThe following steps are then performed to obtain a projected point of the first instrument projected point on the DR image of the patient:
Firstly, a projection straight line passing through an orthotopic DR imaging light source S and an instrument characteristic point P is established, and the direction vector is as follows :
,
The projection lineThe parametric equation for (2) is:
t is a coefficient;
Imaging plane of orthotopic DR imaging light source In the general form of:
,
Wherein, the Representing an imaging planeThe above general point corresponds to a projected straight line in the present embodimentAnd imaging planeThe intersection point position vector of (a), namely the projection point coordinate to be solved;
Then, will Substituting plane equation to solve intersection point parameters:
,
Final intersection pointExpressed as:
,
The intersection point P' is the projection point of the first instrument projection point on the DR image of the patient.
In addition, the implementation principle of the projection of the first instrument projection point onto the DR image of the lateral patient is identical to the implementation principle of the projection of the first instrument projection point onto the DR image of the orthotopic patient, and will not be described herein.
For example, referring to fig. 3, fig. 3 is a schematic view of a first instrument projection point provided in the present embodiment projected onto an orthotopic patient DR image and a lateral patient DR image. In fig. 3, O1 represents a side DR imaging light source, O2 represents a normal DR imaging light source, F APIImage represents an imaging plane coordinate system of the normal DR imaging light source, F LTImage represents an imaging plane coordinate system of the side DR imaging light source, C1 and C2 represent first instrument projection points respectively corresponding to two instrument feature points, API1 and API2 represent projection points of the first instrument projection point C1 and the second instrument projection point C2 on the imaging plane coordinate system of the normal DR imaging light source, respectively, and LT1 and LT2 represent projection points of the first instrument projection point C1 and the second instrument projection point C2 on the imaging plane coordinate system of the side DR imaging light source, respectively.
In some embodiments, the step of determining the internal reference of the DR imaging apparatus based on the imaging position of each first marker point in the DR image of the first calibration module may include steps a1 to a5:
And a1, determining a conversion matrix between the first calibration module coordinate system and the DR imaging plane coordinate system based on the predetermined coordinates of each type of first mark points in the first calibration module coordinate system and the imaging positions of each type of first mark points in the DR imaging device coordinate system, and marking the conversion matrix as a second conversion matrix.
And a2, converting each second type of first mark points under the first calibration module coordinate system into a DR imaging plane coordinate system based on a second conversion matrix to obtain projection points of each second type of first mark points in the DR imaging plane coordinate system.
And a3, connecting each second type of first mark point with the corresponding projection point in the DR imaging plane coordinate system to obtain a plurality of groups of connecting lines.
Step a4, calculating the coordinate positions of the intersection points of a plurality of groups of connecting lines to obtain the coordinates of a DR imaging light source of the DR imaging device in a DR imaging plane coordinate system;
step a5, determining internal parameters of the DR imaging device based on the coordinates of the DR imaging light source of the DR imaging device in the DR imaging plane coordinate system.
Specifically, the first marker points include a first marker point and a second marker point, the first marker point is used for carrying out DR image registration, and the second marker point is used for carrying out calculation of the DR imaging light source position, namely calculating internal parameters of the DR imaging device.
Taking the image calibration device as an example of the first calibration module, please refer to fig. 2, the image calibration device 4 is provided with an upper layer and a lower layer, the lower layer is closely attached to the flat panel sensor, the upper layer and the lower layer both comprise a plurality of metal balls, each metal ball arranged on the upper layer (i.e. the layer on which the side far from the flat panel sensor is located) is used as a first mark point 62 of a type, and each metal ball arranged on the lower layer is used as a first mark point 61 of a type.
The internal parameters of the DR imaging device are calculated as follows:
Firstly, acquiring coordinates of each type of first mark point under a first calibration module coordinate system (the coordinates of each type of first mark point under the first calibration module coordinate system are known values because the coordinates of each type of first mark point under the first calibration module coordinate system are calibrated in advance), then calculating a conversion matrix from the first calibration module coordinate system to a positive DR image coordinate system based on the coordinates of each type of first mark point under the first calibration module coordinate system and the coordinates in an imaging plane coordinate system of a positive DR imaging light source (hereinafter referred to simply as a positive DR imaging plane coordinate system) and recording the conversion matrix as a positive second conversion matrix, and calculating a conversion matrix from the first calibration module coordinate system to the side DR image coordinate system based on the coordinates of each type of first mark point under the first calibration module coordinate system and the coordinates in the imaging plane coordinate system of the side DR imaging light source (hereinafter referred to simply as a side DR imaging plane coordinate system) and recording the conversion matrix as a side second conversion matrix;
Then, converting the second type of first mark points under the first calibration module coordinate system into a positive DR imaging plane coordinate system by using the positive second conversion matrix to obtain projection points of the second type of first mark points in the positive DR image coordinate system, and connecting the second type of first mark points with corresponding projection points in the positive DR imaging plane coordinate system to obtain a plurality of groups of connecting lines, wherein the points intersected by the plurality of groups of connecting lines are the coordinates of the positive DR imaging light source under the positive DR imaging plane coordinate system; and converting the second type of first mark points under the first calibration module coordinate system into a side DR imaging plane coordinate system by using the side second conversion matrix to obtain projection points of the second type of first mark points in the side DR imaging plane coordinate system, and connecting the second type of first mark points with corresponding projection points in the side DR imaging plane coordinate system to obtain a plurality of groups of connecting lines, wherein points intersected by the plurality of groups of connecting lines are coordinates of the side DR imaging light source under the side DR imaging plane coordinate system.
And calculating the imaging geometrical relationship from the side DR imaging light source to the side DR imaging plane based on the coordinates of the side DR imaging light source under the side DR imaging plane coordinate system and the vertical distance between the X-ray emitter and the receiver in the DR imaging device, thereby completing the internal reference calibration of the DR imaging device.
In some embodiments, the method of navigating a surgical instrument further comprises:
calculating a moving path of a projection point of the instrument characteristic point on the DR image of the patient;
comparing the moving path with a preset planning path to obtain the position deviation and the direction deviation of the surgical instrument;
The positional deviation and the directional deviation are visually displayed on the DR image of the patient.
Specifically, the optical tracking system is used for tracking the surgical instrument, and the moving path of the projection point of the instrument characteristic point on the DR image of the patient is obtained according to the projection point of the real-time position of the instrument characteristic point of the surgical instrument on the DR image of the patient. Let the movement path of the projected points of the instrument feature points of the surgical instrument on the DR image be P Actual practice is that of = { (u 1, v 1), (u 2, v 2), (un, vn) }, where u1, u2, and un denote the position of the 1 st projected point, the position of the 2 nd projected point, and the position of the n th projected point, respectively, and v1, v2, and vn denote the angle of the 1 st projected point, the angle of the 2 nd projected point, and the angle of the n th projected point, respectively. And assuming that the preset planned path of the surgical instrument is P Planning = { (u 1', v 1'), (u 2', v 2'), (un ', vn') }, where u1', u2', and un 'represent the position of the 1 st planned point, the position of the 2 nd planned point, and the position of the n-th planned point, respectively, and v1', v2', and vn' represent the angle of the 1 st planned point, the angle of the 2 nd planned point, and the angle of the n-th planned point, respectively.
The coordinate position of the i-th projected point is expressed as (xi, yi), the coordinate position of the i-th planned point is expressed as (xi ', yi'), and the positional deviation di of the i-th point is calculated by the following formula:
,
then, the direction vector of each adjacent two projection points in the moving path is calculated And presetting the direction vectors of every two adjacent planning points in the planning pathThen the direction deviation is calculated by the following formula:
。
Finally, the difference information containing the position deviation and the direction deviation is visually displayed on the DR image of the patient in a mode of arrow, color coding, numerical labels and the like, so that a visual navigation feedback mechanism is realized.
In the embodiment, through feeding back the difference value between the planned path and the actual moving path of the instrument in the image, a doctor can correct the direction and the depth in time, so that surgical risks such as false striking, askew, puncture and the like are avoided, and the accuracy of the implant position is ensured.
It should be noted that the above examples are only for understanding the present application, and do not limit the navigation method of the surgical instrument of the present application, and it is within the scope of the present application to make more simple changes based on the technical idea.
The application provides electronic equipment which comprises at least one processor and a memory in communication connection with the at least one processor, wherein the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor so that the at least one processor can execute the navigation method of the surgical instrument in the first embodiment.
Referring now to fig. 5, a schematic diagram of an electronic device suitable for use in implementing embodiments of the present application is shown. The electronic device in the embodiment of the present application may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (Personal DIGITAL ASSISTANT: personal digital assistant), a PAD (Portable Application Description: tablet computer), a PMP (Portable MEDIA PLAYER: portable multimedia player), an in-vehicle terminal (e.g., an in-vehicle navigation terminal), and the like, a fixed terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 5 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the present application.
As shown in fig. 5, the electronic apparatus may include a processing device 1001 (e.g., a central processing unit, a graphics processor, etc.), which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1002 or a program loaded from a storage device 1003 into a random access Memory (RAM: random Access Memory) 1004. In the RAM1004, various programs and data required for the operation of the electronic device are also stored. The processing device 1001, the ROM1002, and the RAM1004 are connected to each other by a bus 1005. An input/output (I/O) interface 1006 is also connected to the bus. In general, a system including an input device 1007 such as a touch screen, a touch pad, a keyboard, a mouse, an image sensor, a microphone, an accelerometer, a gyroscope, etc., an output device 1008 including a Liquid crystal display (LCD: liquid CRYSTAL DISPLAY), a speaker, a vibrator, etc., a storage device 1003 including a magnetic tape, a hard disk, etc., and a communication device 1009 may be connected to the I/O interface 1006. The communication means 1009 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While electronic devices having various systems are shown in the figures, it should be understood that not all of the illustrated systems are required to be implemented or provided. More or fewer systems may alternatively be implemented or provided.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through a communication device, or installed from the storage device 1003, or installed from the ROM 1002. The above-described functions defined in the method of the disclosed embodiment of the application are performed when the computer program is executed by the processing device 1001.
The electronic equipment provided by the application adopts the navigation method of the surgical instrument in the embodiment. Compared with the prior art, the beneficial effects of the electronic device provided by the application are the same as those of the navigation method of the surgical instrument provided by the embodiment, and other technical features of the electronic device are the same as those disclosed by the method of the previous embodiment, and are not repeated herein.
It is to be understood that portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof. In the description of the above embodiments, particular features, structures, materials, or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
The present application provides a computer-readable storage medium having computer-readable program instructions (i.e., a computer program) stored thereon for performing the method of navigating a surgical instrument in the above-described embodiments.
The computer readable storage medium provided by the present application may be, for example, a U disk, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or device, or a combination of any of the foregoing. More specific examples of a computer-readable storage medium may include, but are not limited to, an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access Memory (RAM: random Access Memory), a Read-Only Memory (ROM), an erasable programmable Read-Only Memory (EPROM: erasable Programmable Read Only Memory or flash Memory), an optical fiber, a portable compact disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this embodiment, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to electrical wiring, fiber optic cable, RF (Radio Frequency) and the like, or any suitable combination of the foregoing.
The computer readable storage medium may be included in the electronic device or may exist alone without being incorporated into the electronic device.
Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of remote computers, the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN: local Area Network) or a wide area network (WAN: wide Area Network), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules involved in the embodiments of the present application may be implemented in software or in hardware. Wherein the name of the module does not constitute a limitation of the unit itself in some cases.
The present application provides a computer-readable storage medium storing computer-readable program instructions (i.e., a computer program) for performing the above-described navigation method of a surgical instrument. Compared with the prior art, the beneficial effects of the computer readable storage medium provided by the application are the same as those of the navigation method of the surgical instrument provided by the above embodiment, and are not described in detail herein.
The application also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of a method of navigating a surgical instrument as described above.
Compared with the prior art, the beneficial effects of the computer program product provided by the application are the same as those of the navigation method of the surgical instrument provided by the above embodiment, and are not described herein.
The foregoing description is only a partial embodiment of the present application, and is not intended to limit the scope of the present application, and all the equivalent structural changes made by the description and the accompanying drawings under the technical concept of the present application, or the direct/indirect application in other related technical fields are included in the scope of the present application.