WO2018139025A1 - Dispositif d'observation - Google Patents
Dispositif d'observation Download PDFInfo
- Publication number
- WO2018139025A1 WO2018139025A1 PCT/JP2017/041964 JP2017041964W WO2018139025A1 WO 2018139025 A1 WO2018139025 A1 WO 2018139025A1 JP 2017041964 W JP2017041964 W JP 2017041964W WO 2018139025 A1 WO2018139025 A1 WO 2018139025A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- unit
- developed image
- cpu
- locally
- Prior art date
Links
- 230000003287 optical effect Effects 0.000 claims abstract description 55
- 238000003384 imaging method Methods 0.000 claims abstract description 49
- 238000005286 illumination Methods 0.000 claims abstract description 40
- 238000012545 processing Methods 0.000 claims abstract description 28
- 238000000034 method Methods 0.000 claims description 43
- 230000008569 process Effects 0.000 claims description 42
- 238000003860 storage Methods 0.000 claims description 40
- 238000011161 development Methods 0.000 claims description 12
- 238000012937 correction Methods 0.000 claims description 9
- 238000004519 manufacturing process Methods 0.000 claims description 5
- 238000003780 insertion Methods 0.000 description 57
- 230000037431 insertion Effects 0.000 description 57
- 238000010586 diagram Methods 0.000 description 22
- 230000002093 peripheral effect Effects 0.000 description 10
- 230000004044 response Effects 0.000 description 8
- 238000005452 bending Methods 0.000 description 5
- 238000005520 cutting process Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000003825 pressing Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000005304 joining Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 229910052594 sapphire Inorganic materials 0.000 description 2
- 239000010980 sapphire Substances 0.000 description 2
- 239000000725 suspension Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
Definitions
- the present invention relates to an observation apparatus.
- an image projected on a convex mirror provided in the center of a horizontal shaft having a substantially circular cross section is captured to obtain a basic image, and the basic image is sequentially acquired.
- a configuration is disclosed in which a developed image of the inner wall surface of the horizontal shaft is generated by decomposing into a plurality of concentric annular images having different radii, developing the plurality of annular images into a linear image, and sequentially arranging them. ing.
- the central axis of the imaging device provided with the convex mirror and the central axis of the horizontal shaft imaged by the imaging device by components such as a suspension arm In a state in which are substantially matched, a basic image used for generating a developed image is acquired.
- the present invention has been made in view of the above-described circumstances, and an object thereof is to provide an observation apparatus capable of generating an accurate developed image without using a centering mechanism.
- the observation apparatus is arranged inside a cylindrical subject having a central axis and emits illumination light to a cylindrical illumination region centered on a predetermined optical axis.
- An illuminating unit that is configured and an imaging signal that is provided in the vicinity of the illuminating unit, is disposed inside the subject, and images a subject in a cylindrical imaging region centered on the predetermined optical axis.
- An image generating unit configured to generate an annular image having the predetermined optical axis as a center of the image based on the imaging signal, and an image of the subject.
- An offset amount of the predetermined optical axis with respect to the central axis is calculated based on an inner diameter and a value acquired from the annular image, and the central axis in an image obtained by developing the annular image
- generated in the insertion state like FIG. The figure which shows an example of the display image produced
- deployment image of FIG. The figure which shows an example of the cut-out image produced
- deployment image The figure which shows the outline
- deployment image The figure which shows an example of the display image produced
- the endoscope apparatus 1 has a function as an observation apparatus for observing the inside of a cylindrical subject having a central axis such as a pipe.
- the endoscope apparatus 1 includes, for example, an endoscope 2 and an apparatus main body 3 to which the endoscope 2 can be connected as shown in FIG.
- the apparatus body 3 is provided with a display unit 35 that can display an image or the like.
- the endoscope 2 includes an insertion portion 5 formed to have an elongated shape that can be inserted into a cylindrical subject having a central axis, and an operation portion 6 provided on the proximal end side of the insertion portion 5. And a universal cord 7 extending from the operation unit 6. Further, the endoscope 2 is configured to be detachably connected to the apparatus main body 3 via the universal cord 7. A light guide 21 (not shown in FIG. 1) for transmitting illumination light supplied from the apparatus main body 3 is inserted into the endoscope 2.
- the insertion portion 5 is configured by sequentially providing a distal end portion 11, a bending portion 12 formed to be bendable, and a long flexible tube portion 13 having flexibility from the distal end side.
- the operation unit 6 is provided with a bending joystick 6a configured to perform an operation for bending the bending unit 12 in a desired direction.
- the operation unit 6 is provided with operation buttons corresponding to functions available in the endoscope apparatus 1 such as a freeze button, a bending lock button, and a recording instruction button. Yes.
- the distal end portion 11 can be detachably attached with an optical adapter 40 including one or more optical members corresponding to an imaging region when the inside of the cylindrical subject is imaged using the endoscope 2. It is configured. In the present embodiment, it is assumed that an optical adapter 40 configured to have a cylindrical (360 degrees) imaging region centered on an optical axis AC described later is attached to the distal end portion 11. I do.
- FIG. 2 is a diagram for explaining a configuration when an optical adapter is attached to the distal end portion of the insertion portion of the endoscope.
- the image sensor 24 may be configured to include an image sensor such as a CCD or a CMOS. Further, according to the present embodiment, as long as it can be engaged with the mounting member 49 of the optical adapter 40, the structure having a shape different from the uneven portion 25 is formed on the side surface of the tip portion 11. Also good.
- the optical adapter 40 includes a cover member 41, an observation window 42, a free-form surface lens 43, a frame member 44, a lens holder 45, a lens unit 46, a light diffusing element 47, and an illumination.
- a window 48 and a mounting member 49 are included.
- the cover member 41 is formed of, for example, a disk-shaped member having a light shielding property. Further, the cover member 41 is fixedly disposed at a position so as to cover the front surface of the free-form surface lens 43.
- the observation window 42 is formed of, for example, a cylindrical member having translucency such as a sapphire pipe, and transmits light incident from the outside of the optical adapter 40 so as to be emitted to the free-form surface lens 43. It is configured.
- the observation window 42 is a position between the cover member 41 and the frame member 44 and is fixedly disposed at a position that covers the lens side surface of the free-form surface lens 43.
- the observation window 42 is formed with an outer diameter smaller than the outer diameter of a flange 44f (described later) of the frame member 44.
- the free-form surface lens 43 is a position between the cover member 41 and the frame member 44 in the optical adapter 40, and is fixed at a position where light emitted through the observation window 42 is incident on the inside from the lens side surface. Has been placed.
- the free-form lens 43 is formed to have optical characteristics such that light incident inside from the lens side surface is refracted or reflected by the front surface of the lens and emitted to the lens unit 46.
- the frame member 44 is formed by combining, for example, a cylindrical member having a light shielding property and a flange 44 f that shields the front surface of the light diffusing element 47. Further, the frame member 44 is configured such that a part of the cylindrical portion 45a of the lens holder 45 can be fitted from the opening on the flange 44f side.
- the lens holder 45 has a cylindrical portion 45a having an inner diameter capable of holding the lens unit 46, and an enlarged diameter portion 45b having an inner diameter larger than the inner diameter of the cylindrical portion 45a.
- the cylindrical portion 45a is formed so as to hold the lens unit 46 in a state where the central axis of the optical adapter 40 and the central axis of the lens unit 46 are matched.
- the enlarged diameter portion 45b is provided on the rear end side of the cylindrical portion 45a and has an inner diameter capable of accommodating a portion including the distal end surface of the distal end portion 11.
- the diameter-expanded portion 45b allows the light of the light diffusing element 47 to contact the light emitting surface of the lens unit 46 and the light incident surface of the imaging lens 23 when the optical adapter 40 is attached to the distal end portion 11. It is formed to have a shape that allows the entrance surface and the light exit surface of the illumination lens 22 to be in contact with each other.
- An attachment member 49 is provided on the rear end side of the enlarged diameter portion 45b.
- the lens unit 46 includes one or more lenses.
- the lens unit 46 is configured to emit light incident through the free-form surface lens 43 to the imaging lens 23 when the optical adapter 40 is attached to the distal end portion 11.
- the central axis of the imaging lens 23 when the optical adapter 40 is attached to the distal end portion 11, the central axis of the imaging lens 23, the central axis of the optical adapter 40, and a free-form surface
- the central axis of the lens 43 and the central axis of the lens unit 46 are located on the same optical axis AC.
- the light diffusing element 47 has, for example, a substantially cylindrical shape, and is fixedly disposed between the frame member 44 and the lens holder 45 in the optical adapter 40 in a state of being fitted into the cylindrical portion 45a.
- the light diffusing element 47 is configured to diffuse the illumination light incident through the light exit surface of the illumination lens 22 and emit it to the side when the optical adapter 40 is attached to the distal end portion 11. .
- the illumination window 48 is formed of a cylindrical member having translucency, such as a sapphire pipe, for example, and transmits the illumination light diffused by the light diffusing element 47 to be emitted outside the optical adapter 40. Is configured to do.
- the illumination window 48 is a position between the frame member 44 and the lens holder 45 and is fixedly disposed at a position that covers the side surface of the light diffusing element 47.
- the observation window 42 is formed to have the same outer diameter as the outer diameter of the flange 44 f of the frame member 44.
- the mounting member 49 has, for example, a substantially cylindrical shape, and is formed by providing a groove that can be engaged with the concavo-convex portion 25 on the inner peripheral surface.
- the optical adapter 40 is configured to include a cylindrical (360 degrees) illumination area centered on the optical axis AC. Further, according to the configuration of the optical adapter 40 as described above, for example, as schematically shown in FIG. 3, the illumination region LR of the illumination light emitted to the outside of the optical adapter 40 through the illumination window 48 is provided. It becomes wider than the imaging region IR when imaging light incident on the inside of the optical adapter 40 through the observation window 42. Further, according to the configuration of the optical adapter 40 as described above, the illumination area LR and the imaging area IR are shifted along the optical axis AC. (See FIG. 3).
- FIG. 3 is a schematic diagram for explaining the illumination region LR and the imaging region IR when the optical adapter is attached to the distal end portion of the insertion portion of the endoscope.
- the illumination unit in the present embodiment includes a light diffusing element 47 and an illumination window 48.
- the imaging unit in the present embodiment is provided in the vicinity of the illumination unit, and includes the imaging device 24, the observation window 42, and the free-form surface lens 43.
- the apparatus main body 3 includes a light source unit 31, a light source drive unit 32, an image sensor drive unit 33, an image generation unit 34, a display unit 35, a storage unit 36, and an input I / F.
- An (interface) unit 37 and a CPU 38 are included.
- the apparatus body 3 is provided with a connection port (not shown) for connecting a portable external storage device 51 such as a USB memory.
- FIG. 4 is a block diagram for explaining the configuration of the endoscope apparatus according to the embodiment.
- the light source unit 31 includes, for example, an LED or a lamp.
- the light source unit 31 is configured to be turned on or off in accordance with a light source drive signal output from the light source drive unit 32.
- the light source unit 31 is configured to supply, for example, white light having a light amount corresponding to a light source drive signal output from the light source drive unit 32 to the light guide 21 as illumination light.
- the light source driving unit 32 includes, for example, a light source driving circuit. Further, the light source driving unit 32 is configured to generate and output a light source driving signal for driving the light source unit 31 in accordance with the control of the CPU 38.
- the image sensor driving unit 33 includes, for example, an image sensor driving circuit.
- the image sensor driving unit 33 is configured to generate and output an image sensor driving signal for driving the image sensor 24 under the control of the CPU 38.
- the image generation unit 34 is configured by an integrated circuit such as an FPGA (Field Programmable Gate Array). In addition, the image generation unit 34 performs predetermined signal processing on the imaging signal output from the imaging device 24 to generate an annular endoscope image with the optical axis AC as the center of the image, The generated endoscopic images are sequentially output to the CPU 38.
- FPGA Field Programmable Gate Array
- the display unit 35 includes, for example, a liquid crystal panel.
- the display unit 35 is configured to display the display image output from the CPU 38 on the display screen.
- the display unit 35 includes a touch panel 35a that detects a touch operation on a GUI (Graphical User Interface) button or the like displayed on the display screen and outputs an instruction corresponding to the detected touch operation to the CPU 38. Has been.
- GUI Graphic User Interface
- the storage unit 36 includes, for example, a storage circuit such as a memory.
- the storage unit 36 is used for the operation of the CPU 38 such as a program used for controlling each unit of the endoscope apparatus 1 and a program used for processing related to generation of a wide-area expanded image (described later).
- Various corresponding programs are stored.
- the storage unit 36 is configured to store a plurality of locally developed images (described later) generated in the course of processing related to the generation of the wide area developed image by the CPU 38.
- the storage unit 36 is configured to be able to store information input in response to an operation of the input I / F unit 37.
- the storage unit 36 stores parameters used in processing related to generation of a wide area developed image.
- the input I / F unit 37 is configured to include a switch or the like that can instruct the CPU 38 according to a user's input operation.
- the input I / F unit 37 is configured to be able to input information used in processing related to generation of a wide area developed image by the CPU 38 in accordance with a user operation.
- the CPU 38 is configured to be able to control the light source driving unit 32 and the image sensor driving unit 33 based on an instruction made in response to an operation of the touch panel 35a or the input I / F unit 37.
- the CPU 38 includes a timer (not shown) for measuring time.
- the CPU 38 is configured to measure the insertion distance of the insertion portion 5 based on, for example, an output signal from an acceleration sensor (not shown) provided at the distal end portion 11.
- the CPU 38 is configured to generate a display image in which a GUI button or the like is superimposed on an image such as an endoscopic image output from the image generation unit 34 and output the display image to the display unit 35. .
- the CPU 38 is configured to be able to perform an operation for storing information input in accordance with an operation of the input I / F unit 37 in the storage unit 36. Further, the CPU 38 has a function as an image processing unit, and generates a locally expanded image by expanding the annular endoscope images sequentially output from the image generating unit 34 one by one, and generates the generated local It is configured to be able to perform a process for generating a wide area developed image by pasting a plurality of developed images. Details of such processing will be described later. Further, the CPU 38 is configured to be able to store the locally developed image generated in the process of generating the wide area developed image in the storage unit 36.
- the CPU 38 is configured to be able to perform an operation for storing the wide area expanded image generated as described above in the external storage device 51. Further, the CPU 38 encodes the endoscopic image output from the image generation unit 34 using a still image format such as JPEG and a moving image format such as MPEG4 and stores the encoded image in the external storage device 51. It is configured to be able to. Further, the CPU 38 reads an image stored in the external storage device 51 and generates a display image corresponding to the read image based on an instruction made in response to an operation of the touch panel 35 a or the input I / F unit 37. And can be output to the display unit 35. The CPU 38 is configured to perform predetermined image processing such as color space conversion, interlace / progressive conversion, and gamma correction on the display image output to the display unit 35.
- predetermined image processing such as color space conversion, interlace / progressive conversion, and gamma correction on the display image output to the display unit 35.
- the user inputs information used in the processing related to the generation of the wide area developed image by operating the input I / F unit 37 after connecting each unit of the endoscope apparatus 1 and turning on the power.
- the user operates the input I / F unit 37, for example, the inner diameter ⁇ of a pipe that is a cylindrical subject having a central axis, and a cutout reference position used when generating a locally developed image.
- the inner diameter ⁇ of the pipe, the cut-out reference position used when generating the locally developed image, the matching accuracy in the joining of the locally developed images, and the local development that is not suitable for the joining is stored in the storage unit 36.
- FIG. 5 is a diagram illustrating an example of a state where the insertion portion of the endoscope is inserted into the pipe.
- illumination light emitted through the illumination window 48 reaches the inner peripheral surface 101a of the pipe 101.
- FIG. 6 is a diagram for explaining regions IRB and IRS generated in the imaging region IR in the insertion state as shown in FIG.
- FIG. 7 is a diagram showing an example of an endoscopic image generated in the insertion state as shown in FIG.
- FIG. 8 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
- the display image 301 includes an annular endoscope image 201 and a developed image acquisition start button 202.
- the endoscopic image 201 is displayed live.
- the developed image acquisition start button 202 is configured as a GUI button capable of giving an instruction to start acquisition of a locally developed image used for generating a wide-area developed image, for example, in response to a user's touch operation.
- the user presses a development image acquisition start button 202 displayed on the display unit 35 to start acquisition of a local development image used for generating a wide-area development image. Give instructions to do so.
- the CPU 38 starts acquiring a locally developed image used for generating a wide area developed image based on an instruction made in response to an operation of the touch panel 35a, starts measuring the insertion distance of the insertion unit 5, and, for example, in FIG. An operation for generating a display image 302 as shown and outputting it to the display unit 35 is performed.
- FIG. 9 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
- the display image 302 includes an annular endoscope image 201, a developed image acquisition end button 203, and insertion distance information 204.
- the endoscopic image 201 is displayed live.
- the developed image acquisition end button 203 is configured as a GUI button capable of giving an instruction to end acquisition of a locally developed image used for generating a wide-area developed image, for example, in response to a user's touch operation.
- the insertion distance information 204 includes a character string for notifying the user of the insertion distance of the insertion unit 5 measured by the CPU 38.
- the insertion distance information 204 is updated substantially in real time according to the advance / retreat movement of the insertion unit 5 during the period from when the developed image acquisition start button 202 is pressed to when the developed image acquisition end button 203 is pressed. Therefore, at the timing immediately after the developed image acquisition start button 202 is pressed, for example, a character for notifying that the insertion distance of the insertion unit 5 is 0, such as “insertion distance: 0 mm” in FIG. Insertion distance information 204 including a column is displayed on the display unit 35.
- the user After the user presses the developed image acquisition start button 202, the user inserts the insertion unit 5 into the back side of the pipe while confirming the display image 302 displayed on the display unit 35.
- FIG. 10 is a flowchart for explaining an example of an operation related to generation and display of a wide area developed image performed in the endoscope apparatus according to the embodiment.
- the CPU 38 performs processing for developing one endoscopic image output from the image generating unit 34 and generating one locally developed image (step S1 in FIG. 10). Further, the CPU 38 adds information that can identify that the image is the first image of the wide area developed image to the locally developed image generated by the process of step S1 in FIG. After step S2), the process of step S3 in FIG.
- FIG. 11 is a diagram illustrating an example of an endoscopic image used for processing related to generation of a locally developed image.
- the CPU 38 has coordinate values corresponding to each pixel position other than the non-imaging region included in the endoscope image. To get.
- the CPU 38 acquires the coordinate value of the orthogonal coordinate system by applying the Jacobian matrix to the coordinate value of the polar coordinate system acquired as described above, and the endoscopic image according to the acquired coordinate value of the orthogonal coordinate system By rearranging the pixels included in the basic image, for example, as shown in FIG. 12, a basic expanded image IEA in which the endoscope image is expanded in a rectangular shape is generated.
- FIG. 12 is a diagram illustrating an example of a basic developed image generated by developing the endoscopic image of FIG.
- the basic developed image IEA is an image in which the direction parallel to the optical axis AC in the cylindrical imaging region IR, that is, the direction parallel to the central axis AS of the pipe 101 is the vertical direction (hereinafter also referred to as the Y-axis direction). Is generated as The basic developed image IEA is generated as an image in which the circumferential direction of the cylindrical imaging region IR, that is, the circumferential direction of the inner peripheral surface 101a is the left-right direction (hereinafter also referred to as the X-axis direction). In addition, in the X-axis direction of the basic developed image IEA, the inner peripheral surface 101a imaged in the cylindrical imaging region IR is included for one round (360 degrees).
- the CPU 38 generates, for example, a rectangular combined image IJA as shown in FIG. 13 by combining two basic expanded images IEA generated as described above in the X-axis direction. That is, in the X-axis direction of the combined image IJA, the inner peripheral surface 101a imaged in the cylindrical imaging region IR is included for two rounds (for 720 degrees).
- FIG. 13 is a diagram illustrating an example of a combined image generated using the basic development image of FIG.
- the CPU 38 performs processing for generating a cutout image that is an image obtained by cutting out the combined image IJA for one round (for 360 degrees) based on the information of the cutout reference position stored in the storage unit 36. Specifically, for example, when the cutout reference position stored in the storage unit 36 is the dark region DR, the CPU 38 sets the combined image IJA to 1 so that the dark region DR is arranged at both ends in the X-axis direction. By cutting out the circumference (360 degrees), a rectangular cut-out image IEB as shown in FIG. 14 is generated.
- FIG. 14 is a diagram illustrating an example of a cutout image generated by cutting out a part of the combined image in FIG.
- the CPU 38 estimates the direction in which the shadow area included in the endoscopic image used for generating the basic development image IEA is the maximum as the offset direction, and calculates the width WS of the shadow area in the offset direction (See FIG. 15). Further, the CPU 38 performs a process for calculating the offset amount ⁇ based on the width WS calculated as described above and the inner diameter ⁇ of the pipe 101 stored in the storage unit 36.
- FIG. 15 is a diagram for explaining the width WS in the offset direction of the shadow region included in the endoscopic image used for generating the basic development image.
- the CPU 38 instead of the width WS in the offset direction of the shadow area, the CPU 38 sets the center pixel Dp in the dark area DR depicted in the endoscopic image used for generating the basic developed image IEA, for example.
- the CPU 38 may calculate the offset amount ⁇ based on the width WS of the shadow region in the offset direction, the ratio value RV, and the inner diameter ⁇ of the pipe 101. Good.
- the CPU 38 determines the outermost and inner peripheral surfaces 101 a of the free curved surface lens 43.
- a process for calculating the distance WD (see FIG. 16) between the entire circumference of the inner peripheral surface 101a is performed.
- the outer diameters DA and DB are known parameters used in processing related to the generation of the wide area developed image, and are stored in advance in the storage unit 36 before the processing is performed, for example.
- FIG. 16 is a diagram showing an outline of parameters used in the processing related to the generation of the wide area developed image.
- the imaging magnification ⁇ ( ⁇ when the subject located in the direction of the azimuth angle ⁇ is imaged ) Can be expressed as the following formula (1).
- f represents the focal length of the free-form surface lens 43
- Enp in the following formula (1) represents the distance between the outermost periphery of the free-form surface lens 43 and the entrance pupil position (see FIG. 16). It shall represent.
- f and Enp in the following mathematical formula (1) are known parameters used in the process related to the generation of the wide area expanded image, and are stored in the storage unit 36 in advance before the process is performed, for example. To do.
- the CPU 38 based on the imaging magnification ⁇ ( ⁇ ) calculated using the above mathematical formula (1) and the predetermined reference magnification ⁇ th, the Y-axis direction of the position corresponding to the azimuth angle ⁇ in the X-axis direction of the cut-out image IEB A magnification correction process is performed to correct the width of the image by ⁇ th / ⁇ ( ⁇ ) times.
- a magnification correction process for example, a pincushion-shaped cut-out image IEC as shown in FIG. 17 is generated.
- FIG. 17 is a diagram illustrating an example when the cutout image of FIG. 14 is deformed by the magnification correction process.
- the CPU 38 generates a rectangular locally developed image IEL as shown in FIG. 18 by performing a process of cutting out the upper and lower portions of the cut-out image IEC in accordance with the width WM.
- FIG. 18 is a diagram illustrating an example of a locally developed image generated by excising a part of the deformed cutout image of FIG.
- the CPU 38 calculates the offset amount ⁇ based on the inner diameter ⁇ and the width WS and / or the ratio value RV, and further, in the Y-axis direction of the cutout image IEB.
- the locally developed image IEL is generated by performing a magnification correction process for correcting the imaging magnification according to the offset amount ⁇ .
- the CPU 38 performs the same processing as step S1 in FIG. 10 to develop one endoscopic image output from the image generation unit 34 and generate one locally developed image (step in FIG. 10). S3).
- the CPU 38 performs a determination process related to whether or not the locally developed image generated by the process of step S3 in FIG. 10 is an image suitable for pasting (step S4 in FIG. 10).
- the CPU 38 performs, for example, the feature points and feature amounts acquired from the locally developed image IEL1 stored in the storage unit 36 immediately before performing the process of step S3 of FIG. 10 and the process of step S3 of FIG.
- the Y-axis direction of the locally expanded image IEL2 overlaps the width WM of the locally expanded image IEL1 in the Y-axis direction.
- the ratio PA is calculated.
- the CPU 38 detects that the ratio PA calculated as described above does not belong to the threshold range TR
- the CPU 38 obtains a determination result that the locally developed image IEL2 is an image that is not suitable for pasting.
- the CPU 38 detects that the ratio PA calculated as described above belongs to the threshold range TR, the CPU 38 obtains a determination result that the locally developed image IEL2 is an image suitable for pasting.
- the CPU 38 has a function as a determination unit that performs the determination process as described above.
- the threshold range TR may be set as a range of 20% or more and less than 50%, for example. Further, the threshold range TR may be enlarged and reduced according to the matching accuracy stored in the storage unit 36. Specifically, for example, when the matching accuracy stored in the storage unit 36 is 100%, the threshold range TR may be set to only 20%.
- step S3 of FIG. 10 When the CPU 38 obtains a determination result that the locally developed image generated by the process of step S3 of FIG. 10 is an image not suitable for pasting (S4: NO), the operation of step S5 of FIG. 10 described later is performed. I do. In addition, when the CPU 38 obtains a determination result that the locally developed image generated by the process of step S3 in FIG. 10 is an image suitable for pasting (S4: YES), the CPU 38 in step S8 of FIG. Perform the action.
- CPU38 performs the operation
- the ratio PA calculated by the process of step S4 in FIG. 10 is less than the lower limit of the threshold range TR, that is, the overlapping portion of the locally developed image IEL2 with respect to the locally developed image IEL1 is insufficient. If it is present or does not exist, an operation for generating a character string or the like for returning the position of the insertion unit 5 backward from the current position and displaying it on the display unit 35 is performed. Further, for example, when the ratio PA calculated by the process of step S4 in FIG. 10 exceeds the upper limit of the threshold range TR, that is, when the overlapping portion of the locally developed image IEL2 with respect to the locally developed image IEL1 is excessive. In addition, an operation for generating a character string or the like for increasing the insertion speed of the insertion unit 5 forward from the current speed and displaying it on the display unit 35 is performed.
- step S5 in FIG. 10 After performing the operation of step S5 in FIG. 10, the CPU 38 performs the same process as step S1 in FIG. 10 continuously m times based on the number m of re-obtained endoscope images stored in the storage unit 36. Thus, m local development images corresponding to the m endoscopic images output from the image generation unit 34 are generated (step S6 in FIG. 10).
- the CPU 38 performs a determination process similar to that in step S4 in FIG. 10 on each of the m locally developed images generated by the process in step S6 in FIG. It is determined whether there is an image suitable for (step S7 in FIG. 10).
- step S8 When the CPU 38 obtains a determination result that there is no image suitable for pasting in the m locally developed images generated by the process of step S6 of FIG. 10 (S7: NO), the CPU 38 of FIG. The operations of S5 and S6 are performed again.
- the CPU 38 obtains a determination result that an image suitable for pasting exists among the m locally developed images generated by the process of step S6 of FIG. 10 (S7: YES) the CPU 38 of FIG. The operation of step S8 is performed.
- the CPU 38 performs the operation relating to the generation of the locally developed image again when the determination condition of step S4 or step S7 in FIG. 10 is not satisfied.
- the CPU 38 detects that there are a plurality of locally developed images suitable for pasting in step S7 of FIG. 10, for example, the CPU 38 has a ratio PA closest to a predetermined value included in the threshold range TR. Assume that after selecting one locally developed image, the operation proceeds to step S8 in FIG.
- the CPU 38 adds information that can identify that the image is the n (2 ⁇ n) th image of the wide area developed image to the locally developed image that satisfies the determination condition of step S4 or step S7 in FIG. 36 (step S8 in FIG. 10), an operation for generating a character string or the like for requesting the user to maintain the insertion state of the insertion unit 5 and displaying it on the display unit 35 is performed (FIG. 10). Step S9). That is, in step S8 in FIG. 10, the CPU 38 performs an operation for storing the locally developed image satisfying the determination condition in step S4 or step S7 in FIG.
- step S9 of FIG. 10 for example, the CPU 38 generates a character string or the like indicating that the insertion of the insertion unit 5 is continued while maintaining the current insertion speed (and insertion direction), and displays the display unit 35. Performs the operation to display on the screen.
- CPU38 performs the determination process which concerns on whether acquisition of a local expansion
- the CPU 38 cannot detect an instruction in response to pressing of the developed image acquisition end button 203 and detects that the storage unit 36 has a free space capable of storing at least one locally developed image. If it is (S10: NO), the operation from step S3 in FIG. 10 is performed again.
- a plurality of locally developed images used for generating a wide area developed image are sequentially stored in the storage unit 36 (in order from the first).
- the CPU 38 detects that an instruction in response to pressing of the developed image acquisition end button 203 has been performed, or that the storage unit 36 does not have a free space capable of storing at least one locally developed image. Is detected (S10: YES), the measurement of the insertion distance of the insertion unit 5 is stopped, and each locally developed image stored in the storage unit 36 is pasted to generate one wide-area developed image. Processing is performed (step S11 in FIG. 10). That is, when the CPU 38 ends the operation for storing the locally developed image satisfying the determination condition of step S4 or step S7 of FIG. 10 in the storage unit 36 in time series, each CPU stored in the storage unit 36 An operation for generating one wide area developed image by pasting the locally developed images and causing the display unit 35 to display the generated one wide area developed image is performed.
- the CPU 38 calculates, for example, a homography matrix of two numbers adjacent to each other with respect to the first to p (2 ⁇ p) th locally developed images IEL stored in the storage unit 36.
- a single wide area developed image IEW as shown in FIG. 19 is generated.
- FIG. 19 is a diagram illustrating an example of a wide area developed image generated by pasting together a plurality of locally developed images.
- FIG. 20 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
- the display image 303 includes insertion distance information 204, a rectangular wide area developed image 205, and a live image display button 206.
- the insertion distance information 204 of the display image 303 includes a character string for notifying the user of the insertion distance of the insertion unit 5 at the timing when the measurement by the CPU 38 is stopped.
- the live image display button 206 is configured, for example, as a GUI button that can issue an instruction to cancel the display of the wide-area developed image 205 and display the endoscopic image 201 live according to a user's touch operation. Yes.
- the CPU 38 performs a determination process related to whether or not to end the display of the wide area developed image (step S12 in FIG. 10).
- the CPU 38 stands by while performing an operation for outputting the display image 303 to the display unit 35 until an instruction corresponding to the pressing of the live image display button 206 is detected (S12: NO).
- the CPU 38 detects that an instruction corresponding to the pressing of the live image display button 206 has been performed (S12: YES), the CPU 38 stores the wide area developed image generated in step S11 of FIG. In addition, after performing an operation for displaying the display image 301 again instead of the display image 303, a series of processes relating to generation and display of the wide area expanded image is completed.
- the CPU 38 is not limited to performing the operation for generating the wide area developed image and displaying it on the display unit 35 after the storage of the locally developed image in the storage unit 36 is completed.
- an operation for generating a wide area developed image and displaying it on the display unit 35 while storing the locally developed image in the storage unit 36 may be performed.
- FIG. 21 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
- the display image 304a includes an endoscope image 201, a developed image acquisition end button 203, and insertion distance information 204 that are substantially the same as those included in the display image 302 of FIG. Further, according to the display image 304a, each time a locally developed image satisfying the determination condition of step S4 or step S7 in FIG. 10 is acquired, a wide area developed image 211 obtained by pasting the locally developed image is displayed on the display unit 35. Is done. In addition, according to the display image 304a, for example, as indicated by the alternate long and short dash line in FIG. 21, each time the locally developed image satisfying the determination condition in step S4 or step S7 in FIG. The area is gradually expanded.
- FIG. 22 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
- the display image 304b includes a developed image acquisition end button 203 and insertion distance information 204 that are substantially the same as those included in the display image 302 of FIG. Further, according to the display image 304b, each time the locally developed image 221 that satisfies the determination condition of step S4 or step S7 in FIG. 10 is acquired, the locally developed image 221 and the locally developed image 221 are pasted together. 211 are also displayed on the display unit 35. In addition, according to the display image 304b, for example, as indicated by a one-dot chain line in FIG. 22, each time a locally developed image satisfying the determination condition in step S4 or step S7 in FIG. The area is gradually expanded.
- the offset amount ⁇ is calculated based on the endoscopic image output from the image generation unit 34, and the magnification correction process according to the calculated offset amount ⁇ is performed. While performing, a locally developed image and a wide area developed image are generated. Therefore, according to the present embodiment, for example, without using a centering mechanism configured to be able to insert the insertion portion 5 into the pipe 101 while matching the optical axis AC and the center axis AS. Can be generated.
- the developed image can be generated without providing the centering mechanism as described above in the insertion portion 5, for example, the inside of a small diameter pipe corresponding to the outer diameter of the insertion portion 5.
- the state can be confirmed with a wide area developed image.
- Step S5 to Step S7 in FIG. 10 are repeatedly performed, so that the nth locally developed image suitable for pasting to the (n ⁇ 1) th locally developed image is reliably acquired. can do. Therefore, according to the present embodiment, for example, even when the insertion state of the insertion portion 5 inserted into the pipe 101 is temporarily disturbed, an accurate wide area developed image can be generated.
Landscapes
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Astronomy & Astrophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Theoretical Computer Science (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Endoscopes (AREA)
Abstract
Un dispositif d'observation comprend : une unité d'éclairage agencée à l'intérieur d'un échantillon cylindrique avec un axe central et émettant une lumière d'éclairage vers une région d'éclairage cylindrique centrée sur un axe optique prescrit; une unité d'imagerie agencée à proximité de l'unité d'éclairage à l'intérieur de l'échantillon, capturant une région de capture cylindrique dans l'échantillon centrée au niveau de l'axe optique prescrit, et délivrant en sortie le signal capturé; une unité de génération d'image pour générer une image annulaire centrée au niveau de l'axe optique prescrit sur la base du signal capturé; et une unité de traitement d'image pour calculer un décalage prescrit pour l'axe optique par rapport à l'axe central sur la base du diamètre interne de l'échantillon et d'une valeur acquise à partir de l'image annulaire, et générer une image développée localement par correction du grossissement d'imagerie le long d'une direction parallèle à l'axe central dans une image acquise par développement de l'image annulaire conformément au décalage.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017010126 | 2017-01-24 | ||
JP2017-010126 | 2017-01-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018139025A1 true WO2018139025A1 (fr) | 2018-08-02 |
Family
ID=62979131
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/041964 WO2018139025A1 (fr) | 2017-01-24 | 2017-11-22 | Dispositif d'observation |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018139025A1 (fr) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03132590A (ja) * | 1989-10-18 | 1991-06-05 | Okumura Corp | 坑壁の展開画像作成装置 |
JP2012163619A (ja) * | 2011-02-03 | 2012-08-30 | Olympus Corp | 全周囲観察光学系及びそれを備えた全周囲観察システム |
JP2017015836A (ja) * | 2015-06-29 | 2017-01-19 | オリンパス株式会社 | 全周囲照明光学部材、それを備えた内視鏡用全周囲照明光学系及び全周囲観察用内視鏡 |
-
2017
- 2017-11-22 WO PCT/JP2017/041964 patent/WO2018139025A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03132590A (ja) * | 1989-10-18 | 1991-06-05 | Okumura Corp | 坑壁の展開画像作成装置 |
JP2012163619A (ja) * | 2011-02-03 | 2012-08-30 | Olympus Corp | 全周囲観察光学系及びそれを備えた全周囲観察システム |
JP2017015836A (ja) * | 2015-06-29 | 2017-01-19 | オリンパス株式会社 | 全周囲照明光学部材、それを備えた内視鏡用全周囲照明光学系及び全周囲観察用内視鏡 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4500096B2 (ja) | 内視鏡及び内視鏡システム | |
JP5942044B2 (ja) | 内視鏡システム | |
WO2018051679A1 (fr) | Dispositif d'aide à la mesure, système d'endoscope, processeur pour système d'endoscope, et procédé d'aide à la mesure | |
US20100128116A1 (en) | Endoscope apparatus | |
JP5977912B1 (ja) | 内視鏡システム及び内視鏡ビデオプロセッサ | |
JP5953443B2 (ja) | 内視鏡システム | |
JP2011206435A (ja) | 撮像装置、撮像方法、撮像プログラム、及び内視鏡 | |
JP5889495B2 (ja) | 内視鏡システム | |
JP5231173B2 (ja) | 計測用内視鏡装置およびプログラム | |
JP2022136184A (ja) | 制御装置、内視鏡システムおよび制御装置の作動方法 | |
JP4885479B2 (ja) | 計測用内視鏡装置及び内視鏡用プログラム | |
JP6352673B2 (ja) | 内視鏡装置及び内視鏡装置の操作方法 | |
JP2014228851A (ja) | 内視鏡装置、画像取得方法および画像取得プログラム | |
JP5113990B2 (ja) | 計測用内視鏡装置 | |
WO2018139025A1 (fr) | Dispositif d'observation | |
JP2019033971A (ja) | 内視鏡装置 | |
JPS6354378B2 (fr) | ||
JP6064092B2 (ja) | 内視鏡システム | |
CN107920189B (zh) | 全景内视镜装置 | |
KR102516406B1 (ko) | 내시현미경에 의해 획득되는 이미지를 스티칭하기 위해 위치 정보를 캘리브레이션하는 방법 및 장치 | |
JP4776919B2 (ja) | 医療画像処理装置 | |
JP2004275359A (ja) | 計測内視鏡装置 | |
US20200306002A1 (en) | Medical observation control device and medical observation system | |
TWI607734B (zh) | Panoramic endoscope device | |
JPH02297515A (ja) | 立体視電子内視鏡 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17894394 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17894394 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |