+

WO2018139025A1 - Observation device - Google Patents

Observation device Download PDF

Info

Publication number
WO2018139025A1
WO2018139025A1 PCT/JP2017/041964 JP2017041964W WO2018139025A1 WO 2018139025 A1 WO2018139025 A1 WO 2018139025A1 JP 2017041964 W JP2017041964 W JP 2017041964W WO 2018139025 A1 WO2018139025 A1 WO 2018139025A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
developed image
cpu
locally
Prior art date
Application number
PCT/JP2017/041964
Other languages
French (fr)
Japanese (ja)
Inventor
真央 佐々木
高橋 進
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Publication of WO2018139025A1 publication Critical patent/WO2018139025A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Definitions

  • the present invention relates to an observation apparatus.
  • an image projected on a convex mirror provided in the center of a horizontal shaft having a substantially circular cross section is captured to obtain a basic image, and the basic image is sequentially acquired.
  • a configuration is disclosed in which a developed image of the inner wall surface of the horizontal shaft is generated by decomposing into a plurality of concentric annular images having different radii, developing the plurality of annular images into a linear image, and sequentially arranging them. ing.
  • the central axis of the imaging device provided with the convex mirror and the central axis of the horizontal shaft imaged by the imaging device by components such as a suspension arm In a state in which are substantially matched, a basic image used for generating a developed image is acquired.
  • the present invention has been made in view of the above-described circumstances, and an object thereof is to provide an observation apparatus capable of generating an accurate developed image without using a centering mechanism.
  • the observation apparatus is arranged inside a cylindrical subject having a central axis and emits illumination light to a cylindrical illumination region centered on a predetermined optical axis.
  • An illuminating unit that is configured and an imaging signal that is provided in the vicinity of the illuminating unit, is disposed inside the subject, and images a subject in a cylindrical imaging region centered on the predetermined optical axis.
  • An image generating unit configured to generate an annular image having the predetermined optical axis as a center of the image based on the imaging signal, and an image of the subject.
  • An offset amount of the predetermined optical axis with respect to the central axis is calculated based on an inner diameter and a value acquired from the annular image, and the central axis in an image obtained by developing the annular image
  • generated in the insertion state like FIG. The figure which shows an example of the display image produced
  • deployment image of FIG. The figure which shows an example of the cut-out image produced
  • deployment image The figure which shows the outline
  • deployment image The figure which shows an example of the display image produced
  • the endoscope apparatus 1 has a function as an observation apparatus for observing the inside of a cylindrical subject having a central axis such as a pipe.
  • the endoscope apparatus 1 includes, for example, an endoscope 2 and an apparatus main body 3 to which the endoscope 2 can be connected as shown in FIG.
  • the apparatus body 3 is provided with a display unit 35 that can display an image or the like.
  • the endoscope 2 includes an insertion portion 5 formed to have an elongated shape that can be inserted into a cylindrical subject having a central axis, and an operation portion 6 provided on the proximal end side of the insertion portion 5. And a universal cord 7 extending from the operation unit 6. Further, the endoscope 2 is configured to be detachably connected to the apparatus main body 3 via the universal cord 7. A light guide 21 (not shown in FIG. 1) for transmitting illumination light supplied from the apparatus main body 3 is inserted into the endoscope 2.
  • the insertion portion 5 is configured by sequentially providing a distal end portion 11, a bending portion 12 formed to be bendable, and a long flexible tube portion 13 having flexibility from the distal end side.
  • the operation unit 6 is provided with a bending joystick 6a configured to perform an operation for bending the bending unit 12 in a desired direction.
  • the operation unit 6 is provided with operation buttons corresponding to functions available in the endoscope apparatus 1 such as a freeze button, a bending lock button, and a recording instruction button. Yes.
  • the distal end portion 11 can be detachably attached with an optical adapter 40 including one or more optical members corresponding to an imaging region when the inside of the cylindrical subject is imaged using the endoscope 2. It is configured. In the present embodiment, it is assumed that an optical adapter 40 configured to have a cylindrical (360 degrees) imaging region centered on an optical axis AC described later is attached to the distal end portion 11. I do.
  • FIG. 2 is a diagram for explaining a configuration when an optical adapter is attached to the distal end portion of the insertion portion of the endoscope.
  • the image sensor 24 may be configured to include an image sensor such as a CCD or a CMOS. Further, according to the present embodiment, as long as it can be engaged with the mounting member 49 of the optical adapter 40, the structure having a shape different from the uneven portion 25 is formed on the side surface of the tip portion 11. Also good.
  • the optical adapter 40 includes a cover member 41, an observation window 42, a free-form surface lens 43, a frame member 44, a lens holder 45, a lens unit 46, a light diffusing element 47, and an illumination.
  • a window 48 and a mounting member 49 are included.
  • the cover member 41 is formed of, for example, a disk-shaped member having a light shielding property. Further, the cover member 41 is fixedly disposed at a position so as to cover the front surface of the free-form surface lens 43.
  • the observation window 42 is formed of, for example, a cylindrical member having translucency such as a sapphire pipe, and transmits light incident from the outside of the optical adapter 40 so as to be emitted to the free-form surface lens 43. It is configured.
  • the observation window 42 is a position between the cover member 41 and the frame member 44 and is fixedly disposed at a position that covers the lens side surface of the free-form surface lens 43.
  • the observation window 42 is formed with an outer diameter smaller than the outer diameter of a flange 44f (described later) of the frame member 44.
  • the free-form surface lens 43 is a position between the cover member 41 and the frame member 44 in the optical adapter 40, and is fixed at a position where light emitted through the observation window 42 is incident on the inside from the lens side surface. Has been placed.
  • the free-form lens 43 is formed to have optical characteristics such that light incident inside from the lens side surface is refracted or reflected by the front surface of the lens and emitted to the lens unit 46.
  • the frame member 44 is formed by combining, for example, a cylindrical member having a light shielding property and a flange 44 f that shields the front surface of the light diffusing element 47. Further, the frame member 44 is configured such that a part of the cylindrical portion 45a of the lens holder 45 can be fitted from the opening on the flange 44f side.
  • the lens holder 45 has a cylindrical portion 45a having an inner diameter capable of holding the lens unit 46, and an enlarged diameter portion 45b having an inner diameter larger than the inner diameter of the cylindrical portion 45a.
  • the cylindrical portion 45a is formed so as to hold the lens unit 46 in a state where the central axis of the optical adapter 40 and the central axis of the lens unit 46 are matched.
  • the enlarged diameter portion 45b is provided on the rear end side of the cylindrical portion 45a and has an inner diameter capable of accommodating a portion including the distal end surface of the distal end portion 11.
  • the diameter-expanded portion 45b allows the light of the light diffusing element 47 to contact the light emitting surface of the lens unit 46 and the light incident surface of the imaging lens 23 when the optical adapter 40 is attached to the distal end portion 11. It is formed to have a shape that allows the entrance surface and the light exit surface of the illumination lens 22 to be in contact with each other.
  • An attachment member 49 is provided on the rear end side of the enlarged diameter portion 45b.
  • the lens unit 46 includes one or more lenses.
  • the lens unit 46 is configured to emit light incident through the free-form surface lens 43 to the imaging lens 23 when the optical adapter 40 is attached to the distal end portion 11.
  • the central axis of the imaging lens 23 when the optical adapter 40 is attached to the distal end portion 11, the central axis of the imaging lens 23, the central axis of the optical adapter 40, and a free-form surface
  • the central axis of the lens 43 and the central axis of the lens unit 46 are located on the same optical axis AC.
  • the light diffusing element 47 has, for example, a substantially cylindrical shape, and is fixedly disposed between the frame member 44 and the lens holder 45 in the optical adapter 40 in a state of being fitted into the cylindrical portion 45a.
  • the light diffusing element 47 is configured to diffuse the illumination light incident through the light exit surface of the illumination lens 22 and emit it to the side when the optical adapter 40 is attached to the distal end portion 11. .
  • the illumination window 48 is formed of a cylindrical member having translucency, such as a sapphire pipe, for example, and transmits the illumination light diffused by the light diffusing element 47 to be emitted outside the optical adapter 40. Is configured to do.
  • the illumination window 48 is a position between the frame member 44 and the lens holder 45 and is fixedly disposed at a position that covers the side surface of the light diffusing element 47.
  • the observation window 42 is formed to have the same outer diameter as the outer diameter of the flange 44 f of the frame member 44.
  • the mounting member 49 has, for example, a substantially cylindrical shape, and is formed by providing a groove that can be engaged with the concavo-convex portion 25 on the inner peripheral surface.
  • the optical adapter 40 is configured to include a cylindrical (360 degrees) illumination area centered on the optical axis AC. Further, according to the configuration of the optical adapter 40 as described above, for example, as schematically shown in FIG. 3, the illumination region LR of the illumination light emitted to the outside of the optical adapter 40 through the illumination window 48 is provided. It becomes wider than the imaging region IR when imaging light incident on the inside of the optical adapter 40 through the observation window 42. Further, according to the configuration of the optical adapter 40 as described above, the illumination area LR and the imaging area IR are shifted along the optical axis AC. (See FIG. 3).
  • FIG. 3 is a schematic diagram for explaining the illumination region LR and the imaging region IR when the optical adapter is attached to the distal end portion of the insertion portion of the endoscope.
  • the illumination unit in the present embodiment includes a light diffusing element 47 and an illumination window 48.
  • the imaging unit in the present embodiment is provided in the vicinity of the illumination unit, and includes the imaging device 24, the observation window 42, and the free-form surface lens 43.
  • the apparatus main body 3 includes a light source unit 31, a light source drive unit 32, an image sensor drive unit 33, an image generation unit 34, a display unit 35, a storage unit 36, and an input I / F.
  • An (interface) unit 37 and a CPU 38 are included.
  • the apparatus body 3 is provided with a connection port (not shown) for connecting a portable external storage device 51 such as a USB memory.
  • FIG. 4 is a block diagram for explaining the configuration of the endoscope apparatus according to the embodiment.
  • the light source unit 31 includes, for example, an LED or a lamp.
  • the light source unit 31 is configured to be turned on or off in accordance with a light source drive signal output from the light source drive unit 32.
  • the light source unit 31 is configured to supply, for example, white light having a light amount corresponding to a light source drive signal output from the light source drive unit 32 to the light guide 21 as illumination light.
  • the light source driving unit 32 includes, for example, a light source driving circuit. Further, the light source driving unit 32 is configured to generate and output a light source driving signal for driving the light source unit 31 in accordance with the control of the CPU 38.
  • the image sensor driving unit 33 includes, for example, an image sensor driving circuit.
  • the image sensor driving unit 33 is configured to generate and output an image sensor driving signal for driving the image sensor 24 under the control of the CPU 38.
  • the image generation unit 34 is configured by an integrated circuit such as an FPGA (Field Programmable Gate Array). In addition, the image generation unit 34 performs predetermined signal processing on the imaging signal output from the imaging device 24 to generate an annular endoscope image with the optical axis AC as the center of the image, The generated endoscopic images are sequentially output to the CPU 38.
  • FPGA Field Programmable Gate Array
  • the display unit 35 includes, for example, a liquid crystal panel.
  • the display unit 35 is configured to display the display image output from the CPU 38 on the display screen.
  • the display unit 35 includes a touch panel 35a that detects a touch operation on a GUI (Graphical User Interface) button or the like displayed on the display screen and outputs an instruction corresponding to the detected touch operation to the CPU 38. Has been.
  • GUI Graphic User Interface
  • the storage unit 36 includes, for example, a storage circuit such as a memory.
  • the storage unit 36 is used for the operation of the CPU 38 such as a program used for controlling each unit of the endoscope apparatus 1 and a program used for processing related to generation of a wide-area expanded image (described later).
  • Various corresponding programs are stored.
  • the storage unit 36 is configured to store a plurality of locally developed images (described later) generated in the course of processing related to the generation of the wide area developed image by the CPU 38.
  • the storage unit 36 is configured to be able to store information input in response to an operation of the input I / F unit 37.
  • the storage unit 36 stores parameters used in processing related to generation of a wide area developed image.
  • the input I / F unit 37 is configured to include a switch or the like that can instruct the CPU 38 according to a user's input operation.
  • the input I / F unit 37 is configured to be able to input information used in processing related to generation of a wide area developed image by the CPU 38 in accordance with a user operation.
  • the CPU 38 is configured to be able to control the light source driving unit 32 and the image sensor driving unit 33 based on an instruction made in response to an operation of the touch panel 35a or the input I / F unit 37.
  • the CPU 38 includes a timer (not shown) for measuring time.
  • the CPU 38 is configured to measure the insertion distance of the insertion portion 5 based on, for example, an output signal from an acceleration sensor (not shown) provided at the distal end portion 11.
  • the CPU 38 is configured to generate a display image in which a GUI button or the like is superimposed on an image such as an endoscopic image output from the image generation unit 34 and output the display image to the display unit 35. .
  • the CPU 38 is configured to be able to perform an operation for storing information input in accordance with an operation of the input I / F unit 37 in the storage unit 36. Further, the CPU 38 has a function as an image processing unit, and generates a locally expanded image by expanding the annular endoscope images sequentially output from the image generating unit 34 one by one, and generates the generated local It is configured to be able to perform a process for generating a wide area developed image by pasting a plurality of developed images. Details of such processing will be described later. Further, the CPU 38 is configured to be able to store the locally developed image generated in the process of generating the wide area developed image in the storage unit 36.
  • the CPU 38 is configured to be able to perform an operation for storing the wide area expanded image generated as described above in the external storage device 51. Further, the CPU 38 encodes the endoscopic image output from the image generation unit 34 using a still image format such as JPEG and a moving image format such as MPEG4 and stores the encoded image in the external storage device 51. It is configured to be able to. Further, the CPU 38 reads an image stored in the external storage device 51 and generates a display image corresponding to the read image based on an instruction made in response to an operation of the touch panel 35 a or the input I / F unit 37. And can be output to the display unit 35. The CPU 38 is configured to perform predetermined image processing such as color space conversion, interlace / progressive conversion, and gamma correction on the display image output to the display unit 35.
  • predetermined image processing such as color space conversion, interlace / progressive conversion, and gamma correction on the display image output to the display unit 35.
  • the user inputs information used in the processing related to the generation of the wide area developed image by operating the input I / F unit 37 after connecting each unit of the endoscope apparatus 1 and turning on the power.
  • the user operates the input I / F unit 37, for example, the inner diameter ⁇ of a pipe that is a cylindrical subject having a central axis, and a cutout reference position used when generating a locally developed image.
  • the inner diameter ⁇ of the pipe, the cut-out reference position used when generating the locally developed image, the matching accuracy in the joining of the locally developed images, and the local development that is not suitable for the joining is stored in the storage unit 36.
  • FIG. 5 is a diagram illustrating an example of a state where the insertion portion of the endoscope is inserted into the pipe.
  • illumination light emitted through the illumination window 48 reaches the inner peripheral surface 101a of the pipe 101.
  • FIG. 6 is a diagram for explaining regions IRB and IRS generated in the imaging region IR in the insertion state as shown in FIG.
  • FIG. 7 is a diagram showing an example of an endoscopic image generated in the insertion state as shown in FIG.
  • FIG. 8 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
  • the display image 301 includes an annular endoscope image 201 and a developed image acquisition start button 202.
  • the endoscopic image 201 is displayed live.
  • the developed image acquisition start button 202 is configured as a GUI button capable of giving an instruction to start acquisition of a locally developed image used for generating a wide-area developed image, for example, in response to a user's touch operation.
  • the user presses a development image acquisition start button 202 displayed on the display unit 35 to start acquisition of a local development image used for generating a wide-area development image. Give instructions to do so.
  • the CPU 38 starts acquiring a locally developed image used for generating a wide area developed image based on an instruction made in response to an operation of the touch panel 35a, starts measuring the insertion distance of the insertion unit 5, and, for example, in FIG. An operation for generating a display image 302 as shown and outputting it to the display unit 35 is performed.
  • FIG. 9 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
  • the display image 302 includes an annular endoscope image 201, a developed image acquisition end button 203, and insertion distance information 204.
  • the endoscopic image 201 is displayed live.
  • the developed image acquisition end button 203 is configured as a GUI button capable of giving an instruction to end acquisition of a locally developed image used for generating a wide-area developed image, for example, in response to a user's touch operation.
  • the insertion distance information 204 includes a character string for notifying the user of the insertion distance of the insertion unit 5 measured by the CPU 38.
  • the insertion distance information 204 is updated substantially in real time according to the advance / retreat movement of the insertion unit 5 during the period from when the developed image acquisition start button 202 is pressed to when the developed image acquisition end button 203 is pressed. Therefore, at the timing immediately after the developed image acquisition start button 202 is pressed, for example, a character for notifying that the insertion distance of the insertion unit 5 is 0, such as “insertion distance: 0 mm” in FIG. Insertion distance information 204 including a column is displayed on the display unit 35.
  • the user After the user presses the developed image acquisition start button 202, the user inserts the insertion unit 5 into the back side of the pipe while confirming the display image 302 displayed on the display unit 35.
  • FIG. 10 is a flowchart for explaining an example of an operation related to generation and display of a wide area developed image performed in the endoscope apparatus according to the embodiment.
  • the CPU 38 performs processing for developing one endoscopic image output from the image generating unit 34 and generating one locally developed image (step S1 in FIG. 10). Further, the CPU 38 adds information that can identify that the image is the first image of the wide area developed image to the locally developed image generated by the process of step S1 in FIG. After step S2), the process of step S3 in FIG.
  • FIG. 11 is a diagram illustrating an example of an endoscopic image used for processing related to generation of a locally developed image.
  • the CPU 38 has coordinate values corresponding to each pixel position other than the non-imaging region included in the endoscope image. To get.
  • the CPU 38 acquires the coordinate value of the orthogonal coordinate system by applying the Jacobian matrix to the coordinate value of the polar coordinate system acquired as described above, and the endoscopic image according to the acquired coordinate value of the orthogonal coordinate system By rearranging the pixels included in the basic image, for example, as shown in FIG. 12, a basic expanded image IEA in which the endoscope image is expanded in a rectangular shape is generated.
  • FIG. 12 is a diagram illustrating an example of a basic developed image generated by developing the endoscopic image of FIG.
  • the basic developed image IEA is an image in which the direction parallel to the optical axis AC in the cylindrical imaging region IR, that is, the direction parallel to the central axis AS of the pipe 101 is the vertical direction (hereinafter also referred to as the Y-axis direction). Is generated as The basic developed image IEA is generated as an image in which the circumferential direction of the cylindrical imaging region IR, that is, the circumferential direction of the inner peripheral surface 101a is the left-right direction (hereinafter also referred to as the X-axis direction). In addition, in the X-axis direction of the basic developed image IEA, the inner peripheral surface 101a imaged in the cylindrical imaging region IR is included for one round (360 degrees).
  • the CPU 38 generates, for example, a rectangular combined image IJA as shown in FIG. 13 by combining two basic expanded images IEA generated as described above in the X-axis direction. That is, in the X-axis direction of the combined image IJA, the inner peripheral surface 101a imaged in the cylindrical imaging region IR is included for two rounds (for 720 degrees).
  • FIG. 13 is a diagram illustrating an example of a combined image generated using the basic development image of FIG.
  • the CPU 38 performs processing for generating a cutout image that is an image obtained by cutting out the combined image IJA for one round (for 360 degrees) based on the information of the cutout reference position stored in the storage unit 36. Specifically, for example, when the cutout reference position stored in the storage unit 36 is the dark region DR, the CPU 38 sets the combined image IJA to 1 so that the dark region DR is arranged at both ends in the X-axis direction. By cutting out the circumference (360 degrees), a rectangular cut-out image IEB as shown in FIG. 14 is generated.
  • FIG. 14 is a diagram illustrating an example of a cutout image generated by cutting out a part of the combined image in FIG.
  • the CPU 38 estimates the direction in which the shadow area included in the endoscopic image used for generating the basic development image IEA is the maximum as the offset direction, and calculates the width WS of the shadow area in the offset direction (See FIG. 15). Further, the CPU 38 performs a process for calculating the offset amount ⁇ based on the width WS calculated as described above and the inner diameter ⁇ of the pipe 101 stored in the storage unit 36.
  • FIG. 15 is a diagram for explaining the width WS in the offset direction of the shadow region included in the endoscopic image used for generating the basic development image.
  • the CPU 38 instead of the width WS in the offset direction of the shadow area, the CPU 38 sets the center pixel Dp in the dark area DR depicted in the endoscopic image used for generating the basic developed image IEA, for example.
  • the CPU 38 may calculate the offset amount ⁇ based on the width WS of the shadow region in the offset direction, the ratio value RV, and the inner diameter ⁇ of the pipe 101. Good.
  • the CPU 38 determines the outermost and inner peripheral surfaces 101 a of the free curved surface lens 43.
  • a process for calculating the distance WD (see FIG. 16) between the entire circumference of the inner peripheral surface 101a is performed.
  • the outer diameters DA and DB are known parameters used in processing related to the generation of the wide area developed image, and are stored in advance in the storage unit 36 before the processing is performed, for example.
  • FIG. 16 is a diagram showing an outline of parameters used in the processing related to the generation of the wide area developed image.
  • the imaging magnification ⁇ ( ⁇ when the subject located in the direction of the azimuth angle ⁇ is imaged ) Can be expressed as the following formula (1).
  • f represents the focal length of the free-form surface lens 43
  • Enp in the following formula (1) represents the distance between the outermost periphery of the free-form surface lens 43 and the entrance pupil position (see FIG. 16). It shall represent.
  • f and Enp in the following mathematical formula (1) are known parameters used in the process related to the generation of the wide area expanded image, and are stored in the storage unit 36 in advance before the process is performed, for example. To do.
  • the CPU 38 based on the imaging magnification ⁇ ( ⁇ ) calculated using the above mathematical formula (1) and the predetermined reference magnification ⁇ th, the Y-axis direction of the position corresponding to the azimuth angle ⁇ in the X-axis direction of the cut-out image IEB A magnification correction process is performed to correct the width of the image by ⁇ th / ⁇ ( ⁇ ) times.
  • a magnification correction process for example, a pincushion-shaped cut-out image IEC as shown in FIG. 17 is generated.
  • FIG. 17 is a diagram illustrating an example when the cutout image of FIG. 14 is deformed by the magnification correction process.
  • the CPU 38 generates a rectangular locally developed image IEL as shown in FIG. 18 by performing a process of cutting out the upper and lower portions of the cut-out image IEC in accordance with the width WM.
  • FIG. 18 is a diagram illustrating an example of a locally developed image generated by excising a part of the deformed cutout image of FIG.
  • the CPU 38 calculates the offset amount ⁇ based on the inner diameter ⁇ and the width WS and / or the ratio value RV, and further, in the Y-axis direction of the cutout image IEB.
  • the locally developed image IEL is generated by performing a magnification correction process for correcting the imaging magnification according to the offset amount ⁇ .
  • the CPU 38 performs the same processing as step S1 in FIG. 10 to develop one endoscopic image output from the image generation unit 34 and generate one locally developed image (step in FIG. 10). S3).
  • the CPU 38 performs a determination process related to whether or not the locally developed image generated by the process of step S3 in FIG. 10 is an image suitable for pasting (step S4 in FIG. 10).
  • the CPU 38 performs, for example, the feature points and feature amounts acquired from the locally developed image IEL1 stored in the storage unit 36 immediately before performing the process of step S3 of FIG. 10 and the process of step S3 of FIG.
  • the Y-axis direction of the locally expanded image IEL2 overlaps the width WM of the locally expanded image IEL1 in the Y-axis direction.
  • the ratio PA is calculated.
  • the CPU 38 detects that the ratio PA calculated as described above does not belong to the threshold range TR
  • the CPU 38 obtains a determination result that the locally developed image IEL2 is an image that is not suitable for pasting.
  • the CPU 38 detects that the ratio PA calculated as described above belongs to the threshold range TR, the CPU 38 obtains a determination result that the locally developed image IEL2 is an image suitable for pasting.
  • the CPU 38 has a function as a determination unit that performs the determination process as described above.
  • the threshold range TR may be set as a range of 20% or more and less than 50%, for example. Further, the threshold range TR may be enlarged and reduced according to the matching accuracy stored in the storage unit 36. Specifically, for example, when the matching accuracy stored in the storage unit 36 is 100%, the threshold range TR may be set to only 20%.
  • step S3 of FIG. 10 When the CPU 38 obtains a determination result that the locally developed image generated by the process of step S3 of FIG. 10 is an image not suitable for pasting (S4: NO), the operation of step S5 of FIG. 10 described later is performed. I do. In addition, when the CPU 38 obtains a determination result that the locally developed image generated by the process of step S3 in FIG. 10 is an image suitable for pasting (S4: YES), the CPU 38 in step S8 of FIG. Perform the action.
  • CPU38 performs the operation
  • the ratio PA calculated by the process of step S4 in FIG. 10 is less than the lower limit of the threshold range TR, that is, the overlapping portion of the locally developed image IEL2 with respect to the locally developed image IEL1 is insufficient. If it is present or does not exist, an operation for generating a character string or the like for returning the position of the insertion unit 5 backward from the current position and displaying it on the display unit 35 is performed. Further, for example, when the ratio PA calculated by the process of step S4 in FIG. 10 exceeds the upper limit of the threshold range TR, that is, when the overlapping portion of the locally developed image IEL2 with respect to the locally developed image IEL1 is excessive. In addition, an operation for generating a character string or the like for increasing the insertion speed of the insertion unit 5 forward from the current speed and displaying it on the display unit 35 is performed.
  • step S5 in FIG. 10 After performing the operation of step S5 in FIG. 10, the CPU 38 performs the same process as step S1 in FIG. 10 continuously m times based on the number m of re-obtained endoscope images stored in the storage unit 36. Thus, m local development images corresponding to the m endoscopic images output from the image generation unit 34 are generated (step S6 in FIG. 10).
  • the CPU 38 performs a determination process similar to that in step S4 in FIG. 10 on each of the m locally developed images generated by the process in step S6 in FIG. It is determined whether there is an image suitable for (step S7 in FIG. 10).
  • step S8 When the CPU 38 obtains a determination result that there is no image suitable for pasting in the m locally developed images generated by the process of step S6 of FIG. 10 (S7: NO), the CPU 38 of FIG. The operations of S5 and S6 are performed again.
  • the CPU 38 obtains a determination result that an image suitable for pasting exists among the m locally developed images generated by the process of step S6 of FIG. 10 (S7: YES) the CPU 38 of FIG. The operation of step S8 is performed.
  • the CPU 38 performs the operation relating to the generation of the locally developed image again when the determination condition of step S4 or step S7 in FIG. 10 is not satisfied.
  • the CPU 38 detects that there are a plurality of locally developed images suitable for pasting in step S7 of FIG. 10, for example, the CPU 38 has a ratio PA closest to a predetermined value included in the threshold range TR. Assume that after selecting one locally developed image, the operation proceeds to step S8 in FIG.
  • the CPU 38 adds information that can identify that the image is the n (2 ⁇ n) th image of the wide area developed image to the locally developed image that satisfies the determination condition of step S4 or step S7 in FIG. 36 (step S8 in FIG. 10), an operation for generating a character string or the like for requesting the user to maintain the insertion state of the insertion unit 5 and displaying it on the display unit 35 is performed (FIG. 10). Step S9). That is, in step S8 in FIG. 10, the CPU 38 performs an operation for storing the locally developed image satisfying the determination condition in step S4 or step S7 in FIG.
  • step S9 of FIG. 10 for example, the CPU 38 generates a character string or the like indicating that the insertion of the insertion unit 5 is continued while maintaining the current insertion speed (and insertion direction), and displays the display unit 35. Performs the operation to display on the screen.
  • CPU38 performs the determination process which concerns on whether acquisition of a local expansion
  • the CPU 38 cannot detect an instruction in response to pressing of the developed image acquisition end button 203 and detects that the storage unit 36 has a free space capable of storing at least one locally developed image. If it is (S10: NO), the operation from step S3 in FIG. 10 is performed again.
  • a plurality of locally developed images used for generating a wide area developed image are sequentially stored in the storage unit 36 (in order from the first).
  • the CPU 38 detects that an instruction in response to pressing of the developed image acquisition end button 203 has been performed, or that the storage unit 36 does not have a free space capable of storing at least one locally developed image. Is detected (S10: YES), the measurement of the insertion distance of the insertion unit 5 is stopped, and each locally developed image stored in the storage unit 36 is pasted to generate one wide-area developed image. Processing is performed (step S11 in FIG. 10). That is, when the CPU 38 ends the operation for storing the locally developed image satisfying the determination condition of step S4 or step S7 of FIG. 10 in the storage unit 36 in time series, each CPU stored in the storage unit 36 An operation for generating one wide area developed image by pasting the locally developed images and causing the display unit 35 to display the generated one wide area developed image is performed.
  • the CPU 38 calculates, for example, a homography matrix of two numbers adjacent to each other with respect to the first to p (2 ⁇ p) th locally developed images IEL stored in the storage unit 36.
  • a single wide area developed image IEW as shown in FIG. 19 is generated.
  • FIG. 19 is a diagram illustrating an example of a wide area developed image generated by pasting together a plurality of locally developed images.
  • FIG. 20 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
  • the display image 303 includes insertion distance information 204, a rectangular wide area developed image 205, and a live image display button 206.
  • the insertion distance information 204 of the display image 303 includes a character string for notifying the user of the insertion distance of the insertion unit 5 at the timing when the measurement by the CPU 38 is stopped.
  • the live image display button 206 is configured, for example, as a GUI button that can issue an instruction to cancel the display of the wide-area developed image 205 and display the endoscopic image 201 live according to a user's touch operation. Yes.
  • the CPU 38 performs a determination process related to whether or not to end the display of the wide area developed image (step S12 in FIG. 10).
  • the CPU 38 stands by while performing an operation for outputting the display image 303 to the display unit 35 until an instruction corresponding to the pressing of the live image display button 206 is detected (S12: NO).
  • the CPU 38 detects that an instruction corresponding to the pressing of the live image display button 206 has been performed (S12: YES), the CPU 38 stores the wide area developed image generated in step S11 of FIG. In addition, after performing an operation for displaying the display image 301 again instead of the display image 303, a series of processes relating to generation and display of the wide area expanded image is completed.
  • the CPU 38 is not limited to performing the operation for generating the wide area developed image and displaying it on the display unit 35 after the storage of the locally developed image in the storage unit 36 is completed.
  • an operation for generating a wide area developed image and displaying it on the display unit 35 while storing the locally developed image in the storage unit 36 may be performed.
  • FIG. 21 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
  • the display image 304a includes an endoscope image 201, a developed image acquisition end button 203, and insertion distance information 204 that are substantially the same as those included in the display image 302 of FIG. Further, according to the display image 304a, each time a locally developed image satisfying the determination condition of step S4 or step S7 in FIG. 10 is acquired, a wide area developed image 211 obtained by pasting the locally developed image is displayed on the display unit 35. Is done. In addition, according to the display image 304a, for example, as indicated by the alternate long and short dash line in FIG. 21, each time the locally developed image satisfying the determination condition in step S4 or step S7 in FIG. The area is gradually expanded.
  • FIG. 22 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
  • the display image 304b includes a developed image acquisition end button 203 and insertion distance information 204 that are substantially the same as those included in the display image 302 of FIG. Further, according to the display image 304b, each time the locally developed image 221 that satisfies the determination condition of step S4 or step S7 in FIG. 10 is acquired, the locally developed image 221 and the locally developed image 221 are pasted together. 211 are also displayed on the display unit 35. In addition, according to the display image 304b, for example, as indicated by a one-dot chain line in FIG. 22, each time a locally developed image satisfying the determination condition in step S4 or step S7 in FIG. The area is gradually expanded.
  • the offset amount ⁇ is calculated based on the endoscopic image output from the image generation unit 34, and the magnification correction process according to the calculated offset amount ⁇ is performed. While performing, a locally developed image and a wide area developed image are generated. Therefore, according to the present embodiment, for example, without using a centering mechanism configured to be able to insert the insertion portion 5 into the pipe 101 while matching the optical axis AC and the center axis AS. Can be generated.
  • the developed image can be generated without providing the centering mechanism as described above in the insertion portion 5, for example, the inside of a small diameter pipe corresponding to the outer diameter of the insertion portion 5.
  • the state can be confirmed with a wide area developed image.
  • Step S5 to Step S7 in FIG. 10 are repeatedly performed, so that the nth locally developed image suitable for pasting to the (n ⁇ 1) th locally developed image is reliably acquired. can do. Therefore, according to the present embodiment, for example, even when the insertion state of the insertion portion 5 inserted into the pipe 101 is temporarily disturbed, an accurate wide area developed image can be generated.

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Astronomy & Astrophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
  • Endoscopes (AREA)

Abstract

An observation device comprises: an illumination unit arranged inside a cylindrical sample with a center axis and outputting illumination light toward a cylindrical illumination region centered at a prescribed optical axis; an imaging unit arranged near the illumination unit inside the sample, capturing a cylindrical capture region in the sample centered at the prescribed optical axis, and outputting the captured signal; an image generation unit for generating an annular image centered at the prescribed optical axis on the basis of the captured signal; and an image processing unit for calculating a prescribed offset for the optical axis relative to the center axis on the basis of the inner diameter of the sample and a value acquired from the annular image, and generating a locally developed image by correcting the imaging magnification along a direction parallel to the center axis in an image acquired by developing the annular image in accordance with the offset.

Description

観察装置Observation device
 本発明は、観察装置に関するものである。 The present invention relates to an observation apparatus.
 円筒状の被検体の内周面の全周を撮像して得られた画像を展開して展開画像を生成するための技術が従来知られている。 2. Description of the Related Art Conventionally, a technique for developing a developed image by developing an image obtained by imaging the entire inner circumference of a cylindrical subject is known.
 具体的には、例えば、日本国特許第2562059号公報には、略円形断面をもつ横坑の中心に設けた凸面鏡に映された画像を撮像して基本画像を取得し、当該基本画像を順次半径の異なる同心円状の複数の環状画像に分解し、当該複数の環状画像を直線状画像に展開して順次並べることにより、当該横坑の内壁面の展開画像を生成するような構成が開示されている。 Specifically, for example, in Japanese Patent No. 25662059, an image projected on a convex mirror provided in the center of a horizontal shaft having a substantially circular cross section is captured to obtain a basic image, and the basic image is sequentially acquired. A configuration is disclosed in which a developed image of the inner wall surface of the horizontal shaft is generated by decomposing into a plurality of concentric annular images having different radii, developing the plurality of annular images into a linear image, and sequentially arranging them. ing.
 ここで、日本国特許第2562059号公報に開示された構成によれば、懸垂アーム等の部品により、凸面鏡を具備する撮像装置の中心軸と、当該撮像装置により撮像される横坑の中心軸と、を略一致させた状態において、展開画像の生成に用いる基本画像を取得するようにしている。 Here, according to the configuration disclosed in Japanese Patent No. 25662059, the central axis of the imaging device provided with the convex mirror and the central axis of the horizontal shaft imaged by the imaging device by components such as a suspension arm In a state in which are substantially matched, a basic image used for generating a developed image is acquired.
 すなわち、日本国特許第2562059号公報によれば、懸垂アーム等の部品により構成されるセンタリング機構を用いて基本画像を取得しなければ、正確な展開画像を生成することができない、という課題が生じている。 That is, according to Japanese Patent No. 25662059, there is a problem that an accurate developed image cannot be generated unless a basic image is acquired using a centering mechanism constituted by components such as a suspension arm. ing.
 本発明は、前述した事情に鑑みてなされたものであり、センタリング機構を用いずとも正確な展開画像を生成することが可能な観察装置を提供することを目的としている。 The present invention has been made in view of the above-described circumstances, and an object thereof is to provide an observation apparatus capable of generating an accurate developed image without using a centering mechanism.
 本発明の一態様の観察装置は、中心軸を有する円筒状の被検体の内部に配置されるとともに、所定の光学軸を中心とする円筒状の照明領域に対して照明光を出射するように構成された照明部と、前記照明部の近傍に設けられ、前記被検体の内部に配置されるとともに、前記所定の光学軸を中心とする円筒状の撮像領域内の被写体を撮像して撮像信号を出力するように構成された撮像部と、前記撮像信号に基づき、前記所定の光学軸を画像の中心とする円環状の画像を生成するように構成された画像生成部と、前記被検体の内径と前記円環状の画像から取得される値とに基づいて前記中心軸に対する前記所定の光学軸のオフセット量を算出し、さらに、前記円環状の画像を展開して得られる画像における前記中心軸に平行な方向の撮像倍率を前記オフセット量に応じて補正するための倍率補正処理を行うことにより局所展開画像を生成するように構成された画像処理部と、を有する。 The observation apparatus according to one aspect of the present invention is arranged inside a cylindrical subject having a central axis and emits illumination light to a cylindrical illumination region centered on a predetermined optical axis. An illuminating unit that is configured and an imaging signal that is provided in the vicinity of the illuminating unit, is disposed inside the subject, and images a subject in a cylindrical imaging region centered on the predetermined optical axis. An image generating unit configured to generate an annular image having the predetermined optical axis as a center of the image based on the imaging signal, and an image of the subject. An offset amount of the predetermined optical axis with respect to the central axis is calculated based on an inner diameter and a value acquired from the annular image, and the central axis in an image obtained by developing the annular image The imaging magnification in the direction parallel to Having an image processing unit configured to generate a local expansion image by performing the magnification correction process for correcting in accordance with the serial offset.
実施形態に係る内視鏡装置の外観構成を示す図。The figure which shows the external appearance structure of the endoscope apparatus which concerns on embodiment. 内視鏡の挿入部の先端部に光学アダプタを取り付けた場合の構成を説明するための図。The figure for demonstrating a structure at the time of attaching an optical adapter to the front-end | tip part of the insertion part of an endoscope. 内視鏡の挿入部の先端部に光学アダプタを取り付けた場合における照明領域LR及び撮像領域IRを説明するための模式図。The schematic diagram for demonstrating the illumination area | region LR and imaging region IR at the time of attaching an optical adapter to the front-end | tip part of the insertion part of an endoscope. 実施形態に係る内視鏡装置の構成を説明するためのブロック図。The block diagram for demonstrating the structure of the endoscope apparatus which concerns on embodiment. 内視鏡の挿入部を配管の内部に挿入した状態の一例を示す図。The figure which shows an example of the state which inserted the insertion part of the endoscope into the inside of piping. 図5のような挿入状態において撮像領域IR内に生じる領域IRB及びIRSを説明するための図。The figure for demonstrating the area | region IRB and IRS which arise in the imaging area IR in the insertion state like FIG. 図5のような挿入状態において生成される内視鏡画像の一例を示す図。The figure which shows an example of the endoscopic image produced | generated in the insertion state like FIG. 実施形態に係る内視鏡装置の動作に応じて生成される表示画像の一例を示す図。The figure which shows an example of the display image produced | generated according to operation | movement of the endoscope apparatus which concerns on embodiment. 実施形態に係る内視鏡装置の動作に応じて生成される表示画像の一例を示す図。The figure which shows an example of the display image produced | generated according to operation | movement of the endoscope apparatus which concerns on embodiment. 実施形態に係る内視鏡装置において行われる広域展開画像の生成及び表示に係る動作の一例を説明するためのフローチャート。The flowchart for demonstrating an example of the operation | movement which concerns on the production | generation and display of the wide area expansion | deployment image performed in the endoscope apparatus which concerns on embodiment. 局所展開画像の生成に係る処理に用いられる内視鏡画像の一例を示す図。The figure which shows an example of the endoscopic image used for the process which concerns on the production | generation concerning a local expansion | deployment image. 図11の内視鏡画像を展開することにより生成される基本展開画像の一例を示す図。The figure which shows an example of the basic expansion | deployment image produced | generated by expand | deploying the endoscopic image of FIG. 図12の基本展開画像を用いて生成される結合画像の一例を示す図。The figure which shows an example of the combined image produced | generated using the basic expansion | deployment image of FIG. 図13の結合画像の一部を切り出すことにより生成される切り出し画像の一例を示す図。The figure which shows an example of the cut-out image produced | generated by cutting out a part of combined image of FIG. 基本展開画像の生成に用いた内視鏡画像に含まれる影領域のオフセット方向の幅WSを説明するための図。The figure for demonstrating the width WS of the offset direction of the shadow area | region contained in the endoscopic image used for the production | generation of a basic expansion | deployment image. 広域展開画像の生成に係る処理において用いられるパラメータの概要を示す図。The figure which shows the outline | summary of the parameter used in the process which concerns on the production | generation concerning a wide area expansion | deployment image. 図14の切り出し画像を倍率補正処理により変形した場合の例を示す図。The figure which shows the example at the time of deform | transforming the cutout image of FIG. 14 by the magnification correction process. 図17の変形後の切り出し画像の一部を切除することにより生成される局所展開画像の一例を示す図。The figure which shows an example of the local expansion | deployment image produced | generated by excising a part of the cutout image after a deformation | transformation of FIG. 複数の局所展開画像を貼り合わせて生成される広域展開画像の一例を示す図。The figure which shows an example of the wide area expansion | deployment image produced | generated by bonding a some local expansion | deployment image. 実施形態に係る内視鏡装置の動作に応じて生成される表示画像の一例を示す図。The figure which shows an example of the display image produced | generated according to operation | movement of the endoscope apparatus which concerns on embodiment. 実施形態に係る内視鏡装置の動作に応じて生成される表示画像の一例を示す図。The figure which shows an example of the display image produced | generated according to operation | movement of the endoscope apparatus which concerns on embodiment. 実施形態に係る内視鏡装置の動作に応じて生成される表示画像の一例を示す図。The figure which shows an example of the display image produced | generated according to operation | movement of the endoscope apparatus which concerns on embodiment.
 以下、本発明の実施形態について、図面を参照しつつ説明を行う。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 図1から図22は、本発明の実施形態に係るものである。 1 to 22 relate to the embodiment of the present invention.
 内視鏡装置1は、配管等のような中心軸を有する円筒状の被検体の内部を観察するための観察装置としての機能を具備して構成されている。具体的には、内視鏡装置1は、例えば、図1に示すように、内視鏡2と、内視鏡2を接続可能な装置本体3と、を有して構成されている。また、装置本体3には、画像等を表示可能な表示部35が設けられている。 The endoscope apparatus 1 has a function as an observation apparatus for observing the inside of a cylindrical subject having a central axis such as a pipe. Specifically, the endoscope apparatus 1 includes, for example, an endoscope 2 and an apparatus main body 3 to which the endoscope 2 can be connected as shown in FIG. The apparatus body 3 is provided with a display unit 35 that can display an image or the like.
 内視鏡2は、中心軸を有する円筒状の被検体の内部に挿入可能な細長形状を具備して形成された挿入部5と、挿入部5の基端側に設けられた操作部6と、操作部6から延出したユニバーサルコード7と、を有して構成されている。また、内視鏡2は、ユニバーサルコード7を介し、装置本体3に対して着脱自在に接続されるように構成されている。また、内視鏡2の内部には、装置本体3から供給される照明光を伝送するためのライトガイド21(図1では図示省略)が挿通されている。 The endoscope 2 includes an insertion portion 5 formed to have an elongated shape that can be inserted into a cylindrical subject having a central axis, and an operation portion 6 provided on the proximal end side of the insertion portion 5. And a universal cord 7 extending from the operation unit 6. Further, the endoscope 2 is configured to be detachably connected to the apparatus main body 3 via the universal cord 7. A light guide 21 (not shown in FIG. 1) for transmitting illumination light supplied from the apparatus main body 3 is inserted into the endoscope 2.
 挿入部5は、先端部11と、湾曲自在に形成された湾曲部12と、可撓性を有する長尺な可撓管部13と、を先端側から順に設けて構成されている。 The insertion portion 5 is configured by sequentially providing a distal end portion 11, a bending portion 12 formed to be bendable, and a long flexible tube portion 13 having flexibility from the distal end side.
 操作部6には、湾曲部12を所望の方向に湾曲させるための操作を行うことができるように構成された湾曲ジョイスティック6aが設けられている。また、操作部6には、図示しないが、例えば、フリーズボタン、湾曲ロックボタン、及び、記録指示ボタン等のような、内視鏡装置1において利用可能な機能に応じた操作ボタンが設けられている。 The operation unit 6 is provided with a bending joystick 6a configured to perform an operation for bending the bending unit 12 in a desired direction. Although not shown, the operation unit 6 is provided with operation buttons corresponding to functions available in the endoscope apparatus 1 such as a freeze button, a bending lock button, and a recording instruction button. Yes.
 先端部11は、内視鏡2を用いて円筒状の被検体の内部を撮像する際の撮像領域に応じた1つ以上の光学部材を具備する光学アダプタ40を着脱自在に取り付けることができるように構成されている。なお、本実施形態においては、後述の光学軸ACを中心とする円筒状の(360度の)撮像領域を具備するように構成された光学アダプタ40が先端部11に取り付けられているものとして説明を行う。 The distal end portion 11 can be detachably attached with an optical adapter 40 including one or more optical members corresponding to an imaging region when the inside of the cylindrical subject is imaged using the endoscope 2. It is configured. In the present embodiment, it is assumed that an optical adapter 40 configured to have a cylindrical (360 degrees) imaging region centered on an optical axis AC described later is attached to the distal end portion 11. I do.
 先端部11には、図2に示すように、内視鏡2の内部に挿通されたライトガイド21の先端部と、ライトガイド21の先端部を経て入射した照明光を光学アダプタ40の光拡散素子47へ出射する照明レンズ22と、が設けられている。また、先端部11には、図2に示すように、光学アダプタ40のレンズユニット46を経て入射した光を結像する結像レンズ23と、結像レンズ23により結像された光を撮像して撮像信号を出力する撮像素子24と、が設けられている。また、先端部11の側面には、光学アダプタ40の取付部材49に対して係合可能な形状を具備する構造である凹凸部25が形成されている。図2は、内視鏡の挿入部の先端部に光学アダプタを取り付けた場合の構成を説明するための図である。 As shown in FIG. 2, the distal end portion of the light guide 21 inserted into the endoscope 2 and the illumination light incident through the distal end portion of the light guide 21 are diffused into the distal end portion 11 by the optical adapter 40. An illumination lens 22 that emits light to the element 47 is provided. Further, as shown in FIG. 2, an imaging lens 23 that forms an image of light incident through the lens unit 46 of the optical adapter 40 and the light imaged by the imaging lens 23 are imaged at the distal end portion 11. And an image pickup device 24 for outputting an image pickup signal. Further, an uneven portion 25 having a shape that can be engaged with the mounting member 49 of the optical adapter 40 is formed on the side surface of the distal end portion 11. FIG. 2 is a diagram for explaining a configuration when an optical adapter is attached to the distal end portion of the insertion portion of the endoscope.
 なお、本実施形態においては、撮像素子24が、CCDまたはCMOS等のイメージセンサを具備して構成されていればよい。また、本実施形態によれば、光学アダプタ40の取付部材49に対して係合可能である限りにおいては、凹凸部25とは異なる形状を具備する構造が先端部11の側面に形成されていてもよい。 In the present embodiment, the image sensor 24 may be configured to include an image sensor such as a CCD or a CMOS. Further, according to the present embodiment, as long as it can be engaged with the mounting member 49 of the optical adapter 40, the structure having a shape different from the uneven portion 25 is formed on the side surface of the tip portion 11. Also good.
 光学アダプタ40は、図2に示すように、カバー部材41と、観察窓42と、自由曲面レンズ43と、枠部材44と、レンズホルダ45と、レンズユニット46と、光拡散素子47と、照明窓48と、取付部材49と、を有して構成されている。 As shown in FIG. 2, the optical adapter 40 includes a cover member 41, an observation window 42, a free-form surface lens 43, a frame member 44, a lens holder 45, a lens unit 46, a light diffusing element 47, and an illumination. A window 48 and a mounting member 49 are included.
 カバー部材41は、例えば、遮光性を具備する円板状の部材により形成されている。また、カバー部材41は、自由曲面レンズ43のレンズ前面を覆うような位置に固定配置されている。 The cover member 41 is formed of, for example, a disk-shaped member having a light shielding property. Further, the cover member 41 is fixedly disposed at a position so as to cover the front surface of the free-form surface lens 43.
 観察窓42は、例えば、サファイアパイプ等のような透光性を有する円筒状の部材により形成されており、光学アダプタ40の外部から入射した光を透過させて自由曲面レンズ43へ出射するように構成されている。また、観察窓42は、カバー部材41と枠部材44との間の位置であるとともに、自由曲面レンズ43のレンズ側面を覆うような位置に固定配置されている。また、観察窓42は、枠部材44のフランジ44f(後述)の外径よりも小さな外径を有して形成されている。 The observation window 42 is formed of, for example, a cylindrical member having translucency such as a sapphire pipe, and transmits light incident from the outside of the optical adapter 40 so as to be emitted to the free-form surface lens 43. It is configured. The observation window 42 is a position between the cover member 41 and the frame member 44 and is fixedly disposed at a position that covers the lens side surface of the free-form surface lens 43. The observation window 42 is formed with an outer diameter smaller than the outer diameter of a flange 44f (described later) of the frame member 44.
 自由曲面レンズ43は、光学アダプタ40内におけるカバー部材41と枠部材44との間の位置であるとともに、観察窓42を経て出射された光がレンズ側面から内部に入射されるような位置に固定配置されている。また、自由曲面レンズ43は、レンズ側面から内部に入射した光を屈折させてまたはレンズ前面で反射してレンズユニット46へ出射するような光学特性を具備して形成されている。 The free-form surface lens 43 is a position between the cover member 41 and the frame member 44 in the optical adapter 40, and is fixed at a position where light emitted through the observation window 42 is incident on the inside from the lens side surface. Has been placed. The free-form lens 43 is formed to have optical characteristics such that light incident inside from the lens side surface is refracted or reflected by the front surface of the lens and emitted to the lens unit 46.
 枠部材44は、例えば、遮光性を具備する円筒状の部材と、光拡散素子47の前面を遮光するフランジ44fと、を組み合わせて形成されている。また、枠部材44は、フランジ44f側の開口からレンズホルダ45の円筒部45aの一部を嵌め込むことができるように構成されている。 The frame member 44 is formed by combining, for example, a cylindrical member having a light shielding property and a flange 44 f that shields the front surface of the light diffusing element 47. Further, the frame member 44 is configured such that a part of the cylindrical portion 45a of the lens holder 45 can be fitted from the opening on the flange 44f side.
 レンズホルダ45は、レンズユニット46を保持可能な内径を備えた円筒部45aと、円筒部45aの内径よりも大きな内径を備えた拡径部45bと、を有して形成されている。 The lens holder 45 has a cylindrical portion 45a having an inner diameter capable of holding the lens unit 46, and an enlarged diameter portion 45b having an inner diameter larger than the inner diameter of the cylindrical portion 45a.
 円筒部45aは、光学アダプタ40の中心軸と、レンズユニット46の中心軸と、を一致させた状態でレンズユニット46を保持するように形成されている。 The cylindrical portion 45a is formed so as to hold the lens unit 46 in a state where the central axis of the optical adapter 40 and the central axis of the lens unit 46 are matched.
 拡径部45bは、円筒部45aの後端側に設けられているとともに、先端部11の先端面を含む部分を収容可能な内径を具備して形成されている。また、拡径部45bは、光学アダプタ40が先端部11に取り付けられた際に、レンズユニット46の光出射面と結像レンズ23の光入射面とを接触させつつ、光拡散素子47の光入射面と照明レンズ22の光出射面とを接触させることができるような形状を具備して形成されている。また、拡径部45bの後端側には、取付部材49が設けられている。 The enlarged diameter portion 45b is provided on the rear end side of the cylindrical portion 45a and has an inner diameter capable of accommodating a portion including the distal end surface of the distal end portion 11. In addition, the diameter-expanded portion 45b allows the light of the light diffusing element 47 to contact the light emitting surface of the lens unit 46 and the light incident surface of the imaging lens 23 when the optical adapter 40 is attached to the distal end portion 11. It is formed to have a shape that allows the entrance surface and the light exit surface of the illumination lens 22 to be in contact with each other. An attachment member 49 is provided on the rear end side of the enlarged diameter portion 45b.
 レンズユニット46は、1つ以上のレンズを具備して構成されている。また、レンズユニット46は、光学アダプタ40が先端部11に取り付けられた際に、自由曲面レンズ43を経て入射した光を結像レンズ23へ出射するように構成されている。 The lens unit 46 includes one or more lenses. In addition, the lens unit 46 is configured to emit light incident through the free-form surface lens 43 to the imaging lens 23 when the optical adapter 40 is attached to the distal end portion 11.
 なお、本実施形態においては、例えば、図2に示すように、光学アダプタ40が先端部11に取り付けられた際に、結像レンズ23の中心軸と、光学アダプタ40の中心軸と、自由曲面レンズ43の中心軸と、レンズユニット46の中心軸と、が同一の光学軸AC上に位置するものとして説明を行う。 In the present embodiment, for example, as shown in FIG. 2, when the optical adapter 40 is attached to the distal end portion 11, the central axis of the imaging lens 23, the central axis of the optical adapter 40, and a free-form surface In the following description, it is assumed that the central axis of the lens 43 and the central axis of the lens unit 46 are located on the same optical axis AC.
 光拡散素子47は、例えば、略円筒形状を具備し、光学アダプタ40内における枠部材44とレンズホルダ45との間において、円筒部45aに嵌め込まれた状態で固定配置されている。また、光拡散素子47は、光学アダプタ40が先端部11に取り付けられた際に、照明レンズ22の光出射面を経て入射した照明光を拡散して側方へ出射するように構成されている。 The light diffusing element 47 has, for example, a substantially cylindrical shape, and is fixedly disposed between the frame member 44 and the lens holder 45 in the optical adapter 40 in a state of being fitted into the cylindrical portion 45a. The light diffusing element 47 is configured to diffuse the illumination light incident through the light exit surface of the illumination lens 22 and emit it to the side when the optical adapter 40 is attached to the distal end portion 11. .
 照明窓48は、例えば、サファイアパイプ等のような、透光性を有する円筒状の部材により形成されており、光拡散素子47により拡散された照明光を透過させて光学アダプタ40の外部へ出射するように構成されている。また、照明窓48は、枠部材44とレンズホルダ45との間の位置であるとともに、光拡散素子47の側面を覆うような位置に固定配置されている。ままた、観察窓42は、枠部材44のフランジ44fの外径と同じ外径を有して形成されている。 The illumination window 48 is formed of a cylindrical member having translucency, such as a sapphire pipe, for example, and transmits the illumination light diffused by the light diffusing element 47 to be emitted outside the optical adapter 40. Is configured to do. The illumination window 48 is a position between the frame member 44 and the lens holder 45 and is fixedly disposed at a position that covers the side surface of the light diffusing element 47. Moreover, the observation window 42 is formed to have the same outer diameter as the outer diameter of the flange 44 f of the frame member 44.
 取付部材49は、例えば、略円筒形状を具備し、凹凸部25に対して係合可能な溝を内周面に設けて形成されている。 The mounting member 49 has, for example, a substantially cylindrical shape, and is formed by providing a groove that can be engaged with the concavo-convex portion 25 on the inner peripheral surface.
 すなわち、光学アダプタ40は、光学軸ACを中心とする円筒状の(360度の)照明領域を具備するように構成されている。また、以上に述べたような光学アダプタ40の構成によれば、例えば、図3に模式的に示すように、照明窓48を経て光学アダプタ40の外部へ出射された照明光の照明領域LRが、観察窓42を経て光学アダプタ40の内部へ入射した光を撮像する際の撮像領域IRよりも広くなる。また、以上に述べたような光学アダプタ40の構成によれば、照明領域LR及び撮像領域IRが光学軸ACに沿ってずれている。(図3参照)。また、以上に述べたような光学アダプタ40の構成によれば、先端部11の前方の所定の範囲が撮像領域IRから外れた非撮像領域となる。図3は、内視鏡の挿入部の先端部に光学アダプタを取り付けた場合における照明領域LR及び撮像領域IRを説明するための模式図である。また、本実施形態における照明部は、光拡散素子47と、照明窓48と、を具備して構成されている。また、本実施形態における撮像部は、照明部の近傍に設けられているとともに、撮像素子24と、観察窓42と、自由曲面レンズ43と、を具備して構成されている。 That is, the optical adapter 40 is configured to include a cylindrical (360 degrees) illumination area centered on the optical axis AC. Further, according to the configuration of the optical adapter 40 as described above, for example, as schematically shown in FIG. 3, the illumination region LR of the illumination light emitted to the outside of the optical adapter 40 through the illumination window 48 is provided. It becomes wider than the imaging region IR when imaging light incident on the inside of the optical adapter 40 through the observation window 42. Further, according to the configuration of the optical adapter 40 as described above, the illumination area LR and the imaging area IR are shifted along the optical axis AC. (See FIG. 3). In addition, according to the configuration of the optical adapter 40 as described above, a predetermined range in front of the distal end portion 11 becomes a non-imaging region that is out of the imaging region IR. FIG. 3 is a schematic diagram for explaining the illumination region LR and the imaging region IR when the optical adapter is attached to the distal end portion of the insertion portion of the endoscope. In addition, the illumination unit in the present embodiment includes a light diffusing element 47 and an illumination window 48. In addition, the imaging unit in the present embodiment is provided in the vicinity of the illumination unit, and includes the imaging device 24, the observation window 42, and the free-form surface lens 43.
 装置本体3は、図4に示すように、光源部31と、光源駆動部32と、撮像素子駆動部33と、画像生成部34と、表示部35と、記憶部36と、入力I/F(インターフェース)部37と、CPU38と、を有して構成されている。また、装置本体3には、USBメモリ等のような可搬型の外部記憶装置51を接続するための接続ポート(不図示)が設けられている。図4は、実施形態に係る内視鏡装置の構成を説明するためのブロック図である。 As shown in FIG. 4, the apparatus main body 3 includes a light source unit 31, a light source drive unit 32, an image sensor drive unit 33, an image generation unit 34, a display unit 35, a storage unit 36, and an input I / F. An (interface) unit 37 and a CPU 38 are included. The apparatus body 3 is provided with a connection port (not shown) for connecting a portable external storage device 51 such as a USB memory. FIG. 4 is a block diagram for explaining the configuration of the endoscope apparatus according to the embodiment.
 光源部31は、例えば、LEDまたはランプを具備して構成されている。また、光源部31は、光源駆動部32から出力される光源駆動信号に応じて点灯または消灯するように構成されている。また、光源部31は、例えば、光源駆動部32から出力される光源駆動信号に応じた光量の白色光を照明光としてライトガイド21に供給するように構成されている。 The light source unit 31 includes, for example, an LED or a lamp. The light source unit 31 is configured to be turned on or off in accordance with a light source drive signal output from the light source drive unit 32. The light source unit 31 is configured to supply, for example, white light having a light amount corresponding to a light source drive signal output from the light source drive unit 32 to the light guide 21 as illumination light.
 光源駆動部32は、例えば、光源駆動回路を具備して構成されている。また、光源駆動部32は、CPU38の制御に応じ、光源部31を駆動させるための光源駆動信号を生成して出力するように構成されている。 The light source driving unit 32 includes, for example, a light source driving circuit. Further, the light source driving unit 32 is configured to generate and output a light source driving signal for driving the light source unit 31 in accordance with the control of the CPU 38.
 撮像素子駆動部33は、例えば、撮像素子駆動回路を具備して構成されている。また、撮像素子駆動部33は、CPU38の制御に応じ、撮像素子24を駆動させるための撮像素子駆動信号を生成して出力するように構成されている。 The image sensor driving unit 33 includes, for example, an image sensor driving circuit. The image sensor driving unit 33 is configured to generate and output an image sensor driving signal for driving the image sensor 24 under the control of the CPU 38.
 画像生成部34は、例えば、FPGA(Field Programmable Gate Array)等の集積回路により構成されている。また、画像生成部34は、撮像素子24から出力される撮像信号に対して所定の信号処理を施すことにより、光学軸ACを画像の中心とする円環状の内視鏡画像を生成するとともに、当該生成した内視鏡画像をCPU38へ順次出力するように構成されている。 The image generation unit 34 is configured by an integrated circuit such as an FPGA (Field Programmable Gate Array). In addition, the image generation unit 34 performs predetermined signal processing on the imaging signal output from the imaging device 24 to generate an annular endoscope image with the optical axis AC as the center of the image, The generated endoscopic images are sequentially output to the CPU 38.
 表示部35は、例えば、液晶パネルを具備して構成されている。また、表示部35は、CPU38から出力される表示画像を表示画面に表示するように構成されている。また、表示部35は、表示画面に表示されるGUI(グラフィカルユーザインターフェース)ボタン等に対するタッチ操作を検出するとともに、当該検出したタッチ操作に応じた指示をCPU38へ出力するタッチパネル35aを有して構成されている。 The display unit 35 includes, for example, a liquid crystal panel. The display unit 35 is configured to display the display image output from the CPU 38 on the display screen. The display unit 35 includes a touch panel 35a that detects a touch operation on a GUI (Graphical User Interface) button or the like displayed on the display screen and outputs an instruction corresponding to the detected touch operation to the CPU 38. Has been.
 記憶部36は、例えば、メモリ等の記憶回路を具備して構成されている。また、記憶部36には、例えば、内視鏡装置1の各部の制御に用いられるプログラム、及び、広域展開画像(後述)の生成に係る処理に用いられるプログラム等のような、CPU38の動作に対応する種々のプログラムが格納されている。また、記憶部36には、CPU38による広域展開画像の生成に係る処理の過程において生成される複数枚の局所展開画像(後述)を格納することができるように構成されている。また、記憶部36には、入力I/F部37の操作に応じて入力された情報を格納することができるように構成されている。また、記憶部36には、広域展開画像の生成に係る処理において用いられるパラメータが格納されている。 The storage unit 36 includes, for example, a storage circuit such as a memory. In addition, the storage unit 36 is used for the operation of the CPU 38 such as a program used for controlling each unit of the endoscope apparatus 1 and a program used for processing related to generation of a wide-area expanded image (described later). Various corresponding programs are stored. In addition, the storage unit 36 is configured to store a plurality of locally developed images (described later) generated in the course of processing related to the generation of the wide area developed image by the CPU 38. Further, the storage unit 36 is configured to be able to store information input in response to an operation of the input I / F unit 37. In addition, the storage unit 36 stores parameters used in processing related to generation of a wide area developed image.
 入力I/F部37は、ユーザの入力操作に応じた指示をCPU38に対して行うことが可能なスイッチ等を具備して構成されている。また、入力I/F部37は、ユーザの操作に応じ、CPU38による広域展開画像の生成に係る処理の際に用いられる情報を入力することができるように構成されている。 The input I / F unit 37 is configured to include a switch or the like that can instruct the CPU 38 according to a user's input operation. In addition, the input I / F unit 37 is configured to be able to input information used in processing related to generation of a wide area developed image by the CPU 38 in accordance with a user operation.
 CPU38は、タッチパネル35aまたは入力I/F部37の操作に応じてなされた指示に基づき、光源駆動部32及び撮像素子駆動部33に対する制御を行うことができるように構成されている。また、CPU38は、時間計測を行うためのタイマー(不図示)を具備して構成されている。また、CPU38は、例えば、先端部11に設けられた図示しない加速度センサからの出力信号等に基づき、挿入部5の挿入距離を計測することができるように構成されている。また、CPU38は、画像生成部34から出力される内視鏡画像等の画像に対してGUIボタン等を重畳した表示画像を生成して表示部35へ出力することができるように構成されている。また、CPU38は、入力I/F部37の操作に応じて入力された情報を記憶部36に格納させるための動作を行うことができるように構成されている。また、CPU38は、画像処理部としての機能を具備し、画像生成部34から順次出力される円環状の内視鏡画像を1枚ずつ展開することにより局所展開画像を生成し、当該生成した局所展開画像を複数枚貼り合わせて広域展開画像を生成するための処理を行うことができるように構成されている。なお、このような処理の詳細については、後程説明する。また、CPU38は、広域展開画像の生成に係る処理の過程において生成した局所展開画像を記憶部36に格納させることができるように構成されている。また、CPU38は、前述のように生成した広域展開画像を外部記憶装置51に格納させるための動作を行うことができるように構成されている。また、CPU38は、画像生成部34から出力される内視鏡画像をJPEG等の静止画像用のフォーマット、及び、MPEG4等の動画像用のフォーマットを用いてエンコードして外部記憶装置51に格納させることができるように構成されている。また、CPU38は、タッチパネル35aまたは入力I/F部37の操作に応じてなされた指示に基づき、外部記憶装置51に格納された画像を読み込むとともに、当該読み込んだ画像に応じた表示画像を生成して表示部35へ出力することができるように構成されている。また、CPU38は、表示部35へ出力する表示画像に対して色空間変換、インターレース/プログレッシブ変換及びガンマ補正等の所定の画像処理を施すように構成されている。 The CPU 38 is configured to be able to control the light source driving unit 32 and the image sensor driving unit 33 based on an instruction made in response to an operation of the touch panel 35a or the input I / F unit 37. The CPU 38 includes a timer (not shown) for measuring time. The CPU 38 is configured to measure the insertion distance of the insertion portion 5 based on, for example, an output signal from an acceleration sensor (not shown) provided at the distal end portion 11. The CPU 38 is configured to generate a display image in which a GUI button or the like is superimposed on an image such as an endoscopic image output from the image generation unit 34 and output the display image to the display unit 35. . Further, the CPU 38 is configured to be able to perform an operation for storing information input in accordance with an operation of the input I / F unit 37 in the storage unit 36. Further, the CPU 38 has a function as an image processing unit, and generates a locally expanded image by expanding the annular endoscope images sequentially output from the image generating unit 34 one by one, and generates the generated local It is configured to be able to perform a process for generating a wide area developed image by pasting a plurality of developed images. Details of such processing will be described later. Further, the CPU 38 is configured to be able to store the locally developed image generated in the process of generating the wide area developed image in the storage unit 36. Further, the CPU 38 is configured to be able to perform an operation for storing the wide area expanded image generated as described above in the external storage device 51. Further, the CPU 38 encodes the endoscopic image output from the image generation unit 34 using a still image format such as JPEG and a moving image format such as MPEG4 and stores the encoded image in the external storage device 51. It is configured to be able to. Further, the CPU 38 reads an image stored in the external storage device 51 and generates a display image corresponding to the read image based on an instruction made in response to an operation of the touch panel 35 a or the input I / F unit 37. And can be output to the display unit 35. The CPU 38 is configured to perform predetermined image processing such as color space conversion, interlace / progressive conversion, and gamma correction on the display image output to the display unit 35.
 続いて、本実施形態の内視鏡装置1の具体的な動作等について説明する。 Subsequently, specific operations and the like of the endoscope apparatus 1 of the present embodiment will be described.
 ユーザは、内視鏡装置1の各部を接続して電源を投入した後、入力I/F部37を操作することにより、広域展開画像の生成に係る処理の際に用いられる情報を入力する。具体的には、ユーザは、入力I/F部37を操作することにより、例えば、中心軸を有する円筒状の被検体である配管の内径Φと、局所展開画像の生成時に用いられる切り出し基準位置と、局所展開画像同士の貼り合わせにおけるマッチング精度と、貼り合わせに適しない局所展開画像が生成された場合における内視鏡画像の再取得枚数m(1≦m)と、をそれぞれ入力する。すなわち、このようなユーザの操作によれば、配管の内径Φと、局所展開画像の生成時に用いられる切り出し基準位置と、局所展開画像同士の貼り合わせにおけるマッチング精度と、貼り合わせに適しない局所展開画像が生成された場合における内視鏡画像の再取得枚数mと、をそれぞれ特定可能な情報が記憶部36に格納される。 The user inputs information used in the processing related to the generation of the wide area developed image by operating the input I / F unit 37 after connecting each unit of the endoscope apparatus 1 and turning on the power. Specifically, the user operates the input I / F unit 37, for example, the inner diameter Φ of a pipe that is a cylindrical subject having a central axis, and a cutout reference position used when generating a locally developed image. And the matching accuracy in the pasting of the locally developed images and the number m (1 ≦ m) of re-obtained endoscopic images when a locally developed image that is not suitable for the pasting is generated. That is, according to the user's operation, the inner diameter Φ of the pipe, the cut-out reference position used when generating the locally developed image, the matching accuracy in the joining of the locally developed images, and the local development that is not suitable for the joining. Information that can specify the number m of re-obtained endoscope images when an image is generated is stored in the storage unit 36.
 その後、ユーザは、表示部35にライブ表示される内視鏡画像を確認しつつ、配管の内部に挿入部5を挿入する。なお、以降においては、図5に示すように、光学軸ACが円筒状の配管101の中心軸ASに対してオフセット量Δだけ偏心した状態で挿入部5が挿入される場合を例に挙げて説明する。すなわち、以降においては、オフセット量Δが中心軸ASに対する光学軸ACの偏心量に相当するものとして説明を行う。図5は、内視鏡の挿入部を配管の内部に挿入した状態の一例を示す図である。 Thereafter, the user inserts the insertion unit 5 into the pipe while confirming the endoscopic image displayed live on the display unit 35. In the following, as shown in FIG. 5, the case where the insertion portion 5 is inserted with the optical axis AC decentered by an offset amount Δ with respect to the central axis AS of the cylindrical pipe 101 is taken as an example. explain. That is, in the following description, it is assumed that the offset amount Δ corresponds to the eccentric amount of the optical axis AC with respect to the central axis AS. FIG. 5 is a diagram illustrating an example of a state where the insertion portion of the endoscope is inserted into the pipe.
 ここで、光学軸ACが中心軸ASに対してずれた状態で挿入部5が挿入される場合には、配管101の内周面101aのうち、照明窓48を経て出射される照明光が届いている状態で撮像される領域と、当該照明光が届かない状態で撮像される領域と、がそれぞれ生じる。すなわち、光学軸ACが中心軸ASに対して偏心した状態で挿入部5が挿入される場合には、例えば、図6に示すように、配管101の内周面101aを撮像する際の撮像領域IR内において、照明領域LRの内部に属する(照明光が届いている)領域IRBと、照明領域LRから外れた(照明光が届いていない)領域IRSと、がそれぞれ生じる。図6は、図5のような挿入状態において撮像領域IR内に生じる領域IRB及びIRSを説明するための図である。 Here, when the insertion portion 5 is inserted in a state where the optical axis AC is shifted from the central axis AS, illumination light emitted through the illumination window 48 reaches the inner peripheral surface 101a of the pipe 101. A region that is imaged in a state where the illumination light is present and a region that is imaged in a state where the illumination light does not reach each occur. That is, when the insertion portion 5 is inserted in a state where the optical axis AC is decentered with respect to the central axis AS, for example, as shown in FIG. 6, an imaging region when imaging the inner peripheral surface 101 a of the pipe 101. Within the IR, an area IRB belonging to the inside of the illumination area LR (with illumination light reaching) and an area IRS outside the illumination area LR (with no illumination light reaching) are respectively generated. FIG. 6 is a diagram for explaining regions IRB and IRS generated in the imaging region IR in the insertion state as shown in FIG.
 従って、光学軸ACが中心軸ASに対してずれた状態で挿入部5が配管101の内部に挿入された場合には、例えば、図7に示すような、撮像領域IRから外れた円形の非撮像領域が中央部に描出され、かつ、内周面101aのうちの領域IRBに相当する被照明領域と、内周面101aのうちの領域IRSに相当する影領域と、が当該非撮像領域の周囲に描出された、円環状の内視鏡画像が画像生成部34により生成される。図7は、図5のような挿入状態において生成される内視鏡画像の一例を示す図である。 Therefore, when the insertion portion 5 is inserted into the pipe 101 with the optical axis AC shifted from the central axis AS, for example, as shown in FIG. An imaging region is depicted in the center, and an illuminated region corresponding to the region IRB of the inner peripheral surface 101a and a shadow region corresponding to the region IRS of the inner peripheral surface 101a are included in the non-imaging region. An annular endoscopic image drawn around is generated by the image generation unit 34. FIG. 7 is a diagram showing an example of an endoscopic image generated in the insertion state as shown in FIG.
 CPU38は、画像生成部34から出力される内視鏡画像を用い、例えば、図8に示すような表示画像301を生成して表示部35に出力するための動作を行う。図8は、実施形態に係る内視鏡装置の動作に応じて生成される表示画像の一例を示す図である。 The CPU 38 performs an operation for generating a display image 301 as shown in FIG. 8 and outputting it to the display unit 35 using the endoscopic image output from the image generation unit 34, for example. FIG. 8 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
 表示画像301には、円環状の内視鏡画像201と、展開画像取得開始ボタン202と、が含まれている。また、表示画像301においては、内視鏡画像201がライブ表示されている。 The display image 301 includes an annular endoscope image 201 and a developed image acquisition start button 202. In the display image 301, the endoscopic image 201 is displayed live.
 展開画像取得開始ボタン202は、例えば、ユーザのタッチ操作に応じ、広域展開画像の生成に用いる局所展開画像の取得を開始させる旨の指示を行うことが可能なGUIボタンとして構成されている。 The developed image acquisition start button 202 is configured as a GUI button capable of giving an instruction to start acquisition of a locally developed image used for generating a wide-area developed image, for example, in response to a user's touch operation.
 ユーザは、挿入部5を配管の内部に挿入した状態において、表示部35に表示されている展開画像取得開始ボタン202を押下することにより、広域展開画像の生成に用いる局所展開画像の取得を開始させる旨の指示を行う。 In a state where the insertion unit 5 is inserted into the pipe, the user presses a development image acquisition start button 202 displayed on the display unit 35 to start acquisition of a local development image used for generating a wide-area development image. Give instructions to do so.
 CPU38は、タッチパネル35aの操作に応じてなされた指示に基づき、広域展開画像の生成に用いる局所展開画像の取得を開始し、挿入部5の挿入距離の計測を開始するとともに、例えば、図9に示すような表示画像302を生成して表示部35に出力するための動作を行う。図9は、実施形態に係る内視鏡装置の動作に応じて生成される表示画像の一例を示す図である。 The CPU 38 starts acquiring a locally developed image used for generating a wide area developed image based on an instruction made in response to an operation of the touch panel 35a, starts measuring the insertion distance of the insertion unit 5, and, for example, in FIG. An operation for generating a display image 302 as shown and outputting it to the display unit 35 is performed. FIG. 9 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
 表示画像302には、円環状の内視鏡画像201と、展開画像取得終了ボタン203と、挿入距離情報204と、が含まれている。また、表示画像302においては、内視鏡画像201がライブ表示されている。 The display image 302 includes an annular endoscope image 201, a developed image acquisition end button 203, and insertion distance information 204. In the display image 302, the endoscopic image 201 is displayed live.
 展開画像取得終了ボタン203は、例えば、ユーザのタッチ操作に応じ、広域展開画像の生成に用いる局所展開画像の取得を終了させる旨の指示を行うことが可能なGUIボタンとして構成されている。 The developed image acquisition end button 203 is configured as a GUI button capable of giving an instruction to end acquisition of a locally developed image used for generating a wide-area developed image, for example, in response to a user's touch operation.
 挿入距離情報204は、CPU38により計測された挿入部5の挿入距離をユーザに報知するための文字列を具備して構成されている。なお、挿入距離情報204は、展開画像取得開始ボタン202が押下されてから展開画像取得終了ボタン203が押下されるまでの期間における挿入部5の進退移動に応じて略リアルタイムに更新される。そのため、展開画像取得開始ボタン202が押下された直後のタイミングにおいては、例えば、図9の「挿入距離:0mm」のような、挿入部5の挿入距離が0である旨を報知するための文字列を含む挿入距離情報204が表示部35に表示される。 The insertion distance information 204 includes a character string for notifying the user of the insertion distance of the insertion unit 5 measured by the CPU 38. The insertion distance information 204 is updated substantially in real time according to the advance / retreat movement of the insertion unit 5 during the period from when the developed image acquisition start button 202 is pressed to when the developed image acquisition end button 203 is pressed. Therefore, at the timing immediately after the developed image acquisition start button 202 is pressed, for example, a character for notifying that the insertion distance of the insertion unit 5 is 0, such as “insertion distance: 0 mm” in FIG. Insertion distance information 204 including a column is displayed on the display unit 35.
 ユーザは、展開画像取得開始ボタン202を押下した後、表示部35に表示される表示画像302を確認しつつ、挿入部5を配管の奥側へ挿入してゆく。 After the user presses the developed image acquisition start button 202, the user inserts the insertion unit 5 into the back side of the pipe while confirming the display image 302 displayed on the display unit 35.
 ここで、広域展開画像の生成及び表示に係る動作の具体例について、図10等を参照しつつ説明する。図10は、実施形態に係る内視鏡装置において行われる広域展開画像の生成及び表示に係る動作の一例を説明するためのフローチャートである。 Here, a specific example of an operation related to generation and display of a wide area developed image will be described with reference to FIG. FIG. 10 is a flowchart for explaining an example of an operation related to generation and display of a wide area developed image performed in the endoscope apparatus according to the embodiment.
 CPU38は、画像生成部34から出力される1枚の内視鏡画像を展開して1枚の局所展開画像を生成するための処理を行う(図10のステップS1)。また、CPU38は、図10のステップS1の処理により生成した局所展開画像に対し、広域展開画像の1番目の画像である旨を識別可能な情報を付加して記憶部36に格納させた(図10のステップS2)後、後述の図10のステップS3の処理を続けて行う。 The CPU 38 performs processing for developing one endoscopic image output from the image generating unit 34 and generating one locally developed image (step S1 in FIG. 10). Further, the CPU 38 adds information that can identify that the image is the first image of the wide area developed image to the locally developed image generated by the process of step S1 in FIG. After step S2), the process of step S3 in FIG.
 ここで、図10のステップS1等において行われる局所展開画像の生成に係る処理について、図11に示すような内視鏡画像を用いた場合を例に挙げて説明する。なお、説明の便宜のため、図11の内視鏡画像においては、図6の領域IRBに対応する被照明領域が、中心軸ASに対する光学軸ACの偏心方向であるオフセット方向に生じる明領域BRと、当該オフセット方向とは逆方向に生じる暗領域DRと、明領域BR及び暗領域DRの中間に生じる中間領域MRA及びMRBと、の4つの領域に区分されているものとする。また、説明及び図示の簡単のため、図11の内視鏡画像においては、図6の領域IRSに応じて生じる影領域が省略されているものとする。図11は、局所展開画像の生成に係る処理に用いられる内視鏡画像の一例を示す図である。 Here, the process related to the generation of the locally developed image performed in step S1 and the like in FIG. 10 will be described by taking as an example the case where an endoscopic image as shown in FIG. 11 is used. For convenience of explanation, in the endoscopic image of FIG. 11, the illuminated region BR corresponding to the region IRB of FIG. 6 is generated in the offset direction that is the eccentric direction of the optical axis AC with respect to the central axis AS. In addition, it is divided into four regions: a dark region DR that occurs in a direction opposite to the offset direction, and intermediate regions MRA and MRB that occur in the middle of the bright region BR and the dark region DR. For the sake of simplicity of explanation and illustration, it is assumed that the shadow region generated in accordance with the region IRS in FIG. 6 is omitted in the endoscopic image in FIG. FIG. 11 is a diagram illustrating an example of an endoscopic image used for processing related to generation of a locally developed image.
 CPU38は、画像生成部34から出力される円環状の内視鏡画像の中心を原点に設定した極座標系において、当該内視鏡画像に含まれる非撮像領域以外の各画素位置に対応する座標値を取得する。 In the polar coordinate system in which the center of the annular endoscope image output from the image generation unit 34 is set as the origin, the CPU 38 has coordinate values corresponding to each pixel position other than the non-imaging region included in the endoscope image. To get.
 CPU38は、前述のように取得した極座標系の座標値に対してヤコビ行列を適用することにより直交座標系の座標値を取得し、当該取得した直交座標系の座標値に応じて内視鏡画像に含まれる各画素を並べ替えることにより、例えば、図12に示すような、当該内視鏡画像を矩形状に展開した基本展開画像IEAを生成する。図12は、図11の内視鏡画像を展開することにより生成される基本展開画像の一例を示す図である。 The CPU 38 acquires the coordinate value of the orthogonal coordinate system by applying the Jacobian matrix to the coordinate value of the polar coordinate system acquired as described above, and the endoscopic image according to the acquired coordinate value of the orthogonal coordinate system By rearranging the pixels included in the basic image, for example, as shown in FIG. 12, a basic expanded image IEA in which the endoscope image is expanded in a rectangular shape is generated. FIG. 12 is a diagram illustrating an example of a basic developed image generated by developing the endoscopic image of FIG.
 基本展開画像IEAは、円筒状の撮像領域IRにおける光学軸ACに平行な方向、すなわち、配管101の中心軸ASに平行な方向が上下方向(以降、Y軸方向とも称する)になるような画像として生成される。また、基本展開画像IEAは、円筒状の撮像領域IRの周方向、すなわち、内周面101aの周方向が左右方向(以降、X軸方向とも称する)になるような画像として生成される。また、基本展開画像IEAのX軸方向には、円筒状の撮像領域IR内において撮像された内周面101aが1周分(360度分)含まれている。 The basic developed image IEA is an image in which the direction parallel to the optical axis AC in the cylindrical imaging region IR, that is, the direction parallel to the central axis AS of the pipe 101 is the vertical direction (hereinafter also referred to as the Y-axis direction). Is generated as The basic developed image IEA is generated as an image in which the circumferential direction of the cylindrical imaging region IR, that is, the circumferential direction of the inner peripheral surface 101a is the left-right direction (hereinafter also referred to as the X-axis direction). In addition, in the X-axis direction of the basic developed image IEA, the inner peripheral surface 101a imaged in the cylindrical imaging region IR is included for one round (360 degrees).
 CPU38は、前述のように生成した基本展開画像IEAをX軸方向に2枚分結合することにより、例えば、図13に示すような矩形状の結合画像IJAを生成する。すなわち、結合画像IJAのX軸方向には、円筒状の撮像領域IR内において撮像された内周面101aが2周分(720度分)含まれている。図13は、図12の基本展開画像を用いて生成される結合画像の一例を示す図である。 The CPU 38 generates, for example, a rectangular combined image IJA as shown in FIG. 13 by combining two basic expanded images IEA generated as described above in the X-axis direction. That is, in the X-axis direction of the combined image IJA, the inner peripheral surface 101a imaged in the cylindrical imaging region IR is included for two rounds (for 720 degrees). FIG. 13 is a diagram illustrating an example of a combined image generated using the basic development image of FIG.
 CPU38は、記憶部36に格納された切り出し基準位置の情報に基づき、結合画像IJAを1周分(360度分)切り出して得られる画像である切り出し画像を生成するための処理を行う。具体的には、CPU38は、例えば、記憶部36に格納された切り出し基準位置が暗領域DRである場合には、X軸方向の両端に暗領域DRが配置されるように結合画像IJAを1周分(360度分)切り出すことにより、図14に示すような矩形状の切り出し画像IEBを生成する。図14は、図13の結合画像の一部を切り出すことにより生成される切り出し画像の一例を示す図である。 The CPU 38 performs processing for generating a cutout image that is an image obtained by cutting out the combined image IJA for one round (for 360 degrees) based on the information of the cutout reference position stored in the storage unit 36. Specifically, for example, when the cutout reference position stored in the storage unit 36 is the dark region DR, the CPU 38 sets the combined image IJA to 1 so that the dark region DR is arranged at both ends in the X-axis direction. By cutting out the circumference (360 degrees), a rectangular cut-out image IEB as shown in FIG. 14 is generated. FIG. 14 is a diagram illustrating an example of a cutout image generated by cutting out a part of the combined image in FIG.
 CPU38は、基本展開画像IEAの生成に用いた内視鏡画像に含まれる影領域が最大となる方向をオフセット方向として推定するとともに、当該影領域の当該オフセット方向の幅WSを算出するための処理を行う(図15参照)。また、CPU38は、前述のように算出した幅WSと、記憶部36に格納された配管101の内径Φと、に基づいてオフセット量Δを算出するための処理を行う。図15は、基本展開画像の生成に用いた内視鏡画像に含まれる影領域のオフセット方向の幅WSを説明するための図である。 The CPU 38 estimates the direction in which the shadow area included in the endoscopic image used for generating the basic development image IEA is the maximum as the offset direction, and calculates the width WS of the shadow area in the offset direction (See FIG. 15). Further, the CPU 38 performs a process for calculating the offset amount Δ based on the width WS calculated as described above and the inner diameter Φ of the pipe 101 stored in the storage unit 36. FIG. 15 is a diagram for explaining the width WS in the offset direction of the shadow region included in the endoscopic image used for generating the basic development image.
 なお、本実施形態においては、CPU38が、影領域のオフセット方向の幅WSの代わりに、例えば、基本展開画像IEAの生成に用いた内視鏡画像に描出された暗領域DRにおける中心画素Dpの輝度値Dpiを、当該内視鏡画像の暗領域DRに対向する位置に描出された明領域BRにおける中心画素Bpの輝度値Bpiで除して得られる比の値RV(=Dpi/Bpi)を用いてオフセット量Δを算出するものであってもよい。また、本実施形態においては、CPU38が、影領域のオフセット方向の幅WSと、前述の比の値RVと、配管101の内径Φと、に基づいてオフセット量Δを算出するものであってもよい。 In this embodiment, instead of the width WS in the offset direction of the shadow area, the CPU 38 sets the center pixel Dp in the dark area DR depicted in the endoscopic image used for generating the basic developed image IEA, for example. A ratio value RV (= Dpi / Bpi) obtained by dividing the luminance value Dpi by the luminance value Bpi of the central pixel Bp in the bright region BR depicted at a position facing the dark region DR of the endoscopic image. It may be used to calculate the offset amount Δ. In the present embodiment, the CPU 38 may calculate the offset amount Δ based on the width WS of the shadow region in the offset direction, the ratio value RV, and the inner diameter Φ of the pipe 101. Good.
 CPU38は、先端部11の外径DAと、自由曲面レンズ43の外径DBと、配管101の内径Φと、オフセット量Δと、に基づき、自由曲面レンズ43の最外周と内周面101aとの間の距離WD(図16参照)を内周面101aの全周にわたって算出するための処理を行う。なお、外径DA及びDBは、広域展開画像の生成に係る処理において用いられる既知のパラメータであり、例えば、当該処理が行われる前に記憶部36に予め格納されているものとする。図16は、広域展開画像の生成に係る処理において用いられるパラメータの概要を示す図である。 Based on the outer diameter DA of the distal end portion 11, the outer diameter DB of the free curved surface lens 43, the inner diameter Φ of the pipe 101, and the offset amount Δ, the CPU 38 determines the outermost and inner peripheral surfaces 101 a of the free curved surface lens 43. A process for calculating the distance WD (see FIG. 16) between the entire circumference of the inner peripheral surface 101a is performed. It should be noted that the outer diameters DA and DB are known parameters used in processing related to the generation of the wide area developed image, and are stored in advance in the storage unit 36 before the processing is performed, for example. FIG. 16 is a diagram showing an outline of parameters used in the processing related to the generation of the wide area developed image.
 ここで、例えば、光学軸ACを中心とした方位角θの方向における距離WDをWD(θ)とした場合、当該方位角θの方向に位置する被写体が撮像される際の撮像倍率β(θ)を下記数式(1)のように表すことができる。なお、下記数式(1)のfは自由曲面レンズ43の焦点距離を表し、下記数式(1)のEnpは自由曲面レンズ43の最外周と入射瞳位置との間の距離(図16参照)を表すものとする。また、下記数式(1)のf及びEnpは、広域展開画像の生成に係る処理において用いられる既知のパラメータであり、例えば、当該処理が行われる前に記憶部36に予め格納されているものとする。 Here, for example, when the distance WD in the direction of the azimuth angle θ about the optical axis AC is WD (θ), the imaging magnification β (θ when the subject located in the direction of the azimuth angle θ is imaged ) Can be expressed as the following formula (1). In the following formula (1), f represents the focal length of the free-form surface lens 43, and Enp in the following formula (1) represents the distance between the outermost periphery of the free-form surface lens 43 and the entrance pupil position (see FIG. 16). It shall represent. In addition, f and Enp in the following mathematical formula (1) are known parameters used in the process related to the generation of the wide area expanded image, and are stored in the storage unit 36 in advance before the process is performed, for example. To do.
β(θ)=f/(WD(θ)+Enp) …(1)
 また、方位角θと基本展開画像IEAのX軸方向の位置とが対応関係にあることを勘案した場合、上記数式(1)に距離WD(θ)を適用することにより、切り出し画像IEBのX軸方向における方位角θに対応する位置の撮像倍率β(θ)を算出することができる。
β (θ) = f / (WD (θ) + Enp) (1)
Further, when taking into consideration that the azimuth angle θ and the position of the basic developed image IEA in the X-axis direction are in a corresponding relationship, by applying the distance WD (θ) to the above equation (1), the X of the cut-out image IEB An imaging magnification β (θ) at a position corresponding to the azimuth angle θ in the axial direction can be calculated.
 CPU38は、上記数式(1)を用いて算出した撮像倍率β(θ)と、所定の基準倍率βthと、に基づき、切り出し画像IEBのX軸方向における方位角θに対応する位置のY軸方向の幅をβth/β(θ)倍に補正するための倍率補正処理を行う。そして、このような倍率補正処理によれば、例えば、図17に示すような糸巻き形状の切り出し画像IECが生成される。図17は、図14の切り出し画像を倍率補正処理により変形した場合の例を示す図である。 The CPU 38, based on the imaging magnification β (θ) calculated using the above mathematical formula (1) and the predetermined reference magnification βth, the Y-axis direction of the position corresponding to the azimuth angle θ in the X-axis direction of the cut-out image IEB A magnification correction process is performed to correct the width of the image by βth / β (θ) times. According to such a magnification correction process, for example, a pincushion-shaped cut-out image IEC as shown in FIG. 17 is generated. FIG. 17 is a diagram illustrating an example when the cutout image of FIG. 14 is deformed by the magnification correction process.
 すなわち、切り出し画像IECは、図17に示すように、図14の切り出し画像IEBの両端部(θ=180度付近及び-180度付近)に位置する暗領域DRの幅がY軸方向に拡大され、当該切り出し画像IEBの中央部(θ=0度付近)に位置する明領域BRの幅がY軸方向に縮小され、かつ、当該切り出し画像IEBにおける中間領域MRA及びMRBの幅が撮像倍率β(θ)に応じてY軸方向に拡縮された画像として生成される。また、図17の切り出し画像IECによれば、θ=0度に対応する位置であるX軸方向の中心位置におけるY軸方向の幅WMが最小幅となる。 That is, in the cutout image IEC, as shown in FIG. 17, the width of the dark region DR located at both ends (near θ = 180 degrees and −180 degrees) of the cutout image IEB in FIG. 14 is expanded in the Y-axis direction. The width of the bright area BR located in the center of the cutout image IEB (near θ = 0 degrees) is reduced in the Y-axis direction, and the widths of the intermediate areas MRA and MRB in the cutout image IEB are set to the imaging magnification β ( is generated as an image scaled in the Y-axis direction according to θ). Further, according to the cut-out image IEC in FIG. 17, the width WM in the Y-axis direction at the center position in the X-axis direction, which is a position corresponding to θ = 0 °, is the minimum width.
 CPU38は、切り出し画像IECの上部及び下部を幅WMに合わせて切除する処理を行うことにより、図18に示すような矩形状の局所展開画像IELを生成する。図18は、図17の変形後の切り出し画像の一部を切除することにより生成される局所展開画像の一例を示す図である。 The CPU 38 generates a rectangular locally developed image IEL as shown in FIG. 18 by performing a process of cutting out the upper and lower portions of the cut-out image IEC in accordance with the width WM. FIG. 18 is a diagram illustrating an example of a locally developed image generated by excising a part of the deformed cutout image of FIG.
 すなわち、以上に述べたような処理によれば、CPU38は、内径Φと、幅WS及び/または比の値RVと、に基づいてオフセット量Δを算出し、さらに、切り出し画像IEBのY軸方向の撮像倍率をオフセット量Δに応じて補正するための倍率補正処理を行うことにより局所展開画像IELを生成する。 That is, according to the processing described above, the CPU 38 calculates the offset amount Δ based on the inner diameter Φ and the width WS and / or the ratio value RV, and further, in the Y-axis direction of the cutout image IEB. The locally developed image IEL is generated by performing a magnification correction process for correcting the imaging magnification according to the offset amount Δ.
 CPU38は、図10のステップS1と同様の処理を行うことにより、画像生成部34から出力される1枚の内視鏡画像を展開して1枚の局所展開画像を生成する(図10のステップS3)。 The CPU 38 performs the same processing as step S1 in FIG. 10 to develop one endoscopic image output from the image generation unit 34 and generate one locally developed image (step in FIG. 10). S3).
 CPU38は、図10のステップS3の処理により生成した局所展開画像が貼り合わせに適した画像であるか否かに係る判定処理を行う(図10のステップS4)。 The CPU 38 performs a determination process related to whether or not the locally developed image generated by the process of step S3 in FIG. 10 is an image suitable for pasting (step S4 in FIG. 10).
 具体的には、CPU38は、例えば、図10のステップS3の処理を行う直前に記憶部36に格納させた局所展開画像IEL1から取得した特徴点及び特徴量と、図10のステップS3の処理により生成した局所展開画像IEL2から取得した特徴点及び特徴量と、を比較することにより、当該局所展開画像IEL2のY軸方向が当該局所展開画像IEL1のY軸方向の幅WMに対して重複している割合PAを算出する。そして、CPU38は、前述のように算出した割合PAが閾値範囲TR内に属していないことを検出した場合に、局所展開画像IEL2が貼り合わせに適していない画像であるとの判定結果を得る。また、CPU38は、前述のように算出した割合PAが閾値範囲TR内に属していることを検出した場合に、局所展開画像IEL2が貼り合わせに適した画像であるとの判定結果を得る。 Specifically, the CPU 38 performs, for example, the feature points and feature amounts acquired from the locally developed image IEL1 stored in the storage unit 36 immediately before performing the process of step S3 of FIG. 10 and the process of step S3 of FIG. By comparing the feature points and feature quantities acquired from the generated locally expanded image IEL2, the Y-axis direction of the locally expanded image IEL2 overlaps the width WM of the locally expanded image IEL1 in the Y-axis direction. The ratio PA is calculated. Then, when the CPU 38 detects that the ratio PA calculated as described above does not belong to the threshold range TR, the CPU 38 obtains a determination result that the locally developed image IEL2 is an image that is not suitable for pasting. Further, when the CPU 38 detects that the ratio PA calculated as described above belongs to the threshold range TR, the CPU 38 obtains a determination result that the locally developed image IEL2 is an image suitable for pasting.
 すなわち、CPU38は、以上に述べたような判定処理を行う判定部としての機能を有している。 That is, the CPU 38 has a function as a determination unit that performs the determination process as described above.
 なお、閾値範囲TRは、例えば、20%以上かつ50%未満の範囲として設定されていればよい。また、閾値範囲TRは、記憶部36に格納されたマッチング精度に応じて拡大及び縮小されるようにしてもよい。具体的には、例えば、記憶部36に格納されたマッチング精度が100%の場合において、閾値範囲TRが20%のみに設定されるようにしてもよい。 The threshold range TR may be set as a range of 20% or more and less than 50%, for example. Further, the threshold range TR may be enlarged and reduced according to the matching accuracy stored in the storage unit 36. Specifically, for example, when the matching accuracy stored in the storage unit 36 is 100%, the threshold range TR may be set to only 20%.
 CPU38は、図10のステップS3の処理により生成した局所展開画像が貼り合わせに適していない画像であるとの判定結果を得た場合(S4:NO)に、後述の図10のステップS5の動作を行う。また、CPU38は、図10のステップS3の処理により生成した局所展開画像が貼り合わせに適した画像であるとの判定結果を得た場合(S4:YES)に、後述の図10のステップS8の動作を行う。 When the CPU 38 obtains a determination result that the locally developed image generated by the process of step S3 of FIG. 10 is an image not suitable for pasting (S4: NO), the operation of step S5 of FIG. 10 described later is performed. I do. In addition, when the CPU 38 obtains a determination result that the locally developed image generated by the process of step S3 in FIG. 10 is an image suitable for pasting (S4: YES), the CPU 38 in step S8 of FIG. Perform the action.
 CPU38は、挿入部5の進退移動に係る要請をユーザに対して行うための文字列等を生成して表示部35に表示させるための動作を行う(図10のステップS5)。 CPU38 performs the operation | movement for producing | generating the character string etc. for making the request | requirement regarding the advance / retreat movement of the insertion part 5 with respect to a user, and making it display on the display part 35 (step S5 of FIG. 10).
 具体的には、CPU38は、例えば、図10のステップS4の処理により算出した割合PAが閾値範囲TRの下限未満である場合、すなわち、局所展開画像IEL1に対する局所展開画像IEL2の重複部分が不足しているまたは存在しない場合に、挿入部5の位置を現在の位置から後方へ戻させる旨の文字列等を生成して表示部35に表示させるための動作を行う。また、CPU38は、例えば、図10のステップS4の処理により算出した割合PAが閾値範囲TRの上限を超えている場合、すなわち、局所展開画像IEL1に対する局所展開画像IEL2の重複部分が過剰である場合に、挿入部5の前方への挿入速度を現在の速度から増加させる旨の文字列等を生成して表示部35に表示させるための動作を行う。 Specifically, for example, when the ratio PA calculated by the process of step S4 in FIG. 10 is less than the lower limit of the threshold range TR, that is, the overlapping portion of the locally developed image IEL2 with respect to the locally developed image IEL1 is insufficient. If it is present or does not exist, an operation for generating a character string or the like for returning the position of the insertion unit 5 backward from the current position and displaying it on the display unit 35 is performed. Further, for example, when the ratio PA calculated by the process of step S4 in FIG. 10 exceeds the upper limit of the threshold range TR, that is, when the overlapping portion of the locally developed image IEL2 with respect to the locally developed image IEL1 is excessive. In addition, an operation for generating a character string or the like for increasing the insertion speed of the insertion unit 5 forward from the current speed and displaying it on the display unit 35 is performed.
 CPU38は、図10のステップS5の動作を行った後に、記憶部36に格納された内視鏡画像の再取得枚数mに基づき、図10のステップS1と同様の処理をm回連続で行うことにより、画像生成部34から出力されるm枚の内視鏡画像に応じたm枚の局所展開画像を生成する(図10のステップS6)。 After performing the operation of step S5 in FIG. 10, the CPU 38 performs the same process as step S1 in FIG. 10 continuously m times based on the number m of re-obtained endoscope images stored in the storage unit 36. Thus, m local development images corresponding to the m endoscopic images output from the image generation unit 34 are generated (step S6 in FIG. 10).
 CPU38は、図10のステップS6の処理により生成したm枚の局所展開画像それぞれに対し、図10のステップS4と同様の判定処理を行うことにより、当該m枚の局所展開画像の中に貼り合わせに適した画像が存在するか否かを判定する(図10のステップS7)。 The CPU 38 performs a determination process similar to that in step S4 in FIG. 10 on each of the m locally developed images generated by the process in step S6 in FIG. It is determined whether there is an image suitable for (step S7 in FIG. 10).
 CPU38は、図10のステップS6の処理により生成したm枚の局所展開画像の中に貼り合わせに適した画像が存在しないとの判定結果を得た場合(S7:NO)に、図10のステップS5及びステップS6の動作を再度行う。CPU38は、図10のステップS6の処理により生成したm枚の局所展開画像の中に貼り合わせに適した画像が存在するとの判定結果を得た場合(S7:YES)に、後述の図10のステップS8の動作を行う。 When the CPU 38 obtains a determination result that there is no image suitable for pasting in the m locally developed images generated by the process of step S6 of FIG. 10 (S7: NO), the CPU 38 of FIG. The operations of S5 and S6 are performed again. When the CPU 38 obtains a determination result that an image suitable for pasting exists among the m locally developed images generated by the process of step S6 of FIG. 10 (S7: YES), the CPU 38 of FIG. The operation of step S8 is performed.
 すなわち、CPU38は、図10のステップS4またはステップS7の判定条件が満たされない場合に、局所展開画像の生成に係る動作を再度行う。 That is, the CPU 38 performs the operation relating to the generation of the locally developed image again when the determination condition of step S4 or step S7 in FIG. 10 is not satisfied.
 なお、CPU38は、図10のステップS7において、貼り合わせに適した局所展開画像が複数存在することを検出した場合には、例えば、閾値範囲TRに含まれる所定値に最も近い割合PAを具備する一の局所展開画像を選択してから図10のステップS8の動作に移行するものとする。 If the CPU 38 detects that there are a plurality of locally developed images suitable for pasting in step S7 of FIG. 10, for example, the CPU 38 has a ratio PA closest to a predetermined value included in the threshold range TR. Assume that after selecting one locally developed image, the operation proceeds to step S8 in FIG.
 CPU38は、図10のステップS4またはステップS7の判定条件を満たした局所展開画像に対し、広域展開画像のn(2≦n)番目の画像である旨を識別可能な情報を付加して記憶部36に格納させた(図10のステップS8)後、挿入部5の挿入状態の維持をユーザに要請するための文字列等を生成して表示部35に表示させるための動作を行う(図10のステップS9)。すなわち、CPU38は、図10のステップS8において、図10のステップS4またはステップS7の判定条件を満たした局所展開画像を記憶部36に時系列に格納させるための動作を行う。 The CPU 38 adds information that can identify that the image is the n (2 ≦ n) th image of the wide area developed image to the locally developed image that satisfies the determination condition of step S4 or step S7 in FIG. 36 (step S8 in FIG. 10), an operation for generating a character string or the like for requesting the user to maintain the insertion state of the insertion unit 5 and displaying it on the display unit 35 is performed (FIG. 10). Step S9). That is, in step S8 in FIG. 10, the CPU 38 performs an operation for storing the locally developed image satisfying the determination condition in step S4 or step S7 in FIG.
 具体的には、CPU38は、図10のステップS9において、例えば、現在の挿入速度(及び挿入方向)を維持しつつ挿入部5の挿入を継続させる旨の文字列等を生成して表示部35に表示させるための動作を行う。 Specifically, in step S9 of FIG. 10, for example, the CPU 38 generates a character string or the like indicating that the insertion of the insertion unit 5 is continued while maintaining the current insertion speed (and insertion direction), and displays the display unit 35. Performs the operation to display on the screen.
 CPU38は、局所展開画像の取得を終了するか否かに係る判定処理を行う(図10のステップS10)。 CPU38 performs the determination process which concerns on whether acquisition of a local expansion | deployment image is complete | finished (step S10 of FIG. 10).
 CPU38は、例えば、展開画像取得終了ボタン203の押下に応じた指示を検出することができず、かつ、少なくとも1枚の局所展開画像を格納可能な空き容量が記憶部36に存在することを検出した場合(S10:NO)に、図10のステップS3からの動作を再度行う。 For example, the CPU 38 cannot detect an instruction in response to pressing of the developed image acquisition end button 203 and detects that the storage unit 36 has a free space capable of storing at least one locally developed image. If it is (S10: NO), the operation from step S3 in FIG. 10 is performed again.
 すなわち、図10のステップS3~ステップS10の動作が繰り返し行われることにより、広域展開画像の生成に用いる複数の局所展開画像が記憶部36に順次(1番目から順番に)格納される。 That is, by repeatedly performing the operations from step S3 to step S10 in FIG. 10, a plurality of locally developed images used for generating a wide area developed image are sequentially stored in the storage unit 36 (in order from the first).
 CPU38は、例えば、展開画像取得終了ボタン203の押下に応じた指示が行われたことを検出した場合、または、少なくとも1枚の局所展開画像を格納可能な空き容量が記憶部36に存在しないことを検出した場合(S10:YES)に、挿入部5の挿入距離の計測を停止するとともに、記憶部36に格納された各局所展開画像を貼り合わせて1枚の広域展開画像を生成するための処理を行う(図10のステップS11)。すなわち、CPU38は、図10のステップS4またはステップS7の判定条件を満たした局所展開画像を記憶部36に時系列に格納するための動作を終了する際に、記憶部36に格納されている各局所展開画像を貼り合わせることにより1枚の広域展開画像を生成し、当該生成した1枚の広域展開画像を表示部35に表示させるための動作を行う。 For example, the CPU 38 detects that an instruction in response to pressing of the developed image acquisition end button 203 has been performed, or that the storage unit 36 does not have a free space capable of storing at least one locally developed image. Is detected (S10: YES), the measurement of the insertion distance of the insertion unit 5 is stopped, and each locally developed image stored in the storage unit 36 is pasted to generate one wide-area developed image. Processing is performed (step S11 in FIG. 10). That is, when the CPU 38 ends the operation for storing the locally developed image satisfying the determination condition of step S4 or step S7 of FIG. 10 in the storage unit 36 in time series, each CPU stored in the storage unit 36 An operation for generating one wide area developed image by pasting the locally developed images and causing the display unit 35 to display the generated one wide area developed image is performed.
 具体的には、CPU38は、例えば、記憶部36に格納された1番目~p(2≦p)番目の局所展開画像IELに対し、相互に隣接する2つの番号の画像のホモグラフィ行列を算出しつつ画像同士を貼り合わせる処理を施すことにより、図19に示すような1枚の広域展開画像IEWを生成する。図19は、複数の局所展開画像を貼り合わせて生成される広域展開画像の一例を示す図である。 Specifically, the CPU 38 calculates, for example, a homography matrix of two numbers adjacent to each other with respect to the first to p (2 ≦ p) th locally developed images IEL stored in the storage unit 36. In addition, by performing a process of pasting the images together, a single wide area developed image IEW as shown in FIG. 19 is generated. FIG. 19 is a diagram illustrating an example of a wide area developed image generated by pasting together a plurality of locally developed images.
 CPU38は、図9に示した表示画像302の代わりに、例えば、図20に示すような表示画像303を生成して表示部35に出力するための動作を行う(図10のステップS11)。図20は、実施形態に係る内視鏡装置の動作に応じて生成される表示画像の一例を示す図である。 CPU38 performs the operation | movement for producing | generating the display image 303 as shown in FIG. 20, for example instead of the display image 302 shown in FIG. 9, and outputting it to the display part 35 (step S11 of FIG. 10). FIG. 20 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
 表示画像303には、挿入距離情報204と、矩形状の広域展開画像205と、ライブ画像表示ボタン206と、が含まれている。なお、表示画像303の挿入距離情報204には、CPU38による計測が停止されたタイミングにおける挿入部5の挿入距離をユーザに報知するための文字列が含まれている。 The display image 303 includes insertion distance information 204, a rectangular wide area developed image 205, and a live image display button 206. The insertion distance information 204 of the display image 303 includes a character string for notifying the user of the insertion distance of the insertion unit 5 at the timing when the measurement by the CPU 38 is stopped.
 ライブ画像表示ボタン206は、例えば、ユーザのタッチ操作に応じ、広域展開画像205の表示を解除して内視鏡画像201をライブ表示させる旨の指示を行うことが可能なGUIボタンとして構成されている。 The live image display button 206 is configured, for example, as a GUI button that can issue an instruction to cancel the display of the wide-area developed image 205 and display the endoscopic image 201 live according to a user's touch operation. Yes.
 CPU38は、広域展開画像の表示を終了するか否かに係る判定処理を行う(図10のステップS12)。 The CPU 38 performs a determination process related to whether or not to end the display of the wide area developed image (step S12 in FIG. 10).
 CPU38は、ライブ画像表示ボタン206の押下に応じた指示を検出するまでの間、表示画像303を表示部35に出力するための動作を行いつつ待機する(S12:NO)。 The CPU 38 stands by while performing an operation for outputting the display image 303 to the display unit 35 until an instruction corresponding to the pressing of the live image display button 206 is detected (S12: NO).
 CPU38は、ライブ画像表示ボタン206の押下に応じた指示が行われたことを検出した場合(S12:YES)に、図10のステップS11において生成した広域展開画像を外部記憶装置51に格納させるための動作を行い、さらに、表示画像303の代わりに表示画像301を再度表示させるための動作を行った後、広域展開画像の生成及び表示に係る一連の処理を完了する。 When the CPU 38 detects that an instruction corresponding to the pressing of the live image display button 206 has been performed (S12: YES), the CPU 38 stores the wide area developed image generated in step S11 of FIG. In addition, after performing an operation for displaying the display image 301 again instead of the display image 303, a series of processes relating to generation and display of the wide area expanded image is completed.
 なお、本実施形態においては、CPU38が、記憶部36への局所展開画像の格納を完了した後で広域展開画像を生成して表示部35に表示させるための動作を行うものに限らず、例えば、局所展開画像を記憶部36に格納させつつ広域展開画像を生成して表示部35に表示させるための動作を行うものであってもよい。 In the present embodiment, the CPU 38 is not limited to performing the operation for generating the wide area developed image and displaying it on the display unit 35 after the storage of the locally developed image in the storage unit 36 is completed. Alternatively, an operation for generating a wide area developed image and displaying it on the display unit 35 while storing the locally developed image in the storage unit 36 may be performed.
 具体的には、CPU38は、例えば、図21の表示画像304aに示すように、図10のステップS4またはステップS7の判定条件を満たす最新の局所展開画像を取得した際に、当該最新の局所展開画像を記憶部36に格納させるための動作を行いつつ、記憶部36に既に格納されている1枚以上の局所展開画像と、当該最新の局所展開画像と、を貼り合わせて広域展開画像211を生成し、当該生成した広域展開画像211を表示部35に表示させるための動作を行うようにしてもよい。図21は、実施形態に係る内視鏡装置の動作に応じて生成される表示画像の一例を示す図である。 Specifically, for example, as shown in a display image 304a in FIG. 21, the CPU 38 acquires the latest local development image when the latest local development image satisfying the determination condition in step S4 or step S7 in FIG. 10 is acquired. While performing the operation for storing the image in the storage unit 36, one or more locally developed images already stored in the storage unit 36 and the latest locally developed image are pasted together to form a wide area developed image 211. An operation for generating and displaying the generated wide area developed image 211 on the display unit 35 may be performed. FIG. 21 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
 なお、表示画像304aには、図9の表示画像302に含まれているものと略同様の内視鏡画像201、展開画像取得終了ボタン203及び挿入距離情報204が含まれている。また、表示画像304aによれば、図10のステップS4またはステップS7の判定条件を満たす局所展開画像が取得される毎に、当該局所展開画像を貼り合わせた広域展開画像211が表示部35に表示される。また、表示画像304aによれば、例えば、図21の一点鎖線で示すように、図10のステップS4またはステップS7の判定条件を満たす局所展開画像が取得される毎に、広域展開画像211の表示領域が徐々に拡張される。 The display image 304a includes an endoscope image 201, a developed image acquisition end button 203, and insertion distance information 204 that are substantially the same as those included in the display image 302 of FIG. Further, according to the display image 304a, each time a locally developed image satisfying the determination condition of step S4 or step S7 in FIG. 10 is acquired, a wide area developed image 211 obtained by pasting the locally developed image is displayed on the display unit 35. Is done. In addition, according to the display image 304a, for example, as indicated by the alternate long and short dash line in FIG. 21, each time the locally developed image satisfying the determination condition in step S4 or step S7 in FIG. The area is gradually expanded.
 または、CPU38は、例えば、図22の表示画像304bに示すように、図10のステップS4またはステップS7の判定条件を満たす最新の局所展開画像を取得した際に、当該最新の局所展開画像を記憶部36に格納させるための動作を行いつつ、広域展開画像211と、当該最新の局所展開画像に相当する局所展開画像221と、を併せて表示部35に表示させるための動作を行うようにしてもよい。図22は、実施形態に係る内視鏡装置の動作に応じて生成される表示画像の一例を示す図である。 Alternatively, for example, as shown in the display image 304b in FIG. 22, the CPU 38 stores the latest locally developed image when the latest locally developed image satisfying the determination condition in step S4 or step S7 in FIG. 10 is acquired. While performing the operation for storing in the unit 36, the operation for causing the display unit 35 to display the wide area expanded image 211 and the locally expanded image 221 corresponding to the latest locally expanded image together is performed. Also good. FIG. 22 is a diagram illustrating an example of a display image generated according to the operation of the endoscope apparatus according to the embodiment.
 なお、表示画像304bには、図9の表示画像302に含まれているものと略同様の展開画像取得終了ボタン203及び挿入距離情報204が含まれている。また、表示画像304bによれば、図10のステップS4またはステップS7の判定条件を満たす局所展開画像221が取得される毎に、局所展開画像221と、局所展開画像221を貼り合わせた広域展開画像211と、が併せて表示部35に表示される。また、表示画像304bによれば、例えば、図22の一点鎖線で示すように、図10のステップS4またはステップS7の判定条件を満たす局所展開画像が取得される毎に、広域展開画像211の表示領域が徐々に拡張される。 The display image 304b includes a developed image acquisition end button 203 and insertion distance information 204 that are substantially the same as those included in the display image 302 of FIG. Further, according to the display image 304b, each time the locally developed image 221 that satisfies the determination condition of step S4 or step S7 in FIG. 10 is acquired, the locally developed image 221 and the locally developed image 221 are pasted together. 211 are also displayed on the display unit 35. In addition, according to the display image 304b, for example, as indicated by a one-dot chain line in FIG. 22, each time a locally developed image satisfying the determination condition in step S4 or step S7 in FIG. The area is gradually expanded.
 以上に述べたように、本実施形態によれば、画像生成部34から出力される内視鏡画像に基づいてオフセット量Δを算出するとともに、当該算出したオフセット量Δに応じた倍率補正処理を行いつつ局所展開画像及び広域展開画像を生成するようにしている。そのため、本実施形態によれば、例えば、光学軸ACと中心軸ASとを一致させつつ挿入部5を配管101の内部に挿入することができるように構成されたセンタリング機構を用いずとも、正確な展開画像を生成することができる。 As described above, according to the present embodiment, the offset amount Δ is calculated based on the endoscopic image output from the image generation unit 34, and the magnification correction process according to the calculated offset amount Δ is performed. While performing, a locally developed image and a wide area developed image are generated. Therefore, according to the present embodiment, for example, without using a centering mechanism configured to be able to insert the insertion portion 5 into the pipe 101 while matching the optical axis AC and the center axis AS. Can be generated.
 また、本実施形態によれば、前述のようなセンタリング機構を挿入部5に設けずとも展開画像を生成することができるため、例えば、挿入部5の外径に応じた細径な配管の内部状態を広域展開画像で確認することができる。 In addition, according to the present embodiment, since the developed image can be generated without providing the centering mechanism as described above in the insertion portion 5, for example, the inside of a small diameter pipe corresponding to the outer diameter of the insertion portion 5. The state can be confirmed with a wide area developed image.
 また、本実施形態によれば、図10のステップS5~ステップS7の動作が繰り返し行われることにより、n-1番目の局所展開画像に対する貼り合わせに適したn番目の局所展開画像を確実に取得することができる。そのため、本実施形態によれば、例えば、配管101の内部に挿入されている挿入部5の挿入状態が一時的に乱れたとしても、正確な広域展開画像を生成することができる。 In addition, according to the present embodiment, the operations of Step S5 to Step S7 in FIG. 10 are repeatedly performed, so that the nth locally developed image suitable for pasting to the (n−1) th locally developed image is reliably acquired. can do. Therefore, according to the present embodiment, for example, even when the insertion state of the insertion portion 5 inserted into the pipe 101 is temporarily disturbed, an accurate wide area developed image can be generated.
 なお、本発明は、上述した実施形態に限定されるものではなく、発明の趣旨を逸脱しない範囲内において種々の変更や応用が可能であることは勿論である。 It should be noted that the present invention is not limited to the above-described embodiment, and various changes and applications can be made without departing from the spirit of the invention.
 本出願は、2017年1月24日に日本国に出願された特願2017-10126号を優先権主張の基礎として出願するものであり、上記の開示内容は、本願明細書、請求の範囲に引用されるものとする。 This application is filed on the basis of the priority claim of Japanese Patent Application No. 2017-10126 filed in Japan on January 24, 2017. The above disclosure is included in the present specification and claims. Shall be quoted.

Claims (7)

  1.  中心軸を有する円筒状の被検体の内部に配置されるとともに、所定の光学軸を中心とする円筒状の照明領域に対して照明光を出射するように構成された照明部と、
     前記照明部の近傍に設けられ、前記被検体の内部に配置されるとともに、前記所定の光学軸を中心とする円筒状の撮像領域内の被写体を撮像して撮像信号を出力するように構成された撮像部と、
     前記撮像信号に基づき、前記所定の光学軸を画像の中心とする円環状の画像を生成するように構成された画像生成部と、
     前記被検体の内径と前記円環状の画像から取得される値とに基づいて前記中心軸に対する前記所定の光学軸のオフセット量を算出し、さらに、前記円環状の画像を展開して得られる画像における前記中心軸に平行な方向の撮像倍率を前記オフセット量に応じて補正するための倍率補正処理を行うことにより局所展開画像を生成するように構成された画像処理部と、
     を有することを特徴とする観察装置。
    An illumination unit that is arranged inside a cylindrical subject having a central axis and configured to emit illumination light to a cylindrical illumination region centered on a predetermined optical axis;
    Provided in the vicinity of the illumination unit, arranged inside the subject, and configured to image a subject in a cylindrical imaging region centered on the predetermined optical axis and output an imaging signal An imaging unit,
    An image generation unit configured to generate an annular image having the predetermined optical axis as a center of the image based on the imaging signal;
    An image obtained by calculating an offset amount of the predetermined optical axis with respect to the central axis based on an inner diameter of the subject and a value acquired from the annular image, and further developing the annular image An image processing unit configured to generate a locally developed image by performing a magnification correction process for correcting the imaging magnification in a direction parallel to the central axis according to the offset amount;
    An observation apparatus comprising:
  2.  前記画像処理部は、前記被検体の内径と、前記撮像領域内の前記照明領域から外れた領域が撮像された際に前記円環状の画像に描出される影領域の幅と、に基づいて前記オフセット量を算出する
     ことを特徴とする請求項1に記載の観察装置。
    The image processing unit is based on the inner diameter of the subject and the width of a shadow area drawn in the annular image when an area outside the illumination area in the imaging area is imaged. The observation apparatus according to claim 1, wherein an offset amount is calculated.
  3.  前記画像処理部は、前記被検体の内径と、前記円環状の画像に描出された暗領域における中心画素の輝度値と、前記暗領域に対向する位置する位置に描出された明領域における中心画素の輝度値と、に基づいて前記オフセット量を算出する
     ことを特徴とする請求項1に記載の観察装置。
    The image processing unit includes an inner diameter of the subject, a luminance value of a central pixel in a dark region depicted in the annular image, and a central pixel in a bright region depicted at a position facing the dark region The observation apparatus according to claim 1, wherein the offset amount is calculated based on a luminance value of the observation value.
  4.  前記局所展開画像が貼り合わせに適しているか否かに係る判定処理を行うように構成された判定部をさらに有し、
     前記画像処理部は、前記判定処理により貼り合わせに適しているとの判定結果が得られた前記局所展開画像を記憶部に時系列に格納させるための動作を行う
     ことを特徴とする請求項1に記載の観察装置。
    A determination unit configured to perform determination processing related to whether or not the locally developed image is suitable for pasting;
    The image processing unit performs an operation for storing the locally developed image, which is obtained as a result of determination as being suitable for pasting by the determination process, in a time series in a storage unit. The observation apparatus described in 1.
  5.  前記画像処理部は、前記判定処理により貼り合わせに適しているとの判定結果が得られた前記局所展開画像を前記記憶部に時系列に格納させるための動作を終了する際に、前記記憶部に格納されている各局所展開画像を貼り合わせることにより1枚の広域展開画像を生成し、当該生成した1枚の広域展開画像を表示部に表示させるための動作を行う
     ことを特徴とする請求項4に記載の観察装置。
    When the image processing unit finishes the operation for causing the storage unit to store the locally developed image that is determined to be suitable for pasting by the determination process in time series, the storage unit A single wide-area development image is generated by pasting each locally developed image stored in the image, and an operation for displaying the generated single wide-area development image on a display unit is performed. Item 5. The observation device according to Item 4.
  6.  前記画像処理部は、前記判定処理により貼り合わせに適しているとの判定結果が得られた最新の局所展開画像を前記記憶部に格納させるための動作を行いつつ、前記記憶部に既に格納されている1枚以上の局所展開画像と、前記最新の局所展開画像と、を貼り合わせた1枚の広域展開画像を生成し、当該生成した1枚の広域展開画像を表示部に表示させるための動作を行う
     ことを特徴とする請求項4に記載の観察装置。
    The image processing unit is already stored in the storage unit while performing an operation for storing in the storage unit the latest locally developed image that has been determined to be suitable for pasting by the determination process. Generating one wide area developed image by combining one or more local developed images and the latest locally developed image, and displaying the generated one wide area developed image on the display unit The observation apparatus according to claim 4, wherein the observation apparatus performs an operation.
  7.  前記画像処理部は、前記判定処理により貼り合わせに適していないとの判定結果が得られた際に、前記局所展開画像の生成に係る動作を再度行う
     ことを特徴とする請求項4に記載の観察装置。
    The said image processing part performs operation | movement which concerns on the production | generation of the said local expansion | deployment image again, when the determination result that it is not suitable for bonding is obtained by the said determination process. Observation device.
PCT/JP2017/041964 2017-01-24 2017-11-22 Observation device WO2018139025A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017010126 2017-01-24
JP2017-010126 2017-01-24

Publications (1)

Publication Number Publication Date
WO2018139025A1 true WO2018139025A1 (en) 2018-08-02

Family

ID=62979131

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/041964 WO2018139025A1 (en) 2017-01-24 2017-11-22 Observation device

Country Status (1)

Country Link
WO (1) WO2018139025A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03132590A (en) * 1989-10-18 1991-06-05 Okumura Corp Mine wall development image creation device
JP2012163619A (en) * 2011-02-03 2012-08-30 Olympus Corp Whole circumference observation optical system and whole circumference observation system including the same
JP2017015836A (en) * 2015-06-29 2017-01-19 オリンパス株式会社 Perimeter illuminating optical member, and perimeter illuminating optical system for endoscope and perimeter observing endoscope comprising the optical member

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03132590A (en) * 1989-10-18 1991-06-05 Okumura Corp Mine wall development image creation device
JP2012163619A (en) * 2011-02-03 2012-08-30 Olympus Corp Whole circumference observation optical system and whole circumference observation system including the same
JP2017015836A (en) * 2015-06-29 2017-01-19 オリンパス株式会社 Perimeter illuminating optical member, and perimeter illuminating optical system for endoscope and perimeter observing endoscope comprising the optical member

Similar Documents

Publication Publication Date Title
JP4500096B2 (en) Endoscope and endoscope system
JP5942044B2 (en) Endoscope system
WO2018051679A1 (en) Measurement assistance device, endoscope system, processor for endoscope system, and measurement assistance method
US20100128116A1 (en) Endoscope apparatus
JP5977912B1 (en) Endoscope system and endoscope video processor
JP5953443B2 (en) Endoscope system
JP2011206435A (en) Imaging device, imaging method, imaging program and endoscope
JP5889495B2 (en) Endoscope system
JP5231173B2 (en) Endoscope device for measurement and program
JP2022136184A (en) Control device, endoscope system and method of operating the control device
JP4885479B2 (en) Endoscope device for measurement and program for endoscope
JP6352673B2 (en) Endoscope apparatus and operation method of endoscope apparatus
JP2014228851A (en) Endoscope device, image acquisition method, and image acquisition program
JP5113990B2 (en) Endoscope device for measurement
WO2018139025A1 (en) Observation device
JP2019033971A (en) Endoscope apparatus
JPS6354378B2 (en)
JP6064092B2 (en) Endoscope system
CN107920189B (en) Panoramic endoscope device
KR102516406B1 (en) Method and apparatus for calibrating images obtained by confocal endoscopy
JP4776919B2 (en) Medical image processing device
JP2004275359A (en) Measuring endoscope apparatus
US20200306002A1 (en) Medical observation control device and medical observation system
TWI607734B (en) Panoramic endoscope device
JPH02297515A (en) Stereoscopic electronic endoscope

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17894394

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17894394

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载