WO2018154871A1 - Observation device, observation system, and method for controlling observation device - Google Patents
Observation device, observation system, and method for controlling observation device Download PDFInfo
- Publication number
- WO2018154871A1 WO2018154871A1 PCT/JP2017/040772 JP2017040772W WO2018154871A1 WO 2018154871 A1 WO2018154871 A1 WO 2018154871A1 JP 2017040772 W JP2017040772 W JP 2017040772W WO 2018154871 A1 WO2018154871 A1 WO 2018154871A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- observation
- optical system
- imaging
- image
- imaging optical
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 149
- 230000003287 optical effect Effects 0.000 claims abstract description 336
- 238000003384 imaging method Methods 0.000 claims abstract description 320
- 230000007246 mechanism Effects 0.000 claims abstract description 30
- 230000008859 change Effects 0.000 claims description 21
- 230000010365 information processing Effects 0.000 claims description 9
- 230000008569 process Effects 0.000 description 135
- 210000004027 cell Anatomy 0.000 description 93
- 238000012545 processing Methods 0.000 description 79
- 239000000523 sample Substances 0.000 description 51
- 238000004891 communication Methods 0.000 description 46
- 238000005259 measurement Methods 0.000 description 32
- 238000010586 diagram Methods 0.000 description 22
- 238000005286 illumination Methods 0.000 description 20
- 230000005540 biological transmission Effects 0.000 description 15
- 230000006870 function Effects 0.000 description 15
- 239000011295 pitch Substances 0.000 description 15
- 238000006243 chemical reaction Methods 0.000 description 8
- 238000007689 inspection Methods 0.000 description 8
- 102100036034 Thrombospondin-1 Human genes 0.000 description 6
- 239000002609 medium Substances 0.000 description 6
- 230000015654 memory Effects 0.000 description 6
- 238000007781 pre-processing Methods 0.000 description 6
- 101000659879 Homo sapiens Thrombospondin-1 Proteins 0.000 description 4
- 101000633605 Homo sapiens Thrombospondin-2 Proteins 0.000 description 4
- 102100029529 Thrombospondin-2 Human genes 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 4
- 239000000470 constituent Substances 0.000 description 4
- 230000007547 defect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 101100222091 Arabidopsis thaliana CSP3 gene Proteins 0.000 description 3
- 101100249009 Bacillus subtilis (strain 168) rplK gene Proteins 0.000 description 3
- 102100027473 Cartilage oligomeric matrix protein Human genes 0.000 description 3
- 101100100937 Giardia intestinalis TSP11 gene Proteins 0.000 description 3
- 101000725508 Homo sapiens Cartilage oligomeric matrix protein Proteins 0.000 description 3
- 101000693970 Homo sapiens Scavenger receptor class A member 3 Proteins 0.000 description 3
- 101000896379 Homo sapiens Transmembrane reductase CYB561D2 Proteins 0.000 description 3
- 102100027192 Scavenger receptor class A member 3 Human genes 0.000 description 3
- 102100021728 Transmembrane reductase CYB561D2 Human genes 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000011109 contamination Methods 0.000 description 3
- 238000003786 synthesis reaction Methods 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 239000012472 biological sample Substances 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000004113 cell culture Methods 0.000 description 2
- 210000004748 cultured cell Anatomy 0.000 description 2
- 238000012258 culturing Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000001963 growth medium Substances 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 101100441251 Arabidopsis thaliana CSP2 gene Proteins 0.000 description 1
- 102100027557 Calcipressin-1 Human genes 0.000 description 1
- 241000233866 Fungi Species 0.000 description 1
- 101000795624 Homo sapiens Pre-rRNA-processing protein TSR1 homolog Proteins 0.000 description 1
- 101100247605 Homo sapiens RCAN1 gene Proteins 0.000 description 1
- 102100031564 Pre-rRNA-processing protein TSR1 homolog Human genes 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 230000002301 combined effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 101150064416 csp1 gene Proteins 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
Definitions
- the present invention relates to an observation apparatus, an observation system, and an observation apparatus control method.
- Japanese Patent Application Laid-Open No. 2005-295818 discloses a technique related to a cell culture device that takes an image of the surface of a culture vessel while changing the relative position between the imaging device having a magnifying optical system and the culture vessel.
- Japanese Unexamined Patent Application Publication No. 2014-238558 discloses the position of the subject of interest in the optical axis direction based on the parallax between images acquired by two cameras.
- a technique related to an imaging apparatus that acquires the image is disclosed.
- an imaging device that images the surface of the culture vessel while changing the relative position between the imaging device and the culture vessel, and an imaging device that acquires the position in the optical axis direction of the subject of interest based on the parallax between the imaging And have different optical limitations.
- An object of the present invention is to provide an observation apparatus, an observation system, and an observation apparatus control method capable of acquiring depth information of a subject of interest.
- the observation apparatus is a first optical system that is an object-side non-telecentric optical system in which the angle formed by any one of the principal rays on the object side and the optical axis is 6 ° or more.
- An imaging optical system that captures a sample using the first imaging optical system to obtain a first image, a moving mechanism that changes a relative position between the sample and the imaging unit, Three-dimensional information for acquiring three-dimensional information about the sample based on a plurality of the first images acquired at different positions of the imaging unit and information on each imaging position at the time of acquisition of the first image.
- An acquisition unit is a first optical system that is an object-side non-telecentric optical system in which the angle formed by any one of the principal rays on the object side and the optical axis is 6 ° or more.
- the observation system includes the observation device and a controller that acquires a user operation result, outputs the operation result to the observation device, and acquires the observation result of the observation device.
- the observation apparatus control method is an enlargement optical system and a subject-side non-telecentric optical system in which an angle formed between any principal ray on the subject side and the optical axis is 6 ° or more.
- the imaging unit including a certain first imaging optical system, imaging the sample by using the first imaging optical system to obtain the first image, and the relative relationship between the sample and the imaging unit Based on the change of position, the plurality of first images acquired at different positions of the imaging unit, and information on each imaging position at the time of acquisition of the first image, the sample And acquiring three-dimensional information.
- an observation apparatus it is possible to provide an observation apparatus, an observation system, and an observation apparatus control method that can acquire depth information of a subject of interest.
- FIG. 1 is a schematic diagram illustrating an example of an outline of the appearance of the observation system according to the first embodiment.
- FIG. 2 is a block diagram illustrating an outline of a configuration example of the observation system according to the first embodiment.
- FIG. 3 is a schematic diagram illustrating an example of a positional relationship between an imaging unit including the second imaging optical system according to the first embodiment and a sample.
- FIG. 4 is a schematic diagram illustrating another example of the positional relationship between the imaging unit including the second imaging optical system according to the first embodiment and the sample.
- FIG. 5 is a schematic diagram illustrating an example of a positional relationship between an imaging unit including the first imaging optical system according to the first embodiment and a sample.
- FIG. 6 is a schematic diagram illustrating another example of the positional relationship between the imaging unit including the first imaging optical system according to the first embodiment and the sample.
- FIG. 7 is a flowchart illustrating an example of the observation apparatus control process according to the first embodiment.
- FIG. 8 is a flowchart of an example of the count scan process according to the first embodiment.
- FIG. 9 is a schematic diagram illustrating an example of count scan processing information according to the first embodiment.
- FIG. 10 is a schematic diagram illustrating an example of a movement pattern of the image acquisition unit in the count scan processing according to the first embodiment.
- FIG. 11 is a flowchart illustrating an example of the 3D scan processing according to the first embodiment.
- FIG. 12 is a schematic diagram illustrating an example of 3D scan processing information according to the first embodiment.
- FIG. 13 is a schematic diagram illustrating an example of a movement pattern of the image acquisition unit in the 3D scan processing according to the first embodiment.
- FIG. 14 is a flowchart illustrating an example of a controller control process according to the first embodiment.
- FIG. 15 is a schematic diagram illustrating an example of a positional relationship between an imaging unit including the first imaging optical system according to the second embodiment and a sample.
- FIG. 16 is a schematic diagram illustrating an example of a side observation image according to the second embodiment.
- FIG. 17 is a flowchart illustrating an example of a side observation process according to the second embodiment.
- the observation system according to the present embodiment is a system for photographing a cell, a cell group, a tissue or the like in culture and recording the number, form, etc. of the cell or cell group.
- An example of an outline of the appearance of the observation system 1 is shown in FIG. 1 as a schematic diagram, and an outline of a configuration example of the observation system 1 is shown in FIG. 2 as a block diagram, and the configuration of the observation system 1 will be described with reference to these figures.
- the observation system 1 includes an observation device 100 and a controller 200.
- the observation apparatus 100 has a substantially flat plate shape.
- a sample 300 to be observed is arranged on the upper surface of the observation apparatus 100, and the observation apparatus 100 and the sample 300 are installed in, for example, an incubator.
- an X axis and a Y axis that are orthogonal to each other are defined in a plane parallel to a plane on which the sample 300 of the observation apparatus 100 is arranged, and a Z axis (observation axis) is orthogonal to the X axis and the Y axis. ) Is defined.
- the observation apparatus 100 includes a housing 101, a transparent plate 102, an image acquisition unit 150, and a moving mechanism 160.
- a transparent plate 102 is disposed on the upper surface of the housing 101.
- the image acquisition unit 150 is provided inside the housing 101 and includes an imaging unit 151 and an illumination unit 155.
- the imaging unit 151 includes an imaging optical system 152 and an imaging element 153.
- the imaging unit 151 generates image data based on an image (subject image) formed on the imaging surface of the imaging element 153 via the imaging optical system 152.
- the image acquisition unit 150 is moved by the moving mechanism 160 to change the relative position with the sample 300.
- the image acquisition unit 150 illuminates the sample 300 through the transparent plate 102 while being moved, and photographs the sample 300 to acquire an image of the sample 300.
- the controller 200 is installed outside the incubator, for example.
- the observation apparatus 100 and the controller 200 communicate with each other.
- the controller 200 controls the operation of the observation apparatus 100.
- the shooting position in the Z-axis direction may be changed by the moving mechanism 160, or may be changed by changing the in-focus position of the imaging optical system 152.
- the imaging optical system 152 is preferably a zoom optical system that can change the focal length.
- the observation apparatus 100 performs a 3D scan process for acquiring depth information of the subject of interest, and a count scan process for acquiring the size, number, and the like of the subject of interest.
- the imaging optical system 152 includes a first imaging optical system that is an optical system having non-telecentricity at least on the subject side.
- a first image an image captured using the first imaging optical system is referred to as a first image.
- the observation apparatus 100 acquires the first image while moving the image acquisition unit 150 including the first imaging optical system.
- the observation apparatus 100 acquires, as depth information, the position in the optical axis direction (observation axis direction) of the first imaging optical system of the subject of interest included in the sample 300 based on parallax between imaging performed at different positions. To do.
- the depth information includes the position in the optical axis direction of the first imaging optical system related to the subject of interest, a three-dimensional model (3D model), and the like.
- the imaging optical system 152 further includes a second imaging optical system that is an optical system having telecentricity at least on the subject side.
- a second imaging optical system that is an optical system having telecentricity at least on the subject side.
- an image captured using the second imaging optical system is referred to as a second image.
- the observation apparatus 100 acquires the second image while moving the image acquisition unit 150 including the second imaging optical system.
- the observation apparatus 100 combines the second images captured at different positions, and acquires a wide range of high pixel images as if they were captured by capturing a wide range.
- the observation apparatus 100 acquires the size, number, and the like of the subject of interest based on the second image or the high pixel image.
- the imaging optical system 152 includes a diaphragm 152a and a plurality of lenses including at least an objective lens 152b and an imaging lens 152c.
- the focal length on the image sensor 153 side of the imaging lens 152c is defined as a focal length Ft.
- FIG. 3 An example of the positional relationship between the imaging unit 151 including the second imaging optical system according to the present embodiment and the sample 300 is shown in FIG. 3 as a schematic diagram.
- the sample 300 includes a container 310 and a cell 324 that is a subject of interest.
- the second imaging optical system is a double-sided telecentric optical system will be described as an example. As shown in FIG.
- the second imaging optical system matches the positions of the exit side focal point of the stop 152a and the objective lens 152b with the incident side focal point of the imaging lens 152c, so that the subject side And an optical system having telecentricity on both sides of the image side.
- a principal ray that has entered parallel to the optical axis of the optical system (a ray passing through the center of the stop 152a) is emitted in parallel to the optical axis of the optical system.
- the point P1 on the cell 324 is located on the optical axis of the second imaging optical system.
- the light rays (chief rays) that pass through the second imaging optical system so as to pass through the center of the stop 152a and enter the imaging element 153 are principal rays of the second imaging optical system.
- the light ray R1 enters the objective lens 152b in parallel with the optical axis of the second imaging optical system, passes through the center of the diaphragm 152a and the imaging lens 152c, and the second imaging optical on the image sensor 153.
- the light enters the point Q1 located on the optical axis of the system.
- FIG. 4 shows a schematic diagram of the state after being applied.
- the light rays (chief rays) that can enter the image sensor 153 through the second imaging optical system so as to pass through the center of the stop 152a are
- the light ray R1 ′ is incident on the objective lens 152b in parallel with the optical axis of the second imaging optical system.
- the mechanism configuration can be tilted, it is possible to apply the tilting movement to the optical axis.
- a point on the image sensor 153 on which the light ray R1 ′ is incident is referred to as Q1 ′.
- the amount of change in the image position (second image movement amount) on the imaging surface of the imaging element 153 is the distance between the point Q1 and the point Q1 ′. Further, in the state shown in FIG. 4, a point on the optical axis of the second imaging optical system and whose position in the optical axis direction is equal to the point P1 is defined as a point P11.
- FIG. 5 An example of the positional relationship between the imaging unit 151 including the first imaging optical system according to the present embodiment and the sample 300 is shown in FIG. 5 as a schematic diagram. Similar to the case shown in FIGS. 3 and 4, the sample 300 includes a container 310 and a cell 324 that is a subject of interest.
- the first imaging optical system is a subject-side non-telecentric optical system (image-side telecentric optical system) will be described as an example. As shown in FIG.
- the first imaging optical system is configured to reduce the position of the objective lens 152b in the optical axis direction of the first imaging optical system in the double-sided telecentric optical system shown in FIG.
- This is an optical system in which the telecentricity is broken only on the subject side by moving to the 152a side.
- a light beam (principal light beam) that enters the optical system and is deflected by the objective lens 152b and then passes through the center of the stop 152a is parallel to the optical axis of the optical system. It is injected. Since the exit side focal points of the aperture stop 152a and the objective lens 152b are different from each other, the principal ray (in FIG.
- the off-axis principal ray (shown by a broken line) has an optical axis and an inclination of the optical system between the subject of interest and the optical system.
- the position of the point P1 on the cell 324 is on the optical axis of the first imaging optical system. Therefore, out of the light rays emitted from the point P1, the diaphragm 152a of the first imaging optical system.
- a ray (principal ray) that can enter the image sensor 153 through the center of the image is a ray R2 that passes through the optical axis of the first image pickup optical system and enters the point Q2 on the image sensor 153.
- FIG. 6 shows a schematic diagram of the state after being applied.
- the principal ray out of the rays emitted from the point P1 has an inclination of the angle ⁇ with respect to the optical axis. It is a light ray R2 ′ that is incident on the objective lens 152b. Thereafter, the light ray R2 ′ is deflected by the objective lens 152b, passes through the center of the stop 152a, and enters the point Q2 ′ on the image sensor 153.
- the amount of change in the image position (first image movement amount ⁇ X) on the imaging surface of the imaging element 153 is the distance between the point Q2 and the point Q2 ′.
- a point on the optical axis of the first imaging optical system and whose position in the optical axis direction is equal to the point P1 is defined as a point P12.
- each principal ray incident on the imaging element 153 is parallel to the optical axis of the second imaging optical system. 2 is incident on the imaging optical system 2. Therefore, when the relative position between the subject of interest and the optical axis position of the second imaging optical system changes, there is no change (parallax) in the direction of viewing the point P1 from the incident end of the second imaging optical system.
- the optical axis position (imaging position) of the second imaging optical system moves from position X1 to position X1 ′. It is.
- the second image movement amount is equal to a distance obtained by multiplying the interval between the imaging positions at which the second image is acquired (first X movement amount ⁇ X1) by the magnification of the second imaging optical system. Even if the subject is uneven, the appearance on the imaging surface does not change. This can also be expressed as the second image movement amount not including the image movement amount caused by the parallax that occurs when the parallax exists.
- the pupil is infinite when viewed from the subject side, so that an image (second image) as if seen from a distance is obtained.
- the second imaging optical system is parallel to the optical axis of the second imaging optical system.
- the optical path of the light incident on the system does not change. Therefore, when combining a plurality of second images acquired while changing the imaging position to synthesize a wide range of high-pixel images, it is easy to process the joints between the images. Needless to say, based on a wide range of images, it is easy to compare the count of the number of cells and the size of the cells present in the imaging region.
- the observation apparatus 100 performs imaging using the first imaging optical system when it is desired to acquire depth information.
- the principal ray emitted from the optical axis has an inclination with respect to the optical axis. Therefore, for example, when the relative position of the subject of interest and the optical axis position changes so that the optical axis position (imaging position) of the first imaging optical system moves from the position X2 to the position X2 ′, the first Change (parallax) in the direction of viewing the point P1 from the incident end of the optical system.
- the first image movement amount ⁇ X is equal to a distance obtained by multiplying the interval between the imaging positions at which the first image is acquired (second X movement amount ⁇ X2) by the magnification of the first imaging optical system.
- the magnification on the imaging surface differs for a subject that is in a position before and after the focal plane on the subject side of the objective lens (a surface that includes P1 and is perpendicular to the optical axis).
- the magnification on the imaging surface decreases as the distance from the objective lens 152b increases. That is, the ratio between the second X movement amount ⁇ X2 and the first image movement amount ⁇ X changes depending on the difference in the depth of the position of the point P1 to be set. This can also be expressed as the first image movement amount ⁇ X including an image movement amount caused by the parallax generated when the parallax exists.
- the observation apparatus 100 can acquire depth information about a subject of interest such as the cell 324 using the principle of triangulation based on the first image movement amount ⁇ X, for example.
- the depth information includes, for example, 3D information and 3D images including information such as thickness and unevenness. In the imaging using the first imaging optical system, such distance distribution and image information in each depth direction can be obtained, so that the object can be confirmed with various information.
- the imaging unit 151 that acquires an image of the sample 300 using the observation optical system, the moving mechanism 160 that changes the relative position between the sample 300 and the imaging unit 151, and a plurality of images acquired at different positions of the imaging unit 151.
- an observation apparatus provided with the three-dimensional information acquisition part which acquires this sample 300 as three-dimensional information based on the information regarding each imaging position at the time of acquisition of this image can be provided.
- the observation optical system only needs to have an imaging function, and may be applied such that enlargement is electronically enlarged.
- the triangle formed by the point S0, the point P1, and the point P12 indicating the center of the diaphragm 152a is the point K1 on the imaging lens 152c through which the point S0 and the light ray R2 ′ pass.
- the distance between the point K1 and the point K0 is equal to the first image movement amount ⁇ X.
- the first image movement amount ⁇ X is known because the corresponding point is detected and calculated as the movement amount of the corresponding point on the image plane in the image processing based on at least two first images.
- the distance between the point S0 and the point K0 is the focal length Fo on the incident side of the imaging lens 152c and is known.
- the distance between the point P1 and the point P12 is the distance between the optical axes (second X movement amount ⁇ X2) and is known.
- the second imaging optical system shown in FIGS. 3 and 4 and the first imaging optical system shown in FIGS. 5 and 6 are shown as optical systems having telecentricity on the image side, respectively.
- the observation apparatus 100 uses the imaging optical system 152 as the first imaging optical system in order to switch whether an image acquired by imaging includes depth information of the cell 324 or not. Or a second imaging optical system. Therefore, even if the first imaging optical system or the second imaging optical system is an optical system having non-telecentricity on the image side, the same effect can be obtained.
- the first imaging optical system is an optical system having non-telecentricity on the subject side
- the first imaging optical system is moved in the optical axis direction of the first imaging optical system in order to focus.
- the corresponding point on the image sensor 153 moves.
- the case where the first imaging optical system and the imaging element 153 are moved together in the direction away from the cell 324 (Z-direction) from the state shown in FIG. 6 will be described as an example.
- an angle formed by a straight line passing through the points P1 and S0 and a straight line passing through the points P12 and S0 is small. Therefore, the point K1 moves to the optical axis side, and the point Q2 ′ also moves to the optical axis side. In this way, even when the first imaging optical system is moved in the optical axis direction of the first imaging optical system for focusing or the like in the middle of repeating the imaging while being moved, the corresponding point is moved. Obviously, it is possible to acquire depth information.
- the first image movement amount ⁇ X is acquired for a plurality of corresponding points included in the first image, and the depth information on the surface is acquired for the cell 324.
- a point P1 is positioned on the optical axis of the first imaging optical system as one of the first images used for obtaining depth information is captured.
- the present invention is not limited to this.
- points that are not located on the optical axis at the time of each imaging may be used as corresponding points.
- the observation apparatus 100 acquires depth information using the first imaging optical system, the depth information includes depth information of the surface of the subject of interest, and the subject of interest It becomes easy to obtain a 3D model.
- any principal ray on the subject side may be regarded as a subject-side telecentric optical system as long as the angle formed with the optical axis is 4 ° or less.
- the subject-side non-telecentric optical system may be regarded as an optical system in which an angle formed by any principal ray on the subject side and the optical axis (angle ⁇ in FIG. 6) is 6 ° or more.
- An image having a sufficient parallax between the plurality of first images can be obtained.
- the angle (angle ⁇ in FIG. 6) formed by any principal ray on the subject side and the optical axis is 20 ° or more.
- a sample 300 that is a measurement target of the observation system 1 is, for example, as follows.
- the sample 300 includes, for example, a container 310, a culture medium 322, cells 324, and a reflection plate 360.
- a medium 322 is placed in the container 310, and cells 324 are cultured in the medium 322.
- the container 310 can be, for example, a petri dish, a culture flask, a multiwell plate, or the like.
- the container 310 is a culture container for culturing a biological sample, for example.
- the shape, size, etc. of the container 310 are not limited.
- the medium 322 may be a liquid medium or a solid medium.
- the measurement object is, for example, the cell 324, but this may be an adhesive cell or a floating cell.
- the cell 324 may be a spheroid or a tissue.
- the cell 324 may be derived from any organism, and may be a fungus or the like.
- the sample 300 includes a biological sample that is a living organism or a sample derived from a living organism.
- the reflection plate 360 is for illuminating the cells 324 by reflecting the illumination light incident on the sample 300 via the transparent plate 102, and is disposed on the upper surface of the container 310.
- the transparent plate 102 disposed on the upper surface of the casing 101 of the observation apparatus 100 is made of, for example, glass.
- the observation apparatus 100 is in a state in which the inside is sealed by a member including, for example, a housing 101 and a transparent plate 102.
- the sample 300 is placed on the transparent plate 102.
- FIG. 1 shows an example in which the entire upper surface of the housing 101 is formed of a transparent plate, but the observation apparatus 100 is provided with a transparent plate on a part of the upper surface of the housing 101.
- the other part of the upper surface may be configured to be opaque.
- transparency here shows that it is transparent with respect to the wavelength of illumination light.
- the moving mechanism 160 includes a support portion 165, an X feed screw 161 for moving the support portion 165 in the X-axis direction, and an X actuator 162.
- the moving mechanism 160 further includes a Y feed screw 163 and a Y actuator 164 for moving the support portion 165 in the Y-axis direction.
- the moving mechanism 160 may include a Z feed screw and a Z actuator for moving the support portion 165 in the Z-axis direction.
- the direction in which the support portion 165 moves away from the X actuator 162 is defined as the positive direction of the X direction (X + direction)
- the direction of movement away from the Y actuator 164 is defined as the positive direction in the Y direction. (Y + direction)
- the direction from the support 165 toward the sample 300 is the positive direction of the Z direction (Z + direction).
- the illumination unit 155 included in the image acquisition unit 150 is provided on the support unit 165 included in the moving mechanism 160.
- An imaging unit 151 is provided in the vicinity of the illumination unit 155.
- the illumination unit 155 includes an illumination optical system 156 and a light source 157.
- the illumination light emitted from the light source 157 is irradiated onto the sample 300 via the illumination optical system 156.
- the light source 157 includes, for example, an LED.
- the imaging unit 151 further includes a lens switching unit 154.
- the lens switching unit 154 drives a lens included in the imaging optical system 152 in the optical axis direction so that the imaging optical system 152 becomes the first imaging optical system or the second imaging optical system.
- the lens switching unit 154 according to the present embodiment performs, for example, a count scan process that counts the number of cells 324, an imaging process that acquires a wide range of high pixel images by combining a plurality of images acquired at a plurality of positions, and the like.
- the imaging optical system 152 is used as the second imaging optical system.
- the lens switching unit 154 uses the imaging optical system 152 when, for example, 3D scan processing for acquiring depth information of the cell 324, imaging processing for acquiring a three-dimensional image of the cell 324, or the like is performed. 1 imaging optical system.
- the observation apparatus 100 causes the lens switching unit 154 to switch the imaging optical system 152 according to the type of observation, and causes the moving mechanism 160 to change the position of the image acquisition unit 150 in the X direction and the Y direction.
- the sample 300 is repeatedly photographed while changing the optical axis of the imaging optical system 152 in the direction while being kept parallel to the observation axis, and a plurality of images are acquired.
- the observation apparatus 100 further includes an observation side recording circuit 130.
- the observation-side recording circuit 130 records, for example, programs and various parameters used in each unit included in the observation apparatus 100 and data obtained by the observation apparatus 100.
- the observation-side recording circuit 130 temporarily records various data such as image data (pixel data), image data for recording, image data for display, and processing data during operation.
- the observation-side recording circuit 130 records the focus position range in the optical axis direction of the imaging optical system 152 as, for example, a focus position range.
- a focus position range for example, a value corresponding to the size of the sample 300 or the like is set in advance, or is set by a user input.
- the observation apparatus 100 further includes an image processing circuit 120.
- the image processing circuit 120 performs various image processing on the image data obtained by the imaging unit 151. Data after image processing by the image processing circuit 120 is recorded in, for example, the observation-side recording circuit 130 or transmitted to the controller 200. Further, the image processing circuit 120 may perform various analyzes based on the obtained image. For example, the image processing circuit 120 acquires depth information of a cell 324 or a cell group included in the sample 300 based on the obtained first image. For example, the image processing circuit 120 extracts an image of a cell 324 or a cell group included in the sample 300 based on the obtained second image, or calculates the number of cells or a cell group. The analysis result obtained in this way is also recorded in the observation-side recording circuit 130 or transmitted to the controller 200, for example.
- the observation apparatus 100 further includes an observation side communication apparatus 140.
- an observation side communication apparatus 140 wireless communication using, for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark) is used.
- the observation apparatus 100 and the controller 200 may be connected to each other by a wired communication to communicate with each other, or may be connected to an electrical communication line such as the Internet and communicate via an electrical communication line such as the Internet. It may be broken.
- the observation apparatus 100 further includes an observation-side control circuit 110 and a clock unit 172.
- the observation side control circuit 110 controls the operation of each unit included in the observation apparatus 100.
- the observation-side control circuit 110 acquires various information related to the operation of the observation apparatus 100, performs various determinations related to the operation of the observation apparatus 100, and notifies and alerts the user based on the determination result.
- the observation-side control circuit 110 includes a position control unit 111, an imaging control unit 112, an illumination control unit 113, a communication control unit 114, a recording control unit 115, a measurement control unit 116, and a distance conversion unit 117. It has a function.
- the position control unit 111 controls the operation of the moving mechanism 160 and controls the position of the image acquisition unit 150.
- the imaging control unit 112 controls the operation of the imaging unit 151 included in the image acquisition unit 150 and causes the imaging unit 151 to acquire an image of the sample 300.
- the imaging control unit 112 includes a focus / exposure switching unit.
- the imaging control unit 112 performs focus adjustment by moving a focusing lens included in the imaging optical system 152 in the optical axis direction, for example.
- the focusing lens may be a lens having a variable focal length such as a liquid lens. A plurality of lenses with different focal points may be prepared for focusing. If the prepared lens is multi-lens, refocusing technology or the like can be used.
- the focus / exposure switching unit adjusts the exposure by controlling the operation of the diaphragm 152a, for example, and adjusts the zoom by controlling the operation of the lens in the optical axis direction.
- the illumination control unit 113 controls the operation of the illumination unit 155 included in the image acquisition unit 150.
- the communication control unit 114 manages communication with the controller 200 via the observation side communication device 140.
- the recording control unit 115 controls recording of data obtained by the observation apparatus 100.
- the measurement control unit 116 controls the entire measurement such as the timing and number of times of measurement.
- the distance conversion unit 117 acquires position information in the optical axis direction of the cell 324 that is the subject of interest, information on the unevenness of the cell 324, and the like as depth information based on the processing result of the image processing circuit 120, for example.
- the clock unit 172 generates time information and outputs it to the observation side control circuit 110. The time information is used for determination related to the operation of the observation apparatus 100 when recording acquired data, for example.
- observation-side control circuit 110 the image processing circuit 120, the observation-side recording circuit 130, and the observation-side communication device 140 described above are arranged inside the casing 101 as a circuit group 104, for example, as shown in FIG. Is provided.
- the observation apparatus 100 includes functions as a three-dimensional information acquisition unit, a corresponding point acquisition unit, and a three-dimensional model generation unit.
- the three-dimensional information acquisition unit is used as the three-dimensional information about the sample 300 based on the plurality of first images acquired at the positions of the different imaging units 151 and the information related to the respective imaging positions at the time of acquisition of the first images. get.
- the three-dimensional information acquisition unit acquires information related to the unevenness of the sample 300, for example.
- the three-dimensional information acquisition unit acquires, for example, information related to each imaging position at the time of acquiring the first image from the distance conversion unit 117.
- the three-dimensional information includes depth information acquired by the distance conversion unit 117, for example.
- the corresponding point acquisition unit acquires the corresponding point based on the correlation between the plurality of first images acquired while the imaging unit 151 is moved by the moving mechanism 160, for example, and acquires the first image movement amount ⁇ X. .
- the three-dimensional model generation unit constructs a three-dimensional model of the subject of interest based on the depth information acquired by the distance conversion unit 117, for example.
- the functions as the three-dimensional information acquisition unit, the corresponding point acquisition unit, and the three-dimensional model generation unit can be realized by the observation-side control circuit 110 and / or the image processing circuit 120, for example.
- the reliability is high.
- the structure can be easily handled and cleaned, and can prevent contamination and the like.
- the controller 200 is, for example, a personal computer (PC), a tablet information terminal, or the like.
- FIG. 1 illustrates a tablet information terminal.
- the controller 200 is provided with an input / output device 270 including a display device 272 such as a liquid crystal display and an input device 274 such as a touch panel.
- the input device 274 may include a switch, dial, keyboard, mouse, and the like in addition to the touch panel.
- controller 200 is provided with a controller side communication device 240.
- the controller side communication device 240 is a device for communicating with the observation side communication device 140.
- the observation apparatus 100 and the controller 200 communicate with each other via the observation side communication apparatus 140 and the controller side communication apparatus 240.
- the controller 200 includes a controller-side control circuit 210 and a controller-side recording circuit 230.
- the controller side control circuit 210 controls the operation of each part of the controller 200.
- the controller-side recording circuit 230 records, for example, programs used in the controller-side control circuit 210, various parameters, and data received from the observation apparatus 100.
- the controller-side control circuit 210 has functions as a system control unit 211, a display control unit 212, a recording control unit 213, a communication control unit 214, and a network cooperation unit 215.
- the controller-side control circuit 210 may further have functions as a corresponding point acquisition unit and a three-dimensional model generation unit.
- the system control unit 211 performs various calculations related to control for measurement of the sample 300.
- the display control unit 212 controls the operation of the display device 272.
- the display control unit 212 causes the display device 272 to display necessary information and the like.
- the recording control unit 213 controls information recording in the controller-side recording circuit 230.
- the communication control unit 214 controls communication with the observation device 100 via the controller side communication device 240.
- the network cooperation unit 215 controls the cooperation between the observation system 1 and a network server or the like outside the observation system 1 such as a cloud provided on a telecommunication line such as the Internet.
- the network cooperation unit 215 provides, for example, an observation result such as an image acquired by the observation device 100 or an observation result acquired by the controller 200 on the observation side communication device 140 or the controller side communication device 240 on the network. Sent to the server.
- the network cooperation unit 215 causes the image processing circuit or the like included in the network server to perform processing such as cell count and depth information calculation based on the observation result, and acquires the result of the processing.
- the network cooperation unit 215 acquires information from an IoT device such as an incubator, an air conditioning facility, and a lighting facility connected to the Internet, and controls the IoT device.
- the observation-side control circuit 110, the image processing circuit 120, and the controller-side control circuit 210 include an integrated circuit such as Central Processing Unit (CPU), Application Specific Integrated Circuit (ASIC), or Field Programmable Gate Array (FPGA). .
- the observation side control circuit 110, the image processing circuit 120, and the controller side control circuit 210 may each be configured by one integrated circuit or the like, or may be configured by combining a plurality of integrated circuits. Further, the observation side control circuit 110 and the image processing circuit 120 may be configured by one integrated circuit or the like.
- the position control unit 111, the imaging control unit 112, the illumination control unit 113, the communication control unit 114, the recording control unit 115, the measurement control unit 116, and the distance conversion unit 117 of the observation side control circuit 110 are each one integrated circuit. Etc., or a combination of a plurality of integrated circuits or the like. Further, two or more of the position control unit 111, the imaging control unit 112, the illumination control unit 113, the communication control unit 114, the recording control unit 115, the measurement control unit 116, and the distance conversion unit 117 are configured by one integrated circuit or the like. May be.
- system control unit 211, the display control unit 212, the recording control unit 213, the communication control unit 214, and the network link unit 215 of the controller-side control circuit 210 may each be configured with one integrated circuit or the like. These integrated circuits or the like may be combined. Further, two or more of the system control unit 211, the display control unit 212, the recording control unit 213, the communication control unit 214, and the network link unit 215 may be configured by one integrated circuit or the like. The operation of these integrated circuits is performed according to a program recorded in a recording area in the observation-side recording circuit 130 or the controller-side recording circuit 230 or the integrated circuit, for example.
- observation-side recording circuit 130 the controller-side recording circuit 230, or each of the elements included in the observation-side recording circuit 130 is a non-volatile memory such as a flash memory, for example, but is not limited to Static Random Access Memory (SRAM) Such a volatile memory may be further included.
- the observation-side recording circuit 130 or each of the elements included therein and the controller-side recording circuit 230 or each of the elements included in the observation-side recording circuit 130 may be configured by one memory or the like, or a plurality of memories or the like may be combined. It may be configured. Further, a database or the like outside the observation system 1 may be used as a part of the memory.
- step S101 the observation-side control circuit 110 stands by until a signal output from the controller 200 according to a user operation is received, for example.
- step S102 the observation-side control circuit 110 determines whether, for example, a power-on signal for turning on the power of the observation apparatus 100 or a power-off signal for turning off the power of the observation apparatus 100 is received from the controller 200. The process proceeds to step S103 if it is determined that the power ON / OFF signal has been received, and proceeds to step S104 if it is not determined that it has been received.
- step S103 when it is determined that the power-on signal is received in step S102, the observation-side control circuit 110 starts supplying power to each part of the observation apparatus 100, and receives the power-off signal in step S102. If determined, the supply of power to each unit of the observation apparatus 100 is terminated. Note that power is continuously supplied to the observation-side communication device 140 in any case in order to wait for communication. Thereafter, the process returns to step S101.
- the observation device 100 is a communication device with low power consumption such as Bluetooth Low Energy (BLE) for transmission / reception of control signals and the like, and high-speed communication such as Wi-Fi for transmission / reception of data such as observation results including images. And a device.
- BLE Bluetooth Low Energy
- Wi-Fi wireless fidelity
- the controller 200 when the power is turned on at step S103. That's fine.
- the power of the observation apparatus 100 is turned on or off based on the power ON / OFF signal output from the controller 200, the present invention is not limited to this. For example, during culturing of cells, observation such as imaging may be performed by turning on or off the observation apparatus 100 at a preset time interval such as every minute.
- step S104 the observation-side control circuit 110 determines whether or not control signals related to various settings are received from the controller 200, for example. The process proceeds to step S105 if it is determined that a control signal related to various settings has been received, and returns to step S101 if it is not determined that it has been received.
- the observation-side control circuit 110 sets each part of the observation device 100 according to the control signals related to various settings received by the observation-side communication device 140 in step S104.
- the information set here includes, for example, information related to an observation result such as an image acquired by the observation apparatus 100 or a transmission destination of the measurement result, imaging conditions, measurement conditions, and various parameters.
- the transmission destination of the observation result or measurement result acquired by the observation apparatus 100 is, for example, the observation-side recording circuit 130 of the observation apparatus 100, the controller-side recording circuit 230 of the controller 200, a data server on the network, or the like.
- an observation result or a measurement result is transmitted to a cloud or the like constructed on the network in this way, not only information sharing between different users is facilitated, but also an analysis of an acquired image and an image outside the observation system 1 Processing can be performed.
- step S106 the observation-side control circuit 110 determines whether or not a control signal instructing execution of the count scan process is received from the controller 200, for example. The process proceeds to step S107 when it is determined that a control signal instructing execution of the count scan process has been received, and proceeds to step S108 when it is not determined that it has been received.
- a measurement start time or the like is determined in advance, and measurement may be started at the determined measurement start time.
- step S107 the observation-side control circuit 110 executes a count scan process and counts the number of cells 324. Details of the count scan process will be described later. Thereafter, the process proceeds to step S108.
- step S108 the observation-side control circuit 110 determines whether or not a control signal instructing execution of 3D scan processing is received from the controller 200, for example. The process proceeds to step S109 if it is determined that a control signal instructing execution of the count process has been received, and proceeds to step S110 if it is not determined that it has been received.
- step S109 the observation-side control circuit 110 executes a 3D scan process and acquires depth information such as three-dimensional information of the cell 324. Details of the 3D scanning process will be described later. Thereafter, the process proceeds to step S110.
- step S110 the observation-side control circuit 110 determines whether or not to end the processing related to observation or measurement based on, for example, a control signal output by the controller 200 in response to a user operation. The process proceeds to step S111 if it is determined to end, and returns to step S104 if it is determined not to end.
- the observation-side control circuit 110 determines whether or not a control signal that requests an observation result or a measurement result is received from the controller 200, for example.
- the observation result or the measurement result includes, for example, various data obtained by the observation apparatus 100 such as a measurement measurement value, an acquired image, a photographing position, and an analysis result.
- the shooting position includes an X coordinate, a Y coordinate, and a Z coordinate of the shooting position.
- the X coordinate and the Y coordinate are values used in the control of the moving mechanism 160 and can be acquired from the position control unit 111, for example.
- the Z coordinate is a value used for controlling the imaging optical system 152, and can be acquired from, for example, the imaging control unit 112, the distance conversion unit 117, and the like.
- the process proceeds to step S112 if it is determined that a control signal requesting an observation result or a measurement result has been received, and returns to step S101 if it is not determined that it has been received.
- step S112 the observation-side control circuit 110 transmits the results obtained by various observations and measurements such as an acquired image, the analysis results obtained by analyzing the results, and the like to the transmission destination set in step S105, for example. To do. Thereafter, the process returns to step S101.
- step S107 of the observation apparatus control process is shown in FIG. 8 as a flowchart, and the operation of the observation system 1 during the count scan process will be described with reference to this.
- step S201 the observation-side control circuit 110 causes the lens switching unit 154 to make the imaging optical system 152 the second imaging optical system.
- the second imaging optical system is an optical system having telecentricity on the subject side as described above. Thereafter, the process proceeds to step S202.
- step S202 the observation-side control circuit 110 executes pre-processing for starting a count scan based on, for example, the count-scan processing information recorded in the observation-side recording circuit 130.
- the observation side control circuit 110 moves the image acquisition unit 150 to the moving mechanism 160 and returns it to the XY start position of the count scan.
- the observation-side control circuit 110 controls the operations of the imaging optical system 152 and the imaging element 153 or the moving mechanism 160 so that the count scan can be started from the initial position in the Z direction. Thereafter, the observation side control circuit 110 starts a count scan.
- count scan processing information according to the present embodiment is shown in FIG. 9, and information recorded as count scan processing information will be described with reference to this.
- the information is set in advance, for example, or set in step S105 of the observation apparatus control process.
- the count scan processing information includes information CSP relating to the count scan pattern, information CSJ relating to execution of the count scan processing, and information CSR obtained by the count scan processing.
- the information CSP related to the count scan pattern includes, for example, a count scan start condition CSP1, a start position CSP2, an end condition CSP3, a first X movement pitch CSP5, a first Y movement pitch CSP6, and movement in the X direction to movement in the Y direction.
- the first X ⁇ Y condition CSP10 which is a condition for switching to the X direction
- the first Y ⁇ X condition CSP11 which is a condition for switching from the movement in the Y direction to the movement in the X direction.
- the first X movement pitch CSP5 is a movement (imaging) interval in the X direction
- the first Y movement pitch CSP6 is a movement (imaging) interval in the Y direction.
- the image acquisition unit 150 captures images for each of these movement pitches and acquires a second image.
- the information CSJ related to the execution of the count scan process is, for example, when it is determined that there is an observation defect based on a first NG determination condition CSJ1, which is a determination condition for determining an observation defect, for example, the first NG determination condition CSJ1.
- a first retry determination condition CSJ2 which is a determination condition for determining whether or not to re-execute the count scan, is included.
- the information CSR obtained by the count scan process is recorded in association with each image (second image) acquired by using the second imaging optical system in the count scan process, for example.
- the first result CSR1 includes the first frame CSR11, the first time CSR12 when the first frame CSR11 is acquired, the first AF information CSR13, and the first imaging condition CSR14.
- the shooting conditions include exposure conditions such as shutter speed and aperture, and other shooting conditions.
- the photographing conditions here may be different for each photographing, may be different for each measurement, or may be common for all photographing.
- the information may include information on a position where the second image is acquired, a result of counting the number of cells 324, and the like.
- FIG. 10 an example of the movement pattern of the image acquisition unit 150 in the count scan processing according to the present embodiment is shown in FIG. 10 as a schematic diagram, and the movement of the image acquisition unit 150 in the count scan will be described with reference to this.
- a case where the count scan process is executed while the image acquisition unit 150 is moved on the line CL1 shown in FIG. 10 will be described as an example.
- the observation-side control circuit 110 moves the image acquisition unit 150 to the start position CP1, and acquires the second image.
- the observation-side control circuit 110 moves the image acquisition unit 150 in the Y direction by the first Y movement pitch, and acquires the second image at the position after the movement. Thereafter, the observation-side control circuit 110 determines that the second image is in a state of being in a state corresponding to the first Y ⁇ X condition CSP11, for example, the image acquisition unit 150 exists at the position indicated by the point CP2.
- the acquisition and the movement of the image acquisition unit 150 are repeated.
- the observation side control circuit 110 switches the direction in which the image acquisition unit 150 is moved from the Y direction to the X direction. After the movement direction is switched to the Y direction, the observation-side control circuit 110 is in a state corresponding to the first X ⁇ Y condition CSP10, for example, the image acquisition unit 150 exists at the position indicated by the point CP3. Until it is determined, the process of moving the image acquisition unit 150 by the first X movement pitch and acquiring the second image is repeated. In this way, the observation-side control circuit 110 continues the count scan process until it is determined that the end condition CSP3 is satisfied, for example, when the image acquisition unit 150 reaches the point CP10.
- step S203 the observation-side control circuit 110 determines the count scan state. In this determination, for example, when the image captured by the image processing circuit 120 is analyzed to detect an observation failure, or when the moving mechanism 160 detects an operation failure, it is determined that the count scan needs to be performed again.
- the determination condition is recorded in the observation-side recording circuit 130 as count scan processing information, for example.
- the specification which a user judges based on the live view (LV) display performed by the controller 200 by transmitting the image acquired at the time of count scanning to the controller 200 is also considered.
- the process proceeds to step S204 when it is determined that re-execution of the count scan is necessary, and proceeds to step S205 when it is not determined.
- step S204 the observation-side control circuit 110 alerts the user that an observation failure or the like has occurred in the count scan, or that the count scan needs to be re-executed, according to the determination result in step S203.
- a control signal is generated and transmitted to the controller 200. Thereafter, the process returns to step S202.
- count scan processing performed after returning to step S202 may be performed again with the image acquisition unit 150 returned to the start position or performed again from the current position, for example, according to the determination result in step S203. To do.
- step S205 the observation-side control circuit 110 determines whether or not the count scan in a predetermined entire region has ended based on the end condition CSP3 recorded in the observation-side recording circuit 130 as count scan processing information, for example. To do. The process proceeds to step S211 when it is determined that the entire region has been completed, and proceeds to step S206 when it is not determined.
- the observation-side control circuit 110 causes the imaging unit 151 to perform focus adjustment (Auto Focus: AF) on the cell 324 that is the subject of interest, and to perform an imaging operation to acquire a second image.
- focus adjustment Auto Focus: AF
- the imaging optical system is used during AF. 152, the size and position of the cell 324 do not change even if the position in the optical axis direction of the image sensor 153 or the like changes.
- the observation-side control circuit 110 causes the observation-side communication device 140 to transmit the second image acquired as described above with reference to FIG. 9 to a preset transmission destination.
- step S207 the observation-side control circuit 110 causes the image processing circuit 120 to analyze the acquired second image, count the number of cells 324 or cell groups, and also display the cell count result as the observation-side recording circuit 130. Or to the controller 200. Thereafter, the process proceeds to step S208.
- the cell count may be performed by the controller 200.
- the cell count is performed outside the observation system 1. Also good. Further, the cell count may be performed based on a wide range of high pixel images synthesized based on the acquired second image after the count scan process of the entire region is completed.
- step S208 the observation-side control circuit 110 determines whether or not the current state satisfies the first X ⁇ Y condition CSP10 or the first Y ⁇ X condition CSP11 recorded as count scan processing information. judge. The process proceeds to step S209 when it is determined that the first X ⁇ Y condition CSP10 or the first Y ⁇ X condition CSP11 is satisfied, and the process proceeds to step S210 when it is not determined.
- step S209 the observation-side control circuit 110 switches the direction in which the image acquisition unit 150 is moved according to the determination result in step S208. Thereafter, the process proceeds to step S210.
- step S210 the observation-side control circuit 110 moves the image acquisition unit 150 by the first X movement pitch or the first Y movement pitch according to the movement direction at that time. Thereafter, the process returns to step S203.
- step S211 the observation-side control circuit 110 causes the observation-side communication device 140 to transmit an end signal to the controller 200 when it is determined in step S205 that the count scan has been completed in the entire area.
- the observation side control circuit 110 causes the image processing circuit 120 to synthesize a wide range of high pixel images based on the acquired second image. Thereafter, the process ends, and the process proceeds to step S108 of the observation apparatus control process.
- step S109 of the observation apparatus control process is shown in FIG. 11 as a flowchart, and the operation of the observation system 1 during the 3D scan process will be described with reference to this flowchart.
- step S301 the observation-side control circuit 110 causes the lens switching unit 154 to make the imaging optical system 152 the first imaging optical system that is a non-telecentric optical system on the subject side. Thereafter, the process proceeds to step S302.
- the observation-side control circuit 110 executes pre-processing for starting 3D scanning based on, for example, 3D scan processing information recorded in the observation-side recording circuit 130.
- the observation-side control circuit 110 moves the image acquisition unit 150 to the moving mechanism 160 and returns it to the XY designated position for 3D scanning.
- the observation-side control circuit 110 controls the operations of the imaging optical system 152 and the imaging element 153, or the moving mechanism 160 so that 3D scanning can be started from a predetermined position in the Z direction. Thereafter, the observation-side control circuit 110 starts 3D scanning.
- 3D scan processing information is shown in FIG. 12, and information recorded as 3D scan processing information will be described with reference to this.
- the information is set in advance, for example, or set in step S105 of the observation apparatus control process.
- the 3D scan processing according to the present embodiment is executed using the first imaging optical system that is a subject-side non-telecentric optical system.
- the 3D scanning process according to the present embodiment is a process executed on an area (specific area) including a specific position designated by the user.
- the 3D scan processing information includes information TSP related to the 3D scan pattern, information TSJ related to execution of the 3D scan processing, and information TSR obtained by the 3D scan processing.
- the information TSP related to the 3D scan pattern is a designated position TSP1, a range setting TSP2, a second X movement pitch TSP5, a second Y movement pitch TSP6, and a second condition for switching from movement in the X direction to movement in the Y direction.
- X ⁇ Y condition TSP10, and a second Y ⁇ X condition TSP11 that is a condition for switching from movement in the Y direction to movement in the X direction.
- the second X movement pitch TSP5 is a movement (imaging) interval in the X direction
- the second Y movement pitch TSP6 is a movement (imaging) interval in the Y direction.
- the observation-side control circuit 110 determines the specific area based on the designated position TSP1 and the range setting TSP2, and also determines the start position and end position of the 3D scan.
- the information TSJ related to the execution of the 3D scanning process is, for example, when it is determined that the observation is defective according to the second NG determination condition TSJ1, which is a determination condition for determining the observation defect, for example, the second NG determination condition TSJ1.
- a second retry determination condition TSJ2 that is a determination condition for determining whether or not to re-execute the 3D scan is included.
- the information TSR obtained by the 3D scanning process is recorded in association with each image (first image) acquired by using the first imaging optical system in the 3D scanning process.
- the first result TSR1 includes the first frame TSR11, the first time TSR12 when the first frame TSR11 is acquired, the first depth information TSR13, and the first 3D shooting condition TSR14.
- the depth information may include the position of the cell 324 in the optical axis direction of the first imaging optical system, information on a 3D model that can be constructed based on the position information, and the like.
- FIG. 13 an example of the movement pattern of the image acquisition unit 150 in the 3D scan processing according to the present embodiment is shown in FIG. 13 as a schematic diagram, and the movement of the image acquisition unit 150 in the 3D scan will be described with reference to this.
- a case where the image acquisition unit 150 performs the 3D scan process while being moved on the line TL1 illustrated in FIG. 13 will be described as an example.
- the observation-side control circuit 110 moves the image acquisition unit 150 to the line TL1 in a region (specific region) that is generally centered on the designated position TSP1 indicated by the point TP0 according to the range setting TSP2.
- the first image is acquired at each position by picking up an image while moving it up.
- the movement of the image acquisition unit 150 during the 3D scan is substantially the same as the movement of the image acquisition unit 150 during the count scan described with reference to FIG.
- the image acquisition unit 150 starts moving from the start position TP1, and moves in the Y direction until the second Y ⁇ X condition TSP11 is satisfied at the point TP2. Thereafter, the robot moves in the X direction at the point TP3 until the second X ⁇ Y condition TSP10 is satisfied. In this way, the observation-side control circuit 110 continues the 3D scan process until the image acquisition unit 150 reaches the point TP10.
- step S303 the observation-side control circuit 110 determines the 3D scan state in the same manner as in step S203 of the count scan process.
- the determination conditions used here are the second NG determination condition TSJ1 and the second retry determination condition TSJ2.
- the process proceeds to step S304 if it is determined that the 3D scan needs to be re-executed, and proceeds to step S305 if it is not determined.
- step S304 the observation-side control circuit 110 warns the user that an observation defect or the like has occurred in the 3D scan according to the determination result in step S303, and that it is necessary to re-execute the 3D scan.
- a control signal is generated and transmitted to the controller 200. Thereafter, the process returns to step S302.
- step S305 the observation-side control circuit 110 determines whether or not the 3D scan in the specific area has ended. The process proceeds to step S310 when it is determined that the 3D scan in the specific area has been completed, and proceeds to step S306 when it is not determined.
- step S306 the observation-side control circuit 110 causes the imaging unit 151 to perform an imaging operation to acquire a first image.
- the first imaging optical system which is an optical system having non-telecentricity on the subject side is used.
- the observation-side control circuit 110 causes the observation-side communication device 140 to transmit the acquired first image to a preset transmission destination.
- the observation-side control circuit 110 performs the same processes as steps S208 to S210 in the count scan process.
- the observation-side control circuit 110 determines whether or not the second X ⁇ Y condition TSP10 or the second Y ⁇ X condition TSP11 is satisfied in step S307, and if it is determined that these conditions are satisfied.
- the moving direction is switched.
- the observation-side control circuit 110 moves the image acquisition unit 150 according to the movement direction and the value of the second X movement pitch TSP5 or the second Y movement pitch TSP6. Thereafter, the process returns to step S303.
- step S310 the observation-side control circuit 110, as described above with reference to FIGS. 5 and 6, based on the result of causing the image processing circuit 120 to perform image processing on the first image, the depth associated with the cell 324. Get information.
- the depth information includes information related to the unevenness of the cell 324 or the cell group in the optical axis direction of the first imaging optical system. Therefore, the depth information acquired here includes, for example, a 3D model indicating the three-dimensional shape of the cell 324 or the cell group generated by the three-dimensional model generation unit.
- the depth information may be acquired by the controller 200.
- the controller 200 when the controller 200 is the transmission destination of the first image and the controller 200 includes an image processing circuit, the depth information may be acquired by the controller 200.
- the controller 200 when a server on a network such as a cloud is a transmission destination of the first image and the cloud has a function corresponding to an image processing circuit, the depth information is acquired outside the observation system 1. It may be broken.
- step S311 the observation-side control circuit 110 causes the observation-side communication device 140 to transmit an end signal to the controller 200. Thereafter, the process ends, and the process proceeds to step S110 of the observation apparatus control process.
- FIG. 14 An example of a controller control process performed by the controller 200 is shown in FIG. 14 as a flowchart, and the operation of the observation system 1 will be described with reference to this flowchart.
- the process illustrated in the flowchart of FIG. 14 starts, for example, in a state where the observation apparatus 100 is waiting for communication.
- step S401 the controller-side control circuit 210 generates display information for presenting various functions provided in the controller 200 to the user using text, icons, and the like, and causes the display device 272 to display the display information.
- step S402 the controller-side control circuit 210 determines whether the activation of the inspection application is instructed based on, for example, a control signal output from the input device 274 in accordance with a user operation result.
- the inspection application is application software having a program for communicating with the observation apparatus 100 to control the observation apparatus 100.
- the controller control process proceeds to step S403 when it is determined that the activation of the inspection application is instructed, and returns to step S401 when it is not determined.
- the controller 200 is, for example, a tablet PC or a smartphone, and in this step, a telephone application or a mail application can be selected in addition to the inspection application. In the following description, only the case where the inspection application is selected will be described as an example.
- step S403 the controller side control circuit 210 accesses the designated camera.
- the designated camera is an imaging device to be controlled by the inspection application selected in step S402, for example.
- the description will be continued assuming that the designated camera is the observation apparatus 100.
- step S ⁇ b> 404 the controller-side control circuit 210 performs an operation for turning on the observation apparatus 100 by the user or a power supply for the observation apparatus 100 based on, for example, a control signal output from the input apparatus 274 according to the operation result of the user. It is determined whether or not an operation to turn off (imaging ON / OFF operation) has been performed. The process proceeds to step S405 if it is determined that the imaging ON / OFF operation has been performed, and proceeds to step S406 if it is not determined.
- imaging ON / OFF operation an operation to turn off
- step S405 the controller-side control circuit 210 turns off the power ON signal for turning on the observation device 100 or the power to the observation device 100 based on the result of the user's imaging ON / OFF operation detected in step S404.
- the controller-side communication device 240 is caused to transmit a power OFF signal to the observation device 100.
- the process returns to step S403. Note that the process in this step corresponds to steps S102 to S103 in the observation apparatus control process.
- step S406 the controller-side control circuit 210, for example, based on a control signal output from the input device 274 in accordance with a user operation result, the transmission destination of an observation result or measurement result such as an image acquired by the observation device 100 by the user. It is determined whether or not various settings including information, shooting conditions, measurement conditions, and various parameters are performed. The process proceeds to step S407 if it is determined that various settings have been made, and proceeds to step S408 if it is not determined.
- step S407 the controller-side control circuit 210 causes the controller-side communication device 240 to transmit control signals related to various settings detected in step S406 to the observation device 100. Thereafter, the process proceeds to step S408. Note that the process in this step corresponds to steps S104 to S105 in the observation apparatus control process.
- step S408 the controller-side control circuit 210 determines whether or not the user has instructed execution of the count scan process based on, for example, a control signal output from the input device 274 in accordance with the result of the user operation. The process proceeds to step S409 when it is determined that execution of the count scan process is instructed, and proceeds to step S410 when it is not determined.
- step S409 the controller-side control circuit 210 causes the controller-side communication device 240 to transmit a control signal instructing execution of the count scan process to the observation device 100. Thereafter, the process proceeds to step S410. Note that the process in this step corresponds to steps S106 to S107 in the observation apparatus control process.
- step S410 the controller-side control circuit 210 determines whether or not the user has instructed the execution of the 3D scanning process based on, for example, a control signal output from the input device 274 in accordance with the result of the user operation.
- the controller-side control circuit 210 also determines whether or not the user has set the designated position TSP1, the range setting TSP2, and the like related to the specific area where the 3D scan is executed.
- the process proceeds to step S411 when it is determined that the execution of the 3D scan process is instructed or the setting related to the specific area is performed, and when it is not determined, the process proceeds to step S412.
- step S411 if the execution of the 3D scan process is instructed in step S410, the controller side control circuit 210 causes the controller side communication device 240 to transmit a control signal instructing the execution of the 3D scan process to the observation apparatus 100. .
- the controller-side control circuit 210 causes the controller-side communication device 240 to transmit the control signal related to the setting to the observation device 100. Thereafter, the process proceeds to step S412. Note that the process of this step corresponds to steps S108 to S109 of the observation apparatus control process.
- step S412 the controller-side control circuit 210 determines whether to receive a measurement result or the like from the outside of the controller 200, for example, according to the result of the user operation. The process proceeds to step S413 when it is determined that the measurement result is received, and proceeds to step S414 when it is not determined.
- step S413 the controller-side control circuit 210 acquires the measurement results and the like acquired by the observation apparatus 100, and displays the measurement results and the like on the display device 272.
- the measurement result or the like may be acquired from the observation apparatus 100, or may be acquired from the transmission destination of the measurement result of the observation apparatus 100 set in step S406. Thereafter, the process proceeds to step S414. Note that the process of this step corresponds to steps S111 to S112 of the observation apparatus control process.
- step S414 the controller-side control circuit 210 determines whether or not to end the inspection application, for example, according to the operation result of the user. If it is determined that the process is to end, the inspection application is ended and the process returns to step S401. If it is determined that the process is not ended, the process returns to step S403.
- the lens switching unit 154 is driven in the optical axis direction of the imaging optical system 152 of the lens included in the imaging optical system 152, and the first imaging optical system and the second imaging optical system are driven.
- the observation apparatus 100 may include a first imaging optical system and a second imaging optical system separately.
- the lens switching unit 154 moves the optical axis of the imaging optical system used for imaging out of the first imaging optical system and the second imaging optical system so as to be parallel to the observation axis.
- the imaging optical system is switched.
- the illumination unit 155 is described as being disposed on the support unit 165, it is only necessary that the light emitting unit of the illumination optical system 156 be disposed on the support unit 165.
- the light source 157 may be used for observation. It may be placed anywhere on the device 100. Note that, for example, the intensity of illumination may be changed depending on the type of observation in order to reduce damage to the observation target such as the cell 324.
- control of illumination light there can be a control method of intermittent illumination such that the sample 300 is illuminated only at the moment of photographing, and a control method of increasing or decreasing the number of lighting illuminations.
- the case where the image acquisition unit 150 starts moving in the Y direction from the start position has been described as an example.
- the present invention is not limited to this.
- scanning starts from the start position in the X direction. May be.
- the observation apparatus 100 causes the moving mechanism 160 to change the position of the image acquisition unit 150 in the X direction and the Y direction while maintaining the optical axis of the imaging optical system 152 in parallel with the observation axis.
- the sample 300 is repeatedly photographed to acquire a plurality of images.
- the observation apparatus 100 uses the second imaging optical system, which is a subject-side telecentric optical system, to acquire a second image suitable for synthesis of a wide range of high-pixel images.
- the second image or the high pixel image is suitable for acquiring the shape and number of the subject of interest such as a cell.
- the observation apparatus 100 according to the present embodiment uses a first imaging optical system that is a subject-side non-telecentric optical system, thereby causing parallax from a plurality of first images acquired at different imaging positions.
- a first image movement amount ⁇ X including the image movement amount is acquired, and depth information of a subject of interest such as a cell is acquired based on the first image movement amount ⁇ X.
- the lens switching unit 154 switches the imaging optical system 152 according to the type of observation.
- the observation apparatus 100 includes the lens switching unit 154 that switches between the first imaging optical system and the second imaging optical system, and thus is suitable for, for example, acquisition of depth information and cell counting. Observations with different required optical characteristics, such as image acquisition, can be realized.
- the observation system 1 can accurately project the position of the cell in the optical axis direction, the three-dimensional model of the cell, the shape of the cell, etc. only by specifying the desired observation method. Images, cell count results, etc. can be acquired.
- the cell 324 is used when the first imaging optical system is used.
- the case where the depth information of the subject of interest such as is acquired has been described as an example.
- a light beam that can reach the imaging element 153 has an inclination between the cell 324 and the objective lens 152b with respect to the optical axis of the first imaging optical system. Therefore, the first image includes information related to the side surface of the subject that exists at a position other than the optical axis. Therefore, in the present embodiment, the observation system 1 that generates a side observation image as if the side surface of the cell 324 was imaged based on the first image acquired by using the first imaging optical system will be described. do.
- the observation apparatus 100 further has functions as a stereoscopic image generation unit and a side information processing unit.
- the stereoscopic image generation unit generates a stereoscopic image of the subject of interest based on, for example, a plurality of first images and depth information.
- the side information processing unit acquires information related to various side observations such as image data and position information obtained by the side observation processing, and generates a side observation image based on the first image and the depth information.
- the functions as the stereoscopic image generation unit and the side information processing unit can be realized by the observation-side control circuit 110 and / or the image processing circuit 120, for example.
- the functions as the stereoscopic image generation unit and the side information processing unit may be realized by the controller-side control circuit 210, respectively.
- FIG. 15 shows a schematic diagram of an example of the positional relationship between the imaging unit 151 including the first imaging optical system according to the present embodiment and the sample 300.
- the state shown in FIG. 15 is a state where the point P1 on the cell 324 is moved by the third X movement amount ⁇ X3 in the X + direction from the state where it is located on the optical axis of the first imaging optical system. That is, when the image position of the point P1 is a corresponding point, the position of the corresponding point is in a state of moving from the point U1 to the point U1 ′.
- the light ray R10 emitted from the point P1 on the cell 324 is incident on the point U1 ′ on the image sensor 153, and the light ray R30 emitted from the point P3 is incident on the point U3 on the image sensor 153.
- the area A1 on the cell is imaged as the area A1 ′ on the image sensor 153.
- the area between the position of the point P1 and the position of the point P3 corresponds to the in-focus range (within the depth of focus). That is, it is assumed that the area A1 is in focus.
- the focused area A1 of the subject of interest is referred to as a specific range
- the region A1 ′ obtained by capturing the specific range of the subject of interest in the first image is referred to as a specific range image. I will call it.
- the observation apparatus 100 causes the image processing circuit 120 to cut out the specific range image from each first image. Further, the image processing circuit 120 synthesizes the specific range images based on depth information about the subject included in each specific range image, for example, and acquires a side observation image.
- the side information processing unit included in the observation device 100 according to the present embodiment combines the plurality of specific range images acquired in this manner, thereby imaging the cell 324 from below, and the side information processing unit A side observation image such as a depth composite image obtained by imaging can be acquired.
- FIG. 16 an example of a side observation image according to the present embodiment is shown in FIG. 16 as a schematic diagram.
- the observation device 100 acquired from each first image, A plurality of specific range images including the first specific range image I10, the second specific range image I11, and the third specific range image I12 are synthesized.
- the width W in the Z-axis direction (after image processing) in the side-view image of each specific range image to be synthesized is, for example, the optical axis between the points P1 and P3 when the region A1 ′ is the specific range image.
- the width W is the difference between the depth Z3 and the depth Z1.
- the width W in the side-view observation image of each specific range image can be different.
- the side image is actively utilized to perform side observation, and it is more important and abundant than ever to inspect and observe the condition of the object with a simple configuration. It is possible to obtain image information (stereoscopic conditions such as shading, color, and structure).
- the three-dimensional information acquisition unit acquires the three-dimensional information of the sample 300 based on the information related to each imaging position at the time of movement. At least one of the plurality of first images is the first imaging optical system.
- the image obtained in (1) is an image other than on the optical axis of the optical system.
- the width W in the Z-axis direction of each image may be determined based on, for example, the first image movement amount ⁇ X or the angle of view ⁇ , or may be determined based on the focal length and the width of the focusing range. May be.
- the present invention is not limited to this.
- the region on the first image cut out as the specific range image may be a region that is separated from the optical axis of the first imaging optical system by a predetermined threshold or more.
- it is preferable that the range is less affected by image distortion that may be present at the periphery of the image sensor 153.
- each acquired first Of these images a region that is focused on the cell 324 and is separated from the optical axis by a predetermined threshold or more may be cut out as a specific range image and image synthesis may be performed.
- the image composition is performed using information on the imaging position acquired simultaneously with the image.
- a score is obtained based on the presence or absence of overexposure or blackout, the degree of focus on the cell 324, etc., and is used for image synthesis according to the score.
- the specific range image to be selected may be selected.
- the range focused on the subject of interest in the first image is cut out as the specific range image.
- the observation apparatus 100 determines a region to be imaged of the subject of interest, divides it into a plurality of regions, and repeats imaging focused on the divided regions to obtain the first image. Obtaining and using the range including the region of the first image as the specific range image.
- the observation apparatus 100 uses an area corresponding to a specific area that is a predetermined threshold or more away from the optical axis on the image sensor 153 as an AF area, and performs first focusing so as to focus on the AF area. The Z position of the imaging optical system is adjusted.
- the observation apparatus 100 may acquire the side observation image while fixing the region occupied by the specific range image in the first image. Further, for example, the observation apparatus 100 acquires the first image while moving the image acquisition unit 150 in each of the X direction, the Y direction, and the Z direction according to a preset movement pattern, as in 3D scanning. Then, the side observation image may be acquired by using the range focused from the acquired first image as the specific range image.
- the present invention is not limited to this.
- the acquisition of the side observation image may be performed while being moved, for example, in the X direction and the Y direction according to the set or selected movement pattern.
- the side surface information processing unit included in the observation apparatus 100 according to the present embodiment cannot be observed using the subject-side telecentric optical system such as the sample 300 observed from the X + direction in the state illustrated in FIG. A side observation image that captures the side surface of the cell 324 can be acquired.
- the side observation process is performed, for example, after step S205 and before step S210 in the observation apparatus control process described above with reference to FIG.
- the side observation process receives a control signal instructing execution of the side observation process output by the controller 200 according to, for example, a user operation result in the same manner as the count scan process or the 3D scan process. It is started when it is determined.
- various types of information such as scan patterns and determination conditions required in the side observation processing are recorded in the observation side recording circuit 130 as side observation processing information, for example.
- the side observation processing information includes a result acquired in the side observation processing.
- step S501 the observation-side control circuit 110 causes the lens switching unit 154 to set the imaging optical system 152 as the first imaging optical system in the same manner as in step S301 of the 3D scanning process. Thereafter, the process proceeds to step S502.
- step S502 the observation-side control circuit 110 performs pre-processing in the same manner as in step S302 of 3D scanning processing. In addition, the observation side control circuit 110 starts scanning for acquiring a side observation image. Thereafter, the process proceeds to step S503.
- steps S503 and S504 the observation-side control circuit 110 performs pre-processing, determination regarding NG determination conditions and retry determination conditions, warning processing as necessary, and the like in the same manner as in steps S303 and S304 of 3D scanning processing. I do.
- the process proceeds to step S505 when it is determined in step S503 that the NG determination condition and the retry determination condition are not satisfied, and in step S504 when it is determined that the NG determination condition or the retry determination condition is satisfied in step S503. After giving a warning if necessary, the process returns to step S502.
- step S505 the observation-side control circuit 110 determines whether or not the side observation image acquisition in the specific area is completed. If it is determined that the process has been completed, the process proceeds to step S508; otherwise, the process proceeds to step S506.
- step S506 the observation-side control circuit 110 performs AF on a region (specific range) in which the distance from the optical axis of the first imaging optical system is equal to or greater than a predetermined threshold.
- a region specifically range
- the distance from the optical axis of the first imaging optical system is equal to or greater than a predetermined threshold.
- step S507 the observation side control circuit 110 moves the image acquisition unit 150 to the moving mechanism 160 according to the scan pattern recorded as the side observation processing information. For example, the observation-side control circuit 110 moves in a direction corresponding to the inclination of the principal ray emitted from the specific range. Thereafter, the process returns to step S503.
- the observation-side control circuit 110 causes the image processing circuit 120 to set a specific range from each first image acquired in the side-view observation scan in the specific area as described above with reference to FIG. The captured area is cut out as a specific range image.
- the observation-side control circuit 110 causes the image processing circuit 120 to convert the plurality of specific range images into an appropriate width W based on the depth information, and to synthesize the converted images.
- step S509 the observation side control circuit 110 transmits the side observation image to, for example, a preset transmission destination.
- the transmission destination may be determined based on a control signal output by the controller 200 in accordance with a user operation result in this step. Thereafter, the process ends.
- the observation system 1 according to the present embodiment has the following advantages in addition to the advantages obtained in the first embodiment.
- the side information processing unit included in the observation apparatus 100 is an area on the cell 324 in an observation apparatus that performs observation or the like by switching between the first imaging optical system and the second imaging optical system according to the purpose.
- the side surface of the cell 324 is imaged using the fact that there is an inclination between the principal ray radiated from a region separated from the optical axis of the first imaging optical system by a predetermined threshold or more and the optical axis. Acquired side observation image.
- the first image including the side surface of the cell 324 that cannot be obtained by the observation device having telecentricity on the subject side, and the image based on the first image. Obtained side view images.
- the observation apparatus 100 according to the present embodiment and the observation apparatus 100 according to the first embodiment can be combined.
- the stereoscopic image generation unit performs a process of three-dimensionalizing the specific range image acquired as described in the present embodiment based on the depth information regarding each corresponding point included in the specific range image.
- the observation device 100 acquires the stereoscopic image of the cell 324 by synthesizing the three-dimensional specific range image acquired in this way at the corresponding position of the stereoscopic model of the cell 324 described in the first embodiment. it can.
- the objective lens 152b and the imaging lens 152c are illustrated as positive lenses for simplicity, but the present invention is not limited to this.
- the objective lens may be a lens group having a negative refractive power in order to reduce the first image movement amount ⁇ X between the first images to be acquired and to increase the parallax. Needless to say, a plurality of lenses can be used according to the required performance.
- observation apparatus 100 may perform the AF operation based on the depth information acquired from the parallax image when the first imaging optical system is used.
- the observation apparatus 100 is used in an incubator, and emphasizes the use focused on cell observation.
- the present invention can be generalized as an observation apparatus for enlarging and confirming details, which can realize acquisition of size and the like and acquisition of depth information related to an observation object.
- the sample 300 can be put in and out of, for example, an incubator, a clean bench or the like while being placed on the upper surface of the observation apparatus 100.
- the cell 324 will be affected by a temperature change and may receive a heat shock.
- contamination may occur with taking in and out.
- the present technology can appropriately acquire the shape and size of an observation object such as a cell or a cell group and depth information such as unevenness of the observation object such as a cell or a cell group.
- the user is warned of the occurrence of observation failure according to the result of the determination using the NG determination condition and the retry determination condition for the image acquired by the observation apparatus 100.
- the application of the present technology is not limited to this. The user can be warned even when an abnormality of the observation object, contamination, or the like is detected.
- this technique evaluates the state of the culture medium based on the result of image analysis.
- the observation-side control circuit 110, the image processing circuit 120, the observation-side recording circuit 130, and the observation-side communication device 140 are arranged inside the housing 101 as a circuit group 104.
- the case where it is provided has been described as an example, it is not limited thereto.
- one or more of these functions may be provided in the image acquisition unit 150.
- the function as the observation-side communication device 140 may be provided in both the image acquisition unit 150 and the circuit group 104.
- one or more functions of the observation side control circuit 110, the image processing circuit 120, and the observation side recording circuit 130 may be provided in the controller 200. That is, for example, some or all of the above-described various determinations, image processing, and the like may be performed by the controller 200.
- some elements such as the input / output device 270 may be included in the observation device 100.
- a configuration in which the observation apparatus 100 and the controller 200 are incorporated in one housing is also conceivable.
- the observation system 1 in which the observation apparatus 100 and the controller 200 are integrated can be used when the user himself enters a use environment such as a temperature-controlled room.
- the observation system 1 records and learns observation results such as image analysis, usage of the observation system 1 including the usage frequency of the user, incubator settings, and the like.
- Artificial intelligence AI that presents parameters and the like may be included.
- the AI may be built inside the observation system 1, for example, in a DSP or the like, or may be built on the Internet and outside the observation system 1.
- the observation system 1 including such an AI can determine, for example, the state of the cell, the type, the state of the medium, the presence or absence of foreign matter, etc. with respect to the acquired image by referring to a database prepared on the server.
- the present technology is also effective when applied to an imaging apparatus that is used at a position where the imaging optical system and the user are separated, such as a surveillance camera or an endoscope.
- the user can acquire observation results such as images suitable for the purpose of use without having to replace the imaging device according to the purpose of imaging.
- this invention is not limited to the said embodiment, In the implementation stage, it can change variously in the range which does not deviate from the summary. Further, the embodiments may be implemented in combination as appropriate, and in that case, the combined effect can be obtained. Furthermore, the present invention includes various inventions, and various inventions can be extracted by combinations selected from a plurality of disclosed constituent elements. For example, even if several constituent requirements are deleted from all the constituent requirements shown in the embodiment, if the problem can be solved and an effect can be obtained, the configuration from which the constituent requirements are deleted can be extracted as an invention. *
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Microscoopes, Condenser (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Apparatus Associated With Microorganisms And Enzymes (AREA)
Abstract
Provided is an observation device (100), comprising: an imaging unit (151); a movement mechanism (160); and a stereoscopic information acquisition unit. The imaging unit (151) includes a first imaging optical system, which is a magnifying optical system and is a subject-side non-telecentric optical system in which an angle formed between one of principal rays on the subject side and the optical axis is 6° or more. The imaging unit (151) uses the first imaging optical system to image a sample (300) and acquire a first image. The movement mechanism (160) changes relative positions of the sample (300) and the imaging unit (151). The stereoscopic information acquisition unit acquires stereoscopic information on the sample (300) on the basis of a plurality of the first images acquired at different positions of the imaging unit (151) and information on respective imaging positions used to acquire the first images.
Description
本発明は、観察装置、観察システム及び観察装置の制御方法に関する。
The present invention relates to an observation apparatus, an observation system, and an observation apparatus control method.
一般に、インキュベータ内に培養容器を静置し、当該培養容器内の培養細胞等の画像を得る装置が知られている。例えば日本国特開2005-295818号公報には、拡大光学系を有する撮像装置と培養容器との相対位置を変化させながら培養容器表面の撮影を行う細胞培養装置に係る技術が開示されている。
Generally, an apparatus is known in which a culture vessel is left in an incubator to obtain an image of cultured cells in the culture vessel. For example, Japanese Patent Application Laid-Open No. 2005-295818 discloses a technique related to a cell culture device that takes an image of the surface of a culture vessel while changing the relative position between the imaging device having a magnifying optical system and the culture vessel.
また、培養細胞に係る奥行き情報を取得することには需要がある。注目被写体の光軸方向の位置を取得する技術として、例えば、日本国特開2014-238558号公報には、2つのカメラの取得する画像間の視差に基づいて、注目被写体の光軸方向の位置を取得する撮像装置に係る技術が開示されている。
In addition, there is a demand for acquiring depth information relating to cultured cells. As a technique for acquiring the position of the subject of interest in the optical axis direction, for example, Japanese Unexamined Patent Application Publication No. 2014-238558 discloses the position of the subject of interest in the optical axis direction based on the parallax between images acquired by two cameras. A technique related to an imaging apparatus that acquires the image is disclosed.
そのような中、撮像装置と培養容器との相対位置を変化させながら培養容器表面の撮影を行う撮像装置と、撮像間の視差に基づいて、注目被写体の光軸方向の位置を取得する撮像装置とでは、光学的制限が異なる。
Under such circumstances, an imaging device that images the surface of the culture vessel while changing the relative position between the imaging device and the culture vessel, and an imaging device that acquires the position in the optical axis direction of the subject of interest based on the parallax between the imaging And have different optical limitations.
本発明は、注目被写体の奥行き情報を取得できる観察装置、観察システム及び観察装置の制御方法を提供することを目的とする。
An object of the present invention is to provide an observation apparatus, an observation system, and an observation apparatus control method capable of acquiring depth information of a subject of interest.
本発明の一態様によれば、観察装置は、拡大光学系、かつ、被写体側のいずれかの主光線と光軸とのなす角度が6°以上となる被写体側非テレセントリック光学系である第1の撮像光学系を含み、前記第1の撮像光学系を用いて試料を撮像して第1の画像を取得する撮像部と、前記試料と前記撮像部との相対位置を変更する移動機構と、異なる前記撮像部の位置で取得された複数の前記第1の画像と、前記第1の画像の取得時の各々の撮像位置に係る情報とに基づいて、前記試料について立体情報を取得する立体情報取得部とを備える。
According to one aspect of the present invention, the observation apparatus is a first optical system that is an object-side non-telecentric optical system in which the angle formed by any one of the principal rays on the object side and the optical axis is 6 ° or more. An imaging optical system that captures a sample using the first imaging optical system to obtain a first image, a moving mechanism that changes a relative position between the sample and the imaging unit, Three-dimensional information for acquiring three-dimensional information about the sample based on a plurality of the first images acquired at different positions of the imaging unit and information on each imaging position at the time of acquisition of the first image. An acquisition unit.
本発明の一態様によれば、観察システムは、前記観察装置と、ユーザの操作結果を取得して前記観察装置へ出力し、前記観察装置の観察結果を取得するコントローラとを備える。
According to an aspect of the present invention, the observation system includes the observation device and a controller that acquires a user operation result, outputs the operation result to the observation device, and acquires the observation result of the observation device.
本発明の一態様によれば、観察装置の制御方法は、拡大光学系、かつ、被写体側のいずれかの主光線と光軸とのなす角度が6°以上となる被写体側非テレセントリック光学系である第1の撮像光学系を含む撮像部を用いて、前記第1の撮像光学系を用いて試料を撮像して第1の画像の取得を行うことと、前記試料と前記撮像部との相対位置の変更を行うことと、異なる前記撮像部の位置で取得された複数の前記第1の画像と、前記第1の画像の取得時の各々の撮像位置に係る情報とに基づいて、前記試料について立体情報の取得を行うこととを含む。
According to one aspect of the present invention, the observation apparatus control method is an enlargement optical system and a subject-side non-telecentric optical system in which an angle formed between any principal ray on the subject side and the optical axis is 6 ° or more. Using the imaging unit including a certain first imaging optical system, imaging the sample by using the first imaging optical system to obtain the first image, and the relative relationship between the sample and the imaging unit Based on the change of position, the plurality of first images acquired at different positions of the imaging unit, and information on each imaging position at the time of acquisition of the first image, the sample And acquiring three-dimensional information.
本発明によれば、注目被写体の奥行き情報を取得できる観察装置、観察システム及び観察装置の制御方法を提供できる。
According to the present invention, it is possible to provide an observation apparatus, an observation system, and an observation apparatus control method that can acquire depth information of a subject of interest.
[第1の実施形態]
<観察システムの構成>
(観察システムの概要)
本発明の第1の実施形態について図面を参照して説明する。本実施形態に係る観察システムは、培養中の細胞、細胞群、組織等を撮影し、細胞又は細胞群の個数、形態等を記録するためのシステムである。観察システム1の外観の概略の一例を模式図として図1に、観察システム1の構成例の概略をブロック図として図2にそれぞれ示し、これらを参照して観察システム1の構成について説明する。 [First Embodiment]
<Configuration of observation system>
(Outline of observation system)
A first embodiment of the present invention will be described with reference to the drawings. The observation system according to the present embodiment is a system for photographing a cell, a cell group, a tissue or the like in culture and recording the number, form, etc. of the cell or cell group. An example of an outline of the appearance of theobservation system 1 is shown in FIG. 1 as a schematic diagram, and an outline of a configuration example of the observation system 1 is shown in FIG. 2 as a block diagram, and the configuration of the observation system 1 will be described with reference to these figures.
<観察システムの構成>
(観察システムの概要)
本発明の第1の実施形態について図面を参照して説明する。本実施形態に係る観察システムは、培養中の細胞、細胞群、組織等を撮影し、細胞又は細胞群の個数、形態等を記録するためのシステムである。観察システム1の外観の概略の一例を模式図として図1に、観察システム1の構成例の概略をブロック図として図2にそれぞれ示し、これらを参照して観察システム1の構成について説明する。 [First Embodiment]
<Configuration of observation system>
(Outline of observation system)
A first embodiment of the present invention will be described with reference to the drawings. The observation system according to the present embodiment is a system for photographing a cell, a cell group, a tissue or the like in culture and recording the number, form, etc. of the cell or cell group. An example of an outline of the appearance of the
図1及び図2に示すように、観察システム1は、観察装置100とコントローラ200とを備える。図1に示すように、観察装置100は、おおよそ平板形状をしている。観察装置100の上面には観察対象である試料300が配置され、観察装置100と試料300とは、例えばインキュベータ内に設置される。
As shown in FIGS. 1 and 2, the observation system 1 includes an observation device 100 and a controller 200. As shown in FIG. 1, the observation apparatus 100 has a substantially flat plate shape. A sample 300 to be observed is arranged on the upper surface of the observation apparatus 100, and the observation apparatus 100 and the sample 300 are installed in, for example, an incubator.
以降の説明のため、観察装置100の試料300が配置される面と平行な面内に互いに直交するX軸及びY軸を定義し、X軸及びY軸と直交するようにZ軸(観察軸)を定義する。
For the following explanation, an X axis and a Y axis that are orthogonal to each other are defined in a plane parallel to a plane on which the sample 300 of the observation apparatus 100 is arranged, and a Z axis (observation axis) is orthogonal to the X axis and the Y axis. ) Is defined.
図1及び図2に示すように、観察装置100は、筐体101と、透明板102と、画像取得ユニット150と、移動機構160とを備える。筐体101の上面には、透明板102が配置されている。画像取得ユニット150は、筐体101の内部に設けられており、撮像部151と照明部155とを備える。図2に示すように、撮像部151は、撮像光学系152と、撮像素子153とを備える。撮像部151は、撮像光学系152を介して撮像素子153の撮像面に結像した像(被写体像)に基づいて、画像データを生成する。画像取得ユニット150は、移動機構160によって移動させられて試料300との相対位置を変更させられる。画像取得ユニット150は、移動させられながら、透明板102を介して試料300を照明し、また、試料300を撮影して試料300の画像を取得する。一方、コントローラ200は、例えばインキュベータの外部に設置される。観察装置100とコントローラ200とは、通信する。コントローラ200は、観察装置100の動作を制御する。
As shown in FIGS. 1 and 2, the observation apparatus 100 includes a housing 101, a transparent plate 102, an image acquisition unit 150, and a moving mechanism 160. A transparent plate 102 is disposed on the upper surface of the housing 101. The image acquisition unit 150 is provided inside the housing 101 and includes an imaging unit 151 and an illumination unit 155. As illustrated in FIG. 2, the imaging unit 151 includes an imaging optical system 152 and an imaging element 153. The imaging unit 151 generates image data based on an image (subject image) formed on the imaging surface of the imaging element 153 via the imaging optical system 152. The image acquisition unit 150 is moved by the moving mechanism 160 to change the relative position with the sample 300. The image acquisition unit 150 illuminates the sample 300 through the transparent plate 102 while being moved, and photographs the sample 300 to acquire an image of the sample 300. On the other hand, the controller 200 is installed outside the incubator, for example. The observation apparatus 100 and the controller 200 communicate with each other. The controller 200 controls the operation of the observation apparatus 100.
なお、Z軸方向の撮影位置は、移動機構160によって変更されてもよいし、撮像光学系152の合焦位置が変更されることで変更されてもよい。撮像光学系152は、焦点距離を変更できるズーム光学系であることが好ましい。
Note that the shooting position in the Z-axis direction may be changed by the moving mechanism 160, or may be changed by changing the in-focus position of the imaging optical system 152. The imaging optical system 152 is preferably a zoom optical system that can change the focal length.
以下、撮像光学系152が拡大光学系であり、撮像光学系152の光軸がZ軸(観察軸)と平行である場合を例として説明をする。本実施形態に係る観察装置100は、注目被写体の奥行き情報を取得する3Dスキャン処理と、注目被写体の大きさ、数等を取得するカウントスキャン処理とを行う。
Hereinafter, the case where the imaging optical system 152 is an expansion optical system and the optical axis of the imaging optical system 152 is parallel to the Z axis (observation axis) will be described as an example. The observation apparatus 100 according to the present embodiment performs a 3D scan process for acquiring depth information of the subject of interest, and a count scan process for acquiring the size, number, and the like of the subject of interest.
本実施形態に係る撮像光学系152は、少なくとも被写体側に非テレセントリック性を有する光学系である第1の撮像光学系を備える。以下、第1の撮像光学系を用いて撮像された画像を第1の画像と称する。例えば3Dスキャン処理において、観察装置100は、第1の撮像光学系を含む画像取得ユニット150を移動させながら第1の画像を取得する。観察装置100は、それぞれ異なる位置で行われた撮像間の視差に基づいて、試料300に含まれる注目被写体の第1の撮像光学系の光軸方向(観察軸方向)の位置を奥行き情報として取得する。当該奥行き情報には、注目被写体に係る第1の撮像光学系の光軸方向の位置、立体モデル(3Dモデル)等が含まれる。
The imaging optical system 152 according to the present embodiment includes a first imaging optical system that is an optical system having non-telecentricity at least on the subject side. Hereinafter, an image captured using the first imaging optical system is referred to as a first image. For example, in the 3D scanning process, the observation apparatus 100 acquires the first image while moving the image acquisition unit 150 including the first imaging optical system. The observation apparatus 100 acquires, as depth information, the position in the optical axis direction (observation axis direction) of the first imaging optical system of the subject of interest included in the sample 300 based on parallax between imaging performed at different positions. To do. The depth information includes the position in the optical axis direction of the first imaging optical system related to the subject of interest, a three-dimensional model (3D model), and the like.
また、本実施形態に係る撮像光学系152は、少なくとも被写体側にテレセントリック性を有する光学系である第2の撮像光学系をさらに備える。以下、第2の撮像光学系を用いて撮像された画像を第2の画像と称する。例えばカウントスキャン処理において、観察装置100は、第2の撮像光学系を含む画像取得ユニット150を移動させながら第2の画像を取得する。観察装置100は、それぞれ異なる位置で撮像された第2の画像を合成し、広範囲を撮像して取得されたかのような広範囲の高画素画像を取得する。また、観察装置100は、第2の画像又は当該高画素画像に基づいて、注目被写体の大きさ、数等を取得する。
The imaging optical system 152 according to the present embodiment further includes a second imaging optical system that is an optical system having telecentricity at least on the subject side. Hereinafter, an image captured using the second imaging optical system is referred to as a second image. For example, in the count scan process, the observation apparatus 100 acquires the second image while moving the image acquisition unit 150 including the second imaging optical system. The observation apparatus 100 combines the second images captured at different positions, and acquires a wide range of high pixel images as if they were captured by capturing a wide range. In addition, the observation apparatus 100 acquires the size, number, and the like of the subject of interest based on the second image or the high pixel image.
(撮像光学系について)
ここでは、画像取得ユニット150が移動させられながら行う撮像について、第1の撮像光学系を用いる場合と第2の撮像光学系を用いる場合との各々について図3乃至図6を参照して説明する。本実施形態に係る撮像光学系152は、絞り152aと、少なくとも対物レンズ152b及び結像レンズ152cを含む複数のレンズとを備える。なお、結像レンズ152cの撮像素子153側の焦点距離を焦点距離Ftとする。 (About imaging optics)
Here, with respect to imaging performed while theimage acquisition unit 150 is moved, each of a case where the first imaging optical system is used and a case where the second imaging optical system is used will be described with reference to FIGS. 3 to 6. . The imaging optical system 152 according to the present embodiment includes a diaphragm 152a and a plurality of lenses including at least an objective lens 152b and an imaging lens 152c. The focal length on the image sensor 153 side of the imaging lens 152c is defined as a focal length Ft.
ここでは、画像取得ユニット150が移動させられながら行う撮像について、第1の撮像光学系を用いる場合と第2の撮像光学系を用いる場合との各々について図3乃至図6を参照して説明する。本実施形態に係る撮像光学系152は、絞り152aと、少なくとも対物レンズ152b及び結像レンズ152cを含む複数のレンズとを備える。なお、結像レンズ152cの撮像素子153側の焦点距離を焦点距離Ftとする。 (About imaging optics)
Here, with respect to imaging performed while the
まず、本実施形態に係る撮像光学系152として第2の撮像光学系が用いられる場合について説明をする。本実施形態に係る第2の撮像光学系を含む撮像部151と試料300との位置関係の一例を模式図として図3に示す。試料300は、容器310と注目被写体である細胞324とを含む。ここでは、第2の撮像光学系が両側テレセントリック光学系である場合を例として説明をする。図3に示すように、本実施形態に係る第2の撮像光学系は、絞り152aと対物レンズ152bの射出側焦点と結像レンズ152cの入射側焦点との位置を一致させることで、被写体側と像側との両側にテレセントリック性を有する光学系である。このような両側テレセントリック光学系では、光学系の光軸に対して平行に入射した主光線(絞り152aの中心を通過する光線)が、光学系の光軸に対して平行に射出される。
First, the case where the second imaging optical system is used as the imaging optical system 152 according to the present embodiment will be described. An example of the positional relationship between the imaging unit 151 including the second imaging optical system according to the present embodiment and the sample 300 is shown in FIG. 3 as a schematic diagram. The sample 300 includes a container 310 and a cell 324 that is a subject of interest. Here, a case where the second imaging optical system is a double-sided telecentric optical system will be described as an example. As shown in FIG. 3, the second imaging optical system according to the present embodiment matches the positions of the exit side focal point of the stop 152a and the objective lens 152b with the incident side focal point of the imaging lens 152c, so that the subject side And an optical system having telecentricity on both sides of the image side. In such a double-sided telecentric optical system, a principal ray that has entered parallel to the optical axis of the optical system (a ray passing through the center of the stop 152a) is emitted in parallel to the optical axis of the optical system.
例えば図3に示す状態では、細胞324上の点P1は、第2の撮像光学系の光軸上に位置する。点P1から放射される光線のうち、絞り152aの中心を通るようにして第2の撮像光学系を通過して撮像素子153へと入射できる光線(主光線)は、第2の撮像光学系の光軸に対して平行に対物レンズ152bへ入射する光線R1である。光線R1は、第2の撮像光学系の光軸に対して平行に対物レンズ152bへ入射し、絞り152aの中心と結像レンズ152cとを通過して、撮像素子153上の第2の撮像光学系の光軸上に位置する点Q1へと入射する。
For example, in the state shown in FIG. 3, the point P1 on the cell 324 is located on the optical axis of the second imaging optical system. Of the light rays emitted from the point P1, the light rays (chief rays) that pass through the second imaging optical system so as to pass through the center of the stop 152a and enter the imaging element 153 are principal rays of the second imaging optical system. This is a light ray R1 incident on the objective lens 152b in parallel to the optical axis. The light ray R1 enters the objective lens 152b in parallel with the optical axis of the second imaging optical system, passes through the center of the diaphragm 152a and the imaging lens 152c, and the second imaging optical on the image sensor 153. The light enters the point Q1 located on the optical axis of the system.
また、図3に示す状態から、第2の撮像光学系の光軸がZ軸(観察軸)と平行に維持されたまま、移動機構160によって第1のX移動量ΔX1だけX方向に移動させられた後の状態を模式図として図4に示す。
Further, from the state shown in FIG. 3, the moving mechanism 160 moves the optical axis of the second imaging optical system in the X direction by the first X movement amount ΔX1 while being maintained parallel to the Z axis (observation axis). FIG. 4 shows a schematic diagram of the state after being applied.
図4に示す状態では、点P1から放射される光線のうち、絞り152aの中心を通るようにして第2の撮像光学系を通過して撮像素子153へと入射できる光線(主光線)は、第2の撮像光学系の光軸に対して平行に対物レンズ152bへ入射する光線R1´である。もちろん、機構構成を傾斜可能なものにしたりすれば、光軸に対して傾きを持たせた移動にする応用も可能である。以下、光線R1´が入射する撮像素子153上の点をQ1´とする。ここで、点P1を対応点とした場合には、撮像素子153の撮像面における像位置の変化量(第2の像移動量)は、点Q1と点Q1´との間の距離である。また、図4に示す状態において、第2の撮像光学系の光軸上であって、当該光軸方向の位置が点P1と等しい点を点P11とする。
In the state shown in FIG. 4, among the light rays radiated from the point P1, the light rays (chief rays) that can enter the image sensor 153 through the second imaging optical system so as to pass through the center of the stop 152a are The light ray R1 ′ is incident on the objective lens 152b in parallel with the optical axis of the second imaging optical system. Of course, if the mechanism configuration can be tilted, it is possible to apply the tilting movement to the optical axis. Hereinafter, a point on the image sensor 153 on which the light ray R1 ′ is incident is referred to as Q1 ′. Here, when the point P1 is a corresponding point, the amount of change in the image position (second image movement amount) on the imaging surface of the imaging element 153 is the distance between the point Q1 and the point Q1 ′. Further, in the state shown in FIG. 4, a point on the optical axis of the second imaging optical system and whose position in the optical axis direction is equal to the point P1 is defined as a point P11.
次に、本実施形態に係る撮像光学系152として第1の撮像光学系が用いられる場合について説明をする。本実施形態に係る第1の撮像光学系を含む撮像部151と試料300との位置関係の一例を模式図として図5に示す。図3及び図4に示す場合と同様に、試料300は、容器310と注目被写体である細胞324とを含む。ここでは、第1の撮像光学系が被写体側非テレセントリック光学系(像側テレセントリック光学系)である場合を例として説明をする。図5に示すように、本実施形態に係る第1の撮像光学系は、図3又は図4に示す両側テレセントリック光学系において対物レンズ152bの第1の撮像光学系の光軸方向の位置を絞り152a側に移動させて、被写体側のみテレセントリック性を崩した光学系である。このような被写体側非テレセントリック光学系では、当該光学系に入射して対物レンズ152bで偏向された後に絞り152aの中心を通過する光線(主光線)が、光学系の光軸に対して平行に射出される。絞り152aと対物レンズ152bの射出側焦点とが異なっているため、被写体側非テレセントリック光学系を通過して撮像される光線のうち光軸上以外を通過して入射する主光線(図5中の破線で示す軸外主光線)は、注目被写体と光学系との間において、光学系の光軸と傾きを有することになる。
Next, a case where the first imaging optical system is used as the imaging optical system 152 according to the present embodiment will be described. An example of the positional relationship between the imaging unit 151 including the first imaging optical system according to the present embodiment and the sample 300 is shown in FIG. 5 as a schematic diagram. Similar to the case shown in FIGS. 3 and 4, the sample 300 includes a container 310 and a cell 324 that is a subject of interest. Here, a case where the first imaging optical system is a subject-side non-telecentric optical system (image-side telecentric optical system) will be described as an example. As shown in FIG. 5, the first imaging optical system according to the present embodiment is configured to reduce the position of the objective lens 152b in the optical axis direction of the first imaging optical system in the double-sided telecentric optical system shown in FIG. This is an optical system in which the telecentricity is broken only on the subject side by moving to the 152a side. In such a subject-side non-telecentric optical system, a light beam (principal light beam) that enters the optical system and is deflected by the objective lens 152b and then passes through the center of the stop 152a is parallel to the optical axis of the optical system. It is injected. Since the exit side focal points of the aperture stop 152a and the objective lens 152b are different from each other, the principal ray (in FIG. 5) that passes through the object side non-telecentric optical system and enters through the portion other than the optical axis. The off-axis principal ray (shown by a broken line) has an optical axis and an inclination of the optical system between the subject of interest and the optical system.
例えば図5に示す状態では、細胞324上の点P1の位置は第1の撮像光学系の光軸上であるため、点P1から放射される光線のうち、第1の撮像光学系の絞り152aの中心を通過して撮像素子153へと入射できる光線(主光線)は、第1の撮像光学系の光軸上を通過して撮像素子153上の点Q2へと入射する光線R2である。
For example, in the state shown in FIG. 5, the position of the point P1 on the cell 324 is on the optical axis of the first imaging optical system. Therefore, out of the light rays emitted from the point P1, the diaphragm 152a of the first imaging optical system. A ray (principal ray) that can enter the image sensor 153 through the center of the image is a ray R2 that passes through the optical axis of the first image pickup optical system and enters the point Q2 on the image sensor 153.
また、図5に示す状態から、第1の撮像光学系の光軸がZ軸(観察軸)と平行に維持されたまま、移動機構160によって第2のX移動量ΔX2だけX方向に移動させられた後の状態を模式図として図6に示す。
Further, from the state shown in FIG. 5, the moving mechanism 160 moves the first imaging optical system in the X direction by the second X movement amount ΔX2 while maintaining the optical axis of the first imaging optical system in parallel with the Z axis (observation axis). FIG. 6 shows a schematic diagram of the state after being applied.
図6に示す状態では、点P1の位置は第1の撮像光学系の光軸上ではないため、点P1から放射される光線のうち主光線は、当該光軸に対して角度φの傾きを有して対物レンズ152bへ入射する光線R2´である。その後、光線R2´は対物レンズ152bで偏向された後に絞り152aの中心を通過して撮像素子153上の点Q2´へと入射する。ここで、点P1を対応点とした場合には、撮像素子153の撮像面における像位置の変化量(第1の像移動量ΔX)は、点Q2と点Q2´との間の距離である。また、図6に示す状態において、第1の撮像光学系の光軸上であって、当該光軸方向の位置が点P1と等しい点を点P12とする。
In the state shown in FIG. 6, since the position of the point P1 is not on the optical axis of the first imaging optical system, the principal ray out of the rays emitted from the point P1 has an inclination of the angle φ with respect to the optical axis. It is a light ray R2 ′ that is incident on the objective lens 152b. Thereafter, the light ray R2 ′ is deflected by the objective lens 152b, passes through the center of the stop 152a, and enters the point Q2 ′ on the image sensor 153. Here, when the point P1 is a corresponding point, the amount of change in the image position (first image movement amount ΔX) on the imaging surface of the imaging element 153 is the distance between the point Q2 and the point Q2 ′. . In the state shown in FIG. 6, a point on the optical axis of the first imaging optical system and whose position in the optical axis direction is equal to the point P1 is defined as a point P12.
図3及び図4を参照して上述したように、第2の撮像光学系を用いる場合に撮像素子153に入射する各主光線は、第2の撮像光学系の光軸に対して平行に第2の撮像光学系へ入射する光線である。そのため、注目被写体と第2の撮像光学系の光軸位置との相対位置が変化する場合には、第2の撮像光学系の入射端から点P1を見る方向の変化(視差)は生じない。注目被写体と第2の撮像光学系の光軸位置との相対位置が変化する場合は、例えば第2の撮像光学系の光軸位置(撮像位置)が位置X1から位置X1´へと移動する場合である。また、第2の像移動量は、第2の画像が取得された撮像位置の間隔(第1のX移動量ΔX1)に第2の撮像光学系の倍率を乗じた距離に等しい。被写体に凹凸がある場合でも撮像面上での見え方は変わらない。これは、第2の像移動量は、視差が存在する場合に発生する視差に起因する像移動量を含まないとも表現できる。
As described above with reference to FIGS. 3 and 4, when the second imaging optical system is used, each principal ray incident on the imaging element 153 is parallel to the optical axis of the second imaging optical system. 2 is incident on the imaging optical system 2. Therefore, when the relative position between the subject of interest and the optical axis position of the second imaging optical system changes, there is no change (parallax) in the direction of viewing the point P1 from the incident end of the second imaging optical system. When the relative position between the subject of interest and the optical axis position of the second imaging optical system changes, for example, the optical axis position (imaging position) of the second imaging optical system moves from position X1 to position X1 ′. It is. Further, the second image movement amount is equal to a distance obtained by multiplying the interval between the imaging positions at which the second image is acquired (first X movement amount ΔX1) by the magnification of the second imaging optical system. Even if the subject is uneven, the appearance on the imaging surface does not change. This can also be expressed as the second image movement amount not including the image movement amount caused by the parallax that occurs when the parallax exists.
このように、第2の撮像光学系を用いた撮像では、瞳が被写体側から見て無限遠となるため、遠方から俯瞰したかのごとき画像(第2の画像)が得られることになる。例えば細胞が凹凸を有する場合のように、細胞324と対物レンズ152bとの相対位置が当該光軸方向に異なる場合でも、第2の撮像光学系の光軸に対して平行に第2の撮像光学系へ入射する光線の光路は変化しない。そのため、撮像位置を変化させながら取得した複数の第2の画像を張り合わせて広範囲の高画素画像を合成する際に、画像間のつなぎ目の処理が容易となる。広範囲の画像に基づけば、その撮像領域内に存在する細胞数のカウントや細胞の大きさの比較が容易であることは言うまでもない。
As described above, in the imaging using the second imaging optical system, the pupil is infinite when viewed from the subject side, so that an image (second image) as if seen from a distance is obtained. For example, even when the relative position between the cell 324 and the objective lens 152b differs in the optical axis direction as in the case where the cell has irregularities, the second imaging optical system is parallel to the optical axis of the second imaging optical system. The optical path of the light incident on the system does not change. Therefore, when combining a plurality of second images acquired while changing the imaging position to synthesize a wide range of high-pixel images, it is easy to process the joints between the images. Needless to say, based on a wide range of images, it is easy to compare the count of the number of cells and the size of the cells present in the imaging region.
一方で、細胞324又は細胞群の形状が撮像光学系152の光軸方向に対して凹凸を有している場合等には、注目被写体の奥行き情報を取得したいという需要もある。ところが、上述したように第2の撮像光学系のような被写体側テレセントリック光学系では、注目被写体と当該光学系との相対位置が光軸方向に変化しても、第2の像移動量には視差に起因する像移動量が含まれないため、注目被写体の当該光軸方向の位置の変化量は取得できない。本実施形態に係る観察装置100は、奥行き情報を取得したい場合に第1の撮像光学系を用いて撮像を行う。
On the other hand, when the shape of the cell 324 or the cell group is uneven with respect to the optical axis direction of the imaging optical system 152, there is a demand for acquiring the depth information of the subject of interest. However, as described above, in the subject-side telecentric optical system such as the second imaging optical system, even if the relative position between the subject of interest and the optical system changes in the optical axis direction, the second image movement amount is Since the amount of image movement due to parallax is not included, the amount of change in the position of the subject of interest in the optical axis direction cannot be acquired. The observation apparatus 100 according to the present embodiment performs imaging using the first imaging optical system when it is desired to acquire depth information.
図5及び図6を参照して上述したように、第1の撮像光学系を用いる場合に撮像素子153に入射する各主光線のうち、第1の撮像光学系の光軸上に位置しない範囲から放射される主光線は、当該光軸との間に傾きを有している。そのため、例えば第1の撮像光学系の光軸位置(撮像位置)が位置X2から位置X2´へと移動するように、注目被写体と光軸位置との相対位置が変化する場合には、第1の撮像光学系が当該光学系の入射端から点P1を見る方向の変化(視差)が生じる。また、第1の像移動量ΔXは、第1の画像が取得された撮像位置の間隔(第2のX移動量ΔX2)に第1の撮像光学系の倍率を乗じた距離に等しい。しかしながら、被写体に凹凸がある場合、対物レンズの被写体側の焦平面(P1を含み光軸に垂直な面)に対して前後する位置にある被写体に対しては撮像面上での倍率が異なる。図6の例では、対物レンズ152bからの距離が遠いほど撮像面上での倍率は小さくなる。つまり、設定する点P1の位置の奥行きの違いにより、第2のX移動量ΔX2と第1の像移動量ΔXとの比は変化する。これは、第1の像移動量ΔXは、視差が存在する場合に発生する視差に起因する像移動量を含んでいるとも表現できる。
As described above with reference to FIG. 5 and FIG. 6, a range that is not located on the optical axis of the first imaging optical system among the principal rays incident on the imaging element 153 when the first imaging optical system is used. The principal ray emitted from the optical axis has an inclination with respect to the optical axis. Therefore, for example, when the relative position of the subject of interest and the optical axis position changes so that the optical axis position (imaging position) of the first imaging optical system moves from the position X2 to the position X2 ′, the first Change (parallax) in the direction of viewing the point P1 from the incident end of the optical system. Further, the first image movement amount ΔX is equal to a distance obtained by multiplying the interval between the imaging positions at which the first image is acquired (second X movement amount ΔX2) by the magnification of the first imaging optical system. However, when the subject has irregularities, the magnification on the imaging surface differs for a subject that is in a position before and after the focal plane on the subject side of the objective lens (a surface that includes P1 and is perpendicular to the optical axis). In the example of FIG. 6, the magnification on the imaging surface decreases as the distance from the objective lens 152b increases. That is, the ratio between the second X movement amount ΔX2 and the first image movement amount ΔX changes depending on the difference in the depth of the position of the point P1 to be set. This can also be expressed as the first image movement amount ΔX including an image movement amount caused by the parallax generated when the parallax exists.
このように、第1の撮像光学系を用いた撮像では、同一の対応点を含み、かつ、例えば同一のXY平面上で撮像位置が異なる複数の第1の画像に基づいて、視差に起因する像移動量を含む第1の像移動量ΔXが得られることになる。そのため、本実施形態に係る観察装置100は、例えば第1の像移動量ΔXに基づいて三角測量の原理を利用して、細胞324等の注目被写体について奥行き情報を取得することができる。奥行き情報には、例えば厚み、凹凸等の情報を含む立体情報、立体画像が含まれる。第1の撮像光学系を用いた撮像では、こうした距離分布や各深さ方向の像情報などが得られるので、様々な情報で、対象物の確認が可能となる。つまり観察光学系を用いて試料300の画像を取得する撮像部151と、試料300と撮像部151との相対位置を変更する移動機構160と、異なる撮像部151の位置で取得された複数の画像と、この画像の取得時の各々の撮像位置に係る情報とに基づいてこの試料300について立体情報として取得する立体情報取得部を備える観察装置が提供できる。この観察光学系は、結像作用があればよく、拡大は電子的に拡大するような応用を行ってもよい。
As described above, in imaging using the first imaging optical system, due to parallax, for example, based on a plurality of first images including the same corresponding points and having different imaging positions on the same XY plane, for example. A first image movement amount ΔX including the image movement amount is obtained. Therefore, the observation apparatus 100 according to the present embodiment can acquire depth information about a subject of interest such as the cell 324 using the principle of triangulation based on the first image movement amount ΔX, for example. The depth information includes, for example, 3D information and 3D images including information such as thickness and unevenness. In the imaging using the first imaging optical system, such distance distribution and image information in each depth direction can be obtained, so that the object can be confirmed with various information. That is, the imaging unit 151 that acquires an image of the sample 300 using the observation optical system, the moving mechanism 160 that changes the relative position between the sample 300 and the imaging unit 151, and a plurality of images acquired at different positions of the imaging unit 151. And an observation apparatus provided with the three-dimensional information acquisition part which acquires this sample 300 as three-dimensional information based on the information regarding each imaging position at the time of acquisition of this image can be provided. The observation optical system only needs to have an imaging function, and may be applied such that enlargement is electronically enlarged.
(奥行き情報の取得について)
ここでは、図5に示す状態で取得された第1の画像と、図6に示す状態で取得された第1の画像とに基づく奥行き情報の取得について説明をする。なお、これら第1の画像の取得間隔は、試料300の内部で生じる現象の経時変化と比較して十分に短いものとする。また、簡単のため、図6に示す第1の光学系を通過する光線R2´において、対物レンズ152bの入射側と射出側の光線が同一線上である場合を例として説明をする。したがって、以下の奥行き情報の取得についての説明において、絞り152aの中心を示す点S0と点P1と点P12とから成る三角形は、点S0と光線R2´が通過する結像レンズ152c上の点K1と結像レンズ152cの中心を示す点K0とから成る直角三角形と相似であるとする。なお、対物レンズ152bの入射側と射出側の光線が同一線上でない場合であっても、対物レンズ152bにおける屈折角を考慮すれば以下の説明と同様の処理が可能であることは言うまでもない。 (About obtaining depth information)
Here, acquisition of depth information based on the first image acquired in the state shown in FIG. 5 and the first image acquired in the state shown in FIG. 6 will be described. Note that the acquisition interval of these first images is sufficiently shorter than the temporal change of the phenomenon occurring inside thesample 300. For the sake of simplicity, a description will be given by taking as an example a case where the light beam on the incident side and the light exit side of the objective lens 152b are on the same line in the light beam R2 ′ passing through the first optical system shown in FIG. Accordingly, in the following description of the depth information acquisition, the triangle formed by the point S0, the point P1, and the point P12 indicating the center of the diaphragm 152a is the point K1 on the imaging lens 152c through which the point S0 and the light ray R2 ′ pass. And a right triangle composed of a point K0 indicating the center of the imaging lens 152c. Needless to say, even if the incident-side and exit-side rays of the objective lens 152b are not on the same line, the same processing as described below can be performed in consideration of the refraction angle in the objective lens 152b.
ここでは、図5に示す状態で取得された第1の画像と、図6に示す状態で取得された第1の画像とに基づく奥行き情報の取得について説明をする。なお、これら第1の画像の取得間隔は、試料300の内部で生じる現象の経時変化と比較して十分に短いものとする。また、簡単のため、図6に示す第1の光学系を通過する光線R2´において、対物レンズ152bの入射側と射出側の光線が同一線上である場合を例として説明をする。したがって、以下の奥行き情報の取得についての説明において、絞り152aの中心を示す点S0と点P1と点P12とから成る三角形は、点S0と光線R2´が通過する結像レンズ152c上の点K1と結像レンズ152cの中心を示す点K0とから成る直角三角形と相似であるとする。なお、対物レンズ152bの入射側と射出側の光線が同一線上でない場合であっても、対物レンズ152bにおける屈折角を考慮すれば以下の説明と同様の処理が可能であることは言うまでもない。 (About obtaining depth information)
Here, acquisition of depth information based on the first image acquired in the state shown in FIG. 5 and the first image acquired in the state shown in FIG. 6 will be described. Note that the acquisition interval of these first images is sufficiently shorter than the temporal change of the phenomenon occurring inside the
点S0と点K1と点K0とから成る直角三角形において、点K1と点K0との間の距離は、第1の像移動量ΔXに等しい。第1の像移動量ΔXは、少なくとも2枚の第1の画像に基づいた画像処理において、対応点が検出されて像面上の対応点の移動量として算出されるため既知である。また、点S0と点K0との間の距離は、結像レンズ152cの入射側の焦点距離Foであり既知である。一方、点S0と点P1と点P12とから成る直角三角形において、点P1と点P12との間の距離は、光軸間距離(第2のX移動量ΔX2)であり、既知である。
In the right triangle composed of the point S0, the point K1, and the point K0, the distance between the point K1 and the point K0 is equal to the first image movement amount ΔX. The first image movement amount ΔX is known because the corresponding point is detected and calculated as the movement amount of the corresponding point on the image plane in the image processing based on at least two first images. Further, the distance between the point S0 and the point K0 is the focal length Fo on the incident side of the imaging lens 152c and is known. On the other hand, in the right triangle composed of the point S0, the point P1, and the point P12, the distance between the point P1 and the point P12 is the distance between the optical axes (second X movement amount ΔX2) and is known.
したがって、絞り152aと入射瞳(被写体側から見た絞り152aの像)の位置の差が無視できるとした場合、点S0と点P1との間の第1の撮像光学系の光軸方向の距離Zは、Z=Fo×ΔX2/ΔXとして算出され、細胞324の奥行き情報が取得される。
Therefore, when the difference in position between the diaphragm 152a and the entrance pupil (image of the diaphragm 152a viewed from the subject side) can be ignored, the distance in the optical axis direction of the first imaging optical system between the point S0 and the point P1. Z is calculated as Z = Fo × ΔX2 / ΔX, and depth information of the cell 324 is acquired.
なお、図3及び図4に示した第2の撮像光学系と、図5及び図6に示した第1の撮像光学系とは、それぞれ像側にテレセントリック性を有する光学系として示されているが、これに限定されない。上述したように、本実施形態に係る観察装置100は、撮像して取得する画像が、細胞324の奥行き情報を含むか否かを切り替えるために撮像光学系152を第1の撮像光学系としたり、第2の撮像光学系としたりする。そのため、第1の撮像光学系又は第2の撮像光学系が像側に非テレセントリック性を有する光学系であっても、同様の効果が得られ得る。
Note that the second imaging optical system shown in FIGS. 3 and 4 and the first imaging optical system shown in FIGS. 5 and 6 are shown as optical systems having telecentricity on the image side, respectively. However, it is not limited to this. As described above, the observation apparatus 100 according to the present embodiment uses the imaging optical system 152 as the first imaging optical system in order to switch whether an image acquired by imaging includes depth information of the cell 324 or not. Or a second imaging optical system. Therefore, even if the first imaging optical system or the second imaging optical system is an optical system having non-telecentricity on the image side, the same effect can be obtained.
また、第1の撮像光学系は被写体側に非テレセントリック性を有する光学系であるため、例えばピントを合わせるために、第1の撮像光学系を第1の撮像光学系の光軸方向に移動させた場合にも、撮像素子153上の対応点は移動することになる。例えば図6に示す状態から、第1の撮像光学系と撮像素子153とが一体として細胞324から離れる方向(Z-方向)に移動させられた場合を例として説明する。このとき、点P1と点S0とを通る直線と、点P12と点S0とを通る直線とが成す角は小さくなる。したがって、点K1は、光軸側に移動し、点Q2´もまた光軸側に移動することになる。このように、移動させられながら撮像を繰り返す途中で、ピント合わせ等のために第1の撮像光学系が第1の撮像光学系の光軸方向に移動させられた場合でも、対応点の移動を取得することによって、奥行き情報の取得が可能であることは明らかである。
In addition, since the first imaging optical system is an optical system having non-telecentricity on the subject side, for example, the first imaging optical system is moved in the optical axis direction of the first imaging optical system in order to focus. In this case, the corresponding point on the image sensor 153 moves. For example, the case where the first imaging optical system and the imaging element 153 are moved together in the direction away from the cell 324 (Z-direction) from the state shown in FIG. 6 will be described as an example. At this time, an angle formed by a straight line passing through the points P1 and S0 and a straight line passing through the points P12 and S0 is small. Therefore, the point K1 moves to the optical axis side, and the point Q2 ′ also moves to the optical axis side. In this way, even when the first imaging optical system is moved in the optical axis direction of the first imaging optical system for focusing or the like in the middle of repeating the imaging while being moved, the corresponding point is moved. Obviously, it is possible to acquire depth information.
なお、簡単のために、撮像光学系152がX方向に移動させられた場合であって、あるXY平面(点P1を撮像した点)にのみ着目した場合における光軸方向の奥行き情報の取得について説明をしたが、これに限定されない。第1の画像に含まれる複数の対応点について第1の像移動量ΔXが取得され、細胞324について、面での奥行き情報が取得されることは明らかである。
For the sake of simplicity, the acquisition of depth information in the optical axis direction when the imaging optical system 152 is moved in the X direction and only focusing on a certain XY plane (the point where the point P1 is imaged). Although explained, it is not limited to this. It is obvious that the first image movement amount ΔX is acquired for a plurality of corresponding points included in the first image, and the depth information on the surface is acquired for the cell 324.
なお、簡単のために、奥行き情報の取得に用いられる第1の画像の一方が撮像された状態として、図5に示すような、点P1が第1の撮像光学系の光軸上に位置する場合を例として説明したが、これに限定されない。例えば奥行き情報の取得に用いられる少なくとも2枚の第1の画像において、各々の撮像時の光軸上に位置しない点が対応点として用いられてもよい。
For the sake of simplicity, as shown in FIG. 5, a point P1 is positioned on the optical axis of the first imaging optical system as one of the first images used for obtaining depth information is captured. Although the case has been described as an example, the present invention is not limited to this. For example, in at least two first images used for acquisition of depth information, points that are not located on the optical axis at the time of each imaging may be used as corresponding points.
一般に、例えばオートフォーカスの結果に基づいて、注目被写体上の任意の点について奥行き情報を取得することは容易である。一方で、オートフォーカスの結果に基づいて、注目被写体の任意の面について奥行き情報を取得することは、所望の分解能にも依存するが、当該面に含まれる全ての点に対するオートフォーカス動作が要求される等、困難である。このような中、本実施形態に係る観察装置100において第1の撮像光学系を用いて奥行き情報を取得すれば、当該奥行き情報には注目被写体の面の奥行き情報が含まれており、注目被写体の3Dモデルの取得等が容易となる。
Generally, it is easy to acquire depth information for an arbitrary point on the subject of interest based on, for example, the result of autofocus. On the other hand, acquiring depth information for an arbitrary surface of a subject of interest based on the result of autofocusing depends on the desired resolution, but requires an autofocus operation for all points included in the surface. It is difficult. Under such circumstances, when the observation apparatus 100 according to the present embodiment acquires depth information using the first imaging optical system, the depth information includes depth information of the surface of the subject of interest, and the subject of interest It becomes easy to obtain a 3D model.
なお、被写体側のいずれの主光線も光軸とのなす角度が4°以下の光学系であれば被写体側テレセントリック光学系とみなしてよい。
It should be noted that any principal ray on the subject side may be regarded as a subject-side telecentric optical system as long as the angle formed with the optical axis is 4 ° or less.
被写体側非テレセントリック光学系としては、被写体側のいずれかの主光線と光軸とのなす角度(図6の角度φ)が6°以上の光学系とみなしてよい。複数の第1の画像間にて十分に視差のある画像を得られる。
The subject-side non-telecentric optical system may be regarded as an optical system in which an angle formed by any principal ray on the subject side and the optical axis (angle φ in FIG. 6) is 6 ° or more. An image having a sufficient parallax between the plurality of first images can be obtained.
さらに周辺(側面)の画像を撮る場合は、被写体側のいずれかの主光線と光軸とのなす角度(図6の角度φ)を20°以上とすることが好ましい。
Furthermore, when taking a peripheral (side) image, it is preferable that the angle (angle φ in FIG. 6) formed by any principal ray on the subject side and the optical axis is 20 ° or more.
(試料について)
観察システム1の測定対象である試料300は、例えば次のようなものである。試料300は、例えば、容器310と、培地322と、細胞324と、反射板360とを含む。容器310内に培地322が入れられ、培地322内で細胞324が培養されている。容器310は、例えばシャーレ、培養フラスコ、マルチウェルプレート等であり得る。このように、容器310は、例えば、生体試料を培養するための培養容器である。容器310の形状、大きさ等は限定されない。培地322は、液体培地でも固体培地でもよい。測定対象は例えば細胞324であるが、これは、接着性の細胞でもよいし、浮遊性の細胞でもよい。また、細胞324は、スフェロイドや組織であってもよい。さらに、細胞324は、どのような生物に由来してもよく、菌等であってもよい。このように、試料300は、生物又は生物に由来する試料である生体試料を含む。反射板360は、透明板102を介して試料300に入射した照明光を反射させて、細胞324を照明するためのものであり、容器310の上面に配置される。 (About the sample)
Asample 300 that is a measurement target of the observation system 1 is, for example, as follows. The sample 300 includes, for example, a container 310, a culture medium 322, cells 324, and a reflection plate 360. A medium 322 is placed in the container 310, and cells 324 are cultured in the medium 322. The container 310 can be, for example, a petri dish, a culture flask, a multiwell plate, or the like. Thus, the container 310 is a culture container for culturing a biological sample, for example. The shape, size, etc. of the container 310 are not limited. The medium 322 may be a liquid medium or a solid medium. The measurement object is, for example, the cell 324, but this may be an adhesive cell or a floating cell. The cell 324 may be a spheroid or a tissue. Furthermore, the cell 324 may be derived from any organism, and may be a fungus or the like. Thus, the sample 300 includes a biological sample that is a living organism or a sample derived from a living organism. The reflection plate 360 is for illuminating the cells 324 by reflecting the illumination light incident on the sample 300 via the transparent plate 102, and is disposed on the upper surface of the container 310.
観察システム1の測定対象である試料300は、例えば次のようなものである。試料300は、例えば、容器310と、培地322と、細胞324と、反射板360とを含む。容器310内に培地322が入れられ、培地322内で細胞324が培養されている。容器310は、例えばシャーレ、培養フラスコ、マルチウェルプレート等であり得る。このように、容器310は、例えば、生体試料を培養するための培養容器である。容器310の形状、大きさ等は限定されない。培地322は、液体培地でも固体培地でもよい。測定対象は例えば細胞324であるが、これは、接着性の細胞でもよいし、浮遊性の細胞でもよい。また、細胞324は、スフェロイドや組織であってもよい。さらに、細胞324は、どのような生物に由来してもよく、菌等であってもよい。このように、試料300は、生物又は生物に由来する試料である生体試料を含む。反射板360は、透明板102を介して試料300に入射した照明光を反射させて、細胞324を照明するためのものであり、容器310の上面に配置される。 (About the sample)
A
(観察装置について)
観察装置100の筐体101の上面に配置されている透明板102は、例えばガラス等で形成されている。観察装置100は、例えば筐体101と透明板102とを含む部材によってその内部が密閉された状態となっている。試料300は、この透明板102上に静置される。図1には、筐体101の上面の全体が透明な板で形成されている例が示されているが、観察装置100は、筐体101の上面の一部に透明な板が設けられ、上面のその他の部分が不透明であるように構成されてもよい。なお、ここでの透明とは、照明光の波長に対して透明であることを示す。 (About observation equipment)
Thetransparent plate 102 disposed on the upper surface of the casing 101 of the observation apparatus 100 is made of, for example, glass. The observation apparatus 100 is in a state in which the inside is sealed by a member including, for example, a housing 101 and a transparent plate 102. The sample 300 is placed on the transparent plate 102. FIG. 1 shows an example in which the entire upper surface of the housing 101 is formed of a transparent plate, but the observation apparatus 100 is provided with a transparent plate on a part of the upper surface of the housing 101. The other part of the upper surface may be configured to be opaque. In addition, transparency here shows that it is transparent with respect to the wavelength of illumination light.
観察装置100の筐体101の上面に配置されている透明板102は、例えばガラス等で形成されている。観察装置100は、例えば筐体101と透明板102とを含む部材によってその内部が密閉された状態となっている。試料300は、この透明板102上に静置される。図1には、筐体101の上面の全体が透明な板で形成されている例が示されているが、観察装置100は、筐体101の上面の一部に透明な板が設けられ、上面のその他の部分が不透明であるように構成されてもよい。なお、ここでの透明とは、照明光の波長に対して透明であることを示す。 (About observation equipment)
The
移動機構160は、支持部165と、支持部165をX軸方向に移動させるためのX送りねじ161と、Xアクチュエータ162とを備える。また、移動機構160は、支持部165をY軸方向に移動させるためのY送りねじ163とYアクチュエータ164とをさらに備える。移動機構160は支持部165をZ軸方向に移動させるためのZ送りねじ及びZアクチュエータ等を備えてもよい。以下の説明のため、支持部165がXアクチュエータ162から離れる方向に移動する方向をX方向の正の向き(X+方向)とし、Yアクチュエータ164から離れる方向に移動する方向をY方向の正の向き(Y+方向)とし、支持部165から試料300に向かう方向をZ方向の正の向き(Z+方向)とする。
The moving mechanism 160 includes a support portion 165, an X feed screw 161 for moving the support portion 165 in the X-axis direction, and an X actuator 162. The moving mechanism 160 further includes a Y feed screw 163 and a Y actuator 164 for moving the support portion 165 in the Y-axis direction. The moving mechanism 160 may include a Z feed screw and a Z actuator for moving the support portion 165 in the Z-axis direction. For the following explanation, the direction in which the support portion 165 moves away from the X actuator 162 is defined as the positive direction of the X direction (X + direction), and the direction of movement away from the Y actuator 164 is defined as the positive direction in the Y direction. (Y + direction), and the direction from the support 165 toward the sample 300 is the positive direction of the Z direction (Z + direction).
図1に示すように、画像取得ユニット150の備える照明部155は、移動機構160の備える支持部165に設けられている。また、照明部155の近傍には撮像部151が設けられている。照明部155は、照明光学系156と光源157とを備える。光源157から放射された照明光は、照明光学系156を介して試料300へと照射される。光源157は、例えばLEDを含む。
As shown in FIG. 1, the illumination unit 155 included in the image acquisition unit 150 is provided on the support unit 165 included in the moving mechanism 160. An imaging unit 151 is provided in the vicinity of the illumination unit 155. The illumination unit 155 includes an illumination optical system 156 and a light source 157. The illumination light emitted from the light source 157 is irradiated onto the sample 300 via the illumination optical system 156. The light source 157 includes, for example, an LED.
図2に示すように、撮像部151は、レンズ切替部154をさらに備える。レンズ切替部154は、例えば撮像光学系152に含まれるレンズを光軸方向へ駆動させ、撮像光学系152を第1の撮像光学系としたり、第2の撮像光学系としたりする。本実施形態に係るレンズ切替部154は、例えば細胞324の数をカウントするカウントスキャン処理、複数の位置で取得した複数の画像を合成して広範囲の高画素画像を取得する撮像処理等が行われる際には撮像光学系152を第2の撮像光学系とする。また、本実施形態に係るレンズ切替部154は、例えば細胞324の奥行き情報を取得する3Dスキャン処理、細胞324の3次元画像を取得する撮像処理等が行われる際には撮像光学系152を第1の撮像光学系とする。
As shown in FIG. 2, the imaging unit 151 further includes a lens switching unit 154. For example, the lens switching unit 154 drives a lens included in the imaging optical system 152 in the optical axis direction so that the imaging optical system 152 becomes the first imaging optical system or the second imaging optical system. The lens switching unit 154 according to the present embodiment performs, for example, a count scan process that counts the number of cells 324, an imaging process that acquires a wide range of high pixel images by combining a plurality of images acquired at a plurality of positions, and the like. In this case, the imaging optical system 152 is used as the second imaging optical system. In addition, the lens switching unit 154 according to the present embodiment uses the imaging optical system 152 when, for example, 3D scan processing for acquiring depth information of the cell 324, imaging processing for acquiring a three-dimensional image of the cell 324, or the like is performed. 1 imaging optical system.
このように、本実施形態に係る観察装置100は、レンズ切替部154に観察の種類に応じて撮像光学系152を切り替えさせ、また、移動機構160に画像取得ユニット150の位置をX方向及びY方向に撮像光学系152の光軸を観察軸と平行に維持したまま変更させながら繰り返し試料300の撮影を行い、複数の画像を取得する。
As described above, the observation apparatus 100 according to the present embodiment causes the lens switching unit 154 to switch the imaging optical system 152 according to the type of observation, and causes the moving mechanism 160 to change the position of the image acquisition unit 150 in the X direction and the Y direction. The sample 300 is repeatedly photographed while changing the optical axis of the imaging optical system 152 in the direction while being kept parallel to the observation axis, and a plurality of images are acquired.
観察装置100は、観察側記録回路130をさらに備える。観察側記録回路130は、例えば観察装置100の備える各部で用いられるプログラムや各種パラメータ、観察装置100で得られたデータを記録する。また、観察側記録回路130は、例えば画像データ(画素データ)、記録用の画像データ、表示用の画像データ、動作時の処理データといった各種データを一時的に記録する。
The observation apparatus 100 further includes an observation side recording circuit 130. The observation-side recording circuit 130 records, for example, programs and various parameters used in each unit included in the observation apparatus 100 and data obtained by the observation apparatus 100. The observation-side recording circuit 130 temporarily records various data such as image data (pixel data), image data for recording, image data for display, and processing data during operation.
さらに、観察側記録回路130は、撮像光学系152の光軸方向における合焦位置の範囲を、例えばピント位置範囲として記録している。ピント位置範囲は、例えば試料300のサイズ等に応じた値が予め設定されたり、ユーザの入力によって設定されたりする。
Furthermore, the observation-side recording circuit 130 records the focus position range in the optical axis direction of the imaging optical system 152 as, for example, a focus position range. For the focus position range, for example, a value corresponding to the size of the sample 300 or the like is set in advance, or is set by a user input.
観察装置100は、画像処理回路120をさらに備える。画像処理回路120は、撮像部151で得られた画像データに対して各種画像処理を施す。画像処理回路120による画像処理後のデータは、例えば観察側記録回路130に記録されたり、コントローラ200に送信されたりする。また、画像処理回路120は、得られた画像に基づく各種解析を行ってもよい。例えば画像処理回路120は、得られた第1の画像に基づいて、試料300に含まれる細胞324又は細胞群の奥行き情報を取得する。また、例えば画像処理回路120は、得られた第2の画像に基づいて、試料300に含まれる細胞324又は細胞群の画像を抽出したり、細胞又は細胞群の数を算出したりする。このようにして得られた解析結果も、例えば観察側記録回路130に記録されたり、コントローラ200に送信されたりする。
The observation apparatus 100 further includes an image processing circuit 120. The image processing circuit 120 performs various image processing on the image data obtained by the imaging unit 151. Data after image processing by the image processing circuit 120 is recorded in, for example, the observation-side recording circuit 130 or transmitted to the controller 200. Further, the image processing circuit 120 may perform various analyzes based on the obtained image. For example, the image processing circuit 120 acquires depth information of a cell 324 or a cell group included in the sample 300 based on the obtained first image. For example, the image processing circuit 120 extracts an image of a cell 324 or a cell group included in the sample 300 based on the obtained second image, or calculates the number of cells or a cell group. The analysis result obtained in this way is also recorded in the observation-side recording circuit 130 or transmitted to the controller 200, for example.
このようなコントローラ200との通信を行うために、観察装置100は、観察側通信装置140をさらに備える。この通信には、例えばWi-Fi(登録商標)又はBluetooth(登録商標)等を利用した無線通信が利用される。また、観察装置100とコントローラ200とは、有線によって接続されて有線によって通信が行われてもよいし、互いにインターネット等の電気通信回線に接続されてインターネット等の電気通信回線を介して通信が行われてもよい。
In order to perform communication with such a controller 200, the observation apparatus 100 further includes an observation side communication apparatus 140. For this communication, wireless communication using, for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark) is used. Further, the observation apparatus 100 and the controller 200 may be connected to each other by a wired communication to communicate with each other, or may be connected to an electrical communication line such as the Internet and communicate via an electrical communication line such as the Internet. It may be broken.
観察装置100は、観察側制御回路110と、時計部172とをさらに備える。
The observation apparatus 100 further includes an observation-side control circuit 110 and a clock unit 172.
観察側制御回路110は、観察装置100の備える各部の動作を制御する。また、観察側制御回路110は、観察装置100の動作に係る各種情報を取得し、観察装置100の動作に係る各種判定を行い、また、当該判定の結果に基づいてユーザに対して通知、警告等を行う。図2に示すように、観察側制御回路110は、位置制御部111、撮像制御部112、照明制御部113、通信制御部114、記録制御部115、測定制御部116及び距離換算部117としての機能を備える。位置制御部111は、移動機構160の動作を制御し、画像取得ユニット150の位置を制御する。撮像制御部112は、画像取得ユニット150の備える撮像部151の動作を制御し、撮像部151に試料300の画像を取得させる。撮像制御部112は、ピント・露出切替部を備える。撮像制御部112は、例えば撮像光学系152に含まれる合焦用レンズを光軸方向に移動させてピント調節を行う。当該合焦用レンズは、例えば液体レンズのような焦点距離が可変のレンズであってもよい。また、合焦用には焦点の異なるレンズが複数用意されてもよい。用意されたレンズが多眼であればリフォーカス技術などを利用することが可能となる。また、ピント・露出切替部は、例えば絞り152aの動作を制御して露出の調節をしたり、レンズの光軸方向の動作を制御してズームの調整をしたりする。照明制御部113は、画像取得ユニット150の備える照明部155の動作を制御する。通信制御部114は、観察側通信装置140を介したコントローラ200との通信を管理する。記録制御部115は、観察装置100で得られたデータの記録について制御する。測定制御部116は、測定を行うタイミングや回数など、測定全体を制御する。距離換算部117は、注目被写体である細胞324の光軸方向の位置情報、細胞324の有する凹凸に係る情報等を、例えば画像処理回路120の処理結果に基づいて、奥行き情報として取得する。時計部172は、時刻情報を生成して観察側制御回路110へ出力する。当該時刻情報は、例えば取得データの記録時、観察装置100の動作に係る判定に使用される。
The observation side control circuit 110 controls the operation of each unit included in the observation apparatus 100. In addition, the observation-side control circuit 110 acquires various information related to the operation of the observation apparatus 100, performs various determinations related to the operation of the observation apparatus 100, and notifies and alerts the user based on the determination result. Etc. As shown in FIG. 2, the observation-side control circuit 110 includes a position control unit 111, an imaging control unit 112, an illumination control unit 113, a communication control unit 114, a recording control unit 115, a measurement control unit 116, and a distance conversion unit 117. It has a function. The position control unit 111 controls the operation of the moving mechanism 160 and controls the position of the image acquisition unit 150. The imaging control unit 112 controls the operation of the imaging unit 151 included in the image acquisition unit 150 and causes the imaging unit 151 to acquire an image of the sample 300. The imaging control unit 112 includes a focus / exposure switching unit. The imaging control unit 112 performs focus adjustment by moving a focusing lens included in the imaging optical system 152 in the optical axis direction, for example. The focusing lens may be a lens having a variable focal length such as a liquid lens. A plurality of lenses with different focal points may be prepared for focusing. If the prepared lens is multi-lens, refocusing technology or the like can be used. The focus / exposure switching unit adjusts the exposure by controlling the operation of the diaphragm 152a, for example, and adjusts the zoom by controlling the operation of the lens in the optical axis direction. The illumination control unit 113 controls the operation of the illumination unit 155 included in the image acquisition unit 150. The communication control unit 114 manages communication with the controller 200 via the observation side communication device 140. The recording control unit 115 controls recording of data obtained by the observation apparatus 100. The measurement control unit 116 controls the entire measurement such as the timing and number of times of measurement. The distance conversion unit 117 acquires position information in the optical axis direction of the cell 324 that is the subject of interest, information on the unevenness of the cell 324, and the like as depth information based on the processing result of the image processing circuit 120, for example. The clock unit 172 generates time information and outputs it to the observation side control circuit 110. The time information is used for determination related to the operation of the observation apparatus 100 when recording acquired data, for example.
なお、上述した、観察側制御回路110と、画像処理回路120と、観察側記録回路130と、観察側通信装置140とは、例えば図1に示すように、回路群104として筐体101の内部に設けられている。
Note that the observation-side control circuit 110, the image processing circuit 120, the observation-side recording circuit 130, and the observation-side communication device 140 described above are arranged inside the casing 101 as a circuit group 104, for example, as shown in FIG. Is provided.
また、本実施形態に係る観察装置100は、立体情報取得部と対応点取得部と立体モデル生成部としての機能を備える。立体情報取得部は、異なる撮像部151の位置で取得された複数の第1の画像と、これら第1の画像の取得時の各々の撮像位置に係る情報とに基づいて試料300について立体情報として取得する。立体情報取得部は、例えば試料300の凹凸等に係る情報を取得する。立体情報取得部は、例えば距離換算部117から第1の画像の取得時の各々の撮像位置に係る情報を取得する。なお、当該立体情報は、例えば距離換算部117の取得する奥行き情報を含む。対応点取得部は、例えば撮像部151が移動機構160に移動させられながら取得した複数の第1の画像間の相関に基づいて、対応点を取得し、第1の像移動量ΔXを取得する。立体モデル生成部は、例えば距離換算部117の取得する奥行き情報に基づいて、注目被写体の立体モデルを構築する。なお、これら立体情報取得部と対応点取得部と立体モデル生成部としての機能は、それぞれ、例えば観察側制御回路110及び/又は画像処理回路120によって実現され得る。
Moreover, the observation apparatus 100 according to the present embodiment includes functions as a three-dimensional information acquisition unit, a corresponding point acquisition unit, and a three-dimensional model generation unit. The three-dimensional information acquisition unit is used as the three-dimensional information about the sample 300 based on the plurality of first images acquired at the positions of the different imaging units 151 and the information related to the respective imaging positions at the time of acquisition of the first images. get. The three-dimensional information acquisition unit acquires information related to the unevenness of the sample 300, for example. The three-dimensional information acquisition unit acquires, for example, information related to each imaging position at the time of acquiring the first image from the distance conversion unit 117. The three-dimensional information includes depth information acquired by the distance conversion unit 117, for example. The corresponding point acquisition unit acquires the corresponding point based on the correlation between the plurality of first images acquired while the imaging unit 151 is moved by the moving mechanism 160, for example, and acquires the first image movement amount ΔX. . The three-dimensional model generation unit constructs a three-dimensional model of the subject of interest based on the depth information acquired by the distance conversion unit 117, for example. The functions as the three-dimensional information acquisition unit, the corresponding point acquisition unit, and the three-dimensional model generation unit can be realized by the observation-side control circuit 110 and / or the image processing circuit 120, for example.
このように、筐体101の内部に、透明板102を介した撮像によって画像データを生成する画像取得ユニット150と、画像取得ユニット150を移動させる移動機構160とを設けることによって、信頼性が高く、取り扱いや洗浄が容易であり、コンタミネーション等を防止できる構造にすることができる。
As described above, by providing the image acquisition unit 150 that generates image data by imaging through the transparent plate 102 and the moving mechanism 160 that moves the image acquisition unit 150 in the housing 101, the reliability is high. The structure can be easily handled and cleaned, and can prevent contamination and the like.
(コントローラについて)
コントローラ200は、例えばパーソナルコンピュータ(PC)、タブレット型の情報端末等である。図1には、タブレット型の情報端末を図示している。 (About the controller)
Thecontroller 200 is, for example, a personal computer (PC), a tablet information terminal, or the like. FIG. 1 illustrates a tablet information terminal.
コントローラ200は、例えばパーソナルコンピュータ(PC)、タブレット型の情報端末等である。図1には、タブレット型の情報端末を図示している。 (About the controller)
The
コントローラ200には、例えば液晶ディスプレイといった表示装置272とタッチパネルといった入力装置274とを備える入出力装置270が設けられている。入力装置274は、タッチパネルの他に、スイッチ、ダイヤル、キーボード、マウス等を含んでいてもよい。
The controller 200 is provided with an input / output device 270 including a display device 272 such as a liquid crystal display and an input device 274 such as a touch panel. The input device 274 may include a switch, dial, keyboard, mouse, and the like in addition to the touch panel.
また、コントローラ200には、コントローラ側通信装置240が設けられている。コントローラ側通信装置240は、観察側通信装置140と通信を行うための装置である。観察側通信装置140及びコントローラ側通信装置240を介して、観察装置100とコントローラ200とは通信を行う。
Further, the controller 200 is provided with a controller side communication device 240. The controller side communication device 240 is a device for communicating with the observation side communication device 140. The observation apparatus 100 and the controller 200 communicate with each other via the observation side communication apparatus 140 and the controller side communication apparatus 240.
また、コントローラ200は、コントローラ側制御回路210と、コントローラ側記録回路230とを備える。コントローラ側制御回路210は、コントローラ200の各部の動作を制御する。コントローラ側記録回路230は、例えばコントローラ側制御回路210で用いられるプログラムや各種パラメータ、観察装置100から受信したデータを記録する。
The controller 200 includes a controller-side control circuit 210 and a controller-side recording circuit 230. The controller side control circuit 210 controls the operation of each part of the controller 200. The controller-side recording circuit 230 records, for example, programs used in the controller-side control circuit 210, various parameters, and data received from the observation apparatus 100.
コントローラ側制御回路210は、システム制御部211、表示制御部212、記録制御部213、通信制御部214及びネット連携部215としての機能を有する。なお、コントローラ側制御回路210は、対応点取得部と立体モデル生成部としての機能をさらに有していてもよい。システム制御部211は、試料300の測定のための制御に係る各種演算を行う。表示制御部212は、表示装置272の動作を制御する。表示制御部212は、表示装置272に必要な情報等を表示させる。記録制御部213は、コントローラ側記録回路230への情報の記録を制御する。通信制御部214は、コントローラ側通信装置240を介した観察装置100との通信を制御する。ネット連携部215は、インターネット等の電気通信回線上に設けられたクラウド等の観察システム1の外部にあるネットワークサーバ等と観察システム1との連携を制御する。当該連携において、ネット連携部215は、例えば、観察装置100が取得した画像等の観察結果又はコントローラ200が取得した観察結果を、観察側通信装置140又はコントローラ側通信装置240に、ネットワーク上に設けられたサーバ等に送信させる。ネット連携部215は、例えば、当該ネットワークサーバの有する画像処理回路等に当該観察結果に基づいた細胞カウント、奥行き情報の算出等の処理を行わせ、当該処理の結果を取得する。また、例えばネット連携部215は、インターネットに接続されたインキュベータ、空調設備、照明設備等のIoT機器から情報を取得したり、当該IoT機器を制御したりする。
The controller-side control circuit 210 has functions as a system control unit 211, a display control unit 212, a recording control unit 213, a communication control unit 214, and a network cooperation unit 215. The controller-side control circuit 210 may further have functions as a corresponding point acquisition unit and a three-dimensional model generation unit. The system control unit 211 performs various calculations related to control for measurement of the sample 300. The display control unit 212 controls the operation of the display device 272. The display control unit 212 causes the display device 272 to display necessary information and the like. The recording control unit 213 controls information recording in the controller-side recording circuit 230. The communication control unit 214 controls communication with the observation device 100 via the controller side communication device 240. The network cooperation unit 215 controls the cooperation between the observation system 1 and a network server or the like outside the observation system 1 such as a cloud provided on a telecommunication line such as the Internet. In the cooperation, the network cooperation unit 215 provides, for example, an observation result such as an image acquired by the observation device 100 or an observation result acquired by the controller 200 on the observation side communication device 140 or the controller side communication device 240 on the network. Sent to the server. For example, the network cooperation unit 215 causes the image processing circuit or the like included in the network server to perform processing such as cell count and depth information calculation based on the observation result, and acquires the result of the processing. Further, for example, the network cooperation unit 215 acquires information from an IoT device such as an incubator, an air conditioning facility, and a lighting facility connected to the Internet, and controls the IoT device.
なお、観察側制御回路110、画像処理回路120及びコントローラ側制御回路210は、Central Processing Unit(CPU)、Application Specific Integrated Circuit(ASIC)、又はField Programmable Gate Array(FPGA)等の集積回路等を含む。観察側制御回路110、画像処理回路120及びコントローラ側制御回路210は、それぞれ1つの集積回路等で構成されてもよいし、複数の集積回路等が組み合わされて構成されてもよい。また、観察側制御回路110及び画像処理回路120は、1つの集積回路等で構成されてもよい。また、観察側制御回路110の位置制御部111、撮像制御部112、照明制御部113、通信制御部114、記録制御部115、測定制御部116、及び距離換算部117は、それぞれ1つの集積回路等で構成されてもよいし、複数の集積回路等が組み合わされて構成されてもよい。また、位置制御部111、撮像制御部112、照明制御部113、通信制御部114、記録制御部115、測定制御部116、及び距離換算部117のうち2つ以上が1つの集積回路等で構成されてもよい。同様に、コントローラ側制御回路210のシステム制御部211、表示制御部212、記録制御部213、通信制御部214及びネット連携部215は、それぞれ1つの集積回路等で構成されてもよいし、複数の集積回路等が組み合わされて構成されてもよい。また、システム制御部211、表示制御部212、記録制御部213、通信制御部214及びネット連携部215のうち2つ以上が1つの集積回路等で構成されてもよい。これら集積回路の動作は、例えば観察側記録回路130又はコントローラ側記録回路230や集積回路内の記録領域に記録されたプログラムに従って行われる。
The observation-side control circuit 110, the image processing circuit 120, and the controller-side control circuit 210 include an integrated circuit such as Central Processing Unit (CPU), Application Specific Integrated Circuit (ASIC), or Field Programmable Gate Array (FPGA). . The observation side control circuit 110, the image processing circuit 120, and the controller side control circuit 210 may each be configured by one integrated circuit or the like, or may be configured by combining a plurality of integrated circuits. Further, the observation side control circuit 110 and the image processing circuit 120 may be configured by one integrated circuit or the like. In addition, the position control unit 111, the imaging control unit 112, the illumination control unit 113, the communication control unit 114, the recording control unit 115, the measurement control unit 116, and the distance conversion unit 117 of the observation side control circuit 110 are each one integrated circuit. Etc., or a combination of a plurality of integrated circuits or the like. Further, two or more of the position control unit 111, the imaging control unit 112, the illumination control unit 113, the communication control unit 114, the recording control unit 115, the measurement control unit 116, and the distance conversion unit 117 are configured by one integrated circuit or the like. May be. Similarly, the system control unit 211, the display control unit 212, the recording control unit 213, the communication control unit 214, and the network link unit 215 of the controller-side control circuit 210 may each be configured with one integrated circuit or the like. These integrated circuits or the like may be combined. Further, two or more of the system control unit 211, the display control unit 212, the recording control unit 213, the communication control unit 214, and the network link unit 215 may be configured by one integrated circuit or the like. The operation of these integrated circuits is performed according to a program recorded in a recording area in the observation-side recording circuit 130 or the controller-side recording circuit 230 or the integrated circuit, for example.
なお、観察側記録回路130、コントローラ側記録回路230又はこれらの備える各要素は、例えばフラッシュメモリのような不揮発性メモリであるが、Static Random Access Memory(SRAM)やDynamic Random Access Memory(DRAM)のような揮発性メモリをさらに有していてもよい。また、観察側記録回路130又はこれらの備える各要素と、コントローラ側記録回路230又はこれらの備える各要素とは、それぞれ1つのメモリ等で構成されてもよいし、複数のメモリ等が組み合わされて構成されてもよい。また、観察システム1の外部にあるデータベース等を、そのメモリの一部として利用してもよい。
Note that the observation-side recording circuit 130, the controller-side recording circuit 230, or each of the elements included in the observation-side recording circuit 130 is a non-volatile memory such as a flash memory, for example, but is not limited to Static Random Access Memory (SRAM) Such a volatile memory may be further included. In addition, the observation-side recording circuit 130 or each of the elements included therein and the controller-side recording circuit 230 or each of the elements included in the observation-side recording circuit 130 may be configured by one memory or the like, or a plurality of memories or the like may be combined. It may be configured. Further, a database or the like outside the observation system 1 may be used as a part of the memory.
<観察システムの動作>
コントローラ200との間で通信して観察装置100によって行われる観察装置制御処理の一例をフローチャートとして図7に示し、これを参照して観察システム1の動作について説明をする。 <Operation of observation system>
An example of an observation apparatus control process performed by theobservation apparatus 100 by communicating with the controller 200 is shown as a flowchart in FIG. 7, and the operation of the observation system 1 will be described with reference to this flowchart.
コントローラ200との間で通信して観察装置100によって行われる観察装置制御処理の一例をフローチャートとして図7に示し、これを参照して観察システム1の動作について説明をする。 <Operation of observation system>
An example of an observation apparatus control process performed by the
ステップS101において、観察側制御回路110は、例えばコントローラ200がユーザの操作に応じて出力する信号を受信するまで待機する。
In step S101, the observation-side control circuit 110 stands by until a signal output from the controller 200 according to a user operation is received, for example.
ステップS102において、観察側制御回路110は、例えばコントローラ200から観察装置100の電源をオンとする電源ON信号又は観察装置100の電源をオフとする電源OFF信号を受信したか否かを判定する。処理は、電源ON/OFF信号を受信したと判定された場合はステップS103へ進み、受信したと判定されなかった場合はステップS104へ進む。
In step S102, the observation-side control circuit 110 determines whether, for example, a power-on signal for turning on the power of the observation apparatus 100 or a power-off signal for turning off the power of the observation apparatus 100 is received from the controller 200. The process proceeds to step S103 if it is determined that the power ON / OFF signal has been received, and proceeds to step S104 if it is not determined that it has been received.
ステップS103において、観察側制御回路110は、ステップS102において電源ON信号を受信したと判定された場合は観察装置100の各部への電源の供給を開始し、ステップS102において電源OFF信号を受信したと判定された場合は観察装置100の各部への電源の供給を終了する。なお、観察側通信装置140へは、通信を待機するために、何れの場合も電源が供給され続ける。その後、処理はステップS101へ戻る。
In step S103, when it is determined that the power-on signal is received in step S102, the observation-side control circuit 110 starts supplying power to each part of the observation apparatus 100, and receives the power-off signal in step S102. If determined, the supply of power to each unit of the observation apparatus 100 is terminated. Note that power is continuously supplied to the observation-side communication device 140 in any case in order to wait for communication. Thereafter, the process returns to step S101.
細胞培養について観察する場合等、観察物の経時変化が緩やかな場合には、必要に応じた時間間隔で撮像等の観察が行われればよい。そのため、このような電源制御は省エネに貢献するものである。
When observing the cell culture, etc., when the change over time of the observation object is gentle, observation such as imaging may be performed at time intervals as necessary. Therefore, such power control contributes to energy saving.
なお、観察装置100は、制御信号等の送受信用にBluetooth Low Energy(BLE)等の省待機電力の通信装置と、画像を含む観察結果等のデータの送受信用にWi-Fi等の高速の通信装置とを備えていてもよい。この場合には、観察装置100の電源がオフの場合にはBLE等で通信待ちを行い、ステップS103で電源がオンとされるときにWi-Fi等の通信がコントローラ200と確立するようにすればよい。また、コントローラ200の出力する電源ON/OFF信号に基づいて観察装置100の電源がオンとされたり、オフとされたりすると説明したが、これに限定されない。例えば細胞の培養中等は、1分毎など、予め設定された時間間隔で観察装置100の電源がオンとされたり、オフとされたりして撮像等の観察が行われてもよい。
Note that the observation device 100 is a communication device with low power consumption such as Bluetooth Low Energy (BLE) for transmission / reception of control signals and the like, and high-speed communication such as Wi-Fi for transmission / reception of data such as observation results including images. And a device. In this case, when the power of the observation apparatus 100 is off, communication is waited by BLE or the like, and communication such as Wi-Fi is established with the controller 200 when the power is turned on at step S103. That's fine. Further, although it has been described that the power of the observation apparatus 100 is turned on or off based on the power ON / OFF signal output from the controller 200, the present invention is not limited to this. For example, during culturing of cells, observation such as imaging may be performed by turning on or off the observation apparatus 100 at a preset time interval such as every minute.
ステップS104において、観察側制御回路110は、例えばコントローラ200から各種設定に係る制御信号を受信したか否かを判定する。処理は、各種設定に係る制御信号を受信したと判定された場合はステップS105へ進み、受信したと判定されなかった場合はステップS101へ戻る。
In step S104, the observation-side control circuit 110 determines whether or not control signals related to various settings are received from the controller 200, for example. The process proceeds to step S105 if it is determined that a control signal related to various settings has been received, and returns to step S101 if it is not determined that it has been received.
ステップS105において、観察側制御回路110は、ステップS104において観察側通信装置140が受信した各種設定に係る制御信号に応じて、観察装置100の各部の設定を行う。ここで設定される情報は、例えば、観察装置100が取得した画像等の観察結果又は測定結果の送信先に係る情報、撮影条件、測定条件、各種パラメータを含む。なお、観察装置100が取得した観察結果又は測定結果の送信先は、例えば観察装置100の観察側記録回路130、コントローラ200のコントローラ側記録回路230、ネットワーク上のデータサーバ等である。例えばこのようにネットワーク上に構築されたクラウド等に観察結果又は測定結果を送信すれば、異なるユーザ間での情報共有が容易になるだけでなく、観察システム1の外部で取得画像の解析、画像処理等を行うことができるようになる。
In step S105, the observation-side control circuit 110 sets each part of the observation device 100 according to the control signals related to various settings received by the observation-side communication device 140 in step S104. The information set here includes, for example, information related to an observation result such as an image acquired by the observation apparatus 100 or a transmission destination of the measurement result, imaging conditions, measurement conditions, and various parameters. Note that the transmission destination of the observation result or measurement result acquired by the observation apparatus 100 is, for example, the observation-side recording circuit 130 of the observation apparatus 100, the controller-side recording circuit 230 of the controller 200, a data server on the network, or the like. For example, if an observation result or a measurement result is transmitted to a cloud or the like constructed on the network in this way, not only information sharing between different users is facilitated, but also an analysis of an acquired image and an image outside the observation system 1 Processing can be performed.
ステップS106において、観察側制御回路110は、例えばコントローラ200からカウントスキャン処理の実行を指示する制御信号を受信したか否かを判定する。処理は、カウントスキャン処理の実行を指示する制御信号を受信したと判定された場合はステップS107へ進み、受信したと判定されなかった場合はステップS108へ進む。なお、カウントスキャン処理は、例えば測定開始時刻等が予め決められており、当該決められた測定開始時刻で測定が開始されてもよい。
In step S106, the observation-side control circuit 110 determines whether or not a control signal instructing execution of the count scan process is received from the controller 200, for example. The process proceeds to step S107 when it is determined that a control signal instructing execution of the count scan process has been received, and proceeds to step S108 when it is not determined that it has been received. In the count scan process, for example, a measurement start time or the like is determined in advance, and measurement may be started at the determined measurement start time.
ステップS107において、観察側制御回路110は、カウントスキャン処理を実行し、細胞324の数をカウントする。カウントスキャン処理の詳細は後述する。その後、処理はステップS108へ進む。
In step S107, the observation-side control circuit 110 executes a count scan process and counts the number of cells 324. Details of the count scan process will be described later. Thereafter, the process proceeds to step S108.
ステップS108において、観察側制御回路110は、例えばコントローラ200から3Dスキャン処理の実行を指示する制御信号を受信したか否かを判定する。処理は、カウント処理の実行を指示する制御信号を受信したと判定された場合はステップS109へ進み、受信したと判定されなかった場合はステップS110へ進む。
In step S108, the observation-side control circuit 110 determines whether or not a control signal instructing execution of 3D scan processing is received from the controller 200, for example. The process proceeds to step S109 if it is determined that a control signal instructing execution of the count process has been received, and proceeds to step S110 if it is not determined that it has been received.
ステップS109において、観察側制御回路110は、3Dスキャン処理を実行し、細胞324の立体情報等の奥行き情報を取得する。3Dスキャン処理の詳細は後述する。その後、処理はステップS110へ進む。
In step S109, the observation-side control circuit 110 executes a 3D scan process and acquires depth information such as three-dimensional information of the cell 324. Details of the 3D scanning process will be described later. Thereafter, the process proceeds to step S110.
ステップS110において、観察側制御回路110は、例えばコントローラ200がユーザの操作に応じて出力する制御信号に基づいて、観察又は測定に係る処理を終了するか否かを判定する。処理は、終了すると判定された場合はステップS111へ進み、終了しないと判定された場合はステップS104へ戻る。
In step S110, the observation-side control circuit 110 determines whether or not to end the processing related to observation or measurement based on, for example, a control signal output by the controller 200 in response to a user operation. The process proceeds to step S111 if it is determined to end, and returns to step S104 if it is determined not to end.
ステップS111において、観察側制御回路110は、例えばコントローラ200から観察結果又は測定結果を要求する制御信号を受信したか否かを判定する。観察結果又は測定結果は、例えば測定の計測値、取得画像、撮影位置、解析結果等の観察装置100で得られた各種データを含む。撮影位置は、撮影位置のX座標、Y座標及びZ座標を含む。X座標及びY座標は、移動機構160の制御で用いられる値であり、例えば位置制御部111から取得され得る。Z座標は、撮像光学系152の制御に用いられる値であり、例えば撮像制御部112、距離換算部117等から取得され得る。処理は、観察結果又は測定結果を要求する制御信号を受信したと判定された場合はステップS112へ進み、受信したと判定されなかった場合はステップS101へ戻る。
In step S111, the observation-side control circuit 110 determines whether or not a control signal that requests an observation result or a measurement result is received from the controller 200, for example. The observation result or the measurement result includes, for example, various data obtained by the observation apparatus 100 such as a measurement measurement value, an acquired image, a photographing position, and an analysis result. The shooting position includes an X coordinate, a Y coordinate, and a Z coordinate of the shooting position. The X coordinate and the Y coordinate are values used in the control of the moving mechanism 160 and can be acquired from the position control unit 111, for example. The Z coordinate is a value used for controlling the imaging optical system 152, and can be acquired from, for example, the imaging control unit 112, the distance conversion unit 117, and the like. The process proceeds to step S112 if it is determined that a control signal requesting an observation result or a measurement result has been received, and returns to step S101 if it is not determined that it has been received.
ステップS112において、観察側制御回路110は、例えばステップS105で設定された送信先に、取得画像等の各種観察や測定で取得された結果、当該結果を解析して取得された解析結果等を送信する。その後、処理はステップS101へ戻る。
In step S112, the observation-side control circuit 110 transmits the results obtained by various observations and measurements such as an acquired image, the analysis results obtained by analyzing the results, and the like to the transmission destination set in step S105, for example. To do. Thereafter, the process returns to step S101.
ここで、観察装置制御処理のステップS107におけるカウントスキャン処理の一例をフローチャートとして図8に示し、これを参照して観察システム1のカウントスキャン処理時の動作について説明をする。
Here, an example of the count scan process in step S107 of the observation apparatus control process is shown in FIG. 8 as a flowchart, and the operation of the observation system 1 during the count scan process will be described with reference to this.
ステップS201において、観察側制御回路110は、レンズ切替部154に撮像光学系152を第2の撮像光学系とさせる。なお、第2の撮像光学系は、上述したように、被写体側にテレセントリック性を有する光学系である。その後、処理はステップS202へ進む。
In step S201, the observation-side control circuit 110 causes the lens switching unit 154 to make the imaging optical system 152 the second imaging optical system. Note that the second imaging optical system is an optical system having telecentricity on the subject side as described above. Thereafter, the process proceeds to step S202.
ステップS202において、観察側制御回路110は、例えば観察側記録回路130に記録されているカウントスキャン処理情報に基づいて、カウントスキャンを開始するための事前処理を実行する。当該事前処理において、観察側制御回路110は、画像取得ユニット150を移動機構160に移動させてカウントスキャンのXY開始位置に戻す。また、観察側制御回路110は、撮像光学系152及び撮像素子153、又は移動機構160の動作を制御して、Z方向の初期位置からカウントスキャンを開始できるようにする。その後、観察側制御回路110は、カウントスキャンを開始する。
In step S202, the observation-side control circuit 110 executes pre-processing for starting a count scan based on, for example, the count-scan processing information recorded in the observation-side recording circuit 130. In the preprocessing, the observation side control circuit 110 moves the image acquisition unit 150 to the moving mechanism 160 and returns it to the XY start position of the count scan. In addition, the observation-side control circuit 110 controls the operations of the imaging optical system 152 and the imaging element 153 or the moving mechanism 160 so that the count scan can be started from the initial position in the Z direction. Thereafter, the observation side control circuit 110 starts a count scan.
ここで、本実施形態に係るカウントスキャン処理情報の一例を図9に示し、これを参照してカウントスキャン処理情報として記録される情報について説明をする。当該情報は、例えば予め設定されていたり、観察装置制御処理のステップS105で設定されたりする。
Here, an example of count scan processing information according to the present embodiment is shown in FIG. 9, and information recorded as count scan processing information will be described with reference to this. The information is set in advance, for example, or set in step S105 of the observation apparatus control process.
図9に示すように、カウントスキャン処理情報は、カウントスキャンパターンに係る情報CSPと、カウントスキャン処理の実行に係る情報CSJと、カウントスキャン処理によって得られた情報CSRとを含む。カウントスキャンパターンに係る情報CSPは、例えばカウントスキャンの開始条件CSP1、開始位置CSP2、終了条件CSP3、第1のX移動ピッチCSP5、第1のY移動ピッチCSP6、X方向の移動からY方向の移動へと切り替える条件である第1のX→Y条件CSP10、Y方向の移動からX方向の移動へと切り替える条件である第1のY→X条件CSP11を含む。ここで、例えば第1のX移動ピッチCSP5はX方向への移動(撮像)間隔であり、第1のY移動ピッチCSP6はY方向への移動(撮像)間隔である。本実施形態に係る画像取得ユニット150は、これら移動ピッチ毎に撮像して第2の画像を取得することになる。カウントスキャン処理の実行に係る情報CSJは、例えば観察不良等を判定する判定条件である第1のNG判定条件CSJ1、例えば第1のNG判定条件CSJ1によって観察不良であると判定された場合等にカウントスキャンを再実施するか否かを判定する判定条件である第1の再トライ判定条件CSJ2を含む。カウントスキャン処理によって得られた情報CSRは、例えばカウントスキャン処理で第2の撮像光学系を用いて取得された各々の画像(第2の画像)に紐付けられて記録される。例えば、第1の結果CSR1は、第1のコマCSR11と、第1のコマCSR11が取得された際の第1の時刻CSR12、第1のAF情報CSR13及び第1の撮影条件CSR14とを含む。なお、撮影条件は、シャッタースピードや絞り等の露出条件その他の撮影条件を含む。ここでいう撮影条件は、撮影毎に異なっていてもよいし、測定毎に異なっていてもよいし、全ての撮影で共通であってもよい。また、当該情報は、第2の画像が取得された位置の情報、細胞324の数をカウントした結果等を含んでいてもよい。
As shown in FIG. 9, the count scan processing information includes information CSP relating to the count scan pattern, information CSJ relating to execution of the count scan processing, and information CSR obtained by the count scan processing. The information CSP related to the count scan pattern includes, for example, a count scan start condition CSP1, a start position CSP2, an end condition CSP3, a first X movement pitch CSP5, a first Y movement pitch CSP6, and movement in the X direction to movement in the Y direction. The first X → Y condition CSP10 which is a condition for switching to the X direction, and the first Y → X condition CSP11 which is a condition for switching from the movement in the Y direction to the movement in the X direction. Here, for example, the first X movement pitch CSP5 is a movement (imaging) interval in the X direction, and the first Y movement pitch CSP6 is a movement (imaging) interval in the Y direction. The image acquisition unit 150 according to the present embodiment captures images for each of these movement pitches and acquires a second image. The information CSJ related to the execution of the count scan process is, for example, when it is determined that there is an observation defect based on a first NG determination condition CSJ1, which is a determination condition for determining an observation defect, for example, the first NG determination condition CSJ1. A first retry determination condition CSJ2, which is a determination condition for determining whether or not to re-execute the count scan, is included. The information CSR obtained by the count scan process is recorded in association with each image (second image) acquired by using the second imaging optical system in the count scan process, for example. For example, the first result CSR1 includes the first frame CSR11, the first time CSR12 when the first frame CSR11 is acquired, the first AF information CSR13, and the first imaging condition CSR14. Note that the shooting conditions include exposure conditions such as shutter speed and aperture, and other shooting conditions. The photographing conditions here may be different for each photographing, may be different for each measurement, or may be common for all photographing. Further, the information may include information on a position where the second image is acquired, a result of counting the number of cells 324, and the like.
ここで、本実施形態に係るカウントスキャン処理における画像取得ユニット150の移動パターンの一例について模式図として図10に示し、これを参照してカウントスキャンにおける画像取得ユニット150の移動について説明をする。ここでは、画像取得ユニット150が、図10中に示す線CL1上を移動させられながらカウントスキャン処理が実行される場合を例として説明をする。
Here, an example of the movement pattern of the image acquisition unit 150 in the count scan processing according to the present embodiment is shown in FIG. 10 as a schematic diagram, and the movement of the image acquisition unit 150 in the count scan will be described with reference to this. Here, a case where the count scan process is executed while the image acquisition unit 150 is moved on the line CL1 shown in FIG. 10 will be described as an example.
図10に示すように、観察側制御回路110は、画像取得ユニット150を開始位置CP1に移動させ、第2の画像を取得させる。観察側制御回路110は、画像取得ユニット150をY方向へ第1のY移動ピッチだけ移動させ、移動後の位置において第2の画像を取得させる。その後、観察側制御回路110は、例えば画像取得ユニット150が点CP2の示す位置に存在するなど、第1のY→X条件CSP11に該当する状態であると判定されるまで、第2の画像の取得と画像取得ユニット150の移動とを繰り返す。観察側制御回路110は、第1のY→X条件CSP11に該当する状態であると判定された場合には、画像取得ユニット150を移動させる方向をY方向からX方向へ切り替える。移動方向がY方向へと切り替えられた後には、観察側制御回路110は、例えば画像取得ユニット150が点CP3の示す位置に存在するなど、第1のX→Y条件CSP10に該当する状態であると判定されるまで、画像取得ユニット150を第1のX移動ピッチだけ移動させて第2の画像を取得させる処理を繰り返す。このようにして、観察側制御回路110は、画像取得ユニット150が点CP10に到達した場合等、終了条件CSP3が満たされたと判定されるまでカウントスキャン処理を続ける。
As shown in FIG. 10, the observation-side control circuit 110 moves the image acquisition unit 150 to the start position CP1, and acquires the second image. The observation-side control circuit 110 moves the image acquisition unit 150 in the Y direction by the first Y movement pitch, and acquires the second image at the position after the movement. Thereafter, the observation-side control circuit 110 determines that the second image is in a state of being in a state corresponding to the first Y → X condition CSP11, for example, the image acquisition unit 150 exists at the position indicated by the point CP2. The acquisition and the movement of the image acquisition unit 150 are repeated. When it is determined that the state corresponds to the first Y → X condition CSP11, the observation side control circuit 110 switches the direction in which the image acquisition unit 150 is moved from the Y direction to the X direction. After the movement direction is switched to the Y direction, the observation-side control circuit 110 is in a state corresponding to the first X → Y condition CSP10, for example, the image acquisition unit 150 exists at the position indicated by the point CP3. Until it is determined, the process of moving the image acquisition unit 150 by the first X movement pitch and acquiring the second image is repeated. In this way, the observation-side control circuit 110 continues the count scan process until it is determined that the end condition CSP3 is satisfied, for example, when the image acquisition unit 150 reaches the point CP10.
ここで、再び図8を参照して観察システム1のカウントスキャン処理時の動作について説明を続ける。
Here, the operation of the observation system 1 during the count scan process will be described with reference again to FIG.
ステップS203において、観察側制御回路110は、カウントスキャンの状態を判定する。当該判定では、例えば、画像処理回路120が撮像した画像を解析して観察不良を検出した場合、移動機構160が動作不良を検出した場合等に、カウントスキャンの再実施が必要であると判定される。当該判定の条件は、図9に示すように、カウントスキャン処理情報として例えば観察側記録回路130に記録されている。なお、カウントスキャン時に取得される画像をコントローラ200へ送信して、コントローラ200で行われるライブビュー(LV)表示に基づいてユーザが判断する仕様も考えられる。処理は、カウントスキャンの再実施が必要であると判定された場合はステップS204へ進み、判定されなかった場合はステップS205へ進む。
In step S203, the observation-side control circuit 110 determines the count scan state. In this determination, for example, when the image captured by the image processing circuit 120 is analyzed to detect an observation failure, or when the moving mechanism 160 detects an operation failure, it is determined that the count scan needs to be performed again. The As shown in FIG. 9, the determination condition is recorded in the observation-side recording circuit 130 as count scan processing information, for example. In addition, the specification which a user judges based on the live view (LV) display performed by the controller 200 by transmitting the image acquired at the time of count scanning to the controller 200 is also considered. The process proceeds to step S204 when it is determined that re-execution of the count scan is necessary, and proceeds to step S205 when it is not determined.
ステップS204において、観察側制御回路110は、ステップS203での判定結果に応じて、カウントスキャンにおいて観察不良等が発生している旨、カウントスキャンの再実施が必要である旨等をユーザへ警告させるための制御信号を生成し、コントローラ200へ送信する。その後、処理はステップS202へ戻る。
In step S204, the observation-side control circuit 110 alerts the user that an observation failure or the like has occurred in the count scan, or that the count scan needs to be re-executed, according to the determination result in step S203. A control signal is generated and transmitted to the controller 200. Thereafter, the process returns to step S202.
なお、ステップS202へ戻った後に行われるカウントスキャン処理は、例えばステップS203における判定の結果に応じて、画像取得ユニット150が開始位置まで戻されて再実施されたり、現在の位置から再実施されたりする。
Note that the count scan processing performed after returning to step S202 may be performed again with the image acquisition unit 150 returned to the start position or performed again from the current position, for example, according to the determination result in step S203. To do.
ステップS205において、観察側制御回路110は、例えば、カウントスキャン処理情報として例えば観察側記録回路130に記録されている終了条件CSP3に基づいて、所定の全域におけるカウントスキャンが終了したか否かを判定する。処理は、全域終了したと判定された場合はステップS211へ進み、判定されなかった場合はステップS206へ進む。
In step S205, for example, the observation-side control circuit 110 determines whether or not the count scan in a predetermined entire region has ended based on the end condition CSP3 recorded in the observation-side recording circuit 130 as count scan processing information, for example. To do. The process proceeds to step S211 when it is determined that the entire region has been completed, and proceeds to step S206 when it is not determined.
ステップS206において、観察側制御回路110は、撮像部151に注目被写体である細胞324に対して焦点調節(Auto Focus:AF)を行わせ、また、撮像動作を行わせて第2の画像を取得させる。ここで、図3及び図4を参照して上述したように、カウントスキャン処理では被写体側にテレセントリック性を有する光学系である第2の撮像光学系が用いられるため、AFの際に撮像光学系152、撮像素子153等の光軸方向の位置が変化しても、細胞324の大きさや位置は変化しない。また、観察側制御回路110は、図9を参照して上述したように取得した第2の画像等を観察側通信装置140に予め設定された送信先へ送信させる。
In step S206, the observation-side control circuit 110 causes the imaging unit 151 to perform focus adjustment (Auto Focus: AF) on the cell 324 that is the subject of interest, and to perform an imaging operation to acquire a second image. Let Here, as described above with reference to FIGS. 3 and 4, since the second imaging optical system, which is an optical system having telecentricity, is used on the subject side in the count scan processing, the imaging optical system is used during AF. 152, the size and position of the cell 324 do not change even if the position in the optical axis direction of the image sensor 153 or the like changes. In addition, the observation-side control circuit 110 causes the observation-side communication device 140 to transmit the second image acquired as described above with reference to FIG. 9 to a preset transmission destination.
ステップS207において、観察側制御回路110は、画像処理回路120に取得された第2の画像を解析させ、細胞324又は細胞群の数をカウントさせ、また、細胞カウントの結果を観察側記録回路130へ記録させたり、コントローラ200へ送信させたりする。その後、処理はステップS208へ進む。
In step S207, the observation-side control circuit 110 causes the image processing circuit 120 to analyze the acquired second image, count the number of cells 324 or cell groups, and also display the cell count result as the observation-side recording circuit 130. Or to the controller 200. Thereafter, the process proceeds to step S208.
なお、例えばコントローラ200が第2の画像の送信先であって、コントローラ200が画像処理回路を備える場合には、当該細胞カウントはコントローラ200で行われてもよい。また、例えばクラウド等のネットワーク上のサーバが第2の画像の送信先であって、クラウドが画像処理回路に相当する機能を備える場合には、当該細胞カウントは観察システム1の外部で行われてもよい。また、細胞カウントは、全域のカウントスキャン処理が終了した後に、取得された第2の画像に基づいて合成された広範囲の高画素画像に基づいて行われてもよい。
For example, when the controller 200 is the transmission destination of the second image and the controller 200 includes an image processing circuit, the cell count may be performed by the controller 200. For example, when a server on a network such as a cloud is a transmission destination of the second image and the cloud has a function corresponding to an image processing circuit, the cell count is performed outside the observation system 1. Also good. Further, the cell count may be performed based on a wide range of high pixel images synthesized based on the acquired second image after the count scan process of the entire region is completed.
ステップS208において、観察側制御回路110は、現在の状態がカウントスキャン処理情報として記録されている第1のX→Y条件CSP10又は第1のY→X条件CSP11を満たす状態であるか否かを判定する。処理は、第1のX→Y条件CSP10又は第1のY→X条件CSP11を満たす状態であると判定された場合はステップS209へ進み、判定されなかった場合はステップS210へ進む。
In step S208, the observation-side control circuit 110 determines whether or not the current state satisfies the first X → Y condition CSP10 or the first Y → X condition CSP11 recorded as count scan processing information. judge. The process proceeds to step S209 when it is determined that the first X → Y condition CSP10 or the first Y → X condition CSP11 is satisfied, and the process proceeds to step S210 when it is not determined.
ステップS209において、観察側制御回路110は、ステップS208での判定の結果に応じて、画像取得ユニット150を移動させる方向を切り替える。その後、処理はステップS210へ進む。
In step S209, the observation-side control circuit 110 switches the direction in which the image acquisition unit 150 is moved according to the determination result in step S208. Thereafter, the process proceeds to step S210.
ステップS210において、観察側制御回路110は、その時点での移動方向に応じて、第1のX移動ピッチ又は第1のY移動ピッチだけ画像取得ユニット150を移動させる。その後、処理はステップS203へ戻る。
In step S210, the observation-side control circuit 110 moves the image acquisition unit 150 by the first X movement pitch or the first Y movement pitch according to the movement direction at that time. Thereafter, the process returns to step S203.
ステップS211において、観察側制御回路110は、ステップS205において全域でカウントスキャンが終了したと判定されたことに応じて、観察側通信装置140にコントローラ200へ終了信号を送信させる。また、観察側制御回路110は、画像処理回路120に、取得された第2の画像に基づいて広範囲の高画素画像を合成させる。その後、処理は終了し、観察装置制御処理のステップS108へ進む。
In step S211, the observation-side control circuit 110 causes the observation-side communication device 140 to transmit an end signal to the controller 200 when it is determined in step S205 that the count scan has been completed in the entire area. In addition, the observation side control circuit 110 causes the image processing circuit 120 to synthesize a wide range of high pixel images based on the acquired second image. Thereafter, the process ends, and the process proceeds to step S108 of the observation apparatus control process.
ここで、観察装置制御処理のステップS109における3Dスキャン処理の一例をフローチャートとして図11に示し、これを参照して観察システム1の3Dスキャン処理時の動作について説明をする。
Here, an example of the 3D scan process in step S109 of the observation apparatus control process is shown in FIG. 11 as a flowchart, and the operation of the observation system 1 during the 3D scan process will be described with reference to this flowchart.
ステップS301において、観察側制御回路110は、レンズ切替部154に撮像光学系152を被写体側に非テレセントリック性を有する光学系である第1の撮像光学系とさせる。その後、処理はステップS302へ進む。
In step S301, the observation-side control circuit 110 causes the lens switching unit 154 to make the imaging optical system 152 the first imaging optical system that is a non-telecentric optical system on the subject side. Thereafter, the process proceeds to step S302.
ステップS302において、観察側制御回路110は、例えば観察側記録回路130に記録されている3Dスキャン処理情報に基づいて、3Dスキャンを開始するための事前処理を実行する。当該事前処理において、観察側制御回路110は、画像取得ユニット150を移動機構160に移動させて3DスキャンのXY指定位置に戻す。また、観察側制御回路110は、撮像光学系152及び撮像素子153、又は移動機構160の動作を制御して、Z方向の所定位置から3Dスキャンを開始できるようにする。その後、観察側制御回路110は、3Dスキャンを開始する。
In step S302, the observation-side control circuit 110 executes pre-processing for starting 3D scanning based on, for example, 3D scan processing information recorded in the observation-side recording circuit 130. In the pre-processing, the observation-side control circuit 110 moves the image acquisition unit 150 to the moving mechanism 160 and returns it to the XY designated position for 3D scanning. In addition, the observation-side control circuit 110 controls the operations of the imaging optical system 152 and the imaging element 153, or the moving mechanism 160 so that 3D scanning can be started from a predetermined position in the Z direction. Thereafter, the observation-side control circuit 110 starts 3D scanning.
ここで、本実施形態に係る3Dスキャン処理情報の一例を図12に示し、これを参照して3Dスキャン処理情報として記録される情報について説明をする。当該情報は、例えば予め設定されていたり、観察装置制御処理のステップS105で設定されたりする。なお、上述したように、本実施形態に係る3Dスキャン処理は、被写体側非テレセントリック光学系である第1の撮像光学系を用いて実行される。また、本実施形態に係る3Dスキャン処理は、ユーザが指定した特定の位置を含む領域(特定域)に対して実行される処理である。
Here, an example of 3D scan processing information according to this embodiment is shown in FIG. 12, and information recorded as 3D scan processing information will be described with reference to this. The information is set in advance, for example, or set in step S105 of the observation apparatus control process. Note that, as described above, the 3D scan processing according to the present embodiment is executed using the first imaging optical system that is a subject-side non-telecentric optical system. In addition, the 3D scanning process according to the present embodiment is a process executed on an area (specific area) including a specific position designated by the user.
図12に示すように、3Dスキャン処理情報は、3Dスキャンパターンに係る情報TSPと、3Dスキャン処理の実行に係る情報TSJと、3Dスキャン処理によって得られた情報TSRとを含む。3Dスキャンパターンに係る情報TSPは、指定位置TSP1、範囲設定TSP2、第2のX移動ピッチTSP5、第2のY移動ピッチTSP6、X方向の移動からY方向の移動へと切り替える条件である第2のX→Y条件TSP10、Y方向の移動からX方向の移動へと切り替える条件である第2のY→X条件TSP11を含む。ここで、例えば第2のX移動ピッチTSP5はX方向への移動(撮像)間隔であり、第2のY移動ピッチTSP6はY方向への移動(撮像)間隔である。例えば観察側制御回路110は、指定位置TSP1と範囲設定TSP2とに基づいて特定域を決定し、また、3Dスキャンの開始位置と終了位置とを決定する。3Dスキャン処理の実行に係る情報TSJは、例えば観察不良等を判定する判定条件である第2のNG判定条件TSJ1、例えば第2のNG判定条件TSJ1によって観察不良であると判定された場合等に3Dスキャンを再実施するか否かを判定する判定条件である第2の再トライ判定条件TSJ2を含む。3Dスキャン処理によって得られた情報TSRは、3Dスキャン処理で第1の撮像光学系を用いて取得された各々の画像(第1の画像)に紐付けられて記録される。例えば、第1の結果TSR1は、第1のコマTSR11と、第1のコマTSR11が取得された際の第1の時刻TSR12、第1の奥行き情報TSR13及び第1の3D撮影条件TSR14とを含む。なお、当該奥行き情報には、細胞324の第1の撮像光学系の光軸方向の位置、当該位置の情報に基づいて構築され得る3Dモデルに係る情報等が含まれ得る。
As shown in FIG. 12, the 3D scan processing information includes information TSP related to the 3D scan pattern, information TSJ related to execution of the 3D scan processing, and information TSR obtained by the 3D scan processing. The information TSP related to the 3D scan pattern is a designated position TSP1, a range setting TSP2, a second X movement pitch TSP5, a second Y movement pitch TSP6, and a second condition for switching from movement in the X direction to movement in the Y direction. X → Y condition TSP10, and a second Y → X condition TSP11 that is a condition for switching from movement in the Y direction to movement in the X direction. Here, for example, the second X movement pitch TSP5 is a movement (imaging) interval in the X direction, and the second Y movement pitch TSP6 is a movement (imaging) interval in the Y direction. For example, the observation-side control circuit 110 determines the specific area based on the designated position TSP1 and the range setting TSP2, and also determines the start position and end position of the 3D scan. The information TSJ related to the execution of the 3D scanning process is, for example, when it is determined that the observation is defective according to the second NG determination condition TSJ1, which is a determination condition for determining the observation defect, for example, the second NG determination condition TSJ1. A second retry determination condition TSJ2 that is a determination condition for determining whether or not to re-execute the 3D scan is included. The information TSR obtained by the 3D scanning process is recorded in association with each image (first image) acquired by using the first imaging optical system in the 3D scanning process. For example, the first result TSR1 includes the first frame TSR11, the first time TSR12 when the first frame TSR11 is acquired, the first depth information TSR13, and the first 3D shooting condition TSR14. . Note that the depth information may include the position of the cell 324 in the optical axis direction of the first imaging optical system, information on a 3D model that can be constructed based on the position information, and the like.
ここで、本実施形態に係る3Dスキャン処理における画像取得ユニット150の移動パターンの一例について模式図として図13に示し、これを参照して3Dスキャンにおける画像取得ユニット150の移動について説明をする。ここでは、画像取得ユニット150が、図13中に示す線TL1上を移動させられながら3Dスキャン処理が実行される場合を例として説明をする。
Here, an example of the movement pattern of the image acquisition unit 150 in the 3D scan processing according to the present embodiment is shown in FIG. 13 as a schematic diagram, and the movement of the image acquisition unit 150 in the 3D scan will be described with reference to this. Here, a case where the image acquisition unit 150 performs the 3D scan process while being moved on the line TL1 illustrated in FIG. 13 will be described as an example.
図13に示すように、観察側制御回路110は、範囲設定TSP2に応じて、点TP0で示されている指定位置TSP1を概ね中心とした領域(特定域)において、画像取得ユニット150を線TL1上で移動させながら撮像させて、各々の位置で第1の画像を取得させる。3Dスキャン時の画像取得ユニット150の移動は、図10を参照して説明したカウントスキャン時の画像取得ユニット150の移動と概ね同様である。画像取得ユニット150は、開始位置TP1から移動を開始し、点TP2で第2のY→X条件TSP11を満たすまでY方向へ移動する。その後、点TP3で第2のX→Y条件TSP10を満たすまでX方向へ移動する。このようにして、観察側制御回路110は、画像取得ユニット150が点TP10に到達するまで3Dスキャン処理を続ける。
As shown in FIG. 13, the observation-side control circuit 110 moves the image acquisition unit 150 to the line TL1 in a region (specific region) that is generally centered on the designated position TSP1 indicated by the point TP0 according to the range setting TSP2. The first image is acquired at each position by picking up an image while moving it up. The movement of the image acquisition unit 150 during the 3D scan is substantially the same as the movement of the image acquisition unit 150 during the count scan described with reference to FIG. The image acquisition unit 150 starts moving from the start position TP1, and moves in the Y direction until the second Y → X condition TSP11 is satisfied at the point TP2. Thereafter, the robot moves in the X direction at the point TP3 until the second X → Y condition TSP10 is satisfied. In this way, the observation-side control circuit 110 continues the 3D scan process until the image acquisition unit 150 reaches the point TP10.
ここで、再び図11を参照して観察システム1の3Dスキャン処理時の動作について説明を続ける。
Here, the operation of the observation system 1 during the 3D scan process will be described with reference to FIG. 11 again.
ステップS303において、観察側制御回路110は、カウントスキャン処理のステップS203と同様にして、3Dスキャンの状態を判定する。ここで用いられる判定条件は、第2のNG判定条件TSJ1と第2の再トライ判定条件TSJ2である。処理は、3Dスキャンの再実施が必要であると判定された場合はステップS304へ進み、判定されなかった場合はステップS305へ進む。
In step S303, the observation-side control circuit 110 determines the 3D scan state in the same manner as in step S203 of the count scan process. The determination conditions used here are the second NG determination condition TSJ1 and the second retry determination condition TSJ2. The process proceeds to step S304 if it is determined that the 3D scan needs to be re-executed, and proceeds to step S305 if it is not determined.
ステップS304において、観察側制御回路110は、ステップS303での判定結果に応じて、3Dスキャンにおいて観察不良等が発生している旨、3Dスキャンの再実施が必要である旨等をユーザへ警告させるための制御信号を生成し、コントローラ200へ送信する。その後、処理はステップS302へ戻る。
In step S304, the observation-side control circuit 110 warns the user that an observation defect or the like has occurred in the 3D scan according to the determination result in step S303, and that it is necessary to re-execute the 3D scan. A control signal is generated and transmitted to the controller 200. Thereafter, the process returns to step S302.
ステップS305において、観察側制御回路110は、特定域における3Dスキャンが終了したか否かを判定する。処理は、特定域での3Dスキャンが終了したと判定された場合はステップS310へ進み、判定されなかった場合はステップS306へ進む。
In step S305, the observation-side control circuit 110 determines whether or not the 3D scan in the specific area has ended. The process proceeds to step S310 when it is determined that the 3D scan in the specific area has been completed, and proceeds to step S306 when it is not determined.
ステップS306において、観察側制御回路110は、撮像部151に撮像動作を行わせて第1の画像を取得させる。ここで、図5及び図6を参照して上述したように、3Dスキャン処理では被写体側に非テレセントリック性を有する光学系である第1の撮像光学系が用いられる。また、観察側制御回路110は、取得した第1の画像を観察側通信装置140に予め設定された送信先へ送信させる。
In step S306, the observation-side control circuit 110 causes the imaging unit 151 to perform an imaging operation to acquire a first image. Here, as described above with reference to FIGS. 5 and 6, in the 3D scanning process, the first imaging optical system which is an optical system having non-telecentricity on the subject side is used. In addition, the observation-side control circuit 110 causes the observation-side communication device 140 to transmit the acquired first image to a preset transmission destination.
ステップS307乃至ステップS309において、観察側制御回路110は、カウントスキャン処理におけるステップS208乃至ステップS210と同様の処理を行う。観察側制御回路110は、ステップS307において第2のX→Y条件TSP10又は第2のY→X条件TSP11を満たす状態であるか否かを判定し、これらの条件を満たすと判定された場合はステップS308で移動方向を切り替える。その後、観察側制御回路110は、ステップS309において、移動方向と第2のX移動ピッチTSP5又は第2のY移動ピッチTSP6の値とに応じて画像取得ユニット150を移動させる。その後、処理はステップS303へ戻る。
In steps S307 to S309, the observation-side control circuit 110 performs the same processes as steps S208 to S210 in the count scan process. The observation-side control circuit 110 determines whether or not the second X → Y condition TSP10 or the second Y → X condition TSP11 is satisfied in step S307, and if it is determined that these conditions are satisfied. In step S308, the moving direction is switched. Thereafter, in step S309, the observation-side control circuit 110 moves the image acquisition unit 150 according to the movement direction and the value of the second X movement pitch TSP5 or the second Y movement pitch TSP6. Thereafter, the process returns to step S303.
ステップS310において、観察側制御回路110は、図5及び図6を参照して上述したように、画像処理回路120に第1の画像について画像処理を行わせた結果に基づき、細胞324に係る奥行き情報を取得する。
In step S310, the observation-side control circuit 110, as described above with reference to FIGS. 5 and 6, based on the result of causing the image processing circuit 120 to perform image processing on the first image, the depth associated with the cell 324. Get information.
なお、当該奥行き情報は、第1の撮像光学系の光軸方向における細胞324又は細胞群の有する凹凸等に係る情報を含む。したがって、ここで取得される奥行き情報には、例えば立体モデル生成部が生成する、細胞324又は細胞群の3次元形状を示す3Dモデルを含む。
Note that the depth information includes information related to the unevenness of the cell 324 or the cell group in the optical axis direction of the first imaging optical system. Therefore, the depth information acquired here includes, for example, a 3D model indicating the three-dimensional shape of the cell 324 or the cell group generated by the three-dimensional model generation unit.
なお、例えばコントローラ200が第1の画像の送信先であって、コントローラ200が画像処理回路を備える場合には、当該奥行き情報の取得はコントローラ200で行われてもよい。また、例えばクラウド等のネットワーク上のサーバが第1の画像の送信先であって、クラウドが画像処理回路に相当する機能を備える場合には、当該奥行き情報の取得は観察システム1の外部で行われてもよい。
For example, when the controller 200 is the transmission destination of the first image and the controller 200 includes an image processing circuit, the depth information may be acquired by the controller 200. For example, when a server on a network such as a cloud is a transmission destination of the first image and the cloud has a function corresponding to an image processing circuit, the depth information is acquired outside the observation system 1. It may be broken.
ステップS311において、観察側制御回路110は、観察側通信装置140にコントローラ200へ終了信号を送信させる。その後、処理は終了し、観察装置制御処理のステップS110へ進む。
In step S311, the observation-side control circuit 110 causes the observation-side communication device 140 to transmit an end signal to the controller 200. Thereafter, the process ends, and the process proceeds to step S110 of the observation apparatus control process.
コントローラ200で行われるコントローラ制御処理の一例をフローチャートとして図14に示し、これを参照して観察システム1の動作について説明をする。図14のフローチャートに示す処理は、例えば観察装置100が通信待機している状態で開始する。
An example of a controller control process performed by the controller 200 is shown in FIG. 14 as a flowchart, and the operation of the observation system 1 will be described with reference to this flowchart. The process illustrated in the flowchart of FIG. 14 starts, for example, in a state where the observation apparatus 100 is waiting for communication.
ステップS401において、コントローラ側制御回路210は、例えば、コントローラ200が備える各種機能をテキスト、アイコン等でユーザに提示する表示情報を生成し、表示装置272に表示させる。
In step S401, the controller-side control circuit 210 generates display information for presenting various functions provided in the controller 200 to the user using text, icons, and the like, and causes the display device 272 to display the display information.
ステップS402において、コントローラ側制御回路210は、例えばユーザの操作結果に応じて入力装置274が出力する制御信号に基づいて、検査アプリの起動が指示されたか否かを判定する。ここで、当該検査アプリは、観察装置100と互いに通信して観察装置100の制御を行うためのプログラムを有するアプリケーションソフトウェアである。コントローラ制御処理は、検査アプリの起動が指示されたと判定された場合はステップS403へ進み、判定されなかった場合はステップS401に戻る。なお、コントローラ200は例えばタブレットPCやスマートフォンであり、本ステップでは、検査アプリの他に、電話アプリやメールアプリが選択され得る。以下の説明では、検査アプリが選択された場合のみを例として説明を行う。
In step S402, the controller-side control circuit 210 determines whether the activation of the inspection application is instructed based on, for example, a control signal output from the input device 274 in accordance with a user operation result. Here, the inspection application is application software having a program for communicating with the observation apparatus 100 to control the observation apparatus 100. The controller control process proceeds to step S403 when it is determined that the activation of the inspection application is instructed, and returns to step S401 when it is not determined. The controller 200 is, for example, a tablet PC or a smartphone, and in this step, a telephone application or a mail application can be selected in addition to the inspection application. In the following description, only the case where the inspection application is selected will be described as an example.
ステップS403において、コントローラ側制御回路210は、指定カメラにアクセスする。当該指定カメラは、例えばステップS402で選択された検査アプリで制御する対象の撮像装置である。以下、本実施形態では、当該指定カメラは観察装置100であるとして説明を続ける。
In step S403, the controller side control circuit 210 accesses the designated camera. The designated camera is an imaging device to be controlled by the inspection application selected in step S402, for example. Hereinafter, in the present embodiment, the description will be continued assuming that the designated camera is the observation apparatus 100.
ステップS404において、コントローラ側制御回路210は、例えばユーザの操作結果に応じて入力装置274が出力する制御信号に基づいて、ユーザが観察装置100の電源をオンとする操作又は観察装置100の電源をオフとする操作(撮像ON/OFF操作)を行ったか否かを判定する。処理は、撮像ON/OFF操作が行われたと判定された場合はステップS405へ進み、判定されなかった場合はステップS406へ進む。
In step S <b> 404, the controller-side control circuit 210 performs an operation for turning on the observation apparatus 100 by the user or a power supply for the observation apparatus 100 based on, for example, a control signal output from the input apparatus 274 according to the operation result of the user. It is determined whether or not an operation to turn off (imaging ON / OFF operation) has been performed. The process proceeds to step S405 if it is determined that the imaging ON / OFF operation has been performed, and proceeds to step S406 if it is not determined.
ステップS405において、コントローラ側制御回路210は、ステップS404で検出したユーザの撮像ON/OFF操作の結果に基づき、観察装置100の電源をオンとする電源ON信号又は観察装置100の電源をオフとする電源OFF信号を、コントローラ側通信装置240に観察装置100へ送信させる。その後、処理はステップS403へ戻る。なお、本ステップの処理は観察装置制御処理のステップS102乃至ステップS103に対応する。
In step S405, the controller-side control circuit 210 turns off the power ON signal for turning on the observation device 100 or the power to the observation device 100 based on the result of the user's imaging ON / OFF operation detected in step S404. The controller-side communication device 240 is caused to transmit a power OFF signal to the observation device 100. Thereafter, the process returns to step S403. Note that the process in this step corresponds to steps S102 to S103 in the observation apparatus control process.
ステップS406において、コントローラ側制御回路210は、例えばユーザの操作結果に応じて入力装置274が出力する制御信号に基づいて、ユーザによって観察装置100が取得した画像等の観察結果又は測定結果の送信先に係る情報、撮影条件、測定条件、各種パラメータを含む各種設定が行われたか否かを判定する。処理は、各種設定が行われたと判定された場合はステップS407へ進み、判定されなかった場合はステップS408へ進む。
In step S406, the controller-side control circuit 210, for example, based on a control signal output from the input device 274 in accordance with a user operation result, the transmission destination of an observation result or measurement result such as an image acquired by the observation device 100 by the user. It is determined whether or not various settings including information, shooting conditions, measurement conditions, and various parameters are performed. The process proceeds to step S407 if it is determined that various settings have been made, and proceeds to step S408 if it is not determined.
ステップS407において、コントローラ側制御回路210は、ステップS406で検出した各種設定に係る制御信号を、コントローラ側通信装置240に観察装置100へ送信させる。その後、処理はステップS408へ進む。なお、本ステップの処理は観察装置制御処理のステップS104乃至ステップS105に対応する。
In step S407, the controller-side control circuit 210 causes the controller-side communication device 240 to transmit control signals related to various settings detected in step S406 to the observation device 100. Thereafter, the process proceeds to step S408. Note that the process in this step corresponds to steps S104 to S105 in the observation apparatus control process.
ステップS408において、コントローラ側制御回路210は、例えばユーザ操作の結果に応じて入力装置274が出力する制御信号に基づいて、ユーザがカウントスキャン処理の実行を指示したか否かを判定する。処理は、カウントスキャン処理の実行が指示されたと判定された場合はステップS409へ進み、判定されなかった場合はステップS410へ進む。
In step S408, the controller-side control circuit 210 determines whether or not the user has instructed execution of the count scan process based on, for example, a control signal output from the input device 274 in accordance with the result of the user operation. The process proceeds to step S409 when it is determined that execution of the count scan process is instructed, and proceeds to step S410 when it is not determined.
ステップS409において、コントローラ側制御回路210は、カウントスキャン処理の実行を指示する制御信号を、コントローラ側通信装置240に観察装置100へ送信させる。その後、処理はステップS410へ進む。なお、本ステップの処理は観察装置制御処理のステップS106乃至ステップS107に対応する。
In step S409, the controller-side control circuit 210 causes the controller-side communication device 240 to transmit a control signal instructing execution of the count scan process to the observation device 100. Thereafter, the process proceeds to step S410. Note that the process in this step corresponds to steps S106 to S107 in the observation apparatus control process.
ステップS410において、コントローラ側制御回路210は、例えばユーザ操作の結果に応じて入力装置274が出力する制御信号に基づいて、ユーザが3Dスキャン処理の実行を指示したか否かを判定する。また、コントローラ側制御回路210は、ユーザによって3Dスキャンを実行する特定域に係る指定位置TSP1、範囲設定TSP2等の設定が行われたか否かの判定も行う。処理は、3Dスキャン処理の実行が指示された又は特定域に係る設定が行われたと判定された場合はステップS411へ進み、判定されなかった場合はステップS412へ進む。
In step S410, the controller-side control circuit 210 determines whether or not the user has instructed the execution of the 3D scanning process based on, for example, a control signal output from the input device 274 in accordance with the result of the user operation. The controller-side control circuit 210 also determines whether or not the user has set the designated position TSP1, the range setting TSP2, and the like related to the specific area where the 3D scan is executed. The process proceeds to step S411 when it is determined that the execution of the 3D scan process is instructed or the setting related to the specific area is performed, and when it is not determined, the process proceeds to step S412.
ステップS411において、コントローラ側制御回路210は、ステップS410で3Dスキャン処理の実行が指示された場合には、3Dスキャン処理の実行を指示する制御信号をコントローラ側通信装置240に観察装置100へ送信させる。コントローラ側制御回路210は、ステップS410で特定域に係る設定が行われたと判定された場合には、当該設定に係る制御信号をコントローラ側通信装置240に観察装置100へ送信させる。その後、処理はステップS412へ進む。なお、本ステップの処理は観察装置制御処理のステップS108乃至ステップS109に対応する。
In step S411, if the execution of the 3D scan process is instructed in step S410, the controller side control circuit 210 causes the controller side communication device 240 to transmit a control signal instructing the execution of the 3D scan process to the observation apparatus 100. . When it is determined in step S410 that the setting related to the specific area has been performed, the controller-side control circuit 210 causes the controller-side communication device 240 to transmit the control signal related to the setting to the observation device 100. Thereafter, the process proceeds to step S412. Note that the process of this step corresponds to steps S108 to S109 of the observation apparatus control process.
ステップS412において、コントローラ側制御回路210は、例えばユーザ操作の結果に応じて、測定結果等をコントローラ200の外部から受信するか否かを判定する。処理は、測定結果を受信すると判定された場合はステップS413へ進み、判定されなかった場合はステップS414へ進む。
In step S412, the controller-side control circuit 210 determines whether to receive a measurement result or the like from the outside of the controller 200, for example, according to the result of the user operation. The process proceeds to step S413 when it is determined that the measurement result is received, and proceeds to step S414 when it is not determined.
ステップS413において、コントローラ側制御回路210は、観察装置100で取得された測定結果等を取得し、測定結果等を表示装置272へ表示する。なお、当該測定結果等は、観察装置100から取得されてもよいし、ステップS406で設定された観察装置100の測定結果の送信先から取得されてもよい。その後、処理はステップS414へ進む。なお、本ステップの処理は観察装置制御処理のステップS111乃至ステップS112に対応する。
In step S413, the controller-side control circuit 210 acquires the measurement results and the like acquired by the observation apparatus 100, and displays the measurement results and the like on the display device 272. Note that the measurement result or the like may be acquired from the observation apparatus 100, or may be acquired from the transmission destination of the measurement result of the observation apparatus 100 set in step S406. Thereafter, the process proceeds to step S414. Note that the process of this step corresponds to steps S111 to S112 of the observation apparatus control process.
ステップS414において、コントローラ側制御回路210は、例えばユーザの操作結果に応じて、検査アプリを終了するか否かを判定する。処理は、終了すると判定された場合は当該検査アプリを終了してステップS401へ戻り、終了しないと判定された場合はステップS403へ戻る。
In step S414, the controller-side control circuit 210 determines whether or not to end the inspection application, for example, according to the operation result of the user. If it is determined that the process is to end, the inspection application is ended and the process returns to step S401. If it is determined that the process is not ended, the process returns to step S403.
なお、本実施形態に係るレンズ切替部154は、撮像光学系152に含まれるレンズの撮像光学系152の光軸方向に駆動させて、第1の撮像光学系と第2の撮像光学系とを切り替えるとして説明をしたが、これに限定されない。例えば、観察装置100は、第1の撮像光学系と第2の撮像光学系とを、それぞれ別個に備えていてもよい。この場合、レンズ切替部154は、例えば、第1の撮像光学系と第2の撮像光学系とのうち、撮像に使用する撮像光学系の光軸が観察軸と平行となるように移動させたりして撮像光学系を切り替えることになる。
The lens switching unit 154 according to the present embodiment is driven in the optical axis direction of the imaging optical system 152 of the lens included in the imaging optical system 152, and the first imaging optical system and the second imaging optical system are driven. Although described as switching, it is not limited to this. For example, the observation apparatus 100 may include a first imaging optical system and a second imaging optical system separately. In this case, for example, the lens switching unit 154 moves the optical axis of the imaging optical system used for imaging out of the first imaging optical system and the second imaging optical system so as to be parallel to the observation axis. Thus, the imaging optical system is switched.
なお、本実施形態に係る照明部155は支持部165に配置されていると述べたが、照明光学系156の光放射部が支持部165に配置されていればよく、例えば光源157は、観察装置100の何れの場所に配置されていてもよい。なお、細胞324等の観察対象へのダメージを軽減するために、例えば照明の強度が観察の種類によって変更されてもよい。また、照明光の制御としては、撮影を行う瞬間のみ試料300を照明するような間欠的な照明とする制御、点灯照明数を増減させる制御方法もあり得る。
Although the illumination unit 155 according to this embodiment is described as being disposed on the support unit 165, it is only necessary that the light emitting unit of the illumination optical system 156 be disposed on the support unit 165. For example, the light source 157 may be used for observation. It may be placed anywhere on the device 100. Note that, for example, the intensity of illumination may be changed depending on the type of observation in order to reduce damage to the observation target such as the cell 324. Moreover, as control of illumination light, there can be a control method of intermittent illumination such that the sample 300 is illuminated only at the moment of photographing, and a control method of increasing or decreasing the number of lighting illuminations.
なお、カウントスキャン処理及び3Dスキャン処理において、画像取得ユニット150が開始位置からY方向へ移動を開始する場合を例として説明したが、これに限定されず、例えば開始位置からX方向へスキャンが開始されてもよい。
In the count scan process and the 3D scan process, the case where the image acquisition unit 150 starts moving in the Y direction from the start position has been described as an example. However, the present invention is not limited to this. For example, scanning starts from the start position in the X direction. May be.
<観察システムの利点>
このように、本実施形態に係る観察装置100は、移動機構160に画像取得ユニット150の位置をX方向及びY方向に撮像光学系152の光軸を観察軸と平行に維持したまま変更させながら繰り返し試料300の撮影を行い、複数の画像を取得する。 <Advantages of observation system>
As described above, theobservation apparatus 100 according to the present embodiment causes the moving mechanism 160 to change the position of the image acquisition unit 150 in the X direction and the Y direction while maintaining the optical axis of the imaging optical system 152 in parallel with the observation axis. The sample 300 is repeatedly photographed to acquire a plurality of images.
このように、本実施形態に係る観察装置100は、移動機構160に画像取得ユニット150の位置をX方向及びY方向に撮像光学系152の光軸を観察軸と平行に維持したまま変更させながら繰り返し試料300の撮影を行い、複数の画像を取得する。 <Advantages of observation system>
As described above, the
このとき、本実施形態に係る観察装置100は、被写体側テレセントリック光学系である第2の撮像光学系を用いることで、広範囲の高画素画像の合成に適した第2の画像を取得する。なお、当該第2の画像又は高画素画像は、細胞等の注目被写体の形状や数の取得に適する。一方で、本実施形態に係る観察装置100は、被写体側非テレセントリック光学系である第1の撮像光学系を用いることで、異なる撮像位置で取得された複数の第1の画像から視差に起因する像移動量を含む第1の像移動量ΔXを取得し、第1の像移動量ΔXに基づいて細胞等の注目被写体の奥行き情報を取得する。
At this time, the observation apparatus 100 according to the present embodiment uses the second imaging optical system, which is a subject-side telecentric optical system, to acquire a second image suitable for synthesis of a wide range of high-pixel images. Note that the second image or the high pixel image is suitable for acquiring the shape and number of the subject of interest such as a cell. On the other hand, the observation apparatus 100 according to the present embodiment uses a first imaging optical system that is a subject-side non-telecentric optical system, thereby causing parallax from a plurality of first images acquired at different imaging positions. A first image movement amount ΔX including the image movement amount is acquired, and depth information of a subject of interest such as a cell is acquired based on the first image movement amount ΔX.
また、本実施形態に係るレンズ切替部154は、観察の種類に応じて撮像光学系152を切り替えさせる。このように、本実施形態に係る観察装置100は、第1の撮像光学系と第2の撮像光学系とを切り替えるレンズ切替部154を備えることによって、例えば奥行き情報の取得と細胞カウントに適した画像の取得とのように、要求される光学的特徴が異なる観察を実現できる。
Further, the lens switching unit 154 according to the present embodiment switches the imaging optical system 152 according to the type of observation. Thus, the observation apparatus 100 according to the present embodiment includes the lens switching unit 154 that switches between the first imaging optical system and the second imaging optical system, and thus is suitable for, for example, acquisition of depth information and cell counting. Observations with different required optical characteristics, such as image acquisition, can be realized.
したがって、本実施形態に係る観察システム1を使用すれば、ユーザは、所望の観察方法を指定するだけで、細胞の光軸方向の位置、細胞の立体モデル、細胞の形状等が正確に投影された画像、細胞数のカウント結果等を取得できる。
Therefore, if the observation system 1 according to the present embodiment is used, the user can accurately project the position of the cell in the optical axis direction, the three-dimensional model of the cell, the shape of the cell, etc. only by specifying the desired observation method. Images, cell count results, etc. can be acquired.
[第2の実施形態]
本発明における第2の実施形態について説明する。ここでは、第1の実施形態との相違点について説明し、同一の部分については同一の符号を付してその説明を省略する。 [Second Embodiment]
A second embodiment of the present invention will be described. Here, differences from the first embodiment will be described, and the same portions will be denoted by the same reference numerals and description thereof will be omitted.
本発明における第2の実施形態について説明する。ここでは、第1の実施形態との相違点について説明し、同一の部分については同一の符号を付してその説明を省略する。 [Second Embodiment]
A second embodiment of the present invention will be described. Here, differences from the first embodiment will be described, and the same portions will be denoted by the same reference numerals and description thereof will be omitted.
第1の実施形態では、目的に応じて第1の撮像光学系と第2の撮像光学系とを切り替えて観察等を行う観察装置において、第1の撮像光学系を用いた場合に、細胞324等の注目被写体の奥行き情報を取得する場合を例として説明をした。第1の撮像光学系を用いる場合に細胞324と対物レンズ152bとの間では、撮像素子153に到達し得る光線は第1の撮像光学系の光軸との間に傾きを有する。そのため、第1の画像は、当該光軸上を除く位置に存在する被写体の側面に係る情報を含む。そこで、本実施形態では、第1の撮像光学系が用いられて取得される第1の画像に基づいて、細胞324の側面を撮像したかのような側面観察画像を生成する観察システム1について説明をする。
In the first embodiment, in the observation apparatus that performs observation or the like by switching between the first imaging optical system and the second imaging optical system according to the purpose, the cell 324 is used when the first imaging optical system is used. The case where the depth information of the subject of interest such as is acquired has been described as an example. When the first imaging optical system is used, a light beam that can reach the imaging element 153 has an inclination between the cell 324 and the objective lens 152b with respect to the optical axis of the first imaging optical system. Therefore, the first image includes information related to the side surface of the subject that exists at a position other than the optical axis. Therefore, in the present embodiment, the observation system 1 that generates a side observation image as if the side surface of the cell 324 was imaged based on the first image acquired by using the first imaging optical system will be described. do.
<観察システムの構成>
本実施形態に係る観察装置100は、立体画像生成部と、側面情報処理部としての機能をさらに有する。立体画像生成部は、例えば複数の第1の画像と奥行き情報とに基づいて、注目被写体の立体画像を生成する。側面情報処理部は、側面観察処理によって得られた画像データ、位置情報等の各種側面観察に係る情報を取得し、また、第1の画像と奥行き情報とに基づいて、側面観察画像を生成する。なお、これら立体画像生成部と側面情報処理部としての機能は、それぞれ、例えば観察側制御回路110及び/又は画像処理回路120によって実現され得る。なお、立体画像生成部と側面情報処理部としての機能は、それぞれ、コントローラ側制御回路210によって実現されてもよい。 <Configuration of observation system>
Theobservation apparatus 100 according to the present embodiment further has functions as a stereoscopic image generation unit and a side information processing unit. The stereoscopic image generation unit generates a stereoscopic image of the subject of interest based on, for example, a plurality of first images and depth information. The side information processing unit acquires information related to various side observations such as image data and position information obtained by the side observation processing, and generates a side observation image based on the first image and the depth information. . Note that the functions as the stereoscopic image generation unit and the side information processing unit can be realized by the observation-side control circuit 110 and / or the image processing circuit 120, for example. Note that the functions as the stereoscopic image generation unit and the side information processing unit may be realized by the controller-side control circuit 210, respectively.
本実施形態に係る観察装置100は、立体画像生成部と、側面情報処理部としての機能をさらに有する。立体画像生成部は、例えば複数の第1の画像と奥行き情報とに基づいて、注目被写体の立体画像を生成する。側面情報処理部は、側面観察処理によって得られた画像データ、位置情報等の各種側面観察に係る情報を取得し、また、第1の画像と奥行き情報とに基づいて、側面観察画像を生成する。なお、これら立体画像生成部と側面情報処理部としての機能は、それぞれ、例えば観察側制御回路110及び/又は画像処理回路120によって実現され得る。なお、立体画像生成部と側面情報処理部としての機能は、それぞれ、コントローラ側制御回路210によって実現されてもよい。 <Configuration of observation system>
The
(第1の撮像光学系を用いた細胞の側面観察について)
ここでは、第1の撮像光学系を用いる場合において細胞324上の領域のうち第1の撮像光学系の光軸から所定の閾値以上離れた領域から放射される主光線と当該光軸との間に傾き(視差)が存在することを利用した、細胞324の側面観察について説明をする。 (About side observation of a cell using the first imaging optical system)
Here, in the case where the first imaging optical system is used, between the principal ray radiated from an area on thecell 324 that is separated from the optical axis of the first imaging optical system by a predetermined threshold or more and the optical axis. Next, side observation of the cell 324 using the presence of inclination (parallax) will be described.
ここでは、第1の撮像光学系を用いる場合において細胞324上の領域のうち第1の撮像光学系の光軸から所定の閾値以上離れた領域から放射される主光線と当該光軸との間に傾き(視差)が存在することを利用した、細胞324の側面観察について説明をする。 (About side observation of a cell using the first imaging optical system)
Here, in the case where the first imaging optical system is used, between the principal ray radiated from an area on the
本実施形態に係る第1の撮像光学系を含む撮像部151と試料300との位置関係の一例を模式図として図15に示す。図15に示す状態は、細胞324上の点P1が第1の撮像光学系の光軸上に位置する状態からX+方向へ第3のX移動量ΔX3だけ移動した状態である。すなわち、点P1の像位置を対応点とした場合、当該対応点の位置は点U1から点U1´まで移動した状態である。このようにして複数の第1の画像から取得した第1の像移動量ΔX(点U1と点U1´との間の距離)に基づき、第1の実施形態と同様にして、細胞324上の各々の位置について奥行き情報を取得する。なお、第1の画像に基づいた奥行き情報の取得については第1の実施形態において説明したため、以下の説明では、奥行き情報は既知の情報として扱う。
FIG. 15 shows a schematic diagram of an example of the positional relationship between the imaging unit 151 including the first imaging optical system according to the present embodiment and the sample 300. The state shown in FIG. 15 is a state where the point P1 on the cell 324 is moved by the third X movement amount ΔX3 in the X + direction from the state where it is located on the optical axis of the first imaging optical system. That is, when the image position of the point P1 is a corresponding point, the position of the corresponding point is in a state of moving from the point U1 to the point U1 ′. Based on the first image movement amount ΔX (the distance between the point U1 and the point U1 ′) acquired from the plurality of first images in this manner, the same as in the first embodiment, on the cell 324. Depth information is acquired for each position. In addition, since acquisition of depth information based on the first image has been described in the first embodiment, in the following description, depth information is treated as known information.
図15に示す状態では、細胞324上の点P1から放射された光線R10は撮像素子153上の点U1´へ入射し、点P3から放射された光線R30は撮像素子153上の点U3へ入射する。このように、細胞上の領域A1は、撮像素子153上の領域A1´として撮像されることになる。
In the state shown in FIG. 15, the light ray R10 emitted from the point P1 on the cell 324 is incident on the point U1 ′ on the image sensor 153, and the light ray R30 emitted from the point P3 is incident on the point U3 on the image sensor 153. To do. In this way, the area A1 on the cell is imaged as the area A1 ′ on the image sensor 153.
なお、図15に示す状態では、例えば第1の撮像光学系の光軸方向において、点P1の位置と点P3の位置との間が合焦範囲(焦点深度内)に相当するものとする。すなわち、領域A1に対して合焦している状態であるとする。本実施形態に係る側面観察では、注目被写体のうち合焦している領域A1を特定範囲と称し、また、第1の画像のうち注目被写体の特定範囲を撮像した領域A1´を特定範囲画像と称することとする。
In the state shown in FIG. 15, for example, in the optical axis direction of the first imaging optical system, the area between the position of the point P1 and the position of the point P3 corresponds to the in-focus range (within the depth of focus). That is, it is assumed that the area A1 is in focus. In the side observation according to the present embodiment, the focused area A1 of the subject of interest is referred to as a specific range, and the region A1 ′ obtained by capturing the specific range of the subject of interest in the first image is referred to as a specific range image. I will call it.
本実施形態に係る観察装置100は、画像処理回路120に、各々の第1の画像から当該特定範囲画像を切り出させる。また、画像処理回路120は、例えば各々の特定範囲画像に含まれる被写体に係る奥行き情報に基づいて、当該特定範囲画像を合成して側面観察画像を取得する。本実施形態に係る観察装置100の備える側面情報処理部は、このようにして取得した複数の特定範囲画像を合成することによって、細胞324を下方から撮像する観察装置でありながら、細胞324を側面から撮像して得られる深度合成画像のような、側面観察画像を取得できる。
The observation apparatus 100 according to the present embodiment causes the image processing circuit 120 to cut out the specific range image from each first image. Further, the image processing circuit 120 synthesizes the specific range images based on depth information about the subject included in each specific range image, for example, and acquires a side observation image. The side information processing unit included in the observation device 100 according to the present embodiment combines the plurality of specific range images acquired in this manner, thereby imaging the cell 324 from below, and the side information processing unit A side observation image such as a depth composite image obtained by imaging can be acquired.
ここで、本実施形態に係る側面観察画像の一例を模式図として図16に示す。これは、光軸付近の画像以外、画面周辺部の画像を有効利用したもので、図16に示すように、本実施形態に係る観察装置100は、各々の第1の画像からそれぞれ取得した、第1の特定範囲画像I10と第2の特定範囲画像I11と第3の特定範囲画像I12とを含む複数の特定範囲画像を合成する。ここで、合成される各々の特定範囲画像の側面観察画像における(画像処理後の)Z軸方向の幅Wは、例えば領域A1´が特定範囲画像の場合、点P1と点P3との光軸方向の位置(奥行き情報)に基づいて決定される。この場合、幅Wは奥行きZ3と奥行きZ1との差分である。このように、各々の特定範囲画像の側面観察画像における幅Wは異なり得る。このように拡大光学系で、側面の画像を積極的に活用して側面観察を行うことによって、簡単な構成で、対象物の状況を検査、観察する上で、これまで以上に重要で豊富な画像情報(濃淡や色や構造など立体状況)を得ることが可能となる。拡大光学系を用いて試料(対象物)を撮像して第1の画像を取得する撮像部151を移動させる移動機構160を設けることによって、異なる撮像部151の位置で取得された複数の前記第1の画像のつなぎ合わせや、立体情報取得が出来る、前記第1の画像の取得が可能となる。移動時の各々の撮像位置に係る情報に基づいて、立体情報取得部が試料300の立体情報を取得するが、上記複数の第1の画像のうち少なくとも一つは、上記第1の撮像光学系で得られた画像の上記光学系の光軸上以外の画像であることを特徴としている。
Here, an example of a side observation image according to the present embodiment is shown in FIG. 16 as a schematic diagram. This is an effective use of the image in the periphery of the screen other than the image near the optical axis, and as shown in FIG. 16, the observation device 100 according to the present embodiment acquired from each first image, A plurality of specific range images including the first specific range image I10, the second specific range image I11, and the third specific range image I12 are synthesized. Here, the width W in the Z-axis direction (after image processing) in the side-view image of each specific range image to be synthesized is, for example, the optical axis between the points P1 and P3 when the region A1 ′ is the specific range image. It is determined based on the position of the direction (depth information). In this case, the width W is the difference between the depth Z3 and the depth Z1. Thus, the width W in the side-view observation image of each specific range image can be different. In this way, with the magnifying optical system, the side image is actively utilized to perform side observation, and it is more important and abundant than ever to inspect and observe the condition of the object with a simple configuration. It is possible to obtain image information (stereoscopic conditions such as shading, color, and structure). By providing a moving mechanism 160 that moves the imaging unit 151 that captures a sample (object) using a magnifying optical system and acquires a first image, a plurality of the first images acquired at different positions of the imaging unit 151 are provided. It is possible to obtain the first image, which can be used to join one image and obtain three-dimensional information. The three-dimensional information acquisition unit acquires the three-dimensional information of the sample 300 based on the information related to each imaging position at the time of movement. At least one of the plurality of first images is the first imaging optical system. The image obtained in (1) is an image other than on the optical axis of the optical system.
なお、各々の画像のZ軸方向の幅Wは、例えば第1の像移動量ΔX又は画角θに基づいて決定されてもよいし、焦点距離と合焦範囲の幅とに基づいて決定されてもよい。
Note that the width W in the Z-axis direction of each image may be determined based on, for example, the first image movement amount ΔX or the angle of view θ, or may be determined based on the focal length and the width of the focusing range. May be.
なお、光軸近傍の領域が特定範囲画像として切り出される場合を例として説明したが、これに限定されない。特定範囲画像として切り出される第1の画像上の領域は、第1の撮像光学系の光軸から所定の閾値以上離れた領域であればよい。一方で、撮像素子153の周縁部に存在し得る画像の歪みの影響が少ない範囲であることが好ましい。
In addition, although the case where the area near the optical axis is cut out as a specific range image has been described as an example, the present invention is not limited to this. The region on the first image cut out as the specific range image may be a region that is separated from the optical axis of the first imaging optical system by a predetermined threshold or more. On the other hand, it is preferable that the range is less affected by image distortion that may be present at the periphery of the image sensor 153.
なお、例えば細胞324の凹凸の程度によって変化し得る合焦範囲内に含まれる細胞324のX方向の幅に応じて、各々の第1の画像における画角θや第1の像移動量ΔX又は領域A1´の幅も変化し得ることは言うまでもない。また、画角θや第1の像移動量ΔX又は領域A1´の幅は図16に示すような側面観察画像の分解能に相当する。また、各々の第1の画像を取得する位置の間の光軸間距離(画像取得ユニット150の移動量)である第3のX移動量ΔX3もまた、要求される側面観察画像の分解能に応じて変化し得る。このため、本実施形態に係る側面観察の第3のX移動量ΔX3は、例えば第1の実施形態で上述した3Dスキャン処理における第2のX移動量ΔX2と比較して小さいことが好ましい。
For example, depending on the width in the X direction of the cells 324 included in the focus range that can vary depending on the degree of unevenness of the cells 324, the angle of view θ and the first image movement amount ΔX in each first image or Needless to say, the width of the region A1 ′ can also change. Further, the angle of view θ, the first image movement amount ΔX, or the width of the region A1 ′ corresponds to the resolution of the side observation image as shown in FIG. The third X movement amount ΔX3, which is the distance between the optical axes between the positions where each first image is acquired (the movement amount of the image acquisition unit 150), also depends on the required resolution of the side observation image. Can change. For this reason, it is preferable that the third X movement amount ΔX3 of the side observation according to the present embodiment is smaller than, for example, the second X movement amount ΔX2 in the 3D scan process described above in the first embodiment.
なお、第3のX移動量ΔX3が十分に小さく設定されている場合等、複数の第1の画像間に含まれる各々の特定範囲に重複が存在する場合には、取得された各々の第1の画像のうち、細胞324に合焦しており、かつ、光軸から所定の閾値以上離れた領域を特定範囲画像として切り出して画像合成が行われればよい。この場合には、例えば画像と同時に取得された撮像位置の情報を用いて当該画像合成が行われる。また、重複した特定範囲を含む複数の特定範囲画像の各々について、例えば白飛びや黒潰れの有無、細胞324への合焦の程度等に基づいて点数化し、当該点数に応じて画像合成に用いられる特定範囲画像が選択されてもよい。
When there is an overlap in each specific range included between the plurality of first images, such as when the third X movement amount ΔX3 is set to be sufficiently small, each acquired first Of these images, a region that is focused on the cell 324 and is separated from the optical axis by a predetermined threshold or more may be cut out as a specific range image and image synthesis may be performed. In this case, for example, the image composition is performed using information on the imaging position acquired simultaneously with the image. Further, for each of a plurality of specific range images including overlapping specific ranges, for example, a score is obtained based on the presence or absence of overexposure or blackout, the degree of focus on the cell 324, etc., and is used for image synthesis according to the score. The specific range image to be selected may be selected.
上述したように、本実施形態に係る側面観察では、第1の画像のうち注目被写体に対して合焦している範囲が特定範囲画像として切り出される。このような側面観察において、観察装置100は、例えば注目被写体のうち撮像する領域を決定して複数の領域に分割し、当該分割された領域に合焦させた撮像を繰り返して第1の画像を取得し、第1の画像のうち当該領域を含む範囲を特定範囲画像として用いる。また、例えば、観察装置100は、撮像素子153上の光軸から所定の閾値以上離れた特定の領域に対応する領域をAFエリアとして使用して、当該AFエリアで合焦するように第1の撮像光学系のZ位置を調整させる。観察装置100は、このようにして第1の画像のうち特定範囲画像が占める領域を固定として、側面観察画像の取得を行ってもよい。また、例えば、観察装置100は、3Dスキャンのように、画像取得ユニット150を予め設定された移動パターンに従ってX方向、Y方向、Z方向の各々の方向へ移動させながら、第1の画像を取得させ、取得された第1の画像から合焦している範囲を特定範囲画像として用いて側面観察画像の取得を行ってもよい。
As described above, in the side observation according to the present embodiment, the range focused on the subject of interest in the first image is cut out as the specific range image. In such side-viewing, for example, the observation apparatus 100 determines a region to be imaged of the subject of interest, divides it into a plurality of regions, and repeats imaging focused on the divided regions to obtain the first image. Obtaining and using the range including the region of the first image as the specific range image. In addition, for example, the observation apparatus 100 uses an area corresponding to a specific area that is a predetermined threshold or more away from the optical axis on the image sensor 153 as an AF area, and performs first focusing so as to focus on the AF area. The Z position of the imaging optical system is adjusted. In this way, the observation apparatus 100 may acquire the side observation image while fixing the region occupied by the specific range image in the first image. Further, for example, the observation apparatus 100 acquires the first image while moving the image acquisition unit 150 in each of the X direction, the Y direction, and the Z direction according to a preset movement pattern, as in 3D scanning. Then, the side observation image may be acquired by using the range focused from the acquired first image as the specific range image.
なお、画像取得ユニット150が移動機構160によってX方向へ移動させられる場合を例として説明したが、これに限定されない。側面観察画像の取得は、設定又は選択された移動パターンに応じて、例えばX方向及びY方向へ移動させられながら行われてもよい。
Although the case where the image acquisition unit 150 is moved in the X direction by the moving mechanism 160 has been described as an example, the present invention is not limited to this. The acquisition of the side observation image may be performed while being moved, for example, in the X direction and the Y direction according to the set or selected movement pattern.
このようにして、本実施形態に係る観察装置100の備える側面情報処理部は、図15に示す状態におけるX+方向から試料300を観察したような、被写体側テレセントリック光学系を用いた場合では観察できない、細胞324の側面を捉えた側面観察画像を取得できる。
As described above, the side surface information processing unit included in the observation apparatus 100 according to the present embodiment cannot be observed using the subject-side telecentric optical system such as the sample 300 observed from the X + direction in the state illustrated in FIG. A side observation image that captures the side surface of the cell 324 can be acquired.
<観察システムの動作>
本実施形態に係る側面観察処理の一例をフローチャートとして図17に示し、これを参照して観察システム1の動作について説明をする。側面観察処理は、例えば、図7を参照して上述した観察装置制御処理におけるステップS205の後からステップS210の前の間に行われる。側面観察処理は、当該観察装置制御処理において、カウントスキャン処理又は3Dスキャン処理と同様にして、例えばユーザの操作結果に応じてコントローラ200が出力する側面観察処理の実行を指示する制御信号を受信したと判定された場合に開始される。なお、側面観察処理において要求されるスキャンパターン、判定条件等の各種情報は、例えば側面観察処理情報として観察側記録回路130に記録されている。また、側面観察処理情報は、側面観察処理において取得される結果を含む。 <Operation of observation system>
An example of the side observation process according to the present embodiment is shown as a flowchart in FIG. 17, and the operation of theobservation system 1 will be described with reference to this. The side observation process is performed, for example, after step S205 and before step S210 in the observation apparatus control process described above with reference to FIG. In the observation device control process, the side observation process receives a control signal instructing execution of the side observation process output by the controller 200 according to, for example, a user operation result in the same manner as the count scan process or the 3D scan process. It is started when it is determined. Note that various types of information such as scan patterns and determination conditions required in the side observation processing are recorded in the observation side recording circuit 130 as side observation processing information, for example. The side observation processing information includes a result acquired in the side observation processing.
本実施形態に係る側面観察処理の一例をフローチャートとして図17に示し、これを参照して観察システム1の動作について説明をする。側面観察処理は、例えば、図7を参照して上述した観察装置制御処理におけるステップS205の後からステップS210の前の間に行われる。側面観察処理は、当該観察装置制御処理において、カウントスキャン処理又は3Dスキャン処理と同様にして、例えばユーザの操作結果に応じてコントローラ200が出力する側面観察処理の実行を指示する制御信号を受信したと判定された場合に開始される。なお、側面観察処理において要求されるスキャンパターン、判定条件等の各種情報は、例えば側面観察処理情報として観察側記録回路130に記録されている。また、側面観察処理情報は、側面観察処理において取得される結果を含む。 <Operation of observation system>
An example of the side observation process according to the present embodiment is shown as a flowchart in FIG. 17, and the operation of the
ステップS501において、観察側制御回路110は、3Dスキャン処理のステップS301と同様にして、レンズ切替部154に撮像光学系152を第1の撮像光学系とさせる。その後、処理はステップS502へ進む。
In step S501, the observation-side control circuit 110 causes the lens switching unit 154 to set the imaging optical system 152 as the first imaging optical system in the same manner as in step S301 of the 3D scanning process. Thereafter, the process proceeds to step S502.
ステップS502において、観察側制御回路110は、3Dスキャン処理のステップS302と同様にして、事前処理を行う。また、観察側制御回路110は、側面観察画像取得のためにスキャンを開始する。その後、処理はステップS503へ進む。
In step S502, the observation-side control circuit 110 performs pre-processing in the same manner as in step S302 of 3D scanning processing. In addition, the observation side control circuit 110 starts scanning for acquiring a side observation image. Thereafter, the process proceeds to step S503.
ステップS503及びステップS504において、観察側制御回路110は、3Dスキャン処理のステップS303及びステップS304と同様にして、事前処理、NG判定条件及び再トライ判定条件に係る判定、必要に応じた警告処理等を行う。処理は、ステップS503でNG判定条件及び再トライ判定条件を満たさないと判定された場合はステップS505へ進み、ステップS503でNG判定条件又は再トライ判定条件を満たすと判定された場合はステップS504で必要に応じて警告を行った後にステップS502へ戻る。
In steps S503 and S504, the observation-side control circuit 110 performs pre-processing, determination regarding NG determination conditions and retry determination conditions, warning processing as necessary, and the like in the same manner as in steps S303 and S304 of 3D scanning processing. I do. The process proceeds to step S505 when it is determined in step S503 that the NG determination condition and the retry determination condition are not satisfied, and in step S504 when it is determined that the NG determination condition or the retry determination condition is satisfied in step S503. After giving a warning if necessary, the process returns to step S502.
ステップS505において、観察側制御回路110は、特定域での側面観察画像取得が終了したか否かの判定等を行う。処理は、終了したと判定された場合はステップS508へ進み、判定されなかった場合はステップS506へ進む。
In step S505, the observation-side control circuit 110 determines whether or not the side observation image acquisition in the specific area is completed. If it is determined that the process has been completed, the process proceeds to step S508; otherwise, the process proceeds to step S506.
ステップS506において、観察側制御回路110は、図15を参照して上述したようにして、第1の撮像光学系の光軸からの距離が所定の閾値以上である領域(特定範囲)にAFさせて、特定範囲に合焦した第1の画像を取得する。
In step S506, as described above with reference to FIG. 15, the observation-side control circuit 110 performs AF on a region (specific range) in which the distance from the optical axis of the first imaging optical system is equal to or greater than a predetermined threshold. Thus, the first image focused on the specific range is acquired.
ステップS507において、観察側制御回路110は、側面観察処理情報として記録されているスキャンパターンに従って画像取得ユニット150を移動機構160に移動させる。観察側制御回路110は、例えば、特定範囲から放射された主光線の傾きに応じた方向に移動することになる。その後、処理はステップS503へ戻る。
In step S507, the observation side control circuit 110 moves the image acquisition unit 150 to the moving mechanism 160 according to the scan pattern recorded as the side observation processing information. For example, the observation-side control circuit 110 moves in a direction corresponding to the inclination of the principal ray emitted from the specific range. Thereafter, the process returns to step S503.
ステップS508において、観察側制御回路110は、図16を参照して上述したようにして、画像処理回路120に、特定域における側面観察スキャンにおいて取得された各々の第1の画像から、特定範囲を撮像した領域を特定範囲画像として切り出させる。観察側制御回路110は、画像処理回路120に当該複数の特定範囲画像を、それぞれ奥行き情報に基づいて適切な幅Wへと画像処理によって変換し、変換した画像を合成させる。
In step S508, the observation-side control circuit 110 causes the image processing circuit 120 to set a specific range from each first image acquired in the side-view observation scan in the specific area as described above with reference to FIG. The captured area is cut out as a specific range image. The observation-side control circuit 110 causes the image processing circuit 120 to convert the plurality of specific range images into an appropriate width W based on the depth information, and to synthesize the converted images.
ステップS509において、観察側制御回路110は、当該側面観察画像を例えば予め設定された送信先へ送信する。なお、送信先は、本ステップにおいてユーザの操作結果に応じてコントローラ200が出力する制御信号に基づいて決定されてもよい。その後、処理は終了する。
In step S509, the observation side control circuit 110 transmits the side observation image to, for example, a preset transmission destination. Note that the transmission destination may be determined based on a control signal output by the controller 200 in accordance with a user operation result in this step. Thereafter, the process ends.
<第2の実施形態に係る利点>
本実施形態に係る観察システム1は、第1の実施形態で得られる利点に加えて、以下のような利点を有する。 <Advantages of Second Embodiment>
Theobservation system 1 according to the present embodiment has the following advantages in addition to the advantages obtained in the first embodiment.
本実施形態に係る観察システム1は、第1の実施形態で得られる利点に加えて、以下のような利点を有する。 <Advantages of Second Embodiment>
The
本実施形態に係る観察装置100の備える側面情報処理部は、目的に応じて第1の撮像光学系と第2の撮像光学系とを切り替えて観察等を行う観察装置において、細胞324上の領域のうち、第1の撮像光学系の光軸から所定の閾値以上離れた領域から放射される主光線と当該光軸との間に傾きが存在することを利用して、細胞324の側面を撮像した側面観察画像を取得する。
The side information processing unit included in the observation apparatus 100 according to the present embodiment is an area on the cell 324 in an observation apparatus that performs observation or the like by switching between the first imaging optical system and the second imaging optical system according to the purpose. The side surface of the cell 324 is imaged using the fact that there is an inclination between the principal ray radiated from a region separated from the optical axis of the first imaging optical system by a predetermined threshold or more and the optical axis. Acquired side observation image.
したがって、ユーザは、本実施形態に係る観察装置100を使用すれば、被写体側にテレセントリック性を有している観察装置では取得できない、細胞324の側面を含む第1の画像と、当該画像に基づいた側面観察画像とを取得できる。
Therefore, if the user uses the observation device 100 according to the present embodiment, the first image including the side surface of the cell 324 that cannot be obtained by the observation device having telecentricity on the subject side, and the image based on the first image. Obtained side view images.
さらに、本実施形態に係る観察装置100と第1の実施形態に係る観察装置100とは組み合わせ可能である。例えば、立体画像生成部は、本実施形態で説明したようにして取得される特定範囲画像を、当該特定範囲画像に含まれる各々の対応点に係る奥行き情報に基づいて立体化する処理を行う。観察装置100は、このようにして取得される立体化された特定範囲画像を、第1の実施形態で上述した細胞324の立体モデルの対応する位置に合成して、細胞324の立体画像を取得できる。
Furthermore, the observation apparatus 100 according to the present embodiment and the observation apparatus 100 according to the first embodiment can be combined. For example, the stereoscopic image generation unit performs a process of three-dimensionalizing the specific range image acquired as described in the present embodiment based on the depth information regarding each corresponding point included in the specific range image. The observation device 100 acquires the stereoscopic image of the cell 324 by synthesizing the three-dimensional specific range image acquired in this way at the corresponding position of the stereoscopic model of the cell 324 described in the first embodiment. it can.
<変形例>
なお、第1の実施形態及び第2の実施形態において、簡単のために対物レンズ152bと結像レンズ152cとを正レンズを模して示しているが、これに限定されない。例えば、取得される第1の画像間の第1の像移動量ΔXをより小さく、視差を大きくさせるために、対物レンズを負の屈折力を有するレンズ群としてもよい。また、レンズ枚数は、要求される性能に応じて複数枚用いられる得ることは言うまでもない。 <Modification>
In the first embodiment and the second embodiment, theobjective lens 152b and the imaging lens 152c are illustrated as positive lenses for simplicity, but the present invention is not limited to this. For example, the objective lens may be a lens group having a negative refractive power in order to reduce the first image movement amount ΔX between the first images to be acquired and to increase the parallax. Needless to say, a plurality of lenses can be used according to the required performance.
なお、第1の実施形態及び第2の実施形態において、簡単のために対物レンズ152bと結像レンズ152cとを正レンズを模して示しているが、これに限定されない。例えば、取得される第1の画像間の第1の像移動量ΔXをより小さく、視差を大きくさせるために、対物レンズを負の屈折力を有するレンズ群としてもよい。また、レンズ枚数は、要求される性能に応じて複数枚用いられる得ることは言うまでもない。 <Modification>
In the first embodiment and the second embodiment, the
なお、第1の実施形態及び第2の実施形態に係る観察装置100は、第1の撮像光学系を用いる場合に視差画像から取得した奥行き情報に基づいてAF動作を行ってもよい。
Note that the observation apparatus 100 according to the first embodiment and the second embodiment may perform the AF operation based on the depth information acquired from the parallax image when the first imaging optical system is used.
なお、第1の実施形態及び第2の実施形態では、インキュベータ内で観察装置100が利用される場合を想定し、細胞の観察に注力した用途を強調しているが、観察物の形状、数、大きさ等の取得と、観察物に係る奥行き情報の取得とを実現できる、細部を拡大して確認するための観察装置として一般化できることは言うまでもない。
In the first embodiment and the second embodiment, it is assumed that the observation apparatus 100 is used in an incubator, and emphasizes the use focused on cell observation. Needless to say, the present invention can be generalized as an observation apparatus for enlarging and confirming details, which can realize acquisition of size and the like and acquisition of depth information related to an observation object.
試料300は、観察装置100の上面に配置されたまま、例えばインキュベータ、クリーンベンチ等への出し入れが行われ得る。このような場合、細胞324は、温度変化の影響を受けることになり、ヒートショックを受ける可能性がある。さらに、出し入れ等に伴ってコンタミネーションの発生も起こり得る。本技術は、例えば細胞又は細胞群等の観察物の形状や大きさと、細胞又は細胞群等の観察物の有する凹凸等の奥行き情報とを、それぞれ適切に取得できる。また、第1の実施形態及び第2の実施形態では観察装置100の取得した画像について、NG判定条件と再トライ判定条件とを用いた判定の結果に応じてユーザに観察不良の発生等を警告する場合を例として説明をしたが、本技術の適用はこれに限定されない。観察物の異常、コンタミネーション等を検出した場合にもユーザへ警告できる。また、本技術は、画像解析の結果に基づいた培地の状態の評価を行う。
The sample 300 can be put in and out of, for example, an incubator, a clean bench or the like while being placed on the upper surface of the observation apparatus 100. In such a case, the cell 324 will be affected by a temperature change and may receive a heat shock. In addition, contamination may occur with taking in and out. For example, the present technology can appropriately acquire the shape and size of an observation object such as a cell or a cell group and depth information such as unevenness of the observation object such as a cell or a cell group. In the first embodiment and the second embodiment, the user is warned of the occurrence of observation failure according to the result of the determination using the NG determination condition and the retry determination condition for the image acquired by the observation apparatus 100. However, the application of the present technology is not limited to this. The user can be warned even when an abnormality of the observation object, contamination, or the like is detected. Moreover, this technique evaluates the state of the culture medium based on the result of image analysis.
第1の実施形態及び第2の実施形態では、観察側制御回路110と、画像処理回路120と、観察側記録回路130と、観察側通信装置140とが回路群104として筐体101の内部に備えられている場合を例として説明したが、これに限定されない。例えば、これらのうち1つ又は複数の機能が画像取得ユニット150に備えられていてもよい。また、例えば観察側通信装置140としての機能は、画像取得ユニット150と、回路群104との両方に備えられていてもよい。また、観察側制御回路110と、画像処理回路120と、観察側記録回路130とのうち1つ又は複数の機能が、コントローラ200に備えられていてもよい。すなわち、例えば上述した各種判定、画像処理等の一部又は全てがコントローラ200で行われてもよい。
In the first and second embodiments, the observation-side control circuit 110, the image processing circuit 120, the observation-side recording circuit 130, and the observation-side communication device 140 are arranged inside the housing 101 as a circuit group 104. Although the case where it is provided has been described as an example, it is not limited thereto. For example, one or more of these functions may be provided in the image acquisition unit 150. For example, the function as the observation-side communication device 140 may be provided in both the image acquisition unit 150 and the circuit group 104. In addition, one or more functions of the observation side control circuit 110, the image processing circuit 120, and the observation side recording circuit 130 may be provided in the controller 200. That is, for example, some or all of the above-described various determinations, image processing, and the like may be performed by the controller 200.
また、コントローラ200の備える各部のうち、例えば入出力装置270等、一部の要素が観察装置100に含まれていてもよい。さらに、観察装置100とコントローラ200とが1つの筐体に組み込まれた構成も考えられる。観察装置100とコントローラ200とが一体となった観察システム1は、例えば恒温室等の、ユーザ自身が使用環境に立ち入る場合に使用され得る。
Further, among the units included in the controller 200, some elements such as the input / output device 270 may be included in the observation device 100. Furthermore, a configuration in which the observation apparatus 100 and the controller 200 are incorporated in one housing is also conceivable. The observation system 1 in which the observation apparatus 100 and the controller 200 are integrated can be used when the user himself enters a use environment such as a temperature-controlled room.
また、観察システム1には、画像解析等の観察結果、ユーザの使用頻度等を含む観察システム1の使い方、インキュベータの設定等を記録して学習して、ユーザに、ユーザが設定する各種条件、パラメータ等を提示するような人工知能(AI)が含まれていてもよい。また、AIは、例えばDSP等に構築されて観察システム1の内部にあってもよいし、インターネット上に構築されて観察システム1の外部にあってもよい。このようなAIを含む観察システム1は、例えば、取得した画像について、サーバ上に用意されたデータベースを参照して細胞の様子、種類、培地の状態、異物混入の有無等を判定できる。
In addition, the observation system 1 records and learns observation results such as image analysis, usage of the observation system 1 including the usage frequency of the user, incubator settings, and the like. Artificial intelligence (AI) that presents parameters and the like may be included. Further, the AI may be built inside the observation system 1, for example, in a DSP or the like, or may be built on the Internet and outside the observation system 1. The observation system 1 including such an AI can determine, for example, the state of the cell, the type, the state of the medium, the presence or absence of foreign matter, etc. with respect to the acquired image by referring to a database prepared on the server.
本技術は、例えば監視カメラや内視鏡のように、撮像光学系とユーザとが離れた位置で使用される撮像装置に対して適用しても有効である。ユーザは、撮像の目的に応じて当該撮像装置を交換する手間なく、使用する目的に適した画像等の観察結果を取得できる。
The present technology is also effective when applied to an imaging apparatus that is used at a position where the imaging optical system and the user are separated, such as a surveillance camera or an endoscope. The user can acquire observation results such as images suitable for the purpose of use without having to replace the imaging device according to the purpose of imaging.
なお、本発明は、上記実施形態に限定されるものではなく、実施段階ではその要旨を逸脱しない範囲で種々に変形することが可能である。また、各実施形態は適宜組み合わせて実施してもよく、その場合組み合わせた効果が得られる。更に、上記実施形態には種々の発明が含まれており、開示される複数の構成要件から選択された組み合わせにより種々の発明が抽出され得る。例えば、実施形態に示される全構成要件からいくつかの構成要件が削除されても、課題が解決でき、効果が得られる場合には、この構成要件が削除された構成が発明として抽出され得る。
In addition, this invention is not limited to the said embodiment, In the implementation stage, it can change variously in the range which does not deviate from the summary. Further, the embodiments may be implemented in combination as appropriate, and in that case, the combined effect can be obtained. Furthermore, the present invention includes various inventions, and various inventions can be extracted by combinations selected from a plurality of disclosed constituent elements. For example, even if several constituent requirements are deleted from all the constituent requirements shown in the embodiment, if the problem can be solved and an effect can be obtained, the configuration from which the constituent requirements are deleted can be extracted as an invention. *
Claims (12)
- 拡大光学系、かつ、被写体側のいずれかの主光線と光軸とのなす角度が6°以上となる被写体側非テレセントリック光学系である第1の撮像光学系を含み、前記第1の撮像光学系を用いて試料を撮像して第1の画像を取得する撮像部と、
前記試料と前記撮像部との相対位置を変更する移動機構と、
異なる前記撮像部の位置で取得された複数の前記第1の画像と、前記第1の画像の取得時の各々の撮像位置に係る情報とに基づいて、前記試料について立体情報を取得する立体情報取得部と
を備える、観察装置。 And a first imaging optical system including a magnifying optical system and a first imaging optical system which is a subject-side non-telecentric optical system in which an angle between any principal ray on the subject side and the optical axis is 6 ° or more. An imaging unit that images a sample using a system to obtain a first image;
A moving mechanism for changing a relative position between the sample and the imaging unit;
Three-dimensional information for acquiring three-dimensional information about the sample based on a plurality of the first images acquired at different positions of the imaging unit and information on each imaging position at the time of acquisition of the first image. An observation device comprising: an acquisition unit. - 複数の前記第1の画像のうち少なくとも一つは、前記第1の撮像光学系で得られた画像の前記第1の撮像光学系の光軸上以外の画像である、請求項1に記載の観察装置。 2. The at least one of the plurality of first images is an image obtained by the first imaging optical system other than an image on an optical axis of the first imaging optical system. Observation device.
- 異なる前記撮像位置で取得された複数の前記第1の画像の各々に含まれる被写体像に基づいて対応点を取得する対応点取得部をさらに備え、
前記立体情報取得部は、前記対応点の位置の撮像面上での変化量と、各々の前記撮像位置の間隔とに基づいて、前記立体情報を取得する、請求項1又は2に記載の観察装置。 A corresponding point acquisition unit that acquires corresponding points based on subject images included in each of the plurality of first images acquired at different imaging positions;
The observation according to claim 1, wherein the three-dimensional information acquisition unit acquires the three-dimensional information based on a change amount on the imaging surface of the position of the corresponding point and an interval between the imaging positions. apparatus. - 前記立体情報に基づいて、前記試料に含まれる被写体の立体モデルを生成する立体モデル生成部をさらに備える、請求項1乃至3のうち何れか1項に記載の観察装置。 The observation apparatus according to any one of claims 1 to 3, further comprising a three-dimensional model generation unit that generates a three-dimensional model of a subject included in the sample based on the three-dimensional information.
- 前記立体情報と前記立体情報に対応する前記第1の画像とに基づいて、前記試料に含まれる被写体の立体画像を生成する立体画像生成部をさらに備える、請求項1乃至3のうち何れか1項に記載の観察装置。 Any one of Claims 1 thru | or 3 further provided with the stereo image production | generation part which produces | generates the stereo image of the to-be-photographed object contained in the said sample based on the said stereo information and the said 1st image corresponding to the said stereo information. The observation apparatus according to item.
- 複数の前記第1の画像の各々について、被写体に合焦している領域を特定範囲画像として切り出し、複数の前記特定範囲画像を前記立体情報に基づいて合成して前記被写体の側面観察画像を生成する側面情報処理部をさらに備える、請求項1乃至3のうち何れか1項に記載の観察装置。 For each of the plurality of first images, a region focused on the subject is cut out as a specific range image, and the plurality of the specific range images are synthesized based on the stereoscopic information to generate a side-view image of the subject. The observation apparatus according to any one of claims 1 to 3, further comprising a side surface information processing unit.
- 前記撮像部は、被写体側のいずれの主光線も光軸とのなす角度が4°以下となる被写体側テレセントリック光学系である第2の撮像光学系をさらに含み、
前記第1の撮像光学系と前記第2の撮像光学系とを切り替えるレンズ切替部をさらに備え、
前記撮像部は、前記第2の撮像光学系を用いて前記試料を撮像して第2の画像をさらに取得する、
請求項1乃至6のうち何れか1項に記載の観察装置。 The imaging unit further includes a second imaging optical system which is a subject-side telecentric optical system in which an angle formed by any principal ray on the subject side and the optical axis is 4 ° or less,
A lens switching unit that switches between the first imaging optical system and the second imaging optical system;
The imaging unit further acquires a second image by imaging the sample using the second imaging optical system.
The observation apparatus according to any one of claims 1 to 6. - 前記レンズ切替部は、撮像光学系に含まれるレンズの光軸方向の位置を制御して、前記撮像光学系を前記第1の撮像光学系又は前記第2の撮像光学系とする、請求項7に記載の観察装置。 The lens switching unit controls a position in an optical axis direction of a lens included in the imaging optical system, and the imaging optical system is the first imaging optical system or the second imaging optical system. The observation apparatus described in 1.
- 前記第1の撮像光学系の光軸は、観察軸と平行であり、
前記移動機構は、前記第1の撮像光学系の光軸を前記観察軸と平行に維持したまま前記撮像部を移動させ、
前記立体情報は、前記試料に含まれる被写体の前記観察軸の方向における位置に係る情報である、
請求項1乃至8のうち何れか1項に記載の観察装置。 The optical axis of the first imaging optical system is parallel to the observation axis;
The moving mechanism moves the imaging unit while maintaining the optical axis of the first imaging optical system in parallel with the observation axis,
The three-dimensional information is information related to the position of the subject included in the sample in the direction of the observation axis.
The observation apparatus according to any one of claims 1 to 8. - 請求項1乃至9のうち何れか1項に記載の観察装置と、
ユーザの操作結果を取得して前記観察装置へ出力し、前記観察装置の観察結果を取得するコントローラと
を備える観察システム。 An observation apparatus according to any one of claims 1 to 9,
An observation system comprising: a controller that acquires a user operation result, outputs the operation result to the observation device, and acquires an observation result of the observation device. - 拡大光学系、かつ、被写体側のいずれかの主光線と光軸とのなす角度が6°以上となる被写体側非テレセントリック光学系である第1の撮像光学系を含む撮像部を用いて、前記第1の撮像光学系を用いて試料を撮像して第1の画像の取得を行うことと、
前記試料と前記撮像部との相対位置の変更を行うことと、
異なる前記撮像部の位置で取得された複数の前記第1の画像と、前記第1の画像の取得時の各々の撮像位置に係る情報とに基づいて、前記試料について立体情報の取得を行うことと
を含む、観察装置の制御方法。 Using an imaging unit including a magnifying optical system and a first imaging optical system that is a subject-side non-telecentric optical system in which an angle formed between any principal ray on the subject side and the optical axis is 6 ° or more, Obtaining a first image by imaging a sample using a first imaging optical system;
Changing the relative position of the sample and the imaging unit;
Obtaining three-dimensional information about the sample based on a plurality of the first images acquired at different positions of the imaging unit and information on each imaging position at the time of acquisition of the first image A method for controlling an observation apparatus, comprising: - 前記第1の撮像光学系の光軸は、観察軸と平行であり、
前記相対位置の変更は、前記第1の撮像光学系の光軸を前記観察軸と平行に維持したまま前記撮像部を移動させることを含み、
前記立体情報の取得は、前記試料に含まれる被写体について前記観察軸の方向における位置に係る情報を取得することを含む
請求項11に記載の観察装置の制御方法。 The optical axis of the first imaging optical system is parallel to the observation axis;
The change of the relative position includes moving the imaging unit while maintaining the optical axis of the first imaging optical system parallel to the observation axis,
The method for controlling an observation apparatus according to claim 11, wherein the acquisition of the three-dimensional information includes acquiring information related to a position in the direction of the observation axis for a subject included in the sample.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017029999A JP6330070B1 (en) | 2017-02-21 | 2017-02-21 | Observation device, observation system, and control method for observation device |
JP2017-029999 | 2017-02-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018154871A1 true WO2018154871A1 (en) | 2018-08-30 |
Family
ID=62186841
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/040772 WO2018154871A1 (en) | 2017-02-21 | 2017-11-13 | Observation device, observation system, and method for controlling observation device |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6330070B1 (en) |
WO (1) | WO2018154871A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210102887A1 (en) * | 2018-06-08 | 2021-04-08 | Olympus Corporation | Observation device |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7410701B2 (en) * | 2019-12-09 | 2024-01-10 | 株式会社ミツトヨ | Adapter optics and variable focal length optics |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006011145A (en) * | 2004-06-28 | 2006-01-12 | Olympus Corp | Binocular microscope apparatus |
JP2013088490A (en) * | 2011-10-14 | 2013-05-13 | Nikon Corp | Microscope, image acquisition method, program, and recording medium |
JP2016070720A (en) * | 2014-09-29 | 2016-05-09 | 株式会社ミツトヨ | Image measurement device and guidance display method of image measurement device |
JP2016157197A (en) * | 2015-02-23 | 2016-09-01 | 株式会社リコー | Self-position estimation device, self-position estimation method, and program |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4740826B2 (en) * | 2006-02-23 | 2011-08-03 | 株式会社神戸製鋼所 | Shape measuring device and shape measuring method |
-
2017
- 2017-02-21 JP JP2017029999A patent/JP6330070B1/en not_active Expired - Fee Related
- 2017-11-13 WO PCT/JP2017/040772 patent/WO2018154871A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006011145A (en) * | 2004-06-28 | 2006-01-12 | Olympus Corp | Binocular microscope apparatus |
JP2013088490A (en) * | 2011-10-14 | 2013-05-13 | Nikon Corp | Microscope, image acquisition method, program, and recording medium |
JP2016070720A (en) * | 2014-09-29 | 2016-05-09 | 株式会社ミツトヨ | Image measurement device and guidance display method of image measurement device |
JP2016157197A (en) * | 2015-02-23 | 2016-09-01 | 株式会社リコー | Self-position estimation device, self-position estimation method, and program |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210102887A1 (en) * | 2018-06-08 | 2021-04-08 | Olympus Corporation | Observation device |
US11635364B2 (en) * | 2018-06-08 | 2023-04-25 | Evident Corporation | Observation device |
Also Published As
Publication number | Publication date |
---|---|
JP6330070B1 (en) | 2018-05-23 |
JP2018136178A (en) | 2018-08-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8982457B2 (en) | Microscope system and illumination intensity adjusting method | |
CN104111524B (en) | The method of workflow in digit microscope and optimization digit microscope | |
US10116848B2 (en) | Illumination and imaging system for imaging raw samples with liquid in a sample container | |
CN106210520B (en) | A kind of automatic focusing electronic eyepiece and system | |
TW201126624A (en) | System and method for inspecting a wafer (2) | |
CN110261385B (en) | Appearance inspection system, image processing apparatus, imaging apparatus, and inspection method | |
US11119382B2 (en) | Tunable acoustic gradient lens system with amplitude adjustment corresponding to z-height as indicated by calibration data | |
US9291750B2 (en) | Calibration method and apparatus for optical imaging lens system with double optical paths | |
US11249225B2 (en) | Tunable acoustic gradient lens system utilizing amplitude adjustments for acquiring images focused at different z-heights | |
JP6698451B2 (en) | Observation device | |
JP6330070B1 (en) | Observation device, observation system, and control method for observation device | |
US10827114B2 (en) | Imaging system and setting device | |
JP2013034127A (en) | Imaging apparatus | |
CN109194851A (en) | A kind of ultrashort burnt Vision imaging system of miniaturization | |
JPH1172717A (en) | Microscopic digital photographing system | |
JP6640610B2 (en) | Observation device, measurement system and observation method | |
JP6815787B2 (en) | Observation device, control method of observation device, control program of observation device | |
CN113596441B (en) | Optical axis adjusting device, method, system and readable storage medium | |
JP2007327903A (en) | Visual inspection system | |
CN203759357U (en) | Optical image capturing system and optical detection system | |
CN114689281A (en) | Method for detecting pupil drift of optical module | |
JP2007103787A (en) | Inspection apparatus for solid-state imaging device | |
JP5362981B2 (en) | Imaging device | |
WO2020039920A1 (en) | Image processing system, image processing method, and program | |
JP2019033436A (en) | Imaging apparatus, imaging system, and imaging apparatus control method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17897409 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17897409 Country of ref document: EP Kind code of ref document: A1 |