+

WO2013088657A1 - Projecting projector device, optical anti-glare method, and optical anti-glare program - Google Patents

Projecting projector device, optical anti-glare method, and optical anti-glare program Download PDF

Info

Publication number
WO2013088657A1
WO2013088657A1 PCT/JP2012/007623 JP2012007623W WO2013088657A1 WO 2013088657 A1 WO2013088657 A1 WO 2013088657A1 JP 2012007623 W JP2012007623 W JP 2012007623W WO 2013088657 A1 WO2013088657 A1 WO 2013088657A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
pixel value
foreground
projection
Prior art date
Application number
PCT/JP2012/007623
Other languages
French (fr)
Japanese (ja)
Inventor
亮磨 大網
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Publication of WO2013088657A1 publication Critical patent/WO2013088657A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/48Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
    • G03B17/54Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/2086Security or safety means in lamp houses

Definitions

  • the present invention relates to a projection type projection device, a light anti-glare method, and a light anti-glare program for reducing the glare of light projected when an image is projected.
  • Patent Document 1 describes a display device that controls the brightness of image light projected onto an area where a person is located.
  • the display device described in Patent Document 1 captures a screen with a camera attached to a projector, and when a presenter is detected in the captured region, the brightness of the image light projected on the region where the presenter is located is reduced.
  • Patent Document 2 describes a projector that prevents various types of information included in a projected image from being displayed on the face of a presenter.
  • the projector described in Patent Document 2 detects a human face area from a captured image obtained by photographing the projected image, and projects an image in which the area corresponding to the face area is replaced with a predetermined color.
  • Non-Patent Document 1 describes a projector system that adjusts light of a projector so that a speaker does not dazzle even when standing in front of the projector.
  • the projector system described in Non-Patent Document 1 detects a speaker's face area from an image including a screen and the speaker, and displays a mask image superimposed on the detected area.
  • Non-Patent Document 2 describes a person detection method that improves the accuracy of detecting a person in a high illumination environment.
  • brightness adjustment is performed between a projected image and a captured image obtained by capturing the projected image with a camera, and a difference image is created between the images after brightness adjustment, thereby Is detected.
  • Patent Document 3 describes an image processing apparatus that extracts an object region without being affected by a pixel value of a background region.
  • the image processing apparatus described in Patent Literature 3 obtains a difference value between input image data and background image data, and detects an obstacle candidate region using a predetermined binarization threshold.
  • the projected slide is an important material for the viewer. Therefore, it is desirable that the range in which the projected light is reduced is as close as possible to the user's head. Therefore, it is desirable to appropriately recognize the head of the presenter and reduce the light at that portion.
  • the display device described in Patent Document 1 detects a presenter by paying attention to a luminance change in which the lower part of the screen is darkened by the body and legs of the presenter in an image taken by a camera.
  • the display device described in Patent Document 1 does not detect a person, there is a problem that there are many false detections.
  • Non-Patent Document 1 there is a problem that face detection does not operate correctly and the anti-glare function does not work because the slide is superimposed on the presenter's face. Further, the system described in Non-Patent Document 1 has a problem that when a face is originally placed in a slide, even the face is detected and the anti-glare function is erroneously activated.
  • Patent Document 2 describes that a projector performs normalization including distortion correction, brightness, and brightness correction on a comparative image.
  • the specific correction method is not described in Patent Document 2, it is difficult to say that the projector described in Patent Document 2 can detect a person with high accuracy.
  • Non-Patent Document 2 brightness adjustment is performed by performing processing for matching the histogram average value with the pixel value of the input image.
  • a predetermined luminance correction value when a predetermined luminance correction value is used, luminance correction is performed for a person to be extracted, and there is a case where a person cannot be appropriately extracted.
  • the present invention when projecting an image by projecting light with high illuminance, the present invention reduces the glare of the light projected by detecting the person with high accuracy when a person is present in the image projection direction. It is an object of the present invention to provide a projection type projection apparatus, a light anti-glare method, and a light anti-glare program that can be reduced.
  • a projection type projection apparatus projects a projection image, which is an image projected from the projection means, and an image obtained by capturing the projection image or a foreground that is a moving object existing between the projection image and the projection means.
  • Pixel value correction that corrects the pixel value of the pixel in the background that is a part other than the foreground in the captured image so that it approaches the pixel value of the pixel in the corresponding projection image.
  • a difference image generating means for generating a difference image from a difference value between a pixel value of each pixel in the captured image after correction and a pixel value of each pixel in the corresponding projection image, and a difference image based on a threshold value 2
  • a foreground extracting means for generating a binarized image that is a binarized image and extracting a foreground area indicating a foreground portion from the binarized image; and a moving object from an area corresponding to the foreground area in the photographed image
  • a head detecting means for detecting the head, an anti-glare processing means for reducing glare on each pixel of the projection image corresponding to the area including the head, and an area other than the foreground area in the binarized image.
  • An area corresponding to a certain background area is extracted from each of the projected image and the corrected captured image, and a threshold value used for binarization of the difference image based on a difference in pixel values at each corresponding pixel between the extracted images.
  • a foreground extraction unit that generates a binarized image from the difference image based on the threshold value calculated by the dynamic threshold value calculating unit, and from the binarized image,
  • the foreground area is extracted, and the pixel value correcting means corrects the pixel values of the pixels other than the foreground area extracted by the foreground extracting means.
  • the anti-glare method projects a projection image, which is an image projected from the projection means, and an image obtained by photographing the projection image or a foreground, which is a moving object existing between the projection image and the projection means, And the pixel value of the pixel in the background that is a part other than the foreground in the photographed image is corrected so as to be close to the pixel value of the pixel in the corresponding projection image, and after the correction
  • a difference image is generated from the difference value between the pixel value of each pixel in the captured image and the pixel value of each pixel in the corresponding projection image
  • the binarized image is an image obtained by binarizing the difference image based on the threshold value
  • a foreground region indicating the foreground part is extracted from the binarized image
  • the head of the moving object is detected from the region corresponding to the foreground region in the captured image
  • the processing corresponding to the background region in the binarized image is extracted from
  • a threshold value used for binarization of the difference image is calculated for each pixel based on the difference between the pixel values in the image, and when extracting the foreground region, a binarized image is generated from the difference image based on the calculated threshold value.
  • the foreground area is extracted from the binarized image, and the pixel values of the pixels other than the extracted foreground area are corrected.
  • An anti-glare program is a foreground that is a projected image that is an image projected from a projection unit on a computer and an image obtained by capturing the projection image or a moving object that exists between the projection image and the projection unit. Is compared with the captured image that is captured with the projected image, and the pixel value of the pixel in the background, which is a portion other than the foreground in the captured image, is corrected to be close to the pixel value of the pixel in the corresponding projected image.
  • Pixel value correction processing difference image generation processing for generating a difference image from the difference value between the pixel value of each pixel in the corrected captured image and the pixel value of each pixel in the corresponding projection image, and based on the threshold value of the difference image
  • a foreground extraction process for generating a binarized image that is a binarized image and extracting a foreground region indicating a foreground portion from the binarized image, corresponding to a foreground region in a captured image
  • Head detection processing for detecting the head of a moving object from the area, anti-glare processing for reducing glare in each pixel of the projection image corresponding to the area including the head, and areas other than the foreground area in the binarized image
  • the region corresponding to the background region is extracted from the projected image and the corrected captured image, and is used for binarization of the difference image based on the difference in pixel value at each corresponding pixel between the extracted images.
  • a dynamic threshold value calculation process for calculating a threshold value for each pixel is executed, and in the foreground extraction process, a binary image is generated from the difference image based on the threshold value calculated in the dynamic threshold value calculation process, and the binary image
  • the foreground area is extracted from the image data, and the pixel value correction process corrects the pixel values of the pixels other than the foreground area extracted by the foreground extraction process.
  • the present invention when projecting an image by projecting light with high illuminance, if there is a person in the projection direction of the image, the glare of the projected light is detected by detecting the person with high accuracy. Can be reduced.
  • FIG. FIG. 1 is a block diagram showing a configuration example of a first embodiment of a projection type projection apparatus according to the present invention.
  • the projection type projection apparatus according to the first embodiment includes a geometric correction unit 101, a difference image generation unit 102, a foreground extraction unit 103, a head detection unit 104, an anti-glare processing unit 105, and a pixel value correction unit 121.
  • the geometric correction unit 101 inputs an image captured by the camera (hereinafter referred to as a camera image) and matches the shape of a slide image (hereinafter referred to as a slide image) projected on the screen by the projection type projector. Geometrically correct the camera image.
  • the geometric correction unit 101 geometrically corrects the camera image captured by the camera into a rectangle according to the shape of the slide image.
  • the camera image geometrically corrected by the geometric correction unit 101 may be referred to as a geometrically corrected image.
  • the geometric correction method for example, a widely known general method such as a method using a projective transformation matrix is used.
  • the camera image is an image of a slide image projected on the screen.
  • the camera image means an image obtained by photographing the presenter together with the slide image.
  • it when it is described as a camera image, it means an image in which a slide image is taken or an image in which both a slide image and a presenter are taken.
  • the presenter photographed as a camera image is an object that moves while projecting a slide image
  • the presenter may be referred to as a moving body in the following description.
  • the slide video is sometimes referred to as the background
  • the presenter present before the slide video is sometimes referred to as the foreground.
  • the pixel value correction model calculation unit 122 generates a model that defines a method for correcting the pixel value of the camera image so as to approach the pixel value of the corresponding slide image (hereinafter referred to as a pixel value correction model).
  • a geometrically corrected image is used for this camera image.
  • a luminance value can also be used as the pixel value.
  • the pixel value of the corresponding pixel differs between the shot image (that is, the camera image) and the slide image. Therefore, by correcting the pixel value of the pixel in the camera image to be close to the pixel value of the pixel in the slide image, it is possible to improve the accuracy of comparing the two images.
  • the pixel value correction model calculation unit 122 generates a pixel value correction model based on the relationship between the pixel value of the pixel in the slide image and the pixel value of the pixel in the camera image corresponding to each pixel of the slide image. However, in the present embodiment, the pixel value correction model calculating unit 122 compares the pixel value of the pixel in the background portion among the pixels in the camera image with the pixel value of the pixel in the corresponding slide image.
  • An appropriate pixel value correction model can be obtained by comparing pixel values of pixels only in a portion other than the foreground portion to be extracted (ie, background portion) except for a portion that does not correspond between the slide image and the camera image (ie, foreground portion). Can be generated.
  • the foreground area indicating the foreground portion of the camera image is specified by the foreground extraction means 103 described later.
  • the information indicating the foreground area may be referred to as foreground information.
  • the pixel value correction model calculation unit 122 uses a least-square estimation method to calculate the correspondence between the pixel value of a pixel in the slide image and the pixel value of the pixel in the background portion of the camera image corresponding to each pixel of the slide image.
  • the pixel value correction model may be generated from the correspondence relationship. Further, the pixel value correction model calculation unit 122 may generate a pixel value correction model based on a conversion process for matching the histograms of the images.
  • FIG. 2 is an explanatory diagram showing an example of the correspondence between the brightness of the camera image and the brightness of the slide image.
  • the horizontal axis indicates the brightness of the camera image
  • the vertical axis indicates the brightness of the slide image.
  • the black rhombus in FIG. 2 indicates the correspondence between the brightness of the camera image and the brightness of the slide image.
  • white squares in FIG. 2 indicate the result of specifying the correspondence between the pixel value of the pixel in the slide image and the pixel value of the pixel in the camera image using the least square estimation method.
  • the correspondence between the two is represented in a linear form (linear).
  • the pixel value correction model calculating unit 122 may generate one pixel value correction model for the entire image, or may generate a pixel value correction model for each partial region of the image. Further, when generating a pixel value correction model for each region, the pixel value correction model calculating unit 122 may separately generate a pixel value correction model for the vicinity of the boundary of the region. For example, there may be a case where the correspondence relationship between pixel value changes is different between the vicinity of the center of the projected image and the periphery. In such a case, by generating a pixel value correction model for each partial region of the image, it is possible to improve the accuracy of converting pixel values.
  • the pixel value correcting unit 121 compares the slide image and the camera image, and corrects the pixel value of the pixel in the background portion other than the foreground of the camera image. Specifically, the pixel value correction unit 121 makes the pixel value of the background of the camera image close to the pixel value of the corresponding slide image based on the pixel value correction model calculated by the pixel value correction model calculation unit 122. to correct. For example, when the pixel value correction model is expressed in a linear format, the pixel value correcting unit 121 corrects the pixel value by multiplying the pixel value of the background portion of the camera image by a certain constant and adding the certain value. Good. Further, the pixel value correction model may not be expressed in a line format. For example, an equation approximated by a polynomial or a broken line may be used as the pixel value correction model.
  • the pixel value correction unit 121 corrects the pixel value based on the pixel value correction model calculated by the pixel value correction model calculation unit 122 for the background portion. Therefore, it can be said that the pixel value correction unit 121 corrects the pixel values of the pixels other than the foreground area specified by the foreground extraction unit 103.
  • the difference image generation unit 102 calculates the difference between the pixel value of each pixel in the camera image after correcting the pixel value (hereinafter referred to as a pixel value corrected image) and the pixel value of each corresponding pixel in the slide image. A difference image is generated.
  • the foreground extraction unit 103 generates an image obtained by binarizing the difference image generated by the difference image generation unit 102 based on a threshold value determined for each pixel (hereinafter referred to as a binarized image). Then, the foreground extraction unit 103 extracts a foreground area indicating the foreground from the binarized image.
  • This threshold value is dynamically determined by a dynamic threshold value calculation unit 124 described later.
  • the foreground extraction means 103 specifies, for example, a pixel in which the pixel value of each pixel in the difference image is greater than a threshold value.
  • the foreground extraction unit 103 determines that the block area is a part of the foreground when the predetermined number of pixels are included in a predetermined block area. Then, the foreground extraction means 103 identifies a set of block areas determined as a part of the foreground as the foreground area. A portion other than the foreground area corresponds to the background.
  • An image indicated by the foreground area (that is, a set of block areas) specified in this way may be referred to as a mask image.
  • the foreground extraction means 103 notifies the dynamic threshold value calculation means 124 and the pixel value correction model calculation means 122 of information indicating the extracted foreground area.
  • FIG. 3 is an explanatory diagram showing an example of processing for generating a mask image.
  • the difference image generation unit 102 When the camera image illustrated in FIG. 3A is input, the difference image generation unit 102 generates the difference image illustrated in FIG.
  • the foreground extraction unit 103 generates a binarized image illustrated in FIG. 3C by comparing with a threshold value determined for each pixel. Further, the foreground extraction means 103 determines whether or not each block area of a predetermined unit is a part of the foreground, and generates a mask image illustrated in FIG.
  • the head detection means 104 detects the head of the moving object from the area corresponding to the foreground extracted by the foreground extraction means 103 in the camera image.
  • the head detection unit 104 may detect the head by pattern recognition using a template or a feature amount describing a head pattern viewed from various directions of 360 degrees. Since a method for detecting the head from an image including a person or the like is widely known, detailed description thereof is omitted here.
  • the anti-glare processing means 105 performs a process of reducing glare on each pixel of the slide image corresponding to the region including the head detected by the head detecting means 104 from the camera image. Specifically, the anti-glare processing unit 105 performs a process of reducing the luminance of each pixel of the slide image corresponding to the region including the head.
  • the anti-glare processing means 105 may, for example, reduce the luminance value of the pixels included in the head region uniformly (for example, the luminance value to 0), and a pattern such as an ellipse or a circle whose luminance gradually increases as the distance from the center increases. Luminance may be reduced by placing a figure over the head region. Moreover, the area
  • FIG. 4 is an explanatory view showing an example of the anti-glare process.
  • a mask image illustrated in FIG. 4B is generated.
  • the head detection means 104 detects the head from the part corresponding to the foreground area in the camera image.
  • FIG. 4C shows the detected head with a black frame.
  • the anti-glare processing unit 105 reduces glare by, for example, superimposing a black rectangular image illustrated in FIG. 4D on the slide image.
  • the dynamic threshold value calculation means 124 calculates a threshold value that the foreground extraction means 103 uses for binarization of the difference image for each pixel. First, the dynamic threshold value calculation unit 124 extracts regions corresponding to the background from the slide image and the pixel value correction image, respectively. The dynamic threshold value calculation unit 124 calculates a threshold value for each pixel based on the difference between the pixel values of the corresponding pixels between the extracted images. At this time, the dynamic threshold value calculation unit 124 calculates the threshold value larger as the difference between the pixel values is larger.
  • the pixel value at the coordinates (x, y) of the slide image is I slide (x, y)
  • the pixel value of the camera image is I cam (x, y)
  • a correction function derived from the luminance correction model hereinafter, luminance correction function)
  • Is f (I) I is a luminance value
  • Th (x, y) of each pixel is calculated by Equation 1 exemplified below.
  • Th (x, y)
  • ( ⁇ > 1) is a multiplier.
  • a value such as 1.5 is set for ⁇ . From this equation, for example, a portion with a large pixel value difference such as a character region has a large threshold value, and a portion with a small pixel value difference such as a flat region has a small threshold value.
  • the foreground extraction unit 103 generates a binarized image from the difference image based on the threshold value calculated by the dynamic threshold value calculation unit 124. Therefore, an effect that the foreground can be more easily distinguished is obtained.
  • the slide change determination means 123 determines whether or not the slide image has changed. For example, the slide change determination unit 123 may determine whether or not the slide image has changed based on a difference between frames (for example, whether or not the difference is a predetermined value or more). Further, the slide change determination unit 123 may determine whether or not the slide image has changed using a shot division method for detecting a division point in the time direction in a television image.
  • the pixel value correction model calculation unit 122 calculates the pixel value correction model again when the slide change determination unit 123 determines that the slide image has changed. It should be noted that when determining a change in an image, it is desirable to use a slide image having a stable luminance rather than a camera image.
  • Geometric correction means 101 difference image generation means 102, foreground extraction means 103, head detection means 104, anti-glare processing means 105, pixel value correction means 121, pixel value correction model calculation means 122, slide
  • the change determination unit 123 and the dynamic threshold value calculation unit 124 are realized by a CPU of a computer that operates according to a program (person detection program).
  • the program is stored in a storage unit (not shown) of the projection type projection apparatus, and the CPU reads the program, and in accordance with the program, the geometric correction unit 101, the difference image generation unit 102, the foreground extraction unit 103, the head The detection unit 104, the anti-glare processing unit 105, the pixel value correction unit 121, the pixel value correction model calculation unit 122, the slide change determination unit 123, and the dynamic threshold calculation unit 124 may be operated.
  • the geometric correction unit 101 the difference image generation unit 102, the foreground extraction unit 103, the head detection unit 104, the anti-glare processing unit 105, the pixel value correction unit 121, and the pixel value correction model calculation unit 122
  • Each of the slide change determination unit 123 and the dynamic threshold calculation unit 124 may be realized by dedicated hardware.
  • FIG. 5 is a flowchart showing an operation example of the projection type projection apparatus of the present embodiment.
  • the moving object is not captured in the camera image in the initial state.
  • a model assumed by the situation is set as the initial model for the pixel value correction model, and a value assumed by the situation is also set as the initial value for the threshold value.
  • the geometric correction unit 101 corrects the camera image (step S1).
  • the pixel value correction unit 121 corrects the pixel value of the geometrically corrected image based on the pixel value correction model (step S2). In the initial state, since no moving object is captured in the camera image, the pixel value correcting unit 121 corrects the pixel value of the entire geometrically corrected image.
  • the difference image generation means 102 generates a difference image from the difference value between the pixel value of each pixel in the pixel value correction image and the pixel value of each corresponding pixel in the slide image (step S3).
  • the foreground extraction means 103 generates a binarized image from the difference image and extracts the foreground area (step S4). In the initial state, no moving object is captured in the camera image. Therefore, no foreground area is extracted from the binarized image.
  • the dynamic threshold value calculation means 124 calculates a threshold value for each pixel based on the difference between the pixel values of the corresponding pixels of the slide image and the pixel value corrected image (step S5). In the initial state, no moving object is captured in the camera image. For this reason, the dynamic threshold value calculation unit 124 calculates a threshold value for each pixel for the entire image.
  • the pixel value correction model calculation unit 122 generates a pixel value correction model based on the relationship between the pixel values of the corresponding pixels of the slide image and the camera image (step S6). In the initial state, no moving object is captured in the camera image. Therefore, the pixel value correction model calculation unit 122 generates a pixel value correction model for the pixel values of the entire image.
  • FIG. 6 is a flowchart showing another example of the operation of the projection type projection apparatus of this embodiment.
  • the process from the geometric correction unit 101 geometrically correcting the camera image until the difference image is generated from the pixel value corrected image and the slide image is the same as the process shown in steps S1 to S3 in FIG.
  • the foreground extraction means 103 generates a binarized image from the difference image and extracts the foreground area (step S4).
  • the foreground area is extracted from the binarized image.
  • the head detection means 104 detects the head of the moving body from the area corresponding to the foreground area extracted by the foreground extraction means 103 in the camera image (step S11).
  • the anti-glare processing unit 105 performs a process of reducing glare on each pixel of the slide image corresponding to the region including the head detected by the head detection unit 104 from the camera image (step S12).
  • the dynamic threshold value calculation unit 124 compares pixels other than the foreground area of the slide image and the pixel value correction image, and calculates a threshold value for each pixel based on the difference between the pixel values of the corresponding pixels (step S13). .
  • the pixel value correction model calculation unit 122 compares pixels other than the foreground area of the slide image and the camera image, and generates a pixel value correction model based on the relationship between the pixel values of the corresponding pixels (step S14). . Thereafter, the pixel value correction unit 121 corrects the pixel value of the geometrically corrected image based on the pixel value correction model generated by the pixel value correction model calculation unit 122 in step S14.
  • the pixel value correcting unit 121 compares the slide image with the camera-captured image, and determines the pixel value of the background portion of the camera image as the pixel of the corresponding slide image. Correct it so that it is close to the value.
  • the difference image generation unit 102 generates a difference image from the difference value between the pixel value of each pixel in the corrected camera image and the pixel value of each pixel in the corresponding slide image.
  • the foreground extraction means 103 generates a binarized image from the difference image based on the threshold value, and extracts the foreground area from the binarized image.
  • the head detecting means 104 detects the presenter's head from an area corresponding to the foreground area in the camera image.
  • the anti-glare processing means 105 performs a process of reducing glare on each pixel of the slide image corresponding to the region including the head.
  • Dynamic threshold calculation means 106 extracts regions corresponding to the background region in the binarized image from the slide image and the corrected camera image, respectively. Then, the dynamic threshold value calculation unit 106 calculates a threshold value used for binarization of the difference image for each pixel based on the difference between the pixel values of the corresponding pixels between the extracted images.
  • the foreground extraction unit 103 generates a binarized image from the difference image based on the threshold value calculated by the dynamic threshold value calculation unit 106, and extracts a foreground region from the binarized image. Further, the pixel value correcting unit 121 corrects the pixel values of the pixels other than the foreground region extracted by the foreground extracting unit 103.
  • FIG. FIG. 7 is a block diagram showing a configuration example of the second embodiment of the projection type projection apparatus according to the present invention.
  • the projection type projection apparatus according to the first embodiment includes a geometric correction unit 101, a difference image generation unit 102, a foreground extraction unit 103, a head detection unit 104, an anti-glare processing unit 105, and a pixel value correction unit 121.
  • the projection type projection apparatus of the second embodiment is different from the projection type projection apparatus of the first embodiment in that the foreground determination means 125 is further provided.
  • the foreground determination means 125 determines the validity of the foreground area extracted by the foreground extraction means 103, and if it determines that the foreground area is appropriate, information indicating the foreground area extracted by the foreground extraction means 103 is used as the dynamic threshold calculation means 124 and the pixel.
  • the value correction model calculation means 122 is notified.
  • the moving body When a moving body is photographed in the camera image, the moving body generally exists as a lump in a part of the screen. Therefore, when the foreground area is distributed over the entire screen, a change in the illuminance of the slide, not the foreground, may be detected. In addition, when the illuminance of the slide changes, when the pixel values of the camera image are compared in time series, the difference in luminance value is generally small.
  • the dynamic threshold value calculation unit 124 calculates the threshold value of the camera image in which only the background area is captured, the threshold value tends to be kept low, so that the entire screen is determined to be the foreground area according to the illuminance of the slide. There is a possibility.
  • the foreground determination means 125 determines the validity of the foreground area extracted by the foreground extraction means 103 based on the time-series pixel value change of the camera image and the distribution of the foreground area with respect to the entire camera image.
  • the foreground determination unit 125 determines that the foreground region has been extracted due to a change in illuminance, the pixel value correction model calculation unit 122 regenerates the luminance correction model, and the dynamic threshold calculation unit 124 Recalculate. Then, after the foreground extraction means 103 regenerates the mask image again, the foreground determination means 125 performs the above determination again.
  • the dynamic threshold value calculation means 124 extracts the area corresponding to the background area from the slide image and the corrected camera image using the foreground area determined to be appropriate by the foreground determination means 125. Further, the pixel value correction model calculation unit 122 generates a pixel value correction model.
  • the foreground determination unit 125 first expands the mask image and performs determination on the expanded image.
  • the foreground determination unit 125 performs determination by assuming, for example, an area obtained by expanding the mask image by a certain number of pixels as a foreground area. For example, in order to expand N pixels in the vertical and horizontal directions, the foreground determination unit 125 may perform processing by applying a maximum value filter having a window of (N + 1) ⁇ (N + 1) pixels.
  • the foreground determination means 125 detects this state separately.
  • FIG. 8 is an explanatory diagram showing an example of the luminance difference between the camera image after the luminance correction and the slide image.
  • FIG. 8A when the difference between the slide image of N frames and the camera image is small, the number of pixels exceeding the threshold is small as shown in FIG.
  • FIG. 8C it is assumed that the content of the slide image does not change, but the illuminance of the projector changes between the N frame and the N + 1 frame. In this case, as a result of the overall change in the luminance value of the camera image, a large number of pixels exceeding the threshold are extracted as shown in FIG.
  • the foreground determination unit 125 compares the binarized images between two time-series frames, and the number of pixels exceeding the threshold (that is, the portion determined to be the background region) increases as a whole. Determine whether or not.
  • the foreground determination means 125 compares the binarized images between two frames.
  • the foreground determination means 125 obtains the difference between the pixel value of the pixel of the target image and the luminance value of the pixel of the image one frame before the pixel exceeding the threshold. When this difference is within a certain threshold, the foreground determination means 125 checks the degree of variation in the position of the pixel.
  • the foreground determination means 125 can determine the degree of variation by obtaining the variance of coordinate values. When the variance is equal to or greater than a certain value, it is considered that the image has spread over the entire screen. Therefore, the foreground determination unit 125 determines that a change in luminance due to a change in illuminance has occurred.
  • the foreground determination means 125 may perform the determination process using the following method. In the determination process described below, the feature that the absolute value of the luminance change value between images to be compared is not so large and the feature that the luminance change appears on the entire screen are used.
  • the foreground determination means 125 checks how much the difference between the pixel values exceeding the threshold at the time of binarization exceeds the threshold. When the difference from the threshold is within a certain threshold, the foreground determination means 125 checks the degree of variation in the position of the pixel. The foreground determination means 125 can determine the degree of variation by obtaining the variance of coordinate values. When the variance is equal to or greater than a certain value, it is considered that the image has spread over the entire screen. Therefore, the foreground determination unit 125 determines that a change in luminance due to a change in illuminance has occurred.
  • the foreground determination means 125 determines the validity of the foreground area. Therefore, in addition to the effects of the first embodiment, the accuracy of determining the foreground region can be improved.
  • FIG. 9 is a block diagram showing an outline of a projection type projection apparatus according to the present invention.
  • a projection type projection apparatus according to the present invention includes a projection image (for example, a slide image) that is an image projected from a projection unit (for example, a projector) and an image obtained by capturing the projection image or between the projection image and the projection unit.
  • a foreground e.g., presenter, person
  • a captured image e.g., a camera image
  • a pixel value correcting unit 81 (for example, a pixel value correcting unit 121) that corrects the pixel value of the pixel so as to approach the pixel value of the pixel in the corresponding projection image, and the pixel value of each pixel in the corrected captured image;
  • Difference image generation means 82 (for example, difference image generation means 102) that generates a difference image from a difference value between the pixel value of each pixel in the corresponding projection image, and a difference
  • a foreground extraction unit 83 (for example, foreground) that generates a binarized image that is an image obtained by binarizing the image based on a threshold and extracts a foreground region (for example, a mask image) indicating a foreground portion from the binarized image.
  • Extraction means 103 head detection means 84 (for example, head detection means 104) for detecting the head of the moving body from the area corresponding to the foreground area in the captured image, and projection corresponding to the area including the head.
  • Anti-glare processing means 85 (for example, anti-glare processing means 105) that performs processing to reduce glare on each pixel of the image, and a region corresponding to the background area other than the foreground area in the binarized image are projected Dynamic threshold calculation that extracts a threshold value used for binarization of a difference image for each pixel based on a difference between pixel values in each corresponding pixel between the extracted images and the captured image after correction Means 86 ( In example, a dynamic threshold calculation means) and.
  • the foreground extraction means 83 generates a binary image from the difference image based on the threshold value calculated by the dynamic threshold value calculation means 86, and extracts the foreground region from the binary image.
  • the pixel value correcting unit 81 corrects the pixel values of the pixels other than the foreground area extracted by the foreground extracting unit 83.
  • the projection type projection apparatus uses the pixel value of the projected image corresponding to the pixel value of the captured image based on the relationship between the pixel value of the pixel of the projected image and the pixel value of the background pixel among the pixels of the corresponding captured image.
  • You may provide the pixel value correction model calculation means (for example, pixel value correction model calculation means 122) which calculates the pixel value correction model which is a model which prescribed
  • the pixel value correction model calculation unit may specify the background from the foreground region extracted by the foreground extraction unit 83, and the pixel value correction unit 81 may correct the pixel value of the captured image based on the pixel value correction model. .
  • the projection type projector may include a change determination unit (for example, a slide change determination unit 123) that determines whether or not the projection image has changed.
  • the pixel value correction model calculating unit may generate a pixel value correction model when it is determined that the projection image has changed. In this case, the pixel value correction model calculation means can update the pixel value correction model at an appropriate timing.
  • the projection type projection apparatus determines the validity of the foreground region extracted by the foreground extraction unit 83 based on the time-series pixel value change of the captured image and the distribution of the foreground region with respect to the entire captured image (for example, , Foreground determination means 125) may be provided. Then, the dynamic threshold value calculation means 86 extracts areas corresponding to the background area from the projected image and the corrected captured image using the foreground area determined to be appropriate by the foreground determination means, and the pixel value correction model calculation means The pixel value correction model may be calculated by specifying the background of the captured image using the foreground area determined to be appropriate by the foreground determination means. In this case, the accuracy of determining the foreground area can be improved.
  • the projection type projection apparatus may include a geometric correction unit (for example, a geometric conversion unit 101) that geometrically corrects the captured image in accordance with the shape (for example, a rectangle) of the projection image. Then, the pixel value correcting means may correct the pixel value of the captured image by comparing the projected image with the geometrically corrected captured image.
  • a geometric correction unit for example, a geometric conversion unit 101
  • the pixel value correcting means may correct the pixel value of the captured image by comparing the projected image with the geometrically corrected captured image.
  • the anti-glare processing means 85 may perform processing for reducing the luminance of each pixel of the projection image corresponding to the region including the head (for example, processing for superimposing a black image).
  • the dynamic threshold value calculation means 86 may calculate the threshold value larger as the difference between the pixel values is larger (for example, the threshold value is calculated using the above equation 1).
  • the present invention is preferably applied to a projection type projection apparatus that reduces glare of light projected when an image is projected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)

Abstract

In the present invention, a difference image generation means (82) generates a difference image from the difference values between the pixel value of each pixel in a post-correction captured image and the pixel value of each pixel in a corresponding projected image. A foreground extraction means (83) extracts a foreground region indicating a foreground portion from a binarized image, which is an image resulting from binarizing the difference image on the basis of a threshold. An anti-glare processing means (85) performs a glare-reducing process on each pixel of the projected image corresponding to a region containing a head. A dynamic threshold calculation means (86) extracts a region corresponding to the background region, which is the region other than the foreground region in the binarized image, from the projected image and from the post-correction captured image, and on the basis of the difference in pixel values of each corresponding pixel between the extracted images, calculates the threshold used in binarizing the difference image for each pixel.

Description

投射型投影装置、光防眩方法、および光防眩用プログラムProjection type projection device, light anti-glare method, and light anti-glare program
 本発明は、画像を投影する際に投射される光の眩しさを低減させる投射型投影装置、光防眩方法、および光防眩用プログラムに関する。 The present invention relates to a projection type projection device, a light anti-glare method, and a light anti-glare program for reducing the glare of light projected when an image is projected.
 スクリーンに画像を投影するプロジェクタを用いてプレゼンテーションを行う場合、プレゼンタは、投影される画像の前に立って説明する機会が多くなる。この場合、プロジェクタから投影される光がプレゼンタの目に直接照射されることになるため、プレゼンタは、その光を非常に眩しく感じることが多い。そのため、プレゼンタが感じる眩しさを緩和するように画像を投影する方法が各種知られている。 When a presentation is performed using a projector that projects an image on a screen, the presenter has many opportunities to explain by standing in front of the projected image. In this case, since the light projected from the projector is directly irradiated to the presenter's eyes, the presenter often feels the light very dazzling. Therefore, various methods for projecting an image so as to alleviate the glare felt by the presenter are known.
 特許文献1には、人物が位置する領域に投射する映像光の明るさを制御する表示装置が記載されている。特許文献1に記載された表示装置は、プロジェクタに併設したカメラでスクリーンを撮影し、撮影した領域内でプレゼンタを検知すると、そのプレゼンタが位置する領域に投射する映像光の明るさを低減させる。 Patent Document 1 describes a display device that controls the brightness of image light projected onto an area where a person is located. The display device described in Patent Document 1 captures a screen with a camera attached to a projector, and when a presenter is detected in the captured region, the brightness of the image light projected on the region where the presenter is located is reduced.
 特許文献2には、投影される画像に含まれる各種情報がプレゼンタの顔に表示されることを防止するプロジェクタが記載されている。特許文献2に記載されたプロジェクタは、投影した画像を撮影した撮影画像から人物の顔領域を検出し、その顔領域に対応する領域を所定の色に置換した画像を投影する。 Patent Document 2 describes a projector that prevents various types of information included in a projected image from being displayed on the face of a presenter. The projector described in Patent Document 2 detects a human face area from a captured image obtained by photographing the projected image, and projects an image in which the area corresponding to the face area is replaced with a predetermined color.
 非特許文献1には、話者がプロジェクタの前に立っても眩しくないようにプロジェクタの光を調整するプロジェクタシステムが記載されている。非特許文献1に記載されたプロジェクタシステムは、スクリーンと話者を含む画像から話者の顔領域を検出し、検出した領域にマスク画像を重畳して表示する。 Non-Patent Document 1 describes a projector system that adjusts light of a projector so that a speaker does not dazzle even when standing in front of the projector. The projector system described in Non-Patent Document 1 detects a speaker's face area from an image including a screen and the speaker, and displays a mask image superimposed on the detected area.
 非特許文献2には、高照度の環境で人物を検出する精度を高める人物検出方法が記載されている。非特許文献2に記載された方法では、投影画像と、その投影画像をカメラで撮影した撮影画像との間で輝度調整を行い、輝度調整後の画像間で差分画像を作成することにより、人物を検出する。 Non-Patent Document 2 describes a person detection method that improves the accuracy of detecting a person in a high illumination environment. In the method described in Non-Patent Document 2, brightness adjustment is performed between a projected image and a captured image obtained by capturing the projected image with a camera, and a difference image is created between the images after brightness adjustment, thereby Is detected.
 なお、特許文献3には、背景領域の画素値に影響されることなく物体領域を抽出する画像処理装置が記載されている。特許文献3に記載された画像処理装置は、入力画像データと背景画像データの差分値を求め、所定の2値化閾値を用いて、障害物の候補領域を検出する。 Note that Patent Document 3 describes an image processing apparatus that extracts an object region without being affected by a pixel value of a background region. The image processing apparatus described in Patent Literature 3 obtains a difference value between input image data and background image data, and detects an obstacle candidate region using a predetermined binarization threshold.
特開2000-305481号公報JP 2000-305481 A 特開2010-34820号公報JP 2010-34820 A 特開2000-57323号公報JP 2000-57323 A
 プレゼンタの眩しさを軽減させるためには、プレゼンタに投射される光を軽減することが有効である。一方、投影されるスライドは、視聴者にとって重要な資料である。そのため、投射される光を軽減する範囲は、できるだけ利用者の頭部周辺であることが望ましい。そのため、プレゼンタの頭部を適切に認識して、その部分の光を軽減することが望まれる。 In order to reduce the glare of the presenter, it is effective to reduce the light projected on the presenter. On the other hand, the projected slide is an important material for the viewer. Therefore, it is desirable that the range in which the projected light is reduced is as close as possible to the user's head. Therefore, it is desirable to appropriately recognize the head of the presenter and reduce the light at that portion.
 特許文献1に記載された表示装置は、カメラで撮影した映像の中で、スクリーン下部がプレゼンタの胴体や脚部によって暗くなるという輝度変化に着目してプレゼンタを検出する。しかし、特許文献1に記載された表示装置は、人物を検出しているわけではないため、誤検知が多いという問題がある。 The display device described in Patent Document 1 detects a presenter by paying attention to a luminance change in which the lower part of the screen is darkened by the body and legs of the presenter in an image taken by a camera. However, since the display device described in Patent Document 1 does not detect a person, there is a problem that there are many false detections.
 また、非特許文献1に記載されたシステムでは、スライドがプレゼンタの顔に重畳されるため、顔検出が正しく動作せず、うまく防眩機能が働かないという問題がある。また、非特許文献1に記載されたシステムでは、スライド中にもともと顔が載っている場合、その顔までも検出し、誤って防眩機能が働いてしまうという問題がある。 Also, in the system described in Non-Patent Document 1, there is a problem that face detection does not operate correctly and the anti-glare function does not work because the slide is superimposed on the presenter's face. Further, the system described in Non-Patent Document 1 has a problem that when a face is originally placed in a slide, even the face is detected and the anti-glare function is erroneously activated.
 特許文献2には、プロジェクタが、比較画像に対して、ゆがみ補正、明度、輝度の補正を含む正規化を行うことは記載されている。しかし、特許文献2には、その具体的な補正方法は記載されていないため、特許文献2に記載されたプロジェクタが人物を精度高く検出できるとは言い難い。 Patent Document 2 describes that a projector performs normalization including distortion correction, brightness, and brightness correction on a comparative image. However, since the specific correction method is not described in Patent Document 2, it is difficult to say that the projector described in Patent Document 2 can detect a person with high accuracy.
 一方、非特許文献2には、入力画像の画素値にヒストグラム平均値を合わせる処理を行うことで、輝度調整を行っている。しかし、非特許文献2に記載されているように、予め定めた輝度の補正値を用いる場合、抽出すべき人物についても輝度補正がなされてしまい、適切に人物が抽出できない場合も存在する。 On the other hand, in Non-Patent Document 2, brightness adjustment is performed by performing processing for matching the histogram average value with the pixel value of the input image. However, as described in Non-Patent Document 2, when a predetermined luminance correction value is used, luminance correction is performed for a person to be extracted, and there is a case where a person cannot be appropriately extracted.
 そこで、本発明は、高照度の光を投射して画像を投影する際、画像の投影方向に人物が存在する場合に、その人物を高い精度で検出することで投射される光の眩しさを低減できる投射型投影装置、光防眩方法、および光防眩用プログラムを提供することを目的とする。 Therefore, when projecting an image by projecting light with high illuminance, the present invention reduces the glare of the light projected by detecting the person with high accuracy when a person is present in the image projection direction. It is an object of the present invention to provide a projection type projection apparatus, a light anti-glare method, and a light anti-glare program that can be reduced.
 本発明による投射型投影装置は、投影手段から投影される画像である投影画像と、その投影画像を撮影した画像またはその投影画像と投影手段の間に存在する移動体である前景をその投影画像と共に撮影した画像である撮影画像とを比較して、撮影画像のうち前景以外の部分である背景における画素の画素値を、対応する投影画像における画素の画素値に近づけるように補正する画素値補正手段と、補正後の撮影画像における各画素の画素値と、対応する投影画像における各画素の画素値との差分値から差分画像を生成する差分画像生成手段と、差分画像を閾値に基づいて2値化した画像である2値化画像を生成し、その2値化画像から前景部分を示す前景領域を抽出する前景抽出手段と、撮影画像内の前景領域に対応する領域から移動体の頭部を検出する頭部検出手段と、頭部を含む領域に対応する投影画像の各画素に眩しさを軽減させる処理を行う防眩処理手段と、2値化画像における前景領域以外の領域である背景領域に対応する領域を、投影画像および補正後の撮影画像からそれぞれ抽出し、抽出した画像間の対応する各画素における画素値の差分に基づいて、差分画像の2値化に用いられる閾値を画素ごとに算出する動的閾値算出手段とを備え、前景抽出手段が、動的閾値算出手段によって算出された閾値に基づいて差分画像から2値化画像を生成し、その2値化画像から前景領域を抽出し、画素値補正手段が、前景抽出手段が抽出した前景領域以外の部分の画素の画素値を補正することを特徴とする。 A projection type projection apparatus according to the present invention projects a projection image, which is an image projected from the projection means, and an image obtained by capturing the projection image or a foreground that is a moving object existing between the projection image and the projection means. Pixel value correction that corrects the pixel value of the pixel in the background that is a part other than the foreground in the captured image so that it approaches the pixel value of the pixel in the corresponding projection image. Means, a difference image generating means for generating a difference image from a difference value between a pixel value of each pixel in the captured image after correction and a pixel value of each pixel in the corresponding projection image, and a difference image based on a threshold value 2 A foreground extracting means for generating a binarized image that is a binarized image and extracting a foreground area indicating a foreground portion from the binarized image; and a moving object from an area corresponding to the foreground area in the photographed image A head detecting means for detecting the head, an anti-glare processing means for reducing glare on each pixel of the projection image corresponding to the area including the head, and an area other than the foreground area in the binarized image. An area corresponding to a certain background area is extracted from each of the projected image and the corrected captured image, and a threshold value used for binarization of the difference image based on a difference in pixel values at each corresponding pixel between the extracted images. And a foreground extraction unit that generates a binarized image from the difference image based on the threshold value calculated by the dynamic threshold value calculating unit, and from the binarized image, The foreground area is extracted, and the pixel value correcting means corrects the pixel values of the pixels other than the foreground area extracted by the foreground extracting means.
 本発明による光防眩方法は、投影手段から投影される画像である投影画像と、その投影画像を撮影した画像またはその投影画像と投影手段の間に存在する移動体である前景をその投影画像と共に撮影した画像である撮影画像とを比較して、撮影画像のうち前景以外の部分である背景における画素の画素値を、対応する投影画像における画素の画素値に近づけるように補正し、補正後の撮影画像における各画素の画素値と、対応する投影画像における各画素の画素値との差分値から差分画像を生成し、差分画像を閾値に基づいて2値化した画像である2値化画像を生成し、その2値化画像から前景部分を示す前景領域を抽出し、撮影画像内の前景領域に対応する領域から移動体の頭部を検出し、頭部を含む領域に対応する投影画像の各画素に眩しさを軽減させる処理を行い、2値化画像における前景領域以外の領域である背景領域に対応する領域を、投影画像および補正後の撮影画像からそれぞれ抽出し、抽出した画像間の対応する各画素における画素値の差分に基づいて、差分画像の2値化に用いられる閾値を画素ごとに算出し、前景領域を抽出する際、算出された閾値に基づいて差分画像から2値化画像を生成し、その2値化画像から前景領域を抽出し、抽出された前景領域以外の部分の画素の画素値を補正することを特徴とする。 The anti-glare method according to the present invention projects a projection image, which is an image projected from the projection means, and an image obtained by photographing the projection image or a foreground, which is a moving object existing between the projection image and the projection means, And the pixel value of the pixel in the background that is a part other than the foreground in the photographed image is corrected so as to be close to the pixel value of the pixel in the corresponding projection image, and after the correction A difference image is generated from the difference value between the pixel value of each pixel in the captured image and the pixel value of each pixel in the corresponding projection image, and the binarized image is an image obtained by binarizing the difference image based on the threshold value A foreground region indicating the foreground part is extracted from the binarized image, the head of the moving object is detected from the region corresponding to the foreground region in the captured image, and the projection image corresponding to the region including the head For each pixel The processing corresponding to the background region in the binarized image is extracted from the projected image and the corrected captured image, and the corresponding pixels between the extracted images are processed. A threshold value used for binarization of the difference image is calculated for each pixel based on the difference between the pixel values in the image, and when extracting the foreground region, a binarized image is generated from the difference image based on the calculated threshold value. The foreground area is extracted from the binarized image, and the pixel values of the pixels other than the extracted foreground area are corrected.
 本発明による光防眩用プログラムは、コンピュータに、投影手段から投影される画像である投影画像と、その投影画像を撮影した画像またはその投影画像と投影手段の間に存在する移動体である前景をその投影画像と共に撮影した画像である撮影画像とを比較して、撮影画像のうち前景以外の部分である背景における画素の画素値を、対応する投影画像における画素の画素値に近づけるように補正する画素値補正処理、補正後の撮影画像における各画素の画素値と、対応する投影画像における各画素の画素値との差分値から差分画像を生成する差分画像生成処理、差分画像を閾値に基づいて2値化した画像である2値化画像を生成し、その2値化画像から前景部分を示す前景領域を抽出する前景抽出処理、撮影画像内の前景領域に対応する領域から移動体の頭部を検出する頭部検出処理、頭部を含む領域に対応する投影画像の各画素に眩しさを軽減させる防眩処理、および、2値化画像における前景領域以外の領域である背景領域に対応する領域を、投影画像および補正後の撮影画像からそれぞれ抽出し、抽出した画像間の対応する各画素における画素値の差分に基づいて、差分画像の2値化に用いられる閾値を画素ごとに算出する動的閾値算出処理を実行させ、前景抽出処理で、動的閾値算出処理で算出された閾値に基づいて差分画像から2値化画像を生成させ、その2値化画像から前景領域を抽出させ、画素値補正処理で、前景抽出処理で抽出された前景領域以外の部分の画素の画素値を補正させることを特徴とする。 An anti-glare program according to the present invention is a foreground that is a projected image that is an image projected from a projection unit on a computer and an image obtained by capturing the projection image or a moving object that exists between the projection image and the projection unit. Is compared with the captured image that is captured with the projected image, and the pixel value of the pixel in the background, which is a portion other than the foreground in the captured image, is corrected to be close to the pixel value of the pixel in the corresponding projected image. Pixel value correction processing, difference image generation processing for generating a difference image from the difference value between the pixel value of each pixel in the corrected captured image and the pixel value of each pixel in the corresponding projection image, and based on the threshold value of the difference image A foreground extraction process for generating a binarized image that is a binarized image and extracting a foreground region indicating a foreground portion from the binarized image, corresponding to a foreground region in a captured image Head detection processing for detecting the head of a moving object from the area, anti-glare processing for reducing glare in each pixel of the projection image corresponding to the area including the head, and areas other than the foreground area in the binarized image The region corresponding to the background region is extracted from the projected image and the corrected captured image, and is used for binarization of the difference image based on the difference in pixel value at each corresponding pixel between the extracted images. A dynamic threshold value calculation process for calculating a threshold value for each pixel is executed, and in the foreground extraction process, a binary image is generated from the difference image based on the threshold value calculated in the dynamic threshold value calculation process, and the binary image The foreground area is extracted from the image data, and the pixel value correction process corrects the pixel values of the pixels other than the foreground area extracted by the foreground extraction process.
 本発明によれば、高照度の光を投射して画像を投影する際、画像の投影方向に人物が存在する場合に、その人物を高い精度で検出することで投射される光の眩しさを低減できる。 According to the present invention, when projecting an image by projecting light with high illuminance, if there is a person in the projection direction of the image, the glare of the projected light is detected by detecting the person with high accuracy. Can be reduced.
本発明による投射型投影装置の第1の実施形態の構成例を示すブロック図である。It is a block diagram which shows the structural example of 1st Embodiment of the projection type projector by this invention. カメラ画像の輝度とスライド画像の輝度の対応関係例を示す説明図である。It is explanatory drawing which shows the example of a correspondence of the brightness | luminance of a camera image, and the brightness | luminance of a slide image. マスク画像を生成する処理の例を示す説明図である。It is explanatory drawing which shows the example of the process which produces | generates a mask image. 防眩処理の例を示す説明図である。It is explanatory drawing which shows the example of an anti-glare process. 第1の実施形態の投射型投影装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the projection type projector of 1st Embodiment. 第1の実施形態の投射型投影装置の他の動作例を示すフローチャートである。It is a flowchart which shows the other operation example of the projection type projector of 1st Embodiment. 本発明による投射型投影装置の第2の実施形態の構成例を示すブロック図である。It is a block diagram which shows the structural example of 2nd Embodiment of the projection type projector by this invention. 輝度補正後のカメラ画像とスライド画像との間の輝度差の例を示す説明図である。It is explanatory drawing which shows the example of the brightness | luminance difference between the camera image and slide image after brightness correction. 本発明による投射型投影装置の概要を示すブロック図である。It is a block diagram which shows the outline | summary of the projection type projector by this invention.
 以下、本発明の実施形態を図面を参照して説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
実施形態1.
 図1は、本発明による投射型投影装置の第1の実施形態の構成例を示すブロック図である。第1の実施形態の投射型投影装置は、幾何補正手段101と、差分画像生成手段102と、前景抽出手段103と、頭部検出手段104と、防眩処理手段105と、画素値補正手段121と、画素値補正モデル算出手段122と、スライド変化判定手段123と、動的閾値算出手段124とを備えている。
Embodiment 1. FIG.
FIG. 1 is a block diagram showing a configuration example of a first embodiment of a projection type projection apparatus according to the present invention. The projection type projection apparatus according to the first embodiment includes a geometric correction unit 101, a difference image generation unit 102, a foreground extraction unit 103, a head detection unit 104, an anti-glare processing unit 105, and a pixel value correction unit 121. A pixel value correction model calculation unit 122, a slide change determination unit 123, and a dynamic threshold calculation unit 124.
 幾何補正手段101は、カメラが撮影した画像(以下、カメラ画像と記す。)を入力し、投射型投影装置がスクリーンに投影するスライドの画像(以下、スライド画像と記す。)の形状に合わせてカメラ画像を幾何補正する。 The geometric correction unit 101 inputs an image captured by the camera (hereinafter referred to as a camera image) and matches the shape of a slide image (hereinafter referred to as a slide image) projected on the screen by the projection type projector. Geometrically correct the camera image.
 具体的には、プロジェクタが長方形のスライド画像を投影している場合、幾何補正手段101は、カメラが撮影したカメラ画像をスライド画像の形状に合わせて長方形に幾何補正する。以下、幾何補正手段101が幾何補正したカメラ画像を幾何補正画像と記すこともある。幾何補正の方法は、例えば、射影変換行列を利用する方法など、広く知られた一般的な方法が用いられる。 Specifically, when the projector is projecting a rectangular slide image, the geometric correction unit 101 geometrically corrects the camera image captured by the camera into a rectangle according to the shape of the slide image. Hereinafter, the camera image geometrically corrected by the geometric correction unit 101 may be referred to as a geometrically corrected image. As the geometric correction method, for example, a widely known general method such as a method using a projective transformation matrix is used.
 ここで、カメラ画像は、スクリーンに投影されたスライド画像を撮影した画像である。また、スクリーンに投影されるスライド画像の前にプレゼンタが立って説明をするような場合、スライド画像とプロジェクタとの間にプレゼンタが存在することになる。この場合、カメラ画像とは、スライド画像とともにプレゼンタを撮影した画像を意味する。本実施形態では、カメラ画像と記した場合、スライド画像が撮影された画像、またはスライド画像とプレゼンタとが共に撮影された画像を意味する。 Here, the camera image is an image of a slide image projected on the screen. In addition, when a presenter stands in front of a slide image projected on a screen for explanation, a presenter exists between the slide image and the projector. In this case, the camera image means an image obtained by photographing the presenter together with the slide image. In this embodiment, when it is described as a camera image, it means an image in which a slide image is taken or an image in which both a slide image and a presenter are taken.
 また、カメラ画像として撮影されるプレゼンタは、スライド画像を投影している間移動する物体であることから、以下の説明では、プレゼンタのことを移動体と記すこともある。また、スライドの視聴者からの観点から、スライド映像のことを背景と記し、スライド映像の前に存在するプレゼンタのことを前景と記すこともある。 In addition, since the presenter photographed as a camera image is an object that moves while projecting a slide image, the presenter may be referred to as a moving body in the following description. Further, from the viewpoint of the slide viewer, the slide video is sometimes referred to as the background, and the presenter present before the slide video is sometimes referred to as the foreground.
 画素値補正モデル算出手段122は、カメラ画像の画素値を対応するスライド画像の画素値に近づけるように補正する方法を規定したモデル(以下、画素値補正モデルと記す。)を生成する。なお、このカメラ画像には、幾何補正画像が用いられる。なお、画素値として輝度値を用いることも可能である。 The pixel value correction model calculation unit 122 generates a model that defines a method for correcting the pixel value of the camera image so as to approach the pixel value of the corresponding slide image (hereinafter referred to as a pixel value correction model). A geometrically corrected image is used for this camera image. Note that a luminance value can also be used as the pixel value.
 一般に、スライド画像をカメラで撮影した場合、撮影された画像(すなわち、カメラ画像)とスライド画像とでは、対応する画素の画素値が異なってしまう。そこで、カメラ画像における画素の画素値をスライド画像における画素の画素値に近づける補正をすることで、両画像を比較する精度を高めることができる。 Generally, when a slide image is shot with a camera, the pixel value of the corresponding pixel differs between the shot image (that is, the camera image) and the slide image. Therefore, by correcting the pixel value of the pixel in the camera image to be close to the pixel value of the pixel in the slide image, it is possible to improve the accuracy of comparing the two images.
 画素値補正モデル算出手段122は、スライド画像における画素の画素値と、そのスライド画像の各画素に対応するカメラ画像における画素の画素値との関係に基づいて、画素値補正モデルを生成する。ただし、本実施形態では、画素値補正モデル算出手段122は、カメラ画像における画素のうち、背景部分の画素の画素値を、対応するスライド画像における画素の画素値と比較する。 The pixel value correction model calculation unit 122 generates a pixel value correction model based on the relationship between the pixel value of the pixel in the slide image and the pixel value of the pixel in the camera image corresponding to each pixel of the slide image. However, in the present embodiment, the pixel value correction model calculating unit 122 compares the pixel value of the pixel in the background portion among the pixels in the camera image with the pixel value of the pixel in the corresponding slide image.
 スライド画像とカメラ画像で対応しない部分(すなわち、前景部分)を除き、抽出したい前景部分以外の部分(すなわち、背景部分)のみの画素の画素値を比較することで、適切な画素値補正モデルを生成することができる。なお、カメラ画像の前景部分を示す前景領域は、後述する前景抽出手段103によって特定される。なお、前景領域を示す情報のことを、以下、前景情報と記すこともある。 An appropriate pixel value correction model can be obtained by comparing pixel values of pixels only in a portion other than the foreground portion to be extracted (ie, background portion) except for a portion that does not correspond between the slide image and the camera image (ie, foreground portion). Can be generated. Note that the foreground area indicating the foreground portion of the camera image is specified by the foreground extraction means 103 described later. Hereinafter, the information indicating the foreground area may be referred to as foreground information.
 画素値補正モデル算出手段122は、例えば、スライド画像における画素の画素値と、そのスライド画像の各画素に対応するカメラ画像のうち背景部分における画素の画素値との対応関係を、最小二乗推定法を用いて特定し、その対応関係から画素値補正モデルを生成してもよい。また、画素値補正モデル算出手段122は、各画像のヒストグラムを合わせるための変換処理に基づいて画素値補正モデルを生成してもよい。 The pixel value correction model calculation unit 122, for example, uses a least-square estimation method to calculate the correspondence between the pixel value of a pixel in the slide image and the pixel value of the pixel in the background portion of the camera image corresponding to each pixel of the slide image. The pixel value correction model may be generated from the correspondence relationship. Further, the pixel value correction model calculation unit 122 may generate a pixel value correction model based on a conversion process for matching the histograms of the images.
 図2は、カメラ画像の輝度とスライド画像の輝度の対応関係の例を示す説明図である。図2に例示するグラフは、横軸にカメラ画像の輝度を示し、縦軸にスライド画像の輝度を示している。また、図2における黒の菱形が、カメラ画像の輝度とスライド画像の輝度との対応関係を示している。さらに、図2における白抜きの四角が、スライド画像における画素の画素値とカメラ画像の画素の画素値との対応関係を、最小二乗推定法を用いて特定した結果を示す。図2では、両者の対応関係が線形式(リニア)で表わされる。 FIG. 2 is an explanatory diagram showing an example of the correspondence between the brightness of the camera image and the brightness of the slide image. In the graph illustrated in FIG. 2, the horizontal axis indicates the brightness of the camera image, and the vertical axis indicates the brightness of the slide image. Also, the black rhombus in FIG. 2 indicates the correspondence between the brightness of the camera image and the brightness of the slide image. Further, white squares in FIG. 2 indicate the result of specifying the correspondence between the pixel value of the pixel in the slide image and the pixel value of the pixel in the camera image using the least square estimation method. In FIG. 2, the correspondence between the two is represented in a linear form (linear).
 また、画素値補正モデル算出手段122は、画像全体に対して1つの画素値補正モデルを生成してもよく、画像の一部の領域ごとに画素値補正モデルを生成してもよい。また、領域ごとに画素値補正モデルを生成する場合、画素値補正モデル算出手段122は、領域の境界付近用に、別途画素値補正モデルを生成してもよい。例えば、投影される画像の中央付近と周囲とでは、画素値の変化の対応関係が異なる場合がある。このような場合に、画像の一部の領域ごとに画素値補正モデルを生成することで、画素値を変換する精度を高めることができる。 Further, the pixel value correction model calculating unit 122 may generate one pixel value correction model for the entire image, or may generate a pixel value correction model for each partial region of the image. Further, when generating a pixel value correction model for each region, the pixel value correction model calculating unit 122 may separately generate a pixel value correction model for the vicinity of the boundary of the region. For example, there may be a case where the correspondence relationship between pixel value changes is different between the vicinity of the center of the projected image and the periphery. In such a case, by generating a pixel value correction model for each partial region of the image, it is possible to improve the accuracy of converting pixel values.
 画素値補正手段121は、スライド画像とカメラ画像とを比較して、カメラ画像の前景以外の背景部分における画素の画素値を補正する。具体的には、画素値補正手段121は、画素値補正モデル算出手段122が算出した画素値補正モデルに基づいて、カメラ画像の背景の画素値を、対応するスライド画像の画素値に近づけるように補正する。例えば、画素値補正モデルが線形式で表わされる場合、画素値補正手段121は、カメラ画像の背景部分の画素値に一定の定数を乗じて一定値を加算することで画素値を補正してもよい。また、画素値補正モデルは、線形式で表わされなくてもよい。例えば、多項式や折れ線によって近似した式を画素値補正モデルとしてもよい。 The pixel value correcting unit 121 compares the slide image and the camera image, and corrects the pixel value of the pixel in the background portion other than the foreground of the camera image. Specifically, the pixel value correction unit 121 makes the pixel value of the background of the camera image close to the pixel value of the corresponding slide image based on the pixel value correction model calculated by the pixel value correction model calculation unit 122. to correct. For example, when the pixel value correction model is expressed in a linear format, the pixel value correcting unit 121 corrects the pixel value by multiplying the pixel value of the background portion of the camera image by a certain constant and adding the certain value. Good. Further, the pixel value correction model may not be expressed in a line format. For example, an equation approximated by a polynomial or a broken line may be used as the pixel value correction model.
 なお、画素値補正手段121は、画素値補正モデル算出手段122が背景部分を対象として算出した画素値補正モデルをもとに画素値を補正する。そのことから、画素値補正手段121は、前景抽出手段103によって特定される前景領域以外の部分の画素の画素値を補正しているこということができる。 The pixel value correction unit 121 corrects the pixel value based on the pixel value correction model calculated by the pixel value correction model calculation unit 122 for the background portion. Therefore, it can be said that the pixel value correction unit 121 corrects the pixel values of the pixels other than the foreground area specified by the foreground extraction unit 103.
 差分画像生成手段102は、画素値を補正した後のカメラ画像(以下、画素値補正画像と記す。)における各画素の画素値と、スライド画像における対応する各画素の画素値との差分値から差分画像を生成する。 The difference image generation unit 102 calculates the difference between the pixel value of each pixel in the camera image after correcting the pixel value (hereinafter referred to as a pixel value corrected image) and the pixel value of each corresponding pixel in the slide image. A difference image is generated.
 前景抽出手段103は、差分画像生成手段102が生成した差分画像を、画素ごとに定められる閾値に基づいて2値化した画像(以下、2値化画像と記す。)を生成する。そして、前景抽出手段103は、その2値化画像から前景を示す前景領域を抽出する。なお、この閾値は、後述する動的閾値算出手段124によって、動的に決定される。 The foreground extraction unit 103 generates an image obtained by binarizing the difference image generated by the difference image generation unit 102 based on a threshold value determined for each pixel (hereinafter referred to as a binarized image). Then, the foreground extraction unit 103 extracts a foreground area indicating the foreground from the binarized image. This threshold value is dynamically determined by a dynamic threshold value calculation unit 124 described later.
 前景抽出手段103は、例えば、差分画像における各画素の画素値が閾値より大きい画素を特定する。前景抽出手段103は、予め定めた単位のブロック領域に、特定した画素が一定以上の割合含まれている場合、そのブロック領域を前景の一部と判定する。そして、前景抽出手段103は、前景の一部と判定したブロック領域の集合を前景領域と特定する。なお、前景領域以外の部分が、背景に相当する。また、このように特定された前景領域(すなわち、ブロック領域の集合)が示す画像をマスク画像と記すこともある。 The foreground extraction means 103 specifies, for example, a pixel in which the pixel value of each pixel in the difference image is greater than a threshold value. The foreground extraction unit 103 determines that the block area is a part of the foreground when the predetermined number of pixels are included in a predetermined block area. Then, the foreground extraction means 103 identifies a set of block areas determined as a part of the foreground as the foreground area. A portion other than the foreground area corresponds to the background. An image indicated by the foreground area (that is, a set of block areas) specified in this way may be referred to as a mask image.
 前景抽出手段103は、抽出した前景領域を示す情報を動的閾値算出手段124および画素値補正モデル算出手段122に通知する。 The foreground extraction means 103 notifies the dynamic threshold value calculation means 124 and the pixel value correction model calculation means 122 of information indicating the extracted foreground area.
 図3は、マスク画像を生成する処理の例を示す説明図である。図3(a)に例示するカメラ画像が入力されると、差分画像生成手段102は、図3(b)に例示する差分画像を生成する。前景抽出手段103は、画素ごとに定められた閾値と比較することにより、図3(c)に例示する2値化画像を生成する。さらに、前景抽出手段103は、予め定めた単位のブロック領域ごとに前景の一部か否かを判定して、図3(d)に例示するマスク画像を生成する。 FIG. 3 is an explanatory diagram showing an example of processing for generating a mask image. When the camera image illustrated in FIG. 3A is input, the difference image generation unit 102 generates the difference image illustrated in FIG. The foreground extraction unit 103 generates a binarized image illustrated in FIG. 3C by comparing with a threshold value determined for each pixel. Further, the foreground extraction means 103 determines whether or not each block area of a predetermined unit is a part of the foreground, and generates a mask image illustrated in FIG.
 頭部検出手段104は、カメラ画像内で前景抽出手段103が抽出した前景に対応する領域から移動体の頭部を検出する。頭部検出手段104は、例えば、360度の様々な方向から見た頭部のパターンを記述するテンプレートや特徴量を用いて、パターン認識により頭部を検出してもよい。なお、人物等を含む画像から頭部を検出する方法は広く知られているため、ここでは詳細な説明は省略する。 The head detection means 104 detects the head of the moving object from the area corresponding to the foreground extracted by the foreground extraction means 103 in the camera image. For example, the head detection unit 104 may detect the head by pattern recognition using a template or a feature amount describing a head pattern viewed from various directions of 360 degrees. Since a method for detecting the head from an image including a person or the like is widely known, detailed description thereof is omitted here.
 防眩処理手段105は、頭部検出手段104がカメラ画像から検出した頭部を含む領域に対応するスライド画像の各画素に、眩しさを軽減させる処理を行う。具体的には、防眩処理手段105は、頭部を含む領域に対応するスライド画像の各画素の輝度を低減させる処理を行う。 The anti-glare processing means 105 performs a process of reducing glare on each pixel of the slide image corresponding to the region including the head detected by the head detecting means 104 from the camera image. Specifically, the anti-glare processing unit 105 performs a process of reducing the luminance of each pixel of the slide image corresponding to the region including the head.
 防眩処理手段105は、例えば、頭部領域に含まれる画素の輝度値を一律に(例えば輝度値を0に)低減させてもよく、中心から離れるにつれて輝度が漸増する楕円や円などのパターン図形を頭部領域にかぶせることにより輝度を低減させてもよい。また、輝度を低減させる領域は、頭部領域に一致させる必要はなく、頭部領域中心付近の長方形領域であってもよい。また、頭部検出手段104が頭部の向きまで検知している場合、防眩処理手段105は、頭部前方の目が存在する可能性が高い領域の輝度を低減させてもよい。 The anti-glare processing means 105 may, for example, reduce the luminance value of the pixels included in the head region uniformly (for example, the luminance value to 0), and a pattern such as an ellipse or a circle whose luminance gradually increases as the distance from the center increases. Luminance may be reduced by placing a figure over the head region. Moreover, the area | region which reduces a brightness | luminance does not need to correspond with a head area | region, The rectangular area | region near the head area | region center may be sufficient. In addition, when the head detection unit 104 detects the head direction, the anti-glare processing unit 105 may reduce the luminance of an area where there is a high possibility that eyes in front of the head are present.
 図4は、防眩処理の例を示す説明図である。図4(a)に例示するカメラ画像が入力された結果、図4(b)に例示するマスク画像が生成される。頭部検出手段104は、カメラ画像内で前景領域に対応する部分から頭部を検出する。図4(c)は、検出された頭部を黒の枠で示している。防眩処理手段105は、例えば、図4(d)に例示するような黒の矩形の画像をスライド画像に重畳することにより、眩しさを軽減させる。 FIG. 4 is an explanatory view showing an example of the anti-glare process. As a result of inputting the camera image illustrated in FIG. 4A, a mask image illustrated in FIG. 4B is generated. The head detection means 104 detects the head from the part corresponding to the foreground area in the camera image. FIG. 4C shows the detected head with a black frame. The anti-glare processing unit 105 reduces glare by, for example, superimposing a black rectangular image illustrated in FIG. 4D on the slide image.
 動的閾値算出手段124は、前景抽出手段103が差分画像の2値化に用いる閾値を画素ごとに算出する。まず、動的閾値算出手段124は、スライド画像および画素値補正画像からそれぞれ背景に対応する領域を抽出する。動的閾値算出手段124は、抽出した画像間の対応する各画素の画素値の差分に基づいて画素ごとに閾値を算出する。このとき、動的閾値算出手段124は、画素値の差分が大きいほど閾値を大きく算出する。 The dynamic threshold value calculation means 124 calculates a threshold value that the foreground extraction means 103 uses for binarization of the difference image for each pixel. First, the dynamic threshold value calculation unit 124 extracts regions corresponding to the background from the slide image and the pixel value correction image, respectively. The dynamic threshold value calculation unit 124 calculates a threshold value for each pixel based on the difference between the pixel values of the corresponding pixels between the extracted images. At this time, the dynamic threshold value calculation unit 124 calculates the threshold value larger as the difference between the pixel values is larger.
 スライド画像の座標(x,y)における画素値をIslide(x,y)、カメラ画像の画素値をIcam(x,y)、輝度補正モデルから導出される補正関数(以下、輝度補正関数と記す。)をf(I)(Iは輝度値)とすると、各画素の閾値Th(x,y)は、以下に例示する式1で算出される。 The pixel value at the coordinates (x, y) of the slide image is I slide (x, y), the pixel value of the camera image is I cam (x, y), and a correction function derived from the luminance correction model (hereinafter, luminance correction function) ) Is f (I) (I is a luminance value), the threshold value Th (x, y) of each pixel is calculated by Equation 1 exemplified below.
 Th(x,y)=α|f(Icam(x,y))-Islide(x,y)|  (式1) Th (x, y) = α | f (I cam (x, y)) − I slide (x, y) |
 ここで、α(α>1)は乗数である。例えば、αには1.5等の値が設定される。この式から、例えば、文字領域など、画素値の差分が大きい部分は閾値も大きくなり、平坦な領域など、画素値の差分が小さい部分は閾値も小さくなる。このように動的閾値算出手段124が算出した閾値に基づいて前景抽出手段103が差分画像から2値化画像を生成する。よって、前景がより区別しやすくなるという効果が得られる。 Where α (α> 1) is a multiplier. For example, a value such as 1.5 is set for α. From this equation, for example, a portion with a large pixel value difference such as a character region has a large threshold value, and a portion with a small pixel value difference such as a flat region has a small threshold value. As described above, the foreground extraction unit 103 generates a binarized image from the difference image based on the threshold value calculated by the dynamic threshold value calculation unit 124. Therefore, an effect that the foreground can be more easily distinguished is obtained.
 スライド変化判定手段123は、スライド画像が変化したか否かを判定する。スライド変化判定手段123は、例えば、フレーム間の差分(例えば、差分が所定値以上か否か)に基づいて、スライド画像が変化したか否かを判定してもよい。また、スライド変化判定手段123は、テレビジョン映像において時間方向の分割点を検出するショット分割の方法を用いてスライド画像が変化したか否かを判定してもよい。 The slide change determination means 123 determines whether or not the slide image has changed. For example, the slide change determination unit 123 may determine whether or not the slide image has changed based on a difference between frames (for example, whether or not the difference is a predetermined value or more). Further, the slide change determination unit 123 may determine whether or not the slide image has changed using a shot division method for detecting a division point in the time direction in a television image.
 このように、スライド画像が変化した場合、画素値補正モデルも変化する。そのため、画素値補正モデル算出手段122は、スライド画像が変化したとスライド変化判定手段123が判定したときに、画素値補正モデルを再度算出する。なお、画像の変化を判定する際には、カメラ画像よりも、輝度が安定しているスライド画像を利用することが望ましい。 Thus, when the slide image changes, the pixel value correction model also changes. Therefore, the pixel value correction model calculation unit 122 calculates the pixel value correction model again when the slide change determination unit 123 determines that the slide image has changed. It should be noted that when determining a change in an image, it is desirable to use a slide image having a stable luminance rather than a camera image.
 幾何補正手段101と、差分画像生成手段102と、前景抽出手段103と、頭部検出手段104と、防眩処理手段105と、画素値補正手段121と、画素値補正モデル算出手段122と、スライド変化判定手段123と、動的閾値算出手段124とは、プログラム(人物検知プログラム)に従って動作するコンピュータのCPUによって実現される。 Geometric correction means 101, difference image generation means 102, foreground extraction means 103, head detection means 104, anti-glare processing means 105, pixel value correction means 121, pixel value correction model calculation means 122, slide The change determination unit 123 and the dynamic threshold value calculation unit 124 are realized by a CPU of a computer that operates according to a program (person detection program).
 例えば、プログラムは、投射型投影装置の記憶部(図示せず)に記憶され、CPUは、そのプログラムを読み込み、プログラムに従って、幾何補正手段101、差分画像生成手段102、前景抽出手段103、頭部検出手段104、防眩処理手段105、画素値補正手段121、画素値補正モデル算出手段122、スライド変化判定手段123および動的閾値算出手段124として動作してもよい。 For example, the program is stored in a storage unit (not shown) of the projection type projection apparatus, and the CPU reads the program, and in accordance with the program, the geometric correction unit 101, the difference image generation unit 102, the foreground extraction unit 103, the head The detection unit 104, the anti-glare processing unit 105, the pixel value correction unit 121, the pixel value correction model calculation unit 122, the slide change determination unit 123, and the dynamic threshold calculation unit 124 may be operated.
 また、幾何補正手段101と、差分画像生成手段102と、前景抽出手段103と、頭部検出手段104と、防眩処理手段105と、画素値補正手段121と、画素値補正モデル算出手段122と、スライド変化判定手段123と、動的閾値算出手段124とは、それぞれが専用のハードウェアで実現されていてもよい。 Further, the geometric correction unit 101, the difference image generation unit 102, the foreground extraction unit 103, the head detection unit 104, the anti-glare processing unit 105, the pixel value correction unit 121, and the pixel value correction model calculation unit 122 Each of the slide change determination unit 123 and the dynamic threshold calculation unit 124 may be realized by dedicated hardware.
 次に、本実施形態の投射型投影装置の動作を説明する。図5は、本実施形態の投射型投影装置の動作例を示すフローチャートである。なお、本実施形態では、初期状態において、カメラ画像に移動体は撮影されていないものとする。また、初期状態において、画素値補正モデルには、状況により想定されるモデルが初期モデルとして設定され、閾値にも、状況により想定される値が初期値として設定されているものとする。 Next, the operation of the projection type projection apparatus of this embodiment will be described. FIG. 5 is a flowchart showing an operation example of the projection type projection apparatus of the present embodiment. In the present embodiment, it is assumed that the moving object is not captured in the camera image in the initial state. In the initial state, it is assumed that a model assumed by the situation is set as the initial model for the pixel value correction model, and a value assumed by the situation is also set as the initial value for the threshold value.
 幾何補正手段101は、カメラ画像を幾何補正する(ステップS1)。画素値補正手段121は、画素値補正モデルに基づいて、幾何補正画像の画素値を補正する(ステップS2)。なお、初期状態では、カメラ画像には移動体が撮影されていないため、画素値補正手段121は、幾何補正画像全体について画素値を補正する。 The geometric correction unit 101 corrects the camera image (step S1). The pixel value correction unit 121 corrects the pixel value of the geometrically corrected image based on the pixel value correction model (step S2). In the initial state, since no moving object is captured in the camera image, the pixel value correcting unit 121 corrects the pixel value of the entire geometrically corrected image.
 差分画像生成手段102は、画素値補正画像における各画素の画素値と、スライド画像における対応する各画素の画素値との差分値から差分画像を生成する(ステップS3)。前景抽出手段103は、差分画像から2値化画像を生成し、前景領域を抽出する(ステップS4)。なお、初期状態では、カメラ画像には移動体が撮影されていない。そのため、2値化画像から前景領域は抽出されないことになる。 The difference image generation means 102 generates a difference image from the difference value between the pixel value of each pixel in the pixel value correction image and the pixel value of each corresponding pixel in the slide image (step S3). The foreground extraction means 103 generates a binarized image from the difference image and extracts the foreground area (step S4). In the initial state, no moving object is captured in the camera image. Therefore, no foreground area is extracted from the binarized image.
 一方、動的閾値算出手段124は、スライド画像と画素値補正画像の対応する各画素の画素値の差分に基づいて画素ごとに閾値を算出する(ステップS5)。なお、初期状態では、カメラ画像には移動体が撮影されていない。そのため、動的閾値算出手段124は、画像全体を対象として画素ごとに閾値を算出する。 On the other hand, the dynamic threshold value calculation means 124 calculates a threshold value for each pixel based on the difference between the pixel values of the corresponding pixels of the slide image and the pixel value corrected image (step S5). In the initial state, no moving object is captured in the camera image. For this reason, the dynamic threshold value calculation unit 124 calculates a threshold value for each pixel for the entire image.
 また、画素値補正モデル算出手段122は、スライド画像とカメラ画像の対応する各画素の画素値の関係に基づいて、画素値補正モデルを生成する(ステップS6)。なお、初期状態では、カメラ画像には移動体が撮影されていない。そのため、画素値補正モデル算出手段122は、画像全体の画素値を対象として画素値補正モデルを生成する。 Further, the pixel value correction model calculation unit 122 generates a pixel value correction model based on the relationship between the pixel values of the corresponding pixels of the slide image and the camera image (step S6). In the initial state, no moving object is captured in the camera image. Therefore, the pixel value correction model calculation unit 122 generates a pixel value correction model for the pixel values of the entire image.
 次に、スライド画像の前に移動体が存在する場合の処理を説明する。図6は、本実施形態の投射型投影装置の他の動作例を示すフローチャートである。幾何補正手段101がカメラ画像を幾何補正し、画素値補正画像とスライド画像から差分画像を生成するまでの処理は、図5のステップS1からステップS3に示す処理と同様である。 Next, processing when a moving object exists in front of a slide image will be described. FIG. 6 is a flowchart showing another example of the operation of the projection type projection apparatus of this embodiment. The process from the geometric correction unit 101 geometrically correcting the camera image until the difference image is generated from the pixel value corrected image and the slide image is the same as the process shown in steps S1 to S3 in FIG.
 前景抽出手段103は、差分画像から2値化画像を生成し、前景領域を抽出する(ステップS4)。ここでは、カメラ画像に移動体が撮影されているため、2値化画像から前景領域が抽出される。 The foreground extraction means 103 generates a binarized image from the difference image and extracts the foreground area (step S4). Here, since the moving body is captured in the camera image, the foreground area is extracted from the binarized image.
 頭部検出手段104は、カメラ画像内で前景抽出手段103が抽出した前景領域に対応する領域から移動体の頭部を検出する(ステップS11)。防眩処理手段105は、頭部検出手段104がカメラ画像から検出した頭部を含む領域に対応するスライド画像の各画素に、眩しさを軽減させる処理を行う(ステップS12)。 The head detection means 104 detects the head of the moving body from the area corresponding to the foreground area extracted by the foreground extraction means 103 in the camera image (step S11). The anti-glare processing unit 105 performs a process of reducing glare on each pixel of the slide image corresponding to the region including the head detected by the head detection unit 104 from the camera image (step S12).
 一方、動的閾値算出手段124は、スライド画像と画素値補正画像の前景領域以外の画素を比較し、対応する各画素の画素値の差分に基づいて画素ごとに閾値を算出する(ステップS13)。 On the other hand, the dynamic threshold value calculation unit 124 compares pixels other than the foreground area of the slide image and the pixel value correction image, and calculates a threshold value for each pixel based on the difference between the pixel values of the corresponding pixels (step S13). .
 また、画素値補正モデル算出手段122は、スライド画像とカメラ画像の前景領域以外の画素を比較し、対応する各画素の画素値の関係に基づいて、画素値補正モデルを生成する(ステップS14)。以降、画素値補正手段121は、画素値補正モデル算出手段122がステップS14において生成した画素値補正モデルに基づいて、幾何補正画像の画素値を補正する。 The pixel value correction model calculation unit 122 compares pixels other than the foreground area of the slide image and the camera image, and generates a pixel value correction model based on the relationship between the pixel values of the corresponding pixels (step S14). . Thereafter, the pixel value correction unit 121 corrects the pixel value of the geometrically corrected image based on the pixel value correction model generated by the pixel value correction model calculation unit 122 in step S14.
 以上のように、本実施形態によれば、画素値補正手段121は、スライド画像とカメラ撮影画像とを比較して、カメラ画像のうち背景部分の画素値を、対応するスライド画像の画素の画素値に近づけるように補正する。差分画像生成手段102は、補正後のカメラ画像における各画素の画素値と、対応するスライド画像における各画素の画素値との差分値から差分画像を生成する。 As described above, according to the present embodiment, the pixel value correcting unit 121 compares the slide image with the camera-captured image, and determines the pixel value of the background portion of the camera image as the pixel of the corresponding slide image. Correct it so that it is close to the value. The difference image generation unit 102 generates a difference image from the difference value between the pixel value of each pixel in the corrected camera image and the pixel value of each pixel in the corresponding slide image.
 前景抽出手段103は、閾値に基づいて差分画像から2値化画像を生成し、2値化画像から前景領域を抽出する。頭部検出手段104は、カメラ画像内の前景領域に対応する領域からプレゼンタの頭部を検出する。防眩処理手段105は、頭部を含む領域に対応するスライド画像の各画素に眩しさを軽減させる処理を行う。 The foreground extraction means 103 generates a binarized image from the difference image based on the threshold value, and extracts the foreground area from the binarized image. The head detecting means 104 detects the presenter's head from an area corresponding to the foreground area in the camera image. The anti-glare processing means 105 performs a process of reducing glare on each pixel of the slide image corresponding to the region including the head.
 動的閾値算出手段106は、2値化画像における背景領域に対応する領域を、スライド画像および補正後のカメラ画像からそれぞれ抽出する。そして、動的閾値算出手段106は、抽出した画像間の対応する各画素の画素値の差分に基づいて、差分画像の2値化に用いられる閾値を画素ごとに算出する。 Dynamic threshold calculation means 106 extracts regions corresponding to the background region in the binarized image from the slide image and the corrected camera image, respectively. Then, the dynamic threshold value calculation unit 106 calculates a threshold value used for binarization of the difference image for each pixel based on the difference between the pixel values of the corresponding pixels between the extracted images.
 このとき、前景抽出手段103は、動的閾値算出手段106によって算出された閾値に基づいて差分画像から2値化画像を生成し、その2値化画像から前景領域を抽出する。また、画素値補正手段121は、前景抽出手段103が抽出した前景領域以外の部分の画素の画素値を補正する。 At this time, the foreground extraction unit 103 generates a binarized image from the difference image based on the threshold value calculated by the dynamic threshold value calculation unit 106, and extracts a foreground region from the binarized image. Further, the pixel value correcting unit 121 corrects the pixel values of the pixels other than the foreground region extracted by the foreground extracting unit 103.
 以上のような構成により、高照度の光を投射して画像を投影する際、画像の投影方向に人物が存在する場合に、その人物を高い精度で検出することで投射される光の眩しさを低減できる。 With the above configuration, when projecting an image by projecting light with high illuminance, if there is a person in the projection direction of the image, the glare of the light projected by detecting the person with high accuracy Can be reduced.
実施形態2. 
 図7は、本発明による投射型投影装置の第2の実施形態の構成例を示すブロック図である。なお、第1の実施形態と同様の構成については、図1と同一の符号を付し、説明を省略する。第1の実施形態の投射型投影装置は、幾何補正手段101と、差分画像生成手段102と、前景抽出手段103と、頭部検出手段104と、防眩処理手段105と、画素値補正手段121と、画素値補正モデル算出手段122と、スライド変化判定手段123と、動的閾値算出手段124と、前景判定手段125とを備えている。
Embodiment 2. FIG.
FIG. 7 is a block diagram showing a configuration example of the second embodiment of the projection type projection apparatus according to the present invention. In addition, about the structure similar to 1st Embodiment, the code | symbol same as FIG. 1 is attached | subjected and description is abbreviate | omitted. The projection type projection apparatus according to the first embodiment includes a geometric correction unit 101, a difference image generation unit 102, a foreground extraction unit 103, a head detection unit 104, an anti-glare processing unit 105, and a pixel value correction unit 121. A pixel value correction model calculation unit 122, a slide change determination unit 123, a dynamic threshold calculation unit 124, and a foreground determination unit 125.
 すなわち、第2の実施形態の投射型投影装置は、前景判定手段125をさらに備えている点において第1の実施形態の投射型投影装置と異なる。 That is, the projection type projection apparatus of the second embodiment is different from the projection type projection apparatus of the first embodiment in that the foreground determination means 125 is further provided.
 前景判定手段125は、前景抽出手段103が抽出した前景領域の妥当性を判定し、妥当と判断した場合に、前景抽出手段103が抽出した前景領域を示す情報を動的閾値算出手段124および画素値補正モデル算出手段122に通知する。 The foreground determination means 125 determines the validity of the foreground area extracted by the foreground extraction means 103, and if it determines that the foreground area is appropriate, information indicating the foreground area extracted by the foreground extraction means 103 is used as the dynamic threshold calculation means 124 and the pixel. The value correction model calculation means 122 is notified.
 カメラ画像に移動体が撮影されている場合、一般に、その移動体は画面の一部に塊として存在する。したがって、前景領域が画面全体に分布している場合、前景ではなくスライドの照度の変化などが検知されている可能性がある。また、スライドの照度が変化する場合、カメラ画像の画素値を時系列に比較した場合、一般に、その輝度値の差は小さい。 When a moving body is photographed in the camera image, the moving body generally exists as a lump in a part of the screen. Therefore, when the foreground area is distributed over the entire screen, a change in the illuminance of the slide, not the foreground, may be detected. In addition, when the illuminance of the slide changes, when the pixel values of the camera image are compared in time series, the difference in luminance value is generally small.
 また、動的閾値算出手段124が背景領域のみ撮影されたカメラ画像の閾値を算出する場合、その閾値は低く抑えられる傾向にあるため、スライドの照度に伴って、画面全体が前景領域と判断される可能性がある。 Further, when the dynamic threshold value calculation unit 124 calculates the threshold value of the camera image in which only the background area is captured, the threshold value tends to be kept low, so that the entire screen is determined to be the foreground area according to the illuminance of the slide. There is a possibility.
 そこで、前景判定手段125は、カメラ画像の時系列の画素値変化およびカメラ画像全体に対する前景領域の分布に基づいて、前景抽出手段103が抽出した前景領域の妥当性を判断する。 Therefore, the foreground determination means 125 determines the validity of the foreground area extracted by the foreground extraction means 103 based on the time-series pixel value change of the camera image and the distribution of the foreground area with respect to the entire camera image.
 前景判定手段125が、前景領域が照度の変化によって抽出されたと判定した場合、画素値補正モデル算出手段122は、輝度補正モデルの再生成を行い、動的閾値算出手段124は、動的閾値の再計算を行う。そして、前景抽出手段103が、再度マスク画像を生成し直したあと、前景判定手段125が、再度、上記判定を行う。 When the foreground determination unit 125 determines that the foreground region has been extracted due to a change in illuminance, the pixel value correction model calculation unit 122 regenerates the luminance correction model, and the dynamic threshold calculation unit 124 Recalculate. Then, after the foreground extraction means 103 regenerates the mask image again, the foreground determination means 125 performs the above determination again.
 言い換えると、動的閾値算出手段124は、前景判定手段125が妥当と判断した前景領域を用いて背景領域に対応する領域をスライド画像および補正後のカメラ画像からそれぞれ抽出する。また、画素値補正モデル算出手段122は、画素値補正モデルを生成する In other words, the dynamic threshold value calculation means 124 extracts the area corresponding to the background area from the slide image and the corrected camera image using the foreground area determined to be appropriate by the foreground determination means 125. Further, the pixel value correction model calculation unit 122 generates a pixel value correction model.
 このように、スライド画像と輝度補正後のカメラ画像との差分が大きくなる場合を想定して判定処理を行うことで、単にプロジェクタの照度の変化のみによって輝度が変化した場合に前景と誤判定されることを防止できる。 In this way, by performing the determination process assuming that the difference between the slide image and the camera image after luminance correction is large, it is erroneously determined as the foreground when the luminance is changed only by the change in the illuminance of the projector. Can be prevented.
 以下、前景判定手段125が前景領域を判定する方法を具体的に説明する。本実施形態では、前景判定手段125は、まず、マスク画像を膨張させ、膨張させた画像を対象に判定を行う。前景判定手段125は、例えば、マスク画像を一定画素数分だけ膨張させた領域を前景領域と仮定して判定を行う。例えば、縦および横方向にN画素分だけ膨張させるには、前景判定手段125は、(N+1)×(N+1)画素のウィンドウをもつ最大値フィルタを適用して処理を行えばよい。 Hereinafter, a method by which the foreground determination unit 125 determines the foreground area will be specifically described. In the present embodiment, the foreground determination unit 125 first expands the mask image and performs determination on the expanded image. The foreground determination unit 125 performs determination by assuming, for example, an area obtained by expanding the mask image by a certain number of pixels as a foreground area. For example, in order to expand N pixels in the vertical and horizontal directions, the foreground determination unit 125 may perform processing by applying a maximum value filter having a window of (N + 1) × (N + 1) pixels.
 プロジェクタの照度の変化等により、前景が存在しなくても、輝度補正後のカメラ画像の画素値とスライド画像との間の輝度差が大きくなる場合がある。そこで、前景判定手段125は、この状態を別途検知する。 Even if the foreground does not exist due to a change in the illuminance of the projector, the brightness difference between the pixel value of the camera image after brightness correction and the slide image may become large. Therefore, the foreground determination means 125 detects this state separately.
 図8は、輝度補正後のカメラ画像とスライド画像との間の輝度差の例を示す説明図である。図8(a)に示すように、Nフレームのスライド画像とカメラ画像との差が小さい場合、図8(b)に示すように、閾値を超える画素は少数である。ここで、図8(c)に示すように、スライド画像の内容が変わらない一方で、NフレームとN+1フレームとの間で、プロジェクタの照度が変化したとする。この場合、カメラ画像の輝度値が全体的に変化した結果、図8(d)に示すように、閾値を超える画素が大量に抽出されることになる。 FIG. 8 is an explanatory diagram showing an example of the luminance difference between the camera image after the luminance correction and the slide image. As shown in FIG. 8A, when the difference between the slide image of N frames and the camera image is small, the number of pixels exceeding the threshold is small as shown in FIG. Here, as shown in FIG. 8C, it is assumed that the content of the slide image does not change, but the illuminance of the projector changes between the N frame and the N + 1 frame. In this case, as a result of the overall change in the luminance value of the camera image, a large number of pixels exceeding the threshold are extracted as shown in FIG.
 以下、スライド画像と輝度補正後のカメラ画像との差分が大きくなることを想定して行われる判定処理の例を説明する。この判定処理では、フレーム間での輝度変化はそれほど大きくないという特徴、および、その輝度変化が画面全体に現れるという特徴が利用される。具体的には、前景判定手段125は、時系列の2枚のフレーム間で2値化後の画像を比較し、閾値を超える画素(すなわち、背景領域と判断された部分)が全体的に増えたかどうかを判定する。 Hereinafter, an example of determination processing performed assuming that the difference between the slide image and the camera image after the brightness correction is large will be described. In this determination processing, the feature that the luminance change between frames is not so large and the feature that the luminance change appears on the entire screen are used. Specifically, the foreground determination unit 125 compares the binarized images between two time-series frames, and the number of pixels exceeding the threshold (that is, the portion determined to be the background region) increases as a whole. Determine whether or not.
 まず、前景判定手段125は、2枚のフレーム間で2値化後の画像を比較する。前景判定手段125は、閾値を超えた画素について、対象とする画像の画素の画素値と1フレーム前の画像の画素の輝度値の差分を求める。この差分が一定閾値内に収まっているときに、前景判定手段125は、その画素の位置のばらつき度合いを調べる。前景判定手段125は、ばらつき度合いを、座標値の分散を求めることによって判定できる。分散が一定値以上の場合、画面全体に広がっているものと考えられるため、前景判定手段125は、照度変化による輝度変化が生じたものと判定する。 First, the foreground determination means 125 compares the binarized images between two frames. The foreground determination means 125 obtains the difference between the pixel value of the pixel of the target image and the luminance value of the pixel of the image one frame before the pixel exceeding the threshold. When this difference is within a certain threshold, the foreground determination means 125 checks the degree of variation in the position of the pixel. The foreground determination means 125 can determine the degree of variation by obtaining the variance of coordinate values. When the variance is equal to or greater than a certain value, it is considered that the image has spread over the entire screen. Therefore, the foreground determination unit 125 determines that a change in luminance due to a change in illuminance has occurred.
 なお、前景判定手段125は、以下の方法を用いて判定処理を行ってもよい。以下に説明する判定処理では、比較する画像間での輝度変化の値の絶対値はそれほど大きくないという特徴、および、その輝度変化が画面全体に現れるという特徴が利用される。 Note that the foreground determination means 125 may perform the determination process using the following method. In the determination process described below, the feature that the absolute value of the luminance change value between images to be compared is not so large and the feature that the luminance change appears on the entire screen are used.
 具体的には、前景判定手段125は、2値化時に閾値を超えた画素値の差分がその閾値からどの程度上回っているかを調べる。閾値との差が一定閾値内に収まっているときに、前景判定手段125は、その画素の位置のばらつき度合いを調べる。前景判定手段125は、ばらつき度合いを、座標値の分散を求めることによって判定できる。分散が一定値以上の場合、画面全体に広がっているものと考えられるため、前景判定手段125は、照度変化による輝度変化が生じたものと判定する。 Specifically, the foreground determination means 125 checks how much the difference between the pixel values exceeding the threshold at the time of binarization exceeds the threshold. When the difference from the threshold is within a certain threshold, the foreground determination means 125 checks the degree of variation in the position of the pixel. The foreground determination means 125 can determine the degree of variation by obtaining the variance of coordinate values. When the variance is equal to or greater than a certain value, it is considered that the image has spread over the entire screen. Therefore, the foreground determination unit 125 determines that a change in luminance due to a change in illuminance has occurred.
 以上のように、本実施形態では、前景判定手段125が前景領域の妥当性を判定する。そのため、第1の実施形態の効果に加え、前景領域を判断する精度を向上させることができる。 As described above, in this embodiment, the foreground determination means 125 determines the validity of the foreground area. Therefore, in addition to the effects of the first embodiment, the accuracy of determining the foreground region can be improved.
 次に、本発明の概要を説明する。図9は、本発明による投射型投影装置の概要を示すブロック図である。本発明による投射型投影装置は、投影手段(例えば、プロジェクタ)から投影される画像である投影画像(例えば、スライド画像)と、その投影画像を撮影した画像またはその投影画像と投影手段の間に存在する移動体である前景(例えば、プレゼンタ、人物)をその投影画像と共に撮影した画像である撮影画像(例えば、カメラ画像)とを比較して、撮影画像のうち前景以外の部分である背景における画素の画素値を、対応する投影画像における画素の画素値に近づけるように補正する画素値補正手段81(例えば、画素値補正手段121)と、補正後の撮影画像における各画素の画素値と、対応する投影画像における各画素の画素値との差分値から差分画像を生成する差分画像生成手段82(例えば、差分画像生成手段102)と、差分画像を閾値に基づいて2値化した画像である2値化画像を生成し、その2値化画像から前景部分を示す前景領域(例えば、マスク画像)を抽出する前景抽出手段83(例えば、前景抽出手段103)と、撮影画像内の前景領域に対応する領域から移動体の頭部を検出する頭部検出手段84(例えば、頭部検出手段104)と、頭部を含む領域に対応する投影画像の各画素に眩しさを軽減させる処理を行う防眩処理手段85(例えば、防眩処理手段105)と、2値化画像における前景領域以外の領域である背景領域に対応する領域を、投影画像および補正後の撮影画像からそれぞれ抽出し、抽出した画像間の対応する各画素における画素値の差分に基づいて、差分画像の2値化に用いられる閾値を画素ごとに算出する動的閾値算出手段86(例えば、動的閾値算出手段)とを備えている。 Next, the outline of the present invention will be described. FIG. 9 is a block diagram showing an outline of a projection type projection apparatus according to the present invention. A projection type projection apparatus according to the present invention includes a projection image (for example, a slide image) that is an image projected from a projection unit (for example, a projector) and an image obtained by capturing the projection image or between the projection image and the projection unit. A foreground (e.g., presenter, person) that is an existing moving body is compared with a captured image (e.g., a camera image) that is an image obtained by capturing the projected image together with the projected image. A pixel value correcting unit 81 (for example, a pixel value correcting unit 121) that corrects the pixel value of the pixel so as to approach the pixel value of the pixel in the corresponding projection image, and the pixel value of each pixel in the corrected captured image; Difference image generation means 82 (for example, difference image generation means 102) that generates a difference image from a difference value between the pixel value of each pixel in the corresponding projection image, and a difference A foreground extraction unit 83 (for example, foreground) that generates a binarized image that is an image obtained by binarizing the image based on a threshold and extracts a foreground region (for example, a mask image) indicating a foreground portion from the binarized image. Extraction means 103), head detection means 84 (for example, head detection means 104) for detecting the head of the moving body from the area corresponding to the foreground area in the captured image, and projection corresponding to the area including the head. Anti-glare processing means 85 (for example, anti-glare processing means 105) that performs processing to reduce glare on each pixel of the image, and a region corresponding to the background area other than the foreground area in the binarized image are projected Dynamic threshold calculation that extracts a threshold value used for binarization of a difference image for each pixel based on a difference between pixel values in each corresponding pixel between the extracted images and the captured image after correction Means 86 ( In example, a dynamic threshold calculation means) and.
 前景抽出手段83は、動的閾値算出手段86によって算出された閾値に基づいて差分画像から2値化画像を生成し、その2値化画像から前景領域を抽出する。画素値補正手段81は、前景抽出手段83が抽出した前景領域以外の部分の画素の画素値を補正する。 The foreground extraction means 83 generates a binary image from the difference image based on the threshold value calculated by the dynamic threshold value calculation means 86, and extracts the foreground region from the binary image. The pixel value correcting unit 81 corrects the pixel values of the pixels other than the foreground area extracted by the foreground extracting unit 83.
 そのような構成により、高照度の光を投射して画像を投影する際、画像の投影方向に人物が存在する場合に、その人物を高い精度で検出することで投射される光の眩しさを低減できる。 With such a configuration, when projecting an image by projecting light with high illuminance, if there is a person in the projection direction of the image, the glare of the light projected by detecting the person with high accuracy can be reduced. Can be reduced.
 また、投射型投影装置は、投影画像の画素の画素値と、対応する撮影画像の画素のうち背景の画素の画素値との関係に基づいて、撮影画像の画素値を対応する投影画像の画素値に近づけるように補正する方法を規定したモデルである画素値補正モデルを算出する画素値補正モデル算出手段(例えば、画素値補正モデル算出手段122)を備えていてもよい。そして、画素値補正モデル算出手段は、前景抽出手段83が抽出した前景領域から背景を特定し、画素値補正手段81は、画素値補正モデルに基づいて撮影画像の画素値を補正してもよい。 In addition, the projection type projection apparatus uses the pixel value of the projected image corresponding to the pixel value of the captured image based on the relationship between the pixel value of the pixel of the projected image and the pixel value of the background pixel among the pixels of the corresponding captured image. You may provide the pixel value correction model calculation means (for example, pixel value correction model calculation means 122) which calculates the pixel value correction model which is a model which prescribed | regulated the method of correct | amending so that it may approximate to a value. Then, the pixel value correction model calculation unit may specify the background from the foreground region extracted by the foreground extraction unit 83, and the pixel value correction unit 81 may correct the pixel value of the captured image based on the pixel value correction model. .
 さらに、投射型投影装置は、投影画像が変化したか否かを判定する変化判定手段(例えば、スライド変化判定手段123)を備えていてもよい。そして、画素値補正モデル算出手段は、投影画像が変化したと判定された場合に画素値補正モデルを生成してもよい。この場合、画素値補正モデル算出手段が画素値補正モデルを適切なタイミングで更新できる。 Furthermore, the projection type projector may include a change determination unit (for example, a slide change determination unit 123) that determines whether or not the projection image has changed. The pixel value correction model calculating unit may generate a pixel value correction model when it is determined that the projection image has changed. In this case, the pixel value correction model calculation means can update the pixel value correction model at an appropriate timing.
 また、投射型投影装置は、撮影画像の時系列の画素値変化および撮影画像全体に対する前景領域の分布に基づいて、前景抽出手段83が抽出した前景領域の妥当性を判定する前景判定手段(例えば、前景判定手段125)を備えていてもよい。そして、動的閾値算出手段86は、前景判定手段が妥当と判断した前景領域を用いて背景領域に対応する領域を投影画像および補正後の撮影画像からそれぞれ抽出し、画素値補正モデル算出手段は、前景判定手段が妥当と判断した前景領域を用いて撮影画像の背景を特定することにより画素値補正モデルを算出してもよい。この場合、前景領域を判定する精度を向上できる。 In addition, the projection type projection apparatus determines the validity of the foreground region extracted by the foreground extraction unit 83 based on the time-series pixel value change of the captured image and the distribution of the foreground region with respect to the entire captured image (for example, , Foreground determination means 125) may be provided. Then, the dynamic threshold value calculation means 86 extracts areas corresponding to the background area from the projected image and the corrected captured image using the foreground area determined to be appropriate by the foreground determination means, and the pixel value correction model calculation means The pixel value correction model may be calculated by specifying the background of the captured image using the foreground area determined to be appropriate by the foreground determination means. In this case, the accuracy of determining the foreground area can be improved.
 また、投射型投影装置は、投影画像の形状(例えば、長方形)に合わせて撮影画像を幾何補正する幾何補正手段(例えば、幾何変換手段101)を備えていてもよい。そして、画素値補正手段は、投影画像と幾何補正後の撮影画像とを比較して撮影画像の画素値を補正してもよい。 Further, the projection type projection apparatus may include a geometric correction unit (for example, a geometric conversion unit 101) that geometrically corrects the captured image in accordance with the shape (for example, a rectangle) of the projection image. Then, the pixel value correcting means may correct the pixel value of the captured image by comparing the projected image with the geometrically corrected captured image.
 また、防眩処理手段85は、頭部を含む領域に対応する投影画像の各画素の輝度を低減させる処理(例えば、黒い画像を重畳する処理)を行ってもよい。 Further, the anti-glare processing means 85 may perform processing for reducing the luminance of each pixel of the projection image corresponding to the region including the head (for example, processing for superimposing a black image).
 また、動的閾値算出手段86は、画素値の差分が大きいほど閾値を大きく算出(例えば、上記の式1を用いて閾値を算出)してもよい。 Further, the dynamic threshold value calculation means 86 may calculate the threshold value larger as the difference between the pixel values is larger (for example, the threshold value is calculated using the above equation 1).
 以上、実施形態及び実施例を参照して本願発明を説明したが、本願発明は上記実施形態および実施例に限定されるものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解し得る様々な変更をすることができる。 As mentioned above, although this invention was demonstrated with reference to embodiment and an Example, this invention is not limited to the said embodiment and Example. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.
 この出願は、2011年12月16日に出願された日本特許出願2011-275814を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2011-275814 filed on Dec. 16, 2011, the entire disclosure of which is incorporated herein.
 本発明は、画像を投影する際に投射される光の眩しさを低減させる投射型投影装置に好適に適用される。 The present invention is preferably applied to a projection type projection apparatus that reduces glare of light projected when an image is projected.
 101 幾何補正手段
 102 差分画像生成手段
 103 前景抽出手段
 104 頭部検出手段
 105 防眩処理手段
 121 画素値補正手段
 122 画素値補正モデル算出手段
 123 スライド変化判定手段
 124 動的閾値算出手段
 125 前景判定手段
DESCRIPTION OF SYMBOLS 101 Geometric correction means 102 Difference image generation means 103 Foreground extraction means 104 Head detection means 105 Anti-glare processing means 121 Pixel value correction means 122 Pixel value correction model calculation means 123 Slide change determination means 124 Dynamic threshold value calculation means 125 Foreground determination means

Claims (11)

  1.  投影手段から投影される画像である投影画像と、当該投影画像を撮影した画像または当該投影画像と前記投影手段の間に存在する移動体である前景を当該投影画像と共に撮影した画像である撮影画像とを比較して、前記撮影画像のうち前記前景以外の部分である背景における画素の画素値を、対応する前記投影画像における画素の画素値に近づけるように補正する画素値補正手段と、
     補正後の撮影画像における各画素の画素値と、対応する前記投影画像における各画素の画素値との差分値から差分画像を生成する差分画像生成手段と、
     前記差分画像を閾値に基づいて2値化した画像である2値化画像を生成し、当該2値化画像から前記前景部分を示す前景領域を抽出する前景抽出手段と、
     前記撮影画像内の前記前景領域に対応する領域から前記移動体の頭部を検出する頭部検出手段と、
     前記頭部を含む領域に対応する前記投影画像の各画素に眩しさを軽減させる処理を行う防眩処理手段と、
     前記2値化画像における前記前景領域以外の領域である背景領域に対応する領域を、前記投影画像および前記補正後の撮影画像からそれぞれ抽出し、抽出した画像間の対応する各画素における画素値の差分に基づいて、前記差分画像の2値化に用いられる閾値を画素ごとに算出する動的閾値算出手段とを備え、
     前記前景抽出手段は、前記動的閾値算出手段によって算出された閾値に基づいて差分画像から2値化画像を生成し、当該2値化画像から前景領域を抽出し、
     前記画素値補正手段は、前記前景抽出手段が抽出した前景領域以外の部分の画素の画素値を補正する
     ことを特徴とする投射型投影装置。
    A projected image that is an image projected from the projection unit, and a captured image that is an image obtained by capturing the image obtained by capturing the projection image or a foreground that is a moving object existing between the projection image and the projection unit together with the projection image And a pixel value correction unit that corrects the pixel value of the pixel in the background that is a part other than the foreground in the photographed image so as to approach the pixel value of the pixel in the corresponding projection image;
    Difference image generation means for generating a difference image from a difference value between a pixel value of each pixel in the captured image after correction and a pixel value of each pixel in the corresponding projection image;
    Foreground extraction means for generating a binarized image that is an image obtained by binarizing the difference image based on a threshold, and extracting a foreground region indicating the foreground portion from the binarized image;
    Head detection means for detecting the head of the moving body from an area corresponding to the foreground area in the captured image;
    Anti-glare processing means for performing processing to reduce glare on each pixel of the projection image corresponding to the region including the head; and
    An area corresponding to a background area that is an area other than the foreground area in the binarized image is extracted from the projected image and the corrected photographed image, and the pixel value of each corresponding pixel between the extracted images is extracted. Dynamic threshold calculation means for calculating a threshold value used for binarization of the difference image for each pixel based on the difference,
    The foreground extraction unit generates a binarized image from the difference image based on the threshold calculated by the dynamic threshold calculation unit, extracts a foreground region from the binarized image,
    The projection type projection apparatus, wherein the pixel value correction unit corrects pixel values of pixels in a portion other than the foreground region extracted by the foreground extraction unit.
  2.  投影画像の画素の画素値と、対応する撮影画像の画素のうち背景の画素の画素値との関係に基づいて、撮影画像の画素値を対応する投影画像の画素値に近づけるように補正する方法を規定したモデルである画素値補正モデルを算出する画素値補正モデル算出手段を備え、
     前記画素値補正モデル算出手段は、前景抽出手段が抽出した前景領域から背景を特定し、
     画素値補正手段は、前記画素値補正モデルに基づいて撮影画像の画素値を補正する
     請求項1記載の投射型投影装置。
    A method for correcting a pixel value of a photographed image to be close to a pixel value of a corresponding projection image based on a relationship between a pixel value of a pixel of the projection image and a pixel value of a background pixel among the pixels of the corresponding photographed image A pixel value correction model calculating means for calculating a pixel value correction model that is a model that defines
    The pixel value correction model calculation means specifies a background from the foreground region extracted by the foreground extraction means,
    The projection type projection apparatus according to claim 1, wherein the pixel value correction unit corrects a pixel value of the captured image based on the pixel value correction model.
  3.  投影画像が変化したか否かを判定する変化判定手段を備え、
     画素値補正モデル算出手段は、投影画像が変化したと判定された場合に画素値補正モデルを生成する
     請求項2記載の投射型投影装置。
    A change determining means for determining whether or not the projected image has changed,
    The projection type projection apparatus according to claim 2, wherein the pixel value correction model calculation unit generates a pixel value correction model when it is determined that the projection image has changed.
  4.  撮影画像の時系列の画素値変化および撮影画像全体に対する前景領域の分布に基づいて、前景抽出手段が抽出した前景領域の妥当性を判定する前景判定手段を備え、
     動的閾値算出手段は、前記前景判定手段が妥当と判断した前景領域を用いて背景領域に対応する領域を投影画像および補正後の撮影画像からそれぞれ抽出し、
     画素値補正モデル算出手段は、前記前景判定手段が妥当と判断した前景領域を用いて撮影画像の背景を特定することにより画素値補正モデルを算出する
     請求項2または請求項3記載の投射型投影装置。
    Foreground determination means for determining the validity of the foreground area extracted by the foreground extraction means based on the time-series pixel value change of the captured image and the distribution of the foreground area with respect to the entire captured image,
    The dynamic threshold value calculation means extracts the area corresponding to the background area from the projected image and the corrected captured image using the foreground area determined to be appropriate by the foreground determination means,
    4. The projection type projection according to claim 2, wherein the pixel value correction model calculation unit calculates a pixel value correction model by specifying a background of the photographed image using the foreground region determined to be appropriate by the foreground determination unit. apparatus.
  5.  投影画像の形状に合わせて撮影画像を幾何補正する幾何補正手段を備え、
     画素値補正手段は、投影画像と幾何補正後の撮影画像とを比較して撮影画像の画素値を補正する
     請求項1から請求項4のうちのいずれか1項に記載の投射型投影装置。
    Comprising geometric correction means for geometrically correcting the captured image in accordance with the shape of the projected image;
    5. The projection type projection device according to claim 1, wherein the pixel value correction unit corrects the pixel value of the captured image by comparing the projected image with the geometrically corrected captured image. 6.
  6.  防眩処理手段は、頭部を含む領域に対応する投影画像の各画素の輝度を低減させる処理を行う
     請求項1から請求項5のうちのいずれか1項に記載の投射型投影装置。
    The projection type projector according to any one of claims 1 to 5, wherein the anti-glare processing means performs a process of reducing the luminance of each pixel of the projection image corresponding to the region including the head.
  7.  動的閾値算出手段は、画素値の差分が大きいほど閾値を大きく算出する
     請求項1から請求項6のうちのいずれか1項に記載の投射型投影装置。 
    The projection type projection apparatus according to any one of claims 1 to 6, wherein the dynamic threshold value calculation unit calculates the threshold value as the pixel value difference increases.
  8.  投影手段から投影される画像である投影画像と、当該投影画像を撮影した画像または当該投影画像と前記投影手段の間に存在する移動体である前景を当該投影画像と共に撮影した画像である撮影画像とを比較して、前記撮影画像のうち前記前景以外の部分である背景における画素の画素値を、対応する前記投影画像における画素の画素値に近づけるように補正し、
     補正後の撮影画像における各画素の画素値と、対応する前記投影画像における各画素の画素値との差分値から差分画像を生成し、
     前記差分画像を閾値に基づいて2値化した画像である2値化画像を生成し、当該2値化画像から前記前景部分を示す前景領域を抽出し、
     前記撮影画像内の前記前景領域に対応する領域から前記移動体の頭部を検出し、
     前記頭部を含む領域に対応する前記投影画像の各画素に眩しさを軽減させる処理を行い、
     前記2値化画像における前記前景領域以外の領域である背景領域に対応する領域を、前記投影画像および前記補正後の撮影画像からそれぞれ抽出し、抽出した画像間の対応する各画素における画素値の差分に基づいて、前記差分画像の2値化に用いられる閾値を画素ごとに算出し、
     前景領域を抽出する際、算出された前記閾値に基づいて差分画像から2値化画像を生成し、当該2値化画像から前景領域を抽出し、
     抽出された前景領域以外の部分の画素の画素値を補正する
     ことを特徴とする光防眩方法。
    A projected image that is an image projected from the projection unit, and a captured image that is an image obtained by capturing the image obtained by capturing the projection image or a foreground that is a moving object existing between the projection image and the projection unit together with the projection image And correcting the pixel value of the pixel in the background that is a part other than the foreground in the captured image so as to approach the pixel value of the pixel in the corresponding projection image,
    A difference image is generated from a difference value between a pixel value of each pixel in the captured image after correction and a pixel value of each pixel in the corresponding projection image,
    Generating a binarized image that is an image obtained by binarizing the difference image based on a threshold, and extracting a foreground region indicating the foreground portion from the binarized image;
    Detecting the head of the moving body from an area corresponding to the foreground area in the captured image;
    Performing a process of reducing glare on each pixel of the projected image corresponding to the region including the head;
    An area corresponding to a background area that is an area other than the foreground area in the binarized image is extracted from the projected image and the corrected photographed image, and the pixel value of each corresponding pixel between the extracted images is extracted. Based on the difference, a threshold value used for binarization of the difference image is calculated for each pixel,
    When extracting a foreground region, a binarized image is generated from the difference image based on the calculated threshold value, and a foreground region is extracted from the binarized image,
    A light anti-glare method characterized by correcting pixel values of pixels other than the extracted foreground region.
  9.  投影画像の画素の画素値と、対応する撮影画像の画素のうち背景の画素の画素値との関係に基づいて、撮影画像の画素値を対応する投影画像の画素値に近づけるように補正する方法を規定したモデルである画素値補正モデルを算出する画素値補正モデル算出手段を備え、
     画素値補正モデルを算出する際、2値化画像から抽出された前景領域をもとに背景を特定し、
     前記画素値補正モデルに基づいて撮影画像の画素値を補正する
     請求項8記載の光防眩方法。
    A method for correcting a pixel value of a photographed image to be close to a pixel value of a corresponding projection image based on a relationship between a pixel value of a pixel of the projection image and a pixel value of a background pixel among the pixels of the corresponding photographed image A pixel value correction model calculating means for calculating a pixel value correction model that is a model that defines
    When calculating the pixel value correction model, the background is specified based on the foreground region extracted from the binarized image,
    The light anti-glare method according to claim 8, wherein a pixel value of a captured image is corrected based on the pixel value correction model.
  10.  コンピュータに、
     投影手段から投影される画像である投影画像と、当該投影画像を撮影した画像または当該投影画像と前記投影手段の間に存在する移動体である前景を当該投影画像と共に撮影した画像である撮影画像とを比較して、前記撮影画像のうち前記前景以外の部分である背景における画素の画素値を、対応する前記投影画像における画素の画素値に近づけるように補正する画素値補正処理、
     補正後の撮影画像における各画素の画素値と、対応する前記投影画像における各画素の画素値との差分値から差分画像を生成する差分画像生成処理、
     前記差分画像を閾値に基づいて2値化した画像である2値化画像を生成し、当該2値化画像から前記前景部分を示す前景領域を抽出する前景抽出処理、
     前記撮影画像内の前記前景領域に対応する領域から前記移動体の頭部を検出する頭部検出処理、
     前記頭部を含む領域に対応する前記投影画像の各画素に眩しさを軽減させる防眩処理、および、
     前記2値化画像における前記前景領域以外の領域である背景領域に対応する領域を、前記投影画像および前記補正後の撮影画像からそれぞれ抽出し、抽出した画像間の対応する各画素における画素値の差分に基づいて、前記差分画像の2値化に用いられる閾値を画素ごとに算出する動的閾値算出処理を実行させ、
     前記前景抽出処理で、前記動的閾値算出処理で算出された閾値に基づいて差分画像から2値化画像を生成させ、当該2値化画像から前景領域を抽出させ、
     前記画素値補正処理で、前記前景抽出処理で抽出された前景領域以外の部分の画素の画素値を補正させる
     ための光防眩用プログラム。
    On the computer,
    A projected image that is an image projected from the projection unit, and a captured image that is an image obtained by capturing the image obtained by capturing the projection image or a foreground that is a moving object existing between the projection image and the projection unit together with the projection image And a pixel value correction process for correcting the pixel value of the pixel in the background that is a part other than the foreground in the captured image so as to approach the pixel value of the pixel in the corresponding projection image,
    A difference image generation process for generating a difference image from a difference value between a pixel value of each pixel in the captured image after correction and a pixel value of each pixel in the corresponding projection image;
    A foreground extraction process for generating a binarized image that is an image obtained by binarizing the difference image based on a threshold value, and extracting a foreground region indicating the foreground part from the binarized image;
    A head detection process for detecting the head of the moving body from an area corresponding to the foreground area in the captured image;
    Anti-glare processing for reducing glare in each pixel of the projected image corresponding to the region including the head, and
    An area corresponding to a background area that is an area other than the foreground area in the binarized image is extracted from the projected image and the corrected photographed image, and the pixel value of each corresponding pixel between the extracted images is extracted. Based on the difference, a dynamic threshold value calculation process for calculating a threshold value used for binarization of the difference image for each pixel is executed,
    In the foreground extraction process, a binary image is generated from the difference image based on the threshold value calculated in the dynamic threshold value calculation process, and a foreground region is extracted from the binary image,
    An anti-glare program for correcting pixel values of pixels in a portion other than the foreground area extracted by the foreground extraction process in the pixel value correction process.
  11.  コンピュータに、
     投影画像の画素の画素値と、対応する撮影画像の画素のうち背景の画素の画素値との関係に基づいて、撮影画像の画素値を対応する投影画像の画素値に近づけるように補正する方法を規定したモデルである画素値補正モデルを算出する画素値補正モデル算出処理を実行させ、
     前記画素値補正モデル算出処理で、前景抽出処理で抽出された前景領域から背景を特定させ、
     画素値補正処理で、前記画素値補正モデルに基づいて撮影画像の画素値を補正させる
     請求項10記載の光防眩用プログラム。
    On the computer,
    A method for correcting a pixel value of a photographed image to be close to a pixel value of a corresponding projection image based on a relationship between a pixel value of a pixel of the projection image and a pixel value of a background pixel among the pixels of the corresponding photographed image A pixel value correction model calculation process for calculating a pixel value correction model that is a model that defines
    In the pixel value correction model calculation process, the background is specified from the foreground region extracted by the foreground extraction process,
    The program for anti-glare according to claim 10, wherein the pixel value of the photographed image is corrected based on the pixel value correction model in a pixel value correction process.
PCT/JP2012/007623 2011-12-16 2012-11-28 Projecting projector device, optical anti-glare method, and optical anti-glare program WO2013088657A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011275814 2011-12-16
JP2011-275814 2011-12-16

Publications (1)

Publication Number Publication Date
WO2013088657A1 true WO2013088657A1 (en) 2013-06-20

Family

ID=48612135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/007623 WO2013088657A1 (en) 2011-12-16 2012-11-28 Projecting projector device, optical anti-glare method, and optical anti-glare program

Country Status (1)

Country Link
WO (1) WO2013088657A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015011103A (en) * 2013-06-27 2015-01-19 カシオ計算機株式会社 Projection apparatus, projection method, and program
CN109816662A (en) * 2017-11-22 2019-05-28 瑞昱半导体股份有限公司 Image processing method and electronic device for foreground image extraction
CN112700463A (en) * 2020-12-30 2021-04-23 上海幻维数码创意科技股份有限公司 Multimedia exhibition hall interaction method and device based on image detection and storage medium
CN114928728A (en) * 2022-05-27 2022-08-19 海信视像科技股份有限公司 Projection apparatus and foreign matter detection method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0281590A (en) * 1988-09-19 1990-03-22 Toshiba Corp Moving picture encoding system
JP2004254145A (en) * 2003-02-21 2004-09-09 Hitachi Ltd Projection display device
JP2005004799A (en) * 1998-01-07 2005-01-06 Toshiba Corp Object extraction apparatus
JP2008148089A (en) * 2006-12-12 2008-06-26 Nikon Corp Projection apparatus and camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0281590A (en) * 1988-09-19 1990-03-22 Toshiba Corp Moving picture encoding system
JP2005004799A (en) * 1998-01-07 2005-01-06 Toshiba Corp Object extraction apparatus
JP2004254145A (en) * 2003-02-21 2004-09-09 Hitachi Ltd Projection display device
JP2008148089A (en) * 2006-12-12 2008-06-26 Nikon Corp Projection apparatus and camera

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015011103A (en) * 2013-06-27 2015-01-19 カシオ計算機株式会社 Projection apparatus, projection method, and program
CN109816662A (en) * 2017-11-22 2019-05-28 瑞昱半导体股份有限公司 Image processing method and electronic device for foreground image extraction
CN109816662B (en) * 2017-11-22 2022-10-18 瑞昱半导体股份有限公司 Image processing method for foreground image extraction and electronic device
CN112700463A (en) * 2020-12-30 2021-04-23 上海幻维数码创意科技股份有限公司 Multimedia exhibition hall interaction method and device based on image detection and storage medium
CN114928728A (en) * 2022-05-27 2022-08-19 海信视像科技股份有限公司 Projection apparatus and foreign matter detection method

Similar Documents

Publication Publication Date Title
WO2022179108A1 (en) Projection correction method and apparatus, storage medium, and electronic device
CN102193287B (en) Projection method and projection system
JP5266954B2 (en) Projection display apparatus and display method
JP5266953B2 (en) Projection display apparatus and display method
US20140176730A1 (en) Projection-type image display device, image projection method, and computer program
US10134118B2 (en) Information processing apparatus and method of obtaining information about a projection surface on which a target is projected
US9781396B2 (en) Projector and adjustment method using a plurality of patterns having different lightness values
JP2015060012A (en) Image processing system, image processing device, image processing method and image processing program as well as display system
JP2011081775A (en) Projection image area detecting device
US9953422B2 (en) Selective local registration based on registration error
US9551918B2 (en) Image processing apparatus, image processing method, and computer-readable storage medium
WO2018233637A1 (en) Video processing method, device, electronic device and storage medium
KR20090090682A (en) Screen distortion correction device and method
CN112272292A (en) Projection correction method, apparatus and storage medium
US10469812B2 (en) Projection display system, information processing apparatus, information processing method, and storage medium therefor
US20170359508A1 (en) Capturing apparatus and method for capturing images using the capturing apparatus
JP2019220887A (en) Image processing system, image processing method, and program
WO2013088657A1 (en) Projecting projector device, optical anti-glare method, and optical anti-glare program
WO2013186994A1 (en) Projection-type projector, anti-glare method, and program for anti-glare
JP2016173791A (en) Image processing apparatus, image processing method, and program
US10536677B2 (en) Image processing apparatus and method
JP2017212638A (en) Display device, control method for display device, and program
JP2021114685A (en) Controller, projection control method, projection system, program, and storage medium
US20160127704A1 (en) Display control apparatus, method of controlling the same, and non-transitory computer-readable storage medium
KR20150101343A (en) Video projection system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12857379

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12857379

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载