+

WO2018189867A1 - Projector, remote controller, projector system and distortion correction method - Google Patents

Projector, remote controller, projector system and distortion correction method Download PDF

Info

Publication number
WO2018189867A1
WO2018189867A1 PCT/JP2017/015159 JP2017015159W WO2018189867A1 WO 2018189867 A1 WO2018189867 A1 WO 2018189867A1 JP 2017015159 W JP2017015159 W JP 2017015159W WO 2018189867 A1 WO2018189867 A1 WO 2018189867A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
projection
projector
remote controller
distortion correction
Prior art date
Application number
PCT/JP2017/015159
Other languages
French (fr)
Japanese (ja)
Inventor
青柳 寿和
Original Assignee
Necディスプレイソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necディスプレイソリューションズ株式会社 filed Critical Necディスプレイソリューションズ株式会社
Priority to PCT/JP2017/015159 priority Critical patent/WO2018189867A1/en
Publication of WO2018189867A1 publication Critical patent/WO2018189867A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Definitions

  • the present invention relates to a projector, a remote controller, a projector system, and a distortion correction method.
  • a projector can project an image not only on a screen but also on a structure such as a wall.
  • the projection surface may not be flat.
  • distortion corresponding to the three-dimensional shape of the projection surface occurs in the projection image.
  • the projector is equipped with a distortion correction function for correcting such distortion.
  • the distortion of the projected image varies depending on the position (viewpoint) where the projected image is viewed. For example, when viewed from an oblique direction with respect to the projection plane, the near side of the projected image looks large and the far side appears small.
  • the distortion correction function is configured to correct distortion when a predetermined position (for example, the position of the projector) is the viewpoint, and the viewpoint cannot be changed.
  • Patent Document 1 a technique for correcting distortion according to an arbitrary viewpoint has been proposed (see Patent Document 1).
  • the projector described in Patent Document 1 includes an image modulation unit, a tilt sensor, a distance measurement unit, and a control unit, and an image formed by the image modulation unit is projected on a screen.
  • the tilt sensor detects the tilt angle of the projector with respect to the horizontal plane.
  • the screen is arranged perpendicular to the horizontal plane.
  • the distance measuring means measures the distance from the projector to the center of the screen (the center of the projection area) and the distance from the center of the screen to the person.
  • the distance measuring means includes a light source, a photodetector, and a right-angle prism.
  • the right angle prism is rotatably provided on the upper surface of the projector casing.
  • the light source outputs pulse-modulated light, and the pulse-modulated light is emitted in the horizontal direction via a right-angle prism. Reflected light from the object enters the photodetector through a right-angle prism. By rotating the right-angle prism, the periphery of the projector is scanned with pulse-modulated light.
  • the distance to the object is calculated based on the phase difference between the pulse modulated light output from the light source and the incident light (reflected light from the object) incident on the photodetector.
  • the control unit Based on the inclination angle detected by the inclination sensor and the distance from the projector to the screen and the distance from the screen to the person, which is measured by the distance measuring means, the control unit has a rectangular projection image when viewed from the person. As described above, the distortion of the image formed on the image modulation means is corrected.
  • distortion correction method there is a method in which a camera is installed at an arbitrary viewpoint position, an image projected on an object is captured by the camera, and distortion correction data is created from the degree of distortion of the captured image. Furthermore, as another distortion correction method, the operator adjusts the image so that the distortion is eliminated by using the user interface while observing the degree of distortion of the projected image projected on the target object at an arbitrary position. Thus, there is a method for creating distortion correction data.
  • Patent Document 1 discloses distance measuring means as means for detecting the position of an image projection area and the position of a person. However, Patent Document 1 measures distance to surrounding objects as distance measuring means. However, it is unclear how to detect the position of the image projection area and the position of the person. Therefore, it is also unclear whether it can be applied to distortion correction with the position of an arbitrary observer as the viewpoint. ing. In the method of installing a camera at an arbitrary viewpoint position, it is necessary to re-install the camera every time the viewpoint is changed. Such work is very troublesome for the user. Even in the method of correcting distortion using the user interface, the user needs to re-adjust the distortion correction every time the viewpoint is changed. Such work is very troublesome for the user.
  • An object of the present invention is to provide a projector, a remote controller, a projector system, and a distortion correction method capable of correcting distortion from an arbitrary viewpoint with a simple operation.
  • a projector used in combination with a remote controller that projects a predetermined pattern including three or more feature points An image forming unit for forming an image; A projection unit that projects an image formed by the image forming unit on a projection surface; Distortion correcting means for correcting an image formed on the image forming unit so that the projected image is not distorted when the projection surface is viewed from the position of the remote controller,
  • the distortion correction means includes A three-dimensional position of the projection plane in a projector coordinate system in which a three-dimensional measurement of the projection plane is performed to obtain a three-dimensional position of the projection plane, and a space in which the image of the projection unit is projected is represented by three-dimensional coordinates.
  • the predetermined pattern projected on the projection plane is imaged, the three or more feature points are extracted from the captured image of the pattern, and the physical coordinates of the three or more feature points in the projector coordinate system are acquired.
  • a second acquisition unit A storage unit storing data indicating the radiation direction of each of the three or more feature points of the remote controller; Based on the three-dimensional position coordinates of the projection plane, the physical coordinates of the three or more feature points, and the radiation directions of the three or more feature points, the position of the remote controller in the projector coordinate system is calculated. And a remote controller position calculating unit.
  • a projection unit that projects a predetermined pattern including three or more feature points;
  • a transmitter that wirelessly transmits a distortion correction start request signal for requesting the start of distortion correction;
  • First and second operation keys First and second operation keys;
  • a control unit that causes the projection unit to project the predetermined pattern when the first operation key is pressed, and transmits the distortion correction start request signal from the transmission unit when the second operation key is pressed.
  • a remote controller is provided.
  • a projector system having a remote controller that projects a predetermined pattern including three or more feature points.
  • a projector including an image forming unit that forms an image and a projection unit that projects an image formed by the image forming unit onto a projection plane
  • a distortion correction method using a remote controller Correcting the image formed on the image forming unit so that the projected image is not distorted when the projection surface is viewed from the position of the remote controller,
  • the distortion correction is A three-dimensional position of the projection plane in a projector coordinate system in which a three-dimensional measurement of the projection plane is performed to obtain a three-dimensional position of the projection plane, and a space in which the image of the projection unit is projected is represented by three-dimensional coordinates.
  • the predetermined pattern projected on the projection plane is imaged, the three or more feature points are extracted from the captured image of the pattern, and the physical coordinates of the three or more feature points in the projector coordinate system are acquired. And Based on the three-dimensional position coordinates of the projection plane, the physical coordinates of the three or more feature points, and the radiation directions of the three or more feature points, the position of the remote controller in the projector coordinate system is calculated. A distortion correction method is provided.
  • distortion correction from an arbitrary viewpoint can be performed with a simple operation.
  • FIG. 1 is a block diagram illustrating a configuration of a projector that is a first embodiment of the present invention.
  • FIG. It is a block diagram which shows the structure of the remote controller used with the projector shown in FIG.
  • It is a schematic diagram for demonstrating the positional relationship of the detection range of a three-dimensional sensor part, the imaging range of a camera part, and the projection area of a projection lens unit part.
  • It is a schematic diagram for demonstrating the positional relationship of the pattern projected from the remote controller, and the projection area of a projector.
  • It is a flowchart which shows one procedure of a distortion correction data calculation process.
  • It is a schematic diagram for demonstrating the relationship of a field angle in case of launching, a projection optical axis, and a projection center point.
  • FIG. 1 is a block diagram showing a configuration of a projector according to the first embodiment of the present invention.
  • the projector includes a projection unit 2, a reception unit 3, and a distortion correction data calculation unit 4.
  • the projection unit 2 projects an image based on the video signal 1 from the video supply device.
  • the receiving unit 3 receives a remote control signal (including a distortion correction start request and the like) from the remote controller.
  • the distortion correction data calculation unit 4 calculates distortion correction data so that an image without distortion can be visually recognized when the projection plane is viewed from the position of the remote controller.
  • the video supply device is, for example, an information processing device such as a personal computer or a video device such as a recorder.
  • the projection unit 2 includes a video processing unit 5, a distortion correction unit 6, a video mute unit 7, and a projection lens unit unit 8.
  • the video signal 1 is input to the video processing unit 5.
  • the video processing unit 5 performs processing for converting the resolution of the video signal 1 to the resolution of the display device of the projection lens unit 8, processing for adjusting image quality, and the like.
  • the distortion correction unit 6 corrects the distortion of the projected image when the position of the remote controller is set as the viewpoint, according to the distortion correction coefficient, with respect to the video signal processed by the video processing unit 5.
  • the video mute unit 7 supplies the video signal corrected by the distortion correction unit 6 to the projection lens unit unit 8 as it is, or creates a black screen image and outputs it to the projection lens unit unit 8.
  • the projection lens unit section 8 includes a display device that forms an image based on the video signal from the video mute section 7 and a projection lens that projects an image formed by the display device.
  • the projection lens includes a lens that can move in the direction of the optical axis, a zoom mechanism that changes the angle of view according to a zoom position corresponding to the position on the optical axis of the lens, and the entire projection lens that is orthogonal to the optical axis. And a lens shift mechanism that shifts in the direction of movement.
  • the projection lens unit 8 supplies zoom / shift position information indicating the zoom position and lens shift position of the projection lens to the distortion correction data calculation unit 4.
  • the display device can be referred to as an image forming element including an image forming surface including a plurality of pixels.
  • a liquid crystal display element, DMD (digital micromirror device), or the like can be used.
  • the distortion correction data calculation unit 4 includes a projector projection design data storage unit 9, a three-dimensional sensor unit 10, a three-dimensional sensor calibration data storage unit 11, a three-dimensional data projector coordinate conversion unit 12, a camera unit 13, and a pattern feature point detection unit. 14, camera imaging design data storage unit 15, pattern physical coordinate calculation unit 16, camera calibration data storage unit 17, pattern projector coordinate conversion unit 18, pattern storage unit 19, remote controller projection design data storage unit 20, remote controller position calculation And a distortion correction coefficient calculation unit 22.
  • the projector projection design data storage unit 9 stores design data related to the projection of the projector.
  • the design data is data necessary for setting the projector coordinate system such as the angle of view and the projection center from the zoom position and the lens shift position.
  • the three-dimensional sensor unit 10 measures the three-dimensional position of the projection surface of the projection object and outputs three-dimensional position data.
  • the three-dimensional sensor unit 10 includes a three-dimensional sensor arranged toward the optical axis direction of the projection lens.
  • the detection range of the three-dimensional sensor includes the entire projectable area of the projection lens.
  • a TOF (Time of Flight) method or a triangulation method three-dimensional sensor can be used, but it is not limited to these methods.
  • the TOF method is a method of performing three-dimensional measurement by projecting light toward an object and measuring the time until the projected light is reflected by the object and returned.
  • Examples of the triangulation method include a passive triangulation method and an active triangulation method.
  • the passive triangulation method an object is photographed at the same time with two cameras arranged side by side on the left and right, and the principle of triangulation is used based on the difference in position on the captured image of the object obtained by each camera.
  • This is a method for measuring and is also called a stereo camera method.
  • the active triangulation method is a method of irradiating light on an object and performing three-dimensional measurement using the principle of triangulation based on information on reflected light from the object.
  • the 3D sensor calibration data storage unit 11 stores 3D sensor calibration data.
  • the three-dimensional sensor calibration data includes parameters (rotation amount and translation amount) for converting the coordinate system of the three-dimensional sensor into the coordinate system of the projector, a reference zoom position, and a reference lens shift position.
  • the amount of rotation and the amount of translation are data obtained from the result of calibration that measures the positional relationship between the coordinate system of the three-dimensional sensor and the coordinate system of the projector.
  • the reference zoom position and the reference lens shift position are the zoom position and the lens shift position when calibration is performed.
  • the three-dimensional data projector coordinate conversion unit 12 acquires the three-dimensional position data of the projection plane from the three-dimensional sensor unit 10 and the three-dimensional sensor calibration data (rotation amount, translation amount, Reference zoom position and reference lens shift position) are acquired, design data is acquired from the projector projection design data storage unit 9, and zoom / shift position information is acquired from the projection lens unit unit 8. Based on the 3D sensor calibration data, the design data, and the zoom / shift position information, the 3D data projector coordinate conversion unit 12 converts the 3D position data of the projection plane into the coordinate system of the projector with the projection center as the origin. Convert to 3D position data.
  • the calibration data and design data can be called coordinate conversion data for converting the coordinate system of the three-dimensional sensor into a projector coordinate system with the projection center as the origin.
  • the camera unit 13 is arranged in the direction of the optical axis of the projection lens, and images the pattern projected from the remote controller onto the projection area.
  • the imaging range of the camera unit 13 includes the entire projectable area of the projection lens.
  • the pattern feature point detection unit 14 extracts a pattern projected by the remote controller from the captured image of the camera unit 13 and detects a plurality of feature points indicated by the pattern.
  • the number of feature points is three or more.
  • the pattern feature point detector 14 outputs pattern feature point information (two-dimensional coordinate information) indicating the coordinates of each feature point on the captured image.
  • the camera imaging design data storage unit 15 stores camera imaging design data related to imaging such as the angle of view of the camera.
  • the camera imaging design data is data necessary for setting a camera coordinate system such as an angle of view and an optical center.
  • the pattern physical coordinate calculation unit 16 acquires camera imaging design data from the camera imaging design data storage unit 15 and acquires pattern feature point information indicating the coordinates of each feature point from the pattern feature point detection unit 14.
  • the pattern physical coordinate calculation unit 16 calculates physical coordinates of each feature point in the camera coordinate system based on the camera imaging design data and the pattern feature point information.
  • the pattern physical coordinate calculation unit 16 outputs pattern feature point physical coordinate data, which is a calculation result of the physical coordinates of each feature point.
  • the physical coordinates will be briefly described.
  • the coordinates on the captured image which are two-dimensional coordinates, expressed as three-dimensional coordinates on a reference plane placed at an arbitrary distance based on design information such as the angle of view of the camera are called physical coordinates.
  • the camera calibration data storage unit 17 stores camera calibration data. It includes parameters (amount of rotation and translation) for converting the camera coordinate system to the projector coordinate system, a reference zoom position, and a reference lens shift position.
  • the rotation amount and the translation amount are data obtained as a result of performing calibration for measuring the positional relationship between the camera coordinate system and the projector coordinate system.
  • the reference zoom position and the reference lens shift position are the zoom position and the lens shift position when calibration is performed.
  • the pattern projector coordinate conversion unit 18 acquires the pattern feature point physical coordinate data from the pattern physical coordinate calculation unit 16 and the camera calibration data (rotation amount, translation amount, reference zoom position, and reference lens) from the camera calibration data storage unit 17.
  • design data is acquired from the projector projection design data storage unit 9, and zoom / shift position information is acquired from the projection lens unit unit 8.
  • the pattern projector coordinate conversion unit 18 uses the pattern feature point physical coordinate data based on the camera calibration data, design data, and zoom / shift position information as the pattern feature point physical coordinates in the projector coordinate system with the projection center as the origin. Convert to data.
  • the pattern storage unit 19 stores pattern information projected from the remote controller. For example, in the case of a pattern including four feature points, information such as the positional relationship of each feature point is stored in the pattern storage unit 19.
  • the remote controller projection design data storage unit 20 stores design data related to the projection of the remote controller. This remote controller projection design data is the angle of view of projection of the remote controller. From this design data, it is possible to obtain the direction (radiation direction) of the light beam toward each of the feature points.
  • the remote controller position calculation unit 21 acquires the three-dimensional position coordinate data of the projection plane of the projector coordinate system from the three-dimensional data projector coordinate conversion unit 12, and the pattern feature point physical coordinate data of the projector coordinate system from the pattern projector coordinate conversion unit 18. , Pattern information is acquired from the pattern storage unit 19, and remote controller projection design data is acquired from the remote controller projection design data storage unit 20.
  • the remote controller position calculation unit 21 calculates the three-dimensional position of the remote controller in the projector coordinate system based on the three-dimensional position coordinate data, the pattern feature point physical coordinate data, the pattern information, and the remote controller projection design data.
  • the distortion correction coefficient calculation unit 22 acquires zoom / shift position information from the projection lens unit unit 8, acquires design data from the projector projection design data storage unit 9, and receives the projector coordinate system from the three-dimensional data projector coordinate conversion unit 12.
  • the three-dimensional position coordinate data of the projection plane is acquired, and the three-dimensional position data of the remote controller of the projector coordinate system is acquired from the remote controller position calculation unit 21.
  • the distortion correction coefficient calculation unit 22 uses the position of the remote controller as a viewpoint based on the zoom / shift position information, the design data, the three-dimensional position coordinate data of the projection plane, and the three-dimensional position data of the remote controller.
  • a distortion correction coefficient for correcting the projection image so as not to cause distortion corresponding to the three-dimensional shape of the projection surface is calculated.
  • the distortion correction coefficient is supplied from the distortion correction coefficient calculation unit 22 to the distortion correction unit 6.
  • FIG. 2 is a block diagram showing the configuration of the remote controller.
  • the remote controller 50 includes an operation unit 51, a control unit 52, a transmission unit 53, and a pattern projection unit 54.
  • the operation unit 51 includes a plurality of operation keys, receives an input operation using these operation keys, and supplies an operation signal corresponding to the input operation to the control unit 52.
  • a pattern projection key and a distortion correction start key are provided on the operation unit 51.
  • an operation signal for starting pattern projection is supplied from the operation unit 51 to the control unit 52.
  • the distortion correction start key is pressed, an operation signal indicating the start of distortion correction is supplied from the operation unit 51 to the control unit 52.
  • the pattern projection unit 54 projects a predetermined pattern including three or more feature points.
  • the predetermined pattern may be any pattern as long as three or more feature points can be specified.
  • the predetermined pattern may be a pattern composed of three or more bright spots, or may be a figure composed of a plurality of lines.
  • a pattern including four feature points is used as the predetermined pattern.
  • the pattern projection unit 54 may have any structure as long as a predetermined pattern can be projected.
  • the pattern projection unit 54 may have a structure for projecting a pattern composed of a plurality of bright spots and a figure pattern composed of a plurality of lines, such as used in a laser pointer or the like.
  • the pattern projection unit 54 may include four light sources arranged on the substrate and a lens group that projects a bright spot pattern composed of these light sources.
  • an LD laser diode
  • LED light emitting diode
  • the direction of the chief ray from the four light sources toward each of the four feature points is the radiation direction of each feature point.
  • a combination of three or more laser pointers each indicating one bright spot may be used as the pattern projection unit 54. In this case, the emission direction of each laser pointer is the radiation direction of each feature point.
  • the transmission unit 53 transmits a remote control signal (radio signal) using infrared rays or the like. Since the remote control signal is generally used in video equipment such as a recorder, a detailed description of its structure is omitted here.
  • the control unit 52 controls the operations of the transmission unit 53 and the pattern projection unit 54 in accordance with an operation signal from the operation unit 51. Specifically, the control unit 52 causes the pattern projection unit 54 to project an image of a predetermined pattern in accordance with an operation signal indicating that the pattern projection key is pressed. In addition, the control unit 52 causes the transmission unit 53 to transmit a remote control signal indicating a distortion correction start request in accordance with an operation signal indicating that the distortion correction start key is pressed.
  • FIG. 3 schematically shows a positional relationship among the detection range of the three-dimensional sensor unit 10, the imaging range of the camera unit 13, and the projection area of the projection lens unit unit 8.
  • the three-dimensional sensor 107, the camera 106, and the projection lens 101a are provided on the same surface of the casing.
  • the three-dimensional sensor 107 and the camera 106 are arranged toward the optical axis direction of the projection lens 101a.
  • the projection area can be enlarged or reduced using the zoom function, and the projection area can be moved up, down, left, and right using the lens shift function.
  • the projection area 110 is a projection area when the lens is shifted to the upper right to minimize the zoom.
  • the projection area 111 is a projection area when the zoom is maximized by shifting the lens in the upper right direction.
  • the projection area 112 is a projection area when the zoom is minimized by shifting the lens in the upper left direction.
  • the projection area 113 is a projection area when the lens is shifted in the upper left direction to maximize the zoom.
  • the projection area 114 is a projection area when the zoom is minimized by shifting the lens in the lower right direction.
  • the projection area 115 is a projection area when the zoom is maximized by shifting the lens in the lower right direction.
  • the projection area 116 is a projection area when the zoom is minimized by shifting the lens in the lower left direction.
  • the projection area 117 is a projection area when the zoom is maximized by shifting the lens in the lower left direction.
  • the detection range of the three-dimensional sensor 107 includes the entire projectable area where the image from the projection lens 101a can be projected, that is, the entire projection areas 110 to 117. Therefore, the three-dimensional sensor 107 can measure the three-dimensional position of the three-dimensional object arranged in the projectable area.
  • the three-dimensional sensor unit 10 supplies the three-dimensional position data that is the output of the three-dimensional sensor 107 to the three-dimensional data projector coordinate conversion unit 12.
  • the imaging range of the camera 106 also includes the entire projectable area, that is, the entire projection areas 110 to 117. Therefore, the camera 106 can capture a predetermined pattern projected on the projection plane by the remote controller 50 in the projectable area.
  • the camera unit 13 supplies the captured image that is the output of the camera 106 to the pattern feature point detection unit 14.
  • Both the detection range and the imaging range of the camera 106 include a projection area.
  • the user uses the remote controller 50 to specify a position that is a viewpoint for viewing the projected image.
  • the pattern projection unit 54 projects a predetermined pattern on the projection surface.
  • the transmission unit 53 transmits a remote control signal indicating a distortion correction start request.
  • the projector 101 starts distortion correction processing in response to a remote control signal from the remote controller 50.
  • FIG. 4 shows the positional relationship between the pattern projected from the remote controller 50 and the projection area of the projector 101.
  • the remote controller 50 projects a pattern 120 composed of four points onto the projection area 103 on the projection object 104.
  • the projector 101 detects the position of the remote controller 50 based on the pattern 120, and an image without distortion is viewed from the detected position as a viewpoint. Such distortion correction data is calculated.
  • FIG. 5 shows a procedure of distortion correction data calculation processing. The operation will be described below with reference to FIGS.
  • the receiving unit 3 It is determined whether the receiving unit 3 has received a remote control signal (distortion correction start request) from the remote controller 50 (step S100).
  • the receiving unit 3 notifies the distortion correction data calculating unit 4 to that effect.
  • the distortion correction data calculation unit 4 in response to a distortion correction start request, first, the three-dimensional sensor 107 three-dimensionally measures the projection surface of the projection area 103, and the three-dimensional position data as the measurement result is measured by a three-dimensional data projector. It supplies to the coordinate transformation part 12 (step S101).
  • the three-dimensional position data is point group data indicated by three-dimensional coordinates in the coordinate system of the three-dimensional sensor 107.
  • the three-dimensional data projector coordinate conversion unit 12 converts the three-dimensional position data of the projection plane into the three-dimensional position data of the projector coordinate system with the projection center 102 as the origin (step S102).
  • the projection center 102 which is the origin of the projector coordinate system will be briefly described.
  • FIG. 6A schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when launching is performed.
  • FIG. 6B schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when there is no launch.
  • FIG. 6C schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when the angle of view is enlarged.
  • FIG. 6A schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when launching is performed.
  • FIG. 6B schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when there is no launch.
  • FIG. 6C schematically shows
  • 6D schematically shows the relationship between the angle of view, the projection center axis, and the projection center when the lens shift is performed upward.
  • the projection optical axis is an axis passing through the center of the image forming surface and perpendicular to the image forming surface.
  • 6B to 6D correspond to cross sections in the vertical direction.
  • a projection form called launching is used in which the image is projected above the projection optical axis so that the image is projected above the height of the table.
  • launching is one form of the lens shift.
  • the projection center 102 is a line in which the points at the four corners of the projection area 103 and the points at the four corners of the image forming area of the display device 100 are linearly connected by corresponding points.
  • the projection area 103 is obtained by inverting the image of the image forming area of the display device 100 vertically and horizontally. Actually, since refraction is caused by the lens, lines connecting the four corners of the projection area 103 and the four corners of the image forming area of the display device 100 are not straight lines.
  • the projection center 102 needs to be determined in consideration of the lens configuration of the projection lens.
  • the four corner points of the image forming area of the display device 100 are A point, B point, C point, and D point, respectively, and the four corner points of the projection area 103 are respectively a point, b point, c point, and d point.
  • Points a, b, c, and d correspond to points A, B, C, and D, respectively, and the arrangement of points a, b, c, and d is points A, B, and C.
  • the positional relationship is reversed up, down, left and right with respect to the arrangement of point D.
  • the projection center 102 emits from the point A and reaches the point a via the lens, emits from the point B and reaches the point b via the lens, and exits from the point C.
  • the principal ray that reaches the point c via the lens and the principal ray that exits from the point D and reaches the point d via the lens are shown as crossing each other.
  • Such an intersection of principal rays can be defined at the center of the aperture stop of the projection lens, for example, and can be calculated based on lens design data.
  • the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center 102 is located above the projection optical axis 109. In this case, the projection center axis does not coincide with the projection optical axis 109.
  • the projection optical axis 109 passes through the center of the projection area 103, and the projection center 102 is located on the projection optical axis 109.
  • the projection center axis coincides with the projection optical axis 109.
  • FIG. 6C is an example in which the angle of view is enlarged as compared with the example of FIG. 6B. 6B, the projection optical axis 109 passes through the center of the projection area 103, and the projection center 102 is located on the projection optical axis 109.
  • the projection center 102 is closer to the display device 100 than in the example of FIG. 6B. Is arranged.
  • the projection center axis coincides with the projection optical axis 109.
  • FIG. 6D is an example in which the lens shift is performed so that the projection area 103 is shifted upward as compared to the example of FIG. 6B.
  • the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center 102 is located above the projection optical axis 109. In this case, the projection center axis does not coincide with the projection optical axis 109.
  • the projection center 102 changes according to the zoom position and the lens shift position. In other words, the projection center 102 needs to be determined according to the zoom position or the lens shift position.
  • a rotation amount with respect to three coordinate axes orthogonal to each other and a translation amount representing movement with respect to these coordinate axes are used.
  • Obtaining the amount of rotation and the amount of translation is called calibration. Since the position of the projection center 102 that is the origin of the projector coordinate system is not always the same and moves by zooming or lens shift, the three-dimensional sensor calibration data storage unit 11 performs calibration together with the rotation amount and the translation amount.
  • the zoom position and lens shift position at that time are stored as a reference zoom position and a reference lens shift position, respectively.
  • the three-dimensional data projector coordinate conversion unit 12 performs coordinate conversion in the following procedure using the rotation amount, the translation amount, the reference zoom position, and the reference lens shift position.
  • A1 The coordinates of the reference projection center point at the time of calibration are obtained from the design data stored in the projector projection design data storage unit 9, the reference zoom position, and the reference lens shift position.
  • A2) The coordinates of the current projection center point are obtained from the current zoom position and lens shift position from the projection lens unit 8 in the same manner as described above.
  • A3) A translation amount for converting from the coordinates of the reference projection center point to the coordinates of the current projection center point is obtained.
  • A4) The coordinate transformation is performed on the three-dimensional position data from the three-dimensional sensor 107 based on the rotation amount and the translation amount stored in the three-dimensional sensor calibration data storage unit 11.
  • the coordinates are moved by the amount of translation from the coordinates of the reference projection center point to the coordinates of the current projection center point in accordance with the current zoom position and the current lens shift position. Based on the above procedures (A1) to (A5), the three-dimensional position data from the three-dimensional sensor 107 is converted into a projector coordinate system with the current projection center point as the origin.
  • the camera unit 13 including the camera 106 supplies the video mute unit 7 with a mute start instruction signal to output a black screen video.
  • the video mute unit 7 supplies a black screen video signal to the projection lens unit 8 in response to the mute start instruction signal.
  • the projection lens unit unit 8 projects a black screen image on the projection area 103 (step S103). By projecting a black screen image, a predetermined pattern projected from the remote controller 50 can be accurately extracted.
  • the camera 106 images the pattern 120 projected from the remote controller 50 (step S104).
  • the camera unit 13 including the camera 106 supplies a mute release instruction signal to the video mute unit 7.
  • the video mute unit 7 supplies the video signal from the distortion correction unit 6 to the projection lens unit unit 8 in response to the mute release instruction signal.
  • the pattern feature point detection unit 14 extracts four feature points of the pattern 120 from the captured image from the camera 106 (step S105).
  • the pattern physical coordinate calculation unit 16 calculates physical coordinates in the camera coordinate system of each of the four feature points (step S106).
  • the pattern projector coordinate conversion unit 18 converts the physical coordinates of each feature point of the pattern in the camera coordinates into physical coordinates in the projector coordinate system (step S107).
  • the coordinate transformation in step S107 can also be performed by the same method as the above-described three-dimensional sensor projector coordinate transformation.
  • the pattern projector coordinate conversion unit 18 performs coordinate conversion in the following procedure using the rotation amount, the translation amount, the reference zoom position, and the reference lens shift position.
  • the coordinates of the reference projection center point at the time of calibration are obtained from the design data stored in the projector projection design data storage unit 9, the reference zoom position, and the reference lens shift position.
  • the coordinates of the current projection center point are obtained from the current zoom position and lens shift position from the projection lens unit 8 in the same manner as described above.
  • B3 A translation amount for conversion from the coordinates of the reference projection center point to the coordinates of the current projection center point is obtained.
  • the physical coordinates of each feature point of the pattern in the camera coordinates from the pattern physical coordinate calculation unit 16 are coordinate-converted by the rotation amount and the translation amount stored in the camera calibration data storage unit 17.
  • the coordinates are moved by the amount of translation from the coordinates of the reference projection center point to the coordinates of the current projection center point in accordance with the current zoom position and the current lens shift position.
  • the physical coordinates of each feature point of the pattern in the camera coordinates from the pattern physical coordinate calculation unit 16 are converted into a projector coordinate system with the current projection center point as the origin. .
  • the origin (0,0,0) which is the optical center in the camera coordinate system, is also converted to the projector coordinate system.
  • the remote controller position calculation unit 21 performs the three-dimensional position data of the projection plane in the projector coordinate system (three-dimensional position coordinates of the projection plane) and each feature point of the pattern in the projector coordinate system.
  • the three-dimensional position of the remote controller in the projector coordinate system is calculated based on the physical coordinates (three-dimensional position coordinates) (step S108).
  • the distortion correction coefficient calculation unit 22 determines the remote controller based on the three-dimensional position data of the projection plane in the projector coordinate system (three-dimensional position coordinates of the projection plane) and the three-dimensional position of the remote controller in the projector coordinate system.
  • a distortion correction coefficient is calculated so that distortion corresponding to the three-dimensional shape of the projection plane does not occur in the projected image when the projection plane is viewed from the viewpoint (step S109).
  • the distortion correction coefficient calculated as described above is supplied to the distortion correction unit 6.
  • the distortion correction unit 6 performs distortion correction on the video signal from the video processing unit 5 according to the distortion correction coefficient.
  • FIG. 7 schematically shows a method for calculating the three-dimensional position of the remote controller 50.
  • the coordinate axes of the projector coordinate system are indicated by XYZ axes
  • the coordinate axes of the three-dimensional sensor coordinate system are indicated by X'Y'Z 'axes
  • the coordinate axes of the camera coordinate system are indicated by X "Y" Z "axes.
  • the origins of the projector coordinate system, the three-dimensional sensor coordinate system, and the camera coordinate system are different from each other.
  • the pattern feature point detection unit 14 extracts four feature points of the pattern 120 from the captured image 121 from the camera 106, and the pattern physical coordinate calculation unit 16 extracts the camera coordinates of each of the four feature points.
  • the physical coordinates 122 in the system are calculated.
  • the pattern projector coordinate conversion unit 18 converts the physical coordinates 122 of each feature point in the camera coordinate system into physical coordinates in the projector coordinate system. Further, the pattern projector coordinate conversion unit 18 converts the origin (0,0,0) of the camera coordinate system, which is the optical center of the camera 106, into the projector coordinate system.
  • the remote controller position calculation unit 21 obtains four straight lines connecting the physical coordinates of the four feature points and the origin of the camera coordinate system in the projector coordinate system, and each straight line and the projector converted by the three-dimensional data projector coordinate conversion unit 12 An intersection point with the three-dimensional position of the projection plane in the coordinate system is obtained.
  • the pattern projected from the remote controller 50 from the pattern data stored in the pattern storage unit 19 and the design data stored in the remote controller projection design data storage unit 20 by the remote controller position calculation unit 21 is The three-dimensional position of the remote controller 50 in the projector coordinate system is calculated from the condition that projection is performed at the three-dimensional position of the intersection.
  • the remote controller position calculation unit 21 acquires the light ray directions respectively directed to the four feature points of the remote controller 50 based on the design data stored in the remote controller projection design data storage unit 20. Then, a unique position of the remote controller is obtained such that light beams traveling toward the four feature points pass through the four intersection points on the projection plane in the projector coordinate system. This obtained position is the three-dimensional position of the remote controller 50 in the projector coordinate system.
  • distortion correction can be automatically performed using the position designated by the remote controller 50 as a viewpoint. Therefore, distortion correction corresponding to an arbitrary viewpoint can be performed with a simple operation using a remote controller.
  • the operation has been described by taking a pattern including four feature points as an example. However, the number of feature points necessary for calculating the position of the remote controller is three or more. Further, a polygonal graphic pattern composed of a plurality of lines may be used instead of the bright spot pattern. In this case, each vertex angle of the polygon is extracted as a feature point.
  • a portion including the distortion correction unit 6 and the distortion correction data calculation unit 4 can be referred to as a distortion correction unit.
  • a portion including the projector projection design data storage unit 9, the three-dimensional sensor unit 10, the three-dimensional sensor calibration data storage unit 11, and the three-dimensional data projector coordinate conversion unit 12 can be referred to as a first acquisition unit.
  • a portion including the camera unit 13, the pattern feature point detection unit 14, the camera imaging design data storage unit 15, the pattern physical coordinate calculation unit 16, the camera calibration data storage unit 17, and the pattern projector coordinate conversion unit 18 is referred to as a second acquisition unit. Can be called.
  • the pattern storage unit 19 and the remote controller projection design data storage unit 20 can be called storage units.
  • FIG. 8 is a block diagram showing a configuration of a projector according to the second embodiment of the present invention.
  • the projector is used in combination with a remote controller that projects a predetermined pattern including three or more feature points.
  • the projector includes an image forming unit 200 that forms an image, and an image forming unit 200.
  • a projection unit 201 that projects the formed image, and a distortion correction unit 202 that corrects an image formed on the image forming unit 200 so that the projection image is not distorted when the projection surface is viewed from the position of the remote controller.
  • the distortion correction unit 202 includes a first acquisition unit 203, a second acquisition unit 204, a storage unit 205, and a remote controller position calculation unit 206.
  • the first acquisition unit 203 performs three-dimensional measurement of the projection plane to acquire a three-dimensional position of the projection plane, and projection in a projector coordinate system in which a space in which the image of the projection unit 201 is projected is represented by three-dimensional coordinates.
  • the second acquisition unit 204 captures a predetermined pattern projected on the projection plane, extracts three or more feature points from the captured image of the pattern, and extracts the three or more feature points in the projector coordinate system. Get physical coordinates.
  • the storage unit 205 stores data indicating the radiation direction of each of the three or more feature points of the remote controller.
  • the remote controller position calculation unit 206 uses a three-dimensional position coordinate of the projection plane, physical coordinates of the three or more feature points, and radiation directions of the three or more feature points in the projector coordinate system. Calculate the position of the remote controller.
  • distortion correction can be automatically performed using the position designated by the remote controller as a viewpoint. Therefore, distortion correction corresponding to an arbitrary viewpoint can be performed with a simple operation using a remote controller.
  • the distortion correction unit 202 includes a distortion correction coefficient calculation unit that calculates a distortion correction coefficient for correcting distortion of the projected image based on the three-dimensional position coordinates of the projection surface and the position of the remote controller. Further, the image forming unit 200 may be configured to correct an image formed according to the distortion correction coefficient. In the projector according to the present embodiment, the second acquisition unit 204 may cause the image forming unit 200 to form a black screen image and capture the predetermined pattern in a state where the black screen is projected.
  • a receiving unit that receives a distortion correction start request signal transmitted from the remote controller is provided, and the first acquisition unit 203, the second acquisition unit 204, and the remote controller position calculation unit 206 include: It may be configured to operate in response to a distortion correction start request signal.
  • the image forming unit 200 and the projection unit 201 correspond to the projection lens unit unit 8 shown in FIG.
  • the distortion correction unit 202 corresponds to the distortion correction unit 6 and the distortion correction data calculation unit 4 illustrated in FIG.
  • the first acquisition unit 203 corresponds to the portion including the projector projection design data storage unit 9, the three-dimensional sensor unit 10, the three-dimensional sensor calibration data storage unit 11, and the three-dimensional data projector coordinate conversion unit 12 illustrated in FIG. To do.
  • the second acquisition unit 204 includes the camera unit 13, the pattern feature point detection unit 14, the camera imaging design data storage unit 15, the pattern physical coordinate calculation unit 16, the camera calibration data storage unit 17, and the pattern projector coordinates illustrated in FIG. This corresponds to the portion formed by the conversion unit 18.
  • the storage unit 205 corresponds to the pattern storage unit 19 and the remote controller projection design data storage unit 20 shown in FIG.
  • the remote controller position calculation unit 206 corresponds to the remote controller position calculation unit 21 shown in FIG.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)

Abstract

Provided is a projector capable of correcting distortions from any viewpoint via a simple operation. The projector is provided with: an image forming unit (200); a projection unit (201) that projects an image formed by the image forming unit; and a distortion correction means (202) that corrects the image being formed by the image forming unit so that distortion is not generated in the projected image. The distortion correction means is provided with: a first acquisition unit (203) for acquiring three-dimensional position coordinates of a projection surface in a projector coordinate system; a second acquisition unit (204) for extracting three or more characteristic points from a captured image of a prescribed pattern projected on the projection surface from a remote controller, and acquiring physical coordinates of each characteristic point in the projector coordinate system; a storage unit (205) in which data indicating the respective radiation direction of each characteristic point is stored; and a remote controller position calculation unit (206) for calculating the position of the remote controller on the basis of the three-dimensional position coordinates of the projection surface, the physical coordinates of each characteristic point, and the radiation direction of each characteristic point.

Description

プロジェクタ、リモートコントローラ、プロジェクタシステム及び歪み補正方法Projector, remote controller, projector system, and distortion correction method
 本発明は、プロジェクタ、リモートコントローラ、プロジェクタシステム及び歪み補正方法に関する。 The present invention relates to a projector, a remote controller, a projector system, and a distortion correction method.
 プロジェクタは、スクリーンだけでなく、壁等の構造体にも画像を投写することができるが、その場合、投写面が平面でない場合もある。投写面が平面でない場合は、投写面の3次元形状に応じた歪みが投写画像に生じる。また、投写面が平面であっても、投写光が投写面に対して斜めに投写された場合には、台形歪みと呼ばれる歪みが投写画像に生じる。このため、プロジェクタには、そのような歪みを補正するための歪み補正機能が搭載されている。
 投写画像の歪みは、投写画像を見る位置(視点)によっても異なる。例えば、投写面に対して斜め方向から見た場合に、投写画像の手前側が大きく見え、奥側が小さく見える。このため、視点の位置に応じて投写画像の歪みを補正することが望ましい。しかし、一般に、歪み補正機能は、所定の位置(例えば、プロジェクタの位置)を視点とした場合の歪みを補正するように構成されており、視点の変更はできない。
A projector can project an image not only on a screen but also on a structure such as a wall. In this case, the projection surface may not be flat. When the projection surface is not a flat surface, distortion corresponding to the three-dimensional shape of the projection surface occurs in the projection image. Even if the projection surface is flat, if the projection light is projected obliquely with respect to the projection surface, distortion called trapezoidal distortion occurs in the projected image. For this reason, the projector is equipped with a distortion correction function for correcting such distortion.
The distortion of the projected image varies depending on the position (viewpoint) where the projected image is viewed. For example, when viewed from an oblique direction with respect to the projection plane, the near side of the projected image looks large and the far side appears small. For this reason, it is desirable to correct the distortion of the projected image according to the position of the viewpoint. However, generally, the distortion correction function is configured to correct distortion when a predetermined position (for example, the position of the projector) is the viewpoint, and the viewpoint cannot be changed.
 そこで、任意の視点に応じた歪み補正を行うための技術が提案されている(特許文献1参照)。
 特許文献1に記載のプロジェクタは、画像変調手段、傾斜センサ、測距手段及び制御部を有しており、画像変調手段で形成された画像がスクリーン上に投写される。傾斜センサは、プロジェクタの水平面に対する傾斜角を検出する。スクリーンは、水平面と垂直に配置されている。
Therefore, a technique for correcting distortion according to an arbitrary viewpoint has been proposed (see Patent Document 1).
The projector described in Patent Document 1 includes an image modulation unit, a tilt sensor, a distance measurement unit, and a control unit, and an image formed by the image modulation unit is projected on a screen. The tilt sensor detects the tilt angle of the projector with respect to the horizontal plane. The screen is arranged perpendicular to the horizontal plane.
 測距手段は、プロジェクタからスクリーンの中心(投写領域の中心)までの距離と、スクリーンの中心から人物までの距離とをそれぞれ計測する。具体的には、測距手段は、光源、光検出器及び直角プリズムを有する。直角プリズムは、プロジェクタの筐体の上面に回転可能に設けられている。光源がパルス変調された光を出力し、そのパルス変調光が直角プリズムを介して水平方向に射出される。対象物からの反射光が、直角プリズムを介して光検出器に入射する。直角プリズムを回転させることで、プロジェクタの周囲をパルス変調光で走査する。光源から出力されるパルス変調光と、光検出器に入射した入射光(対象物からの反射光)との位相差に基づいて、対象物までの距離を算出する。
 制御部は、傾斜センサで検出した傾斜角と、測距手段で計測した、プロジェクタからスクリーンまでの距離とスクリーンから人物までの距離とに基づいて、人物から見たときの投写画像が矩形になるように、画像変調手段に形成される画像の歪み補正を行う。
The distance measuring means measures the distance from the projector to the center of the screen (the center of the projection area) and the distance from the center of the screen to the person. Specifically, the distance measuring means includes a light source, a photodetector, and a right-angle prism. The right angle prism is rotatably provided on the upper surface of the projector casing. The light source outputs pulse-modulated light, and the pulse-modulated light is emitted in the horizontal direction via a right-angle prism. Reflected light from the object enters the photodetector through a right-angle prism. By rotating the right-angle prism, the periphery of the projector is scanned with pulse-modulated light. The distance to the object is calculated based on the phase difference between the pulse modulated light output from the light source and the incident light (reflected light from the object) incident on the photodetector.
Based on the inclination angle detected by the inclination sensor and the distance from the projector to the screen and the distance from the screen to the person, which is measured by the distance measuring means, the control unit has a rectangular projection image when viewed from the person. As described above, the distortion of the image formed on the image modulation means is corrected.
 また、別の歪み補正方法として、任意の視点の位置にカメラを設置し、対象物に投写された画像をカメラで撮像し、その撮像画像の歪み具合から歪み補正データを作成する方法もある。
 さらに、別の歪み補正方法として、操作者が任意の位置に立って対象物に投写された投写画像の歪み具合を観察しながら、ユーザーインタフェースを用いて、歪みが解消するように画像を調整することで、歪み補正データを作成する方法もある。
As another distortion correction method, there is a method in which a camera is installed at an arbitrary viewpoint position, an image projected on an object is captured by the camera, and distortion correction data is created from the degree of distortion of the captured image.
Furthermore, as another distortion correction method, the operator adjusts the image so that the distortion is eliminated by using the user interface while observing the degree of distortion of the projected image projected on the target object at an arbitrary position. Thus, there is a method for creating distortion correction data.
特開2007-324643号公報JP 2007-324643 A
 特許文献1には、画像投写領域の位置と人物の位置とを検出する手段として測距手段が開示されているが、特許文献1には測距手段として周囲の物体までの距離を測定することのみが開示され、どのようにして画像投写領域の位置と人物の位置とを検出するかが不明であり、このため、任意の観察者の位置を視点とした歪み補正に適用できるかも不明となっている。
 任意の視点の位置にカメラを設置する方法においては、視点を変更する度に、カメラを設置し直すといった作業が必要となる。そのような作業は、使用者にとって非常に煩わしい。
 ユーザーインタフェースを用いて歪みを補正する方法においても、視点を変更する度に、使用者が歪み補正の調整をやり直すといった作業が必要となる。そのような作業は、使用者にとって非常に煩わしい。
Patent Document 1 discloses distance measuring means as means for detecting the position of an image projection area and the position of a person. However, Patent Document 1 measures distance to surrounding objects as distance measuring means. However, it is unclear how to detect the position of the image projection area and the position of the person. Therefore, it is also unclear whether it can be applied to distortion correction with the position of an arbitrary observer as the viewpoint. ing.
In the method of installing a camera at an arbitrary viewpoint position, it is necessary to re-install the camera every time the viewpoint is changed. Such work is very troublesome for the user.
Even in the method of correcting distortion using the user interface, the user needs to re-adjust the distortion correction every time the viewpoint is changed. Such work is very troublesome for the user.
 本発明の目的は、簡単な操作で任意の視点での歪み補正を行うことができる、プロジェクタ、リモートコントローラ、プロジェクタシステム及び歪み補正方法を提供することにある。 An object of the present invention is to provide a projector, a remote controller, a projector system, and a distortion correction method capable of correcting distortion from an arbitrary viewpoint with a simple operation.
 上記目的を達成するため、本発明の一態様によれば、
 3つ以上の特徴点を含む所定のパターンを投写するリモートコントローラと組み合わせて用いられるプロジェクタであって、
 画像を形成する画像形成部と、
 前記画像形成部で形成された画像を投写面上に投写する投写部と、
 前記リモートコントローラの位置から前記投写面を見たときの投写画像に歪みが生じないように前記画像形成部に形成される画像を補正する歪み補正手段と、を有し、
 前記歪み補正手段は、
 前記投写面を3次元計測して前記投写面の3次元位置を取得し、前記投写部の前記画像が投写される空間を3次元の座標で表したプロジェクタ座標系における前記投写面の3次元位置座標を取得する第1の取得部と、
 前記投写面上に投写された前記所定のパターンを撮像し、該パターンの撮像画像から前記3つ以上の特徴点を抽出し、前記プロジェクタ座標系における前記3つ以上の特徴点の物理座標を取得する第2の取得部と、
 前記リモートコントローラの前記3つ以上の特徴点それぞれの放射方向を示すデータが格納された記憶部と、
 前記投写面の3次元位置座標と、前記3つ以上の特徴点の物理座標と、前記3つ以上の特徴点それぞれの放射方向とに基づいて、前記プロジェクタ座標系における前記リモートコントローラの位置を算出するリモートコントローラ位置算出部と、を有する、プロジェクタ。
In order to achieve the above object, according to one aspect of the present invention,
A projector used in combination with a remote controller that projects a predetermined pattern including three or more feature points,
An image forming unit for forming an image;
A projection unit that projects an image formed by the image forming unit on a projection surface;
Distortion correcting means for correcting an image formed on the image forming unit so that the projected image is not distorted when the projection surface is viewed from the position of the remote controller,
The distortion correction means includes
A three-dimensional position of the projection plane in a projector coordinate system in which a three-dimensional measurement of the projection plane is performed to obtain a three-dimensional position of the projection plane, and a space in which the image of the projection unit is projected is represented by three-dimensional coordinates. A first acquisition unit for acquiring coordinates;
The predetermined pattern projected on the projection plane is imaged, the three or more feature points are extracted from the captured image of the pattern, and the physical coordinates of the three or more feature points in the projector coordinate system are acquired. A second acquisition unit
A storage unit storing data indicating the radiation direction of each of the three or more feature points of the remote controller;
Based on the three-dimensional position coordinates of the projection plane, the physical coordinates of the three or more feature points, and the radiation directions of the three or more feature points, the position of the remote controller in the projector coordinate system is calculated. And a remote controller position calculating unit.
 本発明の別の態様によれば、
 3つ以上の特徴点を含む所定のパターンを投写する投写部と、
 歪み補正の開始を要求する歪み補正開始要求信号を無線送信する送信部と、
 第1及び第2の操作キーと、
 前記第1の操作キーが押下されると、前記投写部から前記所定のパターンを投写させ、前記第2の操作キーが押下させると、前記送信部から前記歪み補正開始要求信号を送信させる制御部と、を有する、リモートコントローラが提供される。
According to another aspect of the invention,
A projection unit that projects a predetermined pattern including three or more feature points;
A transmitter that wirelessly transmits a distortion correction start request signal for requesting the start of distortion correction;
First and second operation keys;
A control unit that causes the projection unit to project the predetermined pattern when the first operation key is pressed, and transmits the distortion correction start request signal from the transmission unit when the second operation key is pressed. A remote controller is provided.
 本発明のさらに別の態様によれば、
 上記のプロジェクタと、
 3つ以上の特徴点を含む所定のパターンを投写するリモートコントローラと、を有する、プロジェクタシステムが提供される。
According to yet another aspect of the invention,
The above projector,
There is provided a projector system having a remote controller that projects a predetermined pattern including three or more feature points.
 本発明のさらに別の態様によれば、
 画像を形成する画像形成部と、前記画像形成部で形成された画像を投写面上に投写する投写部とを備えたプロジェクタにて行われる、3つ以上の特徴点を含む所定のパターンを投写するリモートコントローラを用いた歪み補正方法であって、
 前記リモートコントローラの位置から前記投写面を見たときの投写画像に歪みが生じないように前記画像形成部に形成される画像を補正することを含み、
 前記歪みの補正は、
 前記投写面を3次元計測して前記投写面の3次元位置を取得し、前記投写部の前記画像が投写される空間を3次元の座標で表したプロジェクタ座標系における前記投写面の3次元位置座標を取得し、
 前記投写面上に投写された前記所定のパターンを撮像し、該パターンの撮像画像から前記3つ以上の特徴点を抽出し、前記プロジェクタ座標系における前記3つ以上の特徴点の物理座標を取得し、
 前記投写面の3次元位置座標と、前記3つ以上の特徴点の物理座標と、前記3つ以上の特徴点それぞれの放射方向とに基づいて、前記プロジェクタ座標系における前記リモートコントローラの位置を算出することを含む、歪み補正方法が提供される。
According to yet another aspect of the invention,
Projecting a predetermined pattern including three or more feature points performed by a projector including an image forming unit that forms an image and a projection unit that projects an image formed by the image forming unit onto a projection plane A distortion correction method using a remote controller,
Correcting the image formed on the image forming unit so that the projected image is not distorted when the projection surface is viewed from the position of the remote controller,
The distortion correction is
A three-dimensional position of the projection plane in a projector coordinate system in which a three-dimensional measurement of the projection plane is performed to obtain a three-dimensional position of the projection plane, and a space in which the image of the projection unit is projected is represented by three-dimensional coordinates. Get the coordinates,
The predetermined pattern projected on the projection plane is imaged, the three or more feature points are extracted from the captured image of the pattern, and the physical coordinates of the three or more feature points in the projector coordinate system are acquired. And
Based on the three-dimensional position coordinates of the projection plane, the physical coordinates of the three or more feature points, and the radiation directions of the three or more feature points, the position of the remote controller in the projector coordinate system is calculated. A distortion correction method is provided.
 本発明によれば、簡単な操作で任意の視点での歪み補正を行うことができる。 According to the present invention, distortion correction from an arbitrary viewpoint can be performed with a simple operation.
本発明の第1の実施形態であるプロジェクタの構成を示すブロック図である。1 is a block diagram illustrating a configuration of a projector that is a first embodiment of the present invention. FIG. 図1に示すプロジェクタにて用いられるリモートコントローラの構成を示すブロック図である。It is a block diagram which shows the structure of the remote controller used with the projector shown in FIG. 3次元センサ部の検出範囲とカメラ部の撮像範囲と投写レンズユニット部の投写エリアとの位置関係を説明するための模式図である。It is a schematic diagram for demonstrating the positional relationship of the detection range of a three-dimensional sensor part, the imaging range of a camera part, and the projection area of a projection lens unit part. リモートコントローラから投写されたパターンとプロジェクタの投写エリアの位置関係を説明するための模式図である。It is a schematic diagram for demonstrating the positional relationship of the pattern projected from the remote controller, and the projection area of a projector. 歪み補正データ算出処理の一手順を示すフローチャートである。It is a flowchart which shows one procedure of a distortion correction data calculation process. 打ち上げ有りの場合の画角、投射光軸及び投射中心点の関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship of a field angle in case of launching, a projection optical axis, and a projection center point. 打ち上げ無しの場合の画角、投射光軸及び投射中心点の関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship of a field angle in case of no launch, a projection optical axis, and a projection center point. 画角を拡大した場合の画角、投射光軸及び投射中心点の関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship between a field angle at the time of enlarging a field angle, a projection optical axis, and a projection center point. 上方へレンズシフトを行った場合の画角、投射中心軸及び投射中心点の関係を説明するための模式図である。It is a schematic diagram for demonstrating the relationship of a field angle at the time of performing a lens shift upward, a projection central axis, and a projection center point. リモートコントローラの3次元位置を算出する方法を説明するための模式図である。It is a schematic diagram for demonstrating the method to calculate the three-dimensional position of a remote controller. 本発明の第2の実施形態であるプロジェクタの構成を示すブロック図である。It is a block diagram which shows the structure of the projector which is the 2nd Embodiment of this invention.
 次に、本発明の実施形態について図面を参照して説明する。
 (第1の実施形態)
 図1は、本発明の第1の実施形態であるプロジェクタの構成を示すブロック図である。
Next, embodiments of the present invention will be described with reference to the drawings.
(First embodiment)
FIG. 1 is a block diagram showing a configuration of a projector according to the first embodiment of the present invention.
 図1を参照すると、プロジェクタは、投写部2、受信部3と、歪み補正データ算出部4とを有する。投写部2は、映像供給装置からの映像信号1に基づく画像を投写する。受信部3は、リモートコントローラからリモートコントロール信号(歪み補正開始要求等を含む)を受信する。歪み補正データ算出部4は、リモートコントローラの位置から投写面を見た場合に、歪みのない映像が視認できるような歪み補正データを算出する。映像供給装置は、例えば、パーソナルコンピュータ等の情報処理装置やレコーダ等の映像機器である。 Referring to FIG. 1, the projector includes a projection unit 2, a reception unit 3, and a distortion correction data calculation unit 4. The projection unit 2 projects an image based on the video signal 1 from the video supply device. The receiving unit 3 receives a remote control signal (including a distortion correction start request and the like) from the remote controller. The distortion correction data calculation unit 4 calculates distortion correction data so that an image without distortion can be visually recognized when the projection plane is viewed from the position of the remote controller. The video supply device is, for example, an information processing device such as a personal computer or a video device such as a recorder.
 投写部2は、映像処理部5、歪み補正部6、映像ミュート部7及び投写レンズユニット部8を有する。映像信号1が映像処理部5に入力される。映像処理部5は、映像信号1の解像度を投写レンズユニット部8の表示デバイスの解像度へ変換する処理や画質を調整する処理などを行う。
 歪み補正部6は、映像処理部5で処理された映像信号に対して、歪み補正係数に従って、リモートコントローラの位置を視点とした場合の投写画像の歪みを補正する。映像ミュート部7は、歪み補正部6で補正された映像信号をそのまま投写レンズユニット部8に供給するか、黒画面の映像を作成して投写レンズユニット部8に出力する。
The projection unit 2 includes a video processing unit 5, a distortion correction unit 6, a video mute unit 7, and a projection lens unit unit 8. The video signal 1 is input to the video processing unit 5. The video processing unit 5 performs processing for converting the resolution of the video signal 1 to the resolution of the display device of the projection lens unit 8, processing for adjusting image quality, and the like.
The distortion correction unit 6 corrects the distortion of the projected image when the position of the remote controller is set as the viewpoint, according to the distortion correction coefficient, with respect to the video signal processed by the video processing unit 5. The video mute unit 7 supplies the video signal corrected by the distortion correction unit 6 to the projection lens unit unit 8 as it is, or creates a black screen image and outputs it to the projection lens unit unit 8.
 投写レンズユニット部8は、映像ミュート部7からの映像信号に基づく画像を形成する表示デバイスと、表示デバイスで形成された画像を投写する投写レンズとを含む。投写レンズは、光軸の方向に移動可能なレンズを備え、該レンズの光軸上の位置に対応するズーム位置に応じて画角が変化するズーム機構と、当該投写レンズ全体を光軸と直交する方向にシフトさせるレンズシフト機構とを備える。投写レンズユニット8は、投写レンズのズーム位置及びレンズシフト位置を示すズーム・シフト位置情報を歪み補正データ算出部4に供給する。ここで、表示デバイスは、複数の画素からなる画像形成面を備える画像形成素子と呼ぶことができる。表示デバイスとして、液晶表示素子やDMD(デジタルマイクロミラーデバイス)などを用いることができる。 The projection lens unit section 8 includes a display device that forms an image based on the video signal from the video mute section 7 and a projection lens that projects an image formed by the display device. The projection lens includes a lens that can move in the direction of the optical axis, a zoom mechanism that changes the angle of view according to a zoom position corresponding to the position on the optical axis of the lens, and the entire projection lens that is orthogonal to the optical axis. And a lens shift mechanism that shifts in the direction of movement. The projection lens unit 8 supplies zoom / shift position information indicating the zoom position and lens shift position of the projection lens to the distortion correction data calculation unit 4. Here, the display device can be referred to as an image forming element including an image forming surface including a plurality of pixels. As the display device, a liquid crystal display element, DMD (digital micromirror device), or the like can be used.
 歪み補正データ算出部4は、プロジェクタ投写設計データ格納部9、3次元センサ部10、3次元センサキャリブレーションデータ格納部11、3次元データプロジェクタ座標変換部12、カメラ部13、パターン特徴点検出部14、カメラ撮像設計データ格納部15、パターン物理座標算出部16、カメラキャリブレーションデータ格納部17、パターンプロジェクタ座標変換部18、パターン格納部19、リモートコントローラ投写設計データ格納部20、リモートコントローラ位置算出部21及び歪み補正係数計算部22を有する。
 プロジェクタ投写設計データ格納部9は、プロジェクタの投写に関わる設計データを格納する。設計データは、ズーム位置とレンズシフト位置とから画角や投写中心などのプロジェクタの座標系を設定するのに必要なデータである。
The distortion correction data calculation unit 4 includes a projector projection design data storage unit 9, a three-dimensional sensor unit 10, a three-dimensional sensor calibration data storage unit 11, a three-dimensional data projector coordinate conversion unit 12, a camera unit 13, and a pattern feature point detection unit. 14, camera imaging design data storage unit 15, pattern physical coordinate calculation unit 16, camera calibration data storage unit 17, pattern projector coordinate conversion unit 18, pattern storage unit 19, remote controller projection design data storage unit 20, remote controller position calculation And a distortion correction coefficient calculation unit 22.
The projector projection design data storage unit 9 stores design data related to the projection of the projector. The design data is data necessary for setting the projector coordinate system such as the angle of view and the projection center from the zoom position and the lens shift position.
 3次元センサ部10は、投写対象物の投写面の3次元位置を計測し、3次元位置データを出力する。3次元センサ部10は、投写レンズの光軸方向に向けて配置された3次元センサを有する。3次元センサの検出範囲は、投写レンズの投写可能エリア全体を含む。
 3次元センサとして、例えば、TOF(Time of Flight)方式や三角測量方式の3次元センサを用いることができるが、これら方式のものに限定されない。TOF方式は、光を対象物に向けて投写し、その投写光が対象物で反射されて戻ってくるまでの時間を計測することで、3次元計測を行う方式である。三角測量方式には、例えば、パッシブ三角測量方式やアクティブ三角測量方式などがある。パッシブ三角測量方式は、左右に並べて配置された2台のカメラで同時に対象物を撮影し、各カメラで得られる対象物の撮像画像上での位置の違いから三角測量の原理を用いて3次元計測を行う方式であり、ステレオカメラ方式とも呼ばれている。アクティブ三角測量方式は、対象物に光を照射し、対象物からの反射光の情報に基づいて、三角測量の原理を用いて3次元計測を行う方式である。
The three-dimensional sensor unit 10 measures the three-dimensional position of the projection surface of the projection object and outputs three-dimensional position data. The three-dimensional sensor unit 10 includes a three-dimensional sensor arranged toward the optical axis direction of the projection lens. The detection range of the three-dimensional sensor includes the entire projectable area of the projection lens.
As the three-dimensional sensor, for example, a TOF (Time of Flight) method or a triangulation method three-dimensional sensor can be used, but it is not limited to these methods. The TOF method is a method of performing three-dimensional measurement by projecting light toward an object and measuring the time until the projected light is reflected by the object and returned. Examples of the triangulation method include a passive triangulation method and an active triangulation method. In the passive triangulation method, an object is photographed at the same time with two cameras arranged side by side on the left and right, and the principle of triangulation is used based on the difference in position on the captured image of the object obtained by each camera. This is a method for measuring and is also called a stereo camera method. The active triangulation method is a method of irradiating light on an object and performing three-dimensional measurement using the principle of triangulation based on information on reflected light from the object.
 3次元センサキャリブレーションデータ格納部11は、3次元センサキャリブレーションデータを格納している。3次元センサキャリブレーションデータは、3次元センサの座標系をプロジェクタの座標系に変換するためのパラメータ(回転量及び並進量)と、基準ズーム位置及び基準レンズシフト位置とを含む。回転量及び並進量は、3次元センサの座標系とプロジェクタの座標系の位置関係を計測するキャリブレーションを行った結果から得られたデータである。基準ズーム位置及び基準レンズシフト位置は、キャリブレーションを行った時のズーム位置及びレンズシフト位置である。 The 3D sensor calibration data storage unit 11 stores 3D sensor calibration data. The three-dimensional sensor calibration data includes parameters (rotation amount and translation amount) for converting the coordinate system of the three-dimensional sensor into the coordinate system of the projector, a reference zoom position, and a reference lens shift position. The amount of rotation and the amount of translation are data obtained from the result of calibration that measures the positional relationship between the coordinate system of the three-dimensional sensor and the coordinate system of the projector. The reference zoom position and the reference lens shift position are the zoom position and the lens shift position when calibration is performed.
 3次元データプロジェクタ座標変換部12は、3次元センサ部10から投写面の3次元位置データを取得し、3次元センサキャリブレーションデータ格納部11から3次元センサキャリブレーションデータ(回転量、並進量、基準ズーム位置及び基準レンズシフト位置)を取得し、プロジェクタ投写設計データ格納部9から設計データを取得し、投写レンズユニット部8からズーム・シフト位置情報を取得する。3次元データプロジェクタ座標変換部12は、3次元センサキャリブレーションデータと設計データ及びズーム・シフト位置情報とに基づいて、投写面の3次元位置データを、投写中心を原点としたプロジェクタの座標系の3次元位置データに変換する。キャリブレーションデータ及び設計データは、3次元センサの座標系を、投写中心を原点としたプロジェクタ座標系に変換するための座標変換データと呼ぶことができる。 The three-dimensional data projector coordinate conversion unit 12 acquires the three-dimensional position data of the projection plane from the three-dimensional sensor unit 10 and the three-dimensional sensor calibration data (rotation amount, translation amount, Reference zoom position and reference lens shift position) are acquired, design data is acquired from the projector projection design data storage unit 9, and zoom / shift position information is acquired from the projection lens unit unit 8. Based on the 3D sensor calibration data, the design data, and the zoom / shift position information, the 3D data projector coordinate conversion unit 12 converts the 3D position data of the projection plane into the coordinate system of the projector with the projection center as the origin. Convert to 3D position data. The calibration data and design data can be called coordinate conversion data for converting the coordinate system of the three-dimensional sensor into a projector coordinate system with the projection center as the origin.
 カメラ部13は、投写レンズの光軸方向に向けて配置されており、リモートコントローラから投写エリアに投写されたパターンを撮像する。カメラ部13の撮像範囲は、投写レンズの投写可能エリア全体を含む。
 パターン特徴点検出部14は、カメラ部13の撮像画像からリモートコントローラが投写したパターンを抽出し、該パターンが示す複数の特徴点を検出する。特徴点の数は3つ以上である。パターン特徴点検出部14は、各特徴点の撮像画像上の座標を示すパターン特徴点情報(2次元座標情報)を出力する。
The camera unit 13 is arranged in the direction of the optical axis of the projection lens, and images the pattern projected from the remote controller onto the projection area. The imaging range of the camera unit 13 includes the entire projectable area of the projection lens.
The pattern feature point detection unit 14 extracts a pattern projected by the remote controller from the captured image of the camera unit 13 and detects a plurality of feature points indicated by the pattern. The number of feature points is three or more. The pattern feature point detector 14 outputs pattern feature point information (two-dimensional coordinate information) indicating the coordinates of each feature point on the captured image.
 カメラ撮像設計データ格納部15は、カメラの画角などの撮像に関わるカメラ撮像設計データを格納している。カメラ撮像設計データは、画角や光学中心などのカメラの座標系を設定するのに必要なデータである。
 パターン物理座標算出部16は、カメラ撮像設計データ格納部15からカメラ撮像設計データを取得し、パターン特徴点検出部14から各特徴点の座標を示すパターン特徴点情報を取得する。パターン物理座標算出部16は、カメラ撮像設計データとパターン特徴点情報とに基づいて、カメラ座標系における各特徴点の物理座標を算出する。パターン物理座標算出部16は、各特徴点の物理座標の算出結果であるパターン特徴点物理座標データを出力する。
 ここで、物理座標について簡単に説明する。2次元座標である撮像画像上の座標を、カメラの画角などの設計情報を元に、任意の距離に置かれた基準平面上の3次元座標として表したものを物理座標と呼ぶ。物理座標は、例えば、(x, y, m)(m=任意の値)の形で表すことができる。
The camera imaging design data storage unit 15 stores camera imaging design data related to imaging such as the angle of view of the camera. The camera imaging design data is data necessary for setting a camera coordinate system such as an angle of view and an optical center.
The pattern physical coordinate calculation unit 16 acquires camera imaging design data from the camera imaging design data storage unit 15 and acquires pattern feature point information indicating the coordinates of each feature point from the pattern feature point detection unit 14. The pattern physical coordinate calculation unit 16 calculates physical coordinates of each feature point in the camera coordinate system based on the camera imaging design data and the pattern feature point information. The pattern physical coordinate calculation unit 16 outputs pattern feature point physical coordinate data, which is a calculation result of the physical coordinates of each feature point.
Here, the physical coordinates will be briefly described. The coordinates on the captured image, which are two-dimensional coordinates, expressed as three-dimensional coordinates on a reference plane placed at an arbitrary distance based on design information such as the angle of view of the camera are called physical coordinates. The physical coordinates can be expressed, for example, in the form of (x, y, m) (m = arbitrary value).
 カメラキャリブレーションデータ格納部17は、カメラキャリブレーションデータを格納している。カメラの座標系をプロジェクタの座標系に変換するためのパラメータ(回転量及び並進量)と、基準ズーム位置及び基準レンズシフト位置とを含む。回転量及び並進量は、カメラの座標系とプロジェクタの座標系の位置関係を計測するキャリブレーションを行った結果から得られたデータである。基準ズーム位置及び基準レンズシフト位置は、キャリブレーションを行った時のズーム位置及びレンズシフト位置である。
 パターンプロジェクタ座標変換部18は、パターン物理座標算出部16からパターン特徴点物理座標データを取得し、カメラキャリブレーションデータ格納部17からカメラキャリブレーションデータ(回転量、並進量、基準ズーム位置及び基準レンズシフト位置)を取得し、プロジェクタ投写設計データ格納部9から設計データを取得し、投写レンズユニット部8からズーム・シフト位置情報を取得する。パターンプロジェクタ座標変換部18は、カメラキャリブレーションデータと設計データ及びズーム・シフト位置情報とに基づいて、パターン特徴点物理座標データを、投写中心を原点としたプロジェクタの座標系のパターン特徴点物理座標データに変換する。
The camera calibration data storage unit 17 stores camera calibration data. It includes parameters (amount of rotation and translation) for converting the camera coordinate system to the projector coordinate system, a reference zoom position, and a reference lens shift position. The rotation amount and the translation amount are data obtained as a result of performing calibration for measuring the positional relationship between the camera coordinate system and the projector coordinate system. The reference zoom position and the reference lens shift position are the zoom position and the lens shift position when calibration is performed.
The pattern projector coordinate conversion unit 18 acquires the pattern feature point physical coordinate data from the pattern physical coordinate calculation unit 16 and the camera calibration data (rotation amount, translation amount, reference zoom position, and reference lens) from the camera calibration data storage unit 17. Shift position), design data is acquired from the projector projection design data storage unit 9, and zoom / shift position information is acquired from the projection lens unit unit 8. The pattern projector coordinate conversion unit 18 uses the pattern feature point physical coordinate data based on the camera calibration data, design data, and zoom / shift position information as the pattern feature point physical coordinates in the projector coordinate system with the projection center as the origin. Convert to data.
 パターン格納部19は、リモートコントローラから投写されるパターンの情報を格納している。例えば、4つの特徴点を含むパターンであれば、各特徴点の位置関係などの情報がパターン格納部19に格納される。
 リモートコントローラ投写設計データ格納部20は、リモートコントローラの投写に関わる設計データを格納する。このリモートコントローラ投写設計データは、リモートコントローラの投写の画角などである。この設計データから、上記各特徴点に向かう光線の方向(放射方向)を取得することができる。
The pattern storage unit 19 stores pattern information projected from the remote controller. For example, in the case of a pattern including four feature points, information such as the positional relationship of each feature point is stored in the pattern storage unit 19.
The remote controller projection design data storage unit 20 stores design data related to the projection of the remote controller. This remote controller projection design data is the angle of view of projection of the remote controller. From this design data, it is possible to obtain the direction (radiation direction) of the light beam toward each of the feature points.
 リモートコントローラ位置算出部21は、3次元データプロジェクタ座標変換部12からプロジェクタ座標系の投写面の3次元位置座標データを取得し、パターンプロジェクタ座標変換部18からプロジェクタ座標系のパターン特徴点物理座標データを取得し、パターン格納部19からパターン情報を取得し、リモートコントローラ投写設計データ格納部20からリモートコントローラ投写設計データを取得する。リモートコントローラ位置算出部21は、3次元位置座標データ、パターン特徴点物理座標データ、パターン情報及びリモートコントローラ投写設計データに基づいて、プロジェクタ座標系におけるリモートコントローラの3次元位置を算出する。
 歪み補正係数計算部22は、投写レンズユニット部8からズーム・シフト位置情報を取得し、プロジェクタ投写設計データ格納部9から設計データを取得し、3次元データプロジェクタ座標変換部12からプロジェクタ座標系の投写面の3次元位置座標データを取得し、リモートコントローラ位置算出部21からプロジェクタ座標系のリモートコントローラの3次元位置データを取得する。歪み補正係数計算部22は、ズーム・シフト位置情報、設計データ、投写面の3次元位置座標データ及びリモートコントローラの3次元位置データに基づいて、リモートコントローラの位置を視点とし、該視点から投写面を見たときの投写画像に投写面の3次元形状に応じた歪みが生じないように補正するための歪み補正係数を計算する。この歪み補正係数は、歪み補正係数計算部22から歪み補正部6に供給される。
The remote controller position calculation unit 21 acquires the three-dimensional position coordinate data of the projection plane of the projector coordinate system from the three-dimensional data projector coordinate conversion unit 12, and the pattern feature point physical coordinate data of the projector coordinate system from the pattern projector coordinate conversion unit 18. , Pattern information is acquired from the pattern storage unit 19, and remote controller projection design data is acquired from the remote controller projection design data storage unit 20. The remote controller position calculation unit 21 calculates the three-dimensional position of the remote controller in the projector coordinate system based on the three-dimensional position coordinate data, the pattern feature point physical coordinate data, the pattern information, and the remote controller projection design data.
The distortion correction coefficient calculation unit 22 acquires zoom / shift position information from the projection lens unit unit 8, acquires design data from the projector projection design data storage unit 9, and receives the projector coordinate system from the three-dimensional data projector coordinate conversion unit 12. The three-dimensional position coordinate data of the projection plane is acquired, and the three-dimensional position data of the remote controller of the projector coordinate system is acquired from the remote controller position calculation unit 21. The distortion correction coefficient calculation unit 22 uses the position of the remote controller as a viewpoint based on the zoom / shift position information, the design data, the three-dimensional position coordinate data of the projection plane, and the three-dimensional position data of the remote controller. A distortion correction coefficient for correcting the projection image so as not to cause distortion corresponding to the three-dimensional shape of the projection surface is calculated. The distortion correction coefficient is supplied from the distortion correction coefficient calculation unit 22 to the distortion correction unit 6.
 次に、リモートコントローラの構成について説明する。
 図2は、リモートコントローラの構成を示すブロック図である。図2を参照すると、リモートコントローラ50は、操作部51、制御部52、送信部53及びパターン投写部54を有する。
 操作部51は、複数の操作キーを備え、これら操作キーを用いた入力操作を受け付け、該入力操作に応じた操作信号を制御部52に供給する。例えば、パターン投写キーや歪み補正開始キーが操作部51に設けられている。パターン投写キーが押下されると、パターンの投写を開始する旨の操作信号が操作部51から制御部52供給される。歪み補正開始キーが押されると、歪み補正の開始を示す操作信号が、操作部51から制御部52供給される。
Next, the configuration of the remote controller will be described.
FIG. 2 is a block diagram showing the configuration of the remote controller. Referring to FIG. 2, the remote controller 50 includes an operation unit 51, a control unit 52, a transmission unit 53, and a pattern projection unit 54.
The operation unit 51 includes a plurality of operation keys, receives an input operation using these operation keys, and supplies an operation signal corresponding to the input operation to the control unit 52. For example, a pattern projection key and a distortion correction start key are provided on the operation unit 51. When the pattern projection key is pressed, an operation signal for starting pattern projection is supplied from the operation unit 51 to the control unit 52. When the distortion correction start key is pressed, an operation signal indicating the start of distortion correction is supplied from the operation unit 51 to the control unit 52.
 パターン投写部54は、3つ以上の特徴点を含む所定のパターンを投写する。所定のパターンは、3つ以上の特徴点を特定できるのであれば、どのようなパターンであっても良い。例えば、所定のパターンは、3つ以上の輝点で構成されたパターンであっても良く、複数の線で構成された図形であっても良い。なお、ここでは、所定のパターンとして、4つの特徴点を含むパターンを用いる。
 また、パターン投写部54は、所定のパターンを投写することができるのであれば、どのような構造であってもよい。パターン投写部54は、例えば、レーザーポインタ等で用いられているような、複数の輝点からなるパターンや複数の線から構成された図形パターンを投写する構造を有しても良い。例えば、パターン投写部54は、基板上に配置された4つの光源と、これら光源からなる輝点パターンを投写するレンズ群とを有していても良い。光源には、例えば、LD(レーザーダイオード)やLED(発光ダイオード)などを用いることができる。4つの光源から4つの特徴点それぞれに向かう主光線の方向が、各特徴点の放射方向である。また、それぞれが一つの輝点を示す3つ以上のレーザーポインタを組み合わせたものをパターン投写部54として用いても良い。この場合は、各レーザーポインタの射出方向が各特徴点の放射方向である。
The pattern projection unit 54 projects a predetermined pattern including three or more feature points. The predetermined pattern may be any pattern as long as three or more feature points can be specified. For example, the predetermined pattern may be a pattern composed of three or more bright spots, or may be a figure composed of a plurality of lines. Here, a pattern including four feature points is used as the predetermined pattern.
The pattern projection unit 54 may have any structure as long as a predetermined pattern can be projected. The pattern projection unit 54 may have a structure for projecting a pattern composed of a plurality of bright spots and a figure pattern composed of a plurality of lines, such as used in a laser pointer or the like. For example, the pattern projection unit 54 may include four light sources arranged on the substrate and a lens group that projects a bright spot pattern composed of these light sources. For example, an LD (laser diode) or an LED (light emitting diode) can be used as the light source. The direction of the chief ray from the four light sources toward each of the four feature points is the radiation direction of each feature point. Further, a combination of three or more laser pointers each indicating one bright spot may be used as the pattern projection unit 54. In this case, the emission direction of each laser pointer is the radiation direction of each feature point.
 送信部53は、赤外線等を用いたリモートコントロール信号(無線信号)を送信する。リモートコントロール信号は、一般に、レコーダ等の映像機器で良く用いられているものであるので、ここでは、その構造の詳細な説明は省略する。
 制御部52は、操作部51からの操作信号に応じて送信部53及びパターン投写部54の動作を制御する。具体的には、制御部52は、パターン投写キーの押下を示す操作信号に従って、パターン投写部54にて所定のパターンの画像を投写させる。また、制御部52は、歪み補正開始キーの押下を示す操作信号に従って、歪み補正開始要求を示すリモートコントロール信号を送信部53から送信させる。
The transmission unit 53 transmits a remote control signal (radio signal) using infrared rays or the like. Since the remote control signal is generally used in video equipment such as a recorder, a detailed description of its structure is omitted here.
The control unit 52 controls the operations of the transmission unit 53 and the pattern projection unit 54 in accordance with an operation signal from the operation unit 51. Specifically, the control unit 52 causes the pattern projection unit 54 to project an image of a predetermined pattern in accordance with an operation signal indicating that the pattern projection key is pressed. In addition, the control unit 52 causes the transmission unit 53 to transmit a remote control signal indicating a distortion correction start request in accordance with an operation signal indicating that the distortion correction start key is pressed.
 次に、本実施形態のプロジェクタ及びリモートコントローラ50の動作を具体的に説明する。
 まず、3次元センサ部10の検出範囲とカメラ部13の撮像範囲と投写レンズユニット部8の投写エリアとの位置関係について説明する。
Next, operations of the projector and the remote controller 50 according to the present embodiment will be specifically described.
First, the positional relationship among the detection range of the three-dimensional sensor unit 10, the imaging range of the camera unit 13, and the projection area of the projection lens unit unit 8 will be described.
 図3に、3次元センサ部10の検出範囲とカメラ部13の撮像範囲と投写レンズユニット部8の投写エリアとの位置関係を模式的に示す。図3に示すように、プロジェクタ101において、3次元センサ107、カメラ106及び投写レンズ101aは筺体の同じ面に設けられている。3次元センサ107及びカメラ106は、投写レンズ101aの光軸方向に向けて配置されている。
 投写レンズ101aにおいて、ズーム機能を用いて投写エリアを拡大又は縮小することができ、レンズシフト機能を用いて投写エリアを上下左右に移動することができる。
 投写エリア110は、右上方向にレンズシフトしてズームを最小にした場合の投写エリアである。投写エリア111は、右上方向にレンズシフトしてズームを最大にした場合の投写エリアである。投写エリア112は、左上方向にレンズシフトしてズームを最小にした場合の投写エリアである。投写エリア113は、左上方向にレンズシフトしてズームを最大にした場合の投写エリアである。
FIG. 3 schematically shows a positional relationship among the detection range of the three-dimensional sensor unit 10, the imaging range of the camera unit 13, and the projection area of the projection lens unit unit 8. As shown in FIG. 3, in the projector 101, the three-dimensional sensor 107, the camera 106, and the projection lens 101a are provided on the same surface of the casing. The three-dimensional sensor 107 and the camera 106 are arranged toward the optical axis direction of the projection lens 101a.
In the projection lens 101a, the projection area can be enlarged or reduced using the zoom function, and the projection area can be moved up, down, left, and right using the lens shift function.
The projection area 110 is a projection area when the lens is shifted to the upper right to minimize the zoom. The projection area 111 is a projection area when the zoom is maximized by shifting the lens in the upper right direction. The projection area 112 is a projection area when the zoom is minimized by shifting the lens in the upper left direction. The projection area 113 is a projection area when the lens is shifted in the upper left direction to maximize the zoom.
 投写エリア114は、右下方向にレンズシフトしてズームを最小にした場合の投写エリアである。投写エリア115は、右下方向にレンズシフトしてズームを最大にした場合の投写エリアである。投写エリア116は、左下方向にレンズシフトしてズームを最小にした場合の投写エリアである。投写エリア117は、左下方向にレンズシフトしてズームを最大にした場合の投写エリアである。
 3次元センサ107の検出範囲は、投写レンズ101aからの画像を投写することができる投写可能エリア全体、すなわち、投写エリア110~117全体を含む。よって、3次元センサ107は、投写可能エリア内に配置された立体物の3次元位置を計測できる。3次元センサ部10は、3次元センサ107の出力である3次元位置データを3次元データプロジェクタ座標変換部12に供給する。
The projection area 114 is a projection area when the zoom is minimized by shifting the lens in the lower right direction. The projection area 115 is a projection area when the zoom is maximized by shifting the lens in the lower right direction. The projection area 116 is a projection area when the zoom is minimized by shifting the lens in the lower left direction. The projection area 117 is a projection area when the zoom is maximized by shifting the lens in the lower left direction.
The detection range of the three-dimensional sensor 107 includes the entire projectable area where the image from the projection lens 101a can be projected, that is, the entire projection areas 110 to 117. Therefore, the three-dimensional sensor 107 can measure the three-dimensional position of the three-dimensional object arranged in the projectable area. The three-dimensional sensor unit 10 supplies the three-dimensional position data that is the output of the three-dimensional sensor 107 to the three-dimensional data projector coordinate conversion unit 12.
 カメラ106の撮像範囲も、投写可能エリア全体、すなわち、投写エリア110~117全体を含む。よって、カメラ106は、投写可能エリアにおいて、リモートコントローラ50が投写面に投写した所定のパターンを撮像することができる。カメラ部13は、カメラ106の出力である撮像画像をパターン特徴点検出部14に供給する。
 上記のように、プロジェクタ101においては、ズーム機能を用いて投写映像を拡大した場合や、レンズシフト機能を用いて投写映像の位置を上下左右に移動した場合であっても、3次元センサ107の検出範囲及びカメラ106の撮像範囲はいずれも、投写エリアを含む。
The imaging range of the camera 106 also includes the entire projectable area, that is, the entire projection areas 110 to 117. Therefore, the camera 106 can capture a predetermined pattern projected on the projection plane by the remote controller 50 in the projectable area. The camera unit 13 supplies the captured image that is the output of the camera 106 to the pattern feature point detection unit 14.
As described above, in the projector 101, even when the projected image is enlarged using the zoom function or when the position of the projected image is moved up, down, left, or right using the lens shift function, Both the detection range and the imaging range of the camera 106 include a projection area.
 次に、リモートコントローラ50を用いたプロジェクタ101の歪み補正の動作を説明する。
 使用者は、リモートコントローラ50を用いて、投写画像を眺める視点となる位置を指定する。具体的には、使用者が、リモートコントローラ50を投写面に向けた状態で、パターン投写キーを押下すると、パターン投写部54が、所定のパターンを投写面に投写する。所定のパターンが投写エリア内に投写された状態で、使用者が、歪み補正開始キーを押下すると、送信部53が、歪み補正開始要求を示すリモートコントロール信号を送信する。プロジェクタ101は、リモートコントローラ50からのリモートコントロール信号に応じて歪み補正の処理を開始する。
Next, the distortion correction operation of the projector 101 using the remote controller 50 will be described.
The user uses the remote controller 50 to specify a position that is a viewpoint for viewing the projected image. Specifically, when the user presses the pattern projection key with the remote controller 50 facing the projection surface, the pattern projection unit 54 projects a predetermined pattern on the projection surface. When the user presses the distortion correction start key while a predetermined pattern is projected in the projection area, the transmission unit 53 transmits a remote control signal indicating a distortion correction start request. The projector 101 starts distortion correction processing in response to a remote control signal from the remote controller 50.
 図4に、リモートコントローラ50から投写されたパターンとプロジェクタ101の投写エリアの位置関係を示す。図4に示すように、リモートコントローラ50は、4つの点からなるパターン120を投写対象物104上の投写エリア103に投写する。
 プロジェクタ101は、リモートコントローラ50からのリモートコントロール信号(歪み補正開始要求)に応じて、パターン120に基づいてリモートコントローラ50の位置を検出し、該検出位置を視点として、歪みのない映像が視認されるような歪み補正データを算出する。
FIG. 4 shows the positional relationship between the pattern projected from the remote controller 50 and the projection area of the projector 101. As shown in FIG. 4, the remote controller 50 projects a pattern 120 composed of four points onto the projection area 103 on the projection object 104.
In response to a remote control signal (distortion correction start request) from the remote controller 50, the projector 101 detects the position of the remote controller 50 based on the pattern 120, and an image without distortion is viewed from the detected position as a viewpoint. Such distortion correction data is calculated.
 以下に、プロジェクタ101にて行われる、歪み補正データ算出処理を具体的に説明する。
 図5に、歪み補正データ算出処理の一手順を示す。以下、図1~図5を参照して動作を説明する。
Hereinafter, the distortion correction data calculation process performed by the projector 101 will be specifically described.
FIG. 5 shows a procedure of distortion correction data calculation processing. The operation will be described below with reference to FIGS.
 受信部3が、リモートコントローラ50からのリモートコントロール信号(歪み補正開始要求)を受信したか否かを判定する(ステップS100)。リモートコントロール信号(歪み補正開始要求)を受信した場合、受信部3が、その旨を歪み補正データ算出部4に通知する。
 歪み補正データ算出部4では、歪み補正開始要求に応じて、まず、3次元センサ107が、投写エリア103の投写面を3次元計測し、その計測結果である3次元位置データを3次元データプロジェクタ座標変換部12に供給する(ステップS101)。ここで、3次元位置データは、3次元センサ107の座標系における3次元座標で示された点群データである。
It is determined whether the receiving unit 3 has received a remote control signal (distortion correction start request) from the remote controller 50 (step S100). When the remote control signal (distortion correction start request) is received, the receiving unit 3 notifies the distortion correction data calculating unit 4 to that effect.
In the distortion correction data calculation unit 4, in response to a distortion correction start request, first, the three-dimensional sensor 107 three-dimensionally measures the projection surface of the projection area 103, and the three-dimensional position data as the measurement result is measured by a three-dimensional data projector. It supplies to the coordinate transformation part 12 (step S101). Here, the three-dimensional position data is point group data indicated by three-dimensional coordinates in the coordinate system of the three-dimensional sensor 107.
 次いで、3次元データプロジェクタ座標変換部12が、投写面の3次元位置データを、投写中心102を原点としたプロジェクタ座標系の3次元位置データに変換する(ステップS102)。
 ここで、プロジェクタ座標系の原点とされる投写中心102について簡単に説明する。図6Aに打ち上げ有りの場合の画角、投写光軸及び投写中心の関係を模式的に示す。図6Bに打ち上げ無しの場合の画角、投写光軸及び投写中心の関係を模式的に示す。図6Cに画角を拡大した場合の画角、投写光軸及び投写中心の関係を模式的に示す。図6Dに上方へレンズシフトを行った場合の画角、投写中心軸及び投写中心の関係を模式的に示す。ここで、投写光軸は、画像形成面の中心を通り、かつ、画像形成面に垂直に交わる軸である。図6B~図6Dは、垂直方向の断面に相当する。
Next, the three-dimensional data projector coordinate conversion unit 12 converts the three-dimensional position data of the projection plane into the three-dimensional position data of the projector coordinate system with the projection center 102 as the origin (step S102).
Here, the projection center 102 which is the origin of the projector coordinate system will be briefly described. FIG. 6A schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when launching is performed. FIG. 6B schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when there is no launch. FIG. 6C schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when the angle of view is enlarged. FIG. 6D schematically shows the relationship between the angle of view, the projection center axis, and the projection center when the lens shift is performed upward. Here, the projection optical axis is an axis passing through the center of the image forming surface and perpendicular to the image forming surface. 6B to 6D correspond to cross sections in the vertical direction.
 通常、プロジェクタをテーブル上に置いたときに、テーブルの高さより上に映像が投写されるように、映像を投写光軸よりも上に向けて投写する、打ち上げと呼ばれる投写形態が用いられる。また、映像を投写光軸に対して上下左右の位置に移動して投写する、レンズシフトと呼ばれる投写形態もあるが、打ち上げはそのレンズシフトの一形態である。
 例えば、図6Aに示すように、投写中心102は、投写エリア103の四隅の各点と表示デバイス100の画像形成領域の四隅の各点とを、それぞれ対応する点同士で直線的に結んだ線が交わる点である。ここで、投写エリア103は、表示デバイス100の画像形成領域の画像を上下左右で反転させたものである。なお、実際は、レンズでの屈折を伴うため、投写エリア103の四隅の各点と表示デバイス100の画像形成領域の四隅の各点とを結ぶ線は直線とはならない。投写中心102は、投写レンズのレンズ構成を考慮して決定する必要がある。
Usually, when a projector is placed on a table, a projection form called launching is used in which the image is projected above the projection optical axis so that the image is projected above the height of the table. In addition, there is a projection form called lens shift, in which an image is projected by moving it to the vertical and horizontal positions with respect to the projection optical axis, and launching is one form of the lens shift.
For example, as shown in FIG. 6A, the projection center 102 is a line in which the points at the four corners of the projection area 103 and the points at the four corners of the image forming area of the display device 100 are linearly connected by corresponding points. Is the point where Here, the projection area 103 is obtained by inverting the image of the image forming area of the display device 100 vertically and horizontally. Actually, since refraction is caused by the lens, lines connecting the four corners of the projection area 103 and the four corners of the image forming area of the display device 100 are not straight lines. The projection center 102 needs to be determined in consideration of the lens configuration of the projection lens.
 例えば、表示デバイス100の画像形成領域の四隅の点をそれぞれA点、B点、C点、D点とし、投写エリア103の四隅の点をそれぞれa点、b点、c点、d点とする。a点、b点、c点、d点はそれぞれA点、B点、C点、D点に対応し、a点、b点、c点、d点の配置はA点、B点、C点、D点の配置に対して上下左右が反転した位置関係になる。この場合、投写中心102は、A点から射出してレンズを介してa点に到達する主光線と、B点から射出してレンズを介してb点に到達する主光線と、C点から射出してレンズを介してc点に到達する主光線と、D点から射出してレンズを介してd点に到達する主光線とが互いに交わる点を示す。このような主光線の交点は、例えば、投写レンズの開口絞りの中心で規定することができ、レンズ設計データに基づいて計算することが可能である。
 図6Aに示した打ち上げ有りの例では、投写光軸109は投写エリア103の下端の中心部を通り、投写中心102は、投写光軸109よりも上側に位置している。この場合、投写中心軸は、投写光軸109と一致しない。
For example, the four corner points of the image forming area of the display device 100 are A point, B point, C point, and D point, respectively, and the four corner points of the projection area 103 are respectively a point, b point, c point, and d point. . Points a, b, c, and d correspond to points A, B, C, and D, respectively, and the arrangement of points a, b, c, and d is points A, B, and C. , The positional relationship is reversed up, down, left and right with respect to the arrangement of point D. In this case, the projection center 102 emits from the point A and reaches the point a via the lens, emits from the point B and reaches the point b via the lens, and exits from the point C. The principal ray that reaches the point c via the lens and the principal ray that exits from the point D and reaches the point d via the lens are shown as crossing each other. Such an intersection of principal rays can be defined at the center of the aperture stop of the projection lens, for example, and can be calculated based on lens design data.
In the example with launch shown in FIG. 6A, the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center 102 is located above the projection optical axis 109. In this case, the projection center axis does not coincide with the projection optical axis 109.
 図6Bに示した打ち上げ無しの例では、投写光軸109は投写エリア103の中心部を通り、投写中心102は、投写光軸109上に位置している。この場合、投写中心軸は、投写光軸109と一致する。
 図6Cは、図6Bの例と比較して、画角を拡大した例である。図6Bと同様、投写光軸109は投写エリア103の中心部を通り、投写中心102は投写光軸109上に位置しているが、図6Bの例よりも、投写中心102は表示デバイス100側に配置されている。この場合も、投写中心軸は、投写光軸109と一致する。
In the example without launch shown in FIG. 6B, the projection optical axis 109 passes through the center of the projection area 103, and the projection center 102 is located on the projection optical axis 109. In this case, the projection center axis coincides with the projection optical axis 109.
FIG. 6C is an example in which the angle of view is enlarged as compared with the example of FIG. 6B. 6B, the projection optical axis 109 passes through the center of the projection area 103, and the projection center 102 is located on the projection optical axis 109. However, the projection center 102 is closer to the display device 100 than in the example of FIG. 6B. Is arranged. Also in this case, the projection center axis coincides with the projection optical axis 109.
 図6Dは、図6Bの例と比較して、投写エリア103が上方へシフトするようにレンズシフトを行った例である。図6Aの例と同様、投写光軸109は投写エリア103の下端の中心部を通り、投写中心102は、投写光軸109よりも上側に位置している。この場合、投写中心軸は、投写光軸109と一致しない。
 図6A~図6Dの例から分かるように、投写中心102は、ズーム位置やレンズシフト位置に応じて変化する。換言すると、投写中心102は、ズーム位置やレンズシフト位置に応じて決定する必要がある。
FIG. 6D is an example in which the lens shift is performed so that the projection area 103 is shifted upward as compared to the example of FIG. 6B. Similar to the example of FIG. 6A, the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center 102 is located above the projection optical axis 109. In this case, the projection center axis does not coincide with the projection optical axis 109.
As can be seen from the examples of FIGS. 6A to 6D, the projection center 102 changes according to the zoom position and the lens shift position. In other words, the projection center 102 needs to be determined according to the zoom position or the lens shift position.
 3次元センサ107で計測した3次元位置データの座標系をプロジェクタ座標系に変換するために、互いに直交する3つの座標軸に対する回転量と、それら座標軸に対する移動を表す並進量を用いる。この回転量と並進量を求めることをキャリブレーションと呼ぶ。
 プロジェクタ座標系の原点となる投写中心102の位置がいつも同じではなく、ズームやレンズシフトにより移動するため、3次元センサキャリブレーションデータ格納部11には、回転量と並進量とともに、キャリブレーションを行った時のズーム位置とレンズシフト位置がそれぞれ基準ズーム位置、基準レンズシフト位置として格納されている。3次元データプロジェクタ座標変換部12は、回転量、並進量、基準ズーム位置及び基準レンズシフト位置を用いて以下の手順で座標変換を行う。
In order to convert the coordinate system of the three-dimensional position data measured by the three-dimensional sensor 107 into the projector coordinate system, a rotation amount with respect to three coordinate axes orthogonal to each other and a translation amount representing movement with respect to these coordinate axes are used. Obtaining the amount of rotation and the amount of translation is called calibration.
Since the position of the projection center 102 that is the origin of the projector coordinate system is not always the same and moves by zooming or lens shift, the three-dimensional sensor calibration data storage unit 11 performs calibration together with the rotation amount and the translation amount. The zoom position and lens shift position at that time are stored as a reference zoom position and a reference lens shift position, respectively. The three-dimensional data projector coordinate conversion unit 12 performs coordinate conversion in the following procedure using the rotation amount, the translation amount, the reference zoom position, and the reference lens shift position.
 (A1)プロジェクタ投写設計データ格納部9に格納されている設計データと基準ズーム位置及び基準レンズシフト位置とから、キャリブレーションを行った時の基準投写中心点の座標を求める。
 (A2)投写レンズユニット部8からの現在のズーム位置とレンズシフト位置から上記と同様に現在の投写中心点の座標を求める。
 (A3)基準投写中心点の座標から現在の投写中心点の座標への変換を行うための並進量を求める。
 (A4)3次元センサ107からの3次元位置データを3次元センサキャリブレーションデータ格納部11に格納されている回転量と並進量により座標変換する。
 (A5)現在のズーム位置と現在のレンズシフト位置に合わせて基準投写中心点の座標から現在の投写中心点の座標への並進量分だけ座標を移動する。
 上記の手順(A1)~(A5)に基づいて、3次元センサ107からの3次元位置データを現在の投写中心点を原点としたプロジェクタ座標系に変換する。
(A1) The coordinates of the reference projection center point at the time of calibration are obtained from the design data stored in the projector projection design data storage unit 9, the reference zoom position, and the reference lens shift position.
(A2) The coordinates of the current projection center point are obtained from the current zoom position and lens shift position from the projection lens unit 8 in the same manner as described above.
(A3) A translation amount for converting from the coordinates of the reference projection center point to the coordinates of the current projection center point is obtained.
(A4) The coordinate transformation is performed on the three-dimensional position data from the three-dimensional sensor 107 based on the rotation amount and the translation amount stored in the three-dimensional sensor calibration data storage unit 11.
(A5) The coordinates are moved by the amount of translation from the coordinates of the reference projection center point to the coordinates of the current projection center point in accordance with the current zoom position and the current lens shift position.
Based on the above procedures (A1) to (A5), the three-dimensional position data from the three-dimensional sensor 107 is converted into a projector coordinate system with the current projection center point as the origin.
 再び、図5を参照する。
 ステップS102の座標変換処理を行った後、カメラ106を含むカメラ部13が、黒画面の映像を出力する旨のミュート開始指示信号を映像ミュート部7に供給する。映像ミュート部7は、ミュート開始指示信号に応じて、黒画面の映像信号を投写レンズユニット部8に供給する。投写レンズユニット部8は、黒画面の映像を投写エリア103に投写する(ステップS103)。黒画面の映像を投写することで、リモートコントローラ50から投写された所定のパターンを正確に抽出することが可能となる。
 黒画面の投写後、カメラ106が、リモートコントローラ50から投写されたパターン120を撮像する(ステップS104)。パターンの撮像後、カメラ106を含むカメラ部13が、ミュート解除指示信号を映像ミュート部7に供給する。映像ミュート部7は、ミュート解除指示信号に応じて、歪み補正部6から映像信号を投写レンズユニット部8に供給する。
Again referring to FIG.
After performing the coordinate conversion processing in step S102, the camera unit 13 including the camera 106 supplies the video mute unit 7 with a mute start instruction signal to output a black screen video. The video mute unit 7 supplies a black screen video signal to the projection lens unit 8 in response to the mute start instruction signal. The projection lens unit unit 8 projects a black screen image on the projection area 103 (step S103). By projecting a black screen image, a predetermined pattern projected from the remote controller 50 can be accurately extracted.
After the black screen is projected, the camera 106 images the pattern 120 projected from the remote controller 50 (step S104). After imaging the pattern, the camera unit 13 including the camera 106 supplies a mute release instruction signal to the video mute unit 7. The video mute unit 7 supplies the video signal from the distortion correction unit 6 to the projection lens unit unit 8 in response to the mute release instruction signal.
 次に、パターン特徴点検出部14が、カメラ106からの撮像画像からパターン120の4つの特徴点を抽出する(ステップS105)。次に、パターン物理座標算出部16が、4つの特徴点それぞれのカメラ座標系における物理座標を算出する(ステップS106)。次に、パターンプロジェクタ座標変換部18が、カメラ座標におけるパターンの各特徴点の物理座標をそれぞれプロジェクタ座標系における物理座標に変換する(ステップS107)。
 ステップS107の座標変換も、前述の3次元センサプロジェクタ座標変換と同様の方法で行うことができる。
Next, the pattern feature point detection unit 14 extracts four feature points of the pattern 120 from the captured image from the camera 106 (step S105). Next, the pattern physical coordinate calculation unit 16 calculates physical coordinates in the camera coordinate system of each of the four feature points (step S106). Next, the pattern projector coordinate conversion unit 18 converts the physical coordinates of each feature point of the pattern in the camera coordinates into physical coordinates in the projector coordinate system (step S107).
The coordinate transformation in step S107 can also be performed by the same method as the above-described three-dimensional sensor projector coordinate transformation.
 具体的には、パターンプロジェクタ座標変換部18は、回転量、並進量、基準ズーム位置及び基準レンズシフト位置を用いて以下の手順で座標変換を行う。
 (B1)プロジェクタ投写設計データ格納部9に格納されている設計データと基準ズーム位置及び基準レンズシフト位置とから、キャリブレーションを行った時の基準投写中心点の座標を求める。
 (B2)投写レンズユニット部8からの現在のズーム位置とレンズシフト位置から上記と同様に現在の投写中心点の座標を求める。
 (B3)基準投写中心点の座標から現在の投写中心点の座標への変換を行うための並進量を求める。
 (B4)パターン物理座標算出部16からのカメラ座標におけるパターンの各特徴点の物理座標を、カメラキャリブレーションデータ格納部17に格納されている回転量と並進量により座標変換する。
 (B5)現在のズーム位置と現在のレンズシフト位置に合わせて基準投写中心点の座標から現在の投写中心点の座標への並進量分だけ座標を移動する。
 上記の手順(B1)~(B5)に基づいて、パターン物理座標算出部16からのカメラ座標におけるパターンの各特徴点の物理座標を、現在の投写中心点を原点としたプロジェクタ座標系に変換する。カメラ座標系における光学中心である原点(0,0,0)も、プロジェクタの座標系に変換される。
Specifically, the pattern projector coordinate conversion unit 18 performs coordinate conversion in the following procedure using the rotation amount, the translation amount, the reference zoom position, and the reference lens shift position.
(B1) The coordinates of the reference projection center point at the time of calibration are obtained from the design data stored in the projector projection design data storage unit 9, the reference zoom position, and the reference lens shift position.
(B2) The coordinates of the current projection center point are obtained from the current zoom position and lens shift position from the projection lens unit 8 in the same manner as described above.
(B3) A translation amount for conversion from the coordinates of the reference projection center point to the coordinates of the current projection center point is obtained.
(B4) The physical coordinates of each feature point of the pattern in the camera coordinates from the pattern physical coordinate calculation unit 16 are coordinate-converted by the rotation amount and the translation amount stored in the camera calibration data storage unit 17.
(B5) The coordinates are moved by the amount of translation from the coordinates of the reference projection center point to the coordinates of the current projection center point in accordance with the current zoom position and the current lens shift position.
Based on the above steps (B1) to (B5), the physical coordinates of each feature point of the pattern in the camera coordinates from the pattern physical coordinate calculation unit 16 are converted into a projector coordinate system with the current projection center point as the origin. . The origin (0,0,0), which is the optical center in the camera coordinate system, is also converted to the projector coordinate system.
 再び、図5を参照する。
 ステップS107の座標変換処理を行った後、リモートコントローラ位置算出部21が、プロジェクタ座標系における投写面の3次元位置データ(投写面の3次元位置座標)と、プロジェクタ座標系におけるパターンの各特徴点の物理座標(3次元位置座標)とに基づいて、プロジェクタ座標系における、リモートコントローラの3次元位置を算出する(ステップS108)。
 最後に、歪み補正係数計算部22が、プロジェクタ座標系における投写面の3次元位置データ(投写面の3次元位置座標)と、プロジェクタ座標系におけるリモートコントローラの3次元位置とに基づいて、リモートコントローラの位置を視点とし、該視点から投写面を見たときの投写画像に投写面の3次元形状に応じた歪みが生じないような歪み補正係数を算出する(ステップS109)。
Again referring to FIG.
After performing the coordinate conversion processing in step S107, the remote controller position calculation unit 21 performs the three-dimensional position data of the projection plane in the projector coordinate system (three-dimensional position coordinates of the projection plane) and each feature point of the pattern in the projector coordinate system. The three-dimensional position of the remote controller in the projector coordinate system is calculated based on the physical coordinates (three-dimensional position coordinates) (step S108).
Finally, the distortion correction coefficient calculation unit 22 determines the remote controller based on the three-dimensional position data of the projection plane in the projector coordinate system (three-dimensional position coordinates of the projection plane) and the three-dimensional position of the remote controller in the projector coordinate system. A distortion correction coefficient is calculated so that distortion corresponding to the three-dimensional shape of the projection plane does not occur in the projected image when the projection plane is viewed from the viewpoint (step S109).
 以上のようにして算出した歪み補正係数が、歪み補正部6に供給される。歪み補正部6は、歪み補正係数に従って、映像処理部5からの映像信号に対して歪み補正を行う。 The distortion correction coefficient calculated as described above is supplied to the distortion correction unit 6. The distortion correction unit 6 performs distortion correction on the video signal from the video processing unit 5 according to the distortion correction coefficient.
 次に、リモートコントローラ50の3次元位置を算出する方法を詳細に説明する。
 図7に、リモートコントローラ50の3次元位置を算出する方法を模式的に示す。図7において、プロジェクタ座標系の座標軸がXYZ軸で示され、3次元センサ座標系の座標軸がX’Y’Z’軸で示され、カメラ座標系の座標軸がX”Y”Z”軸で示されている。プロジェクタ座標系、3次元センサ座標系及びカメラ座標系の原点は互いに異なる。
Next, a method for calculating the three-dimensional position of the remote controller 50 will be described in detail.
FIG. 7 schematically shows a method for calculating the three-dimensional position of the remote controller 50. In FIG. 7, the coordinate axes of the projector coordinate system are indicated by XYZ axes, the coordinate axes of the three-dimensional sensor coordinate system are indicated by X'Y'Z 'axes, and the coordinate axes of the camera coordinate system are indicated by X "Y" Z "axes. The origins of the projector coordinate system, the three-dimensional sensor coordinate system, and the camera coordinate system are different from each other.
 図7に示すように、パターン特徴点検出部14が、カメラ106からの撮像画像121からパターン120の4つの特徴点を抽出し、パターン物理座標算出部16が、4つの特徴点それぞれのカメラ座標系における物理座標122を算出する。パターンプロジェクタ座標変換部18が、カメラ座標系の各特徴点の物理座標122をそれぞれプロジェクタ座標系における物理座標に変換する。さらに、パターンプロジェクタ座標変換部18が、カメラ106の光学中心であるカメラ座標系の原点(0,0,0)をプロジェクタ座標系に変換する。
 リモートコントローラ位置算出部21が、プロジェクタ座標系において、4つの特徴点の物理座標とカメラ座標系の原点とを結ぶ4つの直線を求め、各直線と3次元データプロジェクタ座標変換部12で変換したプロジェクタ座標系における投写面の3次元位置との交点を求める。リモートコントローラ位置算出部21が、パターン格納部19に格納されているパターンのデータと、リモートコントローラ投写設計データ格納部20に格納されている設計データとから、リモートコントローラ50から投写されたパターンが、上記交点の3次元位置に投写されるという条件から、プロジェクタ座標系におけるリモートコントローラ50の3次元位置を算出する。
As shown in FIG. 7, the pattern feature point detection unit 14 extracts four feature points of the pattern 120 from the captured image 121 from the camera 106, and the pattern physical coordinate calculation unit 16 extracts the camera coordinates of each of the four feature points. The physical coordinates 122 in the system are calculated. The pattern projector coordinate conversion unit 18 converts the physical coordinates 122 of each feature point in the camera coordinate system into physical coordinates in the projector coordinate system. Further, the pattern projector coordinate conversion unit 18 converts the origin (0,0,0) of the camera coordinate system, which is the optical center of the camera 106, into the projector coordinate system.
The remote controller position calculation unit 21 obtains four straight lines connecting the physical coordinates of the four feature points and the origin of the camera coordinate system in the projector coordinate system, and each straight line and the projector converted by the three-dimensional data projector coordinate conversion unit 12 An intersection point with the three-dimensional position of the projection plane in the coordinate system is obtained. The pattern projected from the remote controller 50 from the pattern data stored in the pattern storage unit 19 and the design data stored in the remote controller projection design data storage unit 20 by the remote controller position calculation unit 21 is The three-dimensional position of the remote controller 50 in the projector coordinate system is calculated from the condition that projection is performed at the three-dimensional position of the intersection.
 具体的には、リモートコントローラ位置算出部21は、リモートコントローラ投写設計データ格納部20に格納されている設計データに基づいて、リモートコントローラ50の4つの特徴点にそれぞれ向かう光線方向を取得する。そして、プロジェクタ座標系における投写面上の4つの交点を4つの特徴点に向かう光線が通るようなリモートコントローラの唯一の位置を求める。この求めた位置が、プロジェクタ座標系におけるリモートコントローラ50の3次元位置である。 Specifically, the remote controller position calculation unit 21 acquires the light ray directions respectively directed to the four feature points of the remote controller 50 based on the design data stored in the remote controller projection design data storage unit 20. Then, a unique position of the remote controller is obtained such that light beams traveling toward the four feature points pass through the four intersection points on the projection plane in the projector coordinate system. This obtained position is the three-dimensional position of the remote controller 50 in the projector coordinate system.
 以上のように、本実施形態のプロジェクタによれば、リモートコントローラ50で指定した位置を視点として歪み補正を自動的に行うことができる。よって、リモートコントローラを用いた簡単な操作で、任意の視点に応じた歪み補正を行うことができる。
 なお、本実施形態のプロジェクタにおいて、4つの特徴点を含むパターンを例に動作を説明したが、リモートコントローラの位置を算出するのに必要な特徴点の数は3つ以上である。
 また、輝点のパターンに代えて、複数の線で構成された多角形の図形パターンを用いても良い。この場合は、多角形の各頂角を特徴点として抽出する。
As described above, according to the projector of the present embodiment, distortion correction can be automatically performed using the position designated by the remote controller 50 as a viewpoint. Therefore, distortion correction corresponding to an arbitrary viewpoint can be performed with a simple operation using a remote controller.
In the projector according to the present embodiment, the operation has been described by taking a pattern including four feature points as an example. However, the number of feature points necessary for calculating the position of the remote controller is three or more.
Further, a polygonal graphic pattern composed of a plurality of lines may be used instead of the bright spot pattern. In this case, each vertex angle of the polygon is extracted as a feature point.
 なお、本実施形態のプロジェクタにおいて、歪み補正部6及び歪み補正データ算出部4からなる部分を歪み補正手段と呼ぶことができる。プロジェクタ投写設計データ格納部9、3次元センサ部10、3次元センサキャリブレーションデータ格納部11及び3次元データプロジェクタ座標変換部12からなる部分を第1の取得部と呼ぶことができる。カメラ部13、パターン特徴点検出部14、カメラ撮像設計データ格納部15、パターン物理座標算出部16、カメラキャリブレーションデータ格納部17及びパターンプロジェクタ座標変換部18からなる部分を第2の取得部と呼ぶことができる。パターン格納部19及びリモートコントローラ投写設計データ格納部20を記憶部と呼ぶことができる。 In the projector according to the present embodiment, a portion including the distortion correction unit 6 and the distortion correction data calculation unit 4 can be referred to as a distortion correction unit. A portion including the projector projection design data storage unit 9, the three-dimensional sensor unit 10, the three-dimensional sensor calibration data storage unit 11, and the three-dimensional data projector coordinate conversion unit 12 can be referred to as a first acquisition unit. A portion including the camera unit 13, the pattern feature point detection unit 14, the camera imaging design data storage unit 15, the pattern physical coordinate calculation unit 16, the camera calibration data storage unit 17, and the pattern projector coordinate conversion unit 18 is referred to as a second acquisition unit. Can be called. The pattern storage unit 19 and the remote controller projection design data storage unit 20 can be called storage units.
 (第2の実施形態)
 図8は、本発明の第2の実施形態によるプロジェクタの構成を示すブロック図である。
(Second Embodiment)
FIG. 8 is a block diagram showing a configuration of a projector according to the second embodiment of the present invention.
 図8を参照すると、プロジェクタは、3つ以上の特徴点を含む所定のパターンを投写するリモートコントローラと組み合わせて用いられるプロジェクタであって、画像を形成する画像形成部200と、画像形成部200で形成された画像を投写する投写部201と、リモートコントローラの位置から投写面を見たときの投写画像に歪みが生じないように画像形成部200に形成される画像を補正する歪み補正手段202と、を有する。歪み補正手段202は、第1の取得部203、第2の取得部204、記憶部205及びリモートコントローラ位置算出部206を有する。 Referring to FIG. 8, the projector is used in combination with a remote controller that projects a predetermined pattern including three or more feature points. The projector includes an image forming unit 200 that forms an image, and an image forming unit 200. A projection unit 201 that projects the formed image, and a distortion correction unit 202 that corrects an image formed on the image forming unit 200 so that the projection image is not distorted when the projection surface is viewed from the position of the remote controller. Have. The distortion correction unit 202 includes a first acquisition unit 203, a second acquisition unit 204, a storage unit 205, and a remote controller position calculation unit 206.
 第1の取得部203は、投写面を3次元計測して該投写面の3次元位置を取得し、投写部201の画像が投写される空間を3次元の座標で表したプロジェクタ座標系における投写面の3次元位置座標を取得する。第2の取得部204は、投写面上に投写された所定のパターンを撮像し、該パターンの撮像画像から3つ以上の特徴点を抽出し、プロジェクタ座標系における該3つ以上の特徴点の物理座標を取得する。
 記憶部205は、リモートコントローラの上記3つ以上の特徴点それぞれの放射方向を示すデータを格納している。
 リモートコントローラ位置算出部206は、上記投写面の3次元位置座標と、上記3つ以上の特徴点の物理座標と、上記3つ以上の特徴点それぞれの放射方向とに基づいて、プロジェクタ座標系におけるリモートコントローラの位置を算出する。
The first acquisition unit 203 performs three-dimensional measurement of the projection plane to acquire a three-dimensional position of the projection plane, and projection in a projector coordinate system in which a space in which the image of the projection unit 201 is projected is represented by three-dimensional coordinates. Get the 3D position coordinates of the surface. The second acquisition unit 204 captures a predetermined pattern projected on the projection plane, extracts three or more feature points from the captured image of the pattern, and extracts the three or more feature points in the projector coordinate system. Get physical coordinates.
The storage unit 205 stores data indicating the radiation direction of each of the three or more feature points of the remote controller.
The remote controller position calculation unit 206 uses a three-dimensional position coordinate of the projection plane, physical coordinates of the three or more feature points, and radiation directions of the three or more feature points in the projector coordinate system. Calculate the position of the remote controller.
 本実施形態のプロジェクタにおいても、リモートコントローラで指定した位置を視点として歪み補正を自動的に行うことができる。よって、リモートコントローラを用いた簡単な操作で、任意の視点に応じた歪み補正を行うことができる。 Also in the projector of this embodiment, distortion correction can be automatically performed using the position designated by the remote controller as a viewpoint. Therefore, distortion correction corresponding to an arbitrary viewpoint can be performed with a simple operation using a remote controller.
 本実施形態のプロジェクタにおいて、歪み補正手段202は、投写面の3次元位置座標とリモートコントローラの位置とに基づいて投写画像の歪みを補正するための歪み補正係数を算出する歪み補正係数計算部を、さらに有し、歪み補正係数に従って、画像形成部200に形成される画像を補正するように構成されても良い。
 また、本実施形態のプロジェクタにおいて、第2の取得部204は、画像形成部200に黒画面の映像を形成させ、該黒画面が投写された状態で上記所定のパターンを撮像しても良い。
 さらに、本実施形態のプロジェクタにおいて、リモートコントローラから送信された歪み補正開始要求信号を受信する受信部を設け、第1の取得部203、第2の取得部204およびリモートコントローラ位置算出部206は、歪み補正開始要求信号に応じて動作するように構成されても良い。
 なお、本実施形態のプロジェクタにおいて、画像形成部200及び投写部201は、図1に示した投写レンズユニット部8に対応する。歪み補正手段202は、図1に示した歪み補正部6及び歪み補正データ算出部4に対応する。第1の取得部203は、図1に示したプロジェクタ投写設計データ格納部9、3次元センサ部10、3次元センサキャリブレーションデータ格納部11及び3次元データプロジェクタ座標変換部12からなる部分に対応する。第2の取得部204は、図1に示したカメラ部13、パターン特徴点検出部14、カメラ撮像設計データ格納部15、パターン物理座標算出部16、カメラキャリブレーションデータ格納部17及びパターンプロジェクタ座標変換部18からなる部分に対応する。記憶部205は、図1に示したパターン格納部19及びリモートコントローラ投写設計データ格納部20に対応する。リモートコントローラ位置算出部206は、図1に示したリモートコントローラ位置算出部21に対応する。
In the projector of this embodiment, the distortion correction unit 202 includes a distortion correction coefficient calculation unit that calculates a distortion correction coefficient for correcting distortion of the projected image based on the three-dimensional position coordinates of the projection surface and the position of the remote controller. Further, the image forming unit 200 may be configured to correct an image formed according to the distortion correction coefficient.
In the projector according to the present embodiment, the second acquisition unit 204 may cause the image forming unit 200 to form a black screen image and capture the predetermined pattern in a state where the black screen is projected.
Further, in the projector according to the present embodiment, a receiving unit that receives a distortion correction start request signal transmitted from the remote controller is provided, and the first acquisition unit 203, the second acquisition unit 204, and the remote controller position calculation unit 206 include: It may be configured to operate in response to a distortion correction start request signal.
In the projector of this embodiment, the image forming unit 200 and the projection unit 201 correspond to the projection lens unit unit 8 shown in FIG. The distortion correction unit 202 corresponds to the distortion correction unit 6 and the distortion correction data calculation unit 4 illustrated in FIG. The first acquisition unit 203 corresponds to the portion including the projector projection design data storage unit 9, the three-dimensional sensor unit 10, the three-dimensional sensor calibration data storage unit 11, and the three-dimensional data projector coordinate conversion unit 12 illustrated in FIG. To do. The second acquisition unit 204 includes the camera unit 13, the pattern feature point detection unit 14, the camera imaging design data storage unit 15, the pattern physical coordinate calculation unit 16, the camera calibration data storage unit 17, and the pattern projector coordinates illustrated in FIG. This corresponds to the portion formed by the conversion unit 18. The storage unit 205 corresponds to the pattern storage unit 19 and the remote controller projection design data storage unit 20 shown in FIG. The remote controller position calculation unit 206 corresponds to the remote controller position calculation unit 21 shown in FIG.
 以上説明した各実施形態のプロジェクタは本発明の一例であり、その構成及び動作については、発明の趣旨を逸脱しない範囲で、当業者が理解し得る変更及び改善を適用することができる。 The projectors of the respective embodiments described above are examples of the present invention, and changes and improvements that can be understood by those skilled in the art can be applied to the configurations and operations without departing from the spirit of the invention.
 200 画像形成部
 201 投射部
 202 歪み補正手段
 203 第1の取得部
 204 第2の取得部
 205 記憶部
 206 リモートコントローラ位置算出部
DESCRIPTION OF SYMBOLS 200 Image formation part 201 Projection part 202 Distortion correction means 203 1st acquisition part 204 2nd acquisition part 205 Storage part 206 Remote controller position calculation part

Claims (7)

  1.  3つ以上の特徴点を含む所定のパターンを投写するリモートコントローラと組み合わせて用いられるプロジェクタであって、
     画像を形成する画像形成部と、
     前記画像形成部で形成された画像を投写面上に投写する投写部と、
     前記リモートコントローラの位置から前記投写面を見たときの投写画像に歪みが生じないように前記画像形成部に形成される画像を補正する歪み補正手段と、を有し、
     前記歪み補正手段は、
     前記投写面を3次元計測して前記投写面の3次元位置を取得し、前記投写部の前記画像が投写される空間を3次元の座標で表したプロジェクタ座標系における前記投写面の3次元位置座標を取得する第1の取得部と、
     前記投写面上に投写された前記所定のパターンを撮像し、該パターンの撮像画像から前記3つ以上の特徴点を抽出し、前記プロジェクタ座標系における前記3つ以上の特徴点の物理座標を取得する第2の取得部と、
     前記リモートコントローラの前記3つ以上の特徴点それぞれの放射方向を示すデータが格納された記憶部と、
     前記投写面の3次元位置座標と、前記3つ以上の特徴点の物理座標と、前記3つ以上の特徴点それぞれの放射方向とに基づいて、前記プロジェクタ座標系における前記リモートコントローラの位置を算出するリモートコントローラ位置算出部と、を有する、プロジェクタ。
    A projector used in combination with a remote controller that projects a predetermined pattern including three or more feature points,
    An image forming unit for forming an image;
    A projection unit that projects an image formed by the image forming unit on a projection surface;
    Distortion correcting means for correcting an image formed on the image forming unit so that the projected image is not distorted when the projection surface is viewed from the position of the remote controller,
    The distortion correction means includes
    A three-dimensional position of the projection plane in a projector coordinate system in which a three-dimensional measurement of the projection plane is performed to obtain a three-dimensional position of the projection plane, and a space in which the image of the projection unit is projected is represented by three-dimensional coordinates. A first acquisition unit for acquiring coordinates;
    The predetermined pattern projected on the projection plane is imaged, the three or more feature points are extracted from the captured image of the pattern, and the physical coordinates of the three or more feature points in the projector coordinate system are acquired. A second acquisition unit
    A storage unit storing data indicating the radiation direction of each of the three or more feature points of the remote controller;
    Based on the three-dimensional position coordinates of the projection plane, the physical coordinates of the three or more feature points, and the radiation directions of the three or more feature points, the position of the remote controller in the projector coordinate system is calculated. And a remote controller position calculating unit.
  2.  請求項1に記載のプロジェクタにおいて、
     前記歪み補正手段は、前記投写面の3次元位置座標と前記リモートコントローラの位置とに基づいて前記投写画像の歪みを補正するための歪み補正係数を算出する歪み補正係数計算部を、有し、前記歪み補正係数に従って、前記画像形成部に形成される画像を補正する、プロジェクタ。
    The projector according to claim 1, wherein
    The distortion correction unit includes a distortion correction coefficient calculation unit that calculates a distortion correction coefficient for correcting distortion of the projection image based on a three-dimensional position coordinate of the projection plane and a position of the remote controller. A projector that corrects an image formed on the image forming unit in accordance with the distortion correction coefficient.
  3.  請求項1または2に記載のプロジェクタにおいて、
     前記第2の取得部は、前記画像形成部に黒画面の映像を形成させ、該黒画面が投写された状態で前記所定のパターンを撮像する、プロジェクタ。
    The projector according to claim 1 or 2,
    The second acquisition unit causes the image forming unit to form a black screen image and captures the predetermined pattern in a state where the black screen is projected.
  4.  請求項1乃至3のいずれか一項に記載のプロジェクタにおいて、
     前記リモートコントローラから送信された歪み補正開始要求信号を受信する受信部を、さらに有し、
     前記第1の取得部、前記第2の取得部および前記リモートコントローラ位置算出部は、前記歪み補正開始要求信号に応じて動作する、プロジェクタ。
    The projector according to any one of claims 1 to 3,
    A receiver for receiving a distortion correction start request signal transmitted from the remote controller,
    The projector in which the first acquisition unit, the second acquisition unit, and the remote controller position calculation unit operate according to the distortion correction start request signal.
  5.  3つ以上の特徴点を含む所定のパターンを投写する投写部と、
     歪み補正の開始を要求する歪み補正開始要求信号を無線送信する送信部と、
     第1及び第2の操作キーと、
     前記第1の操作キーが押下されると、前記投写部から前記所定のパターンを投写させ、前記第2の操作キーが押下させると、前記送信部から前記歪み補正開始要求信号を送信させる制御部と、を有する、リモートコントローラ。
    A projection unit that projects a predetermined pattern including three or more feature points;
    A transmitter that wirelessly transmits a distortion correction start request signal for requesting the start of distortion correction;
    First and second operation keys;
    A control unit that causes the projection unit to project the predetermined pattern when the first operation key is pressed, and transmits the distortion correction start request signal from the transmission unit when the second operation key is pressed. And having a remote controller.
  6.  請求項1乃至5のいずれか一項に記載のプロジェクタと、
     3つ以上の特徴点を含む所定のパターンを投写面上に投写するリモートコントローラと、を有する、プロジェクタシステム。
    A projector according to any one of claims 1 to 5,
    And a remote controller that projects a predetermined pattern including three or more feature points onto a projection plane.
  7.  画像を形成する画像形成部と、前記画像形成部で形成された画像を投写面上に投写する投写部とを備えたプロジェクタにて行われる、3つ以上の特徴点を含む所定のパターンを投写するリモートコントローラを用いた歪み補正方法であって、
     前記リモートコントローラの位置から前記投写面を見たときの投写画像に歪みが生じないように前記画像形成部に形成される画像を補正することを含み、
     前記歪みの補正は、
     前記投写面を3次元計測して前記投写面の3次元位置を取得し、前記投写部の前記画像が投写される空間を3次元の座標で表したプロジェクタ座標系における前記投写面の3次元位置座標を取得し、
     前記投写面上に投写された前記所定のパターンを撮像し、該パターンの撮像画像から前記3つ以上の特徴点を抽出し、前記プロジェクタ座標系における前記3つ以上の特徴点の物理座標を取得し、
     前記投写面の3次元位置座標と、前記3つ以上の特徴点の物理座標と、前記3つ以上の特徴点それぞれの放射方向とに基づいて、前記プロジェクタ座標系における前記リモートコントローラの位置を算出することを含む、歪み補正方法。
    Projecting a predetermined pattern including three or more feature points performed by a projector including an image forming unit that forms an image and a projection unit that projects an image formed by the image forming unit onto a projection plane A distortion correction method using a remote controller,
    Correcting the image formed on the image forming unit so that the projected image is not distorted when the projection surface is viewed from the position of the remote controller,
    The distortion correction is
    A three-dimensional position of the projection plane in a projector coordinate system in which a three-dimensional measurement of the projection plane is performed to obtain a three-dimensional position of the projection plane, and a space in which the image of the projection unit is projected is represented by three-dimensional coordinates. Get the coordinates,
    The predetermined pattern projected on the projection plane is imaged, the three or more feature points are extracted from the captured image of the pattern, and the physical coordinates of the three or more feature points in the projector coordinate system are acquired. And
    Based on the three-dimensional position coordinates of the projection plane, the physical coordinates of the three or more feature points, and the radiation directions of the three or more feature points, the position of the remote controller in the projector coordinate system is calculated. A distortion correction method.
PCT/JP2017/015159 2017-04-13 2017-04-13 Projector, remote controller, projector system and distortion correction method WO2018189867A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/015159 WO2018189867A1 (en) 2017-04-13 2017-04-13 Projector, remote controller, projector system and distortion correction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/015159 WO2018189867A1 (en) 2017-04-13 2017-04-13 Projector, remote controller, projector system and distortion correction method

Publications (1)

Publication Number Publication Date
WO2018189867A1 true WO2018189867A1 (en) 2018-10-18

Family

ID=63793205

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/015159 WO2018189867A1 (en) 2017-04-13 2017-04-13 Projector, remote controller, projector system and distortion correction method

Country Status (1)

Country Link
WO (1) WO2018189867A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004187052A (en) * 2002-12-04 2004-07-02 Seiko Epson Corp Image processing system, projector, portable device, and image processing method
JP2004297401A (en) * 2003-03-26 2004-10-21 Sharp Corp Automatic adjuster for projector
JP2005039518A (en) * 2003-07-15 2005-02-10 Sharp Corp Projection type projector
JP2009206798A (en) * 2008-02-27 2009-09-10 Seiko Epson Corp Image display system and image display method
JP2011176637A (en) * 2010-02-24 2011-09-08 Sanyo Electric Co Ltd Projection type video display apparatus
JP2014056030A (en) * 2012-09-11 2014-03-27 Ricoh Co Ltd Image projection system, operation method for image projection system, image projection device, and remote control device for image projection system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004187052A (en) * 2002-12-04 2004-07-02 Seiko Epson Corp Image processing system, projector, portable device, and image processing method
JP2004297401A (en) * 2003-03-26 2004-10-21 Sharp Corp Automatic adjuster for projector
JP2005039518A (en) * 2003-07-15 2005-02-10 Sharp Corp Projection type projector
JP2009206798A (en) * 2008-02-27 2009-09-10 Seiko Epson Corp Image display system and image display method
JP2011176637A (en) * 2010-02-24 2011-09-08 Sanyo Electric Co Ltd Projection type video display apparatus
JP2014056030A (en) * 2012-09-11 2014-03-27 Ricoh Co Ltd Image projection system, operation method for image projection system, image projection device, and remote control device for image projection system

Similar Documents

Publication Publication Date Title
JP3960390B2 (en) Projector with trapezoidal distortion correction device
US10165202B2 (en) Method and system for performing alignment of a projection image to detected infrared (IR) radiation information
US9835445B2 (en) Method and system for projecting a visible representation of infrared radiation
EP1508876B1 (en) Image projection method and device
EP3054693B1 (en) Image display apparatus and pointing method for same
US11620732B2 (en) Multi-projection system, image projection method and projector
CN108007344B (en) Method, storage medium and measuring system for visually representing scan data
KR102133492B1 (en) Robust display system and method using depth camera and flight vehicle
JP2010122273A (en) Method of measuring zoom ratio of projection optical system, method of correcting projection image using the method, and projector executing the correction method
JP2019215811A (en) Projection system, image processing apparatus, and projection method
JP6804056B2 (en) Projection type display device, control method of projection type display device, and program
CN104660944A (en) Image projection apparatus and image projection method
JP6990694B2 (en) Projector, data creation method for mapping, program and projection mapping system
Kitajima et al. Simultaneous projection and positioning of laser projector pixels
JP3742085B2 (en) Projector having tilt angle measuring device
JP2014134611A (en) Geometric distortion correction device, projector, and geometric distortion correction method
JP2013152224A (en) Optical system
WO2012085990A1 (en) Projection display device and method for instructing installation attitude
JP2005004165A (en) Projector having tilt angle measuring device
WO2018189867A1 (en) Projector, remote controller, projector system and distortion correction method
JP4301028B2 (en) Projection apparatus, angle detection method, and program
JP2015142157A (en) Image projection system, projection controller, projection controlling program
JP3709406B2 (en) Projector having automatic trapezoidal distortion correction means
JP2005150818A (en) Projector system provided with computer including distortion correction means
KR102207256B1 (en) Image display apparatus and pointing method of the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17905061

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17905061

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载