+

WO2018167918A1 - Projecteur, procédé de création de données pour mappage, programme et système de mappage par projection - Google Patents

Projecteur, procédé de création de données pour mappage, programme et système de mappage par projection Download PDF

Info

Publication number
WO2018167918A1
WO2018167918A1 PCT/JP2017/010700 JP2017010700W WO2018167918A1 WO 2018167918 A1 WO2018167918 A1 WO 2018167918A1 JP 2017010700 W JP2017010700 W JP 2017010700W WO 2018167918 A1 WO2018167918 A1 WO 2018167918A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
projector
data
dimensional
mapping
Prior art date
Application number
PCT/JP2017/010700
Other languages
English (en)
Japanese (ja)
Inventor
青柳 寿和
Original Assignee
Necディスプレイソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necディスプレイソリューションズ株式会社 filed Critical Necディスプレイソリューションズ株式会社
Priority to JP2019505625A priority Critical patent/JP6990694B2/ja
Priority to PCT/JP2017/010700 priority patent/WO2018167918A1/fr
Publication of WO2018167918A1 publication Critical patent/WO2018167918A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to a projector, a mapping data creation method, a program, and a projection mapping system.
  • FIG. 1A shows an example of a system that performs projection mapping.
  • This system includes a projector 101 and a video processing apparatus 105 including a personal computer.
  • the projector 101 and the video processing device 105 are connected via a video signal cable.
  • the video processing device 105 creates a projection mapping video 106 and supplies the video signal to the projector 101.
  • the projector 101 projects an image based on the image signal from the image processing device 105 toward the three-dimensional object 104 in the projection area 103.
  • the projection mapping image 106 is displayed on a projection surface including the five surfaces 104 a to 104 e of the three-dimensional object 104.
  • FIG. 1B shows an example of the projection mapping video 106.
  • the rectangular original images 141a to 141e are images to be displayed on the surfaces 104a to 104e of the three-dimensional object 104 shown in FIG. 1A.
  • the images 142a to 142e are images obtained by deforming the rectangular original images 141a to 141e according to the outer shapes of the surfaces 104a to 104e, respectively.
  • the projection mapping image 106 is obtained by assigning these images 142a to 142e so as to correspond to the surfaces 104a to 104e, respectively.
  • the method for creating the projection mapping video 106 includes a method for creating a projection mapping video in a two-dimensional space and a method for creating a projection mapping video in a three-dimensional space.
  • FIG. 2 schematically shows a method for creating a projection mapping video in a two-dimensional space.
  • a frame indicating the outer shape of each surface of the three-dimensional object in the projection video is created on the video processing device 105.
  • a frame 146 indicated by a solid line is a frame indicating the outer shape of the surface 104 c of the three-dimensional object illustrated in FIG. 1A.
  • the image of the frame formed on the video processing device 105 is actually projected from the projector 101.
  • the frame is deformed on the video processing device 105 so that the projected image of the frame matches the outer shape of the corresponding surface of the three-dimensional object 104.
  • FIG. 1 schematically shows a method for creating a projection mapping video in a two-dimensional space.
  • the frame 147 projected onto the three-dimensional object 104 corresponds to the frame 146 created on the video processing device 105, but the frame 147 matches the outer shape of the corresponding surface of the three-dimensional object 104. Absent. Therefore, the frame 146 is deformed on the video processing device 105 so that the frame 147 matches the outer shape of the corresponding surface of the three-dimensional object 104.
  • FIG. 3 schematically shows a method for creating a projection mapping video in a three-dimensional space.
  • a virtual three-dimensional space is formed on the video processing apparatus 105, and a virtual three-dimensional object 104-1 and a virtual camera 148 corresponding to the three-dimensional object 104 are arranged in the three-dimensional space.
  • a video is assigned to each surface of the virtual three-dimensional object 104-1 and reproduced, and a video reproduced on each surface of the virtual three-dimensional object 104-1 is captured using the virtual camera 148.
  • An image captured by the virtual camera 148 is used as the projection mapping image 106.
  • creating a video using a virtual camera 148 is called rendering.
  • the angle of view of the virtual camera 148 is set to be equal to the angle of view of the projector 101.
  • the projector 101 is placed at the position where the virtual camera 148 was placed, and the projector 101 projects the projection mapping image 106 created using the virtual camera 148 toward the three-dimensional object 104.
  • Techniques related to projection mapping are described in Patent Documents 1 and 2.
  • the frame 146 is displayed on the video processing device 105 so that the projected frame 147 matches the outer shape of the corresponding surface of the three-dimensional object 104. It is necessary to carry out adjustment work such as deforming. Such adjustment work is very troublesome and troublesome for the operator.
  • An object of the present invention is to solve the above-mentioned problems, and to generate a mapping data capable of creating a projection mapping video that matches the outer shape of each surface of a three-dimensional object, a projector, a mapping data creation method, To provide a program and a projection mapping system.
  • a display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object.
  • a projector A reception unit for receiving a data creation request;
  • a mapping data generation unit for generating mapping data for creating a projection mapping video in response to the data creation request;
  • the mapping data generation unit A three-dimensional sensor unit that three-dimensionally measures the three-dimensional object and outputs three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates;
  • the coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located on the diagonal of the image forming surface intersects as an origin.
  • a coordinate transformation unit Based on the three-dimensional position data coordinate-converted by the coordinate conversion unit, for each of the surfaces of the three-dimensional object, a position of the surface and a ridge line indicating the outer shape of the surface are acquired, and the position and the ridge line are obtained.
  • a data generation unit that generates the mapping data based on the projector.
  • a display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object.
  • a mapping data creation method performed by a projector, A mapping data generation step for generating mapping data for creating a projection mapping video in response to the data creation request;
  • the mapping data generation step includes: Three-dimensional measurement of the three-dimensional object, to obtain three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates, The coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located diagonally on the image forming surface intersects as an origin.
  • a method for creating data for mapping includes generating data.
  • a display element having an image forming surface composed of a plurality of pixels; and a projection lens that projects an image formed on the image forming surface, and the image is projected from the projection lens toward a three-dimensional object.
  • the mapping data generation process includes: Processing to measure the three-dimensional object three-dimensionally and obtain three-dimensional position data representing the position and shape of each surface on which the image on the three-dimensional object is projected in three-dimensional coordinates;
  • the coordinate system of the three-dimensional position data is converted into a projector coordinate system in which the projection area of the image is expressed in three-dimensional coordinates with a point where a principal ray from a pixel located on the diagonal of the image forming surface intersects as an origin.
  • the above projector An image processing device capable of mutual communication with the projector,
  • the video processing apparatus is provided with a projection mapping system that creates a projection mapping video based on data for a projection mapping video creation tool generated by the projector.
  • FIG. 1 is a block diagram illustrating a configuration of a projector that is a first embodiment of the present invention.
  • FIG. It is a schematic diagram for demonstrating the relationship of a field angle in case of launching, a projection optical axis, and a projection center point.
  • 5 is a flowchart showing a procedure for generating a two-dimensional data file 33; It is a schematic diagram which shows an example of a perspective projection. It is a schematic diagram for demonstrating the angle of view of launch projection. It is a schematic diagram for demonstrating an angle-of-view central axis. It is a schematic diagram for demonstrating the projection target surface which shows one surface of a solid object.
  • FIG. 5 is a block diagram showing a configuration of a projector that is a second embodiment of the present invention.
  • FIG. 5 is a block diagram showing a configuration of a projector that is a third embodiment of the present invention.
  • FIG. 4 is a block diagram showing a configuration of the projector according to the first embodiment of the present invention.
  • the projector includes a communication control unit 1, a parameter storage unit 3, a projection unit 5, a projector data generation unit 6, a mapping data generation unit 7, a file storage unit 8, a projector projection design data storage unit 9, an image
  • An angle symmetrizing unit 10 and an attitude sensor unit 11 are included.
  • the communication control unit 1 includes control means such as a CPU (Central Processing Unit), and is connected to an external device via the communication input / output unit 2 so as to be able to communicate with each other.
  • RS-232C, a wired LAN (Local Area Network), a wireless LAN, or the like can be used as a communication means between the communication input / output unit 2 and an external device, but is not limited thereto.
  • the external device is an information processing apparatus such as a personal computer, for example, and includes a projection mapping video creation tool.
  • the projection mapping video creation tool is a tool for creating a projection mapping video in a three-dimensional space, a tool for creating a projection mapping video in a two-dimensional space, or the like. Since these tools are existing, detailed description thereof is omitted here.
  • the communication control unit 1 controls the operation of each unit of the projector and transmits / receives data and instruction signals (or control signals) to / from the information processing apparatus. For example, the communication control unit 1 controls the projector data generation unit 6 and the mapping data generation unit 7 in accordance with an instruction from the information processing apparatus, and generates data (three-dimensional object data) necessary to create a projection mapping video. 2D data or 3D data or projector data) is generated.
  • the parameter storage unit 3 includes parameters used in mapping data creation processing necessary to create a projection mapping video (view angle symmetrization selection parameter, segmentation parameter, polygon meshing parameter, file format parameter, mapping mode parameter, etc.) Is stored.
  • the field angle symmetrization selection parameter is setting information indicating whether or not there is field angle symmetrization.
  • the segmentation parameter is a threshold value for extracting each surface of the three-dimensional object based on three-dimensional position data (point cloud data) obtained by three-dimensionally measuring the shape of the surface of the three-dimensional object.
  • the polygon meshing parameter is a parameter (polygon shape, mesh roughness, etc.) necessary for polygonal meshing of each surface of the extracted three-dimensional object.
  • the file format parameter is a parameter for designating the formats of the two-dimensional data file and the three-dimensional data file.
  • the mapping mode parameter is a parameter for designating any one of three modes of mapping for two-dimensional space, mapping for three-dimensional space, and
  • a video signal is input from the information processing apparatus to the projection unit 5 via the projection video input unit 4.
  • the projection unit 5 projects a video based on the input video signal from the projection video input unit 4, and includes a video processing unit 12, a distortion correction unit 13, a projection lens unit 14, and a distortion correction coefficient calculation unit 15.
  • the video processing unit 12 performs processing for converting the resolution of the input video signal to the resolution of the display device, processing for adjusting image quality, and the like.
  • the distortion correction unit 13 corrects distortion (for example, trapezoidal distortion) of the image projected on the projection surface not facing the projector, according to the distortion correction coefficient, with respect to the video signal processed by the video processing unit 12.
  • the distortion correction coefficient calculation unit 15 calculates the distortion in the distortion correction unit 13 based on the information for distortion correction set by the user or the horizontal and vertical inclination information of the projection surface from the view angle symmetrization unit 10. A distortion correction coefficient necessary for correction is calculated.
  • the user can set information for distortion correction using an operation unit (not shown).
  • the projection lens unit 14 includes a display device that forms an image based on the video signal from the distortion correction unit 13 and a projection lens that projects an image formed by the display device.
  • the projection lens includes a lens that can move in the direction of the optical axis, a zoom mechanism that changes the angle of view according to a zoom position corresponding to the position on the optical axis of the lens, and the projection lens as a whole orthogonal to the optical axis. And a lens shift mechanism that shifts in the direction of movement.
  • the projection lens unit 14 supplies zoom / shift position information indicating the zoom position and lens shift position of the projection lens to the view angle symmetrizing unit 10.
  • the display device can be referred to as a display element having an image forming surface including a plurality of pixels, and an image based on the video signal from the distortion correction unit 13 is formed on the image forming surface.
  • a liquid crystal display element, DMD (digital micromirror device), or the like can be used as the display element.
  • the projector projection design data storage unit 9 stores design data related to projector projection.
  • the design data is data related to optics for obtaining the field angle and projection center point from the zoom position and lens shift position of the projection lens, that is, how the field angle and projection center point change depending on the zoom position and lens shift position. It is data for calculating what to do. This data is based on data determined by optical design.
  • the attitude sensor unit 11 includes an attitude sensor that can detect 360 ° rotation angles and 360 ° rotation angles, for example, a triaxial acceleration sensor, and detects the inclination of the projector with respect to the horizontal plane.
  • the output of the attitude sensor unit 11 is supplied to the projector data generation unit 6 and the mapping data generation unit 7.
  • the view angle symmetrization unit 10 acquires zoom / shift position information from the projection lens unit 14, acquires design data from the projector projection design data storage unit 9, and acquires view angle symmetrization selection parameters from the parameter storage unit 3.
  • the mapping mode parameter is acquired from the parameter storage unit 3.
  • the angle-of-view symmetrization unit 10 generates a projection direction vector and an angle of view indicating the direction of the projection center axis of the projector based on zoom / shift position information and design data in accordance with the mapping mode parameter and the angle-of-view symmetrization selection parameter. And the horizontal and vertical inclination angles of the projection surface with respect to the projection surface facing the projector are calculated.
  • the angle of view represents the range of projection light from the projection lens (image projection range) as an angle
  • the range of horizontal projection light as an angle is called the horizontal angle of view.
  • a range of vertical projection light expressed as an angle is called a vertical angle of view.
  • the projection central axis is the central axis (corresponding to the central ray) of the projection light from the projection lens, and can be used as a reference for the angle of view.
  • the angle of view, the projection center axis, and the projection center point change according to the zoom position and the lens shift position.
  • FIG. 5A schematically shows the relationship between the angle of view, the projection optical axis, and the projection center point when launching is performed.
  • FIG. 5B schematically shows the relationship between the angle of view, the projection optical axis, and the projection center point when there is no launch.
  • FIG. 5C schematically shows the relationship between the angle of view, the projection optical axis, and the projection center point when the angle of view is enlarged.
  • FIG. 5D schematically shows the relationship between the angle of view, the projection center axis, and the projection center point when the lens shift is performed upward.
  • the projection optical axis is an axis passing through the center of the image forming surface and perpendicular to the image forming surface.
  • FIGS. 5B to 5D correspond to cross sections in the vertical direction.
  • a projection form called launching is used in which the image is projected above the projection optical axis so that the image is projected above the height of the table.
  • lens shift is a projection form called lens shift in which an image is projected by moving it to the vertical and horizontal positions with respect to the projection optical axis
  • launch is one form of the lens shift.
  • the projection center point 102 linearly connects each point of the four corners of the projection area 103 and each point of the four corners of the image forming area of the display device 100 with corresponding points. This is where the lines meet.
  • the projection area 103 is obtained by inverting the image of the image forming area of the display device 100 vertically and horizontally.
  • the four corner points of the image forming area of the display device 100 are A point, B point, C point, and D point, respectively, and the four corner points of the projection area 103 are respectively a point, b point, c point, and d point.
  • Points a, b, c, and d correspond to points A, B, C, and D, respectively, and the arrangement of points a, b, c, and d is points A, B, and C.
  • the positional relationship is reversed up, down, left and right with respect to the arrangement of point D.
  • the projection center point 102 is emitted from the point A and reaches the point a through the lens, the principal ray that exits from the point B and reaches the point b through the lens, and the point C.
  • the principal ray that exits and reaches the point c via the lens and the principal ray that exits from the point D and reaches the point d via the lens are shown as intersecting points.
  • Such an intersection of principal rays can be defined, for example, at the center of the aperture stop of the projection lens, and can be calculated based on lens design data. In the example with launch shown in FIG.
  • the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center point 102 is located above the projection optical axis 109. In this case, the projection center axis does not coincide with the projection optical axis 109.
  • the projection optical axis 109 passes through the center of the projection area 103, and the projection center point 102 is located on the projection optical axis 109.
  • the projection center axis coincides with the projection optical axis 109.
  • FIG. 5C is an example in which the angle of view is enlarged as compared with the example of FIG. 5B. 5B, the projection optical axis 109 passes through the center of the projection area 103, and the projection center point 102 is located on the projection optical axis 109.
  • the projection center point 102 is a display device than the example of FIG. It is arranged on the 100 side. Also in this case, the projection center axis coincides with the projection optical axis 109.
  • FIG. 5D is an example in which the lens shift is performed so that the projection area 103 is shifted upward as compared with the example of FIG. 5B.
  • the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center point 102 is located above the projection optical axis 109.
  • the projection center axis does not coincide with the projection optical axis 109.
  • the angle of view, the projection center axis, and the projection center point change according to the zoom position and the lens shift position. In other words, the angle of view, the projection center axis, and the projection center point need to be determined according to the zoom position and the lens shift position.
  • the process of calculating the projection direction vector, the view angle, and the tilt angle differs depending on whether the view angle symmetrization is performed or not.
  • the angle-of-view symmetrization unit 10 executes the following processes (A1) to (A3).
  • A1 Based on the zoom / shift position information (zoom position and lens shift position) and design data, the left and right field angles in the horizontal direction and the vertical field angles in the vertical direction are obtained.
  • A2 Considering the projection optical axis as the projection center axis, a vector in which the horizontal and vertical direction components are set to 0 is set as the projection direction vector.
  • Both horizontal and vertical inclinations are set to zero.
  • the view angle symmetrization unit 10 performs the following processing. (B1) to (B3) are executed. (B1) Based on the zoom / shift position information (zoom position and lens shift position) and design data, the left and right field angles in the horizontal direction and the vertical field angles in the vertical direction are obtained. (B2) The projection plane is perpendicular to the projection center axis, and the horizontal angle of view in the horizontal direction with respect to the projection center axis when the distortion correction is performed so that the image projected on the projection plane is square becomes equal.
  • the projection center axis and the projection plane are determined so that the vertical angle of view in the vertical direction is equal, and the projection direction vector is set in the direction of the projection center axis.
  • the horizontal and vertical inclinations of the projection plane perpendicular to the projection center axis determined in (B2) with respect to the plane perpendicular to the projection optical axis are obtained.
  • the angle-of-view symmetrization unit 10 supplies the projection direction vector and the calculation result of the angle of view to the projector data generation unit 6 and supplies the calculation result of the tilt angle to the distortion correction coefficient calculation unit 15. Note that when the mapping mode parameter is a mode without mapping, the view angle symmetrization unit 10 does not perform the calculation processing of the projection direction vector, the view angle, and the tilt.
  • the projector data generation unit 6 generates a projector data file for setting a rendering camera and a projector in a virtual three-dimensional space. This projector data file is used in a tool for creating a projection mapping video in a three-dimensional space on the information processing apparatus side.
  • the projector data generation unit 6 includes an initial projector data generation unit 16, a projector data world coordinate conversion unit 17, a projector data vertical offset unit 18, and a projector data file generation unit 19.
  • the initial projector data generation unit 16 uses the origin of the projector coordinate system that is the projection center point of the projector (projection center point 102 as shown in FIGS. 5A to 5D), that is, the coordinates (0, 0, 0) as the projector position. Set to coordinates.
  • the initial projector data generation unit 16 generates projector data including the projector position coordinates, the projection direction vector from the angle-of-view symmetrization unit 10, and the angle of view (up / down / left / right angle of view).
  • the projector data world coordinate conversion unit 17 converts the direction of the projection vector included in the projector data generated by the initial projector data generation unit 16 based on the inclination of the projector with respect to the horizontal plane detected by the attitude sensor unit 11 into the horizontal plane. Convert to world coordinate system based on.
  • the projector data vertical offset unit 18 adjusts the vertical coordinate of the three-dimensional position data performed by the mapping data generation unit 7 and adjusts the projector position included in the projector data from the projector data world coordinate conversion unit 17. Change the vertical coordinate of the coordinate. In order to perform this coordinate change, the vertical offset amount is supplied from the mapping data generation unit 7 to the projector data vertical offset unit 18.
  • the projector data file generation unit 19 generates a projector data file 31 based on the projector data from the projector data vertical offset unit 18.
  • the projector data file 31 is stored in the file storage unit 8.
  • the mapping data generation unit 7 generates a two-dimensional data file 32 indicating the outer position of each surface of the three-dimensional object or a three-dimensional data file 33 of the three-dimensional object.
  • the two-dimensional data file 32 is used by a tool for creating a projection mapping video in a two-dimensional space.
  • the three-dimensional data file 33 is used by a tool that creates a projection mapping video in a three-dimensional space.
  • the two-dimensional data file 32 and the three-dimensional three-dimensional data file 33 are stored in the file storage unit 8.
  • the mapping data generation unit 7 includes a three-dimensional sensor unit 20, a calibration data storage unit 21, a three-dimensional position data projector coordinate conversion unit 22, a three-dimensional position data world coordinate conversion unit 23, a vertical offset calculation unit 24, and a three-dimensional position.
  • a data vertical offset unit 25, a three-dimensional position data segmentation unit 26, a perspective projection unit 27, a three-dimensional position data polygon meshing unit 29, a two-dimensional data file generation unit 28, and a three-dimensional data file generation unit 30 are provided.
  • the three-dimensional sensor unit 20 includes a three-dimensional sensor that is arranged toward the optical axis direction of the projection lens and three-dimensionally measures each surface of a three-dimensional object that is a projection target.
  • the detection range of the three-dimensional sensor includes the entire projectable area where the image of the projection lens can be projected.
  • FIG. 6 is a schematic diagram for explaining the relative positional relationship between the detection range of the three-dimensional sensor and the projectable area.
  • the three-dimensional sensor 108 is arranged toward the optical axis direction of the projection lens 101a.
  • the projection area can be enlarged or reduced using the zoom function, and the projection area can be moved up, down, left and right using the lens shift function.
  • the projection area 113 is a projection area when the lens is shifted in the upper right direction to minimize the zoom.
  • the projection area 114 is a projection area when the lens is shifted in the upper right direction to maximize the zoom.
  • the projection area 115 is a projection area when the lens is shifted in the upper left direction to minimize the zoom.
  • the projection area 116 is a projection area when the lens is shifted in the upper left direction to maximize the zoom.
  • the projection area 117 is a projection area when the lens is shifted in the lower right direction to minimize the zoom.
  • the projection area 118 is a projection area when the zoom is maximized by shifting the lens in the lower right direction.
  • the projection area 119 is a projection area when the lens is shifted in the lower left direction to minimize the zoom.
  • the projection area 120 is a projection area when the lens is shifted in the lower left direction to maximize the zoom.
  • the detection range of the three-dimensional sensor 108 includes the entire projectable area where the image from the projection lens 101a can be projected, that is, the entire projection areas 113 to 120. Therefore, the three-dimensional sensor 108 can measure the three-dimensional position of the three-dimensional object arranged in the projectable area.
  • the three-dimensional sensor unit 20 supplies the three-dimensional position data output from the three-dimensional sensor 108 to the three-dimensional position data projector coordinate conversion unit 22.
  • the three-dimensional sensor 108 for example, a TOF (Time-of-Flight) method or a triangulation method three-dimensional sensor can be used, but it is not limited to these methods.
  • the TOF method is a method of performing three-dimensional measurement by projecting light toward an object and measuring the time until the projected light is reflected by the object and returned.
  • Examples of the triangulation method include a passive triangulation method and an active triangulation method.
  • the passive triangulation method an object is photographed at the same time with two cameras arranged side by side on the left and right, and the principle of triangulation is used based on the difference in position on the captured image of the object obtained by each camera. This is a method for measuring and is also called a stereo camera method.
  • the active triangulation method is a method of irradiating light on an object and performing three-dimensional measurement using the principle of triangulation based on information on reflected light from the object.
  • the calibration data storage unit 21 stores calibration data.
  • the calibration data includes parameters (rotation amount and translation amount) for converting the coordinate system of the three-dimensional sensor 108 to the coordinate system of the projector, a reference zoom position, and a reference lens shift position.
  • the amount of rotation and the amount of translation can be obtained by performing calibration that measures the positional relationship between the coordinate system of the three-dimensional sensor and the coordinate system of the projector.
  • the reference zoom position and the reference lens shift position are the zoom position and the lens shift position when calibration is performed.
  • the three-dimensional position data projector coordinate conversion unit 22 acquires calibration data (rotation amount, translation amount, reference zoom position, and reference lens shift position) from the calibration data storage unit 21 and designs from the projector projection design data storage unit 9.
  • the three-dimensional position data projector coordinate conversion unit 22 uses the three-dimensional position data of the three-dimensional object from the three-dimensional sensor based on the calibration data and the design data, and the three-dimensional position of the projector coordinate system with the projection center point as the origin. Convert to position data.
  • the calibration data and design data can be referred to as coordinate conversion data for converting the coordinate system of the three-dimensional sensor into a projector coordinate system in which the projection center point is the origin.
  • the three-dimensional position data world coordinate conversion unit 23 acquires the inclination of the projector with respect to the horizontal plane from the attitude sensor unit 11, and acquires the three-dimensional position data coordinate-converted into the projector coordinate system from the three-dimensional position data projector coordinate conversion unit 22. .
  • the three-dimensional position data world coordinate conversion unit 23 converts the three-dimensional position data coordinate-converted into the projector coordinate system into a world coordinate system based on the horizontal plane based on the tilt of the projector with respect to the horizontal plane.
  • the three-dimensional position data coordinate-converted into the world coordinate system is supplied to the vertical offset calculation unit 24 and the three-dimensional position data vertical offset unit 25.
  • the vertical offset calculator 24 obtains the minimum value of the vertical coordinates from the three-dimensional position data coordinate-converted to the world coordinate system from the three-dimensional position data world coordinate converter 23. When the minimum value of the vertical coordinate is a negative number, the vertical offset calculation unit 24 outputs a vertical offset amount indicating the absolute value of the minimum value of the vertical coordinate. When the minimum value of the vertical coordinate is a positive number, the vertical offset calculation unit 24 outputs a vertical offset amount indicating 0. The output of the vertical offset calculation unit 24 is supplied to the projector data vertical offset unit 18 and the three-dimensional position data vertical offset unit 25.
  • the three-dimensional position data vertical offset unit 25 vertically converts the three-dimensional position data coordinate-converted into the world coordinate system from the three-dimensional position data world coordinate conversion unit 23 based on the vertical offset amount calculated by the vertical offset calculation unit 24. Offset in direction. Specifically, the vertical offset is performed by adding the vertical offset amount to the vertical coordinate of the three-dimensional position data.
  • the three-dimensional position data that has been offset in the vertical direction is supplied to the three-dimensional position data segmentation unit 26, the three-dimensional position data polygon meshing unit 29, and the three-dimensional position data file creation unit 30.
  • the three-dimensional position data segmentation unit 26 acquires a segmentation parameter from the parameter storage unit 3.
  • the three-dimensional position data segmentation unit 26 detects a surface of a three-dimensional object and a ridge line indicating the shape from the three-dimensional position data from the three-dimensional position data vertical offset unit 25 based on the segmentation parameter.
  • the detection results of the surface and the ridge line are supplied to the perspective projection unit 27 and the three-dimensional position data polygon meshing unit 29.
  • the perspective projection unit 27 acquires projector data from the projector data vertical offset unit 18.
  • the perspective projection unit 27 uses the projector data to project the ridgeline detected by the three-dimensional position data segmentation unit 26 onto the projection surface, thereby obtaining two-dimensional data indicating the external position of each surface of the three-dimensional object in the projection image. Generate.
  • the two-dimensional data is supplied to the two-dimensional data file generation unit 28.
  • the two-dimensional data file generation unit 28 acquires file format parameters from the parameter storage unit 3.
  • the two-dimensional data file generation unit 28 generates a two-dimensional data file 32 based on the file format indicated by the file format parameter from the two-dimensional data created by the perspective projection unit 27.
  • the two-dimensional data file 32 is stored in the file storage unit 8.
  • the three-dimensional position data polygon meshing unit 29 acquires a polygon meshing parameter and a file format parameter from the parameter storage unit 3.
  • the 3D position data polygon meshing unit 29 converts the surface detected by the 3D position data segmentation unit 26 into a polygon mesh based on the polygon meshing parameter and the file format indicated by the file format parameter.
  • Data of the polygon meshed surface is supplied to the three-dimensional data file generation unit 30.
  • the three-dimensional data file generation unit 30 acquires file format parameters from the parameter storage unit 3.
  • the three-dimensional data file generation unit 30 generates a three-dimensional data file 33 based on the file format indicated by the file format parameter from the surface data polygon- meshed by the three-dimensional position data polygon meshing unit 29.
  • the three-dimensional data file 33 is stored in the file storage unit 8.
  • FIG. 7 shows an example of the projection mapping system.
  • the projection mapping system includes a projector 201 and a video processing device 205 such as a personal computer.
  • the projector 201 and the video processing device 205 are connected to each other via a communication unit 207 so that they can communicate with each other.
  • the communication unit 207 may be composed of a communication cable and a video signal cable.
  • the video signal cable is used by a video processing device 205 described later to supply a projection mapping video for three-dimensional space to the projector 201.
  • the communication cable is used by the projector 201 described later to supply data such as the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 to the video processing device 205. It is also possible to configure the communication cable and the video signal cable with one cable.
  • the communication unit 207 may include a wireless communication unit.
  • the projector 201 has the configuration described with reference to FIGS. 4 to 6 and projects video based on the video signal from the video processing device 205 onto each surface of the three-dimensional unit 104.
  • the video processing device 205 includes at least one of a two-dimensional space projection mapping video creation and a three-dimensional space projection mapping video creation tool.
  • mapping mapping is performed according to the following procedure.
  • the projector 201 executes mapping data creation processing in response to a data creation instruction (data creation request) from the video processing device 205.
  • data creation instruction data creation request
  • this mapping data creation process data for a two-dimensional space projection mapping video creation tool or data for a three-dimensional space projection mapping video creation tool is created.
  • the video processing device 205 is equipped with a projection mapping video creation tool for 3D space, and the projector 201 creates data for the projection mapping video creation tool for 3D space.
  • This data includes three-dimensional data of a three-dimensional object and projector data for setting a rendering camera and a projector in a virtual three-dimensional space.
  • the video processing device 205 acquires the data for the 3D space projection mapping video creation tool generated by the projector 201, that is, the 3D data file 33 and the projector data file 31. Then, the 3D space projection mapping video creation tool creates a 3D space projection mapping video using the 3D data file 33 and the projector data file 31 acquired from the projector 201. The video processing device 205 supplies a projection mapping video for 3D space to the projector 201. The projector 201 projects an image based on the three-dimensional space projection mapping image on each surface of the three-dimensional unit 104.
  • the projector 201 uses data for the projection mapping video creation tool for 2D space instead of the 3D data file 33.
  • a two-dimensional data file 32 is created.
  • the two-dimensional data file 32 is two-dimensional data indicating the external position of each surface of the three-dimensional object in the projection video, and is supplied from the projector 201 to the video processing device 205.
  • the two-dimensional space projection mapping video creation tool uses the two-dimensional data file 32 to create a two-dimensional space projection mapping video.
  • FIG. 8 shows a procedure of mapping data creation processing.
  • the operator performs an input operation (data creation request) for setting parameters necessary for creating a projection mapping video on the video processing device 205.
  • a parameter setting start instruction and parameter setting screen information are supplied from the video processing device 205 to the projector 201.
  • the parameter setting screen information is supplied to the projection unit 5, and the communication control unit 1 causes the projection unit 5 to project the parameter setting screen according to the parameter setting start instruction.
  • the operator inputs necessary parameter information with reference to the parameter setting screen.
  • the parameters include parameters necessary for mapping data creation processing (view angle symmetrization selection parameter, segmentation parameter, polygon meshing parameter, file format parameter, mapping mode parameter, etc.).
  • the parameter input information is supplied from the video processing device 205 to the projector 201.
  • the communication control unit 1 stores parameters input by the operator in the parameter storage unit 3.
  • the view angle symmetrization unit 10 acquires zoom / shift position information from the projection lens unit 14, acquires design data from the projector projection design data storage unit 9, and receives the view angle from the parameter storage unit 3.
  • a symmetrization selection parameter is acquired, and a mapping mode parameter is acquired from the parameter storage unit 3.
  • the angle-of-view symmetrization unit 10 generates a projection direction vector and an angle of view indicating the direction of the projection center axis of the projector based on zoom / shift position information and design data in accordance with the mapping mode parameter and the angle-of-view symmetrization selection parameter. And the horizontal and vertical inclination angles of the projection surface with respect to the projection surface facing the projector are calculated.
  • the calculation process of the angle of view, the projection direction, and the tilt by the angle of view symmetrization unit 10 includes the following first and second processes.
  • First process When the angle of view is not symmetrized so that the angle of view is vertically and horizontally symmetrical, that is, the mapping mode parameter is set to 2D space mapping or the mapping mode parameter is set to 3D space mapping.
  • the field angle symmetrization selection parameter is set to “no field angle symmetrization”
  • the field angle symmetrization unit 10 executes the first process.
  • the angle-of-view symmetrization unit 10 includes the zoom position and lens shift position of the projection lens unit 14, the angle of view of the projector stored in the projector projection design data storage unit 9, zoom characteristics, lens shift characteristics, and the like.
  • the angle of view is obtained from design data related to the projection.
  • the view angle symmetrizing unit 10 sets the projection direction vector to the direction of the projection optical axis, that is, a vector in which the horizontal / vertical direction components are 0 respectively.
  • the angle-of-view symmetrization unit 10 does not change the projection plane, and therefore sets both the horizontal inclination and the vertical inclination to zero.
  • the angle-of-view symmetrization unit 10 executes the second process.
  • the view angle symmetrizing unit 10 includes the zoom position and lens shift position of the projection lens unit 14, the angle of view of the projector stored in the projector projection design data storage unit 9, the zoom characteristic, the lens shift characteristic, and the like. The angle of view is obtained from design data related to the projection.
  • the angle-of-view symmetrizing unit 10 is a projection plane perpendicular to the projection center axis, and the horizontal direction with respect to the projection center axis when distortion correction is performed so that the image projected on the projection plane becomes square.
  • a projection plane is defined such that the vertical and vertical vertical angles of view are equal, and the projection direction vector is set in the direction of the projection center axis.
  • the view angle symmetrization unit 10 obtains the horizontal and vertical inclinations of the projection plane perpendicular to the projection central axis with respect to the plane perpendicular to the projection optical axis.
  • the projection unit 5 performs distortion correction so that the projected image on the projection surface becomes square. However, since the projected image area after this distortion correction is smaller than before correction, the distortion correction is not performed. It also affects the corners. Therefore, in consideration of the change in the projected video area after distortion correction, the projection center axis and the projector apparatus in which the horizontal field angle is equal to each other and the vertical field angle is equal to each other in the vertical direction It is desirable to obtain the inclination of the projection surface with respect to the projection surface when the screen and the screen face each other.
  • the distortion correction coefficient calculation unit 15 normally calculates a distortion correction coefficient according to the distortion correction setting by the user, and corrects the distortion that occurs when the distortion correction unit 13 projects on a screen that is not directly facing the projector. To do. However, when 2D space mapping or 3D space mapping is set in the mapping mode parameter, the distortion correction coefficient calculation unit 15 does not perform distortion correction setting by the user, but projects from the view angle symmetrization unit 10. A distortion correction coefficient is calculated according to the inclination of the surface, and the distortion correction unit 13 corrects the distortion based on the distortion correction coefficient.
  • the view angle symmetrization 10 and the distortion correction processing in the projection unit 5 are executed every time the user adjusts the zoom position and the lens shift position of the projection lens unit 14. This eliminates the problem of deviation between the projected image and the three-dimensional object when using a tool for creating a projection mapping image in a three-dimensional space in which the angle of view can only be set to be vertically or horizontally symmetrical. It becomes possible.
  • FIG. 9A shows a projection state in which the projection optical axis passes through the center of the lower side of the projection area in launch projection
  • FIG. 9B shows the projection video area before and after distortion correction.
  • the projection center axis 110 does not coincide with the projection optical axis 109.
  • the upper angle is ⁇ T and the lower angle is ⁇ B.
  • the upper angle is ⁇ ′ T and the lower angle is ⁇ ′ B.
  • the image projection central axis 110 and the projection plane perpendicular to the image projection central axis 110 are obtained, and the inclination of the projection plane with respect to the projection plane when the projector 201 and the screen face each other is obtained.
  • the view angle symmetrizing unit 10 supplies the inclination of the projection surface to the distortion correction coefficient calculation unit 15, and the distortion correction coefficient calculation unit 15 calculates a distortion correction coefficient based on the inclination of the projection surface, and the distortion correction unit 13. Based on the distortion correction coefficient, distortion correction is performed so that the projected image on the projection surface becomes square. According to this distortion correction, the projection video area 122 is corrected to the projection video area 123 as shown in FIG. 9B, and as a result, the projection area 103 is corrected to the projection area 121 as shown in FIG. 9A.
  • FIG. 10A shows a projection state when the lens is shifted in the upper left direction
  • FIG. 10B shows a projected video area before and after distortion correction.
  • the upper angle is ⁇ ′ T and the lower angle is ⁇ ′ B.
  • the right angle is ⁇ ′ R and the left angle is ⁇ ′ L. Because it is ⁇ 'T ⁇ ⁇ ' B, the vertical angle of view does not become vertically symmetrical. Also, since ⁇ ′ R ⁇ ⁇ ′ L , the horizontal field angle is not symmetrical.
  • the view angle symmetrizing unit 10 supplies the inclination of the projection surface to the distortion correction coefficient calculation unit 15, and the distortion correction coefficient calculation unit 15 calculates a distortion correction coefficient based on the inclination of the projection surface, and the distortion correction unit 13. Based on the distortion correction coefficient, distortion correction is performed so that the projected image on the projection surface becomes square. According to this distortion correction, the projection video area 122 is corrected to the projection video area 123 as shown in FIG. 10B, and as a result, the projection area 103 is corrected to the projection area 121 as shown in FIG. 10A.
  • step S12 the communication control unit 1 of the projector 201 determines whether or not a mapping data generation start instruction has been received. After adjusting the position, projection direction, zoom position, and lens shift position of the projector 201 so that the projected image is projected onto the three-dimensional object that is the projection target, the user starts mapping data generation on the information processing device 205. An input operation for instructing is performed. In response to this input operation, a mapping data generation start instruction is supplied from the information processing apparatus 205 to the projector 201. When the mapping data generation start instruction is received, in steps S13 and S14, the communication control unit 1 determines whether the mapping mode parameter is set to the two-dimensional space mapping mode or the three-dimensional space mapping mode.
  • mapping mode parameter is set to the two-dimensional space mapping mode
  • the communication control unit 1 causes the mapping data generation unit 7 to generate the two-dimensional data file 32 in step S15.
  • mapping mode parameter is set to the 3D space mapping mode
  • the communication control unit 1 causes the mapping data generation unit 7 to generate the 3D data file 33 and the projector.
  • the data generation unit 6 generates a projector data file 31. If the no mapping mode is set in the mapping mode parameter, the mapping data creation process is terminated without generating the data file.
  • the mapping data generation unit 7 executes the generation process of the three-dimensional data file 33 in accordance with the mapping data generation start instruction from the communication control unit 1, and the projector data generation unit 6 The generation process of the file 31 is executed.
  • FIG. 11 shows a procedure for generating the three-dimensional data file 33.
  • the three-dimensional sensor unit 20 measures a three-dimensional object three-dimensionally and outputs three-dimensional position data. This three-dimensional position data is represented by point group data indicated by three-dimensional coordinates in the coordinate system of the three-dimensional sensor 108.
  • step S 21 the three-dimensional position data projector coordinate conversion unit 22 acquires the rotation amount, the translation amount, the reference zoom position, and the reference lens shift position from the calibration data storage unit 21, and the current zoom position and the reference lens shift position from the projection lens unit 14.
  • the lens shift position is acquired, and the design data is acquired from the projector projection design data storage unit 9.
  • the 3D position data projector coordinate conversion unit 22 receives the 3D from the 3D sensor 20 based on the rotation amount, translation amount, reference zoom position, reference lens shift position, current zoom position and lens shift position, and design data.
  • the coordinate system of the dimension position data is converted into the coordinate system of the projector 201 with the projection center point as the origin.
  • FIG. 12 schematically shows the positional relationship between the coordinate system of the three-dimensional sensor 108 and the coordinate system of the projector 201.
  • the origin of the coordinate system of the three-dimensional sensor 108 does not coincide with the origin (projection center point) of the coordinate system of the projector 201, and the detection direction of the three-dimensional sensor 108 is also the projection direction of the projector 201. It does not match. For this reason, the coordinate system of the three-dimensional position data measured by the three-dimensional sensor 108 is different from the coordinate system of the projector 201.
  • the coordinate transformation in step S21 can be defined by a rotation amount with respect to the three coordinate axes (XYZ) and a translation amount representing movement with respect to the three coordinate axes (XYZ). Obtaining the amount of rotation and the amount of translation is called calibration.
  • the projection center point that is the origin of the projector coordinate system is not always the same position, but is moved by a zoom or lens shift operation.
  • the shift positions are stored in the calibration data storage unit 21 as the reference zoom position and the reference lens shift position, respectively.
  • the three-dimensional position data projector coordinate conversion unit 22 first obtains the coordinates of the reference projection center point when calibration is performed using the design data, the reference zoom position, and the reference lens shift position. Next, the three-dimensional position data projector coordinate conversion unit 22 obtains the coordinates of the current projection center point based on the current zoom position and lens shift position, and changes the coordinates of the reference projection center point to the coordinates of the current projection center point.
  • the translation amount for the coordinate transformation of is obtained.
  • the three-dimensional position data projector coordinate conversion unit 22 converts the three-dimensional position data from the three-dimensional sensor unit 20 based on the rotation amount and the translation amount stored in the calibration data storage unit 21.
  • the three-dimensional position data projector coordinate conversion unit 22 moves the coordinates by the amount of translation from the coordinates of the reference projection center point to the coordinates of the current projection center point based on the current zoom position and lens shift position. .
  • the coordinate system of the three-dimensional position data from the three-dimensional sensor unit 20 is converted into a projector coordinate system with the current projection center point as the origin.
  • the three-dimensional position data world coordinate conversion unit 23 converts the three-dimensional position data coordinate-converted into the projector coordinate system based on the inclination of the projector with respect to the horizontal plane detected by the attitude sensor unit 11, and uses the horizontal plane as a reference. Convert to the world coordinate system.
  • the projector 201 is installed tilted in the front-rear direction and the left-right direction, or installed upside down.
  • the installation state of the projector 201 includes various states such as a horizontal state 124, an upside down state 125, an upward state 126, a downward state 127, a right rotation state 128, and a left rotation state 129, as shown in FIG.
  • the three-dimensional position data world coordinate conversion unit 23 performs a conversion process to the world coordinate system based on the horizontal plane as shown in FIG. 12 based on the horizontal inclination detected by the attitude sensor unit 11. .
  • the projection mapping video creation tool for 3D space can create not only a planar video but also a 3D projection mapping for representing a stereoscopic video as a video projected on the surface of a three-dimensional object.
  • the viewpoint is set at a three-dimensional position, and an image of the surface of the three-dimensional object is created so that it can be seen as a three-dimensional object when viewed from there.
  • the viewpoint is the position of the human eye standing on the ground, and it is necessary to convert the viewpoint into a world coordinate system based on a horizontal plane parallel to the ground in order to set the viewpoint.
  • the vertical offset amount calculation unit 24 calculates the vertical offset amount so that all the vertical coordinates of the three-dimensional position data become 0 or more, and the three-dimensional position data vertical offset unit 25 Adds the vertical offset to all the vertical coordinates of the three-dimensional position data. Specifically, the vertical offset amount calculation unit 24 obtains the minimum value of the vertical coordinates of the three-dimensional position data, and if it is a negative number, outputs the absolute value of the minimum value as the vertical offset amount, If it is a positive number, 0 is output as the vertical offset amount. In step S24, the three-dimensional position data segmentation unit 26 generates a ridge line indicating the surface of the three-dimensional object to be projected and its outer shape from the three-dimensional position data based on the segmentation parameters stored in the parameter storage unit 3. To detect.
  • FIG. 14 shows an example of the segmentation process.
  • the three-dimensional position data segmentation unit 26 first obtains a normal vector 132 for each point of the point cloud data 130 of the three-dimensional position data.
  • the normal vector 132 is a composite vector of the normal vector 131 of a triangular surface composed of a point of interest and peripheral points.
  • the three-dimensional position data segmentation unit 26 compares the normal vectors of adjacent points, and if the difference is smaller than the threshold value set by the segmentation parameter, the three-dimensional position data segmentation unit 26 sets the same plane. Thereby, the surfaces 133 and 134 can be extracted.
  • the three-dimensional position data segmentation unit 26 calculates an edge line 135 that is an intersection line between the adjacent surfaces 133 and 134.
  • the segmentation parameter is a threshold for the difference between the normal vectors, but the value differs depending on the segmentation method.
  • the three-dimensional position data polygon meshing unit 29 performs the three-dimensional position data segmentation unit based on the polygon meshing parameters stored in the parameter storage unit 3 and the file format parameters stored in the parameter storage unit 3.
  • the surface detected at 26 is converted to a polygon mesh.
  • the polygons that can be handled differ depending on the format of the three-dimensional data file to be generated. For example, there are a format that can handle only triangular polygons, a format that can handle only triangular and quadrangular polygons, and a format that can handle polygonal polygons.
  • the three-dimensional position data polygon meshing unit 29 performs polygonization using a file format parameter that specifies the shape of the polygon and a polygon meshing parameter that specifies the roughness of the polygon. If the polygon is made fine, the curved surface can be expressed more smoothly, but the amount of calculation in the tool for creating the projection mapping video increases.
  • FIG. 15A shows an example in which a polygon mesh is formed using both a triangular polygon and a quadrangular polygon
  • FIG. 15B shows an example in which a polygon mesh is formed using only a triangular polygon.
  • the virtual three-dimensional object 302 in which the polygon mesh is formed only by the triangle polygon. 303 have finer polygons.
  • the 3D data file generation unit 30 determines the 3D position data from the 3D position data vertical offset unit 25 or the polygon mesh from the 3D position data polygon meshing unit 29 based on the file format parameter.
  • a three-dimensional data file 33 suitable for the designated file format is generated from the converted three-dimensional position data.
  • the three-dimensional data file generation unit 30 calculates information such as the normal vector of each vertex of the polygon and the normal vector of the mesh surface, and generates a three-dimensional data file 33 including these information. To do.
  • the 3D data file 33 is generated using the 3D position data from the 3D position data vertical offset unit 25, the 3D data file 33 is the raw data of the measured 3D position in the world coordinate system. including.
  • FIG. 16 shows a procedure for generating the projector data file 31.
  • the initial projector data generation unit 16 sets the origin of the projector coordinate system, that is, the projection center of the projector, that is, the coordinates (0, 0, 0) as the projector position coordinates.
  • initial projector data including the projection direction vector, the angle of view, and the projector position coordinates is generated.
  • FIG. 17 shows an example of a projection direction vector, a vertical field angle, a left and right field angle, and a projection center coordinate when no field angle symmetrization is set as the field angle symmetrization selection parameter.
  • the projection optical axis 109 passes through the center of the lower end of the projection area 103.
  • the lower angle ⁇ B 0, and the upper angle ⁇ ′ T and the lower angle ⁇ B do not match, but the projection optical axis 109 is regarded as the projection center axis 110.
  • the origin of the projector coordinate system that is the projection center point 102 of the projector 201, that is, the coordinates (0, 0, 0) are set as the coordinates of the projector position 138.
  • the projection direction vector 136 is the direction of the projection optical axis 109 (the horizontal and vertical direction components are 0 respectively).
  • FIG. 18 shows an example of the projection direction vector, the vertical and horizontal field angles, the left and right field angles, and the projection center coordinates when the field angle symmetrization is set as the field angle symmetrization selection parameter.
  • the vertical angle of view among the angle of view 137 is not vertically symmetric.
  • the projection direction vector 136 is a vector indicating a direction in which the vertical vertical field angle is symmetric and the horizontal horizontal field angle is symmetric.
  • step S31 the projector data world coordinate conversion unit 17 generates initial projector data based on the inclination of the projector with respect to the horizontal plane detected by the attitude sensor unit 11 in the same manner as the processing in the three-dimensional position data world coordinate conversion unit 23.
  • the projection direction vector generated by the unit 16 is converted into a world coordinate system based on the horizontal plane.
  • step S 32 the projector data vertical offset unit 18 converts the vertical offset amount calculated by the vertical offset amount calculation unit 24 into the vertical coordinate of the projector position coordinate in the same manner as the processing in the three-dimensional position data vertical offset unit 25. Add and adjust vertical coordinates.
  • step S33 the projector file generation unit 19 generates a projector data file 31.
  • a three-dimensional space projection mapping video creation tool constructs a three-dimensional object in a virtual three-dimensional space based on the contents of the three-dimensional data file 33, assigns an image to each surface of the three-dimensional object, and reproduces it.
  • a rendering camera is set and rendering is performed to create a projected image. If this projected image is projected from the projector, projection mapping can be performed in a state where the outer shape of the three-dimensional object is exactly matched.
  • the rendering camera in the case where the rendering camera is set at the viewpoint position in order to set the illumination and reproduce the light hitting condition, etc., according to the contents of the projector data file 31 Then, the projector is set, the rendering camera is set at the viewpoint position, and rendering is performed. Using the positional relationship with the three-dimensional object, it is converted into a projector projected image to create a projected image. If this projected image is projected from the projector, projection mapping can be performed in a state where the outer shape of the three-dimensional object is exactly matched.
  • mapping data generation unit 7 executes the generation process of the two-dimensional data file 32 according to the mapping data generation start instruction from the communication control unit 1, and the projector data generation unit 6 Processing up to the vertical offset unit 18 is executed.
  • FIG. 19 shows a procedure for generating the two-dimensional data file 33.
  • step S40 three-dimensional position data is acquired, in step S41, the coordinate system of the three-dimensional position data is converted into the projector coordinate system, in step S42, converted into the world coordinate system, in step S43, a vertical offset is performed, and in step S44.
  • the surface and ridgeline of the three-dimensional object are detected with.
  • the processes in steps S40 to S44 are the same as the processes in steps S20 to S24 shown in FIG.
  • the projector data generation unit 6 performs the processing of steps S30 to S33 shown in FIG.
  • step S45 the perspective projection unit 27 projects the edge of the three-dimensional space detected by the three-dimensional position data segmentation unit 26 based on the projector data from the projector data vertical offset unit 18 by the projector.
  • Two-dimensional data is generated by perspective projection on a surface.
  • step S46 the two-dimensional data file generation unit 28 matches the file format specified from the two-dimensional data generated by the perspective projection unit 27 based on the file format parameters stored in the parameter storage unit 3.
  • a data file 32 is generated.
  • the two-dimensional spatial projection mapping video creation tool allocates video to each surface based on the external position of each surface of the three-dimensional object on the two-dimensional data and creates a projection video. If this projected image is projected from the projector, projection mapping can be performed in a state that matches the outer shape of the three-dimensional object.
  • the three-dimensional sensor 108 measures a three-dimensional object three-dimensionally, and the three-dimensional position data (position and ridgeline of the projection surface on the three-dimensional object) as the measurement result is the coordinates of the projector. Convert coordinates to system. Thereby, the three-dimensional position data of the three-dimensional object in the projector coordinate system can be obtained. Further, two-dimensional data indicating the position and outer shape of each projection surface on the three-dimensional object is created using the coordinate-converted three-dimensional position data. A video for two-dimensional projection mapping can be created by assigning a video to each projection surface on the three-dimensional object based on the two-dimensional data.
  • the image for two-dimensional projection mapping is created based on the three-dimensional position data of the three-dimensional object in the coordinate system of the projector, when the image is projected from the projector, the range of the projected image is the three-dimensional object. It corresponds to the projection surface of
  • three-dimensional data indicating the position and outer shape of each projection surface on the three-dimensional object is generated, and a projector including a projection direction vector, an angle of view, and a projector position coordinate Generate data.
  • a projector including a projection direction vector, an angle of view, and a projector position coordinate Generate data.
  • configure a three-dimensional object in a virtual three-dimensional space assign and play a video to each surface on the virtual three-dimensional object, set a rendering camera based on the projector data, By performing rendering, a video for three-dimensional projection mapping can be created. Since the image for 3D projection mapping is also created based on the 3D position data of the 3D object in the projector coordinate system, when the image is projected from the projector, the range of the projected image is 3D It corresponds to the projection surface of
  • the projector when setting a rendering camera at the viewpoint position in order to set the illumination and reproduce the light hit condition, etc., the projector is set according to the contents of the projector data file.
  • the range of the projected video matches the projection surface of the three-dimensional object.
  • the angle of view of the projector is vertically symmetric and symmetric, and to generate a projector data file including a projection direction vector, an angle of view, and projector position coordinates according to the distortion correction. Therefore, even in a tool for creating a projection mapping image in a three-dimensional space in which only a vertically symmetrical and left / right symmetric angle of view can be set as a rendering camera or projector setting, an image that matches the projection surface of a three-dimensional object is created. be able to. Furthermore, the data in the world coordinate system based on a horizontal plane parallel to the ground is also used as it is for creating a 3D projection mapping video for a 3D video representation in a tool for creating a 3D space projection mapping video. be able to.
  • FIGS. 21A to 21F a problem relating to field angle symmetrization will be described with reference to FIGS. 21A to 21F.
  • a launch projection as shown in FIG. 21A is common.
  • the projection optical axis 109 passes through the center of the lower side of the projection area 103.
  • the left angle ⁇ L and the right angle ⁇ R are equal in the horizontal angle of view, but the lower angle ⁇ B is 0 in the vertical angle of view and does not match the upper angle ⁇ T.
  • the projection direction, the angle of view (zoom), and the position of the projector 201 are determined so that projection can be performed on the projection target surface 153 that represents one surface of the three-dimensional object.
  • the projection target surface 153 is perpendicular to the ground. Note that the position, angle of view (zoom), and direction of the projector 201 are not changed after this positioning.
  • An image for three-dimensional projection mapping is created using the three-dimensional data (three-dimensional data of the projection target surface 153) and projector data including a position, an angle of view, and a projection direction vector.
  • the position is the coordinate of the projection center point 102
  • projector data is set as a parameter of the rendering camera 148.
  • the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152.
  • an image to be projected on the virtual projection target surface 154 reproduced in the virtual three-dimensional space is allocated based on the three-dimensional data.
  • the video assigned to the virtual projection target surface 154 is captured by the rendering camera 148.
  • the captured image is like a captured virtual projection target surface 155 in the virtual imaging surface 151.
  • the image created as described above is projected from the projector 201.
  • the projection center axis 110 passing through the center of the projection image 156 projected onto the projection target surface 153 is located above the view angle center axis 150.
  • the projection image 156 is shifted upward with respect to the projection target surface 153.
  • the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152
  • the projection surface of the projection area 103 is inclined with respect to the view angle center axis 150, and the virtual imaging surface 151 and Since the projection surfaces are not parallel, the projected image 156 is distorted.
  • FIG. 21F the projection center axis 110 passing through the center of the projection image 156 projected onto the projection target surface 153 is located above the view angle center axis 150. For this reason, the projection image 156 is shifted upward with respect to the projection target surface 153.
  • the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152
  • the projector of this embodiment generation
  • video can be suppressed.
  • the principle will be described with reference to FIGS. 21G to 21K.
  • a projection center axis 110 connecting the projection center point 102 and the center of the projected image and a projection plane perpendicular to the projection center axis 110 are obtained, and a distortion correction coefficient is set.
  • theta "T and theta” B is less than the respective theta 'T and ⁇ ' B, ⁇ "L and theta” R respectively theta L and smaller than ⁇ R. For this reason, the angle of view becomes narrower than in the case where distortion correction is not performed.
  • the projection direction and position of the projector 201 are determined so that the projection can be performed on the projection target surface 153 that indicates one surface of the three-dimensional object.
  • the projection target surface 153 is perpendicular to the ground.
  • the position, direction, and angle of view (zoom) of the projector 201 are not changed.
  • An image for three-dimensional projection mapping is created using the three-dimensional data (three-dimensional data of the projection target surface 153) and projector data including a position, an angle of view, and a projection direction vector.
  • the position is the coordinate of the projection center point 102
  • projector data is set as a parameter of the rendering camera 148.
  • the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152.
  • an image to be projected on the virtual projection target surface 154 reproduced in the virtual three-dimensional space is allocated based on the three-dimensional data.
  • an image assigned to the virtual projection target surface 154 is captured by the rendering camera 148.
  • the captured image is like a captured virtual projection target surface 155 in the virtual imaging surface 151.
  • the image created as described above is projected from the projector 201.
  • the projection center axis 110 passing through the center of the projection image 156 projected onto the projection target surface 153 coincides with the field angle center axis 150.
  • the virtual imaging surface 151 of the rendering camera 148 is perpendicular to the optical center axis 152
  • the projection surface of the projection area 103 is also perpendicular to the field angle central axis 150, so that the projected video 155 is It coincides with the projection target surface 153 and no distortion occurs.
  • the operation has been described with respect to the example of the launch projection shown in FIG. 21A.
  • FIG. 22 is a block diagram showing a configuration of a projector according to the second embodiment of the present invention.
  • the projector shown in FIG. 22 includes a user interface unit 34 and an external storage device 35 in place of the communication control unit 1 and the communication input / output unit 2, and is different from the first embodiment in this respect.
  • the user interface unit 34 is an operation unit that receives an input operation by the user and performs the same control as the communication control unit 1, and includes, for example, an on-screen display and a key input unit.
  • the external storage device 35 is a removable storage device such as a USB (Universal Serial Bus) memory or an SD card.
  • the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 are supplied from the file storage unit 8 to the external storage device 35.
  • the information processing apparatus can read the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 from the external storage device 35.
  • the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33 can be provided to the information processing apparatus.
  • the communication cable is used to send, for example, the projector data file 31, the two-dimensional data file 32, and the three-dimensional data file 33.
  • each unit of the projector (projector data generation unit 6, mapping data generation unit 7, angle of view symmetrization unit 10, distortion correction unit 13, distortion correction calculation unit 15, etc.) is supported.
  • a program for causing a computer to execute the processing to be performed may be provided.
  • a function corresponding to each unit such as the projector data generation unit 6, the mapping data generation unit 7, the view angle symmetrization unit 10, the distortion correction unit 13, and the distortion correction calculation unit 15 by the computer executing the program. can be realized.
  • the program may be provided in a computer usable or computer readable medium, or may be provided through a network such as the Internet.
  • the computer usable or computer readable medium includes a medium capable of recording or reading information using magnetism, light, electronic, electromagnetic, infrared, or the like.
  • Such media include, for example, semiconductor memory, semiconductor or solid storage devices, magnetic tape, removable computer diskettes, random access memory (RAM), read only memory (ROM), magnetic disk, optical disk, magneto-optical disk, etc. There is.
  • FIG. 23 is a block diagram showing a configuration of a projector according to the third embodiment of the present invention.
  • the projector includes a display element 400, a projection lens 401, a reception unit 402, and a mapping data generation unit 403.
  • the display element 400 includes an image forming surface including a plurality of pixels.
  • the projection lens 401 projects an image formed on the image forming surface. An image is projected from the projection lens 401 toward a three-dimensional object.
  • the accepting unit 402 accepts a data creation request.
  • the mapping data generation unit 403 generates mapping data for creating a projection mapping video in response to a data creation request.
  • the mapping data generation unit 403 includes a three-dimensional sensor unit 404, a coordinate conversion unit 406, and a data generation unit 407.
  • the three-dimensional sensor unit 404 includes, for example, a three-dimensional sensor 405 that is arranged in a direction in which an image is projected.
  • the three-dimensional sensor unit 404 three-dimensionally measures a three-dimensional object with the three-dimensional sensor 405, and outputs three-dimensional position data that represents the position and shape of each surface on which the image on the three-dimensional object is projected with three-dimensional coordinates. .
  • the coordinate conversion unit 406 is a projector coordinate system in which the coordinate system of the three-dimensional position data is represented by the three-dimensional coordinates of the image projection area with the point where the principal ray from the pixel located diagonally on the image forming surface intersects as the origin. Convert to Based on the three-dimensional position data coordinate-converted by the coordinate conversion unit 406, the data generation unit 407 determines the position of the surface and the outer shape of the surface for each surface (surface on which an image is projected) of the three-dimensional object. The ridgeline shown is acquired, and mapping data (two-dimensional data or three-dimensional data of a three-dimensional object) is generated based on the position and the ridgeline.
  • a zoom mechanism that can move at least some of the plurality of lenses constituting the projection lens 401 in the optical axis direction of the projection lens 401 and a shift direction that is a direction orthogonal to the optical axis direction.
  • a first look-up table showing a change in a position where the crossing points.
  • the coordinate conversion unit 406 refers to the first look-up table and determines the position where the principal ray intersects from the current zoom position of the zoom mechanism and the current lens shift position of the lens shift mechanism. Also good.
  • the coordinate conversion unit 406 may convert the coordinate system of the three-dimensional position data into the projector coordinate system, and then convert the coordinate system into the world coordinate system based on the horizontal plane. Further, in the projector according to the present embodiment, a second lookup table indicating a change in the angle of view representing the projection range of the image as an angle according to the zoom position and the lens shift position, and a second lookup table are provided. With reference to the current zoom angle of the zoom mechanism and the current lens shift position of the lens shift mechanism, the current angle of view is acquired, and the central ray of the projection light beam from the projection lens 401 is represented based on the angle of view.
  • An angle-of-view symmetrizing unit that sets a projection center axis and outputs angle-of-view symmetrization information including a projection direction vector indicating the direction of the projection center axis and the current angle of view, and the coordinates of the origin of the projector coordinate system
  • a projector data generation unit for generating projector data including the projector position coordinates, and the projection direction vector and the current angle of view indicated by the angle of view symmetrization information; It may further include a.
  • the projector data generation unit may convert the direction of the projection vector into a world coordinate system based on the horizontal plane.
  • the data generation unit 407 may generate two-dimensional data by perspectively projecting the ridge line on the projection surface based on the projector data.
  • the field angle symmetrizing unit may output projection plane tilt information indicating the tilt of the plane perpendicular to the projection center axis with respect to the plane perpendicular to the optical axis of the projection lens 401.
  • the projector includes a distortion correction coefficient calculation unit that calculates a distortion correction coefficient for correcting distortion of an image projected on the projection plane based on the projection plane tilt information, and a distortion correction coefficient for the input video signal.
  • a distortion correction unit that performs distortion correction based on the image signal, and an image based on the video signal subjected to the distortion correction may be formed on the image forming surface of the display element 400.
  • the data generation unit 407 may create a two-dimensional data file generated by generating two-dimensional data in a predetermined format file format.
  • the projector may further include output means for outputting a two-dimensional data file.
  • the data generation unit 407 may create a three-dimensional data file generated by generating three-dimensional data in a predetermined format file format. Further, the data generation unit 407 may generate three-dimensional data by converting each surface of the three-dimensional object into a polygon mesh. In this case, the projector may further include output means for outputting a three-dimensional data file. In the projector according to the present embodiment, the data generation unit 403 generates a three-dimensional data file in which three-dimensional data is generated in a predetermined format file format. The projector data generation unit generates the projector data in a predetermined format file.
  • the output means may be a communication means capable of mutual communication with an external information processing apparatus or a removable storage means.
  • a mapping data creation method for generating mapping data for generating mapping data is performed in response to a mapping data creation request.
  • the three-dimensional sensor unit 404 measures a three-dimensional object three-dimensionally using the three-dimensional sensor 405 arranged in the direction in which the image is projected, and the image on the three-dimensional object is projected.
  • 3D position data representing the position and shape of each surface to be expressed in 3D coordinates is output, and the coordinate conversion unit 406 converts the coordinate system of the 3D position data from the pixels located diagonally to the image forming surface.
  • the data generation unit 407 converts the projection area of the image into a projector coordinate system expressed in three-dimensional coordinates with the point where the principal rays intersect as the origin, based on the coordinate-converted three-dimensional position data. For each of the surfaces, obtaining the position of the surface and a ridge line indicating the outer shape of the surface, and generating two-dimensional data or three-dimensional data of the three-dimensional object based on the position and the ridge line.
  • mapping data generation processing for generating mapping data in response to a mapping data creation request may be used.
  • mapping data generation processing each surface on which the three-dimensional object is three-dimensionally measured using the three-dimensional sensor 405 arranged in the direction in which the image is projected, and the image on the three-dimensional object is projected.
  • the process of outputting the 3D position data representing the position and shape of the image in 3D coordinates, and the coordinate system of the 3D position data at the point where the principal ray from the pixel located diagonally on the image forming surface intersects As a process for converting the projection area of the image into a projector coordinate system expressed in three-dimensional coordinates, and for each surface of the three-dimensional object based on the coordinate-converted three-dimensional position data, the position of the surface and the outer shape of the surface And a process of generating two-dimensional data or three-dimensional data of a three-dimensional object based on the position and the ridge line.
  • the program may be provided on a computer-usable or computer-readable medium, or may be provided via a network such as the Internet.
  • the computer usable or computer readable medium includes a medium capable of recording or reading information using magnetism, light, electronic, electromagnetic, infrared, or the like.
  • Such media include, for example, semiconductor memory, semiconductor or solid storage devices, magnetic tape, removable computer diskettes, random access memory (RAM), read only memory (ROM), magnetic disk, optical disk, magneto-optical disk, etc. There is.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Image Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne la génération de données de mappage avec lesquelles il est possible de créer une vidéo de mappage de projection qui correspond à la forme externe de chaque surface d'un objet solide. Un projecteur ayant une unité de génération de données de mappage (403) est destiné à générer des données pour un mappage en réponse à une demande de création de données. L'unité de génération de données de mappage comprend : un capteur tridimensionnel (405) destiné à mesurer un objet solide en trois dimensions et à délivrer en sortie des données de position tridimensionnelle qui représentent, par des coordonnées tridimensionnelles, la position et la forme de chaque surface de l'objet solide sur laquelle une image est projetée ; une unité de conversion de coordonnées (406) destinée à convertir le système de coordonnées des données de position tridimensionnelles en un système de coordonnées de projecteur dans lequel une zone de projection d'image est représentée par des coordonnées tridimensionnelles, avec un point où des faisceaux principaux provenant de pixels situés aux angles opposés d'une surface de formation d'image se croisent en tant qu'origine ; et une unité de génération de données (407) destinée à acquérir la position de chaque surface de l'objet solide et une ligne de crête indiquant la forme externe de chaque surface et générer les données pour un mappage.
PCT/JP2017/010700 2017-03-16 2017-03-16 Projecteur, procédé de création de données pour mappage, programme et système de mappage par projection WO2018167918A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019505625A JP6990694B2 (ja) 2017-03-16 2017-03-16 プロジェクタ、マッピング用データ作成方法、プログラム及びプロジェクションマッピングシステム
PCT/JP2017/010700 WO2018167918A1 (fr) 2017-03-16 2017-03-16 Projecteur, procédé de création de données pour mappage, programme et système de mappage par projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/010700 WO2018167918A1 (fr) 2017-03-16 2017-03-16 Projecteur, procédé de création de données pour mappage, programme et système de mappage par projection

Publications (1)

Publication Number Publication Date
WO2018167918A1 true WO2018167918A1 (fr) 2018-09-20

Family

ID=63523477

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/010700 WO2018167918A1 (fr) 2017-03-16 2017-03-16 Projecteur, procédé de création de données pour mappage, programme et système de mappage par projection

Country Status (2)

Country Link
JP (1) JP6990694B2 (fr)
WO (1) WO2018167918A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021021810A (ja) * 2019-07-26 2021-02-18 sPods株式会社 投影システム、表示制御装置、投影方法
CN113630588A (zh) * 2020-05-08 2021-11-09 精工爱普生株式会社 图像投射系统的控制方法和图像投射系统
CN114157700A (zh) * 2022-02-09 2022-03-08 国科星图(深圳)数字技术产业研发中心有限公司 大坝水库安全监测系统
JP7281576B1 (ja) 2022-03-31 2023-05-25 Kddi株式会社 映像投影システム及び映像投影方法
US12185031B2 (en) 2020-10-13 2024-12-31 Panasonic Intellectual Property Management Co., Ltd. Installation information acquisition method, correction method, program, and installation information acquisition system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001061121A (ja) * 1999-08-23 2001-03-06 Nec Corp プロジェクタ装置
JP2001320652A (ja) * 2000-05-11 2001-11-16 Nec Corp プロジェクタ装置
JP2002062842A (ja) * 2000-08-11 2002-02-28 Nec Corp 投射映像補正システム及びその方法
JP2005291839A (ja) * 2004-03-31 2005-10-20 Brother Ind Ltd 投影装置および3次元形状検出装置
JP2012078490A (ja) * 2010-09-30 2012-04-19 Sanyo Electric Co Ltd 投写型映像表示装置及び画像調整方法
JP2013098712A (ja) * 2011-10-31 2013-05-20 Sanyo Electric Co Ltd 投写型映像表示装置および画像調整方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001061121A (ja) * 1999-08-23 2001-03-06 Nec Corp プロジェクタ装置
JP2001320652A (ja) * 2000-05-11 2001-11-16 Nec Corp プロジェクタ装置
JP2002062842A (ja) * 2000-08-11 2002-02-28 Nec Corp 投射映像補正システム及びその方法
JP2005291839A (ja) * 2004-03-31 2005-10-20 Brother Ind Ltd 投影装置および3次元形状検出装置
JP2012078490A (ja) * 2010-09-30 2012-04-19 Sanyo Electric Co Ltd 投写型映像表示装置及び画像調整方法
JP2013098712A (ja) * 2011-10-31 2013-05-20 Sanyo Electric Co Ltd 投写型映像表示装置および画像調整方法

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021021810A (ja) * 2019-07-26 2021-02-18 sPods株式会社 投影システム、表示制御装置、投影方法
CN113630588A (zh) * 2020-05-08 2021-11-09 精工爱普生株式会社 图像投射系统的控制方法和图像投射系统
CN113630588B (zh) * 2020-05-08 2023-04-21 精工爱普生株式会社 图像投射系统的控制方法和图像投射系统
US12185031B2 (en) 2020-10-13 2024-12-31 Panasonic Intellectual Property Management Co., Ltd. Installation information acquisition method, correction method, program, and installation information acquisition system
CN114157700A (zh) * 2022-02-09 2022-03-08 国科星图(深圳)数字技术产业研发中心有限公司 大坝水库安全监测系统
CN114157700B (zh) * 2022-02-09 2022-04-15 国科星图(深圳)数字技术产业研发中心有限公司 大坝水库安全监测系统
JP7281576B1 (ja) 2022-03-31 2023-05-25 Kddi株式会社 映像投影システム及び映像投影方法
JP2023151126A (ja) * 2022-03-31 2023-10-16 Kddi株式会社 映像投影システム及び映像投影方法

Also Published As

Publication number Publication date
JPWO2018167918A1 (ja) 2020-05-14
JP6990694B2 (ja) 2022-01-12

Similar Documents

Publication Publication Date Title
JP6764533B2 (ja) キャリブレーション装置、キャリブレーション用チャート、チャートパターン生成装置、およびキャリブレーション方法
WO2021103347A1 (fr) Procédé, appareil, et système de correction de trapèze de projecteur, et support d'informations lisible
WO2018167918A1 (fr) Projecteur, procédé de création de données pour mappage, programme et système de mappage par projection
TWI253006B (en) Image processing system, projector, information storage medium, and image processing method
CN105308503A (zh) 利用短程相机校准显示系统的系统和方法
JP5401940B2 (ja) 投写光学系のズーム比測定方法、そのズーム比測定方法を用いた投写画像の補正方法及びその補正方法を実行するプロジェクタ
WO2018068719A1 (fr) Procédé et appareil de collage d'image
EP3547260B1 (fr) Système et procédé d'étalonnage automatique de dispositifs d'images
JP2020187358A (ja) 投影システム、投影装置及びその表示画像の校正方法
JP6494239B2 (ja) 制御装置、制御方法、及び、プログラム
JP2014178393A (ja) 画像投影システム及び画像投影方法
EP3606059B1 (fr) Procédé d'étalonnage de projecteur et système de projection l'utilisant
KR102222290B1 (ko) 혼합현실 환경의 동적인 3차원 현실데이터 구동을 위한 실사기반의 전방위 3d 모델 비디오 시퀀스 획득 방법
KR20200045176A (ko) 깊이 카메라와 비행체를 이용한 강건한 디스플레이 시스템 및 방법
JP6804056B2 (ja) 投写型表示装置、投写型表示装置の制御方法、及びプログラム
CN114286066A (zh) 投影校正方法、装置、存储介质以及投影设备
CN114615477A (zh) 全景影像中邻近镜头影像间颜色差的补偿方法
JP2016085380A (ja) 制御装置、制御方法、及び、プログラム
JP2015139087A (ja) 投影装置
US10089726B2 (en) Image processing apparatus, image processing method, and storage medium, relating to generating an image corresponding to a predetermined three-dimensional shape by transforming a captured image
TWM594322U (zh) 全向立體視覺的相機配置系統
Yang et al. Effect of field of view on the accuracy of camera calibration
KR20150058660A (ko) 이미지 처리 장치, 이의 동작 방법, 및 이를 포함하는 시스템
JP2007033087A (ja) キャリブレーション装置及び方法
TWI725620B (zh) 全向立體視覺的相機配置系統及相機配置方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17900889

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019505625

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17900889

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载