+

WO2018196472A1 - Procédé, appareil et système de projection, et support de stockage - Google Patents

Procédé, appareil et système de projection, et support de stockage Download PDF

Info

Publication number
WO2018196472A1
WO2018196472A1 PCT/CN2018/076995 CN2018076995W WO2018196472A1 WO 2018196472 A1 WO2018196472 A1 WO 2018196472A1 CN 2018076995 W CN2018076995 W CN 2018076995W WO 2018196472 A1 WO2018196472 A1 WO 2018196472A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
image
projected
area
devices
Prior art date
Application number
PCT/CN2018/076995
Other languages
English (en)
Chinese (zh)
Inventor
赵冬晓
尹志良
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2018196472A1 publication Critical patent/WO2018196472A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Definitions

  • the present disclosure relates to the field of terminal projection, and in particular, to a projection method, apparatus, system, and storage medium.
  • the specific method of using the multi-projector for splicing projection in the related art is as follows: first, the user first divides an image to be projected by a computer according to the splicing requirement; secondly, transmits the segmented image to a plurality of projectors; and then, controls A plurality of projectors project corresponding segmentation images. Therefore, in the prior art, the collaborative projection process of the multi-projector must be manually controlled, and at the same time, the projection result needs to be manually adjusted. When the number of projectors is large, the method of manually controlling the projection of the multi-projector is time-consuming and labor-intensive, resulting in Projection of multi-projector collaborative projection is less efficient.
  • embodiments of the present disclosure are intended to provide a projection method, apparatus and system, and storage medium.
  • an embodiment of the present disclosure provides a projection method, including: acquiring an initial projection image formed by initial projection of at least two projection devices; and determining at least two projection regions corresponding to at least two projection devices from the initial projection image. And determining, as the target projection area, a maximum inscribed rectangular area formed by the at least two projection areas; determining, according to the relative positional relationship of the at least two projection areas and the target projection area, at least two projection devices respectively corresponding to the image to be projected The sub-image to be projected; respectively controlling at least two projection devices to project respective sub-images to be projected.
  • an embodiment of the present disclosure provides a projection apparatus, including: an acquisition module configured to acquire an initial projection image formed by initial projection of at least two projection devices by an image acquisition unit; the first processing module is configured to be initial Determining at least two projection areas corresponding to at least two projection devices in the projection image, and determining a maximum inscribed rectangular area formed by the at least two projection areas as a target projection area; and a second processing module configured to according to at least two projections a relative positional relationship between the area and the target projection area, determining a sub-image to be projected corresponding to each of the at least two projection devices from the image to be projected; and the control module is configured to respectively control at least two projection devices to project the corresponding sub-images to be projected.
  • an embodiment of the present disclosure provides a projection system, the system comprising: a camera configured to acquire an initial projection image formed by initial projection of at least two projection devices; and a control device configured to determine from the initial projection image At least two projection areas corresponding to at least two projection devices, and determining a maximum inscribed rectangular area formed by the at least two projection areas as a target projection area; according to a relative positional relationship between the at least two projection areas and the target projection area, Determining a sub-image to be projected corresponding to each of the at least two projection devices; respectively controlling at least two projection devices to project respective sub-images to be projected; at least two projection devices configured as initial projections; Corresponding sub-images to be projected.
  • the projection method, device, system and storage medium provided by the embodiments of the present disclosure firstly acquire an initial projection image formed by initial projection of at least two projection devices; secondly, determine a projection area corresponding to each projection device from the initial projection image and Target projection area; then, according to the relative positional relationship between each projection area and the target projection area, determining a sub-image to be projected corresponding to each projection device from the image to be projected; finally, respectively controlling each projection device to project a corresponding sub-projection Image, the process does not require manual operation, and the cooperative projection of multi-projection equipment can be realized only by means of software control, thereby improving the intelligence degree of collaborative projection of multi-projection equipment, and further improving the projection of cooperative projection of multi-projection equipment. effectiveness.
  • FIG. 1 is a first structural schematic view of a projection system in an embodiment of the present disclosure
  • FIG. 2 is a second schematic structural diagram of a projection system in an embodiment of the present disclosure
  • FIG. 3 is a schematic structural diagram of a main projection device in a projection system according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of a first flow of a projection method in an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of a first projection image of a projection method in an embodiment of the present disclosure
  • FIG. 6 is a schematic diagram of a second projection image of a projection method in an embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of a third projection image of a projection method in an embodiment of the present disclosure.
  • FIG. 8 is a second schematic flowchart of a projection method in an embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram of a fourth projection image of a projection method in an embodiment of the present disclosure.
  • FIG. 10 is a third schematic flowchart of a projection method in an embodiment of the present disclosure.
  • FIG. 11 is a fourth schematic flowchart of a projection method in an embodiment of the present disclosure.
  • FIG. 12 is a schematic structural diagram of a projection apparatus in an embodiment of the present disclosure.
  • Embodiment 1 is a diagrammatic representation of Embodiment 1:
  • the embodiment of the present disclosure provides a projection method applied to a projection system.
  • the projection system 10 includes a camera 11 , a control device 12 , and at least two projection devices 13 .
  • the control device 12 and the projection device 13 may be physically separated or physically combined.
  • the camera 11 can be disposed on the control device 12.
  • the projection process of each projection device 13 is controlled by the control device 12, and the camera 11 is controlled to collect the initial projection of each projection device.
  • the initial projected image formed, where the control device 12 can be a mobile phone, a tablet computer, a notebook computer, etc.; when the control device 12 and the projection device 13 are physically combined, the camera 11 is disposed on the projection device 13,
  • the projection device 13 can be referred to as a main projection device, and other projection devices are referred to as respective slave projection devices.
  • the projection system includes four projection devices, wherein D0 represents a first projection device, D1 represents a second projection device, D2 represents a third projection device, and D3 represents a fourth projection device.
  • the control device and D0 are physically combined, and the camera is set on D0, and the position of the camera can be in the same plane as the D0 optical machine (Fig. 2 is the main view of each projection device, wherein the circle on the main view of D0) Representing the camera, the rectangle represents the optical machine).
  • D0 can be used as the main projection device, D1, D2 and D3 as the slave projection devices, the master and slave devices are connected, and D0 controls itself and the projection process of each slave projector device, and The initial projection image is acquired by the D0 control camera.
  • the multi-projection devices in this embodiment can be connected by wireless, for example, through a wireless mirror connection technology, to realize sharing of information between the projection devices.
  • D0 is used as the main projection device, and is connected to each of the slave projection devices D1, D2, and D3 by wireless mirror connection technology.
  • the main projection device is the transmitting end, and the receiving device is receiving from the projection device.
  • the main projection device controls itself and the output image of each of the slave projection devices.
  • the projection method of the embodiment is further described by the physical combination of the control device and the projection device.
  • the control device and the D0 are physically combined, in the D0.
  • the CPU (Central Processing Unit) module 121, the display control module 122, and the communication interface 123 can be regarded as a whole, which is equivalent to a control device, wherein the CPU module 121 serves as a total control module and is responsible for controlling the projection of each device.
  • the display control module 122 controls the optical device 124 of the D0 to perform projection
  • the communication interface of the D0 may be a wireless communication interface, and then communicates with each of the slave projection devices through the wireless network to control the projection of each of the slave projection devices.
  • the mirror connection mode is connected to D1, D2, and D3, respectively, and the image divided by the CPU module of D0 is transmitted to D1, D2, and D3, and the delay between D0 and D1, D2, and D3 is controlled by the CPU module to ensure each projection device. Synchronous display of collaborative projection screens.
  • FIG. 4 it is a schematic flowchart of a projection method provided by an embodiment of the present disclosure.
  • the method can be applied to scenes that need to display data or display images on a large screen, such as displaying military simulation, industrial design, and virtual When manufacturing, engineering projection, complex monitoring, etc. related images, the method includes:
  • S40 Acquire an initial projection image formed by initial projection of at least two projection devices
  • a plurality of projection devices are turned on; then, multiple projection devices perform initial projection; and then, acquired by an image acquisition module.
  • the initial projected image formed by the initial projection of multiple projection devices.
  • At least two projection devices can be used for initial projection.
  • the number of projection devices can also take the square of the natural number.
  • 4 or 9 projection devices are used for initial projection to improve the projection effect of multi-projection device collaborative projection.
  • FIG. 3 is four projection devices D0, D1, D2, and D3 for initial projection
  • FIG. 5 is the above four projection devices collected by the camera.
  • FIG. 3 is four projection devices D0, D1, D2, and D3 for initial projection
  • the line type indicated by 501 represents the projection edge of D0; the line type indicated by 502 represents the projection edge of D1; the line type indicated by 503 represents the projection edge of D2; and the line type indicated by 504 represents the projection edge of D3.
  • S41 determining at least two projection regions corresponding to the at least two projection devices from the initial projection image, and determining a maximum inscribed rectangular region formed by the at least two projection regions as the target projection region;
  • the control device may identify at least two projection regions corresponding to at least two projection devices in the initial projection image by using an image recognition algorithm, and calculate by an edge recognition algorithm or a mathematical algorithm for finding a maximum rectangle.
  • a maximum inscribed rectangular area composed of at least two projection areas is used, and this maximum inscribed rectangular area is used as a target projection area.
  • the control device and D0 are physically combined, and D0 is used as the main projection device.
  • the CPU module of D0 processes the initial projection image, first.
  • the projection area D0' corresponding to D0 in the initial projection area, the projection area D1' corresponding to D1, the projection area D2' corresponding to D2, and the projection area D3' corresponding to D3 are identified by the image recognition algorithm; secondly, respectively, D0', D1', D2', and D3' perform edge recognition to obtain four edges of each projection area; then, extend the line where the lower edge of D0' and D1' are in the upper edge, and D2' and D3' The line of the higher edge of the lower edge is extended, and the line of the right edge of the left edge of D0' and D2' is extended, and the left edge of the right edge of D1' and D3' is located. The straight line is extended; finally, the rectangular area composed of the four extension lines is determined
  • a person skilled in the art can also set a calculation method according to actual needs to identify at least two projection areas corresponding to at least two projection devices in the initial projection image and to obtain a maximum inscribed rectangular area in the initial projection image.
  • the embodiment of the present disclosure is not specifically limited.
  • S42 Determine, according to a relative positional relationship between the at least two projection areas and the target projection area, a sub-image to be projected corresponding to each of the at least two projection devices from the image to be projected;
  • the control device may be based on the target projection area P.
  • the overlapping area and the non-overlapping area between the respective projection areas D0', D1', D2', and D3' the sub-images to be projected corresponding to the respective projection devices are acquired from the image to be projected.
  • S42 may include the following steps:
  • S801 divide the image to be projected according to a relative positional relationship between the at least two projection areas and the target projection area, to obtain a first area corresponding to each of the at least two projection devices;
  • the control device first acquires an area overlapping each of the projection area and the target projection area.
  • the area overlapped between D0' and P is C0
  • the area overlapped between D1' and P is C1.
  • the area overlapped between D2' and P is C2
  • the area overlapped between D3' and P is C3; then, the control device divides the image to be projected according to C0, C1, C2, and C3, and obtains the corresponding number of the four projection devices.
  • One area that is, the divided images corresponding to the four projection devices.
  • S802 determining, respectively, an area that is not overlapped between the at least two projection areas and the target projection area as a second area corresponding to each of the at least two projection devices, where image parameter values of the respective pixels in the second area are the same;
  • Image parameters may include: pixels, saturation, hue, and the like.
  • the pixel value of each pixel in the second area may be set to black, so that the projection content of the second area will be concise and consistent, and will not have a visual impact on the projected content in the target projection area, thereby improving the coordination.
  • the projection effect of the projection may be set to black, so that the projection content of the second area will be concise and consistent, and will not have a visual impact on the projected content in the target projection area, thereby improving the coordination.
  • S803 Obtain a sub-image to be projected corresponding to each of the at least two projection devices according to the first region and the second region.
  • control device divides the image to be projected according to C0, C1, C2, and C3, and obtains a first divided image corresponding to D0, a second divided image corresponding to D1, a third divided image corresponding to D2, and a fourth divided image corresponding to D3. Then, according to each divided image and the second region, the sub-images to be projected corresponding to D0, D1, D2, and D3, respectively, are obtained.
  • S43 Control at least two projection devices respectively to respectively project corresponding sub-images to be projected.
  • control device controls each projection device to project the respective divided images, and uniformly fills each of the projection regions and the second region where the target projection regions are not overlapped, so as to realize projection of the corresponding sub-images to be projected by the respective projection devices. purpose.
  • the respective divided images are spliced with the corresponding second regions, and the sub-images to be projected corresponding to the respective projection devices are obtained, and then cooperative projection is performed.
  • S801 may include:
  • S1001 determining, as each third region, an area in which at least two projection areas in the target projection area overlap each other;
  • the control device determines, as the respective third region, an area in which each of the projection areas in the target projection area overlap each other, and the third area is an area jointly projected by the multi-projection device, and the pixel value of each pixel in the area is more
  • the projection device projects the superimposed pixel values.
  • the areas where the four projection areas D0', D1', D2', and D3' overlap each other include: S1, S2, S3, S4, S5, and S6.
  • the control device acquires the number of projection devices corresponding to each third region, and further knows that the pixel value of the pixel in the third region is a pixel value obtained by superimposing and superimposing a plurality of projection devices. For example, if the control device acquires an area where S1 is a common projection of D0 and D2, the pixel value of the pixel in S1 is the pixel value obtained by the projection of 2 projection devices; S2 is the area jointly projected by D0 and D1, then S2 The pixel value of the pixel inside is the pixel value obtained by the projection of 2 projection devices; S3 is the area jointly projected by D1 and D3, and the pixel value of the pixel in S3 is the pixel obtained by the projection of 2 projection devices.
  • S4 is the area jointly projected by D2 and D3, then the pixel value of the pixel in S4 is the pixel value obtained by the projection of 2 projection devices;
  • S5 is the area jointly projected by D1, D2 and D3, then the area in S5 The pixel value of the pixel is the pixel value obtained by the projection of the three projection devices;
  • S6 is the area jointly projected by D0, D2 and D3, and the pixel value of the pixel in S6 is the pixel obtained by the projection of the three projection devices. value.
  • control device may process the image to be projected according to the number of projection devices, and the processing procedure is as follows:
  • a first step determining, according to a mapping relationship between the image to be projected and the target projection area, a fourth region corresponding to each third region from the image to be projected;
  • mapping relationship between the image to be projected and the target projection area is first established; and according to the established mapping relationship, the fourth area corresponding to each third area is determined from the image to be projected, and the fourth area is the image to be projected.
  • the fourth region corresponding to S1, S2, S3, S4, S5, and S6 is respectively determined from the image to be projected.
  • Step 2 Based on the number of projection devices, the image parameter values corresponding to the respective pixels in the fourth region corresponding to each third region are equally divided.
  • the pixel values of the fourth region may be equally divided according to the number of projection devices. For example, for S5, since the number of projection devices of the projection S5 is 3, after acquiring the fourth region corresponding to S5, the pixel values of the fourth region are processed into three equal parts, which is equivalent to three projection devices respectively projecting S5. 1/3 of the pixel value of each pixel of the corresponding fourth region, and further, the pixel value of each pixel obtained by the three projection devices co-projecting S5 is equal to each pixel in the fourth region in the image to be projected. The pixel value of the point.
  • S1004 Divide the processed image to be projected to obtain a first region corresponding to each of the at least two projection devices.
  • the image to be projected is divided to obtain a first region corresponding to each projection device, and then combined with the second region.
  • the sub-images to be projected corresponding to the projection devices are controlled, and then the sub-images to be projected corresponding to the projections of the projection devices are controlled, and the overlapping regions are not found in the projection regions obtained by the multi-projection device cooperative projection by the method, and related art
  • the efficiency of the cooperative projection of the multi-projection device is improved by the software control method, and the accuracy of the cooperative projection of the multi-projection device is ensured.
  • S801 may further include:
  • S1101 dividing the image to be projected according to a relative positional relationship between the at least two projection areas and the target projection area, and obtaining a fifth area corresponding to each of the at least two projection devices;
  • control device first determines the regions C0, C1, C2, and C3 that overlap each other between the four projection regions and the target projection region P, and then divides the image to be projected according to C0, C1, C2, and C3 to obtain four projections.
  • the fifth area corresponding to each device, the fifth area is a divided image corresponding to each projection device, that is, the first divided image corresponding to D0, the second divided image corresponding to D1, the third divided image corresponding to D2, and the fourth corresponding to D3 Divide the image.
  • S1102 determining, as each sixth region, an area in which at least two projection areas in the target projection area overlap each other;
  • a plurality of sixth regions can be identified, which are: S1, S2, S3, S4, S5, and S6.
  • the control device acquires an area where S1 is a common projection of D0 and D2, and the number of projection devices corresponding to S1 is 2;
  • S2 is an area jointly projected by D0 and D1, and the number of projection devices corresponding to S2 is 2;
  • S3 is D1 and For the area jointly projected by D3, the number of projection devices corresponding to S3 is 2;
  • S4 is the area jointly projected by D2 and D3, and the number of projection devices corresponding to S4 is 2;
  • S5 is the area jointly projected by D1, D2 and D3, then S5
  • the number of corresponding projection devices is 3;
  • S6 is the area where D0, D2, and D3 are jointly projected, and the number of projection devices corresponding to S6 is 3.
  • S1104 Processing, according to the number of projection devices, a fifth region corresponding to each of the at least two projection devices, to obtain a first region corresponding to each of the at least two projection devices.
  • the control device establishes a mapping relationship between each projection area in the target projection area and the corresponding fifth area, for example, respectively establishing D0' and C0, D1' and C1, D2' and C2, and D3'
  • the mapping relationship between C3 secondly, according to the established mapping relationship, the regions corresponding to S1, S2, S5, and S6 are determined from C0, and the regions corresponding to S2, S3, and S6 are determined from C1, and determined from C2.
  • the areas corresponding to S1, S4, and S5 are determined from C3; then, according to the number of projection devices in S1, S2, S3, S4, S5, and S6, C0 and C1 are used.
  • pixel values of pixel points in the regions corresponding to S1, S2, S3, S4, S5, and S6 in C2 and C3 are equally processed to obtain a first region corresponding to each projection device; further, for each projection device According to the first region and the second region, the corresponding sub-images to be projected can be obtained. Finally, by controlling the projection images to be projected by the respective projection devices, the multi-projection device can be cooperatively projected, and the projection effect without overlapping projections can be achieved. .
  • the control device controls the multi-projection device to perform initial projection to obtain each projection region; then, when an overlapping region occurs between the projection regions, the control device can quickly Processing the projected image to obtain the sub-image to be projected corresponding to each projection device; finally, the control device controls the projection image to be projected corresponding to each projection device, thereby avoiding the overlapping problem of the multi-projection device collaborative projection, thereby improving the Multi-projection device collaborative projection projection effect and projection accuracy.
  • Embodiment 2 is a diagrammatic representation of Embodiment 1:
  • the present embodiment performs cooperative projection with four projection devices.
  • S42 according to the relative positional relationship between at least two projection regions and the target projection region, at least two projection devices are respectively determined from the image to be projected. The process of projecting a sub-image.
  • any pixel in the target projection area can be represented by the following array format: (X, Y, N, X0, Y0, X1, Y1, X2, Y2, X3, Y3), where X, Y are the target projections.
  • the coordinate value of any pixel in the region, N is that the pixel is projected by N projections together. When N is 1, the pixel is projected by one projector. When N is 2, the pixel is shared by two projectors.
  • the pixel point (0, 0) in the target projection area is projected by the projector D2, and the coordinate value corresponding to D2 is (0, 50), and the corresponding array of the pixel point is (0, 0, 1, -1). , -1, -1, -1, 0, 50, -1, -1).
  • the pixel point (1280, 0) in the target projection area is projected by the projector D3, and the coordinate value corresponding to D2 is (800, 0), and the corresponding array of the pixel point is (1280, 0, 1, -1). , -1, -1, -1, -1, -1, 800, 0).
  • the pixel point (0,800) in the target projection area is projected by the projector D0, and the coordinate value corresponding to D0 is (60,600), and the corresponding array of the pixel point is (0,800,1,60,600,-1,-1,- 1,-1,-1,-1).
  • the pixel points (1280, 800) in the target projection area are projected by the projector D1, and the coordinate value corresponding to D1 is (800, 500), and the corresponding array of the pixel points is (1280, 800, 1, -1, -1, 800, 500). , -1, -1, -1, -1).
  • the pixel points (320, 400) in the target projection area are projected by the projectors D0 and D2.
  • the coordinate value corresponding to D0 is (410, 0)
  • the coordinate value corresponding to D2 is (380, 290)
  • the corresponding array of the pixel points is (320, 400, 2, 410, 0, -1, -1, 380, 290, -1, -1).
  • the pixel points (640, 400) in the target projection area are projected by the projectors D0, D1, D2 and D3, and the coordinate values corresponding to D0 are (700, 50), and the coordinate values corresponding to D1 are (50, 50), corresponding to D2.
  • the coordinate value is (760, 580), and the coordinate value corresponding to D3 is (50, 560).
  • the corresponding array of the pixel is (640, 400, 4, 700, 50, 50, 50, 760, 580, 50, 560).
  • the control device reads the image to be projected cached by the display system, and the image to be projected can be represented by the coordinate value of each pixel and its corresponding color value, wherein the color value is represented by three primary colors (RGB), and further, the coordinates of each pixel point
  • the value can be expressed as (X, Y, R, G, B), the coordinate value (X, Y) of each pixel in the image to be projected, and the coordinate value (X, Y) of each pixel in the target projection area.
  • each pixel point on the image to be projected is divided into each projection device, and all the pixels in the sub-image to be projected corresponding to each projection device are obtained.
  • the following array format may be used. :
  • D0 (X0, Y0, R/N, G/N, B/N)
  • the pixel value of the coordinates (X0, Y0) in D0 needs to be projected (R/N, G/N, B/N)
  • the projection device D0 initializes the display array [X0, Y0, 0, 0 , 0], no comfort color for the pixel is (0,0,0), for each pixel of the target projection area to do the division of the above projection device, fill the pixel value into the initialization array, you can get
  • the final array [X0, Y0, R, G, B], the coordinate value of the array and its corresponding color value is the pixel value to be projected by D0 to project the pixel.
  • D1 (X1, Y1, R/N, G/N, B/N)
  • the pixel whose coordinates are (X1, Y1) in D1 needs to be displayed (R/N, G/N, B/N)
  • the projection device D1 initializes the display array [X1, Y1, 0, 0, 0 ], the comfort color is not given to the pixel (0,0,0), and each pixel of the target projection area is divided by the above projection device, and the color value is filled into the initialization array to obtain the final array.
  • [X1, Y1, R, G, B] the coordinate value of the array and its corresponding color value is the pixel value to be projected by D1 to project the pixel.
  • D2 (X2, Y2, R/N, G/N, B/N)
  • X2>-1, Y2>- 1 then the pixel value of the (X2, Y2) pixel in D2 needs to be displayed (R/N, G/N, B/N)
  • the projection device D2 initialize the display array [X2, Y2, 0, 0 , 0], no comfort color for the pixel is (0, 0, 0), for each pixel of the target projection area to do the division of the above projection device, fill the color value into the initialization array, you can get
  • the final array [X2, Y2, R, G, B], the coordinate value of the array and its corresponding color value is the pixel value to be projected by D2 to project the pixel.
  • D3 (X3, Y3, R/N, G/N, B/N)
  • the array [X3, Y3, R, G, B], the coordinate value of the array and its corresponding color value is the pixel value to be projected by D3 to project the pixel.
  • control device controls the respective sub-images to be projected corresponding to the D0, D1, D2, and D3 projections.
  • the projection device in this embodiment may be a digital light processing (DLP) projector, or may be another projector.
  • the projector includes the following key parts: the light source, the light source of the DLP system is composed of three LED bulbs, respectively emitting red RED, green GREEN, blue BLUE color light, brightness of the LED
  • the digital micromirror device which is the core display device in the DLP projection system. It is composed of many small mirrors that can be rotated. The small mirror is pixel-based.
  • each small mirror corresponds to each pixel of the image, or each pixel of the image controls the deflection angle of a small mirror.
  • the RGB data of each pixel of the image is decomposed, the RGB lights of the control LED are respectively switched.
  • the R lamp is turned on, the small mirror is deflected according to the value of R. The larger the R value, the more light the mirror reflects, G, The B lamp is turned on by the same process. Through this process, the correct brightness of RGB is reflected out in one frame of image; the lens converges the light reflected by the small mirror, and then projects onto the screen according to the focal length, and realizes different through the lens group in the lens.
  • Focal length, through the focal length can adjust the imaging clarity and size of the screen; in summary of the principle of the DLP projection system, the picture data on the DMD will change the mirror flip of the DMD to change the intensity of the RGB light, and realize the color and brightness of each pixel of the picture. Differently, by changing the brightness of the three LEDs of R, G, and B, the color balance of the picture can be generally controlled.
  • the DLP projector can realize the superimposition effect of the pixel value of 1:1, when the multi-projection device cooperatively projects, the pixel value of the image to be projected corresponding to the overlap region can be based on the region.
  • the number of projection devices is equally divided to control the color balance of the images that are co-projected.
  • Embodiment 3 is a diagrammatic representation of Embodiment 3
  • the present embodiment performs cooperative projection examples by four projection devices, and details the process of determining the projection regions corresponding to the respective projection devices in S41 and determining the target projection regions.
  • each projection device may perform initial projection using rectangular images of different colors.
  • D0 represents a first projection device, and its projection color is a red rectangular region
  • D1 represents a second A projection device whose projection color is a green rectangular region
  • D2 represents a third projection device whose projection color is a yellow rectangular region
  • D3 represents a fourth projection device whose projection color is a blue rectangular region.
  • the control device controls the initial projection image acquired by the camera as shown in FIG. 5.
  • the projections corresponding to the four projection devices are determined for the control device. The process of the area is described.
  • the projection area corresponding to D0 is determined, and it is determined whether the S1-S6 area is adjacent to the red area, and the adjacent area is taken to form a projection area corresponding to D0, as in the frame of D0' in FIG.
  • Determining the projection area corresponding to D1 determining whether the S1-S6 area is adjacent to the green area, and taking the adjacent area to form a projection area corresponding to D1, as in the D1'-frame portion of FIG.
  • Determining the projection area corresponding to D2 determining whether the S1-S6 area is adjacent to the yellow area, and taking the adjacent area to form a projection area corresponding to D2, as shown in the D2'-frame portion in FIG.
  • the projection area corresponding to D3 is determined, and it is determined whether the S1-S6 area is adjacent to the blue area, and the adjacent area is taken to form a projection area corresponding to D3, as in the D3' in-frame portion in FIG.
  • the target projection area is determined based on the projection areas D0', D1', D2', and D3' corresponding to the four projection devices D0, D1, D2, and D3.
  • control device performs image recognition on the projected edge of the initial projected image, and recognizes all horizontal lines and vertical lines of different colors, each of which has two horizontal lines and two vertical lines.
  • the two horizontal lines of red in the D0' area take the higher one.
  • the two horizontal lines of green in the D1' area take the higher one.
  • the two lines are compared, and the lower one is taken as the upper boundary of the target projection area for extension.
  • the rectangular area composed of the four extension lines is the target projection area, as indicated by P in Fig. 7 .
  • each projection area corresponding to each projection device is determined in S41, and the target projection area is determined, the cooperative projection of the four projection devices is taken as an example, and S42 is described. Meanwhile, in the embodiment, each projection area is used. When there is an overlapping area, the image to be processed is divided first, and then the divided image is processed, and finally the non-overlapping projection is realized as an example.
  • control device divides the image to be projected according to the relative positional relationship between the four projection regions and the target projection region, and obtains a fifth region corresponding to each of the four projection devices.
  • control device establishes a mapping relationship between each of the projection areas in the target projection area and the corresponding fifth area; and determines an image to be projected corresponding to each overlapping part from each of the fifth areas according to the established mapping relationship;
  • the number of projection devices in the overlapping area is processed, and the pixel values of the corresponding image to be projected are processed to obtain sub-images to be projected corresponding to the respective projection devices.
  • the sub-image to be projected corresponding to D0 includes the following part:
  • S5 overlapping with D2 and D3, after acquiring the image to be projected corresponding to S5, reducing the pixel value of the image to be projected corresponding to S5 to 1/3, and obtaining the image to be projected corresponding to the processed S5;
  • the sub-image to be projected corresponding to D1 includes the following parts:
  • the sub-image to be projected corresponding to D2 includes the following parts:
  • S1 overlapping with D0, after acquiring the image to be projected corresponding to S1, halving the pixel value of the image to be projected corresponding to S1, and obtaining the image to be projected corresponding to S1 after processing;
  • S5 overlapping with D0 and D3, after acquiring the image to be projected corresponding to S5, reducing the pixel value of the image to be projected corresponding to S5 to 1/3, and obtaining the image to be projected corresponding to the processed S5;
  • the sub-image to be projected corresponding to D3 includes the following parts:
  • S5 overlapping with D2 and D0, after acquiring the image to be projected corresponding to S5, reducing the pixel value of the image to be projected corresponding to S5 to 1/3, and obtaining the image to be projected corresponding to the processed S5;
  • control device controls each of the projection devices to project respective corresponding images to be projected.
  • Embodiment 4 is a diagrammatic representation of Embodiment 4:
  • a projection apparatus which is disposed on a control device and configured to control a projection process of multi-projection device cooperative projection.
  • the apparatus 100 includes: an acquisition module 101 configured to acquire an initial projection image formed by initial projection of at least two projection devices by an image acquisition unit; and a first processing module 102 configured to project an initial image from the initial projection image Determining at least two projection areas corresponding to at least two projection devices, and determining a maximum inscribed rectangular area formed by the at least two projection areas as a target projection area; and the second processing module 103 is configured to according to at least two projection areas And a relative positional relationship with the target projection area, determining a sub-image to be projected corresponding to each of the at least two projection devices from the image to be projected; and the control module 104 is configured to respectively control at least two projection devices to project the corresponding sub-images to be projected.
  • the second processing module 103 is further configured to divide the image to be projected according to the relative positional relationship between the at least two projection areas and the target projection area, to obtain a first area corresponding to each of the at least two projection devices; Determining a non-overlapping region between the at least two projection regions and the target projection region as a second region corresponding to each of the at least two projection devices, wherein the image parameters of the respective pixels in the second region are the same; for each projection device, according to the first The region and the second region obtain respective corresponding sub-images to be projected.
  • the second processing module 103 is further configured to determine, as the third region, an area in which at least two of the target projection areas overlap each other; and acquire a number of projection devices corresponding to each third area;
  • the first image to be projected is processed according to the number of the projection devices; and the processed image to be projected is divided according to the relative positional relationship between the at least two projection regions and the target projection region, and the first corresponding to each of the at least two projection devices is obtained. region.
  • the second processing module 103 is further configured to determine, according to the mapping relationship between the image to be projected and the target projection area, the fourth region corresponding to each third region from the image to be projected; And dividing the image parameter values corresponding to the respective pixels in the fourth region corresponding to each of the third regions.
  • the second processing module 103 is further configured to divide the image to be projected according to the relative positional relationship between the at least two projection areas and the target projection area, and obtain a fifth area corresponding to each of the at least two projection devices; An area in which at least two projection areas overlap each other in the target projection area is determined as each sixth area; a number of projection devices corresponding to each sixth area is acquired; and a fifth corresponding to each of the at least two projection devices is based on the number of projection devices The area is processed to obtain a first area corresponding to each of the at least two projection devices.
  • the obtaining module 101, the first processing module 102, the second processing module 103, and the control module 104 may be implemented by a processor in a projection device.
  • embodiments of the present disclosure can be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of a hardware embodiment, a software embodiment, or a combination of software and hardware aspects. Moreover, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage and optical storage, etc.) including computer usable program code.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
  • an embodiment of the present disclosure further provides a storage medium, in particular a computer readable storage medium, on which a computer program is stored, and when the computer program is executed by the processor, the steps of the method of the embodiment of the present disclosure are implemented.
  • the solution provided by the embodiment of the present disclosure firstly acquires an initial projection image formed by initial projection of at least two projection devices; secondly, determines a projection area corresponding to each projection device and a target projection region from the initial projection image; a relative positional relationship between the projection area and the target projection area, determining a sub-image to be projected corresponding to each projection device from the image to be projected; finally, respectively controlling each projection device to project a corresponding sub-image to be projected, the process does not require manual operation.
  • the cooperative projection of multi-projection equipment can be realized only by means of software control, thereby improving the intelligence degree of collaborative projection of multi-projection equipment and improving the projection efficiency of multi-projection equipment collaborative projection.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

L'invention concerne un procédé de projection, comprenant : l'acquisition d'une image de projection initiale formée par des projections initiales d'au moins deux dispositifs de projection ; la détermination, à partir de l'image de projection initiale, d'au moins deux régions de projection correspondant aux au moins deux dispositifs de projection, et la détermination de la plus grande région rectangulaire inscrite composée des au moins deux régions de projection comme étant une région de projection cible ; en fonction de la relation de position relative entre les au moins deux régions de projection et la région de projection cible, la détermination, à partir d'une image à projeter, de sous-images à projeter qui correspondent respectivement aux au moins deux dispositifs de projection ; et la commande respective des au moins deux dispositifs de projection de telle sorte qu'ils projettent les sous-images à projeter qui leur correspondent respectivement. L'invention concerne également un appareil de projection, un système de projection et un support de stockage.
PCT/CN2018/076995 2017-04-24 2018-02-23 Procédé, appareil et système de projection, et support de stockage WO2018196472A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710271996.3A CN108737799A (zh) 2017-04-24 2017-04-24 一种投影方法、装置及系统
CN201710271996.3 2017-04-24

Publications (1)

Publication Number Publication Date
WO2018196472A1 true WO2018196472A1 (fr) 2018-11-01

Family

ID=63919403

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/076995 WO2018196472A1 (fr) 2017-04-24 2018-02-23 Procédé, appareil et système de projection, et support de stockage

Country Status (2)

Country Link
CN (1) CN108737799A (fr)
WO (1) WO2018196472A1 (fr)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110191326B (zh) * 2019-05-29 2021-09-17 北京小鸟听听科技有限公司 一种投影系统分辨率扩展方法、装置和投影系统
CN110989949B (zh) * 2019-11-13 2023-04-11 浙江大华技术股份有限公司 一种异形拼接显示的方法及装置
CN111158554B (zh) * 2019-12-31 2021-07-16 联想(北京)有限公司 一种图像显示方法、电子设备和图像显示系统
CN111258524B (zh) * 2020-01-20 2023-03-24 北京淳中科技股份有限公司 拼接屏组的控制方法、装置和服务器
CN114007051B (zh) * 2020-07-28 2024-02-20 青岛海信激光显示股份有限公司 激光投影系统及其投影图像的显示方法
CN114360418A (zh) * 2020-10-13 2022-04-15 深圳光峰科技股份有限公司 投影系统
CN112233048B (zh) * 2020-12-11 2021-03-02 成都成电光信科技股份有限公司 一种球形视频图像校正方法
CN112770095B (zh) * 2021-01-28 2023-06-30 广州方硅信息技术有限公司 全景投影方法、装置及电子设备
CN113671782B (zh) * 2021-10-21 2022-02-15 成都极米科技股份有限公司 一种投影设备
CN117041508B (zh) * 2023-10-09 2024-01-16 杭州罗莱迪思科技股份有限公司 一种分布式投影方法、投影系统、设备和介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090033874A1 (en) * 2007-07-31 2009-02-05 Richard Aufranc System and method of projecting an image using a plurality of projectors
CN104516482A (zh) * 2013-09-26 2015-04-15 北京天盛世纪科技发展有限公司 一种无影投影系统及方法
CN105681772A (zh) * 2014-12-04 2016-06-15 佳能株式会社 显示控制装置及其控制方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103533276B (zh) * 2013-10-21 2017-01-18 北京理工大学 一种平面多投影快速拼接方法
JP6456086B2 (ja) * 2014-09-25 2019-01-23 キヤノン株式会社 投影型画像表示装置及びその制御方法並びにプロジェクタ及びその制御方法
CN105912101B (zh) * 2016-03-31 2020-08-25 联想(北京)有限公司 一种投影控制方法和电子设备

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090033874A1 (en) * 2007-07-31 2009-02-05 Richard Aufranc System and method of projecting an image using a plurality of projectors
CN104516482A (zh) * 2013-09-26 2015-04-15 北京天盛世纪科技发展有限公司 一种无影投影系统及方法
CN105681772A (zh) * 2014-12-04 2016-06-15 佳能株式会社 显示控制装置及其控制方法

Also Published As

Publication number Publication date
CN108737799A (zh) 2018-11-02

Similar Documents

Publication Publication Date Title
WO2018196472A1 (fr) Procédé, appareil et système de projection, et support de stockage
US9298071B2 (en) Multi-projection system
CN103019643B (zh) 一种即插即用的大屏幕投影自动校正与拼接显示方法
US20060181685A1 (en) Projector, method of controlling the projector, program for controlling the projector, and recording medium storing the program
US9671684B2 (en) Theater parameter management apparatus and method
US10638100B2 (en) Projector, multi-projection system, and method for controlling projector
JP6793483B2 (ja) 表示装置、電子機器およびそれらの制御方法
US10871931B2 (en) Display device and control method of display device
JP2023027152A (ja) プロジェクター、画像投写システム、及びプロジェクターの制御方法
CN114071104B (zh) 基于着色器实现多投影机投影渐变融合的方法
CN113890626B (zh) 色散校正方法、装置、激光电视机及存储介质
TW202144892A (zh) 投影系統以及投影方法
CN108353154A (zh) 投影仪、影像显示装置以及影像显示方法
EP3934244B1 (fr) Dispositif, système et procédé permettant de générer un mappage de pixels de projecteur vers des pixels de caméra et/ou des positions d'objet en utilisant des motifs alternatifs
US11327389B2 (en) Image projection system and method of controlling image projection system
CN115118943A (zh) 投射图像的调整方法、信息处理装置以及投射系统
KR20150125244A (ko) 다시점 카메라가 획득한 영상에 대한 색상 보정 방법 및 다시점 카메라가 획득한 영상에 대한 색상을 보정하는 장치
JP5249733B2 (ja) 映像信号処理装置
JP2019047312A (ja) 画像投写システム及びその制御方法
CN115343898B (zh) 投影系统及投射影像叠加方法
CN116095280A (zh) 投影系统以及应用投影系统的投影方法
CN110780513A (zh) 投影仪、投射系统和投射方法
TWI552606B (zh) Image processing device and its projection image fusion method
JP2020053710A (ja) 情報処理装置、投影装置、投影装置の制御方法、プログラム
CN115529446A (zh) 一种基于远程教学的投影切换系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18790934

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18790934

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载