WO2018176341A1 - Procédé de transmission vidéo, procédé de réception, système et véhicule aérien sans pilote - Google Patents
Procédé de transmission vidéo, procédé de réception, système et véhicule aérien sans pilote Download PDFInfo
- Publication number
- WO2018176341A1 WO2018176341A1 PCT/CN2017/078871 CN2017078871W WO2018176341A1 WO 2018176341 A1 WO2018176341 A1 WO 2018176341A1 CN 2017078871 W CN2017078871 W CN 2017078871W WO 2018176341 A1 WO2018176341 A1 WO 2018176341A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sub
- video data
- data unit
- image
- channels
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 104
- 230000005540 biological transmission Effects 0.000 title claims abstract description 78
- 230000006835 compression Effects 0.000 claims description 14
- 238000007906 compression Methods 0.000 claims description 14
- 238000004891 communication Methods 0.000 claims description 11
- 238000006243 chemical reaction Methods 0.000 claims description 7
- 238000005562 fading Methods 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 claims description 6
- 238000000354 decomposition reaction Methods 0.000 description 23
- 238000010586 diagram Methods 0.000 description 14
- 230000009466 transformation Effects 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000005070 sampling Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000001131 transforming effect Effects 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000012913 prioritisation Methods 0.000 description 2
- 241000630665 Hada Species 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2385—Channel allocation; Bandwidth allocation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to the field of image processing technologies, and in particular, to a video transmitting method, a receiving method, a transmitting system, a receiving system, and an unmanned aerial vehicle suitable for a movable object.
- the video captured by the shooting device mounted on the UAV can be transmitted through the video transmission system of the UAV.
- the data processor on the UAV acquires the video data captured by the shooting device, encodes the video data, and then encodes the video data.
- the encoded video data is transmitted to a receiving device on the ground.
- the number of channels of the communication network between the UAV and the receiving device may be multiple, and the UAV may transmit video data to the receiving device by using multiple channels.
- the multi-channel is mainly used for bandwidth expansion, the encoded video data is directly packetized, and the packetized data is distributed to different channels for transmission, that is, for one.
- the code stream data obtained by encoding one frame of image is split, and the data packet obtained by splitting is distributed to different channels for transmission, however, when the receiving device receives the data packet transmitted on one of the channels.
- the receiving device receives the data packet transmitted on one of the channels.
- the invention provides a video transmitting method, a receiving method, a transmitting system, a receiving system and an unmanned aerial vehicle suitable for a movable object, so as to improve reliability and efficiency of multi-channel video transmission.
- a first aspect of the present invention is to provide a video transmitting method suitable for a movable object, including:
- each sub-video data unit includes one or more sub-images
- At least one of the plurality of channels is selected to transmit the encoded sub-video data unit based on one or more characteristics of the channel and one or more characteristics of the sub-video data unit.
- a second aspect of the present invention is to provide a video receiving method suitable for a movable object, comprising:
- the video data includes one or more image frames
- the sub-video data unit includes at least one of a plurality of sub-images obtained by decomposing each of the image frames.
- a third aspect of the present invention is to provide a video transmission system suitable for a movable object, comprising:
- One or more imaging devices configured to acquire video data
- One or more processors on the movable object working alone or in concert, the one or more processors being configured to:
- each sub-video data unit includes one or more sub-images
- At least one of the plurality of channels is selected to transmit the encoded sub-video data unit based on one or more characteristics of the channel and one or more characteristics of the sub-video data unit.
- a fourth aspect of the present invention is to provide a video receiving system suitable for a movable object, comprising:
- a communication interface receiving a plurality of encoded sub-video data units from a plurality of channels
- One or more processors working alone or in concert, the one or more processors being configured to:
- the video data includes one or more image frames
- the sub-video data unit includes at least one of a plurality of sub-images obtained by decomposing each of the image frames.
- a fifth aspect of the present invention is to provide an unmanned aerial vehicle comprising:
- a power system mounted to the fuselage for providing flight power
- a sixth aspect of the present invention is to provide a receiving device, including:
- the present invention provides a video transmitting method, a receiving method, a transmitting system, a receiving system, and an unmanned aerial vehicle suitable for a movable object, by decomposing video data into a plurality of sub-video data units, and separately encoding a plurality of sub-video data units, Selecting at least one of the plurality of channels to transmit the encoded sub-video data unit according to the channel characteristics and the characteristics of the sub-video data unit, and selecting one or more of the plurality of channels to transmit one or more encoded sub-elements
- the video data unit enables the sub-video data unit to transmit on the channel matched thereto, and increases the efficiency of transmission of the video data while expanding the bandwidth of the video transmission, and at the same time, the video receiving device utilizes the reception from the plurality of channels.
- the sub video data unit reconstructs the video data, which can improve the fault tolerance and reliability of the video transmission.
- FIG. 1 is a schematic flowchart of a video sending method applicable to a movable object according to an embodiment of the present invention
- FIG. 2 is a schematic diagram of a sub video data unit according to an embodiment of the present invention.
- FIG. 3 is a schematic diagram of a sub video data unit according to another embodiment of the present invention.
- FIG. 4 is a schematic diagram of a sub video data unit according to another embodiment of the present invention.
- FIG. 5 is a schematic flowchart of a video sending method applicable to a movable object according to another embodiment of the present invention.
- FIG. 6 is a schematic structural diagram of a frame image according to an embodiment of the present disclosure.
- FIG. 7 is a coefficient image of a frame image after Hadamard transform according to an embodiment of the present invention.
- FIG. 8 is a schematic diagram of spatial transformation decomposition according to an embodiment of the present invention.
- FIG. 9 is a schematic diagram of spatial downsampling decomposition according to an embodiment of the present invention.
- FIG. 10 is a schematic flowchart of selecting at least one channel of multiple channels to transmit an encoded sub-video data unit according to an embodiment of the present disclosure
- FIG. 11 is a schematic flowchart diagram of a video receiving method applicable to a movable object according to an embodiment of the present invention.
- FIG. 12 is a schematic diagram of a decoded sub-image according to an embodiment of the present invention.
- FIG. 13 is a schematic diagram of a decoded sub-image according to another embodiment of the present invention.
- FIG. 14 is a schematic diagram of reconstructing an original image according to an embodiment of the present invention.
- FIG. 15 is a schematic diagram of reconstructing an original image according to another embodiment of the present invention.
- 16 is a structural diagram of a video transmission system suitable for a movable object according to an embodiment of the present invention.
- FIG. 17 is a structural diagram of a video receiving system suitable for a movable object according to an embodiment of the present invention.
- a component when referred to as being "fixed” to another component, it can be directly on the other component or the component can be present. When a component is considered to "connect” another component, it can be directly connected to another component or possibly a central component.
- the embodiment of the present invention provides a video transmission method suitable for a movable object.
- the mobile platform can be an unmanned aerial vehicle, a ground mobile robot, a handheld cloud platform, etc.
- FIG. 1 is applicable to an embodiment of the present invention.
- a schematic flowchart of a video sending method of a mobile object is shown in FIG. 1. Specifically, the method may include:
- S101 Decompose video data into multiple sub-video data units, where each sub-video data unit includes one or more sub-images;
- the execution body of the embodiment may be a data processor, where the data processor may be a dedicated processor, for example, a processor for performing image processing, or a general-purpose processor, which is not specifically described in the present invention. Limited.
- the data processor obtains video data captured by the photographing device mounted on the movable platform in real time, and the video data may include one image frame or multiple image frames, and the data processor decomposes each image frame included in the video data unit, and Each sub-video data unit may include one or more sub-images; specifically, decomposing the video data into a plurality of sub-video data units is configured to: decompose each of the one or more image frames in the video data into one or more a sub-image, at which time each of the sub-video data units includes at least one of the plurality of sub-images decomposed in the image frame, specifically, each image frame is selected Decomposing at least one of the plurality of sub-images to combine the selected sub-images into sub-video data units.
- the sub-image described above may be part of an image frame.
- the sub-image may also be one or more pixels of the image frame, or the sub-image may also be one or more conversions of the image frame.
- Coefficient for details, please refer to the following section, which will not be repeated here. This embodiment does not limit the number of image frames included in one video data. To schematically illustrate the decomposition process of the video data, it is assumed that the video data includes 6 image frames, that is, 6 frames, and further, in other embodiments, the video The number of image frames included in the data may also be other values.
- the video data includes six image frames, which are frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6, and frame 1, frame 2, frame 3, frame 4, and frame 5.
- the frame 6 is separately decomposed.
- the number of sub-images into which each image frame is decomposed is not limited.
- each image frame is decomposed into four sub-images, which is only a schematic description.
- the number of sub-images obtained after each image frame is decomposed may also be other values.
- Each of the sub-video data units includes at least one of the four sub-images corresponding to each of the image frames obtained by decomposing each of the six image frames.
- each sub-video data unit includes one of the four sub-images corresponding to each image frame in the six image frames.
- the sub-video data unit 210 includes one sub-image 11 and frame of the frame 1. a sub-image 21 of 2, a sub-image 31 of frame 3, a sub-image 41 of frame 4, a sub-image 51 of frame 5, and a sub-image 61 of frame 6; similarly, sub-video data unit 220, sub-video The data unit 230 and the sub video data unit 240 respectively include one sub image of each of the six image frames.
- the sub-video data unit 310 includes two sub-images 11, 12 of the frame 1, and two sub-images 21, 22 of the frame 2.
- sub-video data unit 320 includes one sub-image 13 of frame 1, frame 2
- two sub-images 52, 53 of frame 5 two sub-images 62, 63 of frame 6
- sub-video data unit 330 Includes one sub-image of each of the six image frames.
- the manner in which at least one of the plurality of sub-images corresponding to each of the plurality of image frames is combined to form a sub-video data unit may also have other combinations, which are not enumerated here.
- the video data may include only one image frame, that is, one frame image.
- 40 indicates an image frame included in the video data, and the image frame 40 is decomposed.
- the number of sub-images obtained by decomposing an image frame optionally, decomposing the image frame 40 into four sub-images, such as sub-image 11, sub-image 12, sub-image 13, sub-image as shown in FIG. 14.
- Sub-picture 11, sub-image 12, sub-image 13, and sub-image 14 can be divided into the following achievable ways:
- each sub-video data unit includes one sub-image, such as sub-video data unit 410, sub-video data unit 420, sub-video data unit 430, sub-video data unit 440, as shown in FIG.
- each sub-video data unit includes two sub-images. This embodiment does not limit the combination of two sub-images included in one sub-video data unit.
- the sub-image shown in FIG. Video data unit 450 and sub-video data unit 460 wherein sub-video data unit 450 includes sub-image 11 and sub-image 12, sub-video data unit 460 includes sub-image 13 and sub-image 14.
- each sub-video data unit includes a different number of sub-images, such as the sub-video data unit 470 and the sub-video data unit 480 shown in FIG. 4, wherein the sub-video data unit 470 includes three sub-pictures.
- the sub-video data unit 470 includes a sub-image 11, a sub-image 12, a sub-image 13, and the sub-video data unit 480 includes a sub-image 14.
- the data processor encodes each of the plurality of sub-video data units by using the sub-video data unit as a coding unit, and obtains a plurality of code stream data after encoding, and optionally encodes one sub-video data unit. Then, a code stream data is obtained, where the coding includes source coding and/or channel coding, and the manner of source coding may include H.263, H.264, H.265, MPEG4, etc., and the channel coding method may include correcting
- the error coding type may include an RS code, that is, a Reed-Solomon code, a convolutional code, a Turbo code, a Polar code, an interleaving code, a pseudo random sequence scrambling code, and the like.
- S103 Select at least one of the plurality of channels to transmit the encoded sub-video data unit according to one or more characteristics of the channel and one or more characteristics of the sub-video data unit.
- one or more characteristics of the channel include at least a bandwidth, or one or more characteristics of the channel include at least one of: noise, interference, signal to noise ratio, bit error rate, fading rate Rate, bandwidth, number of available channels; and one or more characteristics of the sub-video data unit may include: the size of the code stream data encoded by the sub-video data unit, or the energy concentration of the sub-video data unit.
- a plurality of encoded sub-video data units need to be sent out by using multiple channels, and before transmitting, characteristics of multiple channels can be evaluated, and the data processor is based on the evaluation.
- One or more characteristics of the channel, and one or more characteristics of the sub-video data unit selecting at least one of the plurality of channels to transmit the encoded sub-video data unit such that the encoded sub-video data unit is The characteristics are matched on the channel for transmission, so that the encoded sub-video data unit is transmitted to the receiving device.
- the receiving device may be a remote controller, a smart phone, a tablet computer, a ground control station, a laptop computer, a wearable device (watch, a wristband, etc.), a combination thereof, or the like.
- selecting one of the plurality of channels to transmit the encoded sub-video data unit is implemented by selecting one channel from the plurality of channels for each of the plurality of sub-video data units, The selected channel transmits the sub-video data unit.
- the plurality of sub video data units may have the same or similar characteristics, and the characteristics include a code stream data size encoded by the sub video data unit or a priority of the sub video data unit, where the plurality of sub videos are The priority of the data unit is determined according to the energy concentration of the sub-video data unit; the plurality of sub-video data units may have the same or similar characteristics, which may mean that the plurality of sub-video data units have the same or similar codes.
- the stream data size, or multiple sub-video data units have the same or similar priority.
- the plurality of sub video data units may have different characteristics including a code stream data size encoded by the sub video data unit or a priority of the sub video data unit, that is, a plurality of encoded sub video data.
- the code stream data size of the unit is different, or the priority of multiple sub video data units is different.
- one of the plurality of channels may be selected to transmit the corresponding sub-video data unit by the following feasible methods:
- the first feasible manner is: selecting one channel from the plurality of channels for each of the plurality of sub-video data units according to the code stream data size and the channel bandwidth of the encoded sub-video data unit, using the selected channel Transmitting the sub-video data unit; specifically, the code stream data size of the one or more encoded sub-video data units matches the channel bandwidth of the selected channel, so that the encoded sub-video data unit is in its code stream The data size is matched on the channel for transmission.
- the characteristics of multiple channels may be different at the same time.
- the bandwidth of the channel may be different.
- the channel bandwidth is large, and some channel bandwidth is small.
- the data processor decomposes the video data to obtain four sub-video data units
- the four sub-video data units are respectively recorded as a sub-video data unit A, a sub-video data unit B, a sub-video data unit C, and a sub-video data unit D.
- the size of the code stream obtained by encoding the four sub-video data units is S0, S1, S2, and S3, and the sizes of the four code streams may be different.
- the four codes may be assumed.
- the size of the stream data is sequentially decremented. If the current radio channel includes channel 1, channel 2, channel 3, and channel 4, and the bandwidth T of the channel is T0, T1, T2, and T3, respectively, and the bandwidths of the four channels are successively decreased.
- a channel may be selected for each code stream data according to the current bandwidth of each channel for transmission. For example, channel 1 with the largest channel bandwidth can be used to transmit the encoded sub-video data unit A, and channel 2 with the second largest channel bandwidth can be used to transmit the encoded sub-video data unit B, and the channel bandwidth is the third largest.
- Channel 3 is used to transmit the encoded sub-video data unit C
- channel 4 with the smallest channel bandwidth is used to transmit the encoded sub-video data unit D.
- the channel with strong data transmission capability can transmit the sub-video data unit with larger coded stream data
- the channel with weak data transmission capability can transmit the sub-video data unit with smaller code stream data.
- the sub-video data unit can be matched with the channel, and the data processor can select the channel to transmit the sub-video data unit according to the size of the code stream data encoded by the sub-video data unit and the channel bandwidth.
- the second feasible way is that multiple sub-video data units can be prioritized according to energy concentration. Further, according to the priority and channel bandwidth of the encoded sub-video data unit, one channel is selected from the plurality of channels for each of the plurality of sub-video data units, and the sub-video data is transmitted by using the selected channel. unit. Specifically, the priority of the one or more encoded sub-video data units matches the channel bandwidth of the selected channel such that the encoded sub-video data unit is transmitted on a channel that matches its priority.
- the energy concentration of each sub-video data unit in multiple sub-video data units is different due to the use of a specific decomposition method (see below, and will not be described here). Therefore, in some embodiments, if the energy concentration of each sub-video data unit is different, the plurality of sub-video data units are prioritized based on the energy concentration of each sub-video data unit, and the optional prioritization
- the rule is: the greater the concentration of energy, the better The higher the priority.
- the data processor may select at least one channel to transmit the encoded sub-video data unit according to the code stream data size and channel bandwidth of the encoded sub-video data unit, such that the code stream data size and channel of the encoded sub-video data unit Bandwidth matching, in addition, one channel may be selected for each sub-video data unit according to the priority and bandwidth of the sub-video data unit, and the sub-video data unit is transmitted by using the selected channel to make the priority and channel of the sub-video data unit
- an appropriate channel can be selected for each sub-video data unit based on the bandwidth of the channel. For example, if the data processor decomposes the video data to obtain four sub-video data units, the four sub-video data units are respectively recorded as a sub-video data unit A, a sub-video data unit B, a sub-video data unit C, and a sub-video data unit D.
- the priority of the four sub-video data units may be different. For the purpose of illustration, it may be assumed that the priorities of the four sub-video data units are successively decremented, if the current radio channel includes channel 1, channel 2, and channel 3. Channel 4, and the bandwidth of the four channels is successively decremented.
- the priority of the sub-video data unit can be matched with the bandwidth of the channel, that is, Channel 1 with the largest channel bandwidth can be used to transmit the encoded sub-video data unit A with the highest priority, and channel 2 with the second largest channel bandwidth can be used to transmit the encoded second highest priority video data unit B.
- Channel 3 having the third largest channel bandwidth is used to transmit the encoded third highest priority video data unit C, and the channel bandwidth is used.
- the current channel may only have channel 1 and channel 2.
- channel 1 can be selected to transmit sub-video data unit A
- channel 2 is selected to transmit sub-video data unit B
- sub-video data units C and D can be selected not to transmit.
- the bandwidths of multiple channels may be the same or substantially the same.
- any one of the multiple channels may be randomly selected.
- the encoded sub-video data unit is transmitted to ensure that the selected channels do not overlap.
- the video sending method applicable to the movable object provided by the embodiment by dividing the video data Decomposed into a plurality of sub-video data units, and respectively encoding a plurality of sub-video data units, and selecting at least one of the plurality of channels to transmit the encoded sub-video data unit according to channel characteristics and characteristics of the sub-video data units, when selecting When at least one of the plurality of channels transmits the one or more encoded sub-video data units, the sub-video data unit can be transmitted on the channel matched thereto, while expanding the channel bandwidth of the video transmission system, Improve the efficiency of video transmission.
- FIG. 5 is a schematic flowchart of a video sending method applicable to a movable object according to another embodiment of the present invention.
- the present embodiment is applicable to video data.
- a specific implementation manner in which each of the one or more image frames is decomposed into a plurality of sub-images is not limited, and those skilled in the art may set according to specific design requirements, and more preferably, one or more images in the video data.
- Each of the frames is decomposed into a plurality of sub-images set to include:
- S501 Decompose each of the one or more image frames in the video data into a plurality of sub-images.
- the video data may include one frame image, and may also include consecutive multi-frame images. This embodiment does not limit the number of pixels included in one frame image, and does not limit the pixels of each pixel. value.
- the data processor decomposes the video data, each of the one or more frames of the video data may be decomposed into a plurality of sub-images.
- each of the one or more image frames in the video data is decomposed into a plurality of sub-images.
- a frame image included in the video data is taken as an example to introduce a process for spatially decomposing the frame image. This can be achieved in several possible ways:
- One of the achievable manners is that the decomposition of each of the one or more image frames in the video data into a plurality of sub-images is set to include:
- S502 Decompose each of the one or more image frames in the video data into a plurality of sub-images by using a Fourier correlation transform or an orthogonal transform.
- the Fourier correlation transform or the orthogonal transform is determined from a Hadamard transform, a discrete cosine transform, a discrete Fourier correlation transform, a Walsh-Hadamard transform, a Haar transform or an oblique transform.
- FIG. 6 is a schematic diagram of a frame image. This embodiment does not limit the number of pixels included in a frame image, and the image includes 16 pixels, for example, P1-P16 represents 16 pixels.
- the pixel value of the point is spatially transformed and decomposed into pixel values of every four adjacent pixel points of the 16 pixel points, and is decomposed into four sub-images.
- the following is a schematic description of the Hadamard transform, and the specific spatial transformation decomposition process Including the following steps:
- Step 1 Perform a Hadamard transform by using four adjacent pixels in each of the 16 pixels as a unit.
- four adjacent pixels are selected as one unit only for illustrative purposes, and those skilled in the art may select other methods.
- the conversion coefficients obtained by Pd1, P2, P3, and P4 after Hadamard transformation are H1, H2, H3, and H4, wherein the relationship between P1, P2, P3, P4 and H1, H2, H3, and H4 satisfies the formula (1). ), (2), (3), (4):
- H1 (P1+P2+P3+P4+1)>>1 (1)
- H2 (P1+P2–P3–P4+1)>>1 (2);
- H3 (P1+P3–P2–P4+1)>>1 (3)
- H4 (P1+P4–P2–P3+1)>>1 (4);
- H1 contains the average energy of 4 pixels
- H2 contains the average gradient of 4 pixels in the vertical direction
- H3 contains 4 pixels.
- H4 contains a cross gradient of 4 pixels, ie texture information. Therefore, when the receiving device reconstructs the frame image, H1 is the most important, H2 and H3 are the second most important, and H4 is the least important, that is, the importance of H1, H2, H3, and H4 is successively decreased.
- Step 2 Decompose the conversion coefficients obtained by the Hadamard transform into different sub-images.
- This embodiment does not limit the number of sub-images obtained by spatially transforming each frame of the image, for example, the sub-images obtained after the decomposition. The number is four, which is only a schematic description. In other embodiments, the number of sub-images obtained by spatially transforming each frame of the image may also be other values; optionally, H1 is assigned to the first Sub-images, divide H2 into a second sub-image, divide H3 into a third sub-image, and divide H4 into a fourth sub-image.
- H5-H8 is decomposed into 4 sub-images in the same way
- H9-H12 is decomposed into 4 sub-images in the same way
- H13-H16 is decomposed into 4 sub-images in the same way, to obtain The decomposition result shown in Fig. 8.
- the resolution of each of the four sub-images after the spatial transformation decomposition is one quarter of the original image before the decomposition.
- each of the one or more image frames in the video data can be decomposed into a plurality of sub-images to be set to include:
- S503 Decompose each of the one or more image frames in the video data into a plurality of sub-images by using spatial downsampling.
- This embodiment does not limit the number of sub-images obtained by down-sampling the image space of each frame.
- the number of sub-images obtained after decomposition is four, which is only schematically illustrated herein.
- the number of sub-images obtained by sub-sample decomposition of each frame of image space may also be other values.
- spatial down-sampling is performed on pixel values of every four adjacent pixel points of 16 pixel points.
- the specific spatial downsampling decomposition process is: decomposing 4 pixel points in one unit into different sub-images by using 4 pixels adjacent to each of 16 pixels as a unit.
- P5-P8 will be the same.
- the method is decomposed into four sub-images, P9-P12 is decomposed into four sub-images in the same manner, and P13-P16 is decomposed into four sub-images in the same manner, and the decomposition result as shown in FIG. 9 is obtained.
- the resolution of each of the four sub-images after spatial down-sampling is one-fourth of the original image before the decomposition.
- the size of the original image before decomposition is W*H.
- the original image is decomposed into 4 sub-images, and the line number or column number of the pixel matrix corresponding to the original image or sub-image is counted from 0, the first sub-image
- the pixel in the original image may have a pixel with coordinates (2i, 2j)
- the second sub-image may include a pixel with coordinates (2i+1, 2j) in the original image
- the third sub-image may include the original image.
- the fourth sub-image may include pixels with coordinates (2i+1, 2j+1) in the original image, where 2i+1 ⁇ W, 2j+1 ⁇ H.
- Each of the one or more image frames in the video data can be decomposed into a plurality of sub-images according to any of the above-described decomposition methods or spatial downsampling.
- one image frame is one frame image
- a plurality of image frames are multi-frame images.
- the video data includes one or more image frames
- the sub-video data unit includes at least one of a plurality of sub-images obtained by decomposing each of the image frames.
- the sub-video data unit may include at least one of the plurality of sub-images obtained by decomposing the image frame, for example, A sub-video data unit includes a sub-image, and each sub-image obtained after the decomposition is encoded to obtain code stream data encoded by the sub-video data unit.
- each image frame that is, each frame image is decomposed as shown in FIG. 8 or FIG. 9, for example, the video data includes 4 image frames, and if each image frame is decomposed For 4 sub-images, 4 consecutive 4 image frames are decomposed to obtain 4*4 sub-images, and each sub-video data unit may include multiple sub-images of 4*4 sub-images, specifically, sub-video data units. There may be four, and at least one sub-image is selected from a plurality of sub-images (four sub-images) decomposed in each image frame, and the selected sub-images are composed into sub-video data units.
- each sub-image includes a portion of an image frame.
- each sub-image includes one or more pixels of an image frame.
- each sub-image includes one or more conversion coefficients of the image frame.
- the energy concentration of the sub-image 1, the sub-image 2, the sub-image 3, and the sub-image 4 are the same or similar, so that the sub-image 1, the sub-image 2, the sub-image 3, and the sub-image 4 have the same importance. .
- FIG. 9 the energy concentration of the sub-image 1, the sub-image 2, the sub-image 3, and the sub-image 4 are the same or similar, so that the sub-image 1, the sub-image 2, the sub-image 3, and the sub-image 4 have the same importance. .
- the energy concentration of the sub-image 1 is the largest, the energy concentration of the sub-image 2 and the sub-image 3 is slightly smaller than the energy concentration of the sub-image 1, and the energy concentration of the sub-image 4 is the smallest, and further, the sub-image is known.
- 1 is most important, sub-image 2, sub-image 3 is of less importance, sub-image 4 is of the lowest importance, therefore, the sub-video data unit containing sub-image 1 has the highest priority, including sub-image 2 or sub-image 3.
- the sub video data unit has the lower priority, and the sub video data unit including the sub image 4 has the lowest priority.
- the video transmitting method for a movable object decomposes each space in one or more image frames in the video data into a processing manner by Fourier correlation transform or orthogonal transform or spatial down sampling. a plurality of sub-images, thereby implementing a process of decomposing each of one or more image frames in the video data into a plurality of sub-images, the receiving device reconstructing the image frames by receiving the sub-images in the sub-video data, in the individual sub-images When the image is received incorrectly, the image frame can still be reconstructed to improve the reliability and fault tolerance of data transmission.
- the plurality of sub video data units can be separately encoded to include:
- a plurality of sub-video data units are encoded by a plurality of separate encoders.
- multiple sub-video data units may be encoded in parallel by using multiple independent encoders; or, multiple sub-video data units may be encoded by using different video encoding rules; or Multiple sub-video data units can also be encoded using the same video coding rules.
- Another achievable way is to separately code multiple sub-video data units to include:
- Two or more of the plurality of sub-video data units are encoded by the same encoder.
- Yet another achievable manner is: separately encoding a plurality of sub-video data units to include:
- At least one of the plurality of sub-video data units is encoded using a motion compensation based video compression standard.
- Yet another achievable manner is: separately encoding a plurality of sub-video data units to include:
- Multiple sub-video data units are compressed according to different compression ratios.
- the compression ratio may be determined according to one or more characteristics of the sub-video data unit; and for a plurality of sub-video data units, they may have the same or similar characteristics, or may have different characteristics, and the similar characteristics described above. It may mean that the degree of difference in characteristics is less than or equal to a difference threshold set in advance.
- the video sending method applicable to the movable object provided in this embodiment uses different encoders or video compression standards based on motion compensation or different compression ratios to encode or compress multiple sub-video data units, and the implementation manner is various and convenient to operate. And also effectively guarantees the reliability and flexibility of encoding the sub-video data unit.
- FIG. 10 is a schematic flowchart of selecting at least one channel of a plurality of channels to transmit a coded sub-video data unit according to an embodiment of the present disclosure. On the basis of the foregoing embodiment, referring to FIG. 10, the embodiment may be used. Transmitting at least one of the plurality of channels to transmit the encoded sub-video data unit is set to include:
- S1001 Divide the encoded multiple sub-video data units into one or more sub-video data unit groups according to one or more characteristics of one or more channels;
- the video data unit group may include one or more encoded sub-video data units.
- the plurality of sub video data units to be encoded may be divided into one or more sub video data unit groups according to the number of available channels, for example, the video data unit is decomposed into sub video data unit A, sub video data unit B, and sub
- the video data unit C and the sub video data unit D may divide the sub video data unit A, the sub video data unit B, the sub video data unit C, and the sub video data unit D into two sub video data if there are currently two available channels.
- a unit group, for a divided sub-video data unit group, one sub-video data unit group is transmitted by one available channel.
- the encoded plurality of sub-video data units may be divided into one or more sub-video data unit groups according to the channel bandwidth, for example, the video data unit is decomposed into the sub-video data unit A, the sub-video data unit B, and the sub-video data.
- the unit C, the sub-video data unit D, the sub-video data unit E, the sub-video data unit F, and the size of the code stream data corresponding to the six sub-video data units are S0, S1, S2, S3, S4, and S5, respectively.
- the sub-video data unit A, the sub-video data unit B, the sub-video data unit C, the sub-video data unit D, the sub-video data unit E, the sub-video may be provided for the transmission of the two sub-video data unit groups to meet the transmission delay requirement.
- the data unit F is divided into two sub-video data unit groups, channel 1 transmits one sub-video data unit group, and channel 2 transmits another sub-video data unit group.
- S1002 Select at least one of the plurality of channels to send the sub-video data unit group.
- the combined sub-video data unit group can be used as a sending unit.
- selecting at least one of the multiple channels to send the sub-video data unit group is set to include:
- S1003 Select a channel that matches at least one channel bandwidth of the sub-video data unit group to transmit a sub-video data unit group among the plurality of channels.
- a channel whose channel bandwidth matches the code stream data size of the sub-video data unit group may be selected to transmit the sub-video data unit group, for example, for example, There are currently 2 available channels, namely channel 1, channel 2, channel bandwidths are T0, T1, respectively, and the channel bandwidth is successively decremented, and the video data is decomposed into sub-video data unit A, sub-video data unit B, sub-video data unit.
- a sub video data The cell group 1 divides the sub-video data units C and D into one sub-video data unit group 2, wherein the size of the code stream data of the sub-video data unit group 1 is S0+S1, and the code stream of the sub-video data unit group 2 The size of the data is S2+S3.
- the sub-video data unit group 1 can be selected by selecting the channel 1 matching the sub-video data unit group 1, and the sub-video data unit group 2 can be transmitted by selecting the channel 2 matching the sub-video data unit group 2.
- selecting at least one of the plurality of channels to send a sub-video data unit group is set to include:
- S1004 Select at least one of the plurality of channels to select the sub-video data unit group according to the code stream data size and the channel bandwidth of the sub-video data unit group.
- the sub-video data unit group may be selected according to the channel bandwidth and the sub-video data unit group, for example, there are currently 2 Channels, which are channel 1 and channel 2, respectively, and the channel bandwidth is successively decremented, and the video data is decomposed into sub-video data unit A, sub-video data unit B, sub-video data unit C, sub-video data unit D, and 4 sub-video data.
- the size of the corresponding code stream data of the unit code is successively decremented, the sub video data units A and B are divided into one sub video data unit group 1, and the sub video data units C and D are divided into one sub video data unit group 2,
- the code stream data size of the sub video data unit group is larger than the code stream data size of the sub video data unit group 2.
- Channel 1 with a large channel bandwidth is selected for transmitting sub-video data unit group 1, and channel 2 with a small channel bandwidth can be selected for transmission.
- selecting at least one of the plurality of channels to send a sub-video data unit group is set to include:
- S1005 Select at least one of the plurality of channels to select the sub-video data unit group according to the priority of the sub-video data unit group and the channel bandwidth.
- the channel bandwidth and the priority of the sub video data unit group may be selected to select one channel to transmit the sub video data unit group, for example, current There are two channels, namely channel 1 and channel 2, and the channel bandwidth is successively decremented, and the video data is decomposed into sub-video data unit A, sub-video data unit B, sub- The video data unit C, the sub-video data unit D, and the corresponding priorities are successively decremented, dividing the sub-video data units A and B into one sub-video data unit group 1, and dividing the sub-video data units C and D into one sub-video In the data unit group 2, the priority of the sub video data unit group 1 is greater than the priority of the sub video data unit group 2, in this case, in order to ensure that the sub video data unit group 1 with high priority can be sent out and the transmission delay requirement is ensured.
- the channel 1 having a large channel bandwidth can be selected to transmit the sub-video data unit group 1, and the channel 2 having a large channel bandwidth can be selected to transmit the sub-video data unit group 1, and the
- the specific implementation process and the implementation effect of implementing multiple manners of selecting at least one of the multiple channels to transmit the sub-video data unit group are selected in the multiple channels in the foregoing embodiment.
- the specific implementation process of the at least one channel to transmit the encoded sub-video data unit is similar to that of the implementation. The only difference is that the channel in the foregoing embodiment is used to send the encoded sub-video data unit, but in this embodiment, The channel is used to transmit the sub-video data unit group that is divided by the sub-video data unit. Therefore, the specific implementation process and the implementation effect can refer to the foregoing content, and details are not described herein again.
- the embodiment of the invention further provides a computer storage medium, wherein the computer storage medium stores program instructions, and the program instructions may include some or all of the steps of the sending method in the foregoing embodiments.
- FIG. 11 is a schematic flowchart of a video receiving method applicable to a movable object according to an embodiment of the present invention.
- the embodiment provides a video receiving method suitable for a movable object, for example,
- the video receiving method is used for video image information transmitted by an unmanned aerial vehicle.
- the video receiving method includes:
- S1101 Receive multiple encoded sub-video data units from multiple channels
- the movable object decomposes each image frame included in the video data into a plurality of sub-images before transmitting the video data through the plurality of channels
- the sub-video data unit includes a plurality of sub-decompositions of each image frame.
- the receiving device receives the plurality of encoded sub-video data transmitted by the video transmitting system of the movable object through multiple channels.
- each channel can transmit one or more encoded sub-video data units.
- the decomposition of the video data is obtained as shown in FIG. 2, FIG. 3 or FIG. 4; wherein, the decomposition of each frame of the video data may be as shown in FIG. 8 or FIG. 9 , the specific decomposition process is consistent with the above embodiment, and details are not described herein again.
- S1102 Decode the encoded multiple sub-video data units.
- the video data unit includes an image frame, and one image frame is decomposed into four sub-video data units, and each sub-video data unit includes one sub-image for illustrative explanation.
- the receiving device may separately decode the four encoded sub-video data units, that is, the code streams corresponding to the four sub-images. The data is decoded separately to obtain a decoded sub-video data unit.
- the sub-video data unit is transmitted in the wireless channel, the sub-video data unit obtained by the receiving device and the encoded system actually transmitted by the communication system of the UAV may be caused due to noise interference, multipath effect, fading, and the like.
- the video data unit is different, and the receiving device receives an error at this time.
- the four sub-images obtained by the receiving device after decoding the code stream are as shown in FIG. 12, wherein if the sub-image 1 is transmitted correctly, H1 and H1 is the same, H2 and h2 are the same, H3 and h3 are the same, and H4 and h4 are the same. If sub-image 1 is transmitted incorrectly, at least one of H1 and h1, H2 and h2, H3 and h3, H4 and h4 are different. Similarly, other sub-pictures are transmitted correctly or transmitted incorrectly, and the transform coefficients before transmission and the transform coefficients after transmission also have the same relationship.
- the four sub-images obtained by the receiving device after decoding the code stream data are as shown in FIG. 12, wherein if the sub-image 1 is transmitted correctly, P1 The same as p1, P2 and p2 are the same, P3 and p3 are the same, and P4 and p4 are the same. If sub-image 1 is transmitted incorrectly, at least one of P1 and p1, P2 and p2, P3 and p3, P4 and p4 are different. Similarly, other sub-images are transmitted correctly or transmitted incorrectly, and the pixels before transmission and the pixels after transmission also have the same relationship.
- S1103 reconstruct video data according to the decoded sub video data unit.
- the transmission error of one or more sub-images of the sub-video data unit is detected, and the video data is reconstructed according to receiving the correct sub-image.
- the receiving device decodes the code stream data to obtain 4 sub-images, in order to improve the correct rate of the reconstructed video data, each sub-image transmission may be detected to be correct or transmitted, and the original image may be reconstructed according to the received correct sub-image.
- the sub-image transmitted by the communication system is the sub-image shown in FIG. 8, and the sub-image received by the receiving device is as shown in FIG. 12, as shown in FIG. 12, assuming that the sub-image 2 receives an error, the sub-image 1 If the sub-image 3 and the sub-image 4 are all received correctly, the receiving device reconstructs the original image according to the sub-image 1, the sub-image 3, and the sub-image 4 shown in FIG. 12, and reconstructs the original image, and reconstructs the video data by using an inverse transform.
- a sub-image of the sub-picture data unit that is transmitted in error is assigned a value. Specifically, h2, h6, h10, and h14 of the sub-image 2 can be set to 0.
- H1, H2, and H3 are known.
- H4 is obtained according to P1, P2, P3, and P4. Therefore, when reconstructing the original image, it is necessary to perform inverse Hadamard transform on h1, h2, h3, and h4. If spatial transformation is used to decompose the image frame, other spaces are used. When the receiving device reconstructs the original image, the inverse transform of the corresponding spatial transform is adopted.
- H1 and h1 are the same, H3 and h3 are the same, and H4 and h4 are the same. Therefore, p1, p2, p3, p4 obtained by inverse Hadamard transform and pixel values P1, P2, P3 in the original image, P4 may be different, but reconstructing the original image based on receiving the correct sub-image ensures that the reconstructed image is close to the original image.
- h5, h6, h7, h8 are inversely transformed by Hadamard to obtain p5, p6, p7, p8, and h9, h10, h11, h12 are inversely transformed by Hadamama to obtain p9, p10, p11, p12, h13, H14, h15, h16 perform inverse Hadamard transform to obtain p13, p14, p15, p16, wherein h6, h10, h14 are all 0, and reconstruct the original image according to p1-p16 obtained by inverse Hadamard transform, as shown in Fig. 14. .
- the sub-image sent by the video transmitting system is the sub-image shown in FIG. 9, and the sub-image received by the receiving device is as shown in FIG. 13.
- FIG. 13 it is assumed that the sub-image 3 receives an error, and the sub-image 1
- the sub-image 2 and the sub-image 4 are all received correctly, and the receiving device reconstructs the original image according to the sub-image 1, the sub-image 2, the sub-image 3, and the sub-image 4 shown in FIG. 11, and gives the sub-video data when reconstructing the original image.
- a sub-image of the transmission error in the unit is assigned a value.
- the value assigned to the sub-picture transmitted in the sub-video data unit is determined by interpolation, specifically The value assigned to the sub-picture that transmitted the error in the sub-video data unit is determined based on the transmission of the correct sub-image.
- the sub-image transmitted incorrectly and the sub-image transmitted correctly are from the same image frame.
- the sub-image 3 receives an error, and the sub-image 1, the sub-image 2, and the sub-image 4 are all received correctly, and the sub-image 3 does not participate in the reconstruction process, that is, the receiving device only according to the sub-image 1, the sub-image 2, The sub-image 4 reconstructs the original image.
- the specific process is as follows: since the original image includes 16 pixels, the sub-image 1, the sub-image 2, and the sub-image 4 have a total of 12 pixels. According to FIG. 8, each of the 16 pixels in the original image is known. Four adjacent pixels are decomposed into four different sub-images. Therefore, when the original image is reconstructed from the sub-image 1, sub-image 2, and sub-image 4, the first pixel p1 of the sub-image 1 and the sub-image 2 The first pixel p2 and the first pixel p4 of the sub-image 4 are respectively three pixels P1, P2, and P4 of the first four adjacent pixels of the original image. Similarly, p5, p6, and p8 are original.
- the three pixels P5, P6, P8, p9, p10, and p12 in the image P5-P8 are the three pixels P9, P10, and P12 in the original image P9-P12, respectively, and the p13, p14, and p16 are the original images P13-P16, respectively.
- p1, p2, p4, p5, p6, p8, p9, p10, p12, p13, p14, and p16 are all correctly received, ie, p1, p2, p4, p5, p6, p8, p9, p10, p12, p13
- the p14 and the p16 are respectively the same as the pixel at the same position of the original image.
- a sub-image of the receiving error may be assigned a value, and the pixel value of the image A may be determined by interpolation according to the correct sub-image received.
- a feasible interpolation method is: p3 is equal to the arithmetic mean of p1, p2, p4, p7 is equal to the arithmetic mean of p5, p6, p8, p11 is equal to the arithmetic mean of p9, p10, p12, p15 is equal to p13, p14, The arithmetic mean of p16, resulting in the reconstructed original image B.
- p3 is equal to the arithmetic mean of p1, p2, p4, p7 is equal to the arithmetic mean of p5, p6, p8, p11 is equal to the arithmetic mean of p9, p10, p12, p15 is equal to p13, p14, The arithmetic
- the video receiving method for a movable object receives a plurality of encoded sub-video data units from a plurality of channels, and then separately decodes the encoded plurality of sub-video data units to obtain a decoded sub-video.
- Data unit, and reconstructing the original image according to the decoded sub-video data unit so that if one or more channel data are received correctly, a reconstructed image without mosaic error can be obtained, and at the same time, receiving the correct sub-video data unit More, reconstructed vision
- the embodiment of the invention further provides a computer storage medium, wherein the computer storage medium stores program instructions, and the program instructions may include some or all of the steps of the receiving method in the foregoing embodiments.
- the embodiment provides a video transmission system suitable for a movable object.
- the video transmission system 1600 can be disposed on a movable platform, such as an unmanned aerial vehicle, for performing the above video.
- the sending method specifically, the video sending system includes:
- One or more imaging devices 1601 configured to acquire video data
- One or more processors 1602, working alone or in concert, one or more processors 1602 are configured to: decompose video data into a plurality of sub-video data units, wherein each sub-video data unit includes one or more sub-images; Encoding a plurality of sub-video data units separately; selecting at least one of the plurality of channels to transmit the encoded sub-video data unit based on one or more characteristics of the channel and one or more characteristics of the sub-video data unit.
- the above video data may include one or more image frames.
- the processor 1602 when the processor 1602 decomposes the video data into a plurality of sub-video data units, it may be configured to:
- Each of the one or more image frames in the video data is decomposed into a plurality of sub-images, wherein each of the sub-video data units includes at least one of the plurality of sub-images of each of the image frames.
- each of the sub-images described above includes a portion of an image frame, and specifically, each sub-image may include one or more pixels of the image frame; or each sub-image may include one or more conversion coefficients of the image frame.
- the processor 1602 may be configured to: each of the one or more image frames in the video data A space is decomposed into multiple sub-images.
- an implementation manner is that the processor 1602 is configured to: use a Fourier correlation transform. Or orthogonal transforming each of the one or more image frames in the video data Decomposed into multiple sub-images; wherein the Fourier transform or orthogonal transform is determined from Hadamard transform, discrete cosine transform, discrete Fourier correlation transform, Walsh-Hadamard transform, Haar transform or oblique transform
- the processor is configured to decompose each of the one or more image frames in the video data into a plurality of sub-images using spatial downsampling.
- one or more characteristics of the foregoing sub video data unit include: a code stream data size encoded by the sub video data unit, or a priority of the sub video data unit; when the characteristics of the sub video data unit include the sub video data unit In priority, the priority of the plurality of sub-video data units may be determined by prioritization according to the energy concentration of the sub-video data units; and the plurality of sub-video data units may have similar characteristics.
- one or more characteristics of the channel can be set to include at least the bandwidth.
- one or more characteristics of the channel are set to include at least one of noise, interference, signal to noise ratio, bit error rate, fading rate, bandwidth, number of available channels.
- the processor 1602 may be configured to: select each of the plurality of sub-video data units from the plurality of channels. A channel that uses a selected channel to transmit sub-video data units.
- the processor 1602 selects at least one of the multiple channels to send the encoded sub-video according to one or more characteristics of the channel and one or more characteristics of the sub-video data unit.
- the data unit may be configured to: select one channel from the plurality of channels for each of the plurality of sub-video data units according to the code stream data size and channel bandwidth of the encoded sub-video data unit, and use the selected channel to Send a sub video data unit.
- the method may be configured to: select one channel from the plurality of channels for each of the plurality of sub-video data units according to the priority and channel bandwidth of the encoded sub-video data unit, and transmit the sub-video data unit by using the selected channel .
- the processor 1602 may be configured to: after encoding according to one or more characteristics of the one or more channels The plurality of sub-video data units are divided into one or more data unit groups; at least one of the plurality of channels is selected to transmit the sub-video data unit group.
- the first achievable way is:
- the processor 1602 may be configured to: select, among the plurality of channels, a channel whose channel bandwidth matches the code stream data size of the sub-video data unit group. To send a sub video data unit group.
- the second achievable manner is: when the processor 1602 selects at least one of the multiple channels to transmit the sub video data unit group, the processor 1602 can be configured to:
- the sub video data unit group is transmitted by selecting at least one of the plurality of channels according to the code stream data size and the channel bandwidth of the sub video data unit group.
- a third achievable manner is that when the processor 1602 selects at least one of the plurality of channels to transmit the sub-video data unit group, the processor 1602 can be configured to:
- the sub video data unit group is transmitted by selecting at least one of the plurality of channels according to the priority of the sub video data unit group and the channel bandwidth.
- the processor 1602 is further configured to control multiple encoders to encode multiple sub-video data units. Specifically, the processor 1602 is specifically configured to control multiple encoders to multiple sub-videos. The data unit is encoded in parallel; another achievable manner is: the processor 1602 is specifically configured to control multiple encoders to encode multiple sub-video data units by using different video coding rules; The processor 1602 is specifically configured to control the plurality of encoders to encode the plurality of sub-video data units by using the same video encoding rule.
- the processor 1602 is further configured to control the encoder to encode two or more of the plurality of sub-video data units. Alternatively, the processor 1602 is further configured to control the encoder to encode at least one of the plurality of sub-video data units based on the motion compensated video compression standard. Specifically, when the processor 1602 separately encodes the plurality of sub video data units, the processor 1602 may be configured to compress the plurality of sub video data units according to different compression ratios, wherein the compression ratio is based on one or a sub video data unit. Multiple characteristics are determined.
- the movable object is an unmanned aerial vehicle; one or more imaging devices are connected to the movable object through a carrier; wherein the carrier may be a multi-axis pan/tilt.
- the processor 1602 separately decomposes the video data into a plurality of sub-video data units, and separately encodes the plurality of sub-video data units according to channel characteristics and characteristics of the sub-video data units. , selecting at least one of the plurality of channels Transmitting the encoded sub-video data unit, when selecting at least one of the plurality of channels to transmit one or more encoded sub-video data units, enabling the sub-video data unit to be performed on the channel matched thereto Transmission, while expanding the bandwidth of video transmission, improving the efficiency of transmission of video data.
- the video receiving device reconstructs video data by using sub-video data units received from multiple channels, which can improve the fault tolerance and reliability of video transmission. .
- the present embodiment provides a video receiving system suitable for a movable object.
- the video receiving system 1700 can be configured on a receiving device.
- the video receiving system 1700 is configured to perform the video receiving method.
- the video receiving system includes:
- a communication interface 1701 receiving a plurality of encoded sub-video data units from a plurality of channels;
- One or more processors 1702 working alone or in concert, the one or more processors 1702 are configured to decode the received encoded plurality of sub-video data units, reconstructing the video data from the decoded sub-video data units;
- the video data includes one or more image frames, and the sub-video data unit includes at least one of the plurality of sub-images obtained by decomposing each of the image frames.
- the processor 1702 may be configured to decode each of the encoded plurality of sub-video data units when decoding the encoded plurality of sub-video data units.
- the processor 1702 may be configured to: detect a transmission error of one or more sub-images of the decoded sub-video data unit, and reconstruct according to receiving the correct sub-image. Video data.
- the processor 1702 is further configured to: assign a value to the sub-image that is transmitted incorrectly in the decoded sub-video data unit. Specifically, a value assigned to the sub-picture that transmits the error in the decoded sub-video data unit is 0.
- the processor 1702 may be configured to: determine, by using an interpolation method, a value assigned to the sub-image transmitted in the decoded sub-video data unit. Specifically, when the processor 1702 determines, by using an interpolation method, a value assigned to the sub-image transmitted by the error in the decoded sub-video data unit, the processor 1702 may be configured to: determine the decoded sub-video data according to the correctly transmitted sub-image. The value assigned to the erroneous sub-image is transmitted in the unit, wherein the erroneous sub-image is transmitted and the correct sub-image is transmitted from the same image frame.
- the processor 1702 may be configured to reconstruct the video data according to the decoded sub video data unit by using an inverse transform.
- the video receiving system for the movable object receives the plurality of encoded sub-video data units from the plurality of channels by the processor 1702, and then decodes the encoded plurality of sub-video data units separately to obtain the decoding.
- This embodiment provides an unmanned aerial vehicle, including:
- a power system installed in the fuselage to provide flight power
- the unmanned aerial vehicle provided in this embodiment is the above-mentioned video transmission system provided on the unmanned aerial vehicle, wherein the processor in the video transmission system can decompose the video data acquired by the imaging device into a plurality of sub-video data units, and The video data units are respectively encoded, and according to the channel characteristics and the characteristics of the sub video data unit, at least one of the plurality of channels is selected to transmit the encoded sub video data unit, and when at least one of the plurality of channels is selected to transmit one Or a plurality of encoded sub-video data units, so that the sub-video data unit can be transmitted on the channel matched thereto, and the efficiency of transmission of the video data is improved while expanding the bandwidth of the video transmission, and at the same time, the video receiving device utilizes Reconstructing video data from sub-video data units received on multiple channels can improve fault tolerance and reliability of video transmission.
- This embodiment provides a receiving device, including the above video receiving system.
- the receiving device receives the sub-video data unit sent by the mobile platform based on the video receiving system, and reconstructs the video data according to the sub-video data unit; specifically, the receiving device may be a remote controller, a smart phone, a tablet computer, Ground control stations, laptops, watches, bracelets, etc., and combinations thereof, can also control unmanned aerial vehicles on the ground.
- the related apparatus and method disclosed may be implemented in other manners.
- the device embodiments described above are merely illustrative.
- the division of the modules or units is only a logical function division.
- there may be another division manner for example, multiple units or components may be used. Combinations can be integrated into another system, or some features can be ignored or not executed.
- the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
- the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
- each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
- the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
- the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
- the technical solution of the present invention which is essential or contributes to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium.
- a number of instructions are included to cause a computer processor 101 to perform all or part of the steps of the methods described in various embodiments of the present invention.
- the foregoing storage medium includes: a U disk, a removable hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, which can store program codes.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
L'invention concerne un procédé de transmission vidéo, un procédé de réception et un système destinés à être utilisés dans un objet mobile, et un véhicule aérien sans pilote. Le procédé de transmission vidéo consiste à : segmenter des données vidéo en de multiples unités de données de sous-vidéo, chaque unité de données de sous-vidéo comprenant une ou plusieurs sous-images ; effectuer respectivement un codage sur les multiples unités de données de sous-vidéo ; et sélectionner, selon une ou plusieurs caractéristiques de multiples canaux et une ou plusieurs caractéristiques des unités de données de sous-vidéo, au moins un des multiples canaux pour transmettre les unités de données de sous-vidéo codées. Dans le procédé de transmission vidéo, le procédé de réception et le système destinés à être utilisés dans un objet mobile, et le véhicule aérien sans pilote, au moins un canal est sélectionné parmi de multiples canaux pour transmettre une ou plusieurs unités de données sous-vidéo codées, de sorte que les unités de données de sous-vidéo puissent être transmises sur des canaux d'adaptation, élargissant ainsi une bande passante pour une transmission vidéo tout en améliorant l'efficacité de transmission de données vidéo et la tolérance aux défauts et la fiabilité de transmission vidéo.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780005035.6A CN108496370A (zh) | 2017-03-30 | 2017-03-30 | 视频发送方法、接收方法、系统以及无人飞行器 |
PCT/CN2017/078871 WO2018176341A1 (fr) | 2017-03-30 | 2017-03-30 | Procédé de transmission vidéo, procédé de réception, système et véhicule aérien sans pilote |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/078871 WO2018176341A1 (fr) | 2017-03-30 | 2017-03-30 | Procédé de transmission vidéo, procédé de réception, système et véhicule aérien sans pilote |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018176341A1 true WO2018176341A1 (fr) | 2018-10-04 |
Family
ID=63344696
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/078871 WO2018176341A1 (fr) | 2017-03-30 | 2017-03-30 | Procédé de transmission vidéo, procédé de réception, système et véhicule aérien sans pilote |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108496370A (fr) |
WO (1) | WO2018176341A1 (fr) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110913244A (zh) * | 2018-09-18 | 2020-03-24 | 传线网络科技(上海)有限公司 | 视频处理方法及装置、电子设备和存储介质 |
WO2022141121A1 (fr) * | 2020-12-29 | 2022-07-07 | 深圳市大疆创新科技有限公司 | Procédé de transmission d'image, plateforme mobile, dispositif de commande à distance, système, et support de stockage |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013128010A2 (fr) * | 2012-03-02 | 2013-09-06 | Canon Kabushiki Kaisha | Procédé et dispositifs pour coder une séquence d'images en un flux binaire vidéo hiérarchique, et décoder un flux binaire vidéo hiérarchique correspondant |
US20130263202A1 (en) * | 2010-06-22 | 2013-10-03 | Gi Provision Limited | Data stream rate adaptation mechanism |
CN105120230A (zh) * | 2015-09-15 | 2015-12-02 | 成都时代星光科技有限公司 | 无人机图像监控和传输系统 |
CN105208335A (zh) * | 2015-09-22 | 2015-12-30 | 成都时代星光科技有限公司 | 高倍变焦无人机空中高清多维实时侦查传输系统 |
CN105391977A (zh) * | 2015-11-09 | 2016-03-09 | 天津航天中为数据系统科技有限公司 | 一种数据发送方法及系统 |
CN105940627A (zh) * | 2014-08-21 | 2016-09-14 | 深圳市大疆创新科技有限公司 | 无人飞行器通信方法及系统 |
CN106411838A (zh) * | 2016-06-14 | 2017-02-15 | 青岛乾元通数码科技有限公司 | 一种多信道负载均衡音视频传输方法及系统 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7480252B2 (en) * | 2002-10-04 | 2009-01-20 | Koniklijke Philips Electronics N.V. | Method and system for improving transmission efficiency using multiple-description layered encoding |
CN1753493A (zh) * | 2004-09-24 | 2006-03-29 | 松下电器产业株式会社 | 无线多媒体通信系统的跨层联合方法 |
US8189621B2 (en) * | 2006-05-12 | 2012-05-29 | Microsoft Corporation | Stack signaling to application with lack of requested bandwidth |
EP2080270A4 (fr) * | 2006-10-06 | 2010-11-17 | Agency Science Tech & Res | Procédé de codage, procédé de décodage, codeur, décodeur et produits de programme informatique |
JP2008142150A (ja) * | 2006-12-07 | 2008-06-26 | Matsushita Electric Ind Co Ltd | 医療端末および医療端末の制御方法 |
CN101848499B (zh) * | 2009-03-25 | 2013-05-08 | 上海贝尔股份有限公司 | 改进无线系统中的分级业务传输的方法、网络单元及系统 |
CN104349142B (zh) * | 2014-11-03 | 2018-07-06 | 南京航空航天大学 | 一种基于分层表达的无人机视频自适应传输方法 |
CN105025270A (zh) * | 2015-07-28 | 2015-11-04 | 南京中网卫星通信股份有限公司 | 一种天地一体多通道融合的视频传输装置及其视频传输方法 |
-
2017
- 2017-03-30 WO PCT/CN2017/078871 patent/WO2018176341A1/fr active Application Filing
- 2017-03-30 CN CN201780005035.6A patent/CN108496370A/zh active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130263202A1 (en) * | 2010-06-22 | 2013-10-03 | Gi Provision Limited | Data stream rate adaptation mechanism |
WO2013128010A2 (fr) * | 2012-03-02 | 2013-09-06 | Canon Kabushiki Kaisha | Procédé et dispositifs pour coder une séquence d'images en un flux binaire vidéo hiérarchique, et décoder un flux binaire vidéo hiérarchique correspondant |
CN105940627A (zh) * | 2014-08-21 | 2016-09-14 | 深圳市大疆创新科技有限公司 | 无人飞行器通信方法及系统 |
CN105120230A (zh) * | 2015-09-15 | 2015-12-02 | 成都时代星光科技有限公司 | 无人机图像监控和传输系统 |
CN105208335A (zh) * | 2015-09-22 | 2015-12-30 | 成都时代星光科技有限公司 | 高倍变焦无人机空中高清多维实时侦查传输系统 |
CN105391977A (zh) * | 2015-11-09 | 2016-03-09 | 天津航天中为数据系统科技有限公司 | 一种数据发送方法及系统 |
CN106411838A (zh) * | 2016-06-14 | 2017-02-15 | 青岛乾元通数码科技有限公司 | 一种多信道负载均衡音视频传输方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
CN108496370A (zh) | 2018-09-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12167035B2 (en) | Hybrid cubemap projection for 360-degree video coding | |
US11277635B2 (en) | Predictive coding for 360-degree video based on geometry padding | |
US11432010B2 (en) | Face discontinuity filtering for 360-degree video coding | |
WO2018176340A1 (fr) | Procédé de transmission vidéo, procédé de réception, système, et véhicule aérien sans pilote | |
EP3729809B1 (fr) | Techniques de récupération de données de codec vidéo pour liaisons sans fil avec perte | |
US20220261616A1 (en) | Clustering-based quantization for neural network compression | |
US9049464B2 (en) | Multiple description coding with plural combined diversity | |
CA3059870A1 (fr) | Codage de video a 360 degres utilisant des continuites de face | |
KR101783963B1 (ko) | 무선 네트워크들에서 비압축된 비디오 송신을 위한 크로마 파티셔닝 및 레이트 적응을 위한 방법 및 시스템 | |
US10542265B2 (en) | Self-adaptive prediction method for multi-layer codec | |
US11825116B2 (en) | Predictive coding for 360-degree video based on geometry padding | |
WO2012045098A1 (fr) | Procédé et appareil pour un codage vidéo à résolution arbitraire au moyen de mesures d'échantillonnage compressif | |
WO2018176303A1 (fr) | Procédé, système et dispositif d'émission et de réception de vidéo, et véhicule aérien sans pilote | |
US20220360778A1 (en) | Methods and apparatus for kernel tensor and tree partition based neural network compression framework | |
WO2018176341A1 (fr) | Procédé de transmission vidéo, procédé de réception, système et véhicule aérien sans pilote | |
US20250056036A1 (en) | Temporal attention-based neural networks for video compression | |
JP2024504689A (ja) | ビデオを符号化又は復号化するための方法及び装置 | |
WO2025011956A1 (fr) | Correction de mouvement pour prédiction d'attributs de nuage de points | |
WO2025078201A1 (fr) | Schéma de codage d'attribut de nuage de points à deux étages avec transformées locales et globales imbriquées | |
WO2025080447A1 (fr) | Codage prédictif implicite pour compression de nuage de points |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17903130 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17903130 Country of ref document: EP Kind code of ref document: A1 |