+

WO2018120350A1 - Procédé et dispositif de positionnement de véhicule aérien sans pilote - Google Patents

Procédé et dispositif de positionnement de véhicule aérien sans pilote Download PDF

Info

Publication number
WO2018120350A1
WO2018120350A1 PCT/CN2017/072477 CN2017072477W WO2018120350A1 WO 2018120350 A1 WO2018120350 A1 WO 2018120350A1 CN 2017072477 W CN2017072477 W CN 2017072477W WO 2018120350 A1 WO2018120350 A1 WO 2018120350A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
image
current
reference image
feature point
Prior art date
Application number
PCT/CN2017/072477
Other languages
English (en)
Chinese (zh)
Inventor
雷志辉
卞一杰
杨凯斌
贾宁
Original Assignee
深圳市道通智能航空技术有限公司
湖南省道通科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司, 湖南省道通科技有限公司 filed Critical 深圳市道通智能航空技术有限公司
Publication of WO2018120350A1 publication Critical patent/WO2018120350A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/102Simultaneous control of position or course in three dimensions specially adapted for aircraft specially adapted for vertical take-off of aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Definitions

  • the invention relates to the field of UAV control, and in particular to a method and a device for positioning a UAV.
  • UAVs have broad applications in disaster prevention and rescue, scientific investigations, etc., and flight control systems are an important part of drones, playing an important role in the intelligent and practical use of drones.
  • the drone can automatically return to the original route.
  • the map data provided by the third party can usually be stored in the flight system of the drone, and then passed through a positioning device such as a Global Position System (GPS).
  • GPS Global Position System
  • the resolution of the map data provided by the third party is related to the height of the drone from the ground. Generally, the higher the altitude of the drone's off-ground flight, the lower the resolution. Since the flying height of the drone during the operation will change frequently, the resolution of the ground target is likely to be large, and the matching accuracy is low, resulting in poor positioning accuracy when returning.
  • the technical problem to be solved by the present invention is how to improve the positioning accuracy.
  • an embodiment of the present invention discloses a method for positioning a drone, including:
  • a reference image is generated; the current image acquired at the current time is acquired; and the current position of the drone is determined according to the reference image and the current image.
  • generating a reference image includes: collecting a ground image during the flight of the drone; and splicing the ground image to obtain a reference image.
  • the ground image is collected, including: collecting the ground image during the flight of the drone from the starting position to the returning position.
  • the method for positioning the UAV disclosed in this embodiment further includes: determining the return flight.
  • the method for positioning the UAV disclosed in this embodiment further includes: receiving, by the controller, an instruction sent by the controller for indicating the return flight.
  • the method for positioning the UAV disclosed in this embodiment further includes: determining a reverse trajectory of the UAV flying from the starting position to the returning position.
  • the method for positioning the unmanned aerial vehicle disclosed in this embodiment further includes: flying from the returning position to the starting position according to the reverse trajectory.
  • determining a current location of the drone according to the reference image and the current image including: matching the current image with the reference image, and obtaining the drone at the current time relative to the reference image a motion vector; determining, according to the motion vector, positioning information of the drone relative to the reference image at the current time; wherein the positioning information includes at least one of: a position of the drone, a height of the drone, a posture of the drone The direction of the drone, the speed of the drone and the heading of the drone.
  • the current image is matched with the reference image to obtain a motion vector of the drone relative to the reference image at the current time, including: matching the current image with the reference image to obtain a scene of the drone relative to the reference at the current time.
  • the motion vector of the image is matched with the reference image to obtain a motion vector of the drone relative to the reference image at the current time, including: matching the current image with the reference image to obtain a scene of the drone relative to the reference at the current time.
  • the current image is matched with the reference image to obtain a motion vector of the drone relative to the reference image at the current time, including: selecting a feature point of the reference image, wherein the selected feature point is used as the reference feature point. Determining a feature point matching the reference feature point in the current image, wherein the matched feature point is used as the current feature point; matching the current feature point with the reference feature point to obtain a drone at the current time relative to the reference The motion vector of the image.
  • an apparatus for positioning a drone including:
  • a reference module for generating a reference image during the flight of the drone; an acquisition module for acquiring the current image acquired at the current time; and a positioning module for the reference image generated by the reference module and the current image acquired by the acquisition module , determine the current location of the drone.
  • the reference module includes: a sampling unit, configured to collect a ground image during the flight of the drone; and a splicing unit configured to splicing the ground image collected by the sampling unit to obtain a reference image.
  • the sampling unit is specifically configured to collect a ground image during the flight of the drone from the starting position to the returning position.
  • the apparatus for positioning the UAV disclosed in this embodiment further includes: a determining module, Used to determine the return flight.
  • the determining module is further configured to receive an instruction sent by the controller to indicate a return flight.
  • the apparatus for positioning the UAV further includes: a trajectory module, configured to determine, after the reference module generates the reference image, a reverse trajectory of the UAV flying from the starting position to the returning position.
  • a trajectory module configured to determine, after the reference module generates the reference image, a reverse trajectory of the UAV flying from the starting position to the returning position.
  • the device for positioning the UAV disclosed in this embodiment further includes: a returning module, configured to fly from the return position to the starting position according to the reverse trajectory determined by the trajectory module.
  • the positioning module includes: a matching unit, configured to match the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time; and a positioning information unit configured to determine the drone according to the motion vector Positioning information relative to the reference image at the current time; wherein the positioning information includes at least one of: a position of the drone, a height of the drone, a posture of the drone, an azimuth of the drone, a drone The speed and heading of the drone.
  • the matching unit is specifically configured to perform scene matching on the current image and the reference image to obtain a motion vector of the drone relative to the reference image at the current time.
  • the matching unit includes: a selecting subunit for selecting a feature point of the reference image, wherein the selected feature point is used as a reference feature point; and the feature point determining subunit is configured to determine the reference feature in the current image Point matching feature points, wherein the matched feature points are used as the current feature points; the vector subunits are used to match the current feature points with the reference feature points to obtain the movement of the drone relative to the reference image at the current time.
  • a selecting subunit for selecting a feature point of the reference image wherein the selected feature point is used as a reference feature point
  • the feature point determining subunit is configured to determine the reference feature in the current image Point matching feature points, wherein the matched feature points are used as the current feature points
  • the vector subunits are used to match the current feature points with the reference feature points to obtain the movement of the drone relative to the reference image at the current time.
  • the method and device for positioning a drone generate a reference image during the flight of the drone, and the reference image can reflect the latest ground situation, and then obtain Taking the current image collected at the current time, since the reference image and the current image are acquired during the flight of the drone, there is a certain correlation between the reference image and the current image, and then, according to the reference image and the current image, Determining the current position of the drone; in the solution of the embodiment of the invention, the reference image is generated during the flight of the drone, and the current image is also acquired during the flight of the drone, and therefore, the generated reference The image can dynamically compensate for the difference in resolution generated during the flight of the UAV. Compared with the fixed resolution in the prior art, the UAV can better achieve dynamic matching during the return process, reducing system errors and thus improving The positioning accuracy of the return flight.
  • determining a reverse trajectory of the drone flying from the starting position to the returning position so that the drone can directly follow the reverse trajectory when flying from the returning position to the starting position.
  • the return position is flying to the starting position, which reduces the return planning of the flight path and improves the efficiency of determining the flight path when returning.
  • the drone can fly back to the starting position by following the reverse trajectory in the case of no signal or communication failure, so that the drone can smoothly return to the starting position.
  • the drone will plan a better trajectory such as bypassing obstacles when flying from the starting position to the returning position, so that the reverse trajectory according to the outward trajectory is from the returning position to the starting position.
  • a better trajectory such as bypassing obstacles when flying from the starting position to the returning position, so that the reverse trajectory according to the outward trajectory is from the returning position to the starting position.
  • FIG. 1 is a flow chart of a method for positioning a drone according to an embodiment of the present invention
  • FIG. 2 is a flowchart of a method for matching a scene to obtain a motion vector of a drone according to an embodiment of the present invention
  • FIG. 3 is a schematic structural diagram of an apparatus for positioning a drone according to an embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of a system for positioning a drone according to an embodiment of the present invention.
  • installation In the description of the present invention, it should be noted that the terms “installation”, “connected”, and “connected” are to be understood broadly, and, for example, may be fixed connections, unless explicitly stated and defined otherwise. It can also be a detachable connection, or an integral connection; it can be a mechanical connection or an electrical connection; it can be directly connected, or can be indirectly connected through an intermediate medium, or can be internal communication between two components, and can be a wireless connection. It can also be a wired connection.
  • the specific meaning of the above terms in the present invention can be understood in a specific case by those skilled in the art.
  • the embodiment discloses a method for positioning the unmanned aerial vehicle.
  • a flow chart of the method for positioning the unmanned aerial vehicle is performed.
  • the methods of positioning include:
  • step S101 a reference image is generated during the flight of the drone.
  • the ground image can be collected after the drone takes off from the starting position, and the ground image collected by the drone during the moving to the destination is spliced, and the stitched result is used as the reference image.
  • the so-called ground image refers to an image taken by the drone in a bird's eye view during a flight, and the angle between the overhead viewing direction and the vertical direction is less than 90 degrees.
  • the overhead viewing direction may be vertically downward, in which case the angle of view of the overhead viewing angle is 0 degrees from the vertical.
  • the drone can store the generated reference image for subsequent use of the reference image.
  • the reference image can also be sent to other drones, so that other drones can also use the reference image.
  • the reference image generated by the drone since the hardware parameters of the drone do not change during the flight of the drone, the reference image generated by the drone itself can represent the drone.
  • the so-called departure trajectory refers to the flight path of the drone from the starting position to the destination position. trace.
  • Step S102 Acquire a current image collected at a current time.
  • the current image acquired at the current time can be acquired during the flight of the drone.
  • the drone acquires the image acquired by the image acquisition device on the drone at the current moment in order to determine the current time position.
  • Step S103 determining the current position of the drone based on the reference image and the current image.
  • the current image and the reference image may be compared to obtain a difference between the current image and the reference image, and the motion vector of the drone may be estimated according to the difference, thereby determining the drone's motion vector. current position.
  • the operation of generating the reference image may include: collecting a ground image during the flight of the drone; and splicing the ground image to obtain a reference image.
  • the ground image may be collected at preset intervals, and the preset interval may be determined according to prior knowledge to determine a time interval preset in the time domain, or a preset distance interval in the position.
  • the preset intervals may be equal intervals or non-equal intervals.
  • the overall mode splicing may be used, or the segmentation mode may be used for image splicing.
  • the segmentation mode may be used for image splicing.
  • there is usually an overlapping area between adjacent frame images and the front and rear two frames of the overlapping portion may be combined into a large seamless image.
  • the overlapping area of one of the images may be directly rounded off, and the extra portion of the image may be stitched to another frame image, and the fusion is performed in the seam region to obtain a mosaic image.
  • the ground image is acquired during the flight of the drone from the starting position to the returning position.
  • the so-called starting position refers to the position where the drone starts to take off;
  • the so-called return position is Refers to the position where the drone starts to return to the starting position after taking off.
  • the returning position is the destination of the drone, but in the specific implementation process, the returning position may also be the location where the drone receives the returning instruction during the flight to the destination.
  • the return location can also be a location where the drone encounters a special situation in the process of flying to the destination to determine the need to return. For example, during the flight, there are sudden situations such as insufficient power, no GPS signal, and drone failure. At this time, the flight control system in the drone determines the return flight.
  • the method may further include:
  • step S104 it is determined that the return flight.
  • the drone may actively determine the return flight, for example, the drone encounters a special situation during the flight and needs to return. For example, when the drone is in a state of insufficient power, no GPS signal, or a drone failure during the flight, the flight control system in the drone determines the return flight. After flying to the destination to complete the task, the drone can also take the initiative to determine the need to return.
  • the controller may also control the drone to return. Specifically, the drone receives an instruction sent by the controller to indicate a return flight. After receiving the instruction, the drone determines to return.
  • the controller may be a remote control dedicated to the drone, or a terminal that remotely controls the drone, such as a mobile terminal, a computer, a notebook, or the like.
  • the method further includes:
  • Step S105 determining a reverse trajectory of the drone flying from the starting position to the returning position.
  • the captured image is in the process of flying from the starting position to the returning position.
  • the flight path of the drone can be determined according to the image attributes.
  • the flight path from the return position to the starting position along the departure path forms a reverse trajectory of the drone flying from the starting position to the returning position.
  • the drone can perform the returning operation based on the reverse trajectory.
  • the step S103 may specifically include: matching the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time; and determining, according to the motion vector, the positioning of the drone relative to the reference image at the current time. information.
  • the positioning information includes at least one of the following: the position of the drone, the height of the drone, the attitude of the drone, the azimuth of the drone, the speed of the drone, and the heading of the drone.
  • the direction of the UAV refers to the relative angle between the current image and the reference image acquired by the aircraft at the current time.
  • the heading of the UAV refers to the actual flight direction of the UAV.
  • the current image and the reference image may be scene-matched to obtain a current time of the drone relative to the reference image.
  • the motion vector specifically, please refer to Figure 2.
  • the method shown in Figure 2 includes:
  • Step S201 selecting feature points of the reference image, and the selected feature points are used as reference feature points.
  • reference feature points such as texture-rich Object edge points, etc.
  • the feature points can be described by mathematical methods such as gradient histograms and local random binary features.
  • Step S202 determining a feature point matching the reference feature point in the current image, and the feature point obtained by the matching is used as the current feature point.
  • the pixels in the current image can be described by the same mathematical description, and the current feature points in the current image that match the reference feature points can be determined using mathematical knowledge.
  • Step S203 matching the current feature point with the reference feature point to obtain a motion vector of the drone relative to the reference image at the current time.
  • the current feature points and the reference feature points can be matched by an affine transformation model or a projective transformation model.
  • a description of the affine transformation model or the projective transformation model is as follows.
  • the affine transformation model can be established by means of equations. Specifically, the transformation model established by the equations is as follows:
  • the affine transformation model can also be established by the form of a matrix.
  • the transformation model established by the matrix is as follows:
  • (x, y) is the coordinate of the reference feature point in the reference image
  • (x', y') is the coordinate of the feature point in the current image that matches the reference feature point, a0, a1, a2, b0, b1, and b2 Transform parameters for affine.
  • the complete affine transformation parameters can be solved; when the matched feature points are more than three groups, the least squares solution can be solved. More precise affine transformation parameters.
  • the affine transformation parameters calculated from the affine transformation model can be used to represent the motion vector of the drone.
  • the projective transformation model can be established by means of equations. Specifically, the transformation model is established by the following formula:
  • the projective transformation matrix calculated from the projective transformation model can be used to represent the motion vector of the drone.
  • the UAV return positioning device includes: a reference module 301, an acquisition module 302, and a positioning module 303, where:
  • the reference module 301 is configured to generate a reference image during the flight of the drone; the acquisition module 302 is configured to acquire the current image acquired at the current time; the positioning module 303 is configured to collect the reference image generated by the reference module 301 and the acquisition module 302. The current image determines the current location of the drone.
  • the reference module 301 includes: a sampling unit, configured to collect a ground image during the flight of the drone; and a splicing unit configured to splicing the ground image collected by the sampling unit to obtain a reference image.
  • the sampling unit is specifically configured to acquire a ground image during the flight of the drone from the starting position to the return position.
  • the sampling unit may be an imaging device, such as a camera, a digital camera, etc.; the splicing unit may be a processor or a chip or the like.
  • the acquisition module 302 can be an imaging device such as a camera, a digital camera, or the like.
  • the location module 303 can be a processor or a chip.
  • the sampling unit and the acquisition module 302 may be the same camera device.
  • the method further includes: a determining module, configured to determine a return flight.
  • the determining module is further configured to receive an instruction sent by the controller to indicate a return flight.
  • the determining module may be a wireless signal receiver, such as an antenna for receiving a WiFi signal, an antenna for receiving an LTE (Long Term Evolution) signal, or an antenna for receiving a Bluetooth signal.
  • the determining module can also include a processor on this basis.
  • the method further includes: a trajectory module, configured to determine, after the reference module 301 generates the reference image, a reverse trajectory of the drone flying from the starting position to the returning position.
  • a trajectory module configured to determine, after the reference module 301 generates the reference image, a reverse trajectory of the drone flying from the starting position to the returning position.
  • the method further includes: a returning module for flying from the returning position to the starting position according to the reverse trajectory determined by the trajectory module.
  • the foregoing trajectory module and the returning module may be a processor or a computing chip, respectively.
  • the above trajectory module and the returning module may be the same processor or a computing chip.
  • the positioning module includes: a matching unit configured to match the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time; and a positioning information unit configured to use the motion vector Determining the positioning information of the drone relative to the reference image at the current time; wherein the positioning information includes at least one of the following: the position of the drone, the height of the drone, the posture of the drone, and the orientation of the drone The speed of the drone and the heading of the drone.
  • the matching unit is further configured to perform scene matching on the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time.
  • the matching unit includes: a selecting subunit for selecting a feature point of the reference image, wherein the selected feature point is used as a reference feature point; and the feature point determining subunit is configured to determine the current image a feature point matching the reference feature point, wherein the matched feature point is used as the current feature point; the vector sub-unit is used to match the current feature point with the reference feature point to obtain a drone relative to the current time The motion vector of the reference image.
  • the drone includes a body 401, an image capture device 402, and a processor (not shown), wherein:
  • the body 401 is used to carry various components of the drone, such as a battery, an engine (motor), and a camera Like first class;
  • the image capture device 402 is disposed on the body 401, and the image capture device 402 is configured to acquire images.
  • the image collection device 402 may be a camera.
  • image acquisition device 402 can be used for panoramic photography.
  • the image capture device 402 can include a multi-view camera, can also include a panoramic camera, and can include both a multi-view camera and a panoramic camera to capture images or video from multiple angles.
  • the processor is for performing the method disclosed in the embodiment shown in FIG.
  • the method and device for positioning a drone provided by the embodiment of the present invention provide a method and a device for positioning a drone according to an embodiment of the present invention, and generate a reference image during the flight of the drone, and the reference image can reflect The latest ground situation, and then, the current image acquired at the current time is acquired. Since the reference image and the current image are acquired during the flight of the drone, there is a certain correlation between the reference image and the current image, and then, The current position of the drone can be determined according to the reference image and the current image. In the solution of the embodiment of the present invention, the reference image is generated during the flight of the drone, and the current image is acquired during the flight of the drone.
  • the generated reference image can dynamically compensate for the difference in resolution generated during the flight of the drone, and the dynamic matching can be better achieved in the return process of the drone compared to the fixed resolution in the prior art.
  • the system error is reduced, thereby improving the positioning accuracy of the return flight.
  • determining a reverse trajectory of the drone flying from the starting position to the returning position so that the drone can directly follow the reverse trajectory when flying from the returning position to the starting position. Flying from the return position to the starting position reduces the amount of return planning for the flight path and improves the efficiency of determining the flight path when returning. In addition, drones encounter no signal Or in the case of a communication failure, the UAV can smoothly return to the starting position by flying from the returning position to the starting position in accordance with the reverse trajectory.
  • the drone will plan a better trajectory such as bypassing obstacles when flying from the starting position to the returning position, so that the reverse trajectory according to the outward trajectory is from the returning position to the starting position.
  • a better trajectory such as bypassing obstacles when flying from the starting position to the returning position, so that the reverse trajectory according to the outward trajectory is from the returning position to the starting position.
  • embodiments of the present invention can be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware. Moreover, the invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the device is implemented in a flow chart A function specified in a block or blocks of a process or multiple processes and/or block diagrams.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé et un dispositif de positionnement d'un véhicule aérien sans pilote. Le procédé consiste à : générer une image de référence pendant le vol d'un véhicule aérien sans pilote (S101) ; acquérir une image courante acquise à un instant courant (S102) ; et déterminer, conformément à l'image de référence et à l'image courante, une position courante du véhicule aérien sans pilote (S103). L'image de référence et l'image courante ont un certain degré de corrélation et, par conséquent, la position courante du véhicule aérien sans pilote peut être déterminée conformément à l'image de référence et à l'image courante. Par rapport à une résolution fixe de l'état de la technique, la présente invention permet une meilleure adaptation dynamique pendant un trajet de retour d'un véhicule aérien sans pilote, réduisant ainsi des erreurs systématiques, et améliorant la précision de positionnement du trajet de retour.
PCT/CN2017/072477 2016-12-28 2017-01-24 Procédé et dispositif de positionnement de véhicule aérien sans pilote WO2018120350A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611240082.2 2016-12-28
CN201611240082.2A CN106774402A (zh) 2016-12-28 2016-12-28 对无人机进行定位的方法及装置

Publications (1)

Publication Number Publication Date
WO2018120350A1 true WO2018120350A1 (fr) 2018-07-05

Family

ID=58923493

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/072477 WO2018120350A1 (fr) 2016-12-28 2017-01-24 Procédé et dispositif de positionnement de véhicule aérien sans pilote

Country Status (2)

Country Link
CN (1) CN106774402A (fr)
WO (1) WO2018120350A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112712558A (zh) * 2020-12-25 2021-04-27 北京三快在线科技有限公司 一种无人驾驶设备的定位方法及装置
CN113361552A (zh) * 2020-03-05 2021-09-07 西安邮电大学 定位方法及装置
CN114348264A (zh) * 2022-01-29 2022-04-15 国家海洋环境预报中心 一种基于海洋环境的无人机搜救方法及系统
CN114930194A (zh) * 2020-12-28 2022-08-19 深圳市大疆创新科技有限公司 可移动平台的位置确定方法、装置及设备
US12008910B2 (en) * 2017-08-04 2024-06-11 ideaForge Technology Pvt. Ltd UAV system emergency path planning on communication failure

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109214984B (zh) * 2017-07-03 2023-03-14 臻迪科技股份有限公司 一种图像获取方法及装置、自主定位导航系统、计算设备
CN107291099A (zh) * 2017-07-06 2017-10-24 杨顺伟 无人机返航方法及装置
WO2019061111A1 (fr) * 2017-09-27 2019-04-04 深圳市大疆创新科技有限公司 Procédé de réglage de trajet et véhicule aérien sans pilote
US10685229B2 (en) * 2017-12-21 2020-06-16 Wing Aviation Llc Image based localization for unmanned aerial vehicles, and associated systems and methods
CN110243357B (zh) * 2018-03-07 2021-09-10 杭州海康机器人技术有限公司 一种无人机定位方法、装置、无人机及存储介质
CN108917768B (zh) * 2018-07-04 2022-03-01 上海应用技术大学 无人机定位导航方法和系统
WO2021056144A1 (fr) * 2019-09-23 2021-04-01 深圳市大疆创新科技有限公司 Procédé et appareil pour commander le retour d'une plate-forme mobile, et plate-forme mobile
CN111722179A (zh) * 2020-06-29 2020-09-29 河南天安润信信息技术有限公司 一种多点布设的无人机信号测向方法
TWI829005B (zh) * 2021-08-12 2024-01-11 國立政治大學 高空定位中心設定方法及高空定位飛行控制方法
CN114779814A (zh) * 2022-05-16 2022-07-22 山东欧齐珞信息科技有限公司 无人机方位自动跟踪方法、装置、设备及可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101046387A (zh) * 2006-08-07 2007-10-03 南京航空航天大学 利用景象匹配提高导航系统精度的方法及组合导航仿真系统
CN103411609A (zh) * 2013-07-18 2013-11-27 北京航天自动控制研究所 一种基于在线构图的飞行器返航路线规划方法
CN104807456A (zh) * 2015-04-29 2015-07-29 深圳市保千里电子有限公司 一种gps无信号时自动返航的方法
CN104932515A (zh) * 2015-04-24 2015-09-23 深圳市大疆创新科技有限公司 一种自主巡航方法以及巡航设备
CN106204443A (zh) * 2016-07-01 2016-12-07 成都通甲优博科技有限责任公司 一种基于多目复用的全景无人机系统

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105487555B (zh) * 2016-01-14 2018-09-28 浙江华飞智能科技有限公司 一种无人机的悬停定位方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101046387A (zh) * 2006-08-07 2007-10-03 南京航空航天大学 利用景象匹配提高导航系统精度的方法及组合导航仿真系统
CN103411609A (zh) * 2013-07-18 2013-11-27 北京航天自动控制研究所 一种基于在线构图的飞行器返航路线规划方法
CN104932515A (zh) * 2015-04-24 2015-09-23 深圳市大疆创新科技有限公司 一种自主巡航方法以及巡航设备
CN104807456A (zh) * 2015-04-29 2015-07-29 深圳市保千里电子有限公司 一种gps无信号时自动返航的方法
CN106204443A (zh) * 2016-07-01 2016-12-07 成都通甲优博科技有限责任公司 一种基于多目复用的全景无人机系统

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12008910B2 (en) * 2017-08-04 2024-06-11 ideaForge Technology Pvt. Ltd UAV system emergency path planning on communication failure
CN113361552A (zh) * 2020-03-05 2021-09-07 西安邮电大学 定位方法及装置
CN112712558A (zh) * 2020-12-25 2021-04-27 北京三快在线科技有限公司 一种无人驾驶设备的定位方法及装置
CN112712558B (zh) * 2020-12-25 2024-11-05 北京三快在线科技有限公司 一种无人驾驶设备的定位方法及装置
CN114930194A (zh) * 2020-12-28 2022-08-19 深圳市大疆创新科技有限公司 可移动平台的位置确定方法、装置及设备
CN114348264A (zh) * 2022-01-29 2022-04-15 国家海洋环境预报中心 一种基于海洋环境的无人机搜救方法及系统
CN114348264B (zh) * 2022-01-29 2022-08-02 国家海洋环境预报中心 一种基于海洋环境的无人机搜救方法及系统

Also Published As

Publication number Publication date
CN106774402A (zh) 2017-05-31

Similar Documents

Publication Publication Date Title
WO2018120350A1 (fr) Procédé et dispositif de positionnement de véhicule aérien sans pilote
CN103118230B (zh) 一种全景图像采集方法、装置以及系统
US11073389B2 (en) Hover control
CN108323190B (zh) 一种避障方法、装置和无人机
US11906983B2 (en) System and method for tracking targets
JP2020030204A (ja) 距離測定方法、プログラム、距離測定システム、および可動物体
WO2018120351A1 (fr) Procédé et dispositif de positionnement de véhicule aérien sans pilote
US11057604B2 (en) Image processing method and device
WO2020014909A1 (fr) Procédé et dispositif de photographie, et véhicule aérien sans pilote
CN110022444B (zh) 无人飞行机的全景拍照方法与使用其的无人飞行机
US20200064133A1 (en) Information processing device, aerial photography route generation method, aerial photography route generation system, program, and storage medium
CN113875222B (zh) 拍摄控制方法和装置、无人机及计算机可读存储介质
JP7501535B2 (ja) 情報処理装置、情報処理方法、情報処理プログラム
WO2020198963A1 (fr) Procédé et appareil de traitement de données associés à un dispositif de photographie, et dispositif de traitement d'image
WO2020048365A1 (fr) Procédé et dispositif de commande de vol pour aéronef, et dispositif terminal et système de commande de vol
WO2019183789A1 (fr) Procédé et appareil de commande de véhicule aérien sans pilote, et véhicule aérien sans pilote
WO2020237422A1 (fr) Procédé d'arpentage aérien, aéronef et support d'informations
WO2020237478A1 (fr) Procédé de planification de vol et dispositif associé
JP6265576B1 (ja) 撮像制御装置、影位置特定装置、撮像システム、移動体、撮像制御方法、影位置特定方法、及びプログラム
JP2020036163A (ja) 情報処理装置、撮影制御方法、プログラム及び記録媒体
WO2020019175A1 (fr) Procédé et dispositif de traitement d'image et dispositif photographique et véhicule aérien sans pilote
WO2019100214A1 (fr) Procédé, dispositif et véhicule aérien sans pilote pour générer une image de sortie
WO2022205294A1 (fr) Procédé et appareil de commande d'engin volant sans pilote embarqué, engin volant sans pilote embarqué, et support d'enregistrement
WO2020062255A1 (fr) Procédé de commande de photographie et véhicule aérien sans équipage
WO2020088397A1 (fr) Appareil d'estimation de position, procédé d'estimation de position, programme et support d'enregistrement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17886784

Country of ref document: EP

Kind code of ref document: A1

WA Withdrawal of international application
NENP Non-entry into the national phase

Ref country code: DE

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载